Sample records for web coverage processing

  1. rasdaman Array Database: current status

    NASA Astrophysics Data System (ADS)

    Merticariu, George; Toader, Alexandru

    2015-04-01

    rasdaman (Raster Data Manager) is a Free Open Source Array Database Management System which provides functionality for storing and processing massive amounts of raster data in the form of multidimensional arrays. The user can access, process and delete the data using SQL. The key features of rasdaman are: flexibility (datasets of any dimensionality can be processed with the help of SQL queries), scalability (rasdaman's distributed architecture enables it to seamlessly run on cloud infrastructures while offering an increase in performance with the increase of computation resources), performance (real-time access, processing, mixing and filtering of arrays of any dimensionality) and reliability (legacy communication protocol replaced with a new one based on cutting edge technology - Google Protocol Buffers and ZeroMQ). Among the data with which the system works, we can count 1D time series, 2D remote sensing imagery, 3D image time series, 3D geophysical data, and 4D atmospheric and climate data. Most of these representations cannot be stored only in the form of raw arrays, as the location information of the contents is also important for having a correct geoposition on Earth. This is defined by ISO 19123 as coverage data. rasdaman provides coverage data support through the Petascope service. Extensions were added on top of rasdaman in order to provide support for the Geoscience community. The following OGC standards are currently supported: Web Map Service (WMS), Web Coverage Service (WCS), and Web Coverage Processing Service (WCPS). The Web Map Service is an extension which provides zoom and pan navigation over images provided by a map server. Starting with version 9.1, rasdaman supports WMS version 1.3. The Web Coverage Service provides capabilities for downloading multi-dimensional coverage data. Support is also provided for several extensions of this service: Subsetting Extension, Scaling Extension, and, starting with version 9.1, Transaction Extension, which defines request types for inserting, updating and deleting coverages. A web client, designed for both novice and experienced users, is also available for the service and its extensions. The client offers an intuitive interface that allows users to work with multi-dimensional coverages by abstracting the specifics of the standard definitions of the requests. The Web Coverage Processing Service defines a language for on-the-fly processing and filtering multi-dimensional raster coverages. rasdaman exposes this service through the WCS processing extension. Demonstrations are provided online via the Earthlook website (earthlook.org) which presents use-cases from a wide variety of application domains, using the rasdaman system as processing engine.

  2. Exposing Coverage Data to the Semantic Web within the MELODIES project: Challenges and Solutions

    NASA Astrophysics Data System (ADS)

    Riechert, Maik; Blower, Jon; Griffiths, Guy

    2016-04-01

    Coverage data, typically big in data volume, assigns values to a given set of spatiotemporal positions, together with metadata on how to interpret those values. Existing storage formats like netCDF, HDF and GeoTIFF all have various restrictions that prevent them from being preferred formats for use over the web, especially the semantic web. Factors that are relevant here are the processing complexity, the semantic richness of the metadata, and the ability to request partial information, such as a subset or just the appropriate metadata. Making coverage data available within web browsers opens the door to new ways for working with such data, including new types of visualization and on-the-fly processing. As part of the European project MELODIES (http://melodiesproject.eu) we look into the challenges of exposing such coverage data in an interoperable and web-friendly way, and propose solutions using a host of emerging technologies like JSON-LD, the DCAT and GeoDCAT-AP ontologies, the CoverageJSON format, and new approaches to REST APIs for coverage data. We developed the CoverageJSON format within the MELODIES project as an additional way to expose coverage data to the web, next to having simple rendered images available using standards like OGC's WMS. CoverageJSON partially incorporates JSON-LD but does not encode individual data values as semantic resources, making use of the technology in a practical manner. The development also focused on it being a potential output format for OGC WCS. We will demonstrate how existing netCDF data can be exposed as CoverageJSON resources on the web together with a REST API that allows users to explore the data and run operations such as spatiotemporal subsetting. We will show various use cases from the MELODIES project, including reclassification of a Land Cover dataset client-side within the browser with the ability for the user to influence the reclassification result by making use of the above technologies.

  3. Towards Semantic Web Services on Large, Multi-Dimensional Coverages

    NASA Astrophysics Data System (ADS)

    Baumann, P.

    2009-04-01

    Observed and simulated data in the Earth Sciences often come as coverages, the general term for space-time varying phenomena as set forth by standardization bodies like the Open GeoSpatial Consortium (OGC) and ISO. Among such data are 1-d time series, 2-D surface data, 3-D surface data time series as well as x/y/z geophysical and oceanographic data, and 4-D metocean simulation results. With increasing dimensionality the data sizes grow exponentially, up to Petabyte object sizes. Open standards for exploiting coverage archives over the Web are available to a varying extent. The OGC Web Coverage Service (WCS) standard defines basic extraction operations: spatio-temporal and band subsetting, scaling, reprojection, and data format encoding of the result - a simple interoperable interface for coverage access. More processing functionality is available with products like Matlab, Grid-type interfaces, and the OGC Web Processing Service (WPS). However, these often lack properties known as advantageous from databases: declarativeness (describe results rather than the algorithms), safe in evaluation (no request can keep a server busy infinitely), and optimizable (enable the server to rearrange the request so as to produce the same result faster). WPS defines a geo-enabled SOAP interface for remote procedure calls. This allows to webify any program, but does not allow for semantic interoperability: a function is identified only by its function name and parameters while the semantics is encoded in the (only human readable) title and abstract. Hence, another desirable property is missing, namely an explicit semantics which allows for machine-machine communication and reasoning a la Semantic Web. The OGC Web Coverage Processing Service (WCPS) language, which has been adopted as an international standard by OGC in December 2008, defines a flexible interface for the navigation, extraction, and ad-hoc analysis of large, multi-dimensional raster coverages. It is abstract in that it does not anticipate any particular protocol. One such protocol is given by the OGC Web Coverage Service (WCS) Processing Extension standard which ties WCPS into WCS. Another protocol which makes WCPS an OGC Web Processing Service (WPS) Profile is under preparation. Thereby, WCPS bridges WCS and WPS. The conceptual model of WCPS relies on the coverage model of WCS, which in turn is based on ISO 19123. WCS currently addresses raster-type coverages where a coverage is seen as a function mapping points from a spatio-temporal extent (its domain) into values of some cell type (its range). A retrievable coverage has an identifier associated, further the CRSs supported and, for each range field (aka band, channel), the interpolation methods applicable. The WCPS language offers access to one or several such coverages via a functional, side-effect free language. The following example, which derives the NDVI (Normalized Difference Vegetation Index) from given coverages C1, C2, and C3 within the regions identified by the binary mask R, illustrates the language concept: for c in ( C1, C2, C3 ), r in ( R ) return encode( (char) (c.nir - c.red) / (c.nir + c.red), H˜DF-EOS\\~ ) The result is a list of three HDF-EOS encoded images containing masked NDVI values. Note that the same request can operate on coverages of any dimensionality. The expressive power of WCPS includes statistics, image, and signal processing up to recursion, to maintain safe evaluation. As both syntax and semantics of any WCPS expression is well known the language is Semantic Web ready: clients can construct WCPS requests on the fly, servers can optimize such requests (this has been investigated extensively with the rasdaman raster database system) and automatically distribute them for processing in a WCPS-enabled computing cloud. The WCPS Reference Implementation is being finalized now that the standard is stable; it will be released in open source once ready. Among the future tasks is to extend WCPS to general meshes, in synchronization with the WCS standard. In this talk WCPS is presented in the context of OGC standardization. The author is co-chair of OGC's WCS Working Group (WG) and Coverages WG.

  4. Assessing Pre-Service Candidates' Web-Based Electronic Portfolios.

    ERIC Educational Resources Information Center

    Lamson, Sharon; Thomas, Kelli R.; Aldrich, Jennifer; King, Andy

    This paper describes processes undertaken by Central Missouri State University's Department of Curriculum and Instruction to prepare teacher candidates to create Web-based professional portfolios, Central's expectations for content coverage within the electronic portfolios, and evaluation procedures. It also presents data on portfolio construction…

  5. 42 CFR 423.128 - Dissemination of Part D plan information.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... coverage determination and redetermination processes via an Internet Web site; and (iii) A system that... determination by contacting the plan sponsor's toll free customer service line or by accessing the plan sponsor's internet Web site. (8) Quality assurance policies and procedures. A description of the quality...

  6. 42 CFR 423.128 - Dissemination of Part D plan information.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... coverage determination and redetermination processes via an Internet Web site; and (iii) A system that... determination by contacting the plan sponsor's toll free customer service line or by accessing the plan sponsor's internet Web site. (8) Quality assurance policies and procedures. A description of the quality...

  7. 42 CFR 423.128 - Dissemination of Part D plan information.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... coverage determination and redetermination processes via an Internet Web site; and (iii) A system that... determination by contacting the plan sponsor's toll free customer service line or by accessing the plan sponsor's internet Web site. (8) Quality assurance policies and procedures. A description of the quality...

  8. Leveraging Open Standard Interfaces in Accessing and Processing NASA Data Model Outputs

    NASA Astrophysics Data System (ADS)

    Falke, S. R.; Alameh, N. S.; Hoijarvi, K.; de La Beaujardiere, J.; Bambacus, M. J.

    2006-12-01

    An objective of NASA's Earth Science Division is to develop advanced information technologies for processing, archiving, accessing, visualizing, and communicating Earth Science data. To this end, NASA and other federal agencies have collaborated with the Open Geospatial Consortium (OGC) to research, develop, and test interoperability specifications within projects and testbeds benefiting the government, industry, and the public. This paper summarizes the results of a recent effort under the auspices of the OGC Web Services testbed phase 4 (OWS-4) to explore standardization approaches for accessing and processing the outputs of NASA models of physical phenomena. Within the OWS-4 context, experiments were designed to leverage the emerging OGC Web Processing Service (WPS) and Web Coverage Service (WCS) specifications to access, filter and manipulate the outputs of the NASA Goddard Earth Observing System (GEOS) and Goddard Chemistry Aerosol Radiation and Transport (GOCART) forecast models. In OWS-4, the intent is to provide the users with more control over the subsets of data that they can extract from the model results as well as over the final portrayal of that data. To meet that goal, experiments have been designed to test the suitability of use of OGC's Web Processing Service (WPS) and Web Coverage Service (WCS) for filtering, processing and portraying the model results (including slices by height or by time), and to identify any enhancements to the specs to meet the desired objectives. This paper summarizes the findings of the experiments highlighting the value of the Web Processing Service in providing standard interfaces for accessing and manipulating model data within spatial and temporal frameworks. The paper also points out the key shortcomings of the WPS especially in terms in comparison with a SOAP/WSDL approach towards solving the same problem.

  9. Footprint Database and web services for the Herschel space observatory

    NASA Astrophysics Data System (ADS)

    Verebélyi, Erika; Dobos, László; Kiss, Csaba

    2015-08-01

    Using all telemetry and observational meta-data, we created a searchable database of Herschel observation footprints. Data from the Herschel space observatory is freely available for everyone but no uniformly processed catalog of all observations has been published yet. As a first step, we unified the data model for all three Herschel instruments in all observation modes and compiled a database of sky coverage information. As opposed to methods using a pixellation of the sphere, in our database, sky coverage is stored in exact geometric form allowing for precise area calculations. Indexing of the footprints allows for very fast search among observations based on pointing, time, sky coverage overlap and meta-data. This enables us, for example, to find moving objects easily in Herschel fields. The database is accessible via a web site and also as a set of REST web service functions which makes it usable from program clients like Python or IDL scripts. Data is available in various formats including Virtual Observatory standards.

  10. EarthServer - an FP7 project to enable the web delivery and analysis of 3D/4D models

    NASA Astrophysics Data System (ADS)

    Laxton, John; Sen, Marcus; Passmore, James

    2013-04-01

    EarthServer aims at open access and ad-hoc analytics on big Earth Science data, based on the OGC geoservice standards Web Coverage Service (WCS) and Web Coverage Processing Service (WCPS). The WCS model defines "coverages" as a unifying paradigm for multi-dimensional raster data, point clouds, meshes, etc., thereby addressing a wide range of Earth Science data including 3D/4D models. WCPS allows declarative SQL-style queries on coverages. The project is developing a pilot implementing these standards, and will also investigate the use of GeoSciML to describe coverages. Integration of WCPS with XQuery will in turn allow coverages to be queried in combination with their metadata and GeoSciML description. The unified service will support navigation, extraction, aggregation, and ad-hoc analysis on coverage data from SQL. Clients will range from mobile devices to high-end immersive virtual reality, and will enable 3D model visualisation using web browser technology coupled with developing web standards. EarthServer is establishing open-source client and server technology intended to be scalable to Petabyte/Exabyte volumes, based on distributed processing, supercomputing, and cloud virtualization. Implementation will be based on the existing rasdaman server technology developed. Services using rasdaman technology are being installed serving the atmospheric, oceanographic, geological, cryospheric, planetary and general earth observation communities. The geology service (http://earthserver.bgs.ac.uk/) is being provided by BGS and at present includes satellite imagery, superficial thickness data, onshore DTMs and 3D models for the Glasgow area. It is intended to extend the data sets available to include 3D voxel models. Use of the WCPS standard allows queries to be constructed against single or multiple coverages. For example on a single coverage data for a particular area can be selected or data with a particular range of pixel values. Queries on multiple surfaces can be constructed to calculate, for example, the thickness between two surfaces in a 3D model or the depth from ground surface to the top of a particular geologic unit. In the first version of the service a simple interface showing some example queries has been implemented in order to show the potential of the technologies. The project aims to develop the services available in light of user feedback, both in terms of the data available, the functionality and the interface. User feedback on the services guides the software and standards development aspects of the project, leading to enhanced versions of the software which will be implemented in upgraded versions of the services during the lifetime of the project.

  11. [Utilization and coverage of a Food and Nutritional Surveillance System in Rio Grande do Sul state, Brazil].

    PubMed

    Jung, Natália Miranda; Bairros, Fernanda de Souza; Neutzling, Marilda Borges

    2014-05-01

    This article seeks to describe the utilization and coverage percentage of the Nutritional and Food Surveillance System (SISVAN-Web) in the Regional Health Offices of Rio Grande do Sul in 2010 and to assess its correlation with socio-economic, demographic and health system organization variables at the time. It is an ecological study that used secondary data from the SISVAN-Web, the Department of Primary Health Care, the IT Department of the Unified Health System and the Brazilian Institute of Geography and Statistics. The evaluation of utilization and coverage data was restricted to nutritional status. The percentage of utilization of SISVAN-Web refers to the number of cities that fed the system. Total coverage was defined as the percentage of individuals in all stages of the life cycle monitored by SISVAN-Web. It was found that 324 cities fed the application, corresponding to a utilization percentage of 65.3%. Greater system coverage was observed in all Regional Health Coordination (RHC) Units for ages 0 to 5 years and 5-10 years. There was a significant association between the percentage of utilization of SISVAN-Web and Family Health Strategy coverage in each RHC Unit. The results of this study indicated low percentages of utilization and coverage of SISVAN-Web in Rio Grande do Sul.

  12. geoknife: Reproducible web-processing of large gridded datasets

    USGS Publications Warehouse

    Read, Jordan S.; Walker, Jordan I.; Appling, Alison P.; Blodgett, David L.; Read, Emily K.; Winslow, Luke A.

    2016-01-01

    Geoprocessing of large gridded data according to overlap with irregular landscape features is common to many large-scale ecological analyses. The geoknife R package was created to facilitate reproducible analyses of gridded datasets found on the U.S. Geological Survey Geo Data Portal web application or elsewhere, using a web-enabled workflow that eliminates the need to download and store large datasets that are reliably hosted on the Internet. The package provides access to several data subset and summarization algorithms that are available on remote web processing servers. Outputs from geoknife include spatial and temporal data subsets, spatially-averaged time series values filtered by user-specified areas of interest, and categorical coverage fractions for various land-use types.

  13. Coverage Gains After the Affordable Care Act Among the Uninsured in Minnesota.

    PubMed

    Call, Kathleen Thiede; Lukanen, Elizabeth; Spencer, Donna; Alarcón, Giovann; Kemmick Pintor, Jessie; Baines Simon, Alisha; Gildemeister, Stefan

    2015-11-01

    We determined whether and how Minnesotans who were uninsured in 2013 gained health insurance coverage in 2014, 1 year after the Affordable Care Act (ACA) expanded Medicaid coverage and enrollment. Insurance status and enrollment experiences came from the Minnesota Health Insurance Transitions Study (MH-HITS), a follow-up telephone survey of children and adults in Minnesota who had no health insurance in the fall of 2013. ACA had a tempered success in Minnesota. Outreach and enrollment efforts were effective; one half of those previously uninsured gained coverage, although many reported difficulty signing up (nearly 62%). Of the previously uninsured who gained coverage, 44% obtained their coverage through MNsure, Minnesota's insurance marketplace. Most of those who remained uninsured heard of MNsure and went to the Web site. Many still struggled with the enrollment process or reported being deterred by the cost of coverage. Targeting outreach, simplifying the enrollment process, focusing on affordability, and continuing funding for in-person assistance will be important in the future.

  14. Coverage Gains After the Affordable Care Act Among the Uninsured in Minnesota

    PubMed Central

    Lukanen, Elizabeth; Spencer, Donna; Alarcón, Giovann; Kemmick Pintor, Jessie; Baines Simon, Alisha; Gildemeister, Stefan

    2015-01-01

    Objectives. We determined whether and how Minnesotans who were uninsured in 2013 gained health insurance coverage in 2014, 1 year after the Affordable Care Act (ACA) expanded Medicaid coverage and enrollment. Methods. Insurance status and enrollment experiences came from the Minnesota Health Insurance Transitions Study (MH-HITS), a follow-up telephone survey of children and adults in Minnesota who had no health insurance in the fall of 2013. Results. ACA had a tempered success in Minnesota. Outreach and enrollment efforts were effective; one half of those previously uninsured gained coverage, although many reported difficulty signing up (nearly 62%). Of the previously uninsured who gained coverage, 44% obtained their coverage through MNsure, Minnesota’s insurance marketplace. Most of those who remained uninsured heard of MNsure and went to the Web site. Many still struggled with the enrollment process or reported being deterred by the cost of coverage. Conclusions. Targeting outreach, simplifying the enrollment process, focusing on affordability, and continuing funding for in-person assistance will be important in the future. PMID:26447912

  15. The quality of online antidepressant drug information: an evaluation of English and Finnish language Web sites.

    PubMed

    Prusti, Marjo; Lehtineva, Susanna; Pohjanoksa-Mäntylä, Marika; Bell, J Simon

    2012-01-01

    The Internet is a frequently used source of drug information, including among people with mental disorders. Online drug information may be narrow in scope, incomplete, and contain errors of omission. To evaluate the quality of online antidepressant drug information in English and Finnish. Forty Web sites were identified using the search terms antidepressants and masennuslääkkeet in English and Finnish, respectively. Included Web sites (14 English, 8 Finnish) were evaluated for aesthetics, interactivity, content coverage, and content correctness using published criteria. All Web sites were assessed using the Date, Author, References, Type, Sponsor (DARTS) and DISCERN quality assessment tools. English and Finnish Web sites had similar aesthetics, content coverage, and content correctness scores. English Web sites were more interactive than Finnish Web sites (P<.05). Overall, adverse drug reactions were covered on 21 of 22 Web sites; however, drug-alcohol interactions were addressed on only 9 of 22 Web sites, and dose was addressed on only 6 of 22 Web sites. Few (2/22 Web sites) provided incorrect information. The DISCERN score was significantly correlated with content coverage (r=0.670, P<.01), content correctness (r=0.663, P<.01), and the DARTS score (r=0.459, P<.05). No Web site provided information about all aspects of antidepressant treatment. Nevertheless, few Web sites provided incorrect information. Both English and Finnish Web sites were similar in terms of aesthetics, content coverage, and content correctness. Copyright © 2012 Elsevier Inc. All rights reserved.

  16. HyspIRI Low Latency Concept and Benchmarks

    NASA Technical Reports Server (NTRS)

    Mandl, Dan

    2010-01-01

    Topics include HyspIRI low latency data ops concept, HyspIRI data flow, ongoing efforts, experiment with Web Coverage Processing Service (WCPS) approach to injecting new algorithms into SensorWeb, low fidelity HyspIRI IPM testbed, compute cloud testbed, open cloud testbed environment, Global Lambda Integrated Facility (GLIF) and OCC collaboration with Starlight, delay tolerant network (DTN) protocol benchmarking, and EO-1 configuration for preliminary DTN prototype.

  17. Searching the world wide Web

    PubMed

    Lawrence; Giles

    1998-04-03

    The coverage and recency of the major World Wide Web search engines was analyzed, yielding some surprising results. The coverage of any one engine is significantly limited: No single engine indexes more than about one-third of the "indexable Web," the coverage of the six engines investigated varies by an order of magnitude, and combining the results of the six engines yields about 3.5 times as many documents on average as compared with the results from only one engine. Analysis of the overlap between pairs of engines gives an estimated lower bound on the size of the indexable Web of 320 million pages.

  18. A Query Language for Handling Big Observation Data Sets in the Sensor Web

    NASA Astrophysics Data System (ADS)

    Autermann, Christian; Stasch, Christoph; Jirka, Simon; Koppe, Roland

    2017-04-01

    The Sensor Web provides a framework for the standardized Web-based sharing of environmental observations and sensor metadata. While the issue of varying data formats and protocols is addressed by these standards, the fast growing size of observational data is imposing new challenges for the application of these standards. Most solutions for handling big observational datasets currently focus on remote sensing applications, while big in-situ datasets relying on vector features still lack a solid approach. Conventional Sensor Web technologies may not be adequate, as the sheer size of the data transmitted and the amount of metadata accumulated may render traditional OGC Sensor Observation Services (SOS) unusable. Besides novel approaches to store and process observation data in place, e.g. by harnessing big data technologies from mainstream IT, the access layer has to be amended to utilize and integrate these large observational data archives into applications and to enable analysis. For this, an extension to the SOS will be discussed that establishes a query language to dynamically process and filter observations at storage level, similar to the OGC Web Coverage Service (WCS) and it's Web Coverage Processing Service (WCPS) extension. This will enable applications to request e.g. spatial or temporal aggregated data sets in a resolution it is able to display or it requires. The approach will be developed and implemented in cooperation with the The Alfred Wegener Institute, Helmholtz Centre for Polar and Marine Research whose catalogue of data compromises marine observations of physical, chemical and biological phenomena from a wide variety of sensors, including mobile (like research vessels, aircrafts or underwater vehicles) and stationary (like buoys or research stations). Observations are made with a high temporal resolution and the resulting time series may span multiple decades.

  19. Spatial data standards meet meteorological data - pushing the boundaries

    NASA Astrophysics Data System (ADS)

    Wagemann, Julia; Siemen, Stephan; Lamy-Thepaut, Sylvie

    2017-04-01

    The data archive of the European Centre for Medium-Range Weather Forecasts (ECMWF) holds around 120 PB of data and is world's largest archive of meteorological data. This information is of great value for many Earth Science disciplines, but the complexity of the data (up to five dimensions and different time axis domains) and its native data format GRIB, while being an efficient archive format, limits the overall data uptake especially from users outside the MetOcean domain. ECMWF's MARS WebAPI is a very efficient and flexible system for expert users to access and retrieve meteorological data, though challenging for users outside the MetOcean domain. With the help of web-based standards for data access and processing, ECMWF wants to make more than 1 PB of meteorological and climate data easier accessible to users across different Earth Science disciplines. As climate data provider for the H2020 project EarthServer-2, ECMWF explores the feasibility to give on-demand access to it's MARS archive via the OGC standard interface Web Coverage Service (WCS). Despite the potential a WCS for climate and meteorological data offers, the standards-based modelling of meteorological and climate data entails many challenges and reveals the boundaries of the current Web Coverage Service 2.0 standard. Challenges range from valid semantic data models for meteorological data to optimal and efficient data structures for a scalable web service. The presentation reviews the applicability of the current Web Coverage Service 2.0 standard to meteorological and climate data and discusses challenges that are necessary to overcome in order to achieve real interoperability and to ensure the conformant sharing and exchange of meteorological data.

  20. Open Data, Jupyter Notebooks and Geospatial Data Standards Combined - Opening up large volumes of marine and climate data to other communities

    NASA Astrophysics Data System (ADS)

    Clements, O.; Siemen, S.; Wagemann, J.

    2017-12-01

    The EU-funded Earthserver-2 project aims to offer on-demand access to large volumes of environmental data (Earth Observation, Marine, Climate data and Planetary data) via the interface standard Web Coverage Service defined by the Open Geospatial Consortium. Providing access to data via OGC web services (e.g. WCS and WMS) has the potential to open up services to a wider audience, especially to users outside the respective communities. Especially WCS 2.0 with its processing extension Web Coverage Processing Service (WCPS) is highly beneficial to make large volumes accessible to non-expert communities. Users do not have to deal with custom community data formats, such as GRIB for the meteorological community, but can directly access the data in a format they are more familiar with, such as NetCDF, JSON or CSV. Data requests can further directly be integrated into custom processing routines and users are not required to download Gigabytes of data anymore. WCS supports trim (reduction of data extent) and slice (reduction of data dimension) operations on multi-dimensional data, providing users a very flexible on-demand access to the data. WCPS allows the user to craft queries to run on the data using a text-based query language, similar to SQL. These queries can be very powerful, e.g. condensing a three-dimensional data cube into its two-dimensional mean. However, the more processing-intensive the more complex the query. As part of the EarthServer-2 project, we developed a python library that helps users to generate complex WCPS queries with Python, a programming language they are more familiar with. The interactive presentation aims to give practical examples how users can benefit from two specific WCS services from the Marine and Climate community. Use-cases from the two communities will show different approaches to take advantage of a Web Coverage (Processing) Service. The entire content is available with Jupyter Notebooks, as they prove to be a highly beneficial tool to generate reproducible workflows for environmental data analysis.

  1. An Innovative Open Data-driven Approach for Improved Interpretation of Coverage Data at NASA JPL's PO.DAA

    NASA Astrophysics Data System (ADS)

    McGibbney, L. J.; Armstrong, E. M.

    2016-12-01

    Figuratively speaking, Scientific Datasets (SD) are shared by data producers in a multitude of shapes, sizes and flavors. Primarily however they exist as machine-independent manifestations supporting the creation, access, and sharing of array-oriented SD that can on occasion be spread across multiple files. Within the Earth Sciences, the most notable general examples include the HDF family, NetCDF, etc. with other formats such as GRIB being used pervasively within specific domains such as the Oceanographic, Atmospheric and Meteorological sciences. Such file formats contain Coverage Data e.g. a digital representation of some spatio-temporal phenomenon. A challenge for large data producers such as NASA and NOAA as well as consumers of coverage datasets (particularly surrounding visualization and interactive use within web clients) is that this is still not a straight-forward issue due to size, serialization and inherent complexity. Additionally existing data formats are either unsuitable for the Web (like netCDF files) or hard to interpret independently due to missing standard structures and metadata (e.g. the OPeNDAP protocol). Therefore alternative, Web friendly manifestations of such datasets are required.CoverageJSON is an emerging data format for publishing coverage data to the web in a web-friendly, way which fits in with the linked data publication paradigm hence lowering the barrier for interpretation by consumers via mobile devices and client applications, etc. as well as data producers who can build next generation Web friendly Web services around datasets. This work will detail how CoverageJSON is being evaluated at NASA JPL's PO.DAAC as an enabling data representation format for publishing SD as Linked Open Data embedded within SD landing pages as well as via semantic data repositories. We are currently evaluating how utilization of CoverageJSON within SD landing pages addresses the long-standing acknowledgement that SD producers are not currently addressing content-based optimization within their SD landing pages for better crawlability by commercial search engines.

  2. Cross-Dataset Analysis and Visualization Driven by Expressive Web Services

    NASA Astrophysics Data System (ADS)

    Alexandru Dumitru, Mircea; Catalin Merticariu, Vlad

    2015-04-01

    The deluge of data that is hitting us every day from satellite and airborne sensors is changing the workflow of environmental data analysts and modelers. Web geo-services play now a fundamental role, and are no longer needed to preliminary download and store the data, but rather they interact in real-time with GIS applications. Due to the very large amount of data that is curated and made available by web services, it is crucial to deploy smart solutions for optimizing network bandwidth, reducing duplication of data and moving the processing closer to the data. In this context we have created a visualization application for analysis and cross-comparison of aerosol optical thickness datasets. The application aims to help researchers identify and visualize discrepancies between datasets coming from various sources, having different spatial and time resolutions. It also acts as a proof of concept for integration of OGC Web Services under a user-friendly interface that provides beautiful visualizations of the explored data. The tool was built on top of the World Wind engine, a Java based virtual globe built by NASA and the open source community. For data retrieval and processing we exploited the OGC Web Coverage Service potential: the most exciting aspect being its processing extension, a.k.a. the OGC Web Coverage Processing Service (WCPS) standard. A WCPS-compliant service allows a client to execute a processing query on any coverage offered by the server. By exploiting a full grammar, several different kinds of information can be retrieved from one or more datasets together: scalar condensers, cross-sectional profiles, comparison maps and plots, etc. This combination of technology made the application versatile and portable. As the processing is done on the server-side, we ensured that the minimal amount of data is transferred and that the processing is done on a fully-capable server, leaving the client hardware resources to be used for rendering the visualization. The application offers a set of features to visualize and cross-compare the datasets. Users can select a region of interest in space and time on which an aerosol map layer is plotted. Hovmoeller time-latitude and time-longitude profiles can be displayed by selecting orthogonal cross-sections on the globe. Statistics about the selected dataset are also displayed in different text and plot formats. The datasets can also be cross-compared either by using the delta map tool or the merged map tool. For more advanced users, a WCPS query console is also offered allowing users to process their data with ad-hoc queries and then choose how to display the results. Overall, the user has a rich set of tools that can be used to visualize and cross-compare the aerosol datasets. With our application we have shown how the NASA WorldWind framework can be used to display results processed efficiently - and entirely - on the server side using the expressiveness of the OGC WCPS web-service. The application serves not only as a proof of concept of a new paradigm in working with large geospatial data but also as an useful tool for environmental data analysts.

  3. Automatic Earth observation data service based on reusable geo-processing workflow

    NASA Astrophysics Data System (ADS)

    Chen, Nengcheng; Di, Liping; Gong, Jianya; Yu, Genong; Min, Min

    2008-12-01

    A common Sensor Web data service framework for Geo-Processing Workflow (GPW) is presented as part of the NASA Sensor Web project. This framework consists of a data service node, a data processing node, a data presentation node, a Catalogue Service node and BPEL engine. An abstract model designer is used to design the top level GPW model, model instantiation service is used to generate the concrete BPEL, and the BPEL execution engine is adopted. The framework is used to generate several kinds of data: raw data from live sensors, coverage or feature data, geospatial products, or sensor maps. A scenario for an EO-1 Sensor Web data service for fire classification is used to test the feasibility of the proposed framework. The execution time and influences of the service framework are evaluated. The experiments show that this framework can improve the quality of services for sensor data retrieval and processing.

  4. Using Knowledge Base for Event-Driven Scheduling of Web Monitoring Systems

    NASA Astrophysics Data System (ADS)

    Kim, Yang Sok; Kang, Sung Won; Kang, Byeong Ho; Compton, Paul

    Web monitoring systems report any changes to their target web pages by revisiting them frequently. As they operate under significant resource constraints, it is essential to minimize revisits while ensuring minimal delay and maximum coverage. Various statistical scheduling methods have been proposed to resolve this problem; however, they are static and cannot easily cope with events in the real world. This paper proposes a new scheduling method that manages unpredictable events. An MCRDR (Multiple Classification Ripple-Down Rules) document classification knowledge base was reused to detect events and to initiate a prompt web monitoring process independent of a static monitoring schedule. Our experiment demonstrates that the approach improves monitoring efficiency significantly.

  5. Lessons from Communicating Space Science Over the Web

    NASA Technical Reports Server (NTRS)

    Dooling, David, Jr.; Triese, D.

    2000-01-01

    The Science Directorate at NASA's Marshall Space Flight Center uses the web in an aggressive manner to expand communications beyond the traditional "public affairs" or "media relations" routines. The key to success has been developing a balanced process that A) involves laboratory personnel and the NASA center community through a weekly Science Communications Roundtable, B) vests ownership and development of the product (i.e., the story) in the scientist a writer resident in the laboratory, and C) seeks taps the talents of the outside communications community through the Research/Roadmap Communications activity. The process is flexible and responsive, allowing Science@NASA to provide daily coverage for events, such as two materials science missions managed by NASA/Marshall. In addition to developing materials for the web, Science@NASA has conducted extensive research to determine what subjects people seek on the web, and the best methods to position stories so they will be found and read.

  6. Coverage and quality: A comparison of Web of Science and Scopus databases for reporting faculty nursing publication metrics.

    PubMed

    Powell, Kimberly R; Peterson, Shenita R

    Web of Science and Scopus are the leading databases of scholarly impact. Recent studies outside the field of nursing report differences in journal coverage and quality. A comparative analysis of nursing publications reported impact. Journal coverage by each database for the field of nursing was compared. Additionally, publications by 2014 nursing faculty were collected in both databases and compared for overall coverage and reported quality, as modeled by Scimajo Journal Rank, peer review status, and MEDLINE inclusion. Individual author impact, modeled by the h-index, was calculated by each database for comparison. Scopus offered significantly higher journal coverage. For 2014 faculty publications, 100% of journals were found in Scopus, Web of Science offered 82%. No significant difference was found in the quality of reported journals. Author h-index was found to be higher in Scopus. When reporting faculty publications and scholarly impact, academic nursing programs may be better represented by Scopus, without compromising journal quality. Programs with strong interdisciplinary work should examine all areas of strength to ensure appropriate coverage. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Discrepancies among Scopus, Web of Science, and PubMed coverage of funding information in medical journal articles.

    PubMed

    Kokol, Peter; Vošner, Helena Blažun

    2018-01-01

    The overall aim of the present study was to compare the coverage of existing research funding information for articles indexed in Scopus, Web of Science, and PubMed databases. The numbers of articles with funding information published in 2015 were identified in the three selected databases and compared using bibliometric analysis of a sample of twenty-eight prestigious medical journals. Frequency analysis of the number of articles with funding information showed statistically significant differences between Scopus, Web of Science, and PubMed databases. The largest proportion of articles with funding information was found in Web of Science (29.0%), followed by PubMed (14.6%) and Scopus (7.7%). The results show that coverage of funding information differs significantly among Scopus, Web of Science, and PubMed databases in a sample of the same medical journals. Moreover, we found that, currently, funding data in PubMed is more difficult to obtain and analyze compared with that in the other two databases.

  8. New Quality Metrics for Web Search Results

    NASA Astrophysics Data System (ADS)

    Metaxas, Panagiotis Takis; Ivanova, Lilia; Mustafaraj, Eni

    Web search results enjoy an increasing importance in our daily lives. But what can be said about their quality, especially when querying a controversial issue? The traditional information retrieval metrics of precision and recall do not provide much insight in the case of web information retrieval. In this paper we examine new ways of evaluating quality in search results: coverage and independence. We give examples on how these new metrics can be calculated and what their values reveal regarding the two major search engines, Google and Yahoo. We have found evidence of low coverage for commercial and medical controversial queries, and high coverage for a political query that is highly contested. Given the fact that search engines are unwilling to tune their search results manually, except in a few cases that have become the source of bad publicity, low coverage and independence reveal the efforts of dedicated groups to manipulate the search results.

  9. Big Geo Data Services: From More Bytes to More Barrels

    NASA Astrophysics Data System (ADS)

    Misev, Dimitar; Baumann, Peter

    2016-04-01

    The data deluge is affecting the oil and gas industry just as much as many other industries. However, aside from the sheer volume there is the challenge of data variety, such as regular and irregular grids, multi-dimensional space/time grids, point clouds, and TINs and other meshes. A uniform conceptualization for modelling and serving them could save substantial effort, such as the proverbial "department of reformatting". The notion of a coverage actually can accomplish this. Its abstract model in ISO 19123 together with the concrete, interoperable OGC Coverage Implementation Schema (CIS), which is currently under adoption as ISO 19123-2, provieds a common platform for representing any n-D grid type, point clouds, and general meshes. This is paired by the OGC Web Coverage Service (WCS) together with its datacube analytics language, the OGC Web Coverage Processing Service (WCPS). The OGC WCS Core Reference Implementation, rasdaman, relies on Array Database technology, i.e. a NewSQL/NoSQL approach. It supports the grid part of coverages, with installations of 100+ TB known and single queries parallelized across 1,000+ cloud nodes. Recent research attempts to address the point cloud and mesh part through a unified query model. The Holy Grail envisioned is that these approaches can be merged into a single service interface at some time. We present both grid amd point cloud / mesh approaches and discuss status, implementation, standardization, and research perspectives, including a live demo.

  10. Towards Direct Manipulation and Remixing of Massive Data: The EarthServer Approach

    NASA Astrophysics Data System (ADS)

    Baumann, P.

    2012-04-01

    Complex analytics on "big data" is one of the core challenges of current Earth science, generating strong requirements for on-demand processing and fil tering of massive data sets. Issues under discussion include flexibility, performance, scalability, and the heterogeneity of the information types invo lved. In other domains, high-level query languages (such as those offered by database systems) have proven successful in the quest for flexible, scalable data access interfaces to massive amounts of data. However, due to the lack of support for many of the Earth science data structures, database systems are only used for registries and catalogs, but not for the bulk of spatio-temporal data. One core information category in this field is given by coverage data. ISO 19123 defines coverages, simplifying, as a representation of a "space-time varying phenomenon". This model can express a large class of Earth science data structures, including rectified and non-rectified rasters, curvilinear grids, point clouds, TINs, general meshes, trajectories, surfaces, and solids. This abstract definition, which is too high-level to establish interoperability, is concretized by the OGC GML 3.2.1 Application Schema for Coverages Standard into an interoperable representation. The OGC Web Coverage Processing Service (WCPS) Standard defines a declarative query language on multi-dimensional raster-type coverages, such as 1D in-situ sensor timeseries, 2D EO imagery, 3D x/y/t image time series and x/y/z geophysical data, 4D x/y/z/t climate and ocean data. Hence, important ingredients for versatile coverage retrieval are given - however, this potential has not been fully unleashed by service architectures up to now. The EU FP7-INFRA project EarthServer, launched in September 2011, aims at enabling standards-based on-demand analytics over the Web for Earth science data based on an integration of W3C XQuery for alphanumeric data and OGC-WCPS for raster data. Ultimately, EarthServer will support all OGC coverage types. The platform used by EarthServer is the rasdaman raster database system. To exploit heterogeneous multi-parallel platforms, automatic request distribution and orchestration is being established. Client toolkits are under development which will allow to quickly compose bespoke interactive clients, ranging from mobile devices over Web clients to high-end immersive virtual reality. The EarthServer platform has been deployed in six large-scale data centres with the aim of setting up Lighthouse Applications addressing all Earth Sciences, including satellite and airborne earth observation as well as use cases from atmosphere, ocean, snow, and ice monitoring, and geology on Earth and Mars. These services, each of which will ultimately host at least 100 TB, will form a peer cloud with distributed query processing for arbitrarily mixing database and in-situ access. With its ability to directly manipulate, analyze and remix massive data, the goal of EarthServer is to lift the data providers' semantic level from data stewardship to service stewardship.

  11. The Footprint Database and Web Services of the Herschel Space Observatory

    NASA Astrophysics Data System (ADS)

    Dobos, László; Varga-Verebélyi, Erika; Verdugo, Eva; Teyssier, David; Exter, Katrina; Valtchanov, Ivan; Budavári, Tamás; Kiss, Csaba

    2016-10-01

    Data from the Herschel Space Observatory is freely available to the public but no uniformly processed catalogue of the observations has been published so far. To date, the Herschel Science Archive does not contain the exact sky coverage (footprint) of individual observations and supports search for measurements based on bounding circles only. Drawing on previous experience in implementing footprint databases, we built the Herschel Footprint Database and Web Services for the Herschel Space Observatory to provide efficient search capabilities for typical astronomical queries. The database was designed with the following main goals in mind: (a) provide a unified data model for meta-data of all instruments and observational modes, (b) quickly find observations covering a selected object and its neighbourhood, (c) quickly find every observation in a larger area of the sky, (d) allow for finding solar system objects crossing observation fields. As a first step, we developed a unified data model of observations of all three Herschel instruments for all pointing and instrument modes. Then, using telescope pointing information and observational meta-data, we compiled a database of footprints. As opposed to methods using pixellation of the sphere, we represent sky coverage in an exact geometric form allowing for precise area calculations. For easier handling of Herschel observation footprints with rather complex shapes, two algorithms were implemented to reduce the outline. Furthermore, a new visualisation tool to plot footprints with various spherical projections was developed. Indexing of the footprints using Hierarchical Triangular Mesh makes it possible to quickly find observations based on sky coverage, time and meta-data. The database is accessible via a web site http://herschel.vo.elte.hu and also as a set of REST web service functions, which makes it readily usable from programming environments such as Python or IDL. The web service allows downloading footprint data in various formats including Virtual Observatory standards.

  12. A New User Interface for On-Demand Customizable Data Products for Sensors in a SensorWeb

    NASA Technical Reports Server (NTRS)

    Mandl, Daniel; Cappelaere, Pat; Frye, Stuart; Sohlberg, Rob; Ly, Vuong; Chien, Steve; Sullivan, Don

    2011-01-01

    A SensorWeb is a set of sensors, which can consist of ground, airborne and space-based sensors interoperating in an automated or autonomous collaborative manner. The NASA SensorWeb toolbox, developed at NASA/GSFC in collaboration with NASA/JPL, NASA/Ames and other partners, is a set of software and standards that (1) enables users to create virtual private networks of sensors over open networks; (2) provides the capability to orchestrate their actions; (3) provides the capability to customize the output data products and (4) enables automated delivery of the data products to the users desktop. A recent addition to the SensorWeb Toolbox is a new user interface, together with web services co-resident with the sensors, to enable rapid creation, loading and execution of new algorithms for processing sensor data. The web service along with the user interface follows the Open Geospatial Consortium (OGC) standard called Web Coverage Processing Service (WCPS). This presentation will detail the prototype that was built and how the WCPS was tested against a HyspIRI flight testbed and an elastic computation cloud on the ground with EO-1 data. HyspIRI is a future NASA decadal mission. The elastic computation cloud stores EO-1 data and runs software similar to Amazon online shopping.

  13. QBCov: A Linked Data interface for Discrete Global Grid Systems, a new approach to delivering coverage data on the web

    NASA Astrophysics Data System (ADS)

    Zhang, Z.; Toyer, S.; Brizhinev, D.; Ledger, M.; Taylor, K.; Purss, M. B. J.

    2016-12-01

    We are witnessing a rapid proliferation of geoscientific and geospatial data from an increasing variety of sensors and sensor networks. This data presents great opportunities to resolve cross-disciplinary problems. However, working with it often requires an understanding of file formats and protocols seldom used outside of scientific computing, potentially limiting the data's value to other disciplines. In this paper, we present a new approach to serving satellite coverage data on the web, which improves ease-of-access using the principles of linked data. Linked data adapts the concepts and protocols of the human-readable web to machine-readable data; the number of developers familiar with web technologies makes linked data a natural choice for bringing coverages to a wider audience. Our approach to using linked data also makes it possible to efficiently service high-level SPARQL queries: for example, "Retrieve all Landsat ETM+ observations of San Francisco between July and August 2016" can easily be encoded in a single query. We validate the new approach, which we call QBCov, with a reference implementation of the entire stack, including a simple web-based client for interacting with Landsat observations. In addition to demonstrating the utility of linked data for publishing coverages, we investigate the heretofore unexplored relationship between Discrete Global Grid Systems (DGGS) and linked data. Our conclusions are informed by the aforementioned reference implementation of QBCov, which is backed by a hierarchical file format designed around the rHEALPix DGGS. Not only does the choice of a DGGS-based representation provide an efficient mechanism for accessing large coverages at multiple scales, but the ability of DGGS to produce persistent, unique identifiers for spatial regions is especially valuable in a linked data context. This suggests that DGGS has an important role to play in creating sustainable and scalable linked data infrastructures. QBCov is being developed as a contribution to the Spatial Data on the Web working group--a joint activity of the Open Geospatial Consortium and World Wide Web Consortium.

  14. Datacube Services in Action, Using Open Source and Open Standards

    NASA Astrophysics Data System (ADS)

    Baumann, P.; Misev, D.

    2016-12-01

    Array Databases comprise novel, promising technology for massive spatio-temporal datacubes, extending the SQL paradigm of "any query, anytime" to n-D arrays. On server side, such queries can be optimized, parallelized, and distributed based on partitioned array storage. The rasdaman ("raster data manager") system, which has pioneered Array Databases, is available in open source on www.rasdaman.org. Its declarative query language extends SQL with array operators which are optimized and parallelized on server side. The rasdaman engine, which is part of OSGeo Live, is mature and in operational use databases individually holding dozens of Terabytes. Further, the rasdaman concepts have strongly impacted international Big Data standards in the field, including the forthcoming MDA ("Multi-Dimensional Array") extension to ISO SQL, the OGC Web Coverage Service (WCS) and Web Coverage Processing Service (WCPS) standards, and the forthcoming INSPIRE WCS/WCPS; in both OGC and INSPIRE, OGC is WCS Core Reference Implementation. In our talk we present concepts, architecture, operational services, and standardization impact of open-source rasdaman, as well as experiences made.

  15. SensorWeb 3G: Extending On-Orbit Sensor Capabilities to Enable Near Realtime User Configurability

    NASA Technical Reports Server (NTRS)

    Mandl, Daniel; Cappelaere, Pat; Frye, Stuart; Sohlberg, Rob; Ly, Vuong; Chien, Steve; Tran, Daniel; Davies, Ashley; Sullivan, Don; Ames, Troy; hide

    2010-01-01

    This research effort prototypes an implementation of a standard interface, Web Coverage Processing Service (WCPS), which is an Open Geospatial Consortium(OGC) standard, to enable users to define, test, upload and execute algorithms for on-orbit sensor systems. The user is able to customize on-orbit data products that result from raw data streaming from an instrument. This extends the SensorWeb 2.0 concept that was developed under a previous Advanced Information System Technology (AIST) effort in which web services wrap sensors and a standardized Extensible Markup Language (XML) based scripting workflow language orchestrates processing steps across multiple domains. SensorWeb 3G extends the concept by providing the user controls into the flight software modules associated with on-orbit sensor and thus provides a degree of flexibility which does not presently exist. The successful demonstrations to date will be presented, which includes a realistic HyspIRI decadal mission testbed. Furthermore, benchmarks that were run will also be presented along with future demonstration and benchmark tests planned. Finally, we conclude with implications for the future and how this concept dovetails into efforts to develop "cloud computing" methods and standards.

  16. CEOS Ocean Variables Enabling Research and Applications for Geo (COVERAGE)

    NASA Astrophysics Data System (ADS)

    Tsontos, V. M.; Vazquez, J.; Zlotnicki, V.

    2017-12-01

    The CEOS Ocean Variables Enabling Research and Applications for GEO (COVERAGE) initiative seeks to facilitate joint utilization of different satellite data streams on ocean physics, better integrated with biological and in situ observations, including near real-time data streams in support of oceanographic and decision support applications for societal benefit. COVERAGE aligns with programmatic objectives of CEOS (the Committee on Earth Observation Satellites) and the missions of GEO-MBON (Marine Biodiversity Observation Network) and GEO-Blue Planet, which are to advance and exploit synergies among the many observational programs devoted to ocean and coastal waters. COVERAGE is conceived of as 3 year pilot project involving international collaboration. It focuses on implementing technologies, including cloud based solutions, to provide a data rich, web-based platform for integrated ocean data delivery and access: multi-parameter observations, easily discoverable and usable, organized by disciplines, available in near real-time, collocated to a common grid and including climatologies. These will be complemented by a set of value-added data services available via the COVERAGE portal including an advanced Web-based visualization interface, subsetting/extraction, data collocation/matchup and other relevant on demand processing capabilities. COVERAGE development will be organized around priority use cases and applications identified by GEO and agency partners. The initial phase will be to develop co-located 25km products from the four Ocean Virtual Constellations (VCs), Sea Surface Temperature, Sea Level, Ocean Color, and Sea Surface Winds. This aims to stimulate work among the ocean VCs while developing products and system functionality based on community recommendations. Such products as anomalies from a time mean, would build on the theme of applications with a relevance to CEOS/GEO mission and vision. Here we provide an overview of the COVERAGE initiative with an emphasis on international collaborative aspects entailed with the intent of soliciting community feedback as we develop and implement

  17. The EarthServer Geology Service: web coverage services for geosciences

    NASA Astrophysics Data System (ADS)

    Laxton, John; Sen, Marcus; Passmore, James

    2014-05-01

    The EarthServer FP7 project is implementing web coverage services using the OGC WCS and WCPS standards for a range of earth science domains: cryospheric; atmospheric; oceanographic; planetary; and geological. BGS is providing the geological service (http://earthserver.bgs.ac.uk/). Geoscience has used remote sensed data from satellites and planes for some considerable time, but other areas of geosciences are less familiar with the use of coverage data. This is rapidly changing with the development of new sensor networks and the move from geological maps to geological spatial models. The BGS geology service is designed initially to address two coverage data use cases and three levels of data access restriction. Databases of remote sensed data are typically very large and commonly held offline, making it time-consuming for users to assess and then download data. The service is designed to allow the spatial selection, editing and display of Landsat and aerial photographic imagery, including band selection and contrast stretching. This enables users to rapidly view data, assess is usefulness for their purposes, and then enhance and download it if it is suitable. At present the service contains six band Landsat 7 (Blue, Green, Red, NIR 1, NIR 2, MIR) and three band false colour aerial photography (NIR, green, blue), totalling around 1Tb. Increasingly 3D spatial models are being produced in place of traditional geological maps. Models make explicit spatial information implicit on maps and thus are seen as a better way of delivering geosciences information to non-geoscientists. However web delivery of models, including the provision of suitable visualisation clients, has proved more challenging than delivering maps. The EarthServer geology service is delivering 35 surfaces as coverages, comprising the modelled superficial deposits of the Glasgow area. These can be viewed using a 3D web client developed in the EarthServer project by Fraunhofer. As well as remote sensed imagery and 3D models, the geology service is also delivering DTM coverages which can be viewed in the 3D client in conjunction with both imagery and models. The service is accessible through a web GUI which allows the imagery to be viewed against a range of background maps and DTMs, and in the 3D client; spatial selection to be carried out graphically; the results of image enhancement to be displayed; and selected data to be downloaded. The GUI also provides access to the Glasgow model in the 3D client, as well as tutorial material. In the final year of the project it is intended to increase the volume of data to 20Tb and enhance the WCPS processing, including depth and thickness querying of 3D models. We have also investigated the use of GeoSciML, developed to describe and interchange the information on geological maps, to describe model surface coverages. EarthServer is developing a combined WCPS and xQuery query language, and we will investigate applying this to the GeoSciML described surfaces to answer questions such as 'find all units with a predominant sand lithology within 25m of the surface'.

  18. Translating access into utilization: lessons from the design and evaluation of a health insurance Web site to promote reproductive health care for young women in Massachusetts.

    PubMed

    Janiak, Elizabeth; Rhodes, Elizabeth; Foster, Angel M

    2013-12-01

    Following state-level health care reform in Massachusetts, young women reported confusion over coverage of contraception and other sexual and reproductive health services under newly available health insurance products. To address this gap, a plain-language Web site titled "My Little Black Book for Sexual Health" was developed by a statewide network of reproductive health stakeholders. The purpose of this evaluation was to assess the health literacy demands and usability of the site among its target audience, women ages 18-26 years. We performed an evaluation of the literacy demands of the Web site's written content and tested the Web site's usability in a health communications laboratory. Participants found the Web site visually appealing and its overall design concept accessible. However, the Web site's literacy demands were high, and all participants encountered problems navigating through the Web site. Following this evaluation, the Web site was modified to be more usable and more comprehensible to women of all health literacy levels. To avail themselves of sexual and reproductive health services newly available under expanded health insurance coverage, young women require customized educational resources that are rigorously evaluated to ensure accessibility. To maximize utilization of reproductive health services under expanded health insurance coverage, US women require customized educational resources commensurate with their literacy skills. The application of established research methods from the field of health communications will enable advocates to evaluate and adapt these resources to best serve their targeted audiences. © 2013.

  19. Can EO afford big data - an assessment of the temporal and monetary costs of existing and emerging big data workflows

    NASA Astrophysics Data System (ADS)

    Clements, Oliver; Walker, Peter

    2014-05-01

    The cost of working with extremely large data sets is an increasingly important issue within the Earth Observation community. From global coverage data at any resolution to small coverage data at extremely high resolution, the community has always produced big data. This will only increase as new sensors are deployed and their data made available. Over time standard workflows have emerged. These have been facilitated by the production and adoption of standard technologies. Groups such as the International Organisation for Standardisation (ISO) and the Open Geospatial Consortium (OGC) have been a driving force in this area for many years. The production of standard protocols and interfaces such as OPeNDAP, Web Coverage Service (WCS), Web Processing Service (WPS) and the newer emerging standards such as Web Coverage Processing Service (WCPS) have helped to galvanise these workflows. An example of a traditional workflow, assume a researcher wants to assess the temporal trend in chlorophyll concentration. This would involve a discovery phase, an acquisition phase, a processing phase and finally a derived product or analysis phase. Each element of this workflow has an associated temporal and monetary cost. Firstly the researcher would require a high bandwidth connection or the acquisition phase would take too long. Secondly the researcher must have their own expensive equipment for use in the processing phase. Both of these elements cost money and time. This can make the whole process prohibitive to scientists from the developing world or "citizen scientists" that do not have the processing infrastructure necessary. The use of emerging technologies can help improve both the monetary and time costs associated with these existing workflows. By utilising a WPS that is hosted at the same location as the data a user is able to apply processing to the data without needing their own processing infrastructure. This however limits the user to predefined processes that are made available by the data provider. The emerging OGC WCPS standard combined with big data analytics engines may provide a mechanism to improve this situation. The technology allows users to create their own queries using an SQL like query language and apply them over available large data archive, once again at the data providers end. This not only removes the processing cost whilst still allowing user defined processes it also reduces the bandwidth required, as only the final analysis or derived product needs to be downloaded. The maturity of the new technologies is a stage where their use should be justified by a quantitative assessment rather than simply by the fact that they are new developments. We will present a study of the time and cost requirements for a selection of existing workflows and then show how new/emerging standards and technologies can help to both reduce the cost to the user by shifting processing to the data, and reducing the required bandwidth for analysing large datasets, making analysis of big-data archives possible for a greater and more diverse audience.

  20. Web Services Implementations at Land Process and Goddard Earth Sciences Distributed Active Archive Centers

    NASA Astrophysics Data System (ADS)

    Cole, M.; Bambacus, M.; Lynnes, C.; Sauer, B.; Falke, S.; Yang, W.

    2007-12-01

    NASA's vast array of scientific data within its Distributed Active Archive Centers (DAACs) is especially valuable to both traditional research scientists as well as the emerging market of Earth Science Information Partners. For example, the air quality science and management communities are increasingly using satellite derived observations in their analyses and decision making. The Air Quality Cluster in the Federation of Earth Science Information Partners (ESIP) uses web infrastructures of interoperability, or Service Oriented Architecture (SOA), to extend data exploration, use, and analysis and provides a user environment for DAAC products. In an effort to continually offer these NASA data to the broadest research community audience, and reusing emerging technologies, both NASA's Goddard Earth Science (GES) and Land Process (LP) DAACs have engaged in a web services pilot project. Through these projects both GES and LP have exposed data through the Open Geospatial Consortiums (OGC) Web Services standards. Reusing several different existing applications and implementation techniques, GES and LP successfully exposed a variety data, through distributed systems to be ingested into multiple end-user systems. The results of this project will enable researchers world wide to access some of NASA's GES & LP DAAC data through OGC protocols. This functionality encourages inter-disciplinary research while increasing data use through advanced technologies. This paper will concentrate on the implementation and use of OGC Web Services, specifically Web Map and Web Coverage Services (WMS, WCS) at GES and LP DAACs, and the value of these services within scientific applications, including integration with the DataFed air quality web infrastructure and in the development of data analysis web applications.

  1. EarthServer2 : The Marine Data Service - Web based and Programmatic Access to Ocean Colour Open Data

    NASA Astrophysics Data System (ADS)

    Clements, Oliver; Walker, Peter

    2017-04-01

    The ESA Ocean Colour - Climate Change Initiative (ESA OC-CCI) has produced a long-term high quality global dataset with associated per-pixel uncertainty data. This dataset has now grown to several hundred terabytes (uncompressed) and is freely available to download. However, the sheer size of the dataset can act as a barrier to many users; large network bandwidth, local storage and processing requirements can prevent researchers without the backing of a large organisation from taking advantage of this raw data. The EC H2020 project, EarthServer2, aims to create a federated data service providing access to more than 1 petabyte of earth science data. Within this federation the Marine Data Service already provides an innovative on-line tool-kit for filtering, analysing and visualising OC-CCI data. Data are made available, filtered and processed at source through a standards-based interface, the Open Geospatial Consortium Web Coverage Service and Web Coverage Processing Service. This work was initiated in the EC FP7 EarthServer project where it was found that the unfamiliarity and complexity of these interfaces itself created a barrier to wider uptake. The continuation project, EarthServer2, addresses these issues by providing higher level tools for working with these data. We will present some examples of these tools. Many researchers wish to extract time series data from discrete points of interest. We will present a web based interface, based on NASA/ESA WebWorldWind, for selecting points of interest and plotting time series from a chosen dataset. In addition, a CSV file of locations and times, such as a ship's track, can be uploaded and these points extracted and returned in a CSV file allowing researchers to work with the extract locally, such as a spreadsheet. We will also present a set of Python and JavaScript APIs that have been created to complement and extend the web based GUI. These APIs allow the selection of single points and areas for extraction. The extracted data is returned as structured data (for instance a Python array) which can then be passed directly to local processing code. We will highlight how the libraries can be used by the community and integrated into existing systems, for instance by the use of Jupyter notebooks to share Python code examples which can then be used by other researchers as a basis for their own work.

  2. Introduction to the Application of Web-Based Surveys.

    ERIC Educational Resources Information Center

    Timmerman, Annemarie

    This paper discusses some basic assumptions and issues concerning web-based surveys. Discussion includes: assumptions regarding cost and ease of use; disadvantages of web-based surveys, concerning the inability to compensate for four common errors of survey research: coverage error, sampling error, measurement error and nonresponse error; and…

  3. Teaching Critical Evaluation Skills for World Wide Web Resources.

    ERIC Educational Resources Information Center

    Tate, Marsha; Alexander, Jan

    1996-01-01

    Outlines a lesson plan used by an academic library to evaluate the quality of World Wide Web information. Discusses the traditional evaluation criteria of accuracy, authority, objectivity, currency, and coverage as it applies to the unique characteristics of Web pages: their marketing orientation, variety of information, and instability. The…

  4. MALINA: a web service for visual analytics of human gut microbiota whole-genome metagenomic reads.

    PubMed

    Tyakht, Alexander V; Popenko, Anna S; Belenikin, Maxim S; Altukhov, Ilya A; Pavlenko, Alexander V; Kostryukova, Elena S; Selezneva, Oksana V; Larin, Andrei K; Karpova, Irina Y; Alexeev, Dmitry G

    2012-12-07

    MALINA is a web service for bioinformatic analysis of whole-genome metagenomic data obtained from human gut microbiota sequencing. As input data, it accepts metagenomic reads of various sequencing technologies, including long reads (such as Sanger and 454 sequencing) and next-generation (including SOLiD and Illumina). It is the first metagenomic web service that is capable of processing SOLiD color-space reads, to authors' knowledge. The web service allows phylogenetic and functional profiling of metagenomic samples using coverage depth resulting from the alignment of the reads to the catalogue of reference sequences which are built into the pipeline and contain prevalent microbial genomes and genes of human gut microbiota. The obtained metagenomic composition vectors are processed by the statistical analysis and visualization module containing methods for clustering, dimension reduction and group comparison. Additionally, the MALINA database includes vectors of bacterial and functional composition for human gut microbiota samples from a large number of existing studies allowing their comparative analysis together with user samples, namely datasets from Russian Metagenome project, MetaHIT and Human Microbiome Project (downloaded from http://hmpdacc.org). MALINA is made freely available on the web at http://malina.metagenome.ru. The website is implemented in JavaScript (using Ext JS), Microsoft .NET Framework, MS SQL, Python, with all major browsers supported.

  5. Teaching Tectonics to Undergraduates with Web GIS

    NASA Astrophysics Data System (ADS)

    Anastasio, D. J.; Bodzin, A.; Sahagian, D. L.; Rutzmoser, S.

    2013-12-01

    Geospatial reasoning skills provide a means for manipulating, interpreting, and explaining structured information and are involved in higher-order cognitive processes that include problem solving and decision-making. Appropriately designed tools, technologies, and curriculum can support spatial learning. We present Web-based visualization and analysis tools developed with Javascript APIs to enhance tectonic curricula while promoting geospatial thinking and scientific inquiry. The Web GIS interface integrates graphics, multimedia, and animations that allow users to explore and discover geospatial patterns that are not easily recognized. Features include a swipe tool that enables users to see underneath layers, query tools useful in exploration of earthquake and volcano data sets, a subduction and elevation profile tool which facilitates visualization between map and cross-sectional views, drafting tools, a location function, and interactive image dragging functionality on the Web GIS. The Web GIS platform is independent and can be implemented on tablets or computers. The GIS tool set enables learners to view, manipulate, and analyze rich data sets from local to global scales, including such data as geology, population, heat flow, land cover, seismic hazards, fault zones, continental boundaries, and elevation using two- and three- dimensional visualization and analytical software. Coverages which allow users to explore plate boundaries and global heat flow processes aided learning in a Lehigh University Earth and environmental science Structural Geology and Tectonics class and are freely available on the Web.

  6. Influenza during pregnancy: Incidence, vaccination coverage and attitudes toward vaccination in the French web-based cohort G-GrippeNet.

    PubMed

    Loubet, Paul; Guerrisi, Caroline; Turbelin, Clément; Blondel, Béatrice; Launay, Odile; Bardou, Marc; Goffinet, François; Colizza, Vittoria; Hanslik, Thomas; Kernéis, Solen

    2016-04-29

    Pregnancy is a risk factor for severe influenza. However, data on influenza incidence during pregnancy are scarce. Likewise, no data are available on influenza vaccine coverage in France since national recommendation in 2012. We aimed to assess these points using a novel nationwide web-based surveillance system, G-GrippeNet. During the 2014/2015 influenza season, pregnant women living in metropolitan France were enrolled through a web platform (https://www.grippenet.fr/). Throughout the season, participants were asked to report, on a weekly basis, if they had experienced symptoms of influenza-like-illness (ILI). ILI episodes reported were used to calculate incidence density rates based on period of participation from each participant. Vaccination coverage was estimated after weighing on age and education level from national data on pregnant women. Factors associated with higher vaccination coverage were obtained through a logistic regression with Odds Ratio (OR) corrected with the Zhang and Yu method. A total of 153 women were enrolled. ILI incidence density rate was 1.8 per 100 person-week (95% CI, 1.5-2.1). This rate was higher in women older than 40 years (RR = 3.0, 95% CI [1.1-8.3], p = 0.03) and during first/second trimesters compared to third trimester (RR = 4.0, 95% CI [1.4-12.0], p = 0.01). Crude vaccination coverage was 39% (95% CI, 31-47) and weighted vaccination coverage was estimated at 26% (95% CI, 20-34). Health care provider recommendation for vaccination (corrected OR = 7.8; 95% CI [3.0-17.1]) and non-smoking status (cOR = 2.1; 95% CI [1.2-6.9]) were associated with higher vaccine uptake. This original web based longitudinal surveillance study design proved feasible in pregnant women population. First results are of interest and underline that public health policies should emphasize the vaccination promotion through health care providers. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Opinion Integration and Summarization

    ERIC Educational Resources Information Center

    Lu, Yue

    2011-01-01

    As Web 2.0 applications become increasingly popular, more and more people express their opinions on the Web in various ways in real time. Such wide coverage of topics and abundance of users make the Web an extremely valuable source for mining people's opinions about all kinds of topics. However, since the opinions are usually expressed as…

  8. COV2HTML: a visualization and analysis tool of bacterial next generation sequencing (NGS) data for postgenomics life scientists.

    PubMed

    Monot, Marc; Orgeur, Mickael; Camiade, Emilie; Brehier, Clément; Dupuy, Bruno

    2014-03-01

    COV2HTML is an interactive web interface, which is addressed to biologists, and allows performing both coverage visualization and analysis of NGS alignments performed on prokaryotic organisms (bacteria and phages). It combines two processes: a tool that converts the huge NGS mapping or coverage files into light specific coverage files containing information on genetic elements; and a visualization interface allowing a real-time analysis of data with optional integration of statistical results. To demonstrate the scope of COV2HTML, the program was tested with data from two published studies. The first data were from RNA-seq analysis of Campylobacter jejuni, based on comparison of two conditions with two replicates. We were able to recover 26 out of 27 genes highlighted in the publication using COV2HTML. The second data comprised of stranded TSS and RNA-seq data sets on the Archaea Sulfolobus solfataricus. COV2HTML was able to highlight most of the TSSs from the article and allows biologists to visualize both TSS and RNA-seq on the same screen. The strength of the COV2HTML interface is making possible NGS data analysis without software installation, login, or a long training period. A web version is accessible at https://mmonot.eu/COV2HTML/ . This website is free and open to users without any login requirement.

  9. Datacube Interoperability, Encoding Independence, and Analytics

    NASA Astrophysics Data System (ADS)

    Baumann, Peter; Hirschorn, Eric; Maso, Joan

    2017-04-01

    Datacubes are commonly accepted as an enabling paradigm which provides a handy abstraction for accessing and analyzing the zillions of image files delivered by the manifold satellite instruments and climate simulations, among others. Additionally, datacubes are the classic model for statistical and OLAP datacubes, so a further information category can be integrated. From a standards perspective, spatio-temporal datacubes naturally are included in the concept of coverages which encompass regular and irregular grids, point clouds, and general meshes - or, more abstractly, digital representations of spatio-temporally varying phenomena. ISO 19123, which is identical to OGC Abstract Topic 6, gives a high-level abstract definition which is complemented by the OGC Coverage Implementation Schema (CIS) which is an interoperable, yet format independent concretization of the abstract model. Currently, ISO is working on adopting OGC CIS as ISO 19123-2; the existing ISO 19123 standard is under revision by one of the abstract authors and will become ISO 19123-1. The roadmap agreed by ISO further foresees adoption of the OGC Web Coverage Service (WCS) as an ISO standard so that a complete data and service model will exist. In 2016, INSPIRE has adopted WCS as Coverage Download Service, including the datacube analytics language Web Coverage Processing Service (WCPS). The rasdaman technology (www.rasdaman.org) is both OGC and INSPIRE Reference Implementation. In the global EarthServer initiative rasdaman database sizes are exceeding 250 TB today, heading for the Petabyte frontier well in 2017. Technically, CIS defines a compact, efficient model for representing multi-dimensional datacubes in several ways. The classical coverage cube defines a domain set (where are values?), a range set (what are these values?), and range type (what do the values mean?), as well as a "bag" for arbitrary metadata. With CIS 1.1, coordinate/value pair sequences have been added, as well as tiled representations. Further, CIS 1.1 offers a unified model for any kind of regular and irregular grids, also allowing sensor models as per SensorML. Encodings include ASCII formats like GML, JSON, RDF as well as binary formats like GeoTIFF, NetCDF, JPEG2000, and GRIB2; further, a container concept allows mixed representations within one coverage file utilizing zip or other convenient package formats. Through the tight integration with the Sensor Web Enablement (SWE), a lossless "transport" from sensor into coverage world is ensured. The corresponding service model of WCS supports datacube operations ranging from simple data extraction to complex ad-hoc analytics with WPCS. Notably, W3C is working has set out on a coverage model as well; it has been designed relatively independently from the abovementioned standards, but there is informal agreement to link it into the CIS universe (which allows for different, yet interchangeable representations). Particularly interesting in the W3C proposal is the detailed semantic modeling of metadata; as CIS 1.1 supports RDF, a tight coupling seems feasible.

  10. The EarthServer Federation: State, Role, and Contribution to GEOSS

    NASA Astrophysics Data System (ADS)

    Merticariu, Vlad; Baumann, Peter

    2016-04-01

    The intercontinental EarthServer initiative has established a European datacube platform with proven scalability: known databases exceed 100 TB, and single queries have been split across more than 1,000 cloud nodes. Its service interface being rigorously based on the OGC "Big Geo Data" standards, Web Coverage Service (WCS) and Web Coverage Processing Service (WCPS), a series of clients can dock into the services, ranging from open-source OpenLayers and QGIS over open-source NASA WorldWind to proprietary ESRI ArcGIS. Datacube fusion in a "mix and match" style is supported by the platform technolgy, the rasdaman Array Database System, which transparently federates queries so that users simply approach any node of the federation to access any data item, internally optimized for minimal data transfer. Notably, rasdaman is part of GEOSS GCI. NASA is contributing its Web WorldWind virtual globe for user-friendly data extraction, navigation, and analysis. Integrated datacube / metadata queries are contributed by CITE. Current federation members include ESA (managed by MEEO sr.l.), Plymouth Marine Laboratory (PML), the European Centre for Medium-Range Weather Forecast (ECMWF), Australia's National Computational Infrastructure, and Jacobs University (adding in Planetary Science). Further data centers have expressed interest in joining. We present the EarthServer approach, discuss its underlying technology, and illustrate the contribution this datacube platform can make to GEOSS.

  11. Mining Genotype-Phenotype Associations from Public Knowledge Sources via Semantic Web Querying.

    PubMed

    Kiefer, Richard C; Freimuth, Robert R; Chute, Christopher G; Pathak, Jyotishman

    2013-01-01

    Gene Wiki Plus (GeneWiki+) and the Online Mendelian Inheritance in Man (OMIM) are publicly available resources for sharing information about disease-gene and gene-SNP associations in humans. While immensely useful to the scientific community, both resources are manually curated, thereby making the data entry and publication process time-consuming, and to some degree, error-prone. To this end, this study investigates Semantic Web technologies to validate existing and potentially discover new genotype-phenotype associations in GWP and OMIM. In particular, we demonstrate the applicability of SPARQL queries for identifying associations not explicitly stated for commonly occurring chronic diseases in GWP and OMIM, and report our preliminary findings for coverage, completeness, and validity of the associations. Our results highlight the benefits of Semantic Web querying technology to validate existing disease-gene associations as well as identify novel associations although further evaluation and analysis is required before such information can be applied and used effectively.

  12. Web-based tool for visualization of electric field distribution in deep-seated body structures and planning of electroporation-based treatments.

    PubMed

    Marčan, Marija; Pavliha, Denis; Kos, Bor; Forjanič, Tadeja; Miklavčič, Damijan

    2015-01-01

    Treatments based on electroporation are a new and promising approach to treating tumors, especially non-resectable ones. The success of the treatment is, however, heavily dependent on coverage of the entire tumor volume with a sufficiently high electric field. Ensuring complete coverage in the case of deep-seated tumors is not trivial and can in best way be ensured by patient-specific treatment planning. The basis of the treatment planning process consists of two complex tasks: medical image segmentation, and numerical modeling and optimization. In addition to previously developed segmentation algorithms for several tissues (human liver, hepatic vessels, bone tissue and canine brain) and the algorithms for numerical modeling and optimization of treatment parameters, we developed a web-based tool to facilitate the translation of the algorithms and their application in the clinic. The developed web-based tool automatically builds a 3D model of the target tissue from the medical images uploaded by the user and then uses this 3D model to optimize treatment parameters. The tool enables the user to validate the results of the automatic segmentation and make corrections if necessary before delivering the final treatment plan. Evaluation of the tool was performed by five independent experts from four different institutions. During the evaluation, we gathered data concerning user experience and measured performance times for different components of the tool. Both user reports and performance times show significant reduction in treatment-planning complexity and time-consumption from 1-2 days to a few hours. The presented web-based tool is intended to facilitate the treatment planning process and reduce the time needed for it. It is crucial for facilitating expansion of electroporation-based treatments in the clinic and ensuring reliable treatment for the patients. The additional value of the tool is the possibility of easy upgrade and integration of modules with new functionalities as they are developed.

  13. Web-based tool for visualization of electric field distribution in deep-seated body structures and planning of electroporation-based treatments

    PubMed Central

    2015-01-01

    Background Treatments based on electroporation are a new and promising approach to treating tumors, especially non-resectable ones. The success of the treatment is, however, heavily dependent on coverage of the entire tumor volume with a sufficiently high electric field. Ensuring complete coverage in the case of deep-seated tumors is not trivial and can in best way be ensured by patient-specific treatment planning. The basis of the treatment planning process consists of two complex tasks: medical image segmentation, and numerical modeling and optimization. Methods In addition to previously developed segmentation algorithms for several tissues (human liver, hepatic vessels, bone tissue and canine brain) and the algorithms for numerical modeling and optimization of treatment parameters, we developed a web-based tool to facilitate the translation of the algorithms and their application in the clinic. The developed web-based tool automatically builds a 3D model of the target tissue from the medical images uploaded by the user and then uses this 3D model to optimize treatment parameters. The tool enables the user to validate the results of the automatic segmentation and make corrections if necessary before delivering the final treatment plan. Results Evaluation of the tool was performed by five independent experts from four different institutions. During the evaluation, we gathered data concerning user experience and measured performance times for different components of the tool. Both user reports and performance times show significant reduction in treatment-planning complexity and time-consumption from 1-2 days to a few hours. Conclusions The presented web-based tool is intended to facilitate the treatment planning process and reduce the time needed for it. It is crucial for facilitating expansion of electroporation-based treatments in the clinic and ensuring reliable treatment for the patients. The additional value of the tool is the possibility of easy upgrade and integration of modules with new functionalities as they are developed. PMID:26356007

  14. Conducting Web-Based Surveys. ERIC Digest.

    ERIC Educational Resources Information Center

    Solomon, David J.

    Web-based surveying is very attractive for many reasons, including reducing the time and cost of conducting a survey and avoiding the often error prone and tedious task of data entry. At this time, Web-based surveys should still be used with caution. The biggest concern at present is coverage bias or bias resulting from sampled people either not…

  15. 5 CFR 1650.42 - How to obtain a financial hardship withdrawal.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... form or use the TSP Web site to initiate a request. A participant's ability to complete a financial hardship withdrawal on the Web will depend on his or her retirement system coverage and marital status. (b...

  16. 5 CFR 1650.24 - How to obtain a post-employment withdrawal.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... request form or use the TSP Web site to initiate a request. (A participant's ability to complete a post-employment withdrawal on the Web will depend on his or her retirement system coverage, withdrawal election...

  17. 5 CFR 1650.24 - How to obtain a post-employment withdrawal.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... request form or use the TSP Web site to initiate a request. (A participant's ability to complete a post-employment withdrawal on the Web will depend on his or her retirement system coverage, withdrawal election...

  18. 5 CFR 1650.24 - How to obtain a post-employment withdrawal.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... request form or use the TSP Web site to initiate a request. (A participant's ability to complete a post-employment withdrawal on the Web will depend on his or her retirement system coverage, withdrawal election...

  19. deepTools: a flexible platform for exploring deep-sequencing data.

    PubMed

    Ramírez, Fidel; Dündar, Friederike; Diehl, Sarah; Grüning, Björn A; Manke, Thomas

    2014-07-01

    We present a Galaxy based web server for processing and visualizing deeply sequenced data. The web server's core functionality consists of a suite of newly developed tools, called deepTools, that enable users with little bioinformatic background to explore the results of their sequencing experiments in a standardized setting. Users can upload pre-processed files with continuous data in standard formats and generate heatmaps and summary plots in a straight-forward, yet highly customizable manner. In addition, we offer several tools for the analysis of files containing aligned reads and enable efficient and reproducible generation of normalized coverage files. As a modular and open-source platform, deepTools can easily be expanded and customized to future demands and developments. The deepTools webserver is freely available at http://deeptools.ie-freiburg.mpg.de and is accompanied by extensive documentation and tutorials aimed at conveying the principles of deep-sequencing data analysis. The web server can be used without registration. deepTools can be installed locally either stand-alone or as part of Galaxy. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  20. Semantics-based plausible reasoning to extend the knowledge coverage of medical knowledge bases for improved clinical decision support.

    PubMed

    Mohammadhassanzadeh, Hossein; Van Woensel, William; Abidi, Samina Raza; Abidi, Syed Sibte Raza

    2017-01-01

    Capturing complete medical knowledge is challenging-often due to incomplete patient Electronic Health Records (EHR), but also because of valuable, tacit medical knowledge hidden away in physicians' experiences. To extend the coverage of incomplete medical knowledge-based systems beyond their deductive closure, and thus enhance their decision-support capabilities, we argue that innovative, multi-strategy reasoning approaches should be applied. In particular, plausible reasoning mechanisms apply patterns from human thought processes, such as generalization, similarity and interpolation, based on attributional, hierarchical, and relational knowledge. Plausible reasoning mechanisms include inductive reasoning , which generalizes the commonalities among the data to induce new rules, and analogical reasoning , which is guided by data similarities to infer new facts. By further leveraging rich, biomedical Semantic Web ontologies to represent medical knowledge, both known and tentative, we increase the accuracy and expressivity of plausible reasoning, and cope with issues such as data heterogeneity, inconsistency and interoperability. In this paper, we present a Semantic Web-based, multi-strategy reasoning approach, which integrates deductive and plausible reasoning and exploits Semantic Web technology to solve complex clinical decision support queries. We evaluated our system using a real-world medical dataset of patients with hepatitis, from which we randomly removed different percentages of data (5%, 10%, 15%, and 20%) to reflect scenarios with increasing amounts of incomplete medical knowledge. To increase the reliability of the results, we generated 5 independent datasets for each percentage of missing values, which resulted in 20 experimental datasets (in addition to the original dataset). The results show that plausibly inferred knowledge extends the coverage of the knowledge base by, on average, 2%, 7%, 12%, and 16% for datasets with, respectively, 5%, 10%, 15%, and 20% of missing values. This expansion in the KB coverage allowed solving complex disease diagnostic queries that were previously unresolvable, without losing the correctness of the answers. However, compared to deductive reasoning, data-intensive plausible reasoning mechanisms yield a significant performance overhead. We observed that plausible reasoning approaches, by generating tentative inferences and leveraging domain knowledge of experts, allow us to extend the coverage of medical knowledge bases, resulting in improved clinical decision support. Second, by leveraging OWL ontological knowledge, we are able to increase the expressivity and accuracy of plausible reasoning methods. Third, our approach is applicable to clinical decision support systems for a range of chronic diseases.

  1. A cross disciplinary study of link decay and the effectiveness of mitigation techniques

    PubMed Central

    2013-01-01

    Background The dynamic, decentralized world-wide-web has become an essential part of scientific research and communication. Researchers create thousands of web sites every year to share software, data and services. These valuable resources tend to disappear over time. The problem has been documented in many subject areas. Our goal is to conduct a cross-disciplinary investigation of the problem and test the effectiveness of existing remedies. Results We accessed 14,489 unique web pages found in the abstracts within Thomson Reuters' Web of Science citation index that were published between 1996 and 2010 and found that the median lifespan of these web pages was 9.3 years with 62% of them being archived. Survival analysis and logistic regression were used to find significant predictors of URL lifespan. The availability of a web page is most dependent on the time it is published and the top-level domain names. Similar statistical analysis revealed biases in current solutions: the Internet Archive favors web pages with fewer layers in the Universal Resource Locator (URL) while WebCite is significantly influenced by the source of publication. We also created a prototype for a process to submit web pages to the archives and increased coverage of our list of scientific webpages in the Internet Archive and WebCite by 22% and 255%, respectively. Conclusion Our results show that link decay continues to be a problem across different disciplines and that current solutions for static web pages are helping and can be improved. PMID:24266891

  2. A cross disciplinary study of link decay and the effectiveness of mitigation techniques.

    PubMed

    Hennessey, Jason; Ge, Steven

    2013-01-01

    The dynamic, decentralized world-wide-web has become an essential part of scientific research and communication. Researchers create thousands of web sites every year to share software, data and services. These valuable resources tend to disappear over time. The problem has been documented in many subject areas. Our goal is to conduct a cross-disciplinary investigation of the problem and test the effectiveness of existing remedies. We accessed 14,489 unique web pages found in the abstracts within Thomson Reuters' Web of Science citation index that were published between 1996 and 2010 and found that the median lifespan of these web pages was 9.3 years with 62% of them being archived. Survival analysis and logistic regression were used to find significant predictors of URL lifespan. The availability of a web page is most dependent on the time it is published and the top-level domain names. Similar statistical analysis revealed biases in current solutions: the Internet Archive favors web pages with fewer layers in the Universal Resource Locator (URL) while WebCite is significantly influenced by the source of publication. We also created a prototype for a process to submit web pages to the archives and increased coverage of our list of scientific webpages in the Internet Archive and WebCite by 22% and 255%, respectively. Our results show that link decay continues to be a problem across different disciplines and that current solutions for static web pages are helping and can be improved.

  3. Towards the Geospatial Web: Media Platforms for Managing Geotagged Knowledge Repositories

    NASA Astrophysics Data System (ADS)

    Scharl, Arno

    International media have recognized the visual appeal of geo-browsers such as NASA World Wind and Google Earth, for example, when Web and television coverage on Hurricane Katrina used interactive geospatial projections to illustrate its path and the scale of destruction in August 2005. Yet these early applications only hint at the true potential of geospatial technology to build and maintain virtual communities and to revolutionize the production, distribution and consumption of media products. This chapter investigates this potential by reviewing the literature and discussing the integration of geospatial and semantic reference systems, with an emphasis on extracting geospatial context from unstructured text. A content analysis of news coverage based on a suite of text mining tools (webLyzard) sheds light on the popularity and adoption of geospatial platforms.

  4. The JCSG high-throughput structural biology pipeline.

    PubMed

    Elsliger, Marc André; Deacon, Ashley M; Godzik, Adam; Lesley, Scott A; Wooley, John; Wüthrich, Kurt; Wilson, Ian A

    2010-10-01

    The Joint Center for Structural Genomics high-throughput structural biology pipeline has delivered more than 1000 structures to the community over the past ten years. The JCSG has made a significant contribution to the overall goal of the NIH Protein Structure Initiative (PSI) of expanding structural coverage of the protein universe, as well as making substantial inroads into structural coverage of an entire organism. Targets are processed through an extensive combination of bioinformatics and biophysical analyses to efficiently characterize and optimize each target prior to selection for structure determination. The pipeline uses parallel processing methods at almost every step in the process and can adapt to a wide range of protein targets from bacterial to human. The construction, expansion and optimization of the JCSG gene-to-structure pipeline over the years have resulted in many technological and methodological advances and developments. The vast number of targets and the enormous amounts of associated data processed through the multiple stages of the experimental pipeline required the development of variety of valuable resources that, wherever feasible, have been converted to free-access web-based tools and applications.

  5. Comparing Unique Title Coverage of Web of Science and Scopus in Earth and Atmospheric Sciences

    ERIC Educational Resources Information Center

    Barnett, Philip; Lascar, Claudia

    2012-01-01

    The current journal titles in earth and atmospheric sciences, that are unique to each of two databases, Web of Science and Scopus, were identified using different methods. Comparing by subject category shows that Scopus has hundreds of unique titles, and Web of Science just 16. The titles unique to each database have low SCImago Journal Rank…

  6. Evolution of the Data Access Protocol in Response to Community Needs

    NASA Astrophysics Data System (ADS)

    Gallagher, J.; Caron, J. L.; Davis, E.; Fulker, D.; Heimbigner, D.; Holloway, D.; Howe, B.; Moe, S.; Potter, N.

    2012-12-01

    Under the aegis of the OPULS (OPeNDAP-Unidata Linked Servers) Project, funded by NOAA, version 2 of OPeNDAP's Data Access Protocol (DAP2) is being updated to version 4. DAP4 is the first major upgrade in almost two decades and will embody three main areas of advancement. First, the data-model extensions developed by the OPULS team focus on three areas: Better support for coverages, access to HDF5 files and access to relational databases. DAP2 support for coverages (defined as a sampled functions) was limited to simple rectangular coverages that work well for (some) model outputs and processed satellite data but that cannot represent trajectories or satellite swath data, for example. We have extended the coverage concept in DAP4 to remove these limitations. These changes are informed by work at Unidata on the Common Data Model and also by the OGC's abstract coverages specification. In a similar vein, we have extended DAP2's support for relations by including the concept of foreign keys, so that tables can be explicitly related to one another. Second, the web interfaces - web services - that provides access to data using via DAP will be more clearly defined and use other (, orthogonal), standards where they are appropriate. An important case is the XML interface, which provides a cleaner way to build other response media types such as JSON and RDF (for metadata) and to build support for Atom, thus simplify the integration of DAP servers with tools that support OpenSearch. Input from the ESIP federation and work performed with IOOS have informed our choices here. Last, DAP4-compliant servers will support richer data-processing capabilities than DAP2, enabling a wider array of server functions that manipulate data before returning values. Two projects currently are exploring just what can be done even with DAP2's server-function model: The MIIC project at LARC and OPULS itself (with work performed at the University of Washington). Both projects have demonstrated that server functions can be used to perform operations on large volumes of data and return results that are far smaller than would be required to achieve the same outcomes via client-side processing. We are using information from these efforts to inform the design of server functions in DAP4. Each of the three areas of DAP4 advancement is being guided by input from a number of community members, including an OPULS Advisory Committee.

  7. Nursing challenges for universal health coverage: a systematic review1

    PubMed Central

    Schveitzer, Mariana Cabral; Zoboli, Elma Lourdes Campos Pavone; Vieira, Margarida Maria da Silva

    2016-01-01

    Objectives to identify nursing challenges for universal health coverage, based on the findings of a systematic review focused on the health workforce' understanding of the role of humanization practices in Primary Health Care. Method systematic review and meta-synthesis, from the following information sources: PubMed, CINAHL, Scielo, Web of Science, PsycInfo, SCOPUS, DEDALUS and Proquest, using the keyword Primary Health Care associated, separately, with the following keywords: humanization of assistance, holistic care/health, patient centred care, user embracement, personal autonomy, holism, attitude of health personnel. Results thirty studies between 1999-2011. Primary Health Care work processes are complex and present difficulties for conducting integrative care, especially for nursing, but humanizing practices have showed an important role towards the development of positive work environments, quality of care and people-centered care by promoting access and universal health coverage. Conclusions nursing challenges for universal health coverage are related to education and training, to better working conditions and clear definition of nursing role in primary health care. It is necessary to overcome difficulties such as fragmented concepts of health and care and invest in multidisciplinary teamwork, community empowerment, professional-patient bond, user embracement, soft technologies, to promote quality of life, holistic care and universal health coverage. PMID:27143536

  8. Development and formative evaluation of an innovative mHealth intervention for improving coverage of community-based maternal, newborn and child health services in rural areas of India

    PubMed Central

    Modi, Dhiren; Gopalan, Ravi; Shah, Shobha; Venkatraman, Sethuraman; Desai, Gayatri; Desai, Shrey; Shah, Pankaj

    2015-01-01

    Background A new cadre of village-based frontline health workers, called Accredited Social Health Activists (ASHAs), was created in India. However, coverage of selected community-based maternal, newborn and child health (MNCH) services remains low. Objective This article describes the process of development and formative evaluation of a complex mHealth intervention (ImTeCHO) to increase the coverage of proven MNCH services in rural India by improving the performance of ASHAs. Design The Medical Research Council (MRC) framework for developing complex interventions was used. Gaps were identified in the usual care provided by ASHAs, based on a literature search, and SEWA Rural's1 three decades of grassroots experience. The components of the intervention (mHealth strategies) were designed to overcome the gaps in care. The intervention, in the form of the ImTeCHO mobile phone and web application, along with the delivery model, was developed to incorporate these mHealth strategies. The intervention was piloted through 45 ASHAs among 45 villages in Gujarat (population: 45,000) over 7 months in 2013 to assess the acceptability, feasibility, and usefulness of the intervention and to identify barriers to its delivery. Results Inadequate supervision and support to ASHAs were noted as a gap in usual care, resulting in low coverage of selected MNCH services and care received by complicated cases. Therefore, the ImTeCHO application was developed to integrate mHealth strategies in the form of job aid to ASHAs to assist with scheduling, behavior change communication, diagnosis, and patient management, along with supervision and support of ASHAs. During the pilot, the intervention and its delivery were found to be largely acceptable, feasible, and useful. A few changes were made to the intervention and its delivery, including 1) a new helpline for ASHAs, 2) further simplification of processes within the ImTeCHO incentive management system and 3) additional web-based features for enhancing value and supervision of Primary Health Center (PHC) staff. Conclusions The effectiveness of the improved ImTeCHO intervention will be now tested through a cluster randomized trial. PMID:25697233

  9. Camp Insurance 101: Understanding the Fundamentals of a Camp Insurance Program.

    ERIC Educational Resources Information Center

    Garner, Ian

    2001-01-01

    This short course on insurance for camps discusses coverage, including the various types of liability, property, and other types of coverage; the difference between direct writers, brokers, agents, and captive agents; choosing an insurance company; and checking on the financial stability of recommended carriers. Three Web sites are given for…

  10. 22 CFR 126.5 - Canadian exemptions.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... such persons publicly available through the Internet Web site of the Directorate of Defense Trade... coverage area on the surface of the earth less than 200 nautical miles in diameter, where “coverage area” is defined as that area on the surface of the earth that is illuminated by the main beam width of the...

  11. Lexical Coverage of TED Talks: Implications for Vocabulary Instruction

    ERIC Educational Resources Information Center

    Nurmukhamedov, Ulugbek

    2017-01-01

    Teachers of English are often in search of authentic audio and video materials that promote learners' listening comprehension and vocabulary development. TED Talks, a set of freely available web presentations, could be a useful resource to promote vocabulary instruction. The present replication study examines the lexical coverage of TED Talks by…

  12. Mining Genotype-Phenotype Associations from Public Knowledge Sources via Semantic Web Querying

    PubMed Central

    Kiefer, Richard C.; Freimuth, Robert R.; Chute, Christopher G; Pathak, Jyotishman

    Gene Wiki Plus (GeneWiki+) and the Online Mendelian Inheritance in Man (OMIM) are publicly available resources for sharing information about disease-gene and gene-SNP associations in humans. While immensely useful to the scientific community, both resources are manually curated, thereby making the data entry and publication process time-consuming, and to some degree, error-prone. To this end, this study investigates Semantic Web technologies to validate existing and potentially discover new genotype-phenotype associations in GWP and OMIM. In particular, we demonstrate the applicability of SPARQL queries for identifying associations not explicitly stated for commonly occurring chronic diseases in GWP and OMIM, and report our preliminary findings for coverage, completeness, and validity of the associations. Our results highlight the benefits of Semantic Web querying technology to validate existing disease-gene associations as well as identify novel associations although further evaluation and analysis is required before such information can be applied and used effectively. PMID:24303249

  13. Recent El Niño brought downpour of media coverage

    NASA Astrophysics Data System (ADS)

    Hare, Steven R.

    Media coverage of the 1997-1998 tropical ocean warming event made the term “El Nino” a household word. So pervasive was coverage of El Nino that it became the fodder of late night talk show monologues and an oft-invoked gremlin responsible for many of society's ailments. As a fisheries biologist studying climate impacts on marine resources, I followed the event very closely and created an El Nino Web site (http://www. iphc.washington.edu/PAGES/IPHC/Staff/ hare/html/1997ENSO/ 1997ENSO.html) in the spring of 1997 when the magnitude of the event was becoming obvious.As part of my daily routine in updating the Web page, I began tracking El Nino media coverage over the Internet. Between June 1997 and July 1998,1 accumulated links to stories about El Nino. I attempted to maintain a constant level of effort so that the number of stories accurately reflected the level of coverage given the event as it progressed. In fisheries lingo, this is known as a Catch Per Unit Effort (CPUE) index. Because Internet content is often removed after a period of time, a retrospective accumulation of daily stories would not yield as accurate a count as the contemporary CPUE index I maintained.

  14. The Effectiveness of Web Search Engines to Index New Sites from Different Countries

    ERIC Educational Resources Information Center

    Pirkola, Ari

    2009-01-01

    Introduction: Investigates how effectively Web search engines index new sites from different countries. The primary interest is whether new sites are indexed equally or whether search engines are biased towards certain countries. If major search engines show biased coverage it can be considered a significant economic and political problem because…

  15. 2009-2010 Seasonal Influenza Vaccination Coverage among College Students from 8 Universities in North Carolina

    ERIC Educational Resources Information Center

    Poehling, Katherine A.; Blocker, Jill; Ip, Edward H.; Peters, Timothy R.; Wolfson, Mark

    2012-01-01

    Objective: The authors sought to describe the 2009-2010 seasonal influenza vaccine coverage of college students. Participants: A total of 4,090 college students from 8 North Carolina universities participated in a confidential, Web-based survey in October-November 2009. Methods: Associations between self-reported 2009-2010 seasonal influenza…

  16. Discovery Mechanisms for the Sensor Web

    PubMed Central

    Jirka, Simon; Bröring, Arne; Stasch, Christoph

    2009-01-01

    This paper addresses the discovery of sensors within the OGC Sensor Web Enablement framework. Whereas services like the OGC Web Map Service or Web Coverage Service are already well supported through catalogue services, the field of sensor networks and the according discovery mechanisms is still a challenge. The focus within this article will be on the use of existing OGC Sensor Web components for realizing a discovery solution. After discussing the requirements for a Sensor Web discovery mechanism, an approach will be presented that was developed within the EU funded project “OSIRIS”. This solution offers mechanisms to search for sensors, exploit basic semantic relationships, harvest sensor metadata and integrate sensor discovery into already existing catalogues. PMID:22574038

  17. Management intensity and vegetation complexity affect web-building spiders and their prey.

    PubMed

    Diehl, Eva; Mader, Viktoria L; Wolters, Volkmar; Birkhofer, Klaus

    2013-10-01

    Agricultural management and vegetation complexity affect arthropod diversity and may alter trophic interactions between predators and their prey. Web-building spiders are abundant generalist predators and important natural enemies of pests. We analyzed how management intensity (tillage, cutting of the vegetation, grazing by cattle, and synthetic and organic inputs) and vegetation complexity (plant species richness, vegetation height, coverage, and density) affect rarefied richness and composition of web-building spiders and their prey with respect to prey availability and aphid predation in 12 habitats, ranging from an uncut fallow to a conventionally managed maize field. Spiders and prey from webs were collected manually and the potential prey were quantified using sticky traps. The species richness of web-building spiders and the order richness of prey increased with plant diversity and vegetation coverage. Prey order richness was lower at tilled compared to no-till sites. Hemipterans (primarily aphids) were overrepresented, while dipterans, hymenopterans, and thysanopterans were underrepresented in webs compared to sticky traps. The per spider capture efficiency for aphids was higher at tilled than at no-till sites and decreased with vegetation complexity. After accounting for local densities, 1.8 times more aphids were captured at uncut compared to cut sites. Our results emphasize the functional role of web-building spiders in aphid predation, but suggest negative effects of cutting or harvesting. We conclude that reduced management intensity and increased vegetation complexity help to conserve local invertebrate diversity, and that web-building spiders at sites under low management intensity (e.g., semi-natural habitats) contribute to aphid suppression at the landscape scale.

  18. Volcview: A Web-Based Platform for Satellite Monitoring of Volcanic Activity and Eruption Response

    NASA Astrophysics Data System (ADS)

    Schneider, D. J.; Randall, M.; Parker, T.

    2014-12-01

    The U.S. Geological Survey (USGS), in cooperation with University and State partners, operates five volcano observatories that employ specialized software packages and computer systems to process and display real-time data coming from in-situ geophysical sensors and from near-real-time satellite sources. However, access to these systems both inside and from outside the observatory offices are limited in some cases by factors such as software cost, network security, and bandwidth. Thus, a variety of Internet-based tools have been developed by the USGS Volcano Science Center to: 1) Improve accessibility to data sources for staff scientists across volcano monitoring disciplines; 2) Allow access for observatory partners and for after-hours, on-call duty scientists; 3) Provide situational awareness for emergency managers and the general public. Herein we describe VolcView (volcview.wr.usgs.gov), a freely available, web-based platform for display and analysis of near-real-time satellite data. Initial geographic coverage is of the volcanoes in Alaska, the Russian Far East, and the Commonwealth of the Northern Mariana Islands. Coverage of other volcanoes in the United States will be added in the future. Near-real-time satellite data from NOAA, NASA and JMA satellite systems are processed to create image products for detection of elevated surface temperatures and volcanic ash and SO2 clouds. VolcView uses HTML5 and the canvas element to provide image overlays (volcano location and alert status, annotation, and location information) and image products that can be queried to provide data values, location and measurement capabilities. Use over the past year during the eruptions of Pavlof, Veniaminof, and Cleveland volcanoes in Alaska by the Alaska Volcano Observatory, the National Weather Service, and the U.S. Air Force has reinforced the utility of shared situational awareness and has guided further development. These include overlay of volcanic cloud trajectory and dispersion models, atmospheric temperature profiles, and incorporation of monitoring alerts from ground and satellite-based algorithms. Challenges for future development include reducing the latency in satellite data reception and processing, and increasing the geographic coverage from polar-orbiting satellite platforms.

  19. 5 CFR 1650.41 - How to obtain an age-based withdrawal.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... record keeper a properly completed paper TSP age-based withdrawal request form or use the TSP Web site to initiate a request. A participant's ability to complete an age-based withdrawal on the Web will depend on his or her retirement system coverage, marital status, and whether or not part or all of the...

  20. Indexing Aids at Corporate Websites: The Use of Robots.txt and META Tags.

    ERIC Educational Resources Information Center

    Drott, M. Carl

    2002-01-01

    This study examine 60 corporate Web sites to see if they provided support for automatic indexing, particularly use of the robots.txt and Meta tags for keywords and description. Discusses the use of Java and cookies and suggests that an increase in indexing aids would improve overall index coverage of the Web. (Author/LRW)

  1. 75 FR 58414 - Medicare Program; Meeting of the Medicare Evidence Development and Coverage Advisory Committee...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-24

    ... focus on issues specific to the list of topics that we have proposed to the Committee. The list of research topics to be discussed at the meeting will be available on the following Web site prior to the... topic, including panel materials, is available at http://www.cms.hhs.gov/center/coverage.asp . We...

  2. Operational Interoperable Web Coverage Service for Earth Observing Satellite Data: Issues and Lessons Learned

    NASA Astrophysics Data System (ADS)

    Yang, W.; Min, M.; Bai, Y.; Lynnes, C.; Holloway, D.; Enloe, Y.; di, L.

    2008-12-01

    In the past few years, there have been growing interests, among major earth observing satellite (EOS) data providers, in serving data through the interoperable Web Coverage Service (WCS) interface protocol, developed by the Open Geospatial Consortium (OGC). The interface protocol defined in WCS specifications allows client software to make customized requests of multi-dimensional EOS data, including spatial and temporal subsetting, resampling and interpolation, and coordinate reference system (CRS) transformation. A WCS server describes an offered coverage, i.e., a data product, through a response to a client's DescribeCoverage request. The description includes the offered coverage's spatial/temporal extents and resolutions, supported CRSs, supported interpolation methods, and supported encoding formats. Based on such information, a client can request the entire or a subset of coverage in any spatial/temporal resolutions and in any one of the supported CRSs, formats, and interpolation methods. When implementing a WCS server, a data provider has different approaches to present its data holdings to clients. One of the most straightforward, and commonly used, approaches is to offer individual physical data files as separate coverages. Such implementation, however, will result in too many offered coverages for large data holdings and it also cannot fully present the relationship among different, but spatially and/or temporally associated, data files. It is desirable to disconnect offered coverages from physical data files so that the former is more coherent, especially in spatial and temporal domains. Therefore, some servers offer one single coverage for a set of spatially coregistered time series data files such as a daily global precipitation coverage linked to many global single- day precipitation files; others offer one single coverage for multiple temporally coregistered files together forming a large spatial extent. In either case, a server needs to assemble an output coverage real-time by combining potentially large number of physical files, which can be operationally difficult. The task becomes more challenging if an offered coverage involves spatially and temporally un-registered physical files. In this presentation, we will discuss issues and lessons learned in providing NASA's AIRS Level 2 atmospheric products, which are in satellite swath CRS and in 6-minute segment granule files, as virtual global coverages. We"ll discuss the WCS server's on- the-fly georectification, mosaicking, quality screening, performance, and scalability.

  3. Web Coverage Service Challenges for NASA's Earth Science Data

    NASA Technical Reports Server (NTRS)

    Cantrell, Simon; Khan, Abdul; Lynnes, Christopher

    2017-01-01

    In an effort to ensure that data in NASA's Earth Observing System Data and Information System (EOSDIS) is available to a wide variety of users through the tools of their choice, NASA continues to focus on exposing data and services using standards based protocols. Specifically, this work has focused recently on the Web Coverage Service (WCS). Experience has been gained in data delivery via GetCoverage requests, starting out with WCS v1.1.1. The pros and cons of both the version itself and different implementation approaches will be shared during this session. Additionally, due to limitations with WCS v1.1.1 ability to work with NASA's Earth science data, this session will also discuss the benefit of migrating to WCS 2.0.1 with EO-x to enrich this capability to meet a wide range of anticipated user's needs This will enable subsetting and various types of data transformations to be performed on a variety of EOS data sets.

  4. Sentinel-1 automatic processing chain for volcanic and seismic areas monitoring within the Geohazards Exploitation Platform (GEP)

    NASA Astrophysics Data System (ADS)

    De Luca, Claudio; Zinno, Ivana; Manunta, Michele; Lanari, Riccardo; Casu, Francesco

    2016-04-01

    The microwave remote sensing scenario is rapidly evolving through development of new sensor technology for Earth Observation (EO). In particular, Sentinel-1A (S1A) is the first of a sensors' constellation designed to provide a satellite data stream for the Copernicus European program. Sentinel-1A has been specifically designed to provide, over land, Differential Interferometric Synthetic Aperture Radar (DInSAR) products to analyze and investigate Earth's surface displacements. S1A peculiarities include wide ground coverage (250 km of swath), C-band operational frequency and short revisit time (that will reduce from 12 to 6 days when the twin system Sentinel-1B will be placed in orbit during 2016). Such characteristics, together with the global coverage acquisition policy, make the Sentinel-1 constellation to be extremely suitable for volcanic and seismic areas studying and monitoring worldwide, thus allowing the generation of both ground displacement information with increasing rapidity and new geological understanding. The main acquisition mode over land is the so called Interferometric Wide Swath (IWS) that is based on the Terrain Observation by Progressive Scans (TOPS) technique and that guarantees the mentioned S1A large coverage characteristics at expense of a not trivial interferometric processing. Moreover, the satellite spatial coverage and the reduced revisit time will lead to an exponential increase of the data archives that, after the launch of Sentine-1B, will reach about 3TB per day. Therefore, the EO scientific community needs from the one hand automated and effective DInSAR tools able to address the S1A processing complexity, and from the other hand the computing and storage capacities to face out the expected large amount of data. Then, it is becoming more crucial to move processors and tools close to the satellite archives, being not efficient anymore the approach of downloading and processing data with in-house computing facilities. To address these issues, ESA recently funded the development of the Geohazards Exploitation Platform (GEP), a project aimed at putting together data, processing tools and results to make them accessible to the EO scientific community, with particular emphasis to the Geohazard Supersites & Natural Laboratories and the CEOS Seismic Hazards and Volcanoes Pilots. In this work we present the integration of the parallel version of a well-known DInSAR algorithm referred to as Small BAseline Subset (P-SBAS) within the GEP platform for processing Sentinel-1 data. The integration allowed us to set up an operational on-demand web tool, open to every user, aimed at automatically processing S1A data for the generation of SBAS displacement time-series. Main characteristics as well as a number of experimental results obtained by using the implemented web tool will be also shown. This work is partially supported by: the RITMARE project of Italian MIUR, the DPC-CNR agreement and the ESA GEP project.

  5. Exploring NASA GES DISC Data with Interoperable Services

    NASA Technical Reports Server (NTRS)

    Zhao, Peisheng; Yang, Wenli; Hegde, Mahabal; Wei, Jennifer C.; Kempler, Steven; Pham, Long; Teng, William; Savtchenko, Andrey

    2015-01-01

    Overview of NASA GES DISC (NASA Goddard Earth Science Data and Information Services Center) data with interoperable services: Open-standard and Interoperable Services Improve data discoverability, accessibility, and usability with metadata, catalogue and portal standards Achieve data, information and knowledge sharing across applications with standardized interfaces and protocols Open Geospatial Consortium (OGC) Data Services and Specifications Web Coverage Service (WCS) -- data Web Map Service (WMS) -- pictures of data Web Map Tile Service (WMTS) --- pictures of data tiles Styled Layer Descriptors (SLD) --- rendered styles.

  6. Coverage of, and compliance with, mass drug administration under the programme to eliminate lymphatic filariasis in India: a systematic review.

    PubMed

    Babu, Bontha V; Babu, Gopalan R

    2014-09-01

    India's mass drug administration (MDA) programme to eliminate lymphatic filariasis (PELF) covers all 250 endemic districts, but compliance with treatment is not adequate for the programme to succeed in eradicating this neglected tropical disease. The objective of our study was to systematically review published studies on the coverage of and compliance with MDA under the PELF in India. We searched several databases-PubMed/Medline, Google Scholar, CINAHL/EBSCO, Web of Knowledge (including Web of Science) and OVID-and by applying selection criteria identified a total of 36 papers to include in the review. Overall MDA coverage rates varied between 48.8% and 98.8%, while compliance rates ranged from 20.8% to 93.7%. The coverage-compliance gap is large in many MDA programmes. The effective level of compliance, ≥65%, was reported in only 10 of a total of 31 MDAs (5 of 20 MDAs in rural areas and 2 of 12 MDAs in urban areas). The review has identified a gap between coverage and compliance, and potentially correctable causes of this gap. These causes need to be addressed if the Indian programme is to advance towards elimination of lymphatic filariasis. © The Author 2014. Published by Oxford University Press on behalf of Royal Society of Tropical Medicine and Hygiene. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  7. EarthServer: a Summary of Achievements in Technology, Services, and Standards

    NASA Astrophysics Data System (ADS)

    Baumann, Peter

    2015-04-01

    Big Data in the Earth sciences, the Tera- to Exabyte archives, mostly are made up from coverage data, according to ISO and OGC defined as the digital representation of some space-time varying phenomenon. Common examples include 1-D sensor timeseries, 2-D remote sensing imagery, 3D x/y/t image timese ries and x/y/z geology data, and 4-D x/y/z/t atmosphere and ocean data. Analytics on such data requires on-demand processing of sometimes significant complexity, such as getting the Fourier transform of satellite images. As network bandwidth limits prohibit transfer of such Big Data it is indispensable to devise protocols allowing clients to task flexible and fast processing on the server. The transatlantic EarthServer initiative, running from 2011 through 2014, has united 11 partners to establish Big Earth Data Analytics. A key ingredient has been flexibility for users to ask whatever they want, not impeded and complicated by system internals. The EarthServer answer to this is to use high-level, standards-based query languages which unify data and metadata search in a simple, yet powerful way. A second key ingredient is scalability. Without any doubt, scalability ultimately can only be achieved through parallelization. In the past, parallelizing cod e has been done at compile time and usually with manual intervention. The EarthServer approach is to perform a samentic-based dynamic distribution of queries fragments based on networks optimization and further criteria. The EarthServer platform is comprised by rasdaman, the pioneer and leading Array DBMS built for any-size multi-dimensional raster data being extended with support for irregular grids and general meshes; in-situ retrieval (evaluation of database queries on existing archive structures, avoiding data import and, hence, duplication); the aforementioned distributed query processing. Additionally, Web clients for multi-dimensional data visualization are being established. Client/server interfaces are strictly based on OGC and W3C standards, in particular the Web Coverage Processing Service (WCPS) which defines a high-level coverage query language. Reviewers have attested EarthServer that "With no doubt the project has been shaping the Big Earth Data landscape through the standardization activities within OGC, ISO and beyond". We present the project approach, its outcomes and impact on standardization and Big Data technology, and vistas for the future.

  8. Comparison of PubMed, Scopus, Web of Science, and Google Scholar: strengths and weaknesses.

    PubMed

    Falagas, Matthew E; Pitsouni, Eleni I; Malietzis, George A; Pappas, Georgios

    2008-02-01

    The evolution of the electronic age has led to the development of numerous medical databases on the World Wide Web, offering search facilities on a particular subject and the ability to perform citation analysis. We compared the content coverage and practical utility of PubMed, Scopus, Web of Science, and Google Scholar. The official Web pages of the databases were used to extract information on the range of journals covered, search facilities and restrictions, and update frequency. We used the example of a keyword search to evaluate the usefulness of these databases in biomedical information retrieval and a specific published article to evaluate their utility in performing citation analysis. All databases were practical in use and offered numerous search facilities. PubMed and Google Scholar are accessed for free. The keyword search with PubMed offers optimal update frequency and includes online early articles; other databases can rate articles by number of citations, as an index of importance. For citation analysis, Scopus offers about 20% more coverage than Web of Science, whereas Google Scholar offers results of inconsistent accuracy. PubMed remains an optimal tool in biomedical electronic research. Scopus covers a wider journal range, of help both in keyword searching and citation analysis, but it is currently limited to recent articles (published after 1995) compared with Web of Science. Google Scholar, as for the Web in general, can help in the retrieval of even the most obscure information but its use is marred by inadequate, less often updated, citation information.

  9. Archive Management of NASA Earth Observation Data to Support Cloud Analysis

    NASA Technical Reports Server (NTRS)

    Lynnes, Christopher; Baynes, Kathleen; McInerney, Mark A.

    2017-01-01

    NASA collects, processes and distributes petabytes of Earth Observation (EO) data from satellites, aircraft, in situ instruments and model output, with an order of magnitude increase expected by 2024. Cloud-based web object storage (WOS) of these data can simplify the execution of such an increase. More importantly, it can also facilitate user analysis of those volumes by making the data available to the massively parallel computing power in the cloud. However, storing EO data in cloud WOS has a ripple effect throughout the NASA archive system with unexpected challenges and opportunities. One challenge is modifying data servicing software (such as Web Coverage Service servers) to access and subset data that are no longer on a directly accessible file system, but rather in cloud WOS. Opportunities include refactoring of the archive software to a cloud-native architecture; virtualizing data products by computing on demand; and reorganizing data to be more analysis-friendly.

  10. The impact of the web and social networks on vaccination. New challenges and opportunities offered to fight against vaccine hesitancy.

    PubMed

    Stahl, J-P; Cohen, R; Denis, F; Gaudelus, J; Martinot, A; Lery, T; Lepetit, H

    2016-05-01

    Vaccine hesitancy is a growing and threatening trend, increasing the risk of disease outbreaks and potentially defeating health authorities' strategies. We aimed to describe the significant role of social networks and the Internet on vaccine hesitancy, and more generally on vaccine attitudes and behaviors. Presentation and discussion of lessons learnt from: (i) the monitoring and analysis of web and social network contents on vaccination; (ii) the tracking of Google search terms used by web users; (iii) the analysis of Google search suggestions related to vaccination; (iv) results from the Vaccinoscopie(©) study, online annual surveys of representative samples of 6500 to 10,000 French mothers, monitoring vaccine behaviors and attitude of French parents as well as vaccination coverage of their children, since 2008; and (v) various studies published in the scientific literature. Social networks and the web play a major role in disseminating information about vaccination. They have modified the vaccination decision-making process and, more generally, the doctor/patient relationship. The Internet may fuel controversial issues related to vaccination and durably impact public opinion, but it may also provide new tools to fight against vaccine hesitancy. Vaccine hesitancy should be fought on the Internet battlefield, and for this purpose, communication strategies should take into account new threats and opportunities offered by the web and social networks. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  11. Interoperability in planetary research for geospatial data analysis

    NASA Astrophysics Data System (ADS)

    Hare, Trent M.; Rossi, Angelo P.; Frigeri, Alessandro; Marmo, Chiara

    2018-01-01

    For more than a decade there has been a push in the planetary science community to support interoperable methods for accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (e.g., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized geospatial image formats, geologic mapping conventions, U.S. Federal Geographic Data Committee (FGDC) cartographic and metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Map Tile Services (cached image tiles), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they can be just as valuable for planetary domain. Another initiative, called VESPA (Virtual European Solar and Planetary Access), will marry several of the above geoscience standards and astronomy-based standards as defined by International Virtual Observatory Alliance (IVOA). This work outlines the current state of interoperability initiatives in use or in the process of being researched within the planetary geospatial community.

  12. Finding Citations to Social Work Literature: The Relative Benefits of Using "Web of Science," "Scopus," or "Google Scholar"

    ERIC Educational Resources Information Center

    Bergman, Elaine M. Lasda

    2012-01-01

    Past studies of citation coverage of "Web of Science," "Scopus," and "Google Scholar" do not demonstrate a consistent pattern that can be applied to the interdisciplinary mix of resources used in social work research. To determine the utility of these tools to social work researchers, an analysis of citing references to well-known social work…

  13. PlanetServer: Innovative approaches for the online analysis of hyperspectral satellite data from Mars

    NASA Astrophysics Data System (ADS)

    Oosthoek, J. H. P.; Flahaut, J.; Rossi, A. P.; Baumann, P.; Misev, D.; Campalani, P.; Unnithan, V.

    2014-06-01

    PlanetServer is a WebGIS system, currently under development, enabling the online analysis of Compact Reconnaissance Imaging Spectrometer (CRISM) hyperspectral data from Mars. It is part of the EarthServer project which builds infrastructure for online access and analysis of huge Earth Science datasets. Core functionality consists of the rasdaman Array Database Management System (DBMS) for storage, and the Open Geospatial Consortium (OGC) Web Coverage Processing Service (WCPS) for data querying. Various WCPS queries have been designed to access spatial and spectral subsets of the CRISM data. The client WebGIS, consisting mainly of the OpenLayers javascript library, uses these queries to enable online spatial and spectral analysis. Currently the PlanetServer demonstration consists of two CRISM Full Resolution Target (FRT) observations, surrounding the NASA Curiosity rover landing site. A detailed analysis of one of these observations is performed in the Case Study section. The current PlanetServer functionality is described step by step, and is tested by focusing on detecting mineralogical evidence described in earlier Gale crater studies. Both the PlanetServer methodology and its possible use for mineralogical studies will be further discussed. Future work includes batch ingestion of CRISM data and further development of the WebGIS and analysis tools.

  14. Developing Toolsets for AirBorne Data (TAD): Overview of Design Concept

    NASA Astrophysics Data System (ADS)

    Parker, L.; Perez, J.; Chen, G.; Benson, A.; Peeters, M. C.

    2013-12-01

    NASA has conducted airborne tropospheric chemistry studies for about three decades. These field campaigns have generated a great wealth of observations, including a wide range of the trace gases and aerosol properties. Even though the spatial and temporal coverage is limited, the aircraft data offer high resolution and comprehensive simultaneous coverage of many variables, e.g. ozone precursors, intermediate photochemical species, and photochemical products. The recent NASA Earth Venture Program has generated an unprecedented amount of aircraft observations in terms of the sheer number of measurements and data volume. The ASDC Toolset for Airborne Data (TAD) is being designed to meet the user community needs for aircraft data for scientific research on climate change and air quality relevant issues, particularly: 1) Provide timely access to a broad user community, 2) Provide an intuitive user interface to facilitate quick discovery of the variables and data, 3) Provide data products and tools to facilitate model assessment activities, e.g., merge files and data subsetting capabilities, 4) Provide simple utility 'calculators', e.g., unit conversion and aerosol size distribution processing, and 5) Provide Web Coverage Service capable tools to enhance the data usability. The general strategy and design of TAD will be presented.

  15. Evaluation of the content and accessibility of web sites for accredited orthopaedic sports medicine fellowships.

    PubMed

    Mulcahey, Mary K; Gosselin, Michelle M; Fadale, Paul D

    2013-06-19

    The Internet is a common source of information for orthopaedic residents applying for sports medicine fellowships, with the web sites of the American Orthopaedic Society for Sports Medicine (AOSSM) and the San Francisco Match serving as central databases. We sought to evaluate the web sites for accredited orthopaedic sports medicine fellowships with regard to content and accessibility. We reviewed the existing web sites of the ninety-five accredited orthopaedic sports medicine fellowships included in the AOSSM and San Francisco Match databases from February to March 2012. A Google search was performed to determine the overall accessibility of program web sites and to supplement information obtained from the AOSSM and San Francisco Match web sites. The study sample consisted of the eighty-seven programs whose web sites connected to information about the fellowship. Each web site was evaluated for its informational value. Of the ninety-five programs, fifty-one (54%) had links listed in the AOSSM database. Three (3%) of all accredited programs had web sites that were linked directly to information about the fellowship. Eighty-eight (93%) had links listed in the San Francisco Match database; however, only five (5%) had links that connected directly to information about the fellowship. Of the eighty-seven programs analyzed in our study, all eighty-seven web sites (100%) provided a description of the program and seventy-six web sites (87%) included information about the application process. Twenty-one web sites (24%) included a list of current fellows. Fifty-six web sites (64%) described the didactic instruction, seventy (80%) described team coverage responsibilities, forty-seven (54%) included a description of cases routinely performed by fellows, forty-one (47%) described the role of the fellow in seeing patients in the office, eleven (13%) included call responsibilities, and seventeen (20%) described a rotation schedule. Two Google searches identified direct links for 67% to 71% of all accredited programs. Most accredited orthopaedic sports medicine fellowships lack easily accessible or complete web sites in the AOSSM or San Francisco Match databases. Improvement in the accessibility and quality of information on orthopaedic sports medicine fellowship web sites would facilitate the ability of applicants to obtain useful information.

  16. Tools for Interdisciplinary Data Assimilation and Sharing in Support of Hydrologic Science

    NASA Astrophysics Data System (ADS)

    Blodgett, D. L.; Walker, J.; Suftin, I.; Warren, M.; Kunicki, T.

    2013-12-01

    Information consumed and produced in hydrologic analyses is interdisciplinary and massive. These factors put a heavy information management burden on the hydrologic science community. The U.S. Geological Survey (USGS) Office of Water Information Center for Integrated Data Analytics (CIDA) seeks to assist hydrologic science investigators with all-components of their scientific data management life cycle. Ongoing data publication and software development projects will be presented demonstrating publically available data access services and manipulation tools being developed with support from two Department of the Interior initiatives. The USGS-led National Water Census seeks to provide both data and tools in support of nationally consistent water availability estimates. Newly available data include national coverages of radar-indicated precipitation, actual evapotranspiration, water use estimates aggregated by county, and South East region estimates of streamflow for 12-digit hydrologic unit code watersheds. Web services making these data available and applications to access them will be demonstrated. Web-available processing services able to provide numerous streamflow statistics for any USGS daily flow record or model result time series and other National Water Census processing tools will also be demonstrated. The National Climate Change and Wildlife Science Center is a USGS center leading DOI-funded academic global change adaptation research. It has a mission goal to ensure data used and produced by funded projects is available via web services and tools that streamline data management tasks in interdisciplinary science. For example, collections of downscaled climate projections, typically large collections of files that must be downloaded to be accessed, are being published using web services that allow access to the entire dataset via simple web-service requests and numerous processing tools. Recent progress on this front includes, data web services for Climate Model Intercomparison Phase 5 based downscaled climate projections, EPA's Integrated Climate and Land Use Scenarios projections of population and land cover metrics, and MODIS-derived land cover parameters from NASA's Land Processes Distributed Active Archive Center. These new services and ways to discover others will be presented through demonstration of a recently open-sourced project from a web-application or scripted workflow. Development and public deployment of server-based processing tools to subset and summarize these and other data is ongoing at the CIDA with partner groups such as 52 Degrees North and Unidata. The latest progress on subsetting, spatial summarization to areas of interest, and temporal summarization via common-statistical methods will be presented.

  17. The Sensor Management for Applied Research Technologies (SMART) Project

    NASA Technical Reports Server (NTRS)

    Goodman, Michael; Jedlovec, Gary; Conover, Helen; Botts, Mike; Robin, Alex; Blakeslee, Richard; Hood, Robbie; Ingenthron, Susan; Li, Xiang; Maskey, Manil; hide

    2007-01-01

    NASA seeks on-demand data processing and analysis of Earth science observations to facilitate timely decision-making that can lead to the realization of the practical benefits of satellite instruments, airborne and surface remote sensing systems. However, a significant challenge exists in accessing and integrating data from multiple sensors or platforms to address Earth science problems because of the large data volumes, varying sensor scan characteristics, unique orbital coverage, and the steep "learning curve" associated with each sensor, data type, and associated products. The development of sensor web capabilities to autonomously process these data streams (whether real-time or archived) provides an opportunity to overcome these obstacles and facilitate the integration and synthesis of Earth science data and weather model output.

  18. EarthServer: Use of Rasdaman as a data store for use in visualisation of complex EO data

    NASA Astrophysics Data System (ADS)

    Clements, Oliver; Walker, Peter; Grant, Mike

    2013-04-01

    The European Commission FP7 project EarthServer is establishing open access and ad-hoc analytics on extreme-size Earth Science data, based on and extending cutting-edge Array Database technology. EarthServer is built around the Rasdaman Raster Data Manager which extends standard relational database systems with the ability to store and retrieve multi-dimensional raster data of unlimited size through an SQL style query language. Rasdaman facilitates visualisation of data by providing several Open Geospatial Consortium (OGC) standard interfaces through its web services wrapper, Petascope. These include the well established standards, Web Coverage Service (WCS) and Web Map Service (WMS) as well as the emerging standard, Web Coverage Processing Service (WCPS). The WCPS standard allows the running of ad-hoc queries on the data stored within Rasdaman, creating an infrastructure where users are not restricted by bandwidth when manipulating or querying huge datasets. Here we will show that the use of EarthServer technologies and infrastructure allows access and visualisation of massive scale data through a web client with only marginal bandwidth use as opposed to the current mechanism of copying huge amounts of data to create visualisations locally. For example if a user wanted to generate a plot of global average chlorophyll for a complete decade time series they would only have to download the result instead of Terabytes of data. Firstly we will present a brief overview of the capabilities of Rasdaman and the WCPS query language to introduce the ways in which it is used in a visualisation tool chain. We will show that there are several ways in which WCPS can be utilised to create both standard and novel web based visualisations. An example of a standard visualisation is the production of traditional 2d plots, allowing users the ability to plot data products easily. However, the query language allows the creation of novel/custom products, which can then immediately be plotted with the same system. For more complex multi-spectral data, WCPS allows the user to explore novel combinations of bands in standard band-ratio algorithms through a web browser with dynamic updating of the resultant image. To visualise very large datasets Rasdaman has the capability to dynamically scale a dataset or query result so that it can be appraised quickly for use in later unscaled queries. All of these techniques are accessible through a web based GIS interface increasing the number of potential users of the system. Lastly we will show the advances in dynamic web based 3D visualisations being explored within the EarthServer project. By utilising the emerging declarative 3D web standard X3DOM as a tool to visualise the results of WCPS queries we introduce several possible benefits, including quick appraisal of data for outliers or anomalous data points and visualisation of the uncertainty of data alongside the actual data values.

  19. Climate Feedback: Bringing the Scientific Community to Provide Direct Feedback on the Credibility of Climate Media Coverage

    NASA Astrophysics Data System (ADS)

    Vincent, E. M.; Matlock, T.; Westerling, A. L.

    2015-12-01

    While most scientists recognize climate change as a major societal and environmental issue, social and political will to tackle the problem is still lacking. One of the biggest obstacles is inaccurate reporting or even outright misinformation in climate change coverage that result in the confusion of the general public on the issue.In today's era of instant access to information, what we read online usually falls outside our field of expertise and it is a real challenge to evaluate what is credible. The emerging technology of web annotation could be a game changer as it allows knowledgeable individuals to attach notes to any piece of text of a webpage and to share them with readers who will be able to see the annotations in-context -like comments on a pdf.Here we present the Climate Feedback initiative that is bringing together a community of climate scientists who collectively evaluate the scientific accuracy of influential climate change media coverage. Scientists annotate articles sentence by sentence and assess whether they are consistent with scientific knowledge allowing readers to see where and why the coverage is -or is not- based on science. Scientists also summarize the essence of their critical commentary in the form of a simple article-level overall credibility rating that quickly informs readers about the credibility of the entire piece.Web-annotation allows readers to 'hear' directly from the experts and to sense the consensus in a personal way as one can literaly see how many scientists agree with a given statement. It also allows a broad population of scientists to interact with the media, notably early career scientists.In this talk, we will present results on the impacts annotations have on readers -regarding their evaluation of the trustworthiness of the information they read- and on journalists -regarding their reception of scientists comments.Several dozen scientists have contributed to this effort to date and the system offers potential to scale up as it relies on a crowdsourced process where each scientist only makes small contributions that get aggregated together. The project aims to build a network of scientists with varied expertise and to organize their efforts at a global scale to efficiently peer-review major news coverage on climate.

  20. Information Management of Web Application Based Environmental Performance Management in Concentrating Division of PTFI

    NASA Astrophysics Data System (ADS)

    Susanto, Arif; Mulyono, Nur Budi

    2018-02-01

    The changes of environmental management system standards into the latest version, i.e. ISO 14001:2015, may cause a change on a data and information need in decision making and achieving the objectives in the organization coverage. Information management is the organization's responsibility to ensure that effectiveness and efficiency start from its creating, storing, processing and distribution processes to support operations and effective decision making activity in environmental performance management. The objective of this research was to set up an information management program and to adopt the technology as the supporting component of the program which was done by PTFI Concentrating Division so that it could be in line with the desirable organization objective in environmental management based on ISO 14001:2015 environmental management system standards. Materials and methods used covered technical aspects in information management, i.e. with web-based application development by using usage centered design. The result of this research showed that the use of Single Sign On gave ease to its user to interact further on the use of the environmental management system. Developing a web-based through creating entity relationship diagram (ERD) and information extraction by conducting information extraction which focuses on attributes, keys, determination of constraints. While creating ERD is obtained from relational database scheme from a number of database from environmental performances in Concentrating Division.

  1. A Framework for Integrating Oceanographic Data Repositories

    NASA Astrophysics Data System (ADS)

    Rozell, E.; Maffei, A. R.; Beaulieu, S. E.; Fox, P. A.

    2010-12-01

    Oceanographic research covers a broad range of science domains and requires a tremendous amount of cross-disciplinary collaboration. Advances in cyberinfrastructure are making it easier to share data across disciplines through the use of web services and community vocabularies. Best practices in the design of web services and vocabularies to support interoperability amongst science data repositories are only starting to emerge. Strategic design decisions in these areas are crucial to the creation of end-user data and application integration tools. We present S2S, a novel framework for deploying customizable user interfaces to support the search and analysis of data from multiple repositories. Our research methods follow the Semantic Web methodology and technology development process developed by Fox et al. This methodology stresses the importance of close scientist-technologist interactions when developing scientific use cases, keeping the project well scoped and ensuring the result meets a real scientific need. The S2S framework motivates the development of standardized web services with well-described parameters, as well as the integration of existing web services and applications in the search and analysis of data. S2S also encourages the use and development of community vocabularies and ontologies to support federated search and reduce the amount of domain expertise required in the data discovery process. S2S utilizes the Web Ontology Language (OWL) to describe the components of the framework, including web service parameters, and OpenSearch as a standard description for web services, particularly search services for oceanographic data repositories. We have created search services for an oceanographic metadata database, a large set of quality-controlled ocean profile measurements, and a biogeographic search service. S2S provides an application programming interface (API) that can be used to generate custom user interfaces, supporting data and application integration across these repositories and other web resources. Although initially targeted towards a general oceanographic audience, the S2S framework shows promise in many science domains, inspired in part by the broad disciplinary coverage of oceanography. This presentation will cover the challenges addressed by the S2S framework, the research methods used in its development, and the resulting architecture for the system. It will demonstrate how S2S is remarkably extensible, and can be generalized to many science domains. Given these characteristics, the framework can simplify the process of data discovery and analysis for the end user, and can help to shift the responsibility of search interface development away from data managers.

  2. Recommendation of standardized health learning contents using archetypes and semantic web technologies.

    PubMed

    Legaz-García, María del Carmen; Martínez-Costa, Catalina; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás

    2012-01-01

    Linking Electronic Healthcare Records (EHR) content to educational materials has been considered a key international recommendation to enable clinical engagement and to promote patient safety. This would suggest citizens to access reliable information available on the web and to guide them properly. In this paper, we describe an approach in that direction, based on the use of dual model EHR standards and standardized educational contents. The recommendation method will be based on the semantic coverage of the learning content repository for a particular archetype, which will be calculated by applying semantic web technologies like ontologies and semantic annotations.

  3. Virtual Sensors in a Web 2.0 Digital Watershed

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Hill, D. J.; Marini, L.; Kooper, R.; Rodriguez, A.; Myers, J. D.

    2008-12-01

    The lack of rainfall data in many watersheds is one of the major barriers for modeling and studying many environmental and hydrological processes and supporting decision making. There are just not enough rain gages on the ground. To overcome this data scarcity issue, a Web 2.0 digital watershed is developed at NCSA(National Center for Supercomputing Applications), where users can point-and-click on a web-based google map interface and create new precipitation virtual sensors at any location within the same coverage region as a NEXRAD station. A set of scientific workflows are implemented to perform spatial, temporal and thematic transformations to the near-real-time NEXRAD Level II data. Such workflows can be triggered by the users' actions and generate either rainfall rate or rainfall accumulation streaming data at a user-specified time interval. We will discuss some underlying components of this digital watershed, which consists of a semantic content management middleware, a semantically enhanced streaming data toolkit, virtual sensor management functionality, and RESTful (REpresentational State Transfer) web service that can trigger the workflow execution. Such loosely coupled architecture presents a generic framework for constructing a Web 2.0 style digital watershed. An implementation of this architecture at the Upper Illinois Rive Basin will be presented. We will also discuss the implications of the virtual sensor concept for the broad environmental observatory community and how such concept will help us move towards a participatory digital watershed.

  4. Earth science big data at users' fingertips: the EarthServer Science Gateway Mobile

    NASA Astrophysics Data System (ADS)

    Barbera, Roberto; Bruno, Riccardo; Calanducci, Antonio; Fargetta, Marco; Pappalardo, Marco; Rundo, Francesco

    2014-05-01

    The EarthServer project (www.earthserver.eu), funded by the European Commission under its Seventh Framework Program, aims at establishing open access and ad-hoc analytics on extreme-size Earth Science data, based on and extending leading-edge Array Database technology. The core idea is to use database query languages as client/server interface to achieve barrier-free "mix & match" access to multi-source, any-size, multi-dimensional space-time data -- in short: "Big Earth Data Analytics" - based on the open standards of the Open Geospatial Consortium Web Coverage Processing Service (OGC WCPS) and the W3C XQuery. EarthServer combines both, thereby achieving a tight data/metadata integration. Further, the rasdaman Array Database System (www.rasdaman.com) is extended with further space-time coverage data types. On server side, highly effective optimizations - such as parallel and distributed query processing - ensure scalability to Exabyte volumes. In this contribution we will report on the EarthServer Science Gateway Mobile, an app for both iOS and Android-based devices that allows users to seamlessly access some of the EarthServer applications using SAML-based federated authentication and fine-grained authorisation mechanisms.

  5. Evaluation of physical activity web sites for use of behavior change theories.

    PubMed

    Doshi, Amol; Patrick, Kevin; Sallis, James F; Calfas, Karen

    2003-01-01

    Physical activity (PA) Web sites were assessed for their use of behavior change theories, including constructs of the health belief model, Transtheoretical Model, social cognitive theory, and the theory of reasoned action and planned behavior. An evaluation template for assessing PA Web sites was developed, and content validity and interrater reliability were demonstrated. Two independent raters evaluated 24 PA Web sites. Web sites varied widely in application of theory-based constructs, ranging from 5 to 48 on a 100-point scale. The most common intervention strategies were general information, social support, and realistic goal areas. Coverage of theory-based strategies was low, varying from 26% for social cognitive theory to 39% for health belief model. Overall, PA Web sites provided little assessment, feedback, or individually tailored assistance for users. They were unable to substantially tailor the on-line experience for users at different stages of change or different demographic characteristics.

  6. Review on dog rabies vaccination coverage in Africa: a question of dog accessibility or cost recovery?

    PubMed

    Jibat, Tariku; Hogeveen, Henk; Mourits, Monique C M

    2015-02-01

    Rabies still poses a significant human health problem throughout most of Africa, where the majority of the human cases results from dog bites. Mass dog vaccination is considered to be the most effective method to prevent rabies in humans. Our objective was to systematically review research articles on dog rabies parenteral vaccination coverage in Africa in relation to dog accessibility and vaccination cost recovery arrangement (i.e.free of charge or owner charged). A systematic literature search was made in the databases of CAB abstracts (EBSCOhost and OvidSP), Scopus, Web of Science, PubMed, Medline (EBSCOhost and OvidSP) and AJOL (African Journal Online) for peer reviewed articles on 1) rabies control, 2) dog rabies vaccination coverage and 3) dog demography in Africa. Identified articles were subsequently screened and selected using predefined selection criteria like year of publication (viz. ≥ 1990), type of study (cross sectional), objective(s) of the study (i.e. vaccination coverage rates, dog demographics and financial arrangements of vaccination costs), language of publication (English) and geographical focus (Africa). The selection process resulted in sixteen peer reviewed articles which were used to review dog demography and dog ownership status, and dog rabies vaccination coverage throughout Africa. The main review findings indicate that 1) the majority (up to 98.1%) of dogs in African countries are owned (and as such accessible), 2) puppies younger than 3 months of age constitute a considerable proportion (up to 30%) of the dog population and 3) male dogs are dominating in numbers (up to 3.6 times the female dog population). Dog rabies parenteral vaccination coverage was compared between "free of charge" and "owner charged" vaccination schemes by the technique of Meta-analysis. Results indicate that the rabies vaccination coverage following a free of charge vaccination scheme (68%) is closer to the World Health Organization recommended coverage rate (70%) than the achieved coverage rate in owner-charged dog rabies vaccination schemes (18%). Most dogs in Africa are owned and accessible for parenteral vaccination against rabies if the campaign is performed "free of charge".

  7. Review on Dog Rabies Vaccination Coverage in Africa: A Question of Dog Accessibility or Cost Recovery?

    PubMed Central

    Jibat, Tariku; Hogeveen, Henk; Mourits, Monique C. M.

    2015-01-01

    Background Rabies still poses a significant human health problem throughout most of Africa, where the majority of the human cases results from dog bites. Mass dog vaccination is considered to be the most effective method to prevent rabies in humans. Our objective was to systematically review research articles on dog rabies parenteral vaccination coverage in Africa in relation to dog accessibility and vaccination cost recovery arrangement (i.e.free of charge or owner charged). Methodology/Principal Findings A systematic literature search was made in the databases of CAB abstracts (EBSCOhost and OvidSP), Scopus, Web of Science, PubMed, Medline (EBSCOhost and OvidSP) and AJOL (African Journal Online) for peer reviewed articles on 1) rabies control, 2) dog rabies vaccination coverage and 3) dog demography in Africa. Identified articles were subsequently screened and selected using predefined selection criteria like year of publication (viz. ≥ 1990), type of study (cross sectional), objective(s) of the study (i.e. vaccination coverage rates, dog demographics and financial arrangements of vaccination costs), language of publication (English) and geographical focus (Africa). The selection process resulted in sixteen peer reviewed articles which were used to review dog demography and dog ownership status, and dog rabies vaccination coverage throughout Africa. The main review findings indicate that 1) the majority (up to 98.1%) of dogs in African countries are owned (and as such accessible), 2) puppies younger than 3 months of age constitute a considerable proportion (up to 30%) of the dog population and 3) male dogs are dominating in numbers (up to 3.6 times the female dog population). Dog rabies parenteral vaccination coverage was compared between “free of charge” and “owner charged” vaccination schemes by the technique of Meta-analysis. Results indicate that the rabies vaccination coverage following a free of charge vaccination scheme (68%) is closer to the World Health Organization recommended coverage rate (70%) than the achieved coverage rate in owner-charged dog rabies vaccination schemes (18%). Conclusions/Significance Most dogs in Africa are owned and accessible for parenteral vaccination against rabies if the campaign is performed “free of charge”. PMID:25646774

  8. Towards Big Earth Data Analytics: The EarthServer Approach

    NASA Astrophysics Data System (ADS)

    Baumann, Peter

    2013-04-01

    Big Data in the Earth sciences, the Tera- to Exabyte archives, mostly are made up from coverage data whereby the term "coverage", according to ISO and OGC, is defined as the digital representation of some space-time varying phenomenon. Common examples include 1-D sensor timeseries, 2-D remote sensing imagery, 3D x/y/t image timeseries and x/y/z geology data, and 4-D x/y/z/t atmosphere and ocean data. Analytics on such data requires on-demand processing of sometimes significant complexity, such as getting the Fourier transform of satellite images. As network bandwidth limits prohibit transfer of such Big Data it is indispensable to devise protocols allowing clients to task flexible and fast processing on the server. The EarthServer initiative, funded by EU FP7 eInfrastructures, unites 11 partners from computer and earth sciences to establish Big Earth Data Analytics. One key ingredient is flexibility for users to ask what they want, not impeded and complicated by system internals. The EarthServer answer to this is to use high-level query languages; these have proven tremendously successful on tabular and XML data, and we extend them with a central geo data structure, multi-dimensional arrays. A second key ingredient is scalability. Without any doubt, scalability ultimately can only be achieved through parallelization. In the past, parallelizing code has been done at compile time and usually with manual intervention. The EarthServer approach is to perform a samentic-based dynamic distribution of queries fragments based on networks optimization and further criteria. The EarthServer platform is comprised by rasdaman, an Array DBMS enabling efficient storage and retrieval of any-size, any-type multi-dimensional raster data. In the project, rasdaman is being extended with several functionality and scalability features, including: support for irregular grids and general meshes; in-situ retrieval (evaluation of database queries on existing archive structures, avoiding data import and, hence, duplication); the aforementioned distributed query processing. Additionally, Web clients for multi-dimensional data visualization are being established. Client/server interfaces are strictly based on OGC and W3C standards, in particular the Web Coverage Processing Service (WCPS) which defines a high-level raster query language. We present the EarthServer project with its vision and approaches, relate it to the current state of standardization, and demonstrate it by way of large-scale data centers and their services using rasdaman.

  9. The Americans with Disabilities Amendment Act--are you ready for the changes?

    PubMed

    Leiker, Michelle

    2008-12-01

    Significant change is coming quickly, and employers need to be prepared. The Act will move the focus from a "disability" inquiry to an individualized interactive process, and will likely increase the number of individuals protected under the ADA. The defenses and employer modes of responding to disability claims will be narrowed while the range of ADA coverage will expand considerably. Additional information on the ADA and the recent amendments can be obtained by calling the Department of Justice's ADA Information Line (800.514.0301), the EEOC (800.669.4000), or by visiting the DOJ's ADA Web site (http://www.ada.gov/).

  10. Archive Management of NASA Earth Observation Data to Support Cloud Analysis

    NASA Technical Reports Server (NTRS)

    Lynnes, Christopher; Baynes, Kathleen; McInerney, Mark

    2017-01-01

    NASA collects, processes and distributes petabytes of Earth Observation (EO) data from satellites, aircraft, in situ instruments and model output, with an order of magnitude increase expected by 2024. Cloud-based web object storage (WOS) of these data can simplify the execution of such an increase. More importantly, it can also facilitate user analysis of those volumes by making the data available to the massively parallel computing power in the cloud. However, storing EO data in cloud WOS has a ripple effect throughout the NASA archive system with unexpected challenges and opportunities. One challenge is modifying data servicing software (such as Web Coverage Service servers) to access and subset data that are no longer on a directly accessible file system, but rather in cloud WOS. Opportunities include refactoring of the archive software to a cloud-native architecture; virtualizing data products by computing on demand; and reorganizing data to be more analysis-friendly. Reviewed by Mark McInerney ESDIS Deputy Project Manager.

  11. Modelling noise propagation using Grid Resources. Progress within GDI-Grid

    NASA Astrophysics Data System (ADS)

    Kiehle, Christian; Mayer, Christian; Padberg, Alexander; Stapelfeld, Hartmut

    2010-05-01

    Modelling noise propagation using Grid Resources. Progress within GDI-Grid. GDI-Grid (english: SDI-Grid) is a research project funded by the German Ministry for Science and Education (BMBF). It aims at bridging the gaps between OGC Web Services (OWS) and Grid infrastructures and identifying the potential of utilizing the superior storage capacities and computational power of grid infrastructures for geospatial applications while keeping the well-known service interfaces specified by the OGC. The project considers all major OGC webservice interfaces for Web Mapping (WMS), Feature access (Web Feature Service), Coverage access (Web Coverage Service) and processing (Web Processing Service). The major challenge within GDI-Grid is the harmonization of diverging standards as defined by standardization bodies for Grid computing and spatial information exchange. The project started in 2007 and will continue until June 2010. The concept for the gridification of OWS developed by lat/lon GmbH and the Department of Geography of the University of Bonn is applied to three real-world scenarios in order to check its practicability: a flood simulation, a scenario for emergency routing and a noise propagation simulation. The latter scenario is addressed by the Stapelfeldt Ingenieurgesellschaft mbH located in Dortmund adapting their LimA software to utilize grid resources. Noise mapping of e.g. traffic noise in urban agglomerates and along major trunk roads is a reoccurring demand of the EU Noise Directive. Input data requires road net and traffic, terrain, buildings and noise protection screens as well as population distribution. Noise impact levels are generally calculated in 10 m grid and along relevant building facades. For each receiver position sources within a typical range of 2000 m are split down into small segments, depending on local geometry. For each of the segments propagation analysis includes diffraction effects caused by all obstacles on the path of sound propagation. This immense intensive calculation needs to be performed for a major part of European landscape. A LINUX version of the commercial LimA software for noise mapping analysis has been implemented on a test cluster within the German D-GRID computer network. Results and performance indicators will be presented. The presentation is an extension to last-years presentation "Spatial Data Infrastructures and Grid Computing: the GDI-Grid project" that described the gridification concept developed in the GDI-Grid project and provided an overview of the conceptual gaps between Grid Computing and Spatial Data Infrastructures. Results from the GDI-Grid project are incorporated in the OGC-OGF (Open Grid Forum) collaboration efforts as well as the OGC WPS 2.0 standards working group developing the next major version of the WPS specification.

  12. EarthServer: Visualisation and use of uncertainty as a data exploration tool

    NASA Astrophysics Data System (ADS)

    Walker, Peter; Clements, Oliver; Grant, Mike

    2013-04-01

    The Ocean Science/Earth Observation community generates huge datasets from satellite observation. Until recently it has been difficult to obtain matching uncertainty information for these datasets and to apply this to their processing. In order to make use of uncertainty information when analysing "Big Data" we need both the uncertainty itself (attached to the underlying data) and a means of working with the combined product without requiring the entire dataset to be downloaded. The European Commission FP7 project EarthServer (http://earthserver.eu) is addressing the problem of accessing and ad-hoc analysis of extreme-size Earth Science data using cutting-edge Array Database technology. The core software (Rasdaman) and web services wrapper (Petascope) allow huge datasets to be accessed using Open Geospatial Consortium (OGC) standard interfaces including the well established standards, Web Coverage Service (WCS) and Web Map Service (WMS) as well as the emerging standard, Web Coverage Processing Service (WCPS). The WCPS standard allows the running of ad-hoc queries on any of the data stored within Rasdaman, creating an infrastructure where users are not restricted by bandwidth when manipulating or querying huge datasets. The ESA Ocean Colour - Climate Change Initiative (OC-CCI) project (http://www.esa-oceancolour-cci.org/), is producing high-resolution, global ocean colour datasets over the full time period (1998-2012) where high quality observations were available. This climate data record includes per-pixel uncertainty data for each variable, based on an analytic method that classifies how much and which types of water are present in a pixel, and assigns uncertainty based on robust comparisons to global in-situ validation datasets. These uncertainty values take two forms, Root Mean Square (RMS) and Bias uncertainty, respectively representing the expected variability and expected offset error. By combining the data produced through the OC-CCI project with the software from the EarthServer project we can produce a novel data offering that allows the use of traditional exploration and access mechanisms such as WMS and WCS. However the real benefits can be seen when utilising WCPS to explore the data . We will show two major benefits to this infrastructure. Firstly we will show that the visualisation of the combined chlorophyll and uncertainty datasets through a web based GIS portal gives users the ability to instantaneously assess the quality of the data they are exploring using traditional web based plotting techniques as well as through novel web based 3 dimensional visualisation. Secondly we will showcase the benefits available when combining these data with the WCPS standard. The uncertainty data can be utilised in queries using the standard WCPS query language. This allows selection of data either for download or use within the query, based on the respective uncertainty values as well as the possibility of incorporating both the chlorophyll data and uncertainty data into complex queries to produce additional novel data products. By filtering with uncertainty at the data source rather than the client we can minimise traffic over the network allowing huge datasets to be worked on with a minimal time penalty.

  13. Analyzing the public discourse on works of fiction - Detection and visualization of emotion in online coverage about HBO's Game of Thrones.

    PubMed

    Scharl, Arno; Hubmann-Haidvogel, Alexander; Jones, Alistair; Fischl, Daniel; Kamolov, Ruslan; Weichselbraun, Albert; Rafelsberger, Walter

    2016-01-01

    This paper presents a Web intelligence portal that captures and aggregates news and social media coverage about "Game of Thrones", an American drama television series created for the HBO television network based on George R.R. Martin's series of fantasy novels. The system collects content from the Web sites of Anglo-American news media as well as from four social media platforms: Twitter, Facebook, Google+ and YouTube. An interactive dashboard with trend charts and synchronized visual analytics components not only shows how often Game of Thrones events and characters are being mentioned by journalists and viewers, but also provides a real-time account of concepts that are being associated with the unfolding storyline and each new episode. Positive or negative sentiment is computed automatically, which sheds light on the perception of actors and new plot elements.

  14. Analyzing the public discourse on works of fiction – Detection and visualization of emotion in online coverage about HBO’s Game of Thrones

    PubMed Central

    Scharl, Arno; Hubmann-Haidvogel, Alexander; Jones, Alistair; Fischl, Daniel; Kamolov, Ruslan; Weichselbraun, Albert; Rafelsberger, Walter

    2016-01-01

    This paper presents a Web intelligence portal that captures and aggregates news and social media coverage about “Game of Thrones”, an American drama television series created for the HBO television network based on George R.R. Martin’s series of fantasy novels. The system collects content from the Web sites of Anglo-American news media as well as from four social media platforms: Twitter, Facebook, Google+ and YouTube. An interactive dashboard with trend charts and synchronized visual analytics components not only shows how often Game of Thrones events and characters are being mentioned by journalists and viewers, but also provides a real-time account of concepts that are being associated with the unfolding storyline and each new episode. Positive or negative sentiment is computed automatically, which sheds light on the perception of actors and new plot elements. PMID:27065510

  15. Digging into the low molecular weight peptidome with the OligoNet web server.

    PubMed

    Liu, Youzhong; Forcisi, Sara; Lucio, Marianna; Harir, Mourad; Bahut, Florian; Deleris-Bou, Magali; Krieger-Weber, Sibylle; Gougeon, Régis D; Alexandre, Hervé; Schmitt-Kopplin, Philippe

    2017-09-15

    Bioactive peptides play critical roles in regulating many biological processes. Recently, natural short peptides biomarkers are drawing significant attention and are considered as "hidden treasure" of drug candidates. High resolution and high mass accuracy provided by mass spectrometry (MS)-based untargeted metabolomics would enable the rapid detection and wide coverage of the low-molecular-weight peptidome. However, translating unknown masses (<1 500 Da) into putative peptides is often limited due to the lack of automatic data processing tools and to the limit of peptide databases. The web server OligoNet responds to this challenge by attempting to decompose each individual mass into a combination of amino acids out of metabolomics datasets. It provides an additional network-based data interpretation named "Peptide degradation network" (PDN), which unravels interesting relations between annotated peptides and generates potential functional patterns. The ab initio PDN built from yeast metabolic profiling data shows a great similarity with well-known metabolic networks, and could aid biological interpretation. OligoNet allows also an easy evaluation and interpretation of annotated peptides in systems biology, and is freely accessible at https://daniellyz200608105.shinyapps.io/OligoNet/ .

  16. TIGER 2010 Boundaries

    EPA Pesticide Factsheets

    This EnviroAtlas web service supports research and online mapping activities related to EnviroAtlas (https://www.epa.gov/enviroatlas). This web service includes the State and County boundaries from the TIGER shapefiles compiled into a single national coverage for each layer. The TIGER/Line Files are shapefiles and related database files (.dbf) that are an extract of selected geographic and cartographic information from the U.S. Census Bureau's Master Address File / Topologically Integrated Geographic Encoding and Referencing (MAF/TIGER) Database (MTDB).

  17. RAINE Public Communities

    EPA Pesticide Factsheets

    The file geodatabase (fgdb) contains the New England Town Boundaries and information related specifically to the Resilience and Adaptation in New England (RAINE) web application. This includes data tables relating to particular aspects of towns notably features, funding, impacts, partners, plans, and programs (refer to V_MAP_STATIC tables). New England Town Boundary coverage is a compilation of coverages received from the six New England State GIS Offices. The EPA New England GIS Center appended the coverages together into a single file and generated attrributes to link to the Facility Identification Online system. These feature class points represent the communities (Communities in gdb) and featured RAINE communities (RAINE_Communities_201609), which contain more detailed information that is contained within the included data tables.

  18. Acta Dermatovenerologica Alpina, Pannonica et Adriatica accepted for coverage in Thomson Reuters' Emerging Sources Citation Index (ESCI).

    PubMed

    Poljak, Mario; Miljković, Jovan; Triglav, Tina

    2016-09-01

    Acta Dermatovenerologica Alpina, Pannonica et Adriatica (Acta Dermatovenerol APA) is the leading journal in dermatology and sexually transmitted infections in the region. Several important steps were taken during the last 25 years to improve the journal's quality, global visibility, and international impact. After a 1-year trial period, Thomson Reuters recently informed the editorial office that they had accepted Acta Dermatovenerol APA for coverage in Thomson Reuters' new index in the Web of Science Core Collection called the Emerging Sources Citation Index (ESCI). The coverage of Acta Dermatovenerol APA begins with the journal content published online in 2016; that is, from volume 25 onwards.

  19. NDE of Fiber Reinforced Foam Composite Structures for Future Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Walker, james; Roth, Don; Hopkins, Dale

    2010-01-01

    This slide presentation reviews the complexities of non-destructive evaluation (NDE) of fiber reinforced foam composite structures to be used for aerospace vehicles in the future.Various views of fiber reinforced foam materials are shown and described. Conventional methods of NDE for composites are reviewed such as Micro-computed X-Ray Tomography, Thermography, Shearography, and Phased Array Ultrasonics (PAUT). These meth0ods appear to work well on the face sheet and face sheet ot core bond, they do not provide adequate coverage for the webs. There is a need for additional methods that will examine the webs and web to foam core bond.

  20. Towards end-to-end models for investigating the effects of climate and fishing in marine ecosystems

    NASA Astrophysics Data System (ADS)

    Travers, M.; Shin, Y.-J.; Jennings, S.; Cury, P.

    2007-12-01

    End-to-end models that represent ecosystem components from primary producers to top predators, linked through trophic interactions and affected by the abiotic environment, are expected to provide valuable tools for assessing the effects of climate change and fishing on ecosystem dynamics. Here, we review the main process-based approaches used for marine ecosystem modelling, focusing on the extent of the food web modelled, the forcing factors considered, the trophic processes represented, as well as the potential use and further development of the models. We consider models of a subset of the food web, models which represent the first attempts to couple low and high trophic levels, integrated models of the whole ecosystem, and size spectrum models. Comparisons within and among these groups of models highlight the preferential use of functional groups at low trophic levels and species at higher trophic levels and the different ways in which the models account for abiotic processes. The model comparisons also highlight the importance of choosing an appropriate spatial dimension for representing organism dynamics. Many of the reviewed models could be extended by adding components and by ensuring that the full life cycles of species components are represented, but end-to-end models should provide full coverage of ecosystem components, the integration of physical and biological processes at different scales and two-way interactions between ecosystem components. We suggest that this is best achieved by coupling models, but there are very few existing cases where the coupling supports true two-way interaction. The advantages of coupling models are that the extent of discretization and representation can be targeted to the part of the food web being considered, making their development time- and cost-effective. Processes such as predation can be coupled to allow the propagation of forcing factors effects up and down the food web. However, there needs to be a stronger focus on enabling two-way interaction, carefully selecting the key functional groups and species, reconciling different time and space scales and the methods of converting between energy, nutrients and mass.

  1. Successful introduction of an underutilized elderly pneumococcal vaccine in a national immunization program by integrating the pre-existing public health infrastructure.

    PubMed

    Yang, Tae Un; Kim, Eunsung; Park, Young-Joon; Kim, Dongwook; Kwon, Yoon Hyung; Shin, Jae Kyong; Park, Ok

    2016-03-18

    Although pneumococcal vaccines had been recommended for the elderly population in South Korea for a considerable period of time, the coverage has been well below the optimal level. To increase the vaccination rate with integrating the pre-existing public health infrastructure and governmental funding, the Korean government introduced an elderly pneumococcal vaccination into the national immunization program with a 23-valent pneumococcal polysaccharide vaccine in May 2013. The aim of this study was to assess the performance of the program in increasing the vaccine coverage rate and maintaining stable vaccine supply and safe vaccination during the 20 months of the program. We qualitatively and quantitatively analyzed the process of introducing and the outcomes of the program in terms of the systematic organization, efficiency, and stability at the national level. A staggered introduction during the first year utilizing the public sector, with a target coverage of 60%, was implemented based on the public demand for an elderly pneumococcal vaccination, vaccine supply capacity, vaccine delivery capacity, safety, and sustainability. During the 20-month program period, the pneumococcal vaccine coverage rate among the population aged ≥65 years increased from 5.0% to 57.3% without a noticeable vaccine shortage or safety issues. A web-based integrated immunization information system, which includes the immunization registry, vaccine supply chain management, and surveillance of adverse events following immunization, reduced programmatic errors and harmonized the overall performance of the program. Introduction of an elderly pneumococcal vaccination in the national immunization program based on strong government commitment, meticulous preparation, financial support, and the pre-existing public health infrastructure resulted in an efficient, stable, and sustainable increase in vaccination coverage. Copyright © 2016. Published by Elsevier Ltd.

  2. Power Search.

    ERIC Educational Resources Information Center

    Haskin, David

    1997-01-01

    Compares six leading Web search engines (AltaVista, Excite, HotBot, Infoseek, Lycos, and Northern Light), looking at the breadth of their coverage, accuracy, and ease of use, and finds a clear favorite of the six. Includes tips that can improve search results. (AEF)

  3. eodataservice.org: how to enable cross-continental interoperability of the European Space Agency and Australian Geoscience Landsat datacubes

    NASA Astrophysics Data System (ADS)

    Mantovani, Simone; Barboni, Damiano; Natali, Stefano; Evans, Ben; Steer, Adam; Hogan, Patrik; Baumann, Peter

    2017-04-01

    Globally, billions of dollars are invested annually in Earth observations that support public services, commercial activity, and scientific inquiry. The Common Data Framework [1] for Earth Observation data summarises the current standards for the international community to adopt a common approach so that this significant data can be readily accessible. Concurrently, the "Copernicus Cooperation Arrangement" between the European Commission and the Australian Government is just one in a number of recent agreements signed to facilitate Satellite Earth Observation data sharing among the users' communities. The typical approach implemented in these initiatives is the establishment of a regional data access hub managed by the regional entity to collect data at full scale or over the local region, improve access services and provide high-performance environment in which all the data can be analysed. Furthermore, a number of datacube-aware platforms and services have emerged that enable a new collaborative approach for analysing the vast quantities of satellite imagery and other Earth Observations, making it quicker and easier to explore a time series of image data. In this context, the H2020-funded EarthServer2 project brings together multiple organisations in Europe, Australia and United States to allow federated data holdings to be analysed using web-based access to petabytes of multidimensional geospatial datasets. The aim is to create and ensure that these large spatial data sources can be accessed based on OGC standards, namely Web Coverage Service (WCS) and Web Coverage Processing Service (WCPS) that provide efficient&timely retrieval of large volumes of geospatial data as well as on-the-fly processing. In this study, we provide an overview of the existing European Space Agency and Australian Geoscience Landsat datacubes, how the regional datacube structures differ, how interoperability is enabled through standards, and finally how the datacubes can be visualized on a virtual globe (NASA - ESA WebWorldWind) based on a WC(P)S query via any standard internet browser. The current study is co-financed by the European Space Agency under the MaaS project (ESRIN Contract No. 4000114186/15/I-LG) and the European Union's Horizon 2020 research and innovation programme under the EarthServer-2 project (Grant Agreement No. 654367) [1] Common framework for Earth-Observation data, March 23, 2016 (https://www.whitehouse.gov/sites/default/files/microsites/ostp/common_framework_for_earth_observation_data.pdf)

  4. TIGER 2010 Boundaries

    EPA Pesticide Factsheets

    This EnviroAtlas web service supports research and online mapping activities related to EnviroAtlas (https://www.epa.gov/enviroatlas). This web service includes the State, County, and Census Block Groups boundaries from the TIGER shapefiles compiled into a single national coverage for each layer. The TIGER/Line Files are shapefiles and related database files (.dbf) that are an extract of selected geographic and cartographic information from the U.S. Census Bureau's Master Address File / Topologically Integrated Geographic Encoding and Referencing (MAF/TIGER) Database (MTDB).

  5. Remote Sensing Information Gateway: A free application and web service for fast, convenient, interoperable access to large repositories of atmospheric data

    NASA Astrophysics Data System (ADS)

    Plessel, T.; Szykman, J.; Freeman, M.

    2012-12-01

    EPA's Remote Sensing Information Gateway (RSIG) is a widely used free applet and web service for quickly and easily retrieving, visualizing and saving user-specified subsets of atmospheric data - by variable, geographic domain and time range. Petabytes of available data include thousands of variables from a set of NASA and NOAA satellites, aircraft, ground stations and EPA air-quality models. The RSIG applet is used by atmospheric researchers and uses the rsigserver web service to obtain data and images. The rsigserver web service is compliant with the Open Geospatial Consortium Web Coverage Service (OGC-WCS) standard to facilitate data discovery and interoperability. Since rsigserver is publicly accessible, it can be (and is) used by other applications. This presentation describes the architecture and technical implementation details of this successful system with an emphasis on achieving convenience, high-performance, data integrity and security.

  6. Serving Satellite Remote Sensing Data to User Community through the OGC Interoperability Protocols

    NASA Astrophysics Data System (ADS)

    di, L.; Yang, W.; Bai, Y.

    2005-12-01

    Remote sensing is one of the major methods for collecting geospatial data. Hugh amount of remote sensing data has been collected by space agencies and private companies around the world. For example, NASA's Earth Observing System (EOS) is generating more than 3 Tb of remote sensing data per day. The data collected by EOS are processed, distributed, archived, and managed by the EOS Data and Information System (EOSDIS). Currently, EOSDIS is managing several petabytes of data. All of those data are not only valuable for global change research, but also useful for local and regional application and decision makings. How to make the data easily accessible to and usable by the user community is one of key issues for realizing the full potential of these valuable datasets. In the past several years, the Open Geospatial Consortium (OGC) has developed several interoperability protocols aiming at making geospatial data easily accessible to and usable by the user community through Internet. The protocols particularly relevant to the discovery, access, and integration of multi-source satellite remote sensing data are the Catalog Service for Web (CS/W) and Web Coverage Services (WCS) Specifications. The OGC CS/W specifies the interfaces, HTTP protocol bindings, and a framework for defining application profiles required to publish and access digital catalogues of metadata for geographic data, services, and related resource information. The OGC WCS specification defines the interfaces between web-based clients and servers for accessing on-line multi-dimensional, multi-temporal geospatial coverage in an interoperable way. Based on definitions by OGC and ISO 19123, coverage data include all remote sensing images as well as gridded model outputs. The Laboratory for Advanced Information Technology and Standards (LAITS), George Mason University, has been working on developing and implementing OGC specifications for better serving NASA Earth science data to the user community for many years. We have developed the NWGISS software package that implements multiple OGC specifications, including OGC WMS, WCS, CS/W, and WFS. As a part of NASA REASON GeoBrain project, the NWGISS WCS and CS/W servers have been extended to provide operational access to NASA EOS data at data pools through OGC protocols and to make both services chainable in the web-service chaining. The extensions in the WCS server include the implementation of WCS 1.0.0 and WCS 1.0.2, and the development of WSDL description of the WCS services. In order to find the on-line EOS data resources, the CS/W server is extended at the backend to search metadata in NASA ECHO. This presentation reports those extensions and discuss lessons-learned on the implementation. It also discusses the advantage, disadvantages, and future improvement of OGC specifications, particularly the WCS.

  7. Surveying ourselves: examining the use of a web-based approach for a physician survey.

    PubMed

    Matteson, Kristen A; Anderson, Britta L; Pinto, Stephanie B; Lopes, Vrishali; Schulkin, Jay; Clark, Melissa A

    2011-12-01

    A survey was distributed, using a sequential mixed-mode approach, to a national sample of obstetrician-gynecologists. Differences between responses to the web-based mode and the on-paper mode were compared to determine if there were systematic differences between respondents. Only two differences in respondents between the two modes were identified. University-based physicians were more likely to complete the web-based mode than private practice physicians. Mail respondents reported a greater volume of endometrial ablations compared to online respondents. The web-based mode had better data quality than the paper-based mailed mode in terms of less missing and inappropriate responses. Together, these findings suggest that, although a few differences were identified, the web-based survey mode attained adequate representativeness and improved data quality. Given the metrics examined for this study, exclusive use of web-based data collection may be appropriate for physician surveys with a minimal reduction in sample coverage and without a reduction in data quality.

  8. Advances in the TRIDEC Cloud

    NASA Astrophysics Data System (ADS)

    Hammitzsch, Martin; Spazier, Johannes; Reißland, Sven

    2016-04-01

    The TRIDEC Cloud is a platform that merges several complementary cloud-based services for instant tsunami propagation calculations and automated background computation with graphics processing units (GPU), for web-mapping of hazard specific geospatial data, and for serving relevant functionality to handle, share, and communicate threat specific information in a collaborative and distributed environment. The platform offers a modern web-based graphical user interface so that operators in warning centres and stakeholders of other involved parties (e.g. CPAs, ministries) just need a standard web browser to access a full-fledged early warning and information system with unique interactive features such as Cloud Messages and Shared Maps. Furthermore, the TRIDEC Cloud can be accessed in different modes, e.g. the monitoring mode, which provides important functionality required to act in a real event, and the exercise-and-training mode, which enables training and exercises with virtual scenarios re-played by a scenario player. The software system architecture and open interfaces facilitate global coverage so that the system is applicable for any region in the world and allow the integration of different sensor systems as well as the integration of other hazard types and use cases different to tsunami early warning. Current advances of the TRIDEC Cloud platform will be summarized in this presentation.

  9. J-Plus Web Portal

    NASA Astrophysics Data System (ADS)

    Civera Lorenzo, Tamara

    2017-10-01

    Brief presentation about the J-PLUS EDR data access web portal (http://archive.cefca.es/catalogues/jplus-edr) where the different services available to retrieve images and catalogues data have been presented.J-PLUS Early Data Release (EDR) archive includes two types of data: images and dual and single catalogue data which include parameters measured from images. J-PLUS web portal offers catalogue data and images through several different online data access tools or services each suited to a particular need. The different services offered are: Coverage map Sky navigator Object visualization Image search Cone search Object list search Virtual observatory services: Simple Cone Search Simple Image Access Protocol Simple Spectral Access Protocol Table Access Protocol

  10. ArcticDEM Year 3; Improving Coverage, Repetition and Resolution

    NASA Astrophysics Data System (ADS)

    Morin, P. J.; Porter, C. C.; Cloutier, M.; Howat, I.; Noh, M. J.; Willis, M. J.; Candela, S. G.; Bauer, G.; Kramer, W.; Bates, B.; Williamson, C.

    2017-12-01

    Surface topography is among the most fundamental data sets for geosciences, essential for disciplines ranging from glaciology to geodynamics. The ArcticDEM project is using sub-meter, commercial imagery licensed by the National Geospatial-Intelligence Agency, petascale computing, and open source photogrammetry software to produce a time-tagged 2m posting elevation model and a 5m posting mosaic of the entire Arctic region. As ArcticDEM enters its third year, the region has gone from having some of the sparsest and poorest elevation data to some of the most precise and complete data of any region on the globe. To date, we have produced and released over 80,000,000 km2 as 57,000 - 2m posting, time-stamped DEMs. The Arctic, on average, is covered four times though there are hotspots with more than 100 DEMs. In addition, the version 1 release includes a 5m posting mosaic covering the entire 20,000,000 km2 region. All products are publically available through arctidem.org, ESRI web services, and a web viewer. The final year of the project will consist of a complete refiltering of clouds/water and re-mosaicing of all elevation data. Since inception of the project, post-processing techniques have improved significantly, resulting in fewer voids, better registration, sharper coastlines, and fewer inaccuracies due to clouds. All ArcticDEM data will be released in 2018. Data, documentation, web services and web viewer are available at arcticdem.org

  11. Spatial Data Services for Interdisciplinary Applications from the NASA Socioeconomic Data and Applications Center

    NASA Astrophysics Data System (ADS)

    Chen, R. S.; MacManus, K.; Vinay, S.; Yetman, G.

    2016-12-01

    The Socioeconomic Data and Applications Center (SEDAC), one of 12 Distributed Active Archive Centers (DAACs) in the NASA Earth Observing System Data and Information System (EOSDIS), has developed a variety of operational spatial data services aimed at providing online access, visualization, and analytic functions for geospatial socioeconomic and environmental data. These services include: open web services that implement Open Geospatial Consortium (OGC) specifications such as Web Map Service (WMS), Web Feature Service (WFS), and Web Coverage Service (WCS); spatial query services that support Web Processing Service (WPS) and Representation State Transfer (REST); and web map clients and a mobile app that utilize SEDAC and other open web services. These services may be accessed from a variety of external map clients and visualization tools such as NASA's WorldView, NOAA's Climate Explorer, and ArcGIS Online. More than 200 data layers related to population, settlements, infrastructure, agriculture, environmental pollution, land use, health, hazards, climate change and other aspects of sustainable development are available through WMS, WFS, and/or WCS. Version 2 of the SEDAC Population Estimation Service (PES) supports spatial queries through WPS and REST in the form of a user-defined polygon or circle. The PES returns an estimate of the population residing in the defined area for a specific year (2000, 2005, 2010, 2015, or 2020) based on SEDAC's Gridded Population of the World version 4 (GPWv4) dataset, together with measures of accuracy. The SEDAC Hazards Mapper and the recently released HazPop iOS mobile app enable users to easily submit spatial queries to the PES and see the results. SEDAC has developed an operational virtualized backend infrastructure to manage these services and support their continual improvement as standards change, new data and services become available, and user needs evolve. An ongoing challenge is to improve the reliability and performance of the infrastructure, in conjunction with external services, to meet both research and operational needs.

  12. PHL7/441: Fixing a Broken Line between the Perceived "Anarchy" of the Web and a Process-Comfortable Pharmaceutical Company

    PubMed Central

    Vercellesi, L

    1999-01-01

    Introduction In 1998 a pharmaceutical company published its Web site to provide: an institutional presence multifunctional information to primary customers and general public a new way of access to the company a link to existing company-sponsored sites a platform for future projects Since the publication, some significant integration have been added; in particular one is a primary interactive service, addressed to a selected audience. The need has been felt to foster new projects and establish the idea of routinely considering the site as a potential tool in the marketing mix, to provide advanced services to customers. Methods Re-assessment of the site towards objectives. Assessment of its perception with company potential suppliers. Results The issue "web use" was discussed in various management meetings; the trend of use of Internet among the primary customers was known; major concerns expressed were about staffing and return of investment for activities run in the Web. These perceptions are being addressed by making the company more comfortable by: Running the site through a detailed process and clear procedures, defining A new process of maintenance of the site, involving representatives of all the functions. Procedures and guidelines. A master file of approved answers and company contacts. Categories of activities (information, promotion, education, information to investors, general services, target-specific services). Measures for all the activities run in the Web site Specifically for the Web site a concise periodical report is being assessed, covering 1. Statistics about hits and mails, compared to the corporate data. Indication of new items published. Description by the "supplier" of new or ongoing innovative projects, to transfer best practice. Basic figures on the Italian trend in internet use and specifically in the pharmaceutical and medical fields. Comments to a few competitor sites. Examples of potential uses deriving from other Web sites. Discussion The comparatively low use of Internet in Italy has affected the systematic professional exploitation of the company site. The definition of "anarchic" commonly linked to the Web by local media has lead to the attempt to "master" and "normalize" the site with a stricter approach than usual: most procedures and guidelines have been designed from scratch as not available for similar activities traditionally run. A short set of information has been requested for inclusion in the report: its wide coverage will help to receive a flavour of the global parallel new world developing in the net. Hopefully this approach will help to create a comfortable attitude towards the medium in the whole organisation and to acquire a working experience with the net.

  13. Journalism as health education: media coverage of a nonbranded pharma web site.

    PubMed

    Mackert, Michael; Love, Brad; Holton, Avery E

    2011-03-01

    As healthcare consumers increasingly use the Internet as a source for health information, direct-to-consumer (DTC) prescription drug advertising online merits additional attention. The purpose of this research was to investigate media coverage of the joint marketing program linking the movie Happy Feet and the nonbranded disease education Web site FluFacts-a resource from Tamiflu flu treatment manufacturer Roche Laboratories Inc. Twenty-nine articles (n = 29) were found covering the Happy Feet-FluFacts marketing campaign. A coding guide was developed to assess elements of the articles, including those common in the sample and information that ideally would be included in these articles. Two coders independently coded the articles, achieving intercoder agreement of κ = 0.98 before resolving disagreements to arrive at a final dataset. The majority of articles reported that Roche operated FluFacts (51.7%) and mentioned the product Tamiflu (58.6%). Almost half (48.3%) reported FluFacts was an educational resource; yet, no articles mentioned other antiviral medications or nonmedical options for preventing the flu. Almost a quarter of the articles (24.1%) provided a call to action-telling readers to visit FluFacts or providing a link for them to do so. Findings suggest that journalists' coverage of this novel campaign-likely one of the goals of the campaign-helped spread the message of the Happy Feet-FluFacts relationship, often omitting other useful health information. Additional research is needed to better understand online DTC campaigns and how consumers react to these campaigns and resulting media coverage and to inform the policymakers' decisions regarding DTC advertising online.

  14. The Geo Data Portal an Example Physical and Application Architecture Demonstrating the Power of the "Cloud" Concept.

    NASA Astrophysics Data System (ADS)

    Blodgett, D. L.; Booth, N.; Walker, J.; Kunicki, T.

    2012-12-01

    The U.S. Geological Survey Center for Integrated Data Analytics (CIDA), in holding with the President's Digital Government Strategy and the Department of Interior's IT Transformation initiative, has evolved its data center and application architecture toward the "cloud" paradigm. In this case, "cloud" refers to a goal of developing services that may be distributed to infrastructure anywhere on the Internet. This transition has taken place across the entire data management spectrum from data center location to physical hardware configuration to software design and implementation. In CIDA's case, physical hardware resides in Madison at the Wisconsin Water Science Center, in South Dakota at the Earth Resources Observation and Science Center (EROS), and in the near future at a DOI approved commercial vendor. Tasks normally conducted on desktop-based GIS software with local copies of data in proprietary formats are now done using browser-based interfaces to web processing services drawing on a network of standard data-source web services. Organizations are gaining economies of scale through data center consolidation and the creation of private cloud services as well as taking advantage of the commoditization of data processing services. Leveraging open standards for data and data management take advantage of this commoditization and provide the means to reliably build distributed service based systems. This presentation will use CIDA's experience as an illustration of the benefits and hurdles of moving to the cloud. Replicating, reformatting, and processing large data sets, such as downscaled climate projections, traditionally present a substantial challenge to environmental science researchers who need access to data subsets and derived products. The USGS Geo Data Portal (GDP) project uses cloud concepts to help earth system scientists' access subsets, spatial summaries, and derivatives of commonly needed very large data. The GDP project has developed a reusable architecture and advanced processing services that currently accesses archives hosted at Lawrence Livermore National Lab, Oregon State University, the University Corporation for Atmospheric Research, and the U.S. Geological Survey, among others. Several examples of how the GDP project uses cloud concepts will be highlighted in this presentation: 1) The high bandwidth network connectivity of large data centers reduces the need for data replication and storage local to processing services. 2) Standard data serving web services, like OPeNDAP, Web Coverage Services, and Web Feature Services allow GDP services to remotely access custom subsets of data in a variety of formats, further reducing the need for data replication and reformatting. 3) The GDP services use standard web service APIs to allow browser-based user interfaces to run complex and compute-intensive processes for users from any computer with an Internet connection. The combination of physical infrastructure and application architecture implemented for the Geo Data Portal project offer an operational example of how distributed data and processing on the cloud can be used to aid earth system science.

  15. Study on generation and sharing of on-demand global seamless data—Taking MODIS NDVI as an example

    NASA Astrophysics Data System (ADS)

    Shen, Dayong; Deng, Meixia; Di, Liping; Han, Weiguo; Peng, Chunming; Yagci, Ali Levent; Yu, Genong; Chen, Zeqiang

    2013-04-01

    By applying advanced Geospatial Data Abstraction Library (GDAL) and BigTIFF technology in a Geographical Information System (GIS) with Service Oriented Architecture (SOA), this study has derived global datasets using tile-based input data and implemented Virtual Web Map Service (VWMS) and Virtual Web Coverage Service (VWCS) to provide software tools for visualization and acquisition of global data. Taking MODIS Normalized Difference Vegetation Index (NDVI) as an example, this study proves the feasibility, efficiency and features of the proposed approach.

  16. Assessing vaccination coverage in infants, survey studies versus the Flemish immunisation register: achieving the best of both worlds.

    PubMed

    Braeckman, Tessa; Lernout, Tinne; Top, Geert; Paeps, Annick; Roelants, Mathieu; Hoppenbrouwers, Karel; Van Damme, Pierre; Theeten, Heidi

    2014-01-09

    Infant immunisation coverage in Flanders, Belgium, is monitored through repeated coverage surveys. With the increased use of Vaccinnet, the web-based ordering system for vaccines in Flanders set up in 2004 and linked to an immunisation register, this database could become an alternative to quickly estimate vaccination coverage. To evaluate its current accuracy, coverage estimates generated from Vaccinnet alone were compared with estimates from the most recent survey (2012) that combined interview data with data from Vaccinnet and medical files. Coverage rates from registrations in Vaccinnet were systematically lower than the corresponding estimates obtained through the survey (mean difference 7.7%). This difference increased by dose number for vaccines that require multiple doses. Differences in administration date between the two sources were observed for 3.8-8.2% of registered doses. Underparticipation in Vaccinnet thus significantly impacts on the register-based immunisation coverage estimates, amplified by underregistration of administered doses among vaccinators using Vaccinnet. Therefore, survey studies, despite being labour-intensive and expensive, currently provide more complete and reliable results than register-based estimates alone in Flanders. However, further improvement of Vaccinnet's completeness will likely allow more accurate estimates in the nearby future. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Leveraging Big Data for Exploring Occupational Diseases-Related Interest at the Level of Scientific Community, Media Coverage and Novel Data Streams: The Example of Silicosis as a Pilot Study

    PubMed Central

    Bragazzi, Nicola Luigi; Toletone, Alessandra; Brigo, Francesco; Durando, Paolo

    2016-01-01

    Objective Silicosis is an untreatable but preventable occupational disease, caused by exposure to silica. It can progressively evolve to lung impairment, respiratory failure and death, even after exposure has ceased. However, little is known about occupational diseases-related interest at the level of scientific community, media coverage and web behavior. This article aims at filling in this gap of knowledge, taking the silicosis as a case study. Methods We investigated silicosis-related web-activities using Google Trends (GT) for capturing the Internet behavior worldwide in the years 2004–2015. GT-generated data were, then, compared with the silicosis-related scientific production (i.e., PubMed and Google Scholar), the media coverage (i.e., Google news), the Wikipedia traffic (i.e, Wikitrends) and the usage of new media (i.e., YouTube and Twitter). Results A peak in silicosis-related web searches was noticed in 2010–2011: interestingly, both scientific articles production and media coverage markedly increased after these years in a statistically significant way. The public interest and the level of the public engagement were witnessed by an increase in likes, comments, hashtags, and re-tweets. However, it was found that only a small fraction of the posted/uploaded material contained accurate scientific information. Conclusions GT could be useful to assess the reaction of the public and the level of public engagement both to novel risk-factors associated to occupational diseases, and possibly related changes in disease natural history, and to the effectiveness of preventive workplace practices and legislative measures adopted to improve occupational health. Further, occupational clinicians should become aware of the topics most frequently searched by patients and proactively address these concerns during the medical examination. Institutional bodies and organisms should be more present and active in digital tools and media to disseminate and communicate scientifically accurate information. This manuscript should be intended as preliminary, exploratory communication, paving the way for further studies. PMID:27806115

  18. Using RxNorm and NDF-RT to classify medication data extracted from electronic health records: experiences from the Rochester Epidemiology Project.

    PubMed

    Pathak, Jyotishman; Murphy, Sean P; Willaert, Brian N; Kremers, Hilal M; Yawn, Barbara P; Rocca, Walter A; Chute, Christopher G

    2011-01-01

    RxNorm and NDF-RT published by the National Library of Medicine (NLM) and Veterans Affairs (VA), respectively, are two publicly available federal medication terminologies. In this study, we evaluate the applicability of RxNorm and National Drug File-Reference Terminology (NDF-RT) for extraction and classification of medication data retrieved using structured querying and natural language processing techniques from electronic health records at two different medical centers within the Rochester Epidemiology Project (REP). Specifically, we explore how mappings between RxNorm concept codes and NDF-RT drug classes can be leveraged for hierarchical organization and grouping of REP medication data, identify gaps and coverage issues, and analyze the recently released NLM's NDF-RT Web service API. Our study concludes that RxNorm and NDF-RT can be applied together for classification of medication extracted from multiple EHR systems, although several issues and challenges remain to be addressed. We further conclude that the Web service APIs developed by the NLM provide useful functionalities for such activities.

  19. A tool for NDVI time series extraction from wide-swath remotely sensed images

    NASA Astrophysics Data System (ADS)

    Li, Zhishan; Shi, Runhe; Zhou, Cong

    2015-09-01

    Normalized Difference Vegetation Index (NDVI) is one of the most widely used indicators for monitoring the vegetation coverage in land surface. The time series features of NDVI are capable of reflecting dynamic changes of various ecosystems. Calculating NDVI via Moderate Resolution Imaging Spectrometer (MODIS) and other wide-swath remotely sensed images provides an important way to monitor the spatial and temporal characteristics of large-scale NDVI. However, difficulties are still existed for ecologists to extract such information correctly and efficiently because of the problems in several professional processes on the original remote sensing images including radiometric calibration, geometric correction, multiple data composition and curve smoothing. In this study, we developed an efficient and convenient online toolbox for non-remote sensing professionals who want to extract NDVI time series with a friendly graphic user interface. It is based on Java Web and Web GIS technically. Moreover, Struts, Spring and Hibernate frameworks (SSH) are integrated in the system for the purpose of easy maintenance and expansion. Latitude, longitude and time period are the key inputs that users need to provide, and the NDVI time series are calculated automatically.

  20. 42 CFR 431.428 - Reporting requirements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... providing insurance coverage to beneficiaries and uninsured populations. (4) Outcomes of care, quality of... performance of the demonstration. (8) The status of the evaluation and information regarding progress in... must publish its draft annual report on its public Web site within 30 days of submission to CMS. (1...

  1. 42 CFR 431.428 - Reporting requirements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... providing insurance coverage to beneficiaries and uninsured populations. (4) Outcomes of care, quality of... performance of the demonstration. (8) The status of the evaluation and information regarding progress in... must publish its draft annual report on its public Web site within 30 days of submission to CMS. (1...

  2. 42 CFR 431.428 - Reporting requirements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... providing insurance coverage to beneficiaries and uninsured populations. (4) Outcomes of care, quality of... performance of the demonstration. (8) The status of the evaluation and information regarding progress in... must publish its draft annual report on its public Web site within 30 days of submission to CMS. (1...

  3. PhosphOrtholog: a web-based tool for cross-species mapping of orthologous protein post-translational modifications.

    PubMed

    Chaudhuri, Rima; Sadrieh, Arash; Hoffman, Nolan J; Parker, Benjamin L; Humphrey, Sean J; Stöckli, Jacqueline; Hill, Adam P; James, David E; Yang, Jean Yee Hwa

    2015-08-19

    Most biological processes are influenced by protein post-translational modifications (PTMs). Identifying novel PTM sites in different organisms, including humans and model organisms, has expedited our understanding of key signal transduction mechanisms. However, with increasing availability of deep, quantitative datasets in diverse species, there is a growing need for tools to facilitate cross-species comparison of PTM data. This is particularly important because functionally important modification sites are more likely to be evolutionarily conserved; yet cross-species comparison of PTMs is difficult since they often lie in structurally disordered protein domains. Current tools that address this can only map known PTMs between species based on known orthologous phosphosites, and do not enable the cross-species mapping of newly identified modification sites. Here, we addressed this by developing a web-based software tool, PhosphOrtholog ( www.phosphortholog.com ) that accurately maps protein modification sites between different species. This facilitates the comparison of datasets derived from multiple species, and should be a valuable tool for the proteomics community. Here we describe PhosphOrtholog, a web-based application for mapping known and novel orthologous PTM sites from experimental data obtained from different species. PhosphOrtholog is the only generic and automated tool that enables cross-species comparison of large-scale PTM datasets without relying on existing PTM databases. This is achieved through pairwise sequence alignment of orthologous protein residues. To demonstrate its utility we apply it to two sets of human and rat muscle phosphoproteomes generated following insulin and exercise stimulation, respectively, and one publicly available mouse phosphoproteome following cellular stress revealing high mapping and coverage efficiency. Although coverage statistics are dataset dependent, PhosphOrtholog increased the number of cross-species mapped sites in all our example data sets by more than double when compared to those recovered using existing resources such as PhosphoSitePlus. PhosphOrtholog is the first tool that enables mapping of thousands of novel and known protein phosphorylation sites across species, accessible through an easy-to-use web interface. Identification of conserved PTMs across species from large-scale experimental data increases our knowledgebase of functional PTM sites. Moreover, PhosphOrtholog is generic being applicable to other PTM datasets such as acetylation, ubiquitination and methylation.

  4. Affordable Care Act Impact on Medicaid Coverage of Smoking-Cessation Treatments.

    PubMed

    McMenamin, Sara B; Yoeun, Sara W; Halpin, Helen A

    2018-04-01

    Four sections of the Affordable Care Act address the expansion of Medicaid coverage for recommended smoking-cessation treatments for: (1) pregnant women (Section 4107), (2) all enrollees through a financial incentive (1% Federal Medical Assistance Percentage increase) to offer comprehensive coverage (Section 4106), (3) all enrollees through Medicaid formulary requirements (Section 2502), and (4) Medicaid expansion enrollees (Section 2001). The purpose of this study is to document changes in Medicaid coverage for smoking-cessation treatments since the passage of the Affordable Care Act and to assess how implementation has differentially affected Medicaid coverage policies for: pregnant women, enrollees in traditional Medicaid, and Medicaid expansion enrollees. From January through June 2017, data were collected and analyzed from 51 Medicaid programs (50 states plus the District of Columbia) through a web-based survey and review of benefits documents to assess coverage policies for smoking-cessation treatments. Forty-seven Medicaid programs have increased coverage for smoking-cessation treatments post-implementation of the Affordable Care Act by adopting one or more of the four smoking-cessation treatment provisions. Coverage for pregnant women increased in 37 states, coverage for newly eligible expansion enrollees increased in 32 states, and 15 states added coverage and/or removed copayments in order to apply for a 1% increase in the Federal Medical Assistance Percentage. Coverage for all recommended pharmacotherapy and group and individual counseling increased from seven states in 2009 to 28 states in 2017. The Affordable Care Act was successful in improving and expanding state Medicaid coverage of effective smoking-cessation treatments. Many programs are not fully compliant with the law, and additional guidance and clarification from the Centers for Medicare and Medicaid Services may be needed. Copyright © 2018 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  5. The Number of Scholarly Documents on the Public Web

    PubMed Central

    Khabsa, Madian; Giles, C. Lee

    2014-01-01

    The number of scholarly documents available on the web is estimated using capture/recapture methods by studying the coverage of two major academic search engines: Google Scholar and Microsoft Academic Search. Our estimates show that at least 114 million English-language scholarly documents are accessible on the web, of which Google Scholar has nearly 100 million. Of these, we estimate that at least 27 million (24%) are freely available since they do not require a subscription or payment of any kind. In addition, at a finer scale, we also estimate the number of scholarly documents on the web for fifteen fields: Agricultural Science, Arts and Humanities, Biology, Chemistry, Computer Science, Economics and Business, Engineering, Environmental Sciences, Geosciences, Material Science, Mathematics, Medicine, Physics, Social Sciences, and Multidisciplinary, as defined by Microsoft Academic Search. In addition, we show that among these fields the percentage of documents defined as freely available varies significantly, i.e., from 12 to 50%. PMID:24817403

  6. The number of scholarly documents on the public web.

    PubMed

    Khabsa, Madian; Giles, C Lee

    2014-01-01

    The number of scholarly documents available on the web is estimated using capture/recapture methods by studying the coverage of two major academic search engines: Google Scholar and Microsoft Academic Search. Our estimates show that at least 114 million English-language scholarly documents are accessible on the web, of which Google Scholar has nearly 100 million. Of these, we estimate that at least 27 million (24%) are freely available since they do not require a subscription or payment of any kind. In addition, at a finer scale, we also estimate the number of scholarly documents on the web for fifteen fields: Agricultural Science, Arts and Humanities, Biology, Chemistry, Computer Science, Economics and Business, Engineering, Environmental Sciences, Geosciences, Material Science, Mathematics, Medicine, Physics, Social Sciences, and Multidisciplinary, as defined by Microsoft Academic Search. In addition, we show that among these fields the percentage of documents defined as freely available varies significantly, i.e., from 12 to 50%.

  7. Internet survey of the influence of environmental factors on human health: environmental epidemiologic investigation using the web-based daily questionnaire for health

    PubMed Central

    Sano, Tomomi; Akahane, Manabu; Sugiura, Hiroaki; Ohkusa, Yasushi; Okabe, Nobuhiko; Imamura, Tomoaki

    2012-01-01

    With increasing Internet coverage, the use of a web-based survey for epidemiological study is a possibility. We performed an investigation in Japan in winter 2008 using the web-based daily questionnaire for health (WDQH). The WDQH is a web-based questionnaire survey formulated to obtain information about the daily physical condition of the general public on a real-time basis, in order to study correlations between changes in physical health and changes in environmental factors. Respondents were asked whether they felt ill and had specific symptoms including fever. We analysed the environmental factors along with the health conditions obtained from the WDQH. Four factors were found to influence health: minimum temperature, hours of sunlight, median humidity and weekday or holiday. The WDQH allowed a daily health survey in the general population in real time via the Internet. PMID:22946467

  8. A survey of the current status of web-based databases indexing Iranian journals.

    PubMed

    Merat, Shahin; Khatibzadeh, Shahab; Mesgarpour, Bita; Malekzadeh, Reza

    2009-05-01

    The scientific output of Iran is increasing rapidly during the recent years. Unfortunately, most papers are published in journals which are not indexed by popular indexing systems and many of them are in Persian without English translation. This makes the results of Iranian scientific research unavailable to other researchers, including Iranians. The aim of this study was to evaluate the quality of current web-based databases indexing scientific articles published in Iran. We identified web-based databases which indexed scientific journals published in Iran using popular search engines. The sites were then subjected to a series of tests to evaluate their coverage, search capabilities, stability, accuracy of information, consistency, accessibility, ease of use, and other features. Results were compared with each other to identify strengths and shortcomings of each site. Five web sites were indentified. None had a complete coverage on scientific Iranian journals. The search capabilities were less than optimal in most sites. English translations of research titles, author names, keywords, and abstracts of Persian-language articles did not follow standards. Some sites did not cover abstracts. Numerous typing errors make searches ineffective and citation indexing unreliable. None of the currently available indexing sites are capable of presenting Iranian research to the international scientific community. The government should intervene by enforcing policies designed to facilitate indexing through a systematic approach. The policies should address Iranian journals, authors, and indexing sites. Iranian journals should be required to provide their indexing data, including references, electronically; authors should provide correct indexing information to journals; and indexing sites should improve their software to meet standards set by the government.

  9. The peer review system (PRS) for quality assurance and treatment improvement in radiation therapy

    NASA Astrophysics Data System (ADS)

    Le, Anh H. T.; Kapoor, Rishabh; Palta, Jatinder R.

    2012-02-01

    Peer reviews are needed across all disciplines of medicine to address complex medical challenges in disease care, medical safety, insurance coverage handling, and public safety. Radiation therapy utilizes technologically advanced imaging for treatment planning, often with excellent efficacy. Since planning data requirements are substantial, patients are at risk for repeat diagnostic procedures or suboptimal therapeutic intervention due to a lack of knowledge regarding previous treatments. The Peer Review System (PRS) will make this critical radiation therapy information readily available on demand via Web technology. The PRS system has been developed with current Web technology, .NET framework, and in-house DICOM library. With the advantages of Web server-client architecture, including IIS web server, SOAP Web Services and Silverlight for the client side, the patient data can be visualized through web browser and distributed across multiple locations by the local area network and Internet. This PRS will significantly improve the quality, safety, and accessibility, of treatment plans in cancer therapy. Furthermore, the secure Web-based PRS with DICOM-RT compliance will provide flexible utilities for organization, sorting, and retrieval of imaging studies and treatment plans to optimize the patient treatment and ultimately improve patient safety and treatment quality.

  10. Handbook of Research on Electronic Surveys and Measurements

    ERIC Educational Resources Information Center

    Reynolds, Rodney, Ed.; Woods, Robert, Ed.; Baker, Jason, Ed.

    2007-01-01

    The "Handbook of Research on Electronic Surveys and Measurements" is the comprehensive reference source for innovative knowledge on electronic surveys. This commanding handbook of research provides complete coverage of the challenges associated with the use of the Internet to develop online surveys, administer Web-based instruments, and conduct…

  11. Flipping Introduction to MIS for a Connected World

    ERIC Educational Resources Information Center

    Law, Wai K.

    2014-01-01

    It has been increasingly challenging to provide an introductory coverage of the rapidly expanding fields in Information Systems (IS). The task has been further complicated by the popularity of web resources and cloud services. A new generation of technically savvy learners, while recognizing the significance of information systems, expects…

  12. The web and public confidence in MMR vaccination in Italy.

    PubMed

    Aquino, Francesco; Donzelli, Gabriele; De Franco, Emanuela; Privitera, Gaetano; Lopalco, Pier Luigi; Carducci, Annalaura

    2017-08-16

    Measles, mumps and rubella (MMR) vaccination coverage in Italy has been decreasing starting from 2012 and, at the present, none of the Italian regions has achieved the goal of 95% coverage target. A decision of the Court of Justice of Rimini in March 2012 that awarded vaccine-injury compensation for a case of autism has been indicated as a probable trigger event leading to a reduction of vaccine confidence in Italy. The aim of the study was to explore the relationship between MMR vaccination coverage to online search trends and social network activity on the topic "autism and MMR vaccine", during the period 2010-2015. A significant inverse correlation was found between MMR vaccination coverage and Internet search activity, tweets and Facebook posts. New media might have played a role in spreading misinformation. Media monitoring could be useful to assess the level of vaccine hesitancy and to plan and target effective information campaigns. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. 2009–2010 Seasonal Influenza Vaccination Coverage Among College Students From 8 Universities in North Carolina

    PubMed Central

    Poehling, Katherine A.; Blocker, Jill; Ip, Edward H.; Peters, Timothy R.; Wolfson, Mark

    2012-01-01

    Objective We sought to describe the 2009–2010 seasonal influenza vaccine coverage of college students. Participants 4090 college students from eight North Carolina universities participated in a confidential, web-based survey in October-November 2009. Methods Associations between self-reported 2009–2010 seasonal influenza vaccination and demographic characteristics, campus activities, parental education, and email usage were assessed by bivariate analyses and by a mixed-effects model adjusting for clustering by university. Results Overall, 20% of students (range 14%–30% by university) reported receiving 2009–2010 seasonal influenza vaccine. Being a freshman, attending a private university, having a college-educated parent, and participating in academic clubs/honor societies predicted receipt of influenza vaccine in the mixed-effects model. Conclusions The self-reported 2009–2010 influenza vaccine coverage was one-quarter of the 2020 Healthy People goal (80%) for healthy persons 18–64 years of age. College campuses have the opportunity to enhance influenza vaccine coverage among its diverse student populations. PMID:23157195

  14. 2009-2010 seasonal influenza vaccination coverage among college students from 8 universities in North Carolina.

    PubMed

    Poehling, Katherine A; Blocker, Jill; Ip, Edward H; Peters, Timothy R; Wolfson, Mark

    2012-01-01

    The authors sought to describe the 2009-2010 seasonal influenza vaccine coverage of college students. A total of 4,090 college students from 8 North Carolina universities participated in a confidential, Web-based survey in October-November 2009. Associations between self-reported 2009-2010 seasonal influenza vaccination and demographic characteristics, campus activities, parental education, and e-mail usage were assessed by bivariate analyses and by a mixed-effects model adjusting for clustering by university. Overall, 20% of students (range 14%-30% by university) reported receiving 2009-2010 seasonal influenza vaccine. Being a freshman, attending a private university, having a college-educated parent, and participating in academic clubs/honor societies predicted receipt of influenza vaccine in the mixed-effects model. The self-reported 2009-2010 influenza vaccine coverage was one-quarter of the 2020 Healthy People goal (80%) for healthy persons 18 to 64 years of age. College campuses have the opportunity to enhance influenza vaccine coverage among its diverse student populations.

  15. Sampling properties of directed networks

    NASA Astrophysics Data System (ADS)

    Son, S.-W.; Christensen, C.; Bizhani, G.; Foster, D. V.; Grassberger, P.; Paczuski, M.

    2012-10-01

    For many real-world networks only a small “sampled” version of the original network may be investigated; those results are then used to draw conclusions about the actual system. Variants of breadth-first search (BFS) sampling, which are based on epidemic processes, are widely used. Although it is well established that BFS sampling fails, in most cases, to capture the IN component(s) of directed networks, a description of the effects of BFS sampling on other topological properties is all but absent from the literature. To systematically study the effects of sampling biases on directed networks, we compare BFS sampling to random sampling on complete large-scale directed networks. We present new results and a thorough analysis of the topological properties of seven complete directed networks (prior to sampling), including three versions of Wikipedia, three different sources of sampled World Wide Web data, and an Internet-based social network. We detail the differences that sampling method and coverage can make to the structural properties of sampled versions of these seven networks. Most notably, we find that sampling method and coverage affect both the bow-tie structure and the number and structure of strongly connected components in sampled networks. In addition, at a low sampling coverage (i.e., less than 40%), the values of average degree, variance of out-degree, degree autocorrelation, and link reciprocity are overestimated by 30% or more in BFS-sampled networks and only attain values within 10% of the corresponding values in the complete networks when sampling coverage is in excess of 65%. These results may cause us to rethink what we know about the structure, function, and evolution of real-world directed networks.

  16. Resource Management Scheme Based on Ubiquitous Data Analysis

    PubMed Central

    Lee, Heung Ki; Jung, Jaehee

    2014-01-01

    Resource management of the main memory and process handler is critical to enhancing the system performance of a web server. Owing to the transaction delay time that affects incoming requests from web clients, web server systems utilize several web processes to anticipate future requests. This procedure is able to decrease the web generation time because there are enough processes to handle the incoming requests from web browsers. However, inefficient process management results in low service quality for the web server system. Proper pregenerated process mechanisms are required for dealing with the clients' requests. Unfortunately, it is difficult to predict how many requests a web server system is going to receive. If a web server system builds too many web processes, it wastes a considerable amount of memory space, and thus performance is reduced. We propose an adaptive web process manager scheme based on the analysis of web log mining. In the proposed scheme, the number of web processes is controlled through prediction of incoming requests, and accordingly, the web process management scheme consumes the least possible web transaction resources. In experiments, real web trace data were used to prove the improved performance of the proposed scheme. PMID:25197692

  17. Online Maps and Cloud-Supported Location-Based Services across a Manifold of Devices

    NASA Astrophysics Data System (ADS)

    Kröpfl, M.; Buchmüller, D.; Leberl, F.

    2012-07-01

    Online mapping, miniaturization of computing devices, the "cloud", Global Navigation Satellite System (GNSS) and cell tower triangulation all coalesce into an entirely novel infrastructure for numerous innovative map applications. This impacts the planning of human activities, navigating and tracking these activities as they occur, and finally documenting their outcome for either a single user or a network of connected users in a larger context. In this paper, we provide an example of a simple geospatial application making use of this model, which we will use to explain the basic steps necessary to deploy an application involving a web service hosting geospatial information and a client software consuming the web service through an API. The application allows an insurance claim specialist to add claims to a cloud-based database including a claim location. A field agent then uses a smartphone application to query the database by proximity, and heads out to capture photographs as supporting documentation for the claim. Once the photos have been uploaded to the web service, a second web service for image matching is called in order to try and match the current photograph to previously submitted assets. Image matching is used as a pre-verification step to determine whether the coverage of the respective object is sufficient for the claim specialist to process the claim. The development of the application was based on Microsoft's® Bing Maps™, Windows Phone™, Silverlight™, Windows Azure™ and Visual Studio™, and was completed in approximately 30 labour hours split among two developers.

  18. Urban Climate Resilience - Connecting climate models with decision support cyberinfrastructure using open standards

    NASA Astrophysics Data System (ADS)

    Bermudez, L. E.; Percivall, G.; Idol, T. A.

    2015-12-01

    Experts in climate modeling, remote sensing of the Earth, and cyber infrastructure must work together in order to make climate predictions available to decision makers. Such experts and decision makers worked together in the Open Geospatial Consortium's (OGC) Testbed 11 to address a scenario of population displacement by coastal inundation due to the predicted sea level rise. In a Policy Fact Sheet "Harnessing Climate Data to Boost Ecosystem & Water Resilience", issued by White House Office of Science and Technology (OSTP) in December 2014, OGC committed to increase access to climate change information using open standards. In July 2015, the OGC Testbed 11 Urban Climate Resilience activity delivered on that commitment with open standards based support for climate-change preparedness. Using open standards such as the OGC Web Coverage Service and Web Processing Service and the NetCDF and GMLJP2 encoding standards, Testbed 11 deployed an interoperable high-resolution flood model to bring climate model outputs together with global change assessment models and other remote sensing data for decision support. Methods to confirm model predictions and to allow "what-if-scenarios" included in-situ sensor webs and crowdsourcing. A scenario was in two locations: San Francisco Bay Area and Mozambique. The scenarios demonstrated interoperation and capabilities of open geospatial specifications in supporting data services and processing services. The resultant High Resolution Flood Information System addressed access and control of simulation models and high-resolution data in an open, worldwide, collaborative Web environment. The scenarios examined the feasibility and capability of existing OGC geospatial Web service specifications in supporting the on-demand, dynamic serving of flood information from models with forecasting capacity. Results of this testbed included identification of standards and best practices that help researchers and cities deal with climate-related issues. Results of the testbeds will now be deployed in pilot applications. The testbed also identified areas of additional development needed to help identify scientific investments and cyberinfrastructure approaches needed to improve the application of climate science research results to urban climate resilence.

  19. The bare necessities? A realist review of necessity argumentations used in health care coverage decisions.

    PubMed

    Kleinhout-Vliek, Tineke; de Bont, Antoinette; Boer, Bert

    2017-07-01

    Policy makers and insurance companies decide on coverage of care by both calculating (cost-) effectiveness and assessing the necessity of coverage. To investigate argumentations pertaining to necessity used in coverage decisions made by policy makers and insurance companies, as well as those argumentations used by patients, authors, the public and the media. This study is designed as a realist review, adhering to the RAMESES quality standards. Embase, Medline and Web of Science were searched and 98 articles were included that detailed necessity-based argumentations. We identified twenty necessity-based argumentation types. Seven are only used to argue in favour of coverage, five solely for arguing against coverage, and eight are used to argue both ways. A positive decision appears to be facilitated when patients or the public set the decision on the agenda. Moreover, half the argumentation types are only used by patients, authors, the public and the media, whereas the other half is also used by policy makers and insurance companies. The latter group is more accepted and used in more different countries. The majority of necessity-based argumentation types is used for either favouring or opposing coverage, and not for both. Patients, authors, the public and the media use a broader repertoire of argumentation types than policy makers and insurance companies. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Commercial insurance coverage for outpatient cardiac rehabilitation in patients with heart failure in the United States.

    PubMed

    Thirapatarapong, Wilawan; Thomas, Randal J; Pack, Quinn; Sharma, Saurabh; Squires, Ray W

    2014-01-01

    Although cardiac rehabilitation (CR) improves outcomes in patients with heart failure (HF), studies suggest variable uptake by patients with HF, as well as variable coverage by insurance carriers. The purpose of this study was to determine the percentage of large commercial health insurance companies that provide coverage for outpatient (CR) for patients with HF. We identified a sample of the largest US commercial health care providers and analyzed their CR coverage policies for patients with HF. We surveyed 44 large private health care insurance companies, reviewed company Web sites, and, when unclear, contacted companies by e-mail or telephone. We excluded insurance clearinghouses because they did not directly provide health care insurance. Of 44 eligible insurance companies, 29 (66%) reported that they provide coverage for outpatient CR in patients with HF. The majority of companies (83%) covered CR for patients with any type of HF. A minority (10%) did not cover CR for patients with HF if it was considered a preexisting condition. A significant percentage of commercial health care insurance companies in the United States report that they currently cover outpatient CR for patients with HF. Because health insurance coverage is associated with patient participation in CR, it is anticipated that patients with HF will increasingly participate in CR in coming years.

  1. Dynamics of Phenanthrenequinone on Carbon Nano-Onion Surfaces Probed by Quasielastic Neutron Scattering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anjos, Daniela M; Mamontov, Eugene; Brown, Gilbert M

    We used quasielastic neutron scattering (QENS) to study the dynamics of phenanthrenequinone (PQ) on the surface of onion-like carbon (OLC), or so called carbon onions, as a function of surface coverage and temperature. For both the high- and low-coverage samples, we observed two diffusion processes; a faster process and nearly an order of magnitude slower process. On the high-coverage surface, the slow diffusion process is of long-range translational character, whereas the fast diffusion process is spatially localized on the length scale of ~ 4.7 . On the low-coverage surface, both diffusion processes are spatially localized; on the same length scalemore » of ~ 4.7 for the fast diffusion and a somewhat larger length scale for the slow diffusion. Arrhenius temperature dependence is observed except for the long-range diffusion on the high-coverage surface. We attribute the fast diffusion process to the generic localized in-cage dynamics of PQ molecules, and the slow diffusion process to the long-range translational dynamics of PQ molecules, which, depending on the coverage, may be either spatially restricted, or long-range. On the low-coverage surface, uniform surface coverage is not attained, and the PQ molecules experience the effect of spatial constraints on their long-range translational dynamics. Unexpectedly, the dynamics of PQ molecules on OLC as a function of temperature and surface coverage bears qualitative resemblance to the dynamics of water molecules on oxide surfaces, including practically temperature-independent residence times for the low-coverage surface. The dynamics features that we observed may be universal across different classes of surface adsorbates.« less

  2. Mars reconnaissance orbiter's high resolution imaging science experiment (HiRISE)

    USGS Publications Warehouse

    McEwen, A.S.; Eliason, E.M.; Bergstrom, J.W.; Bridges, N.T.; Hansen, C.J.; Delamere, W.A.; Grant, J. A.; Gulick, V.C.; Herkenhoff, K. E.; Keszthelyi, L.; Kirk, R.L.; Mellon, M.T.; Squyres, S. W.; Thomas, N.; Weitz, C.M.

    2007-01-01

    The HiRISE camera features a 0.5 m diameter primary mirror, 12 m effective focal length, and a focal plane system that can acquire images containing up to 28 Gb (gigabits) of data in as little as 6 seconds. HiRISE will provide detailed images (0.25 to 1.3 m/pixel) covering ???1% of the Martian surface during the 2-year Primary Science Phase (PSP) beginning November 2006. Most images will include color data covering 20% of the potential field of view. A top priority is to acquire ???1000 stereo pairs and apply precision geometric corrections to enable topographic measurements to better than 25 cm vertical precision. We expect to return more than 12 Tb of HiRISE data during the 2-year PSP, and use pixel binning, conversion from 14 to 8 bit values, and a lossless compression system to increase coverage. HiRISE images are acquired via 14 CCD detectors, each with 2 output channels, and with multiple choices for pixel binning and number of Time Delay and Integration lines. HiRISE will support Mars exploration by locating and characterizing past, present, and future landing sites, unsuccessful landing sites, and past and potentially future rover traverses. We will investigate cratering, volcanism, tectonism, hydrology, sedimentary processes, stratigraphy, aeolian processes, mass wasting, landscape evolution, seasonal processes, climate change, spectrophotometry, glacial and periglacial processes, polar geology, and regolith properties. An Internet Web site (HiWeb) will enable anyone in the world to suggest HiRISE targets on Mars and to easily locate, view, and download HiRISE data products. Copyright 2007 by the American Geophysical Union.

  3. The Montage Image Mosaic Toolkit As A Visualization Engine.

    NASA Astrophysics Data System (ADS)

    Berriman, G. Bruce; Lerias, Angela; Good, John; Mandel, Eric; Pepper, Joshua

    2018-01-01

    The Montage toolkit has since 2003 been used to aggregate FITS images into mosaics for science analysis. It is now finding application as an engine for image visualization. One important reason is that the functionality developed for creating mosaics is also valuable in image visualization. An equally important (though perhaps less obvious) reason is that Montage is portable and is built on standard astrophysics toolkits, making it very easy to integrate into new environments. Montage models and rectifies the sky background to a common level and thus reveals faint, diffuse features; it offers an adaptive image stretching method that preserves the dynamic range of a FITS image when represented in PNG format; it provides utilities for creating cutouts of large images and downsampled versions of large images that can then be visualized on desktops or in browsers; it contains a fast reprojection algorithm intended for visualization; and it resamples and reprojects images to a common grid for subsequent multi-color visualization.This poster will highlight these visualization capabilities with the following examples:1. Creation of down-sampled multi-color images of a 16-wavelength Infrared Atlas of the Galactic Plane, sampled at 1 arcsec when created2. Integration into web-based image processing environment: JS9 is an interactive image display service for web browsers, desktops and mobile devices. It exploits the flux-preserving reprojection algorithms in Montage to transform diverse images to common image parameters for display. Select Montage programs have been compiled to Javascript/WebAssembly using the Emscripten compiler, which allows our reprojection algorithms to run in browsers at close to native speed.3. Creation of complex sky coverage maps: an multicolor all-sky map that shows the sky coverage of the Kepler and K2, KELT and TESS projects, overlaid on an all-sky 2MASS image.Montage is funded by the National Science Foundation under Grant Number ACI-1642453. JS9 is funded by the Chandra X-ray Center (NAS8-03060) and NASA's Universe of Learning (STScI-509913).

  4. Criminal Justice Research in Libraries and on the Internet.

    ERIC Educational Resources Information Center

    Nelson, Bonnie R.

    In addition to covering the enduring elements of traditional research on criminal justice, this new edition provides full coverage on research using the World Wide Web, hypertext documents, computer indexes, and other online resources. It gives an in-depth explanation of such concepts as databases, networks, and full text, and covers the Internet…

  5. Fueling a Contagion of Campus Bloodshed

    ERIC Educational Resources Information Center

    Fox, James Alan

    2008-01-01

    The gun smoke had barely cleared from the lecture hall at Northern Illinois University where last week a former graduate student had executed five students before killing himself when local and national scribes began speculating about a new trend in mass murder American-style. The "Chicago Tribune" Web site, quick with coverage of the…

  6. 42 CFR 411.39 - Automobile and liability insurance (including self-insurance), no-fault insurance, and workers...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 2 2013-10-01 2013-10-01 false Automobile and liability insurance (including self-insurance), no-fault insurance, and workers' compensation: Final conditional payment amounts via Web portal... Coverage That Limits Medicare Payment: General Provisions § 411.39 Automobile and liability insurance...

  7. 42 CFR 411.39 - Automobile and liability insurance (including self-insurance), no-fault insurance, and workers...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 2 2014-10-01 2014-10-01 false Automobile and liability insurance (including self-insurance), no-fault insurance, and workers' compensation: Final conditional payment amounts via Web portal... Coverage That Limits Medicare Payment: General Provisions § 411.39 Automobile and liability insurance...

  8. Development of dynamic Bayesian models for web application test management

    NASA Astrophysics Data System (ADS)

    Azarnova, T. V.; Polukhin, P. V.; Bondarenko, Yu V.; Kashirina, I. L.

    2018-03-01

    The mathematical apparatus of dynamic Bayesian networks is an effective and technically proven tool that can be used to model complex stochastic dynamic processes. According to the results of the research, mathematical models and methods of dynamic Bayesian networks provide a high coverage of stochastic tasks associated with error testing in multiuser software products operated in a dynamically changing environment. Formalized representation of the discrete test process as a dynamic Bayesian model allows us to organize the logical connection between individual test assets for multiple time slices. This approach gives an opportunity to present testing as a discrete process with set structural components responsible for the generation of test assets. Dynamic Bayesian network-based models allow us to combine in one management area individual units and testing components with different functionalities and a direct influence on each other in the process of comprehensive testing of various groups of computer bugs. The application of the proposed models provides an opportunity to use a consistent approach to formalize test principles and procedures, methods used to treat situational error signs, and methods used to produce analytical conclusions based on test results.

  9. River Basin Standards Interoperability Pilot

    NASA Astrophysics Data System (ADS)

    Pesquer, Lluís; Masó, Joan; Stasch, Christoph

    2016-04-01

    There is a lot of water information and tools in Europe to be applied in the river basin management but fragmentation and a lack of coordination between countries still exists. The European Commission and the member states have financed several research and innovation projects in support of the Water Framework Directive. Only a few of them are using the recently emerging hydrological standards, such as the OGC WaterML 2.0. WaterInnEU is a Horizon 2020 project focused on creating a marketplace to enhance the exploitation of EU funded ICT models, tools, protocols and policy briefs related to water and to establish suitable conditions for new market opportunities based on these offerings. One of WaterInnEU's main goals is to assess the level of standardization and interoperability of these outcomes as a mechanism to integrate ICT-based tools, incorporate open data platforms and generate a palette of interchangeable components that are able to use the water data emerging from the recently proposed open data sharing processes and data models stimulated by initiatives such as the INSPIRE directive. As part of the standardization and interoperability activities in the project, the authors are designing an experiment (RIBASE, the present work) to demonstrate how current ICT-based tools and water data can work in combination with geospatial web services in the Scheldt river basin. The main structure of this experiment, that is the core of the present work, is composed by the following steps: - Extraction of information from river gauges data in OGC WaterML 2.0 format using SOS services (preferably compliant to the OGC SOS 2.0 Hydrology Profile Best Practice). - Model floods using a WPS 2.0, WaterML 2.0 data and weather forecast models as input. - Evaluation of the applicability of Sensor Notification Services in water emergencies. - Open distribution of the input and output data as OGC web services WaterML, / WCS / WFS and with visualization utilities: WMS. The architecture tests the combination of Gauge data in a WPS that is triggered by a meteorological alert. The data is translated into OGC WaterML 2.0 time series data format and will be ingested in a SOS 2.0. SOS data is visualized in a SOS Client that is able to handle time series. The meteorological forecast data (with the supervision of an operator manipulating the WPS user interface) ingests with WaterML 2.0 time series and terrain data is input for a flooding modelling algorithm. The WPS is able to produce flooding datasets in the form of coverages that is offered to clients via a WCS 2.0 service or a WMS 1.3 service, and downloaded and visualized by the respective clients. The WPS triggers a notification or an alert that will be monitored from an emergency control response service. Acronyms AS: Alert Service ES: Event Service ICT: Information and Communication Technology NS: Notification Service OGC: Open Geospatial Consortium RIBASE: River Basin Standards Interoperability Pilot SOS: Sensor Observation Service WaterML: Water Markup Language WCS: Web Coverage Service WMS: Web Map Service WPS: Web Processing Service

  10. The GeoDataPortal: A Standards-based Environmental Modeling Data Access and Manipulation Toolkit

    NASA Astrophysics Data System (ADS)

    Blodgett, D. L.; Kunicki, T.; Booth, N.; Suftin, I.; Zoerb, R.; Walker, J.

    2010-12-01

    Environmental modelers from fields of study such as climatology, hydrology, geology, and ecology rely on many data sources and processing methods that are common across these disciplines. Interest in inter-disciplinary, loosely coupled modeling and data sharing is increasing among scientists from the USGS, other agencies, and academia. For example, hydrologic modelers need downscaled climate change scenarios and land cover data summarized for the watersheds they are modeling. Subsequently, ecological modelers are interested in soil moisture information for a particular habitat type as predicted by the hydrologic modeler. The USGS Center for Integrated Data Analytics Geo Data Portal (GDP) project seeks to facilitate this loose model coupling data sharing through broadly applicable open-source web processing services. These services simplify and streamline the time consuming and resource intensive tasks that are barriers to inter-disciplinary collaboration. The GDP framework includes a catalog describing projects, models, data, processes, and how they relate. Using newly introduced data, or sources already known to the catalog, the GDP facilitates access to sub-sets and common derivatives of data in numerous formats on disparate web servers. The GDP performs many of the critical functions needed to summarize data sources into modeling units regardless of scale or volume. A user can specify their analysis zones or modeling units as an Open Geospatial Consortium (OGC) standard Web Feature Service (WFS). Utilities to cache Shapefiles and other common GIS input formats have been developed to aid in making the geometry available for processing via WFS. Dataset access in the GDP relies primarily on the Unidata NetCDF-Java library’s common data model. Data transfer relies on methods provided by Unidata’s Thematic Real-time Environmental Data Distribution System Data Server (TDS). TDS services of interest include the Open-source Project for a Network Data Access Protocol (OPeNDAP) standard for gridded time series, the OGC’s Web Coverage Service for high-density static gridded data, and Unidata’s CDM-remote for point time series. OGC WFS and Sensor Observation Service (SOS) are being explored as mechanisms to serve and access static or time series data attributed to vector geometry. A set of standardized XML-based output formats allows easy transformation into a wide variety of “model-ready” formats. Interested users will have the option of submitting custom transformations to the GDP or transforming the XML output as a post-process. The GDP project aims to support simple, rapid development of thin user interfaces (like web portals) to commonly needed environmental modeling-related data access and manipulation tools. Standalone, service-oriented components of the GDP framework provide the metadata cataloging, data subset access, and spatial-statistics calculations needed to support interdisciplinary environmental modeling.

  11. Extreme Mergers from the Massive Cluster Survey

    NASA Astrophysics Data System (ADS)

    Morris, Roger

    2010-09-01

    We propose to observe two extraordinary, high-redshift galaxy clusters from the Massive Cluster Survey. Both targets are very rare, triple merger systems (one a nearly co-linear merger), and likely lie at the deepest nodes of the cosmic web. Both targets show multiple strong gravitational lensing arcs in the cluster cores. These targets only possess very short (10ks) Chandra observations, and are unobserved by XMM-Newton. The X-ray data will be used to probe the mass distribution of hot, baryonic gas, and to reveal the details of the merger physics and the process of cluster assembly. We will also search for hints of X-ray emission from filaments between the merging clumps. Subaru and Hubble Space Telescope imaging data are in hand; we request additional HST coverage for one object.

  12. From chromatogram to analyte to metabolite. How to pick horses for courses from the massive web resources for mass spectral plant metabolomics

    PubMed Central

    Perez de Souza, Leonardo; Naake, Thomas; Tohge, Takayuki; Fernie, Alisdair R

    2017-01-01

    Abstract The grand challenge currently facing metabolomics is the expansion of the coverage of the metabolome from a minor percentage of the metabolic complement of the cell toward the level of coverage afforded by other post-genomic technologies such as transcriptomics and proteomics. In plants, this problem is exacerbated by the sheer diversity of chemicals that constitute the metabolome, with the number of metabolites in the plant kingdom generally considered to be in excess of 200 000. In this review, we focus on web resources that can be exploited in order to improve analyte and ultimately metabolite identification and quantification. There is a wide range of available software that not only aids in this but also in the related area of peak alignment; however, for the uninitiated, choosing which program to use is a daunting task. For this reason, we provide an overview of the pros and cons of the software as well as comments regarding the level of programing skills required to effectively exploit their basic functions. In addition, the torrent of available genome and transcriptome sequences that followed the advent of next-generation sequencing has opened up further valuable resources for metabolite identification. All things considered, we posit that only via a continued communal sharing of information such as that deposited in the databases described within the article are we likely to be able to make significant headway toward improving our coverage of the plant metabolome. PMID:28520864

  13. A Dynamic Recommender System for Improved Web Usage Mining and CRM Using Swarm Intelligence.

    PubMed

    Alphy, Anna; Prabakaran, S

    2015-01-01

    In modern days, to enrich e-business, the websites are personalized for each user by understanding their interests and behavior. The main challenges of online usage data are information overload and their dynamic nature. In this paper, to address these issues, a WebBluegillRecom-annealing dynamic recommender system that uses web usage mining techniques in tandem with software agents developed for providing dynamic recommendations to users that can be used for customizing a website is proposed. The proposed WebBluegillRecom-annealing dynamic recommender uses swarm intelligence from the foraging behavior of a bluegill fish. It overcomes the information overload by handling dynamic behaviors of users. Our dynamic recommender system was compared against traditional collaborative filtering systems. The results show that the proposed system has higher precision, coverage, F1 measure, and scalability than the traditional collaborative filtering systems. Moreover, the recommendations given by our system overcome the overspecialization problem by including variety in recommendations.

  14. Capturing citation activity in three health sciences departments: a comparison study of Scopus and Web of Science.

    PubMed

    Sarkozy, Alexandra; Slyman, Alison; Wu, Wendy

    2015-01-01

    Scopus and Web of Science are the two major citation databases that collect and disseminate bibliometric statistics about research articles, journals, institutions, and individual authors. Liaison librarians are now regularly called upon to utilize these databases to assist faculty in finding citation activity on their published works for tenure and promotion, grant applications, and more. But questions about the accuracy, scope, and coverage of these tools deserve closer scrutiny. Discrepancies in citation capture led to a systematic study on how Scopus and Web of Science compared in a real-life situation encountered by liaisons: comparing three different disciplines at a medical school and nursing program. How many articles would each database retrieve for each faculty member using the author-searching tools provided? How many cited references for each faculty member would each tool generate? Results demonstrated troubling differences in publication and citation activity capture between Scopus and Web of Science. Implications for librarians are discussed.

  15. A Dynamic Recommender System for Improved Web Usage Mining and CRM Using Swarm Intelligence

    PubMed Central

    Alphy, Anna; Prabakaran, S.

    2015-01-01

    In modern days, to enrich e-business, the websites are personalized for each user by understanding their interests and behavior. The main challenges of online usage data are information overload and their dynamic nature. In this paper, to address these issues, a WebBluegillRecom-annealing dynamic recommender system that uses web usage mining techniques in tandem with software agents developed for providing dynamic recommendations to users that can be used for customizing a website is proposed. The proposed WebBluegillRecom-annealing dynamic recommender uses swarm intelligence from the foraging behavior of a bluegill fish. It overcomes the information overload by handling dynamic behaviors of users. Our dynamic recommender system was compared against traditional collaborative filtering systems. The results show that the proposed system has higher precision, coverage, F1 measure, and scalability than the traditional collaborative filtering systems. Moreover, the recommendations given by our system overcome the overspecialization problem by including variety in recommendations. PMID:26229978

  16. The iMars WebGIS - Spatio-Temporal Data Queries and Single Image Map Web Services

    NASA Astrophysics Data System (ADS)

    Walter, Sebastian; Steikert, Ralf; Schreiner, Bjoern; Muller, Jan-Peter; van Gasselt, Stephan; Sidiropoulos, Panagiotis; Lanz-Kroechert, Julia

    2017-04-01

    Introduction: Web-based planetary image dissemination platforms usually show outline coverages of the data and offer querying for metadata as well as preview and download, e.g. the HRSC Mapserver (Walter & van Gasselt, 2014). Here we introduce a new approach for a system dedicated to change detection by simultanous visualisation of single-image time series in a multi-temporal context. While the usual form of presenting multi-orbit datasets is the merge of the data into a larger mosaic, we want to stay with the single image as an important snapshot of the planetary surface at a specific time. In the context of the EU FP-7 iMars project we process and ingest vast amounts of automatically co-registered (ACRO) images. The base of the co-registration are the high precision HRSC multi-orbit quadrangle image mosaics, which are based on bundle-block-adjusted multi-orbit HRSC DTMs. Additionally we make use of the existing bundle-adjusted HRSC single images available at the PDS archives. A prototype demonstrating the presented features is available at http://imars.planet.fu-berlin.de. Multi-temporal database: In order to locate multiple coverage of images and select images based on spatio-temporal queries, we converge available coverage catalogs for various NASA imaging missions into a relational database management system with geometry support. We harvest available metadata entries during our processing pipeline using the Integrated Software for Imagers and Spectrometers (ISIS) software. Currently, this database contains image outlines from the MGS/MOC, MRO/CTX and the MO/THEMIS instruments with imaging dates ranging from 1996 to the present. For the MEx/HRSC data, we already maintain a database which we automatically update with custom software based on the VICAR environment. Web Map Service with time support: The MapServer software is connected to the database and provides Web Map Services (WMS) with time support based on the START_TIME image attribute. It allows temporal WMS GetMap requests by setting additional TIME parameter values in the request. The values for the parameter represent an interval defined by its lower and upper bounds. As the WMS time standard only supports one time variable, only the start times of the images are considered. If no time values are submitted with the request, the full time range of all images is assumed as the default. Dynamic single image WMS: To compare images from different acquisition times at sites of multiple coverage, we have to load every image as a single WMS layer. Due to the vast amount of single images we need a way to set up the layers in a dynamic way - the map server does not know the images to be served beforehand. We use the MapScript interface to dynamically access MapServer's objects and configure the file name and path of the requested image in the map configuration. The layers are created on-the-fly each representing only one single image. On the frontend side, the vendor-specific WMS request parameter (PRODUCTID) has to be appended to the regular set of WMS parameters. The request is then passed on to the MapScript instance. Web Map Tile Cache: In order to speed up access of the WMS requests, a MapCache instance has been integrated in the pipeline. As it is not aware of the available PDS product IDs which will be queried, the PRODUCTID parameter is configured as an additional dimension of the cache. The WMS request is received by the Apache webserver configured with the MapCache module. If the tile is available in the tile cache, it is immediately commited to the client. If not available, the tile request is forwarded to Apache and the MapScript module. The Python script intercepts the WMS request and extracts the product ID from the parameter chain. It loads the layer object from the map file and appends the file name and path of the inquired image. After some possible further image processing inside the script (stretching, color matching), the request is submitted to the MapServer backend which in turn delivers the response back to the MapCache instance. Web frontend: We have implemented a web-GIS frontend based on various OpenLayers components. The basemap is a global color-hillshaded HRSC bundle-adjusted DTM mosaic with a resolution of 50 m per pixel. The new bundle-block-adjusted qudrangle mosaics of the MC-11 quadrangle, both image and DTM, are included with opacity slider options. The layer user interface has been adapted on the base of the ol3-layerswitcher and extended by foldable and switchable groups, layer sorting (by resolution, by time and alphabeticallly) and reordering (drag-and-drop). A collapsible time panel accomodates a time slider interface where the user can filter the visible data by a range of Mars or Earth dates and/or by solar longitudes. The visualisation of time-series of single images is controlled by a specific toolbar enabling the workflow of image selection (by point or bounding box), dynamic image loading and playback of single images in a video player-like environment. During a stress-test campaign we could demonstrate that the system is capable of serving up to 10 simultaneous users on its current lightweight development hardware. It is planned to relocate the software to more powerful hardware by the time of this conference. Conclusions/Outlook: The iMars webGIS is an expert tool for the detection and visualization of surface changes. We demonstrate a technique to dynamically retrieve and display single images based on the time-series structure of the data. Together with the multi-temporal database and its MapServer/MapCache backend it provides a stable and high performance environment for the dissemination of the various iMars products. Acknowledgements: This research has received funding from the EU's FP7 Programme under iMars 607379 and by the German Space Agency (DLR Bonn), grant 50 QM 1301 (HRSC on Mars Express).

  17. Interaction mining and skill-dependent recommendations for multi-objective team composition

    PubMed Central

    Dorn, Christoph; Skopik, Florian; Schall, Daniel; Dustdar, Schahram

    2011-01-01

    Web-based collaboration and virtual environments supported by various Web 2.0 concepts enable the application of numerous monitoring, mining and analysis tools to study human interactions and team formation processes. The composition of an effective team requires a balance between adequate skill fulfillment and sufficient team connectivity. The underlying interaction structure reflects social behavior and relations of individuals and determines to a large degree how well people can be expected to collaborate. In this paper we address an extended team formation problem that does not only require direct interactions to determine team connectivity but additionally uses implicit recommendations of collaboration partners to support even sparsely connected networks. We provide two heuristics based on Genetic Algorithms and Simulated Annealing for discovering efficient team configurations that yield the best trade-off between skill coverage and team connectivity. Our self-adjusting mechanism aims to discover the best combination of direct interactions and recommendations when deriving connectivity. We evaluate our approach based on multiple configurations of a simulated collaboration network that features close resemblance to real world expert networks. We demonstrate that our algorithm successfully identifies efficient team configurations even when removing up to 40% of experts from various social network configurations. PMID:22298939

  18. Metabolite profiling of a NIST Standard Reference Material for human plasma (SRM 1950): GC-MS, LC-MS, NMR, and clinical laboratory analyses, libraries, and web-based resources.

    PubMed

    Simón-Manso, Yamil; Lowenthal, Mark S; Kilpatrick, Lisa E; Sampson, Maureen L; Telu, Kelly H; Rudnick, Paul A; Mallard, W Gary; Bearden, Daniel W; Schock, Tracey B; Tchekhovskoi, Dmitrii V; Blonder, Niksa; Yan, Xinjian; Liang, Yuxue; Zheng, Yufang; Wallace, William E; Neta, Pedatsur; Phinney, Karen W; Remaley, Alan T; Stein, Stephen E

    2013-12-17

    Recent progress in metabolomics and the development of increasingly sensitive analytical techniques have renewed interest in global profiling, i.e., semiquantitative monitoring of all chemical constituents of biological fluids. In this work, we have performed global profiling of NIST SRM 1950, "Metabolites in Human Plasma", using GC-MS, LC-MS, and NMR. Metabolome coverage, difficulties, and reproducibility of the experiments on each platform are discussed. A total of 353 metabolites have been identified in this material. GC-MS provides 65 unique identifications, and most of the identifications from NMR overlap with the LC-MS identifications, except for some small sugars that are not directly found by LC-MS. Also, repeatability and intermediate precision analyses show that the SRM 1950 profiling is reproducible enough to consider this material as a good choice to distinguish between analytical and biological variability. Clinical laboratory data shows that most results are within the reference ranges for each assay. In-house computational tools have been developed or modified for MS data processing and interactive web display. All data and programs are freely available online at http://peptide.nist.gov/ and http://srmd.nist.gov/ .

  19. ATLAS, CMS and new challenges for public communication

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, Lucas; Barney, David; Goldfarb, Steven

    On 30 March 2010 the first high-energy collisions brought the LHC experiments into the era of research and discovery. Millions of viewers worldwide tuned in to the webcasts and followed the news via Web 2.0 tools, such as blogs, Twitter, and Facebook, with 205,000 unique visitors to CERN's Web site. Media coverage at the experiments and in institutes all over the world yielded more than 2,200 news items including 800 TV broadcasts. We describe the new multimedia communications challenges, due to the massive public interest in the LHC programme, and the corresponding responses of the ATLAS and CMS experiments, inmore » the areas of Web 2.0 tools, multimedia, webcasting, videoconferencing, and collaborative tools. We discuss the strategic convergence of the two experiments' communications services, information systems and public database of outreach material.« less

  20. ATLAS, CMS and New Challenges for Public Communication

    NASA Astrophysics Data System (ADS)

    Taylor, Lucas; Barney, David; Goldfarb, Steven

    2011-12-01

    On 30 March 2010 the first high-energy collisions brought the LHC experiments into the era of research and discovery. Millions of viewers worldwide tuned in to the webcasts and followed the news via Web 2.0 tools, such as blogs, Twitter, and Facebook, with 205,000 unique visitors to CERN's Web site. Media coverage at the experiments and in institutes all over the world yielded more than 2,200 news items including 800 TV broadcasts. We describe the new multimedia communications challenges, due to the massive public interest in the LHC programme, and the corresponding responses of the ATLAS and CMS experiments, in the areas of Web 2.0 tools, multimedia, webcasting, videoconferencing, and collaborative tools. We discuss the strategic convergence of the two experiments' communications services, information systems and public database of outreach material.

  1. School and Community in Shock as Newspaper Tries to Honor Three Who Die in Car Wreck.

    ERIC Educational Resources Information Center

    Zwiebel, Kathleen

    1999-01-01

    Describes how the staffs of the yearbook, newspaper, online Web page, and literary magazine (at the Pottsville Area High School in Pennsylvania) worked together on a special insert to cover the deaths of three students in an auto accident. Appends an exercise on deciding coverage of automobile accidents. (RS)

  2. 29 CFR 2590.715-1251 - Preservation of right to maintain existing coverage.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... paragraphs (f) and (g)(2) of this section, if a group health plan (including a group health plan that was... grandfathered health plan. (2) Disclosure of grandfather status—(i) To maintain status as a grandfathered health... Web site has a table summarizing which protections do and do not apply to grandfathered health plans...

  3. 45 CFR 147.140 - Preservation of right to maintain existing coverage.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    .... Subject to paragraphs (f) and (g)(2) of this section, if a group health plan (including a group health... to be a grandfathered health plan. (2) Disclosure of grandfather status—(i) To maintain status as a... Labor at 1-866-444-3272 or www.dol.gov/ebsa/healthreform. This Web site has a table summarizing which...

  4. 76 FR 53475 - Medicare Program; Meeting of the Medicare Evidence Development and Coverage Advisory Committee...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-26

    .... Your comments should focus on issues specific to the list of topics that we have proposed to the Committee. The list of research topics to be discussed at the meeting will be available on the following Web... disease. Background information about this topic, including panel materials, is available at http://www...

  5. 75 FR 73094 - Medicare Program; Meeting of the Medicare Evidence Development and Coverage Advisory Committee...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-29

    ... available. Your comments should focus on issues specific to the list of topics that we have proposed to the Committee. The list of research topics to be discussed at the meeting will be available on the following Web... (CKD) patients (pre- dialysis and dialysis). Background information about this topic, including panel...

  6. 77 FR 53204 - Medicare Program; Meeting of the Medicare Evidence Development and Coverage Advisory Committee...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-31

    .... Your comments should focus on issues specific to the list of topics that we have proposed to the Committee. The list of research topics to be discussed at the meeting will be available on the following web... management of heart failure. Background information about this topic, including panel materials, is available...

  7. 77 FR 64997 - Medicare Program; Meeting of the Medicare Evidence Development and Coverage Advisory Committee...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-24

    ... focus on issues specific to the list of topics that we have proposed to the Committee. The list of research topics to be discussed at the meeting will be available on the following Web site prior to the... neurodegenerative disease. Background information about this topic, including panel materials, is available at http...

  8. 75 FR 8980 - Medicare Program; Meeting of the Medicare Evidence Development and Coverage Advisory Committee...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-26

    ... available. Your comments should focus on issues specific to the list of topics that we have proposed to the Committee. The list of research topics to be discussed at the meeting will be available on the following Web... prostate cancer. Background information about this topic, including panel materials, is available at http...

  9. ACMES: fast multiple-genome searches for short repeat sequences with concurrent cross-species information retrieval

    PubMed Central

    Reneker, Jeff; Shyu, Chi-Ren; Zeng, Peiyu; Polacco, Joseph C.; Gassmann, Walter

    2004-01-01

    We have developed a web server for the life sciences community to use to search for short repeats of DNA sequence of length between 3 and 10 000 bases within multiple species. This search employs a unique and fast hash function approach. Our system also applies information retrieval algorithms to discover knowledge of cross-species conservation of repeat sequences. Furthermore, we have incorporated a part of the Gene Ontology database into our information retrieval algorithms to broaden the coverage of the search. Our web server and tutorial can be found at http://acmes.rnet.missouri.edu. PMID:15215469

  10. 2007 costs and coverage of antiretrovirals under Medicare Part D for people with HIV/AIDS living in North Carolina.

    PubMed

    Sengupta, Sohini

    2008-01-01

    Effective January 1, 2006 Medicare Part D became a new source of prescription drug coverage for people with HIV/AIDS in the United States. The implementation of Part D has affected access to antiretrovirals for people with HIV/AIDS. In North Carolina, access can be difficult because of the state's struggling safety net programs and the growing HIV-infected populations among Blacks and in poor rural counties. This analysis examines Medicare Part D antiretroviral coverage in 2007 for beneficiaries with HIV/AIDS in North Carolina, particularly those who did not qualify as dual eligibles or for a full low-income subsidy. Data describing program coverage were obtained from the Web site www.medicare.gov and descriptive analyses were performed to assess changes in antiretroviral coverage in Part D prescription drug plans in North Carolina. Most of the 26 antiretrovirals are covered in some way by 76 North Carolina prescription drug plans. There may be variability in coverage however associated with (a) antiretroviral classification within formularies; (b) drug premiums; (c) whether premiums can be waived; (d) annual deductibles; and (e) whether coverage is provided in the "doughnut hole." The data may not reflect actual patterns of drug use and realized access to the drugs. The findings are limited to antiretroviral coverage in North Carolina's Part D offerings but could be generalized to other states with similar prescription drug plan costs and coverage. These concerns continue to pose significant challenges to accessing antiretrovirals for Part D beneficiaries with HIV/AIDS in North Carolina. Variability demonstrated within prescription drug plans will continue, and beneficiaries with HIV/AIDS who do not qualify as dual eligibles or for low-income subsidies will need to evaluate these issues when selecting a prescription drug plan in future enrollment periods.

  11. Land User and Land Cover Maps of Europe: a Webgis Platform

    NASA Astrophysics Data System (ADS)

    Brovelli, M. A.; Fahl, F. C.; Minghini, M.; Molinari, M. E.

    2016-06-01

    This paper presents the methods and implementation processes of a WebGIS platform designed to publish the available land use and land cover maps of Europe at continental scale. The system is built completely on open source infrastructure and open standards. The proposed architecture is based on a server-client model having GeoServer as the map server, Leaflet as the client-side mapping library and the Bootstrap framework at the core of the front-end user interface. The web user interface is designed to have typical features of a desktop GIS (e.g. activate/deactivate layers and order layers by drag and drop actions) and to show specific information on the activated layers (e.g. legend and simplified metadata). Users have the possibility to change the base map from a given list of map providers (e.g. OpenStreetMap and Microsoft Bing) and to control the opacity of each layer to facilitate the comparison with both other land cover layers and the underlying base map. In addition, users can add to the platform any custom layer available through a Web Map Service (WMS) and activate the visualization of photos from popular photo sharing services. This last functionality is provided in order to have a visual assessment of the available land coverages based on other user-generated contents available on the Internet. It is supposed to be a first step towards a calibration/validation service that will be made available in the future.

  12. Analyzing CRISM hyperspectral imagery using PlanetServer.

    NASA Astrophysics Data System (ADS)

    Figuera, Ramiro Marco; Pham Huu, Bang; Minin, Mikhail; Flahaut, Jessica; Halder, Anik; Rossi, Angelo Pio

    2017-04-01

    Mineral characterization of planetary surfaces bears great importance for space exploration. In order to perform it, orbital hyperspectral imagery is widely used. In our research we use Compact Reconnaissance Imaging Spectrometer for Mars (CRISM) [1] TRDR L observations with a spectral range of 1 to 4 µm. PlanetServer comprises a server, a web client and a Python client/API. The server side uses the Array DataBase Management System (DBMS) Raster Data Manager (Rasdaman) Community Edition [2]. OGC standards such as the Web Coverage Processing Service (WCPS) [3], an SQL-like language capable to query information along the image cube, are implemented in the PetaScope component [4]. The client side uses NASA's Web World Wind [5] allowing the user to access the data in an intuitive way. The client consists of a globe where all cubes are deployed, a main menu where projections, base maps and RGB combinations are provided, and a plot dock where the spectral information is shown. The RGB combinator tool allows to do band combination such as the CRISM products [6] using WCPS. The spectral information is retrieved using WCPS and shown in the plot dock/widget. The USGS splib06a library [7] is available to compare CRISM vs. laboratory spectra. The Python API provides an environment to create RGB combinations that can be embedded into existing pipelines. All employed libraries and tools are open source and can be easily adapted to other datasets. PlanetServer stands as a promising tool for spectral analysis on planetary bodies. M3/Moon and OMEGA datasets will be soon available. [1] S. Murchie et al., "Compact Connaissance Imaging Spectrometer for Mars (CRISM) on Mars Reconnaissance Orbiter (MRO)," J. Geophys. Res. E Planets,2007. [2] P. Baumann, A. Dehmel, P. Furtado, R. Ritsch, and N. Widmann, "The multidimensional database system RasDaMan," ACM SIGMOD Rec., vol. 27, no. 2, pp. 575-577, Jun. 1998. [3] P. Baumann, "The OGC web coverage processing service (WCPS) standard," Geoinformatica, vol. 14, no. 4, Jul. 2010. [4] A. Aiordǎchioaie and P. Baumann, "PetaScope: An open-source implementation of the OGC WCS Geo service standards suite," Lect. Notes Comput. Sci. (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics), vol. 6187 LNCS, pp. 160-168, Jun. 2010. [5] P. Hogan, C. Maxwell, R. Kim, and T. Gaskins, "World Wind 3D Earth Viewing," Apr. 2007. [6] C. E. Viviano-Beck et al., "Revised CRISM spectral parameters and summary products based on the currently detected mineral diversity on Mars," J. Geophys. Res. E Planets, vol. 119, no. 6, pp. 1403-1431, Jun. 2014. [7] R. N. Clark et al., "USGS digital spectral library splib06a: U.S. Geological Survey, Digital Data Series 231," 2007. [Online]. Available: http://speclab.cr.usgs.gov/spectral.lib06.

  13. Increasing the availability and usability of terrestrial ecology data through geospatial Web services and visualization tools (Invited)

    NASA Astrophysics Data System (ADS)

    Santhana Vannan, S.; Cook, R. B.; Wilson, B. E.; Wei, Y.

    2010-12-01

    Terrestrial ecology data sets are produced from diverse data sources such as model output, field data collection, laboratory analysis and remote sensing observation. These data sets can be created, distributed, and consumed in diverse ways as well. However, this diversity can hinder the usability of the data, and limit data users’ abilities to validate and reuse data for science and application purposes. Geospatial web services, such as those described in this paper, are an important means of reducing this burden. Terrestrial ecology researchers generally create the data sets in diverse file formats, with file and data structures tailored to the specific needs of their project, possibly as tabular data, geospatial images, or documentation in a report. Data centers may reformat the data to an archive-stable format and distribute the data sets through one or more protocols, such as FTP, email, and WWW. Because of the diverse data preparation, delivery, and usage patterns, users have to invest time and resources to bring the data into the format and structure most useful for their analysis. This time-consuming data preparation process shifts valuable resources from data analysis to data assembly. To address these issues, the ORNL DAAC, a NASA-sponsored terrestrial ecology data center, has utilized geospatial Web service technology, such as Open Geospatial Consortium (OGC) Web Map Service (WMS) and OGC Web Coverage Service (WCS) standards, to increase the usability and availability of terrestrial ecology data sets. Data sets are standardized into non-proprietary file formats and distributed through OGC Web Service standards. OGC Web services allow the ORNL DAAC to store data sets in a single format and distribute them in multiple ways and formats. Registering the OGC Web services through search catalogues and other spatial data tools allows for publicizing the data sets and makes them more available across the Internet. The ORNL DAAC has also created a Web-based graphical user interface called Spatial Data Access Tool (SDAT) that utilizes OGC Web services standards and allows data distribution and consumption for users not familiar with OGC standards. SDAT also allows for users to visualize the data set prior to download. Google Earth visualizations of the data set are also provided through SDAT. The use of OGC Web service standards at the ORNL DAAC has enabled an increase in data consumption. In one case, a data set had ~10 fold increase in download through OGC Web service in comparison to the conventional FTP and WWW method of access. The increase in download suggests that users are not only finding the data sets they need but also able to consume them readily in the format they need.

  14. A Relevancy Algorithm for Curating Earth Science Data Around Phenomenon

    NASA Technical Reports Server (NTRS)

    Maskey, Manil; Ramachandran, Rahul; Li, Xiang; Weigel, Amanda; Bugbee, Kaylin; Gatlin, Patrick; Miller, J. J.

    2017-01-01

    Earth science data are being collected for various science needs and applications, processed using different algorithms at multiple resolutions and coverages, and then archived at different archiving centers for distribution and stewardship causing difficulty in data discovery. Curation, which typically occurs in museums, art galleries, and libraries, is traditionally defined as the process of collecting and organizing information around a common subject matter or a topic of interest. Curating data sets around topics or areas of interest addresses some of the data discovery needs in the field of Earth science, especially for unanticipated users of data. This paper describes a methodology to automate search and selection of data around specific phenomena. Different components of the methodology including the assumptions, the process, and the relevancy ranking algorithm are described. The paper makes two unique contributions to improving data search and discovery capabilities. First, the paper describes a novel methodology developed for automatically curating data around a topic using Earthscience metadata records. Second, the methodology has been implemented as a standalone web service that is utilized to augment search and usability of data in a variety of tools.

  15. A relevancy algorithm for curating earth science data around phenomenon

    NASA Astrophysics Data System (ADS)

    Maskey, Manil; Ramachandran, Rahul; Li, Xiang; Weigel, Amanda; Bugbee, Kaylin; Gatlin, Patrick; Miller, J. J.

    2017-09-01

    Earth science data are being collected for various science needs and applications, processed using different algorithms at multiple resolutions and coverages, and then archived at different archiving centers for distribution and stewardship causing difficulty in data discovery. Curation, which typically occurs in museums, art galleries, and libraries, is traditionally defined as the process of collecting and organizing information around a common subject matter or a topic of interest. Curating data sets around topics or areas of interest addresses some of the data discovery needs in the field of Earth science, especially for unanticipated users of data. This paper describes a methodology to automate search and selection of data around specific phenomena. Different components of the methodology including the assumptions, the process, and the relevancy ranking algorithm are described. The paper makes two unique contributions to improving data search and discovery capabilities. First, the paper describes a novel methodology developed for automatically curating data around a topic using Earth science metadata records. Second, the methodology has been implemented as a stand-alone web service that is utilized to augment search and usability of data in a variety of tools.

  16. Reconstruction of the first web space in symbrachydactyly using the reverse radial forearm flap.

    PubMed

    Gülgönen, Ayan; Güdemez, Eftal

    2007-02-01

    To present a new approach for the reconstruction of severe first web contractures using a distally based reverse radial forearm flap in symbrachydactyly patients. This study included 6 hands in 5 patients. Subjective evaluation included appearance, parent satisfaction (and patient satisfaction when appropriate), and ability to perform daily activities such as thumb-index grasp and pinch at follow-up evaluations. We measured the angle between the first and second rays using a goniometer at maximum radial abduction, and pinch and grasp strengths were evaluated as an objective assessment. The average follow-up period was 2 years. All parents and patients were happy with the aesthetic appearance. They were completely satisfied in their daily living activities. The average first web angle measurement was 56 degrees . An average of 39 degrees of improvement of web measurement was achieved. For the unilateral 4 patients, the average pinch strength measurement was 80% of the normal contralateral hand and the grip strength was 75% of the normal contralateral hand. The reverse radial forearm flap was found to be a safe and simple method in the reconstruction of severe first web contractures in symbrachydactyly patients. This method provided good coverage of appropriate thickness and skin quality, and supple soft tissue that filled the first web space. Therapeutic IV.

  17. Health Care Coverage Decision Making in Low- and Middle-Income Countries: Experiences from 25 Coverage Schemes.

    PubMed

    Gutierrez, Hialy; Shewade, Ashwini; Dai, Minghan; Mendoza-Arana, Pedro; Gómez-Dantés, Octavio; Jain, Nishant; Khonelidze, Irma; Nabyonga-Orem, Juliet; Saleh, Karima; Teerawattananon, Yot; Nishtar, Sania; Hornberger, John

    2015-08-01

    Lessons learned by countries that have successfully implemented coverage schemes for health services may be valuable for other countries, especially low- and middle-income countries (LMICs), which likewise are seeking to provide/expand coverage. The research team surveyed experts in population health management from LMICs for information on characteristics of health care coverage schemes and factors that influenced decision-making processes. The level of coverage provided by the different schemes varied. Nearly all the health care coverage schemes involved various representatives and stakeholders in their decision-making processes. Maternal and child health, cardiovascular diseases, cancer, and HIV were among the highest priorities guiding coverage development decisions. Evidence used to inform coverage decisions included medical literature, regional and global epidemiology, and coverage policies of other coverage schemes. Funding was the most commonly reported reason for restricting coverage. This exploratory study provides an overview of health care coverage schemes from participating LMICs and contributes to the scarce evidence base on coverage decision making. Sharing knowledge and experiences among LMICs can support efforts to establish systems for accessible, affordable, and equitable health care.

  18. Where Do I Start? A School Library Handbook. Second Edition

    ERIC Educational Resources Information Center

    Linworth, 2012

    2012-01-01

    If you're new to running a library or looking for a refresher, this book can serve as your first reference source for school library operation, providing overview information on a wealth of topics, lists of resources for more in-depth information, and coverage of current topics such as Web 2.0, fundraising, digital booktalks, and cybersafety.…

  19. Ethical Principles Associated with the Publication of Research in ASHA's Scholarly Journals: Importance and Adequacy of Coverage

    ERIC Educational Resources Information Center

    Ingham, Janis C.; Minifie, Fred D.; Horner, Jennifer; Robey, Randall R.; Lansing, Charissa; McCartney, James H.; Slater, Sarah C.; Moss, Sharon E.

    2011-01-01

    Purpose: The purpose of this 2-part study was to determine the importance of specific topics relating to publication ethics and adequacy of the American Speech-Language-Hearing Association's (ASHA's) policies regarding these topics. Method: A 56-item Web-based survey was sent to (a) ASHA journal editors, associate editors, and members of the…

  20. Near real time water quality monitoring of Chivero and Manyame lakes of Zimbabwe

    NASA Astrophysics Data System (ADS)

    Muchini, Ronald; Gumindoga, Webster; Togarepi, Sydney; Pinias Masarira, Tarirai; Dube, Timothy

    2018-05-01

    Zimbabwe's water resources are under pressure from both point and non-point sources of pollution hence the need for regular and synoptic assessment. In-situ and laboratory based methods of water quality monitoring are point based and do not provide a synoptic coverage of the lakes. This paper presents novel methods for retrieving water quality parameters in Chivero and Manyame lakes, Zimbabwe, from remotely sensed imagery. Remotely sensed derived water quality parameters are further validated using in-situ data. It also presents an application for automated retrieval of those parameters developed in VB6, as well as a web portal for disseminating the water quality information to relevant stakeholders. The web portal is developed, using Geoserver, open layers and HTML. Results show the spatial variation of water quality and an automated remote sensing and GIS system with a web front end to disseminate water quality information.

  1. Struct2Net: a web service to predict protein–protein interactions using a structure-based approach

    PubMed Central

    Singh, Rohit; Park, Daniel; Xu, Jinbo; Hosur, Raghavendra; Berger, Bonnie

    2010-01-01

    Struct2Net is a web server for predicting interactions between arbitrary protein pairs using a structure-based approach. Prediction of protein–protein interactions (PPIs) is a central area of interest and successful prediction would provide leads for experiments and drug design; however, the experimental coverage of the PPI interactome remains inadequate. We believe that Struct2Net is the first community-wide resource to provide structure-based PPI predictions that go beyond homology modeling. Also, most web-resources for predicting PPIs currently rely on functional genomic data (e.g. GO annotation, gene expression, cellular localization, etc.). Our structure-based approach is independent of such methods and only requires the sequence information of the proteins being queried. The web service allows multiple querying options, aimed at maximizing flexibility. For the most commonly studied organisms (fly, human and yeast), predictions have been pre-computed and can be retrieved almost instantaneously. For proteins from other species, users have the option of getting a quick-but-approximate result (using orthology over pre-computed results) or having a full-blown computation performed. The web service is freely available at http://struct2net.csail.mit.edu. PMID:20513650

  2. Extreme Mergers from the Massive Cluster Survey

    NASA Astrophysics Data System (ADS)

    Morris, R.

    2010-09-01

    We will observe an extraordinary, high-redshift galaxy cluster from the Massive Cluster Survey. The target is a very rare, triple merger system, and likely lies at the one of deepest nodes of the cosmic web. The target shows multiple strong gravitational lensing arcs in the cluster core. This target only possesses a very short {10ks} Chandra observations, and is unobserved by XMM-Newton. The X-ray data from this joint Chandra/HST proposal will be used to probe the mass distribution of hot, baryonic gas, and to reveal the details of the merger physics and the process of cluster assembly. We will also search for hints of X-ray emission from filaments between the merging clumps. Subaru and some Hubble Space Telescope imaging data are in hand; we will gather additional HST coverage for a lensing analysis.

  3. Comparison of anticancer drug coverage decisions in the United States and United Kingdom: does the evidence support the rhetoric?

    PubMed

    Mason, Anne; Drummond, Michael; Ramsey, Scott; Campbell, Jonathan; Raisch, Dennis

    2010-07-10

    In contrast to the United States, several European countries have health technology assessment programs for drugs, many of which assess cost effectiveness. Coverage decisions that consider cost effectiveness may lead to restrictions in access. For a purposive sample of five decision-making bodies, we analyzed US and United Kingdom coverage decisions on all anticancer drugs approved by the US Food and Drug Administration (FDA) from 2004 to 2008. Data sources for the timing and outcome of licensing and coverage decisions included published and unpublished documentation, Web sites, and personal communication. The FDA approved 59 anticancer drugs over the study period, of which 46 were also approved by the European Medicines Agency. In the United States, 100% of drugs were covered, mostly without restriction. However, the United Kingdom bodies made positive coverage decisions for less than half of licensed drugs (National Institute for Health and Clinical Excellence [NICE]: 39%; Scottish Medicines Consortium [SMC]: 43%). Whereas the Centers for Medicare and Medicaid Services (CMS) and the Department of Veterans Affairs (VA) covered all 59 drugs from the FDA license date, delays were evident for some Regence Group decisions that were informed by cost effectiveness (median, 0 days; semi-interquartile range [SIQR], 122 days; n = 22). Relative to the European Medicines Agency license date, median time to coverage was 783 days (SIQR, 170 days) for NICE and 231 days (SIQR, 129 days) for the SMC. Anticancer drug coverage decisions that consider cost effectiveness are associated with greater restrictions and slower time to coverage. However, this approach may represent an explicit alternative to rationing achieved through the use of patient copayments.

  4. Patterns of internationalization and criteria for research assessment in the social sciences and humanities.

    PubMed

    Sivertsen, Gunnar

    This article investigates the developments during the last decades in the use of languages, publication types and publication channels in the social sciences and humanities (SSH). The purpose is to develop an understanding of the processes of internationalization and to apply this understanding in a critical examination of two often used general criteria in research evaluations in the SSH. One of them is that the coverage of a publication in Scopus or Web of Science is seen in itself as an expression of research quality and of internationalization. The other is that a specific international language, English, and a specific type of publication, journal articles, are perceived as supreme in a general hierarchy of languages and publication types. Simple distinctions based on these criteria are contrary to the heterogeneous publication patterns needed in the SSH to organize their research adequately, present their results properly, reach their audiences efficiently, and thereby fulfil their missions. Research quality, internationalization, and societal relevance can be promoted in research assessment in the SSH without categorical hierarchies of publications. I will demonstrate this by using data from scholarly publishing in the SSH that go beyond the coverage in the commercial data sources in order to give a more comprehensive representation of scholarly publishing in the SSH.

  5. Using health-facility data to assess subnational coverage of maternal and child health indicators, Kenya.

    PubMed

    Maina, Isabella; Wanjala, Pepela; Soti, David; Kipruto, Hillary; Droti, Benson; Boerma, Ties

    2017-10-01

    To develop a systematic approach to obtain the best possible national and subnational statistics for maternal and child health coverage indicators from routine health-facility data. Our approach aimed to obtain improved numerators and denominators for calculating coverage at the subnational level from health-facility data. This involved assessing data quality and determining adjustment factors for incomplete reporting by facilities, then estimating local target populations based on interventions with near-universal coverage (first antenatal visit and first dose of pentavalent vaccine). We applied the method to Kenya at the county level, where routine electronic reporting by facilities is in place via the district health information software system. Reporting completeness for facility data were well above 80% in all 47 counties and the consistency of data over time was good. Coverage of the first dose of pentavalent vaccine, adjusted for facility reporting completeness, was used to obtain estimates of the county target populations for maternal and child health indicators. The country and national statistics for the four-year period 2012/13 to 2015/16 showed good consistency with results of the 2014 Kenya demographic and health survey. Our results indicated a stagnation of immunization coverage in almost all counties, a rapid increase of facility-based deliveries and caesarean sections and limited progress in antenatal care coverage. While surveys will continue to be necessary to provide population-based data, web-based information systems for health facility reporting provide an opportunity for more frequent, local monitoring of progress, in maternal and child health.

  6. From chromatogram to analyte to metabolite. How to pick horses for courses from the massive web resources for mass spectral plant metabolomics.

    PubMed

    Perez de Souza, Leonardo; Naake, Thomas; Tohge, Takayuki; Fernie, Alisdair R

    2017-07-01

    The grand challenge currently facing metabolomics is the expansion of the coverage of the metabolome from a minor percentage of the metabolic complement of the cell toward the level of coverage afforded by other post-genomic technologies such as transcriptomics and proteomics. In plants, this problem is exacerbated by the sheer diversity of chemicals that constitute the metabolome, with the number of metabolites in the plant kingdom generally considered to be in excess of 200 000. In this review, we focus on web resources that can be exploited in order to improve analyte and ultimately metabolite identification and quantification. There is a wide range of available software that not only aids in this but also in the related area of peak alignment; however, for the uninitiated, choosing which program to use is a daunting task. For this reason, we provide an overview of the pros and cons of the software as well as comments regarding the level of programing skills required to effectively exploit their basic functions. In addition, the torrent of available genome and transcriptome sequences that followed the advent of next-generation sequencing has opened up further valuable resources for metabolite identification. All things considered, we posit that only via a continued communal sharing of information such as that deposited in the databases described within the article are we likely to be able to make significant headway toward improving our coverage of the plant metabolome. © The Authors 2017. Published by Oxford University Press.

  7. Validation and discovery of genotype-phenotype associations in chronic diseases using linked data.

    PubMed

    Pathak, Jyotishman; Kiefer, Richard; Freimuth, Robert; Chute, Christopher

    2012-01-01

    This study investigates federated SPARQL queries over Linked Open Data (LOD) in the Semantic Web to validate existing, and potentially discover new genotype-phenotype associations from public datasets. In particular, we report our preliminary findings for identifying such associations for commonly occurring chronic diseases using the Online Mendelian Inheritance in Man (OMIM) and Database for SNPs (dbSNP) within the LOD knowledgebase and compare them with Gene Wiki for coverage and completeness. Our results indicate that Semantic Web technologies can play an important role for in-silico identification of novel disease-gene-SNP associations, although additional verification is required before such information can be applied and used effectively.

  8. VO-compliant libraries of high resolution spectra of cool stars

    NASA Astrophysics Data System (ADS)

    Montes, D.

    2008-10-01

    In this contribution we describe a Virtual Observatory (VO) compliant version of the libraries of high resolution spectra of cool stars described by Montes et al. (1997; 1998; and 1999). Since their publication the fully reduced spectra in FITS format have been available via ftp and in the World Wide Web. However, in the VO all the spectra will be accessible using a common web interface following the standards of the International Virtual Observatory Alliance (IVOA). These libraries include F, G, K and M field stars, from dwarfs to giants. The spectral coverage is from 3800 to 10000 Å, with spectral resolution ranging from 0.09 to 3.0 Å.

  9. Improving data discoverability, accessibility, and interoperability with the Esri ArcGIS Platform at the NASA Atmospheric Science Data Center (ASDC).

    NASA Astrophysics Data System (ADS)

    Tisdale, M.

    2017-12-01

    NASA's Atmospheric Science Data Center (ASDC) is operationally using the Esri ArcGIS Platform to improve data discoverability, accessibility and interoperability to meet the diversifying user requirements from government, private, public and academic communities. The ASDC is actively working to provide their mission essential datasets as ArcGIS Image Services, Open Geospatial Consortium (OGC) Web Mapping Services (WMS), and OGC Web Coverage Services (WCS) while leveraging the ArcGIS multidimensional mosaic dataset structure. Science teams at ASDC are utilizing these services through the development of applications using the Web AppBuilder for ArcGIS and the ArcGIS API for Javascript. These services provide greater exposure of ASDC data holdings to the GIS community and allow for broader sharing and distribution to various end users. These capabilities provide interactive visualization tools and improved geospatial analytical tools for a mission critical understanding in the areas of the earth's radiation budget, clouds, aerosols, and tropospheric chemistry. The presentation will cover how the ASDC is developing geospatial web services and applications to improve data discoverability, accessibility, and interoperability.

  10. Local Television News Coverage of the Affordable Care Act: Emphasizing Politics Over Consumer Information.

    PubMed

    Gollust, Sarah E; Baum, Laura M; Niederdeppe, Jeff; Barry, Colleen L; Fowler, Erika Franklin

    2017-05-01

    To examine the public health and policy-relevant messages conveyed through local television news during the first stage of Affordable Care Act (ACA) implementation, when about 10 million Americans gained insurance. We conducted a content analysis of 1569 ACA-related local evening television news stories, obtained from sampling local news aired between October 1, 2013, and April 19, 2014. Coders systematically collected data using a coding instrument tracking major messages and information sources cited in the news. Overall, only half of all ACA-related news coverage focused on health insurance products, whereas the remainder discussed political disagreements over the law. Major policy tools of the ACA-the Medicaid expansion and subsidies available-were cited in less than 10% of news stories. Number of enrollees (27%) and Web site glitches (33%) were more common features of coverage. Sources with a political affiliation were by far the most common source of information (> 40%), whereas research was cited in less than 4% of stories. The most common source of news for Americans provided little public health-relevant substance about the ACA during its early implementation, favoring political strategy in coverage.

  11. Patient-oriented cancer information on the internet: a comparison of wikipedia and a professionally maintained database.

    PubMed

    Rajagopalan, Malolan S; Khanna, Vineet K; Leiter, Yaacov; Stott, Meghan; Showalter, Timothy N; Dicker, Adam P; Lawrence, Yaacov R

    2011-09-01

    A wiki is a collaborative Web site, such as Wikipedia, that can be freely edited. Because of a wiki's lack of formal editorial control, we hypothesized that the content would be less complete and accurate than that of a professional peer-reviewed Web site. In this study, the coverage, accuracy, and readability of cancer information on Wikipedia were compared with those of the patient-orientated National Cancer Institute's Physician Data Query (PDQ) comprehensive cancer database. For each of 10 cancer types, medically trained personnel scored PDQ and Wikipedia articles for accuracy and presentation of controversies by using an appraisal form. Reliability was assessed by using interobserver variability and test-retest reproducibility. Readability was calculated from word and sentence length. Evaluators were able to rapidly assess articles (18 minutes/article), with a test-retest reliability of 0.71 and interobserver variability of 0.53. For both Web sites, inaccuracies were rare, less than 2% of information examined. PDQ was significantly more readable than Wikipedia: Flesch-Kincaid grade level 9.6 versus 14.1. There was no difference in depth of coverage between PDQ and Wikipedia (29.9, 34.2, respectively; maximum possible score 72). Controversial aspects of cancer care were relatively poorly discussed in both resources (2.9 and 6.1 for PDQ and Wikipedia, respectively, NS; maximum possible score 18). A planned subanalysis comparing common and uncommon cancers demonstrated no difference. Although the wiki resource had similar accuracy and depth as the professionally edited database, it was significantly less readable. Further research is required to assess how this influences patients' understanding and retention.

  12. Remote Sensing Data Analytics for Planetary Science with PlanetServer/EarthServer

    NASA Astrophysics Data System (ADS)

    Rossi, Angelo Pio; Figuera, Ramiro Marco; Flahaut, Jessica; Martinot, Melissa; Misev, Dimitar; Baumann, Peter; Pham Huu, Bang; Besse, Sebastien

    2016-04-01

    Planetary Science datasets, beyond the change in the last two decades from physical volumes to internet-accessible archives, still face the problem of large-scale processing and analytics (e.g. Rossi et al., 2014, Gaddis and Hare, 2015). PlanetServer, the Planetary Science Data Service of the EC-funded EarthServer-2 project (#654367) tackles the planetary Big Data analytics problem with an array database approach (Baumann et al., 2014). It is developed to serve a large amount of calibrated, map-projected planetary data online, mainly through Open Geospatial Consortium (OGC) Web Coverage Processing Service (WCPS) (e.g. Rossi et al., 2014; Oosthoek et al., 2013; Cantini et al., 2014). The focus of the H2020 evolution of PlanetServer is still on complex multidimensional data, particularly hyperspectral imaging and topographic cubes and imagery. In addition to hyperspectral and topographic from Mars (Rossi et al., 2014), the use of WCPS is applied to diverse datasets on the Moon, as well as Mercury. Other Solar System Bodies are going to be progressively available. Derived parameters such as summary products and indices can be produced through WCPS queries, as well as derived imagery colour combination products, dynamically generated and accessed also through OGC Web Coverage Service (WCS). Scientific questions translated into queries can be posed to a large number of individual coverages (data products), locally, regionally or globally. The new PlanetServer system uses the the Open Source Nasa WorldWind (e.g. Hogan, 2011) virtual globe as visualisation engine, and the array database Rasdaman Community Edition as core server component. Analytical tools and client components of relevance for multiple communities and disciplines are shared across service such as the Earth Observation and Marine Data Services of EarthServer. The Planetary Science Data Service of EarthServer is accessible on http://planetserver.eu. All its code base is going to be available on GitHub, on https://github.com/planetserver References: Baumann, P., et al. (2015) Big Data Analytics for Earth Sciences: the EarthServer approach, International Journal of Digital Earth, doi: 10.1080/17538947.2014.1003106. Cantini, F. et al. (2014) Geophys. Res. Abs., Vol. 16, #EGU2014-3784. Gaddis, L., and T. Hare (2015), Status of tools and data for planetary research, Eos, 96, dos: 10.1029/2015EO041125. Hogan, P., 2011. NASA World Wind: Infrastructure for Spatial Data. Technical report. Proceedings of the 2nd International Conference on Computing for Geospatial Research & Applications ACM. Oosthoek, J.H.P, et al. (2013) Advances in Space Research. doi: 10.1016/j.asr.2013.07.002. Rossi, A. P., et al. (2014) PlanetServer/EarthServer: Big Data analytics in Planetary Science. Geophysical Research Abstracts, Vol. 16, #EGU2014-5149.

  13. Principal facts for gravity data collected in South Dakota: a web site for distribution of data

    USGS Publications Warehouse

    Kucks, Robert P.; Zawislak, Ronald L.

    2001-01-01

    Principal facts for 12266 new gravity stations and 2880 stations previously released in paper form (Klasner and Kucks, 1988) for the state of South Dakota are presented. These data were contracted to fill a gap in existing data coverage for the state. Observed and Bouguer anomaly data for this regional compilation are available here in digital form.

  14. Can SPOC (Self-Paced Online Course) Live Long and Prosper? A Comparison Study of a New Species of Online Course Delivery

    ERIC Educational Resources Information Center

    Southard, Sheryne; Meddaugh, Joshua; France-Harris, Antoinette

    2015-01-01

    Numerous formats exist for online course delivery: pure online, blended or hybrid, flipped and web-enhanced. The literature is replete with comparison studies on the efficacy of online, hybrid and traditional format courses. However, the self-paced online course, a relatively new and rare variation, has received very little coverage in the body of…

  15. Leveraging the NLM map from SNOMED CT to ICD-10-CM to facilitate adoption of ICD-10-CM.

    PubMed

    Cartagena, F Phil; Schaeffer, Molly; Rifai, Dorothy; Doroshenko, Victoria; Goldberg, Howard S

    2015-05-01

    Develop and test web services to retrieve and identify the most precise ICD-10-CM code(s) for a given clinical encounter. Facilitate creation of user interfaces that 1) provide an initial shortlist of candidate codes, ideally visible on a single screen; and 2) enable code refinement. To satisfy our high-level use cases, the analysis and design process involved reviewing available maps and crosswalks, designing the rule adjudication framework, determining necessary metadata, retrieving related codes, and iteratively improving the code refinement algorithm. The Partners ICD-10-CM Search and Mapping Services (PI-10 Services) are SOAP web services written using Microsoft's.NET 4.0 Framework, Windows Communications Framework, and SQL Server 2012. The services cover 96% of the Partners problem list subset of SNOMED CT codes that map to ICD-10-CM codes and can return up to 76% of the 69,823 billable ICD-10-CM codes prior to creation of custom mapping rules. We consider ways to increase 1) the coverage ratio of the Partners problem list subset of SNOMED CT codes and 2) the upper bound of returnable ICD-10-CM codes by creating custom mapping rules. Future work will investigate the utility of the transitive closure of SNOMED CT codes and other methods to assist in custom rule creation and, ultimately, to provide more complete coverage of ICD-10-CM codes. ICD-10-CM will be easier for clinicians to manage if applications display short lists of candidate codes from which clinicians can subsequently select a code for further refinement. The PI-10 Services support ICD-10 migration by implementing this paradigm and enabling users to consistently and accurately find the best ICD-10-CM code(s) without translation from ICD-9-CM. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  16. Supporting NEESPI with Data Services - The SIB-ESS-C e-Infrastructure

    NASA Astrophysics Data System (ADS)

    Gerlach, R.; Schmullius, C.; Frotscher, K.

    2009-04-01

    Data discovery and retrieval is commonly among the first steps performed for any Earth science study. The way scientific data is searched and accessed has changed significantly over the past two decades. Especially the development of the World Wide Web and the technologies that evolved along shortened the data discovery and data exchange process. On the other hand the amount of data collected and distributed by earth scientists has increased exponentially requiring new concepts for data management and sharing. One such concept to meet the demand is to build up Spatial Data Infrastructures (SDI) or e-Infrastructures. These infrastructures usually contain components for data discovery allowing users (or other systems) to query a catalogue or registry and retrieve metadata information on available data holdings and services. Data access is typically granted using FTP/HTTP protocols or, more advanced, through Web Services. A Service Oriented Architecture (SOA) approach based on standardized services enables users to benefit from interoperability among different systems and to integrate distributed services into their application. The Siberian Earth System Science Cluster (SIB-ESS-C) being established at the University of Jena (Germany) is such a spatial data infrastructure following these principles and implementing standards published by the Open Geospatial Consortium (OGC) and the International Organization for Standardization (ISO). The prime objective is to provide researchers with focus on Siberia with the technical means for data discovery, data access, data publication and data analysis. The region of interest covers the entire Asian part of the Russian Federation from the Ural to the Pacific Ocean including the Ob-, Lena- and Yenissey river catchments. The aim of SIB-ESS-C is to provide a comprehensive set of data products for Earth system science in this region. Although SIB-ESS-C will be equipped with processing capabilities for in-house data generation (mainly from Earth Observation), current data holdings of SIB-ESS-C have been created in collaboration with a number of partners in previous and ongoing research projects (e.g. SIBERIA-II, SibFORD, IRIS). At the current development stage the SIB-ESS-C system comprises a federated metadata catalogue accessible through the SIB-ESS-C Web Portal or from any OGC-CSW compliant client. Due to full interoperability with other metadata catalogues users of the SIB-ESS-C Web Portal are able to search external metadata repositories. The Web Portal contains also a simple visualization component which will be extended to a comprehensive visualization and analysis tool in the near future. All data products are already accessible as a Web Mapping Service and will be made available as Web Feature and Web Coverage Services soon allowing users to directly incorporate the data into their application. The SIB-ESS-C infrastructure will be further developed as one node in a network of similar systems (e.g. NASA GIOVANNI) in the NEESPI region.

  17. Sensor Management for Applied Research Technologies (SMART)-On Demand Modeling (ODM) Project

    NASA Technical Reports Server (NTRS)

    Goodman, M.; Blakeslee, R.; Hood, R.; Jedlovec, G.; Botts, M.; Li, X.

    2006-01-01

    NASA requires timely on-demand data and analysis capabilities to enable practical benefits of Earth science observations. However, a significant challenge exists in accessing and integrating data from multiple sensors or platforms to address Earth science problems because of the large data volumes, varying sensor scan characteristics, unique orbital coverage, and the steep learning curve associated with each sensor and data type. The development of sensor web capabilities to autonomously process these data streams (whether real-time or archived) provides an opportunity to overcome these obstacles and facilitate the integration and synthesis of Earth science data and weather model output. A three year project, entitled Sensor Management for Applied Research Technologies (SMART) - On Demand Modeling (ODM), will develop and demonstrate the readiness of Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) capabilities that integrate both Earth observations and forecast model output into new data acquisition and assimilation strategies. The advancement of SWE-enabled systems (i.e., use of SensorML, sensor planning services - SPS, sensor observation services - SOS, sensor alert services - SAS and common observation model protocols) will have practical and efficient uses in the Earth science community for enhanced data set generation, real-time data assimilation with operational applications, and for autonomous sensor tasking for unique data collection.

  18. NeuroMorpho.Org implementation of digital neuroscience: dense coverage and integration with the NIF

    PubMed Central

    Halavi, Maryam; Polavaram, Sridevi; Donohue, Duncan E.; Hamilton, Gail; Hoyt, Jeffrey; Smith, Kenneth P.; Ascoli, Giorgio A.

    2009-01-01

    Neuronal morphology affects network connectivity, plasticity, and information processing. Uncovering the design principles and functional consequences of dendritic and axonal shape necessitates quantitative analysis and computational modeling of detailed experimental data. Digital reconstructions provide the required neuromorphological descriptions in a parsimonious, comprehensive, and reliable numerical format. NeuroMorpho.Org is the largest web-accessible repository service for digitally reconstructed neurons and one of the integrated resources in the Neuroscience Information Framework (NIF). Here we describe the NeuroMorpho.Org approach as an exemplary experience in designing, creating, populating, and curating a neuroscience digital resource. The simple three-tier architecture of NeuroMorpho.Org (web client, web server, and relational database) encompasses all necessary elements to support a large-scale, integrate-able repository. The data content, while heterogeneous in scientific scope and experimental origin, is unified in format and presentation by an in house standardization protocol. The server application (MRALD) is secure, customizable, and developer-friendly. Centralized processing and expert annotation yields a comprehensive set of metadata that enriches and complements the raw data. The thoroughly tested interface design allows for optimal and effective data search and retrieval. Availability of data in both original and standardized formats ensures compatibility with existing resources and fosters further tool development. Other key functions enable extensive exploration and discovery, including 3D and interactive visualization of branching, frequently measured morphometrics, and reciprocal links to the original PubMed publications. The integration of NeuroMorpho.Org with version-1 of the NIF (NIFv1) provides the opportunity to access morphological data in the context of other relevant resources and diverse subdomains of neuroscience, opening exciting new possibilities in data mining and knowledge discovery. The outcome of such coordination is the rapid and powerful advancement of neuroscience research at both the conceptual and technological level. PMID:18949582

  19. NeuroMorpho.Org implementation of digital neuroscience: dense coverage and integration with the NIF.

    PubMed

    Halavi, Maryam; Polavaram, Sridevi; Donohue, Duncan E; Hamilton, Gail; Hoyt, Jeffrey; Smith, Kenneth P; Ascoli, Giorgio A

    2008-09-01

    Neuronal morphology affects network connectivity, plasticity, and information processing. Uncovering the design principles and functional consequences of dendritic and axonal shape necessitates quantitative analysis and computational modeling of detailed experimental data. Digital reconstructions provide the required neuromorphological descriptions in a parsimonious, comprehensive, and reliable numerical format. NeuroMorpho.Org is the largest web-accessible repository service for digitally reconstructed neurons and one of the integrated resources in the Neuroscience Information Framework (NIF). Here we describe the NeuroMorpho.Org approach as an exemplary experience in designing, creating, populating, and curating a neuroscience digital resource. The simple three-tier architecture of NeuroMorpho.Org (web client, web server, and relational database) encompasses all necessary elements to support a large-scale, integrate-able repository. The data content, while heterogeneous in scientific scope and experimental origin, is unified in format and presentation by an in house standardization protocol. The server application (MRALD) is secure, customizable, and developer-friendly. Centralized processing and expert annotation yields a comprehensive set of metadata that enriches and complements the raw data. The thoroughly tested interface design allows for optimal and effective data search and retrieval. Availability of data in both original and standardized formats ensures compatibility with existing resources and fosters further tool development. Other key functions enable extensive exploration and discovery, including 3D and interactive visualization of branching, frequently measured morphometrics, and reciprocal links to the original PubMed publications. The integration of NeuroMorpho.Org with version-1 of the NIF (NIFv1) provides the opportunity to access morphological data in the context of other relevant resources and diverse subdomains of neuroscience, opening exciting new possibilities in data mining and knowledge discovery. The outcome of such coordination is the rapid and powerful advancement of neuroscience research at both the conceptual and technological level.

  20. Decision support methodology to establish priorities on the inspection of structures

    NASA Astrophysics Data System (ADS)

    Cortes, V. Juliette; Sterlacchini, Simone; Bogaard, Thom; Frigerio, Simone; Schenato, Luca; Pasuto, Alessandro

    2014-05-01

    For hydro-meteorological hazards in mountain areas, the regular inspection of check dams and bridges is important due to the effect of their functional status on water-sediment processes. Moreover, the inspection of these structures is time consuming for organizations due to their extensive number in many regions. However, trained citizen-volunteers can support civil protection and technical services in the frequency, timeliness and coverage of monitoring the functional status of hydraulic structures. Technicians should evaluate and validate these reports to get an index for the status of the structure. Thus, preventive actions could initiate such as the cleaning of obstructions or to pre-screen potential problems for a second level inspection. This study proposes a decision support methodology that technicians can use to assess an index for three parameters representing the functional status of the structure: a) condition of the structure at the opening of the stream flow, b) level of obstruction at the structure and c) the level of erosion in the stream bank. The calculation of the index for each parameter is based upon fuzzy logic theory to handle ranges in precision of the reports and to convert the linguistic rating scales into numbers representing the structure's status. A weighting method and multi-criteria method (Analytic Hierarchy Process- AHP and TOPSIS), can be used by technicians to combine the different ratings according to the component elements of the structure and the completeness of the reports. Finally, technicians can set decision rules based on the worst rating and a threshold for the functional indexes. The methodology was implemented as a prototype web-based tool to be tested with technicians of the Civil Protection in the Fella basin, Northern Italy. Results at this stage comprise the design and implementation of the web-based tool with GIS interaction to evaluate available reports and to set priorities on the inspection of structures. Keywords Decision-making, Multi-criteria methods, Torrent control structures, Web-based tools.

  1. Sustaining enrollment in health insurance for vulnerable populations: lessons from Massachusetts.

    PubMed

    Capoccia, Victor; Croze, Colette; Cohen, Martin; O'Brien, John P

    2013-04-01

    Since 2008 Massachusetts has had universal health insurance with an individual mandate. As a result, only about 3% of the population is uninsured. However, patients who use behavioral health services are uninsured at much higher rates. This 2011 study sought to understand reasons for the discrepancy and identify approaches to reduce disenrollment and sustain coverage. The qualitative study was based on structured interviews and focus groups. Structured interviews were conducted with 15 policy makers, consumer advocates, and chief executive officers of provider organizations, and three focus groups were held with 33 patient volunteers. The interviews and focus groups identified several disenrollment opportunities, all of which contribute to "churn" (the process by which disenrolled persons who remain eligible are reenrolled in the same or a different plan): missing and incomplete documentation, acute and chronic conditions and long-term disabilities that interfere with a patient's ability to respond to program communications, and lack of awareness among beneficiaries of the consequences of changes that trigger termination and the need to transfer to another program. Although safeguards are built into the system to avoid some disenrollments, the policies and procedures that drive the system are built on a default assumption of ineligibility or disenrollment until the individual establishes eligibility and completes requirements. Practices that can sustain enrollment include real-time Web-based prepopulated enrollment and redetermination processes, redetermination flexibility for designated chronic illnesses, and standardized performance metrics for churn and associated costs. Changes in the information system infrastructure and in outreach, enrollment, disenrollment, and reenrollment procedures can improve continuity and retention of health insurance coverage.

  2. Social influence of a religious hero: the late Cardinal Stephen Kim Sou-hwan's effect on cornea donation and volunteerism.

    PubMed

    Bae, Hyuhn-Suhck; Brown, William J; Kang, Seok

    2011-01-01

    This study examined the mediated influence of a celebrated religious hero in South Korea, Cardinal Stephen Kim, through two forms of involvement--parasocial interaction and identification--on intention toward cornea donation and volunteerism, and it investigated how the news media diffused of his death. A structural equation modeling analysis with a Web-based voluntary survey of more than 1,200 people in South Korea revealed a multistep social influence process, beginning with parasocial interaction with Cardinal Kim, leading to identification with him, which predicted intention toward cornea donation and volunteerism. Additional investigations found that news of Cardinal Kim's death diffused rapidly through media and interpersonal communication. Results of this study demonstrate that religious leaders who achieve a celebrity hero status can prompt public discussion of important issues rather quickly through extensive media coverage, enabling them to promote prosocial behavior and positively affect public health.

  3. Mobile cloud-computing-based healthcare service by noncontact ECG monitoring.

    PubMed

    Fong, Ee-May; Chung, Wan-Young

    2013-12-02

    Noncontact electrocardiogram (ECG) measurement technique has gained popularity these days owing to its noninvasive features and convenience in daily life use. This paper presents mobile cloud computing for a healthcare system where a noncontact ECG measurement method is employed to capture biomedical signals from users. Healthcare service is provided to continuously collect biomedical signals from multiple locations. To observe and analyze the ECG signals in real time, a mobile device is used as a mobile monitoring terminal. In addition, a personalized healthcare assistant is installed on the mobile device; several healthcare features such as health status summaries, medication QR code scanning, and reminders are integrated into the mobile application. Health data are being synchronized into the healthcare cloud computing service (Web server system and Web server dataset) to ensure a seamless healthcare monitoring system and anytime and anywhere coverage of network connection is available. Together with a Web page application, medical data are easily accessed by medical professionals or family members. Web page performance evaluation was conducted to ensure minimal Web server latency. The system demonstrates better availability of off-site and up-to-the-minute patient data, which can help detect health problems early and keep elderly patients out of the emergency room, thus providing a better and more comprehensive healthcare cloud computing service.

  4. Mobile Cloud-Computing-Based Healthcare Service by Noncontact ECG Monitoring

    PubMed Central

    Fong, Ee-May; Chung, Wan-Young

    2013-01-01

    Noncontact electrocardiogram (ECG) measurement technique has gained popularity these days owing to its noninvasive features and convenience in daily life use. This paper presents mobile cloud computing for a healthcare system where a noncontact ECG measurement method is employed to capture biomedical signals from users. Healthcare service is provided to continuously collect biomedical signals from multiple locations. To observe and analyze the ECG signals in real time, a mobile device is used as a mobile monitoring terminal. In addition, a personalized healthcare assistant is installed on the mobile device; several healthcare features such as health status summaries, medication QR code scanning, and reminders are integrated into the mobile application. Health data are being synchronized into the healthcare cloud computing service (Web server system and Web server dataset) to ensure a seamless healthcare monitoring system and anytime and anywhere coverage of network connection is available. Together with a Web page application, medical data are easily accessed by medical professionals or family members. Web page performance evaluation was conducted to ensure minimal Web server latency. The system demonstrates better availability of off-site and up-to-the-minute patient data, which can help detect health problems early and keep elderly patients out of the emergency room, thus providing a better and more comprehensive healthcare cloud computing service. PMID:24316562

  5. Telestroke 10 years later--'telestroke 2.0'.

    PubMed

    Switzer, Jeffrey A; Levine, Steven R; Hess, David C

    2009-01-01

    The lack of physicians with specialty stroke training represents a significant challenge to the future of stroke. This deficit limits both quality stroke care and clinical research initiatives. The use of telemedicine for stroke ('telestroke') has been an attempt to overcome this shortage and extend stroke expertise to locations which lack coverage. However, the initial telestroke systems required a point-to-point connection for transmission and only provided videoconferencing which limited their generalizability and usefulness. 'Telestroke 2.0' is the authors' vision of an integrative web-based telestroke system combining high-quality audiovideo transmission, the ability of consults and teleradiology to be carried out from any desktop or laptop computer with web-access, decision and technical support, creation of billable physician documentation and electronic medical record connectivity. These features will facilitate the development of statewide and regional telestroke call networks with an opportunity for physician supply companies to fill in coverage gaps. In addition, telestroke 2.0 may improve acute stroke research by increasing trial efficiency via the addition of non-academic recruitment sites, enhancing trial validity by centralizing neurologic examinations via recorded encounters, and generalizing clinical trial results to community hospital settings. Greater diffusion and long-term sustainability of telestroke systems will be dependent upon improvements in patient and hospital reimbursement for acute stroke and telestroke care. Copyright 2009 S. Karger AG, Basel.

  6. Feasibility of using global system for mobile communication (GSM)-based tracking for vaccinators to improve oral poliomyelitis vaccine campaign coverage in rural Pakistan.

    PubMed

    Chandir, Subhash; Dharma, Vijay Kumar; Siddiqi, Danya Arif; Khan, Aamir Javed

    2017-09-05

    Despite multiple rounds of immunization campaigns, it has not been possible to achieve optimum immunization coverage for poliovirus in Pakistan. Supplementary activities to improve coverage of immunization, such as door-to-door campaigns are constrained by several factors including inaccurate hand-drawn maps and a lack of means to objectively monitor field teams in real time, resulting in suboptimal vaccine coverage during campaigns. Global System for Mobile Communications (GSM) - based tracking of mobile subscriber identity modules (SIMs) of vaccinators provides a low-cost solution to identify missed areas and ensure effective immunization coverage. We conducted a pilot study to investigate the feasibility of using GSM technology to track vaccinators through observing indicators including acceptability, ease of implementation, costs and scalability as well as the likelihood of ownership by District Health Officials. The real-time location of the field teams was displayed on a GSM tracking web dashboard accessible by supervisors and managers for effective monitoring of workforce attendance including 'time in-time out', and discerning if all target areas - specifically remote and high-risk locations - had been reached. Direct access to this information by supervisors eliminated the possibility of data fudging and inaccurate reporting by workers regarding their mobility. The tracking cost per vaccinator was USD 0.26/month. Our study shows that GSM-based tracking is potentially a cost-efficient approach, results in better monitoring and accountability, is scalable and provides the potential for improved geographic coverage of health services. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Staff Acceptance of Tele-ICU Coverage

    PubMed Central

    Chan, Paul S.; Cram, Peter

    2011-01-01

    Background: Remote coverage of ICUs is increasing, but staff acceptance of this new technology is incompletely characterized. We conducted a systematic review to summarize existing research on acceptance of tele-ICU coverage among ICU staff. Methods: We searched for published articles pertaining to critical care telemedicine systems (aka, tele-ICU) between January 1950 and March 2010 using PubMed, Cumulative Index to Nursing and Allied Health Literature, Global Health, Web of Science, and the Cochrane Library and abstracts and presentations delivered at national conferences. Studies were included if they provided original qualitative or quantitative data on staff perceptions of tele-ICU coverage. Studies were imported into content analysis software and coded by tele-ICU configuration, methodology, participants, and findings (eg, positive and negative staff evaluations). Results: Review of 3,086 citations yielded 23 eligible studies. Findings were grouped into four categories of staff evaluation: overall acceptance level of tele-ICU coverage (measured in 70% of studies), impact on patient care (measured in 96%), impact on staff (measured in 100%), and organizational impact (measured in 48%). Overall acceptance was high, despite initial ambivalence. Favorable impact on patient care was perceived by > 82% of participants. Staff impact referenced enhanced collaboration, autonomy, and training, although scrutiny, malfunctions, and contradictory advice were cited as potential barriers. Staff perceived the organizational impact to vary. An important limitation of available studies was a lack of rigorous methodology and validated survey instruments in many studies. Conclusions: Initial reports suggest high levels of staff acceptance of tele-ICU coverage, but more rigorous methodologic study is required. PMID:21051386

  8. Building asynchronous geospatial processing workflows with web services

    NASA Astrophysics Data System (ADS)

    Zhao, Peisheng; Di, Liping; Yu, Genong

    2012-02-01

    Geoscience research and applications often involve a geospatial processing workflow. This workflow includes a sequence of operations that use a variety of tools to collect, translate, and analyze distributed heterogeneous geospatial data. Asynchronous mechanisms, by which clients initiate a request and then resume their processing without waiting for a response, are very useful for complicated workflows that take a long time to run. Geospatial contents and capabilities are increasingly becoming available online as interoperable Web services. This online availability significantly enhances the ability to use Web service chains to build distributed geospatial processing workflows. This paper focuses on how to orchestrate Web services for implementing asynchronous geospatial processing workflows. The theoretical bases for asynchronous Web services and workflows, including asynchrony patterns and message transmission, are examined to explore different asynchronous approaches to and architecture of workflow code for the support of asynchronous behavior. A sample geospatial processing workflow, issued by the Open Geospatial Consortium (OGC) Web Service, Phase 6 (OWS-6), is provided to illustrate the implementation of asynchronous geospatial processing workflows and the challenges in using Web Services Business Process Execution Language (WS-BPEL) to develop them.

  9. 77 FR 38033 - Notice of Establishment of a Commodity Import Approval Process Web Site

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-26

    ... Process Web Site AGENCY: Animal and Plant Health Inspection Service, USDA. ACTION: Notice. SUMMARY: We are announcing the creation of a new Plant Protection and Quarantine Web site that will provide stakeholders with... comment on draft risk assessments. This Web site will make the commodity import approval process more...

  10. World Wide Web Based Image Search Engine Using Text and Image Content Features

    NASA Astrophysics Data System (ADS)

    Luo, Bo; Wang, Xiaogang; Tang, Xiaoou

    2003-01-01

    Using both text and image content features, a hybrid image retrieval system for Word Wide Web is developed in this paper. We first use a text-based image meta-search engine to retrieve images from the Web based on the text information on the image host pages to provide an initial image set. Because of the high-speed and low cost nature of the text-based approach, we can easily retrieve a broad coverage of images with a high recall rate and a relatively low precision. An image content based ordering is then performed on the initial image set. All the images are clustered into different folders based on the image content features. In addition, the images can be re-ranked by the content features according to the user feedback. Such a design makes it truly practical to use both text and image content for image retrieval over the Internet. Experimental results confirm the efficiency of the system.

  11. Grid enablement of OpenGeospatial Web Services: the G-OWS Working Group

    NASA Astrophysics Data System (ADS)

    Mazzetti, Paolo

    2010-05-01

    In last decades two main paradigms for resource sharing emerged and reached maturity: the Web and the Grid. They both demonstrate suitable for building Distributed Computing Infrastructures (DCIs) supporting the coordinated sharing of resources (i.e. data, information, services, etc) on the Internet. Grid and Web DCIs have much in common as a result of their underlying Internet technology (protocols, models and specifications). However, being based on different requirements and architectural approaches, they show some differences as well. The Web's "major goal was to be a shared information space through which people and machines could communicate" [Berners-Lee 1996]. The success of the Web, and its consequent pervasiveness, made it appealing for building specialized systems like the Spatial Data Infrastructures (SDIs). In this systems the introduction of Web-based geo-information technologies enables specialized services for geospatial data sharing and processing. The Grid was born to achieve "flexible, secure, coordinated resource sharing among dynamic collections of individuals, institutions, and resources" [Foster 2001]. It specifically focuses on large-scale resource sharing, innovative applications, and, in some cases, high-performance orientation. In the Earth and Space Sciences (ESS) the most part of handled information is geo-referred (geo-information) since spatial and temporal meta-information is of primary importance in many application domains: Earth Sciences, Disasters Management, Environmental Sciences, etc. On the other hand, in several application areas there is the need of running complex models which require the large processing and storage capabilities that the Grids are able to provide. Therefore the integration of geo-information and Grid technologies might be a valuable approach in order to enable advanced ESS applications. Currently both geo-information and Grid technologies have reached a high level of maturity, allowing to build such an integration on existing solutions. More specifically, the Open Geospatial Consortium (OGC) Web Services (OWS) specifications play a fundamental role in geospatial information sharing (e.g. in INSPIRE Implementing Rules, GEOSS architecture, GMES Services, etc.). On the Grid side, the gLite middleware, developed in the European EGEE (Enabling Grids for E-sciencE) Projects, is widely spread in Europe and beyond, proving its high scalability and it is one of the middleware chosen for the future European Grid Infrastructure (EGI) initiative. Therefore the convergence between OWS and gLite technologies would be desirable for a seamless access to the Grid capabilities through OWS-compliant systems. Anyway, to achieve this harmonization there are some obstacles to overcome. Firstly, a semantics mismatch must be addressed: gLite handle low-level (e.g. close to the machine) concepts like "file", "data", "instruments", "job", etc., while geo-information services handle higher-level (closer to the human) concepts like "coverage", "observation", "measurement", "model", etc. Secondly, an architectural mismatch must be addressed: OWS implements a Web Service-Oriented-Architecture which is stateless, synchronous and with no embedded security (which is demanded to other specs), while gLite implements the Grid paradigm in an architecture which is stateful, asynchronous (even not fully event-based) and with strong embedded security (based on the VO paradigm). In recent years many initiatives and projects have worked out possible approaches for implementing Grid-enabled OWSs. Just to mention some: (i) in 2007 the OGC has signed a Memorandum of Understanding with the Open Grid Forum, "a community of users, developers, and vendors leading the global standardization effort for grid computing."; (ii) the OGC identified "WPS Profiles - Conflation; and Grid processing" as one of the tasks in the Geo Processing Workflow theme of the OWS Phase 6 (OWS-6); (iii) several national, European and international projects investigated different aspects of this integration, developing demonstrators and Proof-of-Concepts; In this context, "gLite enablement of OpenGeospatial Web Services" (G-OWS) is an initiative started in 2008 by the European CYCLOPS, GENESI-DR, and DORII Projects Consortia in order to collect/coordinate experiences on the enablement of OWS on top of the gLite middleware [GOWS]. Currently G-OWS counts ten member organizations from Europe and beyond, and four European Projects involved. It broadened its scope to the development of Spatial Data and Information Infrastructures (SDI and SII) based on the Grid/Cloud capacity in order to enable Earth Science applications and tools. Its operational objectives are the following: i) to contribute to the OGC-OGF initiative; ii) to release a reference implementation as standard gLite APIs (under the gLite software license); iii) to release a reference model (including procedures and guidelines) for OWS Grid-ification, as far as gLite is concerned; iv) to foster and promote the formation of consortiums for participation to projects/initiatives aimed at building Grid-enabled SDIs To achieve this objectives G-OWS bases its activities on two main guiding principles: a) the adoption of a service-oriented architecture based on the information modelling approach, and b) standardization as a means of achieving interoperability (i.e. adoption of standards from ISO TC211, OGC OWS, OGF). In the first year of activity G-OWS has designed a general architectural framework stemming from the FP6 CYCLOPS studies and enriched by the outcomes of other projects and initiatives involved (i.e. FP7 GENESI-DR, FP7 DORII, AIST GeoGrid, etc.). Some proof-of-concepts have been developed to demonstrate the flexibility and scalability of such architectural framework. The G-OWS WG developed implementations of gLite-enabled Web Coverage Service (WCS) and Web Processing Service (WPS), and an implementation of a Shibboleth authentication for gLite-enabled OWS in order to evaluate the possible integration of Web and Grid security models. The presentation will aim to communicate the G-OWS organization, activities, future plans and means to involve the ESSI community. References [Berners-Lee 1996] T. Berners-Lee, "WWW: Past, present, and future". IEEE Computer, 29(10), Oct. 1996, pp. 69-77. [Foster 2001] I. Foster, C. Kesselman and S. Tuecke, "The Anatomy of the Grid. The International Journal ofHigh Performance Computing Applications", 15(3):200-222, Fall 2001 [GOWS] G-OWS WG, https://www.g-ows.org/, accessed: 15 January 2010

  12. Using internet GIS technology for early warning, response and controlling the quality of the public health sector.

    PubMed

    Ptochos, Dimitrios; Panopoulos, Dimitrios; Metaxiotis, Kostas; Askounis, Dimitrios

    2004-01-01

    Recent EU and Greek Government legislation highlights the need for the modernisation of the public health management system and the improvement of the overall health of EU citizens. In addition, the effusion of epidemics even in developed countries makes the need for the enhancement of public health services imperative. In order to best confront the above-described challenges, the National Technical University of Athens, in cooperation with the Greek Ministry of Health and Welfare and the European Commission (EC), designed and developed an integrated public health information network, named GEPIMI (Integrated Geographical System for EPIdemiological and other Medical Information), in the framework of a three-year pilot project. This pilot project, funded by Greek Ministry of Health and Welfare and the EC supported the Programme INTERREG II to establish an advanced and integrated web-based information system that can process and move information in real time, allowing public health authorities to monitor events at hundreds or thousands of public health facilities at once. The system is established among hospitals, primary healthcare authorities and health agents in Greece, Bulgaria, Albania, Fyrom, and Turkey. The project aims at demonstrating the best practices, prospects, applications and high potential of Telematics Healthcare Networks in Europe, with a view to promoting cooperation and interconnection between European communities in the field of Telematics Healthcare Applications. The GEPIMI System, implemented via an innovative web based system, constitutes a replication of a highly effective mechanism. It incorporates state-of-the-art technologies such as Geographic Information Systems (G.I.S.), web based databases, GPS, and Smart Card Technology and supports a variety of health-related web applications including early warning and response of epidemics, remote management of medical records, seamless healthcare coverage, comprehensive statistical analysis of data, decision-making procedures, inter-communication between international scientific fora and other.

  13. HPV-QUEST: A highly customized system for automated HPV sequence analysis capable of processing Next Generation sequencing data set.

    PubMed

    Yin, Li; Yao, Jiqiang; Gardner, Brent P; Chang, Kaifen; Yu, Fahong; Goodenow, Maureen M

    2012-01-01

    Next Generation sequencing (NGS) applied to human papilloma viruses (HPV) can provide sensitive methods to investigate the molecular epidemiology of multiple type HPV infection. Currently a genotyping system with a comprehensive collection of updated HPV reference sequences and a capacity to handle NGS data sets is lacking. HPV-QUEST was developed as an automated and rapid HPV genotyping system. The web-based HPV-QUEST subtyping algorithm was developed using HTML, PHP, Perl scripting language, and MYSQL as the database backend. HPV-QUEST includes a database of annotated HPV reference sequences with updated nomenclature covering 5 genuses, 14 species and 150 mucosal and cutaneous types to genotype blasted query sequences. HPV-QUEST processes up to 10 megabases of sequences within 1 to 2 minutes. Results are reported in html, text and excel formats and display e-value, blast score, and local and coverage identities; provide genus, species, type, infection site and risk for the best matched reference HPV sequence; and produce results ready for additional analyses.

  14. Image Re-Ranking Based on Topic Diversity.

    PubMed

    Qian, Xueming; Lu, Dan; Wang, Yaxiong; Zhu, Li; Tang, Yuan Yan; Wang, Meng

    2017-08-01

    Social media sharing Websites allow users to annotate images with free tags, which significantly contribute to the development of the web image retrieval. Tag-based image search is an important method to find images shared by users in social networks. However, how to make the top ranked result relevant and with diversity is challenging. In this paper, we propose a topic diverse ranking approach for tag-based image retrieval with the consideration of promoting the topic coverage performance. First, we construct a tag graph based on the similarity between each tag. Then, the community detection method is conducted to mine the topic community of each tag. After that, inter-community and intra-community ranking are introduced to obtain the final retrieved results. In the inter-community ranking process, an adaptive random walk model is employed to rank the community based on the multi-information of each topic community. Besides, we build an inverted index structure for images to accelerate the searching process. Experimental results on Flickr data set and NUS-Wide data sets show the effectiveness of the proposed approach.

  15. Reducing Unintended Pregnancies Through Web-Based Reproductive Life Planning and Contraceptive Action Planning among Privately Insured Women: Study Protocol for the MyNewOptions Randomized, Controlled Trial.

    PubMed

    Chuang, Cynthia H; Velott, Diana L; Weisman, Carol S; Sciamanna, Christopher N; Legro, Richard S; Chinchilli, Vernon M; Moos, Merry-K; Francis, Erica B; Confer, Lindsay N; Lehman, Erik B; Armitage, Christopher J

    2015-01-01

    The Affordable Care Act mandates that most women of reproductive age with private health insurance have full contraceptive coverage with no out-of-pocket costs, creating an actionable time for women to evaluate their contraceptive choices without cost considerations. The MyNewOptions study is a three-arm, randomized, controlled trial testing web-based interventions aimed at assisting privately insured women with making contraceptive choices that are consistent with their reproductive goals. Privately insured women between the ages of 18 and 40 not intending pregnancy were randomly assigned to one of three groups: 1) a reproductive life planning (RLP) intervention, 2) a reproductive life planning enriched with contraceptive action planning (RLP+) intervention, or 3) an information only control group. Both the RLP and RLP+ guide women to identify their individualized reproductive goals and contraceptive method requirements. The RLP+ additionally includes a contraceptive action planning component, which uses if-then scenarios that allow the user to problem solve situations that make it difficult to be adherent to their contraceptive method. All three groups have access to a reproductive options library containing information about their contraceptive coverage and the attributes of alternative contraceptive methods. Women completed a baseline survey with follow-up surveys every 6 months for 2 years concurrent with intervention boosters. Study outcomes include contraceptive use and adherence. ClinicalTrials.gov identifier: NCT02100124. Results from the MyNewOptions study will demonstrate whether web-based reproductive life planning, with or without contraceptive action planning, helps insured women make patient-centered contraceptive choices compared with an information-only control condition. Copyright © 2015 Jacobs Institute of Women's Health. Published by Elsevier Inc. All rights reserved.

  16. Data Discovery and Access via the Heliophysics Events Knowledgebase (HEK)

    NASA Astrophysics Data System (ADS)

    Somani, A.; Hurlburt, N. E.; Schrijver, C. J.; Cheung, M.; Freeland, S.; Slater, G. L.; Seguin, R.; Timmons, R.; Green, S.; Chang, L.; Kobashi, A.; Jaffey, A.

    2011-12-01

    The HEK is a integrated system which helps direct scientists to solar events and data from a variety of providers. The system is fully operational and adoption of HEK has been growing since the launch of NASA's SDO mission. In this presentation we describe the different components that comprise HEK. The Heliophysics Events Registry (HER) and Heliophysics Coverage Registry (HCR) form the two major databases behind the system. The HCR allows the user to search on coverage event metadata for a variety of instruments. The HER allows the user to search on annotated event metadata for a variety of instruments. Both the HCR and HER are accessible via a web API which can return search results in machine readable formats (e.g. XML and JSON). A variety of SolarSoft services are also provided to allow users to search the HEK as well as obtain and manipulate data. Other components include - the Event Detection System (EDS) continually runs feature finding algorithms on SDO data to populate the HER with relevant events, - A web form for users to request SDO data cutouts for multiple AIA channels as well as HMI line-of-sight magnetograms, - iSolSearch, which allows a user to browse events in the HER and search for specific events over a specific time interval, all within a graphical web page, - Panorama, which is the software tool used for rapid visualization of large volumes of solar image data in multiple channels/wavelengths. The user can also easily create WYSIWYG movies and launch the Annotator tool to describe events and features. - EVACS, which provides a JOGL powered client for the HER and HCR. EVACS displays the searched for events on a full disk magnetogram of the sun while displaying more detailed information for events.

  17. Patient-Oriented Cancer Information on the Internet: A Comparison of Wikipedia and a Professionally Maintained Database

    PubMed Central

    Rajagopalan, Malolan S.; Khanna, Vineet K.; Leiter, Yaacov; Stott, Meghan; Showalter, Timothy N.; Dicker, Adam P.; Lawrence, Yaacov R.

    2011-01-01

    Purpose: A wiki is a collaborative Web site, such as Wikipedia, that can be freely edited. Because of a wiki's lack of formal editorial control, we hypothesized that the content would be less complete and accurate than that of a professional peer-reviewed Web site. In this study, the coverage, accuracy, and readability of cancer information on Wikipedia were compared with those of the patient-orientated National Cancer Institute's Physician Data Query (PDQ) comprehensive cancer database. Methods: For each of 10 cancer types, medically trained personnel scored PDQ and Wikipedia articles for accuracy and presentation of controversies by using an appraisal form. Reliability was assessed by using interobserver variability and test-retest reproducibility. Readability was calculated from word and sentence length. Results: Evaluators were able to rapidly assess articles (18 minutes/article), with a test-retest reliability of 0.71 and interobserver variability of 0.53. For both Web sites, inaccuracies were rare, less than 2% of information examined. PDQ was significantly more readable than Wikipedia: Flesch-Kincaid grade level 9.6 versus 14.1. There was no difference in depth of coverage between PDQ and Wikipedia (29.9, 34.2, respectively; maximum possible score 72). Controversial aspects of cancer care were relatively poorly discussed in both resources (2.9 and 6.1 for PDQ and Wikipedia, respectively, NS; maximum possible score 18). A planned subanalysis comparing common and uncommon cancers demonstrated no difference. Conclusion: Although the wiki resource had similar accuracy and depth as the professionally edited database, it was significantly less readable. Further research is required to assess how this influences patients' understanding and retention. PMID:22211130

  18. Five Years of BEACO2N: First Results and Lessons Learned

    NASA Astrophysics Data System (ADS)

    Shusterman, A.; Cohen, R. C.

    2017-12-01

    The BErkeley Atmospheric CO2 Observation Network (BEACO2N) is an ongoing greenhouse gas and air quality monitoring campaign based in the San Francisco Bay Area of Northern California. BEACO2N is a distributed network instrument consisting of low- to moderate-cost commercial sensors for CO2 and other pollutants installed on top of schools, museums, and other outreach-minded institutions. The reduced cost of each individual sensor "node" enables the deployment of a larger volume of total nodes, resulting in a web of approximately 50 sites with an average node-to-node distance of 2 km. Operating in some variation of this configuration since 2012, BEACO2N offers greater spatio-temporal coverage than any other fixed CO2 monitoring network to date. This high-resolution information allows us to faithfully represent the true heterogeneity of urban emission processes and distinguish between specific sources that are often regulated independently, but typically treated en masse by sparser, conventional surface monitors. However, maintaining and appropriately interpreting a network of BEACO2N's size presents a number of unique data quality and data coverage challenges. Here we describe the quantitative capabilities of the BEACO2N platform, first results from initial attempts at constraining greenhouse gas emission estimates, as well as other lessons learned over the first five years of operation.

  19. Analyzing the test process using structural coverage

    NASA Technical Reports Server (NTRS)

    Ramsey, James; Basili, Victor R.

    1985-01-01

    A large, commercially developed FORTRAN program was modified to produce structural coverage metrics. The modified program was executed on a set of functionally generated acceptance tests and a large sample of operational usage cases. The resulting structural coverage metrics are combined with fault and error data to evaluate structural coverage. It was shown that in the software environment the functionally generated tests seem to be a good approximation of operational use. The relative proportions of the exercised statement subclasses change as the structural coverage of the program increases. A method was also proposed for evaluating if two sets of input data exercise a program in a similar manner. Evidence was provided that implies that in this environment, faults revealed in a procedure are independent of the number of times the procedure is executed and that it may be reasonable to use procedure coverage in software models that use statement coverage. Finally, the evidence suggests that it may be possible to use structural coverage to aid in the management of the acceptance test processed.

  20. Einstein Online: A Web-based Course for K-12 Teachers from the American Museum of Natural History

    NASA Astrophysics Data System (ADS)

    Steiner, Robert

    2004-05-01

    Einstein Online: A Web-based Course for K-12 Teachers from the American Museum of Natural History Robert V. Steiner, Ph.D. Project Director, Seminars on Science American Museum of Natural History The American Museum of Natural History, in collaboration with Hebrew University and the Skirball Cultural Center, has created a major exhibit on Albert Einstein, including extensive coverage of his contributions to relativity, quantum mechanics and unified field theories as well as the social and political dimensions of his life. Leveraging the assets of this exhibit as well as the expertise of the Museum's Department of Astrophysics and its Education Department, a six-week online professional development course for K-12 teachers has been created, providing inquires into some of the frontiers of physics through rich media resources, facilitated discussion forums and assignments. The course, which requires only minimal Web access, offers a unique opportunity for teachers across the United States to explore modern physics guided by a working scientist and a skilled online facilitator. The course includes original essays by Museum scientists, images, video, simulations, web links and digital resources for classroom use. The course design, development, implementation and evaluation are reviewed.

  1. Experimental evaluation of the impact of packet capturing tools for web services.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choe, Yung Ryn; Mohapatra, Prasant; Chuah, Chen-Nee

    Network measurement is a discipline that provides the techniques to collect data that are fundamental to many branches of computer science. While many capturing tools and comparisons have made available in the literature and elsewhere, the impact of these packet capturing tools on existing processes have not been thoroughly studied. While not a concern for collection methods in which dedicated servers are used, many usage scenarios of packet capturing now requires the packet capturing tool to run concurrently with operational processes. In this work we perform experimental evaluations of the performance impact that packet capturing process have on web-based services;more » in particular, we observe the impact on web servers. We find that packet capturing processes indeed impact the performance of web servers, but on a multi-core system the impact varies depending on whether the packet capturing and web hosting processes are co-located or not. In addition, the architecture and behavior of the web server and process scheduling is coupled with the behavior of the packet capturing process, which in turn also affect the web server's performance.« less

  2. Moving on From Representativeness: Testing the Utility of the Global Drug Survey.

    PubMed

    Barratt, Monica J; Ferris, Jason A; Zahnow, Renee; Palamar, Joseph J; Maier, Larissa J; Winstock, Adam R

    2017-01-01

    A decline in response rates in traditional household surveys, combined with increased internet coverage and decreased research budgets, has resulted in increased attractiveness of web survey research designs based on purposive and voluntary opt-in sampling strategies. In the study of hidden or stigmatised behaviours, such as cannabis use, web survey methods are increasingly common. However, opt-in web surveys are often heavily criticised due to their lack of sampling frame and unknown representativeness. In this article, we outline the current state of the debate about the relevance of pursuing representativeness, the state of probability sampling methods, and the utility of non-probability, web survey methods especially for accessing hidden or minority populations. Our article has two aims: (1) to present a comprehensive description of the methodology we use at Global Drug Survey (GDS), an annual cross-sectional web survey and (2) to compare the age and sex distributions of cannabis users who voluntarily completed (a) a household survey or (b) a large web-based purposive survey (GDS), across three countries: Australia, the United States, and Switzerland. We find that within each set of country comparisons, the demographic distributions among recent cannabis users are broadly similar, demonstrating that the age and sex distributions of those who volunteer to be surveyed are not vastly different between these non-probability and probability methods. We conclude that opt-in web surveys of hard-to-reach populations are an efficient way of gaining in-depth understanding of stigmatised behaviours and are appropriate, as long as they are not used to estimate drug use prevalence of the general population.

  3. Moving on From Representativeness: Testing the Utility of the Global Drug Survey

    PubMed Central

    Barratt, Monica J; Ferris, Jason A; Zahnow, Renee; Palamar, Joseph J; Maier, Larissa J; Winstock, Adam R

    2017-01-01

    A decline in response rates in traditional household surveys, combined with increased internet coverage and decreased research budgets, has resulted in increased attractiveness of web survey research designs based on purposive and voluntary opt-in sampling strategies. In the study of hidden or stigmatised behaviours, such as cannabis use, web survey methods are increasingly common. However, opt-in web surveys are often heavily criticised due to their lack of sampling frame and unknown representativeness. In this article, we outline the current state of the debate about the relevance of pursuing representativeness, the state of probability sampling methods, and the utility of non-probability, web survey methods especially for accessing hidden or minority populations. Our article has two aims: (1) to present a comprehensive description of the methodology we use at Global Drug Survey (GDS), an annual cross-sectional web survey and (2) to compare the age and sex distributions of cannabis users who voluntarily completed (a) a household survey or (b) a large web-based purposive survey (GDS), across three countries: Australia, the United States, and Switzerland. We find that within each set of country comparisons, the demographic distributions among recent cannabis users are broadly similar, demonstrating that the age and sex distributions of those who volunteer to be surveyed are not vastly different between these non-probability and probability methods. We conclude that opt-in web surveys of hard-to-reach populations are an efficient way of gaining in-depth understanding of stigmatised behaviours and are appropriate, as long as they are not used to estimate drug use prevalence of the general population. PMID:28924351

  4. The climate4impact platform: Providing, tailoring and facilitating climate model data access

    NASA Astrophysics Data System (ADS)

    Pagé, Christian; Pagani, Andrea; Plieger, Maarten; Som de Cerff, Wim; Mihajlovski, Andrej; de Vreede, Ernst; Spinuso, Alessandro; Hutjes, Ronald; de Jong, Fokke; Bärring, Lars; Vega, Manuel; Cofiño, Antonio; d'Anca, Alessandro; Fiore, Sandro; Kolax, Michael

    2017-04-01

    One of the main objectives of climate4impact is to provide standardized web services and tools that are reusable in other portals. These services include web processing services, web coverage services and web mapping services (WPS, WCS and WMS). Tailored portals can be targeted to specific communities and/or countries/regions while making use of those services. Easier access to climate data is very important for the climate change impact communities. To fulfill this objective, the climate4impact (http://climate4impact.eu/) web portal and services has been developed, targeting climate change impact modellers, impact and adaptation consultants, as well as other experts using climate change data. It provides to users harmonized access to climate model data through tailored services. It features static and dynamic documentation, Use Cases and best practice examples, an advanced search interface, an integrated authentication and authorization system with the Earth System Grid Federation (ESGF), a visualization interface with ADAGUC web mapping tools. In the latest version, statistical downscaling services, provided by the Santander Meteorology Group Downscaling Portal, were integrated. An innovative interface to integrate statistical downscaling services will be released in the upcoming version. The latter will be a big step in bridging the gap between climate scientists and the climate change impact communities. The climate4impact portal builds on the infrastructure of an international distributed database that has been set to disseminate the results from the global climate model results of the Coupled Model Intercomparison project Phase 5 (CMIP5). This database, the ESGF, is an international collaboration that develops, deploys and maintains software infrastructure for the management, dissemination, and analysis of climate model data. The European FP7 project IS-ENES, Infrastructure for the European Network for Earth System modelling, supports the European contribution to ESGF and contributes to the ESGF open source effort, notably through the development of search, monitoring, quality control, and metadata services. In its second phase, IS-ENES2 supports the implementation of regional climate model results from the international Coordinated Regional Downscaling Experiments (CORDEX). These services were extended within the European FP7 Climate Information Portal for Copernicus (CLIPC) project, and some could be later integrated into the European Copernicus platform.

  5. AirNow Information Management System - Global Earth Observation System of Systems Data Processor for Real-Time Air Quality Data Products

    NASA Astrophysics Data System (ADS)

    Haderman, M.; Dye, T. S.; White, J. E.; Dickerson, P.; Pasch, A. N.; Miller, D. S.; Chan, A. C.

    2012-12-01

    Built upon the success of the U.S. Environmental Protection Agency's (EPA) AirNow program (www.AirNow.gov), the AirNow-International (AirNow-I) system contains an enhanced suite of software programs that process and quality control real-time air quality and environmental data and distribute customized maps, files, and data feeds. The goals of the AirNow-I program are similar to those of the successful U.S. program and include fostering the exchange of environmental data; making advances in air quality knowledge and applications; and building a community of people, organizations, and decision makers in environmental management. In 2010, Shanghai became the first city in China to run this state-of-the-art air quality data management and notification system. AirNow-I consists of a suite of modules (software programs and schedulers) centered on a database. One such module is the Information Management System (IMS), which can automatically produce maps and other data products through the use of GIS software to provide the most current air quality information to the public. Developed with Global Earth Observation System of Systems (GEOSS) interoperability in mind, IMS is based on non-proprietary standards, with preference to formal international standards. The system depends on data and information providers accepting and implementing a set of interoperability arrangements, including technical specifications for collecting, processing, storing, and disseminating shared data, metadata, and products. In particular, the specifications include standards for service-oriented architecture and web-based interfaces, such as a web mapping service (WMS), web coverage service (WCS), web feature service (WFS), sensor web services, and Really Simple Syndication (RSS) feeds. IMS is flexible, open, redundant, and modular. It also allows the merging of data grids to create complex grids that show comprehensive air quality conditions. For example, the AirNow Satellite Data Processor (ASDP) was recently developed to merge PM2.5 estimates from National Aeronautics and Space Administration (NASA) satellite data and AirNow observational data, creating more precise maps and gridded data products for under-monitored areas. The ASDP can easily incorporate other data feeds, including fire and smoke locations, to build enhanced real-time air quality data products. In this presentation, we provide an overview of the features and functions of IMS, an explanation of how data moves through IMS, the rationale of the system architecture, and highlights of the ASDP as an example of the modularity and scalability of IMS.

  6. GSKY: A scalable distributed geospatial data server on the cloud

    NASA Astrophysics Data System (ADS)

    Rozas Larraondo, Pablo; Pringle, Sean; Antony, Joseph; Evans, Ben

    2017-04-01

    Earth systems, environmental and geophysical datasets are an extremely valuable sources of information about the state and evolution of the Earth. Being able to combine information coming from different geospatial collections is in increasing demand by the scientific community, and requires managing and manipulating data with different formats and performing operations such as map reprojections, resampling and other transformations. Due to the large data volume inherent in these collections, storing multiple copies of them is unfeasible and so such data manipulation must be performed on-the-fly using efficient, high performance techniques. Ideally this should be performed using a trusted data service and common system libraries to ensure wide use and reproducibility. Recent developments in distributed computing based on dynamic access to significant cloud infrastructure opens the door for such new ways of processing geospatial data on demand. The National Computational Infrastructure (NCI), hosted at the Australian National University (ANU), has over 10 Petabytes of nationally significant research data collections. Some of these collections, which comprise a variety of observed and modelled geospatial data, are now made available via a highly distributed geospatial data server, called GSKY (pronounced [jee-skee]). GSKY supports on demand processing of large geospatial data products such as satellite earth observation data as well as numerical weather products, allowing interactive exploration and analysis of the data. It dynamically and efficiently distributes the required computations among cloud nodes providing a scalable analysis framework that can adapt to serve large number of concurrent users. Typical geospatial workflows handling different file formats and data types, or blending data in different coordinate projections and spatio-temporal resolutions, is handled transparently by GSKY. This is achieved by decoupling the data ingestion and indexing process as an independent service. An indexing service crawls data collections either locally or remotely by extracting, storing and indexing all spatio-temporal metadata associated with each individual record. GSKY provides the user with the ability of specifying how ingested data should be aggregated, transformed and presented. It presents an OGC standards-compliant interface, allowing ready accessibility for users of the data via Web Map Services (WMS), Web Processing Services (WPS) or raw data arrays using Web Coverage Services (WCS). The presentation will show some cases where we have used this new capability to provide a significant improvement over previous approaches.

  7. Increasing Coverage of Hepatitis B Vaccination in China

    PubMed Central

    Wang, Shengnan; Smith, Helen; Peng, Zhuoxin; Xu, Biao; Wang, Weibing

    2016-01-01

    Abstract This study used a system evaluation method to summarize China's experience on improving the coverage of hepatitis B vaccine, especially the strategies employed to improve the uptake of timely birth dosage. Identifying successful methods and strategies will provide strong evidence for policy makers and health workers in other countries with high hepatitis B prevalence. We conducted a literature review included English or Chinese literature carried out in mainland China, using PubMed, the Cochrane databases, Web of Knowledge, China National Knowledge Infrastructure, Wanfang data, and other relevant databases. Nineteen articles about the effectiveness and impact of interventions on improving the coverage of hepatitis B vaccine were included. Strong or moderate evidence showed that reinforcing health education, training and supervision, providing subsidies for facility birth, strengthening the coordination among health care providers, and using out-of-cold-chain storage for vaccines were all important to improving vaccination coverage. We found evidence that community education was the most commonly used intervention, and out-reach programs such as out-of-cold chain strategy were more effective in increasing the coverage of vaccination in remote areas where the facility birth rate was respectively low. The essential impact factors were found to be strong government commitment and the cooperation of the different government departments. Public interventions relying on basic health care systems combined with outreach care services were critical elements in improving the hepatitis B vaccination rate in China. This success could not have occurred without exceptional national commitment. PMID:27175710

  8. Increasing Coverage of Hepatitis B Vaccination in China: A Systematic Review of Interventions and Implementation Experiences.

    PubMed

    Wang, Shengnan; Smith, Helen; Peng, Zhuoxin; Xu, Biao; Wang, Weibing

    2016-05-01

    This study used a system evaluation method to summarize China's experience on improving the coverage of hepatitis B vaccine, especially the strategies employed to improve the uptake of timely birth dosage. Identifying successful methods and strategies will provide strong evidence for policy makers and health workers in other countries with high hepatitis B prevalence.We conducted a literature review included English or Chinese literature carried out in mainland China, using PubMed, the Cochrane databases, Web of Knowledge, China National Knowledge Infrastructure, Wanfang data, and other relevant databases.Nineteen articles about the effectiveness and impact of interventions on improving the coverage of hepatitis B vaccine were included. Strong or moderate evidence showed that reinforcing health education, training and supervision, providing subsidies for facility birth, strengthening the coordination among health care providers, and using out-of-cold-chain storage for vaccines were all important to improving vaccination coverage.We found evidence that community education was the most commonly used intervention, and out-reach programs such as out-of-cold chain strategy were more effective in increasing the coverage of vaccination in remote areas where the facility birth rate was respectively low. The essential impact factors were found to be strong government commitment and the cooperation of the different government departments.Public interventions relying on basic health care systems combined with outreach care services were critical elements in improving the hepatitis B vaccination rate in China. This success could not have occurred without exceptional national commitment.

  9. Analysis of Media Coverage on Breastfeeding Policy in Washington State.

    PubMed

    DeMarchis, Alessandra; Ritter, Gaelen; Otten, Jennifer; Johnson, Donna

    2018-02-01

    Media coverage and message framing about breastfeeding polices can influence important policy decisions in institutional and governmental settings. Research aim: This study aimed to describe the media coverage of breastfeeding policies and the message frames that are found in print newspapers and web-only news publications in Washington State between 2000 and 2014. For this retrospective media analysis study, 131 news articles published from January 2000 through June 2014 in Washington State that specifically discussed breastfeeding policy were identified, coded, and analyzed to explore the content of the sample and examine how arguments supporting or opposing breastfeeding policy were framed. The coding scheme was developed cooperatively and found to be reliable across coders. The number of articles published each year about breastfeeding policy grew overall between 2000 and 2014 and peaked during periods of specific policy development. Seventy-four articles had a neutral tone, 49 supported breastfeeding policy, and 4 were in opposition. Nine distinct supporting frames and six distinct opposing frames were identified. Common supporting frames were health benefits of breastfeeding and the need for policies because of challenges of breastfeeding in public. The most common opposing frame was indecency of breastfeeding in public. There is limited but growing media coverage of breastfeeding policies. For the most part, coverage is supportive of the need for policies. Breastfeeding advocates can apply information about media message frames to craft effective policy development strategies that counteract negative perceptions and promote the benefits of breastfeeding policies.

  10. A flexible geospatial sensor observation service for diverse sensor data based on Web service

    NASA Astrophysics Data System (ADS)

    Chen, Nengcheng; Di, Liping; Yu, Genong; Min, Min

    Achieving a flexible and efficient geospatial Sensor Observation Service (SOS) is difficult, given the diversity of sensor networks, the heterogeneity of sensor data storage, and the differing requirements of users. This paper describes development of a service-oriented multi-purpose SOS framework. The goal is to create a single method of access to the data by integrating the sensor observation service with other Open Geospatial Consortium (OGC) services — Catalogue Service for the Web (CSW), Transactional Web Feature Service (WFS-T) and Transactional Web Coverage Service (WCS-T). The framework includes an extensible sensor data adapter, an OGC-compliant geospatial SOS, a geospatial catalogue service, a WFS-T, and a WCS-T for the SOS, and a geospatial sensor client. The extensible sensor data adapter finds, stores, and manages sensor data from live sensors, sensor models, and simulation systems. Abstract factory design patterns are used during design and implementation. A sensor observation service compatible with the SWE is designed, following the OGC "core" and "transaction" specifications. It is implemented using Java servlet technology. It can be easily deployed in any Java servlet container and automatically exposed for discovery using Web Service Description Language (WSDL). Interaction sequences between a Sensor Web data consumer and an SOS, between a producer and an SOS, and between an SOS and a CSW are described in detail. The framework has been successfully demonstrated in application scenarios for EO-1 observations, weather observations, and water height gauge observations.

  11. Choosing health care online: a 7-Eleven case study.

    PubMed

    Fuller, Margaret; Beauregard, Cindy

    2003-01-01

    This article describes 7-Eleven's success in offering Web-based health care enrollment to its diverse workforce, which made the introduction of such service delivery strategy unusually challenging. Through its efforts, 7-Eleven was able to meet several important objectives, including helping employees better appreciate the value of their benefits, providing employees with increased services and convenience, and encouraging employees to make more cost-effective choices in their health care coverage.

  12. The Geology of Burma (Myanmar): An Annotated Bibliography of Burma’s Geology, Geography and Earth Science

    DTIC Science & Technology

    2008-09-01

    about the region includes maps and links to related Web sites. Notes: Named Corp: Mekong River Commission. Genre/Form: Article/ Paper /Report. Map...unequalled in its coverage of international literature of the core scientific and technical periodicals. Papers are selected, read, and classified...includes refereed scientific papers ; trade journal and magazine articles, product reviews, directories and any other relevant material. GEOBASE has a

  13. Depth-of-processing effects as college students use academic advising Web sites.

    PubMed

    Boatright-Horowitz, Su L; Langley, Michelle; Gunnip, Matthew

    2009-06-01

    This research examined students' cognitive and affective responses to an academic advising Web site. Specifically, we investigated whether exposure to our Web site increased student reports that they would access university Web sites to obtain various types of advising information. A depth-of-processing (DOP) manipulation revealed this effect as students engaged in semantic processing of Web content but not when they engaged in superficial examination of the physical appearance of the same Web site. Students appeared to scan online academic advising materials for information of immediate importance without noticing other information or hyperlinks (e.g., regarding internships and careers). Suggestions are presented for increasing the effectiveness of academic advising Web sites.

  14. Availability of the OGC geoprocessing standard: March 2011 reality check

    NASA Astrophysics Data System (ADS)

    Lopez-Pellicer, Francisco J.; Rentería-Agualimpia, Walter; Béjar, Rubén; Muro-Medrano, Pedro R.; Zarazaga-Soria, F. Javier

    2012-10-01

    This paper presents an investigation about the servers available in March 2011 conforming to the Web Processing Service interface specification published by the geospatial standards organization Open Geospatial Consortium (OGC) in 2007. This interface specification gives support to standard Web-based geoprocessing. The data used in this research were collected using a focused crawler configured for finding OGC Web services. The research goals are (i) to provide a reality check of the availability of Web Processing Service servers, (ii) to provide quantitative data about the use of different features defined in the standard that are relevant for a scalable Geoprocessing Web (e.g. long-running processes, Web-accessible data outputs), and (iii) to test if the advances in the use of search engines and focused crawlers for finding Web services can be applied for finding geoscience processing systems. Research results show the feasibility of the discovery approach and provide data about the implementation of the Web Processing Service specification. These results also show extensive use of features related to scalability, except for those related to technical and semantic interoperability.

  15. Recent advancements on the development of web-based applications for the implementation of seismic analysis and surveillance systems

    NASA Astrophysics Data System (ADS)

    Friberg, P. A.; Luis, R. S.; Quintiliani, M.; Lisowski, S.; Hunter, S.

    2014-12-01

    Recently, a novel set of modules has been included in the Open Source Earthworm seismic data processing system, supporting the use of web applications. These include the Mole sub-system, for storing relevant event data in a MySQL database (see M. Quintiliani and S. Pintore, SRL, 2013), and an embedded webserver, Moleserv, for serving such data to web clients in QuakeML format. These modules have enabled, for the first time using Earthworm, the use of web applications for seismic data processing. These can greatly simplify the operation and maintenance of seismic data processing centers by having one or more servers providing the relevant data as well as the data processing applications themselves to client machines running arbitrary operating systems.Web applications with secure online web access allow operators to work anywhere, without the often cumbersome and bandwidth hungry use of secure shell or virtual private networks. Furthermore, web applications can seamlessly access third party data repositories to acquire additional information, such as maps. Finally, the usage of HTML email brought the possibility of specialized web applications, to be used in email clients. This is the case of EWHTMLEmail, which produces event notification emails that are in fact simple web applications for plotting relevant seismic data.Providing web services as part of Earthworm has enabled a number of other tools as well. One is ISTI's EZ Earthworm, a web based command and control system for an otherwise command line driven system; another is a waveform web service. The waveform web service serves Earthworm data to additional web clients for plotting, picking, and other web-based processing tools. The current Earthworm waveform web service hosts an advanced plotting capability for providing views of event-based waveforms from a Mole database served by Moleserve.The current trend towards the usage of cloud services supported by web applications is driving improvements in JavaScript, css and HTML, as well as faster and more efficient web browsers, including mobile. It is foreseeable that in the near future, web applications are as powerful and efficient as native applications. Hence the work described here has been the first step towards bringing the Open Source Earthworm seismic data processing system to this new paradigm.

  16. An overview of the web-based Google Earth coincident imaging tool

    USGS Publications Warehouse

    Chander, Gyanesh; Kilough, B.; Gowda, S.

    2010-01-01

    The Committee on Earth Observing Satellites (CEOS) Visualization Environment (COVE) tool is a browser-based application that leverages Google Earth web to display satellite sensor coverage areas. The analysis tool can also be used to identify near simultaneous surface observation locations for two or more satellites. The National Aeronautics and Space Administration (NASA) CEOS System Engineering Office (SEO) worked with the CEOS Working Group on Calibration and Validation (WGCV) to develop the COVE tool. The CEOS member organizations are currently operating and planning hundreds of Earth Observation (EO) satellites. Standard cross-comparison exercises between multiple sensors to compare near-simultaneous surface observations and to identify corresponding image pairs are time-consuming and labor-intensive. COVE is a suite of tools that have been developed to make such tasks easier.

  17. PigGIS: Pig Genomic Informatics System

    PubMed Central

    Ruan, Jue; Guo, Yiran; Li, Heng; Hu, Yafeng; Song, Fei; Huang, Xin; Kristiensen, Karsten; Bolund, Lars; Wang, Jun

    2007-01-01

    Pig Genomic Information System (PigGIS) is a web-based depository of pig (Sus scrofa) genomic learning mainly engineered for biomedical research to locate pig genes from their human homologs and position single nucleotide polymorphisms (SNPs) in different pig populations. It utilizes a variety of sequence data, including whole genome shotgun (WGS) reads and expressed sequence tags (ESTs), and achieves a successful mapping solution to the low-coverage genome problem. With the data presently available, we have identified a total of 15 700 pig consensus sequences covering 18.5 Mb of the homologous human exons. We have also recovered 18 700 SNPs and 20 800 unique 60mer oligonucleotide probes for future pig genome analyses. PigGIS can be freely accessed via the web at and . PMID:17090590

  18. Ten years of change: National Library of Medicine TOXMAP gets a new look.

    PubMed

    Hochstein, Colette; Gemoets, Darren; Goshorn, Jeanne

    2014-01-01

    The United States National Library of Medicine (NLM) TOXNET® databases < http://toxnet.nlm.nih.gov > provide broad coverage of environmental health information covering a wide variety of topics, including access to the U.S. Environment Protection Agency (EPA)'s Toxics Release Inventory (TRI) data. The NLM web-based geographic information system (GIS), TOXMAP® < http://toxmap.nlm.nih.gov/ >, provides interactive maps which show where TRI chemicals are released into the environment and links to TOXNET for information about these chemicals. TOXMAP also displays locations of Superfund sites on the EPA National Priority List, as well as information about the chemical contaminants at these sites. This column focuses on a new version of TOXMAP which brings it up to date with current web GIS technologies and user expectations.

  19. Exploring the Role of Usability in the Software Process: A Study of Irish Software SMEs

    NASA Astrophysics Data System (ADS)

    O'Connor, Rory V.

    This paper explores the software processes and usability techniques used by Small and Medium Enterprises (SMEs) that develop web applications. The significance of this research is that it looks at development processes used by SMEs in order to assess to what degree usability is integrated into the process. This study seeks to gain an understanding into the level of awareness of usability within SMEs today and their commitment to usability in practice. The motivation for this research is to explore the current development processes used by SMEs in developing web applications and to understand how usability is represented in those processes. The background for this research is provided by the growth of the web application industry beyond informational web sites to more sophisticated applications delivering a broad range of functionality. This paper presents an analysis of the practices of several Irish SMEs that develop web applications through a series of case studies. With the focus on SMEs that develop web applications as Management Information Systems and not E-Commerce sites, informational sites, online communities or web portals. This study gathered data about the usability techniques practiced by these companies and their awareness of usability in the context of the software process in those SMEs. The contribution of this study is to further the understanding of the current role of usability within the software development processes of SMEs that develop web applications.

  20. Consumption processes and food web structure in the Columbia River Estuary

    NASA Astrophysics Data System (ADS)

    Simenstad, Charles A.; Small, Lawrence F.; David McIntire, C.

    Consumption processes at several trophic levels tend to coverage in the central (estuarine-mixing) region of the Columbia River estuary, where living and dentrital food resources are entrained within the energy null of the turbidity maximum zone. Primary consumers in this region are generalist and omnivorous feeders, capable of exploiting both autotrophic and heterotrophic food web pathways. In the presence of higher standing stocks of their prey resources, feeding by secondary and tertiary consumers is also concentrated, or more effective, in the estuarine mixing region of the estuary. During the 1980-1981 studies of the estuary, total consumer (metazoan) production averaged 5.5g C m -2 within the estuary. Of the estimated 15 x 10 3mt Cyy -1 attributed to primary consumption in the water column, 83% was the result of suspension-feeding pelagic zooplankton. In comparison to grazing on phytoplankton, it was estimated that approximately 84% of primary consumption in the water column was based on suspended detritus and, presumably, associated microbiota. Endemic primary,consumers, principally epibenthic crustaceans such as the calanoid copepod Eurytemora affinis, the harpacticoid copepod Scottolana canadensis, and the crangonid shrimp Crangon franciscorum, accounted for a high proportion of the consumption of suspended particles. Wertland herbivores inhabiting the estuary's extensive marshes, on the other hand, were estimated to account for only 2 to 17% of total estuarine primary consumption. Trophic linkages to secondary and tertiary consumers were more evenly apportioned among pelagic fishes, motile macroinvertebrates, and benthic infauna. High, comparatively unknown fluxes of migratory or wide-ranging tertiary consumers, such as piscivorous birds, seals and sea lions, made estimation of their annual consumption rates in the estuary highly tenuous. The physical processes of mixing and stratification, sediments accretion and erosion, and salinity intrusion appear to be the fundamental determinants of consumption processes in the Columbia River estuary, and perhaps in other similarly energetic estuarine systems, by promoting concentrations of consumers in low-energy habitats such as the turbidity maximum and peripheral bays.

  1. Integrated web visualizations for protein-protein interaction databases.

    PubMed

    Jeanquartier, Fleur; Jean-Quartier, Claire; Holzinger, Andreas

    2015-06-16

    Understanding living systems is crucial for curing diseases. To achieve this task we have to understand biological networks based on protein-protein interactions. Bioinformatics has come up with a great amount of databases and tools that support analysts in exploring protein-protein interactions on an integrated level for knowledge discovery. They provide predictions and correlations, indicate possibilities for future experimental research and fill the gaps to complete the picture of biochemical processes. There are numerous and huge databases of protein-protein interactions used to gain insights into answering some of the many questions of systems biology. Many computational resources integrate interaction data with additional information on molecular background. However, the vast number of diverse Bioinformatics resources poses an obstacle to the goal of understanding. We present a survey of databases that enable the visual analysis of protein networks. We selected M=10 out of N=53 resources supporting visualization, and we tested against the following set of criteria: interoperability, data integration, quantity of possible interactions, data visualization quality and data coverage. The study reveals differences in usability, visualization features and quality as well as the quantity of interactions. StringDB is the recommended first choice. CPDB presents a comprehensive dataset and IntAct lets the user change the network layout. A comprehensive comparison table is available via web. The supplementary table can be accessed on http://tinyurl.com/PPI-DB-Comparison-2015. Only some web resources featuring graph visualization can be successfully applied to interactive visual analysis of protein-protein interaction. Study results underline the necessity for further enhancements of visualization integration in biochemical analysis tools. Identified challenges are data comprehensiveness, confidence, interactive feature and visualization maturing.

  2. Evaluating Web accessibility at different processing phases

    NASA Astrophysics Data System (ADS)

    Fernandes, N.; Lopes, R.; Carriço, L.

    2012-09-01

    Modern Web sites use several techniques (e.g. DOM manipulation) that allow for the injection of new content into their Web pages (e.g. AJAX), as well as manipulation of the HTML DOM tree. This has the consequence that the Web pages that are presented to users (i.e. after browser processing) are different from the original structure and content that is transmitted through HTTP communication (i.e. after browser processing). This poses a series of challenges for Web accessibility evaluation, especially on automated evaluation software. This article details an experimental study designed to understand the differences posed by accessibility evaluation after Web browser processing. We implemented a Javascript-based evaluator, QualWeb, that can perform WCAG 2.0 based accessibility evaluations in the two phases of browser processing. Our study shows that, in fact, there are considerable differences between the HTML DOM trees in both phases, which have the consequence of having distinct evaluation results. We discuss the impact of these results in the light of the potential problems that these differences can pose to designers and developers that use accessibility evaluators that function before browser processing.

  3. Adding Processing Functionality to the Sensor Web

    NASA Astrophysics Data System (ADS)

    Stasch, Christoph; Pross, Benjamin; Jirka, Simon; Gräler, Benedikt

    2017-04-01

    The Sensor Web allows discovering, accessing and tasking different kinds of environmental sensors in the Web, ranging from simple in-situ sensors to remote sensing systems. However, (geo-)processing functionality needs to be applied to integrate data from different sensor sources and to generate higher level information products. Yet, a common standardized approach for processing sensor data in the Sensor Web is still missing and the integration differs from application to application. Standardizing not only the provision of sensor data, but also the processing facilitates sharing and re-use of processing modules, enables reproducibility of processing results, and provides a common way to integrate external scalable processing facilities or legacy software. In this presentation, we provide an overview on on-going research projects that develop concepts for coupling standardized geoprocessing technologies with Sensor Web technologies. At first, different architectures for coupling sensor data services with geoprocessing services are presented. Afterwards, profiles for linear regression and spatio-temporal interpolation of the OGC Web Processing Services that allow consuming sensor data coming from and uploading predictions to Sensor Observation Services are introduced. The profiles are implemented in processing services for the hydrological domain. Finally, we illustrate how the R software can be coupled with existing OGC Sensor Web and Geoprocessing Services and present an example, how a Web app can be built that allows exploring the results of environmental models in an interactive way using the R Shiny framework. All of the software presented is available as Open Source Software.

  4. Description and testing of the Geo Data Portal: Data integration framework and Web processing services for environmental science collaboration

    USGS Publications Warehouse

    Blodgett, David L.; Booth, Nathaniel L.; Kunicki, Thomas C.; Walker, Jordan I.; Viger, Roland J.

    2011-01-01

    Interest in sharing interdisciplinary environmental modeling results and related data is increasing among scientists. The U.S. Geological Survey Geo Data Portal project enables data sharing by assembling open-standard Web services into an integrated data retrieval and analysis Web application design methodology that streamlines time-consuming and resource-intensive data management tasks. Data-serving Web services allow Web-based processing services to access Internet-available data sources. The Web processing services developed for the project create commonly needed derivatives of data in numerous formats. Coordinate reference system manipulation and spatial statistics calculation components implemented for the Web processing services were confirmed using ArcGIS 9.3.1, a geographic information science software package. Outcomes of the Geo Data Portal project support the rapid development of user interfaces for accessing and manipulating environmental data.

  5. "What is Palliative Care?"

    PubMed

    Kozlov, Elissa; Carpenter, Brian D

    2017-04-01

    Americans rely on the Internet for health information, and people are likely to turn to online resources to learn about palliative care as well. The purpose of this study was to analyze online palliative care information pages to evaluate the breadth of their content. We also compared how frequently basic facts about palliative care appeared on the Web pages to expert rankings of the importance of those facts to understanding palliative care. Twenty-six pages were identified. Two researchers independently coded each page for content. Palliative care professionals (n = 20) rated the importance of content domains for comparison with content frequency in the Web pages. We identified 22 recurring broad concepts about palliative care. Each information page included, on average, 9.2 of these broad concepts (standard deviation [SD] = 3.36, range = 5-15). Similarly, each broad concept was present in an average of 45% of the Web pages (SD = 30.4%, range = 8%-96%). Significant discrepancies emerged between expert ratings of the importance of the broad concepts and the frequency of their appearance in the Web pages ( r τ = .25, P > .05). This study demonstrates that palliative care information pages available online vary considerably in their content coverage. Furthermore, information that palliative care professionals rate as important for consumers to know is not always included in Web pages. We developed guidelines for information pages for the purpose of educating consumers in a consistent way about palliative care.

  6. A systematic review of nursing research priorities on health system and services in the Americas.

    PubMed

    Garcia, Alessandra Bassalobre; Cassiani, Silvia Helena De Bortoli; Reveiz, Ludovic

    2015-03-01

    To systematically review literature on priorities in nursing research on health systems and services in the Region of the Americas as a step toward developing a nursing research agenda that will advance the Regional Strategy for Universal Access to Health and Universal Health Coverage. This was a systematic review of the literature available from the following databases: Web of Science, PubMed, LILACS, and Google. Documents considered were published in 2008-2014; in English, Spanish, or Portuguese; and addressed the topic in the Region of the Americas. The documents selected had their priority-setting process evaluated according to the "nine common themes for good practice in health research priorities." A content analysis collected all study questions and topics, and sorted them by category and subcategory. Of 185 full-text articles/documents that were assessed for eligibility, 23 were selected: 12 were from peer-reviewed journals; 6 from nursing publications; 4 from Ministries of Health; and 1 from an international organization. Journal publications had stronger methodological rigor; the majority did not present a clear implementation or evaluation plan. After compiling the 444 documents' study questions and topics, the content analysis resulted in a document with 5 categories and 16 subcategories regarding nursing research priorities on health systems and services. Research priority-setting is a highly important process for health services improvement and resources optimization, but implementation and evaluation plans are rarely included. The resulting document will serve as basis for the development of a new nursing research agenda focused on health systems and services, and shaped to advance universal health coverage and universal access to health.

  7. Effect of hydrogen coverage on hydrogenation of o-cresol on Pt(111)

    NASA Astrophysics Data System (ADS)

    Li, Yaping; Liu, Zhimin; Crossley, Steven P.; Jentoft, Friederike C.; Wang, Sanwu

    2018-06-01

    The conversion of phenolics over metal catalysts is an important process for upgrading biofuels. With density functional calculations, hydrogenation of o-cresol on the hydrogen-covered Pt(111) surface was investigated. The results show that the coverage of hydrogen plays a significant role in the reaction rate while it does not affect the reaction selectivity. The reaction barriers of the hydrogenation process leading to the formation of both 2-methyl-cyclohexanone (the intermediate product) and 2-methyl-cyclohexanol (the final product) at high H coverages (∼1 ML) are found to be smaller by 0.14-0.69 eV than those at lower H coverages (∼1/25 ML). After both hydrogen and cresol are adsorbed on Pt(111) from their initial gas phase state, the reaction energy of each hydrogenation step on the surface is also dependent on the hydrogen coverage. On the H-covered Pt(111) surface, most steps of hydrogenation involve exothermic reactions when the hydrogen coverage is high while they are endothermic reactions at low hydrogen coverages. The differences in reaction rate and reaction energy between high and low H coverages can be understood with the coverage-dependent bonding strength and configurations.

  8. Process property studies of melt blown thermoplastic polyurethane polymers

    NASA Astrophysics Data System (ADS)

    Lee, Youn Eung

    The primary goal of this research was to determine optimum processing conditions to produce commercially acceptable melt blown (MB) thermoplastic polyurethane (TPU) webs. The 6-inch MB line and the 20-inch wide Accurate Products MB pilot line at the Textiles and Nonwovens Development Center (TANDEC), The University of Tennessee, Knoxville, were utilized for this study. The MB TPU trials were performed in four different phases: Phase 1 focused on the envelope of the MB operating conditions for different TPU polymers; Phase 2 focused on the production of commercially acceptable MB TPU webs; Phase 3 focused on the optimization of the processing conditions of MB TPU webs, and the determination of the significant relationships between processing parameters and web properties utilizing statistical analyses; Based on the first three phases, a more extensive study of fiber and web formation in the MB TPU process was made and a multi liner regression model for the MB TPU process versus properties was also developed in Phase 4. In conclusion, the basic MB process was fundamentally valid for the MB TPU process; however, the MB process was more complicated for TPU than PP, because web structures and properties of MB TPUs are very sensitive to MB process conditions: Furthermore, different TPU grades responded very differently to MB processing and exhibited different web structure and properties. In Phase 3 and Phase 4, small fiber diameters of less than 5mum were produced from TPU237, TPU245 and TPU280 pellets, and the mechanical strengths of MB TPU webs including the tensile strength, tear strength, abrasion resistance and tensile elongation were notably good. In addition, the statistical model showed useful interaction regarding trends for processing parameters versus properties of MB TPU webs. Die and air temperature showed multicollinearity problems and fiber diameter was notably affected by air flow rate, throughput and die/air temperature. It was also shown that most of the MB TPU web properties including mechanical strength, air permeability and fiber diameters were affected by air velocity and die temperature.

  9. BPELPower—A BPEL execution engine for geospatial web services

    NASA Astrophysics Data System (ADS)

    Yu, Genong (Eugene); Zhao, Peisheng; Di, Liping; Chen, Aijun; Deng, Meixia; Bai, Yuqi

    2012-10-01

    The Business Process Execution Language (BPEL) has become a popular choice for orchestrating and executing workflows in the Web environment. As one special kind of scientific workflow, geospatial Web processing workflows are data-intensive, deal with complex structures in data and geographic features, and execute automatically with limited human intervention. To enable the proper execution and coordination of geospatial workflows, a specially enhanced BPEL execution engine is required. BPELPower was designed, developed, and implemented as a generic BPEL execution engine with enhancements for executing geospatial workflows. The enhancements are especially in its capabilities in handling Geography Markup Language (GML) and standard geospatial Web services, such as the Web Processing Service (WPS) and the Web Feature Service (WFS). BPELPower has been used in several demonstrations over the decade. Two scenarios were discussed in detail to demonstrate the capabilities of BPELPower. That study showed a standard-compliant, Web-based approach for properly supporting geospatial processing, with the only enhancement at the implementation level. Pattern-based evaluation and performance improvement of the engine are discussed: BPELPower directly supports 22 workflow control patterns and 17 workflow data patterns. In the future, the engine will be enhanced with high performance parallel processing and broad Web paradigms.

  10. Chemistry on the world-wide-web: a ten year experiment.

    PubMed

    Goodman, Jonathan M

    2004-11-21

    The server logs for access to the Cambridge Chemistry webserver show how use of the server has increased over the last ten years, with access doubling every year and a half. This growth has started to slow, and extrapolation of the data suggests that the current rate of access is close to a plateau of ten million downloads a year. The transition for chemists from no internet access to saturation coverage, therefore, appears almost complete.

  11. Effects of mercury deposition and coniferous forests on the mercury contamination of fish in the south central United States

    USGS Publications Warehouse

    Drenner, Ray W.; Chumchal, Matthew M.; Jones, Christina M.; Lehmann, Christopher M.B.; Gay, David A.; Donato, David I.

    2013-01-01

    Mercury (Hg) is a toxic metal that is found in aquatic food webs and is hazardous to human and wildlife health. We examined the relationship between Hg deposition, land coverage by coniferous and deciduous forests, and average Hg concentrations in largemouth bass (Micropterus salmoides)-equivalent fish (LMBE) in 14 ecoregions located within all or part of six states in the South Central U.S. In 11 ecoregions, the average Hg concentrations in 35.6-cm total length LMBE were above 300 ng/g, the threshold concentration of Hg recommended by the U.S. Environmental Protection Agency for the issuance of fish consumption advisories. Percent land coverage by coniferous forests within ecoregions had a significant linear relationship with average Hg concentrations in LMBE while percent land coverage by deciduous forests did not. Eighty percent of the variance in average Hg concentrations in LMBE between ecoregions could be accounted for by estimated Hg deposition after adjusting for the effects of coniferous forests. Here we show for the first time that fish from ecoregions with high atmospheric Hg pollution and coniferous forest coverage pose a significant hazard to human health. Our study suggests that models that use Hg deposition to predict Hg concentrations in fish could be improved by including the effects of coniferous forests on Hg deposition.

  12. Responsible conduct of research in communication sciences and disorders: faculty and student perceptions.

    PubMed

    Minifie, Fred D; Robey, Randall R; Horner, Jennifer; Ingham, Janis C; Lansing, Charissa; McCartney, James H; Alldredge, Elham-Eid; Slater, Sarah C; Moss, Sharon E

    2011-02-01

    Two Web-based surveys (Surveys I and II) were used to assess perceptions of faculty and students in Communication Sciences and Disorders (CSD) regarding the responsible conduct of research (RCR). Survey questions addressed 9 RCR domains thought important to the responsible conduct of research: (a) human subjects protections; (b) research involving animals; (c) publication practices and responsible authorship; (d) mentor/trainee responsibilities; (e) collaborative science; (f) peer review; (g) data acquisition, management, sharing, and ownership; (h) conflicts of interest; and (i) research misconduct. Respondents rated each of 37 topics for importance and for sufficiency of instructional coverage. Respondents to Survey I were 137 faculty members from 68 (26%) of the 261 graduate programs in CSD. By comparison, 237 students from 39 (15%) programs responded to Survey II. Data about the importance and sufficiency of coverage of each of the 37 items were transformed into z scores to reveal relative ratings among the 37 topics. Data presentations were grouped for topics in each of the 9 RCR domains. Ratings indicated the relatively high importance assigned among the 37 topics by CSD faculty and students. Sufficiency of coverage of those same topics received lower ratings. The results of these surveys support the notion that students in CSD perceive that they are receiving information about RCR. The data pertaining to sufficiency of coverage provide a basis for improving instruction in this important aspect of research education.

  13. Optimizing Crawler4j using MapReduce Programming Model

    NASA Astrophysics Data System (ADS)

    Siddesh, G. M.; Suresh, Kavya; Madhuri, K. Y.; Nijagal, Madhushree; Rakshitha, B. R.; Srinivasa, K. G.

    2017-06-01

    World wide web is a decentralized system that consists of a repository of information on the basis of web pages. These web pages act as a source of information or data in the present analytics world. Web crawlers are used for extracting useful information from web pages for different purposes. Firstly, it is used in web search engines where the web pages are indexed to form a corpus of information and allows the users to query on the web pages. Secondly, it is used for web archiving where the web pages are stored for later analysis phases. Thirdly, it can be used for web mining where the web pages are monitored for copyright purposes. The amount of information processed by the web crawler needs to be improved by using the capabilities of modern parallel processing technologies. In order to solve the problem of parallelism and the throughput of crawling this work proposes to optimize the Crawler4j using the Hadoop MapReduce programming model by parallelizing the processing of large input data. Crawler4j is a web crawler that retrieves useful information about the pages that it visits. The crawler Crawler4j coupled with data and computational parallelism of Hadoop MapReduce programming model improves the throughput and accuracy of web crawling. The experimental results demonstrate that the proposed solution achieves significant improvements with respect to performance and throughput. Hence the proposed approach intends to carve out a new methodology towards optimizing web crawling by achieving significant performance gain.

  14. How NASA's Atmospheric Science Data Center (ASDC) is operationally using the Esri ArcGIS Platform to improve data discoverability, accessibility and interoperability to meet the diversifying government, private, public and academic communities' driven requirements.

    NASA Astrophysics Data System (ADS)

    Tisdale, M.

    2016-12-01

    NASA's Atmospheric Science Data Center (ASDC) is operationally using the Esri ArcGIS Platform to improve data discoverability, accessibility and interoperability to meet the diversifying government, private, public and academic communities' driven requirements. The ASDC is actively working to provide their mission essential datasets as ArcGIS Image Services, Open Geospatial Consortium (OGC) Web Mapping Services (WMS), OGC Web Coverage Services (WCS) and leveraging the ArcGIS multidimensional mosaic dataset structure. Science teams and ASDC are utilizing these services, developing applications using the Web AppBuilder for ArcGIS and ArcGIS API for Javascript, and evaluating restructuring their data production and access scripts within the ArcGIS Python Toolbox framework and Geoprocessing service environment. These capabilities yield a greater usage and exposure of ASDC data holdings and provide improved geospatial analytical tools for a mission critical understanding in the areas of the earth's radiation budget, clouds, aerosols, and tropospheric chemistry.

  15. ICO amplicon NGS data analysis: a Web tool for variant detection in common high-risk hereditary cancer genes analyzed by amplicon GS Junior next-generation sequencing.

    PubMed

    Lopez-Doriga, Adriana; Feliubadaló, Lídia; Menéndez, Mireia; Lopez-Doriga, Sergio; Morón-Duran, Francisco D; del Valle, Jesús; Tornero, Eva; Montes, Eva; Cuesta, Raquel; Campos, Olga; Gómez, Carolina; Pineda, Marta; González, Sara; Moreno, Victor; Capellá, Gabriel; Lázaro, Conxi

    2014-03-01

    Next-generation sequencing (NGS) has revolutionized genomic research and is set to have a major impact on genetic diagnostics thanks to the advent of benchtop sequencers and flexible kits for targeted libraries. Among the main hurdles in NGS are the difficulty of performing bioinformatic analysis of the huge volume of data generated and the high number of false positive calls that could be obtained, depending on the NGS technology and the analysis pipeline. Here, we present the development of a free and user-friendly Web data analysis tool that detects and filters sequence variants, provides coverage information, and allows the user to customize some basic parameters. The tool has been developed to provide accurate genetic analysis of targeted sequencing of common high-risk hereditary cancer genes using amplicon libraries run in a GS Junior System. The Web resource is linked to our own mutation database, to assist in the clinical classification of identified variants. We believe that this tool will greatly facilitate the use of the NGS approach in routine laboratories.

  16. Lightweight monitoring and control system for coal mine safety using REST style.

    PubMed

    Cheng, Bo; Cheng, Xin; Chen, Junliang

    2015-01-01

    The complex environment of a coal mine requires the underground environment, devices and miners to be constantly monitored to ensure safe coal production. However, existing coal mines do not meet these coverage requirements because blind spots occur when using a wired network. In this paper, we develop a Web-based, lightweight remote monitoring and control platform using a wireless sensor network (WSN) with the REST style to collect temperature, humidity and methane concentration data in a coal mine using sensor nodes. This platform also collects information on personnel positions inside the mine. We implement a RESTful application programming interface (API) that provides access to underground sensors and instruments through the Web such that underground coal mine physical devices can be easily interfaced to remote monitoring and control applications. We also implement three different scenarios for Web-based, lightweight remote monitoring and control of coal mine safety and measure and analyze the system performance. Finally, we present the conclusions from this study and discuss future work. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  17. A comparative study of six European databases of medically oriented Web resources.

    PubMed

    Abad García, Francisca; González Teruel, Aurora; Bayo Calduch, Patricia; de Ramón Frias, Rosa; Castillo Blasco, Lourdes

    2005-10-01

    The paper describes six European medically oriented databases of Web resources, pertaining to five quality-controlled subject gateways, and compares their performance. The characteristics, coverage, procedure for selecting Web resources, record structure, searching possibilities, and existence of user assistance were described for each database. Performance indicators for each database were obtained by means of searches carried out using the key words, "myocardial infarction." Most of the databases originated in the 1990s in an academic or library context and include all types of Web resources of an international nature. Five databases use Medical Subject Headings. The number of fields per record varies between three and nineteen. The language of the search interfaces is mostly English, and some of them allow searches in other languages. In some databases, the search can be extended to Pubmed. Organizing Medical Networked Information, Catalogue et Index des Sites Médicaux Francophones, and Diseases, Disorders and Related Topics produced the best results. The usefulness of these databases as quick reference resources is clear. In addition, their lack of content overlap means that, for the user, they complement each other. Their continued survival faces three challenges: the instability of the Internet, maintenance costs, and lack of use in spite of their potential usefulness.

  18. SDI-based business processes: A territorial analysis web information system in Spain

    NASA Astrophysics Data System (ADS)

    Béjar, Rubén; Latre, Miguel Á.; Lopez-Pellicer, Francisco J.; Nogueras-Iso, Javier; Zarazaga-Soria, F. J.; Muro-Medrano, Pedro R.

    2012-09-01

    Spatial Data Infrastructures (SDIs) provide access to geospatial data and operations through interoperable Web services. These data and operations can be chained to set up specialized geospatial business processes, and these processes can give support to different applications. End users can benefit from these applications, while experts can integrate the Web services in their own business processes and developments. This paper presents an SDI-based territorial analysis Web information system for Spain, which gives access to land cover, topography and elevation data, as well as to a number of interoperable geospatial operations by means of a Web Processing Service (WPS). Several examples illustrate how different territorial analysis business processes are supported. The system has been established by the Spanish National SDI (Infraestructura de Datos Espaciales de España, IDEE) both as an experimental platform for geoscientists and geoinformation system developers, and as a mechanism to contribute to the Spanish citizens knowledge about their territory.

  19. Innovations in communication technologies for measles supplemental immunization activities: lessons from Kenya measles vaccination campaign, November 2012

    PubMed Central

    Mbabazi, William B; Tabu, Collins W; Chemirmir, Caleb; Kisia, James; Ali, Nasra; Corkum, Melissa G; Bartley, Gene L

    2015-01-01

    Background To achieve a measles free world, effective communication must be part of all elimination plans. The choice of communication approaches must be evidence based, locally appropriate, interactive and community owned. In this article, we document the innovative approach of using house visits supported by a web-enabled mobile phone application to create a real-time platform for adaptive management of supplemental measles immunization days in Kenya. Methods One thousand nine hundred and fifty-two Red Cross volunteers were recruited, trained and deployed to conduct house-to-house canvassing in 11 urban districts of Kenya. Three days before the campaigns, volunteers conducted house visits with a uniform approach and package of messages. All house visits were documented using a web-enabled mobile phone application (episurveyor®) that in real-time relayed information collected to all campaign management levels. During the campaigns, volunteers reported daily immunizations to their co-ordinators. Post-campaign house visits were also conducted within 4 days, to verify immunization of eligible children, assess information sources and detect adverse events following immunization. Results Fifty-six per cent of the 164 643 households visited said that they had heard about the planned 2012 measles vaccination campaign 1–3 days before start dates. Twenty-five per cent of households were likely to miss the measles supplemental dose if they had not been reassured by the house visit. Pre- and post-campaign reasons for refusal showed that targeted communication reduced misconceptions, fear of injections and trust in herbal remedies. Daily reporting of immunizations using mobile phones informed changes in service delivery plans for better immunization coverage. House visits were more remembered (70%) as sources of information compared with traditional mass awareness channels like megaphones (41%) and radio (37%). Conclusions In high-density settlements, house-to-house visits are easy and more penetrative compared with traditional media approaches. Using mobile phones to document campaign processes and outputs provides real time evidence for service delivery planning to improve immunization coverage. PMID:24920218

  20. Building Student-Centered Web Sites in the K12 Classroom.

    ERIC Educational Resources Information Center

    Hall, Alison; Basile, Brigitte

    This paper examines the process of constructing a student-centered World Wide Web site and provides recommendations for improving this process. In the project, preservice teachers instructed the fifth grade students about how to design and develop a Web site on weather. The topics of the sessions included Internet ethics, using the Web,…

  1. An Exploratory Study of User Searching of the World Wide Web: A Holistic Approach.

    ERIC Educational Resources Information Center

    Wang, Peiling; Tenopir, Carol; Laymman, Elizabeth; Penniman, David; Collins, Shawn

    1998-01-01

    Examines Web users' behaviors and needs and tests a methodology for studying users' interaction with the Web. A process-tracing technique, together with tests of cognitive style, anxiety levels, and self-report computer experience, provided data on how users interact with the Web in the process of finding factual information. (Author/AEF)

  2. The Acquisition of Integrated Science Process Skills in a Web-Based Learning Environment

    ERIC Educational Resources Information Center

    Saat, Rohaida Mohd

    2004-01-01

    Web-based learning is becoming prevalent in science learning. Some use specially designed programs, while others use materials available on the Internet. This qualitative case study examined the process of acquisition of integrated science process skills, particularly the skill of controlling variables, in a web-based learning environment among…

  3. New options for national population surveys: The implications of internet and smartphone coverage.

    PubMed

    Couper, Mick P; Gremel, Garret; Axinn, William; Guyer, Heidi; Wagner, James; West, Brady T

    2018-07-01

    Challenges to survey data collection have increased the costs of social research via face-to-face surveys so much that it may become extremely difficult for social scientists to continue using these methods. A key drawback to less expensive Internet-based alternatives is the threat of biased results from coverage errors in survey data. The rise of Internet-enabled smartphones presents an opportunity to re-examine the issue of Internet coverage for surveys and its implications for coverage bias. Two questions (on Internet access and smartphone ownership) were added to the National Survey of Family Growth (NSFG), a U.S. national probability survey of women and men age 15-44, using a continuous sample design. We examine 16 quarters (4 years) of data, from September 2012 to August 2016. Overall, we estimate that 82.9% of the target NSFG population has Internet access, and 81.6% has a smartphone. Combined, this means that about 90.7% of U.S. residents age 15-44 have Internet access, via either traditional devices or a smartphone. We find some evidence of compensatory coverage when looking at key race/ethnicity and age subgroups. For instance, while Black teens (15-18) have the lowest estimated rate of Internet access (81.9%) and the lowest rate of smartphone usage (72.6%), an estimated 88.0% of this subgroup has some form of Internet access. We also examine the socio-demographic correlates of Internet and smartphone coverage, separately and combined, as indicators of technology access in this population. In addition, we look at the effect of differential coverage on key estimates produced by the NSFG, related to fertility, family formation, and sexual activity. While this does not address nonresponse or measurement biases that may differ for alternative modes, our paper has implications for possible coverage biases that may arise when switching to a Web-based mode of data collection, either for follow-up surveys or to replace the main face-to-face data collection. Copyright © 2018. Published by Elsevier Inc.

  4. Information as Power: An Anthology of Selected United States Army War College Student Papers. Volume 1

    DTIC Science & Technology

    2006-01-01

    audience. Things like web logs are providing a different way to access news and commentary. Bloggers invite the contributions of their readers...enough, there were wide differences in media coverage – some left leaning press and bloggers were quick to criticize this as a “war crime within a war...in the JOC. It is now after dinner and we are less than 12 hours from election time on January 29, 2005. The beautiful part is that this

  5. Silicon web process development

    NASA Technical Reports Server (NTRS)

    Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Skutch, M. E.; Driggers, J. M.; Hopkins, R. H.

    1981-01-01

    The silicon web process takes advantage of natural crystallographic stabilizing forces to grow long, thin single crystal ribbons directly from liquid silicon. The ribbon, or web, is formed by the solidification of a liquid film supported by surface tension between two silicon filaments, called dendrites, which border the edges of the growing strip. The ribbon can be propagated indefinitely by replenishing the liquid silicon as it is transformed to crystal. The dendritic web process has several advantages for achieving low cost, high efficiency solar cells. These advantages are discussed.

  6. Extending CATH: increasing coverage of the protein structure universe and linking structure with function

    PubMed Central

    Cuff, Alison L.; Sillitoe, Ian; Lewis, Tony; Clegg, Andrew B.; Rentzsch, Robert; Furnham, Nicholas; Pellegrini-Calace, Marialuisa; Jones, David; Thornton, Janet; Orengo, Christine A.

    2011-01-01

    CATH version 3.3 (class, architecture, topology, homology) contains 128 688 domains, 2386 homologous superfamilies and 1233 fold groups, and reflects a major focus on classifying structural genomics (SG) structures and transmembrane proteins, both of which are likely to add structural novelty to the database and therefore increase the coverage of protein fold space within CATH. For CATH version 3.4 we have significantly improved the presentation of sequence information and associated functional information for CATH superfamilies. The CATH superfamily pages now reflect both the functional and structural diversity within the superfamily and include structural alignments of close and distant relatives within the superfamily, annotated with functional information and details of conserved residues. A significantly more efficient search function for CATH has been established by implementing the search server Solr (http://lucene.apache.org/solr/). The CATH v3.4 webpages have been built using the Catalyst web framework. PMID:21097779

  7. Coverage, universal access and equity in health: a characterization of scientific production in nursing.

    PubMed

    Mendoza-Parra, Sara

    2016-01-01

    to characterize the scientific contribution nursing has made regarding coverage, universal access and equity in health, and to understand this production in terms of subjects and objects of study. this was cross-sectional, documentary research; the units of analysis were 97 journals and 410 documents, retrieved from the Web of Science in the category, "nursing". Descriptors associated to coverage, access and equity in health, and the Mesh thesaurus, were applied. We used bibliometric laws and indicators, and analyzed the most important articles according to amount of citations and collaboration. the document retrieval allowed for 25 years of observation of production, an institutional and an international collaboration of 31% and 7%, respectively. The mean number of coauthors per article was 3.5, with a transience rate of 93%. The visibility index was 67.7%, and 24.6% of production was concentrated in four core journals. A review from the nursing category with 286 citations, and a Brazilian author who was the most productive, are issues worth highlighting. the nursing collective should strengthen future research on the subject, defining lines and sub-lines of research, increasing internationalization and building it with the joint participation of the academy and nursing community.

  8. The Web Measurement Environment (WebME): A Tool for Combining and Modeling Distributed Data

    NASA Technical Reports Server (NTRS)

    Tesoriero, Roseanne; Zelkowitz, Marvin

    1997-01-01

    Many organizations have incorporated data collection into their software processes for the purpose of process improvement. However, in order to improve, interpreting the data is just as important as the collection of data. With the increased presence of the Internet and the ubiquity of the World Wide Web, the potential for software processes being distributed among several physically separated locations has also grown. Because project data may be stored in multiple locations and in differing formats, obtaining and interpreting data from this type of environment becomes even more complicated. The Web Measurement Environment (WebME), a Web-based data visualization tool, is being developed to facilitate the understanding of collected data in a distributed environment. The WebME system will permit the analysis of development data in distributed, heterogeneous environments. This paper provides an overview of the system and its capabilities.

  9. The Evolution of Landsat Data Systems and Science Products

    NASA Astrophysics Data System (ADS)

    Dwyer, J. L.

    2011-12-01

    The series of Landsat satellite missions have collected observations of the Earth's surface since 1972, resulting in the richest archive of remotely sensed data covering the global land masses at scales from which natural and human-induced changes can be distinguished. This observational record will continue to be extended with the launch of the Landsat Data Continuity Mission, or Landsat 8, in December of 2012 carrying the Operational Land Imager (OLI) and Thermal Infrared Sensor (TIRS) instruments. The data streams from these instruments will be significantly enhanced yet compatible with data acquired by heritage Landsat instruments. The radiometry and geometry of the OLI and TIRS data will be calibrated and combined into single, multi-band Level-1 terrain-corrected image products. Coefficients will be included in the product metadata to convert OLI to at-sensor radiance or reflectance and to convert TIRS data to at-aperture radiances. A quality assurance band will contain pixel-based information regarding the presences or clouds, shadows, and terrain occlusion. The raw data as well as the Level-1 products will be stored online and made freely accessible through web coverage services. Rescaled Level-1 OLI and TIRS images will be made available via web mapping services to enable inventory searches and for ready use in geospatial applications. The architecture of the Landsat science data processing systems is scalable to accommodate additional processing and storage nodes in response to archive growth and increased demands on processing and distribution. The data collected by the various Landsat instruments have been inter-calibrated to enable the generation of higher level science data products that are of consistent quality through time and from which geophysical and biophysical parameters of the land surface can be derived for use in process models and decision support systems. Data access and delivery services have evolved in response to increasing demand for Landsat data in a broad range of applications, and the demand for additional processing capabilities and services is expected to grow in the future to meet the needs for climate data records and essential climate variables.

  10. Online, On Demand Access to Coastal Digital Elevation Models

    NASA Astrophysics Data System (ADS)

    Long, J.; Bristol, S.; Long, D.; Thompson, S.

    2014-12-01

    Process-based numerical models for coastal waves, water levels, and sediment transport are initialized with digital elevation models (DEM) constructed by interpolating and merging bathymetric and topographic elevation data. These gridded surfaces must seamlessly span the land-water interface and may cover large regions where the individual raw data sources are collected at widely different spatial and temporal resolutions. In addition, the datasets are collected from different instrument platforms with varying accuracy and may or may not overlap in coverage. The lack of available tools and difficulties in constructing these DEMs lead scientists to 1) rely on previously merged, outdated, or over-smoothed DEMs; 2) discard more recent data that covers only a portion of the DEM domain; and 3) use inconsistent methodologies to generate DEMs. The objective of this work is to address the immediate need of integrating land and water-based elevation data sources and streamline the generation of a seamless data surface that spans the terrestrial-marine boundary. To achieve this, the U.S. Geological Survey (USGS) is developing a web processing service to format and initialize geoprocessing tasks designed to create coastal DEMs. The web processing service is maintained within the USGS ScienceBase data management system and has an associated user interface. Through the map-based interface, users define a geographic region that identifies the bounds of the desired DEM and a time period of interest. This initiates a query for elevation datasets within federal science agency data repositories. A geoprocessing service is then triggered to interpolate, merge, and smooth the data sources creating a DEM based on user-defined configuration parameters. Uncertainty and error estimates for the DEM are also returned by the geoprocessing service. Upon completion, the information management platform provides access to the final gridded data derivative and saves the configuration parameters for future reference. The resulting products and tools developed here could be adapted to future data sources and projects beyond the coastal environment.

  11. SU-D-BRD-02: A Web-Based Image Processing and Plan Evaluation Platform (WIPPEP) for Future Cloud-Based Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chai, X; Liu, L; Xing, L

    Purpose: Visualization and processing of medical images and radiation treatment plan evaluation have traditionally been constrained to local workstations with limited computation power and ability of data sharing and software update. We present a web-based image processing and planning evaluation platform (WIPPEP) for radiotherapy applications with high efficiency, ubiquitous web access, and real-time data sharing. Methods: This software platform consists of three parts: web server, image server and computation server. Each independent server communicates with each other through HTTP requests. The web server is the key component that provides visualizations and user interface through front-end web browsers and relay informationmore » to the backend to process user requests. The image server serves as a PACS system. The computation server performs the actual image processing and dose calculation. The web server backend is developed using Java Servlets and the frontend is developed using HTML5, Javascript, and jQuery. The image server is based on open source DCME4CHEE PACS system. The computation server can be written in any programming language as long as it can send/receive HTTP requests. Our computation server was implemented in Delphi, Python and PHP, which can process data directly or via a C++ program DLL. Results: This software platform is running on a 32-core CPU server virtually hosting the web server, image server, and computation servers separately. Users can visit our internal website with Chrome browser, select a specific patient, visualize image and RT structures belonging to this patient and perform image segmentation running Delphi computation server and Monte Carlo dose calculation on Python or PHP computation server. Conclusion: We have developed a webbased image processing and plan evaluation platform prototype for radiotherapy. This system has clearly demonstrated the feasibility of performing image processing and plan evaluation platform through a web browser and exhibited potential for future cloud based radiotherapy.« less

  12. [Development of a consented set of criteria to evaluate post-rehabilitation support services].

    PubMed

    Parzanka, Susanne; Himstedt, Christian; Deck, Ruth

    2015-01-01

    Existing rehabilitation aftercare offers in Germany are heterogeneous, and there is a lack of transparency in terms of indications and methods as well as of (nationwide) availability and financial coverage. Also, there is no systematic and transparent synopsis. To close this gap a systematic review was conducted and a web-based database created for post-rehabilitation support. To allow a consistent assessment of the included aftercare offers, a quality profile of universally valid criteria was developed. This paper aims to outline the scientific approach. The procedure adapts the RAND/UCLA method, with the participation of the advisory board of the ReNa project. Preparations for the set included systematic searches in order to find possible criteria to assess the quality of aftercare offers. These criteria first were collected without any pre-selection involved. Every item of the adjusted collection was evaluated by every single member of the advisory board considering the topics "relevance", "feasibility" and "suitability for public coverage". Interpersonal analysis was conducted by relating the median and classification into consensus and dissent. All items that were considered to be "relevant" and "feasible" in the three stages of consensus building and deemed "suitable for public coverage" were transferred into the final set of criteria (ReNa set). A total of 82 publications were selected out of the 656 findings taken into account, which delivered 3,603 criteria of possible initial relevance. After a further removal of 2,598 redundant criteria, the panel needed to assess a set of 1,005 items. Finally we performed a quality assessment of aftercare offers using a set of 35 descriptive criteria merged into 8 conceptual clusters. The consented ReNa set of 35 items delivers a first generally valid tool to describe quality of structures, standards and processes of aftercare offers. So finally, the project developed into a complete collection of profiles characterizing each post-rehabilitation support service included in the database. Copyright © 2015. Published by Elsevier GmbH.

  13. Optimized Global Digital Elevation Data Records (Invited)

    NASA Astrophysics Data System (ADS)

    Kobrick, M.; Farr, T.; Crippen, R. E.

    2009-12-01

    The Shuttle Radar Topography Mission (SRTM) used radar interferometry to map the Earth's topography between ±60° latitude - representing 80% of the land surface. The resulting digital elevation models bettered existing topographic data sets (including restricted military data) in accuracy, areal coverage and uniformity by several orders of magnitude, and the resulting data records have found broad application in most of the geosciences, military operations, even Google Earth. Despite their popularity the SRTM data have several limitations, including lack of coverage in polar regions and occasional small voids, or areas of no data in regions of high slope of low radar backscatter. Fortunately additional data sets have become available that, although lacking SRTM's data quality, are sufficient to mitigate many of these limitations. Primary among these is the Global Digital Elevation Model (GDEM) produced from ASTER stereo pairs. The MEaSUREs program is sponsoring an effort to merge these sets to produce and distribute an improved collection of data records that will optimize the topographic data, as well as make available additional non-topographic data products from the SRTM mission. There are four main areas of effort: (1) A systematic program to combine SRTM elevation data with those from other sensors, principally GDEM but also including SPOT stereo, the USGS’s National Elevation Data Set and others, to fill voids in the DEMs according to a prioritized plan, as well as extend the coverage beyond the current 60° latitude limit. (2) Combine the topographic data records with ICESat laser altimeter topography profiles to produce and distribute data records with enhanced ground control. (3) Document the existing SRTM radar image and ancillary data records, as well as generate image mosaics at multiple scales and distribute them via the world wide web. (4) Generate, document and distribute a standard and representative set of SRTM raw radar echo data, along with the appropriate ancillary tracking and pointing data necessary to process the echoes into DEMS using improved algorithms or

  14. OneGeology Web Services and Portal as a global geological SDI - latest standards and technology

    NASA Astrophysics Data System (ADS)

    Duffy, Tim; Tellez-Arenas, Agnes

    2014-05-01

    The global coverage of OneGeology Web Services (www.onegeology.org and portal.onegeology.org) achieved since 2007 from the 120 participating geological surveys will be reviewed and issues arising discussed. Recent enhancements to the OneGeology Web Services capabilities will be covered including new up to 5 star service accreditation scheme utilising the ISO/OGC Web Mapping Service standard version 1.3, core ISO 19115 metadata additions and Version 2.0 Web Feature Services (WFS) serving the new IUGS-CGI GeoSciML V3.2 geological web data exchange language standard (http://www.geosciml.org/) with its associated 30+ IUGS-CGI available vocabularies (http://resource.geosciml.org/ and http://srvgeosciml.brgm.fr/eXist2010/brgm/client.html). Use of the CGI simpelithology and timescale dictionaries now allow those who wish to do so to offer data harmonisation to query their GeoSciML 3.2 based Web Feature Services and their GeoSciML_Portrayal V2.0.1 (http://www.geosciml.org/) Web Map Services in the OneGeology portal (http://portal.onegeology.org). Contributing to OneGeology involves offering to serve ideally 1:1000,000 scale geological data (in practice any scale now is warmly welcomed) as an OGC (Open Geospatial Consortium) standard based WMS (Web Mapping Service) service from an available WWW server. This may either be hosted within the Geological Survey or a neighbouring, regional or elsewhere institution that offers to serve that data for them i.e. offers to help technically by providing the web serving IT infrastructure as a 'buddy'. OneGeology is a standards focussed Spatial Data Infrastructure (SDI) and works to ensure that these standards work together and it is now possible for European Geological Surveys to register their INSPIRE web services within the OneGeology SDI (e.g. see http://www.geosciml.org/geosciml/3.2/documentation/cookbook/INSPIRE_GeoSciML_Cookbook%20_1.0.pdf). The Onegeology portal (http://portal.onegeology.org) is the first port of call for anyone wishing to discover the availability of global geological web services and has new functionality to view and use such services including multiple projection support. KEYWORDS : OneGeology; GeoSciML V 3.2; Data exchange; Portal; INSPIRE; Standards; OGC; Interoperability; GeoScience information; WMS; WFS; Cookbook.

  15. NMRPro: an integrated web component for interactive processing and visualization of NMR spectra.

    PubMed

    Mohamed, Ahmed; Nguyen, Canh Hao; Mamitsuka, Hiroshi

    2016-07-01

    The popularity of using NMR spectroscopy in metabolomics and natural products has driven the development of an array of NMR spectral analysis tools and databases. Particularly, web applications are well used recently because they are platform-independent and easy to extend through reusable web components. Currently available web applications provide the analysis of NMR spectra. However, they still lack the necessary processing and interactive visualization functionalities. To overcome these limitations, we present NMRPro, a web component that can be easily incorporated into current web applications, enabling easy-to-use online interactive processing and visualization. NMRPro integrates server-side processing with client-side interactive visualization through three parts: a python package to efficiently process large NMR datasets on the server-side, a Django App managing server-client interaction, and SpecdrawJS for client-side interactive visualization. Demo and installation instructions are available at http://mamitsukalab.org/tools/nmrpro/ mohamed@kuicr.kyoto-u.ac.jp Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  16. Supporting Reflective Activities in Information Seeking on the Web

    NASA Astrophysics Data System (ADS)

    Saito, Hitomi; Miwa, Kazuhisa

    Recently, many opportunities have emerged to use the Internet in daily life and classrooms. However, with the growth of the World Wide Web (Web), it is becoming increasingly difficult to find target information on the Internet. In this study, we explore a method for developing the ability of users in information seeking on the Web and construct a search process feedback system supporting reflective activities of information seeking on the Web. Reflection is defined as a cognitive activity for monitoring, evaluating, and modifying one's thinking and process. In the field of learning science, many researchers have investigated reflective activities that facilitate learners' problem solving and deep understanding. The characteristics of this system are: (1) to show learners' search processes on the Web as described, based on a cognitive schema, and (2) to prompt learners to reflect on their search processes. We expect that users of this system can reflect on their search processes by receiving information on their own search processes provided by the system, and that these types of reflective activity helps them to deepen their understanding of information seeking activities. We have conducted an experiment to investigate the effects of our system. The experimental results confirmed that (1) the system actually facilitated the learners' reflective activities by providing process visualization and prompts, and (2) the learners who reflected on their search processes more actively understood their own search processes more deeply.

  17. The Effects of a Web-Based Nursing Process Documentation Program on Stress and Anxiety of Nursing Students in South Korea.

    PubMed

    Lee, Eunjoo; Noh, Hyun Kyung

    2016-01-01

    To examine the effects of a web-based nursing process documentation system on the stress and anxiety of nursing students during their clinical practice. A quasi-experimental design was employed. The experimental group (n = 110) used a web-based nursing process documentation program for their case reports as part of assignments for a clinical practicum, whereas the control group (n = 106) used traditional paper-based case reports. Stress and anxiety levels were measured with a numeric rating scale before, 2 weeks after, and 4 weeks after using the web-based nursing process documentation program during a clinical practicum. The data were analyzed using descriptive statistics, t tests, chi-square tests, and repeated-measures analyses of variance. Nursing students who used the web-based nursing process documentation program showed significant lower levels of stress and anxiety than the control group. A web-based nursing process documentation program could be used to reduce the stress and anxiety of nursing students during clinical practicum, which ultimately would benefit nursing students by increasing satisfaction with and effectiveness of clinical practicum. © 2015 NANDA International, Inc.

  18. VisSearch: A Collaborative Web Searching Environment

    ERIC Educational Resources Information Center

    Lee, Young-Jin

    2005-01-01

    VisSearch is a collaborative Web searching environment intended for sharing Web search results among people with similar interests, such as college students taking the same course. It facilitates students' Web searches by visualizing various Web searching processes. It also collects the visualized Web search results and applies an association rule…

  19. Web site development: applying aesthetics to promote breast health education and awareness.

    PubMed

    Thomas, Barbara; Goldsmith, Susan B; Forrest, Anne; Marshall, Renée

    2002-01-01

    This article describes the process of establishing a Web site as part of a collaborative project using visual art to promote breast health education. The need for a more "user-friendly" comprehensive breast health Web site that is aesthetically rewarding was identified after an analysis of current Web sites available through the World Wide Web. Two predetermined sets of criteria, accountability and aesthetics, were used to analyze these sites and to generate ideas for creating a breast health education Web site using visual art. Results of the analyses conducted are included as well as the factors to consider for incorporating into a Web site. The process specified is thorough and can be applied to establish a Web site that is aesthetically rewarding and informative for a variety of educational purposes.

  20. Repurposing of open data through large scale hydrological modelling - hypeweb.smhi.se

    NASA Astrophysics Data System (ADS)

    Strömbäck, Lena; Andersson, Jafet; Donnelly, Chantal; Gustafsson, David; Isberg, Kristina; Pechlivanidis, Ilias; Strömqvist, Johan; Arheimer, Berit

    2015-04-01

    Hydrological modelling demands large amounts of spatial data, such as soil properties, land use, topography, lakes and reservoirs, ice and snow coverage, water management (e.g. irrigation patterns and regulations), meteorological data and observed water discharge in rivers. By using such data, the hydrological model will in turn provide new data that can be used for new purposes (i.e. re-purposing). This presentation will give an example of how readily available open data from public portals have been re-purposed by using the Hydrological Predictions for the Environment (HYPE) model in a number of large-scale model applications covering numerous subbasins and rivers. HYPE is a dynamic, semi-distributed, process-based, and integrated catchment model. The model output is launched as new Open Data at the web site www.hypeweb.smhi.se to be used for (i) Climate change impact assessments on water resources and dynamics; (ii) The European Water Framework Directive (WFD) for characterization and development of measure programs to improve the ecological status of water bodies; (iii) Design variables for infrastructure constructions; (iv) Spatial water-resource mapping; (v) Operational forecasts (1-10 days and seasonal) on floods and droughts; (vi) Input to oceanographic models for operational forecasts and marine status assessments; (vii) Research. The following regional domains have been modelled so far with different resolutions (number of subbasins within brackets): Sweden (37 000), Europe (35 000), Arctic basin (30 000), La Plata River (6 000), Niger River (800), Middle-East North-Africa (31 000), and the Indian subcontinent (6 000). The Hype web site provides several interactive web applications for exploring results from the models. The user can explore an overview of various water variables for historical and future conditions. Moreover the user can explore and download historical time series of discharge for each basin and explore the performance of the model towards observed river flow. The presentation will describe the Open Data sources used, show the functionality of the web site and discuss model performance and experience from this world-wide hydrological modelling of multi-basins using open data.

  1. OGC and Grid Interoperability in enviroGRIDS Project

    NASA Astrophysics Data System (ADS)

    Gorgan, Dorian; Rodila, Denisa; Bacu, Victor; Giuliani, Gregory; Ray, Nicolas

    2010-05-01

    EnviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is a 4-years FP7 Project aiming to address the subjects of ecologically unsustainable development and inadequate resource management. The project develops a Spatial Data Infrastructure of the Black Sea Catchment region. The geospatial technologies offer very specialized functionality for Earth Science oriented applications as well as the Grid oriented technology that is able to support distributed and parallel processing. One challenge of the enviroGRIDS project is the interoperability between geospatial and Grid infrastructures by providing the basic and the extended features of the both technologies. The geospatial interoperability technology has been promoted as a way of dealing with large volumes of geospatial data in distributed environments through the development of interoperable Web service specifications proposed by the Open Geospatial Consortium (OGC), with applications spread across multiple fields but especially in Earth observation research. Due to the huge volumes of data available in the geospatial domain and the additional introduced issues (data management, secure data transfer, data distribution and data computation), the need for an infrastructure capable to manage all those problems becomes an important aspect. The Grid promotes and facilitates the secure interoperations of geospatial heterogeneous distributed data within a distributed environment, the creation and management of large distributed computational jobs and assures a security level for communication and transfer of messages based on certificates. This presentation analysis and discusses the most significant use cases for enabling the OGC Web services interoperability with the Grid environment and focuses on the description and implementation of the most promising one. In these use cases we give a special attention to issues such as: the relations between computational grid and the OGC Web service protocols, the advantages offered by the Grid technology - such as providing a secure interoperability between the distributed geospatial resource -and the issues introduced by the integration of distributed geospatial data in a secure environment: data and service discovery, management, access and computation. enviroGRIDS project proposes a new architecture which allows a flexible and scalable approach for integrating the geospatial domain represented by the OGC Web services with the Grid domain represented by the gLite middleware. The parallelism offered by the Grid technology is discussed and explored at the data level, management level and computation level. The analysis is carried out for OGC Web service interoperability in general but specific details are emphasized for Web Map Service (WMS), Web Feature Service (WFS), Web Coverage Service (WCS), Web Processing Service (WPS) and Catalog Service for Web (CSW). Issues regarding the mapping and the interoperability between the OGC and the Grid standards and protocols are analyzed as they are the base in solving the communication problems between the two environments: grid and geospatial. The presetation mainly highlights how the Grid environment and Grid applications capabilities can be extended and utilized in geospatial interoperability. Interoperability between geospatial and Grid infrastructures provides features such as the specific geospatial complex functionality and the high power computation and security of the Grid, high spatial model resolution and geographical area covering, flexible combination and interoperability of the geographical models. According with the Service Oriented Architecture concepts and requirements of interoperability between geospatial and Grid infrastructures each of the main functionality is visible from enviroGRIDS Portal and consequently, by the end user applications such as Decision Maker/Citizen oriented Applications. The enviroGRIDS portal is the single way of the user to get into the system and the portal faces a unique style of the graphical user interface. Main reference for further information: [1] enviroGRIDS Project, http://www.envirogrids.net/

  2. Novel data sources for women's health research: mapping breast screening online information seeking through Google trends.

    PubMed

    Fazeli Dehkordy, Soudabeh; Carlos, Ruth C; Hall, Kelli S; Dalton, Vanessa K

    2014-09-01

    Millions of people use online search engines everyday to find health-related information and voluntarily share their personal health status and behaviors in various Web sites. Thus, data from tracking of online information seeker's behavior offer potential opportunities for use in public health surveillance and research. Google Trends is a feature of Google which allows Internet users to graph the frequency of searches for a single term or phrase over time or by geographic region. We used Google Trends to describe patterns of information-seeking behavior in the subject of dense breasts and to examine their correlation with the passage or introduction of dense breast notification legislation. To capture the temporal variations of information seeking about dense breasts, the Web search query "dense breast" was entered in the Google Trends tool. We then mapped the dates of legislative actions regarding dense breasts that received widespread coverage in the lay media to information-seeking trends about dense breasts over time. Newsworthy events and legislative actions appear to correlate well with peaks in search volume of "dense breast". Geographic regions with the highest search volumes have passed, denied, or are currently considering the dense breast legislation. Our study demonstrated that any legislative action and respective news coverage correlate with increase in information seeking for "dense breast" on Google, suggesting that Google Trends has the potential to serve as a data source for policy-relevant research. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.

  3. Quality of health information on acute myocardial infarction and stroke in the world wide web.

    PubMed

    Bastos, Ana; Paiva, Dagmara; Azevedo, Ana

    2014-01-01

    The quality of health information in the Internet may be low. This is a concerning issue in cardiovascular diseases which warrant patient self-management. We aimed to assess the quality of Portuguese websites as a source of health information on acute myocardial infarction and stroke. We used the search terms 'enfarte miocardio' and 'acidente vascular cerebral' (Portuguese terms for myocardial infarction and stroke) on Google(®), on April 5th and 7th 2011, respectively, using Internet Explorer(®). The first 200 URL retrieved in each search were independently visited and Portuguese websites in Portuguese language were selected. We analysed and classified 121 websites for structural characteristics, information coverage and accuracy of the web pages with items defined a priori, trustworthiness in general according to the Health on the Net Foundation and regarding treatments using the DISCERN instrument (48 websites). Websites were most frequently commercial (49.5%), not exclusively dedicated to acute myocardial infarction/ stroke (94.2%), and with information on medical facts (59.5%), using images, video or animation (60.3%). Websites' trustworthiness was low. None of the websites displayed the Health on the Net Foundation seal. Acute myocardial infarction/ stroke websites differed in information coverage but the accuracy of the information was acceptable, although often incomplete. The quality of information on acute myocardial infarction/ stroke in Portuguese websites was acceptable. Trustworthiness was low, impairing users' capability of identifying potentially more reliable content.

  4. Exploring NASA OMI Level 2 Data With Visualization

    NASA Technical Reports Server (NTRS)

    Wei, Jennifer; Yang, Wenli; Johnson, James; Zhao, Peisheng; Gerasimov, Irina; Pham, Long; Vicente, Gilberto

    2014-01-01

    Satellite data products are important for a wide variety of applications that can bring far-reaching benefits to the science community and the broader society. These benefits can best be achieved if the satellite data are well utilized and interpreted, such as model inputs from satellite, or extreme events (such as volcano eruptions, dust storms,... etc.). Unfortunately, this is not always the case, despite the abundance and relative maturity of numerous satellite data products provided by NASA and other organizations. Such obstacles may be avoided by allowing users to visualize satellite data as "images", with accurate pixel-level (Level-2) information, including pixel coverage area delineation and science team recommended quality screening for individual geophysical parameters. We present a prototype service from the Goddard Earth Sciences Data and Information Services Center (GES DISC) supporting Aura OMI Level-2 Data with GIS-like capabilities. Functionality includes selecting data sources (e.g., multiple parameters under the same scene, like NO2 and SO2, or the same parameter with different aggregation methods, like NO2 in OMNO2G and OMNO2D products), user-defined area-of-interest and temporal extents, zooming, panning, overlaying, sliding, and data subsetting, reformatting, and reprojection. The system will allow any user-defined portal interface (front-end) to connect to our backend server with OGC standard-compliant Web Mapping Service (WMS) and Web Coverage Service (WCS) calls. This back-end service should greatly enhance its expandability to integrate additional outside data/map sources.

  5. Exploring NASA OMI Level 2 Data With Visualization

    NASA Technical Reports Server (NTRS)

    Wei, Jennifer C.; Yang, Wenli; Johnson, James; Zhao, Peisheng; Gerasimov, Irina; Pham, Long; Vincente, Gilbert

    2014-01-01

    Satellite data products are important for a wide variety of applications that can bring far-reaching benefits to the science community and the broader society. These benefits can best be achieved if the satellite data are well utilized and interpreted, such as model inputs from satellite, or extreme events (such as volcano eruptions, dust storms, etc.).Unfortunately, this is not always the case, despite the abundance and relative maturity of numerous satellite data products provided by NASA and other organizations. Such obstacles may be avoided by allowing users to visualize satellite data as images, with accurate pixel-level (Level-2) information, including pixel coverage area delineation and science team recommended quality screening for individual geophysical parameters. We present a prototype service from the Goddard Earth Sciences Data and Information Services Center (GES DISC) supporting Aura OMI Level-2 Data with GIS-like capabilities. Functionality includes selecting data sources (e.g., multiple parameters under the same scene, like NO2 and SO2, or the same parameter with different aggregation methods, like NO2 in OMNO2G and OMNO2D products), user-defined area-of-interest and temporal extents, zooming, panning, overlaying, sliding, and data subsetting, reformatting, and reprojection. The system will allow any user-defined portal interface (front-end) to connect to our backend server with OGC standard-compliant Web Mapping Service (WMS) and Web Coverage Service (WCS) calls. This back-end service should greatly enhance its expandability to integrate additional outside data-map sources.

  6. Nuclear science abstracts (NSA) database 1948--1974 (on the Internet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    Nuclear Science Abstracts (NSA) is a comprehensive abstract and index collection of the International Nuclear Science and Technology literature for the period 1948 through 1976. Included are scientific and technical reports of the US Atomic Energy Commission, US Energy Research and Development Administration and its contractors, other agencies, universities, and industrial and research organizations. Coverage of the literature since 1976 is provided by Energy Science and Technology Database. Approximately 25% of the records in the file contain abstracts. These are from the following volumes of the print Nuclear Science Abstracts: Volumes 12--18, Volume 29, and Volume 33. The database containsmore » over 900,000 bibliographic records. All aspects of nuclear science and technology are covered, including: Biomedical Sciences; Metals, Ceramics, and Other Materials; Chemistry; Nuclear Materials and Waste Management; Environmental and Earth Sciences; Particle Accelerators; Engineering; Physics; Fusion Energy; Radiation Effects; Instrumentation; Reactor Technology; Isotope and Radiation Source Technology. The database includes all records contained in Volume 1 (1948) through Volume 33 (1976) of the printed version of Nuclear Science Abstracts (NSA). This worldwide coverage includes books, conference proceedings, papers, patents, dissertations, engineering drawings, and journal literature. This database is now available for searching through the GOV. Research Center (GRC) service. GRC is a single online web-based search service to well known Government databases. Featuring powerful search and retrieval software, GRC is an important research tool. The GRC web site is at http://grc.ntis.gov.« less

  7. Enhancing acronym/abbreviation knowledge bases with semantic information.

    PubMed

    Torii, Manabu; Liu, Hongfang

    2007-10-11

    In the biomedical domain, a terminology knowledge base that associates acronyms/abbreviations (denoted as SFs) with the definitions (denoted as LFs) is highly needed. For the construction such terminology knowledge base, we investigate the feasibility to build a system automatically assigning semantic categories to LFs extracted from text. Given a collection of pairs (SF,LF) derived from text, we i) assess the coverage of LFs and pairs (SF,LF) in the UMLS and justify the need of a semantic category assignment system; and ii) automatically derive name phrases annotated with semantic category and construct a system using machine learning. Utilizing ADAM, an existing collection of (SF,LF) pairs extracted from MEDLINE, our system achieved an f-measure of 87% when assigning eight UMLS-based semantic groups to LFs. The system has been incorporated into a web interface which integrates SF knowledge from multiple SF knowledge bases. Web site: http://gauss.dbb.georgetown.edu/liblab/SFThesurus.

  8. SLDAssay: A software package and web tool for analyzing limiting dilution assays.

    PubMed

    Trumble, Ilana M; Allmon, Andrew G; Archin, Nancie M; Rigdon, Joseph; Francis, Owen; Baldoni, Pedro L; Hudgens, Michael G

    2017-11-01

    Serial limiting dilution (SLD) assays are used in many areas of infectious disease related research. This paper presents SLDAssay, a free and publicly available R software package and web tool for analyzing data from SLD assays. SLDAssay computes the maximum likelihood estimate (MLE) for the concentration of target cells, with corresponding exact and asymptotic confidence intervals. Exact and asymptotic goodness of fit p-values, and a bias-corrected (BC) MLE are also provided. No other publicly available software currently implements the BC MLE or the exact methods. For validation of SLDAssay, results from Myers et al. (1994) are replicated. Simulations demonstrate the BC MLE is less biased than the MLE. Additionally, simulations demonstrate that exact methods tend to give better confidence interval coverage and goodness-of-fit tests with lower type I error than the asymptotic methods. Additional advantages of using exact methods are also discussed. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Gmz: a Gml Compression Model for Webgis

    NASA Astrophysics Data System (ADS)

    Khandelwal, A.; Rajan, K. S.

    2017-09-01

    Geography markup language (GML) is an XML specification for expressing geographical features. Defined by Open Geospatial Consortium (OGC), it is widely used for storage and transmission of maps over the Internet. XML schemas provide the convenience to define custom features profiles in GML for specific needs as seen in widely popular cityGML, simple features profile, coverage, etc. Simple features profile (SFP) is a simpler subset of GML profile with support for point, line and polygon geometries. SFP has been constructed to make sure it covers most commonly used GML geometries. Web Feature Service (WFS) serves query results in SFP by default. But it falls short of being an ideal choice due to its high verbosity and size-heavy nature, which provides immense scope for compression. GMZ is a lossless compression model developed to work for SFP compliant GML files. Our experiments indicate GMZ achieves reasonably good compression ratios and can be useful in WebGIS based applications.

  10. Enabling Interactive Measurements from Large Coverage Microscopy

    PubMed Central

    Bajcsy, Peter; Vandecreme, Antoine; Amelot, Julien; Chalfoun, Joe; Majurski, Michael; Brady, Mary

    2017-01-01

    Microscopy could be an important tool for characterizing stem cell products if quantitative measurements could be collected over multiple spatial and temporal scales. With the cells changing states over time and being several orders of magnitude smaller than cell products, modern microscopes are already capable of imaging large spatial areas, repeat imaging over time, and acquiring images over several spectra. However, characterizing stem cell products from such large image collections is challenging because of data size, required computations, and lack of interactive quantitative measurements needed to determine release criteria. We present a measurement web system consisting of available algorithms, extensions to a client-server framework using Deep Zoom, and the configuration know-how to provide the information needed for inspecting the quality of a cell product. The cell and other data sets are accessible via the prototype web-based system at http://isg.nist.gov/deepzoomweb. PMID:28663600

  11. Rail-RNA: scalable analysis of RNA-seq splicing and coverage.

    PubMed

    Nellore, Abhinav; Collado-Torres, Leonardo; Jaffe, Andrew E; Alquicira-Hernández, José; Wilks, Christopher; Pritt, Jacob; Morton, James; Leek, Jeffrey T; Langmead, Ben

    2017-12-15

    RNA sequencing (RNA-seq) experiments now span hundreds to thousands of samples. Current spliced alignment software is designed to analyze each sample separately. Consequently, no information is gained from analyzing multiple samples together, and it requires extra work to obtain analysis products that incorporate data from across samples. We describe Rail-RNA, a cloud-enabled spliced aligner that analyzes many samples at once. Rail-RNA eliminates redundant work across samples, making it more efficient as samples are added. For many samples, Rail-RNA is more accurate than annotation-assisted aligners. We use Rail-RNA to align 667 RNA-seq samples from the GEUVADIS project on Amazon Web Services in under 16 h for US$0.91 per sample. Rail-RNA outputs alignments in SAM/BAM format; but it also outputs (i) base-level coverage bigWigs for each sample; (ii) coverage bigWigs encoding normalized mean and median coverages at each base across samples analyzed; and (iii) exon-exon splice junctions and indels (features) in columnar formats that juxtapose coverages in samples in which a given feature is found. Supplementary outputs are ready for use with downstream packages for reproducible statistical analysis. We use Rail-RNA to identify expressed regions in the GEUVADIS samples and show that both annotated and unannotated (novel) expressed regions exhibit consistent patterns of variation across populations and with respect to known confounding variables. Rail-RNA is open-source software available at http://rail.bio. anellore@gmail.com or langmea@cs.jhu.edu. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  12. Impact of a website based educational program for increasing vaccination coverage among adolescents.

    PubMed

    Esposito, Susanna; Bianchini, Sonia; Tagliabue, Claudia; Umbrello, Giulia; Madini, Barbara; Di Pietro, Giada; Principi, Nicola

    2018-04-03

    Data regarding the use of technology to improve adolescent knowledge on vaccines are scarce. The main aim of this study was to evaluate whether different web-based educational programmes for adolescents might increase their vaccination coverage. Overall, 917 unvaccinated adolescents (389 males, 42.4%; mean age ± standard deviation, 14.0 ± 2.2 years) were randomized 1:1:1 into the following groups: no intervention (n = 334), website educational program only (n = 281), or website plus face to face lesson (n = 302) groups. The use of the website plus the lesson significantly increased the overall knowledge of various aspects of vaccine-preventable disease and reduced the fear of vaccines (p < 0.001). A significant increase in vaccination coverage was observed for tetanus, diphtheria, acellular pertussis and conjugated meningococcal ACYW vaccines in the 2 groups using the website (p < 0.001), and better results were observed in the group that had also received the lesson; in this last group, significant results were observed in the increase in vaccination coverage for meningococcal B vaccine (p < 0.001). Overall, the majority of the participants liked the experience of the website, although they considered it important to further discuss vaccines with parents, experts and teachers. This study is the first to evaluate website based education of adolescents while considering all of the vaccines recommended for this age group. Our results demonstrate the possibility of increasing vaccination coverage by using a website based educational program with tailored information. However, to be most effective, this program should be supplemented with face-to-face discussions of vaccines at school and at home. Thus, specific education should also include teachers and parents so that they will be prepared to discuss with adolescents what is true and false in the vaccination field.

  13. Immunization, urbanization and slums - a systematic review of factors and interventions.

    PubMed

    Crocker-Buque, Tim; Mindra, Godwin; Duncan, Richard; Mounier-Jack, Sandra

    2017-06-08

    In 2014, over half (54%) of the world's population lived in urban areas and this proportion will increase to 66% by 2050. This urbanizing trend has been accompanied by an increasing number of people living in urban poor communities and slums. Lower immunization coverage is found in poorer urban dwellers in many contexts. This study aims to identify factors associated with immunization coverage in poor urban areas and slums, and to identify interventions to improve coverage. We conducted a systematic review, searching Medline, Embase, Global Health, CINAHL, Web of Science and The Cochrane Database with broad search terms for studies published between 2000 and 2016. Of 4872 unique articles, 327 abstracts were screened, leading to 63 included studies: 44 considering factors and 20 evaluating interventions (one in both categories) in 16 low or middle-income countries. A wide range of socio-economic characteristics were associated with coverage in different contexts. Recent rural-urban migration had a universally negative effect. Parents commonly reported lack of awareness of immunization importance and difficulty accessing services as reasons for under-immunization of their children. Physical distance to clinics and aspects of service quality also impacted uptake. We found evidence of effectiveness for interventions involving multiple components, especially if they have been designed with community involvement. Outreach programmes were effective where physical distance was identified as a barrier. Some evidence was found for the effective use of SMS (text) messaging services, community-based education programmes and financial incentives, which warrant further evaluation. No interventions were identified that provided services to migrants from rural areas. Different factors affect immunization coverage in different urban poor and slum contexts. Immunization services should be designed in collaboration with slum-dwelling communities, considering the local context. Interventions should be designed and tested to increase immunization in migrants from rural areas.

  14. Challenges in Cost-Effectiveness Analysis Modelling of HPV Vaccines in Low- and Middle-Income Countries: A Systematic Review and Practice Recommendations.

    PubMed

    Ekwunife, Obinna I; O'Mahony, James F; Gerber Grote, Andreas; Mosch, Christoph; Paeck, Tatjana; Lhachimi, Stefan K

    2017-01-01

    Low- and middle-income countries (LMICs) face a number of challenges in implementing cervical cancer prevention programmes that do not apply in high-income countries. This review assessed how context-specific challenges of implementing cervical cancer prevention strategies in LMICs were accounted for in existing cost-effectiveness analysis (CEA) models of human papillomavirus (HPV) vaccination. The databases of MEDLINE, EMBASE, NHS Economic Evaluation Database, EconLit, Web of Science, and the Center for the Evaluation of Value and Risk in Health (CEA) Registry were searched for studies published from 2006 to 2015. A descriptive, narrative, and interpretative synthesis of data was undertaken. Of the 33 studies included in the review, the majority acknowledged cost per vaccinated girl (CVG) (26 studies) and vaccine coverage rate (21 studies) as particular challenges for LMICs, while nine studies identified screening coverage rate as a challenge. Most of the studies estimated CVG as a composite of different cost items. However, the basis for the items within this composite cost was unclear. The majority used an assumption rather than an observed rate to represent screening and vaccination coverage rates. CVG, vaccine coverage and screening coverage were shown by some studies through sensitivity analyses to reverse the conclusions regarding cost-effectiveness, thereby significantly affecting policy recommendations. While many studies recognized aspects of the particular challenges of HPV vaccination in LMICs, greater efforts need to be made in adapting models to account for these challenges. These include adapting costings of HPV vaccine delivery from other countries, learning from the outcomes of cervical cancer screening programmes in the same geographical region, and taking into account the country's previous experience with other vaccination programmes.

  15. Development of an Internet-Based Obesity Prevention Program for Children

    PubMed Central

    Gabriele, Jeanne M.; Stewart, Tiffany M.; Sample, Alicia; Davis, Allison B.; Allen, Ray; Martin, Corby K.; Newton, Robert L.; Williamson, Donald A.

    2010-01-01

    Background Childhood obesity is a growing problem, particularly in rural, Louisiana school children. Traditionally, school-based obesity prevention programs have used a primary prevention approach. Finding methods to deliver secondary prevention programs to large numbers of students without singling out overweight students has been a challenge. An innovative approach to achieving this goal is through use of an Internet intervention targeted toward a student's weight status. This article describes the Louisiana (LA) Health Internet intervention, including the student Web site, the Internet counselor Web site, and the Internet counseling process. Method The LA Health Internet intervention had separate interfaces for students and Internet counselors. The main features of the student site were behavioral weight loss lessons, lesson activities, chat with an Internet counselor, and email. The Internet counselor site contained these same features, plus a student directory and various means of obtaining student information to guide counseling. Based on their baseline weight status, students received lessons and counseling that promoted either weight loss or weight maintenance. Intervention was delivered during class time, and teachers scheduled Internet counseling sessions with intervention personnel. Results The LA Health Internet intervention was initially implemented within 14 schools; 773 students were granted access to the site. From Fall 2007 to Spring 2009, 1174 hours of Internet counselor coverage was needed to implement the Internet counseling component of this intervention Conclusion The LA Health Internet intervention is an innovative and feasible method of delivering a secondary prevention program within a school setting to large numbers of students. PMID:20513340

  16. On-demand server-side image processing for web-based DICOM image display

    NASA Astrophysics Data System (ADS)

    Sakusabe, Takaya; Kimura, Michio; Onogi, Yuzo

    2000-04-01

    Low cost image delivery is needed in modern networked hospitals. If a hospital has hundreds of clients, cost of client systems is a big problem. Naturally, a Web-based system is the most effective solution. But a Web browser could not display medical images with certain image processing such as a lookup table transformation. We developed a Web-based medical image display system using Web browser and on-demand server-side image processing. All images displayed on a Web page are generated from DICOM files on a server, delivered on-demand. User interaction on the Web page is handled by a client-side scripting technology such as JavaScript. This combination makes a look-and-feel of an imaging workstation not only for its functionality but also for its speed. Real time update of images with tracing mouse motion is achieved on Web browser without any client-side image processing which may be done by client-side plug-in technology such as Java Applets or ActiveX. We tested performance of the system in three cases. Single client, small number of clients in a fast speed network, and large number of clients in a normal speed network. The result shows that there are very slight overhead for communication and very scalable in number of clients.

  17. The new Inventory of Italian Glaciers: Present knowledge, applied methods and preliminary results

    NASA Astrophysics Data System (ADS)

    Smiraglia, Claudio; Diolaiuti, Guglielmina; D'Agata, Carlo; Maragno, Davide; Baroni, Carlo; Mortara, Gianni; Perotti, Luigi; Bondesan, Aldino; Salvatore, Cristina; Vagliasindi, Marco; Vuillermoz, Elisa

    2013-04-01

    A new Glacier Inventory is an indispensable requirement in Italy due to the importance of evaluating the present glacier coverage and the recent changes driven by climate. Furthermore Alpine glaciers represent a not negligible water and touristic resource then to manage and promote them is needed to know their distribution, size and features. The first Italian Glacier Inventory dates back to 1959-1962. It was compiled by the Italian Glaciological Committee (CGI) in cooperation with the National Research Council (CNR); this first inventory was mainly based on field data coupled with photographs (acquired on the field) and high resolution maps. The Italian glaciation resulted to be spread into 754 ice bodies which altogether were covering 525 km2. Moreover in the Eighties a new inventory was compiled to insert Italian data into the World Glacier Inventory (WGI); aerial photos taken at the end of the Seventies (and in some cases affected by a high and not negligible snow coverage) were used as the main source of data. No other national inventory were compiled after that period. Nevertheless during the last decade the largest part of the Italian Alpine regions have produced regional and local glacier inventories which in several cases are also available and queried through web sites and web GIS application. The actual need is now to obtain a complete, homogeneous and contemporary picture of the Italian Glaciation which encompasses the already available regional and local data and all the new updated information coming from new sources of data (e.g.: orthophotos, satellite imagines, etc..). The challenge was accepted by the University of Milan, the EvK2CNR Committee and the Italian Glaciological Committee who, with the sponsorship of Levissima Spa, are presently working to compile the new updated Italian Glacier Inventory. The first project step is to produce a unique homogeneous glacier database including glacier boundary and surface area and the main fundamental glacier features (following the well-known guidelines of the World Glacier Monitoring Service summarized by Paul et al., 2010). The identification of the Italian glacier bodies and the evaluation of glacier area and main features are performed by analysing aerial orthophotos acquired in the time frame 2007-2012 (pixel size 0.5 m). Moreover the data base will be improved and updated also analysing regional data and by processing and analysing high resolution satellite imagines acquired on the last 2 years. In Lombardy the analysis of the 2007 orthophotos permitted to evaluate a glacier coverage of about 90 km2 of area. This value is about the 75% of the glacier surface area reported for Lombardy glaciers in the Italian Inventory compiled by CGI-CNR in the 1959-62.

  18. JADDS - towards a tailored global atmospheric composition data service for CAMS forecasts and reanalysis

    NASA Astrophysics Data System (ADS)

    Stein, Olaf; Schultz, Martin G.; Rambadt, Michael; Saini, Rajveer; Hoffmann, Lars; Mallmann, Daniel

    2017-04-01

    Global model data of atmospheric composition produced by the Copernicus Atmospheric Monitoring Service (CAMS) is collected since 2010 at FZ Jülich and serves as boundary condition for use by Regional Air Quality (RAQ) modellers world-wide. RAQ models need time-resolved meteorological as well as chemical lateral boundary conditions for their individual model domains. While the meteorological data usually come from well-established global forecast systems, the chemical boundary conditions are not always well defined. In the past, many models used 'climatic' boundary conditions for the tracer concentrations, which can lead to significant concentration biases, particularly for tracers with longer lifetimes which can be transported over long distances (e.g. over the whole northern hemisphere) with the mean wind. The Copernicus approach utilizes extensive near-realtime data assimilation of atmospheric composition data observed from space which gives additional reliability to the global modelling data and is well received by the RAQ communities. An existing Web Coverage Service (WCS) for sharing these individually tailored model results is currently being re-engineered to make use of a modern, scalable database technology in order to improve performance, enhance flexibility, and allow the operation of catalogue services. The new Jülich Atmospheric Data Distributions Server (JADDS) adheres to the Web Coverage Service WCS2.0 standard as defined by the Open Geospatial Consortium OGC. This enables the user groups to flexibly define datasets they need by selecting a subset of chemical species or restricting geographical boundaries or the length of the time series. The data is made available in the form of different catalogues stored locally on our server. In addition, the Jülich OWS Interface (JOIN) provides interoperable web services allowing for easy download and visualization of datasets delivered from WCS servers via the internet. We will present the prototype JADDS server and address the major issues identified when relocating large four-dimensional datasets into a RASDAMAN raster array database. So far the RASDAMAN support for data available in netCDF format is limited with respect to metadata related to variables and axes. For community-wide accepted solutions, selected data coverages shall result in downloadable netCDF files including metadata complying with the netCDF CF Metadata Conventions standard (http://cfconventions.org/). This can be achieved by adding custom metadata elements for RASDAMAN bands (model levels) on data ingestion. Furthermore, an optimization strategy for ingestion of several TB of 4D model output data will be outlined.

  19. Automatic cloud coverage assessment of Formosat-2 image

    NASA Astrophysics Data System (ADS)

    Hsu, Kuo-Hsien

    2011-11-01

    Formosat-2 satellite equips with the high-spatial-resolution (2m ground sampling distance) remote sensing instrument. It has been being operated on the daily-revisiting mission orbit by National Space organization (NSPO) of Taiwan since May 21 2004. NSPO has also serving as one of the ground receiving stations for daily processing the received Formosat- 2 images. The current cloud coverage assessment of Formosat-2 image for NSPO Image Processing System generally consists of two major steps. Firstly, an un-supervised K-means method is used for automatically estimating the cloud statistic of Formosat-2 image. Secondly, manual estimation of cloud coverage from Formosat-2 image is processed by manual examination. Apparently, a more accurate Automatic Cloud Coverage Assessment (ACCA) method certainly increases the efficiency of processing step 2 with a good prediction of cloud statistic. In this paper, mainly based on the research results from Chang et al, Irish, and Gotoh, we propose a modified Formosat-2 ACCA method which considered pre-processing and post-processing analysis. For pre-processing analysis, cloud statistic is determined by using un-supervised K-means classification, Sobel's method, Otsu's method, non-cloudy pixels reexamination, and cross-band filter method. Box-Counting fractal method is considered as a post-processing tool to double check the results of pre-processing analysis for increasing the efficiency of manual examination.

  20. Development of spatial density maps based on geoprocessing web services: application to tuberculosis incidence in Barcelona, Spain.

    PubMed

    Dominkovics, Pau; Granell, Carlos; Pérez-Navarro, Antoni; Casals, Martí; Orcau, Angels; Caylà, Joan A

    2011-11-29

    Health professionals and authorities strive to cope with heterogeneous data, services, and statistical models to support decision making on public health. Sophisticated analysis and distributed processing capabilities over geocoded epidemiological data are seen as driving factors to speed up control and decision making in these health risk situations. In this context, recent Web technologies and standards-based web services deployed on geospatial information infrastructures have rapidly become an efficient way to access, share, process, and visualize geocoded health-related information. Data used on this study is based on Tuberculosis (TB) cases registered in Barcelona city during 2009. Residential addresses are geocoded and loaded into a spatial database that acts as a backend database. The web-based application architecture and geoprocessing web services are designed according to the Representational State Transfer (REST) principles. These web processing services produce spatial density maps against the backend database. The results are focused on the use of the proposed web-based application to the analysis of TB cases in Barcelona. The application produces spatial density maps to ease the monitoring and decision making process by health professionals. We also include a discussion of how spatial density maps may be useful for health practitioners in such contexts. In this paper, we developed web-based client application and a set of geoprocessing web services to support specific health-spatial requirements. Spatial density maps of TB incidence were generated to help health professionals in analysis and decision-making tasks. The combined use of geographic information tools, map viewers, and geoprocessing services leads to interesting possibilities in handling health data in a spatial manner. In particular, the use of spatial density maps has been effective to identify the most affected areas and its spatial impact. This study is an attempt to demonstrate how web processing services together with web-based mapping capabilities suit the needs of health practitioners in epidemiological analysis scenarios.

  1. Development of spatial density maps based on geoprocessing web services: application to tuberculosis incidence in Barcelona, Spain

    PubMed Central

    2011-01-01

    Background Health professionals and authorities strive to cope with heterogeneous data, services, and statistical models to support decision making on public health. Sophisticated analysis and distributed processing capabilities over geocoded epidemiological data are seen as driving factors to speed up control and decision making in these health risk situations. In this context, recent Web technologies and standards-based web services deployed on geospatial information infrastructures have rapidly become an efficient way to access, share, process, and visualize geocoded health-related information. Methods Data used on this study is based on Tuberculosis (TB) cases registered in Barcelona city during 2009. Residential addresses are geocoded and loaded into a spatial database that acts as a backend database. The web-based application architecture and geoprocessing web services are designed according to the Representational State Transfer (REST) principles. These web processing services produce spatial density maps against the backend database. Results The results are focused on the use of the proposed web-based application to the analysis of TB cases in Barcelona. The application produces spatial density maps to ease the monitoring and decision making process by health professionals. We also include a discussion of how spatial density maps may be useful for health practitioners in such contexts. Conclusions In this paper, we developed web-based client application and a set of geoprocessing web services to support specific health-spatial requirements. Spatial density maps of TB incidence were generated to help health professionals in analysis and decision-making tasks. The combined use of geographic information tools, map viewers, and geoprocessing services leads to interesting possibilities in handling health data in a spatial manner. In particular, the use of spatial density maps has been effective to identify the most affected areas and its spatial impact. This study is an attempt to demonstrate how web processing services together with web-based mapping capabilities suit the needs of health practitioners in epidemiological analysis scenarios. PMID:22126392

  2. The Four Levels of Web Site Development Expertise.

    ERIC Educational Resources Information Center

    Ingram, Albert L.

    2000-01-01

    Discusses the design of Web pages and sites and proposes a four-level model of Web development expertise that can serve as a curriculum overview or as a plan for an individual's professional development. Highlights include page design, media use, client-side processing, server-side processing, and site structure. (LRW)

  3. Multilingual event extraction for epidemic detection.

    PubMed

    Lejeune, Gaël; Brixtel, Romain; Doucet, Antoine; Lucas, Nadine

    2015-10-01

    This paper presents a multilingual news surveillance system applied to tele-epidemiology. It has been shown that multilingual approaches improve timeliness in detection of epidemic events across the globe, eliminating the wait for local news to be translated into major languages. We present here a system to extract epidemic events in potentially any language, provided a Wikipedia seed for common disease names exists. The Daniel system presented herein relies on properties that are common to news writing (the journalistic genre), the most useful being repetition and saliency. Wikipedia is used to screen common disease names to be matched with repeated characters strings. Language variations, such as declensions, are handled by processing text at the character-level, rather than at the word level. This additionally makes it possible to handle various writing systems in a similar fashion. As no multilingual ground truth existed to evaluate the Daniel system, we built a multilingual corpus from the Web, and collected annotations from native speakers of Chinese, English, Greek, Polish and Russian, with no connection or interest in the Daniel system. This data set is available online freely, and can be used for the evaluation of other event extraction systems. Experiments for 5 languages out of 17 tested are detailed in this paper: Chinese, English, Greek, Polish and Russian. The Daniel system achieves an average F-measure of 82% in these 5 languages. It reaches 87% on BEcorpus, the state-of-the-art corpus in English, slightly below top-performing systems, which are tailored with numerous language-specific resources. The consistent performance of Daniel on multiple languages is an important contribution to the reactivity and the coverage of epidemiological event detection systems. Most event extraction systems rely on extensive resources that are language-specific. While their sophistication induces excellent results (over 90% precision and recall), it restricts their coverage in terms of languages and geographic areas. In contrast, in order to detect epidemic events in any language, the Daniel system only requires a list of a few hundreds of disease names and locations, which can actually be acquired automatically. The system can perform consistently well on any language, with precision and recall around 82% on average, according to this paper's evaluation. Daniel's character-based approach is especially interesting for morphologically-rich and low-resourced languages. The lack of resources to be exploited and the state of the art string matching algorithms imply that Daniel can process thousands of documents per minute on a simple laptop. In the context of epidemic surveillance, reactivity and geographic coverage are of primary importance, since no one knows where the next event will strike, and therefore in what vernacular language it will first be reported. By being able to process any language, the Daniel system offers unique coverage for poorly endowed languages, and can complete state of the art techniques for major languages. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Crisis Communication in the Area of Risk Management: The CriCoRM Project.

    PubMed

    Scarcella, Carmelo; Antonelli, Laura; Orizio, Grazia; Rossmann, Costanze; Ziegler, Lena; Meyer, Lisa; Garcia-Jimenez, Leonarda; Losada, Jose Carlos; Correia, Joao; Soares, Joana; Covolo, Loredana; Lirangi, Enrico; Gelatti, Umberto

    2013-09-02

    During the last H1N1 pandemic has emerged the importance of crisis communication as an essential part of health crisis management. The Project aims specifically to improve the understanding of crisis communication dynamics and effective tools and to allow public health institutions to communicate better with the public during health emergencies. THE PROJECT WILL PERFORM DIFFERENT ACTIVITIES: i) state of the art review; ii) identification of key stakeholders; iii) communicational analysis performed using data collected on stakeholder communication activities and their outcomes considering the lessons learnt from the analysis of the reasons for differing public reactions during pandemics; iv) improvement of the existing guidelines; v) development of Web 2.0 tools as web-platform and feed service and implementation of impact assessment algorithms; vi) organization of exercises and training on this issues. In the context of health security policies at an EU level, the project aims to find a common and innovative approach to health crisis communication that was displayed by differing reactions to the H1N1 pandemic policies. The focus on new social media tools aims to enhance the role of e-health, and the project aims to use these tools in the specific field of health institutions and citizens. The development of Web 2.0 tools for health crisis communication will allow an effective two-way exchange of information between public health institutions and citizens. An effective communication strategy will increase population compliance with public health recommendations. Significance for public healthThe specific aim of the project is to develop a European strategy approach on how to communicate with the population and with different stakeholders groups involved in the crisis management process, based on an analysis of the communication process during the H1N1 pandemic (content analysis of press releases, press coverage and forum discussions) and on interviews with key stakeholders in health crisis communication. The development of web 2.0 tools, providing rapid responses will allow real-time verification of awareness of social trends and citizens' response. Furthermore, the project would like to offer these resources to the EU Public Health Institutions and EU citizens to improve their interaction, and hence reinforce citizens' right to patient-centred health care. The project proposal has been designed in accordance with the general principles of ethics and the EU Charter of Fundamental Rights with regard to human rights, values, freedom, solidarity, and better protection of European citizens.

  5. Crisis Communication in the Area of Risk Management: The CriCoRM Project

    PubMed Central

    Scarcella, Carmelo; Antonelli, Laura; Orizio, Grazia; Rossmann, Costanze; Ziegler, Lena; Meyer, Lisa; Garcia-Jimenez, Leonarda; Losada, Jose Carlos; Correia, Joao; Soares, Joana; Covolo, Loredana; Lirangi, Enrico; Gelatti, Umberto

    2013-01-01

    Background During the last H1N1 pandemic has emerged the importance of crisis communication as an essential part of health crisis management. The Project aims specifically to improve the understanding of crisis communication dynamics and effective tools and to allow public health institutions to communicate better with the public during health emergencies. Design and methods The Project will perform different activities: i) state of the art review; ii) identification of key stakeholders; iii) communicational analysis performed using data collected on stakeholder communication activities and their outcomes considering the lessons learnt from the analysis of the reasons for differing public reactions during pandemics; iv) improvement of the existing guidelines; v) development of Web 2.0 tools as web-platform and feed service and implementation of impact assessment algorithms; vi) organization of exercises and training on this issues. Expected impact of the study for public health In the context of health security policies at an EU level, the project aims to find a common and innovative approach to health crisis communication that was displayed by differing reactions to the H1N1 pandemic policies. The focus on new social media tools aims to enhance the role of e-health, and the project aims to use these tools in the specific field of health institutions and citizens. The development of Web 2.0 tools for health crisis communication will allow an effective two-way exchange of information between public health institutions and citizens. An effective communication strategy will increase population compliance with public health recommendations. Significance for public health The specific aim of the project is to develop a European strategy approach on how to communicate with the population and with different stakeholders groups involved in the crisis management process, based on an analysis of the communication process during the H1N1 pandemic (content analysis of press releases, press coverage and forum discussions) and on interviews with key stakeholders in health crisis communication. The development of web 2.0 tools, providing rapid responses will allow real-time verification of awareness of social trends and citizens’ response. Furthermore, the project would like to offer these resources to the EU Public Health Institutions and EU citizens to improve their interaction, and hence reinforce citizens’ right to patient-centred health care. The project proposal has been designed in accordance with the general principles of ethics and the EU Charter of Fundamental Rights with regard to human rights, values, freedom, solidarity, and better protection of European citizens. PMID:25170491

  6. The Web as an Information Resource in K-12 Education: Strategies for Supporting Students in Searching and Processing Information

    ERIC Educational Resources Information Center

    Kuiper, Els; Volman, Monique; Terwel, Jan

    2005-01-01

    The use of the Web in K-12 education has increased substantially in recent years. The Web, however, does not support the learning processes of students as a matter of course. In this review, the authors analyze what research says about the demands that the use of the Web as an information resource in education makes on the support and supervision…

  7. [A solution for display and processing of DICOM images in web PACS].

    PubMed

    Xue, Wei-jing; Lu, Wen; Wang, Hai-yang; Meng, Jian

    2009-03-01

    Use the technique of Java Applet to realize the supporting of DICOM image in ordinary Web browser, thereby to expand the processing function of medical image. First analyze the format of DICOM file and design a class which can acquire the pixels, then design two Applet classes, of which one is used to disposal the DICOM image, the other is used to display DICOM image that have been disposaled in the first Applet. They all embedded in the View page, and they communicate by Applet Context object. The method designed in this paper can make users display and process DICOM images directly by using ordinary Web browser, which makes Web PACS not only have the advantages of B/S model, but also have the advantages of the C/S model. Java Applet is the key for expanding the Web browser's function in Web PACS, which provides a guideline to sharing of medical images.

  8. 9 CFR 381.7 - Coverage of all poultry and poultry products processed in official establishments.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Coverage of all poultry and poultry... AND POULTRY PRODUCTS INSPECTION AND VOLUNTARY INSPECTION AND CERTIFICATION POULTRY PRODUCTS INSPECTION REGULATIONS Administration; Application of Inspection and Other Requirements § 381.7 Coverage of all poultry...

  9. Low cost solar array project cell and module formation research area: Process research of non-CZ silicon material

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Liquid diffusion masks and liquid applied dopants to replace the CVD Silox masking and gaseous diffusion operations specified for forming junctions in the Westinghouse baseline process sequence for producing solar cells from dendritic web silicon were investigated. The baseline diffusion masking and drive processes were compared with those involving direct liquid applications to the dendritic web silicon strips. Attempts were made to control the number of variables by subjecting dendritic web strips cut from a single web crystal to both types of operations. Data generated reinforced earlier conclusions that efficiency levels at least as high as those achieved with the baseline back junction formation process can be achieved using liquid diffusion masks and liquid dopants. The deliveries of dendritic web sheet material and solar cells specified by the current contract were made as scheduled.

  10. Analysis and Development of a Web-Enabled Planning and Scheduling Database Application

    DTIC Science & Technology

    2013-09-01

    establishes an entity—relationship diagram for the desired process, constructs an operable database using MySQL , and provides a web- enabled interface for...development, develop, design, process, re- engineering, reengineering, MySQL , structured query language, SQL, myPHPadmin. 15. NUMBER OF PAGES 107 16...relationship diagram for the desired process, constructs an operable database using MySQL , and provides a web-enabled interface for the population of

  11. Process model-based atomic service discovery and composition of composite semantic web services using web ontology language for services (OWL-S)

    NASA Astrophysics Data System (ADS)

    Paulraj, D.; Swamynathan, S.; Madhaiyan, M.

    2012-11-01

    Web Service composition has become indispensable as a single web service cannot satisfy complex functional requirements. Composition of services has received much interest to support business-to-business (B2B) or enterprise application integration. An important component of the service composition is the discovery of relevant services. In Semantic Web Services (SWS), service discovery is generally achieved by using service profile of Ontology Web Languages for Services (OWL-S). The profile of the service is a derived and concise description but not a functional part of the service. The information contained in the service profile is sufficient for atomic service discovery, but it is not sufficient for the discovery of composite semantic web services (CSWS). The purpose of this article is two-fold: first to prove that the process model is a better choice than the service profile for service discovery. Second, to facilitate the composition of inter-organisational CSWS by proposing a new composition method which uses process ontology. The proposed service composition approach uses an algorithm which performs a fine grained match at the level of atomic process rather than at the level of the entire service in a composite semantic web service. Many works carried out in this area have proposed solutions only for the composition of atomic services and this article proposes a solution for the composition of composite semantic web services.

  12. Monitoring the web to support vaccine coverage: results of two years of the portal VaccinarSì.

    PubMed

    Ferro, Antonio; Odone, Anna; Siddu, Andrea; Colucci, Massimiliano; Anello, Paola; Longone, Michela; Marcon, Elena; Castiglia, Paolo; Bonanni, Paolo; Signorelli, Carlo

    2015-01-01

    The increasingly widespread use of the Internet by the population to collect information regarding health and medical treatments and the circulation of many non-scientific documents on the effectiveness and safety of vaccines has led the Italian Society of Hygiene (SItI), in 2013, to promote a portal to provide scientific information that is verified and easily understood to counteract the rampant misinformation on health treatments and combat the phenomenon of vaccine hesitancy. The project was launched in May 2013 and provides a portal with six main sections (vaccine preventable diseases, registered vaccines, benefits and risks of vaccination, against misinformation, pros & cons and travel immunizations) and other headings that relate to scientific events, comics and news coverage concerning vaccines. The contents are validated and evaluated by a scientific committee of high profile scientists and experts in computer-mediated communication. In the first two years of activity, the portal has published more than 250 web pages on all aspects related to vaccinations. The number of individual users was 860,411, with a constant increase over time. Of these, about 21.7% returned to the website at least once. The total visits in 24 months were 1,099,670, with a total page count of 2,530,416. The frequency of contact was almost exclusively Italian (95.6%), with a higher proportion of males (54.1%) and younger age groups (25-34 years, 33.5%, and18-24 years, 27.5%). The data also show a significant position of the website in the major web search engines. The website has been certified by the Health On the Net Foundation. It is connected with the main social networks and it has recently opened its first regional section (Veneto). The strong, progressive increase in web contacts, the involvement of several institutional bodies, and the appreciation of various stakeholders give an absolutely positive assessment of the first two years of the VaccinarSì project. The success of the website suggests future developments, with updates, sections devoted to regional problems, in-depth news analysis, and international expansion. The authors conclude that initiatives like this are to be implemented and constitute an effective way to counteract vaccine hesitancy.

  13. OrthoVenn: a web server for genome wide comparison and annotation of orthologous clusters across multiple species.

    PubMed

    Wang, Yi; Coleman-Derr, Devin; Chen, Guoping; Gu, Yong Q

    2015-07-01

    Genome wide analysis of orthologous clusters is an important component of comparative genomics studies. Identifying the overlap among orthologous clusters can enable us to elucidate the function and evolution of proteins across multiple species. Here, we report a web platform named OrthoVenn that is useful for genome wide comparisons and visualization of orthologous clusters. OrthoVenn provides coverage of vertebrates, metazoa, protists, fungi, plants and bacteria for the comparison of orthologous clusters and also supports uploading of customized protein sequences from user-defined species. An interactive Venn diagram, summary counts, and functional summaries of the disjunction and intersection of clusters shared between species are displayed as part of the OrthoVenn result. OrthoVenn also includes in-depth views of the clusters using various sequence analysis tools. Furthermore, OrthoVenn identifies orthologous clusters of single copy genes and allows for a customized search of clusters of specific genes through key words or BLAST. OrthoVenn is an efficient and user-friendly web server freely accessible at http://probes.pw.usda.gov/OrthoVenn or http://aegilops.wheat.ucdavis.edu/OrthoVenn. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  14. PSORTb 3.0: improved protein subcellular localization prediction with refined localization subcategories and predictive capabilities for all prokaryotes.

    PubMed

    Yu, Nancy Y; Wagner, James R; Laird, Matthew R; Melli, Gabor; Rey, Sébastien; Lo, Raymond; Dao, Phuong; Sahinalp, S Cenk; Ester, Martin; Foster, Leonard J; Brinkman, Fiona S L

    2010-07-01

    PSORTb has remained the most precise bacterial protein subcellular localization (SCL) predictor since it was first made available in 2003. However, the recall needs to be improved and no accurate SCL predictors yet make predictions for archaea, nor differentiate important localization subcategories, such as proteins targeted to a host cell or bacterial hyperstructures/organelles. Such improvements should preferably be encompassed in a freely available web-based predictor that can also be used as a standalone program. We developed PSORTb version 3.0 with improved recall, higher proteome-scale prediction coverage, and new refined localization subcategories. It is the first SCL predictor specifically geared for all prokaryotes, including archaea and bacteria with atypical membrane/cell wall topologies. It features an improved standalone program, with a new batch results delivery system complementing its web interface. We evaluated the most accurate SCL predictors using 5-fold cross validation plus we performed an independent proteomics analysis, showing that PSORTb 3.0 is the most accurate but can benefit from being complemented by Proteome Analyst predictions. http://www.psort.org/psortb (download open source software or use the web interface). psort-mail@sfu.ca Supplementary data are available at Bioinformatics online.

  15. Surfing the web during pandemic flu: availability of World Health Organization recommendations on prevention

    PubMed Central

    2010-01-01

    Background People often search for information on influenza A(H1N1)v prevention on the web. The extent to which information found on the Internet is consistent with recommendations issued by the World Health Organization is unknown. Methods We conducted a search for "swine flu" accessing 3 of the most popular search engines through different proxy servers located in 4 English-speaking countries (Australia, Canada, UK, USA). We explored each site resulting from the searches, up to 4 clicks starting from the search engine page, analyzing availability of World Health Organization recommendations for swine flu prevention. Results Information on hand cleaning was reported on 79% of the 147 websites analyzed; staying home when sick was reported on 77.5% of the websites; disposing tissues after sneezing on 75.5% of the websites. Availability of other recommendations was lower. The probability of finding preventative recommendations consistent with World Health Organization varied by country, type of website, and search engine. Conclusions Despite media coverage on H1N1 influenza, relevant information for prevention is not easily found on the web. Strategies to improve information delivery to the general public through this channel should be improved. PMID:20854690

  16. Surfing the web during pandemic flu: availability of World Health Organization recommendations on prevention.

    PubMed

    Gesualdo, Francesco; Romano, Mariateresa; Pandolfi, Elisabetta; Rizzo, Caterina; Ravà, Lucilla; Lucente, Daniela; Tozzi, Alberto E

    2010-09-20

    People often search for information on influenza A(H1N1)v prevention on the web. The extent to which information found on the Internet is consistent with recommendations issued by the World Health Organization is unknown. We conducted a search for "swine flu" accessing 3 of the most popular search engines through different proxy servers located in 4 English-speaking countries (Australia, Canada, UK, USA). We explored each site resulting from the searches, up to 4 clicks starting from the search engine page, analyzing availability of World Health Organization recommendations for swine flu prevention. Information on hand cleaning was reported on 79% of the 147 websites analyzed; staying home when sick was reported on 77.5% of the websites; disposing tissues after sneezing on 75.5% of the websites. Availability of other recommendations was lower. The probability of finding preventative recommendations consistent with World Health Organization varied by country, type of website, and search engine. Despite media coverage on H1N1 influenza, relevant information for prevention is not easily found on the web. Strategies to improve information delivery to the general public through this channel should be improved.

  17. Effective strategies for scaling up evidence-based practices in primary care: a systematic review.

    PubMed

    Ben Charif, Ali; Zomahoun, Hervé Tchala Vignon; LeBlanc, Annie; Langlois, Léa; Wolfenden, Luke; Yoong, Sze Lin; Williams, Christopher M; Lépine, Roxanne; Légaré, France

    2017-11-22

    While an extensive array of existing evidence-based practices (EBPs) have the potential to improve patient outcomes, little is known about how to implement EBPs on a larger scale. Therefore, we sought to identify effective strategies for scaling up EBPs in primary care. We conducted a systematic review with the following inclusion criteria: (i) study design: randomized and non-randomized controlled trials, before-and-after (with/without control), and interrupted time series; (ii) participants: primary care-related units (e.g., clinical sites, patients); (iii) intervention: any strategy used to scale up an EBP; (iv) comparator: no restrictions; and (v) outcomes: no restrictions. We searched MEDLINE, Embase, PsycINFO, Web of Science, CINAHL, and the Cochrane Library from database inception to August 2016 and consulted clinical trial registries and gray literature. Two reviewers independently selected eligible studies, then extracted and analyzed data following the Cochrane methodology. We extracted components of scaling-up strategies and classified them into five categories: infrastructure, policy/regulation, financial, human resources-related, and patient involvement. We extracted scaling-up process outcomes, such as coverage, and provider/patient outcomes. We validated data extraction with study authors. We included 14 studies. They were published since 2003 and primarily conducted in low-/middle-income countries (n = 11). Most were funded by governmental organizations (n = 8). The clinical area most represented was infectious diseases (HIV, tuberculosis, and malaria, n = 8), followed by newborn/child care (n = 4), depression (n = 1), and preventing seniors' falls (n = 1). Study designs were mostly before-and-after (without control, n = 8). The most frequently targeted unit of scaling up was the clinical site (n = 11). The component of a scaling-up strategy most frequently mentioned was human resource-related (n = 12). All studies reported patient/provider outcomes. Three studies reported scaling-up coverage, but no study quantitatively reported achieving a coverage of 80% in combination with a favorable impact. We found few studies assessing strategies for scaling up EBPs in primary care settings. It is uncertain whether any strategies were effective as most studies focused more on patient/provider outcomes and less on scaling-up process outcomes. Minimal consensus on the metrics of scaling up are needed for assessing the scaling up of EBPs in primary care. This review is registered as PROSPERO CRD42016041461 .

  18. Randomized evaluation of a web based interview process for urology resident selection.

    PubMed

    Shah, Satyan K; Arora, Sanjeev; Skipper, Betty; Kalishman, Summers; Timm, T Craig; Smith, Anthony Y

    2012-04-01

    We determined whether a web based interview process for resident selection could effectively replace the traditional on-site interview. For the 2010 to 2011 match cycle, applicants to the University of New Mexico urology residency program were randomized to participate in a web based interview process via Skype or a traditional on-site interview process. Both methods included interviews with the faculty, a tour of facilities and the opportunity to ask current residents any questions. To maintain fairness the applicants were then reinterviewed via the opposite process several weeks later. We assessed comparative effectiveness, cost, convenience and satisfaction using anonymous surveys largely scored on a 5-point Likert scale. Of 39 total participants (33 applicants and 6 faculty) 95% completed the surveys. The web based interview was less costly to applicants (mean $171 vs $364, p=0.05) and required less time away from school (10% missing 1 or more days vs 30%, p=0.04) compared to traditional on-site interview. However, applicants perceived the web based interview process as less effective than traditional on-site interview, with a mean 6-item summative effectiveness score of 21.3 vs 25.6 (p=0.003). Applicants and faculty favored continuing the web based interview process in the future as an adjunct to on-site interviews. Residency interviews can be successfully conducted via the Internet. The web based interview process reduced costs and improved convenience. The findings of this study support the use of videoconferencing as an adjunct to traditional interview methods rather than as a replacement. Copyright © 2012 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  19. Interactions of the space debris environment with mega constellations-Using the example of the OneWeb constellation

    NASA Astrophysics Data System (ADS)

    Radtke, Jonas; Kebschull, Christopher; Stoll, Enrico

    2017-02-01

    Recently, several announcements have been published to deploy satellite constellations into Low Earth Orbit (LEO) containing several hundred to thousands of rather small sized objects. The purpose of these constellations is to provide a worldwide internet coverage, even to the remotest areas. Examples of these mega-constellations are one from SpaceX, which is announced to comprise of about 4000 satellites, the Norwegian STEAM network, which is told to contain 4257 satellites, and the OneWeb constellation, which forms one of the smaller constellations with 720 satellites. As example constellation, OneWeb has been chosen. From all announced constellation, OneWeb by far delivered most information, both in regards to constellation design and their plans to encounter space debris issues, which is the reason why it has been chosen for these analyses. In this paper, at first an overview of the planned OneWeb constellation setup is given. From this description, a mission life-cycle is deduced, splitting the complete orbital lifetime of the satellites into four phases. Following, using ESA-MASTER, for each of the mission phases the flux on both single constellations satellites and the complete constellation are performed and the collision probabilities are derived. The focus in this analysis is set on catastrophic collisions. This analysis is then varied parametrically for different operational altitudes of the constellation as well as different lifetimes with different assumptions for the success of post mission disposal (PMD). Following the to-be-expected mean number of collision avoidance manoeuvres during all active mission phases is performed using ARES from ESA's DRAMA tool suite. The same variations as during the flux analysis are considered. Lastly the characteristics of hypothetical OneWeb satellite fragmentation clouds, calculated using the NASA Breakup model, are described and the impact of collision clouds from OneWeb satellites on the constellation itself is analysed.

  20. Perceived stress and life satisfaction: social network service use as a moderator.

    PubMed

    Niu, Qikun; Liu, Yihao; Sheng, Zitong; He, Yue; Shao, Xiaolin

    2011-01-01

    Social Network Service (SNS) has become a buzzword in recent media coverage with the development of the second generation of Web-based communities. In China, SNS has played an increasingly important role in its users' daily lives, especially among students. With a sample of 471 college students, we tested the direct relationship between perceived stress and life satisfaction using a regression analysis. Moreover, we found SNS use could buffer the negative effect of perceived stress. This study has practical implications on Internet users' SNS use.

  1. A Practical Tutorial on Modified Condition/Decision Coverage

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J.; Veerhusen, Dan S.; Chilenski, John J.; Rierson, Leanna K.

    2001-01-01

    This tutorial provides a practical approach to assessing modified condition/decision coverage (MC/DC) for aviation software products that must comply with regulatory guidance for DO-178B level A software. The tutorial's approach to MC/DC is a 5-step process that allows a certification authority or verification analyst to evaluate MC/DC claims without the aid of a coverage tool. In addition to the MC/DC approach, the tutorial addresses factors to consider in selecting and qualifying a structural coverage analysis tool, tips for reviewing life cycle data related to MC/DC, and pitfalls common to structural coverage analysis.

  2. Constellation Coverage Analysis

    NASA Technical Reports Server (NTRS)

    Lo, Martin W. (Compiler)

    1997-01-01

    The design of satellite constellations requires an understanding of the dynamic global coverage provided by the constellations. Even for a small constellation with a simple circular orbit propagator, the combinatorial nature of the analysis frequently renders the problem intractable. Particularly for the initial design phase where the orbital parameters are still fluid and undetermined, the coverage information is crucial to evaluate the performance of the constellation design. We have developed a fast and simple algorithm for determining the global constellation coverage dynamically using image processing techniques. This approach provides a fast, powerful and simple method for the analysis of global constellation coverage.

  3. Interactivity, Information Processing, and Learning on the World Wide Web.

    ERIC Educational Resources Information Center

    Tremayne, Mark; Dunwoody, Sharon

    2001-01-01

    Examines the role of interactivity in the presentation of science news on the World Wide Web. Proposes and tests a model of interactive information processing that suggests that characteristics of users and Web sites influence interactivity, which influences knowledge acquisition. Describes use of a think-aloud method to study participants' mental…

  4. Practice Patterns of Radiation Field Design for Sentinel Lymph Node-Positive Early-Stage Breast Cancer.

    PubMed

    Azghadi, Soheila; Daly, Megan; Mayadev, Jyoti

    2016-10-01

    Recent randomized trials have led to decreased use of completion axillary lymph node dissection (ALND) in early-stage breast cancer patients with a positive sentinel lymph node (SLN), causing controversy surrounding radiotherapy coverage of the axilla. We investigated the practice variation among radiation oncologists for regional nodal coverage for clinicopathologic scenarios and evaluated axillary field design decision-making processes. A customized, web-based questionnaire was e-mailed to 983 community (n = 617) and academic (n = 366) radiation oncologists with a breast cancer subspecialty practicing in the United States. The survey consisted of 18 multiple-choice questions evaluating general clinical preferences surrounding radiation therapy (RT) field design for patients with early-stage breast cancer and a positive SLN. Seven case scenarios were developed to investigate the field design in the setting of specific clinical and pathologic risk factors. Nodal coverage was classified as standard tangents (STs), high tangents (HTs), STs and a supraclavicular field (SCF), or STs and full axillary coverage (AX). A total of 145 evaluable responses were collected, with a response rate of 15.0%. Of the respondents, 12 (8.3%) reported using completion ALND for patients with 1 to 3 positive SLNs without extracapsular extension (ECE) and 66 (45.5%) performed ALND with 1 to 3 positive SLNs with ECE. For micrometastatic SLNs, with no lymphovascular system invasion, 115 (87.1%) used STs or HTs. The use of neoadjuvant chemotherapy (NAC) influenced RT field design for patients with a positive SLN without ECE, with 64 (48.5%) using STs and SCF or STs and AX treatment without NAC and 94 (70.7%) using SCF and AX after NAC. With macrometastatic SLN involvement, most respondents preferred SCF (45.27%) and AX (45.66%). In contrast, for micrometastatic involvement, HTs (43.61%) were frequently chosen. Forty (27.8%) reported using online predictive nomograms to predict further axillary involvement, with no difference between the academic and community radiation oncologists (P = .11). In SLN biopsy-positive early-stage breast cancer with omission of completion ALND, axillary RT is increasing used to cover the undissected axilla. Most respondents use SCF or AX for patients with low to intermediate pathologic features. Online prediction nomograms are used by a few practitioners to assist in clinical decision-making in this setting. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Ab Initio Surface Phase Diagrams for Coadsorption of Aromatics and Hydrogen on the Pt(111) Surface

    DOE PAGES

    Ferguson, Glen Allen; Vorotnikov, Vassili; Wunder, Nicholas; ...

    2016-11-02

    Supported metal catalysts are commonly used for the hydrogenation and deoxygenation of biomass-derived aromatic compounds in catalytic fast pyrolysis. To date, the substrate-adsorbate interactions under reaction conditions crucial to these processes remain poorly understood, yet understanding this is critical to constructing detailed mechanistic models of the reactions important to catalytic fast pyrolysis. Density functional theory (DFT) has been used in identifying mechanistic details, but many of these works assume surface models that are not representative of realistic conditions, for example, under which the surface is covered with some concentration of hydrogen and aromatic compounds. In this study, we investigate hydrogen-guaiacolmore » coadsorption on Pt(111) using van der Waals-corrected DFT and ab initio thermodynamics over a range of temperatures and pressures relevant to bio-oil upgrading. We find that relative coverage of hydrogen and guaiacol is strongly dependent on the temperature and pressure of the system. Under conditions relevant to ex situ catalytic fast pyrolysis (CFP; 620-730 K, 1-10 bar), guaiacol and hydrogen chemisorb to the surface with a submonolayer hydrogen (~0.44 ML H), while under conditions relevant to hydrotreating (470-580 K, 10-200 bar), the surface exhibits a full-monolayer hydrogen coverage with guaiacol physisorbed to the surface. These results correlate with experimentally observed selectivities, which show ring saturation to methoxycyclohexanol at hydrotreating conditions and deoxygenation to phenol at CFP-relevant conditions. Additionally, the vibrational energy of the adsorbates on the surface significantly contributes to surface energy at higher coverage. Ignoring this contribution results in not only quantitatively, but also qualitatively incorrect interpretation of coadsorption, shifting the phase boundaries by more than 200 K and ~10-20 bar and predicting no guaiacol adsorption under CFP and hydrotreating conditions. We discuss the implications of this work in the context of modeling hydrogenation and deoxygenation reactions on Pt(111), and we find that only the models representative of equilibrium surface coverage can capture the hydrogenation kinetics correctly. Lastly, as a major outcome of this work, we introduce a freely available web-based tool, dubbed the Surface Phase Explorer (SPE), which allows researchers to conveniently determine surface composition for any one- or two-component system at thermodynamic equilibrium over a wide range of temperatures and pressures on any crystalline surface using standard DFT output.« less

  6. Client-Side Event Processing for Personalized Web Advertisement

    NASA Astrophysics Data System (ADS)

    Stühmer, Roland; Anicic, Darko; Sen, Sinan; Ma, Jun; Schmidt, Kay-Uwe; Stojanovic, Nenad

    The market for Web advertisement is continuously growing and correspondingly, the number of approaches that can be used for realizing Web advertisement are increasing. However, current approaches fail to generate very personalized ads for a current Web user that is visiting a particular Web content. They mainly try to develop a profile based on the content of that Web page or on a long-term user's profile, by not taking into account current user's preferences. We argue that by discovering a user's interest from his current Web behavior we can support the process of ad generation, especially the relevance of an ad for the user. In this paper we present the conceptual architecture and implementation of such an approach. The approach is based on the extraction of simple events from the user interaction with a Web page and their combination in order to discover the user's interests. We use semantic technologies in order to build such an interpretation out of many simple events. We present results from preliminary evaluation studies. The main contribution of the paper is a very efficient, semantic-based client-side architecture for generating and combining Web events. The architecture ensures the agility of the whole advertisement system, by complexly processing events on the client. In general, this work contributes to the realization of new, event-driven applications for the (Semantic) Web.

  7. Silicon web process development

    NASA Technical Reports Server (NTRS)

    Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Blais, P. D.; Davis, J. R., Jr.

    1977-01-01

    Thirty-five (35) furnace runs were carried out during this quarter, of which 25 produced a total of 120 web crystals. The two main thermal models for the dendritic growth process were completed and are being used to assist the design of the thermal geometry of the web growth apparatus. The first model, a finite element representation of the susceptor and crucible, was refined to give greater precision and resolution in the critical central region of the melt. The second thermal model, which describes the dissipation of the latent heat to generate thickness-velocity data, was completed. Dendritic web samples were fabricated into solar cells using a standard configuration and a standard process for a N(+) -P-P(+) configuration. The detailed engineering design was completed for a new dendritic web growth facility of greater width capability than previous facilities.

  8. An Educational Tool for Browsing the Semantic Web

    ERIC Educational Resources Information Center

    Yoo, Sujin; Kim, Younghwan; Park, Seongbin

    2013-01-01

    The Semantic Web is an extension of the current Web where information is represented in a machine processable way. It is not separate from the current Web and one of the confusions that novice users might have is where the Semantic Web is. In fact, users can easily encounter RDF documents that are components of the Semantic Web while they navigate…

  9. Learning Time-Varying Coverage Functions

    PubMed Central

    Du, Nan; Liang, Yingyu; Balcan, Maria-Florina; Song, Le

    2015-01-01

    Coverage functions are an important class of discrete functions that capture the law of diminishing returns arising naturally from applications in social network analysis, machine learning, and algorithmic game theory. In this paper, we propose a new problem of learning time-varying coverage functions, and develop a novel parametrization of these functions using random features. Based on the connection between time-varying coverage functions and counting processes, we also propose an efficient parameter learning algorithm based on likelihood maximization, and provide a sample complexity analysis. We applied our algorithm to the influence function estimation problem in information diffusion in social networks, and show that with few assumptions about the diffusion processes, our algorithm is able to estimate influence significantly more accurately than existing approaches on both synthetic and real world data. PMID:25960624

  10. Learning Time-Varying Coverage Functions.

    PubMed

    Du, Nan; Liang, Yingyu; Balcan, Maria-Florina; Song, Le

    2014-12-08

    Coverage functions are an important class of discrete functions that capture the law of diminishing returns arising naturally from applications in social network analysis, machine learning, and algorithmic game theory. In this paper, we propose a new problem of learning time-varying coverage functions, and develop a novel parametrization of these functions using random features. Based on the connection between time-varying coverage functions and counting processes, we also propose an efficient parameter learning algorithm based on likelihood maximization, and provide a sample complexity analysis. We applied our algorithm to the influence function estimation problem in information diffusion in social networks, and show that with few assumptions about the diffusion processes, our algorithm is able to estimate influence significantly more accurately than existing approaches on both synthetic and real world data.

  11. Tuberculosis-Diagnostic Expert System: an architecture for translating patients information from the web for use in tuberculosis diagnosis.

    PubMed

    Osamor, Victor C; Azeta, Ambrose A; Ajulo, Oluseyi O

    2014-12-01

    Over 1.5-2 million tuberculosis deaths occur annually. Medical professionals are faced with a lot of challenges in delivering good health-care with unassisted automation in hospitals where there are several patients who need the doctor's attention. To automate the pre-laboratory screening process against tuberculosis infection to aid diagnosis and make it fast and accessible to the public via the Internet. The expert system we have built is designed to also take care of people who do not have access to medical experts, but would want to check their medical status. A rule-based approach has been used, and unified modeling language and the client-server architecture technique were applied to model the system and to develop it as a web-based expert system for tuberculosis diagnosis. Algorithmic rules in the Tuberculosis-Diagnosis Expert System necessitate decision coverage where tuberculosis is either suspected or not suspected. The architecture consists of a rule base, knowledge base, and patient database. These units interact with the inference engine, which receives patient' data through the Internet via a user interface. We present the architecture of the Tuberculosis-Diagnosis Expert System and its implementation. We evaluated it for usability to determine the level of effectiveness, efficiency and user satisfaction. The result of the usability evaluation reveals that the system has a usability of 4.08 out of a scale of 5. This is an indication of a more-than-average system performance. Several existing expert systems have been developed for the purpose of supporting different medical diagnoses, but none is designed to translate tuberculosis patients' symptomatic data for online pre-laboratory screening. Our Tuberculosis-Diagnosis Expert System is an effective solution for the implementation of the needed web-based expert system diagnosis. © The Author(s) 2013.

  12. Plans for future on-line access to the historical astronomical literature through the Astrophysics Data System.

    NASA Astrophysics Data System (ADS)

    Eichhorn, G.; Kurtz, M. J.; Coletti, D.

    1997-09-01

    The NASA Astrophysics Data System provides access to about 1 million abstracts and 50,000 journal articles. This service is funded by NASA and is accessible world-wide through the World Wide Web free without restrictions at: http://adswww.harvard.edu We currently have on-line journals starting with 1975. We plan to extend the coverage for the journals and also include scans from observatory publications in our database. Eventually we plan to provide access to scans of the complete journal literature and as much observatory literature as possible. In order to accomplish this, we have started discussions with the preservation group at the Harvard University Library. Harvard University Library, together with the Library at the Center for Astrophysics is in the process of microfilming their collection of observatory publications. We are working together with this project to prepare for scanning the microfilms and make these scans available through the ADS. We are also collecting older journals and preparing them for scanning. We already have the Monthly Notices of the Royal Astronomical Society in hand from Volume 1, and have been promised a large part of the Astronomische Nachrichten prior to 1945. We will start scanning these volumes soon. All volumes that can be fed automatically through the scanning machine should be scanned and put on-line within the next 6 - 12 months. In order to scan volumes that are too brittle, we need additional funding. We hope to obtain additional funding to cover such scanning for 1998. In order to cover more of the astronomical literature, we need donations of astronomical literature. We have a web page that lists the volumes that we need so we can scan them. If you have any of these journals (or other astronomical literature), please contact us. the web page is at: http://adshome.harvard.edu/pubs/missing_journals.html We would appreciate any contributions, even smaller sets, since it will be more and more difficult to find complete sets.

  13. Influenza Vaccination Coverage Among School Employees: Assessing Knowledge, Attitudes, and Behaviors

    PubMed Central

    de Perio, Marie A.; Wiegand, Douglas M.; Brueck, Scott E.

    2015-01-01

    BACKGROUND Influenza can spread among students, teachers, and staff in school settings. Vaccination is the most effective method to prevent influenza. We determined 2012–2013 influenza vaccination coverage among school employees, assessed knowledge and attitudes regarding the vaccine, and determined factors associated with vaccine receipt. METHODS We surveyed 412 (49%) of 841 employees at 1 suburban Ohio school district in March 2013. The Web-based survey assessed personal and work characteristics, vaccine receipt, and knowledge and attitudes regarding the vaccine. RESULTS Overall, 238 (58%) respondents reported getting the 2012–2013 influenza vaccine. The most common reason for getting the vaccine was to protect oneself or one’s family (87%). Beliefs that the vaccine was not needed (32%) or that it was not effective (21%) were the most common reasons for not getting it. Factors independently associated with vaccine receipt were having positive attitudes toward the vaccine, feeling external pressure to get it, and feeling personal control over whether to get it. CONCLUSIONS Influenza vaccine coverage among school employees should be improved. Messages encouraging school employees to get the vaccine should address misconceptions about the vaccine. Employers should use methods to maximize employee vaccination as part of a comprehensive influenza prevention program. PMID:25117893

  14. Coverage, universal access and equity in health: a characterization of scientific production in nursing

    PubMed Central

    Mendoza-Parra, Sara

    2016-01-01

    Objectives: to characterize the scientific contribution nursing has made regarding coverage, universal access and equity in health, and to understand this production in terms of subjects and objects of study. Material and methods: this was cross-sectional, documentary research; the units of analysis were 97 journals and 410 documents, retrieved from the Web of Science in the category, "nursing". Descriptors associated to coverage, access and equity in health, and the Mesh thesaurus, were applied. We used bibliometric laws and indicators, and analyzed the most important articles according to amount of citations and collaboration. Results: the document retrieval allowed for 25 years of observation of production, an institutional and an international collaboration of 31% and 7%, respectively. The mean number of coauthors per article was 3.5, with a transience rate of 93%. The visibility index was 67.7%, and 24.6% of production was concentrated in four core journals. A review from the nursing category with 286 citations, and a Brazilian author who was the most productive, are issues worth highlighting. Conclusions: the nursing collective should strengthen future research on the subject, defining lines and sub-lines of research, increasing internationalization and building it with the joint participation of the academy and nursing community. PMID:26959329

  15. A vorticity transport model to restore spatial gaps in velocity data

    NASA Astrophysics Data System (ADS)

    Ameli, Siavash; Shadden, Shawn

    2017-11-01

    Often measurements of velocity data do not have full spatial coverage in the probed domain or near boundaries. These gaps can be due to missing measurements or masked regions of corrupted data. These gaps confound interpretation, and are problematic when the data is used to compute Lagrangian or trajectory-based analyses. Various techniques have been proposed to overcome coverage limitations in velocity data such as unweighted least square fitting, empirical orthogonal function analysis, variational interpolation as well as boundary modal analysis. In this talk, we present a vorticity transport PDE to reconstruct regions of missing velocity vectors. The transport model involves both nonlinear anisotropic diffusion and advection. This approach is shown to preserve the main features of the flow even in cases of large gaps, and the reconstructed regions are continuous up to second order. We illustrate results for high-frequency radar (HFR) measurements of the ocean surface currents as this is a common application of limited coverage. We demonstrate that the error of the method is on the same order of the error of the original velocity data. In addition, we have developed a web-based gateway for data restoration, and we will demonstrate a practical application using available data. This work is supported by the NSF Grant No. 1520825.

  16. Graph-based optimization of epitope coverage for vaccine antigen design

    DOE PAGES

    Theiler, James Patrick; Korber, Bette Tina Marie

    2017-01-29

    Epigraph is a recently developed algorithm that enables the computationally efficient design of single or multi-antigen vaccines to maximize the potential epitope coverage for a diverse pathogen population. Potential epitopes are defined as short contiguous stretches of proteins, comparable in length to T-cell epitopes. This optimal coverage problem can be formulated in terms of a directed graph, with candidate antigens represented as paths that traverse this graph. Epigraph protein sequences can also be used as the basis for designing peptides for experimental evaluation of immune responses in natural infections to highly variable proteins. The epigraph tool suite also enables rapidmore » characterization of populations of diverse sequences from an immunological perspective. Fundamental distance measures are based on immunologically relevant shared potential epitope frequencies, rather than simple Hamming or phylogenetic distances. Here, we provide a mathematical description of the epigraph algorithm, include a comparison of different heuristics that can be used when graphs are not acyclic, and we describe an additional tool we have added to the web-based epigraph tool suite that provides frequency summaries of all distinct potential epitopes in a population. Lastly, we also show examples of the graphical output and summary tables that can be generated using the epigraph tool suite and explain their content and applications.« less

  17. Graph-based optimization of epitope coverage for vaccine antigen design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Theiler, James Patrick; Korber, Bette Tina Marie

    Epigraph is a recently developed algorithm that enables the computationally efficient design of single or multi-antigen vaccines to maximize the potential epitope coverage for a diverse pathogen population. Potential epitopes are defined as short contiguous stretches of proteins, comparable in length to T-cell epitopes. This optimal coverage problem can be formulated in terms of a directed graph, with candidate antigens represented as paths that traverse this graph. Epigraph protein sequences can also be used as the basis for designing peptides for experimental evaluation of immune responses in natural infections to highly variable proteins. The epigraph tool suite also enables rapidmore » characterization of populations of diverse sequences from an immunological perspective. Fundamental distance measures are based on immunologically relevant shared potential epitope frequencies, rather than simple Hamming or phylogenetic distances. Here, we provide a mathematical description of the epigraph algorithm, include a comparison of different heuristics that can be used when graphs are not acyclic, and we describe an additional tool we have added to the web-based epigraph tool suite that provides frequency summaries of all distinct potential epitopes in a population. Lastly, we also show examples of the graphical output and summary tables that can be generated using the epigraph tool suite and explain their content and applications.« less

  18. High Spatial Resolution Europa Coverage by the Galileo Near Infrared Mapping Spectrometer (NIMS)

    NASA Technical Reports Server (NTRS)

    1997-01-01

    The NIMS instrument on the Galileo spacecraft, which is being used to map the mineral and ice properties over the surfaces of the Jovian moons, produces global spectral images at modest spatial resolution and high resolution spectral images for small selected regions on the satellites. This map illustrates the high resolution coverage of Europa obtained by NIMS through the April 1997 G7 orbit.

    The areas covered are displayed on a Voyager-derived map. A good sampling of the dark trailing-side material (180 to 360 degrees) has been obtained, with less coverage of Europa's leading side.

    The false-color composites use red, green and blue to represent the infrared brightnesses at 0.7, 1.51 and 1.82 microns respectively. Considerable variations are evident and are related to the composition and sizes of the surface grains.

    The Jet Propulsion Laboratory, Pasadena, CA manages the mission for NASA's Office of Space Science, Washington, DC.

    The Jet Propulsion Laboratory, Pasadena, CA manages the mission for NASA's Office of Space Science, Washington, DC.

    This image and other images and data received from Galileo are posted on the World Wide Web, on the Galileo mission home page at URL http://galileo.jpl.nasa.gov.

  19. [The benefits prevail – why electronic immunization records are advantageous to the general practitioner and his patients].

    PubMed

    Burkhardt, Tobias

    2016-01-01

    Immunization coverage throughout the Swiss population is still not optimal and therefore preventable diseases such as measles have not been eliminated in Switzerland yet. In addition, new vaccination protocols are available and official recommendations are becoming increasingly complex. The website www.myvaccines.ch has been in use since 2011 with the primary goal to increase immunization coverage. This service was established by Vaccinologist Professor Claire-Anne Siegrist from the University of Geneva and is free of charge for all Swiss doctors and pharmacists. It enables general practitioners and pediatricians to document the vaccination history of their patients in a new electronic immunization record. After a simple and quick process, the web-based software proposes up-to-date recommendations of new or follow-up vaccinations following the current Swiss Immunization Plan by the Federal Department of Health. Within this single practice, 1446 files have been recorded within the past three years. As a consequence, a total of 4378 immunizations have been administered, leading to a mean of 3.03 immunizations per patient. After introducing the electronic immunization record, the rates of immunizations have increased dramatically for all antigens (factor 2.1 to 41.5). Overall, patient acceptance was high – the doctor’s investment was positively recognized and his approach to patient care was perceived as modern. As a result, the practice has become competent in immunization. In summary, the positive outcome of using the electronic record highly supports the free program www.myvaccines.ch to all general practitioners and pediatricians in Switzerland.

  20. Memetic Algorithm-Based Multi-Objective Coverage Optimization for Wireless Sensor Networks

    PubMed Central

    Chen, Zhi; Li, Shuai; Yue, Wenjing

    2014-01-01

    Maintaining effective coverage and extending the network lifetime as much as possible has become one of the most critical issues in the coverage of WSNs. In this paper, we propose a multi-objective coverage optimization algorithm for WSNs, namely MOCADMA, which models the coverage control of WSNs as the multi-objective optimization problem. MOCADMA uses a memetic algorithm with a dynamic local search strategy to optimize the coverage of WSNs and achieve the objectives such as high network coverage, effective node utilization and more residual energy. In MOCADMA, the alternative solutions are represented as the chromosomes in matrix form, and the optimal solutions are selected through numerous iterations of the evolution process, including selection, crossover, mutation, local enhancement, and fitness evaluation. The experiment and evaluation results show MOCADMA can have good capabilities in maintaining the sensing coverage, achieve higher network coverage while improving the energy efficiency and effectively prolonging the network lifetime, and have a significant improvement over some existing algorithms. PMID:25360579

  1. Memetic algorithm-based multi-objective coverage optimization for wireless sensor networks.

    PubMed

    Chen, Zhi; Li, Shuai; Yue, Wenjing

    2014-10-30

    Maintaining effective coverage and extending the network lifetime as much as possible has become one of the most critical issues in the coverage of WSNs. In this paper, we propose a multi-objective coverage optimization algorithm for WSNs, namely MOCADMA, which models the coverage control of WSNs as the multi-objective optimization problem. MOCADMA uses a memetic algorithm with a dynamic local search strategy to optimize the coverage of WSNs and achieve the objectives such as high network coverage, effective node utilization and more residual energy. In MOCADMA, the alternative solutions are represented as the chromosomes in matrix form, and the optimal solutions are selected through numerous iterations of the evolution process, including selection, crossover, mutation, local enhancement, and fitness evaluation. The experiment and evaluation results show MOCADMA can have good capabilities in maintaining the sensing coverage, achieve higher network coverage while improving the energy efficiency and effectively prolonging the network lifetime, and have a significant improvement over some existing algorithms.

  2. No Longer Conveyor but Creator: Developing an Epistemology of the World Wide Web.

    ERIC Educational Resources Information Center

    Trombley, Laura E. Skandera; Flanagan, William G.

    2001-01-01

    Discusses the impact of the World Wide Web in terms of epistemology. Topics include technological innovations, including new dimensions of virtuality; the accessibility of information; tracking Web use via cookies; how the Web transforms the process of learning and knowing; linking information sources; and the Web as an information delivery…

  3. Discovering Decision Knowledge from Web Log Portfolio for Managing Classroom Processes by Applying Decision Tree and Data Cube Technology.

    ERIC Educational Resources Information Center

    Chen, Gwo-Dong; Liu, Chen-Chung; Ou, Kuo-Liang; Liu, Baw-Jhiune

    2000-01-01

    Discusses the use of Web logs to record student behavior that can assist teachers in assessing performance and making curriculum decisions for distance learning students who are using Web-based learning systems. Adopts decision tree and data cube information processing methodologies for developing more effective pedagogical strategies. (LRW)

  4. Applying Constructivist and Objectivist Learning Theories in the Design of a Web-based Course: Implications for Practice.

    ERIC Educational Resources Information Center

    Moallem, Mahnaz

    2001-01-01

    Provides an overview of the process of designing and developing a Web-based course using instructional design principles and models, including constructivist and objectivist theories. Explains the process of implementing an instructional design model in designing a Web-based undergraduate course and evaluates the model based on course evaluations.…

  5. Testing Strategies for Model-Based Development

    NASA Technical Reports Server (NTRS)

    Heimdahl, Mats P. E.; Whalen, Mike; Rajan, Ajitha; Miller, Steven P.

    2006-01-01

    This report presents an approach for testing artifacts generated in a model-based development process. This approach divides the traditional testing process into two parts: requirements-based testing (validation testing) which determines whether the model implements the high-level requirements and model-based testing (conformance testing) which determines whether the code generated from a model is behaviorally equivalent to the model. The goals of the two processes differ significantly and this report explores suitable testing metrics and automation strategies for each. To support requirements-based testing, we define novel objective requirements coverage metrics similar to existing specification and code coverage metrics. For model-based testing, we briefly describe automation strategies and examine the fault-finding capability of different structural coverage metrics using tests automatically generated from the model.

  6. APASdb: a database describing alternative poly(A) sites and selection of heterogeneous cleavage sites downstream of poly(A) signals

    PubMed Central

    You, Leiming; Wu, Jiexin; Feng, Yuchao; Fu, Yonggui; Guo, Yanan; Long, Liyuan; Zhang, Hui; Luan, Yijie; Tian, Peng; Chen, Liangfu; Huang, Guangrui; Huang, Shengfeng; Li, Yuxin; Li, Jie; Chen, Chengyong; Zhang, Yaqing; Chen, Shangwu; Xu, Anlong

    2015-01-01

    Increasing amounts of genes have been shown to utilize alternative polyadenylation (APA) 3′-processing sites depending on the cell and tissue type and/or physiological and pathological conditions at the time of processing, and the construction of genome-wide database regarding APA is urgently needed for better understanding poly(A) site selection and APA-directed gene expression regulation for a given biology. Here we present a web-accessible database, named APASdb (http://mosas.sysu.edu.cn/utr), which can visualize the precise map and usage quantification of different APA isoforms for all genes. The datasets are deeply profiled by the sequencing alternative polyadenylation sites (SAPAS) method capable of high-throughput sequencing 3′-ends of polyadenylated transcripts. Thus, APASdb details all the heterogeneous cleavage sites downstream of poly(A) signals, and maintains near complete coverage for APA sites, much better than the previous databases using conventional methods. Furthermore, APASdb provides the quantification of a given APA variant among transcripts with different APA sites by computing their corresponding normalized-reads, making our database more useful. In addition, APASdb supports URL-based retrieval, browsing and display of exon-intron structure, poly(A) signals, poly(A) sites location and usage reads, and 3′-untranslated regions (3′-UTRs). Currently, APASdb involves APA in various biological processes and diseases in human, mouse and zebrafish. PMID:25378337

  7. Web-based interactive 2D/3D medical image processing and visualization software.

    PubMed

    Mahmoudi, Seyyed Ehsan; Akhondi-Asl, Alireza; Rahmani, Roohollah; Faghih-Roohi, Shahrooz; Taimouri, Vahid; Sabouri, Ahmad; Soltanian-Zadeh, Hamid

    2010-05-01

    There are many medical image processing software tools available for research and diagnosis purposes. However, most of these tools are available only as local applications. This limits the accessibility of the software to a specific machine, and thus the data and processing power of that application are not available to other workstations. Further, there are operating system and processing power limitations which prevent such applications from running on every type of workstation. By developing web-based tools, it is possible for users to access the medical image processing functionalities wherever the internet is available. In this paper, we introduce a pure web-based, interactive, extendable, 2D and 3D medical image processing and visualization application that requires no client installation. Our software uses a four-layered design consisting of an algorithm layer, web-user-interface layer, server communication layer, and wrapper layer. To compete with extendibility of the current local medical image processing software, each layer is highly independent of other layers. A wide range of medical image preprocessing, registration, and segmentation methods are implemented using open source libraries. Desktop-like user interaction is provided by using AJAX technology in the web-user-interface. For the visualization functionality of the software, the VRML standard is used to provide 3D features over the web. Integration of these technologies has allowed implementation of our purely web-based software with high functionality without requiring powerful computational resources in the client side. The user-interface is designed such that the users can select appropriate parameters for practical research and clinical studies. Copyright (c) 2009 Elsevier Ireland Ltd. All rights reserved.

  8. Judging nursing information on the WWW: a theoretical understanding.

    PubMed

    Cader, Raffik; Campbell, Steve; Watson, Don

    2009-09-01

    This paper is a report of a study of the judgement processes nurses use when evaluating World Wide Web information related to nursing practice. The World Wide Web has increased the global accessibility of online health information. However, the variable nature of the quality of World Wide Web information and its perceived level of reliability may lead to misinformation. This makes demands on healthcare professionals, and on nurses in particular, to ensure that health information of reliable quality is selected for use in practice. A grounded theory approach was adopted. Semi-structured interviews and focus groups were used to collect data, between 2004 and 2005, from 20 nurses undertaking a postqualification graduate course at a university and 13 nurses from a local hospital in the United Kingdom. A theoretical framework emerged that gave insight into the judgement process nurses use when evaluating World Wide Web information. Participants broke the judgement process down into specific tasks. In addition, they used tacit, process and propositional knowledge and intuition, quasi-rational cognition and analysis to undertake these tasks. World Wide Web information cues, time available and nurses' critical skills were influencing factors in their judgement process. Addressing the issue of quality and reliability associated with World Wide Web information is a global challenge. This theoretical framework could contribute towards meeting this challenge.

  9. When the oocyte becomes an embryo. The social life of an ambiguous scientific image in Italian newspapers (1996-2007).

    PubMed

    Beltrame, Lorenzo; Giovanetti, Silvia

    2009-01-01

    This paper analyses the social life of a scientific image frequently used in media coverage of human genetic and biotechnology issues. The expression "social life of an image" refers to the set of functions performed in the public sphere and the relations interweaved with narratives and discourses. This paper starts from the assumption that the sense of an image is reflexive to its local contexts of use. Therefore, the social meaning of an image relies on the web of discursive links which constitute its social life. The paper explores the functions performed by a photograph of a fertilized oocyte in Italian media coverage of human genetic issues. We conclude asserting that such an image: (a) acts as a visual link between several controversial issues; (b) it has become the icon of a general master-frame of human genetics issues and (c) it takes a not neutral part in the debates over these issues.

  10. WCS Challenges for NASA's Earth Science Data

    NASA Astrophysics Data System (ADS)

    Cantrell, S.; Swentek, L.; Khan, A.

    2017-12-01

    In an effort to ensure that data in NASA's Earth Observing System Data and Information System (EOSDIS) is available to a wide variety of users through the tools of their choice, NASA continues to focus on exposing data and services using standards based protocols. Specifically, this work has focused recently on the Web Coverage Service (WCS). Experience has been gained in data delivery via GetCoverage requests, starting out with WCS v1.1.1. The pros and cons of both the version itself and different implementation approaches will be shared during this session. Additionally, due to limitations with WCS v1.1.1's ability to work with NASA's Earth science data, this session will also discuss the benefit of migrating to WCS 2.0.1 with EO-x to enrich this capability to meet a wide range of anticipated user needs This will enable subsetting and various types of data transformations to be performed on a variety of EOS data sets.

  11. An Architecture for Autonomic Web Service Process Planning

    NASA Astrophysics Data System (ADS)

    Moore, Colm; Xue Wang, Ming; Pahl, Claus

    Web service composition is a technology that has received considerable attention in the last number of years. Languages and tools to aid in the process of creating composite Web services have been received specific attention. Web service composition is the process of linking single Web services together in order to accomplish more complex tasks. One area of Web service composition that has not received as much attention is the area of dynamic error handling and re-planning, enabling autonomic composition. Given a repository of service descriptions and a task to complete, it is possible for AI planners to automatically create a plan that will achieve this goal. If however a service in the plan is unavailable or erroneous the plan will fail. Motivated by this problem, this paper suggests autonomous re-planning as a means to overcome dynamic problems. Our solution involves automatically recovering from faults and creating a context-dependent alternate plan. We present an architecture that serves as a basis for the central activities autonomous composition, monitoring and fault handling.

  12. Service-based analysis of biological pathways

    PubMed Central

    Zheng, George; Bouguettaya, Athman

    2009-01-01

    Background Computer-based pathway discovery is concerned with two important objectives: pathway identification and analysis. Conventional mining and modeling approaches aimed at pathway discovery are often effective at achieving either objective, but not both. Such limitations can be effectively tackled leveraging a Web service-based modeling and mining approach. Results Inspired by molecular recognitions and drug discovery processes, we developed a Web service mining tool, named PathExplorer, to discover potentially interesting biological pathways linking service models of biological processes. The tool uses an innovative approach to identify useful pathways based on graph-based hints and service-based simulation verifying user's hypotheses. Conclusion Web service modeling of biological processes allows the easy access and invocation of these processes on the Web. Web service mining techniques described in this paper enable the discovery of biological pathways linking these process service models. Algorithms presented in this paper for automatically highlighting interesting subgraph within an identified pathway network enable the user to formulate hypothesis, which can be tested out using our simulation algorithm that are also described in this paper. PMID:19796403

  13. A web service system supporting three-dimensional post-processing of medical images based on WADO protocol.

    PubMed

    He, Longjun; Xu, Lang; Ming, Xing; Liu, Qian

    2015-02-01

    Three-dimensional post-processing operations on the volume data generated by a series of CT or MR images had important significance on image reading and diagnosis. As a part of the DIOCM standard, WADO service defined how to access DICOM objects on the Web, but it didn't involve three-dimensional post-processing operations on the series images. This paper analyzed the technical features of three-dimensional post-processing operations on the volume data, and then designed and implemented a web service system for three-dimensional post-processing operations of medical images based on the WADO protocol. In order to improve the scalability of the proposed system, the business tasks and calculation operations were separated into two modules. As results, it was proved that the proposed system could support three-dimensional post-processing service of medical images for multiple clients at the same moment, which met the demand of accessing three-dimensional post-processing operations on the volume data on the web.

  14. Providing Web Interfaces to the NSF EarthScope USArray Transportable Array

    NASA Astrophysics Data System (ADS)

    Vernon, Frank; Newman, Robert; Lindquist, Kent

    2010-05-01

    Since April 2004 the EarthScope USArray seismic network has grown to over 850 broadband stations that stream multi-channel data in near real-time to the Array Network Facility in San Diego. Providing secure, yet open, access to real-time and archived data for a broad range of audiences is best served by a series of platform agnostic low-latency web-based applications. We present a framework of tools that mediate between the world wide web and Boulder Real Time Technologies Antelope Environmental Monitoring System data acquisition and archival software. These tools provide comprehensive information to audiences ranging from network operators and geoscience researchers, to funding agencies and the general public. This ranges from network-wide to station-specific metadata, state-of-health metrics, event detection rates, archival data and dynamic report generation over a station's two year life span. Leveraging open source web-site development frameworks for both the server side (Perl, Python and PHP) and client-side (Flickr, Google Maps/Earth and jQuery) facilitates the development of a robust extensible architecture that can be tailored on a per-user basis, with rapid prototyping and development that adheres to web-standards. Typical seismic data warehouses allow online users to query and download data collected from regional networks, without the scientist directly visually assessing data coverage and/or quality. Using a suite of web-based protocols, we have recently developed an online seismic waveform interface that directly queries and displays data from a relational database through a web-browser. Using the Python interface to Datascope and the Python-based Twisted network package on the server side, and the jQuery Javascript framework on the client side to send and receive asynchronous waveform queries, we display broadband seismic data using the HTML Canvas element that is globally accessible by anyone using a modern web-browser. We are currently creating additional interface tools to create a rich-client interface for accessing and displaying seismic data that can be deployed to any system running the Antelope Real Time System. The software is freely available from the Antelope contributed code Git repository (http://www.antelopeusersgroup.org).

  15. Internet Technology in Magnetic Resonance: A Common Gateway Interface Program for the World-Wide Web NMR Spectrometer

    NASA Astrophysics Data System (ADS)

    Buszko, Marian L.; Buszko, Dominik; Wang, Daniel C.

    1998-04-01

    A custom-written Common Gateway Interface (CGI) program for remote control of an NMR spectrometer using a World Wide Web browser has been described. The program, running on a UNIX workstation, uses multiple processes to handle concurrent tasks of interacting with the user and with the spectrometer. The program's parent process communicates with the browser and sends out commands to the spectrometer; the child process is mainly responsible for data acquisition. Communication between the processes is via the shared memory mechanism. The WWW pages that have been developed for the system make use of the frames feature of web browsers. The CGI program provides an intuitive user interface to the NMR spectrometer, making, in effect, a complex system an easy-to-use Web appliance.

  16. Rapid Onboard Data Product Generation with Multicore Processors and FPGA

    NASA Astrophysics Data System (ADS)

    Mandl, D.; Sohlberg, R. A.; Cappelaere, P. G.; Frye, S. W.; Ly, V.; Handy, M.; Ambrosia, V. G.; Sullivan, D. V.; Bland, G.; Pastor, E.; Crago, S.; Flatley, C.; Shah, N.; Bronston, J.; Creech, T.

    2012-12-01

    The Intelligent Payload Module (IPM) is an experimental testbed with multicore processors and Field Programmable Gate Array (FPGA). This effort is being funded by the NASA Earth Science Technology Office as part of an Advanced Information Systems Technology (AIST) 2011 research grant to investigate the use of high performance onboard processing to create an onboard data processing pipeline that can rapidly process a subset of onboard imaging spectrometer data (1) through radiance to reflectance conversion (2) atmospheric correction (3) geolocation and co-registration and (4) level 2 data product generation. The requirements are driven by the mission concept for the HyspIRI NASA Decadal mission, although other NASA Decadal missions could use the same concept. The system is being set up to make use of the same ground and flight software being used by other satellites at NASA/GSFC. Furthermore, a Web Coverage Processing Service (WCPS) is installed as part of the flight software which enables a user on the ground to specify the desired algorithm to run onboard against the data in realtime. Benchmark demonstrations are being run and will be run through the three year effort on various platforms including a helicopter and various airplane platforms with various instruments to demonstrate various configurations that would be compatible with the HyspIRI mission and other similar missions. This presentation will lay out the demonstrations conducted to date along with any benchmark performance metrics and future demonstration efforts and objectives.Initial IPM Test Box

  17. Novel Data Sources for Women’s Health Research: Mapping Breast Screening Online Information Seeking Through Google Trends

    PubMed Central

    Dehkordy, Soudabeh Fazeli; Carlos, Ruth C.; Hall, Kelli S.; Dalton, Vanessa K.

    2015-01-01

    Rationale and Objectives Millions of people use online search engines every day to find health-related information and voluntarily share their personal health status and behaviors in various Web sites. Thus, data from tracking of online information seeker’s behavior offer potential opportunities for use in public health surveillance and research. Google Trends is a feature of Google which allows internet users to graph the frequency of searches for a single term or phrase over time or by geographic region. We used Google Trends to describe patterns of information seeking behavior in the subject of dense breasts and to examine their correlation with the passage or introduction of dense breast notification legislation. Materials and Methods In order to capture the temporal variations of information seeking about dense breasts, the web search query “dense breast” was entered in the Google Trends tool. We then mapped the dates of legislative actions regarding dense breasts that received widespread coverage in the lay media to information seeking trends about dense breasts over time. Results Newsworthy events and legislative actions appear to correlate well with peaks in search volume of “dense breast”. Geographic regions with the highest search volumes have either passed, denied, or are currently considering the dense breast legislation. Conclusions Our study demonstrated that any legislative action and respective news coverage correlate with increase in information seeking for “dense breast” on Google, suggesting that Google Trends has the potential to serve as a data source for policy-relevant research. PMID:24998689

  18. Exploring NASA Satellite Data with High Resolution Visualization

    NASA Astrophysics Data System (ADS)

    Wei, J. C.; Yang, W.; Johnson, J. E.; Shen, S.; Zhao, P.; Gerasimov, I. V.; Vollmer, B.; Vicente, G. A.; Pham, L.

    2013-12-01

    Satellite data products are important for a wide variety of applications that can bring far-reaching benefits to the science community and the broader society. These benefits can best be achieved if the satellite data are well utilized and interpreted, such as model inputs from satellite, or extreme event (such as volcano eruption, dust storm, ...etc) interpretation from satellite. Unfortunately, this is not always the case, despite the abundance and relative maturity of numerous satellite data products provided by NASA and other organizations. Such obstacles may be avoided by providing satellite data as ';Images' with accurate pixel-level (Level 2) information, including pixel coverage area delineation and science team recommended quality screening for individual geophysical parameters. We will present a prototype service from the Goddard Earth Sciences Data and Information Services Center (GES DISC) supporting various visualization and data accessing capabilities from satellite Level 2 data (non-aggregated and un-gridded) at high spatial resolution. Functionality will include selecting data sources (e.g., multiple parameters under the same measurement, like NO2 and SO2 from Ozone Monitoring Instrument (OMI), or same parameter with different methods of aggregation, like NO2 in OMNO2G and OMNO2D products), defining area-of-interest and temporal extents, zooming, panning, overlaying, sliding, and data subsetting and reformatting. The portal interface will connect to the backend services with OGC standard-compliant Web Mapping Service (WMS) and Web Coverage Service (WCS) calls. The interface will also be able to connect to other OGC WMS and WCS servers, which will greatly enhance its expandability to integrate additional outside data/map sources.

  19. Improving information retrieval with multiple health terminologies in a quality-controlled gateway.

    PubMed

    Soualmia, Lina F; Sakji, Saoussen; Letord, Catherine; Rollin, Laetitia; Massari, Philippe; Darmoni, Stéfan J

    2013-01-01

    The Catalog and Index of French-language Health Internet resources (CISMeF) is a quality-controlled health gateway, primarily for Web resources in French (n=89,751). Recently, we achieved a major improvement in the structure of the catalogue by setting-up multiple terminologies, based on twelve health terminologies available in French, to overcome the potential weakness of the MeSH thesaurus, which is the main and pivotal terminology we use for indexing and retrieval since 1995. The main aim of this study was to estimate the added-value of exploiting several terminologies and their semantic relationships to improve Web resource indexing and retrieval in CISMeF, in order to provide additional health resources which meet the users' expectations. Twelve terminologies were integrated into the CISMeF information system to set up multiple-terminologies indexing and retrieval. The same sets of thirty queries were run: (i) by exploiting the hierarchical structure of the MeSH, and (ii) by exploiting the additional twelve terminologies and their semantic links. The two search modes were evaluated and compared. The overall coverage of the multiple-terminologies search mode was improved by comparison to the coverage of using the MeSH (16,283 vs. 14,159) (+15%). These additional findings were estimated at 56.6% relevant results, 24.7% intermediate results and 18.7% irrelevant. The multiple-terminologies approach improved information retrieval. These results suggest that integrating additional health terminologies was able to improve recall. Since performing the study, 21 other terminologies have been added which should enable us to make broader studies in multiple-terminologies information retrieval.

  20. Cervical carcinoma in the European Union: an update on disease burden, screening program state of activation, and coverage as of March 2014.

    PubMed

    Altobelli, Emma; Lattanzi, Amedeo

    2015-03-01

    Cervical cancer (CC) is defined as a disease of disparity. This is due to marked differences in CC incidence and mortality between developed and developing countries. As a continent, Europe is no exception. This study examines the state of activation of CC screening in the European Union as of March 2014, reviews CC incidence and mortality data, and highlights the initiatives adopted to extend program coverage to nonresponders. The present study is based on the most recent data available from PubMed-indexed journals, the Web sites of the health ministries of each member state, and the Web sites of national cancer observatories; failing these sources, information was sought in scientific journals published in the local language. In 2003, the European Council recommended that priority be given to organized screening program activation. Nonetheless, a number of European Union member states still lack population-based organized screening programs, and few have implemented programs directed at disadvantaged populations. Several investigations have demonstrated that the women at higher CC risk are unscreened and underscreened ones. Since then, several member states have made significant efforts to set up effective prevention programs by adopting international quality standards and centralizing screening organization and result evaluation. Several developed countries and some new central-eastern European member states have poorly organized prevention programs that result in poor women's health. Diagnosis of CC is emotionally traumatic, but it is highly preventable. When CC is found early, it is highly treatable and associated with long survival and good quality of life.

  1. Supporting tobacco control: stimulating local newspaper coverage with a technical assistance website for local coalitions.

    PubMed

    Buller, David B; Bettinghaus, Erwin P; Helme, Donald; Young, Walter F; Borland, Ron; Maloy, Julie A; Cutter, Gary R; Andersen, Peter A; Walther, Joseph B

    2011-11-01

    A large and growing literature confirms that well-designed web-based programs can be effective in preventing or treating several chronic diseases. This study examined how the Internet can deliver information and train community activists and specifically tested the effects of web-based technical assistance on local tobacco control coalitions' efforts to use media advocacy to advance their agendas. The authors compared a highly interactive, Enhanced website (intervention) to a noninteractive, Basic text-based website (comparison) in Colorado communities. A total of 24 tobacco control coalitions led by local county health departments and nursing services were enrolled in the project and randomly assigned to use either the intervention or comparison website. A total of 73 local daily and weekly newspapers were identified in the service areas of 23 of the 24 coalitions. A posttest assessment of newspaper coverage was conducted to locate all newspaper articles with tobacco control information published between January 1 and April 9, 2004, the last 3 months of the intervention. Although there was no evidence of a treatment effect on the frequency of newspaper articles on tobacco-related issues, there was, however, evidence that newspapers in counties where the coalition had access to the Enhanced website printed more stories focused on local/regional issues and more anti-tobacco local/regional stories than in the counties where coalitions had access to the Basic website. Coalitions can improve their influence on local media for community tobacco control when high-quality online technical assistance, training, and resources are available to them.

  2. News trends and web search query of HIV/AIDS in Hong Kong.

    PubMed

    Chiu, Alice P Y; Lin, Qianying; He, Daihai

    2017-01-01

    The HIV epidemic in Hong Kong has worsened in recent years, with major contributions from high-risk subgroup of men who have sex with men (MSM). Internet use is prevalent among the majority of the local population, where they sought health information online. This study examines the impacts of HIV/AIDS and MSM news coverage on web search query in Hong Kong. Relevant news coverage about HIV/AIDS and MSM from January 1st, 2004 to December 31st, 2014 was obtained from the WiseNews databse. News trends were created by computing the number of relevant articles by type, topic, place of origin and sub-populations. We then obtained relevant search volumes from Google and analysed causality between news trends and Google Trends using Granger Causality test and orthogonal impulse function. We found that editorial news has an impact on "HIV" Google searches on HIV, with the search term popularity peaking at an average of two weeks after the news are published. Similarly, editorial news has an impact on the frequency of "AIDS" searches two weeks after. MSM-related news trends have a more fluctuating impact on "MSM" Google searches, although the time lag varies anywhere from one week later to ten weeks later. This infodemiological study shows that there is a positive impact of news trends on the online search behavior of HIV/AIDS or MSM-related issues for up to ten weeks after. Health promotional professionals could make use of this brief time window to tailor the timing of HIV awareness campaigns and public health interventions to maximise its reach and effectiveness.

  3. Soil food web properties explain ecosystem services across European land use systems.

    PubMed

    de Vries, Franciska T; Thébault, Elisa; Liiri, Mira; Birkhofer, Klaus; Tsiafouli, Maria A; Bjørnlund, Lisa; Bracht Jørgensen, Helene; Brady, Mark Vincent; Christensen, Søren; de Ruiter, Peter C; d'Hertefeldt, Tina; Frouz, Jan; Hedlund, Katarina; Hemerik, Lia; Hol, W H Gera; Hotes, Stefan; Mortimer, Simon R; Setälä, Heikki; Sgardelis, Stefanos P; Uteseny, Karoline; van der Putten, Wim H; Wolters, Volkmar; Bardgett, Richard D

    2013-08-27

    Intensive land use reduces the diversity and abundance of many soil biota, with consequences for the processes that they govern and the ecosystem services that these processes underpin. Relationships between soil biota and ecosystem processes have mostly been found in laboratory experiments and rarely are found in the field. Here, we quantified, across four countries of contrasting climatic and soil conditions in Europe, how differences in soil food web composition resulting from land use systems (intensive wheat rotation, extensive rotation, and permanent grassland) influence the functioning of soils and the ecosystem services that they deliver. Intensive wheat rotation consistently reduced the biomass of all components of the soil food web across all countries. Soil food web properties strongly and consistently predicted processes of C and N cycling across land use systems and geographic locations, and they were a better predictor of these processes than land use. Processes of carbon loss increased with soil food web properties that correlated with soil C content, such as earthworm biomass and fungal/bacterial energy channel ratio, and were greatest in permanent grassland. In contrast, processes of N cycling were explained by soil food web properties independent of land use, such as arbuscular mycorrhizal fungi and bacterial channel biomass. Our quantification of the contribution of soil organisms to processes of C and N cycling across land use systems and geographic locations shows that soil biota need to be included in C and N cycling models and highlights the need to map and conserve soil biodiversity across the world.

  4. Soil food web properties explain ecosystem services across European land use systems

    PubMed Central

    de Vries, Franciska T.; Thébault, Elisa; Liiri, Mira; Birkhofer, Klaus; Tsiafouli, Maria A.; Bjørnlund, Lisa; Bracht Jørgensen, Helene; Brady, Mark Vincent; Christensen, Søren; de Ruiter, Peter C.; d’Hertefeldt, Tina; Frouz, Jan; Hedlund, Katarina; Hemerik, Lia; Hol, W. H. Gera; Hotes, Stefan; Mortimer, Simon R.; Setälä, Heikki; Sgardelis, Stefanos P.; Uteseny, Karoline; van der Putten, Wim H.; Wolters, Volkmar; Bardgett, Richard D.

    2013-01-01

    Intensive land use reduces the diversity and abundance of many soil biota, with consequences for the processes that they govern and the ecosystem services that these processes underpin. Relationships between soil biota and ecosystem processes have mostly been found in laboratory experiments and rarely are found in the field. Here, we quantified, across four countries of contrasting climatic and soil conditions in Europe, how differences in soil food web composition resulting from land use systems (intensive wheat rotation, extensive rotation, and permanent grassland) influence the functioning of soils and the ecosystem services that they deliver. Intensive wheat rotation consistently reduced the biomass of all components of the soil food web across all countries. Soil food web properties strongly and consistently predicted processes of C and N cycling across land use systems and geographic locations, and they were a better predictor of these processes than land use. Processes of carbon loss increased with soil food web properties that correlated with soil C content, such as earthworm biomass and fungal/bacterial energy channel ratio, and were greatest in permanent grassland. In contrast, processes of N cycling were explained by soil food web properties independent of land use, such as arbuscular mycorrhizal fungi and bacterial channel biomass. Our quantification of the contribution of soil organisms to processes of C and N cycling across land use systems and geographic locations shows that soil biota need to be included in C and N cycling models and highlights the need to map and conserve soil biodiversity across the world. PMID:23940339

  5. Contributing, Exchanging and Linking for Learning: Supporting Web Co-Discovery in One-to-One Environments

    ERIC Educational Resources Information Center

    Liu, Chen-Chung; Don, Ping-Hsing; Chung, Chen-Wei; Lin, Shao-Jun; Chen, Gwo-Dong; Liu, Baw-Jhiune

    2010-01-01

    While Web discovery is usually undertaken as a solitary activity, Web co-discovery may transform Web learning activities from the isolated individual search process into interactive and collaborative knowledge exploration. Recent studies have proposed Web co-search environments on a single computer, supported by multiple one-to-one technologies.…

  6. Examining effects of medical cannabis narratives on beliefs, attitudes, and intentions related to recreational cannabis: A web-based randomized experiment.

    PubMed

    Sznitman, Sharon R; Lewis, Nehama

    2018-04-01

    This experimental study tests effects of exposure to video narratives about successful symptom relief with Medical Cannabis (MC) on attitudes, beliefs, and intentions related to recreational cannabis use. Patient video testimonials were modeled after those found in extant media coverage. Israeli participants (N = 396) recruited through an online survey company were randomly assigned to view a narrative or a non-narrative video containing equivalent information about MC. Video content was further manipulated based on whether the protagonist had a stigmatized disease or not, and whether attribution of responsibility for his disease was internal or external. Exposure to patient testimonials indirectly increased positive attitudes, beliefs and intentions related to recreational cannabis use through changing attitudes, beliefs and intentions related to MC. Furthermore, exposure to narratives in which the patient was presented as not to blame for contracting his illness (external attribution) was associated with more positive attitudes, beliefs and intentions toward MC, a factor that was significantly associated with more positive attitudes, beliefs and intentions related to recreational cannabis use. These results suggest that narrative news media coverage of MC may influence public attitudes toward recreational cannabis. Because such media stories continue to be commonplace, it is important to examine potential spillover effects of this coverage on public perceptions of recreational cannabis. Cannabis prevention programs should address the role of media coverage in shaping public opinion and address the distinction between medical and recreational cannabis use. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. Aeolian processes over gravel beds: Field wind tunnel simulation and its application atop the Mogao Grottoes, China

    NASA Astrophysics Data System (ADS)

    Zhang, Weimin; Tan, Lihai; Zhang, Guobin; Qiu, Fei; Zhan, Hongtao

    2014-12-01

    The aeolian processes of erosion, transport and deposition are threatening the Mogao Grottoes, a world culture heritage site. A field wind tunnel experiment was conducted atop the Mogao Grottoes using weighing sensors to quantify aeolian processes over protective gravel beds. Results reveal that aeolian erosion and deposition over gravel beds are basically influenced by gravel coverage and wind speed. Erosion is a main aeolian process over gravel beds and its strength level is mainly determined by gravel coverage: strong (<30%), medium (30-50%) and slight (>50%). Aeolian deposition only occurs when gravel coverage is equal to or greater than 30% and wind speeds are between 8 and 12 m s-1, and this process continues until the occurrence of the equilibrium coverage. In addition, the change in conditions of external sand supply affects the transition between aeolian deposition and erosion over gravel beds, and the quantity of sand transport at the height of 0-24 mm is an important indicator of aeolian deposition and erosion over gravel beds. Our results also demonstrate that making the best use of wind regime atop the Mogao Grottoes and constructing an artificial gobi surface in staggered arrays, with 30% coverage and 30-mm-high gravels and in 40 mm spacing can trap westerly invading sand flow and enable the stronger easterly wind to return the deposited sand on the gravel surface back to the Mingsha Mountain so as to minimize the damage of the blown sand flux to the Mogao Grottoes.

  8. Consumer trophic diversity as a fundamental mechanism linking predation and ecosystem functioning.

    PubMed

    Hines, Jes; Gessner, Mark O

    2012-11-01

    1. Primary production and decomposition, two fundamental processes determining the functioning of ecosystems, may be sensitive to changes in biodiversity and food web interactions. 2. The impacts of food web interactions on ecosystem functioning are generally quantified by experimentally decoupling these linked processes and examining either primary production-based (green) or decomposition-based (brown) food webs in isolation. This decoupling may strongly limit our ability to assess the importance of food web interactions on ecosystem processes. 3. To evaluate how consumer trophic diversity mediates predator effects on ecosystem functioning, we conducted a mesocosm experiment and a field study using an assemblage of invertebrates that naturally co-occur on North Atlantic coastal saltmarshes. We measured the indirect impact of predation on primary production and leaf decomposition as a result of prey communities composed of herbivores alone, detritivores alone or both prey in combination. 4. We find that primary consumers can influence ecosystem process rates not only within, but also across green and brown sub-webs. Moreover, by feeding on a functionally diverse consumer assemblage comprised of both herbivores and detritivores, generalist predators can diffuse consumer effects on decomposition, primary production and feedbacks between the two processes. 5. These results indicate that maintaining functional diversity among primary consumers can alter the consequences of traditional trophic cascades, and they emphasize the role of the detritus-based sub-web when seeking key biotic drivers of plant production. Clearly, traditional compartmentalization of empirical food webs can limit our ability to predict the influence of food web interactions on ecosystem functioning. © 2012 The Authors. Journal of Animal Ecology © 2012 British Ecological Society.

  9. "WWW.MDTF.ORG": a World Wide Web forum for developing open-architecture, freely distributed, digital teaching file software by participant consensus.

    PubMed

    Katzman, G L; Morris, D; Lauman, J; Cochella, C; Goede, P; Harnsberger, H R

    2001-06-01

    To foster a community supported evaluation processes for open-source digital teaching file (DTF) development and maintenance. The mechanisms used to support this process will include standard web browsers, web servers, forum software, and custom additions to the forum software to potentially enable a mediated voting protocol. The web server will also serve as a focal point for beta and release software distribution, which is the desired end-goal of this process. We foresee that www.mdtf.org will provide for widespread distribution of open source DTF software that will include function and interface design decisions from community participation on the website forums.

  10. WebGL and web audio software lightweight components for multimedia education

    NASA Astrophysics Data System (ADS)

    Chang, Xin; Yuksel, Kivanc; Skarbek, Władysław

    2017-08-01

    The paper presents the results of our recent work on development of contemporary computing platform DC2 for multimedia education usingWebGL andWeb Audio { the W3C standards. Using literate programming paradigm the WEBSA educational tools were developed. It offers for a user (student), the access to expandable collection of WEBGL Shaders and web Audio scripts. The unique feature of DC2 is the option of literate programming, offered for both, the author and the reader in order to improve interactivity to lightweightWebGL andWeb Audio components. For instance users can define: source audio nodes including synthetic sources, destination audio nodes, and nodes for audio processing such as: sound wave shaping, spectral band filtering, convolution based modification, etc. In case of WebGL beside of classic graphics effects based on mesh and fractal definitions, the novel image processing analysis by shaders is offered like nonlinear filtering, histogram of gradients, and Bayesian classifiers.

  11. USGS "Did You Feel It?" internet-based macroseismic intensity maps

    USGS Publications Warehouse

    Wald, D.J.; Quitoriano, V.; Worden, B.; Hopper, M.; Dewey, J.W.

    2011-01-01

    The U.S. Geological Survey (USGS) "Did You Feel It?" (DYFI) system is an automated approach for rapidly collecting macroseismic intensity data from Internet users' shaking and damage reports and generating intensity maps immediately following earthquakes; it has been operating for over a decade (1999-2011). DYFI-based intensity maps made rapidly available through the DYFI system fundamentally depart from more traditional maps made available in the past. The maps are made more quickly, provide more complete coverage and higher resolution, provide for citizen input and interaction, and allow data collection at rates and quantities never before considered. These aspects of Internet data collection, in turn, allow for data analyses, graphics, and ways to communicate with the public, opportunities not possible with traditional data-collection approaches. Yet web-based contributions also pose considerable challenges, as discussed herein. After a decade of operational experience with the DYFI system and users, we document refinements to the processing and algorithmic procedures since DYFI was first conceived. We also describe a number of automatic post-processing tools, operations, applications, and research directions, all of which utilize the extensive DYFI intensity datasets now gathered in near-real time. DYFI can be found online at the website http://earthquake.usgs.gov/dyfi/. ?? 2011 by the Istituto Nazionale di Geofisica e Vulcanologia.

  12. Investigation of cloud properties and atmospheric stability with MODIS

    NASA Technical Reports Server (NTRS)

    Menzel, Paul

    1995-01-01

    In the past six months several milestones were accomplished. The MODIS Airborne Simulator (MAS) was flown in a 50 channel configuration for the first time in January 1995 and the data were calibrated and validated; in the same field campaign the approach for validating MODIS radiances using the MAS and High resolution Interferometer Sounder (HIS) instruments was successfully tested on GOES-8. Cloud masks for two scenes (one winter and the other summer) of AVHRR local area coverage from the Gulf of Mexico to Canada were processed and forwarded to the SDST for MODIS Science Team investigation; a variety of surface and cloud scenes were evident. Beta software preparations continued with incorporation of the EOS SDP Toolkit. SCAR-C data was processed and presented at the biomass burning conference. Preparations for SCAR-B accelerated with generation of a home page for access to real time satellite data related to biomass burning; this will be available to the scientists in Brazil via internet on the World Wide Web. The CO2 cloud algorithm was compared to other algorithms that differ in their construction of clear radiance fields. The HIRS global cloud climatology was completed for six years. The MODIS science team meeting was attended by five of the UW scientists.

  13. Shaking the Trees: The Psychology of Collecting in U.S. Newspaper Coverage of the College Admissions Process

    ERIC Educational Resources Information Center

    Bishop, Ronald

    2009-01-01

    A frame analysis was conducted to explore themes in recent coverage by print journalists of the college application process, with special attention paid to the use by reporters of "keywords, stock phrases, stereotyped images, sources of information, and sentences that provide reinforcing clusters of facts or judgments" (Entman, p. 52) about this…

  14. Tension-Enhanced Hydrogen Evolution Reaction on Vanadium Disulfide Monolayer

    NASA Astrophysics Data System (ADS)

    Pan, Hui

    2016-02-01

    Water electrolysis is an efficient way for hydrogen production. Finding efficient, cheap, and eco-friendly electrocatalysts is essential to the development of this technology. In the work, we present a first-principles study on the effects of tension on the hydrogen evolution reaction of a novel electrocatalyst, vanadium disulfide (VS2) monolayer. Two electrocatalytic processes, individual and collective processes, are investigated. We show that the catalytic ability of VS2 monolayer at higher hydrogen coverage can be efficiently improved by escalating tension. We find that the individual process is easier to occur in a wide range of hydrogen coverage and the collective process is possible at a certain hydrogen coverage under the same tension. The best hydrogen evolution reaction with near-zero Gibbs free energy can be achieved by tuning tension. We further show that the change of catalytic activity with tension and hydrogen coverage is induced by the change of free carrier density around the Fermi level, that is, higher carrier density, better catalytic performance. It is expected that tension can be a simple way to improve the catalytic activity, leading to the design of novel electrocatalysts for efficient hydrogen production from water electrolysis.

  15. A resource-oriented architecture for a Geospatial Web

    NASA Astrophysics Data System (ADS)

    Mazzetti, Paolo; Nativi, Stefano

    2010-05-01

    In this presentation we discuss some architectural issues on the design of an architecture for a Geospatial Web, that is an information system for sharing geospatial resources according to the Web paradigm. The success of the Web in building a multi-purpose information space, has raised questions about the possibility of adopting the same approach for systems dedicated to the sharing of more specific resources, such as the geospatial information, that is information characterized by spatial/temporal reference. To this aim an investigation on the nature of the Web and on the validity of its paradigm for geospatial resources is required. The Web was born in the early 90's to provide "a shared information space through which people and machines could communicate" [Berners-Lee 1996]. It was originally built around a small set of specifications (e.g. URI, HTTP, HTML, etc.); however, in the last two decades several other technologies and specifications have been introduced in order to extend its capabilities. Most of them (e.g. the SOAP family) actually aimed to transform the Web in a generic Distributed Computing Infrastructure. While these efforts were definitely successful enabling the adoption of service-oriented approaches for machine-to-machine interactions supporting complex business processes (e.g. for e-Government and e-Business applications), they do not fit in the original concept of the Web. In the year 2000, R. T. Fielding, one of the designers of the original Web specifications, proposes a new architectural style for distributed systems, called REST (Representational State Transfer), aiming to capture the fundamental characteristics of the Web as it was originally conceived [Fielding 2000]. In this view, the nature of the Web lies not so much in the technologies, as in the way they are used. Maintaining the Web architecture conform to the REST style would then assure the scalability, extensibility and low entry barrier of the original Web. On the contrary, systems using the same Web technologies and specifications but according to a different architectural style, despite their usefulness, should not be considered part of the Web. If the REST style captures the significant Web characteristics, then, in order to build a Geospatial Web it is necessary that its architecture satisfies all the REST constraints. One of them is of particular importance: the adoption of a Uniform Interface. It prescribes that all the geospatial resources must be accessed through the same interface; moreover according to the REST style this interface must satisfy four further constraints: a) identification of resources; b) manipulation of resources through representations; c) self-descriptive messages; and, d) hypermedia as the engine of application state. In the Web, the uniform interface provides basic operations which are meaningful for generic resources. They typically implement the CRUD pattern (Create-Retrieve-Update-Delete) which demonstrated to be flexible and powerful in several general-purpose contexts (e.g. filesystem management, SQL for database management systems, etc.). Restricting the scope to a subset of resources it would be possible to identify other generic actions which are meaningful for all of them. For example for geospatial resources, subsetting, resampling, interpolation and coordinate reference systems transformations functionalities are candidate functionalities for a uniform interface. However an investigation is needed to clarify the semantics of those actions for different resources, and consequently if they can really ascend the role of generic interface operation. Concerning the point a), (identification of resources), it is required that every resource addressable in the Geospatial Web has its own identifier (e.g. a URI). This allows to implement citation and re-use of resources, simply providing the URI. OPeNDAP and KVP encodings of OGC data access services specifications might provide a basis for it. Concerning point b) (manipulation of resources through representations), the Geospatial Web poses several issues. In fact, while the Web mainly handles semi-structured information, in the Geospatial Web the information is typically structured with several possible data models (e.g. point series, gridded coverages, trajectories, etc.) and encodings. A possibility would be to simplify the interchange formats, choosing to support a subset of data models and format(s). This is what actually the Web designers did choosing to define a common format for hypermedia (HTML), although the underlying protocol would be generic. Concerning point c), self-descriptive messages, the exchanged messages should describe themselves and their content. This would not be actually a major issue considering the effort put in recent years on geospatial metadata models and specifications. The point d), hypermedia as the engine of application state, is actually where the Geospatial Web would mainly differ from existing geospatial information sharing systems. In fact the existing systems typically adopt a service-oriented architecture, where applications are built as a single service or as a workflow of services. On the other hand, in the Geospatial Web, applications should be built following the path between interconnected resources. The link between resources should be made explicit as hyperlinks. The adoption of Semantic Web solutions would allow to define not only the existence of a link between two resources, but also the nature of the link. The implementation of a Geospatial Web would allow to build an information system with the same characteristics of the Web sharing its points-of-strength and weaknesses. The main advantages would be the following: • The user would interact with the Geospatial Web according to the well-known Web navigation paradigm. This would lower the barrier to the access to geospatial applications for non-specialists (e.g. the success of Google Maps and other Web mapping applications); • Successful Web and Web 2.0 applications - search engines, feeds, social network - could be integrated/replicated in the Geospatial Web; The main drawbacks would be the following: • The Uniform Interface simplifies the overall system architecture (e.g. no service registry, and service descriptors required), but moves the complexity to the data representation. Moreover since the interface must stay generic, it results really simple and therefore complex interactions would require several transfers. • In the geospatial domain one of the most valuable resources are processes (e.g. environmental models). How they can be modeled as resources accessed through the common interface is an open issue. Taking into account advantages and drawback it seems that a Geospatial Web would be useful, but its use would be limited to specific use-cases not covering all the possible applications. The Geospatial Web architecture could be partly based on existing specifications, while other aspects need investigation. References [Berners-Lee 1996] T. Berners-Lee, "WWW: Past, present, and future". IEEE Computer, 29(10), Oct. 1996, pp. 69-77. [Fielding 2000] Fielding, R. T. 2000. Architectural styles and the design of network-based software architectures. PhD Dissertation. Dept. of Information and Computer Science, University of California, Irvine

  16. Multiscale Drivers of Global Environmental Health

    NASA Astrophysics Data System (ADS)

    Desai, Manish Anil

    In this dissertation, I motivate, develop, and demonstrate three such approaches for investigating multiscale drivers of global environmental health: (1) a metric for analyzing contributions and responses to climate change from global to sectoral scales, (2) a framework for unraveling the influence of environmental change on infectious diseases at regional to local scales, and (3) a model for informing the design and evaluation of clean cooking interventions at community to household scales. The full utility of climate debt as an analytical perspective will remain untapped without tools that can be manipulated by a wide range of analysts, including global environmental health researchers. Chapter 2 explains how international natural debt (IND) apportions global radiative forcing from fossil fuel carbon dioxide and methane, the two most significant climate altering pollutants, to individual entities -- primarily countries but also subnational states and economic sectors, with even finer scales possible -- as a function of unique trajectories of historical emissions, taking into account the quite different radiative efficiencies and atmospheric lifetimes of each pollutant. Owing to its straightforward and transparent derivation, IND can readily operationalize climate debt to consider issues of equity and efficiency and drive scenario exercises that explore the response to climate change at multiple scales. Collectively, the analyses presented in this chapter demonstrate how IND can inform a range of key question on climate change mitigation at multiple scales, compelling environmental health towards an appraisal of the causes and not just the consequences of climate change. The environmental change and infectious disease (EnvID) conceptual framework of Chapter 3 builds on a rich history of prior efforts in epidemiologic theory, environmental science, and mathematical modeling by: (1) articulating a flexible and logical system specification; (2) incorporating transmission groupings linked to public health intervention strategies; (3) emphasizing the intersection of proximal environmental characteristics and transmission cycles; (4) incorporating a matrix formulation to identify knowledge gaps and facilitate an integration of research; and (5) highlighting hypothesis generation amidst dynamic processes. A systems based approach leverages the reality that studies relevant to environmental change and infectious disease are embedded within a wider web of interactions. As scientific understanding advances, the EnvID framework can help integrate the various factors at play in determining environment-disease relationships and the connections between intrinsically multiscale causal networks. In Chapter 4, the coverage effect model functions primarily as a "proof of concept" analysis to address whether the efficacy of a clean cooking technology may be determined by the extent of not only household level use but also community level coverage. Such coverage dependent efficacy, or a "coverage effect," would transform how interventions are studied and deployed. Ensemble results are consistent with the concept that an appreciable coverage effect from clean cooking interventions can manifest within moderately dense communities. Benefits for users derive largely from direct effects; initially, at low coverage levels, almost exclusively so. Yet, as coverage expands within a user's community, a coverage effect becomes increasingly beneficial. In contrast, non users, despite also experiencing comparable exposure reductions from community-level intervention use, cannot proportionately benefit because their exposures remain overwhelmingly dominated by household-level use of traditional solid fuel cookstoves. The coverage effect model strengthens the rationale for public health programs and policies to encourage clean cooking technologies with an added incentive to realize high coverage within contiguous areas. The implications of the modeling exercise extend to priorities for data collection, underscoring the importance of outdoor pollution concentrations during, as well as before and/or after, community cooking windows and also routine measurement of ventilation, meteorology, time activity patterns, and cooking practices. The possibility of a coverage effect necessitates appropriate strategies to estimate not only direct effects but also coverage and total effects to avoid impaired conclusions. The specter of accelerating social and ecological change challenges efforts to respond to climate change, re/emerging infectious diseases, and household air pollution. Environmental health possesses a well-established and well-tested repertoire of methods but contending with multiscale drivers of risk requires complementary approaches, as well. Integrating metrics, frameworks, and models -- and their insights -- into its analytical arsenal can help global environmental health meet the challenges of today and tomorrow. (Abstract shortened by ProQuest.).

  17. Electrohydrodynamic spinning of random-textured silver webs for electrodes embedded in flexible organic solar cells

    NASA Astrophysics Data System (ADS)

    Yoon, Dai Geon; Chin, Byung Doo; Bail, Robert

    2017-03-01

    A convenient process for fabricating a transparent conducting electrode on a flexible substrate is essential for numerous low-cost optoelectronic devices, including organic solar cells (OSCs), touch sensors, and free-form lighting applications. Solution-processed metal-nanowire arrays are attractive due to their low sheet resistance and optical clarity. However, the limited conductance at wire junctions and the rough surface topology still need improvement. Here, we present a facile process of electrohydrodynamic spinning using a silver (Ag) - polymer composite paste with high viscosity. Unlike the metal-nanofiber web formed by conventional electrospinning, a relatively thick, but still invisible-to-naked eye, Ag-web random pattern was formed on a glass substrate. The process parameters such as the nozzle diameter, voltage, flow rate, standoff height, and nozzle-scanning speed, were systematically engineered. The formed random texture Ag webs were embedded in a flexible substrate by in-situ photo-polymerization, release from the glass substrate, and post-annealing. OSCs with a donor-acceptor polymeric heterojunction photoactive layer were prepared on the Ag-web-embedded flexible films with various Ag-web densities. The short-circuit current and the power conversion efficiency of an OSC with a Ag-web-embedded electrode were not as high as those of the control sample with an indium-tin-oxide electrode. However, the Ag-web textures embedded in the OSC served well as electrodes when bent (6-mm radius), showing a power conversion efficiency of 2.06% (2.72% for the flat OSC), and the electrical stability of the Ag-web-textured patterns was maintained for up to 1,000 cycles of bending.

  18. Public anxiety and information seeking following the H1N1 outbreak: blogs, newspaper articles, and Wikipedia visits.

    PubMed

    Tausczik, Yla; Faasse, Kate; Pennebaker, James W; Petrie, Keith J

    2012-01-01

    Web-based methodologies may provide a new and unique insight into public response to an infectious disease outbreak. This naturalistic study investigates the effectiveness of new web-based methodologies in assessing anxiety and information seeking in response to the 2009 H1N1 outbreak by examining language use in weblogs ("blogs"), newspaper articles, and web-based information seeking. Language use in blogs and newspaper articles was assessed using Linguistic Inquiry and Word Count, and information seeking was examined using the number of daily visits to H1N1-relevant Wikipedia articles. The results show that blogs mentioning "swine flu" used significantly higher levels of anxiety, health, and death words and lower levels of positive emotion words than control blogs. Change in language use on blogs was strongly related to change in language use in newspaper coverage for the same day. Both the measure of anxiety in blogs mentioning "swine flu" and the number of Wikipedia visits followed similar trajectories, peaking shortly after the announcement of H1N1 and then declining rapidly. Anxiety measured in blogs preceded information seeking on Wikipedia. These results show that the public reaction to H1N1 was rapid and short-lived. This research suggests that analysis of web behavior can provide a source of naturalistic data on the level and changing pattern of public anxiety and information seeking following the outbreak of a public health emergency.

  19. Principles versus procedures in making health care coverage decisions: addressing inevitable conflicts.

    PubMed

    Sabik, Lindsay M; Lie, Reidar K

    2008-01-01

    It has been suggested that focusing on procedures when setting priorities for health care avoids the conflicts that arise when attempting to agree on principles. A prominent example of this approach is "accountability for reasonableness." We will argue that the same problem arises with procedural accounts; reasonable people will disagree about central elements in the process. We consider the procedural condition of appeal process and three examples of conflicts over coverage decisions: a patients' rights law in Norway, health technologies coverage recommendations in the UK, and care withheld by HMOs in the US. In each case a process is at the center of controversy, illustrating the difficulties in establishing procedures that are widely accepted as legitimate. Further work must be done in developing procedural frameworks.

  20. Web-based data collection: detailed methods of a questionnaire and data gathering tool

    PubMed Central

    Cooper, Charles J; Cooper, Sharon P; del Junco, Deborah J; Shipp, Eva M; Whitworth, Ryan; Cooper, Sara R

    2006-01-01

    There have been dramatic advances in the development of web-based data collection instruments. This paper outlines a systematic web-based approach to facilitate this process through locally developed code and to describe the results of using this process after two years of data collection. We provide a detailed example of a web-based method that we developed for a study in Starr County, Texas, assessing high school students' work and health status. This web-based application includes data instrument design, data entry and management, and data tables needed to store the results that attempt to maximize the advantages of this data collection method. The software also efficiently produces a coding manual, web-based statistical summary and crosstab reports, as well as input templates for use by statistical packages. Overall, web-based data entry using a dynamic approach proved to be a very efficient and effective data collection system. This data collection method expedited data processing and analysis and eliminated the need for cumbersome and expensive transfer and tracking of forms, data entry, and verification. The code has been made available for non-profit use only to the public health research community as a free download [1]. PMID:16390556

  1. Evaluating Web Usability

    ERIC Educational Resources Information Center

    Snider, Jean; Martin, Florence

    2012-01-01

    Web usability focuses on design elements and processes that make web pages easy to use. A website for college students was evaluated for underutilization. One-on-one testing, focus groups, web analytics, peer university review and marketing focus group and demographic data were utilized to conduct usability evaluation. The results indicated that…

  2. Stable Isotope Tracers of Process in Great Lakes Food Webs

    EPA Science Inventory

    Stable isotope analyses of biota are now commonly used to discern trophic pathways between consumers and their foods. However, those same isotope data also hold information about processes that influence the physicochemical setting of food webs as well as biological processes ope...

  3. Evaluating the Construct-Coverage of the e-rater[R] Scoring Engine. Research Report. ETS RR-09-01

    ERIC Educational Resources Information Center

    Quinlan, Thomas; Higgins, Derrick; Wolff, Susanne

    2009-01-01

    This report evaluates the construct coverage of the e-rater[R[ scoring engine. The matter of construct coverage depends on whether one defines writing skill, in terms of process or product. Originally, the e-rater engine consisted of a large set of components with a proven ability to predict human holistic scores. By organizing these capabilities…

  4. The New Zealand Major Trauma Registry: the foundation for a data-driven approach in a contemporary trauma system.

    PubMed

    Isles, Siobhan; Christey, Grant; Civil, Ian; Hicks, Peter

    2017-10-06

    To describe the development of the New Zealand Major Trauma Registry (NZ-MTR) and the initial experiences of its use. The background to the development of the NZ-MTR was reviewed and the processes undertaken to implement a single-instance of a web-based national registry described. A national minimum dataset was defined and utilised. Key structures to support the Registry such as a data governance group were established. The NZ-MTR was successfully implemented and is the foundation for a new, data-driven model of quality improvement. In its first year of operation over 1,300 patients were entered into the Registry although coverage is not yet universal. Overall incidence is 40.8 major trauma cases/100,000 population. The incidence in the Māori population was 69/100,000 compared with 31/100,000 in the non-Māori population. Case fatality rate was 9%. Three age peaks were observed at 20-24 years, 50-59 years and above 85 years. Road traffic crashes accounted for 50% of all caseload. A significant proportion of major trauma patients (21%) were transferred to one or more hospitals before reaching a definitive care facility. Despite the challenges working across multiple jurisdictions, initiation of a single-instance web-based registry has been achieved. The NZ-MTR enables New Zealand to have a national view of trauma treatment and outcomes for the first time. It will inform quality improvement and injury prevention initiatives and potentially decrease the burden of injury on all New Zealanders.

  5. Online Access to Weather Satellite Imagery Through the World Wide Web

    NASA Technical Reports Server (NTRS)

    Emery, W.; Baldwin, D.

    1998-01-01

    Both global area coverage (GAC) and high-resolution picture transmission (HRTP) data from the Advanced Very High Resolution Radiometer (AVHRR) are made available to laternet users through an online data access system. Older GOES-7 data am also available. Created as a "testbed" data system for NASA's future Earth Observing System Data and Information System (EOSDIS), this testbed provides an opportunity to test both the technical requirements of an onune'd;ta system and the different ways in which the -general user, community would employ such a system. Initiated in December 1991, the basic data system experienced five major evolutionary changes In response to user requests and requirements. Features added with these changes were the addition of online browse, user subsetting, dynamic image Processing/navigation, a stand-alone data storage system, and movement,from an X-windows graphical user Interface (GUI) to a World Wide Web (WWW) interface. Over Its lifetime, the system has had as many as 2500 registered users. The system on the WWW has had over 2500 hits since October 1995. Many of these hits are by casual users that only take the GIF images directly from the interface screens and do not specifically order digital data. Still, there b a consistent stream of users ordering the navigated image data and related products (maps and so forth). We have recently added a real-time, seven- day, northwestern United States normalized difference vegetation index (NDVI) composite that has generated considerable Interest. Index Terms-Data system, earth science, online access, satellite data.

  6. Teachers' Attitudes Toward WebQuests as a Method of Teaching

    ERIC Educational Resources Information Center

    Perkins, Robert; McKnight, Margaret L.

    2005-01-01

    One of the latest uses of technology gaining popular status in education is the WebQuest, a process that involves students using the World Wide Web to solve a problem. The goals of this project are to: (a) determine if teachers are using WebQuests in their classrooms; (b) ascertain whether teachers feel WebQuests are effective for teaching…

  7. An Efficient Approach for Web Indexing of Big Data through Hyperlinks in Web Crawling.

    PubMed

    Devi, R Suganya; Manjula, D; Siddharth, R K

    2015-01-01

    Web Crawling has acquired tremendous significance in recent times and it is aptly associated with the substantial development of the World Wide Web. Web Search Engines face new challenges due to the availability of vast amounts of web documents, thus making the retrieved results less applicable to the analysers. However, recently, Web Crawling solely focuses on obtaining the links of the corresponding documents. Today, there exist various algorithms and software which are used to crawl links from the web which has to be further processed for future use, thereby increasing the overload of the analyser. This paper concentrates on crawling the links and retrieving all information associated with them to facilitate easy processing for other uses. In this paper, firstly the links are crawled from the specified uniform resource locator (URL) using a modified version of Depth First Search Algorithm which allows for complete hierarchical scanning of corresponding web links. The links are then accessed via the source code and its metadata such as title, keywords, and description are extracted. This content is very essential for any type of analyser work to be carried on the Big Data obtained as a result of Web Crawling.

  8. An Efficient Approach for Web Indexing of Big Data through Hyperlinks in Web Crawling

    PubMed Central

    Devi, R. Suganya; Manjula, D.; Siddharth, R. K.

    2015-01-01

    Web Crawling has acquired tremendous significance in recent times and it is aptly associated with the substantial development of the World Wide Web. Web Search Engines face new challenges due to the availability of vast amounts of web documents, thus making the retrieved results less applicable to the analysers. However, recently, Web Crawling solely focuses on obtaining the links of the corresponding documents. Today, there exist various algorithms and software which are used to crawl links from the web which has to be further processed for future use, thereby increasing the overload of the analyser. This paper concentrates on crawling the links and retrieving all information associated with them to facilitate easy processing for other uses. In this paper, firstly the links are crawled from the specified uniform resource locator (URL) using a modified version of Depth First Search Algorithm which allows for complete hierarchical scanning of corresponding web links. The links are then accessed via the source code and its metadata such as title, keywords, and description are extracted. This content is very essential for any type of analyser work to be carried on the Big Data obtained as a result of Web Crawling. PMID:26137592

  9. A National Crop Progress Monitoring System Based on NASA Earth Science Results

    NASA Astrophysics Data System (ADS)

    Di, L.; Yu, G.; Zhang, B.; Deng, M.; Yang, Z.

    2011-12-01

    Crop progress is an important piece of information for food security and agricultural commodities. Timely monitoring and reporting are mandated for the operation of agricultural statistical agencies. Traditionally, the weekly reporting issued by the National Agricultural Statistics Service (NASS) of the United States Department of Agriculture (USDA) is based on reports from the knowledgeable state and county agricultural officials and farmers. The results are spatially coarse and subjective. In this project, a remote-sensing-supported crop progress monitoring system is being developed intensively using the data and derived products from NASA Earth Observing satellites. Moderate Resolution Imaging Spectroradiometer (MODIS) Level 3 product - MOD09 (Surface Reflectance) is used for deriving daily normalized vegetation index (NDVI), vegetation condition index (VCI), and mean vegetation condition index (MVCI). Ratio change to previous year and multiple year mean can be also produced on demand. The time-series vegetation condition indices are further combined with the NASS' remote-sensing-derived Cropland Data Layer (CDL) to estimate crop condition and progress crop by crop. To facilitate the operational requirement and increase the accessibility of data and products by different users, each component of the system has being developed and implemented following open specifications under the Web Service reference model of Open Geospatial Consortium Inc. Sensor observations and data are accessed through Web Coverage Service (WCS), Web Feature Service (WFS), or Sensor Observation Service (SOS) if available. Products are also served through such open-specification-compliant services. For rendering and presentation, Web Map Service (WMS) is used. A Web-service based system is set up and deployed at dss.csiss.gmu.edu/NDVIDownload. Further development will adopt crop growth models, feed the models with remotely sensed precipitation and soil moisture information, and incorporate the model results with vegetation-index time series for crop progress stage estimation.

  10. Going, going, still there: using the WebCite service to permanently archive cited web pages.

    PubMed

    Eysenbach, Gunther; Trudel, Mathieu

    2005-12-30

    Scholars are increasingly citing electronic "web references" which are not preserved in libraries or full text archives. WebCite is a new standard for citing web references. To "webcite" a document involves archiving the cited Web page through www.webcitation.org and citing the WebCite permalink instead of (or in addition to) the unstable live Web page. This journal has amended its "instructions for authors" accordingly, asking authors to archive cited Web pages before submitting a manuscript. Almost 200 other journals are already using the system. We discuss the rationale for WebCite, its technology, and how scholars, editors, and publishers can benefit from the service. Citing scholars initiate an archiving process of all cited Web references, ideally before they submit a manuscript. Authors of online documents and websites which are expected to be cited by others can ensure that their work is permanently available by creating an archived copy using WebCite and providing the citation information including the WebCite link on their Web document(s). Editors should ask their authors to cache all cited Web addresses (Uniform Resource Locators, or URLs) "prospectively" before submitting their manuscripts to their journal. Editors and publishers should also instruct their copyeditors to cache cited Web material if the author has not done so already. Finally, WebCite can process publisher submitted "citing articles" (submitted for example as eXtensible Markup Language [XML] documents) to automatically archive all cited Web pages shortly before or on publication. Finally, WebCite can act as a focussed crawler, caching retrospectively references of already published articles. Copyright issues are addressed by honouring respective Internet standards (robot exclusion files, no-cache and no-archive tags). Long-term preservation is ensured by agreements with libraries and digital preservation organizations. The resulting WebCite Index may also have applications for research assessment exercises, being able to measure the impact of Web services and published Web documents through access and Web citation metrics.

  11. A web-based solution for 3D medical image visualization

    NASA Astrophysics Data System (ADS)

    Hou, Xiaoshuai; Sun, Jianyong; Zhang, Jianguo

    2015-03-01

    In this presentation, we present a web-based 3D medical image visualization solution which enables interactive large medical image data processing and visualization over the web platform. To improve the efficiency of our solution, we adopt GPU accelerated techniques to process images on the server side while rapidly transferring images to the HTML5 supported web browser on the client side. Compared to traditional local visualization solution, our solution doesn't require the users to install extra software or download the whole volume dataset from PACS server. By designing this web-based solution, it is feasible for users to access the 3D medical image visualization service wherever the internet is available.

  12. Installing and Executing Information Object Analysis, Intent, Dissemination, and Enhancement (IOAIDE) and Its Dependencies

    DTIC Science & Technology

    2017-02-01

    Image Processing Web Server Administration ...........................17 Fig. 18 Microsoft ASP.NET MVC 4 installation...algorithms are made into client applications that can be accessed from an image processing web service2 developed following Representational State...Transfer (REST) standards by a mobile app, laptop PC, and other devices. Similarly, weather tweets can be accessed via the Weather Digest Web Service

  13. Effects of Reflection Category and Reflection Quality on Learning Outcomes during Web-Based Portfolio Assessment Process: A Case Study of High School Students in Computer Application Course

    ERIC Educational Resources Information Center

    Chou, Pao-Nan; Chang, Chi-Cheng

    2011-01-01

    This study examines the effects of reflection category and reflection quality on learning outcomes during Web-based portfolio assessment process. Experimental subjects consist of forty-five eight-grade students in a "Computer Application" course. Through the Web-based portfolio assessment system, these students write reflection, and join…

  14. Internet Technology in Magnetic Resonance: A Common Gateway Interface Program for the World-Wide Web NMR Spectrometer

    PubMed

    Buszko; Buszko; Wang

    1998-04-01

    A custom-written Common Gateway Interface (CGI) program for remote control of an NMR spectrometer using a World Wide Web browser has been described. The program, running on a UNIX workstation, uses multiple processes to handle concurrent tasks of interacting with the user and with the spectrometer. The program's parent process communicates with the browser and sends out commands to the spectrometer; the child process is mainly responsible for data acquisition. Communication between the processes is via the shared memory mechanism. The WWW pages that have been developed for the system make use of the frames feature of web browsers. The CGI program provides an intuitive user interface to the NMR spectrometer, making, in effect, a complex system an easy-to-use Web appliance. Copyright 1998 Academic Press.

  15. A study of an adaptive replication framework for orchestrated composite web services.

    PubMed

    Mohamed, Marwa F; Elyamany, Hany F; Nassar, Hamed M

    2013-01-01

    Replication is considered one of the most important techniques to improve the Quality of Services (QoS) of published Web Services. It has achieved impressive success in managing resource sharing and usage in order to moderate the energy consumed in IT environments. For a robust and successful replication process, attention should be paid to suitable time as well as the constraints and capabilities in which the process runs. The replication process is time-consuming since outsourcing some new replicas into other hosts is lengthy. Furthermore, nowadays, most of the business processes that might be implemented over the Web are composed of multiple Web services working together in two main styles: Orchestration and Choreography. Accomplishing a replication over such business processes is another challenge due to the complexity and flexibility involved. In this paper, we present an adaptive replication framework for regular and orchestrated composite Web services. The suggested framework includes a number of components for detecting unexpected and unhappy events that might occur when consuming the original published web services including failure or overloading. It also includes a specific replication controller to manage the replication process and select the best host that would encapsulate a new replica. In addition, it includes a component for predicting the incoming load in order to decrease the time needed for outsourcing new replicas, enhancing the performance greatly. A simulation environment has been created to measure the performance of the suggested framework. The results indicate that adaptive replication with prediction scenario is the best option for enhancing the performance of the replication process in an online business environment.

  16. Work of the Web Weavers: Web Development in Academic Libraries

    ERIC Educational Resources Information Center

    Bundza, Maira; Vander Meer, Patricia Fravel; Perez-Stable, Maria A.

    2009-01-01

    Although the library's Web site has become a standard tool for seeking information and conducting research in academic institutions, there are a variety of ways libraries approach the often challenging--and sometimes daunting--process of Web site development and maintenance. Three librarians at Western Michigan University explored issues related…

  17. Designing a Pedagogical Model for Web Engineering Education: An Evolutionary Perspective

    ERIC Educational Resources Information Center

    Hadjerrouit, Said

    2005-01-01

    In contrast to software engineering, which relies on relatively well established development approaches, there is a lack of a proven methodology that guides Web engineers in building reliable and effective Web-based systems. Currently, Web engineering lacks process models, architectures, suitable techniques and methods, quality assurance, and a…

  18. 78 FR 76187 - 30-Day Notice of Proposed Information Collection: Exchange Programs Alumni Web Site Registration

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-16

    ...: Exchange Programs Alumni Web Site Registration ACTION: Notice of request for public comment and submission... Information Collection: Exchange Programs Alumni Web site Registration. OMB Control Number: 1405-0192. Type of... proposed collection: The International Exchange Alumni Web site requires information to process users...

  19. Sensor Webs: Autonomous Rapid Response to Monitor Transient Science Events

    NASA Technical Reports Server (NTRS)

    Mandl, Dan; Grosvenor, Sandra; Frye, Stu; Sherwood, Robert; Chien, Steve; Davies, Ashley; Cichy, Ben; Ingram, Mary Ann; Langley, John; Miranda, Felix

    2005-01-01

    To better understand how physical phenomena, such as volcanic eruptions, evolve over time, multiple sensor observations over the duration of the event are required. Using sensor web approaches that integrate original detections by in-situ sensors and global-coverage, lower-resolution, on-orbit assets with automated rapid response observations from high resolution sensors, more observations of significant events can be made with increased temporal, spatial, and spectral resolution. This paper describes experiments using Earth Observing 1 (EO-1) along with other space and ground assets to implement progressive mission autonomy to identify, locate and image with high resolution instruments phenomena such as wildfires, volcanoes, floods and ice breakup. The software that plans, schedules and controls the various satellite assets are used to form ad hoc constellations which enable collaborative autonomous image collections triggered by transient phenomena. This software is both flight and ground based and works in concert to run all of the required assets cohesively and includes software that is model-based, artificial intelligence software.

  20. An Architecture for Automated Fire Detection Early Warning System Based on Geoprocessing Service Composition

    NASA Astrophysics Data System (ADS)

    Samadzadegan, F.; Saber, M.; Zahmatkesh, H.; Joze Ghazi Khanlou, H.

    2013-09-01

    Rapidly discovering, sharing, integrating and applying geospatial information are key issues in the domain of emergency response and disaster management. Due to the distributed nature of data and processing resources in disaster management, utilizing a Service Oriented Architecture (SOA) to take advantages of workflow of services provides an efficient, flexible and reliable implementations to encounter different hazardous situation. The implementation specification of the Web Processing Service (WPS) has guided geospatial data processing in a Service Oriented Architecture (SOA) platform to become a widely accepted solution for processing remotely sensed data on the web. This paper presents an architecture design based on OGC web services for automated workflow for acquisition, processing remotely sensed data, detecting fire and sending notifications to the authorities. A basic architecture and its building blocks for an automated fire detection early warning system are represented using web-based processing of remote sensing imageries utilizing MODIS data. A composition of WPS processes is proposed as a WPS service to extract fire events from MODIS data. Subsequently, the paper highlights the role of WPS as a middleware interface in the domain of geospatial web service technology that can be used to invoke a large variety of geoprocessing operations and chaining of other web services as an engine of composition. The applicability of proposed architecture by a real world fire event detection and notification use case is evaluated. A GeoPortal client with open-source software was developed to manage data, metadata, processes, and authorities. Investigating feasibility and benefits of proposed framework shows that this framework can be used for wide area of geospatial applications specially disaster management and environmental monitoring.

  1. The New Web-Based Hera Data Processing System at the HEASARC

    NASA Technical Reports Server (NTRS)

    Pence, W.

    2011-01-01

    The HEASARC at NASA/GSFC has provide an on-line astronomical data processing system called Hera for several years. Hera provides a complete data processing environment, including installed software packages, local data storage, and the CPU resources needed to process the user's data. The original design of Hera, however, has 2 requirements that has limited it's usefulness for some users, namely, that 1) the user must download and install a small helper program on their own computer before using Hera, and 2) Hera requires that several computer ports/sockets be allowed to communicate through any local firewalls on the users machine. Both of these restrictions can be problematic for some users, therefore we are now migrating Hera into a purely Web based environment which only requires a standard Web browser. The first release of Web Hera is now publicly available at http://heasarc.gsfc.nasa.gov/webheara/. It currently provides a standard graphical interface for running hundreds of different data processing programs that are available in the HEASARC's ftools software package. Over the next year we to add more features to Web Hera, including an interactive command line interface, and more display and line capabilities.

  2. [Quantification of acetabular coverage in normal adult].

    PubMed

    Lin, R M; Yang, C Y; Yu, C Y; Yang, C R; Chang, G L; Chou, Y L

    1991-03-01

    Quantification of acetabular coverage is important and can be expressed by superimposition of cartilage tracings on the maximum cross-sectional area of the femoral head. A practical Autolisp program on PC AutoCAD has been developed by us to quantify the acetabular coverage through numerical expression of the images of computed tomography. Thirty adults (60 hips) with normal center-edge angle and acetabular index in plain X ray were randomly selected for serial drops. These slices were prepared with a fixed coordination and in continuous sections of 5 mm in thickness. The contours of the cartilage of each section were digitized into a PC computer and processed by AutoCAD programs to quantify and characterize the acetabular coverage of normal and dysplastic adult hips. We found that a total coverage ratio of greater than 80%, an anterior coverage ratio of greater than 75% and a posterior coverage ratio of greater than 80% can be categorized in a normal group. Polar edge distance is a good indicator for the evaluation of preoperative and postoperative coverage conditions. For standardization and evaluation of acetabular coverage, the most suitable parameters are the total coverage ratio, anterior coverage ratio, posterior coverage ratio and polar edge distance. However, medial coverage and lateral coverage ratios are indispensable in cases of dysplastic hip because variations between them are so great that acetabuloplasty may be impossible. This program can also be used to classify precisely the type of dysplastic hip.

  3. Patient Information about Gout: An International Review of Existing Educational Resources.

    PubMed

    Johnston, Megan E; Treharne, Gareth J; Chapman, Peter T; Stamp, Lisa K

    2015-06-01

    Inadequate patient information about gout may contribute to poor disease outcomes. We reviewed existing educational resources for gout to identify strengths and weaknesses and compare resources cross-nationally. Content, readability, and dietary recommendations were reviewed using a sample of 30 resources (print and Web-based) from 6 countries. More than half of the resources were written at a highly complex level. Some content areas were lacking coverage, including comorbidity risks, uric acid target levels, and continuing allopurinol during acute attacks. Our findings suggest significant room for improvement in gout patient educational resources, particularly regarding self-management.

  4. Issues in Data Fusion for Satellite Aerosol Measurements for Applications with GIOVANNI System at NASA GES DISC

    NASA Technical Reports Server (NTRS)

    Gopalan, Arun; Zubko, Viktor; Leptoukh, Gregory G.

    2008-01-01

    We look at issues, barriers and approaches for Data Fusion of satellite aerosol data as available from the GES DISC GIOVANNI Web Service. Daily Global Maps of AOT from a single satellite sensor alone contain gaps that arise due to various sources (sun glint regions, clouds, orbital swath gaps at low latitudes, bright underlying surfaces etc.). The goal is to develop a fast, accurate and efficient method to improve the spatial coverage of the Daily AOT data to facilitate comparisons with Global Models. Data Fusion may be supplemented by Optimal Interpolation (OI) as needed.

  5. KML Super Overlay to WMS Translator

    NASA Technical Reports Server (NTRS)

    Plesea, Lucian

    2007-01-01

    This translator is a server-based application that automatically generates KML super overlay configuration files required by Google Earth for map data access via the Open Geospatial Consortium WMS (Web Map Service) standard. The translator uses a set of URL parameters that mirror the WMS parameters as much as possible, and it also can generate a super overlay subdivision of any given area that is only loaded when needed, enabling very large areas of coverage at very high resolutions. It can make almost any dataset available as a WMS service visible and usable in any KML application, without the need to reformat the data.

  6. Drying of fiber webs

    DOEpatents

    Warren, David W.

    1997-01-01

    A process and an apparatus for high-intensity drying of fiber webs or sheets, such as newsprint, printing and writing papers, packaging paper, and paperboard or linerboard, as they are formed on a paper machine. The invention uses direct contact between the wet fiber web or sheet and various molten heat transfer fluids, such as liquified eutectic metal alloys, to impart heat at high rates over prolonged durations, in order to achieve ambient boiling of moisture contained within the web. The molten fluid contact process causes steam vapor to emanate from the web surface, without dilution by ambient air; and it is differentiated from the evaporative drying techniques of the prior industrial art, which depend on the uses of steam-heated cylinders to supply heat to the paper web surface, and ambient air to carry away moisture, which is evaporated from the web surface. Contact between the wet fiber web and the molten fluid can be accomplished either by submersing the web within a molten bath or by coating the surface of the web with the molten media. Because of the high interfacial surface tension between the molten media and the cellulose fiber comprising the paper web, the molten media does not appreciately stick to the paper after it is dried. Steam generated from the paper web is collected and condensed without dilution by ambient air to allow heat recovery at significantly higher temperature levels than attainable in evaporative dryers.

  7. Impact of cost sharing on prescription drugs used by Medicare beneficiaries.

    PubMed

    Goedken, Amber M; Urmie, Julie M; Farris, Karen B; Doucette, William R

    2010-06-01

    Incentive-based prescription drug cost sharing can encourage seniors to use generic medications. Little information exists about prescription drug cost sharing and generic use in employer-sponsored plans after the implementation of Medicare Part D. To compare prescription drug cost sharing across prescription insurance type for Medicare beneficiaries after Medicare Part D, to assess the impact of that cost sharing on the number of medications used, and to examine how generic utilization rates differ before and after Medicare Part D and across the type of insurance. This longitudinal study of Medicare beneficiaries aged 65 years and older used Web-based surveys administered in 2005 and 2007 by Harris Interactive((R)) to collect information on prescription drug coverage and medication use. Co-payment plans were categorized as low, medium, or high co-payment plans. Multiple regression was used to assess the impact of co-payment rank on the number of prescription drugs. t-Tests and analysis of variance were used to compare generic use over time and between coverage types. One thousand two hundred twenty and 1024 respondents completed the baseline and follow-up surveys, respectively. Among 3-tier co-payment plans, brand drug co-payments were higher for Part D plans ($26 for preferred brand and $55 for nonpreferred brand) than employer-based plans ($20 for preferred brand and $39 for nonpreferred brand). Co-payment was not a significant predictor for the number of prescription drugs. Generic use was lowest among beneficiaries in employer plans both before and after Part D. In 2007, generic use among beneficiaries with Part D was not significantly different from the generic use for beneficiaries with no drug coverage. Medicare beneficiaries in Part D had higher cost sharing amounts than those with employer coverage, but higher cost sharing was not significantly linked to lower prescription use. Generic use for Part D beneficiaries was higher than that for beneficiaries with employer coverage but the same as that for beneficiaries without drug coverage. Copyright 2010 Elsevier Inc. All rights reserved.

  8. Internet Hospitals in China: Cross-Sectional Survey

    PubMed Central

    Lin, Lingyan; Fan, Si; Lin, Fen; Wang, Long; Guo, Tongjun; Ma, Chuyang; Zhang, Jingkun; Chen, Yixin

    2017-01-01

    Background The Internet hospital, an innovative approach to providing health care, is rapidly developing in China because it has the potential to provide widely accessible outpatient service delivery via Internet technologies. To date, China’s Internet hospitals have not been systematically investigated. Objective The aim of this study was to describe the characteristics of China’s Internet hospitals, and to assess their health service capacity. Methods We searched Baidu, the popular Chinese search engine, to identify Internet hospitals, using search terms such as “Internet hospital,” “web hospital,” or “cloud hospital.” All Internet hospitals in mainland China were eligible for inclusion if they were officially registered. Our search was carried out until March 31, 2017. Results We identified 68 Internet hospitals, of which 43 have been put into use and 25 were under construction. Of the 43 established Internet hospitals, 13 (30%) were in the hospital informatization stage, 24 (56%) were in the Web ward stage, and 6 (14%) were in full Internet hospital stage. Patients accessed outpatient service delivery via website (74%, 32/43), app (42%, 18/43), or offline medical consultation facility (37%, 16/43) from the Internet hospital. Furthermore, 25 (58%) of the Internet hospitals asked doctors to deliver health services at a specific Web clinic, whereas 18 (42%) did not. The consulting methods included video chat (60%, 26/43), telephone (19%, 8/43), and graphic message (28%, 12/43); 13 (30%) Internet hospitals cannot be consulted online any more. Only 6 Internet hospitals were included in the coverage of health insurance. The median number of doctors available online was zero (interquartile range [IQR] 0 to 5; max 16,492). The median consultation fee per time was ¥20 (approximately US $2.90, IQR ¥0 to ¥200). Conclusions Internet hospitals provide convenient outpatient service delivery. However, many of the Internet hospitals are not yet mature and are faced with various issues such as online doctor scarcity and the unavailability of health insurance coverage. China’s Internet hospitals are heading in the right direction to improve provision of health services, but much more remains to be done. PMID:28676472

  9. Beyond wishful thinking; medical community presence on the web and challenges of pervasive healthcare.

    PubMed

    Moisil, Ioana; Barbat, Boldur E

    2004-01-01

    Romanian healthcare is facing a number of challenges, from the growing general costs, through requests for better services, inadequate territorial coverage, medical errors and a growing incidence of chronic diseases, to the burden of debt toward the pharmaceutical industry. For the last 14 years decision factors have been searching for the magic formula in restructuring the healthcare sector. Eventually, the government has come to appreciate the benefits of IT solutions. Our paper presents recent advances in wireless technologies and their impact on healthcare, in parallel with the results of a study aimed to acknowledge the presence of the medical community on Romanian WWW and to evaluate the degree of accessibility for the general population. We have documented Web sites promoting health services, discussion forums for patients, online medical advice, medical image teleprocessing, health education, health research and documentation, pharmaceutical products, e-procurement, health portals, medical links, hospitals and other health units present on the Web. Initial results have shown that if the current trend in price decreases for mobile communications continues and if the government is able to provide funding for the communication infrastructure needed for pervasive healthcare systems together with the appropriate regulations and standards, this can be a long-term viable solution of the healthcare crisis.

  10. Effectiveness of interventions that apply new media to improve vaccine uptake and vaccine coverage.

    PubMed

    Odone, Anna; Ferrari, Antonio; Spagnoli, Francesca; Visciarelli, Sara; Shefer, Abigail; Pasquarella, Cesira; Signorelli, Carlo

    2015-01-01

    Vaccine-preventable diseases (VPD) are still a major cause of morbidity and mortality worldwide. In high and middle-income settings, immunization coverage is relatively high. However, in many countries coverage rates of routinely recommended vaccines are still below the targets established by international and national advisory committees. Progress in the field of communication technology might provide useful tools to enhance immunization strategies. To systematically collect and summarize the available evidence on the effectiveness of interventions that apply new media to promote vaccination uptake and increase vaccination coverage. We conducted a systematic literature review. Studies published from January 1999 to September 2013 were identified by searching electronic resources (Pubmed, Embase), manual searches of references and expert consultation. Study setting We focused on interventions that targeted recommended vaccinations for children, adolescents and adults and: (1) aimed at increasing community demand for immunizations, or (2) were provider-based interventions. We limited the study setting to countries that are members of the Organisation for Economic Co-operation and Development (OECD). The primary outcome was a measure of vaccination (vaccine uptake or vaccine coverage). Considered secondary outcomes included willingness to receive immunization, attitudes and perceptions toward vaccination, and perceived helpfulness of the intervention. Nineteen studies were included in the systematic review. The majority of the studies were conducted in the US (74%, n = 14); 68% (n = 13) of the studies were experimental, the rest having an observational study design. Eleven (58%) reported results on the primary outcome. Retrieved studies explored the role of: text messaging (n.7, 37%), smartphone applications (n.1, 5%), Youtube videos (n.1, 5%), Facebook (n.1, 5%), targeted websites and portals (n.4, 21%), software for physicians and health professionals (n.4, 21%), and email communication (n.1, 5%). There is some evidence that text messaging, accessing immunization campaign websites, using patient-held web-based portals and computerized reminders increase immunization coverage rates. Insufficient evidence is available on the use of social networks, email communication and smartphone applications. Although there is great potential for improving vaccine uptake and vaccine coverage by implementing programs and interventions that apply new media, scant data are available and further rigorous research - including cost-effectiveness assessments - is needed.

  11. Effectiveness of interventions that apply new media to improve vaccine uptake and vaccine coverage

    PubMed Central

    Odone, Anna; Ferrari, Antonio; Spagnoli, Francesca; Visciarelli, Sara; Shefer, Abigail; Pasquarella, Cesira; Signorelli, Carlo

    2014-01-01

    Background Vaccine-preventable diseases (VPD) are still a major cause of morbidity and mortality worldwide. In high and middle-income settings, immunization coverage is relatively high. However, in many countries coverage rates of routinely recommended vaccines are still below the targets established by international and national advisory committees. Progress in the field of communication technology might provide useful tools to enhance immunization strategies. Objective To systematically collect and summarize the available evidence on the effectiveness of interventions that apply new media to promote vaccination uptake and increase vaccination coverage. Design We conducted a systematic literature review. Studies published from January 1999 to September 2013 were identified by searching electronic resources (Pubmed, Embase), manual searches of references and expert consultation. Study setting We focused on interventions that targeted recommended vaccinations for children, adolescents and adults and: (1) aimed at increasing community demand for immunizations, or (2) were provider-based interventions. We limited the study setting to countries that are members of the Organisation for Economic Co-operation and Development (OECD). Main outcome measures The primary outcome was a measure of vaccination (vaccine uptake or vaccine coverage). Considered secondary outcomes included willingness to receive immunization, attitudes and perceptions toward vaccination, and perceived helpfulness of the intervention. Results Nineteen studies were included in the systematic review. The majority of the studies were conducted in the US (74%, n = 14); 68% (n = 13) of the studies were experimental, the rest having an observational study design. Eleven (58%) reported results on the primary outcome. Retrieved studies explored the role of: text messaging (n.7, 37%), smartphone applications (n.1, 5%), Youtube videos (n.1, 5%), Facebook (n.1, 5%), targeted websites and portals (n.4, 21%), software for physicians and health professionals (n.4, 21%), and email communication (n.1, 5%). There is some evidence that text messaging, accessing immunization campaign websites, using patient-held web-based portals and computerized reminders increase immunization coverage rates. Insufficient evidence is available on the use of social networks, email communication and smartphone applications. Conclusion Although there is great potential for improving vaccine uptake and vaccine coverage by implementing programs and interventions that apply new media, scant data are available and further rigorous research - including cost-effectiveness assessments - is needed. PMID:25483518

  12. Processing biological literature with customizable Web services supporting interoperable formats.

    PubMed

    Rak, Rafal; Batista-Navarro, Riza Theresa; Carter, Jacob; Rowley, Andrew; Ananiadou, Sophia

    2014-01-01

    Web services have become a popular means of interconnecting solutions for processing a body of scientific literature. This has fuelled research on high-level data exchange formats suitable for a given domain and ensuring the interoperability of Web services. In this article, we focus on the biological domain and consider four interoperability formats, BioC, BioNLP, XMI and RDF, that represent domain-specific and generic representations and include well-established as well as emerging specifications. We use the formats in the context of customizable Web services created in our Web-based, text-mining workbench Argo that features an ever-growing library of elementary analytics and capabilities to build and deploy Web services straight from a convenient graphical user interface. We demonstrate a 2-fold customization of Web services: by building task-specific processing pipelines from a repository of available analytics, and by configuring services to accept and produce a combination of input and output data interchange formats. We provide qualitative evaluation of the formats as well as quantitative evaluation of automatic analytics. The latter was carried out as part of our participation in the fourth edition of the BioCreative challenge. Our analytics built into Web services for recognizing biochemical concepts in BioC collections achieved the highest combined scores out of 10 participating teams. Database URL: http://argo.nactem.ac.uk. © The Author(s) 2014. Published by Oxford University Press.

  13. Processing biological literature with customizable Web services supporting interoperable formats

    PubMed Central

    Rak, Rafal; Batista-Navarro, Riza Theresa; Carter, Jacob; Rowley, Andrew; Ananiadou, Sophia

    2014-01-01

    Web services have become a popular means of interconnecting solutions for processing a body of scientific literature. This has fuelled research on high-level data exchange formats suitable for a given domain and ensuring the interoperability of Web services. In this article, we focus on the biological domain and consider four interoperability formats, BioC, BioNLP, XMI and RDF, that represent domain-specific and generic representations and include well-established as well as emerging specifications. We use the formats in the context of customizable Web services created in our Web-based, text-mining workbench Argo that features an ever-growing library of elementary analytics and capabilities to build and deploy Web services straight from a convenient graphical user interface. We demonstrate a 2-fold customization of Web services: by building task-specific processing pipelines from a repository of available analytics, and by configuring services to accept and produce a combination of input and output data interchange formats. We provide qualitative evaluation of the formats as well as quantitative evaluation of automatic analytics. The latter was carried out as part of our participation in the fourth edition of the BioCreative challenge. Our analytics built into Web services for recognizing biochemical concepts in BioC collections achieved the highest combined scores out of 10 participating teams. Database URL: http://argo.nactem.ac.uk. PMID:25006225

  14. A web-based tree crown condition training and evaluation tool for urban and community forestry

    Treesearch

    Matthew F. Winn; Neil A. Clark; Philip A. Araman; Sang-Mook Lee

    2007-01-01

    Training personnel for natural resource related field work can be a costly and time-consuming process. For that reason, web-based training is considered by many to be a more attractive alternative to on-site training. The U.S. Forest Service Southern Research Station unit with Virginia Tech cooperators in Blacksburg, Va., are in the process of constructing a web site...

  15. Measuring Listening Comprehension Skills of 5th Grade School Students with the Help of Web Based System

    ERIC Educational Resources Information Center

    Acat, M. Bahaddin; Demiral, Hilmi; Kaya, Mehmet Fatih

    2016-01-01

    The main purpose of this study is to measure listening comprehension skills of 5th grade school students with the help of web based system. This study was conducted on 5th grade students studying at the primary schools of Eskisehir. The scale used in the process of the study is "Web Based Listening Scale". In the process of the study,…

  16. Confetti: A Multiprotease Map of the HeLa Proteome for Comprehensive Proteomics*

    PubMed Central

    Guo, Xiaofeng; Trudgian, David C.; Lemoff, Andrew; Yadavalli, Sivaramakrishna; Mirzaei, Hamid

    2014-01-01

    Bottom-up proteomics largely relies on tryptic peptides for protein identification and quantification. Tryptic digestion often provides limited coverage of protein sequence because of issues such as peptide length, ionization efficiency, and post-translational modification colocalization. Unfortunately, a region of interest in a protein, for example, because of proximity to an active site or the presence of important post-translational modifications, may not be covered by tryptic peptides. Detection limits, quantification accuracy, and isoform differentiation can also be improved with greater sequence coverage. Selected reaction monitoring (SRM) would also greatly benefit from being able to identify additional targetable sequences. In an attempt to improve protein sequence coverage and to target regions of proteins that do not generate useful tryptic peptides, we deployed a multiprotease strategy on the HeLa proteome. First, we used seven commercially available enzymes in single, double, and triple enzyme combinations. A total of 48 digests were performed. 5223 proteins were detected by analyzing the unfractionated cell lysate digest directly; with 42% mean sequence coverage. Additional strong-anion exchange fractionation of the most complementary digests permitted identification of over 3000 more proteins, with improved mean sequence coverage. We then constructed a web application (https://proteomics.swmed.edu/confetti) that allows the community to examine a target protein or protein isoform in order to discover the enzyme or combination of enzymes that would yield peptides spanning a certain region of interest in the sequence. Finally, we examined the use of nontryptic digests for SRM. From our strong-anion exchange fractionation data, we were able to identify three or more proteotypic SRM candidates within a single digest for 6056 genes. Surprisingly, in 25% of these cases the digest producing the most observable proteotypic peptides was neither trypsin nor Lys-C. SRM analysis of Asp-N versus tryptic peptides for eight proteins determined that Asp-N yielded higher signal in five of eight cases. PMID:24696503

  17. Methods used for immunization coverage assessment in Canada, a Canadian Immunization Research Network (CIRN) study.

    PubMed

    Wilson, Sarah E; Quach, Susan; MacDonald, Shannon E; Naus, Monika; Deeks, Shelley L; Crowcroft, Natasha S; Mahmud, Salaheddin M; Tran, Dat; Kwong, Jeff; Tu, Karen; Gilbert, Nicolas L; Johnson, Caitlin; Desai, Shalini

    2017-08-03

    Accurate and complete immunization data are necessary to assess vaccine coverage, safety and effectiveness. Across Canada, different methods and data sources are used to assess vaccine coverage, but these have not been systematically described. Our primary objective was to examine and describe the methods used to determine immunization coverage in Canada. The secondary objective was to compare routine infant and childhood coverage estimates derived from the Canadian 2013 Childhood National Immunization Coverage Survey (cNICS) with estimates collected from provinces and territories (P/Ts). We collected information from key informants regarding their provincial, territorial or federal methods for assessing immunization coverage. We also collected P/T coverage estimates for select antigens and birth cohorts to determine absolute differences between these and estimates from cNICS. Twenty-six individuals across 16 public health organizations participated between April and August 2015. Coverage surveys are conducted regularly for toddlers in Quebec and in one health authority in British Columbia. Across P/Ts, different methodologies for measuring coverage are used (e.g., valid doses, grace periods). Most P/Ts, except Ontario, measure up-to-date (UTD) coverage and 4 P/Ts also assess on-time coverage. The degree of concordance between P/T and cNICS coverage estimates varied by jurisdiction, antigen and age group. In addition to differences in the data sources and processes used for coverage assessment, there are also differences between Canadian P/Ts in the methods used for calculating immunization coverage. Comparisons between P/T and cNICS estimates leave remaining questions about the proportion of children fully vaccinated in Canada.

  18. Methods used for immunization coverage assessment in Canada, a Canadian Immunization Research Network (CIRN) study

    PubMed Central

    Quach, Susan; MacDonald, Shannon E.; Naus, Monika; Deeks, Shelley L.; Crowcroft, Natasha S.; Mahmud, Salaheddin M.; Tran, Dat; Kwong, Jeff; Tu, Karen; Johnson, Caitlin; Desai, Shalini

    2017-01-01

    ABSTRACT Accurate and complete immunization data are necessary to assess vaccine coverage, safety and effectiveness. Across Canada, different methods and data sources are used to assess vaccine coverage, but these have not been systematically described. Our primary objective was to examine and describe the methods used to determine immunization coverage in Canada. The secondary objective was to compare routine infant and childhood coverage estimates derived from the Canadian 2013 Childhood National Immunization Coverage Survey (cNICS) with estimates collected from provinces and territories (P/Ts). We collected information from key informants regarding their provincial, territorial or federal methods for assessing immunization coverage. We also collected P/T coverage estimates for select antigens and birth cohorts to determine absolute differences between these and estimates from cNICS. Twenty-six individuals across 16 public health organizations participated between April and August 2015. Coverage surveys are conducted regularly for toddlers in Quebec and in one health authority in British Columbia. Across P/Ts, different methodologies for measuring coverage are used (e.g., valid doses, grace periods). Most P/Ts, except Ontario, measure up-to-date (UTD) coverage and 4 P/Ts also assess on-time coverage. The degree of concordance between P/T and cNICS coverage estimates varied by jurisdiction, antigen and age group. In addition to differences in the data sources and processes used for coverage assessment, there are also differences between Canadian P/Ts in the methods used for calculating immunization coverage. Comparisons between P/T and cNICS estimates leave remaining questions about the proportion of children fully vaccinated in Canada. PMID:28708945

  19. 75 FR 66413 - 30-Day Notice of Proposed Information Collection: Exchange Programs Alumni Web Site Registration...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-28

    ...: Exchange Programs Alumni Web Site Registration, DS-7006 ACTION: Notice of request for public comment and... Collection The Exchange Programs Alumni Web site requires information to process users' voluntary requests for participation in the Web site. Other than contact information, which is required for website...

  20. A Comparison between Quantity Surveying and Information Technology Students on Web Application in Learning Process

    ERIC Educational Resources Information Center

    Keng, Tan Chin; Ching, Yeoh Kah

    2015-01-01

    The use of web applications has become a trend in many disciplines including education. In view of the influence of web application in education, this study examines web application technologies that could enhance undergraduates' learning experiences, with focus on Quantity Surveying (QS) and Information Technology (IT) undergraduates. The…

  1. 20 CFR 656.17 - Basic labor certification process.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... participant in the job fair. (B) Employer's Web site. The use of the employer's Web site as a recruitment... involved in the application. (C) Job search Web site other than the employer's. The use of a job search Web...) The Department of Labor may issue or require the use of certain identifying information, including...

  2. The Adoption and Diffusion of Web Technologies into Mainstream Teaching.

    ERIC Educational Resources Information Center

    Hansen, Steve; Salter, Graeme

    2001-01-01

    Discusses various adoption and diffusion frameworks and methodologies to enhance the use of Web technologies by teaching staff. Explains the use of adopter-based models for product development; discusses the innovation-decision process; and describes PlatformWeb, a Web information system that was developed to help integrate a universities'…

  3. A Secure Web Application Providing Public Access to High-Performance Data Intensive Scientific Resources - ScalaBLAST Web Application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curtis, Darren S.; Peterson, Elena S.; Oehmen, Chris S.

    2008-05-04

    This work presents the ScalaBLAST Web Application (SWA), a web based application implemented using the PHP script language, MySQL DBMS, and Apache web server under a GNU/Linux platform. SWA is an application built as part of the Data Intensive Computer for Complex Biological Systems (DICCBS) project at the Pacific Northwest National Laboratory (PNNL). SWA delivers accelerated throughput of bioinformatics analysis via high-performance computing through a convenient, easy-to-use web interface. This approach greatly enhances emerging fields of study in biology such as ontology-based homology, and multiple whole genome comparisons which, in the absence of a tool like SWA, require a heroicmore » effort to overcome the computational bottleneck associated with genome analysis. The current version of SWA includes a user account management system, a web based user interface, and a backend process that generates the files necessary for the Internet scientific community to submit a ScalaBLAST parallel processing job on a dedicated cluster.« less

  4. Health and medication information resources on the World Wide Web.

    PubMed

    Grossman, Sara; Zerilli, Tina

    2013-04-01

    Health care practitioners have increasingly used the Internet to obtain health and medication information. The vast number of Internet Web sites providing such information and concerns with their reliability makes it essential for users to carefully select and evaluate Web sites prior to use. To this end, this article reviews the general principles to consider in this process. Moreover, as cost may limit access to subscription-based health and medication information resources with established reputability, freely accessible online resources that may serve as an invaluable addition to one's reference collection are highlighted. These include government- and organization-sponsored resources (eg, US Food and Drug Administration Web site and the American Society of Health-System Pharmacists' Drug Shortage Resource Center Web site, respectively) as well as commercial Web sites (eg, Medscape, Google Scholar). Familiarity with such online resources can assist health care professionals in their ability to efficiently navigate the Web and may potentially expedite the information gathering and decision-making process, thereby improving patient care.

  5. QoS measurement of workflow-based web service compositions using Colored Petri net.

    PubMed

    Nematzadeh, Hossein; Motameni, Homayun; Mohamad, Radziah; Nematzadeh, Zahra

    2014-01-01

    Workflow-based web service compositions (WB-WSCs) is one of the main composition categories in service oriented architecture (SOA). Eflow, polymorphic process model (PPM), and business process execution language (BPEL) are the main techniques of the category of WB-WSCs. Due to maturity of web services, measuring the quality of composite web services being developed by different techniques becomes one of the most important challenges in today's web environments. Business should try to provide good quality regarding the customers' requirements to a composed web service. Thus, quality of service (QoS) which refers to nonfunctional parameters is important to be measured since the quality degree of a certain web service composition could be achieved. This paper tried to find a deterministic analytical method for dependability and performance measurement using Colored Petri net (CPN) with explicit routing constructs and application of theory of probability. A computer tool called WSET was also developed for modeling and supporting QoS measurement through simulation.

  6. Universal Health Coverage - The Critical Importance of Global Solidarity and Good Governance Comment on "Ethical Perspective: Five Unacceptable Trade-offs on the Path to Universal Health Coverage".

    PubMed

    Reis, Andreas A

    2016-06-07

    This article provides a commentary to Ole Norheim' s editorial entitled "Ethical perspective: Five unacceptable trade-offs on the path to universal health coverage." It reinforces its message that an inclusive, participatory process is essential for ethical decision-making and underlines the crucial importance of good governance in setting fair priorities in healthcare. Solidarity on both national and international levels is needed to make progress towards the goal of universal health coverage (UHC). © 2016 by Kerman University of Medical Sciences.

  7. Influence of surface coverage on the chemical desorption process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Minissale, M.; Dulieu, F., E-mail: francois.dulieu@obspm.fr

    2014-07-07

    In cold astrophysical environments, some molecules are observed in the gas phase whereas they should have been depleted, frozen on dust grains. In order to solve this problem, astrochemists have proposed that a fraction of molecules synthesized on the surface of dust grains could desorb just after their formation. Recently the chemical desorption process has been demonstrated experimentally, but the key parameters at play have not yet been fully understood. In this article, we propose a new procedure to analyze the ratio of di-oxygen and ozone synthesized after O atoms adsorption on oxidized graphite. We demonstrate that the chemical desorptionmore » efficiency of the two reaction paths (O+O and O+O{sub 2}) is different by one order of magnitude. We show the importance of the surface coverage: for the O+O reaction, the chemical desorption efficiency is close to 80% at zero coverage and tends to zero at one monolayer coverage. The coverage dependence of O+O chemical desorption is proved by varying the amount of pre-adsorbed N{sub 2} on the substrate from 0 to 1.5 ML. Finally, we discuss the relevance of the different physical parameters that could play a role in the chemical desorption process: binding energy, enthalpy of formation, and energy transfer from the new molecule to the surface or to other adsorbates.« less

  8. Information Retrieval System for Japanese Standard Disease-Code Master Using XML Web Service

    PubMed Central

    Hatano, Kenji; Ohe, Kazuhiko

    2003-01-01

    Information retrieval system of Japanese Standard Disease-Code Master Using XML Web Service is developed. XML Web Service is a new distributed processing system by standard internet technologies. With seamless remote method invocation of XML Web Service, users are able to get the latest disease code master information from their rich desktop applications or internet web sites, which refer to this service. PMID:14728364

  9. BOWS (bioinformatics open web services) to centralize bioinformatics tools in web services.

    PubMed

    Velloso, Henrique; Vialle, Ricardo A; Ortega, J Miguel

    2015-06-02

    Bioinformaticians face a range of difficulties to get locally-installed tools running and producing results; they would greatly benefit from a system that could centralize most of the tools, using an easy interface for input and output. Web services, due to their universal nature and widely known interface, constitute a very good option to achieve this goal. Bioinformatics open web services (BOWS) is a system based on generic web services produced to allow programmatic access to applications running on high-performance computing (HPC) clusters. BOWS intermediates the access to registered tools by providing front-end and back-end web services. Programmers can install applications in HPC clusters in any programming language and use the back-end service to check for new jobs and their parameters, and then to send the results to BOWS. Programs running in simple computers consume the BOWS front-end service to submit new processes and read results. BOWS compiles Java clients, which encapsulate the front-end web service requisitions, and automatically creates a web page that disposes the registered applications and clients. Bioinformatics open web services registered applications can be accessed from virtually any programming language through web services, or using standard java clients. The back-end can run in HPC clusters, allowing bioinformaticians to remotely run high-processing demand applications directly from their machines.

  10. A novel architecture for information retrieval system based on semantic web

    NASA Astrophysics Data System (ADS)

    Zhang, Hui

    2011-12-01

    Nowadays, the web has enabled an explosive growth of information sharing (there are currently over 4 billion pages covering most areas of human endeavor) so that the web has faced a new challenge of information overhead. The challenge that is now before us is not only to help people locating relevant information precisely but also to access and aggregate a variety of information from different resources automatically. Current web document are in human-oriented formats and they are suitable for the presentation, but machines cannot understand the meaning of document. To address this issue, Berners-Lee proposed a concept of semantic web. With semantic web technology, web information can be understood and processed by machine. It provides new possibilities for automatic web information processing. A main problem of semantic web information retrieval is that when these is not enough knowledge to such information retrieval system, the system will return to a large of no sense result to uses due to a huge amount of information results. In this paper, we present the architecture of information based on semantic web. In addiction, our systems employ the inference Engine to check whether the query should pose to Keyword-based Search Engine or should pose to the Semantic Search Engine.

  11. Accredited hand surgery fellowship Web sites: analysis of content and accessibility.

    PubMed

    Trehan, Samir K; Morrell, Nathan T; Akelman, Edward

    2015-04-01

    To assess the accessibility and content of accredited hand surgery fellowship Web sites. A list of all accredited hand surgery fellowships was obtained from the online database of the American Society for Surgery of the Hand (ASSH). Fellowship program information on the ASSH Web site was recorded. All fellowship program Web sites were located via Google search. Fellowship program Web sites were analyzed for accessibility and content in 3 domains: program overview, application information/recruitment, and education. At the time of this study, there were 81 accredited hand surgery fellowships with 169 available positions. Thirty of 81 programs (37%) had a functional link on the ASSH online hand surgery fellowship directory; however, Google search identified 78 Web sites. Three programs did not have a Web site. Analysis of content revealed that most Web sites contained contact information, whereas information regarding the anticipated clinical, research, and educational experiences during fellowship was less often present. Furthermore, information regarding past and present fellows, salary, application process/requirements, call responsibilities, and case volume was frequently lacking. Overall, 52 of 81 programs (64%) had the minimal online information required for residents to independently complete the fellowship application process. Hand fellowship program Web sites could be accessed either via the ASSH online directory or Google search, except for 3 programs that did not have Web sites. Although most fellowship program Web sites contained contact information, other content such as application information/recruitment and education, was less frequently present. This study provides comparative data regarding the clinical and educational experiences outlined on hand fellowship program Web sites that are of relevance to residents, fellows, and academic hand surgeons. This study also draws attention to various ways in which the hand surgery fellowship application process can be made more user-friendly and efficient. Copyright © 2015 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.

  12. Conducting Retrospective Ontological Clinical Trials in ICD-9-CM in the Age of ICD-10-CM.

    PubMed

    Venepalli, Neeta K; Shergill, Ardaman; Dorestani, Parvaneh; Boyd, Andrew D

    2014-01-01

    To quantify the impact of International Classification of Disease 10th Revision Clinical Modification (ICD-10-CM) transition in cancer clinical trials by comparing coding accuracy and data discontinuity in backward ICD-10-CM to ICD-9-CM mapping via two tools, and to develop a standard ICD-9-CM and ICD-10-CM bridging methodology for retrospective analyses. While the transition to ICD-10-CM has been delayed until October 2015, its impact on cancer-related studies utilizing ICD-9-CM diagnoses has been inadequately explored. Three high impact journals with broad national and international readerships were reviewed for cancer-related studies utilizing ICD-9-CM diagnoses codes in study design, methods, or results. Forward ICD-9-CM to ICD-10-CM mapping was performing using a translational methodology with the Motif web portal ICD-9-CM conversion tool. Backward mapping from ICD-10-CM to ICD-9-CM was performed using both Centers for Medicare and Medicaid Services (CMS) general equivalence mappings (GEMs) files and the Motif web portal tool. Generated ICD-9-CM codes were compared with the original ICD-9-CM codes to assess data accuracy and discontinuity. While both methods yielded additional ICD-9-CM codes, the CMS GEMs method provided incomplete coverage with 16 of the original ICD-9-CM codes missing, whereas the Motif web portal method provided complete coverage. Of these 16 codes, 12 ICD-9-CM codes were present in 2010 Illinois Medicaid data, and accounted for 0.52% of patient encounters and 0.35% of total Medicaid reimbursements. Extraneous ICD-9-CM codes from both methods (Centers for Medicare and Medicaid Services general equivalent mapping [CMS GEMs, n = 161; Motif web portal, n = 246]) in excess of original ICD-9-CM codes accounted for 2.1% and 2.3% of total patient encounters and 3.4% and 4.1% of total Medicaid reimbursements from the 2010 Illinois Medicare database. Longitudinal data analyses post-ICD-10-CM transition will require backward ICD-10-CM to ICD-9-CM coding, and data comparison for accuracy. Researchers must be aware that all methods for backward coding are not comparable in yielding original ICD-9-CM codes. The mandated delay is an opportunity for organizations to better understand areas of financial risk with regards to data management via backward coding. Our methodology is relevant for all healthcare-related coding data, and can be replicated by organizations as a strategy to mitigate financial risk.

  13. Insurance Enrollment at a Student-Run Free Clinic After the Patient Protection and Affordable Care Act.

    PubMed

    McGeehan, Megan; DeMaria, Rebecca; Charney, Pamela; Batavia, Ashita S

    2017-08-01

    The Patient Protection and Affordable Care Act (ACA) aims to increase insurance coverage through government subsidies. Medical student-run free clinics (SRFC) are an important entry point into the healthcare system for the uninsured. SRFCs do not have a standardized approach for navigating the complexities of enrollment. The Weill Cornell Community Clinic (WCCC) developed a unique enrollment model that may inform other SRFCs. Our objective is to describe enrollment processes at SRFCs throughout New York City, and to evaluate enrollment outcomes and persistent barriers to coverage at WCCC. We surveyed SRFC leadership throughout NYC to understand enrollment processes. We evaluated enrollment outcomes at WCCC through chart review and structured phone interviews. Subjects included WCCC patients seen in clinic between October 1, 2013 and September 30, 2015 (N = 140). Demographic information, method of insurance enrollment, and qualitative description of enrollment barriers were collected. SRFCs in New York City have diverse enrollment processes. 48% (N = 42) of WCCC patients obtained health insurance. Immigration status was a barrier to coverage in 21% of patients. Failure to gain coverage was predicted by larger household size (p = 0.02). Gender and employment status were not associated with remaining uninsured. The main barriers to enrollment were inability to afford premiums and lack of interest. Insurance enrollment processes at SRFCs in New York City are mostly ad hoc and outcomes are rarely tracked. Following implementation of the ACA, WCCC stands out for its structured approach, with approximately half of eligible WCCC patients gaining coverage during the study period.

  14. E-Government Goes Semantic Web: How Administrations Can Transform Their Information Processes

    NASA Astrophysics Data System (ADS)

    Klischewski, Ralf; Ukena, Stefan

    E-government applications and services are built mainly on access to, retrieval of, integration of, and delivery of relevant information to citizens, businesses, and administrative users. In order to perform such information processing automatically through the Semantic Web,1 machine-readable2 enhancements of web resources are needed, based on the understanding of the content and context of the information in focus. While these enhancements are far from trivial to produce, administrations in their role of information and service providers so far find little guidance on how to migrate their web resources and enable a new quality of information processing; even research is still seeking best practices. Therefore, the underlying research question of this chapter is: what are the appropriate approaches which guide administrations in transforming their information processes toward the Semantic Web? In search for answers, this chapter analyzes the challenges and possible solutions from the perspective of administrations: (a) the reconstruction of the information processing in the e-government in terms of how semantic technologies must be employed to support information provision and consumption through the Semantic Web; (b) the required contribution to the transformation is compared to the capabilities and expectations of administrations; and (c) available experience with the steps of transformation are reviewed and discussed as to what extent they can be expected to successfully drive the e-government to the Semantic Web. This research builds on studying the case of Schleswig-Holstein, Germany, where semantic technologies have been used within the frame of the Access-eGov3 project in order to semantically enhance electronic service interfaces with the aim of providing a new way of accessing and combining e-government services.

  15. Four-dimensional characterization of a sheet-forming web

    DOEpatents

    Sari-Sarraf, Hamed; Goddard, James S.

    2003-04-22

    A method and apparatus are provided by which a sheet-forming web may be characterized in four dimensions. Light images of the web are recorded at a point adjacent the initial stage of the web, for example, near the headbox in a paperforming operation. The images are digitized, and the resulting data is processed by novel algorithms to provide a four-dimensional measurement of the web. The measurements include two-dimensional spatial information, the intensity profile of the web, and the depth profile of the web. These measurements can be used to characterize the web, predict its properties and monitor production events, and to analyze and quantify headbox flow dynamics.

  16. Designing a Web Site to Share Information with Parents

    ERIC Educational Resources Information Center

    Englund, Lillian White

    2009-01-01

    This article discusses the development and use of an on-line portfolio process. It presents a background rationale for the need and effectiveness of a communication tool that supports the use of the portfolio process throughout the education of a child with identified disabilities. The process for developing the individualized Web page is…

  17. Going, Going, Still There: Using the WebCite Service to Permanently Archive Cited Web Pages

    PubMed Central

    Trudel, Mathieu

    2005-01-01

    Scholars are increasingly citing electronic “web references” which are not preserved in libraries or full text archives. WebCite is a new standard for citing web references. To “webcite” a document involves archiving the cited Web page through www.webcitation.org and citing the WebCite permalink instead of (or in addition to) the unstable live Web page. This journal has amended its “instructions for authors” accordingly, asking authors to archive cited Web pages before submitting a manuscript. Almost 200 other journals are already using the system. We discuss the rationale for WebCite, its technology, and how scholars, editors, and publishers can benefit from the service. Citing scholars initiate an archiving process of all cited Web references, ideally before they submit a manuscript. Authors of online documents and websites which are expected to be cited by others can ensure that their work is permanently available by creating an archived copy using WebCite and providing the citation information including the WebCite link on their Web document(s). Editors should ask their authors to cache all cited Web addresses (Uniform Resource Locators, or URLs) “prospectively” before submitting their manuscripts to their journal. Editors and publishers should also instruct their copyeditors to cache cited Web material if the author has not done so already. Finally, WebCite can process publisher submitted “citing articles” (submitted for example as eXtensible Markup Language [XML] documents) to automatically archive all cited Web pages shortly before or on publication. Finally, WebCite can act as a focussed crawler, caching retrospectively references of already published articles. Copyright issues are addressed by honouring respective Internet standards (robot exclusion files, no-cache and no-archive tags). Long-term preservation is ensured by agreements with libraries and digital preservation organizations. The resulting WebCite Index may also have applications for research assessment exercises, being able to measure the impact of Web services and published Web documents through access and Web citation metrics. PMID:16403724

  18. Coverage of Certain Preventive Services Under the Affordable Care Act. Final rules.

    PubMed

    2015-07-14

    This document contains final regulations regarding coverage of certain preventive services under section 2713 of the Public Health Service Act (PHS Act), added by the Patient Protection and Affordable Care Act, as amended, and incorporated into the Employee Retirement Income Security Act of 1974 and the Internal Revenue Code. Section 2713 of the PHS Act requires coverage without cost sharing of certain preventive health services by non-grandfathered group health plans and health insurance coverage. These regulations finalize provisions from three rulemaking actions: Interim final regulations issued in July 2010 related to coverage of preventive services, interim final regulations issued in August 2014 related to the process an eligible organization uses to provide notice of its religious objection to the coverage of contraceptive services, and proposed regulations issued in August 2014 related to the definition of "eligible organization,'' which would expand the set of entities that may avail themselves of an accommodation with respect to the coverage of contraceptive services.

  19. Development of a user-friendly system for image processing of electron microscopy by integrating a web browser and PIONE with Eos.

    PubMed

    Tsukamoto, Takafumi; Yasunaga, Takuo

    2014-11-01

    Eos (Extensible object-oriented system) is one of the powerful applications for image processing of electron micrographs. In usual cases, Eos works with only character user interfaces (CUI) under the operating systems (OS) such as OS-X or Linux, not user-friendly. Thus, users of Eos need to be expert at image processing of electron micrographs, and have a little knowledge of computer science, as well. However, all the persons who require Eos does not an expert for CUI. Thus we extended Eos to a web system independent of OS with graphical user interfaces (GUI) by integrating web browser.Advantage to use web browser is not only to extend Eos with GUI, but also extend Eos to work under distributed computational environment. Using Ajax (Asynchronous JavaScript and XML) technology, we implemented more comfortable user-interface on web browser. Eos has more than 400 commands related to image processing for electron microscopy, and the usage of each command is different from each other. Since the beginning of development, Eos has managed their user-interface by using the interface definition file of "OptionControlFile" written in CSV (Comma-Separated Value) format, i.e., Each command has "OptionControlFile", which notes information for interface and its usage generation. Developed GUI system called "Zephyr" (Zone for Easy Processing of HYpermedia Resources) also accessed "OptionControlFIle" and produced a web user-interface automatically, because its mechanism is mature and convenient,The basic actions of client side system was implemented properly and can supply auto-generation of web-form, which has functions of execution, image preview, file-uploading to a web server. Thus the system can execute Eos commands with unique options for each commands, and process image analysis. There remain problems of image file format for visualization and workspace for analysis: The image file format information is useful to check whether the input/output file is correct and we also need to provide common workspace for analysis because the client is physically separated from a server. We solved the file format problem by extension of rules of OptionControlFile of Eos. Furthermore, to solve workspace problems, we have developed two type of system. The first system is to use only local environments. The user runs a web server provided by Eos, access to a web client through a web browser, and manipulate the local files with GUI on the web browser. The second system is employing PIONE (Process-rule for Input/Output Negotiation Environment), which is our developing platform that works under heterogenic distributed environment. The users can put their resources, such as microscopic images, text files and so on, into the server-side environment supported by PIONE, and so experts can write PIONE rule definition, which defines a workflow of image processing. PIONE run each image processing on suitable computers, following the defined rule. PIONE has the ability of interactive manipulation, and user is able to try a command with various setting values. In this situation, we contribute to auto-generation of GUI for a PIONE workflow.As advanced functions, we have developed a module to log user actions. The logs include information such as setting values in image processing, procedure of commands and so on. If we use the logs effectively, we can get a lot of advantages. For example, when an expert may discover some know-how of image processing, other users can also share logs including his know-hows and so we may obtain recommendation workflow of image analysis, if we analyze logs. To implement social platform of image processing for electron microscopists, we have developed system infrastructure, as well. © The Author 2014. Published by Oxford University Press on behalf of The Japanese Society of Microscopy. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  20. Process evaluation of the Enabling Mothers toPrevent Pediatric Obesity Through Web-Based Learning and Reciprocal Determinism (EMPOWER) randomized control trial.

    PubMed

    Knowlden, Adam P; Sharma, Manoj

    2014-09-01

    Family-and-home-based interventions are an important vehicle for preventing childhood obesity. Systematic process evaluations have not been routinely conducted in assessment of these interventions. The purpose of this study was to plan and conduct a process evaluation of the Enabling Mothers to Prevent Pediatric Obesity Through Web-Based Learning and Reciprocal Determinism (EMPOWER) randomized control trial. The trial was composed of two web-based, mother-centered interventions for prevention of obesity in children between 4 and 6 years of age. Process evaluation used the components of program fidelity, dose delivered, dose received, context, reach, and recruitment. Categorical process evaluation data (program fidelity, dose delivered, dose exposure, and context) were assessed using Program Implementation Index (PII) values. Continuous process evaluation variables (dose satisfaction and recruitment) were assessed using ANOVA tests to evaluate mean differences between groups (experimental and control) and sessions (sessions 1 through 5). Process evaluation results found that both groups (experimental and control) were equivalent, and interventions were administered as planned. Analysis of web-based intervention process objectives requires tailoring of process evaluation models for online delivery. Dissemination of process evaluation results can advance best practices for implementing effective online health promotion programs. © 2014 Society for Public Health Education.

  1. Drying of fiber webs

    DOEpatents

    Warren, D.W.

    1997-04-15

    A process and an apparatus are disclosed for high-intensity drying of fiber webs or sheets, such as newsprint, printing and writing papers, packaging paper, and paperboard or linerboard, as they are formed on a paper machine. The invention uses direct contact between the wet fiber web or sheet and various molten heat transfer fluids, such as liquefied eutectic metal alloys, to impart heat at high rates over prolonged durations, in order to achieve ambient boiling of moisture contained within the web. The molten fluid contact process causes steam vapor to emanate from the web surface, without dilution by ambient air; and it is differentiated from the evaporative drying techniques of the prior industrial art, which depend on the uses of steam-heated cylinders to supply heat to the paper web surface, and ambient air to carry away moisture, which is evaporated from the web surface. Contact between the wet fiber web and the molten fluid can be accomplished either by submersing the web within a molten bath or by coating the surface of the web with the molten media. Because of the high interfacial surface tension between the molten media and the cellulose fiber comprising the paper web, the molten media does not appreciatively stick to the paper after it is dried. Steam generated from the paper web is collected and condensed without dilution by ambient air to allow heat recovery at significantly higher temperature levels than attainable in evaporative dryers. 6 figs.

  2. The Role of Virtual Reference in Library Web Site Design: A Qualitative Source for Usage Data

    ERIC Educational Resources Information Center

    Powers, Amanda Clay; Shedd, Julie; Hill, Clay

    2011-01-01

    Gathering qualitative information about usage behavior of library Web sites is a time-consuming process requiring the active participation of patron communities. Libraries that collect virtual reference transcripts, however, hold valuable data regarding how the library Web site is used that could benefit Web designers. An analysis of virtual…

  3. Use of Web Technology to Access and Update College Plans

    ERIC Educational Resources Information Center

    Valeau, Edward J.; Luan, Jing

    2007-01-01

    In this study, the process and outcome of a web-based planning application, called Ports of Call, are discussed. The application allows college management to create, edit, and report out activities relating to college plans, all through a web browser. Its design was based on best practices in modern web technology and the application can be easily…

  4. Users' Interaction with World Wide Web Resources: An Exploratory Study Using a Holistic Approach.

    ERIC Educational Resources Information Center

    Wang, Peiling; Hawk, William B.; Tenopir, Carol

    2000-01-01

    Presents results of a study that explores factors of user-Web interaction in finding factual information, develops a conceptual framework for studying user-Web interaction, and applies a process-tracing method for conducting holistic user-Web studies. Describes measurement techniques and proposes a model consisting of the user, interface, and the…

  5. A WebQuest for Spatial Skills

    ERIC Educational Resources Information Center

    Wood, Pamela L.; Quitadamo, Ian J.; DePaepe, James L.; Loverro, Ian

    2007-01-01

    The WebQuest is a four-step process integrated at appropriate points in the Animal Studies unit. Through the WebQuest, students create a series of habitat maps that build on the knowledge gained from conducting the various activities of the unit. The quest concludes with an evaluation using the WebQuest rubric and an oral presentation of a final…

  6. E-Learning Technologies: Employing Matlab Web Server to Facilitate the Education of Mathematical Programming

    ERIC Educational Resources Information Center

    Karagiannis, P.; Markelis, I.; Paparrizos, K.; Samaras, N.; Sifaleras, A.

    2006-01-01

    This paper presents new web-based educational software (webNetPro) for "Linear Network Programming." It includes many algorithms for "Network Optimization" problems, such as shortest path problems, minimum spanning tree problems, maximum flow problems and other search algorithms. Therefore, webNetPro can assist the teaching process of courses such…

  7. A Web Browser Interface to Manage the Searching and Organizing of Information on the Web by Learners

    ERIC Educational Resources Information Center

    Li, Liang-Yi; Chen, Gwo-Dong

    2010-01-01

    Information Gathering is a knowledge construction process. Web learners make a plan for their Information Gathering task based on their prior knowledge. The plan is evolved with new information encountered and their mental model is constructed through continuously assimilating and accommodating new information gathered from different Web pages. In…

  8. Environmental Response Laboratory Network (ERLN) WebEDR Quick Reference Guide

    EPA Pesticide Factsheets

    The Web Electronic Data Review is a web-based system that performs automated data processing on laboratory-submitted Electronic Data Deliverables (EDDs). Enables users to perform technical audits on data, and against Measurement Quality Objectives (MQOs).

  9. Clinical software development for the Web: lessons learned from the BOADICEA project

    PubMed Central

    2012-01-01

    Background In the past 20 years, society has witnessed the following landmark scientific advances: (i) the sequencing of the human genome, (ii) the distribution of software by the open source movement, and (iii) the invention of the World Wide Web. Together, these advances have provided a new impetus for clinical software development: developers now translate the products of human genomic research into clinical software tools; they use open-source programs to build them; and they use the Web to deliver them. Whilst this open-source component-based approach has undoubtedly made clinical software development easier, clinical software projects are still hampered by problems that traditionally accompany the software process. This study describes the development of the BOADICEA Web Application, a computer program used by clinical geneticists to assess risks to patients with a family history of breast and ovarian cancer. The key challenge of the BOADICEA Web Application project was to deliver a program that was safe, secure and easy for healthcare professionals to use. We focus on the software process, problems faced, and lessons learned. Our key objectives are: (i) to highlight key clinical software development issues; (ii) to demonstrate how software engineering tools and techniques can facilitate clinical software development for the benefit of individuals who lack software engineering expertise; and (iii) to provide a clinical software development case report that can be used as a basis for discussion at the start of future projects. Results We developed the BOADICEA Web Application using an evolutionary software process. Our approach to Web implementation was conservative and we used conventional software engineering tools and techniques. The principal software development activities were: requirements, design, implementation, testing, documentation and maintenance. The BOADICEA Web Application has now been widely adopted by clinical geneticists and researchers. BOADICEA Web Application version 1 was released for general use in November 2007. By May 2010, we had > 1200 registered users based in the UK, USA, Canada, South America, Europe, Africa, Middle East, SE Asia, Australia and New Zealand. Conclusions We found that an evolutionary software process was effective when we developed the BOADICEA Web Application. The key clinical software development issues identified during the BOADICEA Web Application project were: software reliability, Web security, clinical data protection and user feedback. PMID:22490389

  10. Clinical software development for the Web: lessons learned from the BOADICEA project.

    PubMed

    Cunningham, Alex P; Antoniou, Antonis C; Easton, Douglas F

    2012-04-10

    In the past 20 years, society has witnessed the following landmark scientific advances: (i) the sequencing of the human genome, (ii) the distribution of software by the open source movement, and (iii) the invention of the World Wide Web. Together, these advances have provided a new impetus for clinical software development: developers now translate the products of human genomic research into clinical software tools; they use open-source programs to build them; and they use the Web to deliver them. Whilst this open-source component-based approach has undoubtedly made clinical software development easier, clinical software projects are still hampered by problems that traditionally accompany the software process. This study describes the development of the BOADICEA Web Application, a computer program used by clinical geneticists to assess risks to patients with a family history of breast and ovarian cancer. The key challenge of the BOADICEA Web Application project was to deliver a program that was safe, secure and easy for healthcare professionals to use. We focus on the software process, problems faced, and lessons learned. Our key objectives are: (i) to highlight key clinical software development issues; (ii) to demonstrate how software engineering tools and techniques can facilitate clinical software development for the benefit of individuals who lack software engineering expertise; and (iii) to provide a clinical software development case report that can be used as a basis for discussion at the start of future projects. We developed the BOADICEA Web Application using an evolutionary software process. Our approach to Web implementation was conservative and we used conventional software engineering tools and techniques. The principal software development activities were: requirements, design, implementation, testing, documentation and maintenance. The BOADICEA Web Application has now been widely adopted by clinical geneticists and researchers. BOADICEA Web Application version 1 was released for general use in November 2007. By May 2010, we had > 1200 registered users based in the UK, USA, Canada, South America, Europe, Africa, Middle East, SE Asia, Australia and New Zealand. We found that an evolutionary software process was effective when we developed the BOADICEA Web Application. The key clinical software development issues identified during the BOADICEA Web Application project were: software reliability, Web security, clinical data protection and user feedback.

  11. Coverage and Consumption of Micronutrient Powders, Fortified Staples, and Iodized Salt Among Children Aged 6 to 23 Months in Selected Neighborhoods of Nairobi County, Kenya.

    PubMed

    Leyvraz, Magali; David-Kigaru, Dorcus M; Macharia-Mutie, Catherine; Aaron, Grant J; Roefs, Marlene; Tumilowicz, Alison

    2018-03-01

    Intake of micronutrient-rich foods among children aged 6 to 23 months in Nairobi is low. This study aimed to assess existing coverage and utilization of micronutrient powders (MNPs), fortified staples, and iodized salt among children aged 6 to 23 months prior to implementation of an MNP program. A cross-sectional survey among caregivers of children aged 6 to 23 months (n = 618) was implemented in 7 neighborhoods within Nairobi County, representing the implementation area of the new MNP program. Results for MNP coverage and utilization showed 28.5% of all caregivers were aware of MNP, 18.5% had ever received MNP for their child, and 10.8% had fed MNP to their child in the previous 7 days. Effective coverage (ie, the child had been given the MNP at least 3 times in the previous 7 days) was 5.8%. Effective coverage of infants and young children with poor feeding practices was significantly lower as compared to those with non-poor feeding practices (coverage ratio, 0.34; confidence interval, 0.12-0.70). Most households purchased iodized salt (96.9%), fortified oil (61.0%), and fortified maize flour (93.9%). An estimated 23.9% of vitamin A requirements of children (6-23 months) were provided from fortified oil and 50.7% of iron from fortified maize flour. Most households consumed processed milk (81%). Coverage of MNPs in the surveyed neighborhoods was low. Coverage of fortified salt, oil, and maize flour was high and provided significant amount of micronutrients to children. Processed milk has potential as a vehicle for food fortification.

  12. News trends and web search query of HIV/AIDS in Hong Kong

    PubMed Central

    Chiu, Alice P. Y.; Lin, Qianying

    2017-01-01

    Background The HIV epidemic in Hong Kong has worsened in recent years, with major contributions from high-risk subgroup of men who have sex with men (MSM). Internet use is prevalent among the majority of the local population, where they sought health information online. This study examines the impacts of HIV/AIDS and MSM news coverage on web search query in Hong Kong. Methods Relevant news coverage about HIV/AIDS and MSM from January 1st, 2004 to December 31st, 2014 was obtained from the WiseNews databse. News trends were created by computing the number of relevant articles by type, topic, place of origin and sub-populations. We then obtained relevant search volumes from Google and analysed causality between news trends and Google Trends using Granger Causality test and orthogonal impulse function. Results We found that editorial news has an impact on “HIV” Google searches on HIV, with the search term popularity peaking at an average of two weeks after the news are published. Similarly, editorial news has an impact on the frequency of “AIDS” searches two weeks after. MSM-related news trends have a more fluctuating impact on “MSM” Google searches, although the time lag varies anywhere from one week later to ten weeks later. Conclusions This infodemiological study shows that there is a positive impact of news trends on the online search behavior of HIV/AIDS or MSM-related issues for up to ten weeks after. Health promotional professionals could make use of this brief time window to tailor the timing of HIV awareness campaigns and public health interventions to maximise its reach and effectiveness. PMID:28922376

  13. Challenges in Visualizing Satellite Level 2 Atmospheric Data with GIS approach

    NASA Astrophysics Data System (ADS)

    Wei, J. C.; Yang, W.; Zhao, P.; Pham, L.; Meyer, D. J.

    2017-12-01

    Satellite data products are important for a wide variety of applications that can bring far-reaching benefits to the science community and the broader society. These benefits can best be achieved if the satellite data are well utilized and interpreted. Unfortunately, this is not always the case, despite the abundance and relative maturity of numerous satellite data products provided by NASA and other organizations. One way to help users better understand the satellite data is to provide data along with `Images', including accurate pixel coverage area delineation, and science team recommended quality screening for individual geophysical parameters. However, there are challenges of visualizing remote sensed non-gridded products: (1) different geodetics of space-borne instruments (2) data often arranged in "along-track" and "across-track" axes (3) spatially and temporally continuous data chunked into granule files: data for a portion (or all) of a satellite orbit (4) no general rule of resampling or interpolations to a grid (5) geophysical retrieval only based on pixel center location without shape information. In this presentation, we will unravel a new Goddard Earth Sciences Data and Information Services Center (GES DISC) Level 2 (L2) visualization on-demand service. The service's front end provides various visualization and data accessing capabilities, such as overlay and swipe of multiply variables and subset and download of data in different formats. The backend of the service consists of Open Geospatial Consortium (OGC) standard-compliant Web Mapping Service (WMS) and Web Coverage Service. The infrastructure allows inclusion of outside data sources served in OGC compliant protocols and allows other interoperable clients, such as ArcGIS clients, to connect to our L2 WCS/WMS.

  14. Using geographic information systems (GIS) to identify communities in need of health insurance outreach: An OCHIN practice-based research network (PBRN) report.

    PubMed

    Angier, Heather; Likumahuwa, Sonja; Finnegan, Sean; Vakarcs, Trisha; Nelson, Christine; Bazemore, Andrew; Carrozza, Mark; DeVoe, Jennifer E

    2014-01-01

    Our practice-based research network (PBRN) is conducting an outreach intervention to increase health insurance coverage for patients seen in the network. To assist with outreach site selection, we sought an understandable way to use electronic health record (EHR) data to locate uninsured patients. Health insurance information was displayed within a web-based mapping platform to demonstrate the feasibility of using geographic information systems (GIS) to visualize EHR data. This study used EHR data from 52 clinics in the OCHIN PBRN. We included cross-sectional coverage data for patients aged 0 to 64 years with at least 1 visit to a study clinic during 2011 (n = 228,284). Our PBRN was successful in using GIS to identify intervention sites. Through use of the maps, we found geographic variation in insurance rates of patients seeking care in OCHIN PBRN clinics. Insurance rates also varied by age: The percentage of adults without insurance ranged from 13.2% to 86.8%; rates of children lacking insurance ranged from 1.1% to 71.7%. GIS also showed some areas of households with median incomes that had low insurance rates. EHR data can be imported into a web-based GIS mapping tool to visualize patient information. Using EHR data, we were able to observe smaller areas than could be seen using only publicly available data. Using this information, we identified appropriate OCHIN PBRN clinics for dissemination of an EHR-based insurance outreach intervention. GIS could also be used by clinics to visualize other patient-level characteristics to target clinic outreach efforts or interventions. © Copyright 2014 by the American Board of Family Medicine.

  15. Challenges in Obtaining and Visualizing Satellite Level 2 Data in GIS

    NASA Technical Reports Server (NTRS)

    Wei, Jennifer C.; Yang, Wenli; Zhao, Peisheng; Pham, Long; Meyer, David J.

    2017-01-01

    Satellite data products are important for a wide variety of applications that can bring far-reaching benefits to the science community and the broader society. These benefits can best be achieved if the satellite data are well utilized and interpreted. Unfortunately, this is not always the case, despite the abundance and relative maturity of numerous satellite data products provided by NASA and other organizations. One way to help users better understand the satellite data is to provide data along with Images, including accurate pixel coverage area delineation, and science team recommended quality screening for individual geophysical parameters. However, there are challenges of visualizing remote sensed non-gridded products: (1) different geodetics of space-borne instruments (2) data often arranged in a long-track� and a cross-track� axes (3) spatially and temporally continuous data chunked into granule files: data for a portion (or all) of a satellite orbit (4) no general rule of resampling or interpolations to a grid (5) geophysical retrieval only based on pixel center location without shape information. In this presentation, we will unravel a new Goddard Earth Sciences Data and Information Services Center (GES DISC) Level 2 (L2) visualization on-demand service. The service's front end provides various visualization and data accessing capabilities, such as overlay and swipe of multiply variables and subset and download of data in different formats. The backend of the service consists of Open Geospatial Consortium (OGC) standard-compliant Web Mapping Service (WMS) and Web Coverage Service. The infrastructure allows inclusion of outside data sources served in OGC compliant protocols and allows other interoperable clients, such as ArcGIS clients, to connect to our L2 WCS/WMS.

  16. Exploiting Aura OMI Level 2 Data with High Resolution Visualization

    NASA Astrophysics Data System (ADS)

    Wei, J. C.; Yang, W.; Johnson, J. E.; Zhao, P.; Gerasimov, I. V.; Pham, L.; Vicente, G. A.; Shen, S.

    2014-12-01

    Satellite data products are important for a wide variety of applications that can bring far-reaching benefits to the science community and the broader society. These benefits can best be achieved if the satellite data are well utilized and interpreted, such as model inputs from satellite, or extreme event (such as volcano eruption, dust storm, …etc) interpretation from satellite. Unfortunately, this is not always the case, despite the abundance and relative maturity of numerous satellite data products provided by NASA and other organizations. One way to help users better understand the satellite data is to provide data along with 'Images', including accurate pixel-level (Level 2) information, pixel coverage area delineation, and science team recommended quality screening for individual geophysical parameters. Goddard Earth Sciences Data and Information Services Center (GES DISC) always strives to best support (i.e., Software-as-a-service, SaaS) the user-community for NASA Earth Science Data. In this case, we will present a new visualization tool that helps users exploiting Aura Ozone Monitoring Instrument (OMI) Level 2 data. This new visualization service utilizes Open Geospatial Consortium (OGC) standard-compliant Web Mapping Service (WMS) and Web Coverage Service (WCS) calls in the backend infrastructure. The functionality of the service allows users to select data sources (e.g., multiple parameters under the same measurement, like NO2 and SO2 from OMI Level 2 or same parameter with different methods of aggregation, like NO2 in OMNO2G and OMNO2D products), defining area-of-interest and temporal extents, zooming, panning, overlaying, sliding, and data subsetting and reformatting. The interface will also be able to connect to other OGC WMS and WCS servers, which will greatly enhance its expandability to integrate additional outside data/map sources (such as Global Imagery Browse Services (GIBS)).

  17. Accuracy and impact of spatial aids based upon satellite enumeration to improve indoor residual spraying spatial coverage.

    PubMed

    Bridges, Daniel J; Pollard, Derek; Winters, Anna M; Winters, Benjamin; Sikaala, Chadwick; Renn, Silvia; Larsen, David A

    2018-02-23

    Indoor residual spraying (IRS) is a key tool in the fight to control, eliminate and ultimately eradicate malaria. IRS protection is based on a communal effect such that an individual's protection primarily relies on the community-level coverage of IRS with limited protection being provided by household-level coverage. To ensure a communal effect is achieved through IRS, achieving high and uniform community-level coverage should be the ultimate priority of an IRS campaign. Ensuring high community-level coverage of IRS in malaria-endemic areas is challenging given the lack of information available about both the location and number of households needing IRS in any given area. A process termed 'mSpray' has been developed and implemented and involves use of satellite imagery for enumeration for planning IRS and a mobile application to guide IRS implementation. This study assessed (1) the accuracy of the satellite enumeration and (2) how various degrees of spatial aid provided through the mSpray process affected community-level IRS coverage during the 2015 spray campaign in Zambia. A 2-stage sampling process was applied to assess accuracy of satellite enumeration to determine number and location of sprayable structures. Results indicated an overall sensitivity of 94% for satellite enumeration compared to finding structures on the ground. After adjusting for structure size, roof, and wall type, households in Nchelenge District where all types of satellite-based spatial aids (paper-based maps plus use of the mobile mSpray application) were used were more likely to have received IRS than Kasama district where maps used were not based on satellite enumeration. The probability of a household being sprayed in Nchelenge district where tablet-based maps were used, did not differ statistically from that of a household in Samfya District, where detailed paper-based spatial aids based on satellite enumeration were provided. IRS coverage from the 2015 spray season benefited from the use of spatial aids based upon satellite enumeration. These spatial aids can guide costly IRS planning and implementation leading to attainment of higher spatial coverage, and likely improve disease impact.

  18. Low cost silicon solar array project large area silicon sheet task: Silicon web process development

    NASA Technical Reports Server (NTRS)

    Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Blais, P. D.; Davis, J. R., Jr.

    1977-01-01

    Growth configurations were developed which produced crystals having low residual stress levels. The properties of a 106 mm diameter round crucible were evaluated and it was found that this design had greatly enhanced temperature fluctuations arising from convection in the melt. Thermal modeling efforts were directed to developing finite element models of the 106 mm round crucible and an elongated susceptor/crucible configuration. Also, the thermal model for the heat loss modes from the dendritic web was examined for guidance in reducing the thermal stress in the web. An economic analysis was prepared to evaluate the silicon web process in relation to price goals.

  19. TreeVector: scalable, interactive, phylogenetic trees for the web.

    PubMed

    Pethica, Ralph; Barker, Gary; Kovacs, Tim; Gough, Julian

    2010-01-28

    Phylogenetic trees are complex data forms that need to be graphically displayed to be human-readable. Traditional techniques of plotting phylogenetic trees focus on rendering a single static image, but increases in the production of biological data and large-scale analyses demand scalable, browsable, and interactive trees. We introduce TreeVector, a Scalable Vector Graphics-and Java-based method that allows trees to be integrated and viewed seamlessly in standard web browsers with no extra software required, and can be modified and linked using standard web technologies. There are now many bioinformatics servers and databases with a range of dynamic processes and updates to cope with the increasing volume of data. TreeVector is designed as a framework to integrate with these processes and produce user-customized phylogenies automatically. We also address the strengths of phylogenetic trees as part of a linked-in browsing process rather than an end graphic for print. TreeVector is fast and easy to use and is available to download precompiled, but is also open source. It can also be run from the web server listed below or the user's own web server. It has already been deployed on two recognized and widely used database Web sites.

  20. Youpi: A Web-based Astronomical Image Processing Pipeline

    NASA Astrophysics Data System (ADS)

    Monnerville, M.; Sémah, G.

    2010-12-01

    Youpi stands for “YOUpi is your processing PIpeline”. It is a portable, easy to use web application providing high level functionalities to perform data reduction on scientific FITS images. It is built on top of open source processing tools that are released to the community by Terapix, in order to organize your data on a computer cluster, to manage your processing jobs in real time and to facilitate teamwork by allowing fine-grain sharing of results and data. On the server side, Youpi is written in the Python programming language and uses the Django web framework. On the client side, Ajax techniques are used along with the Prototype and script.aculo.us Javascript librairies.

  1. Declassifying coverage. New guidance documents issued by the CMS may clarify the great unknown of Medicare coverage determination.

    PubMed

    Barr, Paul

    2005-03-21

    Heard of Section 731 of the Medicare Modernization Act? The provision, meant to clear up the generally murky and unwieldy Medicare reimbursement approval process, could quicken the national approval process for medical procedures and devices. The changes will create a more definable approach to OK'ing new technologies. "I like the idea of bringing more science to bear," says Richard Pico, left.

  2. Using Web-Based Activities to Promote Reading: An Exploratory Study with Teenagers (Uso de actividades en la red para promover la lectura: un estudio exploratorio con adolescentes)

    ERIC Educational Resources Information Center

    Rátiva Velandia, Marlén; Pedreros Torres, Andrés Leonardo; Núñez Alí, Mónica

    2012-01-01

    It is considered valuable to take advantage of web activities to improve and qualify the English teaching and learning processes, especially in the promotion of reading comprehension. In this article we share the process and results of a study that focused on some activities based on web materials that were designed and used with 10th grade…

  3. Macroscopic characterisations of Web accessibility

    NASA Astrophysics Data System (ADS)

    Lopes, Rui; Carriço, Luis

    2010-12-01

    The Web Science framework poses fundamental questions on the analysis of the Web, by focusing on how microscopic properties (e.g. at the level of a Web page or Web site) emerge into macroscopic properties and phenomena. One research topic on the analysis of the Web is Web accessibility evaluation, which centres on understanding how accessible a Web page is for people with disabilities. However, when framing Web accessibility evaluation on Web Science, we have found that existing research stays at the microscopic level. This article presents an experimental study on framing Web accessibility evaluation into Web Science's goals. This study resulted in novel accessibility properties of the Web not found at microscopic levels, as well as of Web accessibility evaluation processes themselves. We observed at large scale some of the empirical knowledge on how accessibility is perceived by designers and developers, such as the disparity of interpretations of accessibility evaluation tools warnings. We also found a direct relation between accessibility quality and Web page complexity. We provide a set of guidelines for designing Web pages, education on Web accessibility, as well as on the computational limits of large-scale Web accessibility evaluations.

  4. Connection Development: Web Lessons from Westchester.

    ERIC Educational Resources Information Center

    Freedman, Maurice J.

    1996-01-01

    Committed to utilizing information technology, the Westchester Library System (New York) made the World Wide Web publicly accessible. Describes the planning, implementation, and management process; obstacles involving financing; establishing Internet connectivity; and vendor negotiations. Westchester hired a Web manager, created Internet use…

  5. Solution Kinetics Database on the Web

    National Institute of Standards and Technology Data Gateway

    SRD 40 NDRL/NIST Solution Kinetics Database on the Web (Web, free access)   Data for free radical processes involving primary radicals from water, inorganic radicals and carbon-centered radicals in solution, and singlet oxygen and organic peroxyl radicals in various solvents.

  6. New French Coverage with Evidence Development for Innovative Medical Devices: Improvements and Unresolved Issues.

    PubMed

    Martelli, Nicolas; van den Brink, Hélène; Borget, Isabelle

    2016-01-01

    We describe here recent modifications to the French Coverage with Evidence Development (CED) scheme for innovative medical devices. CED can be defined as temporary coverage for a novel health product during collection of the additional evidence required to determine whether definitive coverage is possible. The principle refinements to the scheme include a more precise definition of what may be considered an innovative product, the possibility for device manufacturers to request CED either independently or in partnership with hospitals, and the establishment of processing deadlines for health authorities. In the long term, these modifications may increase the number of applications to the CED scheme, which could lead to unsustainable funding for future projects. It will also be necessary to ensure that the study conditions required by national health authorities are suitable for medical devices and that processing deadlines are met for the scheme to be fully operational. Overall, the modifications recently applied to the French CED scheme for innovative medical devices should increase the transparency of the process, and therefore be more appealing to medical device manufacturers. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  7. Can We Envision a Bettor's Guide to Climate Prediction Markets?

    NASA Astrophysics Data System (ADS)

    Trexler, M.

    2017-12-01

    It's one thing to set up a climate prediction market, it's another to find enough informed traders to make the market work. Climate bets could range widely, from purely scientific or atmospheric metrics, to bets that involve the interplay of science, policy, economic, and behavioral outcomes. For a topic as complex and politicized as climate change, a Bettor's Guide to Climate Predictions could substantially expand and diversify the pool of individuals trading in the market, increasing both its liquidity and decision-support value. The Climate Web is an on-line and publically accessible Beta version of such a Bettor's Guide, implementing the knowledge management adage: "if only we knew what we know." The Climate Web not only curates the key literature, news coverage, and websites relating to more than 100 climate topics, from extreme event exceedance curves to climate economics to climate risk scenarios, it extracts and links together thousands of ideas and graphics across all of those topics. The Climate Web integrates the many disciplinary silos that characterize today's often dysfunctional climate policy conversations, allowing rapid cross-silo exploration and understanding. As a Bettor's Guide it would allow prediction market traders to better research and understand their potential bets, and to quickly survey key thinking and uncertainties relating to those bets. The availability of such a Bettor's Guide to Climate Predictions should make traders willing to place more bets than they otherwise would, and should facilitate higher quality betting. The presentation will introduce the knowledge management dimensions and challenges of climate prediction markets, and introduce the Climate Web as one solution to those challenges.

  8. Python Winding Itself Around Datacubes: How to Access Massive Multi-Dimensional Arrays in a Pythonic Way

    NASA Astrophysics Data System (ADS)

    Merticariu, Vlad; Misev, Dimitar; Baumann, Peter

    2017-04-01

    While python has developed into the lingua franca in Data Science there is often a paradigm break when accessing specialized tools. In particular for one of the core data categories in science and engineering, massive multi-dimensional arrays, out-of-memory solutions typically employ their own, different models. We discuss this situation on the example of the scalable open-source array engine, rasdaman ("raster data manager") which offers access to and processing of Petascale multi-dimensional arrays through an SQL-style array query language, rasql. Such queries are executed in the server on a storage engine utilizing adaptive array partitioning and based on a processing engine implementing a "tile streaming" paradigm to allow processing of arrays massively larger than server RAM. The rasdaman QL has acted as blueprint for forthcoming ISO Array SQL and the Open Geospatial Consortium (OGC) geo analytics language, Web Coverage Processing Service, adopted in 2008. Not surprisingly, rasdaman is OGC and INSPIRE Reference Implementation for their "Big Earth Data" standards suite. Recently, rasdaman has been augmented with a python interface which allows to transparently interact with the database (credits go to Siddharth Shukla's Master Thesis at Jacobs University). Programmers do not need to know the rasdaman query language, as the operators are silently transformed, through lazy evaluation, into queries. Arrays delivered are likewise automatically transformed into their python representation. In the talk, the rasdaman concept will be illustrated with the help of large-scale real-life examples of operational satellite image and weather data services, and sample python code.

  9. A Virtual Tour of the Radio Astronomy Process

    NASA Astrophysics Data System (ADS)

    Conrad, S. B.; Finley, D. G.; Claussen, M. J.; Ulvestad, J. S.

    2000-12-01

    In the summer of 2000, two teachers working on a Masters of Science Teaching Degree at New Mexico Tech and participating in the Research Experience for Teachers (RET) program sponsored by the National Science Foundation, spent eight weeks as interns researching and working on projects at the National Radio Astronomy Observatory (NRAO) which will directly benefit students in their classrooms and also impact other science educators. One of the products of the interships is a set of web pages for NRAO's web page educational section. The purpose of these web pages is to familiarize students, teachers, and other people with the process that a radio astronomer goes through to do radio astronomy science. A virtual web tour was created of this process. This required interviewing radio astronomers and other professionals involved with this process at the NRAO (e.g. engineers, data analysts, and operations people), and synthesizing the interviews into a descriptive, visual-based set of web pages. These pages do meet the National as well as New Mexico Standards and Benchmarks for Science Education. The National Radio Astronomy Observatory is a facility of the National Science Foundation, operated under cooperative agreement by Associated Universities, Inc. The NSF's RET program is gratefully acknowledged.

  10. Towards Web-based representation and processing of health information

    PubMed Central

    Gao, Sheng; Mioc, Darka; Yi, Xiaolun; Anton, Francois; Oldfield, Eddie; Coleman, David J

    2009-01-01

    Background There is great concern within health surveillance, on how to grapple with environmental degradation, rapid urbanization, population mobility and growth. The Internet has emerged as an efficient way to share health information, enabling users to access and understand data at their fingertips. Increasingly complex problems in the health field require increasingly sophisticated computer software, distributed computing power, and standardized data sharing. To address this need, Web-based mapping is now emerging as an important tool to enable health practitioners, policy makers, and the public to understand spatial health risks, population health trends and vulnerabilities. Today several web-based health applications generate dynamic maps; however, for people to fully interpret the maps they need data source description and the method used in the data analysis or statistical modeling. For the representation of health information through Web-mapping applications, there still lacks a standard format to accommodate all fixed (such as location) and variable (such as age, gender, health outcome, etc) indicators in the representation of health information. Furthermore, net-centric computing has not been adequately applied to support flexible health data processing and mapping online. Results The authors of this study designed a HEalth Representation XML (HERXML) schema that consists of the semantic (e.g., health activity description, the data sources description, the statistical methodology used for analysis), geometric, and cartographical representations of health data. A case study has been carried on the development of web application and services within the Canadian Geospatial Data Infrastructure (CGDI) framework for community health programs of the New Brunswick Lung Association. This study facilitated the online processing, mapping and sharing of health information, with the use of HERXML and Open Geospatial Consortium (OGC) services. It brought a new solution in better health data representation and initial exploration of the Web-based processing of health information. Conclusion The designed HERXML has been proven to be an appropriate solution in supporting the Web representation of health information. It can be used by health practitioners, policy makers, and the public in disease etiology, health planning, health resource management, health promotion and health education. The utilization of Web-based processing services in this study provides a flexible way for users to select and use certain processing functions for health data processing and mapping via the Web. This research provides easy access to geospatial and health data in understanding the trends of diseases, and promotes the growth and enrichment of the CGDI in the public health sector. PMID:19159445

  11. 49 CFR 1560.205 - Redress process.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Transportation Other Regulations Relating to Transportation (Continued) TRANSPORTATION SECURITY ADMINISTRATION... may obtain the forms and information necessary to initiate the redress process on the DHS TRIP Web... will provide the necessary forms and information to individuals through its Web site or by mail. (c...

  12. Hierarchical Scaffolding With Bambus

    PubMed Central

    Pop, Mihai; Kosack, Daniel S.; Salzberg, Steven L.

    2004-01-01

    The output of a genome assembler generally comprises a collection of contiguous DNA sequences (contigs) whose relative placement along the genome is not defined. A procedure called scaffolding is commonly used to order and orient these contigs using paired read information. This ordering of contigs is an essential step when finishing and analyzing the data from a whole-genome shotgun project. Most recent assemblers include a scaffolding module; however, users have little control over the scaffolding algorithm or the information produced. We thus developed a general-purpose scaffolder, called Bambus, which affords users significant flexibility in controlling the scaffolding parameters. Bambus was used recently to scaffold the low-coverage draft dog genome data. Most significantly, Bambus enables the use of linking data other than that inferred from mate-pair information. For example, the sequence of a completed genome can be used to guide the scaffolding of a related organism. We present several applications of Bambus: support for finishing, comparative genomics, analysis of the haplotype structure of genomes, and scaffolding of a mammalian genome at low coverage. Bambus is available as an open-source package from our Web site. PMID:14707177

  13. Hierarchical scaffolding with Bambus.

    PubMed

    Pop, Mihai; Kosack, Daniel S; Salzberg, Steven L

    2004-01-01

    The output of a genome assembler generally comprises a collection of contiguous DNA sequences (contigs) whose relative placement along the genome is not defined. A procedure called scaffolding is commonly used to order and orient these contigs using paired read information. This ordering of contigs is an essential step when finishing and analyzing the data from a whole-genome shotgun project. Most recent assemblers include a scaffolding module; however, users have little control over the scaffolding algorithm or the information produced. We thus developed a general-purpose scaffolder, called Bambus, which affords users significant flexibility in controlling the scaffolding parameters. Bambus was used recently to scaffold the low-coverage draft dog genome data. Most significantly, Bambus enables the use of linking data other than that inferred from mate-pair information. For example, the sequence of a completed genome can be used to guide the scaffolding of a related organism. We present several applications of Bambus: support for finishing, comparative genomics, analysis of the haplotype structure of genomes, and scaffolding of a mammalian genome at low coverage. Bambus is available as an open-source package from our Web site.

  14. Barriers to Gender Transition-Related Healthcare: Identifying Underserved Transgender Adults in Massachusetts

    PubMed Central

    White Hughto, Jaclyn M.; Rose, Adam J.; Pachankis, John E.; Reisner, Sari L.

    2017-01-01

    Abstract Purpose: The present study sought to examine whether individual (e.g., age, gender), interpersonal (e.g., healthcare provider discrimination), and structural (e.g., lack of insurance coverage) factors are associated with access to transition-related care in a statewide sample of transgender adults. Method: In 2013, 364 transgender residents of Massachusetts completed an electronic web-based survey online (87.1%) or in person (12.9%). A multivariable logistic regression model tested whether individual, interpersonal, and structural factors were associated with access to transition-related care. Results: Overall, 23.6% reported being unable to access transition-related care in the past 12 months. In a multivariable model, younger age, low income, low educational attainment, private insurance coverage, and healthcare discrimination were significantly associated with being unable to access transition-related care (all p<0.05). Discussion: Despite state nondiscrimination policies and universal access to healthcare, many of the Massachusetts transgender residents sampled were unable to access transition-related care. Multilevel interventions are needed, including supportive policies and policy enforcement, to ensure that underserved transgender adults can access medically necessary transition-related care. PMID:29082331

  15. Reliable Execution Based on CPN and Skyline Optimization for Web Service Composition

    PubMed Central

    Ha, Weitao; Zhang, Guojun

    2013-01-01

    With development of SOA, the complex problem can be solved by combining available individual services and ordering them to best suit user's requirements. Web services composition is widely used in business environment. With the features of inherent autonomy and heterogeneity for component web services, it is difficult to predict the behavior of the overall composite service. Therefore, transactional properties and nonfunctional quality of service (QoS) properties are crucial for selecting the web services to take part in the composition. Transactional properties ensure reliability of composite Web service, and QoS properties can identify the best candidate web services from a set of functionally equivalent services. In this paper we define a Colored Petri Net (CPN) model which involves transactional properties of web services in the composition process. To ensure reliable and correct execution, unfolding processes of the CPN are followed. The execution of transactional composition Web service (TCWS) is formalized by CPN properties. To identify the best services of QoS properties from candidate service sets formed in the TCSW-CPN, we use skyline computation to retrieve dominant Web service. It can overcome that the reduction of individual scores to an overall similarity leads to significant information loss. We evaluate our approach experimentally using both real and synthetically generated datasets. PMID:23935431

  16. Reliable execution based on CPN and skyline optimization for Web service composition.

    PubMed

    Chen, Liping; Ha, Weitao; Zhang, Guojun

    2013-01-01

    With development of SOA, the complex problem can be solved by combining available individual services and ordering them to best suit user's requirements. Web services composition is widely used in business environment. With the features of inherent autonomy and heterogeneity for component web services, it is difficult to predict the behavior of the overall composite service. Therefore, transactional properties and nonfunctional quality of service (QoS) properties are crucial for selecting the web services to take part in the composition. Transactional properties ensure reliability of composite Web service, and QoS properties can identify the best candidate web services from a set of functionally equivalent services. In this paper we define a Colored Petri Net (CPN) model which involves transactional properties of web services in the composition process. To ensure reliable and correct execution, unfolding processes of the CPN are followed. The execution of transactional composition Web service (TCWS) is formalized by CPN properties. To identify the best services of QoS properties from candidate service sets formed in the TCSW-CPN, we use skyline computation to retrieve dominant Web service. It can overcome that the reduction of individual scores to an overall similarity leads to significant information loss. We evaluate our approach experimentally using both real and synthetically generated datasets.

  17. Synthesis of a Carbon-activated Microfiber from Spider Webs Silk

    NASA Astrophysics Data System (ADS)

    Taer, E.; Mustika, W. S.; Taslim, R.

    2017-03-01

    Carbon fiber of spider web silk has been produced through the simple carbonization process. Cobwebs are a source of strong natural fiber, flexible and micrometer in size. Preparation of micro carbon fiber from spider webs that consist of carbonization and activation processes. Carbonization was performed in N2 gas environment by multi step heating profile up to temperature of 400 °C, while the activation process was done by using chemical activation with KOH activating agent assistance. Measurement of physical properties was conducted on the surface morphology, element content and the degree of crystallinity. The measurement results found that micro carbon fiber from spider webs has a diameter in the range of 0.5 -25 micrometers. It is found that the carbon-activated microfiber takes the amorphous form with the carbon content of 84 %.

  18. The Interface Design and the Usability Testing of a Fossilization Web-Based Learning Environment

    ERIC Educational Resources Information Center

    Wang, Shiang-Kwei; Yang, Chiachi

    2005-01-01

    This article describes practical issues related to the design and the development of a Web-Based Learning Environment (Web-LE) for high school students. The purpose of the Fossilization Web-LE was to help students understand the process of fossilization, which is a complex phenomenon and is affected by many factors. The instructional design team…

  19. Classroom Web Pages: A "How-To" Guide for Educators.

    ERIC Educational Resources Information Center

    Fehling, Eric E.

    This manual provides teachers, with very little or no technology experience, with a step-by-step guide for developing the necessary skills for creating a class Web Page. The first part of the manual is devoted to the thought processes preceding the actual creation of the Web Page. These include looking at other Web Pages, deciding what should be…

  20. AIRSAR Automated Web-based Data Processing and Distribution System

    NASA Technical Reports Server (NTRS)

    Chu, Anhua; vanZyl, Jakob; Kim, Yunjin; Lou, Yunling; Imel, David; Tung, Wayne; Chapman, Bruce; Durden, Stephen

    2005-01-01

    In this paper, we present an integrated, end-to-end synthetic aperture radar (SAR) processing system that accepts data processing requests, submits processing jobs, performs quality analysis, delivers and archives processed data. This fully automated SAR processing system utilizes database and internet/intranet web technologies to allow external users to browse and submit data processing requests and receive processed data. It is a cost-effective way to manage a robust SAR processing and archival system. The integration of these functions has reduced operator errors and increased processor throughput dramatically.

Top