Sample records for web coverage services

  1. rasdaman Array Database: current status

    NASA Astrophysics Data System (ADS)

    Merticariu, George; Toader, Alexandru

    2015-04-01

    rasdaman (Raster Data Manager) is a Free Open Source Array Database Management System which provides functionality for storing and processing massive amounts of raster data in the form of multidimensional arrays. The user can access, process and delete the data using SQL. The key features of rasdaman are: flexibility (datasets of any dimensionality can be processed with the help of SQL queries), scalability (rasdaman's distributed architecture enables it to seamlessly run on cloud infrastructures while offering an increase in performance with the increase of computation resources), performance (real-time access, processing, mixing and filtering of arrays of any dimensionality) and reliability (legacy communication protocol replaced with a new one based on cutting edge technology - Google Protocol Buffers and ZeroMQ). Among the data with which the system works, we can count 1D time series, 2D remote sensing imagery, 3D image time series, 3D geophysical data, and 4D atmospheric and climate data. Most of these representations cannot be stored only in the form of raw arrays, as the location information of the contents is also important for having a correct geoposition on Earth. This is defined by ISO 19123 as coverage data. rasdaman provides coverage data support through the Petascope service. Extensions were added on top of rasdaman in order to provide support for the Geoscience community. The following OGC standards are currently supported: Web Map Service (WMS), Web Coverage Service (WCS), and Web Coverage Processing Service (WCPS). The Web Map Service is an extension which provides zoom and pan navigation over images provided by a map server. Starting with version 9.1, rasdaman supports WMS version 1.3. The Web Coverage Service provides capabilities for downloading multi-dimensional coverage data. Support is also provided for several extensions of this service: Subsetting Extension, Scaling Extension, and, starting with version 9.1, Transaction Extension, which defines request types for inserting, updating and deleting coverages. A web client, designed for both novice and experienced users, is also available for the service and its extensions. The client offers an intuitive interface that allows users to work with multi-dimensional coverages by abstracting the specifics of the standard definitions of the requests. The Web Coverage Processing Service defines a language for on-the-fly processing and filtering multi-dimensional raster coverages. rasdaman exposes this service through the WCS processing extension. Demonstrations are provided online via the Earthlook website (earthlook.org) which presents use-cases from a wide variety of application domains, using the rasdaman system as processing engine.

  2. EarthServer - an FP7 project to enable the web delivery and analysis of 3D/4D models

    NASA Astrophysics Data System (ADS)

    Laxton, John; Sen, Marcus; Passmore, James

    2013-04-01

    EarthServer aims at open access and ad-hoc analytics on big Earth Science data, based on the OGC geoservice standards Web Coverage Service (WCS) and Web Coverage Processing Service (WCPS). The WCS model defines "coverages" as a unifying paradigm for multi-dimensional raster data, point clouds, meshes, etc., thereby addressing a wide range of Earth Science data including 3D/4D models. WCPS allows declarative SQL-style queries on coverages. The project is developing a pilot implementing these standards, and will also investigate the use of GeoSciML to describe coverages. Integration of WCPS with XQuery will in turn allow coverages to be queried in combination with their metadata and GeoSciML description. The unified service will support navigation, extraction, aggregation, and ad-hoc analysis on coverage data from SQL. Clients will range from mobile devices to high-end immersive virtual reality, and will enable 3D model visualisation using web browser technology coupled with developing web standards. EarthServer is establishing open-source client and server technology intended to be scalable to Petabyte/Exabyte volumes, based on distributed processing, supercomputing, and cloud virtualization. Implementation will be based on the existing rasdaman server technology developed. Services using rasdaman technology are being installed serving the atmospheric, oceanographic, geological, cryospheric, planetary and general earth observation communities. The geology service (http://earthserver.bgs.ac.uk/) is being provided by BGS and at present includes satellite imagery, superficial thickness data, onshore DTMs and 3D models for the Glasgow area. It is intended to extend the data sets available to include 3D voxel models. Use of the WCPS standard allows queries to be constructed against single or multiple coverages. For example on a single coverage data for a particular area can be selected or data with a particular range of pixel values. Queries on multiple surfaces can be constructed to calculate, for example, the thickness between two surfaces in a 3D model or the depth from ground surface to the top of a particular geologic unit. In the first version of the service a simple interface showing some example queries has been implemented in order to show the potential of the technologies. The project aims to develop the services available in light of user feedback, both in terms of the data available, the functionality and the interface. User feedback on the services guides the software and standards development aspects of the project, leading to enhanced versions of the software which will be implemented in upgraded versions of the services during the lifetime of the project.

  3. Exploring NASA GES DISC Data with Interoperable Services

    NASA Technical Reports Server (NTRS)

    Zhao, Peisheng; Yang, Wenli; Hegde, Mahabal; Wei, Jennifer C.; Kempler, Steven; Pham, Long; Teng, William; Savtchenko, Andrey

    2015-01-01

    Overview of NASA GES DISC (NASA Goddard Earth Science Data and Information Services Center) data with interoperable services: Open-standard and Interoperable Services Improve data discoverability, accessibility, and usability with metadata, catalogue and portal standards Achieve data, information and knowledge sharing across applications with standardized interfaces and protocols Open Geospatial Consortium (OGC) Data Services and Specifications Web Coverage Service (WCS) -- data Web Map Service (WMS) -- pictures of data Web Map Tile Service (WMTS) --- pictures of data tiles Styled Layer Descriptors (SLD) --- rendered styles.

  4. Towards Semantic Web Services on Large, Multi-Dimensional Coverages

    NASA Astrophysics Data System (ADS)

    Baumann, P.

    2009-04-01

    Observed and simulated data in the Earth Sciences often come as coverages, the general term for space-time varying phenomena as set forth by standardization bodies like the Open GeoSpatial Consortium (OGC) and ISO. Among such data are 1-d time series, 2-D surface data, 3-D surface data time series as well as x/y/z geophysical and oceanographic data, and 4-D metocean simulation results. With increasing dimensionality the data sizes grow exponentially, up to Petabyte object sizes. Open standards for exploiting coverage archives over the Web are available to a varying extent. The OGC Web Coverage Service (WCS) standard defines basic extraction operations: spatio-temporal and band subsetting, scaling, reprojection, and data format encoding of the result - a simple interoperable interface for coverage access. More processing functionality is available with products like Matlab, Grid-type interfaces, and the OGC Web Processing Service (WPS). However, these often lack properties known as advantageous from databases: declarativeness (describe results rather than the algorithms), safe in evaluation (no request can keep a server busy infinitely), and optimizable (enable the server to rearrange the request so as to produce the same result faster). WPS defines a geo-enabled SOAP interface for remote procedure calls. This allows to webify any program, but does not allow for semantic interoperability: a function is identified only by its function name and parameters while the semantics is encoded in the (only human readable) title and abstract. Hence, another desirable property is missing, namely an explicit semantics which allows for machine-machine communication and reasoning a la Semantic Web. The OGC Web Coverage Processing Service (WCPS) language, which has been adopted as an international standard by OGC in December 2008, defines a flexible interface for the navigation, extraction, and ad-hoc analysis of large, multi-dimensional raster coverages. It is abstract in that it does not anticipate any particular protocol. One such protocol is given by the OGC Web Coverage Service (WCS) Processing Extension standard which ties WCPS into WCS. Another protocol which makes WCPS an OGC Web Processing Service (WPS) Profile is under preparation. Thereby, WCPS bridges WCS and WPS. The conceptual model of WCPS relies on the coverage model of WCS, which in turn is based on ISO 19123. WCS currently addresses raster-type coverages where a coverage is seen as a function mapping points from a spatio-temporal extent (its domain) into values of some cell type (its range). A retrievable coverage has an identifier associated, further the CRSs supported and, for each range field (aka band, channel), the interpolation methods applicable. The WCPS language offers access to one or several such coverages via a functional, side-effect free language. The following example, which derives the NDVI (Normalized Difference Vegetation Index) from given coverages C1, C2, and C3 within the regions identified by the binary mask R, illustrates the language concept: for c in ( C1, C2, C3 ), r in ( R ) return encode( (char) (c.nir - c.red) / (c.nir + c.red), H˜DF-EOS\\~ ) The result is a list of three HDF-EOS encoded images containing masked NDVI values. Note that the same request can operate on coverages of any dimensionality. The expressive power of WCPS includes statistics, image, and signal processing up to recursion, to maintain safe evaluation. As both syntax and semantics of any WCPS expression is well known the language is Semantic Web ready: clients can construct WCPS requests on the fly, servers can optimize such requests (this has been investigated extensively with the rasdaman raster database system) and automatically distribute them for processing in a WCPS-enabled computing cloud. The WCPS Reference Implementation is being finalized now that the standard is stable; it will be released in open source once ready. Among the future tasks is to extend WCPS to general meshes, in synchronization with the WCS standard. In this talk WCPS is presented in the context of OGC standardization. The author is co-chair of OGC's WCS Working Group (WG) and Coverages WG.

  5. Discovery Mechanisms for the Sensor Web

    PubMed Central

    Jirka, Simon; Bröring, Arne; Stasch, Christoph

    2009-01-01

    This paper addresses the discovery of sensors within the OGC Sensor Web Enablement framework. Whereas services like the OGC Web Map Service or Web Coverage Service are already well supported through catalogue services, the field of sensor networks and the according discovery mechanisms is still a challenge. The focus within this article will be on the use of existing OGC Sensor Web components for realizing a discovery solution. After discussing the requirements for a Sensor Web discovery mechanism, an approach will be presented that was developed within the EU funded project “OSIRIS”. This solution offers mechanisms to search for sensors, exploit basic semantic relationships, harvest sensor metadata and integrate sensor discovery into already existing catalogues. PMID:22574038

  6. Translating access into utilization: lessons from the design and evaluation of a health insurance Web site to promote reproductive health care for young women in Massachusetts.

    PubMed

    Janiak, Elizabeth; Rhodes, Elizabeth; Foster, Angel M

    2013-12-01

    Following state-level health care reform in Massachusetts, young women reported confusion over coverage of contraception and other sexual and reproductive health services under newly available health insurance products. To address this gap, a plain-language Web site titled "My Little Black Book for Sexual Health" was developed by a statewide network of reproductive health stakeholders. The purpose of this evaluation was to assess the health literacy demands and usability of the site among its target audience, women ages 18-26 years. We performed an evaluation of the literacy demands of the Web site's written content and tested the Web site's usability in a health communications laboratory. Participants found the Web site visually appealing and its overall design concept accessible. However, the Web site's literacy demands were high, and all participants encountered problems navigating through the Web site. Following this evaluation, the Web site was modified to be more usable and more comprehensible to women of all health literacy levels. To avail themselves of sexual and reproductive health services newly available under expanded health insurance coverage, young women require customized educational resources that are rigorously evaluated to ensure accessibility. To maximize utilization of reproductive health services under expanded health insurance coverage, US women require customized educational resources commensurate with their literacy skills. The application of established research methods from the field of health communications will enable advocates to evaluate and adapt these resources to best serve their targeted audiences. © 2013.

  7. Spatial data standards meet meteorological data - pushing the boundaries

    NASA Astrophysics Data System (ADS)

    Wagemann, Julia; Siemen, Stephan; Lamy-Thepaut, Sylvie

    2017-04-01

    The data archive of the European Centre for Medium-Range Weather Forecasts (ECMWF) holds around 120 PB of data and is world's largest archive of meteorological data. This information is of great value for many Earth Science disciplines, but the complexity of the data (up to five dimensions and different time axis domains) and its native data format GRIB, while being an efficient archive format, limits the overall data uptake especially from users outside the MetOcean domain. ECMWF's MARS WebAPI is a very efficient and flexible system for expert users to access and retrieve meteorological data, though challenging for users outside the MetOcean domain. With the help of web-based standards for data access and processing, ECMWF wants to make more than 1 PB of meteorological and climate data easier accessible to users across different Earth Science disciplines. As climate data provider for the H2020 project EarthServer-2, ECMWF explores the feasibility to give on-demand access to it's MARS archive via the OGC standard interface Web Coverage Service (WCS). Despite the potential a WCS for climate and meteorological data offers, the standards-based modelling of meteorological and climate data entails many challenges and reveals the boundaries of the current Web Coverage Service 2.0 standard. Challenges range from valid semantic data models for meteorological data to optimal and efficient data structures for a scalable web service. The presentation reviews the applicability of the current Web Coverage Service 2.0 standard to meteorological and climate data and discusses challenges that are necessary to overcome in order to achieve real interoperability and to ensure the conformant sharing and exchange of meteorological data.

  8. The EarthServer Geology Service: web coverage services for geosciences

    NASA Astrophysics Data System (ADS)

    Laxton, John; Sen, Marcus; Passmore, James

    2014-05-01

    The EarthServer FP7 project is implementing web coverage services using the OGC WCS and WCPS standards for a range of earth science domains: cryospheric; atmospheric; oceanographic; planetary; and geological. BGS is providing the geological service (http://earthserver.bgs.ac.uk/). Geoscience has used remote sensed data from satellites and planes for some considerable time, but other areas of geosciences are less familiar with the use of coverage data. This is rapidly changing with the development of new sensor networks and the move from geological maps to geological spatial models. The BGS geology service is designed initially to address two coverage data use cases and three levels of data access restriction. Databases of remote sensed data are typically very large and commonly held offline, making it time-consuming for users to assess and then download data. The service is designed to allow the spatial selection, editing and display of Landsat and aerial photographic imagery, including band selection and contrast stretching. This enables users to rapidly view data, assess is usefulness for their purposes, and then enhance and download it if it is suitable. At present the service contains six band Landsat 7 (Blue, Green, Red, NIR 1, NIR 2, MIR) and three band false colour aerial photography (NIR, green, blue), totalling around 1Tb. Increasingly 3D spatial models are being produced in place of traditional geological maps. Models make explicit spatial information implicit on maps and thus are seen as a better way of delivering geosciences information to non-geoscientists. However web delivery of models, including the provision of suitable visualisation clients, has proved more challenging than delivering maps. The EarthServer geology service is delivering 35 surfaces as coverages, comprising the modelled superficial deposits of the Glasgow area. These can be viewed using a 3D web client developed in the EarthServer project by Fraunhofer. As well as remote sensed imagery and 3D models, the geology service is also delivering DTM coverages which can be viewed in the 3D client in conjunction with both imagery and models. The service is accessible through a web GUI which allows the imagery to be viewed against a range of background maps and DTMs, and in the 3D client; spatial selection to be carried out graphically; the results of image enhancement to be displayed; and selected data to be downloaded. The GUI also provides access to the Glasgow model in the 3D client, as well as tutorial material. In the final year of the project it is intended to increase the volume of data to 20Tb and enhance the WCPS processing, including depth and thickness querying of 3D models. We have also investigated the use of GeoSciML, developed to describe and interchange the information on geological maps, to describe model surface coverages. EarthServer is developing a combined WCPS and xQuery query language, and we will investigate applying this to the GeoSciML described surfaces to answer questions such as 'find all units with a predominant sand lithology within 25m of the surface'.

  9. Footprint Database and web services for the Herschel space observatory

    NASA Astrophysics Data System (ADS)

    Verebélyi, Erika; Dobos, László; Kiss, Csaba

    2015-08-01

    Using all telemetry and observational meta-data, we created a searchable database of Herschel observation footprints. Data from the Herschel space observatory is freely available for everyone but no uniformly processed catalog of all observations has been published yet. As a first step, we unified the data model for all three Herschel instruments in all observation modes and compiled a database of sky coverage information. As opposed to methods using a pixellation of the sphere, in our database, sky coverage is stored in exact geometric form allowing for precise area calculations. Indexing of the footprints allows for very fast search among observations based on pointing, time, sky coverage overlap and meta-data. This enables us, for example, to find moving objects easily in Herschel fields. The database is accessible via a web site and also as a set of REST web service functions which makes it usable from program clients like Python or IDL scripts. Data is available in various formats including Virtual Observatory standards.

  10. Web Coverage Service Challenges for NASA's Earth Science Data

    NASA Technical Reports Server (NTRS)

    Cantrell, Simon; Khan, Abdul; Lynnes, Christopher

    2017-01-01

    In an effort to ensure that data in NASA's Earth Observing System Data and Information System (EOSDIS) is available to a wide variety of users through the tools of their choice, NASA continues to focus on exposing data and services using standards based protocols. Specifically, this work has focused recently on the Web Coverage Service (WCS). Experience has been gained in data delivery via GetCoverage requests, starting out with WCS v1.1.1. The pros and cons of both the version itself and different implementation approaches will be shared during this session. Additionally, due to limitations with WCS v1.1.1 ability to work with NASA's Earth science data, this session will also discuss the benefit of migrating to WCS 2.0.1 with EO-x to enrich this capability to meet a wide range of anticipated user's needs This will enable subsetting and various types of data transformations to be performed on a variety of EOS data sets.

  11. Remote Sensing Information Gateway: A free application and web service for fast, convenient, interoperable access to large repositories of atmospheric data

    NASA Astrophysics Data System (ADS)

    Plessel, T.; Szykman, J.; Freeman, M.

    2012-12-01

    EPA's Remote Sensing Information Gateway (RSIG) is a widely used free applet and web service for quickly and easily retrieving, visualizing and saving user-specified subsets of atmospheric data - by variable, geographic domain and time range. Petabytes of available data include thousands of variables from a set of NASA and NOAA satellites, aircraft, ground stations and EPA air-quality models. The RSIG applet is used by atmospheric researchers and uses the rsigserver web service to obtain data and images. The rsigserver web service is compliant with the Open Geospatial Consortium Web Coverage Service (OGC-WCS) standard to facilitate data discovery and interoperability. Since rsigserver is publicly accessible, it can be (and is) used by other applications. This presentation describes the architecture and technical implementation details of this successful system with an emphasis on achieving convenience, high-performance, data integrity and security.

  12. Assessing Pre-Service Candidates' Web-Based Electronic Portfolios.

    ERIC Educational Resources Information Center

    Lamson, Sharon; Thomas, Kelli R.; Aldrich, Jennifer; King, Andy

    This paper describes processes undertaken by Central Missouri State University's Department of Curriculum and Instruction to prepare teacher candidates to create Web-based professional portfolios, Central's expectations for content coverage within the electronic portfolios, and evaluation procedures. It also presents data on portfolio construction…

  13. Leveraging Open Standard Interfaces in Accessing and Processing NASA Data Model Outputs

    NASA Astrophysics Data System (ADS)

    Falke, S. R.; Alameh, N. S.; Hoijarvi, K.; de La Beaujardiere, J.; Bambacus, M. J.

    2006-12-01

    An objective of NASA's Earth Science Division is to develop advanced information technologies for processing, archiving, accessing, visualizing, and communicating Earth Science data. To this end, NASA and other federal agencies have collaborated with the Open Geospatial Consortium (OGC) to research, develop, and test interoperability specifications within projects and testbeds benefiting the government, industry, and the public. This paper summarizes the results of a recent effort under the auspices of the OGC Web Services testbed phase 4 (OWS-4) to explore standardization approaches for accessing and processing the outputs of NASA models of physical phenomena. Within the OWS-4 context, experiments were designed to leverage the emerging OGC Web Processing Service (WPS) and Web Coverage Service (WCS) specifications to access, filter and manipulate the outputs of the NASA Goddard Earth Observing System (GEOS) and Goddard Chemistry Aerosol Radiation and Transport (GOCART) forecast models. In OWS-4, the intent is to provide the users with more control over the subsets of data that they can extract from the model results as well as over the final portrayal of that data. To meet that goal, experiments have been designed to test the suitability of use of OGC's Web Processing Service (WPS) and Web Coverage Service (WCS) for filtering, processing and portraying the model results (including slices by height or by time), and to identify any enhancements to the specs to meet the desired objectives. This paper summarizes the findings of the experiments highlighting the value of the Web Processing Service in providing standard interfaces for accessing and manipulating model data within spatial and temporal frameworks. The paper also points out the key shortcomings of the WPS especially in terms in comparison with a SOAP/WSDL approach towards solving the same problem.

  14. J-Plus Web Portal

    NASA Astrophysics Data System (ADS)

    Civera Lorenzo, Tamara

    2017-10-01

    Brief presentation about the J-PLUS EDR data access web portal (http://archive.cefca.es/catalogues/jplus-edr) where the different services available to retrieve images and catalogues data have been presented.J-PLUS Early Data Release (EDR) archive includes two types of data: images and dual and single catalogue data which include parameters measured from images. J-PLUS web portal offers catalogue data and images through several different online data access tools or services each suited to a particular need. The different services offered are: Coverage map Sky navigator Object visualization Image search Cone search Object list search Virtual observatory services: Simple Cone Search Simple Image Access Protocol Simple Spectral Access Protocol Table Access Protocol

  15. An Innovative Open Data-driven Approach for Improved Interpretation of Coverage Data at NASA JPL's PO.DAA

    NASA Astrophysics Data System (ADS)

    McGibbney, L. J.; Armstrong, E. M.

    2016-12-01

    Figuratively speaking, Scientific Datasets (SD) are shared by data producers in a multitude of shapes, sizes and flavors. Primarily however they exist as machine-independent manifestations supporting the creation, access, and sharing of array-oriented SD that can on occasion be spread across multiple files. Within the Earth Sciences, the most notable general examples include the HDF family, NetCDF, etc. with other formats such as GRIB being used pervasively within specific domains such as the Oceanographic, Atmospheric and Meteorological sciences. Such file formats contain Coverage Data e.g. a digital representation of some spatio-temporal phenomenon. A challenge for large data producers such as NASA and NOAA as well as consumers of coverage datasets (particularly surrounding visualization and interactive use within web clients) is that this is still not a straight-forward issue due to size, serialization and inherent complexity. Additionally existing data formats are either unsuitable for the Web (like netCDF files) or hard to interpret independently due to missing standard structures and metadata (e.g. the OPeNDAP protocol). Therefore alternative, Web friendly manifestations of such datasets are required.CoverageJSON is an emerging data format for publishing coverage data to the web in a web-friendly, way which fits in with the linked data publication paradigm hence lowering the barrier for interpretation by consumers via mobile devices and client applications, etc. as well as data producers who can build next generation Web friendly Web services around datasets. This work will detail how CoverageJSON is being evaluated at NASA JPL's PO.DAAC as an enabling data representation format for publishing SD as Linked Open Data embedded within SD landing pages as well as via semantic data repositories. We are currently evaluating how utilization of CoverageJSON within SD landing pages addresses the long-standing acknowledgement that SD producers are not currently addressing content-based optimization within their SD landing pages for better crawlability by commercial search engines.

  16. Big Geo Data Services: From More Bytes to More Barrels

    NASA Astrophysics Data System (ADS)

    Misev, Dimitar; Baumann, Peter

    2016-04-01

    The data deluge is affecting the oil and gas industry just as much as many other industries. However, aside from the sheer volume there is the challenge of data variety, such as regular and irregular grids, multi-dimensional space/time grids, point clouds, and TINs and other meshes. A uniform conceptualization for modelling and serving them could save substantial effort, such as the proverbial "department of reformatting". The notion of a coverage actually can accomplish this. Its abstract model in ISO 19123 together with the concrete, interoperable OGC Coverage Implementation Schema (CIS), which is currently under adoption as ISO 19123-2, provieds a common platform for representing any n-D grid type, point clouds, and general meshes. This is paired by the OGC Web Coverage Service (WCS) together with its datacube analytics language, the OGC Web Coverage Processing Service (WCPS). The OGC WCS Core Reference Implementation, rasdaman, relies on Array Database technology, i.e. a NewSQL/NoSQL approach. It supports the grid part of coverages, with installations of 100+ TB known and single queries parallelized across 1,000+ cloud nodes. Recent research attempts to address the point cloud and mesh part through a unified query model. The Holy Grail envisioned is that these approaches can be merged into a single service interface at some time. We present both grid amd point cloud / mesh approaches and discuss status, implementation, standardization, and research perspectives, including a live demo.

  17. 42 CFR 423.128 - Dissemination of Part D plan information.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... coverage determination and redetermination processes via an Internet Web site; and (iii) A system that... determination by contacting the plan sponsor's toll free customer service line or by accessing the plan sponsor's internet Web site. (8) Quality assurance policies and procedures. A description of the quality...

  18. 42 CFR 423.128 - Dissemination of Part D plan information.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... coverage determination and redetermination processes via an Internet Web site; and (iii) A system that... determination by contacting the plan sponsor's toll free customer service line or by accessing the plan sponsor's internet Web site. (8) Quality assurance policies and procedures. A description of the quality...

  19. 42 CFR 423.128 - Dissemination of Part D plan information.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... coverage determination and redetermination processes via an Internet Web site; and (iii) A system that... determination by contacting the plan sponsor's toll free customer service line or by accessing the plan sponsor's internet Web site. (8) Quality assurance policies and procedures. A description of the quality...

  20. The Footprint Database and Web Services of the Herschel Space Observatory

    NASA Astrophysics Data System (ADS)

    Dobos, László; Varga-Verebélyi, Erika; Verdugo, Eva; Teyssier, David; Exter, Katrina; Valtchanov, Ivan; Budavári, Tamás; Kiss, Csaba

    2016-10-01

    Data from the Herschel Space Observatory is freely available to the public but no uniformly processed catalogue of the observations has been published so far. To date, the Herschel Science Archive does not contain the exact sky coverage (footprint) of individual observations and supports search for measurements based on bounding circles only. Drawing on previous experience in implementing footprint databases, we built the Herschel Footprint Database and Web Services for the Herschel Space Observatory to provide efficient search capabilities for typical astronomical queries. The database was designed with the following main goals in mind: (a) provide a unified data model for meta-data of all instruments and observational modes, (b) quickly find observations covering a selected object and its neighbourhood, (c) quickly find every observation in a larger area of the sky, (d) allow for finding solar system objects crossing observation fields. As a first step, we developed a unified data model of observations of all three Herschel instruments for all pointing and instrument modes. Then, using telescope pointing information and observational meta-data, we compiled a database of footprints. As opposed to methods using pixellation of the sphere, we represent sky coverage in an exact geometric form allowing for precise area calculations. For easier handling of Herschel observation footprints with rather complex shapes, two algorithms were implemented to reduce the outline. Furthermore, a new visualisation tool to plot footprints with various spherical projections was developed. Indexing of the footprints using Hierarchical Triangular Mesh makes it possible to quickly find observations based on sky coverage, time and meta-data. The database is accessible via a web site http://herschel.vo.elte.hu and also as a set of REST web service functions, which makes it readily usable from programming environments such as Python or IDL. The web service allows downloading footprint data in various formats including Virtual Observatory standards.

  1. Study on generation and sharing of on-demand global seamless data—Taking MODIS NDVI as an example

    NASA Astrophysics Data System (ADS)

    Shen, Dayong; Deng, Meixia; Di, Liping; Han, Weiguo; Peng, Chunming; Yagci, Ali Levent; Yu, Genong; Chen, Zeqiang

    2013-04-01

    By applying advanced Geospatial Data Abstraction Library (GDAL) and BigTIFF technology in a Geographical Information System (GIS) with Service Oriented Architecture (SOA), this study has derived global datasets using tile-based input data and implemented Virtual Web Map Service (VWMS) and Virtual Web Coverage Service (VWCS) to provide software tools for visualization and acquisition of global data. Taking MODIS Normalized Difference Vegetation Index (NDVI) as an example, this study proves the feasibility, efficiency and features of the proposed approach.

  2. Automatic Earth observation data service based on reusable geo-processing workflow

    NASA Astrophysics Data System (ADS)

    Chen, Nengcheng; Di, Liping; Gong, Jianya; Yu, Genong; Min, Min

    2008-12-01

    A common Sensor Web data service framework for Geo-Processing Workflow (GPW) is presented as part of the NASA Sensor Web project. This framework consists of a data service node, a data processing node, a data presentation node, a Catalogue Service node and BPEL engine. An abstract model designer is used to design the top level GPW model, model instantiation service is used to generate the concrete BPEL, and the BPEL execution engine is adopted. The framework is used to generate several kinds of data: raw data from live sensors, coverage or feature data, geospatial products, or sensor maps. A scenario for an EO-1 Sensor Web data service for fire classification is used to test the feasibility of the proposed framework. The execution time and influences of the service framework are evaluated. The experiments show that this framework can improve the quality of services for sensor data retrieval and processing.

  3. TIGER 2010 Boundaries

    EPA Pesticide Factsheets

    This EnviroAtlas web service supports research and online mapping activities related to EnviroAtlas (https://www.epa.gov/enviroatlas). This web service includes the State and County boundaries from the TIGER shapefiles compiled into a single national coverage for each layer. The TIGER/Line Files are shapefiles and related database files (.dbf) that are an extract of selected geographic and cartographic information from the U.S. Census Bureau's Master Address File / Topologically Integrated Geographic Encoding and Referencing (MAF/TIGER) Database (MTDB).

  4. Datacube Services in Action, Using Open Source and Open Standards

    NASA Astrophysics Data System (ADS)

    Baumann, P.; Misev, D.

    2016-12-01

    Array Databases comprise novel, promising technology for massive spatio-temporal datacubes, extending the SQL paradigm of "any query, anytime" to n-D arrays. On server side, such queries can be optimized, parallelized, and distributed based on partitioned array storage. The rasdaman ("raster data manager") system, which has pioneered Array Databases, is available in open source on www.rasdaman.org. Its declarative query language extends SQL with array operators which are optimized and parallelized on server side. The rasdaman engine, which is part of OSGeo Live, is mature and in operational use databases individually holding dozens of Terabytes. Further, the rasdaman concepts have strongly impacted international Big Data standards in the field, including the forthcoming MDA ("Multi-Dimensional Array") extension to ISO SQL, the OGC Web Coverage Service (WCS) and Web Coverage Processing Service (WCPS) standards, and the forthcoming INSPIRE WCS/WCPS; in both OGC and INSPIRE, OGC is WCS Core Reference Implementation. In our talk we present concepts, architecture, operational services, and standardization impact of open-source rasdaman, as well as experiences made.

  5. TIGER 2010 Boundaries

    EPA Pesticide Factsheets

    This EnviroAtlas web service supports research and online mapping activities related to EnviroAtlas (https://www.epa.gov/enviroatlas). This web service includes the State, County, and Census Block Groups boundaries from the TIGER shapefiles compiled into a single national coverage for each layer. The TIGER/Line Files are shapefiles and related database files (.dbf) that are an extract of selected geographic and cartographic information from the U.S. Census Bureau's Master Address File / Topologically Integrated Geographic Encoding and Referencing (MAF/TIGER) Database (MTDB).

  6. Open Data, Jupyter Notebooks and Geospatial Data Standards Combined - Opening up large volumes of marine and climate data to other communities

    NASA Astrophysics Data System (ADS)

    Clements, O.; Siemen, S.; Wagemann, J.

    2017-12-01

    The EU-funded Earthserver-2 project aims to offer on-demand access to large volumes of environmental data (Earth Observation, Marine, Climate data and Planetary data) via the interface standard Web Coverage Service defined by the Open Geospatial Consortium. Providing access to data via OGC web services (e.g. WCS and WMS) has the potential to open up services to a wider audience, especially to users outside the respective communities. Especially WCS 2.0 with its processing extension Web Coverage Processing Service (WCPS) is highly beneficial to make large volumes accessible to non-expert communities. Users do not have to deal with custom community data formats, such as GRIB for the meteorological community, but can directly access the data in a format they are more familiar with, such as NetCDF, JSON or CSV. Data requests can further directly be integrated into custom processing routines and users are not required to download Gigabytes of data anymore. WCS supports trim (reduction of data extent) and slice (reduction of data dimension) operations on multi-dimensional data, providing users a very flexible on-demand access to the data. WCPS allows the user to craft queries to run on the data using a text-based query language, similar to SQL. These queries can be very powerful, e.g. condensing a three-dimensional data cube into its two-dimensional mean. However, the more processing-intensive the more complex the query. As part of the EarthServer-2 project, we developed a python library that helps users to generate complex WCPS queries with Python, a programming language they are more familiar with. The interactive presentation aims to give practical examples how users can benefit from two specific WCS services from the Marine and Climate community. Use-cases from the two communities will show different approaches to take advantage of a Web Coverage (Processing) Service. The entire content is available with Jupyter Notebooks, as they prove to be a highly beneficial tool to generate reproducible workflows for environmental data analysis.

  7. A flexible geospatial sensor observation service for diverse sensor data based on Web service

    NASA Astrophysics Data System (ADS)

    Chen, Nengcheng; Di, Liping; Yu, Genong; Min, Min

    Achieving a flexible and efficient geospatial Sensor Observation Service (SOS) is difficult, given the diversity of sensor networks, the heterogeneity of sensor data storage, and the differing requirements of users. This paper describes development of a service-oriented multi-purpose SOS framework. The goal is to create a single method of access to the data by integrating the sensor observation service with other Open Geospatial Consortium (OGC) services — Catalogue Service for the Web (CSW), Transactional Web Feature Service (WFS-T) and Transactional Web Coverage Service (WCS-T). The framework includes an extensible sensor data adapter, an OGC-compliant geospatial SOS, a geospatial catalogue service, a WFS-T, and a WCS-T for the SOS, and a geospatial sensor client. The extensible sensor data adapter finds, stores, and manages sensor data from live sensors, sensor models, and simulation systems. Abstract factory design patterns are used during design and implementation. A sensor observation service compatible with the SWE is designed, following the OGC "core" and "transaction" specifications. It is implemented using Java servlet technology. It can be easily deployed in any Java servlet container and automatically exposed for discovery using Web Service Description Language (WSDL). Interaction sequences between a Sensor Web data consumer and an SOS, between a producer and an SOS, and between an SOS and a CSW are described in detail. The framework has been successfully demonstrated in application scenarios for EO-1 observations, weather observations, and water height gauge observations.

  8. Datacube Interoperability, Encoding Independence, and Analytics

    NASA Astrophysics Data System (ADS)

    Baumann, Peter; Hirschorn, Eric; Maso, Joan

    2017-04-01

    Datacubes are commonly accepted as an enabling paradigm which provides a handy abstraction for accessing and analyzing the zillions of image files delivered by the manifold satellite instruments and climate simulations, among others. Additionally, datacubes are the classic model for statistical and OLAP datacubes, so a further information category can be integrated. From a standards perspective, spatio-temporal datacubes naturally are included in the concept of coverages which encompass regular and irregular grids, point clouds, and general meshes - or, more abstractly, digital representations of spatio-temporally varying phenomena. ISO 19123, which is identical to OGC Abstract Topic 6, gives a high-level abstract definition which is complemented by the OGC Coverage Implementation Schema (CIS) which is an interoperable, yet format independent concretization of the abstract model. Currently, ISO is working on adopting OGC CIS as ISO 19123-2; the existing ISO 19123 standard is under revision by one of the abstract authors and will become ISO 19123-1. The roadmap agreed by ISO further foresees adoption of the OGC Web Coverage Service (WCS) as an ISO standard so that a complete data and service model will exist. In 2016, INSPIRE has adopted WCS as Coverage Download Service, including the datacube analytics language Web Coverage Processing Service (WCPS). The rasdaman technology (www.rasdaman.org) is both OGC and INSPIRE Reference Implementation. In the global EarthServer initiative rasdaman database sizes are exceeding 250 TB today, heading for the Petabyte frontier well in 2017. Technically, CIS defines a compact, efficient model for representing multi-dimensional datacubes in several ways. The classical coverage cube defines a domain set (where are values?), a range set (what are these values?), and range type (what do the values mean?), as well as a "bag" for arbitrary metadata. With CIS 1.1, coordinate/value pair sequences have been added, as well as tiled representations. Further, CIS 1.1 offers a unified model for any kind of regular and irregular grids, also allowing sensor models as per SensorML. Encodings include ASCII formats like GML, JSON, RDF as well as binary formats like GeoTIFF, NetCDF, JPEG2000, and GRIB2; further, a container concept allows mixed representations within one coverage file utilizing zip or other convenient package formats. Through the tight integration with the Sensor Web Enablement (SWE), a lossless "transport" from sensor into coverage world is ensured. The corresponding service model of WCS supports datacube operations ranging from simple data extraction to complex ad-hoc analytics with WPCS. Notably, W3C is working has set out on a coverage model as well; it has been designed relatively independently from the abovementioned standards, but there is informal agreement to link it into the CIS universe (which allows for different, yet interchangeable representations). Particularly interesting in the W3C proposal is the detailed semantic modeling of metadata; as CIS 1.1 supports RDF, a tight coupling seems feasible.

  9. Improving data discoverability, accessibility, and interoperability with the Esri ArcGIS Platform at the NASA Atmospheric Science Data Center (ASDC).

    NASA Astrophysics Data System (ADS)

    Tisdale, M.

    2017-12-01

    NASA's Atmospheric Science Data Center (ASDC) is operationally using the Esri ArcGIS Platform to improve data discoverability, accessibility and interoperability to meet the diversifying user requirements from government, private, public and academic communities. The ASDC is actively working to provide their mission essential datasets as ArcGIS Image Services, Open Geospatial Consortium (OGC) Web Mapping Services (WMS), and OGC Web Coverage Services (WCS) while leveraging the ArcGIS multidimensional mosaic dataset structure. Science teams at ASDC are utilizing these services through the development of applications using the Web AppBuilder for ArcGIS and the ArcGIS API for Javascript. These services provide greater exposure of ASDC data holdings to the GIS community and allow for broader sharing and distribution to various end users. These capabilities provide interactive visualization tools and improved geospatial analytical tools for a mission critical understanding in the areas of the earth's radiation budget, clouds, aerosols, and tropospheric chemistry. The presentation will cover how the ASDC is developing geospatial web services and applications to improve data discoverability, accessibility, and interoperability.

  10. Web Services Implementations at Land Process and Goddard Earth Sciences Distributed Active Archive Centers

    NASA Astrophysics Data System (ADS)

    Cole, M.; Bambacus, M.; Lynnes, C.; Sauer, B.; Falke, S.; Yang, W.

    2007-12-01

    NASA's vast array of scientific data within its Distributed Active Archive Centers (DAACs) is especially valuable to both traditional research scientists as well as the emerging market of Earth Science Information Partners. For example, the air quality science and management communities are increasingly using satellite derived observations in their analyses and decision making. The Air Quality Cluster in the Federation of Earth Science Information Partners (ESIP) uses web infrastructures of interoperability, or Service Oriented Architecture (SOA), to extend data exploration, use, and analysis and provides a user environment for DAAC products. In an effort to continually offer these NASA data to the broadest research community audience, and reusing emerging technologies, both NASA's Goddard Earth Science (GES) and Land Process (LP) DAACs have engaged in a web services pilot project. Through these projects both GES and LP have exposed data through the Open Geospatial Consortiums (OGC) Web Services standards. Reusing several different existing applications and implementation techniques, GES and LP successfully exposed a variety data, through distributed systems to be ingested into multiple end-user systems. The results of this project will enable researchers world wide to access some of NASA's GES & LP DAAC data through OGC protocols. This functionality encourages inter-disciplinary research while increasing data use through advanced technologies. This paper will concentrate on the implementation and use of OGC Web Services, specifically Web Map and Web Coverage Services (WMS, WCS) at GES and LP DAACs, and the value of these services within scientific applications, including integration with the DataFed air quality web infrastructure and in the development of data analysis web applications.

  11. HyspIRI Low Latency Concept and Benchmarks

    NASA Technical Reports Server (NTRS)

    Mandl, Dan

    2010-01-01

    Topics include HyspIRI low latency data ops concept, HyspIRI data flow, ongoing efforts, experiment with Web Coverage Processing Service (WCPS) approach to injecting new algorithms into SensorWeb, low fidelity HyspIRI IPM testbed, compute cloud testbed, open cloud testbed environment, Global Lambda Integrated Facility (GLIF) and OCC collaboration with Starlight, delay tolerant network (DTN) protocol benchmarking, and EO-1 configuration for preliminary DTN prototype.

  12. A Query Language for Handling Big Observation Data Sets in the Sensor Web

    NASA Astrophysics Data System (ADS)

    Autermann, Christian; Stasch, Christoph; Jirka, Simon; Koppe, Roland

    2017-04-01

    The Sensor Web provides a framework for the standardized Web-based sharing of environmental observations and sensor metadata. While the issue of varying data formats and protocols is addressed by these standards, the fast growing size of observational data is imposing new challenges for the application of these standards. Most solutions for handling big observational datasets currently focus on remote sensing applications, while big in-situ datasets relying on vector features still lack a solid approach. Conventional Sensor Web technologies may not be adequate, as the sheer size of the data transmitted and the amount of metadata accumulated may render traditional OGC Sensor Observation Services (SOS) unusable. Besides novel approaches to store and process observation data in place, e.g. by harnessing big data technologies from mainstream IT, the access layer has to be amended to utilize and integrate these large observational data archives into applications and to enable analysis. For this, an extension to the SOS will be discussed that establishes a query language to dynamically process and filter observations at storage level, similar to the OGC Web Coverage Service (WCS) and it's Web Coverage Processing Service (WCPS) extension. This will enable applications to request e.g. spatial or temporal aggregated data sets in a resolution it is able to display or it requires. The approach will be developed and implemented in cooperation with the The Alfred Wegener Institute, Helmholtz Centre for Polar and Marine Research whose catalogue of data compromises marine observations of physical, chemical and biological phenomena from a wide variety of sensors, including mobile (like research vessels, aircrafts or underwater vehicles) and stationary (like buoys or research stations). Observations are made with a high temporal resolution and the resulting time series may span multiple decades.

  13. The EarthServer Federation: State, Role, and Contribution to GEOSS

    NASA Astrophysics Data System (ADS)

    Merticariu, Vlad; Baumann, Peter

    2016-04-01

    The intercontinental EarthServer initiative has established a European datacube platform with proven scalability: known databases exceed 100 TB, and single queries have been split across more than 1,000 cloud nodes. Its service interface being rigorously based on the OGC "Big Geo Data" standards, Web Coverage Service (WCS) and Web Coverage Processing Service (WCPS), a series of clients can dock into the services, ranging from open-source OpenLayers and QGIS over open-source NASA WorldWind to proprietary ESRI ArcGIS. Datacube fusion in a "mix and match" style is supported by the platform technolgy, the rasdaman Array Database System, which transparently federates queries so that users simply approach any node of the federation to access any data item, internally optimized for minimal data transfer. Notably, rasdaman is part of GEOSS GCI. NASA is contributing its Web WorldWind virtual globe for user-friendly data extraction, navigation, and analysis. Integrated datacube / metadata queries are contributed by CITE. Current federation members include ESA (managed by MEEO sr.l.), Plymouth Marine Laboratory (PML), the European Centre for Medium-Range Weather Forecast (ECMWF), Australia's National Computational Infrastructure, and Jacobs University (adding in Planetary Science). Further data centers have expressed interest in joining. We present the EarthServer approach, discuss its underlying technology, and illustrate the contribution this datacube platform can make to GEOSS.

  14. QBCov: A Linked Data interface for Discrete Global Grid Systems, a new approach to delivering coverage data on the web

    NASA Astrophysics Data System (ADS)

    Zhang, Z.; Toyer, S.; Brizhinev, D.; Ledger, M.; Taylor, K.; Purss, M. B. J.

    2016-12-01

    We are witnessing a rapid proliferation of geoscientific and geospatial data from an increasing variety of sensors and sensor networks. This data presents great opportunities to resolve cross-disciplinary problems. However, working with it often requires an understanding of file formats and protocols seldom used outside of scientific computing, potentially limiting the data's value to other disciplines. In this paper, we present a new approach to serving satellite coverage data on the web, which improves ease-of-access using the principles of linked data. Linked data adapts the concepts and protocols of the human-readable web to machine-readable data; the number of developers familiar with web technologies makes linked data a natural choice for bringing coverages to a wider audience. Our approach to using linked data also makes it possible to efficiently service high-level SPARQL queries: for example, "Retrieve all Landsat ETM+ observations of San Francisco between July and August 2016" can easily be encoded in a single query. We validate the new approach, which we call QBCov, with a reference implementation of the entire stack, including a simple web-based client for interacting with Landsat observations. In addition to demonstrating the utility of linked data for publishing coverages, we investigate the heretofore unexplored relationship between Discrete Global Grid Systems (DGGS) and linked data. Our conclusions are informed by the aforementioned reference implementation of QBCov, which is backed by a hierarchical file format designed around the rHEALPix DGGS. Not only does the choice of a DGGS-based representation provide an efficient mechanism for accessing large coverages at multiple scales, but the ability of DGGS to produce persistent, unique identifiers for spatial regions is especially valuable in a linked data context. This suggests that DGGS has an important role to play in creating sustainable and scalable linked data infrastructures. QBCov is being developed as a contribution to the Spatial Data on the Web working group--a joint activity of the Open Geospatial Consortium and World Wide Web Consortium.

  15. Cross-Dataset Analysis and Visualization Driven by Expressive Web Services

    NASA Astrophysics Data System (ADS)

    Alexandru Dumitru, Mircea; Catalin Merticariu, Vlad

    2015-04-01

    The deluge of data that is hitting us every day from satellite and airborne sensors is changing the workflow of environmental data analysts and modelers. Web geo-services play now a fundamental role, and are no longer needed to preliminary download and store the data, but rather they interact in real-time with GIS applications. Due to the very large amount of data that is curated and made available by web services, it is crucial to deploy smart solutions for optimizing network bandwidth, reducing duplication of data and moving the processing closer to the data. In this context we have created a visualization application for analysis and cross-comparison of aerosol optical thickness datasets. The application aims to help researchers identify and visualize discrepancies between datasets coming from various sources, having different spatial and time resolutions. It also acts as a proof of concept for integration of OGC Web Services under a user-friendly interface that provides beautiful visualizations of the explored data. The tool was built on top of the World Wind engine, a Java based virtual globe built by NASA and the open source community. For data retrieval and processing we exploited the OGC Web Coverage Service potential: the most exciting aspect being its processing extension, a.k.a. the OGC Web Coverage Processing Service (WCPS) standard. A WCPS-compliant service allows a client to execute a processing query on any coverage offered by the server. By exploiting a full grammar, several different kinds of information can be retrieved from one or more datasets together: scalar condensers, cross-sectional profiles, comparison maps and plots, etc. This combination of technology made the application versatile and portable. As the processing is done on the server-side, we ensured that the minimal amount of data is transferred and that the processing is done on a fully-capable server, leaving the client hardware resources to be used for rendering the visualization. The application offers a set of features to visualize and cross-compare the datasets. Users can select a region of interest in space and time on which an aerosol map layer is plotted. Hovmoeller time-latitude and time-longitude profiles can be displayed by selecting orthogonal cross-sections on the globe. Statistics about the selected dataset are also displayed in different text and plot formats. The datasets can also be cross-compared either by using the delta map tool or the merged map tool. For more advanced users, a WCPS query console is also offered allowing users to process their data with ad-hoc queries and then choose how to display the results. Overall, the user has a rich set of tools that can be used to visualize and cross-compare the aerosol datasets. With our application we have shown how the NASA WorldWind framework can be used to display results processed efficiently - and entirely - on the server side using the expressiveness of the OGC WCPS web-service. The application serves not only as a proof of concept of a new paradigm in working with large geospatial data but also as an useful tool for environmental data analysts.

  16. Spatial Data Services for Interdisciplinary Applications from the NASA Socioeconomic Data and Applications Center

    NASA Astrophysics Data System (ADS)

    Chen, R. S.; MacManus, K.; Vinay, S.; Yetman, G.

    2016-12-01

    The Socioeconomic Data and Applications Center (SEDAC), one of 12 Distributed Active Archive Centers (DAACs) in the NASA Earth Observing System Data and Information System (EOSDIS), has developed a variety of operational spatial data services aimed at providing online access, visualization, and analytic functions for geospatial socioeconomic and environmental data. These services include: open web services that implement Open Geospatial Consortium (OGC) specifications such as Web Map Service (WMS), Web Feature Service (WFS), and Web Coverage Service (WCS); spatial query services that support Web Processing Service (WPS) and Representation State Transfer (REST); and web map clients and a mobile app that utilize SEDAC and other open web services. These services may be accessed from a variety of external map clients and visualization tools such as NASA's WorldView, NOAA's Climate Explorer, and ArcGIS Online. More than 200 data layers related to population, settlements, infrastructure, agriculture, environmental pollution, land use, health, hazards, climate change and other aspects of sustainable development are available through WMS, WFS, and/or WCS. Version 2 of the SEDAC Population Estimation Service (PES) supports spatial queries through WPS and REST in the form of a user-defined polygon or circle. The PES returns an estimate of the population residing in the defined area for a specific year (2000, 2005, 2010, 2015, or 2020) based on SEDAC's Gridded Population of the World version 4 (GPWv4) dataset, together with measures of accuracy. The SEDAC Hazards Mapper and the recently released HazPop iOS mobile app enable users to easily submit spatial queries to the PES and see the results. SEDAC has developed an operational virtualized backend infrastructure to manage these services and support their continual improvement as standards change, new data and services become available, and user needs evolve. An ongoing challenge is to improve the reliability and performance of the infrastructure, in conjunction with external services, to meet both research and operational needs.

  17. OneGeology Web Services and Portal as a global geological SDI - latest standards and technology

    NASA Astrophysics Data System (ADS)

    Duffy, Tim; Tellez-Arenas, Agnes

    2014-05-01

    The global coverage of OneGeology Web Services (www.onegeology.org and portal.onegeology.org) achieved since 2007 from the 120 participating geological surveys will be reviewed and issues arising discussed. Recent enhancements to the OneGeology Web Services capabilities will be covered including new up to 5 star service accreditation scheme utilising the ISO/OGC Web Mapping Service standard version 1.3, core ISO 19115 metadata additions and Version 2.0 Web Feature Services (WFS) serving the new IUGS-CGI GeoSciML V3.2 geological web data exchange language standard (http://www.geosciml.org/) with its associated 30+ IUGS-CGI available vocabularies (http://resource.geosciml.org/ and http://srvgeosciml.brgm.fr/eXist2010/brgm/client.html). Use of the CGI simpelithology and timescale dictionaries now allow those who wish to do so to offer data harmonisation to query their GeoSciML 3.2 based Web Feature Services and their GeoSciML_Portrayal V2.0.1 (http://www.geosciml.org/) Web Map Services in the OneGeology portal (http://portal.onegeology.org). Contributing to OneGeology involves offering to serve ideally 1:1000,000 scale geological data (in practice any scale now is warmly welcomed) as an OGC (Open Geospatial Consortium) standard based WMS (Web Mapping Service) service from an available WWW server. This may either be hosted within the Geological Survey or a neighbouring, regional or elsewhere institution that offers to serve that data for them i.e. offers to help technically by providing the web serving IT infrastructure as a 'buddy'. OneGeology is a standards focussed Spatial Data Infrastructure (SDI) and works to ensure that these standards work together and it is now possible for European Geological Surveys to register their INSPIRE web services within the OneGeology SDI (e.g. see http://www.geosciml.org/geosciml/3.2/documentation/cookbook/INSPIRE_GeoSciML_Cookbook%20_1.0.pdf). The Onegeology portal (http://portal.onegeology.org) is the first port of call for anyone wishing to discover the availability of global geological web services and has new functionality to view and use such services including multiple projection support. KEYWORDS : OneGeology; GeoSciML V 3.2; Data exchange; Portal; INSPIRE; Standards; OGC; Interoperability; GeoScience information; WMS; WFS; Cookbook.

  18. Interoperability in planetary research for geospatial data analysis

    NASA Astrophysics Data System (ADS)

    Hare, Trent M.; Rossi, Angelo P.; Frigeri, Alessandro; Marmo, Chiara

    2018-01-01

    For more than a decade there has been a push in the planetary science community to support interoperable methods for accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (e.g., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized geospatial image formats, geologic mapping conventions, U.S. Federal Geographic Data Committee (FGDC) cartographic and metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Map Tile Services (cached image tiles), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they can be just as valuable for planetary domain. Another initiative, called VESPA (Virtual European Solar and Planetary Access), will marry several of the above geoscience standards and astronomy-based standards as defined by International Virtual Observatory Alliance (IVOA). This work outlines the current state of interoperability initiatives in use or in the process of being researched within the planetary geospatial community.

  19. A New User Interface for On-Demand Customizable Data Products for Sensors in a SensorWeb

    NASA Technical Reports Server (NTRS)

    Mandl, Daniel; Cappelaere, Pat; Frye, Stuart; Sohlberg, Rob; Ly, Vuong; Chien, Steve; Sullivan, Don

    2011-01-01

    A SensorWeb is a set of sensors, which can consist of ground, airborne and space-based sensors interoperating in an automated or autonomous collaborative manner. The NASA SensorWeb toolbox, developed at NASA/GSFC in collaboration with NASA/JPL, NASA/Ames and other partners, is a set of software and standards that (1) enables users to create virtual private networks of sensors over open networks; (2) provides the capability to orchestrate their actions; (3) provides the capability to customize the output data products and (4) enables automated delivery of the data products to the users desktop. A recent addition to the SensorWeb Toolbox is a new user interface, together with web services co-resident with the sensors, to enable rapid creation, loading and execution of new algorithms for processing sensor data. The web service along with the user interface follows the Open Geospatial Consortium (OGC) standard called Web Coverage Processing Service (WCPS). This presentation will detail the prototype that was built and how the WCPS was tested against a HyspIRI flight testbed and an elastic computation cloud on the ground with EO-1 data. HyspIRI is a future NASA decadal mission. The elastic computation cloud stores EO-1 data and runs software similar to Amazon online shopping.

  20. Operational Interoperable Web Coverage Service for Earth Observing Satellite Data: Issues and Lessons Learned

    NASA Astrophysics Data System (ADS)

    Yang, W.; Min, M.; Bai, Y.; Lynnes, C.; Holloway, D.; Enloe, Y.; di, L.

    2008-12-01

    In the past few years, there have been growing interests, among major earth observing satellite (EOS) data providers, in serving data through the interoperable Web Coverage Service (WCS) interface protocol, developed by the Open Geospatial Consortium (OGC). The interface protocol defined in WCS specifications allows client software to make customized requests of multi-dimensional EOS data, including spatial and temporal subsetting, resampling and interpolation, and coordinate reference system (CRS) transformation. A WCS server describes an offered coverage, i.e., a data product, through a response to a client's DescribeCoverage request. The description includes the offered coverage's spatial/temporal extents and resolutions, supported CRSs, supported interpolation methods, and supported encoding formats. Based on such information, a client can request the entire or a subset of coverage in any spatial/temporal resolutions and in any one of the supported CRSs, formats, and interpolation methods. When implementing a WCS server, a data provider has different approaches to present its data holdings to clients. One of the most straightforward, and commonly used, approaches is to offer individual physical data files as separate coverages. Such implementation, however, will result in too many offered coverages for large data holdings and it also cannot fully present the relationship among different, but spatially and/or temporally associated, data files. It is desirable to disconnect offered coverages from physical data files so that the former is more coherent, especially in spatial and temporal domains. Therefore, some servers offer one single coverage for a set of spatially coregistered time series data files such as a daily global precipitation coverage linked to many global single- day precipitation files; others offer one single coverage for multiple temporally coregistered files together forming a large spatial extent. In either case, a server needs to assemble an output coverage real-time by combining potentially large number of physical files, which can be operationally difficult. The task becomes more challenging if an offered coverage involves spatially and temporally un-registered physical files. In this presentation, we will discuss issues and lessons learned in providing NASA's AIRS Level 2 atmospheric products, which are in satellite swath CRS and in 6-minute segment granule files, as virtual global coverages. We"ll discuss the WCS server's on- the-fly georectification, mosaicking, quality screening, performance, and scalability.

  1. Mobile cloud-computing-based healthcare service by noncontact ECG monitoring.

    PubMed

    Fong, Ee-May; Chung, Wan-Young

    2013-12-02

    Noncontact electrocardiogram (ECG) measurement technique has gained popularity these days owing to its noninvasive features and convenience in daily life use. This paper presents mobile cloud computing for a healthcare system where a noncontact ECG measurement method is employed to capture biomedical signals from users. Healthcare service is provided to continuously collect biomedical signals from multiple locations. To observe and analyze the ECG signals in real time, a mobile device is used as a mobile monitoring terminal. In addition, a personalized healthcare assistant is installed on the mobile device; several healthcare features such as health status summaries, medication QR code scanning, and reminders are integrated into the mobile application. Health data are being synchronized into the healthcare cloud computing service (Web server system and Web server dataset) to ensure a seamless healthcare monitoring system and anytime and anywhere coverage of network connection is available. Together with a Web page application, medical data are easily accessed by medical professionals or family members. Web page performance evaluation was conducted to ensure minimal Web server latency. The system demonstrates better availability of off-site and up-to-the-minute patient data, which can help detect health problems early and keep elderly patients out of the emergency room, thus providing a better and more comprehensive healthcare cloud computing service.

  2. Mobile Cloud-Computing-Based Healthcare Service by Noncontact ECG Monitoring

    PubMed Central

    Fong, Ee-May; Chung, Wan-Young

    2013-01-01

    Noncontact electrocardiogram (ECG) measurement technique has gained popularity these days owing to its noninvasive features and convenience in daily life use. This paper presents mobile cloud computing for a healthcare system where a noncontact ECG measurement method is employed to capture biomedical signals from users. Healthcare service is provided to continuously collect biomedical signals from multiple locations. To observe and analyze the ECG signals in real time, a mobile device is used as a mobile monitoring terminal. In addition, a personalized healthcare assistant is installed on the mobile device; several healthcare features such as health status summaries, medication QR code scanning, and reminders are integrated into the mobile application. Health data are being synchronized into the healthcare cloud computing service (Web server system and Web server dataset) to ensure a seamless healthcare monitoring system and anytime and anywhere coverage of network connection is available. Together with a Web page application, medical data are easily accessed by medical professionals or family members. Web page performance evaluation was conducted to ensure minimal Web server latency. The system demonstrates better availability of off-site and up-to-the-minute patient data, which can help detect health problems early and keep elderly patients out of the emergency room, thus providing a better and more comprehensive healthcare cloud computing service. PMID:24316562

  3. MALINA: a web service for visual analytics of human gut microbiota whole-genome metagenomic reads.

    PubMed

    Tyakht, Alexander V; Popenko, Anna S; Belenikin, Maxim S; Altukhov, Ilya A; Pavlenko, Alexander V; Kostryukova, Elena S; Selezneva, Oksana V; Larin, Andrei K; Karpova, Irina Y; Alexeev, Dmitry G

    2012-12-07

    MALINA is a web service for bioinformatic analysis of whole-genome metagenomic data obtained from human gut microbiota sequencing. As input data, it accepts metagenomic reads of various sequencing technologies, including long reads (such as Sanger and 454 sequencing) and next-generation (including SOLiD and Illumina). It is the first metagenomic web service that is capable of processing SOLiD color-space reads, to authors' knowledge. The web service allows phylogenetic and functional profiling of metagenomic samples using coverage depth resulting from the alignment of the reads to the catalogue of reference sequences which are built into the pipeline and contain prevalent microbial genomes and genes of human gut microbiota. The obtained metagenomic composition vectors are processed by the statistical analysis and visualization module containing methods for clustering, dimension reduction and group comparison. Additionally, the MALINA database includes vectors of bacterial and functional composition for human gut microbiota samples from a large number of existing studies allowing their comparative analysis together with user samples, namely datasets from Russian Metagenome project, MetaHIT and Human Microbiome Project (downloaded from http://hmpdacc.org). MALINA is made freely available on the web at http://malina.metagenome.ru. The website is implemented in JavaScript (using Ext JS), Microsoft .NET Framework, MS SQL, Python, with all major browsers supported.

  4. How NASA's Atmospheric Science Data Center (ASDC) is operationally using the Esri ArcGIS Platform to improve data discoverability, accessibility and interoperability to meet the diversifying government, private, public and academic communities' driven requirements.

    NASA Astrophysics Data System (ADS)

    Tisdale, M.

    2016-12-01

    NASA's Atmospheric Science Data Center (ASDC) is operationally using the Esri ArcGIS Platform to improve data discoverability, accessibility and interoperability to meet the diversifying government, private, public and academic communities' driven requirements. The ASDC is actively working to provide their mission essential datasets as ArcGIS Image Services, Open Geospatial Consortium (OGC) Web Mapping Services (WMS), OGC Web Coverage Services (WCS) and leveraging the ArcGIS multidimensional mosaic dataset structure. Science teams and ASDC are utilizing these services, developing applications using the Web AppBuilder for ArcGIS and ArcGIS API for Javascript, and evaluating restructuring their data production and access scripts within the ArcGIS Python Toolbox framework and Geoprocessing service environment. These capabilities yield a greater usage and exposure of ASDC data holdings and provide improved geospatial analytical tools for a mission critical understanding in the areas of the earth's radiation budget, clouds, aerosols, and tropospheric chemistry.

  5. Flipping Introduction to MIS for a Connected World

    ERIC Educational Resources Information Center

    Law, Wai K.

    2014-01-01

    It has been increasingly challenging to provide an introductory coverage of the rapidly expanding fields in Information Systems (IS). The task has been further complicated by the popularity of web resources and cloud services. A new generation of technically savvy learners, while recognizing the significance of information systems, expects…

  6. Struct2Net: a web service to predict protein–protein interactions using a structure-based approach

    PubMed Central

    Singh, Rohit; Park, Daniel; Xu, Jinbo; Hosur, Raghavendra; Berger, Bonnie

    2010-01-01

    Struct2Net is a web server for predicting interactions between arbitrary protein pairs using a structure-based approach. Prediction of protein–protein interactions (PPIs) is a central area of interest and successful prediction would provide leads for experiments and drug design; however, the experimental coverage of the PPI interactome remains inadequate. We believe that Struct2Net is the first community-wide resource to provide structure-based PPI predictions that go beyond homology modeling. Also, most web-resources for predicting PPIs currently rely on functional genomic data (e.g. GO annotation, gene expression, cellular localization, etc.). Our structure-based approach is independent of such methods and only requires the sequence information of the proteins being queried. The web service allows multiple querying options, aimed at maximizing flexibility. For the most commonly studied organisms (fly, human and yeast), predictions have been pre-computed and can be retrieved almost instantaneously. For proteins from other species, users have the option of getting a quick-but-approximate result (using orthology over pre-computed results) or having a full-blown computation performed. The web service is freely available at http://struct2net.csail.mit.edu. PMID:20513650

  7. Towards Direct Manipulation and Remixing of Massive Data: The EarthServer Approach

    NASA Astrophysics Data System (ADS)

    Baumann, P.

    2012-04-01

    Complex analytics on "big data" is one of the core challenges of current Earth science, generating strong requirements for on-demand processing and fil tering of massive data sets. Issues under discussion include flexibility, performance, scalability, and the heterogeneity of the information types invo lved. In other domains, high-level query languages (such as those offered by database systems) have proven successful in the quest for flexible, scalable data access interfaces to massive amounts of data. However, due to the lack of support for many of the Earth science data structures, database systems are only used for registries and catalogs, but not for the bulk of spatio-temporal data. One core information category in this field is given by coverage data. ISO 19123 defines coverages, simplifying, as a representation of a "space-time varying phenomenon". This model can express a large class of Earth science data structures, including rectified and non-rectified rasters, curvilinear grids, point clouds, TINs, general meshes, trajectories, surfaces, and solids. This abstract definition, which is too high-level to establish interoperability, is concretized by the OGC GML 3.2.1 Application Schema for Coverages Standard into an interoperable representation. The OGC Web Coverage Processing Service (WCPS) Standard defines a declarative query language on multi-dimensional raster-type coverages, such as 1D in-situ sensor timeseries, 2D EO imagery, 3D x/y/t image time series and x/y/z geophysical data, 4D x/y/z/t climate and ocean data. Hence, important ingredients for versatile coverage retrieval are given - however, this potential has not been fully unleashed by service architectures up to now. The EU FP7-INFRA project EarthServer, launched in September 2011, aims at enabling standards-based on-demand analytics over the Web for Earth science data based on an integration of W3C XQuery for alphanumeric data and OGC-WCPS for raster data. Ultimately, EarthServer will support all OGC coverage types. The platform used by EarthServer is the rasdaman raster database system. To exploit heterogeneous multi-parallel platforms, automatic request distribution and orchestration is being established. Client toolkits are under development which will allow to quickly compose bespoke interactive clients, ranging from mobile devices over Web clients to high-end immersive virtual reality. The EarthServer platform has been deployed in six large-scale data centres with the aim of setting up Lighthouse Applications addressing all Earth Sciences, including satellite and airborne earth observation as well as use cases from atmosphere, ocean, snow, and ice monitoring, and geology on Earth and Mars. These services, each of which will ultimately host at least 100 TB, will form a peer cloud with distributed query processing for arbitrarily mixing database and in-situ access. With its ability to directly manipulate, analyze and remix massive data, the goal of EarthServer is to lift the data providers' semantic level from data stewardship to service stewardship.

  8. Choosing health care online: a 7-Eleven case study.

    PubMed

    Fuller, Margaret; Beauregard, Cindy

    2003-01-01

    This article describes 7-Eleven's success in offering Web-based health care enrollment to its diverse workforce, which made the introduction of such service delivery strategy unusually challenging. Through its efforts, 7-Eleven was able to meet several important objectives, including helping employees better appreciate the value of their benefits, providing employees with increased services and convenience, and encouraging employees to make more cost-effective choices in their health care coverage.

  9. [Utilization and coverage of a Food and Nutritional Surveillance System in Rio Grande do Sul state, Brazil].

    PubMed

    Jung, Natália Miranda; Bairros, Fernanda de Souza; Neutzling, Marilda Borges

    2014-05-01

    This article seeks to describe the utilization and coverage percentage of the Nutritional and Food Surveillance System (SISVAN-Web) in the Regional Health Offices of Rio Grande do Sul in 2010 and to assess its correlation with socio-economic, demographic and health system organization variables at the time. It is an ecological study that used secondary data from the SISVAN-Web, the Department of Primary Health Care, the IT Department of the Unified Health System and the Brazilian Institute of Geography and Statistics. The evaluation of utilization and coverage data was restricted to nutritional status. The percentage of utilization of SISVAN-Web refers to the number of cities that fed the system. Total coverage was defined as the percentage of individuals in all stages of the life cycle monitored by SISVAN-Web. It was found that 324 cities fed the application, corresponding to a utilization percentage of 65.3%. Greater system coverage was observed in all Regional Health Coordination (RHC) Units for ages 0 to 5 years and 5-10 years. There was a significant association between the percentage of utilization of SISVAN-Web and Family Health Strategy coverage in each RHC Unit. The results of this study indicated low percentages of utilization and coverage of SISVAN-Web in Rio Grande do Sul.

  10. A National Crop Progress Monitoring System Based on NASA Earth Science Results

    NASA Astrophysics Data System (ADS)

    Di, L.; Yu, G.; Zhang, B.; Deng, M.; Yang, Z.

    2011-12-01

    Crop progress is an important piece of information for food security and agricultural commodities. Timely monitoring and reporting are mandated for the operation of agricultural statistical agencies. Traditionally, the weekly reporting issued by the National Agricultural Statistics Service (NASS) of the United States Department of Agriculture (USDA) is based on reports from the knowledgeable state and county agricultural officials and farmers. The results are spatially coarse and subjective. In this project, a remote-sensing-supported crop progress monitoring system is being developed intensively using the data and derived products from NASA Earth Observing satellites. Moderate Resolution Imaging Spectroradiometer (MODIS) Level 3 product - MOD09 (Surface Reflectance) is used for deriving daily normalized vegetation index (NDVI), vegetation condition index (VCI), and mean vegetation condition index (MVCI). Ratio change to previous year and multiple year mean can be also produced on demand. The time-series vegetation condition indices are further combined with the NASS' remote-sensing-derived Cropland Data Layer (CDL) to estimate crop condition and progress crop by crop. To facilitate the operational requirement and increase the accessibility of data and products by different users, each component of the system has being developed and implemented following open specifications under the Web Service reference model of Open Geospatial Consortium Inc. Sensor observations and data are accessed through Web Coverage Service (WCS), Web Feature Service (WFS), or Sensor Observation Service (SOS) if available. Products are also served through such open-specification-compliant services. For rendering and presentation, Web Map Service (WMS) is used. A Web-service based system is set up and deployed at dss.csiss.gmu.edu/NDVIDownload. Further development will adopt crop growth models, feed the models with remotely sensed precipitation and soil moisture information, and incorporate the model results with vegetation-index time series for crop progress stage estimation.

  11. Exposing Coverage Data to the Semantic Web within the MELODIES project: Challenges and Solutions

    NASA Astrophysics Data System (ADS)

    Riechert, Maik; Blower, Jon; Griffiths, Guy

    2016-04-01

    Coverage data, typically big in data volume, assigns values to a given set of spatiotemporal positions, together with metadata on how to interpret those values. Existing storage formats like netCDF, HDF and GeoTIFF all have various restrictions that prevent them from being preferred formats for use over the web, especially the semantic web. Factors that are relevant here are the processing complexity, the semantic richness of the metadata, and the ability to request partial information, such as a subset or just the appropriate metadata. Making coverage data available within web browsers opens the door to new ways for working with such data, including new types of visualization and on-the-fly processing. As part of the European project MELODIES (http://melodiesproject.eu) we look into the challenges of exposing such coverage data in an interoperable and web-friendly way, and propose solutions using a host of emerging technologies like JSON-LD, the DCAT and GeoDCAT-AP ontologies, the CoverageJSON format, and new approaches to REST APIs for coverage data. We developed the CoverageJSON format within the MELODIES project as an additional way to expose coverage data to the web, next to having simple rendered images available using standards like OGC's WMS. CoverageJSON partially incorporates JSON-LD but does not encode individual data values as semantic resources, making use of the technology in a practical manner. The development also focused on it being a potential output format for OGC WCS. We will demonstrate how existing netCDF data can be exposed as CoverageJSON resources on the web together with a REST API that allows users to explore the data and run operations such as spatiotemporal subsetting. We will show various use cases from the MELODIES project, including reclassification of a Land Cover dataset client-side within the browser with the ability for the user to influence the reclassification result by making use of the above technologies.

  12. Exploring NASA OMI Level 2 Data With Visualization

    NASA Technical Reports Server (NTRS)

    Wei, Jennifer; Yang, Wenli; Johnson, James; Zhao, Peisheng; Gerasimov, Irina; Pham, Long; Vicente, Gilberto

    2014-01-01

    Satellite data products are important for a wide variety of applications that can bring far-reaching benefits to the science community and the broader society. These benefits can best be achieved if the satellite data are well utilized and interpreted, such as model inputs from satellite, or extreme events (such as volcano eruptions, dust storms,... etc.). Unfortunately, this is not always the case, despite the abundance and relative maturity of numerous satellite data products provided by NASA and other organizations. Such obstacles may be avoided by allowing users to visualize satellite data as "images", with accurate pixel-level (Level-2) information, including pixel coverage area delineation and science team recommended quality screening for individual geophysical parameters. We present a prototype service from the Goddard Earth Sciences Data and Information Services Center (GES DISC) supporting Aura OMI Level-2 Data with GIS-like capabilities. Functionality includes selecting data sources (e.g., multiple parameters under the same scene, like NO2 and SO2, or the same parameter with different aggregation methods, like NO2 in OMNO2G and OMNO2D products), user-defined area-of-interest and temporal extents, zooming, panning, overlaying, sliding, and data subsetting, reformatting, and reprojection. The system will allow any user-defined portal interface (front-end) to connect to our backend server with OGC standard-compliant Web Mapping Service (WMS) and Web Coverage Service (WCS) calls. This back-end service should greatly enhance its expandability to integrate additional outside data/map sources.

  13. Exploring NASA OMI Level 2 Data With Visualization

    NASA Technical Reports Server (NTRS)

    Wei, Jennifer C.; Yang, Wenli; Johnson, James; Zhao, Peisheng; Gerasimov, Irina; Pham, Long; Vincente, Gilbert

    2014-01-01

    Satellite data products are important for a wide variety of applications that can bring far-reaching benefits to the science community and the broader society. These benefits can best be achieved if the satellite data are well utilized and interpreted, such as model inputs from satellite, or extreme events (such as volcano eruptions, dust storms, etc.).Unfortunately, this is not always the case, despite the abundance and relative maturity of numerous satellite data products provided by NASA and other organizations. Such obstacles may be avoided by allowing users to visualize satellite data as images, with accurate pixel-level (Level-2) information, including pixel coverage area delineation and science team recommended quality screening for individual geophysical parameters. We present a prototype service from the Goddard Earth Sciences Data and Information Services Center (GES DISC) supporting Aura OMI Level-2 Data with GIS-like capabilities. Functionality includes selecting data sources (e.g., multiple parameters under the same scene, like NO2 and SO2, or the same parameter with different aggregation methods, like NO2 in OMNO2G and OMNO2D products), user-defined area-of-interest and temporal extents, zooming, panning, overlaying, sliding, and data subsetting, reformatting, and reprojection. The system will allow any user-defined portal interface (front-end) to connect to our backend server with OGC standard-compliant Web Mapping Service (WMS) and Web Coverage Service (WCS) calls. This back-end service should greatly enhance its expandability to integrate additional outside data-map sources.

  14. Serving Satellite Remote Sensing Data to User Community through the OGC Interoperability Protocols

    NASA Astrophysics Data System (ADS)

    di, L.; Yang, W.; Bai, Y.

    2005-12-01

    Remote sensing is one of the major methods for collecting geospatial data. Hugh amount of remote sensing data has been collected by space agencies and private companies around the world. For example, NASA's Earth Observing System (EOS) is generating more than 3 Tb of remote sensing data per day. The data collected by EOS are processed, distributed, archived, and managed by the EOS Data and Information System (EOSDIS). Currently, EOSDIS is managing several petabytes of data. All of those data are not only valuable for global change research, but also useful for local and regional application and decision makings. How to make the data easily accessible to and usable by the user community is one of key issues for realizing the full potential of these valuable datasets. In the past several years, the Open Geospatial Consortium (OGC) has developed several interoperability protocols aiming at making geospatial data easily accessible to and usable by the user community through Internet. The protocols particularly relevant to the discovery, access, and integration of multi-source satellite remote sensing data are the Catalog Service for Web (CS/W) and Web Coverage Services (WCS) Specifications. The OGC CS/W specifies the interfaces, HTTP protocol bindings, and a framework for defining application profiles required to publish and access digital catalogues of metadata for geographic data, services, and related resource information. The OGC WCS specification defines the interfaces between web-based clients and servers for accessing on-line multi-dimensional, multi-temporal geospatial coverage in an interoperable way. Based on definitions by OGC and ISO 19123, coverage data include all remote sensing images as well as gridded model outputs. The Laboratory for Advanced Information Technology and Standards (LAITS), George Mason University, has been working on developing and implementing OGC specifications for better serving NASA Earth science data to the user community for many years. We have developed the NWGISS software package that implements multiple OGC specifications, including OGC WMS, WCS, CS/W, and WFS. As a part of NASA REASON GeoBrain project, the NWGISS WCS and CS/W servers have been extended to provide operational access to NASA EOS data at data pools through OGC protocols and to make both services chainable in the web-service chaining. The extensions in the WCS server include the implementation of WCS 1.0.0 and WCS 1.0.2, and the development of WSDL description of the WCS services. In order to find the on-line EOS data resources, the CS/W server is extended at the backend to search metadata in NASA ECHO. This presentation reports those extensions and discuss lessons-learned on the implementation. It also discusses the advantage, disadvantages, and future improvement of OGC specifications, particularly the WCS.

  15. EarthServer2 : The Marine Data Service - Web based and Programmatic Access to Ocean Colour Open Data

    NASA Astrophysics Data System (ADS)

    Clements, Oliver; Walker, Peter

    2017-04-01

    The ESA Ocean Colour - Climate Change Initiative (ESA OC-CCI) has produced a long-term high quality global dataset with associated per-pixel uncertainty data. This dataset has now grown to several hundred terabytes (uncompressed) and is freely available to download. However, the sheer size of the dataset can act as a barrier to many users; large network bandwidth, local storage and processing requirements can prevent researchers without the backing of a large organisation from taking advantage of this raw data. The EC H2020 project, EarthServer2, aims to create a federated data service providing access to more than 1 petabyte of earth science data. Within this federation the Marine Data Service already provides an innovative on-line tool-kit for filtering, analysing and visualising OC-CCI data. Data are made available, filtered and processed at source through a standards-based interface, the Open Geospatial Consortium Web Coverage Service and Web Coverage Processing Service. This work was initiated in the EC FP7 EarthServer project where it was found that the unfamiliarity and complexity of these interfaces itself created a barrier to wider uptake. The continuation project, EarthServer2, addresses these issues by providing higher level tools for working with these data. We will present some examples of these tools. Many researchers wish to extract time series data from discrete points of interest. We will present a web based interface, based on NASA/ESA WebWorldWind, for selecting points of interest and plotting time series from a chosen dataset. In addition, a CSV file of locations and times, such as a ship's track, can be uploaded and these points extracted and returned in a CSV file allowing researchers to work with the extract locally, such as a spreadsheet. We will also present a set of Python and JavaScript APIs that have been created to complement and extend the web based GUI. These APIs allow the selection of single points and areas for extraction. The extracted data is returned as structured data (for instance a Python array) which can then be passed directly to local processing code. We will highlight how the libraries can be used by the community and integrated into existing systems, for instance by the use of Jupyter notebooks to share Python code examples which can then be used by other researchers as a basis for their own work.

  16. Increasing the availability and usability of terrestrial ecology data through geospatial Web services and visualization tools (Invited)

    NASA Astrophysics Data System (ADS)

    Santhana Vannan, S.; Cook, R. B.; Wilson, B. E.; Wei, Y.

    2010-12-01

    Terrestrial ecology data sets are produced from diverse data sources such as model output, field data collection, laboratory analysis and remote sensing observation. These data sets can be created, distributed, and consumed in diverse ways as well. However, this diversity can hinder the usability of the data, and limit data users’ abilities to validate and reuse data for science and application purposes. Geospatial web services, such as those described in this paper, are an important means of reducing this burden. Terrestrial ecology researchers generally create the data sets in diverse file formats, with file and data structures tailored to the specific needs of their project, possibly as tabular data, geospatial images, or documentation in a report. Data centers may reformat the data to an archive-stable format and distribute the data sets through one or more protocols, such as FTP, email, and WWW. Because of the diverse data preparation, delivery, and usage patterns, users have to invest time and resources to bring the data into the format and structure most useful for their analysis. This time-consuming data preparation process shifts valuable resources from data analysis to data assembly. To address these issues, the ORNL DAAC, a NASA-sponsored terrestrial ecology data center, has utilized geospatial Web service technology, such as Open Geospatial Consortium (OGC) Web Map Service (WMS) and OGC Web Coverage Service (WCS) standards, to increase the usability and availability of terrestrial ecology data sets. Data sets are standardized into non-proprietary file formats and distributed through OGC Web Service standards. OGC Web services allow the ORNL DAAC to store data sets in a single format and distribute them in multiple ways and formats. Registering the OGC Web services through search catalogues and other spatial data tools allows for publicizing the data sets and makes them more available across the Internet. The ORNL DAAC has also created a Web-based graphical user interface called Spatial Data Access Tool (SDAT) that utilizes OGC Web services standards and allows data distribution and consumption for users not familiar with OGC standards. SDAT also allows for users to visualize the data set prior to download. Google Earth visualizations of the data set are also provided through SDAT. The use of OGC Web service standards at the ORNL DAAC has enabled an increase in data consumption. In one case, a data set had ~10 fold increase in download through OGC Web service in comparison to the conventional FTP and WWW method of access. The increase in download suggests that users are not only finding the data sets they need but also able to consume them readily in the format they need.

  17. Immunization, urbanization and slums - a systematic review of factors and interventions.

    PubMed

    Crocker-Buque, Tim; Mindra, Godwin; Duncan, Richard; Mounier-Jack, Sandra

    2017-06-08

    In 2014, over half (54%) of the world's population lived in urban areas and this proportion will increase to 66% by 2050. This urbanizing trend has been accompanied by an increasing number of people living in urban poor communities and slums. Lower immunization coverage is found in poorer urban dwellers in many contexts. This study aims to identify factors associated with immunization coverage in poor urban areas and slums, and to identify interventions to improve coverage. We conducted a systematic review, searching Medline, Embase, Global Health, CINAHL, Web of Science and The Cochrane Database with broad search terms for studies published between 2000 and 2016. Of 4872 unique articles, 327 abstracts were screened, leading to 63 included studies: 44 considering factors and 20 evaluating interventions (one in both categories) in 16 low or middle-income countries. A wide range of socio-economic characteristics were associated with coverage in different contexts. Recent rural-urban migration had a universally negative effect. Parents commonly reported lack of awareness of immunization importance and difficulty accessing services as reasons for under-immunization of their children. Physical distance to clinics and aspects of service quality also impacted uptake. We found evidence of effectiveness for interventions involving multiple components, especially if they have been designed with community involvement. Outreach programmes were effective where physical distance was identified as a barrier. Some evidence was found for the effective use of SMS (text) messaging services, community-based education programmes and financial incentives, which warrant further evaluation. No interventions were identified that provided services to migrants from rural areas. Different factors affect immunization coverage in different urban poor and slum contexts. Immunization services should be designed in collaboration with slum-dwelling communities, considering the local context. Interventions should be designed and tested to increase immunization in migrants from rural areas.

  18. The quality of online antidepressant drug information: an evaluation of English and Finnish language Web sites.

    PubMed

    Prusti, Marjo; Lehtineva, Susanna; Pohjanoksa-Mäntylä, Marika; Bell, J Simon

    2012-01-01

    The Internet is a frequently used source of drug information, including among people with mental disorders. Online drug information may be narrow in scope, incomplete, and contain errors of omission. To evaluate the quality of online antidepressant drug information in English and Finnish. Forty Web sites were identified using the search terms antidepressants and masennuslääkkeet in English and Finnish, respectively. Included Web sites (14 English, 8 Finnish) were evaluated for aesthetics, interactivity, content coverage, and content correctness using published criteria. All Web sites were assessed using the Date, Author, References, Type, Sponsor (DARTS) and DISCERN quality assessment tools. English and Finnish Web sites had similar aesthetics, content coverage, and content correctness scores. English Web sites were more interactive than Finnish Web sites (P<.05). Overall, adverse drug reactions were covered on 21 of 22 Web sites; however, drug-alcohol interactions were addressed on only 9 of 22 Web sites, and dose was addressed on only 6 of 22 Web sites. Few (2/22 Web sites) provided incorrect information. The DISCERN score was significantly correlated with content coverage (r=0.670, P<.01), content correctness (r=0.663, P<.01), and the DARTS score (r=0.459, P<.05). No Web site provided information about all aspects of antidepressant treatment. Nevertheless, few Web sites provided incorrect information. Both English and Finnish Web sites were similar in terms of aesthetics, content coverage, and content correctness. Copyright © 2012 Elsevier Inc. All rights reserved.

  19. SensorWeb 3G: Extending On-Orbit Sensor Capabilities to Enable Near Realtime User Configurability

    NASA Technical Reports Server (NTRS)

    Mandl, Daniel; Cappelaere, Pat; Frye, Stuart; Sohlberg, Rob; Ly, Vuong; Chien, Steve; Tran, Daniel; Davies, Ashley; Sullivan, Don; Ames, Troy; hide

    2010-01-01

    This research effort prototypes an implementation of a standard interface, Web Coverage Processing Service (WCPS), which is an Open Geospatial Consortium(OGC) standard, to enable users to define, test, upload and execute algorithms for on-orbit sensor systems. The user is able to customize on-orbit data products that result from raw data streaming from an instrument. This extends the SensorWeb 2.0 concept that was developed under a previous Advanced Information System Technology (AIST) effort in which web services wrap sensors and a standardized Extensible Markup Language (XML) based scripting workflow language orchestrates processing steps across multiple domains. SensorWeb 3G extends the concept by providing the user controls into the flight software modules associated with on-orbit sensor and thus provides a degree of flexibility which does not presently exist. The successful demonstrations to date will be presented, which includes a realistic HyspIRI decadal mission testbed. Furthermore, benchmarks that were run will also be presented along with future demonstration and benchmark tests planned. Finally, we conclude with implications for the future and how this concept dovetails into efforts to develop "cloud computing" methods and standards.

  20. Searching the world wide Web

    PubMed

    Lawrence; Giles

    1998-04-03

    The coverage and recency of the major World Wide Web search engines was analyzed, yielding some surprising results. The coverage of any one engine is significantly limited: No single engine indexes more than about one-third of the "indexable Web," the coverage of the six engines investigated varies by an order of magnitude, and combining the results of the six engines yields about 3.5 times as many documents on average as compared with the results from only one engine. Analysis of the overlap between pairs of engines gives an estimated lower bound on the size of the indexable Web of 320 million pages.

  1. Development and formative evaluation of an innovative mHealth intervention for improving coverage of community-based maternal, newborn and child health services in rural areas of India

    PubMed Central

    Modi, Dhiren; Gopalan, Ravi; Shah, Shobha; Venkatraman, Sethuraman; Desai, Gayatri; Desai, Shrey; Shah, Pankaj

    2015-01-01

    Background A new cadre of village-based frontline health workers, called Accredited Social Health Activists (ASHAs), was created in India. However, coverage of selected community-based maternal, newborn and child health (MNCH) services remains low. Objective This article describes the process of development and formative evaluation of a complex mHealth intervention (ImTeCHO) to increase the coverage of proven MNCH services in rural India by improving the performance of ASHAs. Design The Medical Research Council (MRC) framework for developing complex interventions was used. Gaps were identified in the usual care provided by ASHAs, based on a literature search, and SEWA Rural's1 three decades of grassroots experience. The components of the intervention (mHealth strategies) were designed to overcome the gaps in care. The intervention, in the form of the ImTeCHO mobile phone and web application, along with the delivery model, was developed to incorporate these mHealth strategies. The intervention was piloted through 45 ASHAs among 45 villages in Gujarat (population: 45,000) over 7 months in 2013 to assess the acceptability, feasibility, and usefulness of the intervention and to identify barriers to its delivery. Results Inadequate supervision and support to ASHAs were noted as a gap in usual care, resulting in low coverage of selected MNCH services and care received by complicated cases. Therefore, the ImTeCHO application was developed to integrate mHealth strategies in the form of job aid to ASHAs to assist with scheduling, behavior change communication, diagnosis, and patient management, along with supervision and support of ASHAs. During the pilot, the intervention and its delivery were found to be largely acceptable, feasible, and useful. A few changes were made to the intervention and its delivery, including 1) a new helpline for ASHAs, 2) further simplification of processes within the ImTeCHO incentive management system and 3) additional web-based features for enhancing value and supervision of Primary Health Center (PHC) staff. Conclusions The effectiveness of the improved ImTeCHO intervention will be now tested through a cluster randomized trial. PMID:25697233

  2. Exploiting Aura OMI Level 2 Data with High Resolution Visualization

    NASA Astrophysics Data System (ADS)

    Wei, J. C.; Yang, W.; Johnson, J. E.; Zhao, P.; Gerasimov, I. V.; Pham, L.; Vicente, G. A.; Shen, S.

    2014-12-01

    Satellite data products are important for a wide variety of applications that can bring far-reaching benefits to the science community and the broader society. These benefits can best be achieved if the satellite data are well utilized and interpreted, such as model inputs from satellite, or extreme event (such as volcano eruption, dust storm, …etc) interpretation from satellite. Unfortunately, this is not always the case, despite the abundance and relative maturity of numerous satellite data products provided by NASA and other organizations. One way to help users better understand the satellite data is to provide data along with 'Images', including accurate pixel-level (Level 2) information, pixel coverage area delineation, and science team recommended quality screening for individual geophysical parameters. Goddard Earth Sciences Data and Information Services Center (GES DISC) always strives to best support (i.e., Software-as-a-service, SaaS) the user-community for NASA Earth Science Data. In this case, we will present a new visualization tool that helps users exploiting Aura Ozone Monitoring Instrument (OMI) Level 2 data. This new visualization service utilizes Open Geospatial Consortium (OGC) standard-compliant Web Mapping Service (WMS) and Web Coverage Service (WCS) calls in the backend infrastructure. The functionality of the service allows users to select data sources (e.g., multiple parameters under the same measurement, like NO2 and SO2 from OMI Level 2 or same parameter with different methods of aggregation, like NO2 in OMNO2G and OMNO2D products), defining area-of-interest and temporal extents, zooming, panning, overlaying, sliding, and data subsetting and reformatting. The interface will also be able to connect to other OGC WMS and WCS servers, which will greatly enhance its expandability to integrate additional outside data/map sources (such as Global Imagery Browse Services (GIBS)).

  3. EarthServer: Use of Rasdaman as a data store for use in visualisation of complex EO data

    NASA Astrophysics Data System (ADS)

    Clements, Oliver; Walker, Peter; Grant, Mike

    2013-04-01

    The European Commission FP7 project EarthServer is establishing open access and ad-hoc analytics on extreme-size Earth Science data, based on and extending cutting-edge Array Database technology. EarthServer is built around the Rasdaman Raster Data Manager which extends standard relational database systems with the ability to store and retrieve multi-dimensional raster data of unlimited size through an SQL style query language. Rasdaman facilitates visualisation of data by providing several Open Geospatial Consortium (OGC) standard interfaces through its web services wrapper, Petascope. These include the well established standards, Web Coverage Service (WCS) and Web Map Service (WMS) as well as the emerging standard, Web Coverage Processing Service (WCPS). The WCPS standard allows the running of ad-hoc queries on the data stored within Rasdaman, creating an infrastructure where users are not restricted by bandwidth when manipulating or querying huge datasets. Here we will show that the use of EarthServer technologies and infrastructure allows access and visualisation of massive scale data through a web client with only marginal bandwidth use as opposed to the current mechanism of copying huge amounts of data to create visualisations locally. For example if a user wanted to generate a plot of global average chlorophyll for a complete decade time series they would only have to download the result instead of Terabytes of data. Firstly we will present a brief overview of the capabilities of Rasdaman and the WCPS query language to introduce the ways in which it is used in a visualisation tool chain. We will show that there are several ways in which WCPS can be utilised to create both standard and novel web based visualisations. An example of a standard visualisation is the production of traditional 2d plots, allowing users the ability to plot data products easily. However, the query language allows the creation of novel/custom products, which can then immediately be plotted with the same system. For more complex multi-spectral data, WCPS allows the user to explore novel combinations of bands in standard band-ratio algorithms through a web browser with dynamic updating of the resultant image. To visualise very large datasets Rasdaman has the capability to dynamically scale a dataset or query result so that it can be appraised quickly for use in later unscaled queries. All of these techniques are accessible through a web based GIS interface increasing the number of potential users of the system. Lastly we will show the advances in dynamic web based 3D visualisations being explored within the EarthServer project. By utilising the emerging declarative 3D web standard X3DOM as a tool to visualise the results of WCPS queries we introduce several possible benefits, including quick appraisal of data for outliers or anomalous data points and visualisation of the uncertainty of data alongside the actual data values.

  4. Exploring NASA Satellite Data with High Resolution Visualization

    NASA Astrophysics Data System (ADS)

    Wei, J. C.; Yang, W.; Johnson, J. E.; Shen, S.; Zhao, P.; Gerasimov, I. V.; Vollmer, B.; Vicente, G. A.; Pham, L.

    2013-12-01

    Satellite data products are important for a wide variety of applications that can bring far-reaching benefits to the science community and the broader society. These benefits can best be achieved if the satellite data are well utilized and interpreted, such as model inputs from satellite, or extreme event (such as volcano eruption, dust storm, ...etc) interpretation from satellite. Unfortunately, this is not always the case, despite the abundance and relative maturity of numerous satellite data products provided by NASA and other organizations. Such obstacles may be avoided by providing satellite data as ';Images' with accurate pixel-level (Level 2) information, including pixel coverage area delineation and science team recommended quality screening for individual geophysical parameters. We will present a prototype service from the Goddard Earth Sciences Data and Information Services Center (GES DISC) supporting various visualization and data accessing capabilities from satellite Level 2 data (non-aggregated and un-gridded) at high spatial resolution. Functionality will include selecting data sources (e.g., multiple parameters under the same measurement, like NO2 and SO2 from Ozone Monitoring Instrument (OMI), or same parameter with different methods of aggregation, like NO2 in OMNO2G and OMNO2D products), defining area-of-interest and temporal extents, zooming, panning, overlaying, sliding, and data subsetting and reformatting. The portal interface will connect to the backend services with OGC standard-compliant Web Mapping Service (WMS) and Web Coverage Service (WCS) calls. The interface will also be able to connect to other OGC WMS and WCS servers, which will greatly enhance its expandability to integrate additional outside data/map sources.

  5. JADDS - towards a tailored global atmospheric composition data service for CAMS forecasts and reanalysis

    NASA Astrophysics Data System (ADS)

    Stein, Olaf; Schultz, Martin G.; Rambadt, Michael; Saini, Rajveer; Hoffmann, Lars; Mallmann, Daniel

    2017-04-01

    Global model data of atmospheric composition produced by the Copernicus Atmospheric Monitoring Service (CAMS) is collected since 2010 at FZ Jülich and serves as boundary condition for use by Regional Air Quality (RAQ) modellers world-wide. RAQ models need time-resolved meteorological as well as chemical lateral boundary conditions for their individual model domains. While the meteorological data usually come from well-established global forecast systems, the chemical boundary conditions are not always well defined. In the past, many models used 'climatic' boundary conditions for the tracer concentrations, which can lead to significant concentration biases, particularly for tracers with longer lifetimes which can be transported over long distances (e.g. over the whole northern hemisphere) with the mean wind. The Copernicus approach utilizes extensive near-realtime data assimilation of atmospheric composition data observed from space which gives additional reliability to the global modelling data and is well received by the RAQ communities. An existing Web Coverage Service (WCS) for sharing these individually tailored model results is currently being re-engineered to make use of a modern, scalable database technology in order to improve performance, enhance flexibility, and allow the operation of catalogue services. The new Jülich Atmospheric Data Distributions Server (JADDS) adheres to the Web Coverage Service WCS2.0 standard as defined by the Open Geospatial Consortium OGC. This enables the user groups to flexibly define datasets they need by selecting a subset of chemical species or restricting geographical boundaries or the length of the time series. The data is made available in the form of different catalogues stored locally on our server. In addition, the Jülich OWS Interface (JOIN) provides interoperable web services allowing for easy download and visualization of datasets delivered from WCS servers via the internet. We will present the prototype JADDS server and address the major issues identified when relocating large four-dimensional datasets into a RASDAMAN raster array database. So far the RASDAMAN support for data available in netCDF format is limited with respect to metadata related to variables and axes. For community-wide accepted solutions, selected data coverages shall result in downloadable netCDF files including metadata complying with the netCDF CF Metadata Conventions standard (http://cfconventions.org/). This can be achieved by adding custom metadata elements for RASDAMAN bands (model levels) on data ingestion. Furthermore, an optimization strategy for ingestion of several TB of 4D model output data will be outlined.

  6. Challenges in Visualizing Satellite Level 2 Atmospheric Data with GIS approach

    NASA Astrophysics Data System (ADS)

    Wei, J. C.; Yang, W.; Zhao, P.; Pham, L.; Meyer, D. J.

    2017-12-01

    Satellite data products are important for a wide variety of applications that can bring far-reaching benefits to the science community and the broader society. These benefits can best be achieved if the satellite data are well utilized and interpreted. Unfortunately, this is not always the case, despite the abundance and relative maturity of numerous satellite data products provided by NASA and other organizations. One way to help users better understand the satellite data is to provide data along with `Images', including accurate pixel coverage area delineation, and science team recommended quality screening for individual geophysical parameters. However, there are challenges of visualizing remote sensed non-gridded products: (1) different geodetics of space-borne instruments (2) data often arranged in "along-track" and "across-track" axes (3) spatially and temporally continuous data chunked into granule files: data for a portion (or all) of a satellite orbit (4) no general rule of resampling or interpolations to a grid (5) geophysical retrieval only based on pixel center location without shape information. In this presentation, we will unravel a new Goddard Earth Sciences Data and Information Services Center (GES DISC) Level 2 (L2) visualization on-demand service. The service's front end provides various visualization and data accessing capabilities, such as overlay and swipe of multiply variables and subset and download of data in different formats. The backend of the service consists of Open Geospatial Consortium (OGC) standard-compliant Web Mapping Service (WMS) and Web Coverage Service. The infrastructure allows inclusion of outside data sources served in OGC compliant protocols and allows other interoperable clients, such as ArcGIS clients, to connect to our L2 WCS/WMS.

  7. Challenges in Obtaining and Visualizing Satellite Level 2 Data in GIS

    NASA Technical Reports Server (NTRS)

    Wei, Jennifer C.; Yang, Wenli; Zhao, Peisheng; Pham, Long; Meyer, David J.

    2017-01-01

    Satellite data products are important for a wide variety of applications that can bring far-reaching benefits to the science community and the broader society. These benefits can best be achieved if the satellite data are well utilized and interpreted. Unfortunately, this is not always the case, despite the abundance and relative maturity of numerous satellite data products provided by NASA and other organizations. One way to help users better understand the satellite data is to provide data along with Images, including accurate pixel coverage area delineation, and science team recommended quality screening for individual geophysical parameters. However, there are challenges of visualizing remote sensed non-gridded products: (1) different geodetics of space-borne instruments (2) data often arranged in a long-track� and a cross-track� axes (3) spatially and temporally continuous data chunked into granule files: data for a portion (or all) of a satellite orbit (4) no general rule of resampling or interpolations to a grid (5) geophysical retrieval only based on pixel center location without shape information. In this presentation, we will unravel a new Goddard Earth Sciences Data and Information Services Center (GES DISC) Level 2 (L2) visualization on-demand service. The service's front end provides various visualization and data accessing capabilities, such as overlay and swipe of multiply variables and subset and download of data in different formats. The backend of the service consists of Open Geospatial Consortium (OGC) standard-compliant Web Mapping Service (WMS) and Web Coverage Service. The infrastructure allows inclusion of outside data sources served in OGC compliant protocols and allows other interoperable clients, such as ArcGIS clients, to connect to our L2 WCS/WMS.

  8. Operational Use of OGC Web Services at the Met Office

    NASA Astrophysics Data System (ADS)

    Wright, Bruce

    2010-05-01

    The Met Office has adopted the Service-Orientated Architecture paradigm to deliver services to a range of customers through Rich Internet Applications (RIAs). The approach uses standard Open Geospatial Consortium (OGC) web services to provide information to web-based applications through a range of generic data services. "Invent", the Met Office beta site, is used to showcase Met Office future plans for presenting web-based weather forecasts, product and information to the public. This currently hosts a freely accessible Weather Map Viewer, written in JavaScript, which accesses a Web Map Service (WMS), to deliver innovative web-based visualizations of weather and its potential impacts to the public. The intention is to engage the public in the development of new web-based services that more accurately meet their needs. As the service is intended for public use within the UK, it has been designed to support a user base of 5 million, the analysed level of UK web traffic reaching the Met Office's public weather information site. The required scalability has been realised through the use of multi-tier tile caching: - WMS requests are made for 256x256 tiles for fixed areas and zoom levels; - a Tile Cache, developed in house, efficiently serves tiles on demand, managing WMS request for the new tiles; - Edge Servers, externally hosted by Akamai, provide a highly scalable (UK-centric) service for pre-cached tiles, passing new requests to the Tile Cache; - the Invent Weather Map Viewer uses the Google Maps API to request tiles from Edge Servers. (We would expect to make use of the Web Map Tiling Service, when it becomes an OGC standard.) The Met Office delivers specialist commercial products to market sectors such as transport, utilities and defence, which exploit a Web Feature Service (WFS) for data relating forecasts and observations to specific geographic features, and a Web Coverage Service (WCS) for sub-selections of gridded data. These are locally rendered as maps or graphs, and combined with the WMS pre-rendered images and text, in a FLEX application, to provide sophisticated, user impact-based view of the weather. The OGC web services supporting these applications have been developed in collaboration with commercial companies. Visual Weather was originally a desktop application for forecasters, but IBL have developed it to expose the full range of forecast and observation data through standard web services (WCS and WMS). Forecasts and observations relating to specific locations and geographic features are held in an Oracle Database, and exposed as a WFS using Snowflake Software's GO-Publisher application. The Met Office has worked closely with both IBL and Snowflake Software to ensure that the web services provided strike a balance between conformance to the standards and performance in an operational environment. This has proved challenging in areas where the standards are rapidly evolving (e.g. WCS) or do not allow adequate description of the Met-Ocean domain (e.g. multiple time coordinates and parametric vertical coordinates). It has also become clear that careful selection of the features to expose, based on the way in which you expect users to query those features, in necessary in order to deliver adequate performance. These experiences are providing useful 'real-world' input in to the recently launched OGC MetOcean Domain Working Group and World Meteorological Organisation (WMO) initiatives in this area.

  9. River Basin Standards Interoperability Pilot

    NASA Astrophysics Data System (ADS)

    Pesquer, Lluís; Masó, Joan; Stasch, Christoph

    2016-04-01

    There is a lot of water information and tools in Europe to be applied in the river basin management but fragmentation and a lack of coordination between countries still exists. The European Commission and the member states have financed several research and innovation projects in support of the Water Framework Directive. Only a few of them are using the recently emerging hydrological standards, such as the OGC WaterML 2.0. WaterInnEU is a Horizon 2020 project focused on creating a marketplace to enhance the exploitation of EU funded ICT models, tools, protocols and policy briefs related to water and to establish suitable conditions for new market opportunities based on these offerings. One of WaterInnEU's main goals is to assess the level of standardization and interoperability of these outcomes as a mechanism to integrate ICT-based tools, incorporate open data platforms and generate a palette of interchangeable components that are able to use the water data emerging from the recently proposed open data sharing processes and data models stimulated by initiatives such as the INSPIRE directive. As part of the standardization and interoperability activities in the project, the authors are designing an experiment (RIBASE, the present work) to demonstrate how current ICT-based tools and water data can work in combination with geospatial web services in the Scheldt river basin. The main structure of this experiment, that is the core of the present work, is composed by the following steps: - Extraction of information from river gauges data in OGC WaterML 2.0 format using SOS services (preferably compliant to the OGC SOS 2.0 Hydrology Profile Best Practice). - Model floods using a WPS 2.0, WaterML 2.0 data and weather forecast models as input. - Evaluation of the applicability of Sensor Notification Services in water emergencies. - Open distribution of the input and output data as OGC web services WaterML, / WCS / WFS and with visualization utilities: WMS. The architecture tests the combination of Gauge data in a WPS that is triggered by a meteorological alert. The data is translated into OGC WaterML 2.0 time series data format and will be ingested in a SOS 2.0. SOS data is visualized in a SOS Client that is able to handle time series. The meteorological forecast data (with the supervision of an operator manipulating the WPS user interface) ingests with WaterML 2.0 time series and terrain data is input for a flooding modelling algorithm. The WPS is able to produce flooding datasets in the form of coverages that is offered to clients via a WCS 2.0 service or a WMS 1.3 service, and downloaded and visualized by the respective clients. The WPS triggers a notification or an alert that will be monitored from an emergency control response service. Acronyms AS: Alert Service ES: Event Service ICT: Information and Communication Technology NS: Notification Service OGC: Open Geospatial Consortium RIBASE: River Basin Standards Interoperability Pilot SOS: Sensor Observation Service WaterML: Water Markup Language WCS: Web Coverage Service WMS: Web Map Service WPS: Web Processing Service

  10. Perceived stress and life satisfaction: social network service use as a moderator.

    PubMed

    Niu, Qikun; Liu, Yihao; Sheng, Zitong; He, Yue; Shao, Xiaolin

    2011-01-01

    Social Network Service (SNS) has become a buzzword in recent media coverage with the development of the second generation of Web-based communities. In China, SNS has played an increasingly important role in its users' daily lives, especially among students. With a sample of 471 college students, we tested the direct relationship between perceived stress and life satisfaction using a regression analysis. Moreover, we found SNS use could buffer the negative effect of perceived stress. This study has practical implications on Internet users' SNS use.

  11. A Framework for Integrating Oceanographic Data Repositories

    NASA Astrophysics Data System (ADS)

    Rozell, E.; Maffei, A. R.; Beaulieu, S. E.; Fox, P. A.

    2010-12-01

    Oceanographic research covers a broad range of science domains and requires a tremendous amount of cross-disciplinary collaboration. Advances in cyberinfrastructure are making it easier to share data across disciplines through the use of web services and community vocabularies. Best practices in the design of web services and vocabularies to support interoperability amongst science data repositories are only starting to emerge. Strategic design decisions in these areas are crucial to the creation of end-user data and application integration tools. We present S2S, a novel framework for deploying customizable user interfaces to support the search and analysis of data from multiple repositories. Our research methods follow the Semantic Web methodology and technology development process developed by Fox et al. This methodology stresses the importance of close scientist-technologist interactions when developing scientific use cases, keeping the project well scoped and ensuring the result meets a real scientific need. The S2S framework motivates the development of standardized web services with well-described parameters, as well as the integration of existing web services and applications in the search and analysis of data. S2S also encourages the use and development of community vocabularies and ontologies to support federated search and reduce the amount of domain expertise required in the data discovery process. S2S utilizes the Web Ontology Language (OWL) to describe the components of the framework, including web service parameters, and OpenSearch as a standard description for web services, particularly search services for oceanographic data repositories. We have created search services for an oceanographic metadata database, a large set of quality-controlled ocean profile measurements, and a biogeographic search service. S2S provides an application programming interface (API) that can be used to generate custom user interfaces, supporting data and application integration across these repositories and other web resources. Although initially targeted towards a general oceanographic audience, the S2S framework shows promise in many science domains, inspired in part by the broad disciplinary coverage of oceanography. This presentation will cover the challenges addressed by the S2S framework, the research methods used in its development, and the resulting architecture for the system. It will demonstrate how S2S is remarkably extensible, and can be generalized to many science domains. Given these characteristics, the framework can simplify the process of data discovery and analysis for the end user, and can help to shift the responsibility of search interface development away from data managers.

  12. Internet Hospitals in China: Cross-Sectional Survey

    PubMed Central

    Lin, Lingyan; Fan, Si; Lin, Fen; Wang, Long; Guo, Tongjun; Ma, Chuyang; Zhang, Jingkun; Chen, Yixin

    2017-01-01

    Background The Internet hospital, an innovative approach to providing health care, is rapidly developing in China because it has the potential to provide widely accessible outpatient service delivery via Internet technologies. To date, China’s Internet hospitals have not been systematically investigated. Objective The aim of this study was to describe the characteristics of China’s Internet hospitals, and to assess their health service capacity. Methods We searched Baidu, the popular Chinese search engine, to identify Internet hospitals, using search terms such as “Internet hospital,” “web hospital,” or “cloud hospital.” All Internet hospitals in mainland China were eligible for inclusion if they were officially registered. Our search was carried out until March 31, 2017. Results We identified 68 Internet hospitals, of which 43 have been put into use and 25 were under construction. Of the 43 established Internet hospitals, 13 (30%) were in the hospital informatization stage, 24 (56%) were in the Web ward stage, and 6 (14%) were in full Internet hospital stage. Patients accessed outpatient service delivery via website (74%, 32/43), app (42%, 18/43), or offline medical consultation facility (37%, 16/43) from the Internet hospital. Furthermore, 25 (58%) of the Internet hospitals asked doctors to deliver health services at a specific Web clinic, whereas 18 (42%) did not. The consulting methods included video chat (60%, 26/43), telephone (19%, 8/43), and graphic message (28%, 12/43); 13 (30%) Internet hospitals cannot be consulted online any more. Only 6 Internet hospitals were included in the coverage of health insurance. The median number of doctors available online was zero (interquartile range [IQR] 0 to 5; max 16,492). The median consultation fee per time was ¥20 (approximately US $2.90, IQR ¥0 to ¥200). Conclusions Internet hospitals provide convenient outpatient service delivery. However, many of the Internet hospitals are not yet mature and are faced with various issues such as online doctor scarcity and the unavailability of health insurance coverage. China’s Internet hospitals are heading in the right direction to improve provision of health services, but much more remains to be done. PMID:28676472

  13. The climate4impact platform: Providing, tailoring and facilitating climate model data access

    NASA Astrophysics Data System (ADS)

    Pagé, Christian; Pagani, Andrea; Plieger, Maarten; Som de Cerff, Wim; Mihajlovski, Andrej; de Vreede, Ernst; Spinuso, Alessandro; Hutjes, Ronald; de Jong, Fokke; Bärring, Lars; Vega, Manuel; Cofiño, Antonio; d'Anca, Alessandro; Fiore, Sandro; Kolax, Michael

    2017-04-01

    One of the main objectives of climate4impact is to provide standardized web services and tools that are reusable in other portals. These services include web processing services, web coverage services and web mapping services (WPS, WCS and WMS). Tailored portals can be targeted to specific communities and/or countries/regions while making use of those services. Easier access to climate data is very important for the climate change impact communities. To fulfill this objective, the climate4impact (http://climate4impact.eu/) web portal and services has been developed, targeting climate change impact modellers, impact and adaptation consultants, as well as other experts using climate change data. It provides to users harmonized access to climate model data through tailored services. It features static and dynamic documentation, Use Cases and best practice examples, an advanced search interface, an integrated authentication and authorization system with the Earth System Grid Federation (ESGF), a visualization interface with ADAGUC web mapping tools. In the latest version, statistical downscaling services, provided by the Santander Meteorology Group Downscaling Portal, were integrated. An innovative interface to integrate statistical downscaling services will be released in the upcoming version. The latter will be a big step in bridging the gap between climate scientists and the climate change impact communities. The climate4impact portal builds on the infrastructure of an international distributed database that has been set to disseminate the results from the global climate model results of the Coupled Model Intercomparison project Phase 5 (CMIP5). This database, the ESGF, is an international collaboration that develops, deploys and maintains software infrastructure for the management, dissemination, and analysis of climate model data. The European FP7 project IS-ENES, Infrastructure for the European Network for Earth System modelling, supports the European contribution to ESGF and contributes to the ESGF open source effort, notably through the development of search, monitoring, quality control, and metadata services. In its second phase, IS-ENES2 supports the implementation of regional climate model results from the international Coordinated Regional Downscaling Experiments (CORDEX). These services were extended within the European FP7 Climate Information Portal for Copernicus (CLIPC) project, and some could be later integrated into the European Copernicus platform.

  14. Online Maps and Cloud-Supported Location-Based Services across a Manifold of Devices

    NASA Astrophysics Data System (ADS)

    Kröpfl, M.; Buchmüller, D.; Leberl, F.

    2012-07-01

    Online mapping, miniaturization of computing devices, the "cloud", Global Navigation Satellite System (GNSS) and cell tower triangulation all coalesce into an entirely novel infrastructure for numerous innovative map applications. This impacts the planning of human activities, navigating and tracking these activities as they occur, and finally documenting their outcome for either a single user or a network of connected users in a larger context. In this paper, we provide an example of a simple geospatial application making use of this model, which we will use to explain the basic steps necessary to deploy an application involving a web service hosting geospatial information and a client software consuming the web service through an API. The application allows an insurance claim specialist to add claims to a cloud-based database including a claim location. A field agent then uses a smartphone application to query the database by proximity, and heads out to capture photographs as supporting documentation for the claim. Once the photos have been uploaded to the web service, a second web service for image matching is called in order to try and match the current photograph to previously submitted assets. Image matching is used as a pre-verification step to determine whether the coverage of the respective object is sufficient for the claim specialist to process the claim. The development of the application was based on Microsoft's® Bing Maps™, Windows Phone™, Silverlight™, Windows Azure™ and Visual Studio™, and was completed in approximately 30 labour hours split among two developers.

  15. Interoperable Data Access Services for NOAA IOOS

    NASA Astrophysics Data System (ADS)

    de La Beaujardiere, J.

    2008-12-01

    The Integrated Ocean Observing System (IOOS) is intended to enhance our ability to collect, deliver, and use ocean information. The goal is to support research and decision-making by providing data on our open oceans, coastal waters, and Great Lakes in the formats, rates, and scales required by scientists, managers, businesses, governments, and the public. The US National Oceanic and Atmospheric Administration (NOAA) is the lead agency for IOOS. NOAA's IOOS office supports the development of regional coastal observing capability and promotes data management efforts to increase data accessibility. Geospatial web services have been established at NOAA data providers including the National Data Buoy Center (NDBC), the Center for Operational Oceanographic Products and Services (CO-OPS), and CoastWatch, and at regional data provider sites. Services established include Open-source Project for a Network Data Access Protocol (OpenDAP), Open Geospatial Consortium (OGC) Sensor Observation Service (SOS), and OGC Web Coverage Service (WCS). These services provide integrated access to data holdings that have been aggregated at each center from multiple sources. We wish to collaborate with other groups to improve our service offerings to maximize interoperability and enhance cross-provider data integration, and to share common service components such as registries, catalogs, data conversion, and gateways. This paper will discuss the current status of NOAA's IOOS efforts and possible next steps.

  16. Are You Covered? Associations Between Patient Protection and Affordable Care Act Knowledge and Preventive Reproductive Service Use.

    PubMed

    Sawyer, Ashlee N; Kwitowski, Melissa A; Benotsch, Eric G

    2018-05-01

    Sexual and reproductive health conditions (eg, infections, cancers) represent public health concerns for American women. The present study examined how knowledge of the Patient Protection and Affordable Care Act (PPACA) relates to receipt of preventive reproductive health services among women. Cross-sectional online survey. Online questionnaires were completed via Amazon Mechanical Turk, a crowdsourcing website where individuals complete web-based tasks for compensation. Cisgendered women aged 18 to 44 years (N = 1083) from across the United States. Participants completed online questionnaires assessing demographics, insurance status, preventive service use, and knowledge of PPACA provisions. Chi-squares showed that receipt of well-woman, pelvic, and breast examinations, as well as pap smears, was related to insurance coverage, with those not having coverage at all during the previous year having significantly lower rates of use. Hierarchical logistic regressions determined the independent relationship between PPACA knowledge and use of health services after controlling for demographic factors and insurance status. Knowledge of PPACA provisions was associated with receiving well-woman, pelvic, and breast examinations, human papillomavirus vaccination, and sexually transmitted infections testing, after controlling for these factors. Results indicate that expanding knowledge about health-care legislation may be beneficial in increasing preventive reproductive health service use among women. Current findings provide support for increasing resources for outreach and education of the general population about the provisions and benefits of health-care legislation, as well as personal health coverage plans.

  17. Web-GIS visualisation of permafrost-related Remote Sensing products for ESA GlobPermafrost

    NASA Astrophysics Data System (ADS)

    Haas, A.; Heim, B.; Schaefer-Neth, C.; Laboor, S.; Nitze, I.; Grosse, G.; Bartsch, A.; Kaab, A.; Strozzi, T.; Wiesmann, A.; Seifert, F. M.

    2016-12-01

    The ESA GlobPermafrost (www.globpermafrost.info) provides a remote sensing service for permafrost research and applications. The service comprises of data product generation for various sites and regions as well as specific infrastructure allowing overview and access to datasets. Based on an online user survey conducted within the project, the user community extensively applies GIS software to handle remote sensing-derived datasets and requires preview functionalities before accessing them. In response, we develop the Permafrost Information System PerSys which is conceptualized as an open access geospatial data dissemination and visualization portal. PerSys will allow visualisation of GlobPermafrost raster and vector products such as land cover classifications, Landsat multispectral index trend datasets, lake and wetland extents, InSAR-based land surface deformation maps, rock glacier velocity fields, spatially distributed permafrost model outputs, and land surface temperature datasets. The datasets will be published as WebGIS services relying on OGC-standardized Web Mapping Service (WMS) and Web Feature Service (WFS) technologies for data display and visualization. The WebGIS environment will be hosted at the AWI computing centre where a geodata infrastructure has been implemented comprising of ArcGIS for Server 10.4, PostgreSQL 9.2 and a browser-driven data viewer based on Leaflet (http://leafletjs.com). Independently, we will provide an `Access - Restricted Data Dissemination Service', which will be available to registered users for testing frequently updated versions of project datasets. PerSys will become a core project of the Arctic Permafrost Geospatial Centre (APGC) within the ERC-funded PETA-CARB project (www.awi.de/petacarb). The APGC Data Catalogue will contain all final products of GlobPermafrost, allow in-depth dataset search via keywords, spatial and temporal coverage, data type, etc., and will provide DOI-based links to the datasets archived in the long-term, open access PANGAEA data repository.

  18. BingEO: Enable Distributed Earth Observation Data for Environmental Research

    NASA Astrophysics Data System (ADS)

    Wu, H.; Yang, C.; Xu, Y.

    2010-12-01

    Our planet is facing great environmental challenges including global climate change, environmental vulnerability, extreme poverty, and a shortage of clean cheap energy. To address these problems, scientists are developing various models to analysis, forecast, simulate various geospatial phenomena to support critical decision making. These models not only challenge our computing technology, but also challenge us to feed huge demands of earth observation data. Through various policies and programs, open and free sharing of earth observation data are advocated in earth science. Currently, thousands of data sources are freely available online through open standards such as Web Map Service (WMS), Web Feature Service (WFS) and Web Coverage Service (WCS). Seamless sharing and access to these resources call for a spatial Cyberinfrastructure (CI) to enable the use of spatial data for the advancement of related applied sciences including environmental research. Based on Microsoft Bing Search Engine and Bing Map, a seamlessly integrated and visual tool is under development to bridge the gap between researchers/educators and earth observation data providers. With this tool, earth science researchers/educators can easily and visually find the best data sets for their research and education. The tool includes a registry and its related supporting module at server-side and an integrated portal as its client. The proposed portal, Bing Earth Observation (BingEO), is based on Bing Search and Bing Map to: 1) Use Bing Search to discover Web Map Services (WMS) resources available over the internet; 2) Develop and maintain a registry to manage all the available WMS resources and constantly monitor their service quality; 3) Allow users to manually register data services; 4) Provide a Bing Maps-based Web application to visualize the data on a high-quality and easy-to-manipulate map platform and enable users to select the best data layers online. Given the amount of observation data accumulated already and still growing, BingEO will allow these resources to be utilized more widely, intensively, efficiently and economically in earth science applications.

  19. Connecting long-tail scientists with big data centers using SaaS

    NASA Astrophysics Data System (ADS)

    Percivall, G. S.; Bermudez, L. E.

    2012-12-01

    Big data centers and long tail scientists represent two extremes in the geoscience research community. Interoperability and inter-use based on software-as-a-service (SaaS) increases access to big data holdings by this underserved community of scientists. Large, institutional data centers have long been recognized as vital resources in the geoscience community. Permanent data archiving and dissemination centers provide "access to the data and (are) a critical source of people who have experience in the use of the data and can provide advice and counsel for new applications." [NRC] The "long-tail of science" is the geoscience researchers that work separate from institutional data centers [Heidorn]. Long-tail scientists need to be efficient consumers of data from large, institutional data centers. Discussions in NSF EarthCube capture the challenges: "Like the vast majority of NSF-funded researchers, Alice (a long-tail scientist) works with limited resources. In the absence of suitable expertise and infrastructure, the apparently simple task that she assigns to her graduate student becomes an information discovery and management nightmare. Downloading and transforming datasets takes weeks." [Foster, et.al.] The long-tail metaphor points to methods to bridge the gap, i.e., the Web. A decade ago, OGC began building a geospatial information space using open, web standards for geoprocessing [ORM]. Recently, [Foster, et.al.] accurately observed that "by adopting, adapting, and applying semantic web and SaaS technologies, we can make the use of geoscience data as easy and convenient as consumption of online media." SaaS places web services into Cloud Computing. SaaS for geospatial is emerging rapidly building on the first-generation geospatial web, e.g., OGC Web Coverage Service [WCS] and the Data Access Protocol [DAP]. Several recent examples show progress in applying SaaS to geosciences: - NASA's Earth Data Coherent Web has a goal to improve science user experience using Web Services (e.g. W*S, SOAP, RESTful) to reduce barriers to using EOSDIS data [ECW]. - NASA's LANCE provides direct access to vast amounts of satellite data using the OGC Web Map Tile Service (WMTS). - NOAA's Unified Access Framework for Gridded Data (UAF Grid) is a web service based capability for direct access to a variety of datasets using netCDF, OPeNDAP, THREDDS, WMS and WCS. [UAF] Tools to access SaaS's are many and varied: some proprietary, others open source; some run in browsers, others are stand-alone applications. What's required is interoperability using web interfaces offered by the data centers. NOAA's UAF service stack supports Matlab, ArcGIS, Ferret, GrADS, Google Earth, IDV, LAS. Any SaaS that offers OGC Web Services (WMS, WFS, WCS) can be accessed by scores of clients [OGC]. While there has been much progress in the recent year toward offering web services for the long-tail of scientists, more needs to be done. Web services offer data access but more than access is needed for inter-use of data, e.g. defining data schemas that allow for data fusion, addressing coordinate systems, spatial geometry, and semantics for observations. Connecting long-tail scientists with large, data centers using SaaS and, in the future, semantic web, will address this large and currently underserved user community.

  20. Archive Management of NASA Earth Observation Data to Support Cloud Analysis

    NASA Technical Reports Server (NTRS)

    Lynnes, Christopher; Baynes, Kathleen; McInerney, Mark A.

    2017-01-01

    NASA collects, processes and distributes petabytes of Earth Observation (EO) data from satellites, aircraft, in situ instruments and model output, with an order of magnitude increase expected by 2024. Cloud-based web object storage (WOS) of these data can simplify the execution of such an increase. More importantly, it can also facilitate user analysis of those volumes by making the data available to the massively parallel computing power in the cloud. However, storing EO data in cloud WOS has a ripple effect throughout the NASA archive system with unexpected challenges and opportunities. One challenge is modifying data servicing software (such as Web Coverage Service servers) to access and subset data that are no longer on a directly accessible file system, but rather in cloud WOS. Opportunities include refactoring of the archive software to a cloud-native architecture; virtualizing data products by computing on demand; and reorganizing data to be more analysis-friendly.

  1. KML Super Overlay to WMS Translator

    NASA Technical Reports Server (NTRS)

    Plesea, Lucian

    2007-01-01

    This translator is a server-based application that automatically generates KML super overlay configuration files required by Google Earth for map data access via the Open Geospatial Consortium WMS (Web Map Service) standard. The translator uses a set of URL parameters that mirror the WMS parameters as much as possible, and it also can generate a super overlay subdivision of any given area that is only loaded when needed, enabling very large areas of coverage at very high resolutions. It can make almost any dataset available as a WMS service visible and usable in any KML application, without the need to reformat the data.

  2. ATLAS, CMS and new challenges for public communication

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, Lucas; Barney, David; Goldfarb, Steven

    On 30 March 2010 the first high-energy collisions brought the LHC experiments into the era of research and discovery. Millions of viewers worldwide tuned in to the webcasts and followed the news via Web 2.0 tools, such as blogs, Twitter, and Facebook, with 205,000 unique visitors to CERN's Web site. Media coverage at the experiments and in institutes all over the world yielded more than 2,200 news items including 800 TV broadcasts. We describe the new multimedia communications challenges, due to the massive public interest in the LHC programme, and the corresponding responses of the ATLAS and CMS experiments, inmore » the areas of Web 2.0 tools, multimedia, webcasting, videoconferencing, and collaborative tools. We discuss the strategic convergence of the two experiments' communications services, information systems and public database of outreach material.« less

  3. ATLAS, CMS and New Challenges for Public Communication

    NASA Astrophysics Data System (ADS)

    Taylor, Lucas; Barney, David; Goldfarb, Steven

    2011-12-01

    On 30 March 2010 the first high-energy collisions brought the LHC experiments into the era of research and discovery. Millions of viewers worldwide tuned in to the webcasts and followed the news via Web 2.0 tools, such as blogs, Twitter, and Facebook, with 205,000 unique visitors to CERN's Web site. Media coverage at the experiments and in institutes all over the world yielded more than 2,200 news items including 800 TV broadcasts. We describe the new multimedia communications challenges, due to the massive public interest in the LHC programme, and the corresponding responses of the ATLAS and CMS experiments, in the areas of Web 2.0 tools, multimedia, webcasting, videoconferencing, and collaborative tools. We discuss the strategic convergence of the two experiments' communications services, information systems and public database of outreach material.

  4. CEOS Ocean Variables Enabling Research and Applications for Geo (COVERAGE)

    NASA Astrophysics Data System (ADS)

    Tsontos, V. M.; Vazquez, J.; Zlotnicki, V.

    2017-12-01

    The CEOS Ocean Variables Enabling Research and Applications for GEO (COVERAGE) initiative seeks to facilitate joint utilization of different satellite data streams on ocean physics, better integrated with biological and in situ observations, including near real-time data streams in support of oceanographic and decision support applications for societal benefit. COVERAGE aligns with programmatic objectives of CEOS (the Committee on Earth Observation Satellites) and the missions of GEO-MBON (Marine Biodiversity Observation Network) and GEO-Blue Planet, which are to advance and exploit synergies among the many observational programs devoted to ocean and coastal waters. COVERAGE is conceived of as 3 year pilot project involving international collaboration. It focuses on implementing technologies, including cloud based solutions, to provide a data rich, web-based platform for integrated ocean data delivery and access: multi-parameter observations, easily discoverable and usable, organized by disciplines, available in near real-time, collocated to a common grid and including climatologies. These will be complemented by a set of value-added data services available via the COVERAGE portal including an advanced Web-based visualization interface, subsetting/extraction, data collocation/matchup and other relevant on demand processing capabilities. COVERAGE development will be organized around priority use cases and applications identified by GEO and agency partners. The initial phase will be to develop co-located 25km products from the four Ocean Virtual Constellations (VCs), Sea Surface Temperature, Sea Level, Ocean Color, and Sea Surface Winds. This aims to stimulate work among the ocean VCs while developing products and system functionality based on community recommendations. Such products as anomalies from a time mean, would build on the theme of applications with a relevance to CEOS/GEO mission and vision. Here we provide an overview of the COVERAGE initiative with an emphasis on international collaborative aspects entailed with the intent of soliciting community feedback as we develop and implement

  5. The peer review system (PRS) for quality assurance and treatment improvement in radiation therapy

    NASA Astrophysics Data System (ADS)

    Le, Anh H. T.; Kapoor, Rishabh; Palta, Jatinder R.

    2012-02-01

    Peer reviews are needed across all disciplines of medicine to address complex medical challenges in disease care, medical safety, insurance coverage handling, and public safety. Radiation therapy utilizes technologically advanced imaging for treatment planning, often with excellent efficacy. Since planning data requirements are substantial, patients are at risk for repeat diagnostic procedures or suboptimal therapeutic intervention due to a lack of knowledge regarding previous treatments. The Peer Review System (PRS) will make this critical radiation therapy information readily available on demand via Web technology. The PRS system has been developed with current Web technology, .NET framework, and in-house DICOM library. With the advantages of Web server-client architecture, including IIS web server, SOAP Web Services and Silverlight for the client side, the patient data can be visualized through web browser and distributed across multiple locations by the local area network and Internet. This PRS will significantly improve the quality, safety, and accessibility, of treatment plans in cancer therapy. Furthermore, the secure Web-based PRS with DICOM-RT compliance will provide flexible utilities for organization, sorting, and retrieval of imaging studies and treatment plans to optimize the patient treatment and ultimately improve patient safety and treatment quality.

  6. Coverage and quality: A comparison of Web of Science and Scopus databases for reporting faculty nursing publication metrics.

    PubMed

    Powell, Kimberly R; Peterson, Shenita R

    Web of Science and Scopus are the leading databases of scholarly impact. Recent studies outside the field of nursing report differences in journal coverage and quality. A comparative analysis of nursing publications reported impact. Journal coverage by each database for the field of nursing was compared. Additionally, publications by 2014 nursing faculty were collected in both databases and compared for overall coverage and reported quality, as modeled by Scimajo Journal Rank, peer review status, and MEDLINE inclusion. Individual author impact, modeled by the h-index, was calculated by each database for comparison. Scopus offered significantly higher journal coverage. For 2014 faculty publications, 100% of journals were found in Scopus, Web of Science offered 82%. No significant difference was found in the quality of reported journals. Author h-index was found to be higher in Scopus. When reporting faculty publications and scholarly impact, academic nursing programs may be better represented by Scopus, without compromising journal quality. Programs with strong interdisciplinary work should examine all areas of strength to ensure appropriate coverage. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Modelling noise propagation using Grid Resources. Progress within GDI-Grid

    NASA Astrophysics Data System (ADS)

    Kiehle, Christian; Mayer, Christian; Padberg, Alexander; Stapelfeld, Hartmut

    2010-05-01

    Modelling noise propagation using Grid Resources. Progress within GDI-Grid. GDI-Grid (english: SDI-Grid) is a research project funded by the German Ministry for Science and Education (BMBF). It aims at bridging the gaps between OGC Web Services (OWS) and Grid infrastructures and identifying the potential of utilizing the superior storage capacities and computational power of grid infrastructures for geospatial applications while keeping the well-known service interfaces specified by the OGC. The project considers all major OGC webservice interfaces for Web Mapping (WMS), Feature access (Web Feature Service), Coverage access (Web Coverage Service) and processing (Web Processing Service). The major challenge within GDI-Grid is the harmonization of diverging standards as defined by standardization bodies for Grid computing and spatial information exchange. The project started in 2007 and will continue until June 2010. The concept for the gridification of OWS developed by lat/lon GmbH and the Department of Geography of the University of Bonn is applied to three real-world scenarios in order to check its practicability: a flood simulation, a scenario for emergency routing and a noise propagation simulation. The latter scenario is addressed by the Stapelfeldt Ingenieurgesellschaft mbH located in Dortmund adapting their LimA software to utilize grid resources. Noise mapping of e.g. traffic noise in urban agglomerates and along major trunk roads is a reoccurring demand of the EU Noise Directive. Input data requires road net and traffic, terrain, buildings and noise protection screens as well as population distribution. Noise impact levels are generally calculated in 10 m grid and along relevant building facades. For each receiver position sources within a typical range of 2000 m are split down into small segments, depending on local geometry. For each of the segments propagation analysis includes diffraction effects caused by all obstacles on the path of sound propagation. This immense intensive calculation needs to be performed for a major part of European landscape. A LINUX version of the commercial LimA software for noise mapping analysis has been implemented on a test cluster within the German D-GRID computer network. Results and performance indicators will be presented. The presentation is an extension to last-years presentation "Spatial Data Infrastructures and Grid Computing: the GDI-Grid project" that described the gridification concept developed in the GDI-Grid project and provided an overview of the conceptual gaps between Grid Computing and Spatial Data Infrastructures. Results from the GDI-Grid project are incorporated in the OGC-OGF (Open Grid Forum) collaboration efforts as well as the OGC WPS 2.0 standards working group developing the next major version of the WPS specification.

  8. Turning Interoperability Operational with GST

    NASA Astrophysics Data System (ADS)

    Schaeben, Helmut; Gabriel, Paul; Gietzel, Jan; Le, Hai Ha

    2013-04-01

    GST - Geosciences in space and time is being developed and implemented as hub to facilitate the exchange of spatially and temporally indexed multi-dimensional geoscience data and corresponding geomodels amongst partners. It originates from TUBAF's contribution to the EU project "ProMine" and its perspective extensions are TUBAF's contribution to the actual EU project "GeoMol". As of today, it provides basic components of a geodata infrastructure as required to establish interoperability with respect to geosciences. Generally, interoperability means the facilitation of cross-border and cross-sector information exchange, taking into account legal, organisational, semantic and technical aspects, cf. Interoperability Solutions for European Public Administrations (ISA), cf. http://ec.europa.eu/isa/. Practical interoperability for partners of a joint geoscience project, say European Geological Surveys acting in a border region, means in particular provision of IT technology to exchange spatially and maybe additionally temporally indexed multi-dimensional geoscience data and corresponding models, i.e. the objects composing geomodels capturing the geometry, topology, and various geoscience contents. Geodata Infrastructure (GDI) and interoperability are objectives of several inititatives, e.g. INSPIRE, OneGeology-Europe, and most recently EGDI-SCOPE to name just the most prominent ones. Then there are quite a few markup languages (ML) related to geographical or geological information like GeoSciML, EarthResourceML, BoreholeML, ResqML for reservoir characterization, earth and reservoir models, and many others featuring geoscience information. Several Web Services are focused on geographical or geoscience information. The Open Geospatial Consortium (OGC) promotes specifications of a Web Feature Service (WFS), a Web Map Service (WMS), a Web Coverage Serverice (WCS), a Web 3D Service (W3DS), and many more. It will be clarified how GST is related to these initiatives, especially how it complies with existing or developing standards or quasi-standards and how it applies and extents services towards interoperability in the Earth sciences.

  9. An Investigation into the Use of 3G Mobile Communications to Provide Telehealth Services in Rural KwaZulu-Natal

    PubMed Central

    Mars, Maurice

    2015-01-01

    Abstract Background: We investigated the use of third-generation (3G) mobile communications to provide telehealth services in remote health clinics in rural KwaZulu-Natal, South Africa. Materials and Methods: We specified a minimal set of services as our use case that would be representative of typical activity and to provide a baseline for analysis of network performance. Services included database access to manage chronic disease, local support and management of patients (to reduce unnecessary travel to the hospital), emergency care (up to 8 h for an ambulance to arrive), e-mail, access to up-to-date information (Web), and teleclinics. We made site measurements at a representative set of health clinics to determine the type of coverage (general packet radio service [GPRS]/3G), its capabilities to support videoconferencing (H323 and Skype™ [Microsoft, Redmond, WA]) and audio (Skype), and throughput for transmission control protocol (TCP) to gain a measure of application performance. Results: We found that none of the remote health clinics had 3G service. The GPRS service provided typical upload speed of 44 kilobits per second (Kbps) and download speed of 64 Kbps. This was not sufficient to support any form of videoconferencing. We also observed that GPRS had significant round trip time (RTT), in some cases in excess of 750 ms, and this led to slow start-up for TCP applications. Conclusions: We found audio was always so broken as to be unusable and further observed that many applications such as Web access would fail under conditions of very high RTT. We found some health clinics were so remote that they had no mobile service. 3G, where available, had measured upload speed of 331 Kbps and download speed of 446 Kbps and supported videoconferencing and audio at all sites, but we frequently experienced 3G changing to GPRS. We conclude that mobile communications currently provide insufficient coverage and capability to provide reliable clinical services and would advocate dedicated wireless services where reliable communication is essential and use of store and forward for mobile applications. PMID:24926731

  10. An investigation into the use of 3G mobile communications to provide telehealth services in rural KwaZulu-Natal.

    PubMed

    Clarke, Malcolm; Mars, Maurice

    2015-02-01

    We investigated the use of third-generation (3G) mobile communications to provide telehealth services in remote health clinics in rural KwaZulu-Natal, South Africa. We specified a minimal set of services as our use case that would be representative of typical activity and to provide a baseline for analysis of network performance. Services included database access to manage chronic disease, local support and management of patients (to reduce unnecessary travel to the hospital), emergency care (up to 8 h for an ambulance to arrive), e-mail, access to up-to-date information (Web), and teleclinics. We made site measurements at a representative set of health clinics to determine the type of coverage (general packet radio service [GPRS]/3G), its capabilities to support videoconferencing (H323 and Skype™ [Microsoft, Redmond, WA]) and audio (Skype), and throughput for transmission control protocol (TCP) to gain a measure of application performance. We found that none of the remote health clinics had 3G service. The GPRS service provided typical upload speed of 44 kilobits per second (Kbps) and download speed of 64 Kbps. This was not sufficient to support any form of videoconferencing. We also observed that GPRS had significant round trip time (RTT), in some cases in excess of 750 ms, and this led to slow start-up for TCP applications. We found audio was always so broken as to be unusable and further observed that many applications such as Web access would fail under conditions of very high RTT. We found some health clinics were so remote that they had no mobile service. 3G, where available, had measured upload speed of 331 Kbps and download speed of 446 Kbps and supported videoconferencing and audio at all sites, but we frequently experienced 3G changing to GPRS. We conclude that mobile communications currently provide insufficient coverage and capability to provide reliable clinical services and would advocate dedicated wireless services where reliable communication is essential and use of store and forward for mobile applications.

  11. Leveraging the NLM map from SNOMED CT to ICD-10-CM to facilitate adoption of ICD-10-CM.

    PubMed

    Cartagena, F Phil; Schaeffer, Molly; Rifai, Dorothy; Doroshenko, Victoria; Goldberg, Howard S

    2015-05-01

    Develop and test web services to retrieve and identify the most precise ICD-10-CM code(s) for a given clinical encounter. Facilitate creation of user interfaces that 1) provide an initial shortlist of candidate codes, ideally visible on a single screen; and 2) enable code refinement. To satisfy our high-level use cases, the analysis and design process involved reviewing available maps and crosswalks, designing the rule adjudication framework, determining necessary metadata, retrieving related codes, and iteratively improving the code refinement algorithm. The Partners ICD-10-CM Search and Mapping Services (PI-10 Services) are SOAP web services written using Microsoft's.NET 4.0 Framework, Windows Communications Framework, and SQL Server 2012. The services cover 96% of the Partners problem list subset of SNOMED CT codes that map to ICD-10-CM codes and can return up to 76% of the 69,823 billable ICD-10-CM codes prior to creation of custom mapping rules. We consider ways to increase 1) the coverage ratio of the Partners problem list subset of SNOMED CT codes and 2) the upper bound of returnable ICD-10-CM codes by creating custom mapping rules. Future work will investigate the utility of the transitive closure of SNOMED CT codes and other methods to assist in custom rule creation and, ultimately, to provide more complete coverage of ICD-10-CM codes. ICD-10-CM will be easier for clinicians to manage if applications display short lists of candidate codes from which clinicians can subsequently select a code for further refinement. The PI-10 Services support ICD-10 migration by implementing this paradigm and enabling users to consistently and accurately find the best ICD-10-CM code(s) without translation from ICD-9-CM. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  12. Internet Hospitals in China: Cross-Sectional Survey.

    PubMed

    Xie, Xiaoxu; Zhou, Weimin; Lin, Lingyan; Fan, Si; Lin, Fen; Wang, Long; Guo, Tongjun; Ma, Chuyang; Zhang, Jingkun; He, Yuan; Chen, Yixin

    2017-07-04

    The Internet hospital, an innovative approach to providing health care, is rapidly developing in China because it has the potential to provide widely accessible outpatient service delivery via Internet technologies. To date, China's Internet hospitals have not been systematically investigated. The aim of this study was to describe the characteristics of China's Internet hospitals, and to assess their health service capacity. We searched Baidu, the popular Chinese search engine, to identify Internet hospitals, using search terms such as "Internet hospital," "web hospital," or "cloud hospital." All Internet hospitals in mainland China were eligible for inclusion if they were officially registered. Our search was carried out until March 31, 2017. We identified 68 Internet hospitals, of which 43 have been put into use and 25 were under construction. Of the 43 established Internet hospitals, 13 (30%) were in the hospital informatization stage, 24 (56%) were in the Web ward stage, and 6 (14%) were in full Internet hospital stage. Patients accessed outpatient service delivery via website (74%, 32/43), app (42%, 18/43), or offline medical consultation facility (37%, 16/43) from the Internet hospital. Furthermore, 25 (58%) of the Internet hospitals asked doctors to deliver health services at a specific Web clinic, whereas 18 (42%) did not. The consulting methods included video chat (60%, 26/43), telephone (19%, 8/43), and graphic message (28%, 12/43); 13 (30%) Internet hospitals cannot be consulted online any more. Only 6 Internet hospitals were included in the coverage of health insurance. The median number of doctors available online was zero (interquartile range [IQR] 0 to 5; max 16,492). The median consultation fee per time was ¥20 (approximately US $2.90, IQR ¥0 to ¥200). Internet hospitals provide convenient outpatient service delivery. However, many of the Internet hospitals are not yet mature and are faced with various issues such as online doctor scarcity and the unavailability of health insurance coverage. China's Internet hospitals are heading in the right direction to improve provision of health services, but much more remains to be done. ©Xiaoxu Xie, Weimin Zhou, Lingyan Lin, Si Fan, Fen Lin, Long Wang, Tongjun Guo, Chuyang Ma, Jingkun Zhang, Yuan He, Yixin Chen. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 04.07.2017.

  13. eodataservice.org: how to enable cross-continental interoperability of the European Space Agency and Australian Geoscience Landsat datacubes

    NASA Astrophysics Data System (ADS)

    Mantovani, Simone; Barboni, Damiano; Natali, Stefano; Evans, Ben; Steer, Adam; Hogan, Patrik; Baumann, Peter

    2017-04-01

    Globally, billions of dollars are invested annually in Earth observations that support public services, commercial activity, and scientific inquiry. The Common Data Framework [1] for Earth Observation data summarises the current standards for the international community to adopt a common approach so that this significant data can be readily accessible. Concurrently, the "Copernicus Cooperation Arrangement" between the European Commission and the Australian Government is just one in a number of recent agreements signed to facilitate Satellite Earth Observation data sharing among the users' communities. The typical approach implemented in these initiatives is the establishment of a regional data access hub managed by the regional entity to collect data at full scale or over the local region, improve access services and provide high-performance environment in which all the data can be analysed. Furthermore, a number of datacube-aware platforms and services have emerged that enable a new collaborative approach for analysing the vast quantities of satellite imagery and other Earth Observations, making it quicker and easier to explore a time series of image data. In this context, the H2020-funded EarthServer2 project brings together multiple organisations in Europe, Australia and United States to allow federated data holdings to be analysed using web-based access to petabytes of multidimensional geospatial datasets. The aim is to create and ensure that these large spatial data sources can be accessed based on OGC standards, namely Web Coverage Service (WCS) and Web Coverage Processing Service (WCPS) that provide efficient&timely retrieval of large volumes of geospatial data as well as on-the-fly processing. In this study, we provide an overview of the existing European Space Agency and Australian Geoscience Landsat datacubes, how the regional datacube structures differ, how interoperability is enabled through standards, and finally how the datacubes can be visualized on a virtual globe (NASA - ESA WebWorldWind) based on a WC(P)S query via any standard internet browser. The current study is co-financed by the European Space Agency under the MaaS project (ESRIN Contract No. 4000114186/15/I-LG) and the European Union's Horizon 2020 research and innovation programme under the EarthServer-2 project (Grant Agreement No. 654367) [1] Common framework for Earth-Observation data, March 23, 2016 (https://www.whitehouse.gov/sites/default/files/microsites/ostp/common_framework_for_earth_observation_data.pdf)

  14. Tools for Interdisciplinary Data Assimilation and Sharing in Support of Hydrologic Science

    NASA Astrophysics Data System (ADS)

    Blodgett, D. L.; Walker, J.; Suftin, I.; Warren, M.; Kunicki, T.

    2013-12-01

    Information consumed and produced in hydrologic analyses is interdisciplinary and massive. These factors put a heavy information management burden on the hydrologic science community. The U.S. Geological Survey (USGS) Office of Water Information Center for Integrated Data Analytics (CIDA) seeks to assist hydrologic science investigators with all-components of their scientific data management life cycle. Ongoing data publication and software development projects will be presented demonstrating publically available data access services and manipulation tools being developed with support from two Department of the Interior initiatives. The USGS-led National Water Census seeks to provide both data and tools in support of nationally consistent water availability estimates. Newly available data include national coverages of radar-indicated precipitation, actual evapotranspiration, water use estimates aggregated by county, and South East region estimates of streamflow for 12-digit hydrologic unit code watersheds. Web services making these data available and applications to access them will be demonstrated. Web-available processing services able to provide numerous streamflow statistics for any USGS daily flow record or model result time series and other National Water Census processing tools will also be demonstrated. The National Climate Change and Wildlife Science Center is a USGS center leading DOI-funded academic global change adaptation research. It has a mission goal to ensure data used and produced by funded projects is available via web services and tools that streamline data management tasks in interdisciplinary science. For example, collections of downscaled climate projections, typically large collections of files that must be downloaded to be accessed, are being published using web services that allow access to the entire dataset via simple web-service requests and numerous processing tools. Recent progress on this front includes, data web services for Climate Model Intercomparison Phase 5 based downscaled climate projections, EPA's Integrated Climate and Land Use Scenarios projections of population and land cover metrics, and MODIS-derived land cover parameters from NASA's Land Processes Distributed Active Archive Center. These new services and ways to discover others will be presented through demonstration of a recently open-sourced project from a web-application or scripted workflow. Development and public deployment of server-based processing tools to subset and summarize these and other data is ongoing at the CIDA with partner groups such as 52 Degrees North and Unidata. The latest progress on subsetting, spatial summarization to areas of interest, and temporal summarization via common-statistical methods will be presented.

  15. Discrepancies among Scopus, Web of Science, and PubMed coverage of funding information in medical journal articles.

    PubMed

    Kokol, Peter; Vošner, Helena Blažun

    2018-01-01

    The overall aim of the present study was to compare the coverage of existing research funding information for articles indexed in Scopus, Web of Science, and PubMed databases. The numbers of articles with funding information published in 2015 were identified in the three selected databases and compared using bibliometric analysis of a sample of twenty-eight prestigious medical journals. Frequency analysis of the number of articles with funding information showed statistically significant differences between Scopus, Web of Science, and PubMed databases. The largest proportion of articles with funding information was found in Web of Science (29.0%), followed by PubMed (14.6%) and Scopus (7.7%). The results show that coverage of funding information differs significantly among Scopus, Web of Science, and PubMed databases in a sample of the same medical journals. Moreover, we found that, currently, funding data in PubMed is more difficult to obtain and analyze compared with that in the other two databases.

  16. New Quality Metrics for Web Search Results

    NASA Astrophysics Data System (ADS)

    Metaxas, Panagiotis Takis; Ivanova, Lilia; Mustafaraj, Eni

    Web search results enjoy an increasing importance in our daily lives. But what can be said about their quality, especially when querying a controversial issue? The traditional information retrieval metrics of precision and recall do not provide much insight in the case of web information retrieval. In this paper we examine new ways of evaluating quality in search results: coverage and independence. We give examples on how these new metrics can be calculated and what their values reveal regarding the two major search engines, Google and Yahoo. We have found evidence of low coverage for commercial and medical controversial queries, and high coverage for a political query that is highly contested. Given the fact that search engines are unwilling to tune their search results manually, except in a few cases that have become the source of bad publicity, low coverage and independence reveal the efforts of dedicated groups to manipulate the search results.

  17. AGU Climate Scientists Offer Question-and-Answer Service for Media

    NASA Astrophysics Data System (ADS)

    Jackson, Stacy

    2010-03-01

    In fall 2009, AGU launched a member-driven pilot project to improve the accuracy of climate science coverage in the media and to improve public understanding of climate science. The project's goal was to increase the accessibility of climate science experts to journalists across the full spectrum of media outlets. As a supplement to the traditional one-to-one journalist-expert relationship model, the project tested the novel approach of providing a question-and-answer (Q&A) service with a pool of expert scientists and a Web-based interface with journalists. Questions were explicitly limited to climate science to maintain a nonadvocacy, nonpartisan perspective.

  18. WCS Challenges for NASA's Earth Science Data

    NASA Astrophysics Data System (ADS)

    Cantrell, S.; Swentek, L.; Khan, A.

    2017-12-01

    In an effort to ensure that data in NASA's Earth Observing System Data and Information System (EOSDIS) is available to a wide variety of users through the tools of their choice, NASA continues to focus on exposing data and services using standards based protocols. Specifically, this work has focused recently on the Web Coverage Service (WCS). Experience has been gained in data delivery via GetCoverage requests, starting out with WCS v1.1.1. The pros and cons of both the version itself and different implementation approaches will be shared during this session. Additionally, due to limitations with WCS v1.1.1's ability to work with NASA's Earth science data, this session will also discuss the benefit of migrating to WCS 2.0.1 with EO-x to enrich this capability to meet a wide range of anticipated user needs This will enable subsetting and various types of data transformations to be performed on a variety of EOS data sets.

  19. Improving antiretroviral therapy scale-up and effectiveness through service integration and decentralization.

    PubMed

    Suthar, Amitabh B; Rutherford, George W; Horvath, Tara; Doherty, Meg C; Negussie, Eyerusalem K

    2014-03-01

    Current service delivery systems do not reach all people in need of antiretroviral therapy (ART). In order to inform the operational and service delivery section of the WHO 2013 consolidated antiretroviral guidelines, our objective was to summarize systematic reviews on integrating ART delivery into maternal, newborn, and child health (MNCH) care settings in countries with generalized epidemics, tuberculosis (TB) treatment settings in which the burden of HIV and TB is high, and settings providing opiate substitution therapy (OST); and decentralizing ART into primary health facilities and communities. A summary of systematic reviews. The reviewers searched PubMed, Embase, PsycINFO, Web of Science, CENTRAL, and the WHO Index Medicus databases. Randomized controlled trials and observational cohort studies were included if they compared ART coverage, retention in HIV care, and/or mortality in MNCH, TB, or OST facilities providing ART with MNCH, TB, or OST facilities providing ART services separately; or primary health facilities or communities providing ART with hospitals providing ART. The reviewers identified 28 studies on integration and decentralization. Antiretroviral therapy integration into MNCH facilities improved ART coverage (relative risk [RR] 1.37, 95% confidence interval [CI] 1.05-1.79) and led to comparable retention in care. ART integration into TB treatment settings improved ART coverage (RR 1.83, 95% CI 1.48-2.23) and led to a nonsignificant reduction in mortality (RR 0.55, 95% CI 0.29-1.05). The limited data on ART integration into OST services indicated comparable rates of ART coverage, retention, and mortality. Partial decentralization into primary health facilities improved retention (RR 1.05, 95% CI 1.01-1.09) and reduced mortality (RR 0.34, 95% CI 0.13-0.87). Full decentralization improved retention (RR 1.12, 95% CI 1.08-1.17) and led to comparable mortality. Community-based ART led to comparable rates of retention and mortality. Integrating ART into MNCH, TB, and OST services was often associated with improvements in ART coverage, and decentralization of ART into primary health facilities and communities was often associated with improved retention. Neither integration nor decentralization was associated with adverse outcomes. These data contributed to recommendations in the WHO 2013 consolidated antiretroviral guidelines to integrate ART delivery into MNCH, TB, and OST services and to decentralize ART.

  20. Archive Management of NASA Earth Observation Data to Support Cloud Analysis

    NASA Technical Reports Server (NTRS)

    Lynnes, Christopher; Baynes, Kathleen; McInerney, Mark

    2017-01-01

    NASA collects, processes and distributes petabytes of Earth Observation (EO) data from satellites, aircraft, in situ instruments and model output, with an order of magnitude increase expected by 2024. Cloud-based web object storage (WOS) of these data can simplify the execution of such an increase. More importantly, it can also facilitate user analysis of those volumes by making the data available to the massively parallel computing power in the cloud. However, storing EO data in cloud WOS has a ripple effect throughout the NASA archive system with unexpected challenges and opportunities. One challenge is modifying data servicing software (such as Web Coverage Service servers) to access and subset data that are no longer on a directly accessible file system, but rather in cloud WOS. Opportunities include refactoring of the archive software to a cloud-native architecture; virtualizing data products by computing on demand; and reorganizing data to be more analysis-friendly. Reviewed by Mark McInerney ESDIS Deputy Project Manager.

  1. A cross disciplinary study of link decay and the effectiveness of mitigation techniques

    PubMed Central

    2013-01-01

    Background The dynamic, decentralized world-wide-web has become an essential part of scientific research and communication. Researchers create thousands of web sites every year to share software, data and services. These valuable resources tend to disappear over time. The problem has been documented in many subject areas. Our goal is to conduct a cross-disciplinary investigation of the problem and test the effectiveness of existing remedies. Results We accessed 14,489 unique web pages found in the abstracts within Thomson Reuters' Web of Science citation index that were published between 1996 and 2010 and found that the median lifespan of these web pages was 9.3 years with 62% of them being archived. Survival analysis and logistic regression were used to find significant predictors of URL lifespan. The availability of a web page is most dependent on the time it is published and the top-level domain names. Similar statistical analysis revealed biases in current solutions: the Internet Archive favors web pages with fewer layers in the Universal Resource Locator (URL) while WebCite is significantly influenced by the source of publication. We also created a prototype for a process to submit web pages to the archives and increased coverage of our list of scientific webpages in the Internet Archive and WebCite by 22% and 255%, respectively. Conclusion Our results show that link decay continues to be a problem across different disciplines and that current solutions for static web pages are helping and can be improved. PMID:24266891

  2. A cross disciplinary study of link decay and the effectiveness of mitigation techniques.

    PubMed

    Hennessey, Jason; Ge, Steven

    2013-01-01

    The dynamic, decentralized world-wide-web has become an essential part of scientific research and communication. Researchers create thousands of web sites every year to share software, data and services. These valuable resources tend to disappear over time. The problem has been documented in many subject areas. Our goal is to conduct a cross-disciplinary investigation of the problem and test the effectiveness of existing remedies. We accessed 14,489 unique web pages found in the abstracts within Thomson Reuters' Web of Science citation index that were published between 1996 and 2010 and found that the median lifespan of these web pages was 9.3 years with 62% of them being archived. Survival analysis and logistic regression were used to find significant predictors of URL lifespan. The availability of a web page is most dependent on the time it is published and the top-level domain names. Similar statistical analysis revealed biases in current solutions: the Internet Archive favors web pages with fewer layers in the Universal Resource Locator (URL) while WebCite is significantly influenced by the source of publication. We also created a prototype for a process to submit web pages to the archives and increased coverage of our list of scientific webpages in the Internet Archive and WebCite by 22% and 255%, respectively. Our results show that link decay continues to be a problem across different disciplines and that current solutions for static web pages are helping and can be improved.

  3. Affordable Care Act Impact on Medicaid Coverage of Smoking-Cessation Treatments.

    PubMed

    McMenamin, Sara B; Yoeun, Sara W; Halpin, Helen A

    2018-04-01

    Four sections of the Affordable Care Act address the expansion of Medicaid coverage for recommended smoking-cessation treatments for: (1) pregnant women (Section 4107), (2) all enrollees through a financial incentive (1% Federal Medical Assistance Percentage increase) to offer comprehensive coverage (Section 4106), (3) all enrollees through Medicaid formulary requirements (Section 2502), and (4) Medicaid expansion enrollees (Section 2001). The purpose of this study is to document changes in Medicaid coverage for smoking-cessation treatments since the passage of the Affordable Care Act and to assess how implementation has differentially affected Medicaid coverage policies for: pregnant women, enrollees in traditional Medicaid, and Medicaid expansion enrollees. From January through June 2017, data were collected and analyzed from 51 Medicaid programs (50 states plus the District of Columbia) through a web-based survey and review of benefits documents to assess coverage policies for smoking-cessation treatments. Forty-seven Medicaid programs have increased coverage for smoking-cessation treatments post-implementation of the Affordable Care Act by adopting one or more of the four smoking-cessation treatment provisions. Coverage for pregnant women increased in 37 states, coverage for newly eligible expansion enrollees increased in 32 states, and 15 states added coverage and/or removed copayments in order to apply for a 1% increase in the Federal Medical Assistance Percentage. Coverage for all recommended pharmacotherapy and group and individual counseling increased from seven states in 2009 to 28 states in 2017. The Affordable Care Act was successful in improving and expanding state Medicaid coverage of effective smoking-cessation treatments. Many programs are not fully compliant with the law, and additional guidance and clarification from the Centers for Medicare and Medicaid Services may be needed. Copyright © 2018 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  4. Impacts of environment on human diseases: a web service for the human exposome

    NASA Astrophysics Data System (ADS)

    Karssenberg, Derek; Vaartjes, Ilonca; Kamphuis, Carlijn; Strak, Maciek; Schmitz, Oliver; Soenario, Ivan; de Jong, Kor

    2017-04-01

    The exposome is the totality of human environmental exposures from conception onwards. Identifying the contribution of the exposome to human diseases and health is a key issue in health research. Examples include the effect of air pollution exposure on cardiovascular diseases, the impact of disease vectors (mosquitos) and surface hydrology exposure on malaria, and the effect of fast food restaurant exposure on obesity. Essential to health research is to disentangle the effects of the exposome and genome on health. Ultimately this requires quantifying the totality of all human exposures, for each individual in the studied human population. This poses a massive challenge to geoscientists, as environmental data are required at a high spatial and temporal resolution, with a large spatial and temporal coverage representing the area inhabited by the population studied and the time span representing several decades. Then, these data need to be combined with space-time paths of individuals to calculate personal exposures for each individual in the population. The Global and Geo Health Data Centre is taking this challenge by providing a web service capable of enriching population data with exposome information. Our web service can generate environmental information either from archived national (up to 5 m spatial and 1 h temporal resolution) and global environmental information or generated on the fly using environmental models running as microservices. On top of these environmental data services runs an individual exposure service enabling health researchers to select different spatial and temporal aggregation methods and to upload space-time paths of individuals. These are then enriched with personal exposures and eventually returned to the user. We illustrate the service in an example of individual exposures to air pollutants calculated from hyper resolution air pollution data and various approaches to estimate space-time paths of individuals.

  5. EarthServer: Visualisation and use of uncertainty as a data exploration tool

    NASA Astrophysics Data System (ADS)

    Walker, Peter; Clements, Oliver; Grant, Mike

    2013-04-01

    The Ocean Science/Earth Observation community generates huge datasets from satellite observation. Until recently it has been difficult to obtain matching uncertainty information for these datasets and to apply this to their processing. In order to make use of uncertainty information when analysing "Big Data" we need both the uncertainty itself (attached to the underlying data) and a means of working with the combined product without requiring the entire dataset to be downloaded. The European Commission FP7 project EarthServer (http://earthserver.eu) is addressing the problem of accessing and ad-hoc analysis of extreme-size Earth Science data using cutting-edge Array Database technology. The core software (Rasdaman) and web services wrapper (Petascope) allow huge datasets to be accessed using Open Geospatial Consortium (OGC) standard interfaces including the well established standards, Web Coverage Service (WCS) and Web Map Service (WMS) as well as the emerging standard, Web Coverage Processing Service (WCPS). The WCPS standard allows the running of ad-hoc queries on any of the data stored within Rasdaman, creating an infrastructure where users are not restricted by bandwidth when manipulating or querying huge datasets. The ESA Ocean Colour - Climate Change Initiative (OC-CCI) project (http://www.esa-oceancolour-cci.org/), is producing high-resolution, global ocean colour datasets over the full time period (1998-2012) where high quality observations were available. This climate data record includes per-pixel uncertainty data for each variable, based on an analytic method that classifies how much and which types of water are present in a pixel, and assigns uncertainty based on robust comparisons to global in-situ validation datasets. These uncertainty values take two forms, Root Mean Square (RMS) and Bias uncertainty, respectively representing the expected variability and expected offset error. By combining the data produced through the OC-CCI project with the software from the EarthServer project we can produce a novel data offering that allows the use of traditional exploration and access mechanisms such as WMS and WCS. However the real benefits can be seen when utilising WCPS to explore the data . We will show two major benefits to this infrastructure. Firstly we will show that the visualisation of the combined chlorophyll and uncertainty datasets through a web based GIS portal gives users the ability to instantaneously assess the quality of the data they are exploring using traditional web based plotting techniques as well as through novel web based 3 dimensional visualisation. Secondly we will showcase the benefits available when combining these data with the WCPS standard. The uncertainty data can be utilised in queries using the standard WCPS query language. This allows selection of data either for download or use within the query, based on the respective uncertainty values as well as the possibility of incorporating both the chlorophyll data and uncertainty data into complex queries to produce additional novel data products. By filtering with uncertainty at the data source rather than the client we can minimise traffic over the network allowing huge datasets to be worked on with a minimal time penalty.

  6. Nuclear science abstracts (NSA) database 1948--1974 (on the Internet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    Nuclear Science Abstracts (NSA) is a comprehensive abstract and index collection of the International Nuclear Science and Technology literature for the period 1948 through 1976. Included are scientific and technical reports of the US Atomic Energy Commission, US Energy Research and Development Administration and its contractors, other agencies, universities, and industrial and research organizations. Coverage of the literature since 1976 is provided by Energy Science and Technology Database. Approximately 25% of the records in the file contain abstracts. These are from the following volumes of the print Nuclear Science Abstracts: Volumes 12--18, Volume 29, and Volume 33. The database containsmore » over 900,000 bibliographic records. All aspects of nuclear science and technology are covered, including: Biomedical Sciences; Metals, Ceramics, and Other Materials; Chemistry; Nuclear Materials and Waste Management; Environmental and Earth Sciences; Particle Accelerators; Engineering; Physics; Fusion Energy; Radiation Effects; Instrumentation; Reactor Technology; Isotope and Radiation Source Technology. The database includes all records contained in Volume 1 (1948) through Volume 33 (1976) of the printed version of Nuclear Science Abstracts (NSA). This worldwide coverage includes books, conference proceedings, papers, patents, dissertations, engineering drawings, and journal literature. This database is now available for searching through the GOV. Research Center (GRC) service. GRC is a single online web-based search service to well known Government databases. Featuring powerful search and retrieval software, GRC is an important research tool. The GRC web site is at http://grc.ntis.gov.« less

  7. Use of Open Standards and Technologies at the Lunar Mapping and Modeling Project

    NASA Astrophysics Data System (ADS)

    Law, E.; Malhotra, S.; Bui, B.; Chang, G.; Goodale, C. E.; Ramirez, P.; Kim, R. M.; Sadaqathulla, S.; Rodriguez, L.

    2011-12-01

    The Lunar Mapping and Modeling Project (LMMP), led by the Marshall Space Flight center (MSFC), is tasked by NASA. The project is responsible for the development of an information system to support lunar exploration activities. It provides lunar explorers a set of tools and lunar map and model products that are predominantly derived from present lunar missions (e.g., the Lunar Reconnaissance Orbiter (LRO)) and from historical missions (e.g., Apollo). At Jet Propulsion Laboratory (JPL), we have built the LMMP interoperable geospatial information system's underlying infrastructure and a single point of entry - the LMMP Portal by employing a number of open standards and technologies. The Portal exposes a set of services to users to allow search, visualization, subset, and download of lunar data managed by the system. Users also have access to a set of tools that visualize, analyze and annotate the data. The infrastructure and Portal are based on web service oriented architecture. We designed the system to support solar system bodies in general including asteroids, earth and planets. We employed a combination of custom software, commercial and open-source components, off-the-shelf hardware and pay-by-use cloud computing services. The use of open standards and web service interfaces facilitate platform and application independent access to the services and data, offering for instances, iPad and Android mobile applications and large screen multi-touch with 3-D terrain viewing functions, for a rich browsing and analysis experience from a variety of platforms. The web services made use of open standards including: Representational State Transfer (REST); and Open Geospatial Consortium (OGC)'s Web Map Service (WMS), Web Coverage Service (WCS), Web Feature Service (WFS). Its data management services have been built on top of a set of open technologies including: Object Oriented Data Technology (OODT) - open source data catalog, archive, file management, data grid framework; openSSO - open source access management and federation platform; solr - open source enterprise search platform; redmine - open source project collaboration and management framework; GDAL - open source geospatial data abstraction library; and others. Its data products are compliant with Federal Geographic Data Committee (FGDC) metadata standard. This standardization allows users to access the data products via custom written applications or off-the-shelf applications such as GoogleEarth. We will demonstrate this ready-to-use system for data discovery and visualization by walking through the data services provided through the portal such as browse, search, and other tools. We will further demonstrate image viewing and layering of lunar map images from the Internet, via mobile devices such as Apple's iPad.

  8. Progress of Interoperability in Planetary Research for Geospatial Data Analysis

    NASA Astrophysics Data System (ADS)

    Hare, T. M.; Gaddis, L. R.

    2015-12-01

    For nearly a decade there has been a push in the planetary science community to support interoperable methods of accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (i.e., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized image formats that retain geographic information (e.g., GeoTiff, GeoJpeg2000), digital geologic mapping conventions, planetary extensions for symbols that comply with U.S. Federal Geographic Data Committee cartographic and geospatial metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they have been modified to support the planetary domain. The motivation to support common, interoperable data format and delivery standards is not only to improve access for higher-level products but also to address the increasingly distributed nature of the rapidly growing volumes of data. The strength of using an OGC approach is that it provides consistent access to data that are distributed across many facilities. While data-steaming standards are well-supported by both the more sophisticated tools used in Geographic Information System (GIS) and remote sensing industries, they are also supported by many light-weight browsers which facilitates large and small focused science applications and public use. Here we provide an overview of the interoperability initiatives that are currently ongoing in the planetary research community, examples of their successful application, and challenges that remain.

  9. The Geo Data Portal an Example Physical and Application Architecture Demonstrating the Power of the "Cloud" Concept.

    NASA Astrophysics Data System (ADS)

    Blodgett, D. L.; Booth, N.; Walker, J.; Kunicki, T.

    2012-12-01

    The U.S. Geological Survey Center for Integrated Data Analytics (CIDA), in holding with the President's Digital Government Strategy and the Department of Interior's IT Transformation initiative, has evolved its data center and application architecture toward the "cloud" paradigm. In this case, "cloud" refers to a goal of developing services that may be distributed to infrastructure anywhere on the Internet. This transition has taken place across the entire data management spectrum from data center location to physical hardware configuration to software design and implementation. In CIDA's case, physical hardware resides in Madison at the Wisconsin Water Science Center, in South Dakota at the Earth Resources Observation and Science Center (EROS), and in the near future at a DOI approved commercial vendor. Tasks normally conducted on desktop-based GIS software with local copies of data in proprietary formats are now done using browser-based interfaces to web processing services drawing on a network of standard data-source web services. Organizations are gaining economies of scale through data center consolidation and the creation of private cloud services as well as taking advantage of the commoditization of data processing services. Leveraging open standards for data and data management take advantage of this commoditization and provide the means to reliably build distributed service based systems. This presentation will use CIDA's experience as an illustration of the benefits and hurdles of moving to the cloud. Replicating, reformatting, and processing large data sets, such as downscaled climate projections, traditionally present a substantial challenge to environmental science researchers who need access to data subsets and derived products. The USGS Geo Data Portal (GDP) project uses cloud concepts to help earth system scientists' access subsets, spatial summaries, and derivatives of commonly needed very large data. The GDP project has developed a reusable architecture and advanced processing services that currently accesses archives hosted at Lawrence Livermore National Lab, Oregon State University, the University Corporation for Atmospheric Research, and the U.S. Geological Survey, among others. Several examples of how the GDP project uses cloud concepts will be highlighted in this presentation: 1) The high bandwidth network connectivity of large data centers reduces the need for data replication and storage local to processing services. 2) Standard data serving web services, like OPeNDAP, Web Coverage Services, and Web Feature Services allow GDP services to remotely access custom subsets of data in a variety of formats, further reducing the need for data replication and reformatting. 3) The GDP services use standard web service APIs to allow browser-based user interfaces to run complex and compute-intensive processes for users from any computer with an Internet connection. The combination of physical infrastructure and application architecture implemented for the Geo Data Portal project offer an operational example of how distributed data and processing on the cloud can be used to aid earth system science.

  10. Urban Climate Resilience - Connecting climate models with decision support cyberinfrastructure using open standards

    NASA Astrophysics Data System (ADS)

    Bermudez, L. E.; Percivall, G.; Idol, T. A.

    2015-12-01

    Experts in climate modeling, remote sensing of the Earth, and cyber infrastructure must work together in order to make climate predictions available to decision makers. Such experts and decision makers worked together in the Open Geospatial Consortium's (OGC) Testbed 11 to address a scenario of population displacement by coastal inundation due to the predicted sea level rise. In a Policy Fact Sheet "Harnessing Climate Data to Boost Ecosystem & Water Resilience", issued by White House Office of Science and Technology (OSTP) in December 2014, OGC committed to increase access to climate change information using open standards. In July 2015, the OGC Testbed 11 Urban Climate Resilience activity delivered on that commitment with open standards based support for climate-change preparedness. Using open standards such as the OGC Web Coverage Service and Web Processing Service and the NetCDF and GMLJP2 encoding standards, Testbed 11 deployed an interoperable high-resolution flood model to bring climate model outputs together with global change assessment models and other remote sensing data for decision support. Methods to confirm model predictions and to allow "what-if-scenarios" included in-situ sensor webs and crowdsourcing. A scenario was in two locations: San Francisco Bay Area and Mozambique. The scenarios demonstrated interoperation and capabilities of open geospatial specifications in supporting data services and processing services. The resultant High Resolution Flood Information System addressed access and control of simulation models and high-resolution data in an open, worldwide, collaborative Web environment. The scenarios examined the feasibility and capability of existing OGC geospatial Web service specifications in supporting the on-demand, dynamic serving of flood information from models with forecasting capacity. Results of this testbed included identification of standards and best practices that help researchers and cities deal with climate-related issues. Results of the testbeds will now be deployed in pilot applications. The testbed also identified areas of additional development needed to help identify scientific investments and cyberinfrastructure approaches needed to improve the application of climate science research results to urban climate resilence.

  11. Supporting NEESPI with Data Services - The SIB-ESS-C e-Infrastructure

    NASA Astrophysics Data System (ADS)

    Gerlach, R.; Schmullius, C.; Frotscher, K.

    2009-04-01

    Data discovery and retrieval is commonly among the first steps performed for any Earth science study. The way scientific data is searched and accessed has changed significantly over the past two decades. Especially the development of the World Wide Web and the technologies that evolved along shortened the data discovery and data exchange process. On the other hand the amount of data collected and distributed by earth scientists has increased exponentially requiring new concepts for data management and sharing. One such concept to meet the demand is to build up Spatial Data Infrastructures (SDI) or e-Infrastructures. These infrastructures usually contain components for data discovery allowing users (or other systems) to query a catalogue or registry and retrieve metadata information on available data holdings and services. Data access is typically granted using FTP/HTTP protocols or, more advanced, through Web Services. A Service Oriented Architecture (SOA) approach based on standardized services enables users to benefit from interoperability among different systems and to integrate distributed services into their application. The Siberian Earth System Science Cluster (SIB-ESS-C) being established at the University of Jena (Germany) is such a spatial data infrastructure following these principles and implementing standards published by the Open Geospatial Consortium (OGC) and the International Organization for Standardization (ISO). The prime objective is to provide researchers with focus on Siberia with the technical means for data discovery, data access, data publication and data analysis. The region of interest covers the entire Asian part of the Russian Federation from the Ural to the Pacific Ocean including the Ob-, Lena- and Yenissey river catchments. The aim of SIB-ESS-C is to provide a comprehensive set of data products for Earth system science in this region. Although SIB-ESS-C will be equipped with processing capabilities for in-house data generation (mainly from Earth Observation), current data holdings of SIB-ESS-C have been created in collaboration with a number of partners in previous and ongoing research projects (e.g. SIBERIA-II, SibFORD, IRIS). At the current development stage the SIB-ESS-C system comprises a federated metadata catalogue accessible through the SIB-ESS-C Web Portal or from any OGC-CSW compliant client. Due to full interoperability with other metadata catalogues users of the SIB-ESS-C Web Portal are able to search external metadata repositories. The Web Portal contains also a simple visualization component which will be extended to a comprehensive visualization and analysis tool in the near future. All data products are already accessible as a Web Mapping Service and will be made available as Web Feature and Web Coverage Services soon allowing users to directly incorporate the data into their application. The SIB-ESS-C infrastructure will be further developed as one node in a network of similar systems (e.g. NASA GIOVANNI) in the NEESPI region.

  12. Cluster randomized trial of a mHealth intervention "ImTeCHO" to improve delivery of proven maternal, neonatal, and child care interventions through community-based Accredited Social Health Activists (ASHAs) by enhancing their motivation and strengthening supervision in tribal areas of Gujarat, India: study protocol for a randomized controlled trial.

    PubMed

    Modi, Dhiren; Desai, Shrey; Dave, Kapilkumar; Shah, Shobha; Desai, Gayatri; Dholakia, Nishith; Gopalan, Ravi; Shah, Pankaj

    2017-06-09

    To facilitate the delivery of proven maternal, neonatal, and child health (MNCH) services, a new cadre of village-based frontline workers, called the Accredited Social Health Activists (ASHAs), was created in 2005 under the aegis of the National Rural Health Mission in India. Evaluations have noted that coverage of selected MNCH services to be delivered by the ASHAs is low. Reasons for low coverage are inadequate supervision and support to ASHAs apart from insufficient skills, poor quality of training, and complexity of tasks to be performed. The proposed study aims to implement and evaluate an innovative intervention based on mobile phone technology (mHealth) to improve the performance of ASHAs through better supervision and support in predominantly tribal and rural communities of Gujarat, India. This is a two-arm, stratified, cluster randomized trial of 36 months in which the units of randomization will be Primary Health Centers (PHCs). There are 11 PHCs in each arm. The intervention is a newly built mobile phone application used in the public health system and evaluated in three ways: (1) mobile phone as a job aid to ASHAs to increase coverage of MNCH services; (2) mobile phone as a job aid to ASHAs and Auxiliary Nurse Midwives (ANMs) to increase coverage of care among complicated cases by facilitating referrals, if indicated and home-based care; (3) web interface as a job aid for medical officers and PHC staff to improve supervision and support to the ASHA program. Participants of the study are pregnant women, mothers, infants, ASHAs, and PHC staff. Primary outcome measures are a composite index made of critical, proven MNCH services and the proportion of neonates who were visited by ASHAs at home within the first week of birth. Secondary outcomes include coverage of selected MNCH services and care sought by complicated cases. Outcomes will be measured by conducting household surveys at baseline and post-intervention which will be compared with usual practice in the control area, where the current level of services provided by the government will continue. The primary analysis will be intention to treat. This study will help answer some critical questions about the effectiveness and feasibility of implementing an mHealth solution in an area of MNCH services. Clinical Trial Registry of India, CTRI/2015/06/005847 . Registered on 3 June 2015.

  13. Beyond wishful thinking; medical community presence on the web and challenges of pervasive healthcare.

    PubMed

    Moisil, Ioana; Barbat, Boldur E

    2004-01-01

    Romanian healthcare is facing a number of challenges, from the growing general costs, through requests for better services, inadequate territorial coverage, medical errors and a growing incidence of chronic diseases, to the burden of debt toward the pharmaceutical industry. For the last 14 years decision factors have been searching for the magic formula in restructuring the healthcare sector. Eventually, the government has come to appreciate the benefits of IT solutions. Our paper presents recent advances in wireless technologies and their impact on healthcare, in parallel with the results of a study aimed to acknowledge the presence of the medical community on Romanian WWW and to evaluate the degree of accessibility for the general population. We have documented Web sites promoting health services, discussion forums for patients, online medical advice, medical image teleprocessing, health education, health research and documentation, pharmaceutical products, e-procurement, health portals, medical links, hospitals and other health units present on the Web. Initial results have shown that if the current trend in price decreases for mobile communications continues and if the government is able to provide funding for the communication infrastructure needed for pervasive healthcare systems together with the appropriate regulations and standards, this can be a long-term viable solution of the healthcare crisis.

  14. MyOcean Central Information System - Achievements and Perspectives

    NASA Astrophysics Data System (ADS)

    Claverie, Vincent; Loubrieu, Thomas; Jolibois, Tony; de Dianous, Rémi; Blower, Jon; Romero, Laia; Griffiths, Guy

    2013-04-01

    Since 2009, MyOcean (http://www.myocean.eu) is providing an operational service, for forecasts, analysis and expertise on ocean currents, temperature, salinity, sea level, primary ecosystems and ice coverage. The production of observation and forecasting data is done by 42 Production Units (PU). Product download and visualisation are hosted by 25 Dissemination Units (DU). All these products and associated services are gathered in a single catalogue hiding the intricate distributed organization of PUs and DUs. Besides applying INSPIRE directive and OGC recommendations, MyOcean overcomes technical choices and challenges. This presentation focuses on 3 specific issues met by MyOcean and relevant for many Spatial Data Infrastructures: user's transaction accounting, large volume download and stream line the catalogue maintenance. Transaction Accounting: Set up powerful means to get detailed knowledge of system usage in order to subsequently improve the products (ocean observations, analysis and forecast dataset) and services (view, download) offer. This subject drives the following ones: Central authentication management for the distributed web services implementations: add-on to THREDDS Data Server for WMS and NETCDF sub-setting service, specific FTP. Share user management with co-funding projects. In addition to MyOcean, alternate projects also need consolidated information about the use of the cofunded products. Provide a central facility for the user management. This central facility provides users' rights to geographically distributed services and gathers transaction accounting history from these distributed services. Propose a user-friendly web interface to download large volume of data (several GigaBytes) as robust as basic FTP but intuitive and file/directory independent. This should rely on a web service drafting the INSPIRE to-be specification and OGC recommendations for download taking into account that FTP server is not enough friendly (need to know filenames, directories) and Web-page not allowing downloading several files. Streamline the maintenance of the central catalogue. The major update for MyOcean v3 (April 2013) is the usage of Geonetwork for catalogue management. This improves the system at different levels : The editing interface is more user-friendly and the catalogue updates are managed in a workflow. This workflow allows higher flexibility for minor updates without giving up the high level qualification requirements for the catalogue content. The distributed web services (download, view) are automatically harvested from the THREDDS Data Server. Thus the manual editing on the catalogue is reduced, the associated typos are avoided and the quality of information is finally improved.

  15. Land User and Land Cover Maps of Europe: a Webgis Platform

    NASA Astrophysics Data System (ADS)

    Brovelli, M. A.; Fahl, F. C.; Minghini, M.; Molinari, M. E.

    2016-06-01

    This paper presents the methods and implementation processes of a WebGIS platform designed to publish the available land use and land cover maps of Europe at continental scale. The system is built completely on open source infrastructure and open standards. The proposed architecture is based on a server-client model having GeoServer as the map server, Leaflet as the client-side mapping library and the Bootstrap framework at the core of the front-end user interface. The web user interface is designed to have typical features of a desktop GIS (e.g. activate/deactivate layers and order layers by drag and drop actions) and to show specific information on the activated layers (e.g. legend and simplified metadata). Users have the possibility to change the base map from a given list of map providers (e.g. OpenStreetMap and Microsoft Bing) and to control the opacity of each layer to facilitate the comparison with both other land cover layers and the underlying base map. In addition, users can add to the platform any custom layer available through a Web Map Service (WMS) and activate the visualization of photos from popular photo sharing services. This last functionality is provided in order to have a visual assessment of the available land coverages based on other user-generated contents available on the Internet. It is supposed to be a first step towards a calibration/validation service that will be made available in the future.

  16. ArcticDEM Year 3; Improving Coverage, Repetition and Resolution

    NASA Astrophysics Data System (ADS)

    Morin, P. J.; Porter, C. C.; Cloutier, M.; Howat, I.; Noh, M. J.; Willis, M. J.; Candela, S. G.; Bauer, G.; Kramer, W.; Bates, B.; Williamson, C.

    2017-12-01

    Surface topography is among the most fundamental data sets for geosciences, essential for disciplines ranging from glaciology to geodynamics. The ArcticDEM project is using sub-meter, commercial imagery licensed by the National Geospatial-Intelligence Agency, petascale computing, and open source photogrammetry software to produce a time-tagged 2m posting elevation model and a 5m posting mosaic of the entire Arctic region. As ArcticDEM enters its third year, the region has gone from having some of the sparsest and poorest elevation data to some of the most precise and complete data of any region on the globe. To date, we have produced and released over 80,000,000 km2 as 57,000 - 2m posting, time-stamped DEMs. The Arctic, on average, is covered four times though there are hotspots with more than 100 DEMs. In addition, the version 1 release includes a 5m posting mosaic covering the entire 20,000,000 km2 region. All products are publically available through arctidem.org, ESRI web services, and a web viewer. The final year of the project will consist of a complete refiltering of clouds/water and re-mosaicing of all elevation data. Since inception of the project, post-processing techniques have improved significantly, resulting in fewer voids, better registration, sharper coastlines, and fewer inaccuracies due to clouds. All ArcticDEM data will be released in 2018. Data, documentation, web services and web viewer are available at arcticdem.org

  17. OGC and Grid Interoperability in enviroGRIDS Project

    NASA Astrophysics Data System (ADS)

    Gorgan, Dorian; Rodila, Denisa; Bacu, Victor; Giuliani, Gregory; Ray, Nicolas

    2010-05-01

    EnviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is a 4-years FP7 Project aiming to address the subjects of ecologically unsustainable development and inadequate resource management. The project develops a Spatial Data Infrastructure of the Black Sea Catchment region. The geospatial technologies offer very specialized functionality for Earth Science oriented applications as well as the Grid oriented technology that is able to support distributed and parallel processing. One challenge of the enviroGRIDS project is the interoperability between geospatial and Grid infrastructures by providing the basic and the extended features of the both technologies. The geospatial interoperability technology has been promoted as a way of dealing with large volumes of geospatial data in distributed environments through the development of interoperable Web service specifications proposed by the Open Geospatial Consortium (OGC), with applications spread across multiple fields but especially in Earth observation research. Due to the huge volumes of data available in the geospatial domain and the additional introduced issues (data management, secure data transfer, data distribution and data computation), the need for an infrastructure capable to manage all those problems becomes an important aspect. The Grid promotes and facilitates the secure interoperations of geospatial heterogeneous distributed data within a distributed environment, the creation and management of large distributed computational jobs and assures a security level for communication and transfer of messages based on certificates. This presentation analysis and discusses the most significant use cases for enabling the OGC Web services interoperability with the Grid environment and focuses on the description and implementation of the most promising one. In these use cases we give a special attention to issues such as: the relations between computational grid and the OGC Web service protocols, the advantages offered by the Grid technology - such as providing a secure interoperability between the distributed geospatial resource -and the issues introduced by the integration of distributed geospatial data in a secure environment: data and service discovery, management, access and computation. enviroGRIDS project proposes a new architecture which allows a flexible and scalable approach for integrating the geospatial domain represented by the OGC Web services with the Grid domain represented by the gLite middleware. The parallelism offered by the Grid technology is discussed and explored at the data level, management level and computation level. The analysis is carried out for OGC Web service interoperability in general but specific details are emphasized for Web Map Service (WMS), Web Feature Service (WFS), Web Coverage Service (WCS), Web Processing Service (WPS) and Catalog Service for Web (CSW). Issues regarding the mapping and the interoperability between the OGC and the Grid standards and protocols are analyzed as they are the base in solving the communication problems between the two environments: grid and geospatial. The presetation mainly highlights how the Grid environment and Grid applications capabilities can be extended and utilized in geospatial interoperability. Interoperability between geospatial and Grid infrastructures provides features such as the specific geospatial complex functionality and the high power computation and security of the Grid, high spatial model resolution and geographical area covering, flexible combination and interoperability of the geographical models. According with the Service Oriented Architecture concepts and requirements of interoperability between geospatial and Grid infrastructures each of the main functionality is visible from enviroGRIDS Portal and consequently, by the end user applications such as Decision Maker/Citizen oriented Applications. The enviroGRIDS portal is the single way of the user to get into the system and the portal faces a unique style of the graphical user interface. Main reference for further information: [1] enviroGRIDS Project, http://www.envirogrids.net/

  18. Advancements in Open Geospatial Standards for Photogrammetry and Remote Sensing from Ogc

    NASA Astrophysics Data System (ADS)

    Percivall, George; Simonis, Ingo

    2016-06-01

    The necessity of open standards for effective sharing and use of remote sensing continues to receive increasing emphasis in policies of agencies and projects around the world. Coordination on the development of open standards for geospatial information is a vital step to insure that the technical standards are ready to support the policy objectives. The mission of the Open Geospatial Consortium (OGC) is to advance development and use of international standards and supporting services that promote geospatial interoperability. To accomplish this mission, OGC serves as the global forum for the collaboration of geospatial data / solution providers and users. Photogrammetry and remote sensing are sources of the largest and most complex geospatial information. Some of the most mature OGC standards for remote sensing include the Sensor Web Enablement (SWE) standards, the Web Coverage Service (WCS) suite of standards, encodings such as NetCDF, GMLJP2 and GeoPackage, and the soon to be approved Discrete Global Grid Systems (DGGS) standard. In collaboration with ISPRS, OGC working with government, research and industrial organizations continue to advance the state of geospatial standards for full use of photogrammetry and remote sensing.

  19. Using RxNorm and NDF-RT to classify medication data extracted from electronic health records: experiences from the Rochester Epidemiology Project.

    PubMed

    Pathak, Jyotishman; Murphy, Sean P; Willaert, Brian N; Kremers, Hilal M; Yawn, Barbara P; Rocca, Walter A; Chute, Christopher G

    2011-01-01

    RxNorm and NDF-RT published by the National Library of Medicine (NLM) and Veterans Affairs (VA), respectively, are two publicly available federal medication terminologies. In this study, we evaluate the applicability of RxNorm and National Drug File-Reference Terminology (NDF-RT) for extraction and classification of medication data retrieved using structured querying and natural language processing techniques from electronic health records at two different medical centers within the Rochester Epidemiology Project (REP). Specifically, we explore how mappings between RxNorm concept codes and NDF-RT drug classes can be leveraged for hierarchical organization and grouping of REP medication data, identify gaps and coverage issues, and analyze the recently released NLM's NDF-RT Web service API. Our study concludes that RxNorm and NDF-RT can be applied together for classification of medication extracted from multiple EHR systems, although several issues and challenges remain to be addressed. We further conclude that the Web service APIs developed by the NLM provide useful functionalities for such activities.

  20. Supporting tobacco control: stimulating local newspaper coverage with a technical assistance website for local coalitions.

    PubMed

    Buller, David B; Bettinghaus, Erwin P; Helme, Donald; Young, Walter F; Borland, Ron; Maloy, Julie A; Cutter, Gary R; Andersen, Peter A; Walther, Joseph B

    2011-11-01

    A large and growing literature confirms that well-designed web-based programs can be effective in preventing or treating several chronic diseases. This study examined how the Internet can deliver information and train community activists and specifically tested the effects of web-based technical assistance on local tobacco control coalitions' efforts to use media advocacy to advance their agendas. The authors compared a highly interactive, Enhanced website (intervention) to a noninteractive, Basic text-based website (comparison) in Colorado communities. A total of 24 tobacco control coalitions led by local county health departments and nursing services were enrolled in the project and randomly assigned to use either the intervention or comparison website. A total of 73 local daily and weekly newspapers were identified in the service areas of 23 of the 24 coalitions. A posttest assessment of newspaper coverage was conducted to locate all newspaper articles with tobacco control information published between January 1 and April 9, 2004, the last 3 months of the intervention. Although there was no evidence of a treatment effect on the frequency of newspaper articles on tobacco-related issues, there was, however, evidence that newspapers in counties where the coalition had access to the Enhanced website printed more stories focused on local/regional issues and more anti-tobacco local/regional stories than in the counties where coalitions had access to the Basic website. Coalitions can improve their influence on local media for community tobacco control when high-quality online technical assistance, training, and resources are available to them.

  1. GIS Services, Visualization Products, and Interoperability at the National Oceanic and Atmospheric Administration (NOAA) National Climatic Data Center (NCDC)

    NASA Astrophysics Data System (ADS)

    Baldwin, R.; Ansari, S.; Reid, G.; Lott, N.; Del Greco, S.

    2007-12-01

    The main goal in developing and deploying Geographic Information System (GIS) services at NOAA's National Climatic Data Center (NCDC) is to provide users with simple access to data archives while integrating new and informative climate products. Several systems at NCDC provide a variety of climatic data in GIS formats and/or map viewers. The Online GIS Map Services provide users with data discovery options which flow into detailed product selection maps, which may be queried using standard "region finder" tools or gazetteer (geographical dictionary search) functions. Each tabbed selection offers steps to help users progress through the systems. A series of additional base map layers or data types have been added to provide companion information. New map services include: Severe Weather Data Inventory, Local Climatological Data, Divisional Data, Global Summary of the Day, and Normals/Extremes products. THREDDS Data Server technology is utilized to provide access to gridded multidimensional datasets such as Model, Satellite and Radar. This access allows users to download data as a gridded NetCDF file, which is readable by ArcGIS. In addition, users may subset the data for a specific geographic region, time period, height range or variable prior to download. The NCDC Weather Radar Toolkit (WRT) is a client tool which accesses Weather Surveillance Radar 1988 Doppler (WSR-88D) data locally or remotely from the NCDC archive, NOAA FTP server or any URL or THREDDS Data Server. The WRT Viewer provides tools for custom data overlays, Web Map Service backgrounds, animations and basic filtering. The export of images and movies is provided in multiple formats. The WRT Data Exporter allows for data export in both vector polygon (Shapefile, Well-Known Text) and raster (GeoTIFF, ESRI Grid, VTK, NetCDF, GrADS) formats. As more users become accustom to GIS, questions of better, cheaper, faster access soon follow. Expanding use and availability can best be accomplished through standards which promote interoperability. Our GIS related products provide Open Geospatial Consortium (OGC) compliant Web Map Services (WMS), Web Feature Services (WFS), Web Coverage Services (WCS) and Federal Geographic Data Committee (FGDC) metadata as a complement to the map viewers. KML/KMZ data files (soon to be compliant OGC specifications) also provide access.

  2. Remote Sensing Data Analytics for Planetary Science with PlanetServer/EarthServer

    NASA Astrophysics Data System (ADS)

    Rossi, Angelo Pio; Figuera, Ramiro Marco; Flahaut, Jessica; Martinot, Melissa; Misev, Dimitar; Baumann, Peter; Pham Huu, Bang; Besse, Sebastien

    2016-04-01

    Planetary Science datasets, beyond the change in the last two decades from physical volumes to internet-accessible archives, still face the problem of large-scale processing and analytics (e.g. Rossi et al., 2014, Gaddis and Hare, 2015). PlanetServer, the Planetary Science Data Service of the EC-funded EarthServer-2 project (#654367) tackles the planetary Big Data analytics problem with an array database approach (Baumann et al., 2014). It is developed to serve a large amount of calibrated, map-projected planetary data online, mainly through Open Geospatial Consortium (OGC) Web Coverage Processing Service (WCPS) (e.g. Rossi et al., 2014; Oosthoek et al., 2013; Cantini et al., 2014). The focus of the H2020 evolution of PlanetServer is still on complex multidimensional data, particularly hyperspectral imaging and topographic cubes and imagery. In addition to hyperspectral and topographic from Mars (Rossi et al., 2014), the use of WCPS is applied to diverse datasets on the Moon, as well as Mercury. Other Solar System Bodies are going to be progressively available. Derived parameters such as summary products and indices can be produced through WCPS queries, as well as derived imagery colour combination products, dynamically generated and accessed also through OGC Web Coverage Service (WCS). Scientific questions translated into queries can be posed to a large number of individual coverages (data products), locally, regionally or globally. The new PlanetServer system uses the the Open Source Nasa WorldWind (e.g. Hogan, 2011) virtual globe as visualisation engine, and the array database Rasdaman Community Edition as core server component. Analytical tools and client components of relevance for multiple communities and disciplines are shared across service such as the Earth Observation and Marine Data Services of EarthServer. The Planetary Science Data Service of EarthServer is accessible on http://planetserver.eu. All its code base is going to be available on GitHub, on https://github.com/planetserver References: Baumann, P., et al. (2015) Big Data Analytics for Earth Sciences: the EarthServer approach, International Journal of Digital Earth, doi: 10.1080/17538947.2014.1003106. Cantini, F. et al. (2014) Geophys. Res. Abs., Vol. 16, #EGU2014-3784. Gaddis, L., and T. Hare (2015), Status of tools and data for planetary research, Eos, 96, dos: 10.1029/2015EO041125. Hogan, P., 2011. NASA World Wind: Infrastructure for Spatial Data. Technical report. Proceedings of the 2nd International Conference on Computing for Geospatial Research & Applications ACM. Oosthoek, J.H.P, et al. (2013) Advances in Space Research. doi: 10.1016/j.asr.2013.07.002. Rossi, A. P., et al. (2014) PlanetServer/EarthServer: Big Data analytics in Planetary Science. Geophysical Research Abstracts, Vol. 16, #EGU2014-5149.

  3. Comparison of anticancer drug coverage decisions in the United States and United Kingdom: does the evidence support the rhetoric?

    PubMed

    Mason, Anne; Drummond, Michael; Ramsey, Scott; Campbell, Jonathan; Raisch, Dennis

    2010-07-10

    In contrast to the United States, several European countries have health technology assessment programs for drugs, many of which assess cost effectiveness. Coverage decisions that consider cost effectiveness may lead to restrictions in access. For a purposive sample of five decision-making bodies, we analyzed US and United Kingdom coverage decisions on all anticancer drugs approved by the US Food and Drug Administration (FDA) from 2004 to 2008. Data sources for the timing and outcome of licensing and coverage decisions included published and unpublished documentation, Web sites, and personal communication. The FDA approved 59 anticancer drugs over the study period, of which 46 were also approved by the European Medicines Agency. In the United States, 100% of drugs were covered, mostly without restriction. However, the United Kingdom bodies made positive coverage decisions for less than half of licensed drugs (National Institute for Health and Clinical Excellence [NICE]: 39%; Scottish Medicines Consortium [SMC]: 43%). Whereas the Centers for Medicare and Medicaid Services (CMS) and the Department of Veterans Affairs (VA) covered all 59 drugs from the FDA license date, delays were evident for some Regence Group decisions that were informed by cost effectiveness (median, 0 days; semi-interquartile range [SIQR], 122 days; n = 22). Relative to the European Medicines Agency license date, median time to coverage was 783 days (SIQR, 170 days) for NICE and 231 days (SIQR, 129 days) for the SMC. Anticancer drug coverage decisions that consider cost effectiveness are associated with greater restrictions and slower time to coverage. However, this approach may represent an explicit alternative to rationing achieved through the use of patient copayments.

  4. Issues in Data Fusion for Satellite Aerosol Measurements for Applications with GIOVANNI System at NASA GES DISC

    NASA Technical Reports Server (NTRS)

    Gopalan, Arun; Zubko, Viktor; Leptoukh, Gregory G.

    2008-01-01

    We look at issues, barriers and approaches for Data Fusion of satellite aerosol data as available from the GES DISC GIOVANNI Web Service. Daily Global Maps of AOT from a single satellite sensor alone contain gaps that arise due to various sources (sun glint regions, clouds, orbital swath gaps at low latitudes, bright underlying surfaces etc.). The goal is to develop a fast, accurate and efficient method to improve the spatial coverage of the Daily AOT data to facilitate comparisons with Global Models. Data Fusion may be supplemented by Optimal Interpolation (OI) as needed.

  5. Introduction to the Application of Web-Based Surveys.

    ERIC Educational Resources Information Center

    Timmerman, Annemarie

    This paper discusses some basic assumptions and issues concerning web-based surveys. Discussion includes: assumptions regarding cost and ease of use; disadvantages of web-based surveys, concerning the inability to compensate for four common errors of survey research: coverage error, sampling error, measurement error and nonresponse error; and…

  6. Teaching Critical Evaluation Skills for World Wide Web Resources.

    ERIC Educational Resources Information Center

    Tate, Marsha; Alexander, Jan

    1996-01-01

    Outlines a lesson plan used by an academic library to evaluate the quality of World Wide Web information. Discusses the traditional evaluation criteria of accuracy, authority, objectivity, currency, and coverage as it applies to the unique characteristics of Web pages: their marketing orientation, variety of information, and instability. The…

  7. A New Look at Data Usage by Using Metadata Attributes as Indicators of Data Quality

    NASA Astrophysics Data System (ADS)

    Won, Y. I.; Wanchoo, L.; Behnke, J.

    2016-12-01

    NASA's Earth Observing System Data and Information System (EOSDIS) stores and distributes data from EOS satellites, as well as ancillary, airborne, in-situ, and socio-economic data. Twelve EOSDIS data centers support different scientific disciplines by providing products and services tailored to specific science communities. Although discipline oriented, these data centers provide common data management functions of ingest, archive and distribution, as well as documentation of their data and services on their web-sites. The Earth Science Data and Information System (ESDIS) Project collects these metrics from the EOSDIS data centers on a daily basis through a tool called the ESDIS Metrics System (EMS). These metrics are used in this study. The implementation of the Earthdata Login - formerly known as the User Registration System (URS) - across the various NASA data centers provides the EMS additional information about users obtaining data products from EOSDIS data centers. These additional user attributes collected by the Earthdata login, such as the user's primary area of study can augment the understanding of data usage, which in turn can help the EOSDIS program better understand the users' needs. This study will review the key metrics (users, distributed volume, and files) in multiple ways to gain an understanding of the significance of the metadata. Characterizing the usability of data by key metadata elements such as discipline and study area, will assist in understanding how the users have evolved over time. The data usage pattern based on version numbers may also provide some insight into the level of data quality. In addition, the data metrics by various services such as the Open-source Project for a Network Data Access Protocol (OPeNDAP), Web Map Service (WMS), Web Coverage Service (WCS), and subsets, will address how these services have extended the usage of data. Over-all, this study will present the usage of data and metadata by metrics analyses and will assist data centers in better supporting the needs of the users.

  8. Sensor Management for Applied Research Technologies (SMART)-On Demand Modeling (ODM) Project

    NASA Technical Reports Server (NTRS)

    Goodman, M.; Blakeslee, R.; Hood, R.; Jedlovec, G.; Botts, M.; Li, X.

    2006-01-01

    NASA requires timely on-demand data and analysis capabilities to enable practical benefits of Earth science observations. However, a significant challenge exists in accessing and integrating data from multiple sensors or platforms to address Earth science problems because of the large data volumes, varying sensor scan characteristics, unique orbital coverage, and the steep learning curve associated with each sensor and data type. The development of sensor web capabilities to autonomously process these data streams (whether real-time or archived) provides an opportunity to overcome these obstacles and facilitate the integration and synthesis of Earth science data and weather model output. A three year project, entitled Sensor Management for Applied Research Technologies (SMART) - On Demand Modeling (ODM), will develop and demonstrate the readiness of Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) capabilities that integrate both Earth observations and forecast model output into new data acquisition and assimilation strategies. The advancement of SWE-enabled systems (i.e., use of SensorML, sensor planning services - SPS, sensor observation services - SOS, sensor alert services - SAS and common observation model protocols) will have practical and efficient uses in the Earth science community for enhanced data set generation, real-time data assimilation with operational applications, and for autonomous sensor tasking for unique data collection.

  9. 5 CFR 831.202 - Continuation of coverage for food service employees of the House of Representatives.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Continuation of coverage for food service... PERSONNEL MANAGEMENT (CONTINUED) CIVIL SERVICE REGULATIONS (CONTINUED) RETIREMENT Coverage § 831.202 Continuation of coverage for food service employees of the House of Representatives. (a) Congressional...

  10. 29 CFR 779.256 - Conditions for enterprise coverage of gasoline service establishments.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 3 2011-07-01 2011-07-01 false Conditions for enterprise coverage of gasoline service... Apply; Enterprise Coverage The Gasoline Service Establishment Enterprise § 779.256 Conditions for enterprise coverage of gasoline service establishments. (a) The requirement that the enterprise must be “an...

  11. 29 CFR 779.256 - Conditions for enterprise coverage of gasoline service establishments.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 29 Labor 3 2012-07-01 2012-07-01 false Conditions for enterprise coverage of gasoline service... Apply; Enterprise Coverage The Gasoline Service Establishment Enterprise § 779.256 Conditions for enterprise coverage of gasoline service establishments. (a) The requirement that the enterprise must be “an...

  12. 29 CFR 779.256 - Conditions for enterprise coverage of gasoline service establishments.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 3 2010-07-01 2010-07-01 false Conditions for enterprise coverage of gasoline service... Apply; Enterprise Coverage The Gasoline Service Establishment Enterprise § 779.256 Conditions for enterprise coverage of gasoline service establishments. (a) The requirement that the enterprise must be “an...

  13. 29 CFR 779.256 - Conditions for enterprise coverage of gasoline service establishments.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 29 Labor 3 2014-07-01 2014-07-01 false Conditions for enterprise coverage of gasoline service... Apply; Enterprise Coverage The Gasoline Service Establishment Enterprise § 779.256 Conditions for enterprise coverage of gasoline service establishments. (a) The requirement that the enterprise must be “an...

  14. 29 CFR 779.256 - Conditions for enterprise coverage of gasoline service establishments.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 29 Labor 3 2013-07-01 2013-07-01 false Conditions for enterprise coverage of gasoline service... Apply; Enterprise Coverage The Gasoline Service Establishment Enterprise § 779.256 Conditions for enterprise coverage of gasoline service establishments. (a) The requirement that the enterprise must be “an...

  15. An Automatic Web Service Composition Framework Using QoS-Based Web Service Ranking Algorithm.

    PubMed

    Mallayya, Deivamani; Ramachandran, Baskaran; Viswanathan, Suganya

    2015-01-01

    Web service has become the technology of choice for service oriented computing to meet the interoperability demands in web applications. In the Internet era, the exponential addition of web services nominates the "quality of service" as essential parameter in discriminating the web services. In this paper, a user preference based web service ranking (UPWSR) algorithm is proposed to rank web services based on user preferences and QoS aspect of the web service. When the user's request cannot be fulfilled by a single atomic service, several existing services should be composed and delivered as a composition. The proposed framework allows the user to specify the local and global constraints for composite web services which improves flexibility. UPWSR algorithm identifies best fit services for each task in the user request and, by choosing the number of candidate services for each task, reduces the time to generate the composition plans. To tackle the problem of web service composition, QoS aware automatic web service composition (QAWSC) algorithm proposed in this paper is based on the QoS aspects of the web services and user preferences. The proposed framework allows user to provide feedback about the composite service which improves the reputation of the services.

  16. Gmz: a Gml Compression Model for Webgis

    NASA Astrophysics Data System (ADS)

    Khandelwal, A.; Rajan, K. S.

    2017-09-01

    Geography markup language (GML) is an XML specification for expressing geographical features. Defined by Open Geospatial Consortium (OGC), it is widely used for storage and transmission of maps over the Internet. XML schemas provide the convenience to define custom features profiles in GML for specific needs as seen in widely popular cityGML, simple features profile, coverage, etc. Simple features profile (SFP) is a simpler subset of GML profile with support for point, line and polygon geometries. SFP has been constructed to make sure it covers most commonly used GML geometries. Web Feature Service (WFS) serves query results in SFP by default. But it falls short of being an ideal choice due to its high verbosity and size-heavy nature, which provides immense scope for compression. GMZ is a lossless compression model developed to work for SFP compliant GML files. Our experiments indicate GMZ achieves reasonably good compression ratios and can be useful in WebGIS based applications.

  17. Can EO afford big data - an assessment of the temporal and monetary costs of existing and emerging big data workflows

    NASA Astrophysics Data System (ADS)

    Clements, Oliver; Walker, Peter

    2014-05-01

    The cost of working with extremely large data sets is an increasingly important issue within the Earth Observation community. From global coverage data at any resolution to small coverage data at extremely high resolution, the community has always produced big data. This will only increase as new sensors are deployed and their data made available. Over time standard workflows have emerged. These have been facilitated by the production and adoption of standard technologies. Groups such as the International Organisation for Standardisation (ISO) and the Open Geospatial Consortium (OGC) have been a driving force in this area for many years. The production of standard protocols and interfaces such as OPeNDAP, Web Coverage Service (WCS), Web Processing Service (WPS) and the newer emerging standards such as Web Coverage Processing Service (WCPS) have helped to galvanise these workflows. An example of a traditional workflow, assume a researcher wants to assess the temporal trend in chlorophyll concentration. This would involve a discovery phase, an acquisition phase, a processing phase and finally a derived product or analysis phase. Each element of this workflow has an associated temporal and monetary cost. Firstly the researcher would require a high bandwidth connection or the acquisition phase would take too long. Secondly the researcher must have their own expensive equipment for use in the processing phase. Both of these elements cost money and time. This can make the whole process prohibitive to scientists from the developing world or "citizen scientists" that do not have the processing infrastructure necessary. The use of emerging technologies can help improve both the monetary and time costs associated with these existing workflows. By utilising a WPS that is hosted at the same location as the data a user is able to apply processing to the data without needing their own processing infrastructure. This however limits the user to predefined processes that are made available by the data provider. The emerging OGC WCPS standard combined with big data analytics engines may provide a mechanism to improve this situation. The technology allows users to create their own queries using an SQL like query language and apply them over available large data archive, once again at the data providers end. This not only removes the processing cost whilst still allowing user defined processes it also reduces the bandwidth required, as only the final analysis or derived product needs to be downloaded. The maturity of the new technologies is a stage where their use should be justified by a quantitative assessment rather than simply by the fact that they are new developments. We will present a study of the time and cost requirements for a selection of existing workflows and then show how new/emerging standards and technologies can help to both reduce the cost to the user by shifting processing to the data, and reducing the required bandwidth for analysing large datasets, making analysis of big-data archives possible for a greater and more diverse audience.

  18. Influenza during pregnancy: Incidence, vaccination coverage and attitudes toward vaccination in the French web-based cohort G-GrippeNet.

    PubMed

    Loubet, Paul; Guerrisi, Caroline; Turbelin, Clément; Blondel, Béatrice; Launay, Odile; Bardou, Marc; Goffinet, François; Colizza, Vittoria; Hanslik, Thomas; Kernéis, Solen

    2016-04-29

    Pregnancy is a risk factor for severe influenza. However, data on influenza incidence during pregnancy are scarce. Likewise, no data are available on influenza vaccine coverage in France since national recommendation in 2012. We aimed to assess these points using a novel nationwide web-based surveillance system, G-GrippeNet. During the 2014/2015 influenza season, pregnant women living in metropolitan France were enrolled through a web platform (https://www.grippenet.fr/). Throughout the season, participants were asked to report, on a weekly basis, if they had experienced symptoms of influenza-like-illness (ILI). ILI episodes reported were used to calculate incidence density rates based on period of participation from each participant. Vaccination coverage was estimated after weighing on age and education level from national data on pregnant women. Factors associated with higher vaccination coverage were obtained through a logistic regression with Odds Ratio (OR) corrected with the Zhang and Yu method. A total of 153 women were enrolled. ILI incidence density rate was 1.8 per 100 person-week (95% CI, 1.5-2.1). This rate was higher in women older than 40 years (RR = 3.0, 95% CI [1.1-8.3], p = 0.03) and during first/second trimesters compared to third trimester (RR = 4.0, 95% CI [1.4-12.0], p = 0.01). Crude vaccination coverage was 39% (95% CI, 31-47) and weighted vaccination coverage was estimated at 26% (95% CI, 20-34). Health care provider recommendation for vaccination (corrected OR = 7.8; 95% CI [3.0-17.1]) and non-smoking status (cOR = 2.1; 95% CI [1.2-6.9]) were associated with higher vaccine uptake. This original web based longitudinal surveillance study design proved feasible in pregnant women population. First results are of interest and underline that public health policies should emphasize the vaccination promotion through health care providers. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Flexible Web services integration: a novel personalised social approach

    NASA Astrophysics Data System (ADS)

    Metrouh, Abdelmalek; Mokhati, Farid

    2018-05-01

    Dynamic composition or integration remains one of the key objectives of Web services technology. This paper aims to propose an innovative approach of dynamic Web services composition based on functional and non-functional attributes and individual preferences. In this approach, social networks of Web services are used to maintain interactions between Web services in order to select and compose Web services that are more tightly related to user's preferences. We use the concept of Web services community in a social network of Web services to reduce considerably their search space. These communities are created by the direct involvement of Web services providers.

  20. An Automatic Web Service Composition Framework Using QoS-Based Web Service Ranking Algorithm

    PubMed Central

    Mallayya, Deivamani; Ramachandran, Baskaran; Viswanathan, Suganya

    2015-01-01

    Web service has become the technology of choice for service oriented computing to meet the interoperability demands in web applications. In the Internet era, the exponential addition of web services nominates the “quality of service” as essential parameter in discriminating the web services. In this paper, a user preference based web service ranking (UPWSR) algorithm is proposed to rank web services based on user preferences and QoS aspect of the web service. When the user's request cannot be fulfilled by a single atomic service, several existing services should be composed and delivered as a composition. The proposed framework allows the user to specify the local and global constraints for composite web services which improves flexibility. UPWSR algorithm identifies best fit services for each task in the user request and, by choosing the number of candidate services for each task, reduces the time to generate the composition plans. To tackle the problem of web service composition, QoS aware automatic web service composition (QAWSC) algorithm proposed in this paper is based on the QoS aspects of the web services and user preferences. The proposed framework allows user to provide feedback about the composite service which improves the reputation of the services. PMID:26504894

  1. SIMAP—the database of all-against-all protein sequence similarities and annotations with new interfaces and increased coverage

    PubMed Central

    Arnold, Roland; Goldenberg, Florian; Mewes, Hans-Werner; Rattei, Thomas

    2014-01-01

    The Similarity Matrix of Proteins (SIMAP, http://mips.gsf.de/simap/) database has been designed to massively accelerate computationally expensive protein sequence analysis tasks in bioinformatics. It provides pre-calculated sequence similarities interconnecting the entire known protein sequence universe, complemented by pre-calculated protein features and domains, similarity clusters and functional annotations. SIMAP covers all major public protein databases as well as many consistently re-annotated metagenomes from different repositories. As of September 2013, SIMAP contains >163 million proteins corresponding to ∼70 million non-redundant sequences. SIMAP uses the sensitive FASTA search heuristics, the Smith–Waterman alignment algorithm, the InterPro database of protein domain models and the BLAST2GO functional annotation algorithm. SIMAP assists biologists by facilitating the interactive exploration of the protein sequence universe. Web-Service and DAS interfaces allow connecting SIMAP with any other bioinformatic tool and resource. All-against-all protein sequence similarity matrices of project-specific protein collections are generated on request. Recent improvements allow SIMAP to cover the rapidly growing sequenced protein sequence universe. New Web-Service interfaces enhance the connectivity of SIMAP. Novel tools for interactive extraction of protein similarity networks have been added. Open access to SIMAP is provided through the web portal; the portal also contains instructions and links for software access and flat file downloads. PMID:24165881

  2. 29 CFR 779.245 - Conditions for coverage of retail or service enterprises.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 29 Labor 3 2014-07-01 2014-07-01 false Conditions for coverage of retail or service enterprises... Apply; Enterprise Coverage Covered Retail Enterprise § 779.245 Conditions for coverage of retail or service enterprises. (a) Retail or service enterprises may be covered under section 3(s)(1) of the prior...

  3. 29 CFR 779.245 - Conditions for coverage of retail or service enterprises.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 3 2011-07-01 2011-07-01 false Conditions for coverage of retail or service enterprises... Apply; Enterprise Coverage Covered Retail Enterprise § 779.245 Conditions for coverage of retail or service enterprises. (a) Retail or service enterprises may be covered under section 3(s)(1) of the prior...

  4. 29 CFR 779.245 - Conditions for coverage of retail or service enterprises.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 29 Labor 3 2012-07-01 2012-07-01 false Conditions for coverage of retail or service enterprises... Apply; Enterprise Coverage Covered Retail Enterprise § 779.245 Conditions for coverage of retail or service enterprises. (a) Retail or service enterprises may be covered under section 3(s)(1) of the prior...

  5. 29 CFR 779.245 - Conditions for coverage of retail or service enterprises.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 29 Labor 3 2013-07-01 2013-07-01 false Conditions for coverage of retail or service enterprises... Apply; Enterprise Coverage Covered Retail Enterprise § 779.245 Conditions for coverage of retail or service enterprises. (a) Retail or service enterprises may be covered under section 3(s)(1) of the prior...

  6. Proposal for a Web Encoding Service (wes) for Spatial Data Transactio

    NASA Astrophysics Data System (ADS)

    Siew, C. B.; Peters, S.; Rahman, A. A.

    2015-10-01

    Web services utilizations in Spatial Data Infrastructure (SDI) have been well established and standardized by Open Geospatial Consortium (OGC). Similar web services for 3D SDI are also being established in recent years, with extended capabilities to handle 3D spatial data. The increasing popularity of using City Geographic Markup Language (CityGML) for 3D city modelling applications leads to the needs for large spatial data handling for data delivery. This paper revisits the available web services in OGC Web Services (OWS), and propose the background concepts and requirements for encoding spatial data via Web Encoding Service (WES). Furthermore, the paper discusses the data flow of the encoder within web service, e.g. possible integration with Web Processing Service (WPS) or Web 3D Services (W3DS). The integration with available web service could be extended to other available web services for efficient handling of spatial data, especially 3D spatial data.

  7. Opinion Integration and Summarization

    ERIC Educational Resources Information Center

    Lu, Yue

    2011-01-01

    As Web 2.0 applications become increasingly popular, more and more people express their opinions on the Web in various ways in real time. Such wide coverage of topics and abundance of users make the Web an extremely valuable source for mining people's opinions about all kinds of topics. However, since the opinions are usually expressed as…

  8. In Interactive, Web-Based Approach to Metadata Authoring

    NASA Technical Reports Server (NTRS)

    Pollack, Janine; Wharton, Stephen W. (Technical Monitor)

    2001-01-01

    NASA's Global Change Master Directory (GCMD) serves a growing number of users by assisting the scientific community in the discovery of and linkage to Earth science data sets and related services. The GCMD holds over 8000 data set descriptions in Directory Interchange Format (DIF) and 200 data service descriptions in Service Entry Resource Format (SERF), encompassing the disciplines of geology, hydrology, oceanography, meteorology, and ecology. Data descriptions also contain geographic coverage information, thus allowing researchers to discover data pertaining to a particular geographic location, as well as subject of interest. The GCMD strives to be the preeminent data locator for world-wide directory level metadata. In this vein, scientists and data providers must have access to intuitive and efficient metadata authoring tools. Existing GCMD tools are not currently attracting. widespread usage. With usage being the prime indicator of utility, it has become apparent that current tools must be improved. As a result, the GCMD has released a new suite of web-based authoring tools that enable a user to create new data and service entries, as well as modify existing data entries. With these tools, a more interactive approach to metadata authoring is taken, as they feature a visual "checklist" of data/service fields that automatically update when a field is completed. In this way, the user can quickly gauge which of the required and optional fields have not been populated. With the release of these tools, the Earth science community will be further assisted in efficiently creating quality data and services metadata. Keywords: metadata, Earth science, metadata authoring tools

  9. Dynamic selection mechanism for quality of service aware web services

    NASA Astrophysics Data System (ADS)

    D'Mello, Demian Antony; Ananthanarayana, V. S.

    2010-02-01

    A web service is an interface of the software component that can be accessed by standard Internet protocols. The web service technology enables an application to application communication and interoperability. The increasing number of web service providers throughout the globe have produced numerous web services providing the same or similar functionality. This necessitates the use of tools and techniques to search the suitable services available over the Web. UDDI (universal description, discovery and integration) is the first initiative to find the suitable web services based on the requester's functional demands. However, the requester's requirements may also include non-functional aspects like quality of service (QoS). In this paper, the authors define a QoS model for QoS aware and business driven web service publishing and selection. The authors propose a QoS requirement format for the requesters, to specify their complex demands on QoS for the web service selection. The authors define a tree structure called quality constraint tree (QCT) to represent the requester's variety of requirements on QoS properties having varied preferences. The paper proposes a QoS broker based architecture for web service selection, which facilitates the requesters to specify their QoS requirements to select qualitatively optimal web service. A web service selection algorithm is presented, which ranks the functionally similar web services based on the degree of satisfaction of the requester's QoS requirements and preferences. The paper defines web service provider qualities to distinguish qualitatively competitive web services. The paper also presents the modelling and selection mechanism for the requester's alternative constraints defined on the QoS. The authors implement the QoS broker based system to prove the correctness of the proposed web service selection mechanism.

  10. Feasibility of using global system for mobile communication (GSM)-based tracking for vaccinators to improve oral poliomyelitis vaccine campaign coverage in rural Pakistan.

    PubMed

    Chandir, Subhash; Dharma, Vijay Kumar; Siddiqi, Danya Arif; Khan, Aamir Javed

    2017-09-05

    Despite multiple rounds of immunization campaigns, it has not been possible to achieve optimum immunization coverage for poliovirus in Pakistan. Supplementary activities to improve coverage of immunization, such as door-to-door campaigns are constrained by several factors including inaccurate hand-drawn maps and a lack of means to objectively monitor field teams in real time, resulting in suboptimal vaccine coverage during campaigns. Global System for Mobile Communications (GSM) - based tracking of mobile subscriber identity modules (SIMs) of vaccinators provides a low-cost solution to identify missed areas and ensure effective immunization coverage. We conducted a pilot study to investigate the feasibility of using GSM technology to track vaccinators through observing indicators including acceptability, ease of implementation, costs and scalability as well as the likelihood of ownership by District Health Officials. The real-time location of the field teams was displayed on a GSM tracking web dashboard accessible by supervisors and managers for effective monitoring of workforce attendance including 'time in-time out', and discerning if all target areas - specifically remote and high-risk locations - had been reached. Direct access to this information by supervisors eliminated the possibility of data fudging and inaccurate reporting by workers regarding their mobility. The tracking cost per vaccinator was USD 0.26/month. Our study shows that GSM-based tracking is potentially a cost-efficient approach, results in better monitoring and accountability, is scalable and provides the potential for improved geographic coverage of health services. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Increasing Coverage of Hepatitis B Vaccination in China

    PubMed Central

    Wang, Shengnan; Smith, Helen; Peng, Zhuoxin; Xu, Biao; Wang, Weibing

    2016-01-01

    Abstract This study used a system evaluation method to summarize China's experience on improving the coverage of hepatitis B vaccine, especially the strategies employed to improve the uptake of timely birth dosage. Identifying successful methods and strategies will provide strong evidence for policy makers and health workers in other countries with high hepatitis B prevalence. We conducted a literature review included English or Chinese literature carried out in mainland China, using PubMed, the Cochrane databases, Web of Knowledge, China National Knowledge Infrastructure, Wanfang data, and other relevant databases. Nineteen articles about the effectiveness and impact of interventions on improving the coverage of hepatitis B vaccine were included. Strong or moderate evidence showed that reinforcing health education, training and supervision, providing subsidies for facility birth, strengthening the coordination among health care providers, and using out-of-cold-chain storage for vaccines were all important to improving vaccination coverage. We found evidence that community education was the most commonly used intervention, and out-reach programs such as out-of-cold chain strategy were more effective in increasing the coverage of vaccination in remote areas where the facility birth rate was respectively low. The essential impact factors were found to be strong government commitment and the cooperation of the different government departments. Public interventions relying on basic health care systems combined with outreach care services were critical elements in improving the hepatitis B vaccination rate in China. This success could not have occurred without exceptional national commitment. PMID:27175710

  12. Increasing Coverage of Hepatitis B Vaccination in China: A Systematic Review of Interventions and Implementation Experiences.

    PubMed

    Wang, Shengnan; Smith, Helen; Peng, Zhuoxin; Xu, Biao; Wang, Weibing

    2016-05-01

    This study used a system evaluation method to summarize China's experience on improving the coverage of hepatitis B vaccine, especially the strategies employed to improve the uptake of timely birth dosage. Identifying successful methods and strategies will provide strong evidence for policy makers and health workers in other countries with high hepatitis B prevalence.We conducted a literature review included English or Chinese literature carried out in mainland China, using PubMed, the Cochrane databases, Web of Knowledge, China National Knowledge Infrastructure, Wanfang data, and other relevant databases.Nineteen articles about the effectiveness and impact of interventions on improving the coverage of hepatitis B vaccine were included. Strong or moderate evidence showed that reinforcing health education, training and supervision, providing subsidies for facility birth, strengthening the coordination among health care providers, and using out-of-cold-chain storage for vaccines were all important to improving vaccination coverage.We found evidence that community education was the most commonly used intervention, and out-reach programs such as out-of-cold chain strategy were more effective in increasing the coverage of vaccination in remote areas where the facility birth rate was respectively low. The essential impact factors were found to be strong government commitment and the cooperation of the different government departments.Public interventions relying on basic health care systems combined with outreach care services were critical elements in improving the hepatitis B vaccination rate in China. This success could not have occurred without exceptional national commitment.

  13. Coverage of Certain Preventive Services Under the Affordable Care Act. Final rules.

    PubMed

    2015-07-14

    This document contains final regulations regarding coverage of certain preventive services under section 2713 of the Public Health Service Act (PHS Act), added by the Patient Protection and Affordable Care Act, as amended, and incorporated into the Employee Retirement Income Security Act of 1974 and the Internal Revenue Code. Section 2713 of the PHS Act requires coverage without cost sharing of certain preventive health services by non-grandfathered group health plans and health insurance coverage. These regulations finalize provisions from three rulemaking actions: Interim final regulations issued in July 2010 related to coverage of preventive services, interim final regulations issued in August 2014 related to the process an eligible organization uses to provide notice of its religious objection to the coverage of contraceptive services, and proposed regulations issued in August 2014 related to the definition of "eligible organization,'' which would expand the set of entities that may avail themselves of an accommodation with respect to the coverage of contraceptive services.

  14. Towards universal health coverage: the role of within-country wealth-related inequality in 28 countries in sub-Saharan Africa.

    PubMed

    Hosseinpoor, Ahmad Reza; Victora, Cesar G; Bergen, Nicole; Barros, Aluisio J D; Boerma, Ties

    2011-12-01

    To measure within-country wealth-related inequality in the health service coverage gap of maternal and child health indicators in sub-Saharan Africa and quantify its contribution to the national health service coverage gap. Coverage data for child and maternal health services in 28 sub-Saharan African countries were obtained from the 2000-2008 Demographic Health Survey. For each country, the national coverage gap was determined for an overall health service coverage index and select individual health service indicators. The data were then additively broken down into the coverage gap in the wealthiest quintile (i.e. the proportion of the quintile lacking a required health service) and the population attributable risk (an absolute measure of within-country wealth-related inequality). In 26 countries, within-country wealth-related inequality accounted for more than one quarter of the national overall coverage gap. Reducing such inequality could lower this gap by 16% to 56%, depending on the country. Regarding select individual health service indicators, wealth-related inequality was more common in services such as skilled birth attendance and antenatal care, and less so in family planning, measles immunization, receipt of a third dose of vaccine against diphtheria, pertussis and tetanus and treatment of acute respiratory infections in children under 5 years of age. The contribution of wealth-related inequality to the child and maternal health service coverage gap differs by country and type of health service, warranting case-specific interventions. Targeted policies are most appropriate where high within-country wealth-related inequality exists, and whole-population approaches, where the health-service coverage gap is high in all quintiles.

  15. PHL7/441: Fixing a Broken Line between the Perceived "Anarchy" of the Web and a Process-Comfortable Pharmaceutical Company

    PubMed Central

    Vercellesi, L

    1999-01-01

    Introduction In 1998 a pharmaceutical company published its Web site to provide: an institutional presence multifunctional information to primary customers and general public a new way of access to the company a link to existing company-sponsored sites a platform for future projects Since the publication, some significant integration have been added; in particular one is a primary interactive service, addressed to a selected audience. The need has been felt to foster new projects and establish the idea of routinely considering the site as a potential tool in the marketing mix, to provide advanced services to customers. Methods Re-assessment of the site towards objectives. Assessment of its perception with company potential suppliers. Results The issue "web use" was discussed in various management meetings; the trend of use of Internet among the primary customers was known; major concerns expressed were about staffing and return of investment for activities run in the Web. These perceptions are being addressed by making the company more comfortable by: Running the site through a detailed process and clear procedures, defining A new process of maintenance of the site, involving representatives of all the functions. Procedures and guidelines. A master file of approved answers and company contacts. Categories of activities (information, promotion, education, information to investors, general services, target-specific services). Measures for all the activities run in the Web site Specifically for the Web site a concise periodical report is being assessed, covering 1. Statistics about hits and mails, compared to the corporate data. Indication of new items published. Description by the "supplier" of new or ongoing innovative projects, to transfer best practice. Basic figures on the Italian trend in internet use and specifically in the pharmaceutical and medical fields. Comments to a few competitor sites. Examples of potential uses deriving from other Web sites. Discussion The comparatively low use of Internet in Italy has affected the systematic professional exploitation of the company site. The definition of "anarchic" commonly linked to the Web by local media has lead to the attempt to "master" and "normalize" the site with a stricter approach than usual: most procedures and guidelines have been designed from scratch as not available for similar activities traditionally run. A short set of information has been requested for inclusion in the report: its wide coverage will help to receive a flavour of the global parallel new world developing in the net. Hopefully this approach will help to create a comfortable attitude towards the medium in the whole organisation and to acquire a working experience with the net.

  16. Developing Toolsets for AirBorne Data (TAD): Overview of Design Concept

    NASA Astrophysics Data System (ADS)

    Parker, L.; Perez, J.; Chen, G.; Benson, A.; Peeters, M. C.

    2013-12-01

    NASA has conducted airborne tropospheric chemistry studies for about three decades. These field campaigns have generated a great wealth of observations, including a wide range of the trace gases and aerosol properties. Even though the spatial and temporal coverage is limited, the aircraft data offer high resolution and comprehensive simultaneous coverage of many variables, e.g. ozone precursors, intermediate photochemical species, and photochemical products. The recent NASA Earth Venture Program has generated an unprecedented amount of aircraft observations in terms of the sheer number of measurements and data volume. The ASDC Toolset for Airborne Data (TAD) is being designed to meet the user community needs for aircraft data for scientific research on climate change and air quality relevant issues, particularly: 1) Provide timely access to a broad user community, 2) Provide an intuitive user interface to facilitate quick discovery of the variables and data, 3) Provide data products and tools to facilitate model assessment activities, e.g., merge files and data subsetting capabilities, 4) Provide simple utility 'calculators', e.g., unit conversion and aerosol size distribution processing, and 5) Provide Web Coverage Service capable tools to enhance the data usability. The general strategy and design of TAD will be presented.

  17. Conducting Web-Based Surveys. ERIC Digest.

    ERIC Educational Resources Information Center

    Solomon, David J.

    Web-based surveying is very attractive for many reasons, including reducing the time and cost of conducting a survey and avoiding the often error prone and tedious task of data entry. At this time, Web-based surveys should still be used with caution. The biggest concern at present is coverage bias or bias resulting from sampled people either not…

  18. 42 CFR 486.106 - Condition for coverage: Referral for service and preservation of records.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... SPECIALIZED SERVICES FURNISHED BY SUPPLIERS Conditions for Coverage: Portable X-Ray Services § 486.106 Condition for coverage: Referral for service and preservation of records. All portable X-ray services... are properly preserved. (a) Standard—referral by a physician. Portable X-ray examinations are...

  19. 5 CFR 1650.42 - How to obtain a financial hardship withdrawal.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... form or use the TSP Web site to initiate a request. A participant's ability to complete a financial hardship withdrawal on the Web will depend on his or her retirement system coverage and marital status. (b...

  20. 5 CFR 1650.24 - How to obtain a post-employment withdrawal.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... request form or use the TSP Web site to initiate a request. (A participant's ability to complete a post-employment withdrawal on the Web will depend on his or her retirement system coverage, withdrawal election...

  1. 5 CFR 1650.24 - How to obtain a post-employment withdrawal.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... request form or use the TSP Web site to initiate a request. (A participant's ability to complete a post-employment withdrawal on the Web will depend on his or her retirement system coverage, withdrawal election...

  2. 5 CFR 1650.24 - How to obtain a post-employment withdrawal.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... request form or use the TSP Web site to initiate a request. (A participant's ability to complete a post-employment withdrawal on the Web will depend on his or her retirement system coverage, withdrawal election...

  3. PlanetServer: Innovative approaches for the online analysis of hyperspectral satellite data from Mars

    NASA Astrophysics Data System (ADS)

    Oosthoek, J. H. P.; Flahaut, J.; Rossi, A. P.; Baumann, P.; Misev, D.; Campalani, P.; Unnithan, V.

    2014-06-01

    PlanetServer is a WebGIS system, currently under development, enabling the online analysis of Compact Reconnaissance Imaging Spectrometer (CRISM) hyperspectral data from Mars. It is part of the EarthServer project which builds infrastructure for online access and analysis of huge Earth Science datasets. Core functionality consists of the rasdaman Array Database Management System (DBMS) for storage, and the Open Geospatial Consortium (OGC) Web Coverage Processing Service (WCPS) for data querying. Various WCPS queries have been designed to access spatial and spectral subsets of the CRISM data. The client WebGIS, consisting mainly of the OpenLayers javascript library, uses these queries to enable online spatial and spectral analysis. Currently the PlanetServer demonstration consists of two CRISM Full Resolution Target (FRT) observations, surrounding the NASA Curiosity rover landing site. A detailed analysis of one of these observations is performed in the Case Study section. The current PlanetServer functionality is described step by step, and is tested by focusing on detecting mineralogical evidence described in earlier Gale crater studies. Both the PlanetServer methodology and its possible use for mineralogical studies will be further discussed. Future work includes batch ingestion of CRISM data and further development of the WebGIS and analysis tools.

  4. Data Discovery and Access via the Heliophysics Events Knowledgebase (HEK)

    NASA Astrophysics Data System (ADS)

    Somani, A.; Hurlburt, N. E.; Schrijver, C. J.; Cheung, M.; Freeland, S.; Slater, G. L.; Seguin, R.; Timmons, R.; Green, S.; Chang, L.; Kobashi, A.; Jaffey, A.

    2011-12-01

    The HEK is a integrated system which helps direct scientists to solar events and data from a variety of providers. The system is fully operational and adoption of HEK has been growing since the launch of NASA's SDO mission. In this presentation we describe the different components that comprise HEK. The Heliophysics Events Registry (HER) and Heliophysics Coverage Registry (HCR) form the two major databases behind the system. The HCR allows the user to search on coverage event metadata for a variety of instruments. The HER allows the user to search on annotated event metadata for a variety of instruments. Both the HCR and HER are accessible via a web API which can return search results in machine readable formats (e.g. XML and JSON). A variety of SolarSoft services are also provided to allow users to search the HEK as well as obtain and manipulate data. Other components include - the Event Detection System (EDS) continually runs feature finding algorithms on SDO data to populate the HER with relevant events, - A web form for users to request SDO data cutouts for multiple AIA channels as well as HMI line-of-sight magnetograms, - iSolSearch, which allows a user to browse events in the HER and search for specific events over a specific time interval, all within a graphical web page, - Panorama, which is the software tool used for rapid visualization of large volumes of solar image data in multiple channels/wavelengths. The user can also easily create WYSIWYG movies and launch the Annotator tool to describe events and features. - EVACS, which provides a JOGL powered client for the HER and HCR. EVACS displays the searched for events on a full disk magnetogram of the sun while displaying more detailed information for events.

  5. Metadata Authoring with Versatility and Extensibility

    NASA Technical Reports Server (NTRS)

    Pollack, Janine; Olsen, Lola

    2004-01-01

    NASA's Global Change Master Directory (GCMD) assists the scientific community in the discovery of and linkage to Earth science data sets and related services. The GCMD holds over 13,800 data set descriptions in Directory Interchange Format (DIF) and 700 data service descriptions in Service Entry Resource Format (SERF), encompassing the disciplines of geology, hydrology, oceanography, meteorology, and ecology. Data descriptions also contain geographic coverage information and direct links to the data, thus allowing researchers to discover data pertaining to a geographic location of interest, then quickly acquire those data. The GCMD strives to be the preferred data locator for world-wide directory-level metadata. In this vein, scientists and data providers must have access to intuitive and efficient metadata authoring tools. Existing GCMD tools are attracting widespread usage; however, a need for tools that are portable, customizable and versatile still exists. With tool usage directly influencing metadata population, it has become apparent that new tools are needed to fill these voids. As a result, the GCMD has released a new authoring tool allowing for both web-based and stand-alone authoring of descriptions. Furthermore, this tool incorporates the ability to plug-and-play the metadata format of choice, offering users options of DIF, SERF, FGDC, ISO or any other defined standard. Allowing data holders to work with their preferred format, as well as an option of a stand-alone application or web-based environment, docBUlLDER will assist the scientific community in efficiently creating quality data and services metadata.

  6. GENESIS SciFlo: Choreographing Interoperable Web Services on the Grid using a Semantically-Enabled Dataflow Execution Environment

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Manipon, G.; Xing, Z.

    2007-12-01

    The General Earth Science Investigation Suite (GENESIS) project is a NASA-sponsored partnership between the Jet Propulsion Laboratory, academia, and NASA data centers to develop a new suite of Web Services tools to facilitate multi-sensor investigations in Earth System Science. The goal of GENESIS is to enable large-scale, multi-instrument atmospheric science using combined datasets from the AIRS, MODIS, MISR, and GPS sensors. Investigations include cross-comparison of spaceborne climate sensors, cloud spectral analysis, study of upper troposphere-stratosphere water transport, study of the aerosol indirect cloud effect, and global climate model validation. The challenges are to bring together very large datasets, reformat and understand the individual instrument retrievals, co-register or re-grid the retrieved physical parameters, perform computationally-intensive data fusion and data mining operations, and accumulate complex statistics over months to years of data. To meet these challenges, we have developed a Grid computing and dataflow framework, named SciFlo, in which we are deploying a set of versatile and reusable operators for data access, subsetting, registration, mining, fusion, compression, and advanced statistical analysis. SciFlo leverages remote Web Services, called via Simple Object Access Protocol (SOAP) or REST (one-line) URLs, and the Grid Computing standards (WS-* & Globus Alliance toolkits), and enables scientists to do multi- instrument Earth Science by assembling reusable Web Services and native executables into a distributed computing flow (tree of operators). The SciFlo client & server engines optimize the execution of such distributed data flows and allow the user to transparently find and use datasets and operators without worrying about the actual location of the Grid resources. In particular, SciFlo exploits the wealth of datasets accessible by OpenGIS Consortium (OGC) Web Mapping Servers & Web Coverage Servers (WMS/WCS), and by Open Data Access Protocol (OpenDAP) servers. SciFlo also publishes its own SOAP services for space/time query and subsetting of Earth Science datasets, and automated access to large datasets via lists of (FTP, HTTP, or DAP) URLs which point to on-line HDF or netCDF files. Typical distributed workflows obtain datasets by calling standard WMS/WCS servers or discovering and fetching data granules from ftp sites; invoke remote analysis operators available as SOAP services (interface described by a WSDL document); and merge results into binary containers (netCDF or HDF files) for further analysis using local executable operators. Naming conventions (HDFEOS and CF-1.0 for netCDF) are exploited to automatically understand and read on-line datasets. More interoperable conventions, and broader adoption of existing converntions, are vital if we are to "scale up" automated choreography of Web Services beyond toy applications. Recently, the ESIP Federation sponsored a collaborative activity in which several ESIP members developed some collaborative science scenarios for atmospheric and aerosol science, and then choreographed services from multiple groups into demonstration workflows using the SciFlo engine and a Business Process Execution Language (BPEL) workflow engine. We will discuss the lessons learned from this activity, the need for standardized interfaces (like WMS/WCS), the difficulty in agreeing on even simple XML formats and interfaces, the benefits of doing collaborative science analysis at the "touch of a button" once services are connected, and further collaborations that are being pursued.

  7. Personalization of Rule-based Web Services.

    PubMed

    Choi, Okkyung; Han, Sang Yong

    2008-04-04

    Nowadays Web users have clearly expressed their wishes to receive personalized services directly. Personalization is the way to tailor services directly to the immediate requirements of the user. However, the current Web Services System does not provide any features supporting this such as consideration of personalization of services and intelligent matchmaking. In this research a flexible, personalized Rule-based Web Services System to address these problems and to enable efficient search, discovery and construction across general Web documents and Semantic Web documents in a Web Services System is proposed. This system utilizes matchmaking among service requesters', service providers' and users' preferences using a Rule-based Search Method, and subsequently ranks search results. A prototype of efficient Web Services search and construction for the suggested system is developed based on the current work.

  8. 24 CFR 242.1 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    .... Debt service coverage ratio is a measure of a hospital's ability to pay interest and principal with cash generated from current operations. Debt service ratio is calculated as follows: Debt Service Coverage Ratio (total debt service coverage on all long-term capital debt) equals the excess of revenues...

  9. 24 CFR 242.1 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    .... Debt service coverage ratio is a measure of a hospital's ability to pay interest and principal with cash generated from current operations. Debt service ratio is calculated as follows: Debt Service Coverage Ratio (total debt service coverage on all long-term capital debt) equals the excess of revenues...

  10. Distributed spatial information integration based on web service

    NASA Astrophysics Data System (ADS)

    Tong, Hengjian; Zhang, Yun; Shao, Zhenfeng

    2008-10-01

    Spatial information systems and spatial information in different geographic locations usually belong to different organizations. They are distributed and often heterogeneous and independent from each other. This leads to the fact that many isolated spatial information islands are formed, reducing the efficiency of information utilization. In order to address this issue, we present a method for effective spatial information integration based on web service. The method applies asynchronous invocation of web service and dynamic invocation of web service to implement distributed, parallel execution of web map services. All isolated information islands are connected by the dispatcher of web service and its registration database to form a uniform collaborative system. According to the web service registration database, the dispatcher of web services can dynamically invoke each web map service through an asynchronous delegating mechanism. All of the web map services can be executed at the same time. When each web map service is done, an image will be returned to the dispatcher. After all of the web services are done, all images are transparently overlaid together in the dispatcher. Thus, users can browse and analyze the integrated spatial information. Experiments demonstrate that the utilization rate of spatial information resources is significantly raised thought the proposed method of distributed spatial information integration.

  11. Distributed spatial information integration based on web service

    NASA Astrophysics Data System (ADS)

    Tong, Hengjian; Zhang, Yun; Shao, Zhenfeng

    2009-10-01

    Spatial information systems and spatial information in different geographic locations usually belong to different organizations. They are distributed and often heterogeneous and independent from each other. This leads to the fact that many isolated spatial information islands are formed, reducing the efficiency of information utilization. In order to address this issue, we present a method for effective spatial information integration based on web service. The method applies asynchronous invocation of web service and dynamic invocation of web service to implement distributed, parallel execution of web map services. All isolated information islands are connected by the dispatcher of web service and its registration database to form a uniform collaborative system. According to the web service registration database, the dispatcher of web services can dynamically invoke each web map service through an asynchronous delegating mechanism. All of the web map services can be executed at the same time. When each web map service is done, an image will be returned to the dispatcher. After all of the web services are done, all images are transparently overlaid together in the dispatcher. Thus, users can browse and analyze the integrated spatial information. Experiments demonstrate that the utilization rate of spatial information resources is significantly raised thought the proposed method of distributed spatial information integration.

  12. Enhancing Access to Drought Information Using the CUAHSI Hydrologic Information System

    NASA Astrophysics Data System (ADS)

    Schreuders, K. A.; Tarboton, D. G.; Horsburgh, J. S.; Sen Gupta, A.; Reeder, S.

    2011-12-01

    The National Drought Information System (NIDIS) Upper Colorado River Basin pilot study is investigating and establishing capabilities for better dissemination of drought information for early warning and management. As part of this study we are using and extending functionality from the Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) Hydrologic Information System (HIS) to provide better access to drought-related data in the Upper Colorado River Basin. The CUAHSI HIS is a federated system for sharing hydrologic data. It is comprised of multiple data servers, referred to as HydroServers, that publish data in a standard XML format called Water Markup Language (WaterML), using web services referred to as WaterOneFlow web services. HydroServers can also publish geospatial data using Open Geospatial Consortium (OGC) web map, feature and coverage services and are capable of hosting web and map applications that combine geospatial datasets with observational data served via web services. HIS also includes a centralized metadata catalog that indexes data from registered HydroServers and a data access client referred to as HydroDesktop. For NIDIS, we have established a HydroServer to publish drought index values as well as the input data used in drought index calculations. Primary input data required for drought index calculation include streamflow, precipitation, reservoir storages, snow water equivalent, and soil moisture. We have developed procedures to redistribute the input data to the time and space scales chosen for drought index calculation, namely half monthly time intervals for HUC 10 subwatersheds. The spatial redistribution approaches used for each input parameter are dependent on the spatial linkages for that parameter, i.e., the redistribution procedure for streamflow is dependent on the upstream/downstream connectivity of the stream network, and the precipitation redistribution procedure is dependent on elevation to account for orographic effects. A set of drought indices are then calculated from the redistributed data. We have created automated data and metadata harvesters that periodically scan and harvest new data from each of the input databases, and calculates extensions to the resulting derived data sets, ensuring that the data available on the drought server is kept up to date. This paper will describe this system, showing how it facilitates the integration of data from multiple sources to inform the planning and management of water resources during drought. The system may be accessed at http://drought.usu.edu.

  13. Providing Multi-Page Data Extraction Services with XWRAPComposer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Ling; Zhang, Jianjun; Han, Wei

    2008-04-30

    Dynamic Web data sources – sometimes known collectively as the Deep Web – increase the utility of the Web by providing intuitive access to data repositories anywhere that Web access is available. Deep Web services provide access to real-time information, like entertainment event listings, or present a Web interface to large databases or other data repositories. Recent studies suggest that the size and growth rate of the dynamic Web greatly exceed that of the static Web, yet dynamic content is often ignored by existing search engine indexers owing to the technical challenges that arise when attempting to search the Deepmore » Web. To address these challenges, we present DYNABOT, a service-centric crawler for discovering and clustering Deep Web sources offering dynamic content. DYNABOT has three unique characteristics. First, DYNABOT utilizes a service class model of the Web implemented through the construction of service class descriptions (SCDs). Second, DYNABOT employs a modular, self-tuning system architecture for focused crawling of the Deep Web using service class descriptions. Third, DYNABOT incorporates methods and algorithms for efficient probing of the Deep Web and for discovering and clustering Deep Web sources and services through SCD-based service matching analysis. Our experimental results demonstrate the effectiveness of the service class discovery, probing, and matching algorithms and suggest techniques for efficiently managing service discovery in the face of the immense scale of the Deep Web.« less

  14. Towards the Geospatial Web: Media Platforms for Managing Geotagged Knowledge Repositories

    NASA Astrophysics Data System (ADS)

    Scharl, Arno

    International media have recognized the visual appeal of geo-browsers such as NASA World Wind and Google Earth, for example, when Web and television coverage on Hurricane Katrina used interactive geospatial projections to illustrate its path and the scale of destruction in August 2005. Yet these early applications only hint at the true potential of geospatial technology to build and maintain virtual communities and to revolutionize the production, distribution and consumption of media products. This chapter investigates this potential by reviewing the literature and discussing the integration of geospatial and semantic reference systems, with an emphasis on extracting geospatial context from unstructured text. A content analysis of news coverage based on a suite of text mining tools (webLyzard) sheds light on the popularity and adoption of geospatial platforms.

  15. An Automated End-To Multi-Agent Qos Based Architecture for Selection of Geospatial Web Services

    NASA Astrophysics Data System (ADS)

    Shah, M.; Verma, Y.; Nandakumar, R.

    2012-07-01

    Over the past decade, Service-Oriented Architecture (SOA) and Web services have gained wide popularity and acceptance from researchers and industries all over the world. SOA makes it easy to build business applications with common services, and it provides like: reduced integration expense, better asset reuse, higher business agility, and reduction of business risk. Building of framework for acquiring useful geospatial information for potential users is a crucial problem faced by the GIS domain. Geospatial Web services solve this problem. With the help of web service technology, geospatial web services can provide useful geospatial information to potential users in a better way than traditional geographic information system (GIS). A geospatial Web service is a modular application designed to enable the discovery, access, and chaining of geospatial information and services across the web that are often both computation and data-intensive that involve diverse sources of data and complex processing functions. With the proliferation of web services published over the internet, multiple web services may provide similar functionality, but with different non-functional properties. Thus, Quality of Service (QoS) offers a metric to differentiate the services and their service providers. In a quality-driven selection of web services, it is important to consider non-functional properties of the web service so as to satisfy the constraints or requirements of the end users. The main intent of this paper is to build an automated end-to-end multi-agent based solution to provide the best-fit web service to service requester based on QoS.

  16. Effective coverage of primary care services in eight high-mortality countries

    PubMed Central

    Malata, Address; Ndiaye, Youssoupha; Kruk, Margaret E

    2017-01-01

    Introduction Measurement of effective coverage (quality-corrected coverage) of essential health services is critical to monitoring progress towards the Sustainable Development Goal for health. We combine facility and household surveys from eight low-income and middle-income countries to examine effective coverage of maternal and child health services. Methods We developed indices of essential clinical actions for antenatal care, family planning and care for sick children from existing guidelines and used data from direct observations of clinical visits conducted in Haiti, Kenya, Malawi, Namibia, Rwanda, Senegal, Tanzania and Uganda between 2007 and 2015 to measure quality of care delivered. We calculated healthcare coverage for each service from nationally representative household surveys and combined quality with utilisation estimates at the subnational level to quantify effective coverage. Results Health facility and household surveys yielded over 40 000 direct clinical observations and over 100 000 individual reports of healthcare utilisation. Coverage varied between services, with much greater use of any antenatal care than family planning or sick-child care, as well as within countries. Quality of care was poor, with few regions demonstrating more than 60% average performance of basic clinical practices in any service. Effective coverage across all eight countries averaged 28% for antenatal care, 26% for family planning and 21% for sick-child care. Coverage and quality were not strongly correlated at the subnational level; effective coverage varied by as much as 20% between regions within a country. Conclusion Effective coverage of three primary care services for women and children in eight countries was substantially lower than crude service coverage due to major deficiencies in care quality. Better performing regions can serve as examples for improvement. Systematic increases in the quality of care delivered—not just utilisation gains—will be necessary to progress towards truly beneficial universal health coverage. PMID:29632704

  17. Comparing Unique Title Coverage of Web of Science and Scopus in Earth and Atmospheric Sciences

    ERIC Educational Resources Information Center

    Barnett, Philip; Lascar, Claudia

    2012-01-01

    The current journal titles in earth and atmospheric sciences, that are unique to each of two databases, Web of Science and Scopus, were identified using different methods. Comparing by subject category shows that Scopus has hundreds of unique titles, and Web of Science just 16. The titles unique to each database have low SCImago Journal Rank…

  18. The GeoDataPortal: A Standards-based Environmental Modeling Data Access and Manipulation Toolkit

    NASA Astrophysics Data System (ADS)

    Blodgett, D. L.; Kunicki, T.; Booth, N.; Suftin, I.; Zoerb, R.; Walker, J.

    2010-12-01

    Environmental modelers from fields of study such as climatology, hydrology, geology, and ecology rely on many data sources and processing methods that are common across these disciplines. Interest in inter-disciplinary, loosely coupled modeling and data sharing is increasing among scientists from the USGS, other agencies, and academia. For example, hydrologic modelers need downscaled climate change scenarios and land cover data summarized for the watersheds they are modeling. Subsequently, ecological modelers are interested in soil moisture information for a particular habitat type as predicted by the hydrologic modeler. The USGS Center for Integrated Data Analytics Geo Data Portal (GDP) project seeks to facilitate this loose model coupling data sharing through broadly applicable open-source web processing services. These services simplify and streamline the time consuming and resource intensive tasks that are barriers to inter-disciplinary collaboration. The GDP framework includes a catalog describing projects, models, data, processes, and how they relate. Using newly introduced data, or sources already known to the catalog, the GDP facilitates access to sub-sets and common derivatives of data in numerous formats on disparate web servers. The GDP performs many of the critical functions needed to summarize data sources into modeling units regardless of scale or volume. A user can specify their analysis zones or modeling units as an Open Geospatial Consortium (OGC) standard Web Feature Service (WFS). Utilities to cache Shapefiles and other common GIS input formats have been developed to aid in making the geometry available for processing via WFS. Dataset access in the GDP relies primarily on the Unidata NetCDF-Java library’s common data model. Data transfer relies on methods provided by Unidata’s Thematic Real-time Environmental Data Distribution System Data Server (TDS). TDS services of interest include the Open-source Project for a Network Data Access Protocol (OPeNDAP) standard for gridded time series, the OGC’s Web Coverage Service for high-density static gridded data, and Unidata’s CDM-remote for point time series. OGC WFS and Sensor Observation Service (SOS) are being explored as mechanisms to serve and access static or time series data attributed to vector geometry. A set of standardized XML-based output formats allows easy transformation into a wide variety of “model-ready” formats. Interested users will have the option of submitting custom transformations to the GDP or transforming the XML output as a post-process. The GDP project aims to support simple, rapid development of thin user interfaces (like web portals) to commonly needed environmental modeling-related data access and manipulation tools. Standalone, service-oriented components of the GDP framework provide the metadata cataloging, data subset access, and spatial-statistics calculations needed to support interdisciplinary environmental modeling.

  19. Datacube as a Service to Exploit the Full Potential of Data Cloudy Distributed

    NASA Astrophysics Data System (ADS)

    Mantovani, S.; Natali, S.; Barboni, D.; Hogan, P.; Baumann, P.; Clements, O.

    2017-12-01

    For almost half a century satellite platforms devoted to Earth Observation have allowed creating a complete description of the global environment, generating hundreds of Petabytes of data. The continuous increase of data availability (and respective data volume), together with the raised awareness of climate change issues, have made people of any kind (from citizens to decision makers to scientists) sensitive to environmental threats, improving their inclination to invest on monitoring and mitigation activities. Recently, the term "datacube" has received increasing attention for its potential to simplify the provision of "Big Earth Data" services, allowing massive spatio-temporal data in an analysis-ready way. A number of datacube-ready platforms have emerged to enable a new collaborative approach to analyse the vast amount of satellite imagery and other Earth Observation data, making quicker and easier to explore a time series of images stored in global or regional datacubes. In this context, the European Space Agency and European Commission H2020-funded projects ([1], [2]) are bringing together multiple organisations in Europe, Australia and United States to allow federated data holdings to be analysed using web-based access to petabytes of multidimensional geospatial datasets. In this study, we provide an overview of the existing datacubes (EarthServer-2 datacubes, Sentinel Datacube, European and Australian Landsat Datacubes, …), how the regional datacube structures differ each other, how datacubes interoperability is achieved through OpenSearch and Web Coverage Service (WCS) standards, and finally how the datacube contents can be visualized on a virtual globe (ESA-NASA WebWorldWind) based on a WC(P)S query, and how data can be manipulated on the fly through web-based interfaces, such as Jupyter notebook. The current study is co-financed by the European Space Agency under the MaaS project (ESRIN Contract No. 4000114186/15/I-LG) and the European Union's Horizon 2020 research and innovation programme under the EarthServer-2 project (Grant Agreement No. 654367). [1] MEA as a Service (http://eodatacube.eu) [2] EarthServer-2 (http://www.earthserver.eu)

  20. MedlinePlus Connect: Web Service

    MedlinePlus

    ... https://medlineplus.gov/connect/service.html MedlinePlus Connect: Web Service To use the sharing features on this ... if you implement MedlinePlus Connect by contacting us . Web Service Overview The parameters for the Web service ...

  1. Catch the A-Train from the NASA GIBS/Worldview Platform

    NASA Astrophysics Data System (ADS)

    Schmaltz, J. E.; Alarcon, C.; Baynes, K.; Boller, R. A.; Cechini, M. F.; De Cesare, C.; De Luca, A. P.; Gunnoe, T.; King, B. A.; King, J.; Pressley, N. N.; Roberts, J. T.; Rodriguez, J.; Thompson, C. K.; Wong, M. M.

    2016-12-01

    The satellites and instruments of the Afternoon Train are providing an unprecedented combination of nearly simultaneous measurements. One of the challenges for researchers and applications users is to sift through these combinations to find particular sets of data that correspond to their interests. Using visualization of the data is one way to explore these combinations. NASA's Worldview tool is designed to do just that - to interactively browse full-resolution satellite imagery. Worldview (https://worldview.earthdata.nasa.gov/) is web-based and developed using open libraries and standards (OpenLayers, JavaScript, CSS, HTML) for cross-platform compatibility. It addresses growing user demands for access to full-resolution imagery by providing a responsive, interactive interface with global coverage and no artificial boundaries. In addition to science data imagery, Worldview provides ancillary datasets such as coastlines and borders, socio-economic layers, and satellite orbit tracks. Worldview interacts with the Earthdata Search Client to provide download of the data files associated with the imagery being viewed. The imagery used by Worldview is provided NASA's Global Imagery Browse Services (GIBS - https://earthdata.nasa.gov/gibs) which provide highly responsive, highly scalable imagery services. Requests are made via the OGC Web Map Tile Service (WMTS) standard. In addition to Worldview, other clients can be developed using a variety of web-based libraries, desktop and mobile app libraries, and GDAL script-based access. GIBS currently includes more than 106 science data sets from seven instruments aboard three of the A-Train satellites and new data sets are being added as part of the President's Big Earth Data Initiative (BEDI). Efforts are underway to include new imagery types, such as vectors and curtains, into Worldview/GIBS which will be used to visualize additional A-Train science parameters.

  2. Focused Crawling of the Deep Web Using Service Class Descriptions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rocco, D; Liu, L; Critchlow, T

    2004-06-21

    Dynamic Web data sources--sometimes known collectively as the Deep Web--increase the utility of the Web by providing intuitive access to data repositories anywhere that Web access is available. Deep Web services provide access to real-time information, like entertainment event listings, or present a Web interface to large databases or other data repositories. Recent studies suggest that the size and growth rate of the dynamic Web greatly exceed that of the static Web, yet dynamic content is often ignored by existing search engine indexers owing to the technical challenges that arise when attempting to search the Deep Web. To address thesemore » challenges, we present DynaBot, a service-centric crawler for discovering and clustering Deep Web sources offering dynamic content. DynaBot has three unique characteristics. First, DynaBot utilizes a service class model of the Web implemented through the construction of service class descriptions (SCDs). Second, DynaBot employs a modular, self-tuning system architecture for focused crawling of the DeepWeb using service class descriptions. Third, DynaBot incorporates methods and algorithms for efficient probing of the Deep Web and for discovering and clustering Deep Web sources and services through SCD-based service matching analysis. Our experimental results demonstrate the effectiveness of the service class discovery, probing, and matching algorithms and suggest techniques for efficiently managing service discovery in the face of the immense scale of the Deep Web.« less

  3. Provenance-Based Approaches to Semantic Web Service Discovery and Usage

    ERIC Educational Resources Information Center

    Narock, Thomas William

    2012-01-01

    The World Wide Web Consortium defines a Web Service as "a software system designed to support interoperable machine-to-machine interaction over a network." Web Services have become increasingly important both within and across organizational boundaries. With the recent advent of the Semantic Web, web services have evolved into semantic…

  4. 47 CFR 22.951 - Minimum coverage requirement.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 2 2010-10-01 2010-10-01 false Minimum coverage requirement. 22.951 Section 22.951 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES PUBLIC MOBILE SERVICES Cellular Radiotelephone Service § 22.951 Minimum coverage requirement. Applications for...

  5. The interoperability skill of the Geographic Portal of the ISPRA - Geological Survey of Italy

    NASA Astrophysics Data System (ADS)

    Pia Congi, Maria; Campo, Valentina; Cipolloni, Carlo; Delogu, Daniela; Ventura, Renato; Battaglini, Loredana

    2010-05-01

    The Geographic Portal of Geological Survey of Italy (ISPRA) available at http://serviziogeologico.apat.it/Portal was planning according to standard criteria of the INSPIRE directive. ArcIMS services and at the same time WMS and WFS services had been realized to satisfy the different clients. For each database and web-services the metadata had been wrote in agreement with the ISO 19115. The management architecture of the portal allow it to encode the clients input and output requests both in ArcXML and in GML language. The web-applications and web-services had been realized for each database owner of Land Protection and Georesources Department concerning the geological map at the scale 1:50.000 (CARG Project) and 1:100.000, the IFFI landslide inventory, the boreholes due Law 464/84, the large-scale geological map and all the raster format maps. The portal thus far published is at the experimental stage but through the development of a new graphical interface achieves the final version. The WMS and WFS services including metadata will be re-designed. The validity of the methodology and the applied standards allow to look ahead to the growing developments. In addition to this it must be borne in mind that the capacity of the new geological standard language (GeoSciML), which is already incorporated in the web-services deployed, will be allow a better display and query of the geological data according to the interoperability. The characteristics of the geological data demand for the cartographic mapping specific libraries of symbols not yet available in a WMS service. This is an other aspect regards the standards of the geological informations. Therefore at the moment were carried out: - a library of geological symbols to be used for printing, with a sketch of system colors and a library for displaying data on video, which almost completely solves the problems of the coverage point and area data (also directed) but that still introduces problems for the linear data (solutions: ArcIMS services from Arcmap projects or a specific SLD implementation for WMS services); - an update of "Guidelines for the supply of geological data" in a short time will be published; - the Geological Survey of Italy is officially involved in the IUGS-CGI working group for the processing and experimentation on the new GeoSciML language with the WMS/WFS services. The availability of geographic informations occurs through the metadata that can be distributed online so that search engines can find them through specialized research. The collected metadata in catalogs are structured in a standard (ISO 19135). The catalogs are a ‘common' interface to locate, view and query data and metadata services, web services and other resources. Then, while working in a growing sector of the environmental knowledgement the focus is to collect the participation of other subjects that contribute to the enrichment of the informative content available, so as to be able to arrive to a real portal of national interest especially in case of disaster management.

  6. AirNow Information Management System - Global Earth Observation System of Systems Data Processor for Real-Time Air Quality Data Products

    NASA Astrophysics Data System (ADS)

    Haderman, M.; Dye, T. S.; White, J. E.; Dickerson, P.; Pasch, A. N.; Miller, D. S.; Chan, A. C.

    2012-12-01

    Built upon the success of the U.S. Environmental Protection Agency's (EPA) AirNow program (www.AirNow.gov), the AirNow-International (AirNow-I) system contains an enhanced suite of software programs that process and quality control real-time air quality and environmental data and distribute customized maps, files, and data feeds. The goals of the AirNow-I program are similar to those of the successful U.S. program and include fostering the exchange of environmental data; making advances in air quality knowledge and applications; and building a community of people, organizations, and decision makers in environmental management. In 2010, Shanghai became the first city in China to run this state-of-the-art air quality data management and notification system. AirNow-I consists of a suite of modules (software programs and schedulers) centered on a database. One such module is the Information Management System (IMS), which can automatically produce maps and other data products through the use of GIS software to provide the most current air quality information to the public. Developed with Global Earth Observation System of Systems (GEOSS) interoperability in mind, IMS is based on non-proprietary standards, with preference to formal international standards. The system depends on data and information providers accepting and implementing a set of interoperability arrangements, including technical specifications for collecting, processing, storing, and disseminating shared data, metadata, and products. In particular, the specifications include standards for service-oriented architecture and web-based interfaces, such as a web mapping service (WMS), web coverage service (WCS), web feature service (WFS), sensor web services, and Really Simple Syndication (RSS) feeds. IMS is flexible, open, redundant, and modular. It also allows the merging of data grids to create complex grids that show comprehensive air quality conditions. For example, the AirNow Satellite Data Processor (ASDP) was recently developed to merge PM2.5 estimates from National Aeronautics and Space Administration (NASA) satellite data and AirNow observational data, creating more precise maps and gridded data products for under-monitored areas. The ASDP can easily incorporate other data feeds, including fire and smoke locations, to build enhanced real-time air quality data products. In this presentation, we provide an overview of the features and functions of IMS, an explanation of how data moves through IMS, the rationale of the system architecture, and highlights of the ASDP as an example of the modularity and scalability of IMS.

  7. 42 CFR 486.106 - Condition for coverage: Referral for service and preservation of records.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Condition for coverage: Referral for service and preservation of records. 486.106 Section 486.106 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS AND CERTIFICATION CONDITIONS FOR COVERAGE OF...

  8. 42 CFR 416.46 - Condition for coverage-Nursing services.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 3 2012-10-01 2012-10-01 false Condition for coverage-Nursing services. 416.46... Coverage § 416.46 Condition for coverage—Nursing services. The nursing services of the ASC must be directed and staffed to assure that the nursing needs of all patients are met. (a) Standard: Organization and...

  9. 42 CFR 416.46 - Condition for coverage-Nursing services.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 3 2014-10-01 2014-10-01 false Condition for coverage-Nursing services. 416.46... Coverage § 416.46 Condition for coverage—Nursing services. The nursing services of the ASC must be directed and staffed to assure that the nursing needs of all patients are met. (a) Standard: Organization and...

  10. 42 CFR 416.46 - Condition for coverage-Nursing services.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 3 2013-10-01 2013-10-01 false Condition for coverage-Nursing services. 416.46... Coverage § 416.46 Condition for coverage—Nursing services. The nursing services of the ASC must be directed and staffed to assure that the nursing needs of all patients are met. (a) Standard: Organization and...

  11. Process model-based atomic service discovery and composition of composite semantic web services using web ontology language for services (OWL-S)

    NASA Astrophysics Data System (ADS)

    Paulraj, D.; Swamynathan, S.; Madhaiyan, M.

    2012-11-01

    Web Service composition has become indispensable as a single web service cannot satisfy complex functional requirements. Composition of services has received much interest to support business-to-business (B2B) or enterprise application integration. An important component of the service composition is the discovery of relevant services. In Semantic Web Services (SWS), service discovery is generally achieved by using service profile of Ontology Web Languages for Services (OWL-S). The profile of the service is a derived and concise description but not a functional part of the service. The information contained in the service profile is sufficient for atomic service discovery, but it is not sufficient for the discovery of composite semantic web services (CSWS). The purpose of this article is two-fold: first to prove that the process model is a better choice than the service profile for service discovery. Second, to facilitate the composition of inter-organisational CSWS by proposing a new composition method which uses process ontology. The proposed service composition approach uses an algorithm which performs a fine grained match at the level of atomic process rather than at the level of the entire service in a composite semantic web service. Many works carried out in this area have proposed solutions only for the composition of atomic services and this article proposes a solution for the composition of composite semantic web services.

  12. Automatic geospatial information Web service composition based on ontology interface matching

    NASA Astrophysics Data System (ADS)

    Xu, Xianbin; Wu, Qunyong; Wang, Qinmin

    2008-10-01

    With Web services technology the functions of WebGIS can be presented as a kind of geospatial information service, and helped to overcome the limitation of the information-isolated situation in geospatial information sharing field. Thus Geospatial Information Web service composition, which conglomerates outsourced services working in tandem to offer value-added service, plays the key role in fully taking advantage of geospatial information services. This paper proposes an automatic geospatial information web service composition algorithm that employed the ontology dictionary WordNet to analyze semantic distances among the interfaces. Through making matching between input/output parameters and the semantic meaning of pairs of service interfaces, a geospatial information web service chain can be created from a number of candidate services. A practice of the algorithm is also proposed and the result of it shows the feasibility of this algorithm and the great promise in the emerging demand for geospatial information web service composition.

  13. 7 CFR 1737.31 - Area Coverage Survey (ACS).

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 11 2011-01-01 2011-01-01 false Area Coverage Survey (ACS). 1737.31 Section 1737.31... Studies-Area Coverage Survey and Loan Design § 1737.31 Area Coverage Survey (ACS). (a) The Area Coverage Survey (ACS) is a market forecast of service requirements of subscribers in a proposed service area. (b...

  14. 7 CFR 1737.31 - Area Coverage Survey (ACS).

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 11 2010-01-01 2010-01-01 false Area Coverage Survey (ACS). 1737.31 Section 1737.31... Studies-Area Coverage Survey and Loan Design § 1737.31 Area Coverage Survey (ACS). (a) The Area Coverage Survey (ACS) is a market forecast of service requirements of subscribers in a proposed service area. (b...

  15. 7 CFR 1737.31 - Area Coverage Survey (ACS).

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 11 2013-01-01 2013-01-01 false Area Coverage Survey (ACS). 1737.31 Section 1737.31... Studies-Area Coverage Survey and Loan Design § 1737.31 Area Coverage Survey (ACS). (a) The Area Coverage Survey (ACS) is a market forecast of service requirements of subscribers in a proposed service area. (b...

  16. 7 CFR 1737.31 - Area Coverage Survey (ACS).

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 11 2012-01-01 2012-01-01 false Area Coverage Survey (ACS). 1737.31 Section 1737.31... Studies-Area Coverage Survey and Loan Design § 1737.31 Area Coverage Survey (ACS). (a) The Area Coverage Survey (ACS) is a market forecast of service requirements of subscribers in a proposed service area. (b...

  17. 7 CFR 1737.31 - Area Coverage Survey (ACS).

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Studies-Area Coverage Survey and Loan Design § 1737.31 Area Coverage Survey (ACS). (a) The Area Coverage Survey (ACS) is a market forecast of service requirements of subscribers in a proposed service area. (b... 7 Agriculture 11 2014-01-01 2014-01-01 false Area Coverage Survey (ACS). 1737.31 Section 1737.31...

  18. Graph-Based Semantic Web Service Composition for Healthcare Data Integration.

    PubMed

    Arch-Int, Ngamnij; Arch-Int, Somjit; Sonsilphong, Suphachoke; Wanchai, Paweena

    2017-01-01

    Within the numerous and heterogeneous web services offered through different sources, automatic web services composition is the most convenient method for building complex business processes that permit invocation of multiple existing atomic services. The current solutions in functional web services composition lack autonomous queries of semantic matches within the parameters of web services, which are necessary in the composition of large-scale related services. In this paper, we propose a graph-based Semantic Web Services composition system consisting of two subsystems: management time and run time. The management-time subsystem is responsible for dependency graph preparation in which a dependency graph of related services is generated automatically according to the proposed semantic matchmaking rules. The run-time subsystem is responsible for discovering the potential web services and nonredundant web services composition of a user's query using a graph-based searching algorithm. The proposed approach was applied to healthcare data integration in different health organizations and was evaluated according to two aspects: execution time measurement and correctness measurement.

  19. Graph-Based Semantic Web Service Composition for Healthcare Data Integration

    PubMed Central

    2017-01-01

    Within the numerous and heterogeneous web services offered through different sources, automatic web services composition is the most convenient method for building complex business processes that permit invocation of multiple existing atomic services. The current solutions in functional web services composition lack autonomous queries of semantic matches within the parameters of web services, which are necessary in the composition of large-scale related services. In this paper, we propose a graph-based Semantic Web Services composition system consisting of two subsystems: management time and run time. The management-time subsystem is responsible for dependency graph preparation in which a dependency graph of related services is generated automatically according to the proposed semantic matchmaking rules. The run-time subsystem is responsible for discovering the potential web services and nonredundant web services composition of a user's query using a graph-based searching algorithm. The proposed approach was applied to healthcare data integration in different health organizations and was evaluated according to two aspects: execution time measurement and correctness measurement. PMID:29065602

  20. BioSWR – Semantic Web Services Registry for Bioinformatics

    PubMed Central

    Repchevsky, Dmitry; Gelpi, Josep Ll.

    2014-01-01

    Despite of the variety of available Web services registries specially aimed at Life Sciences, their scope is usually restricted to a limited set of well-defined types of services. While dedicated registries are generally tied to a particular format, general-purpose ones are more adherent to standards and usually rely on Web Service Definition Language (WSDL). Although WSDL is quite flexible to support common Web services types, its lack of semantic expressiveness led to various initiatives to describe Web services via ontology languages. Nevertheless, WSDL 2.0 descriptions gained a standard representation based on Web Ontology Language (OWL). BioSWR is a novel Web services registry that provides standard Resource Description Framework (RDF) based Web services descriptions along with the traditional WSDL based ones. The registry provides Web-based interface for Web services registration, querying and annotation, and is also accessible programmatically via Representational State Transfer (REST) API or using a SPARQL Protocol and RDF Query Language. BioSWR server is located at http://inb.bsc.es/BioSWR/and its code is available at https://sourceforge.net/projects/bioswr/under the LGPL license. PMID:25233118

  1. BioSWR--semantic web services registry for bioinformatics.

    PubMed

    Repchevsky, Dmitry; Gelpi, Josep Ll

    2014-01-01

    Despite of the variety of available Web services registries specially aimed at Life Sciences, their scope is usually restricted to a limited set of well-defined types of services. While dedicated registries are generally tied to a particular format, general-purpose ones are more adherent to standards and usually rely on Web Service Definition Language (WSDL). Although WSDL is quite flexible to support common Web services types, its lack of semantic expressiveness led to various initiatives to describe Web services via ontology languages. Nevertheless, WSDL 2.0 descriptions gained a standard representation based on Web Ontology Language (OWL). BioSWR is a novel Web services registry that provides standard Resource Description Framework (RDF) based Web services descriptions along with the traditional WSDL based ones. The registry provides Web-based interface for Web services registration, querying and annotation, and is also accessible programmatically via Representational State Transfer (REST) API or using a SPARQL Protocol and RDF Query Language. BioSWR server is located at http://inb.bsc.es/BioSWR/and its code is available at https://sourceforge.net/projects/bioswr/under the LGPL license.

  2. A systematic review of nursing research priorities on health system and services in the Americas.

    PubMed

    Garcia, Alessandra Bassalobre; Cassiani, Silvia Helena De Bortoli; Reveiz, Ludovic

    2015-03-01

    To systematically review literature on priorities in nursing research on health systems and services in the Region of the Americas as a step toward developing a nursing research agenda that will advance the Regional Strategy for Universal Access to Health and Universal Health Coverage. This was a systematic review of the literature available from the following databases: Web of Science, PubMed, LILACS, and Google. Documents considered were published in 2008-2014; in English, Spanish, or Portuguese; and addressed the topic in the Region of the Americas. The documents selected had their priority-setting process evaluated according to the "nine common themes for good practice in health research priorities." A content analysis collected all study questions and topics, and sorted them by category and subcategory. Of 185 full-text articles/documents that were assessed for eligibility, 23 were selected: 12 were from peer-reviewed journals; 6 from nursing publications; 4 from Ministries of Health; and 1 from an international organization. Journal publications had stronger methodological rigor; the majority did not present a clear implementation or evaluation plan. After compiling the 444 documents' study questions and topics, the content analysis resulted in a document with 5 categories and 16 subcategories regarding nursing research priorities on health systems and services. Research priority-setting is a highly important process for health services improvement and resources optimization, but implementation and evaluation plans are rarely included. The resulting document will serve as basis for the development of a new nursing research agenda focused on health systems and services, and shaped to advance universal health coverage and universal access to health.

  3. On the comparability of knowledge transfer activities - a case study at the German Baltic Sea Coast focusing regional climate services

    NASA Astrophysics Data System (ADS)

    Meinke, Insa

    2017-06-01

    In this article the comparability of knowledge transfer activities is discussed by accounting for external impacts. It is shown that factors which are neither part of the knowledge transfer activity nor part of the participating institution may have significant impact on the potential usefulness of knowledge transfer activities. Differences in the potential usefulness are leading to different initial conditions of the knowledge transfer activities. This needs to be taken into account when comparing different knowledge transfer activities, e.g., in program evaluations. This study is focusing on regional climate services at the German Baltic Sea coast. It is based on two surveys and experiences with two identical web tools applied on two regions with different spatial coverage. The results show that comparability among science based knowledge transfer activities is strongly limited through several external impacts. The potential usefulness and thus the initial condition of a particular knowledge transfer activity strongly depends on (1) the perceived priority of the focused topic, (2) the used information channels, (3) the conformity between the research agenda of service providing institutions and information demands in the public, as well as (4) on the spatial coverage of a service. It is suggested to account for the described external impacts for evaluations of knowledge transfer activities. The results show that the comparability of knowledge transfer activities is limited and challenge the adequacy of quantitative measures in this context. Moreover, as shown in this case study, in particular regional climate services should be individually evaluated on a long term perspective, by potential user groups and/or by its real users. It is further suggested that evaluation criteria should be co-developed with these stakeholder groups.

  4. Reliable Execution Based on CPN and Skyline Optimization for Web Service Composition

    PubMed Central

    Ha, Weitao; Zhang, Guojun

    2013-01-01

    With development of SOA, the complex problem can be solved by combining available individual services and ordering them to best suit user's requirements. Web services composition is widely used in business environment. With the features of inherent autonomy and heterogeneity for component web services, it is difficult to predict the behavior of the overall composite service. Therefore, transactional properties and nonfunctional quality of service (QoS) properties are crucial for selecting the web services to take part in the composition. Transactional properties ensure reliability of composite Web service, and QoS properties can identify the best candidate web services from a set of functionally equivalent services. In this paper we define a Colored Petri Net (CPN) model which involves transactional properties of web services in the composition process. To ensure reliable and correct execution, unfolding processes of the CPN are followed. The execution of transactional composition Web service (TCWS) is formalized by CPN properties. To identify the best services of QoS properties from candidate service sets formed in the TCSW-CPN, we use skyline computation to retrieve dominant Web service. It can overcome that the reduction of individual scores to an overall similarity leads to significant information loss. We evaluate our approach experimentally using both real and synthetically generated datasets. PMID:23935431

  5. Reliable execution based on CPN and skyline optimization for Web service composition.

    PubMed

    Chen, Liping; Ha, Weitao; Zhang, Guojun

    2013-01-01

    With development of SOA, the complex problem can be solved by combining available individual services and ordering them to best suit user's requirements. Web services composition is widely used in business environment. With the features of inherent autonomy and heterogeneity for component web services, it is difficult to predict the behavior of the overall composite service. Therefore, transactional properties and nonfunctional quality of service (QoS) properties are crucial for selecting the web services to take part in the composition. Transactional properties ensure reliability of composite Web service, and QoS properties can identify the best candidate web services from a set of functionally equivalent services. In this paper we define a Colored Petri Net (CPN) model which involves transactional properties of web services in the composition process. To ensure reliable and correct execution, unfolding processes of the CPN are followed. The execution of transactional composition Web service (TCWS) is formalized by CPN properties. To identify the best services of QoS properties from candidate service sets formed in the TCSW-CPN, we use skyline computation to retrieve dominant Web service. It can overcome that the reduction of individual scores to an overall similarity leads to significant information loss. We evaluate our approach experimentally using both real and synthetically generated datasets.

  6. The Use of RESTful Web Services in Medical Informatics and Clinical Research and Its Implementation in Europe.

    PubMed

    Aerts, Jozef

    2017-01-01

    RESTful web services nowadays are state-of-the-art in business transactions over the internet. They are however not very much used in medical informatics and in clinical research, especially not in Europe. To make an inventory of RESTful web services that can be used in medical informatics and clinical research, including those that can help in patient empowerment in the DACH region and in Europe, and to develop some new RESTful web services for use in clinical research and regulatory review. A literature search on available RESTful web services has been performed and new RESTful web services have been developed on an application server using the Java language. Most of the web services found originate from institutes and organizations in the USA, whereas no similar web services could be found that are made available by European organizations. New RESTful web services have been developed for LOINC codes lookup, for UCUM conversions and for use with CDISC Standards. A comparison is made between "top down" and "bottom up" web services, the latter meant to answer concrete questions immediately. The lack of RESTful web services made available by European organizations in healthcare and medical informatics is striking. RESTful web services may in short future play a major role in medical informatics, and when localized for the German language and other European languages, can help to considerably facilitate patient empowerment. This however requires an EU equivalent of the US National Library of Medicine.

  7. Virtual Sensors in a Web 2.0 Digital Watershed

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Hill, D. J.; Marini, L.; Kooper, R.; Rodriguez, A.; Myers, J. D.

    2008-12-01

    The lack of rainfall data in many watersheds is one of the major barriers for modeling and studying many environmental and hydrological processes and supporting decision making. There are just not enough rain gages on the ground. To overcome this data scarcity issue, a Web 2.0 digital watershed is developed at NCSA(National Center for Supercomputing Applications), where users can point-and-click on a web-based google map interface and create new precipitation virtual sensors at any location within the same coverage region as a NEXRAD station. A set of scientific workflows are implemented to perform spatial, temporal and thematic transformations to the near-real-time NEXRAD Level II data. Such workflows can be triggered by the users' actions and generate either rainfall rate or rainfall accumulation streaming data at a user-specified time interval. We will discuss some underlying components of this digital watershed, which consists of a semantic content management middleware, a semantically enhanced streaming data toolkit, virtual sensor management functionality, and RESTful (REpresentational State Transfer) web service that can trigger the workflow execution. Such loosely coupled architecture presents a generic framework for constructing a Web 2.0 style digital watershed. An implementation of this architecture at the Upper Illinois Rive Basin will be presented. We will also discuss the implications of the virtual sensor concept for the broad environmental observatory community and how such concept will help us move towards a participatory digital watershed.

  8. Did Equity of Reproductive and Maternal Health Service Coverage Increase during the MDG Era? An Analysis of Trends and Determinants across 74 Low- and Middle-Income Countries

    PubMed Central

    Sharma, Suneeta

    2015-01-01

    Introduction Despite widespread gains toward the 5th Millennium Development Goal (MDG), pro-rich inequalities in reproductive health (RH) and maternal health (MH) are pervasive throughout the world. As countries enter the post-MDG era and strive toward UHC, it will be important to monitor the extent to which countries are achieving equity of RH and MH service coverage. This study explores how equity of service coverage differs across countries, and explores what policy factors are associated with a country’s progress, or lack thereof, toward more equitable RH and MH service coverage. Methods We used RH and MH service coverage data from Demographic and Health Surveys (DHS) for 74 countries to examine trends in equity between countries and over time from 1990 to 2014. We examined trends in both relative and absolute equity, and measured relative equity using a concentration index of coverage data grouped by wealth quintile. Through multivariate analysis we examined the relative importance of policy factors, such as political commitment to health, governance, and the level of prepayment, in determining countries’ progress toward greater equity in RH and MH service coverage. Results Relative equity for the coverage of RH and MH services has continually increased across all countries over the past quarter century; however, inequities in coverage persist, in some countries more than others. Multivariate analysis shows that higher education and greater political commitment (measured as the share of government spending allocated to health) were significantly associated with higher equity of service coverage. Neither country income, i.e., GDP per capita, nor better governance were significantly associated with equity. Conclusion Equity in RH and MH service coverage has improved but varies considerably across countries and over time. Even among the subset of countries that are close to achieving the MDGs, progress made on equity varies considerably across countries. Enduring disparities in access and outcomes underpin mounting support for targeted reforms within the broader context of universal health coverage (UHC). PMID:26331846

  9. Estimating the coverage of mental health programmes: a systematic review.

    PubMed

    De Silva, Mary J; Lee, Lucy; Fuhr, Daniela C; Rathod, Sujit; Chisholm, Dan; Schellenberg, Joanna; Patel, Vikram

    2014-04-01

    The large treatment gap for people suffering from mental disorders has led to initiatives to scale up mental health services. In order to track progress, estimates of programme coverage, and changes in coverage over time, are needed. Systematic review of mental health programme evaluations that assess coverage, measured either as the proportion of the target population in contact with services (contact coverage) or as the proportion of the target population who receive appropriate and effective care (effective coverage). We performed a search of electronic databases and grey literature up to March 2013 and contacted experts in the field. Methods to estimate the numerator (service utilization) and the denominator (target population) were reviewed to explore methods which could be used in programme evaluations. We identified 15 735 unique records of which only seven met the inclusion criteria. All studies reported contact coverage. No study explicitly measured effective coverage, but it was possible to estimate this for one study. In six studies the numerator of coverage, service utilization, was estimated using routine clinical information, whereas one study used a national community survey. The methods for estimating the denominator, the population in need of services, were more varied and included national prevalence surveys case registers, and estimates from the literature. Very few coverage estimates are available. Coverage could be estimated at low cost by combining routine programme data with population prevalence estimates from national surveys.

  10. Grid enablement of OpenGeospatial Web Services: the G-OWS Working Group

    NASA Astrophysics Data System (ADS)

    Mazzetti, Paolo

    2010-05-01

    In last decades two main paradigms for resource sharing emerged and reached maturity: the Web and the Grid. They both demonstrate suitable for building Distributed Computing Infrastructures (DCIs) supporting the coordinated sharing of resources (i.e. data, information, services, etc) on the Internet. Grid and Web DCIs have much in common as a result of their underlying Internet technology (protocols, models and specifications). However, being based on different requirements and architectural approaches, they show some differences as well. The Web's "major goal was to be a shared information space through which people and machines could communicate" [Berners-Lee 1996]. The success of the Web, and its consequent pervasiveness, made it appealing for building specialized systems like the Spatial Data Infrastructures (SDIs). In this systems the introduction of Web-based geo-information technologies enables specialized services for geospatial data sharing and processing. The Grid was born to achieve "flexible, secure, coordinated resource sharing among dynamic collections of individuals, institutions, and resources" [Foster 2001]. It specifically focuses on large-scale resource sharing, innovative applications, and, in some cases, high-performance orientation. In the Earth and Space Sciences (ESS) the most part of handled information is geo-referred (geo-information) since spatial and temporal meta-information is of primary importance in many application domains: Earth Sciences, Disasters Management, Environmental Sciences, etc. On the other hand, in several application areas there is the need of running complex models which require the large processing and storage capabilities that the Grids are able to provide. Therefore the integration of geo-information and Grid technologies might be a valuable approach in order to enable advanced ESS applications. Currently both geo-information and Grid technologies have reached a high level of maturity, allowing to build such an integration on existing solutions. More specifically, the Open Geospatial Consortium (OGC) Web Services (OWS) specifications play a fundamental role in geospatial information sharing (e.g. in INSPIRE Implementing Rules, GEOSS architecture, GMES Services, etc.). On the Grid side, the gLite middleware, developed in the European EGEE (Enabling Grids for E-sciencE) Projects, is widely spread in Europe and beyond, proving its high scalability and it is one of the middleware chosen for the future European Grid Infrastructure (EGI) initiative. Therefore the convergence between OWS and gLite technologies would be desirable for a seamless access to the Grid capabilities through OWS-compliant systems. Anyway, to achieve this harmonization there are some obstacles to overcome. Firstly, a semantics mismatch must be addressed: gLite handle low-level (e.g. close to the machine) concepts like "file", "data", "instruments", "job", etc., while geo-information services handle higher-level (closer to the human) concepts like "coverage", "observation", "measurement", "model", etc. Secondly, an architectural mismatch must be addressed: OWS implements a Web Service-Oriented-Architecture which is stateless, synchronous and with no embedded security (which is demanded to other specs), while gLite implements the Grid paradigm in an architecture which is stateful, asynchronous (even not fully event-based) and with strong embedded security (based on the VO paradigm). In recent years many initiatives and projects have worked out possible approaches for implementing Grid-enabled OWSs. Just to mention some: (i) in 2007 the OGC has signed a Memorandum of Understanding with the Open Grid Forum, "a community of users, developers, and vendors leading the global standardization effort for grid computing."; (ii) the OGC identified "WPS Profiles - Conflation; and Grid processing" as one of the tasks in the Geo Processing Workflow theme of the OWS Phase 6 (OWS-6); (iii) several national, European and international projects investigated different aspects of this integration, developing demonstrators and Proof-of-Concepts; In this context, "gLite enablement of OpenGeospatial Web Services" (G-OWS) is an initiative started in 2008 by the European CYCLOPS, GENESI-DR, and DORII Projects Consortia in order to collect/coordinate experiences on the enablement of OWS on top of the gLite middleware [GOWS]. Currently G-OWS counts ten member organizations from Europe and beyond, and four European Projects involved. It broadened its scope to the development of Spatial Data and Information Infrastructures (SDI and SII) based on the Grid/Cloud capacity in order to enable Earth Science applications and tools. Its operational objectives are the following: i) to contribute to the OGC-OGF initiative; ii) to release a reference implementation as standard gLite APIs (under the gLite software license); iii) to release a reference model (including procedures and guidelines) for OWS Grid-ification, as far as gLite is concerned; iv) to foster and promote the formation of consortiums for participation to projects/initiatives aimed at building Grid-enabled SDIs To achieve this objectives G-OWS bases its activities on two main guiding principles: a) the adoption of a service-oriented architecture based on the information modelling approach, and b) standardization as a means of achieving interoperability (i.e. adoption of standards from ISO TC211, OGC OWS, OGF). In the first year of activity G-OWS has designed a general architectural framework stemming from the FP6 CYCLOPS studies and enriched by the outcomes of other projects and initiatives involved (i.e. FP7 GENESI-DR, FP7 DORII, AIST GeoGrid, etc.). Some proof-of-concepts have been developed to demonstrate the flexibility and scalability of such architectural framework. The G-OWS WG developed implementations of gLite-enabled Web Coverage Service (WCS) and Web Processing Service (WPS), and an implementation of a Shibboleth authentication for gLite-enabled OWS in order to evaluate the possible integration of Web and Grid security models. The presentation will aim to communicate the G-OWS organization, activities, future plans and means to involve the ESSI community. References [Berners-Lee 1996] T. Berners-Lee, "WWW: Past, present, and future". IEEE Computer, 29(10), Oct. 1996, pp. 69-77. [Foster 2001] I. Foster, C. Kesselman and S. Tuecke, "The Anatomy of the Grid. The International Journal ofHigh Performance Computing Applications", 15(3):200-222, Fall 2001 [GOWS] G-OWS WG, https://www.g-ows.org/, accessed: 15 January 2010

  11. Collaborative Science Using Web Services and the SciFlo Grid Dataflow Engine

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Manipon, G.; Xing, Z.; Yunck, T.

    2006-12-01

    The General Earth Science Investigation Suite (GENESIS) project is a NASA-sponsored partnership between the Jet Propulsion Laboratory, academia, and NASA data centers to develop a new suite of Web Services tools to facilitate multi-sensor investigations in Earth System Science. The goal of GENESIS is to enable large-scale, multi-instrument atmospheric science using combined datasets from the AIRS, MODIS, MISR, and GPS sensors. Investigations include cross-comparison of spaceborne climate sensors, cloud spectral analysis, study of upper troposphere-stratosphere water transport, study of the aerosol indirect cloud effect, and global climate model validation. The challenges are to bring together very large datasets, reformat and understand the individual instrument retrievals, co-register or re-grid the retrieved physical parameters, perform computationally-intensive data fusion and data mining operations, and accumulate complex statistics over months to years of data. To meet these challenges, we have developed a Grid computing and dataflow framework, named SciFlo, in which we are deploying a set of versatile and reusable operators for data access, subsetting, registration, mining, fusion, compression, and advanced statistical analysis. SciFlo leverages remote Web Services, called via Simple Object Access Protocol (SOAP) or REST (one-line) URLs, and the Grid Computing standards (WS-* &Globus Alliance toolkits), and enables scientists to do multi-instrument Earth Science by assembling reusable Web Services and native executables into a distributed computing flow (tree of operators). The SciFlo client &server engines optimize the execution of such distributed data flows and allow the user to transparently find and use datasets and operators without worrying about the actual location of the Grid resources. In particular, SciFlo exploits the wealth of datasets accessible by OpenGIS Consortium (OGC) Web Mapping Servers & Web Coverage Servers (WMS/WCS), and by Open Data Access Protocol (OpenDAP) servers. The scientist injects a distributed computation into the Grid by simply filling out an HTML form or directly authoring the underlying XML dataflow document, and results are returned directly to the scientist's desktop. Once an analysis has been specified for a chunk or day of data, it can be easily repeated with different control parameters or over months of data. Recently, the Earth Science Information Partners (ESIP) Federation sponsored a collaborative activity in which several ESIP members advertised their respective WMS/WCS and SOAP services, developed some collaborative science scenarios for atmospheric and aerosol science, and then choreographed services from multiple groups into demonstration workflows using the SciFlo engine and a Business Process Execution Language (BPEL) workflow engine. For several scenarios, the same collaborative workflow was executed in three ways: using hand-coded scripts, by executing a SciFlo document, and by executing a BPEL workflow document. We will discuss the lessons learned from this activity, the need for standardized interfaces (like WMS/WCS), the difficulty in agreeing on even simple XML formats and interfaces, and further collaborations that are being pursued.

  12. Assessment of population coverage of hypertension screening in Thailand based on the effective coverage framework.

    PubMed

    Charoendee, Kulpimol; Sriratanaban, Jiruth; Aekplakorn, Wichai; Hanvoravongchai, Piya

    2018-03-27

    Hypertension (HT) is a major risk factor, and accessible and effective HT screening services are necessary. The effective coverage framework is an assessment tool that can be used to assess health service performance by considering target population who need and receive quality service. The aim of this study is to measure effective coverage of hypertension screening services at the provincial level in Thailand. Over 40 million individual health service records in 2013 were acquired. Data on blood pressure measurement, risk assessment, HT diagnosis and follow up were analyzed. The effectiveness of the services was assessed based on a set of quality criteria for pre-HT, suspected HT, and confirmed HT cases. Effective coverage of HT services for all non-HT Thai population aged 15 or over was estimated for each province and for all Thailand. Population coverage of HT screening is 54.6%, varying significantly across provinces. Among those screened, 28.9% were considered pre-HT, and another 6.0% were suspected HT cases. The average provincial effective coverage was at 49.9%. Around four-fifths (82.6%) of the pre-HT group received HT and Cardiovascular diseases (CVD) risk assessment. Among the suspected HT cases, less than half (38.0%) got a follow-up blood pressure measurement within 60 days from the screening date. Around 9.2% of the suspected cases were diagnosed as having HT, and only one-third of them (36.5%) received treatment within 6 months. Within this group, 21.8% obtained CVD risk assessment, and half of them had their blood pressure under control (50.8%) with less than 1 % (0.7%) of them managed to get the CVD risk reduced. Our findings suggest that hypertension screening coverage, post-screening service quality, and effective coverage of HT screening in Thailand were still low and they vary greatly across provinces. It is imperative that service coverage and its effectiveness are assessed, and both need improvement. Despite some limitations, measurement of effective coverage could be done with existing data, and it can serve as a useful tool for performance measurement of public health services.

  13. Web service module for access to g-Lite

    NASA Astrophysics Data System (ADS)

    Goranova, R.; Goranov, G.

    2012-10-01

    G-Lite is a lightweight grid middleware for grid computing installed on all clusters of the European Grid Infrastructure (EGI). The middleware is partially service-oriented and does not provide well-defined Web services for job management. The existing Web services in the environment cannot be directly used by grid users for building service compositions in the EGI. In this article we present a module of well-defined Web services for job management in the EGI. We describe the architecture of the module and the design of the developed Web services. The presented Web services are composable and can participate in service compositions (workflows). An example of usage of the module with tools for service compositions in g-Lite is shown.

  14. BOWS (bioinformatics open web services) to centralize bioinformatics tools in web services.

    PubMed

    Velloso, Henrique; Vialle, Ricardo A; Ortega, J Miguel

    2015-06-02

    Bioinformaticians face a range of difficulties to get locally-installed tools running and producing results; they would greatly benefit from a system that could centralize most of the tools, using an easy interface for input and output. Web services, due to their universal nature and widely known interface, constitute a very good option to achieve this goal. Bioinformatics open web services (BOWS) is a system based on generic web services produced to allow programmatic access to applications running on high-performance computing (HPC) clusters. BOWS intermediates the access to registered tools by providing front-end and back-end web services. Programmers can install applications in HPC clusters in any programming language and use the back-end service to check for new jobs and their parameters, and then to send the results to BOWS. Programs running in simple computers consume the BOWS front-end service to submit new processes and read results. BOWS compiles Java clients, which encapsulate the front-end web service requisitions, and automatically creates a web page that disposes the registered applications and clients. Bioinformatics open web services registered applications can be accessed from virtually any programming language through web services, or using standard java clients. The back-end can run in HPC clusters, allowing bioinformaticians to remotely run high-processing demand applications directly from their machines.

  15. Web Services--A Buzz Word with Potentials

    Treesearch

    János T. Füstös

    2006-01-01

    The simplest definition of a web service is an application that provides a web API. The web API exposes the functionality of the solution to other applications. The web API relies on other Internet-based technologies to manage communications. The resulting web services are pervasive, vendor-independent, language-neutral, and very low-cost. The main purpose of a web API...

  16. BioServices: a common Python package to access biological Web Services programmatically.

    PubMed

    Cokelaer, Thomas; Pultz, Dennis; Harder, Lea M; Serra-Musach, Jordi; Saez-Rodriguez, Julio

    2013-12-15

    Web interfaces provide access to numerous biological databases. Many can be accessed to in a programmatic way thanks to Web Services. Building applications that combine several of them would benefit from a single framework. BioServices is a comprehensive Python framework that provides programmatic access to major bioinformatics Web Services (e.g. KEGG, UniProt, BioModels, ChEMBLdb). Wrapping additional Web Services based either on Representational State Transfer or Simple Object Access Protocol/Web Services Description Language technologies is eased by the usage of object-oriented programming. BioServices releases and documentation are available at http://pypi.python.org/pypi/bioservices under a GPL-v3 license.

  17. Research on the development and preliminary application of Beijing agricultural sci-tech service hotline WebApp in agricultural consulting services

    NASA Astrophysics Data System (ADS)

    Yu, Weishui; Luo, Changshou; Zheng, Yaming; Wei, Qingfeng; Cao, Chengzhong

    2017-09-01

    To deal with the “last kilometer” problem during the agricultural science and technology information service, we analyzed the feasibility, necessity and advantages of WebApp applied to agricultural information service and discussed the modes of WebApp used in agricultural information service based on the requirements analysis and the function of WebApp. To overcome the existing App’s defects of difficult installation and weak compatibility between the mobile operating systems, the Beijing Agricultural Sci-tech Service Hotline WebApp was developed based on the HTML and JAVA technology. The WebApp has greater compatibility and simpler operation than the Native App, what’s more, it can be linked to the WeChat public platform making it spread easily and run directly without setup process. The WebApp was used to provide agricultural expert consulting services and agriculture information push, obtained a good preliminary application achievement. Finally, we concluded the creative application of WebApp in agricultural consulting services and prospected the development of WebApp in agricultural information service.

  18. Brokering technologies to realize the hydrology scenario in NSF BCube

    NASA Astrophysics Data System (ADS)

    Boldrini, Enrico; Easton, Zachary; Fuka, Daniel; Pearlman, Jay; Nativi, Stefano

    2015-04-01

    In the National Science Foundation (NSF) BCube project an international team composed of cyber infrastructure experts, geoscientists, social scientists and educators are working together to explore the use of brokering technologies, initially focusing on four domains: hydrology, oceans, polar, and weather. In the hydrology domain, environmental models are fundamental to understand the behaviour of hydrological systems. A specific model usually requires datasets coming from different disciplines for its initialization (e.g. elevation models from Earth observation, weather data from Atmospheric sciences, etc.). Scientific datasets are usually available on heterogeneous publishing services, such as inventory and access services (e.g. OGC Web Coverage Service, THREDDS Data Server, etc.). Indeed, datasets are published according to different protocols, moreover they usually come in different formats, resolutions, Coordinate Reference Systems (CRSs): in short different grid environments depending on the original data and the publishing service processing capabilities. Scientists can thus be impeded by the burden of discovery, access and normalize the desired datasets to the grid environment required by the model. These technological tasks of course divert scientists from their main, scientific goals. The use of GI-axe brokering framework has been experimented in a hydrology scenario where scientists needed to compare a particular hydrological model with two different input datasets (digital elevation models): - the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) dataset, v.2. - the Shuttle Radar Topography Mission (SRTM) dataset, v.3. These datasets were published by means of Hyrax Server technology, which can provide NetCDF files at their original resolution and CRS. Scientists had their model running on ArcGIS, so the main goal was to import the datasets using the available ArcPy library and have EPSG:4326 with the same resolution grid as the reference system, so that model outputs could be compared. ArcPy however is able to access only GeoTIff datasets that are published by a OGC Web Coverage Service (WCS). The GI-axe broker has then been deployed between the client application and the data providers. It has been configured to broker the two different Hyrax service endpoints and republish the data content through a WCS interface for the use of the ArcPy library. Finally, scientists were able to easily run the model, and to concentrate on the comparison of the different results obtained according to the selected input dataset. The use of a third party broker to perform such technological tasks has also shown to have the potential advantage of increasing the repeatability of a study among different researchers.

  19. The Impact of the Affordable Care Act's Dependent Coverage Mandate on Use of Dental Treatments and Preventive Services.

    PubMed

    Shane, Dan M; Wehby, George L

    2017-09-01

    Oral health problems are the leading chronic conditions among children and younger adults. Lack of dental coverage is thought to be an important barrier to care but little empirical evidence exists on the causal effect of private dental coverage on use of dental services. We explore the relationship between dental coverage and dental services utilization with an analysis of a natural experiment of increasing private dental coverage stemming from the Affordable Care Act's (ACA)-dependent coverage mandate. To evaluate whether increased private dental insurance due to the spillover effect of the ACA-dependent coverage health insurance mandate affected utilization of dental services among a group of affected young adults. 2006-2013 Medical Expenditure Panel Surveys. We used a difference-in-difference regression approach comparing changes in dental care utilization for 25-year olds affected by the policy to unaffected 27-year olds. We evaluate effects on dental treatments and preventive services RESULTS:: Compared to 27-year olds, 25-year olds were 8 percentage points more likely to have private dental coverage in the 3 years following the mandate. We do not find compelling evidence that young adults increased their use of preventive dental services in response to gaining insurance. We do find a nearly 5 percentage point increase in the likelihood of dental treatments among 25-year olds following the mandate, an effect that appears concentrated among women. Increases in private dental coverage due to the ACA's-dependent coverage mandate do not appear to be driving significant changes in overall preventive dental services utilization but there is evidence of an increase in restorative care.

  20. Conducting Retrospective Ontological Clinical Trials in ICD-9-CM in the Age of ICD-10-CM.

    PubMed

    Venepalli, Neeta K; Shergill, Ardaman; Dorestani, Parvaneh; Boyd, Andrew D

    2014-01-01

    To quantify the impact of International Classification of Disease 10th Revision Clinical Modification (ICD-10-CM) transition in cancer clinical trials by comparing coding accuracy and data discontinuity in backward ICD-10-CM to ICD-9-CM mapping via two tools, and to develop a standard ICD-9-CM and ICD-10-CM bridging methodology for retrospective analyses. While the transition to ICD-10-CM has been delayed until October 2015, its impact on cancer-related studies utilizing ICD-9-CM diagnoses has been inadequately explored. Three high impact journals with broad national and international readerships were reviewed for cancer-related studies utilizing ICD-9-CM diagnoses codes in study design, methods, or results. Forward ICD-9-CM to ICD-10-CM mapping was performing using a translational methodology with the Motif web portal ICD-9-CM conversion tool. Backward mapping from ICD-10-CM to ICD-9-CM was performed using both Centers for Medicare and Medicaid Services (CMS) general equivalence mappings (GEMs) files and the Motif web portal tool. Generated ICD-9-CM codes were compared with the original ICD-9-CM codes to assess data accuracy and discontinuity. While both methods yielded additional ICD-9-CM codes, the CMS GEMs method provided incomplete coverage with 16 of the original ICD-9-CM codes missing, whereas the Motif web portal method provided complete coverage. Of these 16 codes, 12 ICD-9-CM codes were present in 2010 Illinois Medicaid data, and accounted for 0.52% of patient encounters and 0.35% of total Medicaid reimbursements. Extraneous ICD-9-CM codes from both methods (Centers for Medicare and Medicaid Services general equivalent mapping [CMS GEMs, n = 161; Motif web portal, n = 246]) in excess of original ICD-9-CM codes accounted for 2.1% and 2.3% of total patient encounters and 3.4% and 4.1% of total Medicaid reimbursements from the 2010 Illinois Medicare database. Longitudinal data analyses post-ICD-10-CM transition will require backward ICD-10-CM to ICD-9-CM coding, and data comparison for accuracy. Researchers must be aware that all methods for backward coding are not comparable in yielding original ICD-9-CM codes. The mandated delay is an opportunity for organizations to better understand areas of financial risk with regards to data management via backward coding. Our methodology is relevant for all healthcare-related coding data, and can be replicated by organizations as a strategy to mitigate financial risk.

  1. 42 CFR 486.110 - Condition for coverage: Inspection of equipment.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Condition for coverage: Inspection of equipment. 486.110 Section 486.110 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH... SERVICES FURNISHED BY SUPPLIERS Conditions for Coverage: Portable X-Ray Services § 486.110 Condition for...

  2. 76 FR 61245 - Provision of Aviation Insurance Coverage for Commercial Air Carrier Service in Domestic and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-03

    ...--Provision of Aviation Insurance Coverage for Commercial Air Carrier Service in Domestic and International... of September 28, 2011 Provision of Aviation Insurance Coverage for Commercial Air Carrier Service in Domestic and International Operations Memorandum for the Secretary of Transportation By the authority...

  3. Extending Medicare coverage to medically necessary dental care.

    PubMed

    Patton, L L; White, B A; Field, M J

    2001-09-01

    Periodically, Congress considers expanding Medicare coverage to include some currently excluded health care services. In 1999 and 2000, an Institute of Medicine committee studied the issues related to coverage for certain services, including "medically necessary dental services." The committee conducted a literature search for dental care studies in five areas: head and neck cancer, leukemia, lymphoma, organ transplantation, and heart valve repair or replacement. The committee examined evidence to support Medicare coverage for dental services related to these conditions and estimated the cost to Medicare of such coverage. Evidence supported Medicare coverage for preventive dental care before jaw radiation therapy for head or neck cancer and coverage for treatment to prevent or eliminate acute oral infections for patients with leukemia before chemotherapy. Insufficient evidence supported dental coverage for patients with lymphoma or organ transplants and for patients who had undergone heart valve repair or replacement. The committee suggested that Congress update statutory language to permit Medicare coverage of effective dental services needed in conjunction with surgery, chemotherapy, radiation therapy or pharmacological treatment for life-threatening medical conditions. Dental care is important for members of all age groups. More direct, research-based evidence on the efficacy of medically necessary dental care is needed both to guide treatment and to support Medicare payment policy.

  4. Information Retrieval System for Japanese Standard Disease-Code Master Using XML Web Service

    PubMed Central

    Hatano, Kenji; Ohe, Kazuhiko

    2003-01-01

    Information retrieval system of Japanese Standard Disease-Code Master Using XML Web Service is developed. XML Web Service is a new distributed processing system by standard internet technologies. With seamless remote method invocation of XML Web Service, users are able to get the latest disease code master information from their rich desktop applications or internet web sites, which refer to this service. PMID:14728364

  5. Camp Insurance 101: Understanding the Fundamentals of a Camp Insurance Program.

    ERIC Educational Resources Information Center

    Garner, Ian

    2001-01-01

    This short course on insurance for camps discusses coverage, including the various types of liability, property, and other types of coverage; the difference between direct writers, brokers, agents, and captive agents; choosing an insurance company; and checking on the financial stability of recommended carriers. Three Web sites are given for…

  6. 22 CFR 126.5 - Canadian exemptions.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... such persons publicly available through the Internet Web site of the Directorate of Defense Trade... coverage area on the surface of the earth less than 200 nautical miles in diameter, where “coverage area” is defined as that area on the surface of the earth that is illuminated by the main beam width of the...

  7. Lexical Coverage of TED Talks: Implications for Vocabulary Instruction

    ERIC Educational Resources Information Center

    Nurmukhamedov, Ulugbek

    2017-01-01

    Teachers of English are often in search of authentic audio and video materials that promote learners' listening comprehension and vocabulary development. TED Talks, a set of freely available web presentations, could be a useful resource to promote vocabulary instruction. The present replication study examines the lexical coverage of TED Talks by…

  8. Examining public knowledge and preferences for adult preventive services coverage.

    PubMed

    Williams, Jessica A R; Ortiz, Selena E

    2017-01-01

    To examine (1) what individuals know about the existing adult preventive service coverage provisions of the Affordable Care Act (ACA), and (2) which preventive services individuals think should be covered without cost sharing. An online panel from Survey Monkey was used to obtain a sample of 2,990 adults age 18 and older in March 2015, analyzed 2015-2017. A 17-item survey instrument was designed and used to evaluate respondents' knowledge of the adult preventive services provision of the ACA. Additionally, we asked whether various preventive services should be covered. The data include age, sex, race/ethnicity, and educational attainment as well as measures of political ideology, previous insurance status, the number of chronic conditions, and usual source of care. Respondents correctly answered 38.6% of the questions about existing coverage under the ACA, while on average respondents thought 12.1 of 15 preventive services should be covered (SD 3.5). Respondents were more knowledgeable about coverage for routine screenings, such as blood pressure (63.4% correct) than potentially stigmatizing screenings, such as for alcohol misuse (28.8% correct). Blood pressure screening received the highest support of coverage (89.8%) while coverage of gym memberships received the lowest support (59.4%). Individuals with conservative ideologies thought fewer services on average should be covered, but the difference was small-around one service less than those with liberal ideologies. Overwhelmingly, individuals think that most preventive services should be covered without cost sharing. Despite several years of coverage for preventive services, there is still confusion and lack of knowledge about which services are covered.

  9. Examining public knowledge and preferences for adult preventive services coverage

    PubMed Central

    Ortiz, Selena E.

    2017-01-01

    Introduction To examine (1) what individuals know about the existing adult preventive service coverage provisions of the Affordable Care Act (ACA), and (2) which preventive services individuals think should be covered without cost sharing. Methods An online panel from Survey Monkey was used to obtain a sample of 2,990 adults age 18 and older in March 2015, analyzed 2015–2017. A 17-item survey instrument was designed and used to evaluate respondents’ knowledge of the adult preventive services provision of the ACA. Additionally, we asked whether various preventive services should be covered. The data include age, sex, race/ethnicity, and educational attainment as well as measures of political ideology, previous insurance status, the number of chronic conditions, and usual source of care. Results Respondents correctly answered 38.6% of the questions about existing coverage under the ACA, while on average respondents thought 12.1 of 15 preventive services should be covered (SD 3.5). Respondents were more knowledgeable about coverage for routine screenings, such as blood pressure (63.4% correct) than potentially stigmatizing screenings, such as for alcohol misuse (28.8% correct). Blood pressure screening received the highest support of coverage (89.8%) while coverage of gym memberships received the lowest support (59.4%). Individuals with conservative ideologies thought fewer services on average should be covered, but the difference was small—around one service less than those with liberal ideologies. Conclusions Overwhelmingly, individuals think that most preventive services should be covered without cost sharing. Despite several years of coverage for preventive services, there is still confusion and lack of knowledge about which services are covered. PMID:29261757

  10. Estimating the coverage of mental health programmes: a systematic review

    PubMed Central

    De Silva, Mary J; Lee, Lucy; Fuhr, Daniela C; Rathod, Sujit; Chisholm, Dan; Schellenberg, Joanna; Patel, Vikram

    2014-01-01

    Background The large treatment gap for people suffering from mental disorders has led to initiatives to scale up mental health services. In order to track progress, estimates of programme coverage, and changes in coverage over time, are needed. Methods Systematic review of mental health programme evaluations that assess coverage, measured either as the proportion of the target population in contact with services (contact coverage) or as the proportion of the target population who receive appropriate and effective care (effective coverage). We performed a search of electronic databases and grey literature up to March 2013 and contacted experts in the field. Methods to estimate the numerator (service utilization) and the denominator (target population) were reviewed to explore methods which could be used in programme evaluations. Results We identified 15 735 unique records of which only seven met the inclusion criteria. All studies reported contact coverage. No study explicitly measured effective coverage, but it was possible to estimate this for one study. In six studies the numerator of coverage, service utilization, was estimated using routine clinical information, whereas one study used a national community survey. The methods for estimating the denominator, the population in need of services, were more varied and included national prevalence surveys case registers, and estimates from the literature. Conclusions Very few coverage estimates are available. Coverage could be estimated at low cost by combining routine programme data with population prevalence estimates from national surveys. PMID:24760874

  11. 42 CFR 486.108 - Condition for coverage: Safety standards.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Condition for coverage: Safety standards. 486.108 Section 486.108 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN... BY SUPPLIERS Conditions for Coverage: Portable X-Ray Services § 486.108 Condition for coverage...

  12. 5 CFR 317.301 - Conversion coverage.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 5 Administrative Personnel 1 2014-01-01 2014-01-01 false Conversion coverage. 317.301 Section 317... THE SENIOR EXECUTIVE SERVICE Conversion to the Senior Executive Service § 317.301 Conversion coverage... statutory action extending coverage under 5 U.S.C. 3132(a)(1) to that agency. Except as otherwise provided...

  13. 5 CFR 300.702 - Coverage.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Bar to Appointment of Persons Who Fail To Register Under Selective Service Law § 300.702 Coverage. Appointments in the competitive service, the excepted service, the Senior Executive Service, or any other civil...

  14. 5 CFR 300.702 - Coverage.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Bar to Appointment of Persons Who Fail To Register Under Selective Service Law § 300.702 Coverage. Appointments in the competitive service, the excepted service, the Senior Executive Service, or any other civil...

  15. Persistence and availability of Web services in computational biology.

    PubMed

    Schultheiss, Sebastian J; Münch, Marc-Christian; Andreeva, Gergana D; Rätsch, Gunnar

    2011-01-01

    We have conducted a study on the long-term availability of bioinformatics Web services: an observation of 927 Web services published in the annual Nucleic Acids Research Web Server Issues between 2003 and 2009. We found that 72% of Web sites are still available at the published addresses, only 9% of services are completely unavailable. Older addresses often redirect to new pages. We checked the functionality of all available services: for 33%, we could not test functionality because there was no example data or a related problem; 13% were truly no longer working as expected; we could positively confirm functionality only for 45% of all services. Additionally, we conducted a survey among 872 Web Server Issue corresponding authors; 274 replied. 78% of all respondents indicate their services have been developed solely by students and researchers without a permanent position. Consequently, these services are in danger of falling into disrepair after the original developers move to another institution, and indeed, for 24% of services, there is no plan for maintenance, according to the respondents. We introduce a Web service quality scoring system that correlates with the number of citations: services with a high score are cited 1.8 times more often than low-scoring services. We have identified key characteristics that are predictive of a service's survival, providing reviewers, editors, and Web service developers with the means to assess or improve Web services. A Web service conforming to these criteria receives more citations and provides more reliable service for its users. The most effective way of ensuring continued access to a service is a persistent Web address, offered either by the publishing journal, or created on the authors' own initiative, for example at http://bioweb.me. The community would benefit the most from a policy requiring any source code needed to reproduce results to be deposited in a public repository.

  16. Persistence and Availability of Web Services in Computational Biology

    PubMed Central

    Schultheiss, Sebastian J.; Münch, Marc-Christian; Andreeva, Gergana D.; Rätsch, Gunnar

    2011-01-01

    We have conducted a study on the long-term availability of bioinformatics Web services: an observation of 927 Web services published in the annual Nucleic Acids Research Web Server Issues between 2003 and 2009. We found that 72% of Web sites are still available at the published addresses, only 9% of services are completely unavailable. Older addresses often redirect to new pages. We checked the functionality of all available services: for 33%, we could not test functionality because there was no example data or a related problem; 13% were truly no longer working as expected; we could positively confirm functionality only for 45% of all services. Additionally, we conducted a survey among 872 Web Server Issue corresponding authors; 274 replied. 78% of all respondents indicate their services have been developed solely by students and researchers without a permanent position. Consequently, these services are in danger of falling into disrepair after the original developers move to another institution, and indeed, for 24% of services, there is no plan for maintenance, according to the respondents. We introduce a Web service quality scoring system that correlates with the number of citations: services with a high score are cited 1.8 times more often than low-scoring services. We have identified key characteristics that are predictive of a service's survival, providing reviewers, editors, and Web service developers with the means to assess or improve Web services. A Web service conforming to these criteria receives more citations and provides more reliable service for its users. The most effective way of ensuring continued access to a service is a persistent Web address, offered either by the publishing journal, or created on the authors' own initiative, for example at http://bioweb.me. The community would benefit the most from a policy requiring any source code needed to reproduce results to be deposited in a public repository. PMID:21966383

  17. The impact of the Brazilian family health on selected primary care sensitive conditions: A systematic review

    PubMed Central

    Menzies, Dick; Hone, Thomas; Dehghani, Kianoush; Trajman, Anete

    2017-01-01

    Background Brazil has the largest public health-system in the world, with 120 million people covered by its free primary care services. The Family Health Strategy (FHS) is the main primary care model, but there is no consensus on its impact on health outcomes. We systematically reviewed published evidence regarding the impact of the Brazilian FHS on selective primary care sensitive conditions (PCSC). Methods We searched Medline, Web of Science and Lilacs in May 2016 using key words in Portuguese and English, without language restriction. We included studies if intervention was the FHS; comparison was either different levels of FHS coverage or other primary health care service models; outcomes were the selected PCSC; and results were adjusted for relevant sanitary and socioeconomic variables, including the national conditional cash transfer program (Bolsa Familia). Due to differences in methods and outcomes reported, pooling of results was not possible. Results Of 1831 records found, 31 met our inclusion criteria. Of these, 25 were ecological studies. Twenty-one employed longitudinal quasi-experimental methods, 27 compared different levels the FHS coverage, whilst four compared the FHS versus other models of primary care. Fourteen studies found an association between higher FHS coverage and lower post-neonatal and child mortality. When the effect of Bolsa Familia was accounted for, the effect of the FHS on child mortality was greater. In 13 studies about hospitalizations due to PCSC, no clear pattern of association was found. In four studies, there was no effect on child and elderly vaccination or low-birth weight. No included studies addressed breast-feeding, dengue, HIV/AIDS and other neglected infectious diseases. Conclusions Among these ecological studies with limited quality evidence, increasing coverage by the FHS was consistently associated with improvements in child mortality. Scarce evidence on other health outcomes, hospitalization and synergies with cash transfer was found. PMID:28786997

  18. Space Physics Data Facility Web Services

    NASA Technical Reports Server (NTRS)

    Candey, Robert M.; Harris, Bernard T.; Chimiak, Reine A.

    2005-01-01

    The Space Physics Data Facility (SPDF) Web services provides a distributed programming interface to a portion of the SPDF software. (A general description of Web services is available at http://www.w3.org/ and in many current software-engineering texts and articles focused on distributed programming.) The SPDF Web services distributed programming interface enables additional collaboration and integration of the SPDF software system with other software systems, in furtherance of the SPDF mission to lead collaborative efforts in the collection and utilization of space physics data and mathematical models. This programming interface conforms to all applicable Web services specifications of the World Wide Web Consortium. The interface is specified by a Web Services Description Language (WSDL) file. The SPDF Web services software consists of the following components: 1) A server program for implementation of the Web services; and 2) A software developer s kit that consists of a WSDL file, a less formal description of the interface, a Java class library (which further eases development of Java-based client software), and Java source code for an example client program that illustrates the use of the interface.

  19. The EMBRACE web service collection

    PubMed Central

    Pettifer, Steve; Ison, Jon; Kalaš, Matúš; Thorne, Dave; McDermott, Philip; Jonassen, Inge; Liaquat, Ali; Fernández, José M.; Rodriguez, Jose M.; Partners, INB-; Pisano, David G.; Blanchet, Christophe; Uludag, Mahmut; Rice, Peter; Bartaseviciute, Edita; Rapacki, Kristoffer; Hekkelman, Maarten; Sand, Olivier; Stockinger, Heinz; Clegg, Andrew B.; Bongcam-Rudloff, Erik; Salzemann, Jean; Breton, Vincent; Attwood, Teresa K.; Cameron, Graham; Vriend, Gert

    2010-01-01

    The EMBRACE (European Model for Bioinformatics Research and Community Education) web service collection is the culmination of a 5-year project that set out to investigate issues involved in developing and deploying web services for use in the life sciences. The project concluded that in order for web services to achieve widespread adoption, standards must be defined for the choice of web service technology, for semantically annotating both service function and the data exchanged, and a mechanism for discovering services must be provided. Building on this, the project developed: EDAM, an ontology for describing life science web services; BioXSD, a schema for exchanging data between services; and a centralized registry (http://www.embraceregistry.net) that collects together around 1000 services developed by the consortium partners. This article presents the current status of the collection and its associated recommendations and standards definitions. PMID:20462862

  20. 29 CFR 2590.715-2713 - Coverage of preventive health services.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 9 2011-07-01 2011-07-01 false Coverage of preventive health services. 2590.715-2713... ADMINISTRATION, DEPARTMENT OF LABOR GROUP HEALTH PLANS RULES AND REGULATIONS FOR GROUP HEALTH PLANS Other Requirements § 2590.715-2713 Coverage of preventive health services. (a) Services—(1) In general. Beginning at...

  1. 29 CFR 2590.715-2713 - Coverage of preventive health services.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 29 Labor 9 2014-07-01 2014-07-01 false Coverage of preventive health services. 2590.715-2713... ADMINISTRATION, DEPARTMENT OF LABOR GROUP HEALTH PLANS RULES AND REGULATIONS FOR GROUP HEALTH PLANS Other Requirements § 2590.715-2713 Coverage of preventive health services. (a) Services—(1) In general. Beginning at...

  2. Enhancing UCSF Chimera through web services

    PubMed Central

    Huang, Conrad C.; Meng, Elaine C.; Morris, John H.; Pettersen, Eric F.; Ferrin, Thomas E.

    2014-01-01

    Integrating access to web services with desktop applications allows for an expanded set of application features, including performing computationally intensive tasks and convenient searches of databases. We describe how we have enhanced UCSF Chimera (http://www.rbvi.ucsf.edu/chimera/), a program for the interactive visualization and analysis of molecular structures and related data, through the addition of several web services (http://www.rbvi.ucsf.edu/chimera/docs/webservices.html). By streamlining access to web services, including the entire job submission, monitoring and retrieval process, Chimera makes it simpler for users to focus on their science projects rather than data manipulation. Chimera uses Opal, a toolkit for wrapping scientific applications as web services, to provide scalable and transparent access to several popular software packages. We illustrate Chimera's use of web services with an example workflow that interleaves use of these services with interactive manipulation of molecular sequences and structures, and we provide an example Python program to demonstrate how easily Opal-based web services can be accessed from within an application. Web server availability: http://webservices.rbvi.ucsf.edu/opal2/dashboard?command=serviceList. PMID:24861624

  3. Recent El Niño brought downpour of media coverage

    NASA Astrophysics Data System (ADS)

    Hare, Steven R.

    Media coverage of the 1997-1998 tropical ocean warming event made the term “El Nino” a household word. So pervasive was coverage of El Nino that it became the fodder of late night talk show monologues and an oft-invoked gremlin responsible for many of society's ailments. As a fisheries biologist studying climate impacts on marine resources, I followed the event very closely and created an El Nino Web site (http://www. iphc.washington.edu/PAGES/IPHC/Staff/ hare/html/1997ENSO/ 1997ENSO.html) in the spring of 1997 when the magnitude of the event was becoming obvious.As part of my daily routine in updating the Web page, I began tracking El Nino media coverage over the Internet. Between June 1997 and July 1998,1 accumulated links to stories about El Nino. I attempted to maintain a constant level of effort so that the number of stories accurately reflected the level of coverage given the event as it progressed. In fisheries lingo, this is known as a Catch Per Unit Effort (CPUE) index. Because Internet content is often removed after a period of time, a retrospective accumulation of daily stories would not yield as accurate a count as the contemporary CPUE index I maintained.

  4. Real-time GIS data model and sensor web service platform for environmental data management.

    PubMed

    Gong, Jianya; Geng, Jing; Chen, Zeqiang

    2015-01-09

    Effective environmental data management is meaningful for human health. In the past, environmental data management involved developing a specific environmental data management system, but this method often lacks real-time data retrieving and sharing/interoperating capability. With the development of information technology, a Geospatial Service Web method is proposed that can be employed for environmental data management. The purpose of this study is to determine a method to realize environmental data management under the Geospatial Service Web framework. A real-time GIS (Geographic Information System) data model and a Sensor Web service platform to realize environmental data management under the Geospatial Service Web framework are proposed in this study. The real-time GIS data model manages real-time data. The Sensor Web service platform is applied to support the realization of the real-time GIS data model based on the Sensor Web technologies. To support the realization of the proposed real-time GIS data model, a Sensor Web service platform is implemented. Real-time environmental data, such as meteorological data, air quality data, soil moisture data, soil temperature data, and landslide data, are managed in the Sensor Web service platform. In addition, two use cases of real-time air quality monitoring and real-time soil moisture monitoring based on the real-time GIS data model in the Sensor Web service platform are realized and demonstrated. The total time efficiency of the two experiments is 3.7 s and 9.2 s. The experimental results show that the method integrating real-time GIS data model and Sensor Web Service Platform is an effective way to manage environmental data under the Geospatial Service Web framework.

  5. Contraception and abortion coverage: What do primary care physicians think?

    PubMed

    Chuang, Cynthia H; Martenis, Melissa E; Parisi, Sara M; Delano, Rachel E; Sobota, Mindy; Nothnagle, Melissa; Schwarz, Eleanor Bimla

    2012-08-01

    Insurance coverage for family planning services has been a highly controversial element of the US health care reform debate. Whether primary care providers (PCPs) support public and private health insurance coverage for family planning services is unknown. PCPs in three states were surveyed regarding their opinions on health plan coverage and tax dollar use for contraception and abortion services. Almost all PCPs supported health plan coverage for contraception (96%) and use of tax dollars to cover contraception for low-income women (94%). A smaller majority supported health plan coverage for abortions (61%) and use of tax dollars to cover abortions for low-income women (63%). In adjusted models, support of health plan coverage for abortions was associated with female gender and internal medicine specialty, and support of using tax dollars for abortions for low-income women was associated with older age and internal medicine specialty. The majority of PCPs support health insurance coverage of contraception and abortion, as well as tax dollar subsidization of contraception and abortion services for low-income women. Copyright © 2012 Elsevier Inc. All rights reserved.

  6. EarthServer: a Summary of Achievements in Technology, Services, and Standards

    NASA Astrophysics Data System (ADS)

    Baumann, Peter

    2015-04-01

    Big Data in the Earth sciences, the Tera- to Exabyte archives, mostly are made up from coverage data, according to ISO and OGC defined as the digital representation of some space-time varying phenomenon. Common examples include 1-D sensor timeseries, 2-D remote sensing imagery, 3D x/y/t image timese ries and x/y/z geology data, and 4-D x/y/z/t atmosphere and ocean data. Analytics on such data requires on-demand processing of sometimes significant complexity, such as getting the Fourier transform of satellite images. As network bandwidth limits prohibit transfer of such Big Data it is indispensable to devise protocols allowing clients to task flexible and fast processing on the server. The transatlantic EarthServer initiative, running from 2011 through 2014, has united 11 partners to establish Big Earth Data Analytics. A key ingredient has been flexibility for users to ask whatever they want, not impeded and complicated by system internals. The EarthServer answer to this is to use high-level, standards-based query languages which unify data and metadata search in a simple, yet powerful way. A second key ingredient is scalability. Without any doubt, scalability ultimately can only be achieved through parallelization. In the past, parallelizing cod e has been done at compile time and usually with manual intervention. The EarthServer approach is to perform a samentic-based dynamic distribution of queries fragments based on networks optimization and further criteria. The EarthServer platform is comprised by rasdaman, the pioneer and leading Array DBMS built for any-size multi-dimensional raster data being extended with support for irregular grids and general meshes; in-situ retrieval (evaluation of database queries on existing archive structures, avoiding data import and, hence, duplication); the aforementioned distributed query processing. Additionally, Web clients for multi-dimensional data visualization are being established. Client/server interfaces are strictly based on OGC and W3C standards, in particular the Web Coverage Processing Service (WCPS) which defines a high-level coverage query language. Reviewers have attested EarthServer that "With no doubt the project has been shaping the Big Earth Data landscape through the standardization activities within OGC, ISO and beyond". We present the project approach, its outcomes and impact on standardization and Big Data technology, and vistas for the future.

  7. Air Quality uFIND: User-oriented Tool Set for Air Quality Data Discovery and Access

    NASA Astrophysics Data System (ADS)

    Hoijarvi, K.; Robinson, E. M.; Husar, R. B.; Falke, S. R.; Schultz, M. G.; Keating, T. J.

    2012-12-01

    Historically, there have been major impediments to seamless and effective data usage encountered by both data providers and users. Over the last five years, the international Air Quality (AQ) Community has worked through forums such as the Group on Earth Observations AQ Community of Practice, the ESIP AQ Working Group, and the Task Force on Hemispheric Transport of Air Pollution to converge on data format standards (e.g., netCDF), data access standards (e.g., Open Geospatial Consortium Web Coverage Services), metadata standards (e.g., ISO 19115), as well as other conventions (e.g., CF Naming Convention) in order to build an Air Quality Data Network. The centerpiece of the AQ Data Network is the web service-based tool set: user-oriented Filtering and Identification of Networked Data. The purpose of uFIND is to provide rich and powerful facilities for the user to: a) discover and choose a desired dataset by navigation through the multi-dimensional metadata space using faceted search, b) seamlessly access and browse datasets, and c) use uFINDs facilities as a web service for mashups with other AQ applications and portals. In a user-centric information system such as uFIND, the user experience is improved by metadata that includes the general fields for discovery as well as community-specific metadata to narrow the search beyond space, time and generic keyword searches. However, even with the community-specific additions, the ISO 19115 records were formed in compliance with the standard, so that other standards-based search interface could leverage this additional information. To identify the fields necessary for metadata discovery we started with the ISO 19115 Core Metadata fields and fields that were needed for a Catalog Service for the Web (CSW) Record. This fulfilled two goals - one to create valid ISO 19115 records and the other to be able to retrieve the records through a Catalog Service for the Web query. Beyond the required set of fields, the AQ Community added additional fields using a combination of keywords and ISO 19115 fields. These extensions allow discovery by measurement platform or observed phenomena. Beyond discovery metadata, the AQ records include service identification objects that allow standards-based clients, such as some brokers, to access the data found via OGC WCS or WMS data access protocols. uFIND, is one such smart client, this combination of discovery and access metadata allows the user to preview each registered dataset through spatial and temporal views; observe the data access and usage pattern and also find links to dataset-specific metadata directly in uFIND. The AQ data providers also benefit from this architecture since their data products are easier to find and re-use, enhancing the relevance and importance of their products. Finally, the earth science community at large benefits from the Service Oriented Architecture of uFIND, since it is a service itself and allows service-based interfacing with providers and users of the metadata, allowing uFIND facets to be further refined for a particular AQ application or completely repurposed for other Earth Science domains that use the same set of data access and metadata standards.

  8. Similarity Based Semantic Web Service Match

    NASA Astrophysics Data System (ADS)

    Peng, Hui; Niu, Wenjia; Huang, Ronghuai

    Semantic web service discovery aims at returning the most matching advertised services to the service requester by comparing the semantic of the request service with an advertised service. The semantic of a web service are described in terms of inputs, outputs, preconditions and results in Ontology Web Language for Service (OWL-S) which formalized by W3C. In this paper we proposed an algorithm to calculate the semantic similarity of two services by weighted averaging their inputs and outputs similarities. Case study and applications show the effectiveness of our algorithm in service match.

  9. The Effectiveness of Web Search Engines to Index New Sites from Different Countries

    ERIC Educational Resources Information Center

    Pirkola, Ari

    2009-01-01

    Introduction: Investigates how effectively Web search engines index new sites from different countries. The primary interest is whether new sites are indexed equally or whether search engines are biased towards certain countries. If major search engines show biased coverage it can be considered a significant economic and political problem because…

  10. Boverhof's App Earns Honorable Mention in Amazon's Web Services

    Science.gov Websites

    » Boverhof's App Earns Honorable Mention in Amazon's Web Services Competition News & Publications News Publications Facebook Google+ Twitter Boverhof's App Earns Honorable Mention in Amazon's Web Services by Amazon Web Services (AWS). Amazon officially announced the winners of its EC2 Spotathon on Monday

  11. Biological Web Service Repositories Review

    PubMed Central

    Urdidiales‐Nieto, David; Navas‐Delgado, Ismael

    2016-01-01

    Abstract Web services play a key role in bioinformatics enabling the integration of database access and analysis of algorithms. However, Web service repositories do not usually publish information on the changes made to their registered Web services. Dynamism is directly related to the changes in the repositories (services registered or unregistered) and at service level (annotation changes). Thus, users, software clients or workflow based approaches lack enough relevant information to decide when they should review or re‐execute a Web service or workflow to get updated or improved results. The dynamism of the repository could be a measure for workflow developers to re‐check service availability and annotation changes in the services of interest to them. This paper presents a review on the most well‐known Web service repositories in the life sciences including an analysis of their dynamism. Freshness is introduced in this paper, and has been used as the measure for the dynamism of these repositories. PMID:27783459

  12. A Method for Transforming Existing Web Service Descriptions into an Enhanced Semantic Web Service Framework

    NASA Astrophysics Data System (ADS)

    Du, Xiaofeng; Song, William; Munro, Malcolm

    Web Services as a new distributed system technology has been widely adopted by industries in the areas, such as enterprise application integration (EAI), business process management (BPM), and virtual organisation (VO). However, lack of semantics in the current Web Service standards has been a major barrier in service discovery and composition. In this chapter, we propose an enhanced context-based semantic service description framework (CbSSDF+) that tackles the problem and improves the flexibility of service discovery and the correctness of generated composite services. We also provide an agile transformation method to demonstrate how the various formats of Web Service descriptions on the Web can be managed and renovated step by step into CbSSDF+ based service description without large amount of engineering work. At the end of the chapter, we evaluate the applicability of the transformation method and the effectiveness of CbSSDF+ through a series of experiments.

  13. 2009-2010 Seasonal Influenza Vaccination Coverage among College Students from 8 Universities in North Carolina

    ERIC Educational Resources Information Center

    Poehling, Katherine A.; Blocker, Jill; Ip, Edward H.; Peters, Timothy R.; Wolfson, Mark

    2012-01-01

    Objective: The authors sought to describe the 2009-2010 seasonal influenza vaccine coverage of college students. Participants: A total of 4,090 college students from 8 North Carolina universities participated in a confidential, Web-based survey in October-November 2009. Methods: Associations between self-reported 2009-2010 seasonal influenza…

  14. Recent proposals to limit Medigap coverage and modify Medicare cost sharing.

    PubMed

    Linehan, Kathryn

    2012-02-24

    As policymakers look for savings from the Medicare program, some have proposed eliminating or discouraging "first-dollar coverage" available through privately purchased Medigap policies. Medigap coverage, which beneficiaries obtain to protect themselves from Medicare's cost-sharing requirements and its lack of a cap on out-of-pocket spending, may discourage the judicious use of medical services by reducing or eliminating beneficiary cost sharing. It is estimated that eliminating such coverage, which has been shown to be associated with higher Medicare spending, and requiring some cost sharing would encourage beneficiaries to reduce their service use and thus reduce pro­gram spending. However, eliminating first-dollar coverage could cause some beneficiaries to incur higher spending or forego necessary services. Some policy proposals to eliminate first-dollar coverage would also modify Medicare's cost sharing and add an out-of-pocket spending cap for fee-for-service Medicare. This paper discusses Medicare's current cost-sharing requirements, Medigap insurance, and proposals to modify Medicare's cost sharing and eliminate first-dollar coverage in Medigap plans. It reviews the evidence on the effects of first-dollar coverage on spending, some objections to eliminating first-dollar coverage, and results of research that has modeled the impact of eliminating first-dollar coverage, modifying Medicare's cost-sharing requirements, and adding an out-of-pocket limit on beneficiaries' spending.

  15. Measuring effective coverage of curative child health services in rural Burkina Faso: a cross-sectional study

    PubMed Central

    Koulidiati, Jean-Louis; Nesbitt, Robin C; Ouedraogo, Nobila; Hien, Hervé; Robyn, Paul Jacob; Compaoré, Philippe; Souares, Aurélia; Brenner, Stephan

    2018-01-01

    Objective To estimate both crude and effective curative health services coverage provided by rural health facilities to under 5-year-old (U5YO) children in Burkina Faso. Methods We surveyed 1298 child health providers and 1681 clinical cases across 494 primary-level health facilities, as well as 12 497 U5YO children across 7347households in the facilities’ catchment areas. Facilities were scored based on a set of indicators along three quality-of-care dimensions: management of common childhood diseases, management of severe childhood diseases and general service readiness. Linking service quality to service utilisation, we estimated both crude and effective coverage of U5YO children by these selected curative services. Results Measured performance quality among facilities was generally low with only 12.7% of facilities surveyed reaching our definition of high and 57.1% our definition of intermediate quality of care. The crude coverage was 69.5% while the effective coverages indicated that 5.3% and 44.6% of children reporting an illness episode received services of only high or high and intermediate quality, respectively. Conclusion Our study showed that the quality of U5YO child health services provided by primary-level health facilities in Burkina Faso was low, resulting in relatively ineffective population coverage. Poor adherence to clinical treatment guidelines combined with the lack of equipment and qualified clinical staff that performed U5YO consultations seemed to be contributors to the gap between crude and effective coverage. PMID:29858415

  16. Enhancing the AliEn Web Service Authentication

    NASA Astrophysics Data System (ADS)

    Zhu, Jianlin; Saiz, Pablo; Carminati, Federico; Betev, Latchezar; Zhou, Daicui; Mendez Lorenzo, Patricia; Grigoras, Alina Gabriela; Grigoras, Costin; Furano, Fabrizio; Schreiner, Steffen; Vladimirovna Datskova, Olga; Sankar Banerjee, Subho; Zhang, Guoping

    2011-12-01

    Web Services are an XML based technology that allow applications to communicate with each other across disparate systems. Web Services are becoming the de facto standard that enable inter operability between heterogeneous processes and systems. AliEn2 is a grid environment based on web services. The AliEn2 services can be divided in three categories: Central services, deployed once per organization; Site services, deployed on each of the participating centers; Job Agents running on the worker nodes automatically. A security model to protect these services is essential for the whole system. Current implementations of web server, such as Apache, are not suitable to be used within the grid environment. Apache with the mod_ssl and OpenSSL only supports the X.509 certificates. But in the grid environment, the common credential is the proxy certificate for the purpose of providing restricted proxy and delegation. An Authentication framework was taken for AliEn2 web services to add the ability to accept X.509 certificates and proxy certificates from client-side to Apache Web Server. The authentication framework could also allow the generation of access control policies to limit access to the AliEn2 web services.

  17. The impact of web services at the IRIS DMC

    NASA Astrophysics Data System (ADS)

    Weekly, R. T.; Trabant, C. M.; Ahern, T. K.; Stults, M.; Suleiman, Y. Y.; Van Fossen, M.; Weertman, B.

    2015-12-01

    The IRIS Data Management Center (DMC) has served the seismological community for nearly 25 years. In that time we have offered data and information from our archive using a variety of mechanisms ranging from email-based to desktop applications to web applications and web services. Of these, web services have quickly become the primary method for data extraction at the DMC. In 2011, the first full year of operation, web services accounted for over 40% of the data shipped from the DMC. In 2014, over ~450 TB of data was delivered directly to users through web services, representing nearly 70% of all shipments from the DMC that year. In addition to handling requests directly from users, the DMC switched all data extraction methods to use web services in 2014. On average the DMC now handles between 10 and 20 million requests per day submitted to web service interfaces. The rapid adoption of web services is attributed to the many advantages they bring. For users, they provide on-demand data using an interface technology, HTTP, that is widely supported in nearly every computing environment and language. These characteristics, combined with human-readable documentation and existing tools make integration of data access into existing workflows relatively easy. For the DMC, the web services provide an abstraction layer to internal repositories allowing for concentrated optimization of extraction workflow and easier evolution of those repositories. Lending further support to DMC's push in this direction, the core web services for station metadata, timeseries data and event parameters were adopted as standards by the International Federation of Digital Seismograph Networks (FDSN). We expect to continue enhancing existing services and building new capabilities for this platform. For example, the DMC has created a federation system and tools allowing researchers to discover and collect seismic data from data centers running the FDSN-standardized services. A future capability will leverage the DMC's MUSTANG project to select data based on data quality measurements. Within five years, the DMC's web services have proven to be a robust and flexible platform that enables continued growth for the DMC. We expect continued enhancements and adoption of web services.

  18. Rail-RNA: scalable analysis of RNA-seq splicing and coverage.

    PubMed

    Nellore, Abhinav; Collado-Torres, Leonardo; Jaffe, Andrew E; Alquicira-Hernández, José; Wilks, Christopher; Pritt, Jacob; Morton, James; Leek, Jeffrey T; Langmead, Ben

    2017-12-15

    RNA sequencing (RNA-seq) experiments now span hundreds to thousands of samples. Current spliced alignment software is designed to analyze each sample separately. Consequently, no information is gained from analyzing multiple samples together, and it requires extra work to obtain analysis products that incorporate data from across samples. We describe Rail-RNA, a cloud-enabled spliced aligner that analyzes many samples at once. Rail-RNA eliminates redundant work across samples, making it more efficient as samples are added. For many samples, Rail-RNA is more accurate than annotation-assisted aligners. We use Rail-RNA to align 667 RNA-seq samples from the GEUVADIS project on Amazon Web Services in under 16 h for US$0.91 per sample. Rail-RNA outputs alignments in SAM/BAM format; but it also outputs (i) base-level coverage bigWigs for each sample; (ii) coverage bigWigs encoding normalized mean and median coverages at each base across samples analyzed; and (iii) exon-exon splice junctions and indels (features) in columnar formats that juxtapose coverages in samples in which a given feature is found. Supplementary outputs are ready for use with downstream packages for reproducible statistical analysis. We use Rail-RNA to identify expressed regions in the GEUVADIS samples and show that both annotated and unannotated (novel) expressed regions exhibit consistent patterns of variation across populations and with respect to known confounding variables. Rail-RNA is open-source software available at http://rail.bio. anellore@gmail.com or langmea@cs.jhu.edu. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  19. 5 CFR 300.702 - Coverage.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS EMPLOYMENT (GENERAL) Statutory Bar to Appointment of Persons Who Fail To Register Under Selective Service Law § 300.702 Coverage. Appointments in the competitive service, the excepted service, the Senior Executive Service, or any other civil...

  20. 5 CFR 300.702 - Coverage.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS EMPLOYMENT (GENERAL) Statutory Bar to Appointment of Persons Who Fail To Register Under Selective Service Law § 300.702 Coverage. Appointments in the competitive service, the excepted service, the Senior Executive Service, or any other civil...

  1. Web service discovery among large service pools utilising semantic similarity and clustering

    NASA Astrophysics Data System (ADS)

    Chen, Fuzan; Li, Minqiang; Wu, Harris; Xie, Lingli

    2017-03-01

    With the rapid development of electronic business, Web services have attracted much attention in recent years. Enterprises can combine individual Web services to provide new value-added services. An emerging challenge is the timely discovery of close matches to service requests among large service pools. In this study, we first define a new semantic similarity measure combining functional similarity and process similarity. We then present a service discovery mechanism that utilises the new semantic similarity measure for service matching. All the published Web services are pre-grouped into functional clusters prior to the matching process. For a user's service request, the discovery mechanism first identifies matching services clusters and then identifies the best matching Web services within these matching clusters. Experimental results show that the proposed semantic discovery mechanism performs better than a conventional lexical similarity-based mechanism.

  2. A verification strategy for web services composition using enhanced stacked automata model.

    PubMed

    Nagamouttou, Danapaquiame; Egambaram, Ilavarasan; Krishnan, Muthumanickam; Narasingam, Poonkuzhali

    2015-01-01

    Currently, Service-Oriented Architecture (SOA) is becoming the most popular software architecture of contemporary enterprise applications, and one crucial technique of its implementation is web services. Individual service offered by some service providers may symbolize limited business functionality; however, by composing individual services from different service providers, a composite service describing the intact business process of an enterprise can be made. Many new standards have been defined to decipher web service composition problem namely Business Process Execution Language (BPEL). BPEL provides an initial work for forming an Extended Markup Language (XML) specification language for defining and implementing business practice workflows for web services. The problems with most realistic approaches to service composition are the verification of composed web services. It has to depend on formal verification method to ensure the correctness of composed services. A few research works has been carried out in the literature survey for verification of web services for deterministic system. Moreover the existing models did not address the verification properties like dead transition, deadlock, reachability and safetyness. In this paper, a new model to verify the composed web services using Enhanced Stacked Automata Model (ESAM) has been proposed. The correctness properties of the non-deterministic system have been evaluated based on the properties like dead transition, deadlock, safetyness, liveness and reachability. Initially web services are composed using Business Process Execution Language for Web Service (BPEL4WS) and it is converted into ESAM (combination of Muller Automata (MA) and Push Down Automata (PDA)) and it is transformed into Promela language, an input language for Simple ProMeLa Interpreter (SPIN) tool. The model is verified using SPIN tool and the results revealed better recital in terms of finding dead transition and deadlock in contrast to the existing models.

  3. Analyzing CRISM hyperspectral imagery using PlanetServer.

    NASA Astrophysics Data System (ADS)

    Figuera, Ramiro Marco; Pham Huu, Bang; Minin, Mikhail; Flahaut, Jessica; Halder, Anik; Rossi, Angelo Pio

    2017-04-01

    Mineral characterization of planetary surfaces bears great importance for space exploration. In order to perform it, orbital hyperspectral imagery is widely used. In our research we use Compact Reconnaissance Imaging Spectrometer for Mars (CRISM) [1] TRDR L observations with a spectral range of 1 to 4 µm. PlanetServer comprises a server, a web client and a Python client/API. The server side uses the Array DataBase Management System (DBMS) Raster Data Manager (Rasdaman) Community Edition [2]. OGC standards such as the Web Coverage Processing Service (WCPS) [3], an SQL-like language capable to query information along the image cube, are implemented in the PetaScope component [4]. The client side uses NASA's Web World Wind [5] allowing the user to access the data in an intuitive way. The client consists of a globe where all cubes are deployed, a main menu where projections, base maps and RGB combinations are provided, and a plot dock where the spectral information is shown. The RGB combinator tool allows to do band combination such as the CRISM products [6] using WCPS. The spectral information is retrieved using WCPS and shown in the plot dock/widget. The USGS splib06a library [7] is available to compare CRISM vs. laboratory spectra. The Python API provides an environment to create RGB combinations that can be embedded into existing pipelines. All employed libraries and tools are open source and can be easily adapted to other datasets. PlanetServer stands as a promising tool for spectral analysis on planetary bodies. M3/Moon and OMEGA datasets will be soon available. [1] S. Murchie et al., "Compact Connaissance Imaging Spectrometer for Mars (CRISM) on Mars Reconnaissance Orbiter (MRO)," J. Geophys. Res. E Planets,2007. [2] P. Baumann, A. Dehmel, P. Furtado, R. Ritsch, and N. Widmann, "The multidimensional database system RasDaMan," ACM SIGMOD Rec., vol. 27, no. 2, pp. 575-577, Jun. 1998. [3] P. Baumann, "The OGC web coverage processing service (WCPS) standard," Geoinformatica, vol. 14, no. 4, Jul. 2010. [4] A. Aiordǎchioaie and P. Baumann, "PetaScope: An open-source implementation of the OGC WCS Geo service standards suite," Lect. Notes Comput. Sci. (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics), vol. 6187 LNCS, pp. 160-168, Jun. 2010. [5] P. Hogan, C. Maxwell, R. Kim, and T. Gaskins, "World Wind 3D Earth Viewing," Apr. 2007. [6] C. E. Viviano-Beck et al., "Revised CRISM spectral parameters and summary products based on the currently detected mineral diversity on Mars," J. Geophys. Res. E Planets, vol. 119, no. 6, pp. 1403-1431, Jun. 2014. [7] R. N. Clark et al., "USGS digital spectral library splib06a: U.S. Geological Survey, Digital Data Series 231," 2007. [Online]. Available: http://speclab.cr.usgs.gov/spectral.lib06.

  4. Pragmatic Computing - A Semiotic Perspective to Web Services

    NASA Astrophysics Data System (ADS)

    Liu, Kecheng

    The web seems to have evolved from a syntactic web, a semantic web to a pragmatic web. This evolution conforms to the study of information and technology from the theory of semiotics. The pragmatics, concerning with the use of information in relation to the context and intended purposes, is extremely important in web service and applications. Much research in pragmatics has been carried out; but in the same time, attempts and solutions have led to some more questions. After reviewing the current work in pragmatic web, the paper presents a semiotic approach to website services, particularly on request decomposition and service aggregation.

  5. Advances in the TRIDEC Cloud

    NASA Astrophysics Data System (ADS)

    Hammitzsch, Martin; Spazier, Johannes; Reißland, Sven

    2016-04-01

    The TRIDEC Cloud is a platform that merges several complementary cloud-based services for instant tsunami propagation calculations and automated background computation with graphics processing units (GPU), for web-mapping of hazard specific geospatial data, and for serving relevant functionality to handle, share, and communicate threat specific information in a collaborative and distributed environment. The platform offers a modern web-based graphical user interface so that operators in warning centres and stakeholders of other involved parties (e.g. CPAs, ministries) just need a standard web browser to access a full-fledged early warning and information system with unique interactive features such as Cloud Messages and Shared Maps. Furthermore, the TRIDEC Cloud can be accessed in different modes, e.g. the monitoring mode, which provides important functionality required to act in a real event, and the exercise-and-training mode, which enables training and exercises with virtual scenarios re-played by a scenario player. The software system architecture and open interfaces facilitate global coverage so that the system is applicable for any region in the world and allow the integration of different sensor systems as well as the integration of other hazard types and use cases different to tsunami early warning. Current advances of the TRIDEC Cloud platform will be summarized in this presentation.

  6. Management intensity and vegetation complexity affect web-building spiders and their prey.

    PubMed

    Diehl, Eva; Mader, Viktoria L; Wolters, Volkmar; Birkhofer, Klaus

    2013-10-01

    Agricultural management and vegetation complexity affect arthropod diversity and may alter trophic interactions between predators and their prey. Web-building spiders are abundant generalist predators and important natural enemies of pests. We analyzed how management intensity (tillage, cutting of the vegetation, grazing by cattle, and synthetic and organic inputs) and vegetation complexity (plant species richness, vegetation height, coverage, and density) affect rarefied richness and composition of web-building spiders and their prey with respect to prey availability and aphid predation in 12 habitats, ranging from an uncut fallow to a conventionally managed maize field. Spiders and prey from webs were collected manually and the potential prey were quantified using sticky traps. The species richness of web-building spiders and the order richness of prey increased with plant diversity and vegetation coverage. Prey order richness was lower at tilled compared to no-till sites. Hemipterans (primarily aphids) were overrepresented, while dipterans, hymenopterans, and thysanopterans were underrepresented in webs compared to sticky traps. The per spider capture efficiency for aphids was higher at tilled than at no-till sites and decreased with vegetation complexity. After accounting for local densities, 1.8 times more aphids were captured at uncut compared to cut sites. Our results emphasize the functional role of web-building spiders in aphid predation, but suggest negative effects of cutting or harvesting. We conclude that reduced management intensity and increased vegetation complexity help to conserve local invertebrate diversity, and that web-building spiders at sites under low management intensity (e.g., semi-natural habitats) contribute to aphid suppression at the landscape scale.

  7. QoS measurement of workflow-based web service compositions using Colored Petri net.

    PubMed

    Nematzadeh, Hossein; Motameni, Homayun; Mohamad, Radziah; Nematzadeh, Zahra

    2014-01-01

    Workflow-based web service compositions (WB-WSCs) is one of the main composition categories in service oriented architecture (SOA). Eflow, polymorphic process model (PPM), and business process execution language (BPEL) are the main techniques of the category of WB-WSCs. Due to maturity of web services, measuring the quality of composite web services being developed by different techniques becomes one of the most important challenges in today's web environments. Business should try to provide good quality regarding the customers' requirements to a composed web service. Thus, quality of service (QoS) which refers to nonfunctional parameters is important to be measured since the quality degree of a certain web service composition could be achieved. This paper tried to find a deterministic analytical method for dependability and performance measurement using Colored Petri net (CPN) with explicit routing constructs and application of theory of probability. A computer tool called WSET was also developed for modeling and supporting QoS measurement through simulation.

  8. Enhancing UCSF Chimera through web services.

    PubMed

    Huang, Conrad C; Meng, Elaine C; Morris, John H; Pettersen, Eric F; Ferrin, Thomas E

    2014-07-01

    Integrating access to web services with desktop applications allows for an expanded set of application features, including performing computationally intensive tasks and convenient searches of databases. We describe how we have enhanced UCSF Chimera (http://www.rbvi.ucsf.edu/chimera/), a program for the interactive visualization and analysis of molecular structures and related data, through the addition of several web services (http://www.rbvi.ucsf.edu/chimera/docs/webservices.html). By streamlining access to web services, including the entire job submission, monitoring and retrieval process, Chimera makes it simpler for users to focus on their science projects rather than data manipulation. Chimera uses Opal, a toolkit for wrapping scientific applications as web services, to provide scalable and transparent access to several popular software packages. We illustrate Chimera's use of web services with an example workflow that interleaves use of these services with interactive manipulation of molecular sequences and structures, and we provide an example Python program to demonstrate how easily Opal-based web services can be accessed from within an application. Web server availability: http://webservices.rbvi.ucsf.edu/opal2/dashboard?command=serviceList. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  9. 47 CFR 1.946 - Construction and coverage requirements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Random Selection Wireless Radio Services Applications and Proceedings Application Requirements and... requirements. For each of the Wireless Radio Services, requirements for construction and commencement of... service requirements. In certain Wireless Radio Services, licensees must comply with geographic coverage...

  10. 47 CFR 1.946 - Construction and coverage requirements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Random Selection Wireless Radio Services Applications and Proceedings Application Requirements and... requirements. For each of the Wireless Radio Services, requirements for construction and commencement of... service requirements. In certain Wireless Radio Services, licensees must comply with geographic coverage...

  11. 47 CFR 1.946 - Construction and coverage requirements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Random Selection Wireless Radio Services Applications and Proceedings Application Requirements and... requirements. For each of the Wireless Radio Services, requirements for construction and commencement of... service requirements. In certain Wireless Radio Services, licensees must comply with geographic coverage...

  12. A secure mobile crowdsensing (MCS) location tracker for elderly in smart city

    NASA Astrophysics Data System (ADS)

    Shien, Lau Khai; Singh, Manmeet Mahinderjit

    2017-10-01

    According to the UN's (United Nations) projection, Malaysia will achieve ageing population status by 2030. The challenge of the growing ageing population is health and social care services. As the population lives longer, the costs of institutional care rises and elderly who not able live independently in their own homes without caregivers. Moreover, it restricted their activity area, safety and freedom in their daily life. Hence, a tracking system is worthy for their caregivers to track their real-time location with efficient. Currently tracking and monitoring systems are unable to satisfy the needs of the community. Hence, Indoor-Outdoor Elderly Secure and Tracking care system (IOET) proposed to track and monitor elderly. This Mobile Crowdsensing type of system is using indoor and outdoor positioning system to locate elder which utilizes the RFID, NFC, biometric system and GPS aim to secure the safety of elderly within indoors and outdoors environment. A mobile application and web-based application to be designed for this system. This system able to real-time tracking by combining GPS and NFC for outdoor coverage where ideally in smart city. In indoor coverage, the system utilizes active RFID tracking elderly movement. The system will prompt caregiver wherever elderly movement or request by using the notification service which provided the real-time notify. Caregiver also can review the place that visited by elderly and trace back elderly movement.

  13. 29 CFR 552.99 - Basis for coverage of domestic service employees.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... coverage of domestic service employees. Congress in section 2(a) of the Act specifically found that the... v. McClung, 379 U.S. 294 (1964)),” and concluded “that coverage of domestic employees is a vital...

  14. 29 CFR 552.99 - Basis for coverage of domestic service employees.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... coverage of domestic service employees. Congress in section 2(a) of the Act specifically found that the... v. McClung, 379 U.S. 294 (1964)),” and concluded “that coverage of domestic employees is a vital...

  15. 29 CFR 552.99 - Basis for coverage of domestic service employees.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... coverage of domestic service employees. Congress in section 2(a) of the Act specifically found that the... v. McClung, 379 U.S. 294 (1964)),” and concluded “that coverage of domestic employees is a vital...

  16. 29 CFR 552.99 - Basis for coverage of domestic service employees.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... coverage of domestic service employees. Congress in section 2(a) of the Act specifically found that the... v. McClung, 379 U.S. 294 (1964)),” and concluded “that coverage of domestic employees is a vital...

  17. 29 CFR 552.99 - Basis for coverage of domestic service employees.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... coverage of domestic service employees. Congress in section 2(a) of the Act specifically found that the... v. McClung, 379 U.S. 294 (1964)),” and concluded “that coverage of domestic employees is a vital...

  18. 5 CFR 359.701 - Coverage.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Coverage. 359.701 Section 359.701 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS REMOVAL FROM THE SENIOR EXECUTIVE SERVICE; GUARANTEED PLACEMENT IN OTHER PERSONNEL SYSTEMS Guaranteed Placement § 359.701 Coverage...

  19. 5 CFR 359.701 - Coverage.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 5 Administrative Personnel 1 2013-01-01 2013-01-01 false Coverage. 359.701 Section 359.701 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS REMOVAL FROM THE SENIOR EXECUTIVE SERVICE; GUARANTEED PLACEMENT IN OTHER PERSONNEL SYSTEMS Guaranteed Placement § 359.701 Coverage...

  20. 5 CFR 359.701 - Coverage.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 5 Administrative Personnel 1 2012-01-01 2012-01-01 false Coverage. 359.701 Section 359.701 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS REMOVAL FROM THE SENIOR EXECUTIVE SERVICE; GUARANTEED PLACEMENT IN OTHER PERSONNEL SYSTEMS Guaranteed Placement § 359.701 Coverage...

  1. 5 CFR 359.701 - Coverage.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 5 Administrative Personnel 1 2014-01-01 2014-01-01 false Coverage. 359.701 Section 359.701 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS REMOVAL FROM THE SENIOR EXECUTIVE SERVICE; GUARANTEED PLACEMENT IN OTHER PERSONNEL SYSTEMS Guaranteed Placement § 359.701 Coverage...

  2. 5 CFR 359.701 - Coverage.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 1 2011-01-01 2011-01-01 false Coverage. 359.701 Section 359.701 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS REMOVAL FROM THE SENIOR EXECUTIVE SERVICE; GUARANTEED PLACEMENT IN OTHER PERSONNEL SYSTEMS Guaranteed Placement § 359.701 Coverage...

  3. User Needs of Digital Service Web Portals: A Case Study

    ERIC Educational Resources Information Center

    Heo, Misook; Song, Jung-Sook; Seol, Moon-Won

    2013-01-01

    The authors examined the needs of digital information service web portal users. More specifically, the needs of Korean cultural portal users were examined as a case study. The conceptual framework of a web-based portal is that it is a complex, web-based service application with characteristics of information systems and service agents. In…

  4. Compression-based aggregation model for medical web services.

    PubMed

    Al-Shammary, Dhiah; Khalil, Ibrahim

    2010-01-01

    Many organizations such as hospitals have adopted Cloud Web services in applying their network services to avoid investing heavily computing infrastructure. SOAP (Simple Object Access Protocol) is the basic communication protocol of Cloud Web services that is XML based protocol. Generally,Web services often suffer congestions and bottlenecks as a result of the high network traffic that is caused by the large XML overhead size. At the same time, the massive load on Cloud Web services in terms of the large demand of client requests has resulted in the same problem. In this paper, two XML-aware aggregation techniques that are based on exploiting the compression concepts are proposed in order to aggregate the medical Web messages and achieve higher message size reduction.

  5. A snapshot of 3649 Web-based services published between 1994 and 2017 shows a decrease in availability after 2 years.

    PubMed

    Osz, Ágnes; Pongor, Lorinc Sándor; Szirmai, Danuta; Gyorffy, Balázs

    2017-12-08

    The long-term availability of online Web services is of utmost importance to ensure reproducibility of analytical results. However, because of lack of maintenance following acceptance, many servers become unavailable after a short period of time. Our aim was to monitor the accessibility and the decay rate of published Web services as well as to determine the factors underlying trends changes. We searched PubMed to identify publications containing Web server-related terms published between 1994 and 2017. Automatic and manual screening was used to check the status of each Web service. Kruskall-Wallis, Mann-Whitney and Chi-square tests were used to evaluate various parameters, including availability, accessibility, platform, origin of authors, citation, journal impact factor and publication year. We identified 3649 publications in 375 journals of which 2522 (69%) were currently active. Over 95% of sites were running in the first 2 years, but this rate dropped to 84% in the third year and gradually sank afterwards (P < 1e-16). The mean half-life of Web services is 10.39 years. Working Web services were published in journals with higher impact factors (P = 4.8e-04). Services published before the year 2000 received minimal attention. The citation of offline services was less than for those online (P = 0.022). The majority of Web services provide analytical tools, and the proportion of databases is slowly decreasing. Conclusions. Almost one-third of Web services published to date went out of service. We recommend continued support of Web-based services to increase the reproducibility of published results. © The Author 2017. Published by Oxford University Press.

  6. Analysis Tool Web Services from the EMBL-EBI.

    PubMed

    McWilliam, Hamish; Li, Weizhong; Uludag, Mahmut; Squizzato, Silvano; Park, Young Mi; Buso, Nicola; Cowley, Andrew Peter; Lopez, Rodrigo

    2013-07-01

    Since 2004 the European Bioinformatics Institute (EMBL-EBI) has provided access to a wide range of databases and analysis tools via Web Services interfaces. This comprises services to search across the databases available from the EMBL-EBI and to explore the network of cross-references present in the data (e.g. EB-eye), services to retrieve entry data in various data formats and to access the data in specific fields (e.g. dbfetch), and analysis tool services, for example, sequence similarity search (e.g. FASTA and NCBI BLAST), multiple sequence alignment (e.g. Clustal Omega and MUSCLE), pairwise sequence alignment and protein functional analysis (e.g. InterProScan and Phobius). The REST/SOAP Web Services (http://www.ebi.ac.uk/Tools/webservices/) interfaces to these databases and tools allow their integration into other tools, applications, web sites, pipeline processes and analytical workflows. To get users started using the Web Services, sample clients are provided covering a range of programming languages and popular Web Service tool kits, and a brief guide to Web Services technologies, including a set of tutorials, is available for those wishing to learn more and develop their own clients. Users of the Web Services are informed of improvements and updates via a range of methods.

  7. Analysis Tool Web Services from the EMBL-EBI

    PubMed Central

    McWilliam, Hamish; Li, Weizhong; Uludag, Mahmut; Squizzato, Silvano; Park, Young Mi; Buso, Nicola; Cowley, Andrew Peter; Lopez, Rodrigo

    2013-01-01

    Since 2004 the European Bioinformatics Institute (EMBL-EBI) has provided access to a wide range of databases and analysis tools via Web Services interfaces. This comprises services to search across the databases available from the EMBL-EBI and to explore the network of cross-references present in the data (e.g. EB-eye), services to retrieve entry data in various data formats and to access the data in specific fields (e.g. dbfetch), and analysis tool services, for example, sequence similarity search (e.g. FASTA and NCBI BLAST), multiple sequence alignment (e.g. Clustal Omega and MUSCLE), pairwise sequence alignment and protein functional analysis (e.g. InterProScan and Phobius). The REST/SOAP Web Services (http://www.ebi.ac.uk/Tools/webservices/) interfaces to these databases and tools allow their integration into other tools, applications, web sites, pipeline processes and analytical workflows. To get users started using the Web Services, sample clients are provided covering a range of programming languages and popular Web Service tool kits, and a brief guide to Web Services technologies, including a set of tutorials, is available for those wishing to learn more and develop their own clients. Users of the Web Services are informed of improvements and updates via a range of methods. PMID:23671338

  8. Biological Web Service Repositories Review.

    PubMed

    Urdidiales-Nieto, David; Navas-Delgado, Ismael; Aldana-Montes, José F

    2017-05-01

    Web services play a key role in bioinformatics enabling the integration of database access and analysis of algorithms. However, Web service repositories do not usually publish information on the changes made to their registered Web services. Dynamism is directly related to the changes in the repositories (services registered or unregistered) and at service level (annotation changes). Thus, users, software clients or workflow based approaches lack enough relevant information to decide when they should review or re-execute a Web service or workflow to get updated or improved results. The dynamism of the repository could be a measure for workflow developers to re-check service availability and annotation changes in the services of interest to them. This paper presents a review on the most well-known Web service repositories in the life sciences including an analysis of their dynamism. Freshness is introduced in this paper, and has been used as the measure for the dynamism of these repositories. © 2017 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  9. The value of Web-based library services at Cedars-Sinai Health System.

    PubMed

    Halub, L P

    1999-07-01

    Cedars-Sinai Medical Library/Information Center has maintained Web-based services since 1995 on the Cedars-Sinai Health System network. In that time, the librarians have found the provision of Web-based services to be a very worthwhile endeavor. Library users value the services that they access from their desktops because the services save time. They also appreciate being able to access services at their convenience, without restriction by the library's hours of operation. The library values its Web site because it brings increased visibility within the health system, and it enables library staff to expand services when budget restrictions have forced reduced hours of operation. In creating and maintaining the information center Web site, the librarians have learned the following lessons: consider the design carefully; offer what services you can, but weigh the advantages of providing the services against the time required to maintain them; make the content as accessible as possible; promote your Web site; and make friends in other departments, especially information services.

  10. The value of Web-based library services at Cedars-Sinai Health System.

    PubMed Central

    Halub, L P

    1999-01-01

    Cedars-Sinai Medical Library/Information Center has maintained Web-based services since 1995 on the Cedars-Sinai Health System network. In that time, the librarians have found the provision of Web-based services to be a very worthwhile endeavor. Library users value the services that they access from their desktops because the services save time. They also appreciate being able to access services at their convenience, without restriction by the library's hours of operation. The library values its Web site because it brings increased visibility within the health system, and it enables library staff to expand services when budget restrictions have forced reduced hours of operation. In creating and maintaining the information center Web site, the librarians have learned the following lessons: consider the design carefully; offer what services you can, but weigh the advantages of providing the services against the time required to maintain them; make the content as accessible as possible; promote your Web site; and make friends in other departments, especially information services. PMID:10427423

  11. 5 CFR 1650.41 - How to obtain an age-based withdrawal.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... record keeper a properly completed paper TSP age-based withdrawal request form or use the TSP Web site to initiate a request. A participant's ability to complete an age-based withdrawal on the Web will depend on his or her retirement system coverage, marital status, and whether or not part or all of the...

  12. Indexing Aids at Corporate Websites: The Use of Robots.txt and META Tags.

    ERIC Educational Resources Information Center

    Drott, M. Carl

    2002-01-01

    This study examine 60 corporate Web sites to see if they provided support for automatic indexing, particularly use of the robots.txt and Meta tags for keywords and description. Discusses the use of Java and cookies and suggests that an increase in indexing aids would improve overall index coverage of the Web. (Author/LRW)

  13. Coverage gap in maternal and child health services in India: assessing trends and regional deprivation during 1992-2006.

    PubMed

    Kumar, Chandan; Singh, Prashant Kumar; Rai, Rajesh Kumar

    2013-12-01

    Increasing the coverage of key maternal, newborn and child health interventions is essential, if India has to attain Millennium Development Goals 4 and 5. This study assesses the coverage gap in maternal and child health services across states in India during 1992-2006 emphasizing the rural-urban disparities. Additionally, association between the coverage gap and under-5 mortality rate across states are illustrated. The three waves of National Family Health Survey (NFHS) conducted during 1992-1993 (NFHS-1), 1998-1999 (NFHS-2) and 2005-2006 (NFHS-3) were used to construct a composite index of coverage gap in four areas of health-care interventions: family planning, maternal and newborn care, immunization and treatment of sick children. The central, eastern and northeastern regions of India reported a higher coverage gap in maternal and child health care services during 1992-2006, while the rural-urban difference in the coverage gap has increased in Gujarat, Haryana, Rajasthan and Kerala over the period. The analysis also shows a significant positive relationship between the coverage gap index and under-five mortality rate across states. Region or area-specific focus in order to increase the coverage of maternal and child health care services in India should be the priority of the policy-makers and programme executors.

  14. MedlinePlus Connect: How it Works

    MedlinePlus

    ... it looks depends on how it is implemented. Web Application The Web application returns a formatted response ... for more examples of Web Application response pages. Web Service The MedlinePlus Connect REST-based Web service ...

  15. Racial and Ethnic Disparities in Services and the Patient Protection and Affordable Care Act

    PubMed Central

    Abdus, Salam; Mistry, Kamila B.

    2015-01-01

    Objectives. We examined prereform patterns in insurance coverage, access to care, and preventive services use by race/ethnicity in adults targeted by the coverage expansions of the Patient Protection and Affordable Care Act (ACA). Methods. We used pre-ACA household data from the Medical Expenditure Panel Survey to identify groups targeted by the coverage provisions of the Act (Medicaid expansions and subsidized Marketplace coverage). We examined racial/ethnic differences in coverage, access to care, and preventive service use, across and within ACA relevant subgroups from 2005 to 2010. The study took place at the Agency for Healthcare Research and Quality in Rockville, Maryland. Results. Minorities were disproportionately represented among those targeted by the coverage provisions of the ACA. Targeted groups had lower rates of coverage, access to care, and preventive services use, and racial/ethnic disparities were, in some cases, widest within these targeted groups. Conclusions. Our findings highlighted the opportunity of the ACA to not only to improve coverage, access, and use for all racial/ethnic groups, but also to narrow racial/ethnic disparities in these outcomes. Our results might have particular importance for states that are deciding whether to implement the ACA Medicaid expansions. PMID:26447920

  16. Racial and Ethnic Disparities in Services and the Patient Protection and Affordable Care Act.

    PubMed

    Abdus, Salam; Mistry, Kamila B; Selden, Thomas M

    2015-11-01

    We examined prereform patterns in insurance coverage, access to care, and preventive services use by race/ethnicity in adults targeted by the coverage expansions of the Patient Protection and Affordable Care Act (ACA). We used pre-ACA household data from the Medical Expenditure Panel Survey to identify groups targeted by the coverage provisions of the Act (Medicaid expansions and subsidized Marketplace coverage). We examined racial/ethnic differences in coverage, access to care, and preventive service use, across and within ACA relevant subgroups from 2005 to 2010. The study took place at the Agency for Healthcare Research and Quality in Rockville, Maryland. Minorities were disproportionately represented among those targeted by the coverage provisions of the ACA. Targeted groups had lower rates of coverage, access to care, and preventive services use, and racial/ethnic disparities were, in some cases, widest within these targeted groups. Our findings highlighted the opportunity of the ACA to not only to improve coverage, access, and use for all racial/ethnic groups, but also to narrow racial/ethnic disparities in these outcomes. Our results might have particular importance for states that are deciding whether to implement the ACA Medicaid expansions.

  17. Unifying Access to National Hydrologic Data Repositories via Web Services

    NASA Astrophysics Data System (ADS)

    Valentine, D. W.; Jennings, B.; Zaslavsky, I.; Maidment, D. R.

    2006-12-01

    The CUAHSI hydrologic information system (HIS) is designed to be a live, multiscale web portal system for accessing, querying, visualizing, and publishing distributed hydrologic observation data and models for any location or region in the United States. The HIS design follows the principles of open service oriented architecture, i.e. system components are represented as web services with well defined standard service APIs. WaterOneFlow web services are the main component of the design. The currently available services have been completely re-written compared to the previous version, and provide programmatic access to USGS NWIS. (steam flow, groundwater and water quality repositories), DAYMET daily observations, NASA MODIS, and Unidata NAM streams, with several additional web service wrappers being added (EPA STORET, NCDC and others.). Different repositories of hydrologic data use different vocabularies, and support different types of query access. Resolving semantic and structural heterogeneities across different hydrologic observation archives and distilling a generic set of service signatures is one of the main scalability challenges in this project, and a requirement in our web service design. To accomplish the uniformity of the web services API, data repositories are modeled following the CUAHSI Observation Data Model. The web service responses are document-based, and use an XML schema to express the semantics in a standard format. Access to station metadata is provided via web service methods, GetSites, GetSiteInfo and GetVariableInfo. The methdods form the foundation of CUAHSI HIS discovery interface and may execute over locally-stored metadata or request the information from remote repositories directly. Observation values are retrieved via a generic GetValues method which is executed against national data repositories. The service is implemented in ASP.Net, and other providers are implementing WaterOneFlow services in java. Reference implementation of WaterOneFlow web services is available. More information about the ongoing development of CUAHSI HIS is available from http://www.cuahsi.org/his/.

  18. 75 FR 58414 - Medicare Program; Meeting of the Medicare Evidence Development and Coverage Advisory Committee...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-24

    ... focus on issues specific to the list of topics that we have proposed to the Committee. The list of research topics to be discussed at the meeting will be available on the following Web site prior to the... topic, including panel materials, is available at http://www.cms.hhs.gov/center/coverage.asp . We...

  19. A Privacy Access Control Framework for Web Services Collaboration with Role Mechanisms

    NASA Astrophysics Data System (ADS)

    Liu, Linyuan; Huang, Zhiqiu; Zhu, Haibin

    With the popularity of Internet technology, web services are becoming the most promising paradigm for distributed computing. This increased use of web services has meant that more and more personal information of consumers is being shared with web service providers, leading to the need to guarantee the privacy of consumers. This paper proposes a role-based privacy access control framework for Web services collaboration, it utilizes roles to specify the privacy privileges of services, and considers the impact on the reputation degree of the historic experience of services in playing roles. Comparing to the traditional privacy access control approaches, this framework can make the fine-grained authorization decision, thus efficiently protecting consumers' privacy.

  20. 5 CFR 317.301 - Conversion coverage.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 1 2011-01-01 2011-01-01 false Conversion coverage. 317.301 Section 317... THE SENIOR EXECUTIVE SERVICE Conversion to the Senior Executive Service § 317.301 Conversion coverage. (a) When applicable. These conversion provisions apply in the following circumstances. (1) The...

  1. 5 CFR 317.301 - Conversion coverage.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Conversion coverage. 317.301 Section 317... THE SENIOR EXECUTIVE SERVICE Conversion to the Senior Executive Service § 317.301 Conversion coverage. (a) When applicable. These conversion provisions apply in the following circumstances. (1) The...

  2. 5 CFR 300.502 - Coverage.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Coverage. 300.502 Section 300.502 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS EMPLOYMENT (GENERAL) Use of Private Sector Temporaries § 300.502 Coverage. (a) These regulations apply to the competitive service and...

  3. 42 CFR 440.335 - Benchmark-equivalent health benefits coverage.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...) Aggregate actuarial value. Benchmark-equivalent coverage is health benefits coverage that has an aggregate... planning services and supplies and other appropriate preventive services, as designated by the Secretary... State for purposes of comparison in establishing the aggregate actuarial value of the benchmark...

  4. 5 CFR 300.402 - Coverage.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 1 2011-01-01 2011-01-01 false Coverage. 300.402 Section 300.402 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS EMPLOYMENT (GENERAL) Use of Commercial Recruiting Firms and Nonprofit Employment Services § 300.402 Coverage. This part applies to...

  5. 5 CFR 300.502 - Coverage.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 5 Administrative Personnel 1 2013-01-01 2013-01-01 false Coverage. 300.502 Section 300.502 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS EMPLOYMENT (GENERAL) Use of Private Sector Temporaries § 300.502 Coverage. (a) These regulations apply to the competitive service and...

  6. 5 CFR 300.502 - Coverage.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 1 2011-01-01 2011-01-01 false Coverage. 300.502 Section 300.502 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS EMPLOYMENT (GENERAL) Use of Private Sector Temporaries § 300.502 Coverage. (a) These regulations apply to the competitive service and...

  7. 5 CFR 752.601 - Coverage.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Coverage. 752.601 Section 752.601 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED) CIVIL SERVICE REGULATIONS (CONTINUED) ADVERSE... Service § 752.601 Coverage. (a) Adverse actions covered. This subpart applies to suspensions for more than...

  8. 5 CFR 300.502 - Coverage.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 5 Administrative Personnel 1 2012-01-01 2012-01-01 false Coverage. 300.502 Section 300.502 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS EMPLOYMENT (GENERAL) Use of Private Sector Temporaries § 300.502 Coverage. (a) These regulations apply to the competitive service and...

  9. 5 CFR 300.402 - Coverage.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 5 Administrative Personnel 1 2013-01-01 2013-01-01 false Coverage. 300.402 Section 300.402 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS EMPLOYMENT (GENERAL) Use of Commercial Recruiting Firms and Nonprofit Employment Services § 300.402 Coverage. This part applies to...

  10. 5 CFR 300.402 - Coverage.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 5 Administrative Personnel 1 2012-01-01 2012-01-01 false Coverage. 300.402 Section 300.402 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS EMPLOYMENT (GENERAL) Use of Commercial Recruiting Firms and Nonprofit Employment Services § 300.402 Coverage. This part applies to...

  11. 5 CFR 300.402 - Coverage.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 5 Administrative Personnel 1 2014-01-01 2014-01-01 false Coverage. 300.402 Section 300.402 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS EMPLOYMENT (GENERAL) Use of Commercial Recruiting Firms and Nonprofit Employment Services § 300.402 Coverage. This part applies to...

  12. 5 CFR 300.702 - Coverage.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 5 Administrative Personnel 1 2013-01-01 2013-01-01 false Coverage. 300.702 Section 300.702 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS EMPLOYMENT (GENERAL) Statutory Bar to Appointment of Persons Who Fail To Register Under Selective Service Law § 300.702 Coverage...

  13. 5 CFR 300.402 - Coverage.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Coverage. 300.402 Section 300.402 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS EMPLOYMENT (GENERAL) Use of Commercial Recruiting Firms and Nonprofit Employment Services § 300.402 Coverage. This part applies to...

  14. 5 CFR 300.502 - Coverage.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 5 Administrative Personnel 1 2014-01-01 2014-01-01 false Coverage. 300.502 Section 300.502 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS EMPLOYMENT (GENERAL) Use of Private Sector Temporaries § 300.502 Coverage. (a) These regulations apply to the competitive service and...

  15. 38 CFR 71.45 - Revocation.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... the Family Caregiver in transitioning to alternative health care coverage and with mental health... individual with transitioning to alternative health care coverage and with mental health services, unless one... health care coverage and with mental health services. If revocation is due to improvement in the eligible...

  16. 38 CFR 71.45 - Revocation.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... the Family Caregiver in transitioning to alternative health care coverage and with mental health... individual with transitioning to alternative health care coverage and with mental health services, unless one... health care coverage and with mental health services. If revocation is due to improvement in the eligible...

  17. Bridging the gap between Hydrologic and Atmospheric communities through a standard based framework

    NASA Astrophysics Data System (ADS)

    Boldrini, E.; Salas, F.; Maidment, D. R.; Mazzetti, P.; Santoro, M.; Nativi, S.; Domenico, B.

    2012-04-01

    Data interoperability in the study of Earth sciences is essential to performing interdisciplinary multi-scale multi-dimensional analyses (e.g. hydrologic impacts of global warming, regional urbanization, global population growth etc.). This research aims to bridge the existing gap between hydrologic and atmospheric communities both at semantic and technological levels. Within the context of hydrology, scientists are usually concerned with data organized as time series: a time series can be seen as a variable measured at a particular point in space over a period of time (e.g. the stream flow values as periodically measured by a buoy sensor in a river); atmospheric scientists instead usually organize their data as coverages: a coverage can be seen as a multidimensional data array (e.g. satellite images acquired through time). These differences make non-trivial the set up of a common framework to perform data discovery and access. A set of web services specifications and implementations is already in place in both the scientific communities to allow data discovery and access in the different domains. The CUAHSI-Hydrologic Information System (HIS) service stack lists different services types and implementations: - a metacatalog (implemented as a CSW) used to discover metadata services by distributing the query to a set of catalogs - time series catalogs (implemented as CSW) used to discover datasets published by the feature services - feature services (implemented as WFS) containing features with data access link - sensor observation services (implemented as SOS) enabling access to the stream of acquisitions Within the Unidata framework, there lies a similar service stack for atmospheric data: - the broker service (implemented as a CSW) distributes a user query to a set of heterogeneous services (i.e. catalogs services, but also inventory and access services) - the catalog service (implemented as a CSW) is able to harvest the available metadata offered by THREDDS services, and executes complex queries against the available metadata. - inventory service (implemented as a THREDDS) being able to hierarchically organize and publish a local collection of multi-dimensional arrays (e.g. NetCDF, GRIB files), as well as publish auxiliary standard services to realize the actual data access and visualization (e.g. WCS, OPeNDAP, WMS). The approach followed in this research is to build on top of the existing standards and implementations, by setting up a standard-aware interoperable framework, able to deal with the existing heterogeneity in an organic way. As a methodology, interoperability tests against real services were performed; existing problems were thus highlighted and possibly solved. The use of flexible tools, able to deal in a smart way with heterogeneity has proven to be successful, in particular experiments were carried on with both GI-cat broker and ESRI GeoPortal frameworks. GI-cat discovery broker was proven successful at implementing the CSW interface, as well as federating heterogeneous resources, such as THREDDS and WCS services published by Unidata, HydroServer, WFS and SOS services published by CUAHSI. Experiments with ESRI GeoPortal were also successful: the GeoPortal was used to deploy a web interface able to distribute searches amongst catalog implementations from both the hydrologic and the atmospheric communities, including HydroServers and GI-cat, combining results from both the domains in a seamless way.

  18. Universal coverage and its impact on reproductive health services in Thailand.

    PubMed

    Tangcharoensathien, Viroj; Tantivess, Sripen; Teerawattananon, Yot; Auamkul, Nanta; Jongudoumsuk, Pongpisut

    2002-11-01

    Thailand has recently introduced universal health care coverage for 45 million of its people, financed by general tax revenue. A capitation contract model was adopted to purchase ambulatory and hospital care, and preventive care and promotion, including reproductive health services, from public and private service providers. This paper describes the health financing system prior to universal coverage, and the extent to which Thailand has achieved reproductive health objectives prior to this reform. It then analyses the potential impact of universal coverage on reproductive health services. Whether there are positive or negative effects on reproductive health services will depend on the interaction between three key aspects: awareness of entitlement on the part of intended beneficiaries of services, the response of health care providers to capitation, and the capacity of purchasers to monitor and enforce contracts. In rural areas, the district public health system is the sole service provider and the contractual relationship requires trust and positive engagement with purchasers. We recommend an evidence-based approach to fine-tune the reproductive health services benefits package under universal coverage, as well as improved institutional capacity for purchasers and the active participation of civil society and other partners to empower beneficiaries.

  19. SIDECACHE: Information access, management and dissemination framework for web services.

    PubMed

    Doderer, Mark S; Burkhardt, Cory; Robbins, Kay A

    2011-06-14

    Many bioinformatics algorithms and data sets are deployed using web services so that the results can be explored via the Internet and easily integrated into other tools and services. These services often include data from other sites that is accessed either dynamically or through file downloads. Developers of these services face several problems because of the dynamic nature of the information from the upstream services. Many publicly available repositories of bioinformatics data frequently update their information. When such an update occurs, the developers of the downstream service may also need to update. For file downloads, this process is typically performed manually followed by web service restart. Requests for information obtained by dynamic access of upstream sources is sometimes subject to rate restrictions. SideCache provides a framework for deploying web services that integrate information extracted from other databases and from web sources that are periodically updated. This situation occurs frequently in biotechnology where new information is being continuously generated and the latest information is important. SideCache provides several types of services including proxy access and rate control, local caching, and automatic web service updating. We have used the SideCache framework to automate the deployment and updating of a number of bioinformatics web services and tools that extract information from remote primary sources such as NCBI, NCIBI, and Ensembl. The SideCache framework also has been used to share research results through the use of a SideCache derived web service.

  20. Design for Connecting Spatial Data Infrastructures with Sensor Web (sensdi)

    NASA Astrophysics Data System (ADS)

    Bhattacharya, D.; M., M.

    2016-06-01

    Integrating Sensor Web With Spatial Data Infrastructures (SENSDI) aims to extend SDIs with sensor web enablement, converging geospatial and built infrastructure, and implement test cases with sensor data and SDI. It is about research to harness the sensed environment by utilizing domain specific sensor data to create a generalized sensor webframework. The challenges being semantic enablement for Spatial Data Infrastructures, and connecting the interfaces of SDI with interfaces of Sensor Web. The proposed research plan is to Identify sensor data sources, Setup an open source SDI, Match the APIs and functions between Sensor Web and SDI, and Case studies like hazard applications, urban applications etc. We take up co-operative development of SDI best practices to enable a new realm of a location enabled and semantically enriched World Wide Web - the "Geospatial Web" or "Geosemantic Web" by setting up one to one correspondence between WMS, WFS, WCS, Metadata and 'Sensor Observation Service' (SOS); 'Sensor Planning Service' (SPS); 'Sensor Alert Service' (SAS); a service that facilitates asynchronous message interchange between users and services, and between two OGC-SWE services, called the 'Web Notification Service' (WNS). Hence in conclusion, it is of importance to geospatial studies to integrate SDI with Sensor Web. The integration can be done through merging the common OGC interfaces of SDI and Sensor Web. Multi-usability studies to validate integration has to be undertaken as future research.

  1. Description and testing of the Geo Data Portal: Data integration framework and Web processing services for environmental science collaboration

    USGS Publications Warehouse

    Blodgett, David L.; Booth, Nathaniel L.; Kunicki, Thomas C.; Walker, Jordan I.; Viger, Roland J.

    2011-01-01

    Interest in sharing interdisciplinary environmental modeling results and related data is increasing among scientists. The U.S. Geological Survey Geo Data Portal project enables data sharing by assembling open-standard Web services into an integrated data retrieval and analysis Web application design methodology that streamlines time-consuming and resource-intensive data management tasks. Data-serving Web services allow Web-based processing services to access Internet-available data sources. The Web processing services developed for the project create commonly needed derivatives of data in numerous formats. Coordinate reference system manipulation and spatial statistics calculation components implemented for the Web processing services were confirmed using ArcGIS 9.3.1, a geographic information science software package. Outcomes of the Geo Data Portal project support the rapid development of user interfaces for accessing and manipulating environmental data.

  2. Dynamic Generation of Reduced Ontologies to Support Resource Constraints of Mobile Devices

    ERIC Educational Resources Information Center

    Schrimpsher, Dan

    2011-01-01

    As Web Services and the Semantic Web become more important, enabling technologies such as web service ontologies will grow larger. At the same time, use of mobile devices to access web services has doubled in the last year. The ability of these resource constrained devices to download and reason across these ontologies to support service discovery…

  3. A Smart Modeling Framework for Integrating BMI-enabled Models as Web Services

    NASA Astrophysics Data System (ADS)

    Jiang, P.; Elag, M.; Kumar, P.; Peckham, S. D.; Liu, R.; Marini, L.; Hsu, L.

    2015-12-01

    Serviced-oriented computing provides an opportunity to couple web service models using semantic web technology. Through this approach, models that are exposed as web services can be conserved in their own local environment, thus making it easy for modelers to maintain and update the models. In integrated modeling, the serviced-oriented loose-coupling approach requires (1) a set of models as web services, (2) the model metadata describing the external features of a model (e.g., variable name, unit, computational grid, etc.) and (3) a model integration framework. We present the architecture of coupling web service models that are self-describing by utilizing a smart modeling framework. We expose models that are encapsulated with CSDMS (Community Surface Dynamics Modeling System) Basic Model Interfaces (BMI) as web services. The BMI-enabled models are self-describing by uncovering models' metadata through BMI functions. After a BMI-enabled model is serviced, a client can initialize, execute and retrieve the meta-information of the model by calling its BMI functions over the web. Furthermore, a revised version of EMELI (Peckham, 2015), an Experimental Modeling Environment for Linking and Interoperability, is chosen as the framework for coupling BMI-enabled web service models. EMELI allows users to combine a set of component models into a complex model by standardizing model interface using BMI as well as providing a set of utilities smoothing the integration process (e.g., temporal interpolation). We modify the original EMELI so that the revised modeling framework is able to initialize, execute and find the dependencies of the BMI-enabled web service models. By using the revised EMELI, an example will be presented on integrating a set of topoflow model components that are BMI-enabled and exposed as web services. Reference: Peckham, S.D. (2014) EMELI 1.0: An experimental smart modeling framework for automatic coupling of self-describing models, Proceedings of HIC 2014, 11th International Conf. on Hydroinformatics, New York, NY.

  4. GSKY: A scalable distributed geospatial data server on the cloud

    NASA Astrophysics Data System (ADS)

    Rozas Larraondo, Pablo; Pringle, Sean; Antony, Joseph; Evans, Ben

    2017-04-01

    Earth systems, environmental and geophysical datasets are an extremely valuable sources of information about the state and evolution of the Earth. Being able to combine information coming from different geospatial collections is in increasing demand by the scientific community, and requires managing and manipulating data with different formats and performing operations such as map reprojections, resampling and other transformations. Due to the large data volume inherent in these collections, storing multiple copies of them is unfeasible and so such data manipulation must be performed on-the-fly using efficient, high performance techniques. Ideally this should be performed using a trusted data service and common system libraries to ensure wide use and reproducibility. Recent developments in distributed computing based on dynamic access to significant cloud infrastructure opens the door for such new ways of processing geospatial data on demand. The National Computational Infrastructure (NCI), hosted at the Australian National University (ANU), has over 10 Petabytes of nationally significant research data collections. Some of these collections, which comprise a variety of observed and modelled geospatial data, are now made available via a highly distributed geospatial data server, called GSKY (pronounced [jee-skee]). GSKY supports on demand processing of large geospatial data products such as satellite earth observation data as well as numerical weather products, allowing interactive exploration and analysis of the data. It dynamically and efficiently distributes the required computations among cloud nodes providing a scalable analysis framework that can adapt to serve large number of concurrent users. Typical geospatial workflows handling different file formats and data types, or blending data in different coordinate projections and spatio-temporal resolutions, is handled transparently by GSKY. This is achieved by decoupling the data ingestion and indexing process as an independent service. An indexing service crawls data collections either locally or remotely by extracting, storing and indexing all spatio-temporal metadata associated with each individual record. GSKY provides the user with the ability of specifying how ingested data should be aggregated, transformed and presented. It presents an OGC standards-compliant interface, allowing ready accessibility for users of the data via Web Map Services (WMS), Web Processing Services (WPS) or raw data arrays using Web Coverage Services (WCS). The presentation will show some cases where we have used this new capability to provide a significant improvement over previous approaches.

  5. Managing the Web-Enhanced Geographic Information Service.

    ERIC Educational Resources Information Center

    Stephens, Denise

    1997-01-01

    Examines key management issues involved in delivering geographic information services on the World Wide Web, using the Geographic Information Center (GIC) program at the University of Virginia Library as a reference. Highlights include integrating the Web into services; building collections for Web delivery; and evaluating spatial information…

  6. Women's Preventive Services Guidelines Affordable Care Act Expands Prevention Coverage for Women's Health and Well-Being

    MedlinePlus

    ... 2012. Type of Preventive Service HHS Guideline for Health Insurance Coverage Frequency Well-woman visits. Well-woman preventive ... established or maintained by an objecting organization, or health insurance coverage offered or arranged by an objecting organization, ...

  7. 7 CFR 1735.11 - Area coverage.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 11 2013-01-01 2013-01-01 false Area coverage. 1735.11 Section 1735.11 Agriculture Regulations of the Department of Agriculture (Continued) RURAL UTILITIES SERVICE, DEPARTMENT OF AGRICULTURE... Policies § 1735.11 Area coverage. Borrowers must make adequate telephone service available to the widest...

  8. Automated geospatial Web Services composition based on geodata quality requirements

    NASA Astrophysics Data System (ADS)

    Cruz, Sérgio A. B.; Monteiro, Antonio M. V.; Santos, Rafael

    2012-10-01

    Service-Oriented Architecture and Web Services technologies improve the performance of activities involved in geospatial analysis with a distributed computing architecture. However, the design of the geospatial analysis process on this platform, by combining component Web Services, presents some open issues. The automated construction of these compositions represents an important research topic. Some approaches to solving this problem are based on AI planning methods coupled with semantic service descriptions. This work presents a new approach using AI planning methods to improve the robustness of the produced geospatial Web Services composition. For this purpose, we use semantic descriptions of geospatial data quality requirements in a rule-based form. These rules allow the semantic annotation of geospatial data and, coupled with the conditional planning method, this approach represents more precisely the situations of nonconformities with geodata quality that may occur during the execution of the Web Service composition. The service compositions produced by this method are more robust, thus improving process reliability when working with a composition of chained geospatial Web Services.

  9. BioCatalogue: a universal catalogue of web services for the life sciences

    PubMed Central

    Bhagat, Jiten; Tanoh, Franck; Nzuobontane, Eric; Laurent, Thomas; Orlowski, Jerzy; Roos, Marco; Wolstencroft, Katy; Aleksejevs, Sergejs; Stevens, Robert; Pettifer, Steve; Lopez, Rodrigo; Goble, Carole A.

    2010-01-01

    The use of Web Services to enable programmatic access to on-line bioinformatics is becoming increasingly important in the Life Sciences. However, their number, distribution and the variable quality of their documentation can make their discovery and subsequent use difficult. A Web Services registry with information on available services will help to bring together service providers and their users. The BioCatalogue (http://www.biocatalogue.org/) provides a common interface for registering, browsing and annotating Web Services to the Life Science community. Services in the BioCatalogue can be described and searched in multiple ways based upon their technical types, bioinformatics categories, user tags, service providers or data inputs and outputs. They are also subject to constant monitoring, allowing the identification of service problems and changes and the filtering-out of unavailable or unreliable resources. The system is accessible via a human-readable ‘Web 2.0’-style interface and a programmatic Web Service interface. The BioCatalogue follows a community approach in which all services can be registered, browsed and incrementally documented with annotations by any member of the scientific community. PMID:20484378

  10. BioCatalogue: a universal catalogue of web services for the life sciences.

    PubMed

    Bhagat, Jiten; Tanoh, Franck; Nzuobontane, Eric; Laurent, Thomas; Orlowski, Jerzy; Roos, Marco; Wolstencroft, Katy; Aleksejevs, Sergejs; Stevens, Robert; Pettifer, Steve; Lopez, Rodrigo; Goble, Carole A

    2010-07-01

    The use of Web Services to enable programmatic access to on-line bioinformatics is becoming increasingly important in the Life Sciences. However, their number, distribution and the variable quality of their documentation can make their discovery and subsequent use difficult. A Web Services registry with information on available services will help to bring together service providers and their users. The BioCatalogue (http://www.biocatalogue.org/) provides a common interface for registering, browsing and annotating Web Services to the Life Science community. Services in the BioCatalogue can be described and searched in multiple ways based upon their technical types, bioinformatics categories, user tags, service providers or data inputs and outputs. They are also subject to constant monitoring, allowing the identification of service problems and changes and the filtering-out of unavailable or unreliable resources. The system is accessible via a human-readable 'Web 2.0'-style interface and a programmatic Web Service interface. The BioCatalogue follows a community approach in which all services can be registered, browsed and incrementally documented with annotations by any member of the scientific community.

  11. Data Publishing and Sharing Via the THREDDS Data Repository

    NASA Astrophysics Data System (ADS)

    Wilson, A.; Caron, J.; Davis, E.; Baltzer, T.

    2007-12-01

    The terms "Team Science" and "Networked Science" have been coined to describe a virtual organization of researchers tied via some intellectual challenge, but often located in different organizations and locations. A critical component to these endeavors is publishing and sharing of content, including scientific data. Imagine pointing your web browser to a web page that interactively lets you upload data and metadata to a repository residing on a remote server, which can then be accessed by others in a secure fasion via the web. While any content can be added to this repository, it is designed particularly for storing and sharing scientific data and metadata. Server support includes uploading of data files that can subsequently be subsetted, aggregrated, and served in NetCDF or other scientific data formats. Metadata can be associated with the data and interactively edited. The THREDDS Data Repository (TDR) is a server that provides client initiated, on demand, location transparent storage for data of any type that can then be served by the THREDDS Data Server (TDS). The TDR provides functionality to: * securely store and "own" data files and associated metadata * upload files via HTTP and gridftp * upload a collection of data as single file * modify and restructure repository contents * incorporate metadata provided by the user * generate additional metadata programmatically * edit individual metadata elements The TDR can exist separately from a TDS, serving content via HTTP. Also, it can work in conjunction with the TDS, which includes functionality to provide: * access to data in a variety of formats via -- OPeNDAP -- OGC Web Coverage Service (for gridded datasets) -- bulk HTTP file transfer * a NetCDF view of datasets in NetCDF, OPeNDAP, HDF-5, GRIB, and NEXRAD formats * serving of very large volume datasets, such as NEXRAD radar * aggregation into virtual datasets * subsetting via OPeNDAP and NetCDF Subsetting services This talk will discuss TDR/TDS capabilities as well as how users can install this software to create their own repositories.

  12. 20 CFR 404.133 - When we give you quarters of coverage based on military service to establish a period of disability.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 20 Employees' Benefits 2 2012-04-01 2012-04-01 false When we give you quarters of coverage based on military service to establish a period of disability. 404.133 Section 404.133 Employees' Benefits... Status and Quarters of Coverage Disability Insured Status § 404.133 When we give you quarters of coverage...

  13. 20 CFR 404.133 - When we give you quarters of coverage based on military service to establish a period of disability.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 20 Employees' Benefits 2 2014-04-01 2014-04-01 false When we give you quarters of coverage based on military service to establish a period of disability. 404.133 Section 404.133 Employees' Benefits... Status and Quarters of Coverage Disability Insured Status § 404.133 When we give you quarters of coverage...

  14. 20 CFR 404.133 - When we give you quarters of coverage based on military service to establish a period of disability.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 20 Employees' Benefits 2 2013-04-01 2013-04-01 false When we give you quarters of coverage based on military service to establish a period of disability. 404.133 Section 404.133 Employees' Benefits... Status and Quarters of Coverage Disability Insured Status § 404.133 When we give you quarters of coverage...

  15. 20 CFR 404.133 - When we give you quarters of coverage based on military service to establish a period of disability.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false When we give you quarters of coverage based on military service to establish a period of disability. 404.133 Section 404.133 Employees' Benefits... Status and Quarters of Coverage Disability Insured Status § 404.133 When we give you quarters of coverage...

  16. Societal Implications of Health Insurance Coverage for Medically Necessary Services in the U.S. Transgender Population: A Cost-Effectiveness Analysis.

    PubMed

    Padula, William V; Heru, Shiona; Campbell, Jonathan D

    2016-04-01

    Recently, the Massachusetts Group Insurance Commission (GIC) prioritized research on the implications of a clause expressly prohibiting the denial of health insurance coverage for transgender-related services. These medically necessary services include primary and preventive care as well as transitional therapy. To analyze the cost-effectiveness of insurance coverage for medically necessary transgender-related services. Markov model with 5- and 10-year time horizons from a U.S. societal perspective, discounted at 3% (USD 2013). Data on outcomes were abstracted from the 2011 National Transgender Discrimination Survey (NTDS). U.S. transgender population starting before transitional therapy. No health benefits compared to health insurance coverage for medically necessary services. This coverage can lead to hormone replacement therapy, sex reassignment surgery, or both. Cost per quality-adjusted life year (QALY) for successful transition or negative outcomes (e.g. HIV, depression, suicidality, drug abuse, mortality) dependent on insurance coverage or no health benefit at a willingness-to-pay threshold of $100,000/QALY. Budget impact interpreted as the U.S. per-member-per-month cost. Compared to no health benefits for transgender patients ($23,619; 6.49 QALYs), insurance coverage for medically necessary services came at a greater cost and effectiveness ($31,816; 7.37 QALYs), with an incremental cost-effectiveness ratio (ICER) of $9314/QALY. The budget impact of this coverage is approximately $0.016 per member per month. Although the cost for transitions is $10,000-22,000 and the cost of provider coverage is $2175/year, these additional expenses hold good value for reducing the risk of negative endpoints--HIV, depression, suicidality, and drug abuse. Results were robust to uncertainty. The probabilistic sensitivity analysis showed that provider coverage was cost-effective in 85% of simulations. Health insurance coverage for the U.S. transgender population is affordable and cost-effective, and has a low budget impact on U.S. society. Organizations such as the GIC should consider these results when examining policies regarding coverage exclusions.

  17. Customer Decision Making in Web Services with an Integrated P6 Model

    NASA Astrophysics Data System (ADS)

    Sun, Zhaohao; Sun, Junqing; Meredith, Grant

    Customer decision making (CDM) is an indispensable factor for web services. This article examines CDM in web services with a novel P6 model, which consists of the 6 Ps: privacy, perception, propensity, preference, personalization and promised experience. This model integrates the existing 6 P elements of marketing mix as the system environment of CDM in web services. The new integrated P6 model deals with the inner world of the customer and incorporates what the customer think during the DM process. The proposed approach will facilitate the research and development of web services and decision support systems.

  18. 39 CFR 601.103 - Applicability and coverage.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...: INTELLECTUAL PROPERTY RIGHTS OTHER THAN PATENTS PURCHASING OF PROPERTY AND SERVICES § 601.103 Applicability and coverage. The regulations contained in this part apply to all Postal Service acquisition of property (except real property) and services. ...

  19. 39 CFR 601.103 - Applicability and coverage.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...: INTELLECTUAL PROPERTY RIGHTS OTHER THAN PATENTS PURCHASING OF PROPERTY AND SERVICES § 601.103 Applicability and coverage. The regulations contained in this part apply to all Postal Service acquisition of property (except real property) and services. ...

  20. Coverage of, and compliance with, mass drug administration under the programme to eliminate lymphatic filariasis in India: a systematic review.

    PubMed

    Babu, Bontha V; Babu, Gopalan R

    2014-09-01

    India's mass drug administration (MDA) programme to eliminate lymphatic filariasis (PELF) covers all 250 endemic districts, but compliance with treatment is not adequate for the programme to succeed in eradicating this neglected tropical disease. The objective of our study was to systematically review published studies on the coverage of and compliance with MDA under the PELF in India. We searched several databases-PubMed/Medline, Google Scholar, CINAHL/EBSCO, Web of Knowledge (including Web of Science) and OVID-and by applying selection criteria identified a total of 36 papers to include in the review. Overall MDA coverage rates varied between 48.8% and 98.8%, while compliance rates ranged from 20.8% to 93.7%. The coverage-compliance gap is large in many MDA programmes. The effective level of compliance, ≥65%, was reported in only 10 of a total of 31 MDAs (5 of 20 MDAs in rural areas and 2 of 12 MDAs in urban areas). The review has identified a gap between coverage and compliance, and potentially correctable causes of this gap. These causes need to be addressed if the Indian programme is to advance towards elimination of lymphatic filariasis. © The Author 2014. Published by Oxford University Press on behalf of Royal Society of Tropical Medicine and Hygiene. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  1. Ubiquitous Computing Services Discovery and Execution Using a Novel Intelligent Web Services Algorithm

    PubMed Central

    Choi, Okkyung; Han, SangYong

    2007-01-01

    Ubiquitous Computing makes it possible to determine in real time the location and situations of service requesters in a web service environment as it enables access to computers at any time and in any place. Though research on various aspects of ubiquitous commerce is progressing at enterprises and research centers, both domestically and overseas, analysis of a customer's personal preferences based on semantic web and rule based services using semantics is not currently being conducted. This paper proposes a Ubiquitous Computing Services System that enables a rule based search as well as semantics based search to support the fact that the electronic space and the physical space can be combined into one and the real time search for web services and the construction of efficient web services thus become possible.

  2. Some Programs Should Not Run on Laptops - Providing Programmatic Access to Applications Via Web Services

    NASA Astrophysics Data System (ADS)

    Gupta, V.; Gupta, N.; Gupta, S.; Field, E.; Maechling, P.

    2003-12-01

    Modern laptop computers, and personal computers, can provide capabilities that are, in many ways, comparable to workstations or departmental servers. However, this doesn't mean we should run all computations on our local computers. We have identified several situations in which it preferable to implement our seismological application programs in a distributed, server-based, computing model. In this model, application programs on the user's laptop, or local computer, invoke programs that run on an organizational server, and the results are returned to the invoking system. Situations in which a server-based architecture may be preferred include: (a) a program is written in a language, or written for an operating environment, that is unsupported on the local computer, (b) software libraries or utilities required to execute a program are not available on the users computer, (c) a computational program is physically too large, or computationally too expensive, to run on a users computer, (d) a user community wants to enforce a consistent method of performing a computation by standardizing on a single implementation of a program, and (e) the computational program may require current information, that is not available to all client computers. Until recently, distributed, server-based, computational capabilities were implemented using client/server architectures. In these architectures, client programs were often written in the same language, and they executed in the same computing environment, as the servers. Recently, a new distributed computational model, called Web Services, has been developed. Web Services are based on Internet standards such as XML, SOAP, WDSL, and UDDI. Web Services offer the promise of platform, and language, independent distributed computing. To investigate this new computational model, and to provide useful services to the SCEC Community, we have implemented several computational and utility programs using a Web Service architecture. We have hosted these Web Services as a part of the SCEC Community Modeling Environment (SCEC/CME) ITR Project (http://www.scec.org/cme). We have implemented Web Services for several of the reasons sited previously. For example, we implemented a FORTRAN-based Earthquake Rupture Forecast (ERF) as a Web Service for use by client computers that don't support a FORTRAN runtime environment. We implemented a Generic Mapping Tool (GMT) Web Service for use by systems that don't have local access to GMT. We implemented a Hazard Map Calculator Web Service to execute Hazard calculations that are too computationally intensive to run on a local system. We implemented a Coordinate Conversion Web Service to enforce a standard and consistent method for converting between UTM and Lat/Lon. Our experience developing these services indicates both strengths and weakness in current Web Service technology. Client programs that utilize Web Services typically need network access, a significant disadvantage at times. Programs with simple input and output parameters were the easiest to implement as Web Services, while programs with complex parameter-types required a significant amount of additional development. We also noted that Web services are very data-oriented, and adapting object-oriented software into the Web Service model proved problematic. Also, the Web Service approach of converting data types into XML format for network transmission has significant inefficiencies for some data sets.

  3. Prognocean Plus: the Science-Oriented Sea Level Prediction System as a Tool for Public Stakeholders

    NASA Astrophysics Data System (ADS)

    Świerczyńska, M. G.; Miziński, B.; Niedzielski, T.

    2015-12-01

    The novel real-time system for sea level prediction, known as Prognocean Plus, has been developed as a new generation service available through the Polish supercomputing grid infrastructure. The researchers can access the service at https://prognocean.plgrid.pl/. Although the system is science-oriented, we wish to discuss herein its potentials to enhance ocean management studies carried out routinely by public stakeholders. The system produces the short- and medium-term predictions of global altimetric gridded Sea Level Anomaly (SLA) time series, updated daily. The spatial resolution of the SLA forecasts is 1/4° x 1/4°, while the temporal resolution of prognoses is equal to 1 day. The system computes the predictions of time-variable ocean topography using five data-based models, which are not computationally demanding, enabling us to compare their skillfulness in respect to physically-based approaches commonly used by different sea level prediction systems. However, the aim of the system is not only to compute the predictions for science purposes, but primarily to build a user-oriented platform that serves the prognoses and their statistics to a broader community. Thus, we deliver the SLA forecasts as a rapid service available online. In order to provide potential users with the access to science results the Web Map Service (WMS) for Prognocean Plus is designed. We regularly publish the forecasts, both in the interactive graphical WMS service, available from the browser, as well as through the Web Coverage Service (WCS) standard. The Prognocean Plus system, as an early-response system, may be interesting for public stakeholders. It may be used for marine navigation as well as for climate risk management (delineate areas vulnerable to local sea level rise), marine management (advise offered for offshore activities) and coastal management (early warnings against coastal floodings).

  4. Next Generation Landsat Products Delivered Using Virtual Globes and OGC Standard Services

    NASA Astrophysics Data System (ADS)

    Neiers, M.; Dwyer, J.; Neiers, S.

    2008-12-01

    The Landsat Data Continuity Mission (LDCM) is the next in the series of Landsat satellite missions and is tasked with the objective of delivering data acquired by the Operational Land Imager (OLI). The OLI instrument will provide data continuity to over 30 years of global multispectral data collected by the Landsat series of satellites. The U.S. Geological Survey Earth Resources Observation and Science (USGS EROS) Center has responsibility for the development and operation of the LDCM ground system. One of the mission objectives of the LDCM is to distribute OLI data products electronically over the Internet to the general public on a nondiscriminatory basis and at no cost. To ensure the user community and general public can easily access LDCM data from multiple clients, the User Portal Element (UPE) of the LDCM ground system will use OGC standards and services such as Keyhole Markup Language (KML), Web Map Service (WMS), Web Coverage Service (WCS), and Geographic encoding of Really Simple Syndication (GeoRSS) feeds for both access to and delivery of LDCM products. The USGS has developed and tested the capabilities of several successful UPE prototypes for delivery of Landsat metadata, full resolution browse, and orthorectified (L1T) products from clients such as Google Earth, Google Maps, ESRI ArcGIS Explorer, and Microsoft's Virtual Earth. Prototyping efforts included the following services: using virtual globes to search the historical Landsat archive by dynamic generation of KML; notification of and access to new Landsat acquisitions and L1T downloads from GeoRSS feeds; Google indexing of KML files containing links to full resolution browse and data downloads; WMS delivery of reduced resolution browse, full resolution browse, and cloud mask overlays; and custom data downloads using WCS clients. These various prototypes will be demonstrated and LDCM service implementation plans will be discussed during this session.

  5. The Organizational Role of Web Services

    ERIC Educational Resources Information Center

    Mitchell, Erik

    2011-01-01

    The workload of Web librarians is already split between Web-related and other library tasks. But today's technological environment has created new implications for existing services and new demands for staff time. It is time to reconsider how libraries can best allocate resources to provide effective Web services. Delivering high-quality services…

  6. 78 FR 60303 - Agency Information Collection Activities: Online Survey of Web Services Employers; New...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-01

    ...-NEW] Agency Information Collection Activities: Online Survey of Web Services Employers; New... Web site at http://www.Regulations.gov under e-Docket ID number USCIS-2013- 0003. When submitting... information collection. (2) Title of the Form/Collection: Online Survey of Web Services Employers. (3) Agency...

  7. 78 FR 42537 - Agency Information Collection Activities: Online Survey of Web Services Employers; New...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-16

    ...-NEW] Agency Information Collection Activities: Online Survey of Web Services Employers; New... Information Collection: New information collection. (2) Title of the Form/Collection: Online Survey of Web... sector. It is necessary that USCIS obtains data on the E-Verify Program Web Services. Gaining an...

  8. Protecting Database Centric Web Services against SQL/XPath Injection Attacks

    NASA Astrophysics Data System (ADS)

    Laranjeiro, Nuno; Vieira, Marco; Madeira, Henrique

    Web services represent a powerful interface for back-end database systems and are increasingly being used in business critical applications. However, field studies show that a large number of web services are deployed with security flaws (e.g., having SQL Injection vulnerabilities). Although several techniques for the identification of security vulnerabilities have been proposed, developing non-vulnerable web services is still a difficult task. In fact, security-related concerns are hard to apply as they involve adding complexity to already complex code. This paper proposes an approach to secure web services against SQL and XPath Injection attacks, by transparently detecting and aborting service invocations that try to take advantage of potential vulnerabilities. Our mechanism was applied to secure several web services specified by the TPC-App benchmark, showing to be 100% effective in stopping attacks, non-intrusive and very easy to use.

  9. Comparison of three web-scale discovery services for health sciences research.

    PubMed

    Hanneke, Rosie; O'Brien, Kelly K

    2016-04-01

    The purpose of this study was to investigate the relative effectiveness of three web-scale discovery (WSD) tools in answering health sciences search queries. Simple keyword searches, based on topics from six health sciences disciplines, were run at multiple real-world implementations of EBSCO Discovery Service (EDS), Ex Libris's Primo, and ProQuest's Summon. Each WSD tool was evaluated in its ability to retrieve relevant results and in its coverage of MEDLINE content. All WSD tools returned between 50%-60% relevant results. Primo returned a higher number of duplicate results than the other 2 WSD products. Summon results were more relevant when search terms were automatically mapped to controlled vocabulary. EDS indexed the largest number of MEDLINE citations, followed closely by Summon. Additionally, keyword searches in all 3 WSD tools retrieved relevant material that was not found with precision (Medical Subject Headings) searches in MEDLINE. None of the 3 WSD products studied was overwhelmingly more effective in returning relevant results. While difficult to place the figure of 50%-60% relevance in context, it implies a strong likelihood that the average user would be able to find satisfactory sources on the first page of search results using a rudimentary keyword search. The discovery of additional relevant material beyond that retrieved from MEDLINE indicates WSD tools' value as a supplement to traditional resources for health sciences researchers.

  10. A coastal information system to propel emerging science and ...

    EPA Pesticide Factsheets

    The Estuary Data Mapper (EDM) is a free, interactive virtual gateway to coastal data aimed to promote research and aid in environmental management. The graphical user interface allows users to custom select and subset data based on their spatial and temporal interests giving them easy access to visualize, retrieve, and save data for further analysis. Data are accessible across estuarine systems of the Atlantic, Gulf of Mexico and Pacific regions of the United States and includes: (1) time series data including tidal, hydrologic, and weather, (2) water and sediment quality, (3) atmospheric deposition, (4) habitat, (5) coastal exposure indices, (6) historic and projected land-use and population, (7) historic and projected nitrogen and phosphorous sources and load summaries. EDM issues Web Coverage Service Interface Standard queries (WCS; simple, standard one-line text strings) to a public web service to quickly obtain data subsets by variable, for a date-time range and area selected by user. EDM is continuously being enhanced with updated data and new options. Recent additions include a comprehensive suite of nitrogen source and loading data, and inputs for supporting a modeling approach of seagrass habitat. Additions planned for the near future include 1) support for Integrated Water Resources Management cost-benefit analysis, specifically the Watershed Management Optimization Support Tool and 2) visualization of the combined effects of climate change, land-use a

  11. Comparison of three web-scale discovery services for health sciences research*

    PubMed Central

    Hanneke, Rosie; O'Brien, Kelly K.

    2016-01-01

    Objective The purpose of this study was to investigate the relative effectiveness of three web-scale discovery (WSD) tools in answering health sciences search queries. Methods Simple keyword searches, based on topics from six health sciences disciplines, were run at multiple real-world implementations of EBSCO Discovery Service (EDS), Ex Libris's Primo, and ProQuest's Summon. Each WSD tool was evaluated in its ability to retrieve relevant results and in its coverage of MEDLINE content. Results All WSD tools returned between 50%–60% relevant results. Primo returned a higher number of duplicate results than the other 2 WSD products. Summon results were more relevant when search terms were automatically mapped to controlled vocabulary. EDS indexed the largest number of MEDLINE citations, followed closely by Summon. Additionally, keyword searches in all 3 WSD tools retrieved relevant material that was not found with precision (Medical Subject Headings) searches in MEDLINE. Conclusions None of the 3 WSD products studied was overwhelmingly more effective in returning relevant results. While difficult to place the figure of 50%–60% relevance in context, it implies a strong likelihood that the average user would be able to find satisfactory sources on the first page of search results using a rudimentary keyword search. The discovery of additional relevant material beyond that retrieved from MEDLINE indicates WSD tools' value as a supplement to traditional resources for health sciences researchers. PMID:27076797

  12. Estuary Data Mapper: A coastal information system to propel ...

    EPA Pesticide Factsheets

    The Estuary Data Mapper (EDM) is a free, interactive virtual gateway to coastal data aimed to promote research and aid in environmental management. The graphical user interface allows users to custom select and subset data based on their spatial and temporal interests giving them easy access to visualize, retrieve, and save data for further analysis. Data are accessible across estuarine systems of the Atlantic, Gulf of Mexico and Pacific regions of the United States and includes: (1) time series data including tidal, hydrologic, and weather, (2) water and sediment quality, (3) atmospheric deposition, (4) habitat, (5) coastal exposure indices, (6) historic and projected land-use and population, (7) historic and projected nitrogen and phosphorous sources and load summaries. EDM issues Web Coverage Service Interface Standard queries (WCS; simple, standard one-line text strings) to a public web service to quickly obtain data subsets by variable, for a date-time range and area selected by user. EDM is continuously being enhanced with updated data and new options. Recent additions include a comprehensive suite of nitrogen source and loading data, and inputs for supporting a modeling approach of seagrass habitat. Additions planned for the near future include 1) support for Integrated Water Resources Management cost-benefit analysis, specifically the Watershed Management Optimization Support Tool and 2) visualization of the combined effects of climate change, land-use a

  13. The application of geography markup language (GML) to the geological sciences

    NASA Astrophysics Data System (ADS)

    Lake, Ron

    2005-11-01

    GML 3.0 became an adopted specification of the Open Geospatial Consortium (OGC) in January 2003, and is rapidly emerging as the world standard for the encoding, transport and storage of all forms of geographic information. This paper looks at the application of GML to one of the more challenging areas of automated geography, namely the geological sciences. Specific features of GML of interest to geologists are discussed and then illustrated through a series of geological case studies. We conclude the paper with a discussion of anticipated geological web services that GML will enable. GML is written in XML and makes use of XML Schema for extensibility. It can be used both to represent or model geographic objects and to transport them across the Internet. In this way it serves as the foundation for all manner of geographic web services. Unlike vertical application grammars such as LandXML, GML was intended to define geographic application languages, and hence is applicable to any geographic domain including forestry, environmental sciences, geology and oceanography. This paper provides a review of the basic features of GML that are fundamental to the geological sciences including geometry, coverages, observations, reference systems and temporality. These constructs are then employed in a series of simple geological case studies including structural geological description, surficial geology, representation of geological time scales, mineral occurrences, geohazards and geochemical reconnaissance.

  14. A Web service substitution method based on service cluster nets

    NASA Astrophysics Data System (ADS)

    Du, YuYue; Gai, JunJing; Zhou, MengChu

    2017-11-01

    Service substitution is an important research topic in the fields of Web services and service-oriented computing. This work presents a novel method to analyse and substitute Web services. A new concept, called a Service Cluster Net Unit, is proposed based on Web service clusters. A service cluster is converted into a Service Cluster Net Unit. Then it is used to analyse whether the services in the cluster can satisfy some service requests. Meanwhile, the substitution methods of an atomic service and a composite service are proposed. The correctness of the proposed method is proved, and the effectiveness is shown and compared with the state-of-the-art method via an experiment. It can be readily applied to e-commerce service substitution to meet the business automation needs.

  15. Insurers' policies on coverage for behavior management services and the impact of the Affordable Care Act.

    PubMed

    Edelstein, Burton L

    2014-01-01

    The impact of the Affordable Care Act (ACA) on dental insurance coverage for behavior management services depends upon the child's source of insurance (Medicaid, CHIP, private commercial) and the policies that govern each such source. This contribution describes historical and projected sources of pediatric dental coverage, catalogues the seven behavior codes used by dentists, compares how often they are billed by pediatric and general dentists, assesses payment policies and practices for behavioral services across coverage sources, and describes how ACA coverage policies may impact each source. Differences between Congressional intent to ensure comprehensive oral health services with meaningful consumer protections for all legal-resident children and regulatory action by the Departments of Treasury and Health and Human Services are explored to explain how regulations fail to meet Congressional intent as of 2014. The ACA may additionally impact pediatric dentistry practice, including dentists' behavior management services, by expanding pediatric dental training and safety net delivery sites and by stimulating the evolution of novel payment and delivery systems designed to move provider incentives away from procedure-based payments and toward health outcome-based payments.

  16. 47 CFR 80.771 - Method of computing coverage.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 5 2012-10-01 2012-10-01 false Method of computing coverage. 80.771 Section 80.771 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) SAFETY AND SPECIAL RADIO SERVICES STATIONS IN THE MARITIME SERVICES Standards for Computing Public Coast Station VHF Coverage § 80.771 Method...

  17. 47 CFR 22.951 - Minimum coverage requirement.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... MOBILE SERVICES Cellular Radiotelephone Service § 22.951 Minimum coverage requirement. Applications for authority to operate a new cellular system in an unserved area, other than those filed by the licensee of an... toward the minimum coverage requirement. Applications for authority to operate a new cellular system in...

  18. 29 CFR 9.3 - Coverage.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 29 Labor 1 2012-07-01 2012-07-01 false Coverage. 9.3 Section 9.3 Labor Office of the Secretary of Labor NONDISPLACEMENT OF QUALIFIED WORKERS UNDER SERVICE CONTRACTS (effective date pending) General § 9.3 Coverage. This part applies to all service contracts and their solicitations, except those...

  19. Determining the effective coverage of maternal and child health services in Kenya, using demographic and health survey data sets: tracking progress towards universal health coverage.

    PubMed

    Nguhiu, Peter K; Barasa, Edwine W; Chuma, Jane

    2017-04-01

    Effective coverage (EC) is a measure of health systems' performance that combines need, use and quality indicators. This study aimed to assess the extent to which the Kenyan health system provides effective and equitable maternal and child health services, as a means of tracking the country's progress towards universal health coverage. The Demographic Health Surveys (2003, 2008-2009 and 2014) and Service Provision Assessment surveys (2004, 2010) were the main sources of data. Indicators of need, use and quality for eight maternal and child health interventions were aggregated across interventions and economic quintiles to compute EC. EC has increased from 26.7% in 2003 to 50.9% in 2014, but remains low for the majority of interventions. There is a reduction in economic inequalities in EC with the highest to lowest wealth quintile ratio decreasing from 2.41 in 2003 to 1.65 in 2014, but maternal health services remain highly inequitable. Effective coverage of key maternal and child health services remains low, indicating that individuals are not receiving the maximum possible health gain from existing health services. There is an urgent need to focus on the quality and reach of maternal and child health services in Kenya to achieve the goals of universal health coverage. © 2017 The Authors. Tropical Medicine & International Health Published by John Wiley & Sons Ltd.

  20. Web Services as Public Services: Are We Supporting Our Busiest Service Point?

    ERIC Educational Resources Information Center

    Riley-Huff, Debra A.

    2009-01-01

    This article is an analysis of academic library organizational culture, patterns, and processes as they relate to Web services. Data gathered in a research survey is examined in an attempt to reveal current departmental and administrative attitudes, practices, and support for Web services in the library research environment. (Contains 10 tables.)

  1. 47 CFR 1.946 - Construction and coverage requirements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Wireless Radio Services, requirements for construction and commencement of service or commencement of... certain Wireless Radio Services, licensees must comply with geographic coverage requirements or... Section 1.946 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Wireless...

  2. 47 CFR 1.946 - Construction and coverage requirements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Wireless Radio Services, requirements for construction and commencement of service or commencement of... certain Wireless Radio Services, licensees must comply with geographic coverage requirements or... Section 1.946 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Wireless...

  3. Impact of Medicare on the Use of Medical Services by Disabled Beneficiaries, 1972-1974

    PubMed Central

    Deacon, Ronald W.

    1979-01-01

    The extension of Medicare coverage in 1973 to disabled persons receiving cash benefits under the Social Security Act provided an opportunity to examine the impact of health insurance coverage on utilization and expenses for Part B services. Data on medical services used both before and after coverage, collected through the Current Medicare Survey, were analyzed. Results indicate that access to care (as measured by the number of persons using services) increased slightly, while the rate of use did not. The large increase in the number of persons eligible for Medicare reflected the large increase in the number of cash beneficiaries. Significant increases also were found in the amount charged for medical services. The absence of large increases in access and service use may be attributed, in part, to the already existing source of third party payment available to disabled cash beneficiaries in 1972, before Medicare coverage. PMID:10316939

  4. The Increased Effectiveness of HIV Preventive Intervention among Men Who Have Sex with Men and of Follow-Up Care for People Living with HIV after ‘Task-Shifting’ to Community-Based Organizations: A ‘Cash on Service Delivery’ Model in China

    PubMed Central

    Yan, Hongjing; Zhang, Min; Zhao, Jinkou; Huan, Xiping; Ding, Jianping; Wu, Susu; Wang, Chenchen; Xu, Yuanyuan; Liu, Li; Xu, Fei; Yang, Haitao

    2014-01-01

    Background A large number of men who have sex with men (MSM) and people living with HIV/AIDS (PLHA) are underserved despite increased service availability from government facilities while many community based organizations (CBOs) are not involved. We aimed to assess the feasibility and effectiveness of the task shifting from government facilities to CBOs in China. Methods HIV preventive intervention for MSM and follow-up care for PLHA were shifted from government facilities to CBOs. Based on ‘cash on service delivery’ model, 10 USD per MSM tested for HIV with results notified, 82 USD per newly HIV cases diagnosed, and 50 USD per PLHA received a defined package of follow-up care services, were paid to the CBOs. Cash payments were made biannually based on the verified results in the national web-based HIV/AIDS information system. Findings After task shifting, CBOs gradually assumed preventive intervention for MSM and follow-up care for PLHA from 2008 to 2012. HIV testing coverage among MSM increased from 4.1% in 2008 to 22.7% in 2012. The baseline median CD4 counts of newly diagnosed HIV positive MSM increased from 309 to 397 cells/µL. HIV tests among MSM by CBOs accounted for less than 1% of the total HIV tests in Nanjing but the share of HIV cases detected by CBOs was 12.4% in 2008 and 43.6% in 2012. Unit cost per HIV case detected by CBOs was 47 times lower than that by government facilities. The coverage of CD4 tests and antiretroviral therapy increased from 71.1% and 78.6% in 2008 to 86.0% and 90.1% in 2012, respectively. Conclusion It is feasible to shift essential HIV services from government facilities to CBOs, and to verify independently service results to adopt ‘cash on service delivery’ model. Services provided by CBOs are cost-effective, as compared with that by government facilities. PMID:25050797

  5. 42 CFR 410.105 - Requirements for coverage of CORF services.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 2 2010-10-01 2010-10-01 false Requirements for coverage of CORF services. 410.105 Section 410.105 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES MEDICARE PROGRAM SUPPLEMENTARY MEDICAL INSURANCE (SMI) BENEFITS Comprehensive Outpatient...

  6. Visualizing astronomy data using VRML

    NASA Astrophysics Data System (ADS)

    Beeson, Brett; Lancaster, Michael; Barnes, David G.; Bourke, Paul D.; Rixon, Guy T.

    2004-09-01

    Visualisation is a powerful tool for understanding the large data sets typical of astronomical surveys and can reveal unsuspected relationships and anomalous regions of parameter space which may be difficult to find programatically. Visualisation is a classic information technology for optimising scientific return. We are developing a number of generic on-line visualisation tools as a component of the Australian Virtual Observatory project. The tools will be deployed within the framework of the International Virtual Observatory Alliance (IVOA), and follow agreed-upon standards to make them accessible by other programs and people. We and our IVOA partners plan to utilise new information technologies (such as grid computing and web services) to advance the scientific return of existing and future instrumentation. Here we present a new tool - VOlume - which visualises point data. Visualisation of astronomical data normally requires the local installation of complex software, the downloading of potentially large datasets, and very often time-consuming and tedious data format conversions. VOlume enables the astronomer to visualise data using just a web browser and plug-in. This is achieved using IVOA standards which allow us to pass data between Web Services, Java Servlet Technology and Common Gateway Interface programs. Data from a catalogue server can be streamed in eXtensible Mark-up Language format to a servlet which produces Virtual Reality Modeling Language output. The user selects elements of the catalogue to map to geometry and then visualises the result in a browser plug-in such as Cortona or FreeWRL. Other than requiring an input VOTable format file, VOlume is very general. While its major use will likely be to display and explore astronomical source catalogues, it can easily render other important parameter fields such as the sky and redshift coverage of proposed surveys or the sampling of the visibility plane by a rotation-synthesis interferometer.

  7. Beyond spinal manipulation: should Medicare expand coverage for chiropractic services? A review and commentary on the challenges for policy makers

    PubMed Central

    Whedon, James M.; Goertz, Christine M.; Lurie, Jon D.; Stason, William B.

    2013-01-01

    Objectives Private insurance plans typically reimburse doctors of chiropractic for a range of clinical services, but Medicare reimbursements are restricted to spinal manipulation procedures. Medicare pays for evaluations performed by medical and osteopathic physicians, nurse practitioners, physician assistants, podiatrists, physical therapists, and occupational therapists; however, it does not reimburse the same services provided by chiropractic physicians. Advocates for expanded coverage of chiropractic services under Medicare cite clinical effectiveness and patient satisfaction, whereas critics point to unnecessary services, inadequate clinical documentation, and projected cost increases. To further inform this debate, the purpose of this commentary is to address the following questions: (1) What are the barriers to expand coverage for chiropractic services? (2) What could potentially be done to address these issues? (3) Is there a rationale for Centers for Medicare and Medicaid Services to expand coverage for chiropractic services? Methods A literature search was conducted of Google and PubMed for peer-reviewed articles and US government reports relevant to the provision of chiropractic care under Medicare. We reviewed relevant articles and reports to identify key issues concerning the expansion of coverage for chiropractic under Medicare, including identification of barriers and rationale for expanded coverage. Results The literature search yielded 29 peer-reviewed articles and 7 federal government reports. Our review of these documents revealed 3 key barriers to full coverage of chiropractic services under Medicare: inadequate documentation of chiropractic claims, possible provision of unnecessary preventive care services, and the uncertain costs of expanded coverage. Our recommendations to address these barriers include the following: individual chiropractic physicians, as well as state and national chiropractic organizations, should continue to strengthen efforts to improve claims and documentation practices; and additional rigorous efficacy/effectiveness research and clinical studies for chiropractic services need to be performed. Research of chiropractic services should target the triple aim of high-quality care, affordability, and improved health. Conclusions The barriers that were identified in this study can be addressed. To overcome these barriers, the chiropractic profession and individual physicians must assume responsibility for correcting deficiencies in compliance and documentation; further research needs to be done to evaluate chiropractic services; and effectiveness of extended episodes of preventive chiropractic care should be rigorously evaluated. Centers for Medicare and Medicaid Services policies related to chiropractic reimbursement should be reexamined using the same standards applicable to other health care providers. The integration of chiropractic physicians as fully engaged Medicare providers has the potential to enhance the capacity of the Medicare workforce to care for the growing population. We recommend that Medicare policy makers consider limited expansion of Medicare coverage to include, at a minimum, reimbursement for evaluation and management services by chiropractic physicians. PMID:25067927

  8. Virtualization of open-source secure web services to support data exchange in a pediatric critical care research network

    PubMed Central

    Sward, Katherine A; Newth, Christopher JL; Khemani, Robinder G; Cryer, Martin E; Thelen, Julie L; Enriquez, Rene; Shaoyu, Su; Pollack, Murray M; Harrison, Rick E; Meert, Kathleen L; Berg, Robert A; Wessel, David L; Shanley, Thomas P; Dalton, Heidi; Carcillo, Joseph; Jenkins, Tammara L; Dean, J Michael

    2015-01-01

    Objectives To examine the feasibility of deploying a virtual web service for sharing data within a research network, and to evaluate the impact on data consistency and quality. Material and Methods Virtual machines (VMs) encapsulated an open-source, semantically and syntactically interoperable secure web service infrastructure along with a shadow database. The VMs were deployed to 8 Collaborative Pediatric Critical Care Research Network Clinical Centers. Results Virtual web services could be deployed in hours. The interoperability of the web services reduced format misalignment from 56% to 1% and demonstrated that 99% of the data consistently transferred using the data dictionary and 1% needed human curation. Conclusions Use of virtualized open-source secure web service technology could enable direct electronic abstraction of data from hospital databases for research purposes. PMID:25796596

  9. Comparison of PubMed, Scopus, Web of Science, and Google Scholar: strengths and weaknesses.

    PubMed

    Falagas, Matthew E; Pitsouni, Eleni I; Malietzis, George A; Pappas, Georgios

    2008-02-01

    The evolution of the electronic age has led to the development of numerous medical databases on the World Wide Web, offering search facilities on a particular subject and the ability to perform citation analysis. We compared the content coverage and practical utility of PubMed, Scopus, Web of Science, and Google Scholar. The official Web pages of the databases were used to extract information on the range of journals covered, search facilities and restrictions, and update frequency. We used the example of a keyword search to evaluate the usefulness of these databases in biomedical information retrieval and a specific published article to evaluate their utility in performing citation analysis. All databases were practical in use and offered numerous search facilities. PubMed and Google Scholar are accessed for free. The keyword search with PubMed offers optimal update frequency and includes online early articles; other databases can rate articles by number of citations, as an index of importance. For citation analysis, Scopus offers about 20% more coverage than Web of Science, whereas Google Scholar offers results of inconsistent accuracy. PubMed remains an optimal tool in biomedical electronic research. Scopus covers a wider journal range, of help both in keyword searching and citation analysis, but it is currently limited to recent articles (published after 1995) compared with Web of Science. Google Scholar, as for the Web in general, can help in the retrieval of even the most obscure information but its use is marred by inadequate, less often updated, citation information.

  10. Evolution of the Data Access Protocol in Response to Community Needs

    NASA Astrophysics Data System (ADS)

    Gallagher, J.; Caron, J. L.; Davis, E.; Fulker, D.; Heimbigner, D.; Holloway, D.; Howe, B.; Moe, S.; Potter, N.

    2012-12-01

    Under the aegis of the OPULS (OPeNDAP-Unidata Linked Servers) Project, funded by NOAA, version 2 of OPeNDAP's Data Access Protocol (DAP2) is being updated to version 4. DAP4 is the first major upgrade in almost two decades and will embody three main areas of advancement. First, the data-model extensions developed by the OPULS team focus on three areas: Better support for coverages, access to HDF5 files and access to relational databases. DAP2 support for coverages (defined as a sampled functions) was limited to simple rectangular coverages that work well for (some) model outputs and processed satellite data but that cannot represent trajectories or satellite swath data, for example. We have extended the coverage concept in DAP4 to remove these limitations. These changes are informed by work at Unidata on the Common Data Model and also by the OGC's abstract coverages specification. In a similar vein, we have extended DAP2's support for relations by including the concept of foreign keys, so that tables can be explicitly related to one another. Second, the web interfaces - web services - that provides access to data using via DAP will be more clearly defined and use other (, orthogonal), standards where they are appropriate. An important case is the XML interface, which provides a cleaner way to build other response media types such as JSON and RDF (for metadata) and to build support for Atom, thus simplify the integration of DAP servers with tools that support OpenSearch. Input from the ESIP federation and work performed with IOOS have informed our choices here. Last, DAP4-compliant servers will support richer data-processing capabilities than DAP2, enabling a wider array of server functions that manipulate data before returning values. Two projects currently are exploring just what can be done even with DAP2's server-function model: The MIIC project at LARC and OPULS itself (with work performed at the University of Washington). Both projects have demonstrated that server functions can be used to perform operations on large volumes of data and return results that are far smaller than would be required to achieve the same outcomes via client-side processing. We are using information from these efforts to inform the design of server functions in DAP4. Each of the three areas of DAP4 advancement is being guided by input from a number of community members, including an OPULS Advisory Committee.

  11. An Architecture for Autonomic Web Service Process Planning

    NASA Astrophysics Data System (ADS)

    Moore, Colm; Xue Wang, Ming; Pahl, Claus

    Web service composition is a technology that has received considerable attention in the last number of years. Languages and tools to aid in the process of creating composite Web services have been received specific attention. Web service composition is the process of linking single Web services together in order to accomplish more complex tasks. One area of Web service composition that has not received as much attention is the area of dynamic error handling and re-planning, enabling autonomic composition. Given a repository of service descriptions and a task to complete, it is possible for AI planners to automatically create a plan that will achieve this goal. If however a service in the plan is unavailable or erroneous the plan will fail. Motivated by this problem, this paper suggests autonomous re-planning as a means to overcome dynamic problems. Our solution involves automatically recovering from faults and creating a context-dependent alternate plan. We present an architecture that serves as a basis for the central activities autonomous composition, monitoring and fault handling.

  12. Socioeconomic inequalities in the use of dental care services in Europe: what is the role of public coverage?

    PubMed

    Palència, Laia; Espelt, Albert; Cornejo-Ovalle, Marco; Borrell, Carme

    2014-04-01

    The aim of this study was to analyse inequalities in the use of dental care services according to socioeconomic position (SEP) in individuals aged ≥50 years in European countries in 2006, to examine the association between the degree of public coverage of dental services and the extent of inequalities, and specifically to determine whether countries with higher public health coverage show lower inequalities. We carried out a cross-sectional study of 12 364 men and 14 692 women aged ≥50 years from 11 European countries. Data were extracted from the second wave of the Survey of Health, Ageing and Retirement in Europe (SHARE 2006). The dependent variable was use of dental care services within the previous year, and the independent variables were education level as a measure of SEP, whether services were covered to some degree by the country's public health system, and chewing ability as a marker of individuals' need for dental services. Age-standardized prevalence of the use of dental care as a function of SEP was calculated, and age-adjusted indices of relative inequality (RII) were computed for each type of dental coverage, sex and chewing ability. Socioeconomic inequalities in the use of dental care services were higher in countries where no public dental care cover was provided than in countries where there was some degree of public coverage. For example, men with chewing ability from countries with dental care coverage had a RII of 1.39 (95%CI: 1.29-1.51), while those from countries without coverage had a RII of 1.96 (95%CI: 1.72-2.23). Women without chewing ability from countries with dental care coverage had a RII of 2.15 (95%CI: 1.82-2.52), while those from countries without coverage had a RII of 3.02 (95%CI: 2.47-3.69). Dental systems relying on public coverage seem to show lower inequalities in their use, thus confirming the potential benefits of such systems. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  13. SOCIO-ECONOMIC INEQUALITIES IN THE USE OF DENTAL CARE SERVICES IN EUROPE: WHAT IS THE ROLE OF PUBLIC COVERAGE?

    PubMed Central

    Palència, Laia; Espelt, Albert; Cornejo-Ovalle, Marco; Borrell, Carme

    2013-01-01

    Objectives The aim of this study was to analyse inequalities in the use of dental care services according to socio-economic position (SEP) in individuals aged ≥50 years in European countries in 2006, and to examine the association between the degree of public coverage of dental services and the extent of inequalities, and specifically to determine whether countries with higher public health coverage show lower inequalities. Methods We carried out a cross-sectional study of 12,364 men and 14,692 women aged ≥50 years from 11 European countries. Data were extracted from the second wave of the Survey of Health, Ageing and Retirement in Europe (SHARE 2006). The dependent variable was use of dental care services within the previous year, and the independent variables were education level as a measure of SEP, whether services were covered to some degree by the country’s public health system, and chewing ability as a marker of individuals’ need for dental services. Age-standardised prevalence of the use of dental care as a function of SEP was calculated, and age-adjusted indices of relative inequality (RII) were computed for each type of dental coverage, sex, and chewing ability. Results SEP inequalities in the use of dental care services were higher in countries where no public dental care cover was provided than in countries where there was some degree of public coverage. For example, men with chewing ability from countries with dental care coverage had a RII of 1.39 (95%CI:1.29–1.51), while those from countries without coverage had a RII of 1.96 (95%CI:1.72–2.23). Women without chewing ability from countries with dental care coverage had a RII of 2.15 (95%CI:1.82–2.52), while those from countries without coverage had a RII of 3.02 (95%CI:2.47–3.69). Conclusions Dental systems relying on public coverage seem to show lower inequalities in their use, thus confirming the potential benefits of such systems. PMID:23786417

  14. 3 CFR - Provision of Aviation Insurance Coverage for Commercial Air Carrier Service in Domestic and...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 3 The President 1 2014-01-01 2014-01-01 false Provision of Aviation Insurance Coverage for Commercial Air Carrier Service in Domestic and International Operations Presidential Documents Other Presidential Documents Memorandum of December 27, 2013 Provision of Aviation Insurance Coverage for Commercial...

  15. 3 CFR - Provision of Aviation Insurance Coverage for Commercial Air Carrier Service in Domestic and...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 3 The President 1 2011-01-01 2011-01-01 false Provision of Aviation Insurance Coverage for Commercial Air Carrier Service in Domestic and International Operations Presidential Documents Other Presidential Documents Memorandum of September 29, 2010 Provision of Aviation Insurance Coverage for Commercial...

  16. 3 CFR - Provision of Aviation Insurance Coverage for Commercial Air Carrier Service in Domestic and...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 3 The President 1 2012-01-01 2012-01-01 false Provision of Aviation Insurance Coverage for Commercial Air Carrier Service in Domestic and International Operations Presidential Documents Other Presidential Documents Memorandum of September 28, 2011 Provision of Aviation Insurance Coverage for Commercial...

  17. 3 CFR - Provision of Aviation Insurance Coverage for Commercial Air Carrier Service in Domestic and...

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 3 The President 1 2013-01-01 2013-01-01 false Provision of Aviation Insurance Coverage for Commercial Air Carrier Service in Domestic and International Operations Presidential Documents Other Presidential Documents Memorandum of September 27, 2012 Provision of Aviation Insurance Coverage for Commercial...

  18. 3 CFR - Provision of Aviation Insurance Coverage for Commercial Air Carrier Service in Domestic and...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 3 The President 1 2010-01-01 2010-01-01 false Provision of Aviation Insurance Coverage for Commercial Air Carrier Service in Domestic and International Operations Presidential Documents Other Presidential Documents Memorandum of August 21, 2009 Provision of Aviation Insurance Coverage for Commercial...

  19. 42 CFR 486.102 - Condition for coverage: Supervision by a qualified physician.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Condition for coverage: Supervision by a qualified physician. 486.102 Section 486.102 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS AND CERTIFICATION CONDITIONS FOR COVERAGE OF SPECIALIZED...

  20. Domain-specific Web Service Discovery with Service Class Descriptions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rocco, D; Caverlee, J; Liu, L

    2005-02-14

    This paper presents DynaBot, a domain-specific web service discovery system. The core idea of the DynaBot service discovery system is to use domain-specific service class descriptions powered by an intelligent Deep Web crawler. In contrast to current registry-based service discovery systems--like the several available UDDI registries--DynaBot promotes focused crawling of the Deep Web of services and discovers candidate services that are relevant to the domain of interest. It uses intelligent filtering algorithms to match services found by focused crawling with the domain-specific service class descriptions. We demonstrate the capability of DynaBot through the BLAST service discovery scenario and describe ourmore » initial experience with DynaBot.« less

  1. Immunization coverage in India for areas served by the Integrated Child Development Services programme. The Integrated Child Development Services Consultants.

    PubMed

    Tandon, B N; Gandhi, N

    1992-01-01

    The Integrated Child Development Services (ICDS) programme was launched by the Indian government in October 1975 to provide a package of health, nutrition and informal educational services to mothers and children. In 1988 we studied the impact of ICDS on the immunization coverage of children aged 12-24 months and of mothers of infants in 19 rural, 8 tribal, and 9 urban ICDS projects that had been operational for more than 5 years. Complete coverage with BCG, diphtheria-pertussis-tetanus (DPT) and poliomyelitis vaccines was recorded for 65%, 63%, and 64% of children, respectively, in the ICDS population. By comparison, the coverage in the non-ICDS group was only 22% for BCG, 28% for DPT, and 27% for poliomyelitis. Complete immunization with tetanus toxoid was recorded for 68% of the mothers in the ICDS group and for 40% in the non-ICDS group. Coverage was greater in the urban and lower in the tribal projects. Scheduled castes, scheduled tribes, backward communities, and minorities (groups that have a high priority for social services) had immunization coverages in ICDS projects that were similar to those of higher castes.

  2. Available, intuitive and free! Building e-learning modules using web 2.0 services.

    PubMed

    Tam, Chun Wah Michael; Eastwood, Anne

    2012-01-01

    E-learning is part of the mainstream in medical education and often provides the most efficient and effective means of engaging learners in a particular topic. However, translating design and content ideas into a useable product can be technically challenging, especially in the absence of information technology (IT) support. There is little published literature on the use of web 2.0 services to build e-learning activities. To describe the web 2.0 tools and solutions employed to build the GP Synergy evidence-based medicine and critical appraisal online course. We used and integrated a number of free web 2.0 services including: Prezi, a web-based presentation platform; YouTube, a video sharing service; Google Docs, a online document platform; Tiny.cc, a URL shortening service; and Wordpress, a blogging platform. The course consisting of five multimedia-rich, tutorial-like modules was built without IT specialist assistance or specialised software. The web 2.0 services used were free. The course can be accessed with a modern web browser. Modern web 2.0 services remove many of the technical barriers for creating and sharing content on the internet. When used synergistically, these services can be a flexible and low-cost platform for building e-learning activities. They were a pragmatic solution in our context.

  3. 76 FR 28439 - Submission for OMB Review; Comment Request; NCI Cancer Genetics Services Directory Web-Based...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-17

    ...; Comment Request; NCI Cancer Genetics Services Directory Web-Based Application Form and Update Mailer... currently valid OMB control number. Proposed Collection: Title: NCI Cancer Genetics Services Directory Web... application form and the Web-based update mailer is to collect information about genetics professionals to be...

  4. General Practitioners' Attitudes Toward a Web-Based Mental Health Service for Adolescents: Implications for Service Design and Delivery.

    PubMed

    Subotic-Kerry, Mirjana; King, Catherine; O'Moore, Kathleen; Achilles, Melinda; O'Dea, Bridianne

    2018-03-23

    Anxiety disorders and depression are prevalent among youth. General practitioners (GPs) are often the first point of professional contact for treating health problems in young people. A Web-based mental health service delivered in partnership with schools may facilitate increased access to psychological care among adolescents. However, for such a model to be implemented successfully, GPs' views need to be measured. This study aimed to examine the needs and attitudes of GPs toward a Web-based mental health service for adolescents, and to identify the factors that may affect the provision of this type of service and likelihood of integration. Findings will inform the content and overall service design. GPs were interviewed individually about the proposed Web-based service. Qualitative analysis of transcripts was performed using thematic coding. A short follow-up questionnaire was delivered to assess background characteristics, level of acceptability, and likelihood of integration of the Web-based mental health service. A total of 13 GPs participated in the interview and 11 completed a follow-up online questionnaire. Findings suggest strong support for the proposed Web-based mental health service. A wide range of factors were found to influence the likelihood of GPs integrating a Web-based service into their clinical practice. Coordinated collaboration with parents, students, school counselors, and other mental health care professionals were considered important by nearly all GPs. Confidence in Web-based care, noncompliance of adolescents and GPs, accessibility, privacy, and confidentiality were identified as potential barriers to adopting the proposed Web-based service. GPs were open to a proposed Web-based service for the monitoring and management of anxiety and depression in adolescents, provided that a collaborative approach to care is used, the feedback regarding the client is clear, and privacy and security provisions are assured. ©Mirjana Subotic-Kerry, Catherine King, Kathleen O'Moore, Melinda Achilles, Bridianne O'Dea. Originally published in JMIR Human Factors (http://humanfactors.jmir.org), 23.03.2018.

  5. PaaS for web applications with OpenShift Origin

    NASA Astrophysics Data System (ADS)

    Lossent, A.; Rodriguez Peon, A.; Wagner, A.

    2017-10-01

    The CERN Web Frameworks team has deployed OpenShift Origin to facilitate deployment of web applications and to improving efficiency in terms of computing resource usage. OpenShift leverages Docker containers and Kubernetes orchestration to provide a Platform-as-a-service solution oriented for web applications. We will review use cases and how OpenShift was integrated with other services such as source control, web site management and authentication services.

  6. A resource-oriented architecture for a Geospatial Web

    NASA Astrophysics Data System (ADS)

    Mazzetti, Paolo; Nativi, Stefano

    2010-05-01

    In this presentation we discuss some architectural issues on the design of an architecture for a Geospatial Web, that is an information system for sharing geospatial resources according to the Web paradigm. The success of the Web in building a multi-purpose information space, has raised questions about the possibility of adopting the same approach for systems dedicated to the sharing of more specific resources, such as the geospatial information, that is information characterized by spatial/temporal reference. To this aim an investigation on the nature of the Web and on the validity of its paradigm for geospatial resources is required. The Web was born in the early 90's to provide "a shared information space through which people and machines could communicate" [Berners-Lee 1996]. It was originally built around a small set of specifications (e.g. URI, HTTP, HTML, etc.); however, in the last two decades several other technologies and specifications have been introduced in order to extend its capabilities. Most of them (e.g. the SOAP family) actually aimed to transform the Web in a generic Distributed Computing Infrastructure. While these efforts were definitely successful enabling the adoption of service-oriented approaches for machine-to-machine interactions supporting complex business processes (e.g. for e-Government and e-Business applications), they do not fit in the original concept of the Web. In the year 2000, R. T. Fielding, one of the designers of the original Web specifications, proposes a new architectural style for distributed systems, called REST (Representational State Transfer), aiming to capture the fundamental characteristics of the Web as it was originally conceived [Fielding 2000]. In this view, the nature of the Web lies not so much in the technologies, as in the way they are used. Maintaining the Web architecture conform to the REST style would then assure the scalability, extensibility and low entry barrier of the original Web. On the contrary, systems using the same Web technologies and specifications but according to a different architectural style, despite their usefulness, should not be considered part of the Web. If the REST style captures the significant Web characteristics, then, in order to build a Geospatial Web it is necessary that its architecture satisfies all the REST constraints. One of them is of particular importance: the adoption of a Uniform Interface. It prescribes that all the geospatial resources must be accessed through the same interface; moreover according to the REST style this interface must satisfy four further constraints: a) identification of resources; b) manipulation of resources through representations; c) self-descriptive messages; and, d) hypermedia as the engine of application state. In the Web, the uniform interface provides basic operations which are meaningful for generic resources. They typically implement the CRUD pattern (Create-Retrieve-Update-Delete) which demonstrated to be flexible and powerful in several general-purpose contexts (e.g. filesystem management, SQL for database management systems, etc.). Restricting the scope to a subset of resources it would be possible to identify other generic actions which are meaningful for all of them. For example for geospatial resources, subsetting, resampling, interpolation and coordinate reference systems transformations functionalities are candidate functionalities for a uniform interface. However an investigation is needed to clarify the semantics of those actions for different resources, and consequently if they can really ascend the role of generic interface operation. Concerning the point a), (identification of resources), it is required that every resource addressable in the Geospatial Web has its own identifier (e.g. a URI). This allows to implement citation and re-use of resources, simply providing the URI. OPeNDAP and KVP encodings of OGC data access services specifications might provide a basis for it. Concerning point b) (manipulation of resources through representations), the Geospatial Web poses several issues. In fact, while the Web mainly handles semi-structured information, in the Geospatial Web the information is typically structured with several possible data models (e.g. point series, gridded coverages, trajectories, etc.) and encodings. A possibility would be to simplify the interchange formats, choosing to support a subset of data models and format(s). This is what actually the Web designers did choosing to define a common format for hypermedia (HTML), although the underlying protocol would be generic. Concerning point c), self-descriptive messages, the exchanged messages should describe themselves and their content. This would not be actually a major issue considering the effort put in recent years on geospatial metadata models and specifications. The point d), hypermedia as the engine of application state, is actually where the Geospatial Web would mainly differ from existing geospatial information sharing systems. In fact the existing systems typically adopt a service-oriented architecture, where applications are built as a single service or as a workflow of services. On the other hand, in the Geospatial Web, applications should be built following the path between interconnected resources. The link between resources should be made explicit as hyperlinks. The adoption of Semantic Web solutions would allow to define not only the existence of a link between two resources, but also the nature of the link. The implementation of a Geospatial Web would allow to build an information system with the same characteristics of the Web sharing its points-of-strength and weaknesses. The main advantages would be the following: • The user would interact with the Geospatial Web according to the well-known Web navigation paradigm. This would lower the barrier to the access to geospatial applications for non-specialists (e.g. the success of Google Maps and other Web mapping applications); • Successful Web and Web 2.0 applications - search engines, feeds, social network - could be integrated/replicated in the Geospatial Web; The main drawbacks would be the following: • The Uniform Interface simplifies the overall system architecture (e.g. no service registry, and service descriptors required), but moves the complexity to the data representation. Moreover since the interface must stay generic, it results really simple and therefore complex interactions would require several transfers. • In the geospatial domain one of the most valuable resources are processes (e.g. environmental models). How they can be modeled as resources accessed through the common interface is an open issue. Taking into account advantages and drawback it seems that a Geospatial Web would be useful, but its use would be limited to specific use-cases not covering all the possible applications. The Geospatial Web architecture could be partly based on existing specifications, while other aspects need investigation. References [Berners-Lee 1996] T. Berners-Lee, "WWW: Past, present, and future". IEEE Computer, 29(10), Oct. 1996, pp. 69-77. [Fielding 2000] Fielding, R. T. 2000. Architectural styles and the design of network-based software architectures. PhD Dissertation. Dept. of Information and Computer Science, University of California, Irvine

  7. 76 FR 14034 - Proposed Collection; Comment Request; NCI Cancer Genetics Services Directory Web-Based...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-15

    ... Request; NCI Cancer Genetics Services Directory Web-Based Application Form and Update Mailer Summary: In... Cancer Genetics Services Directory Web-based Application Form and Update Mailer. [[Page 14035

  8. 42 CFR 410.110 - Requirements for coverage of partial hospitalization services by CMHCs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Community Mental Health Centers (CMHCs) Providing Partial Hospitalization Services § 410.110 Requirements... 42 Public Health 2 2010-10-01 2010-10-01 false Requirements for coverage of partial hospitalization services by CMHCs. 410.110 Section 410.110 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES...

  9. 42 CFR 410.110 - Requirements for coverage of partial hospitalization services by CMHCs.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Community Mental Health Centers (CMHCs) Providing Partial Hospitalization Services § 410.110 Requirements... 42 Public Health 2 2012-10-01 2012-10-01 false Requirements for coverage of partial hospitalization services by CMHCs. 410.110 Section 410.110 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES...

  10. 42 CFR 410.110 - Requirements for coverage of partial hospitalization services by CMHCs.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Community Mental Health Centers (CMHCs) Providing Partial Hospitalization Services § 410.110 Requirements... 42 Public Health 2 2013-10-01 2013-10-01 false Requirements for coverage of partial hospitalization services by CMHCs. 410.110 Section 410.110 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES...

  11. 42 CFR 410.110 - Requirements for coverage of partial hospitalization services by CMHCs.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Community Mental Health Centers (CMHCs) Providing Partial Hospitalization Services § 410.110 Requirements... 42 Public Health 2 2011-10-01 2011-10-01 false Requirements for coverage of partial hospitalization services by CMHCs. 410.110 Section 410.110 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES...

  12. 42 CFR 410.110 - Requirements for coverage of partial hospitalization services by CMHCs.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Community Mental Health Centers (CMHCs) Providing Partial Hospitalization Services § 410.110 Requirements... 42 Public Health 2 2014-10-01 2014-10-01 false Requirements for coverage of partial hospitalization services by CMHCs. 410.110 Section 410.110 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES...

  13. 42 CFR 440.330 - Benchmark health benefits coverage.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Benchmark-Equivalent Coverage § 440.330 Benchmark health benefits coverage. Benchmark coverage is health...) Federal Employees Health Benefit Plan Equivalent Coverage (FEHBP—Equivalent Health Insurance Coverage). A benefit plan equivalent to the standard Blue Cross/Blue Shield preferred provider option service benefit...

  14. 42 CFR 440.330 - Benchmark health benefits coverage.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Benchmark-Equivalent Coverage § 440.330 Benchmark health benefits coverage. Benchmark coverage is health...) Federal Employees Health Benefit Plan Equivalent Coverage (FEHBP—Equivalent Health Insurance Coverage). A benefit plan equivalent to the standard Blue Cross/Blue Shield preferred provider option service benefit...

  15. 42 CFR 440.330 - Benchmark health benefits coverage.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Benchmark-Equivalent Coverage § 440.330 Benchmark health benefits coverage. Benchmark coverage is health...) Federal Employees Health Benefit Plan Equivalent Coverage (FEHBP—Equivalent Health Insurance Coverage). A benefit plan equivalent to the standard Blue Cross/Blue Shield preferred provider option service benefit...

  16. 42 CFR 440.330 - Benchmark health benefits coverage.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Benchmark-Equivalent Coverage § 440.330 Benchmark health benefits coverage. Benchmark coverage is health...) Federal Employees Health Benefit Plan Equivalent Coverage (FEHBP—Equivalent Health Insurance Coverage). A benefit plan equivalent to the standard Blue Cross/Blue Shield preferred provider option service benefit...

  17. Optimizing Distribution of Pandemic Influenza Antiviral Drugs

    PubMed Central

    Huang, Hsin-Chan; Morton, David P.; Johnson, Gregory P.; Gutfraind, Alexander; Galvani, Alison P.; Clements, Bruce; Meyers, Lauren A.

    2015-01-01

    We provide a data-driven method for optimizing pharmacy-based distribution of antiviral drugs during an influenza pandemic in terms of overall access for a target population and apply it to the state of Texas, USA. We found that during the 2009 influenza pandemic, the Texas Department of State Health Services achieved an estimated statewide access of 88% (proportion of population willing to travel to the nearest dispensing point). However, access reached only 34.5% of US postal code (ZIP code) areas containing <1,000 underinsured persons. Optimized distribution networks increased expected access to 91% overall and 60% in hard-to-reach regions, and 2 or 3 major pharmacy chains achieved near maximal coverage in well-populated areas. Independent pharmacies were essential for reaching ZIP code areas containing <1,000 underinsured persons. This model was developed during a collaboration between academic researchers and public health officials and is available as a decision support tool for Texas Department of State Health Services at a Web-based interface. PMID:25625858

  18. Building asynchronous geospatial processing workflows with web services

    NASA Astrophysics Data System (ADS)

    Zhao, Peisheng; Di, Liping; Yu, Genong

    2012-02-01

    Geoscience research and applications often involve a geospatial processing workflow. This workflow includes a sequence of operations that use a variety of tools to collect, translate, and analyze distributed heterogeneous geospatial data. Asynchronous mechanisms, by which clients initiate a request and then resume their processing without waiting for a response, are very useful for complicated workflows that take a long time to run. Geospatial contents and capabilities are increasingly becoming available online as interoperable Web services. This online availability significantly enhances the ability to use Web service chains to build distributed geospatial processing workflows. This paper focuses on how to orchestrate Web services for implementing asynchronous geospatial processing workflows. The theoretical bases for asynchronous Web services and workflows, including asynchrony patterns and message transmission, are examined to explore different asynchronous approaches to and architecture of workflow code for the support of asynchronous behavior. A sample geospatial processing workflow, issued by the Open Geospatial Consortium (OGC) Web Service, Phase 6 (OWS-6), is provided to illustrate the implementation of asynchronous geospatial processing workflows and the challenges in using Web Services Business Process Execution Language (WS-BPEL) to develop them.

  19. Web Services and Other Enhancements at the Northern California Earthquake Data Center

    NASA Astrophysics Data System (ADS)

    Neuhauser, D. S.; Zuzlewski, S.; Allen, R. M.

    2012-12-01

    The Northern California Earthquake Data Center (NCEDC) provides data archive and distribution services for seismological and geophysical data sets that encompass northern California. The NCEDC is enhancing its ability to deliver rapid information through Web Services. NCEDC Web Services use well-established web server and client protocols and REST software architecture to allow users to easily make queries using web browsers or simple program interfaces and to receive the requested data in real-time rather than through batch or email-based requests. Data are returned to the user in the appropriate format such as XML, RESP, or MiniSEED depending on the service, and are compatible with the equivalent IRIS DMC web services. The NCEDC is currently providing the following Web Services: (1) Station inventory and channel response information delivered in StationXML format, (2) Channel response information delivered in RESP format, (3) Time series availability delivered in text and XML formats, (4) Single channel and bulk data request delivered in MiniSEED format. The NCEDC is also developing a rich Earthquake Catalog Web Service to allow users to query earthquake catalogs based on selection parameters such as time, location or geographic region, magnitude, depth, azimuthal gap, and rms. It will return (in QuakeML format) user-specified results that can include simple earthquake parameters, as well as observations such as phase arrivals, codas, amplitudes, and computed parameters such as first motion mechanisms, moment tensors, and rupture length. The NCEDC will work with both IRIS and the International Federation of Digital Seismograph Networks (FDSN) to define a uniform set of web service specifications that can be implemented by multiple data centers to provide users with a common data interface across data centers. The NCEDC now hosts earthquake catalogs and waveforms from the US Department of Energy (DOE) Enhanced Geothermal Systems (EGS) monitoring networks. These data can be accessed through the above web services and through special NCEDC web pages.

  20. [Potential coverage and real coverage of ambulatory health care services in the state of Mexico. The case of 3 marginal communities in Atenco and Chalco].

    PubMed

    Nájera-Aguilar, P; Infante-Castañeda, C

    1990-01-01

    Less than a third of the non-insured population studied through a sample in the State of Mexico was covered by the Institute of Health of the State of México. This low coverage was observed in spite the fact that health services were available within 2 kilometer radius. 33 per cent of the non-insured preferred to utilize other services within their own community, and 24 per cent of them traveled to bigger localities to receive care. These results suggest that to attain adequate coverage, utilization patterns should be investigated so that health services can meet the needs of the target population.

  1. Earth Observation-Supported Service Platform for the Development and Provision of Thematic Information on the Built Environment - the Tep-Urban Project

    NASA Astrophysics Data System (ADS)

    Esch, T.; Asamer, H.; Boettcher, M.; Brito, F.; Hirner, A.; Marconcini, M.; Mathot, E.; Metz, A.; Permana, H.; Soukop, T.; Stanek, F.; Kuchar, S.; Zeidler, J.; Balhar, J.

    2016-06-01

    The Sentinel fleet will provide a so-far unique coverage with Earth observation data and therewith new opportunities for the implementation of methodologies to generate innovative geo-information products and services. It is here where the TEP Urban project is supposed to initiate a step change by providing an open and participatory platform based on modern ICT technologies and services that enables any interested user to easily exploit Earth observation data pools, in particular those of the Sentinel missions, and derive thematic information on the status and development of the built environment from these data. Key component of TEP Urban project is the implementation of a web-based platform employing distributed high-level computing infrastructures and providing key functionalities for i) high-performance access to satellite imagery and derived thematic data, ii) modular and generic state-of-the art pre-processing, analysis, and visualization techniques, iii) customized development and dissemination of algorithms, products and services, and iv) networking and communication. This contribution introduces the main facts about the TEP Urban project, including a description of the general objectives, the platform systems design and functionalities, and the preliminary portfolio products and services available at the TEP Urban platform.

  2. Income Disparities in the Use of Health Screening Services Among University Students in Korea: A Cross-Sectional Study of 2479 Participants in a University.

    PubMed

    Lee, Su Hyun; Joh, Hee-Kyung; Kim, Soojin; Oh, Seung-Won; Lee, Cheol Min; Kwon, Hyuktae

    2016-05-01

    Public health insurance coverage for preventive care in young adults is incomplete in Korea. Few studies have focused on young adults' socioeconomic disparities in preventive care utilization. We aimed to explore household income disparities in the use of different types of health screening services among university students in Korea.This cross-sectional study used a web-based self-administered survey of students at a university in Korea from January to February 2013. To examine the associations between household income levels and health screening service use within the past 2 years, odds ratios (ORs) and 95% confidence intervals (CIs) were estimated using logistic regression with adjustment for various covariables.Of 2479 participants, 45.5% reported using health screening services within 2 years (university-provided screening 32.9%, private sector screening 16.7%, and both 4.1%). Household income levels were not significantly associated with overall rates of health screening service use with a multivariable-adjusted OR (95% CI) in the lowest versus highest income group of 1.12 (0.87-1.45, Ptrend = 0.35). However, we found significantly different associations in specific types of utilized screening services by household income levels. The multivariable-adjusted OR (95% CI) of university-provided health screening service use in the lowest versus highest income level was 1.74 (1.30-2.34; Ptrend < 0.001), whereas the multivariable-adjusted OR (95% CI) of private sector service use in the lowest versus highest income level was 0.45 (0.31-0.66; Ptrend < 0.001).This study demonstrated significant disparities in the types of utilized health screening services by income groups among university students in Korea, although overall rates of health screening service use were similar across income levels. Low-income students were more likely to use university-provided health screening services, and less likely to use private sector screening services. To ensure appropriate preventive care delivery for young adults and to address disparities in disadvantaged groups, the expansion of medical insurance coverage for preventive health care, establishment of a usual source of care, focusing on vulnerable groups, and the development of evidence-based standardized health screening guidelines for young adults are needed.

  3. Income Disparities in the Use of Health Screening Services Among University Students in Korea

    PubMed Central

    Lee, Su Hyun; Joh, Hee-Kyung; Kim, Soojin; Oh, Seung-Won; Lee, Cheol Min; Kwon, Hyuktae

    2016-01-01

    Abstract Public health insurance coverage for preventive care in young adults is incomplete in Korea. Few studies have focused on young adults’ socioeconomic disparities in preventive care utilization. We aimed to explore household income disparities in the use of different types of health screening services among university students in Korea. This cross-sectional study used a web-based self-administered survey of students at a university in Korea from January to February 2013. To examine the associations between household income levels and health screening service use within the past 2 years, odds ratios (ORs) and 95% confidence intervals (CIs) were estimated using logistic regression with adjustment for various covariables. Of 2479 participants, 45.5% reported using health screening services within 2 years (university-provided screening 32.9%, private sector screening 16.7%, and both 4.1%). Household income levels were not significantly associated with overall rates of health screening service use with a multivariable-adjusted OR (95% CI) in the lowest versus highest income group of 1.12 (0.87–1.45, Ptrend = 0.35). However, we found significantly different associations in specific types of utilized screening services by household income levels. The multivariable-adjusted OR (95% CI) of university-provided health screening service use in the lowest versus highest income level was 1.74 (1.30–2.34; Ptrend < 0.001), whereas the multivariable-adjusted OR (95% CI) of private sector service use in the lowest versus highest income level was 0.45 (0.31–0.66; Ptrend < 0.001). This study demonstrated significant disparities in the types of utilized health screening services by income groups among university students in Korea, although overall rates of health screening service use were similar across income levels. Low-income students were more likely to use university-provided health screening services, and less likely to use private sector screening services. To ensure appropriate preventive care delivery for young adults and to address disparities in disadvantaged groups, the expansion of medical insurance coverage for preventive health care, establishment of a usual source of care, focusing on vulnerable groups, and the development of evidence-based standardized health screening guidelines for young adults are needed. PMID:27196475

  4. Research of three level match method about semantic web service based on ontology

    NASA Astrophysics Data System (ADS)

    Xiao, Jie; Cai, Fang

    2011-10-01

    An important step of Web service Application is the discovery of useful services. Keywords are used in service discovery in traditional technology like UDDI and WSDL, with the disadvantage of user intervention, lack of semantic description and low accuracy. To cope with these problems, OWL-S is introduced and extended with QoS attributes to describe the attribute and functions of Web Services. A three-level service matching algorithm based on ontology and QOS in proposed in this paper. Our algorithm can match web service by utilizing the service profile, QoS parameters together with input and output of the service. Simulation results shows that it greatly enhanced the speed of service matching while high accuracy is also guaranteed.

  5. Interoperability And Value Added To Earth Observation Data

    NASA Astrophysics Data System (ADS)

    Gasperi, J.

    2012-04-01

    Geospatial web services technology has provided a new means for geospatial data interoperability. Open Geospatial Consortium (OGC) services such as Web Map Service (WMS) to request maps on the Internet, Web Feature Service (WFS) to exchange vectors or Catalog Service for the Web (CSW) to search for geospatialized data have been widely adopted in the Geosciences community in general and in the remote sensing community in particular. These services make Earth Observation data available to a wider range of public users than ever before. The mapshup web client offers an innovative and efficient user interface that takes advantage of the power of interoperability. This presentation will demonstrate how mapshup can be effectively used in the context of natural disasters management.

  6. SSWAP: A Simple Semantic Web Architecture and Protocol for semantic web services

    PubMed Central

    Gessler, Damian DG; Schiltz, Gary S; May, Greg D; Avraham, Shulamit; Town, Christopher D; Grant, David; Nelson, Rex T

    2009-01-01

    Background SSWAP (Simple Semantic Web Architecture and Protocol; pronounced "swap") is an architecture, protocol, and platform for using reasoning to semantically integrate heterogeneous disparate data and services on the web. SSWAP was developed as a hybrid semantic web services technology to overcome limitations found in both pure web service technologies and pure semantic web technologies. Results There are currently over 2400 resources published in SSWAP. Approximately two dozen are custom-written services for QTL (Quantitative Trait Loci) and mapping data for legumes and grasses (grains). The remaining are wrappers to Nucleic Acids Research Database and Web Server entries. As an architecture, SSWAP establishes how clients (users of data, services, and ontologies), providers (suppliers of data, services, and ontologies), and discovery servers (semantic search engines) interact to allow for the description, querying, discovery, invocation, and response of semantic web services. As a protocol, SSWAP provides the vocabulary and semantics to allow clients, providers, and discovery servers to engage in semantic web services. The protocol is based on the W3C-sanctioned first-order description logic language OWL DL. As an open source platform, a discovery server running at (as in to "swap info") uses the description logic reasoner Pellet to integrate semantic resources. The platform hosts an interactive guide to the protocol at , developer tools at , and a portal to third-party ontologies at (a "swap meet"). Conclusion SSWAP addresses the three basic requirements of a semantic web services architecture (i.e., a common syntax, shared semantic, and semantic discovery) while addressing three technology limitations common in distributed service systems: i.e., i) the fatal mutability of traditional interfaces, ii) the rigidity and fragility of static subsumption hierarchies, and iii) the confounding of content, structure, and presentation. SSWAP is novel by establishing the concept of a canonical yet mutable OWL DL graph that allows data and service providers to describe their resources, to allow discovery servers to offer semantically rich search engines, to allow clients to discover and invoke those resources, and to allow providers to respond with semantically tagged data. SSWAP allows for a mix-and-match of terms from both new and legacy third-party ontologies in these graphs. PMID:19775460

  7. 39 CFR 3001.12 - Service of documents.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... or presiding officer has determined is unable to receive service through the Commission's Web site... presiding officer has determined is unable to receive service through the Commission Web site shall be by... service list for each current proceeding will be available on the Commission's Web site http://www.prc.gov...

  8. ChemCalc: a building block for tomorrow's chemical infrastructure.

    PubMed

    Patiny, Luc; Borel, Alain

    2013-05-24

    Web services, as an aspect of cloud computing, are becoming an important part of the general IT infrastructure, and scientific computing is no exception to this trend. We propose a simple approach to develop chemical Web services, through which servers could expose the essential data manipulation functionality that students and researchers need for chemical calculations. These services return their results as JSON (JavaScript Object Notation) objects, which facilitates their use for Web applications. The ChemCalc project http://www.chemcalc.org demonstrates this approach: we present three Web services related with mass spectrometry, namely isotopic distribution simulation, peptide fragmentation simulation, and molecular formula determination. We also developed a complete Web application based on these three Web services, taking advantage of modern HTML5 and JavaScript libraries (ChemDoodle and jQuery).

  9. 76 FR 43988 - Procurement List; Additions

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-22

    ... Sill, OK. NPAs: Professional Contract Services, Inc., Austin, TX (Prime Contractor); Work Services.... Contracting Activity: General Services Administration, Fort Worth, TX Coverage: A-List for the Total...: Department of the Army Research, Development, & Engineering Command, Natick, MA. Coverage: C-List for 100% of...

  10. Can They Plan to Teach with Web 2.0? Future Teachers' Potential Use of the Emerging Web

    ERIC Educational Resources Information Center

    Kale, Ugur

    2014-01-01

    This study examined pre-service teachers' potential use of Web 2.0 technologies for teaching. A coding scheme incorporating the Technological Pedagogical Content Knowledge (TPACK) framework guided the analysis of pre-service teachers' Web 2.0-enhanced learning activity descriptions. The results indicated that while pre-service teachers were able…

  11. 26 CFR 54.9815-2713A - Accommodations in connection with coverage of preventive health services.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ...) The organization opposes providing coverage for some or all of any contraceptive services required to... of ERISA. (b) Contraceptive coverage—self-insured group health plans. (1) A group health plan... contraceptive coverage if all of the requirements of this paragraph (b)(1) of this section are satisfied: (i...

  12. 75 FR 41787 - Requirement for Group Health Plans and Health Insurance Issuers To Provide Coverage of Preventive...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-19

    ... Requirement for Group Health Plans and Health Insurance Issuers To Provide Coverage of Preventive Services... Insurance Oversight of the U.S. Department of Health and Human Services are issuing substantially similar interim final regulations with respect to group health plans and health insurance coverage offered in...

  13. 29 CFR 9.3 - Coverage.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 29 Labor 1 2014-07-01 2013-07-01 true Coverage. 9.3 Section 9.3 Labor Office of the Secretary of Labor NONDISPLACEMENT OF QUALIFIED WORKERS UNDER SERVICE CONTRACTS General § 9.3 Coverage. This part applies to all service contracts and their solicitations, except those excluded by § 9.4 of this part...

  14. 29 CFR 9.3 - Coverage.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 29 Labor 1 2013-07-01 2013-07-01 false Coverage. 9.3 Section 9.3 Labor Office of the Secretary of Labor NONDISPLACEMENT OF QUALIFIED WORKERS UNDER SERVICE CONTRACTS General § 9.3 Coverage. This part applies to all service contracts and their solicitations, except those excluded by § 9.4 of this part...

  15. Mobile Cloud Computing with SOAP and REST Web Services

    NASA Astrophysics Data System (ADS)

    Ali, Mushtaq; Fadli Zolkipli, Mohamad; Mohamad Zain, Jasni; Anwar, Shahid

    2018-05-01

    Mobile computing in conjunction with Mobile web services drives a strong approach where the limitations of mobile devices may possibly be tackled. Mobile Web Services are based on two types of technologies; SOAP and REST, which works with the existing protocols to develop Web services. Both the approaches carry their own distinct features, yet to keep the constraint features of mobile devices in mind, the better in two is considered to be the one which minimize the computation and transmission overhead while offloading. The load transferring of mobile device to remote servers for execution called computational offloading. There are numerous approaches to implement computational offloading a viable solution for eradicating the resources constraints of mobile device, yet a dynamic method of computational offloading is always required for a smooth and simple migration of complex tasks. The intention of this work is to present a distinctive approach which may not engage the mobile resources for longer time. The concept of web services utilized in our work to delegate the computational intensive tasks for remote execution. We tested both SOAP Web services approach and REST Web Services for mobile computing. Two parameters considered in our lab experiments to test; Execution Time and Energy Consumption. The results show that RESTful Web services execution is far better than executing the same application by SOAP Web services approach, in terms of execution time and energy consumption. Conducting experiments with the developed prototype matrix multiplication app, REST execution time is about 200% better than SOAP execution approach. In case of energy consumption REST execution is about 250% better than SOAP execution approach.

  16. Adopting and adapting a commercial view of web services for the Navy

    NASA Astrophysics Data System (ADS)

    Warner, Elizabeth; Ladner, Roy; Katikaneni, Uday; Petry, Fred

    2005-05-01

    Web Services are being adopted as the enabling technology to provide net-centric capabilities for many Department of Defense operations. The Navy Enterprise Portal, for example, is Web Services-based, and the Department of the Navy is promulgating guidance for developing Web Services. Web Services, however, only constitute a baseline specification that provides the foundation on which users, under current approaches, write specialized applications in order to retrieve data over the Internet. Application development may increase dramatically as the number of different available Web Services increases. Reasons for specialized application development include XML schema versioning differences, adoption/use of diverse business rules, security access issues, and time/parameter naming constraints, among others. We are currently developing for the US Navy a system which will improve delivery of timely and relevant meteorological and oceanographic (MetOc) data to the warfighter. Our objective is to develop an Advanced MetOc Broker (AMB) that leverages Web Services technology to identify, retrieve and integrate relevant MetOc data in an automated manner. The AMB will utilize a Mediator, which will be developed by applying ontological research and schema matching techniques to MetOc forms of data. The AMB, using the Mediator, will support a new, advanced approach to the use of Web Services; namely, the automated identification, retrieval and integration of MetOc data. Systems based on this approach will then not require extensive end-user application development for each Web Service from which data can be retrieved. Users anywhere on the globe will be able to receive timely environmental data that fits their particular needs.

  17. The CLIMB Geoportal - A web-based dissemination and documentation platform for hydrological modelling data

    NASA Astrophysics Data System (ADS)

    Blaschek, Michael; Gerken, Daniel; Ludwig, Ralf; Duttmann, Rainer

    2015-04-01

    Geoportals are important elements of spatial data infrastructures (SDIs) that are strongly based on GIS-related web services. These services are basically meant for distributing, documenting and visualizing (spatial) data in a standardized manner; an important but challenging task especially in large scientific projects with a high number of data suppliers and producers from various countries. This presentation focuses on introducing the free and open-source based geoportal solution developed within the research project CLIMB (Climate Induced Changes on the Hydrology of Mediterranean Basins, www.climb-fp7.eu) that serves as the central platform for interchanging project-related spatial data and information. In this collaboration, financed by the EU-FP7-framework and coordinated at the LMU Munich, 21 partner institutions from nine European and non-European countries were involved. The CLIMB Geoportal (lgi-climbsrv.geographie.uni-kiel.de) stores and provides spatially distributed data about the current state and future changes of the hydrological conditions within the seven CLIMB test sites around the Mediterranean. Hydrological modelling outcome - validated by the CLIMB partners - is offered to the public in forms of Web Map Services (WMS), whereas downloading the underlying data itself through Web Coverage Services (WCS) is possible for registered users only. A selection of common indicators such as discharge, drought index as well as uncertainty measures including their changes over time were used in different spatial resolution. Besides map information, the portal enables the graphical display of time series of selected variables calculated by the individual models applied within the CLIMB-project. The implementation of the CLIMB Geoportal is finally based on version 2.0c5 of the open source geospatial content management system GeoNode. It includes a GeoServer instance for providing the OGC-compliant web services and comes with a metadata catalog (pycsw) as well as a built-in WebGIS-client based on GeoExt (GeoExplorer). PostgreSQL enhanced by PostGIS in versions 9.2.1/2.0.1 serves as database backend for all base data of the study sites and for the time series of relevant hydrological indicators. Spatial model results in raster-format are stored file-based as GeoTIFFs. Due to the high number of model outputs, the generation of metadata (xml) and graphical rendering instructions (sld) associated with each single layer of the WMS has been done automatically using the statistical software R. Additional applications that have been programmed during the project period include a Java-based interface for comfortable download of climate data that was initially needed as input data in hydrological modeling as well as a tool for displaying time series of selected risk indicators which is directly integrated into the portal structure implemented using Python (Django) and JavaScript. The presented CLIMB Geoportal shows that relevant results of even large international research projects involving many partners and varying national standards in data handling, can be effectively disseminated to stakeholders, policy makers and other interested parties. Thus, it is a successful example of using free and open-source software for providing long-term visibility and access to data produced within a particular (environmental) research project.

  18. AdaFF: Adaptive Failure-Handling Framework for Composite Web Services

    NASA Astrophysics Data System (ADS)

    Kim, Yuna; Lee, Wan Yeon; Kim, Kyong Hoon; Kim, Jong

    In this paper, we propose a novel Web service composition framework which dynamically accommodates various failure recovery requirements. In the proposed framework called Adaptive Failure-handling Framework (AdaFF), failure-handling submodules are prepared during the design of a composite service, and some of them are systematically selected and automatically combined with the composite Web service at service instantiation in accordance with the requirement of individual users. In contrast, existing frameworks cannot adapt the failure-handling behaviors to user's requirements. AdaFF rapidly delivers a composite service supporting the requirement-matched failure handling without manual development, and contributes to a flexible composite Web service design in that service architects never care about failure handling or variable requirements of users. For proof of concept, we implement a prototype system of the AdaFF, which automatically generates a composite service instance with Web Services Business Process Execution Language (WS-BPEL) according to the users' requirement specified in XML format and executes the generated instance on the ActiveBPEL engine.

  19. Coverage of Community-Based Management of Severe Acute Malnutrition Programmes in Twenty-One Countries, 2012-2013

    PubMed Central

    Rogers, Eleanor; Myatt, Mark; Woodhead, Sophie; Guerrero, Saul; Alvarez, Jose Luis

    2015-01-01

    Objective This paper reviews coverage data from programmes treating severe acute malnutrition (SAM) collected between July 2012 and June 2013. Design This is a descriptive study of coverage levels and barriers to coverage collected by coverage assessments of community-based SAM treatment programmes in 21 countries that were supported by the Coverage Monitoring Network. Data from 44 coverage assessments are reviewed. Setting These assessments analyse malnourished populations from 6 to 59 months old to understand the accessibility and coverage of services for treatment of acute malnutrition. The majority of assessments are from sub-Saharan Africa. Results Most of the programmes (33 of 44) failed to meet context-specific internationally agreed minimum standards for coverage. The mean level of estimated coverage achieved by the programmes in this analysis was 38.3%. The most frequently reported barriers to access were lack of awareness of malnutrition, lack of awareness of the programme, high opportunity costs, inter-programme interface problems, and previous rejection. Conclusions This study shows that coverage of CMAM is lower than previous analyses of early CTC programmes; therefore reducing programme impact. Barriers to access need to be addressed in order to start improving coverage by paying greater attention to certain activities such as community sensitisation. As barriers are interconnected focusing on specific activities, such as decentralising services to satellite sites, is likely to increase significantly utilisation of nutrition services. Programmes need to ensure that barriers are continuously monitored to ensure timely removal and increased coverage. PMID:26042827

  20. A New Approach for Semantic Web Matching

    NASA Astrophysics Data System (ADS)

    Zamanifar, Kamran; Heidary, Golsa; Nematbakhsh, Naser; Mardukhi, Farhad

    In this work we propose a new approach for semantic web matching to improve the performance of Web Service replacement. Because in automatic systems we should ensure the self-healing, self-configuration, self-optimization and self-management, all services should be always available and if one of them crashes, it should be replaced with the most similar one. Candidate services are advertised in Universal Description, Discovery and Integration (UDDI) all in Web Ontology Language (OWL). By the help of bipartite graph, we did the matching between the crashed service and a Candidate one. Then we chose the best service, which had the maximum rate of matching. In fact we compare two services' functionalities and capabilities to see how much they match. We found that the best way for matching two web services, is comparing the functionalities of them.

  1. Climatological Data Option in My Weather Impacts Decision Aid (MyWIDA) Overview

    DTIC Science & Technology

    2017-07-18

    rules. It consists of 2 databases, a data service server, a collection of web service, and web applications that show weather impacts on selected...3.1.2 ClimoDB 5 3.2 Data Service 5 3.2.1 Data Requestor 5 3.2.2 Data Decoder 6 3.2.3 Post Processor 6 3.2.4 Job Scheduler 6 3.3 Web Service 6...6.1 Additional Data Option 9 6.2 Impact Overlay Web Service 9 6.3 Graphical User Interface 9 7. References 10 List of Symbols, Abbreviations, and

  2. 42 CFR 403.766 - Requirements for coverage and payment of RNHCI home services.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 2 2010-10-01 2010-10-01 false Requirements for coverage and payment of RNHCI home services. 403.766 Section 403.766 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL PROVISIONS SPECIAL PROGRAMS AND PROJECTS Religious Nonmedical Health...

  3. 42 CFR 416.46 - Condition for coverage-Nursing services.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 3 2011-10-01 2011-10-01 false Condition for coverage-Nursing services. 416.46....46 Condition for coverage—Nursing services. The nursing services of the ASC must be directed and staffed to assure that the nursing needs of all patients are met. (a) Standard: Organization and staffing...

  4. 42 CFR 416.46 - Condition for coverage-Nursing services.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 3 2010-10-01 2010-10-01 false Condition for coverage-Nursing services. 416.46....46 Condition for coverage—Nursing services. The nursing services of the ASC must be directed and staffed to assure that the nursing needs of all patients are met. (a) Standard: Organization and staffing...

  5. Finding Citations to Social Work Literature: The Relative Benefits of Using "Web of Science," "Scopus," or "Google Scholar"

    ERIC Educational Resources Information Center

    Bergman, Elaine M. Lasda

    2012-01-01

    Past studies of citation coverage of "Web of Science," "Scopus," and "Google Scholar" do not demonstrate a consistent pattern that can be applied to the interdisciplinary mix of resources used in social work research. To determine the utility of these tools to social work researchers, an analysis of citing references to well-known social work…

  6. Design and implementation of CUAHSI WaterML and WaterOneFlow Web Services

    NASA Astrophysics Data System (ADS)

    Valentine, D. W.; Zaslavsky, I.; Whitenack, T.; Maidment, D.

    2007-12-01

    WaterOneFlow is a term for a group of web services created by and for the Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) community. CUAHSI web services facilitate the retrieval of hydrologic observations information from online data sources using the SOAP protocol. CUAHSI Water Markup Language (below referred to as WaterML) is an XML schema defining the format of messages returned by the WaterOneFlow web services. \

  7. Processing biological literature with customizable Web services supporting interoperable formats.

    PubMed

    Rak, Rafal; Batista-Navarro, Riza Theresa; Carter, Jacob; Rowley, Andrew; Ananiadou, Sophia

    2014-01-01

    Web services have become a popular means of interconnecting solutions for processing a body of scientific literature. This has fuelled research on high-level data exchange formats suitable for a given domain and ensuring the interoperability of Web services. In this article, we focus on the biological domain and consider four interoperability formats, BioC, BioNLP, XMI and RDF, that represent domain-specific and generic representations and include well-established as well as emerging specifications. We use the formats in the context of customizable Web services created in our Web-based, text-mining workbench Argo that features an ever-growing library of elementary analytics and capabilities to build and deploy Web services straight from a convenient graphical user interface. We demonstrate a 2-fold customization of Web services: by building task-specific processing pipelines from a repository of available analytics, and by configuring services to accept and produce a combination of input and output data interchange formats. We provide qualitative evaluation of the formats as well as quantitative evaluation of automatic analytics. The latter was carried out as part of our participation in the fourth edition of the BioCreative challenge. Our analytics built into Web services for recognizing biochemical concepts in BioC collections achieved the highest combined scores out of 10 participating teams. Database URL: http://argo.nactem.ac.uk. © The Author(s) 2014. Published by Oxford University Press.

  8. Data partitioning enables the use of standard SOAP Web Services in genome-scale workflows.

    PubMed

    Sztromwasser, Pawel; Puntervoll, Pål; Petersen, Kjell

    2011-07-26

    Biological databases and computational biology tools are provided by research groups around the world, and made accessible on the Web. Combining these resources is a common practice in bioinformatics, but integration of heterogeneous and often distributed tools and datasets can be challenging. To date, this challenge has been commonly addressed in a pragmatic way, by tedious and error-prone scripting. Recently however a more reliable technique has been identified and proposed as the platform that would tie together bioinformatics resources, namely Web Services. In the last decade the Web Services have spread wide in bioinformatics, and earned the title of recommended technology. However, in the era of high-throughput experimentation, a major concern regarding Web Services is their ability to handle large-scale data traffic. We propose a stream-like communication pattern for standard SOAP Web Services, that enables efficient flow of large data traffic between a workflow orchestrator and Web Services. We evaluated the data-partitioning strategy by comparing it with typical communication patterns on an example pipeline for genomic sequence annotation. The results show that data-partitioning lowers resource demands of services and increases their throughput, which in consequence allows to execute in-silico experiments on genome-scale, using standard SOAP Web Services and workflows. As a proof-of-principle we annotated an RNA-seq dataset using a plain BPEL workflow engine.

  9. Processing biological literature with customizable Web services supporting interoperable formats

    PubMed Central

    Rak, Rafal; Batista-Navarro, Riza Theresa; Carter, Jacob; Rowley, Andrew; Ananiadou, Sophia

    2014-01-01

    Web services have become a popular means of interconnecting solutions for processing a body of scientific literature. This has fuelled research on high-level data exchange formats suitable for a given domain and ensuring the interoperability of Web services. In this article, we focus on the biological domain and consider four interoperability formats, BioC, BioNLP, XMI and RDF, that represent domain-specific and generic representations and include well-established as well as emerging specifications. We use the formats in the context of customizable Web services created in our Web-based, text-mining workbench Argo that features an ever-growing library of elementary analytics and capabilities to build and deploy Web services straight from a convenient graphical user interface. We demonstrate a 2-fold customization of Web services: by building task-specific processing pipelines from a repository of available analytics, and by configuring services to accept and produce a combination of input and output data interchange formats. We provide qualitative evaluation of the formats as well as quantitative evaluation of automatic analytics. The latter was carried out as part of our participation in the fourth edition of the BioCreative challenge. Our analytics built into Web services for recognizing biochemical concepts in BioC collections achieved the highest combined scores out of 10 participating teams. Database URL: http://argo.nactem.ac.uk. PMID:25006225

  10. 31 CFR 515.578 - Exportation of certain services incident to Internet-based communications.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Internet, such as instant messaging, chat and email, social networking, sharing of photos and movies, web... direct or indirect exportation of web-hosting services that are for purposes other than personal communications (e.g., web-hosting services for commercial endeavors) or of domain name registration services. (4...

  11. 31 CFR 515.578 - Exportation of certain services incident to Internet-based communications.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Internet, such as instant messaging, chat and email, social networking, sharing of photos and movies, web... direct or indirect exportation of web-hosting services that are for purposes other than personal communications (e.g., web-hosting services for commercial endeavors) or of domain name registration services. (4...

  12. Coverage of genetic technologies under national health reform.

    PubMed Central

    Mehlman, M. J.; Botkin, J. R.; Scarrow, A.; Woodhall, A.; Kass, J.; Siebenschuh, E.

    1994-01-01

    This article examines the extent to which the technologies expected to emerge from genetic research are likely to be covered under Government-mandated health insurance programs such as those being proposed by advocates of national health reform. Genetic technologies are divided into three broad categories; genetic information services, including screening, testing, and counseling; experimental technologies; and gene therapy. This article concludes that coverage of these technologies under national health reform is uncertain. The basic benefits packages provided for in the major health reform plans are likely to provide partial coverage of experimental technologies; relatively broad coverage of information services; and varying coverage of gene therapies, on the basis of an evaluation of their costs, benefits, and the degree to which they raise objections on political and religious grounds. Genetic services that are not included in the basic benefits package will be available only to those who can purchase supplemental insurance or to those who can purchase the services with personal funds. The resulting multitiered system of access to genetic services raises serious questions of fairness. PMID:7977343

  13. REMORA: a pilot in the ocean of BioMoby web-services.

    PubMed

    Carrere, Sébastien; Gouzy, Jérôme

    2006-04-01

    Emerging web-services technology allows interoperability between multiple distributed architectures. Here, we present REMORA, a web server implemented according to the BioMoby web-service specifications, providing life science researchers with an easy-to-use workflow generator and launcher, a repository of predefined workflows and a survey system. Jerome.Gouzy@toulouse.inra.fr The REMORA web server is freely available at http://bioinfo.genopole-toulouse.prd.fr/remora, sources are available upon request from the authors.

  14. jORCA: easily integrating bioinformatics Web Services.

    PubMed

    Martín-Requena, Victoria; Ríos, Javier; García, Maximiliano; Ramírez, Sergio; Trelles, Oswaldo

    2010-02-15

    Web services technology is becoming the option of choice to deploy bioinformatics tools that are universally available. One of the major strengths of this approach is that it supports machine-to-machine interoperability over a network. However, a weakness of this approach is that various Web Services differ in their definition and invocation protocols, as well as their communication and data formats-and this presents a barrier to service interoperability. jORCA is a desktop client aimed at facilitating seamless integration of Web Services. It does so by making a uniform representation of the different web resources, supporting scalable service discovery, and automatic composition of workflows. Usability is at the top of the jORCA agenda; thus it is a highly customizable and extensible application that accommodates a broad range of user skills featuring double-click invocation of services in conjunction with advanced execution-control, on the fly data standardization, extensibility of viewer plug-ins, drag-and-drop editing capabilities, plus a file-based browsing style and organization of favourite tools. The integration of bioinformatics Web Services is made easier to support a wider range of users. .

  15. National Centers for Environmental Prediction

    Science.gov Websites

    . Government's official Web portal to all Federal, state and local government Web resources and services. MISSION Web Page [scroll down to "Verification" Section] HRRR Verification at NOAA ESRL HRRR Web Verification Web Page NOAA / National Weather Service National Centers for Environmental Prediction

  16. Breast Health Services: Accuracy of Benefit Coverage Information in the Individual Insurance Marketplace.

    PubMed

    Hamid, Mariam S; Kolenic, Giselle E; Dozier, Jessica; Dalton, Vanessa K; Carlos, Ruth C

    2017-04-01

    The aim of this study was to determine if breast health coverage information provided by customer service representatives employed by insurers offering plans in the 2015 federal and state health insurance marketplaces is consistent with Patient Protection and Affordable Care Act (ACA) and state-specific legislation. One hundred fifty-eight unique customer service numbers were identified for insurers offering plans through the federal marketplace, augmented with four additional numbers representing the Connecticut state-run exchange. Using a standardized patient biography and the mystery-shopper technique, a single investigator posed as a purchaser and contacted each number, requesting information on breast health services coverage. Consistency of information provided by the representative with the ACA mandates (BRCA testing in high-risk women) or state-specific legislation (screening ultrasound in women with dense breasts) was determined. Insurer representatives gave BRCA test coverage information that was not consistent with the ACA mandate in 60.8% of cases, and 22.8% could not provide any information regarding coverage. Nearly half (48.1%) of insurer representatives gave coverage information about ultrasound screening for dense breasts that was not consistent with state-specific legislation, and 18.5% could not provide any information. Insurance customer service representatives in the federal and state marketplaces frequently provide inaccurate coverage information about breast health services that should be covered under the ACA and state-specific legislation. Misinformation can inadvertently lead to the purchase of a plan that does not meet the needs of the insured. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  17. Component, Context, and Manufacturing Model Library (C2M2L)

    DTIC Science & Technology

    2012-11-01

    123 5.1 MML Population and Web Service Interface...104 Table 41. Relevant Questions with Associated Web Services...the models, and implementing web services that provide semantically aware programmatic access to the models, including implementing the MS&T

  18. MaNIDA: Integration of marine expedition information, data and publications: Data Portal of German Marine Research

    NASA Astrophysics Data System (ADS)

    Koppe, Roland; Scientific MaNIDA-Team

    2013-04-01

    The Marine Network for Integrated Data Access (MaNIDA) aims to build a sustainable e-infrastructure to support discovery and re-use of marine data from distinct data providers in Germany (see related abstracts in session ESSI 1.2). In order to provide users integrated access and retrieval of expedition or cruise metadata, data, services and publications as well as relationships among the various objects, we are developing (web) applications based on state of the art technologies: the Data Portal of German Marine Research. Since the German network of distributed content providers have distinct objectives and mandates for storing digital objects (e.g. long-term data preservation, near real time data, publication repositories), we have to cope with heterogeneous metadata in terms of syntax and semantic, data types and formats as well as access solutions. We have defined a set of core metadata elements which are common to our content providers and therefore useful for discovery and building relationships among objects. Existing catalogues for various types of vocabularies are being used to assure the mapping to community-wide used terms. We distinguish between expedition metadata and continuously harvestable metadata objects from distinct data providers. • Existing expedition metadata from distinct sources is integrated and validated in order to create an expedition metadata catalogue which is used as authoritative source for expedition-related content. The web application allows browsing by e.g. research vessel and date, exploring expeditions and research gaps by tracklines and viewing expedition details (begin/end, ports, platforms, chief scientists, events, etc.). Also expedition-related objects from harvesting are dynamically associated with expedition information and presented to the user. Hence we will provide web services to detailed expedition information. • Other harvestable content is separated into four categories: archived data and data products, near real time data, publications and reports. Reports are a special case of publication, describing cruise planning, cruise reports or popular reports on expeditions and are orthogonal to e.g. peer-reviewed articles. Each object's metadata contains at least: identifier(s) e.g. doi/hdl, title, author(s), date, expedition(s), platform(s) e.g. research vessel Polarstern. Furthermore project(s), parameter(s), device(s) and e.g. geographic coverage are of interest. An international gazetteer resolves geographic coverage to region names and annotates to object metadata. Information is homogenously presented to the user, independent of the underlying format, but adaptable to specific disciplines e.g. bathymetry. Also data access and dissemination information is available to the user as data download link or web services (e.g. WFS, WMS). Based on relationship metadata we are dynamically building graphs of objects to support the user in finding possible relevant associated objects. Technically metadata is based on ISO / OGC standards or provider specification. Metadata is harvested via OAI-PMH or OGC CSW and indexed with Apache Lucene. This enables powerful full-text search, geographic and temporal search as well as faceting. In this presentation we will illustrate the architecture and the current implementation of our integrated approach.

  19. Coverage Gains After the Affordable Care Act Among the Uninsured in Minnesota.

    PubMed

    Call, Kathleen Thiede; Lukanen, Elizabeth; Spencer, Donna; Alarcón, Giovann; Kemmick Pintor, Jessie; Baines Simon, Alisha; Gildemeister, Stefan

    2015-11-01

    We determined whether and how Minnesotans who were uninsured in 2013 gained health insurance coverage in 2014, 1 year after the Affordable Care Act (ACA) expanded Medicaid coverage and enrollment. Insurance status and enrollment experiences came from the Minnesota Health Insurance Transitions Study (MH-HITS), a follow-up telephone survey of children and adults in Minnesota who had no health insurance in the fall of 2013. ACA had a tempered success in Minnesota. Outreach and enrollment efforts were effective; one half of those previously uninsured gained coverage, although many reported difficulty signing up (nearly 62%). Of the previously uninsured who gained coverage, 44% obtained their coverage through MNsure, Minnesota's insurance marketplace. Most of those who remained uninsured heard of MNsure and went to the Web site. Many still struggled with the enrollment process or reported being deterred by the cost of coverage. Targeting outreach, simplifying the enrollment process, focusing on affordability, and continuing funding for in-person assistance will be important in the future.

  20. Coverage Gains After the Affordable Care Act Among the Uninsured in Minnesota

    PubMed Central

    Lukanen, Elizabeth; Spencer, Donna; Alarcón, Giovann; Kemmick Pintor, Jessie; Baines Simon, Alisha; Gildemeister, Stefan

    2015-01-01

    Objectives. We determined whether and how Minnesotans who were uninsured in 2013 gained health insurance coverage in 2014, 1 year after the Affordable Care Act (ACA) expanded Medicaid coverage and enrollment. Methods. Insurance status and enrollment experiences came from the Minnesota Health Insurance Transitions Study (MH-HITS), a follow-up telephone survey of children and adults in Minnesota who had no health insurance in the fall of 2013. Results. ACA had a tempered success in Minnesota. Outreach and enrollment efforts were effective; one half of those previously uninsured gained coverage, although many reported difficulty signing up (nearly 62%). Of the previously uninsured who gained coverage, 44% obtained their coverage through MNsure, Minnesota’s insurance marketplace. Most of those who remained uninsured heard of MNsure and went to the Web site. Many still struggled with the enrollment process or reported being deterred by the cost of coverage. Conclusions. Targeting outreach, simplifying the enrollment process, focusing on affordability, and continuing funding for in-person assistance will be important in the future. PMID:26447912

  1. WebGLORE: a web service for Grid LOgistic REgression.

    PubMed

    Jiang, Wenchao; Li, Pinghao; Wang, Shuang; Wu, Yuan; Xue, Meng; Ohno-Machado, Lucila; Jiang, Xiaoqian

    2013-12-15

    WebGLORE is a free web service that enables privacy-preserving construction of a global logistic regression model from distributed datasets that are sensitive. It only transfers aggregated local statistics (from participants) through Hypertext Transfer Protocol Secure to a trusted server, where the global model is synthesized. WebGLORE seamlessly integrates AJAX, JAVA Applet/Servlet and PHP technologies to provide an easy-to-use web service for biomedical researchers to break down policy barriers during information exchange. http://dbmi-engine.ucsd.edu/webglore3/. WebGLORE can be used under the terms of GNU general public license as published by the Free Software Foundation.

  2. The Semantic Automated Discovery and Integration (SADI) Web service Design-Pattern, API and Reference Implementation

    PubMed Central

    2011-01-01

    Background The complexity and inter-related nature of biological data poses a difficult challenge for data and tool integration. There has been a proliferation of interoperability standards and projects over the past decade, none of which has been widely adopted by the bioinformatics community. Recent attempts have focused on the use of semantics to assist integration, and Semantic Web technologies are being welcomed by this community. Description SADI - Semantic Automated Discovery and Integration - is a lightweight set of fully standards-compliant Semantic Web service design patterns that simplify the publication of services of the type commonly found in bioinformatics and other scientific domains. Using Semantic Web technologies at every level of the Web services "stack", SADI services consume and produce instances of OWL Classes following a small number of very straightforward best-practices. In addition, we provide codebases that support these best-practices, and plug-in tools to popular developer and client software that dramatically simplify deployment of services by providers, and the discovery and utilization of those services by their consumers. Conclusions SADI Services are fully compliant with, and utilize only foundational Web standards; are simple to create and maintain for service providers; and can be discovered and utilized in a very intuitive way by biologist end-users. In addition, the SADI design patterns significantly improve the ability of software to automatically discover appropriate services based on user-needs, and automatically chain these into complex analytical workflows. We show that, when resources are exposed through SADI, data compliant with a given ontological model can be automatically gathered, or generated, from these distributed, non-coordinating resources - a behaviour we have not observed in any other Semantic system. Finally, we show that, using SADI, data dynamically generated from Web services can be explored in a manner very similar to data housed in static triple-stores, thus facilitating the intersection of Web services and Semantic Web technologies. PMID:22024447

  3. The Semantic Automated Discovery and Integration (SADI) Web service Design-Pattern, API and Reference Implementation.

    PubMed

    Wilkinson, Mark D; Vandervalk, Benjamin; McCarthy, Luke

    2011-10-24

    The complexity and inter-related nature of biological data poses a difficult challenge for data and tool integration. There has been a proliferation of interoperability standards and projects over the past decade, none of which has been widely adopted by the bioinformatics community. Recent attempts have focused on the use of semantics to assist integration, and Semantic Web technologies are being welcomed by this community. SADI - Semantic Automated Discovery and Integration - is a lightweight set of fully standards-compliant Semantic Web service design patterns that simplify the publication of services of the type commonly found in bioinformatics and other scientific domains. Using Semantic Web technologies at every level of the Web services "stack", SADI services consume and produce instances of OWL Classes following a small number of very straightforward best-practices. In addition, we provide codebases that support these best-practices, and plug-in tools to popular developer and client software that dramatically simplify deployment of services by providers, and the discovery and utilization of those services by their consumers. SADI Services are fully compliant with, and utilize only foundational Web standards; are simple to create and maintain for service providers; and can be discovered and utilized in a very intuitive way by biologist end-users. In addition, the SADI design patterns significantly improve the ability of software to automatically discover appropriate services based on user-needs, and automatically chain these into complex analytical workflows. We show that, when resources are exposed through SADI, data compliant with a given ontological model can be automatically gathered, or generated, from these distributed, non-coordinating resources - a behaviour we have not observed in any other Semantic system. Finally, we show that, using SADI, data dynamically generated from Web services can be explored in a manner very similar to data housed in static triple-stores, thus facilitating the intersection of Web services and Semantic Web technologies.

  4. Semantic Web Services Challenge, Results from the First Year. Series: Semantic Web And Beyond, Volume 8.

    NASA Astrophysics Data System (ADS)

    Petrie, C.; Margaria, T.; Lausen, H.; Zaremba, M.

    Explores trade-offs among existing approaches. Reveals strengths and weaknesses of proposed approaches, as well as which aspects of the problem are not yet covered. Introduces software engineering approach to evaluating semantic web services. Service-Oriented Computing is one of the most promising software engineering trends because of the potential to reduce the programming effort for future distributed industrial systems. However, only a small part of this potential rests on the standardization of tools offered by the web services stack. The larger part of this potential rests upon the development of sufficient semantics to automate service orchestration. Currently there are many different approaches to semantic web service descriptions and many frameworks built around them. A common understanding, evaluation scheme, and test bed to compare and classify these frameworks in terms of their capabilities and shortcomings, is necessary to make progress in developing the full potential of Service-Oriented Computing. The Semantic Web Services Challenge is an open source initiative that provides a public evaluation and certification of multiple frameworks on common industrially-relevant problem sets. This edited volume reports on the first results in developing common understanding of the various technologies intended to facilitate the automation of mediation, choreography and discovery for Web Services using semantic annotations. Semantic Web Services Challenge: Results from the First Year is designed for a professional audience composed of practitioners and researchers in industry. Professionals can use this book to evaluate SWS technology for their potential practical use. The book is also suitable for advanced-level students in computer science.

  5. An open platform for promoting interoperability in solar system sciences

    NASA Astrophysics Data System (ADS)

    Csillaghy, André; Aboudarham, Jean; Berghmans, David; Jacquey, Christian

    2013-04-01

    The European coordination project CASSIS is promoting the creation of an integrated data space that will facilitate science across community boundaries in solar system sciences. Many disciplines may need to use the same data set to support scientific research, although the way they are used may depend on the project and on the particular piece of science. Often, access is hindered because of differences in the way the different communities describe, store their data, as well as how they make them accessible. Working towards this goal, we have set up an open collaboration platform, www.explorespace.eu, that can serve as a hub for discovering and developing interoperability resources in the communities involved. The platform is independent of the project and will be maintained well after the end of the funding. As a first step, we have captured the description of services already provided by the community. The openness of the collaboration platform should allow to discuss with all stakeholders ways to make key types of metadata and derived products more complete and coherent and thus more usable across the domain boundaries. Furthermore, software resources and discussions should help facilitating the development of interoperable services. The platform, along with the database of services, address the following questions, which we consider crucial for promoting interoperability: • Current extent of the data space coverage: What part of the common data space is already covered by the existing interoperable services in terms of data access. In other words, what data, from catalogues as well as from raw data, can be reached by an application through standard protocols today? • Needed extension of the data space coverage: What would be needed to extend the data space coverage? In other words, how can the currently accessible data space be extended by adding services? • Missing services: What applications / services are still missing and need to be developed? This is not a trivial question, as the generation of the common data space in itself creates new requirements on overarching applications that might be necessary to provide a unified access to all the services. As an example, one particular aspect discussed in the platform is the design of web services. Applications of today are mainly human centred while interoperability must happen one level below and the back ends (databases) must be generic, i.e. independent from the applications. We intent our effort to provide to developers resources that disentangle user interfaces from data services. Many activities are challenging and we hope they will be discussed on our platform. In particular, the quality of the services, the data space and the needs of interdisciplinary approaches are serious concerns for instruments such as ATST and EST or the ones onboard SDO and, in the future, Solar Orbiter. We believe that our platform might be useful as a kind of guide that would allow groups of not having to reinvent the wheel for each new instrument.

  6. Web-services-based spatial decision support system to facilitate nuclear waste siting

    NASA Astrophysics Data System (ADS)

    Huang, L. Xinglai; Sheng, Grant

    2006-10-01

    The availability of spatial web services enables data sharing among managers, decision and policy makers and other stakeholders in much simpler ways than before and subsequently has created completely new opportunities in the process of spatial decision making. Though generally designed for a certain problem domain, web-services-based spatial decision support systems (WSDSS) can provide a flexible problem-solving environment to explore the decision problem, understand and refine problem definition, and generate and evaluate multiple alternatives for decision. This paper presents a new framework for the development of a web-services-based spatial decision support system. The WSDSS is comprised of distributed web services that either have their own functions or provide different geospatial data and may reside in different computers and locations. WSDSS includes six key components, namely: database management system, catalog, analysis functions and models, GIS viewers and editors, report generators, and graphical user interfaces. In this study, the architecture of a web-services-based spatial decision support system to facilitate nuclear waste siting is described as an example. The theoretical, conceptual and methodological challenges and issues associated with developing web services-based spatial decision support system are described.

  7. EMODnet Physics in the EMODnet program phase 3

    NASA Astrophysics Data System (ADS)

    Novellino, Antonio; Gorringe, Patrick; Schaap, Dick; Pouliquen, Sylvie; Rickards, Lesley; Thijsse, Peter; Manzella, Giuseppe

    2017-04-01

    Access to marine data is of vital importance for marine research and a key issue for various studies, from climate change prediction to off shore engineering. Giving access to and harmonising marine data from different sources will help industry, public authorities and researchers find the data and make more effective use of them to develop new products, services and improve our understanding of how the seas behave. The aim of EMODnet Physics is the provision of a combined array of services and functionalities (facility for viewing and downloading, dashboard reporting and machine-to-machine communication services) to obtain, free of charge data, meta-data and data products on the physical conditions of European sea basins and oceans from many different distributed data bases. Moreover, the system provides full interoperability with third-party software through WMS services, Web Services and Web catalogues in order to exchange data and products according to the most recent standards. This assures to the user, the access to data having same quality and formats. The portal is providing access to data and products of: wave height and period; temperature and salinity of the water column; wind speed and direction; horizontal velocity of the water column; light attenuation; sea ice coverage and sea level trends. EMODnet Physics is continuously enhancing the number and type of platforms in the system by unlocking and providing high quality data from a growing network. Nowadays the system does integrate information by more than 12.000 stations and is including two ready-to-use data products: Ice Map and Sea Level Trends. The final aim of EMODnet Physics is to confederate different portals and be a portal of portal to further extend the number and type of data (e.g. water noise, river data, etc.) and platforms (e.g. animal bourne instruments, etc) feeding the system; improve the capacity of the system producing data and products that could match the market needs of the current and potential new end and intermediate users.

  8. Towards Big Earth Data Analytics: The EarthServer Approach

    NASA Astrophysics Data System (ADS)

    Baumann, Peter

    2013-04-01

    Big Data in the Earth sciences, the Tera- to Exabyte archives, mostly are made up from coverage data whereby the term "coverage", according to ISO and OGC, is defined as the digital representation of some space-time varying phenomenon. Common examples include 1-D sensor timeseries, 2-D remote sensing imagery, 3D x/y/t image timeseries and x/y/z geology data, and 4-D x/y/z/t atmosphere and ocean data. Analytics on such data requires on-demand processing of sometimes significant complexity, such as getting the Fourier transform of satellite images. As network bandwidth limits prohibit transfer of such Big Data it is indispensable to devise protocols allowing clients to task flexible and fast processing on the server. The EarthServer initiative, funded by EU FP7 eInfrastructures, unites 11 partners from computer and earth sciences to establish Big Earth Data Analytics. One key ingredient is flexibility for users to ask what they want, not impeded and complicated by system internals. The EarthServer answer to this is to use high-level query languages; these have proven tremendously successful on tabular and XML data, and we extend them with a central geo data structure, multi-dimensional arrays. A second key ingredient is scalability. Without any doubt, scalability ultimately can only be achieved through parallelization. In the past, parallelizing code has been done at compile time and usually with manual intervention. The EarthServer approach is to perform a samentic-based dynamic distribution of queries fragments based on networks optimization and further criteria. The EarthServer platform is comprised by rasdaman, an Array DBMS enabling efficient storage and retrieval of any-size, any-type multi-dimensional raster data. In the project, rasdaman is being extended with several functionality and scalability features, including: support for irregular grids and general meshes; in-situ retrieval (evaluation of database queries on existing archive structures, avoiding data import and, hence, duplication); the aforementioned distributed query processing. Additionally, Web clients for multi-dimensional data visualization are being established. Client/server interfaces are strictly based on OGC and W3C standards, in particular the Web Coverage Processing Service (WCPS) which defines a high-level raster query language. We present the EarthServer project with its vision and approaches, relate it to the current state of standardization, and demonstrate it by way of large-scale data centers and their services using rasdaman.

  9. Earth science big data at users' fingertips: the EarthServer Science Gateway Mobile

    NASA Astrophysics Data System (ADS)

    Barbera, Roberto; Bruno, Riccardo; Calanducci, Antonio; Fargetta, Marco; Pappalardo, Marco; Rundo, Francesco

    2014-05-01

    The EarthServer project (www.earthserver.eu), funded by the European Commission under its Seventh Framework Program, aims at establishing open access and ad-hoc analytics on extreme-size Earth Science data, based on and extending leading-edge Array Database technology. The core idea is to use database query languages as client/server interface to achieve barrier-free "mix & match" access to multi-source, any-size, multi-dimensional space-time data -- in short: "Big Earth Data Analytics" - based on the open standards of the Open Geospatial Consortium Web Coverage Processing Service (OGC WCPS) and the W3C XQuery. EarthServer combines both, thereby achieving a tight data/metadata integration. Further, the rasdaman Array Database System (www.rasdaman.com) is extended with further space-time coverage data types. On server side, highly effective optimizations - such as parallel and distributed query processing - ensure scalability to Exabyte volumes. In this contribution we will report on the EarthServer Science Gateway Mobile, an app for both iOS and Android-based devices that allows users to seamlessly access some of the EarthServer applications using SAML-based federated authentication and fine-grained authorisation mechanisms.

  10. Availability of the OGC geoprocessing standard: March 2011 reality check

    NASA Astrophysics Data System (ADS)

    Lopez-Pellicer, Francisco J.; Rentería-Agualimpia, Walter; Béjar, Rubén; Muro-Medrano, Pedro R.; Zarazaga-Soria, F. Javier

    2012-10-01

    This paper presents an investigation about the servers available in March 2011 conforming to the Web Processing Service interface specification published by the geospatial standards organization Open Geospatial Consortium (OGC) in 2007. This interface specification gives support to standard Web-based geoprocessing. The data used in this research were collected using a focused crawler configured for finding OGC Web services. The research goals are (i) to provide a reality check of the availability of Web Processing Service servers, (ii) to provide quantitative data about the use of different features defined in the standard that are relevant for a scalable Geoprocessing Web (e.g. long-running processes, Web-accessible data outputs), and (iii) to test if the advances in the use of search engines and focused crawlers for finding Web services can be applied for finding geoscience processing systems. Research results show the feasibility of the discovery approach and provide data about the implementation of the Web Processing Service specification. These results also show extensive use of features related to scalability, except for those related to technical and semantic interoperability.

  11. Seahawk: moving beyond HTML in Web-based bioinformatics analysis.

    PubMed

    Gordon, Paul M K; Sensen, Christoph W

    2007-06-18

    Traditional HTML interfaces for input to and output from Bioinformatics analysis on the Web are highly variable in style, content and data formats. Combining multiple analyses can therefore be an onerous task for biologists. Semantic Web Services allow automated discovery of conceptual links between remote data analysis servers. A shared data ontology and service discovery/execution framework is particularly attractive in Bioinformatics, where data and services are often both disparate and distributed. Instead of biologists copying, pasting and reformatting data between various Web sites, Semantic Web Service protocols such as MOBY-S hold out the promise of seamlessly integrating multi-step analysis. We have developed a program (Seahawk) that allows biologists to intuitively and seamlessly chain together Web Services using a data-centric, rather than the customary service-centric approach. The approach is illustrated with a ferredoxin mutation analysis. Seahawk concentrates on lowering entry barriers for biologists: no prior knowledge of the data ontology, or relevant services is required. In stark contrast to other MOBY-S clients, in Seahawk users simply load Web pages and text files they already work with. Underlying the familiar Web-browser interaction is an XML data engine based on extensible XSLT style sheets, regular expressions, and XPath statements which import existing user data into the MOBY-S format. As an easily accessible applet, Seahawk moves beyond standard Web browser interaction, providing mechanisms for the biologist to concentrate on the analytical task rather than on the technical details of data formats and Web forms. As the MOBY-S protocol nears a 1.0 specification, we expect more biologists to adopt these new semantic-oriented ways of doing Web-based analysis, which empower them to do more complicated, ad hoc analysis workflow creation without the assistance of a programmer.

  12. Seahawk: moving beyond HTML in Web-based bioinformatics analysis

    PubMed Central

    Gordon, Paul MK; Sensen, Christoph W

    2007-01-01

    Background Traditional HTML interfaces for input to and output from Bioinformatics analysis on the Web are highly variable in style, content and data formats. Combining multiple analyses can therfore be an onerous task for biologists. Semantic Web Services allow automated discovery of conceptual links between remote data analysis servers. A shared data ontology and service discovery/execution framework is particularly attractive in Bioinformatics, where data and services are often both disparate and distributed. Instead of biologists copying, pasting and reformatting data between various Web sites, Semantic Web Service protocols such as MOBY-S hold out the promise of seamlessly integrating multi-step analysis. Results We have developed a program (Seahawk) that allows biologists to intuitively and seamlessly chain together Web Services using a data-centric, rather than the customary service-centric approach. The approach is illustrated with a ferredoxin mutation analysis. Seahawk concentrates on lowering entry barriers for biologists: no prior knowledge of the data ontology, or relevant services is required. In stark contrast to other MOBY-S clients, in Seahawk users simply load Web pages and text files they already work with. Underlying the familiar Web-browser interaction is an XML data engine based on extensible XSLT style sheets, regular expressions, and XPath statements which import existing user data into the MOBY-S format. Conclusion As an easily accessible applet, Seahawk moves beyond standard Web browser interaction, providing mechanisms for the biologist to concentrate on the analytical task rather than on the technical details of data formats and Web forms. As the MOBY-S protocol nears a 1.0 specification, we expect more biologists to adopt these new semantic-oriented ways of doing Web-based analysis, which empower them to do more complicated, ad hoc analysis workflow creation without the assistance of a programmer. PMID:17577405

  13. Maintenance and Exchange of Learning Objects in a Web Services Based e-Learning System

    ERIC Educational Resources Information Center

    Vossen, Gottfried; Westerkamp, Peter

    2004-01-01

    "Web services" enable partners to exploit applications via the Internet. Individual services can be composed to build new and more complex ones with additional and more comprehensive functionality. In this paper, we apply the Web service paradigm to electronic learning, and show how to exchange and maintain learning objects is a…

  14. 42 CFR 440.365 - Coverage of rural health clinic and federally qualified health center (FQHC) services.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Coverage of rural health clinic and federally... clinic and federally qualified health center (FQHC) services. If a State provides benchmark or benchmark... otherwise, to rural health clinic services and FQHC services as defined in subparagraphs (B) and (C) of...

  15. "Perspectives on financing population-based health care towards Universal Health Coverage among employed individuals in Ghanzi district, Botswana: A qualitative study".

    PubMed

    Mbogo, Barnabas Africanus; McGill, Deborah

    2016-08-19

    Globally, about 150 million people experience catastrophic healthcare expenditure services annually. Among low and middle income countries, out-of-pocket expenditure pushes about 100 million people into poverty annually. In Botswana, 83 % of the general population and 58 % of employed individuals do not have medical aid coverage. Moreover, inequity allocation of financial resources between health services suggests marginalization of population-based health care services (i.e. diseases prevention and health promotion). The purpose of the study is to explore perspectives on employed individuals regarding financing population based health care interventions towards Universal Health Coverage (UHC) in order to make recommendations to the Ministry of Health on health financing options to cover population-based health services. A qualitative design grounded in interpretivist epistemology through social constructivism lens was critical for exploring perspectives of employed individuals. Through purposive and snowballing sampling techniques, a total of 15 respondents including 8 males and 7 females were recruited and interviewed using a semi-structured format. Their age ranged from 23 to 59 years with a median of 36 years. Data was analyzed using Thematic Content Analysis technique. Use of social constructivism lens enabled to classify emerging themes into population coverage, health services coverage and financial protection issues. Despite broad understanding of health coverage schemes among participants, knowledge appears insignificant in increasing enrolment. Participants indicated limited understanding of UHC concepts, however showed willingness to embrace UHC upon brief description. Main thematic issues raised include: exclusion of population-based health services from coverage scheme; disparity in financial protection and health services coverage among enrollees; inability to sustain contracted employees; and systematic exclusion of unemployed individuals and informal sector employees. Increasing enrolment in health coverage schemes requires targeted campaign for information dissemination through use of myriads mass media including: social networks, TV, Radio and others. Moreover, re-designing health insurance schemes is critical in order to include population-based interventions; expand uptake of unemployed and informal sector employees; flexibility in monthly premiums payment plan and use of technology to increase access to payment points. Further study need to evaluate the content of health financing policy in Botswana measured against the World Health Organization Universal Health Coverage conceptual requirements for Low and Middle Income Countries.

  16. Developer Network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-08-21

    NREL's Developer Network, developer.nrel.gov, provides data that users can access to provide data to their own analyses, mobile and web applications. Developers can retrieve the data through a Web services API (application programming interface). The Developer Network handles overhead of serving up web services such as key management, authentication, analytics, reporting, documentation standards, and throttling in a common architecture, while allowing web services and APIs to be maintained and managed independently.

  17. API REST Web service and backend system Of Lecturer’s Assessment Information System on Politeknik Negeri Bali

    NASA Astrophysics Data System (ADS)

    Manuaba, I. B. P.; Rudiastini, E.

    2018-01-01

    Assessment of lecturers is a tool used to measure lecturer performance. Lecturer’s assessment variable can be measured from three aspects : teaching activities, research and community service. Broad aspect to measure the performance of lecturers requires a special framework, so that the system can be developed in a sustainable manner. Issues of this research is to create a API web service data tool, so the lecturer assessment system can be developed in various frameworks. The research was developed with web service and php programming language with the output of json extension data. The conclusion of this research is API web service data application can be developed using several platforms such as web, mobile application

  18. 42 CFR 410.160 - Part B annual deductible.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... hepatitis b vaccines and their administration. (3) Federally qualified health center services. (4) ASC... services as described in § 410.34 (c) and (d). (6) Screening pelvic examinations as described in § 410.56... services identified for coverage through the national coverage determination (NCD) process. (c) Application...

  19. 42 CFR 410.160 - Part B annual deductible.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... hepatitis b vaccines and their administration. (3) Federally qualified health center services. (4) ASC... services as described in § 410.34 (c) and (d). (6) Screening pelvic examinations as described in § 410.56... services identified for coverage through the national coverage determination (NCD) process. (c) Application...

  20. Utilization of services in a randomized trial testing phone- and web-based interventions for smoking cessation.

    PubMed

    Zbikowski, Susan M; Jack, Lisa M; McClure, Jennifer B; Deprey, Mona; Javitz, Harold S; McAfee, Timothy A; Catz, Sheryl L; Richards, Julie; Bush, Terry; Swan, Gary E

    2011-05-01

    Phone counseling has become standard for behavioral smoking cessation treatment. Newer options include Web and integrated phone-Web treatment. No prior research, to our knowledge, has systematically compared the effectiveness of these three treatment modalities in a randomized trial. Understanding how utilization varies by mode, the impact of utilization on outcomes, and predictors of utilization across each mode could lead to improved treatments. One thousand two hundred and two participants were randomized to phone, Web, or combined phone-Web cessation treatment. Services varied by modality and were tracked using automated systems. All participants received 12 weeks of varenicline, printed guides, an orientation call, and access to a phone supportline. Self-report data were collected at baseline and 6-month follow-up. Overall, participants utilized phone services more often than the Web-based services. Among treatment groups with Web access, a significant proportion logged in only once (37% phone-Web, 41% Web), and those in the phone-Web group logged in less often than those in the Web group (mean = 2.4 vs. 3.7, p = .0001). Use of the phone also was correlated with increased use of the Web. In multivariate analyses, greater use of the phone- or Web-based services was associated with higher cessation rates. Finally, older age and the belief that certain treatments could improve success were consistent predictors of greater utilization across groups. Other predictors varied by treatment group. Opportunities for enhancing treatment utilization exist, particularly for Web-based programs. Increasing utilization more broadly could result in better overall treatment effectiveness for all intervention modalities.

  1. Implementation of Sensor Twitter Feed Web Service Server and Client

    DTIC Science & Technology

    2016-12-01

    ARL-TN-0807 ● DEC 2016 US Army Research Laboratory Implementation of Sensor Twitter Feed Web Service Server and Client by...Implementation of Sensor Twitter Feed Web Service Server and Client by Bhagyashree V Kulkarni University of Maryland Michael H Lee Computational...

  2. AWSCS-A System to Evaluate Different Approaches for the Automatic Composition and Execution of Web Services Flows

    PubMed Central

    Tardiole Kuehne, Bruno; Estrella, Julio Cezar; Nunes, Luiz Henrique; Martins de Oliveira, Edvard; Hideo Nakamura, Luis; Gomes Ferreira, Carlos Henrique; Carlucci Santana, Regina Helena; Reiff-Marganiec, Stephan; Santana, Marcos José

    2015-01-01

    This paper proposes a system named AWSCS (Automatic Web Service Composition System) to evaluate different approaches for automatic composition of Web services, based on QoS parameters that are measured at execution time. The AWSCS is a system to implement different approaches for automatic composition of Web services and also to execute the resulting flows from these approaches. Aiming at demonstrating the results of this paper, a scenario was developed, where empirical flows were built to demonstrate the operation of AWSCS, since algorithms for automatic composition are not readily available to test. The results allow us to study the behaviour of running composite Web services, when flows with the same functionality but different problem-solving strategies were compared. Furthermore, we observed that the influence of the load applied on the running system as the type of load submitted to the system is an important factor to define which approach for the Web service composition can achieve the best performance in production. PMID:26068216

  3. Virtualization of open-source secure web services to support data exchange in a pediatric critical care research network.

    PubMed

    Frey, Lewis J; Sward, Katherine A; Newth, Christopher J L; Khemani, Robinder G; Cryer, Martin E; Thelen, Julie L; Enriquez, Rene; Shaoyu, Su; Pollack, Murray M; Harrison, Rick E; Meert, Kathleen L; Berg, Robert A; Wessel, David L; Shanley, Thomas P; Dalton, Heidi; Carcillo, Joseph; Jenkins, Tammara L; Dean, J Michael

    2015-11-01

    To examine the feasibility of deploying a virtual web service for sharing data within a research network, and to evaluate the impact on data consistency and quality. Virtual machines (VMs) encapsulated an open-source, semantically and syntactically interoperable secure web service infrastructure along with a shadow database. The VMs were deployed to 8 Collaborative Pediatric Critical Care Research Network Clinical Centers. Virtual web services could be deployed in hours. The interoperability of the web services reduced format misalignment from 56% to 1% and demonstrated that 99% of the data consistently transferred using the data dictionary and 1% needed human curation. Use of virtualized open-source secure web service technology could enable direct electronic abstraction of data from hospital databases for research purposes. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  4. AWSCS-A System to Evaluate Different Approaches for the Automatic Composition and Execution of Web Services Flows.

    PubMed

    Tardiole Kuehne, Bruno; Estrella, Julio Cezar; Nunes, Luiz Henrique; Martins de Oliveira, Edvard; Hideo Nakamura, Luis; Gomes Ferreira, Carlos Henrique; Carlucci Santana, Regina Helena; Reiff-Marganiec, Stephan; Santana, Marcos José

    2015-01-01

    This paper proposes a system named AWSCS (Automatic Web Service Composition System) to evaluate different approaches for automatic composition of Web services, based on QoS parameters that are measured at execution time. The AWSCS is a system to implement different approaches for automatic composition of Web services and also to execute the resulting flows from these approaches. Aiming at demonstrating the results of this paper, a scenario was developed, where empirical flows were built to demonstrate the operation of AWSCS, since algorithms for automatic composition are not readily available to test. The results allow us to study the behaviour of running composite Web services, when flows with the same functionality but different problem-solving strategies were compared. Furthermore, we observed that the influence of the load applied on the running system as the type of load submitted to the system is an important factor to define which approach for the Web service composition can achieve the best performance in production.

  5. Web services as applications' integration tool: QikProp case study.

    PubMed

    Laoui, Abdel; Polyakov, Valery R

    2011-07-15

    Web services are a new technology that enables to integrate applications running on different platforms by using primarily XML to enable communication among different computers over the Internet. Large number of applications was designed as stand alone systems before the concept of Web services was introduced and it is a challenge to integrate them into larger computational networks. A generally applicable method of wrapping stand alone applications into Web services was developed and is described. To test the technology, it was applied to the QikProp for DOS (Windows). Although performance of the application did not change when it was delivered as a Web service, this form of deployment had offered several advantages like simplified and centralized maintenance, smaller number of licenses, and practically no training for the end user. Because by using the described approach almost any legacy application can be wrapped as a Web service, this form of delivery may be recommended as a global alternative to traditional deployment solutions. Copyright © 2011 Wiley Periodicals, Inc.

  6. Using USNO's API to Obtain Data

    NASA Astrophysics Data System (ADS)

    Lesniak, Michael V.; Pozniak, Daniel; Punnoose, Tarun

    2015-01-01

    The U.S. Naval Observatory (USNO) is in the process of modernizing its publicly available web services into APIs (Application Programming Interfaces). Services configured as APIs offer greater flexibility to the user and allow greater usage. Depending on the particular service, users who implement our APIs will receive either a PNG (Portable Network Graphics) image or data in JSON (JavaScript Object Notation) format. This raw data can then be embedded in third-party web sites or in apps.Part of the USNO's mission is to provide astronomical and timing data to government agencies and the general public. To this end, the USNO provides accurate computations of astronomical phenomena such as dates of lunar phases, rise and set times of the Moon and Sun, and lunar and solar eclipse times. Users who navigate to our web site and select one of our 18 services are prompted to complete a web form, specifying parameters such as date, time, location, and object. Many of our services work for years between 1700 and 2100, meaning that past, present, and future events can be computed. Upon form submission, our web server processes the request, computes the data, and outputs it to the user.Over recent years, the use of the web by the general public has vastly changed. In response to this, the USNO is modernizing its web-based data services. This includes making our computed data easier to embed within third-party web sites as well as more easily querying from apps running on tablets and smart phones. To facilitate this, the USNO has begun converting its services into APIs. In addition to the existing web forms for the various services, users are able to make direct URL requests that return either an image or numerical data.To date, four of our web services have been configured to run with APIs. Two are image-producing services: "Apparent Disk of a Solar System Object" and "Day and Night Across the Earth." Two API data services are "Complete Sun and Moon Data for One Day" and "Dates of Primary Phases of the Moon." Instructions for how to use our API services as well as examples of their use can be found on one of our explanatory web pages and will be discussed here.

  7. SSWAP: A Simple Semantic Web Architecture and Protocol for semantic web services.

    PubMed

    Gessler, Damian D G; Schiltz, Gary S; May, Greg D; Avraham, Shulamit; Town, Christopher D; Grant, David; Nelson, Rex T

    2009-09-23

    SSWAP (Simple Semantic Web Architecture and Protocol; pronounced "swap") is an architecture, protocol, and platform for using reasoning to semantically integrate heterogeneous disparate data and services on the web. SSWAP was developed as a hybrid semantic web services technology to overcome limitations found in both pure web service technologies and pure semantic web technologies. There are currently over 2400 resources published in SSWAP. Approximately two dozen are custom-written services for QTL (Quantitative Trait Loci) and mapping data for legumes and grasses (grains). The remaining are wrappers to Nucleic Acids Research Database and Web Server entries. As an architecture, SSWAP establishes how clients (users of data, services, and ontologies), providers (suppliers of data, services, and ontologies), and discovery servers (semantic search engines) interact to allow for the description, querying, discovery, invocation, and response of semantic web services. As a protocol, SSWAP provides the vocabulary and semantics to allow clients, providers, and discovery servers to engage in semantic web services. The protocol is based on the W3C-sanctioned first-order description logic language OWL DL. As an open source platform, a discovery server running at http://sswap.info (as in to "swap info") uses the description logic reasoner Pellet to integrate semantic resources. The platform hosts an interactive guide to the protocol at http://sswap.info/protocol.jsp, developer tools at http://sswap.info/developer.jsp, and a portal to third-party ontologies at http://sswapmeet.sswap.info (a "swap meet"). SSWAP addresses the three basic requirements of a semantic web services architecture (i.e., a common syntax, shared semantic, and semantic discovery) while addressing three technology limitations common in distributed service systems: i.e., i) the fatal mutability of traditional interfaces, ii) the rigidity and fragility of static subsumption hierarchies, and iii) the confounding of content, structure, and presentation. SSWAP is novel by establishing the concept of a canonical yet mutable OWL DL graph that allows data and service providers to describe their resources, to allow discovery servers to offer semantically rich search engines, to allow clients to discover and invoke those resources, and to allow providers to respond with semantically tagged data. SSWAP allows for a mix-and-match of terms from both new and legacy third-party ontologies in these graphs.

  8. WebGIS based on semantic grid model and web services

    NASA Astrophysics Data System (ADS)

    Zhang, WangFei; Yue, CaiRong; Gao, JianGuo

    2009-10-01

    As the combination point of the network technology and GIS technology, WebGIS has got the fast development in recent years. With the restriction of Web and the characteristics of GIS, traditional WebGIS has some prominent problems existing in development. For example, it can't accomplish the interoperability of heterogeneous spatial databases; it can't accomplish the data access of cross-platform. With the appearance of Web Service and Grid technology, there appeared great change in field of WebGIS. Web Service provided an interface which can give information of different site the ability of data sharing and inter communication. The goal of Grid technology was to make the internet to a large and super computer, with this computer we can efficiently implement the overall sharing of computing resources, storage resource, data resource, information resource, knowledge resources and experts resources. But to WebGIS, we only implement the physically connection of data and information and these is far from the enough. Because of the different understanding of the world, following different professional regulations, different policies and different habits, the experts in different field will get different end when they observed the same geographic phenomenon and the semantic heterogeneity produced. Since these there are large differences to the same concept in different field. If we use the WebGIS without considering of the semantic heterogeneity, we will answer the questions users proposed wrongly or we can't answer the questions users proposed. To solve this problem, this paper put forward and experienced an effective method of combing semantic grid and Web Services technology to develop WebGIS. In this paper, we studied the method to construct ontology and the method to combine Grid technology and Web Services and with the detailed analysis of computing characteristics and application model in the distribution of data, we designed the WebGIS query system driven by ontology based on Grid technology and Web Services.

  9. Failure Analysis for Composition of Web Services Represented as Labeled Transition Systems

    NASA Astrophysics Data System (ADS)

    Nadkarni, Dinanath; Basu, Samik; Honavar, Vasant; Lutz, Robyn

    The Web service composition problem involves the creation of a choreographer that provides the interaction between a set of component services to realize a goal service. Several methods have been proposed and developed to address this problem. In this paper, we consider those scenarios where the composition process may fail due to incomplete specification of goal service requirements or due to the fact that the user is unaware of the functionality provided by the existing component services. In such cases, it is desirable to have a composition algorithm that can provide feedback to the user regarding the cause of failure in the composition process. Such feedback will help guide the user to re-formulate the goal service and iterate the composition process. We propose a failure analysis technique for composition algorithms that views Web service behavior as multiple sequences of input/output events. Our technique identifies the possible cause of composition failure and suggests possible recovery options to the user. We discuss our technique using a simple e-Library Web service in the context of the MoSCoE Web service composition framework.

  10. WebGLORE: a Web service for Grid LOgistic REgression

    PubMed Central

    Jiang, Wenchao; Li, Pinghao; Wang, Shuang; Wu, Yuan; Xue, Meng; Ohno-Machado, Lucila; Jiang, Xiaoqian

    2013-01-01

    WebGLORE is a free web service that enables privacy-preserving construction of a global logistic regression model from distributed datasets that are sensitive. It only transfers aggregated local statistics (from participants) through Hypertext Transfer Protocol Secure to a trusted server, where the global model is synthesized. WebGLORE seamlessly integrates AJAX, JAVA Applet/Servlet and PHP technologies to provide an easy-to-use web service for biomedical researchers to break down policy barriers during information exchange. Availability and implementation: http://dbmi-engine.ucsd.edu/webglore3/. WebGLORE can be used under the terms of GNU general public license as published by the Free Software Foundation. Contact: x1jiang@ucsd.edu PMID:24072732

  11. Development of spatial density maps based on geoprocessing web services: application to tuberculosis incidence in Barcelona, Spain.

    PubMed

    Dominkovics, Pau; Granell, Carlos; Pérez-Navarro, Antoni; Casals, Martí; Orcau, Angels; Caylà, Joan A

    2011-11-29

    Health professionals and authorities strive to cope with heterogeneous data, services, and statistical models to support decision making on public health. Sophisticated analysis and distributed processing capabilities over geocoded epidemiological data are seen as driving factors to speed up control and decision making in these health risk situations. In this context, recent Web technologies and standards-based web services deployed on geospatial information infrastructures have rapidly become an efficient way to access, share, process, and visualize geocoded health-related information. Data used on this study is based on Tuberculosis (TB) cases registered in Barcelona city during 2009. Residential addresses are geocoded and loaded into a spatial database that acts as a backend database. The web-based application architecture and geoprocessing web services are designed according to the Representational State Transfer (REST) principles. These web processing services produce spatial density maps against the backend database. The results are focused on the use of the proposed web-based application to the analysis of TB cases in Barcelona. The application produces spatial density maps to ease the monitoring and decision making process by health professionals. We also include a discussion of how spatial density maps may be useful for health practitioners in such contexts. In this paper, we developed web-based client application and a set of geoprocessing web services to support specific health-spatial requirements. Spatial density maps of TB incidence were generated to help health professionals in analysis and decision-making tasks. The combined use of geographic information tools, map viewers, and geoprocessing services leads to interesting possibilities in handling health data in a spatial manner. In particular, the use of spatial density maps has been effective to identify the most affected areas and its spatial impact. This study is an attempt to demonstrate how web processing services together with web-based mapping capabilities suit the needs of health practitioners in epidemiological analysis scenarios.

  12. Development of spatial density maps based on geoprocessing web services: application to tuberculosis incidence in Barcelona, Spain

    PubMed Central

    2011-01-01

    Background Health professionals and authorities strive to cope with heterogeneous data, services, and statistical models to support decision making on public health. Sophisticated analysis and distributed processing capabilities over geocoded epidemiological data are seen as driving factors to speed up control and decision making in these health risk situations. In this context, recent Web technologies and standards-based web services deployed on geospatial information infrastructures have rapidly become an efficient way to access, share, process, and visualize geocoded health-related information. Methods Data used on this study is based on Tuberculosis (TB) cases registered in Barcelona city during 2009. Residential addresses are geocoded and loaded into a spatial database that acts as a backend database. The web-based application architecture and geoprocessing web services are designed according to the Representational State Transfer (REST) principles. These web processing services produce spatial density maps against the backend database. Results The results are focused on the use of the proposed web-based application to the analysis of TB cases in Barcelona. The application produces spatial density maps to ease the monitoring and decision making process by health professionals. We also include a discussion of how spatial density maps may be useful for health practitioners in such contexts. Conclusions In this paper, we developed web-based client application and a set of geoprocessing web services to support specific health-spatial requirements. Spatial density maps of TB incidence were generated to help health professionals in analysis and decision-making tasks. The combined use of geographic information tools, map viewers, and geoprocessing services leads to interesting possibilities in handling health data in a spatial manner. In particular, the use of spatial density maps has been effective to identify the most affected areas and its spatial impact. This study is an attempt to demonstrate how web processing services together with web-based mapping capabilities suit the needs of health practitioners in epidemiological analysis scenarios. PMID:22126392

  13. Modern Technologies aspects for Oceanographic Data Management and Dissemination : The HNODC Implementation

    NASA Astrophysics Data System (ADS)

    Lykiardopoulos, A.; Iona, A.; Lakes, V.; Batis, A.; Balopoulos, E.

    2009-04-01

    The development of new technologies for the aim of enhancing Web Applications with Dynamically data access was the starting point for Geospatial Web Applications to developed at the same time as well. By the means of these technologies the Web Applications embed the capability of presenting Geographical representations of the Geo Information. The induction in nowadays, of the state of the art technologies known as Web Services, enforce the Web Applications to have interoperability among them i.e. to be able to process requests from each other via a network. In particular throughout the Oceanographic Community, modern Geographical Information systems based on Geospatial Web Services are now developed or will be developed shortly in the near future, with capabilities of managing the information itself fully through Web Based Geographical Interfaces. The exploitation of HNODC Data Base, through a Web Based Application enhanced with Web Services by the use of open source tolls may be consider as an ideal case of such implementation. Hellenic National Oceanographic Data Center (HNODC) as a National Public Oceanographic Data provider and at the same time a member of the International Net of Oceanographic Data Centers( IOC/IODE), owns a very big volume of Data and Relevant information about the Marine Ecosystem. For the efficient management and exploitation of these Data, a relational Data Base has been constructed with a storage of over 300.000 station data concerning, physical, chemical and biological Oceanographic information. The development of a modern Web Application for the End User worldwide to be able to explore and navigate throughout HNODC data via the use of an interface with the capability of presenting Geographical representations of the Geo Information, is today a fact. The application is constituted with State of the art software components and tools such as: • Geospatial and no Spatial Web Services mechanisms • Geospatial open source tools for the creation of Dynamic Geographical Representations. • Communication protocols (messaging mechanisms) in all Layers such as XML and GML together with SOAP protocol via Apache/Axis. At the same time the application may interact with any other SOA application either in sending or receiving Geospatial Data through Geographical Layers, since it inherits the big advantage of interoperability between Web Services systems. Roughly the Architecture can denoted as follows: • At the back End Open source PostgreSQL DBMS stands as the data storage mechanism with more than one Data Base Schemas cause of the separation of the Geospatial Data and the non Geospatial Data. • UMN Map Server and Geoserver are the mechanisms for: Represent Geospatial Data via Web Map Service (WMS) Querying and Navigating in Geospatial and Meta Data Information via Web Feature Service (WFS) oAnd in the near future Transacting and processing new or existing Geospatial Data via Web Processing Service (WPS) • Map Bender, a geospatial portal site management software for OGC and OWS architectures acts as the integration module between the Geospatial Mechanisms. Mapbender comes with an embedded data model capable to manage interfaces for displaying, navigating and querying OGC compliant web map and feature services (WMS and transactional WFS). • Apache and Tomcat stand again as the Web Service middle Layers • Apache Axis with it's embedded implementation of the SOAP protocol ("Simple Object Access Protocol") acts as the No spatial data Mechanism of Web Services. These modules of the platform are still under development but their implementation will be fulfilled in the near future. • And a new Web user Interface for the end user based on enhanced and customized version of a MapBender GUI, a powerful Web Services client. For HNODC the interoperability of Web Services is the big advantage of the developed platform since it is capable to act in the future as provider and consumer of Web Services in both ways: • Either as data products provider for external SOA platforms. • Or as consumer of data products from external SOA platforms for new applications to be developed or for existing applications to be enhanced. A great paradigm of Data Managenet integration and dissemination via the use of such technologies is the European's Union Research Project Seadatanet, with the main objective to develop a standardized distributed system for managing and disseminating the large and diverse data sets and to enhance the currently existing infrastructures with Web Services Further more and when the technology of Web Processing Service (WPS), will be mature enough and applicable for development, the derived data products will be able to have any kind of GIS functionality for consumers across the network. From this point of view HNODC, joins the global scientific community by providing and consuming application Independent data products.

  14. 29 CFR 4.150 - Employee coverage, generally.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Office of the Secretary of Labor LABOR STANDARDS FOR FEDERAL SERVICE CONTRACTS Application of the McNamara-O'Hara Service Contract Act Employees Covered by the Act § 4.150 Employee coverage, generally. The Act, in section 2(b), makes it clear that its provisions apply generally to all service employees...

  15. 78 FR 8456 - Coverage of Certain Preventive Services Under the Affordable Care Act

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-06

    ... 2713 of the Public Health Service Act requires coverage without cost sharing of certain preventive... Requirement to Cover Contraceptive Services Without Cost Sharing Under Section 2713 of the Public Health..., non-stock, public benefit, and similar types of corporations. However, for this purpose an...

  16. 45 CFR 147.130 - Coverage of preventive health services.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 1 2013-10-01 2013-10-01 false Coverage of preventive health services. 147.130 Section 147.130 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES REQUIREMENTS RELATING TO HEALTH CARE ACCESS HEALTH INSURANCE REFORM REQUIREMENTS FOR THE GROUP AND INDIVIDUAL HEALTH INSURANCE MARKETS...

  17. 45 CFR 147.130 - Coverage of preventive health services.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 1 2014-10-01 2014-10-01 false Coverage of preventive health services. 147.130 Section 147.130 Public Welfare Department of Health and Human Services REQUIREMENTS RELATING TO HEALTH CARE ACCESS HEALTH INSURANCE REFORM REQUIREMENTS FOR THE GROUP AND INDIVIDUAL HEALTH INSURANCE MARKETS...

  18. 42 CFR 403.720 - Conditions for coverage.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL... of Participation, and Payment § 403.720 Conditions for coverage. Medicare covers services furnished... 501(c)(3) of the Internal Revenue Code of 1986 and is exempt from taxes under section 501(a). (2) Is...

  19. 42 CFR 438.210 - Coverage and authorization of services.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... requested, be made by a health care professional who has appropriate clinical expertise in treating the... 42 Public Health 4 2011-10-01 2011-10-01 false Coverage and authorization of services. 438.210 Section 438.210 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN...

  20. 42 CFR 438.210 - Coverage and authorization of services.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... requested, be made by a health care professional who has appropriate clinical expertise in treating the... 42 Public Health 4 2010-10-01 2010-10-01 false Coverage and authorization of services. 438.210 Section 438.210 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN...

  1. 42 CFR 416.40 - Condition for coverage-Compliance with State licensure law.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... licensure law. 416.40 Section 416.40 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICARE PROGRAM AMBULATORY SURGICAL SERVICES Specific Conditions for Coverage § 416.40 Condition for coverage—Compliance with State licensure law. The ASC must comply...

  2. Access to Routine Immunization: A Comparative Analysis of Supply-Side Disparities between Northern and Southern Nigeria

    PubMed Central

    Eboreime, Ejemai; Abimbola, Seye; Bozzani, Fiammetta

    2015-01-01

    Background The available data on routine immunization in Nigeria show a disparity in coverage between Northern and Southern Nigeria, with the former performing worse. The effect of socio-cultural differences on health-seeking behaviour has been identified in the literature as the main cause of the disparity. Our study analyses the role of supply-side determinants, particularly access to services, in causing these disparities. Methods Using routine government data, we compared supply-side determinants of access in two Northern states with two Southern states. The states were identified using criteria-based purposive selection such that the comparisons were made between a low-coverage state in the South and a low-coverage state in the North as well as between a high-coverage state in the South and a high-coverage state in the North. Results Human resources and commodities at routine immunization service delivery points were generally insufficient for service delivery in both geographical regions. While disparities were evident between individual states irrespective of regional location, compared to the South, residents in Northern Nigeria were more likely to have vaccination service delivery points located within a 5km radius of their settlements. Conclusion Our findings suggest that regional supply-side disparities are not apparent, reinforcing the earlier reported socio-cultural explanations for disparities in routine immunization service uptake between Northern and Southern Nigeria. Nonetheless, improving routine immunisation coverage services require that there are available human resources and that health facilities are equitably distributed. PMID:26692215

  3. Access to Routine Immunization: A Comparative Analysis of Supply-Side Disparities between Northern and Southern Nigeria.

    PubMed

    Eboreime, Ejemai; Abimbola, Seye; Bozzani, Fiammetta

    2015-01-01

    The available data on routine immunization in Nigeria show a disparity in coverage between Northern and Southern Nigeria, with the former performing worse. The effect of socio-cultural differences on health-seeking behaviour has been identified in the literature as the main cause of the disparity. Our study analyses the role of supply-side determinants, particularly access to services, in causing these disparities. Using routine government data, we compared supply-side determinants of access in two Northern states with two Southern states. The states were identified using criteria-based purposive selection such that the comparisons were made between a low-coverage state in the South and a low-coverage state in the North as well as between a high-coverage state in the South and a high-coverage state in the North. Human resources and commodities at routine immunization service delivery points were generally insufficient for service delivery in both geographical regions. While disparities were evident between individual states irrespective of regional location, compared to the South, residents in Northern Nigeria were more likely to have vaccination service delivery points located within a 5 km radius of their settlements. Our findings suggest that regional supply-side disparities are not apparent, reinforcing the earlier reported socio-cultural explanations for disparities in routine immunization service uptake between Northern and Southern Nigeria. Nonetheless, improving routine immunisation coverage services require that there are available human resources and that health facilities are equitably distributed.

  4. Pragmatic service development and customisation with the CEDA OGC Web Services framework

    NASA Astrophysics Data System (ADS)

    Pascoe, Stephen; Stephens, Ag; Lowe, Dominic

    2010-05-01

    The CEDA OGC Web Services framework (COWS) emphasises rapid service development by providing a lightweight layer of OGC web service logic on top of Pylons, a mature web application framework for the Python language. This approach gives developers a flexible web service development environment without compromising access to the full range of web application tools and patterns: Model-View-Controller paradigm, XML templating, Object-Relational-Mapper integration and authentication/authorization. We have found this approach useful for exploring evolving standards and implementing protocol extensions to meet the requirements of operational deployments. This paper outlines how COWS is being used to implement customised WMS, WCS, WFS and WPS services in a variety of web applications from experimental prototypes to load-balanced cluster deployments serving 10-100 simultaneous users. In particular we will cover 1) The use of Climate Science Modeling Language (CSML) in complex-feature aware WMS, WCS and WFS services, 2) Extending WMS to support applications with features specific to earth system science and 3) A cluster-enabled Web Processing Service (WPS) supporting asynchronous data processing. The COWS WPS underpins all backend services in the UK Climate Projections User Interface where users can extract, plot and further process outputs from a multi-dimensional probabilistic climate model dataset. The COWS WPS supports cluster job execution, result caching, execution time estimation and user management. The COWS WMS and WCS components drive the project-specific NCEO and QESDI portals developed by the British Atmospheric Data Centre. These portals use CSML as a backend description format and implement features such as multiple WMS layer dimensions and climatology axes that are beyond the scope of general purpose GIS tools and yet vital for atmospheric science applications.

  5. Technical Services and the World Wide Web.

    ERIC Educational Resources Information Center

    Scheschy, Virginia M.

    The World Wide Web and browsers such as Netscape and Mosaic have simplified access to electronic resources. Today, technical services librarians can share in the wealth of information available on the Web. One of the premier Web sites for acquisitions librarians is AcqWeb, a cousin of the AcqNet listserv. In addition to interesting news items,…

  6. The Roles of Technology in Primary HIV Prevention for Men Who Have Sex with Men.

    PubMed

    Sullivan, Patrick S; Jones, Jeb; Kishore, Nishant; Stephenson, Rob

    2015-12-01

    Men who have sex with men (MSM) are at disproportionate risk for HIV infection globally. The past 5 years have seen considerable advances in biomedical interventions to reduce the risk of HIV infection. To be impactful in reducing HIV incidence requires the rapid and expansive scale-up of prevention. One mechanism for achieving this is technology-based tools to improve knowledge, acceptability, and coverage of interventions and services. This review provides a summary of the current gap in coverage of primary prevention services, how technology-based interventions and services can address gaps in coverage, and the current trends in the development and availability of technology-based primary prevention tools for use by MSM. Results from agent-based models of HIV epidemics of MSM suggest that 40-50 % coverage of multiple primary HIV prevention interventions and services, including biomedical interventions like preexposure prophylaxis, will be needed to reduce HIV incidence among MSM. In the USA, current levels of coverage for all interventions, except HIV testing and condom distribution, fall well short of this target. Recent findings illustrate how technology-based HIV prevention tools can be used to provide certain kinds of services at much larger scale, with marginal incremental costs. A review of mobile apps for primary HIV prevention revealed that most are designed by nonacademic, nonpublic health developers, and only a small proportion of available mobile apps specifically address MSM populations. We are unlikely to reach the required scale of HIV prevention intervention coverage for MSM unless we can leverage technologies to bring key services to broad coverage for MSM. Despite an exciting pipeline of technology-based prevention tools, there are broader challenges with funding structures and sustainability that need to be addressed to realize the full potential of this emerging public health field.

  7. [Influenza vaccination coverage (2011-2014) in healthcare workers from two health departments of the Valencian Community and hospital services more vulnerable to the flu.

    PubMed

    Tuells, José; García-Román, Vicente; Duro-Torrijos, José Luis

    2018-04-05

    Health care workers can transmit influenza to patients in health centers, therefore its vaccination is considered a preventive measure first order. The objective of this study was to know the coverage of vaccination against seasonal influenza in health professionals in two health departments of the Valencian Community (Torrevieja and Elx- Crevillent), in the seasons 2011-12, 2012-13 and 2013-14. TA cross-sectional descriptive study was carried out to determine the coverage of influenza vaccination through the Nominal Vaccine Registry (NVR) of the Conselleria de Sanitat de la Generalitat Valenciana. The services with the highest risk of contagion were detected through requests for PCR analysis in patients suspected of influenza during the 2013-14 season. A total of 2035 health professionals were surveyed who achieved an average vaccination coverage of 27.2% in the 2013-14 season, showing an upward trend from the 2011-12 season. Significant differences were observed between professional categories and, practitioners presented the lowest coverage. A total of 192 PCR requests were recorded in both departments. The services which concentrate a greater number of requests were: Internal Medicine (n = 100), urgency service (n = 37), intensive care unit (n = 25) and Pediatrics ( n=154); The influenza vaccination coverage of these services in the 2013-14 seasons was 27.0%, 32.3%, 34.3% and 25.3%, respectively. Although they show an upward trend, vaccination coverage is low in health care personnel. Nurses are the best vaccinated. It would be appropriate to implement immunization strategies aimed specifically at services that, because of their activity, pose a greater risk to the patient.

  8. geoknife: Reproducible web-processing of large gridded datasets

    USGS Publications Warehouse

    Read, Jordan S.; Walker, Jordan I.; Appling, Alison P.; Blodgett, David L.; Read, Emily K.; Winslow, Luke A.

    2016-01-01

    Geoprocessing of large gridded data according to overlap with irregular landscape features is common to many large-scale ecological analyses. The geoknife R package was created to facilitate reproducible analyses of gridded datasets found on the U.S. Geological Survey Geo Data Portal web application or elsewhere, using a web-enabled workflow that eliminates the need to download and store large datasets that are reliably hosted on the Internet. The package provides access to several data subset and summarization algorithms that are available on remote web processing servers. Outputs from geoknife include spatial and temporal data subsets, spatially-averaged time series values filtered by user-specified areas of interest, and categorical coverage fractions for various land-use types.

  9. Using Knowledge Base for Event-Driven Scheduling of Web Monitoring Systems

    NASA Astrophysics Data System (ADS)

    Kim, Yang Sok; Kang, Sung Won; Kang, Byeong Ho; Compton, Paul

    Web monitoring systems report any changes to their target web pages by revisiting them frequently. As they operate under significant resource constraints, it is essential to minimize revisits while ensuring minimal delay and maximum coverage. Various statistical scheduling methods have been proposed to resolve this problem; however, they are static and cannot easily cope with events in the real world. This paper proposes a new scheduling method that manages unpredictable events. An MCRDR (Multiple Classification Ripple-Down Rules) document classification knowledge base was reused to detect events and to initiate a prompt web monitoring process independent of a static monitoring schedule. Our experiment demonstrates that the approach improves monitoring efficiency significantly.

  10. Innovations in communication technologies for measles supplemental immunization activities: lessons from Kenya measles vaccination campaign, November 2012

    PubMed Central

    Mbabazi, William B; Tabu, Collins W; Chemirmir, Caleb; Kisia, James; Ali, Nasra; Corkum, Melissa G; Bartley, Gene L

    2015-01-01

    Background To achieve a measles free world, effective communication must be part of all elimination plans. The choice of communication approaches must be evidence based, locally appropriate, interactive and community owned. In this article, we document the innovative approach of using house visits supported by a web-enabled mobile phone application to create a real-time platform for adaptive management of supplemental measles immunization days in Kenya. Methods One thousand nine hundred and fifty-two Red Cross volunteers were recruited, trained and deployed to conduct house-to-house canvassing in 11 urban districts of Kenya. Three days before the campaigns, volunteers conducted house visits with a uniform approach and package of messages. All house visits were documented using a web-enabled mobile phone application (episurveyor®) that in real-time relayed information collected to all campaign management levels. During the campaigns, volunteers reported daily immunizations to their co-ordinators. Post-campaign house visits were also conducted within 4 days, to verify immunization of eligible children, assess information sources and detect adverse events following immunization. Results Fifty-six per cent of the 164 643 households visited said that they had heard about the planned 2012 measles vaccination campaign 1–3 days before start dates. Twenty-five per cent of households were likely to miss the measles supplemental dose if they had not been reassured by the house visit. Pre- and post-campaign reasons for refusal showed that targeted communication reduced misconceptions, fear of injections and trust in herbal remedies. Daily reporting of immunizations using mobile phones informed changes in service delivery plans for better immunization coverage. House visits were more remembered (70%) as sources of information compared with traditional mass awareness channels like megaphones (41%) and radio (37%). Conclusions In high-density settlements, house-to-house visits are easy and more penetrative compared with traditional media approaches. Using mobile phones to document campaign processes and outputs provides real time evidence for service delivery planning to improve immunization coverage. PMID:24920218

  11. 42 CFR 440.330 - Benchmark health benefits coverage.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 4 2012-10-01 2012-10-01 false Benchmark health benefits coverage. 440.330 Section 440.330 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN... Benchmark-Equivalent Coverage § 440.330 Benchmark health benefits coverage. Benchmark coverage is health...

  12. Health Care: Information on Coverage Choices for Servicemembers, Former Servicemembers, and Dependents

    DTIC Science & Technology

    2014-12-01

    drugs, rehabilitative and habilitative services and devices, laboratory services, preventive services and chronic disease management , and pediatric ...the Patient Protection and Affordable Care Act (PPACA) is based on age, income, or other factors. The Centers for Medicare & Medicaid Services (CMS...Services MEC minimum essential coverage PPACA Patient Protection and Affordable Care Act VA Department of Veterans Affairs This is a work of the U.S

  13. Dumbing Down the Net

    NASA Astrophysics Data System (ADS)

    Jamison, Mark A.; Hauge, Janice A.

    It is commonplace for sellers of goods and services to enhance the value of their products by paying extra for premium delivery service. For example, package delivery services such as Federal Express and the US Postal Service offer shippers a variety of delivery speeds and insurance programs. Web content providers such as Yahoo! and MSN Live Earth can purchase web-enhancing services from companies such as Akamai to speed the delivery of their web content to customers.1

  14. Determinants of Corporate Web Services Adoption: A Survey of Companies in Korea

    ERIC Educational Resources Information Center

    Kim, Daekil

    2010-01-01

    Despite the growing interest and attention from Information Technology researchers and practitioners, empirical research on factors that influence an organization's likelihood of adoption of Web Services has been limited. This study identified the factors influencing Web Services adoption from the perspective of 151 South Korean firms. The…

  15. Web 2.0 Strategy in Libraries and Information Services

    ERIC Educational Resources Information Center

    Byrne, Alex

    2008-01-01

    Web 2.0 challenges libraries to change from their predominantly centralised service models with integrated library management systems at the hub. Implementation of Web 2.0 technologies and the accompanying attitudinal shifts will demand reconceptualisation of the nature of library and information service around a dynamic, ever changing, networked,…

  16. WIWS: a protein structure bioinformatics Web service collection.

    PubMed

    Hekkelman, M L; Te Beek, T A H; Pettifer, S R; Thorne, D; Attwood, T K; Vriend, G

    2010-07-01

    The WHAT IF molecular-modelling and drug design program is widely distributed in the world of protein structure bioinformatics. Although originally designed as an interactive application, its highly modular design and inbuilt control language have recently enabled its deployment as a collection of programmatically accessible web services. We report here a collection of WHAT IF-based protein structure bioinformatics web services: these relate to structure quality, the use of symmetry in crystal structures, structure correction and optimization, adding hydrogens and optimizing hydrogen bonds and a series of geometric calculations. The freely accessible web services are based on the industry standard WS-I profile and the EMBRACE technical guidelines, and are available via both REST and SOAP paradigms. The web services run on a dedicated computational cluster; their function and availability is monitored daily.

  17. Using EMBL-EBI services via Web interface and programmatically via Web Services

    PubMed Central

    Lopez, Rodrigo; Cowley, Andrew; Li, Weizhong; McWilliam, Hamish

    2015-01-01

    The European Bioinformatics Institute (EMBL-EBI) provides access to a wide range of databases and analysis tools that are of key importance in bioinformatics. As well as providing Web interfaces to these resources, Web Services are available using SOAP and REST protocols that enable programmatic access to our resources and allow their integration into other applications and analytical workflows. This unit describes the various options available to a typical researcher or bioinformatician who wishes to use our resources via Web interface or programmatically via a range of programming languages. PMID:25501941

  18. Improving Medicare coverage of psychological services for older Americans.

    PubMed

    Karlin, Bradley E; Humphreys, Keith

    2007-10-01

    Professional psychology's ability to meet older Americans' psychological needs and to simultaneously thrive as a profession will be closely tied to the federal Medicare program over the coming decades. Despite legislative changes in the 1980s providing professional autonomy to psychologists and expanding coverage for mental health services, Medicare coverage policies, reimbursement mechanisms, and organizational traditions continue to limit older Americans' access to psychological services. This article describes how psychologists can influence Medicare coverage policy. Specifically, the authors examine widely unrecognized policy processes and recent political developments and analyze the recent creation of a new Medicare counseling benefit, applying J. W. Kingdon's (1995) well-known model of policy change. These recent developments offer new opportunities for expanding Medicare coverage of psychological services, particularly in the areas of prevention, screening, and early intervention. The article provides an analysis to guide psychologists in engaging in strategic advocacy and incorporating psychological prevention and early intervention services into Medicare. As Medicare policy entrepreneurs, psychologists can improve the well-being of millions of Americans who rely on the national health insurance program and, in so doing, can help shape the future practice of psychology. Copyright 2007 APA, all rights reserved.

  19. BioPortal: enhanced functionality via new Web services from the National Center for Biomedical Ontology to access and use ontologies in software applications.

    PubMed

    Whetzel, Patricia L; Noy, Natalya F; Shah, Nigam H; Alexander, Paul R; Nyulas, Csongor; Tudorache, Tania; Musen, Mark A

    2011-07-01

    The National Center for Biomedical Ontology (NCBO) is one of the National Centers for Biomedical Computing funded under the NIH Roadmap Initiative. Contributing to the national computing infrastructure, NCBO has developed BioPortal, a web portal that provides access to a library of biomedical ontologies and terminologies (http://bioportal.bioontology.org) via the NCBO Web services. BioPortal enables community participation in the evaluation and evolution of ontology content by providing features to add mappings between terms, to add comments linked to specific ontology terms and to provide ontology reviews. The NCBO Web services (http://www.bioontology.org/wiki/index.php/NCBO_REST_services) enable this functionality and provide a uniform mechanism to access ontologies from a variety of knowledge representation formats, such as Web Ontology Language (OWL) and Open Biological and Biomedical Ontologies (OBO) format. The Web services provide multi-layered access to the ontology content, from getting all terms in an ontology to retrieving metadata about a term. Users can easily incorporate the NCBO Web services into software applications to generate semantically aware applications and to facilitate structured data collection.

  20. Volcview: A Web-Based Platform for Satellite Monitoring of Volcanic Activity and Eruption Response

    NASA Astrophysics Data System (ADS)

    Schneider, D. J.; Randall, M.; Parker, T.

    2014-12-01

    The U.S. Geological Survey (USGS), in cooperation with University and State partners, operates five volcano observatories that employ specialized software packages and computer systems to process and display real-time data coming from in-situ geophysical sensors and from near-real-time satellite sources. However, access to these systems both inside and from outside the observatory offices are limited in some cases by factors such as software cost, network security, and bandwidth. Thus, a variety of Internet-based tools have been developed by the USGS Volcano Science Center to: 1) Improve accessibility to data sources for staff scientists across volcano monitoring disciplines; 2) Allow access for observatory partners and for after-hours, on-call duty scientists; 3) Provide situational awareness for emergency managers and the general public. Herein we describe VolcView (volcview.wr.usgs.gov), a freely available, web-based platform for display and analysis of near-real-time satellite data. Initial geographic coverage is of the volcanoes in Alaska, the Russian Far East, and the Commonwealth of the Northern Mariana Islands. Coverage of other volcanoes in the United States will be added in the future. Near-real-time satellite data from NOAA, NASA and JMA satellite systems are processed to create image products for detection of elevated surface temperatures and volcanic ash and SO2 clouds. VolcView uses HTML5 and the canvas element to provide image overlays (volcano location and alert status, annotation, and location information) and image products that can be queried to provide data values, location and measurement capabilities. Use over the past year during the eruptions of Pavlof, Veniaminof, and Cleveland volcanoes in Alaska by the Alaska Volcano Observatory, the National Weather Service, and the U.S. Air Force has reinforced the utility of shared situational awareness and has guided further development. These include overlay of volcanic cloud trajectory and dispersion models, atmospheric temperature profiles, and incorporation of monitoring alerts from ground and satellite-based algorithms. Challenges for future development include reducing the latency in satellite data reception and processing, and increasing the geographic coverage from polar-orbiting satellite platforms.

Top