Sample records for mapping services wms

  1. Web Map Services (WMS) Global Mosaic

    NASA Technical Reports Server (NTRS)

    Percivall, George; Plesea, Lucian

    2003-01-01

    The WMS Global Mosaic provides access to imagery of the global landmass using an open standard for web mapping. The seamless image is a mosaic of Landsat 7 scenes; geographically-accurate with 30 and 15 meter resolutions. By using the OpenGIS Web Map Service (WMS) interface, any organization can use the global mosaic as a layer in their geospatial applications. Based on a trade study, an implementation approach was chosen that extends a previously developed WMS hosting a Landsat 5 CONUS mosaic developed by JPL. The WMS Global Mosaic supports the NASA Geospatial Interoperability Office goal of providing an integrated digital representation of the Earth, widely accessible for humanity's critical decisions.

  2. eWaterCycle visualisation. combining the strength of NetCDF and Web Map Service: ncWMS

    NASA Astrophysics Data System (ADS)

    Hut, R.; van Meersbergen, M.; Drost, N.; Van De Giesen, N.

    2016-12-01

    As a result of the eWatercycle global hydrological forecast we have created Cesium-ncWMS, a web application based on ncWMS and Cesium. ncWMS is a server side application capable of reading any NetCDF file written using the Climate and Forecasting (CF) conventions, and making the data available as a Web Map Service(WMS). ncWMS automatically determines available variables in a file, and creates maps colored according to map data and a user selected color scale. Cesium is a Javascript 3D virtual Globe library. It uses WebGL for rendering, which makes it very fast, and it is capable of displaying a wide variety of data types such as vectors, 3D models, and 2D maps. The forecast results are automatically uploaded to our web server running ncWMS. In turn, the web application can be used to change the settings for color maps and displayed data. The server uses the settings provided by the web application, together with the data in NetCDF to provide WMS image tiles, time series data and legend graphics to the Cesium-NcWMS web application. The user can simultaneously zoom in to the very high resolution forecast results anywhere on the world, and get time series data for any point on the globe. The Cesium-ncWMS visualisation combines a global overview with local relevant information in any browser. See the visualisation live at forecast.ewatercycle.org

  3. KML Super Overlay to WMS Translator

    NASA Technical Reports Server (NTRS)

    Plesea, Lucian

    2007-01-01

    This translator is a server-based application that automatically generates KML super overlay configuration files required by Google Earth for map data access via the Open Geospatial Consortium WMS (Web Map Service) standard. The translator uses a set of URL parameters that mirror the WMS parameters as much as possible, and it also can generate a super overlay subdivision of any given area that is only loaded when needed, enabling very large areas of coverage at very high resolutions. It can make almost any dataset available as a WMS service visible and usable in any KML application, without the need to reformat the data.

  4. Communicating and visualizing data quality through Web Map Services

    NASA Astrophysics Data System (ADS)

    Roberts, Charles; Blower, Jon; Maso, Joan; Diaz, Daniel; Griffiths, Guy; Lewis, Jane

    2014-05-01

    The sharing and visualization of environmental data through OGC Web Map Services is becoming increasingly common. However, information about the quality of data is rarely presented. (In this presentation we consider mostly data uncertainty as a measure of quality, although we acknowledge that many other quality measures are relevant to the geoscience community.) In the context of the GeoViQua project (http://www.geoviqua.org) we have developed conventions and tools for using WMS to deliver data quality information. The "WMS-Q" convention describes how the WMS specification can be used to publish quality information at the level of datasets, variables and individual pixels (samples). WMS-Q requires no extensions to the WMS 1.3.0 specification, being entirely backward-compatible. (An earlier version of WMS-Q was published as OGC Engineering Report 12-160.) To complement the WMS-Q convention, we have also developed extensions to the OGC Symbology Encoding (SE) specification, enabling uncertain geoscience data to be portrayed using a variety of visualization techniques. These include contours, stippling, blackening, whitening, opacity, bivariate colour maps, confidence interval triangles and glyphs. There may also be more extensive applications of these methods beyond the visual representation of uncertainty. In this presentation we will briefly describe the scope of the WMS-Q and "extended SE" specifications and then demonstrate the innovations using open-source software based upon ncWMS (http://ncwms.sf.net). We apply the tools to a variety of datasets including Earth Observation data from the European Space Agency's Climate Change Initiative. The software allows uncertain raster data to be shared through Web Map Services, giving the user fine control over data visualization.

  5. Content-Based Discovery for Web Map Service using Support Vector Machine and User Relevance Feedback

    PubMed Central

    Cheng, Xiaoqiang; Qi, Kunlun; Zheng, Jie; You, Lan; Wu, Huayi

    2016-01-01

    Many discovery methods for geographic information services have been proposed. There are approaches for finding and matching geographic information services, methods for constructing geographic information service classification schemes, and automatic geographic information discovery. Overall, the efficiency of the geographic information discovery keeps improving., There are however, still two problems in Web Map Service (WMS) discovery that must be solved. Mismatches between the graphic contents of a WMS and the semantic descriptions in the metadata make discovery difficult for human users. End-users and computers comprehend WMSs differently creating semantic gaps in human-computer interactions. To address these problems, we propose an improved query process for WMSs based on the graphic contents of WMS layers, combining Support Vector Machine (SVM) and user relevance feedback. Our experiments demonstrate that the proposed method can improve the accuracy and efficiency of WMS discovery. PMID:27861505

  6. Content-Based Discovery for Web Map Service using Support Vector Machine and User Relevance Feedback.

    PubMed

    Hu, Kai; Gui, Zhipeng; Cheng, Xiaoqiang; Qi, Kunlun; Zheng, Jie; You, Lan; Wu, Huayi

    2016-01-01

    Many discovery methods for geographic information services have been proposed. There are approaches for finding and matching geographic information services, methods for constructing geographic information service classification schemes, and automatic geographic information discovery. Overall, the efficiency of the geographic information discovery keeps improving., There are however, still two problems in Web Map Service (WMS) discovery that must be solved. Mismatches between the graphic contents of a WMS and the semantic descriptions in the metadata make discovery difficult for human users. End-users and computers comprehend WMSs differently creating semantic gaps in human-computer interactions. To address these problems, we propose an improved query process for WMSs based on the graphic contents of WMS layers, combining Support Vector Machine (SVM) and user relevance feedback. Our experiments demonstrate that the proposed method can improve the accuracy and efficiency of WMS discovery.

  7. High-Performance Tiled WMS and KML Web Server

    NASA Technical Reports Server (NTRS)

    Plesea, Lucian

    2007-01-01

    This software is an Apache 2.0 module implementing a high-performance map server to support interactive map viewers and virtual planet client software. It can be used in applications that require access to very-high-resolution geolocated images, such as GIS, virtual planet applications, and flight simulators. It serves Web Map Service (WMS) requests that comply with a given request grid from an existing tile dataset. It also generates the KML super-overlay configuration files required to access the WMS image tiles.

  8. Web GIS in practice IV: publishing your health maps and connecting to remote WMS sources using the Open Source UMN MapServer and DM Solutions MapLab

    PubMed Central

    Boulos, Maged N Kamel; Honda, Kiyoshi

    2006-01-01

    Open Source Web GIS software systems have reached a stage of maturity, sophistication, robustness and stability, and usability and user friendliness rivalling that of commercial, proprietary GIS and Web GIS server products. The Open Source Web GIS community is also actively embracing OGC (Open Geospatial Consortium) standards, including WMS (Web Map Service). WMS enables the creation of Web maps that have layers coming from multiple different remote servers/sources. In this article we present one easy to implement Web GIS server solution that is based on the Open Source University of Minnesota (UMN) MapServer. By following the accompanying step-by-step tutorial instructions, interested readers running mainstream Microsoft® Windows machines and with no prior technical experience in Web GIS or Internet map servers will be able to publish their own health maps on the Web and add to those maps additional layers retrieved from remote WMS servers. The 'digital Asia' and 2004 Indian Ocean tsunami experiences in using free Open Source Web GIS software are also briefly described. PMID:16420699

  9. Web Based Rapid Mapping of Disaster Areas using Satellite Images, Web Processing Service, Web Mapping Service, Frequency Based Change Detection Algorithm and J-iView

    NASA Astrophysics Data System (ADS)

    Bandibas, J. C.; Takarada, S.

    2013-12-01

    Timely identification of areas affected by natural disasters is very important for a successful rescue and effective emergency relief efforts. This research focuses on the development of a cost effective and efficient system of identifying areas affected by natural disasters, and the efficient distribution of the information. The developed system is composed of 3 modules which are the Web Processing Service (WPS), Web Map Service (WMS) and the user interface provided by J-iView (fig. 1). WPS is an online system that provides computation, storage and data access services. In this study, the WPS module provides online access of the software implementing the developed frequency based change detection algorithm for the identification of areas affected by natural disasters. It also sends requests to WMS servers to get the remotely sensed data to be used in the computation. WMS is a standard protocol that provides a simple HTTP interface for requesting geo-registered map images from one or more geospatial databases. In this research, the WMS component provides remote access of the satellite images which are used as inputs for land cover change detection. The user interface in this system is provided by J-iView, which is an online mapping system developed at the Geological Survey of Japan (GSJ). The 3 modules are seamlessly integrated into a single package using J-iView, which could rapidly generate a map of disaster areas that is instantaneously viewable online. The developed system was tested using ASTER images covering the areas damaged by the March 11, 2011 tsunami in northeastern Japan. The developed system efficiently generated a map showing areas devastated by the tsunami. Based on the initial results of the study, the developed system proved to be a useful tool for emergency workers to quickly identify areas affected by natural disasters.

  10. WMS Server 2.0

    NASA Technical Reports Server (NTRS)

    Plesea, Lucian; Wood, James F.

    2012-01-01

    This software is a simple, yet flexible server of raster map products, compliant with the Open Geospatial Consortium (OGC) Web Map Service (WMS) 1.1.1 protocol. The server is a full implementation of the OGC WMS 1.1.1 as a fastCGI client and using Geospatial Data Abstraction Library (GDAL) for data access. The server can operate in a proxy mode, where all or part of the WMS requests are done on a back server. The server has explicit support for a colocated tiled WMS, including rapid response of black (no-data) requests. It generates JPEG and PNG images, including 16-bit PNG. The GDAL back-end support allows great flexibility on the data access. The server is a port to a Linux/GDAL platform from the original IRIX/IL platform. It is simpler to configure and use, and depending on the storage format used, it has better performance than other available implementations. The WMS server 2.0 is a high-performance WMS implementation due to the fastCGI architecture. The use of GDAL data back end allows for great flexibility. The configuration is relatively simple, based on a single XML file. It provides scaling and cropping, as well as blending of multiple layers based on layer transparency.

  11. Caching strategies for improving performance of web-based Geographic applications

    NASA Astrophysics Data System (ADS)

    Liu, M.; Brodzik, M.; Collins, J. A.; Lewis, S.; Oldenburg, J.

    2012-12-01

    The NASA Operation IceBridge mission collects airborne remote sensing measurements to bridge the gap between NASA's Ice, Cloud and Land Elevation Satellite (ICESat) mission and the upcoming ICESat-2 mission. The IceBridge Data Portal from the National Snow and Ice Data Center provides an intuitive web interface for accessing IceBridge mission observations and measurements. Scientists and users usually do not have knowledge about the individual campaigns but are interested in data collected in a specific place. We have developed a high-performance map interface to allow users to quickly zoom to an area of interest and see any Operation IceBridge overflights. The map interface consists of two layers: the user can pan and zoom on the base map layer; the flight line layer that overlays the base layer provides all the campaign missions that intersect with the current map view. The user can click on the flight campaigns and download the data as needed. The OpenGIS® Web Map Service Interface Standard (WMS) provides a simple HTTP interface for requesting geo-registered map images from one or more distributed geospatial databases. Web Feature Service (WFS) provides an interface allowing requests for geographical features across the web using platform-independent calls. OpenLayers provides vector support (points, polylines and polygons) to build a WMS/WFS client for displaying both layers on the screen. Map Server, an open source development environment for building spatially enabled internet applications, is serving the WMS and WFS spatial data to OpenLayers. Early releases of the portal displayed unacceptably poor load time performance for flight lines and the base map tiles. This issue was caused by long response times from the map server in generating all map tiles and flight line vectors. We resolved the issue by implementing various caching strategies on top of the WMS and WFS services, including the use of Squid (www.squid-cache.org) to cache frequently-used content. Our presentation includes the architectural design of the application, and how we use OpenLayers, WMS and WFS with Squid to build a responsive web application capable of efficiently displaying geospatial data to allow the user to quickly interact with the displayed information. We describe the design, implementation and performance improvement of our caching strategies, and the tools and techniques developed to assist our data caching strategies.

  12. BingEO: Enable Distributed Earth Observation Data for Environmental Research

    NASA Astrophysics Data System (ADS)

    Wu, H.; Yang, C.; Xu, Y.

    2010-12-01

    Our planet is facing great environmental challenges including global climate change, environmental vulnerability, extreme poverty, and a shortage of clean cheap energy. To address these problems, scientists are developing various models to analysis, forecast, simulate various geospatial phenomena to support critical decision making. These models not only challenge our computing technology, but also challenge us to feed huge demands of earth observation data. Through various policies and programs, open and free sharing of earth observation data are advocated in earth science. Currently, thousands of data sources are freely available online through open standards such as Web Map Service (WMS), Web Feature Service (WFS) and Web Coverage Service (WCS). Seamless sharing and access to these resources call for a spatial Cyberinfrastructure (CI) to enable the use of spatial data for the advancement of related applied sciences including environmental research. Based on Microsoft Bing Search Engine and Bing Map, a seamlessly integrated and visual tool is under development to bridge the gap between researchers/educators and earth observation data providers. With this tool, earth science researchers/educators can easily and visually find the best data sets for their research and education. The tool includes a registry and its related supporting module at server-side and an integrated portal as its client. The proposed portal, Bing Earth Observation (BingEO), is based on Bing Search and Bing Map to: 1) Use Bing Search to discover Web Map Services (WMS) resources available over the internet; 2) Develop and maintain a registry to manage all the available WMS resources and constantly monitor their service quality; 3) Allow users to manually register data services; 4) Provide a Bing Maps-based Web application to visualize the data on a high-quality and easy-to-manipulate map platform and enable users to select the best data layers online. Given the amount of observation data accumulated already and still growing, BingEO will allow these resources to be utilized more widely, intensively, efficiently and economically in earth science applications.

  13. Operational Use of OGC Web Services at the Met Office

    NASA Astrophysics Data System (ADS)

    Wright, Bruce

    2010-05-01

    The Met Office has adopted the Service-Orientated Architecture paradigm to deliver services to a range of customers through Rich Internet Applications (RIAs). The approach uses standard Open Geospatial Consortium (OGC) web services to provide information to web-based applications through a range of generic data services. "Invent", the Met Office beta site, is used to showcase Met Office future plans for presenting web-based weather forecasts, product and information to the public. This currently hosts a freely accessible Weather Map Viewer, written in JavaScript, which accesses a Web Map Service (WMS), to deliver innovative web-based visualizations of weather and its potential impacts to the public. The intention is to engage the public in the development of new web-based services that more accurately meet their needs. As the service is intended for public use within the UK, it has been designed to support a user base of 5 million, the analysed level of UK web traffic reaching the Met Office's public weather information site. The required scalability has been realised through the use of multi-tier tile caching: - WMS requests are made for 256x256 tiles for fixed areas and zoom levels; - a Tile Cache, developed in house, efficiently serves tiles on demand, managing WMS request for the new tiles; - Edge Servers, externally hosted by Akamai, provide a highly scalable (UK-centric) service for pre-cached tiles, passing new requests to the Tile Cache; - the Invent Weather Map Viewer uses the Google Maps API to request tiles from Edge Servers. (We would expect to make use of the Web Map Tiling Service, when it becomes an OGC standard.) The Met Office delivers specialist commercial products to market sectors such as transport, utilities and defence, which exploit a Web Feature Service (WFS) for data relating forecasts and observations to specific geographic features, and a Web Coverage Service (WCS) for sub-selections of gridded data. These are locally rendered as maps or graphs, and combined with the WMS pre-rendered images and text, in a FLEX application, to provide sophisticated, user impact-based view of the weather. The OGC web services supporting these applications have been developed in collaboration with commercial companies. Visual Weather was originally a desktop application for forecasters, but IBL have developed it to expose the full range of forecast and observation data through standard web services (WCS and WMS). Forecasts and observations relating to specific locations and geographic features are held in an Oracle Database, and exposed as a WFS using Snowflake Software's GO-Publisher application. The Met Office has worked closely with both IBL and Snowflake Software to ensure that the web services provided strike a balance between conformance to the standards and performance in an operational environment. This has proved challenging in areas where the standards are rapidly evolving (e.g. WCS) or do not allow adequate description of the Met-Ocean domain (e.g. multiple time coordinates and parametric vertical coordinates). It has also become clear that careful selection of the features to expose, based on the way in which you expect users to query those features, in necessary in order to deliver adequate performance. These experiences are providing useful 'real-world' input in to the recently launched OGC MetOcean Domain Working Group and World Meteorological Organisation (WMO) initiatives in this area.

  14. Exploring NASA GES DISC Data with Interoperable Services

    NASA Technical Reports Server (NTRS)

    Zhao, Peisheng; Yang, Wenli; Hegde, Mahabal; Wei, Jennifer C.; Kempler, Steven; Pham, Long; Teng, William; Savtchenko, Andrey

    2015-01-01

    Overview of NASA GES DISC (NASA Goddard Earth Science Data and Information Services Center) data with interoperable services: Open-standard and Interoperable Services Improve data discoverability, accessibility, and usability with metadata, catalogue and portal standards Achieve data, information and knowledge sharing across applications with standardized interfaces and protocols Open Geospatial Consortium (OGC) Data Services and Specifications Web Coverage Service (WCS) -- data Web Map Service (WMS) -- pictures of data Web Map Tile Service (WMTS) --- pictures of data tiles Styled Layer Descriptors (SLD) --- rendered styles.

  15. Providing Internet Access to High-Resolution Mars Images

    NASA Technical Reports Server (NTRS)

    Plesea, Lucian

    2008-01-01

    The OnMars server is a computer program that provides Internet access to high-resolution Mars images, maps, and elevation data, all suitable for use in geographical information system (GIS) software for generating images, maps, and computational models of Mars. The OnMars server is an implementation of the Open Geospatial Consortium (OGC) Web Map Service (WMS) server. Unlike other Mars Internet map servers that provide Martian data using an Earth coordinate system, the OnMars WMS server supports encoding of data in Mars-specific coordinate systems. The OnMars server offers access to most of the available high-resolution Martian image and elevation data, including an 8-meter-per-pixel uncontrolled mosaic of most of the Mars Global Surveyor (MGS) Mars Observer Camera Narrow Angle (MOCNA) image collection, which is not available elsewhere. This server can generate image and map files in the tagged image file format (TIFF), Joint Photographic Experts Group (JPEG), 8- or 16-bit Portable Network Graphics (PNG), or Keyhole Markup Language (KML) format. Image control is provided by use of the OGC Style Layer Descriptor (SLD) protocol. The OnMars server also implements tiled WMS protocol and super-overlay KML for high-performance client application programs.

  16. Cool Apps: Building Cryospheric Data Applications with Standards-Based Service Oriented Architecture

    NASA Astrophysics Data System (ADS)

    Oldenburg, J.; Truslove, I.; Collins, J. A.; Liu, M.; Lewis, S.; Brodzik, M.

    2012-12-01

    The National Snow and Ice Data Center (NSIDC) holds a large collection of cryospheric data, and is involved in a number of informatics research and development projects aimed at improving the discoverability and accessibility of these data. To develop high- quality software in a timely manner, we have adopted a Service- Oriented Architecture (SOA) approach for our core technical infrastructure development. Data services at NSIDC are internally exposed to other tools and applications through standards-based service interfaces. These standards include OAI-PMH (Open Archives Initiative Protocol for Metadata Harvesting), various OGC (Open Geospatial Consortium) standards including WMS (Web Map Service) and WFS (Web Feature Service), ESIP (Federation of Earth Sciences Information Partners) OpenSearch, and NSIDC-defined service endpoints which follow a RESTful architecture. By taking a standards-based approach, we are able to use off-the-shelf tools and libraries to consume, translate and broker these data services, and thus develop applications faster. Additionally, by exposing public interfaces to these services we provide valuable data services to technical collaborators; for example, NASA Reverb (http://reverb.echo.nasa.gov) uses NSIDC's WMS services. Our latest generation of web applications consume these data services directly. The most complete example of this is the Operation IceBridge Data Portal (http://nsidc.org/icebridge/ portal) which depends on many of the aforementioned services, retrieving data in several ways. The maps it displays are obtained through the use of WMS and WFS protocols from a MapServer instance hosted at NSIDC. Links to the scientific data collected on Operation IceBridge campaigns are obtained through ESIP OpenSearch requests service providers that encapsulate our metadata databases. These standards-based web services are also developed at NSIDC and are designed to be used independently of the Portal. This poster provides a visual representation of the relationships described above, with additional details and examples, and more generally outlines the benefits and challenges of this SOA approach.

  17. The interoperability skill of the Geographic Portal of the ISPRA - Geological Survey of Italy

    NASA Astrophysics Data System (ADS)

    Pia Congi, Maria; Campo, Valentina; Cipolloni, Carlo; Delogu, Daniela; Ventura, Renato; Battaglini, Loredana

    2010-05-01

    The Geographic Portal of Geological Survey of Italy (ISPRA) available at http://serviziogeologico.apat.it/Portal was planning according to standard criteria of the INSPIRE directive. ArcIMS services and at the same time WMS and WFS services had been realized to satisfy the different clients. For each database and web-services the metadata had been wrote in agreement with the ISO 19115. The management architecture of the portal allow it to encode the clients input and output requests both in ArcXML and in GML language. The web-applications and web-services had been realized for each database owner of Land Protection and Georesources Department concerning the geological map at the scale 1:50.000 (CARG Project) and 1:100.000, the IFFI landslide inventory, the boreholes due Law 464/84, the large-scale geological map and all the raster format maps. The portal thus far published is at the experimental stage but through the development of a new graphical interface achieves the final version. The WMS and WFS services including metadata will be re-designed. The validity of the methodology and the applied standards allow to look ahead to the growing developments. In addition to this it must be borne in mind that the capacity of the new geological standard language (GeoSciML), which is already incorporated in the web-services deployed, will be allow a better display and query of the geological data according to the interoperability. The characteristics of the geological data demand for the cartographic mapping specific libraries of symbols not yet available in a WMS service. This is an other aspect regards the standards of the geological informations. Therefore at the moment were carried out: - a library of geological symbols to be used for printing, with a sketch of system colors and a library for displaying data on video, which almost completely solves the problems of the coverage point and area data (also directed) but that still introduces problems for the linear data (solutions: ArcIMS services from Arcmap projects or a specific SLD implementation for WMS services); - an update of "Guidelines for the supply of geological data" in a short time will be published; - the Geological Survey of Italy is officially involved in the IUGS-CGI working group for the processing and experimentation on the new GeoSciML language with the WMS/WFS services. The availability of geographic informations occurs through the metadata that can be distributed online so that search engines can find them through specialized research. The collected metadata in catalogs are structured in a standard (ISO 19135). The catalogs are a ‘common' interface to locate, view and query data and metadata services, web services and other resources. Then, while working in a growing sector of the environmental knowledgement the focus is to collect the participation of other subjects that contribute to the enrichment of the informative content available, so as to be able to arrive to a real portal of national interest especially in case of disaster management.

  18. Communicating data quality through Web Map Services

    NASA Astrophysics Data System (ADS)

    Blower, Jon; Roberts, Charles; Griffiths, Guy; Lewis, Jane; Yang, Kevin

    2013-04-01

    The sharing and visualization of environmental data through spatial data infrastructures is becoming increasingly common. However, information about the quality of data is frequently unavailable or presented in an inconsistent fashion. ("Data quality" is a phrase with many possible meanings but here we define it as "fitness for purpose" - therefore different users have different notions of what constitutes a "high quality" dataset.) The GeoViQua project (www.geoviqua.org) is developing means for eliciting, formatting, discovering and visualizing quality information using ISO and Open Geospatial Consortium (OGC) standards. Here we describe one aspect of the innovations of the GeoViQua project. In this presentation, we shall demonstrate new developments in using Web Map Services to communicate data quality at the level of datasets, variables and individual samples. We shall outline a new draft set of conventions (known as "WMS-Q"), which describe a set of rules for using WMS to convey quality information (OGC draft Engineering Report 12-160). We shall demonstrate these conventions through new prototype software, based upon the widely-used ncWMS software, that applies these rules to enable the visualization of uncertainties in raster data such as satellite products and the results of numerical simulations. Many conceptual and practical issues have arisen from these experiments. How can source data be formatted so that a WMS implementation can detect the semantic links between variables (e.g. the links between a mean field and its variance)? The visualization of uncertainty can be a complex task - how can we provide users with the power and flexibility to choose an optimal strategy? How can we maintain compatibility (as far as possible) with existing WMS clients? We explore these questions with reference to existing standards and approaches, including UncertML, NetCDF-U and Styled Layer Descriptors.

  19. Development of WMS Capabilities to Support NASA Disasters Applications and App Development

    NASA Astrophysics Data System (ADS)

    Bell, J. R.; Burks, J. E.; Molthan, A.; McGrath, K. M.

    2013-12-01

    During the last year several significant disasters have occurred such as Superstorm Sandy on the East coast of the United States, and Typhoon Bopha in the Phillipines, along with several others. In support of these disasters NASA's Short-term Prediction Research and Transition (SPoRT) Center delivered various products derived from satellite imagery to help in the assessment of damage and recovery of the affected areas. To better support the decision makers responding to the disasters SPoRT quickly developed several solutions to provide the data using open Geographical Information Service (GIS) formats. Providing the data in open GIS standard formats allowed the end user to easily integrate the data into existing Decision Support Systems (DSS). Both Tile Mapping Service (TMS) and Web Mapping Service (WMS) were leveraged to quickly provide the data to the end-user. Development of the deliver methodology allowed quick response to rapidly developing disasters and enabled NASA SPoRT to bring science data to decision makers in a successful research to operations transition.

  20. Development of WMS Capabilities to Support NASA Disasters Applications and App Development

    NASA Technical Reports Server (NTRS)

    Bell, Jordan R.; Burks, Jason E.; Molthan, Andrew L.; McGrath, Kevin M.

    2013-01-01

    During the last year several significant disasters have occurred such as Superstorm Sandy on the East coast of the United States, and Typhoon Bopha in the Phillipines, along with several others. In support of these disasters NASA's Short-term Prediction Research and Transition (SPoRT) Center delivered various products derived from satellite imagery to help in the assessment of damage and recovery of the affected areas. To better support the decision makers responding to the disasters SPoRT quickly developed several solutions to provide the data using open Geographical Information Service (GIS) formats. Providing the data in open GIS standard formats allowed the end user to easily integrate the data into existing Decision Support Systems (DSS). Both Tile Mapping Service (TMS) and Web Mapping Service (WMS) were leveraged to quickly provide the data to the end-user. Development of the deliver methodology allowed quick response to rapidly developing disasters and enabled NASA SPoRT to bring science data to decision makers in a successful research to operations transition.

  1. The iMars WebGIS - Spatio-Temporal Data Queries and Single Image Map Web Services

    NASA Astrophysics Data System (ADS)

    Walter, Sebastian; Steikert, Ralf; Schreiner, Bjoern; Muller, Jan-Peter; van Gasselt, Stephan; Sidiropoulos, Panagiotis; Lanz-Kroechert, Julia

    2017-04-01

    Introduction: Web-based planetary image dissemination platforms usually show outline coverages of the data and offer querying for metadata as well as preview and download, e.g. the HRSC Mapserver (Walter & van Gasselt, 2014). Here we introduce a new approach for a system dedicated to change detection by simultanous visualisation of single-image time series in a multi-temporal context. While the usual form of presenting multi-orbit datasets is the merge of the data into a larger mosaic, we want to stay with the single image as an important snapshot of the planetary surface at a specific time. In the context of the EU FP-7 iMars project we process and ingest vast amounts of automatically co-registered (ACRO) images. The base of the co-registration are the high precision HRSC multi-orbit quadrangle image mosaics, which are based on bundle-block-adjusted multi-orbit HRSC DTMs. Additionally we make use of the existing bundle-adjusted HRSC single images available at the PDS archives. A prototype demonstrating the presented features is available at http://imars.planet.fu-berlin.de. Multi-temporal database: In order to locate multiple coverage of images and select images based on spatio-temporal queries, we converge available coverage catalogs for various NASA imaging missions into a relational database management system with geometry support. We harvest available metadata entries during our processing pipeline using the Integrated Software for Imagers and Spectrometers (ISIS) software. Currently, this database contains image outlines from the MGS/MOC, MRO/CTX and the MO/THEMIS instruments with imaging dates ranging from 1996 to the present. For the MEx/HRSC data, we already maintain a database which we automatically update with custom software based on the VICAR environment. Web Map Service with time support: The MapServer software is connected to the database and provides Web Map Services (WMS) with time support based on the START_TIME image attribute. It allows temporal WMS GetMap requests by setting additional TIME parameter values in the request. The values for the parameter represent an interval defined by its lower and upper bounds. As the WMS time standard only supports one time variable, only the start times of the images are considered. If no time values are submitted with the request, the full time range of all images is assumed as the default. Dynamic single image WMS: To compare images from different acquisition times at sites of multiple coverage, we have to load every image as a single WMS layer. Due to the vast amount of single images we need a way to set up the layers in a dynamic way - the map server does not know the images to be served beforehand. We use the MapScript interface to dynamically access MapServer's objects and configure the file name and path of the requested image in the map configuration. The layers are created on-the-fly each representing only one single image. On the frontend side, the vendor-specific WMS request parameter (PRODUCTID) has to be appended to the regular set of WMS parameters. The request is then passed on to the MapScript instance. Web Map Tile Cache: In order to speed up access of the WMS requests, a MapCache instance has been integrated in the pipeline. As it is not aware of the available PDS product IDs which will be queried, the PRODUCTID parameter is configured as an additional dimension of the cache. The WMS request is received by the Apache webserver configured with the MapCache module. If the tile is available in the tile cache, it is immediately commited to the client. If not available, the tile request is forwarded to Apache and the MapScript module. The Python script intercepts the WMS request and extracts the product ID from the parameter chain. It loads the layer object from the map file and appends the file name and path of the inquired image. After some possible further image processing inside the script (stretching, color matching), the request is submitted to the MapServer backend which in turn delivers the response back to the MapCache instance. Web frontend: We have implemented a web-GIS frontend based on various OpenLayers components. The basemap is a global color-hillshaded HRSC bundle-adjusted DTM mosaic with a resolution of 50 m per pixel. The new bundle-block-adjusted qudrangle mosaics of the MC-11 quadrangle, both image and DTM, are included with opacity slider options. The layer user interface has been adapted on the base of the ol3-layerswitcher and extended by foldable and switchable groups, layer sorting (by resolution, by time and alphabeticallly) and reordering (drag-and-drop). A collapsible time panel accomodates a time slider interface where the user can filter the visible data by a range of Mars or Earth dates and/or by solar longitudes. The visualisation of time-series of single images is controlled by a specific toolbar enabling the workflow of image selection (by point or bounding box), dynamic image loading and playback of single images in a video player-like environment. During a stress-test campaign we could demonstrate that the system is capable of serving up to 10 simultaneous users on its current lightweight development hardware. It is planned to relocate the software to more powerful hardware by the time of this conference. Conclusions/Outlook: The iMars webGIS is an expert tool for the detection and visualization of surface changes. We demonstrate a technique to dynamically retrieve and display single images based on the time-series structure of the data. Together with the multi-temporal database and its MapServer/MapCache backend it provides a stable and high performance environment for the dissemination of the various iMars products. Acknowledgements: This research has received funding from the EU's FP7 Programme under iMars 607379 and by the German Space Agency (DLR Bonn), grant 50 QM 1301 (HRSC on Mars Express).

  2. TRMM Precipitation Application Examples Using Data Services at NASA GES DISC

    NASA Technical Reports Server (NTRS)

    Liu, Zhong; Ostrenga, D.; Teng, W.; Kempler, S.; Greene, M.

    2012-01-01

    Data services to support precipitation applications are important for maximizing the NASA TRMM (Tropical Rainfall Measuring Mission) and the future GPM (Global Precipitation Mission) mission's societal benefits. TRMM Application examples using data services at the NASA GES DISC, including samples from users around the world will be presented in this poster. Precipitation applications often require near-real-time support. The GES DISC provides such support through: 1) Providing near-real-time precipitation products through TOVAS; 2) Maps of current conditions for monitoring precipitation and its anomaly around the world; 3) A user friendly tool (TOVAS) to analyze and visualize near-real-time and historical precipitation products; and 4) The GES DISC Hurricane Portal that provides near-real-time monitoring services for the Atlantic basin. Since the launch of TRMM, the GES DISC has developed data services to support precipitation applications around the world. In addition to the near-real-time services, other services include: 1) User friendly TRMM Online Visualization and Analysis System (TOVAS; URL: http://disc2.nascom.nasa.gov/Giovanni/tovas/); 2) Mirador (http://mirador.gsfc.nasa.gov/), a simplified interface for searching, browsing, and ordering Earth science data at GES DISC. Mirador is designed to be fast and easy to learn; 3) Data via OPeNDAP (http://disc.sci.gsfc.nasa.gov/services/opendap/). The OPeNDAP provides remote access to individual variables within datasets in a form usable by many tools, such as IDV, McIDAS-V, Panoply, Ferret and GrADS; and 4) The Open Geospatial Consortium (OGC) Web Map Service (WMS) (http://disc.sci.gsfc.nasa.gov/services/wxs_ogc.shtml). The WMS is an interface that allows the use of data and enables clients to build customized maps with data coming from a different network.

  3. Interoperability And Value Added To Earth Observation Data

    NASA Astrophysics Data System (ADS)

    Gasperi, J.

    2012-04-01

    Geospatial web services technology has provided a new means for geospatial data interoperability. Open Geospatial Consortium (OGC) services such as Web Map Service (WMS) to request maps on the Internet, Web Feature Service (WFS) to exchange vectors or Catalog Service for the Web (CSW) to search for geospatialized data have been widely adopted in the Geosciences community in general and in the remote sensing community in particular. These services make Earth Observation data available to a wider range of public users than ever before. The mapshup web client offers an innovative and efficient user interface that takes advantage of the power of interoperability. This presentation will demonstrate how mapshup can be effectively used in the context of natural disasters management.

  4. Open source software integrated into data services of Japanese planetary explorations

    NASA Astrophysics Data System (ADS)

    Yamamoto, Y.; Ishihara, Y.; Otake, H.; Imai, K.; Masuda, K.

    2015-12-01

    Scientific data obtained by Japanese scientific satellites and lunar and planetary explorations are archived in DARTS (Data ARchives and Transmission System). DARTS provides the data with a simple method such as HTTP directory listing for long-term preservation while DARTS tries to provide rich web applications for ease of access with modern web technologies based on open source software. This presentation showcases availability of open source software through our services. KADIAS is a web-based application to search, analyze, and obtain scientific data measured by SELENE(Kaguya), a Japanese lunar orbiter. KADIAS uses OpenLayers to display maps distributed from Web Map Service (WMS). As a WMS server, open source software MapServer is adopted. KAGUYA 3D GIS (KAGUYA 3D Moon NAVI) provides a virtual globe for the SELENE's data. The main purpose of this application is public outreach. NASA World Wind Java SDK is used to develop. C3 (Cross-Cutting Comparisons) is a tool to compare data from various observations and simulations. It uses Highcharts to draw graphs on web browsers. Flow is a tool to simulate a Field-Of-View of an instrument onboard a spacecraft. This tool itself is open source software developed by JAXA/ISAS, and the license is BSD 3-Caluse License. SPICE Toolkit is essential to compile FLOW. SPICE Toolkit is also open source software developed by NASA/JPL, and the website distributes many spacecrafts' data. Nowadays, open source software is an indispensable tool to integrate DARTS services.

  5. Providing Internet Access to High-Resolution Lunar Images

    NASA Technical Reports Server (NTRS)

    Plesea, Lucian

    2008-01-01

    The OnMoon server is a computer program that provides Internet access to high-resolution Lunar images, maps, and elevation data, all suitable for use in geographical information system (GIS) software for generating images, maps, and computational models of the Moon. The OnMoon server implements the Open Geospatial Consortium (OGC) Web Map Service (WMS) server protocol and supports Moon-specific extensions. Unlike other Internet map servers that provide Lunar data using an Earth coordinate system, the OnMoon server supports encoding of data in Moon-specific coordinate systems. The OnMoon server offers access to most of the available high-resolution Lunar image and elevation data. This server can generate image and map files in the tagged image file format (TIFF) or the Joint Photographic Experts Group (JPEG), 8- or 16-bit Portable Network Graphics (PNG), or Keyhole Markup Language (KML) format. Image control is provided by use of the OGC Style Layer Descriptor (SLD) protocol. Full-precision spectral arithmetic processing is also available, by use of a custom SLD extension. This server can dynamically add shaded relief based on the Lunar elevation to any image layer. This server also implements tiled WMS protocol and super-overlay KML for high-performance client application programs.

  6. Spatial Data Services for Interdisciplinary Applications from the NASA Socioeconomic Data and Applications Center

    NASA Astrophysics Data System (ADS)

    Chen, R. S.; MacManus, K.; Vinay, S.; Yetman, G.

    2016-12-01

    The Socioeconomic Data and Applications Center (SEDAC), one of 12 Distributed Active Archive Centers (DAACs) in the NASA Earth Observing System Data and Information System (EOSDIS), has developed a variety of operational spatial data services aimed at providing online access, visualization, and analytic functions for geospatial socioeconomic and environmental data. These services include: open web services that implement Open Geospatial Consortium (OGC) specifications such as Web Map Service (WMS), Web Feature Service (WFS), and Web Coverage Service (WCS); spatial query services that support Web Processing Service (WPS) and Representation State Transfer (REST); and web map clients and a mobile app that utilize SEDAC and other open web services. These services may be accessed from a variety of external map clients and visualization tools such as NASA's WorldView, NOAA's Climate Explorer, and ArcGIS Online. More than 200 data layers related to population, settlements, infrastructure, agriculture, environmental pollution, land use, health, hazards, climate change and other aspects of sustainable development are available through WMS, WFS, and/or WCS. Version 2 of the SEDAC Population Estimation Service (PES) supports spatial queries through WPS and REST in the form of a user-defined polygon or circle. The PES returns an estimate of the population residing in the defined area for a specific year (2000, 2005, 2010, 2015, or 2020) based on SEDAC's Gridded Population of the World version 4 (GPWv4) dataset, together with measures of accuracy. The SEDAC Hazards Mapper and the recently released HazPop iOS mobile app enable users to easily submit spatial queries to the PES and see the results. SEDAC has developed an operational virtualized backend infrastructure to manage these services and support their continual improvement as standards change, new data and services become available, and user needs evolve. An ongoing challenge is to improve the reliability and performance of the infrastructure, in conjunction with external services, to meet both research and operational needs.

  7. Design and implementation of a cartographic client application for mobile devices using SVG Tiny and J2ME

    NASA Astrophysics Data System (ADS)

    Hui, L.; Behr, F.-J.; Schröder, D.

    2006-10-01

    The dissemination of digital geospatial data is available now on mobile devices such as PDAs (personal digital assistants) and smart-phones etc. The mobile devices which support J2ME (Java 2 Micro Edition) offer users and developers one open interface, which they can use to develop or download the software according their own demands. Currently WMS (Web Map Service) can afford not only traditional raster image, but also the vector image. SVGT (Scalable Vector Graphics Tiny) is one subset of SVG (Scalable Vector Graphics) and because of its precise vector information, original styling and small file size, SVGT format is fitting well for the geographic mapping purpose, especially for the mobile devices which has bandwidth net connection limitation. This paper describes the development of a cartographic client for the mobile devices, using SVGT and J2ME technology. Mobile device will be simulated on the desktop computer for a series of testing with WMS, for example, send request and get the responding data from WMS and then display both vector and raster format image. Analyzing and designing of System structure such as user interface and code structure are discussed, the limitation of mobile device should be taken into consideration for this applications. The parsing of XML document which is received from WMS after the GetCapabilities request and the visual realization of SVGT and PNG (Portable Network Graphics) image are important issues in codes' writing. At last the client was tested on Nokia S40/60 mobile phone successfully.

  8. Use of NASA Near Real-Time and Archived Satellite Data to Support Disaster Assessment

    NASA Technical Reports Server (NTRS)

    McGrath, Kevin M.; Molthan, Andrew L.; Burks, Jason E.

    2014-01-01

    NASA's Short-term Prediction Research and Transition (SPoRT) Center partners with the NWS to provide near realtime data in support of a variety of weather applications, including disasters. SPoRT supports NASA's Applied Sciences Program: Disasters focus area by developing techniques that will aid the disaster monitoring, response, and assessment communities. SPoRT has explored a variety of techniques for utilizing archived and near real-time NASA satellite data. An increasing number of end-users - such as the NWS Damage Assessment Toolkit (DAT) - access geospatial data via a Web Mapping Service (WMS). SPoRT has begun developing open-standard Geographic Information Systems (GIS) data sets via WMS to respond to end-user needs.

  9. Tile prediction schemes for wide area motion imagery maps in GIS

    NASA Astrophysics Data System (ADS)

    Michael, Chris J.; Lin, Bruce Y.

    2017-11-01

    Wide-area surveillance, traffic monitoring, and emergency management are just several of many applications benefiting from the incorporation of Wide-Area Motion Imagery (WAMI) maps into geographic information systems. Though the use of motion imagery as a GIS base map via the Web Map Service (WMS) standard is not a new concept, effectively streaming imagery is particularly challenging due to its large scale and the multidimensionally interactive nature of clients that use WMS. Ineffective streaming from a server to one or more clients can unnecessarily overwhelm network bandwidth and cause frustratingly large amounts of latency in visualization to the user. Seamlessly streaming WAMI through GIS requires good prediction to accurately guess the tiles of the video that will be traversed in the near future. In this study, we present an experimental framework for such prediction schemes by presenting a stochastic interaction model that represents a human user's interaction with a GIS video map. We then propose several algorithms by which the tiles of the stream may be predicted. Results collected both within the experimental framework and using human analyst trajectories show that, though each algorithm thrives under certain constraints, the novel Markovian algorithm yields the best results overall. Furthermore, we make the argument that the proposed experimental framework is sufficient for the study of these prediction schemes.

  10. Development of a Dynamic Web Mapping Service for Vegetation Productivity Using Earth Observation and in situ Sensors in a Sensor Web Based Approach

    PubMed Central

    Kooistra, Lammert; Bergsma, Aldo; Chuma, Beatus; de Bruin, Sytze

    2009-01-01

    This paper describes the development of a sensor web based approach which combines earth observation and in situ sensor data to derive typical information offered by a dynamic web mapping service (WMS). A prototype has been developed which provides daily maps of vegetation productivity for the Netherlands with a spatial resolution of 250 m. Daily available MODIS surface reflectance products and meteorological parameters obtained through a Sensor Observation Service (SOS) were used as input for a vegetation productivity model. This paper presents the vegetation productivity model, the sensor data sources and the implementation of the automated processing facility. Finally, an evaluation is made of the opportunities and limitations of sensor web based approaches for the development of web services which combine both satellite and in situ sensor sources. PMID:22574019

  11. Web mapping system for complex processing and visualization of environmental geospatial datasets

    NASA Astrophysics Data System (ADS)

    Titov, Alexander; Gordov, Evgeny; Okladnikov, Igor

    2016-04-01

    Environmental geospatial datasets (meteorological observations, modeling and reanalysis results, etc.) are used in numerous research applications. Due to a number of objective reasons such as inherent heterogeneity of environmental datasets, big dataset volume, complexity of data models used, syntactic and semantic differences that complicate creation and use of unified terminology, the development of environmental geodata access, processing and visualization services as well as client applications turns out to be quite a sophisticated task. According to general INSPIRE requirements to data visualization geoportal web applications have to provide such standard functionality as data overview, image navigation, scrolling, scaling and graphical overlay, displaying map legends and corresponding metadata information. It should be noted that modern web mapping systems as integrated geoportal applications are developed based on the SOA and might be considered as complexes of interconnected software tools for working with geospatial data. In the report a complex web mapping system including GIS web client and corresponding OGC services for working with geospatial (NetCDF, PostGIS) dataset archive is presented. There are three basic tiers of the GIS web client in it: 1. Tier of geospatial metadata retrieved from central MySQL repository and represented in JSON format 2. Tier of JavaScript objects implementing methods handling: --- NetCDF metadata --- Task XML object for configuring user calculations, input and output formats --- OGC WMS/WFS cartographical services 3. Graphical user interface (GUI) tier representing JavaScript objects realizing web application business logic Metadata tier consists of a number of JSON objects containing technical information describing geospatial datasets (such as spatio-temporal resolution, meteorological parameters, valid processing methods, etc). The middleware tier of JavaScript objects implementing methods for handling geospatial metadata, task XML object, and WMS/WFS cartographical services interconnects metadata and GUI tiers. The methods include such procedures as JSON metadata downloading and update, launching and tracking of the calculation task running on the remote servers as well as working with WMS/WFS cartographical services including: obtaining the list of available layers, visualizing layers on the map, exporting layers in graphical (PNG, JPG, GeoTIFF), vector (KML, GML, Shape) and digital (NetCDF) formats. Graphical user interface tier is based on the bundle of JavaScript libraries (OpenLayers, GeoExt and ExtJS) and represents a set of software components implementing web mapping application business logic (complex menus, toolbars, wizards, event handlers, etc.). GUI provides two basic capabilities for the end user: configuring the task XML object functionality and cartographical information visualizing. The web interface developed is similar to the interface of such popular desktop GIS applications, as uDIG, QuantumGIS etc. Web mapping system developed has shown its effectiveness in the process of solving real climate change research problems and disseminating investigation results in cartographical form. The work is supported by SB RAS Basic Program Projects VIII.80.2.1 and IV.38.1.7.

  12. Data visualization in interactive maps and time series

    NASA Astrophysics Data System (ADS)

    Maigne, Vanessa; Evano, Pascal; Brockmann, Patrick; Peylin, Philippe; Ciais, Philippe

    2014-05-01

    State-of-the-art data visualization has nothing to do with plots and maps we used few years ago. Many opensource tools are now available to provide access to scientific data and implement accessible, interactive, and flexible web applications. Here we will present a web site opened November 2013 to create custom global and regional maps and time series from research models and datasets. For maps, we explore and get access to data sources from a THREDDS Data Server (TDS) with the OGC WMS protocol (using the ncWMS implementation) then create interactive maps with the OpenLayers javascript library and extra information layers from a GeoServer. Maps become dynamic, zoomable, synchroneaously connected to each other, and exportable to Google Earth. For time series, we extract data from a TDS with the Netcdf Subset Service (NCSS) then display interactive graphs with a custom library based on the Data Driven Documents javascript library (D3.js). This time series application provides dynamic functionalities such as interpolation, interactive zoom on different axes, display of point values, and export to different formats. These tools were implemented for the Global Carbon Atlas (http://www.globalcarbonatlas.org): a web portal to explore, visualize, and interpret global and regional carbon fluxes from various model simulations arising from both human activities and natural processes, a work led by the Global Carbon Project.

  13. Access High Quality Imagery from the NOAA View Portal

    NASA Astrophysics Data System (ADS)

    Pisut, D.; Powell, A. M.; Loomis, T.; Goel, V.; Mills, B.; Cowan, D.

    2013-12-01

    NOAA curates a vast treasure trove of environmental data, but one that is sometimes not easily accessed, especially for education, outreach, and media purposes. Traditional data portals in NOAA require extensive knowledge of the specific names of observation platforms, models, and analyses, along with nomenclature for variable outputs. A new website and web mapping service (WMS) from NOAA attempts to remedy such issues. The NOAA View data imagery portal provides a seamless entry point into data from across the agency: satellite, models, in-situ analysis, etc. The system provides the user with ability to browse, animate, and download high resolution (e.g., 4,000 x 2,000 pixel) imagery, Google Earth, and even proxy data files. The WMS architecture also allows the resources to be ingested into other software systems or applications.

  14. mobilityRERC state of the science conference: Considerations for developing an evidence base for wheeled mobility and seating service delivery.

    PubMed

    Cohen, Laura; Greer, Nancy; Berliner, Elise; Sprigle, Stephen

    2013-11-01

    This article, developed as background content for discussion during the Mobility Rehabilitation Engineering Research Center State of the Science Conference, reviews research surrounding wheeled mobility and seating (WMS) service delivery, discusses the challenges of improving clinical decision-making, and discusses research approaches used to study and improve health services in other practice areas that might be leveraged to develop the evidence base for WMS. Narrative literature review. An overview of existing research found general agreement across models of WMS service delivery but little high quality evidence to support the recommended approaches and few studies of the relationship between service delivery steps and individual patient outcomes. The definition of successful clinical decision-making is different for different stakeholders. Clinical decision-making should incorporate the best available evidence along with patient values, preferences, circumstances, and clinical expertise. To advance the evidence base for WMS service delivery, alternatives to randomized controlled trials should be considered and reliable and valid outcome measures developed. Technological advances offer tremendous opportunities for individuals with complex rehabilitation technology needs. However, with ongoing scrutiny of WMS service delivery there is an increased need for evidence to support the clinical decision-making process and to support evidence-based coverage policies for WMS services and technologies. An evidence base for wheeled mobility and seating services is an important component of the clinical decision-making process. At present, there is little evidence regarding essential components of the wheeled mobility and seating evaluation or the relationship between the evaluation process and patient outcomes. Many factors can confound this relationship and present challenges to research in this area. All stakeholders (i.e. clinicians, rehabilitation technology suppliers, manufacturers, researchers, payers, policy makers, and wheelchair users) need to work together to develop and support an evidence base for wheeled mobility and seating service delivery.

  15. Exploring NASA Satellite Data with High Resolution Visualization

    NASA Astrophysics Data System (ADS)

    Wei, J. C.; Yang, W.; Johnson, J. E.; Shen, S.; Zhao, P.; Gerasimov, I. V.; Vollmer, B.; Vicente, G. A.; Pham, L.

    2013-12-01

    Satellite data products are important for a wide variety of applications that can bring far-reaching benefits to the science community and the broader society. These benefits can best be achieved if the satellite data are well utilized and interpreted, such as model inputs from satellite, or extreme event (such as volcano eruption, dust storm, ...etc) interpretation from satellite. Unfortunately, this is not always the case, despite the abundance and relative maturity of numerous satellite data products provided by NASA and other organizations. Such obstacles may be avoided by providing satellite data as ';Images' with accurate pixel-level (Level 2) information, including pixel coverage area delineation and science team recommended quality screening for individual geophysical parameters. We will present a prototype service from the Goddard Earth Sciences Data and Information Services Center (GES DISC) supporting various visualization and data accessing capabilities from satellite Level 2 data (non-aggregated and un-gridded) at high spatial resolution. Functionality will include selecting data sources (e.g., multiple parameters under the same measurement, like NO2 and SO2 from Ozone Monitoring Instrument (OMI), or same parameter with different methods of aggregation, like NO2 in OMNO2G and OMNO2D products), defining area-of-interest and temporal extents, zooming, panning, overlaying, sliding, and data subsetting and reformatting. The portal interface will connect to the backend services with OGC standard-compliant Web Mapping Service (WMS) and Web Coverage Service (WCS) calls. The interface will also be able to connect to other OGC WMS and WCS servers, which will greatly enhance its expandability to integrate additional outside data/map sources.

  16. Exploiting Aura OMI Level 2 Data with High Resolution Visualization

    NASA Astrophysics Data System (ADS)

    Wei, J. C.; Yang, W.; Johnson, J. E.; Zhao, P.; Gerasimov, I. V.; Pham, L.; Vicente, G. A.; Shen, S.

    2014-12-01

    Satellite data products are important for a wide variety of applications that can bring far-reaching benefits to the science community and the broader society. These benefits can best be achieved if the satellite data are well utilized and interpreted, such as model inputs from satellite, or extreme event (such as volcano eruption, dust storm, …etc) interpretation from satellite. Unfortunately, this is not always the case, despite the abundance and relative maturity of numerous satellite data products provided by NASA and other organizations. One way to help users better understand the satellite data is to provide data along with 'Images', including accurate pixel-level (Level 2) information, pixel coverage area delineation, and science team recommended quality screening for individual geophysical parameters. Goddard Earth Sciences Data and Information Services Center (GES DISC) always strives to best support (i.e., Software-as-a-service, SaaS) the user-community for NASA Earth Science Data. In this case, we will present a new visualization tool that helps users exploiting Aura Ozone Monitoring Instrument (OMI) Level 2 data. This new visualization service utilizes Open Geospatial Consortium (OGC) standard-compliant Web Mapping Service (WMS) and Web Coverage Service (WCS) calls in the backend infrastructure. The functionality of the service allows users to select data sources (e.g., multiple parameters under the same measurement, like NO2 and SO2 from OMI Level 2 or same parameter with different methods of aggregation, like NO2 in OMNO2G and OMNO2D products), defining area-of-interest and temporal extents, zooming, panning, overlaying, sliding, and data subsetting and reformatting. The interface will also be able to connect to other OGC WMS and WCS servers, which will greatly enhance its expandability to integrate additional outside data/map sources (such as Global Imagery Browse Services (GIBS)).

  17. Development of Web Mapping Service Capabilities to Support NASA Disasters Applications/App Development

    NASA Technical Reports Server (NTRS)

    Burks, Jason E.; Molthan, Andrew L.; McGrath, Kevin M.

    2014-01-01

    During the last year several significant disasters have occurred such as Superstorm Sandy on the East coast of the United States, and Typhoon Bopha in the Phillipines, along with several others. In support of these disasters NASA's Short-term Prediction Research and Transition (SPoRT) Center delivered various products derived from satellite imagery to help in the assessment of damage and recovery of the affected areas. To better support the decision makers responding to the disasters SPoRT quickly developed several solutions to provide the data using open Geographical Information Service (GIS) formats. Providing the data in open GIS standard formats allowed the end user to easily integrate the data into existing Decision Support Systems (DSS). Both Tile Mapping Service (TMS) and Web Mapping Service (WMS) were leveraged to quickly provide the data to the end-user. Development of the deliver methodology allowed quick response to rapidly developing disasters and enabled NASA SPoRT to bring science data to decision makers in a successful research to operations transition.

  18. Development of Web Mapping Service Capabilities to Support NASA Disasters Applications / App Development

    NASA Technical Reports Server (NTRS)

    Burks, Jason E.; Molthan, Andrew L.; McGrath, Kevin M.

    2014-01-01

    During the last year several significant disasters have occurred such as Superstorm Sandy on the East coast of the United States, and Typhoon Bopha in the Phillipines, along with several others. In support of these disasters NASA's Short-term Prediction Research and Transition (SPoRT) Center delivered various products derived from satellite imagery to help in the assessment of damage and recovery of the affected areas. To better support the decision makers responding to the disasters SPoRT quickly developed several solutions to provide the data using open Geographical Information Service (GIS) formats. Providing the data in open GIS standard formats allowed the end user to easily integrate the data into existing Decision Support Systems (DSS). Both Tile Mapping Service (TMS) and Web Mapping Service (WMS) were leveraged to quickly provide the data to the end-user. Development of the deliver methodology allowed quick response to rapidly developing disasters and enabled NASA SPoRT to bring science data to decision makers in a successful research to operations transition.

  19. Pragmatic service development and customisation with the CEDA OGC Web Services framework

    NASA Astrophysics Data System (ADS)

    Pascoe, Stephen; Stephens, Ag; Lowe, Dominic

    2010-05-01

    The CEDA OGC Web Services framework (COWS) emphasises rapid service development by providing a lightweight layer of OGC web service logic on top of Pylons, a mature web application framework for the Python language. This approach gives developers a flexible web service development environment without compromising access to the full range of web application tools and patterns: Model-View-Controller paradigm, XML templating, Object-Relational-Mapper integration and authentication/authorization. We have found this approach useful for exploring evolving standards and implementing protocol extensions to meet the requirements of operational deployments. This paper outlines how COWS is being used to implement customised WMS, WCS, WFS and WPS services in a variety of web applications from experimental prototypes to load-balanced cluster deployments serving 10-100 simultaneous users. In particular we will cover 1) The use of Climate Science Modeling Language (CSML) in complex-feature aware WMS, WCS and WFS services, 2) Extending WMS to support applications with features specific to earth system science and 3) A cluster-enabled Web Processing Service (WPS) supporting asynchronous data processing. The COWS WPS underpins all backend services in the UK Climate Projections User Interface where users can extract, plot and further process outputs from a multi-dimensional probabilistic climate model dataset. The COWS WPS supports cluster job execution, result caching, execution time estimation and user management. The COWS WMS and WCS components drive the project-specific NCEO and QESDI portals developed by the British Atmospheric Data Centre. These portals use CSML as a backend description format and implement features such as multiple WMS layer dimensions and climatology axes that are beyond the scope of general purpose GIS tools and yet vital for atmospheric science applications.

  20. WMS and WFS Standards Implementation of Weather Data

    NASA Astrophysics Data System (ADS)

    Armstrong, M.

    2005-12-01

    CustomWeather is private weather company that delivers global weather data products. CustomWeather has built a mapping platform according to OGC standards. Currently, both a Web Mapping Service (WMS) and Web Feature Service (WFS) are supported by CustomWeather. Supporting open geospatial standards has lead to number of positive changes internally to the processes of CustomWeather, along with those of the clients accessing the data. Quite a number of challenges surfaced during this process, particularly with respect to combining a wide variety of raw modeling and sensor data into a single delivery platform. Open standards have, however, made the delivery of very different data products rather seamless. The discussion will address the issues faced in building an OGC-based mapping platform along with the limitations encountered. While the availability of these data products through open standards is still very young, there have already been many adopters in the utility and navigation industries. The discussion will take a closer look at the different approach taken by these two industries as they utilize interoperability standards with existing data. Insight will be given in regards to applications already taking advantage of this new technology and how this is affecting decision-making processes. CustomWeather has observed considerable interest and potential benefit in this technology from developing countries. Weather data is a key element in disaster management. Interoperability is literally opening up a world of data and has the potential to quickly enable functionality that would otherwise take considerable time to implement. The discussion will briefly touch on our experience.

  1. OneGeology Web Services and Portal as a global geological SDI - latest standards and technology

    NASA Astrophysics Data System (ADS)

    Duffy, Tim; Tellez-Arenas, Agnes

    2014-05-01

    The global coverage of OneGeology Web Services (www.onegeology.org and portal.onegeology.org) achieved since 2007 from the 120 participating geological surveys will be reviewed and issues arising discussed. Recent enhancements to the OneGeology Web Services capabilities will be covered including new up to 5 star service accreditation scheme utilising the ISO/OGC Web Mapping Service standard version 1.3, core ISO 19115 metadata additions and Version 2.0 Web Feature Services (WFS) serving the new IUGS-CGI GeoSciML V3.2 geological web data exchange language standard (http://www.geosciml.org/) with its associated 30+ IUGS-CGI available vocabularies (http://resource.geosciml.org/ and http://srvgeosciml.brgm.fr/eXist2010/brgm/client.html). Use of the CGI simpelithology and timescale dictionaries now allow those who wish to do so to offer data harmonisation to query their GeoSciML 3.2 based Web Feature Services and their GeoSciML_Portrayal V2.0.1 (http://www.geosciml.org/) Web Map Services in the OneGeology portal (http://portal.onegeology.org). Contributing to OneGeology involves offering to serve ideally 1:1000,000 scale geological data (in practice any scale now is warmly welcomed) as an OGC (Open Geospatial Consortium) standard based WMS (Web Mapping Service) service from an available WWW server. This may either be hosted within the Geological Survey or a neighbouring, regional or elsewhere institution that offers to serve that data for them i.e. offers to help technically by providing the web serving IT infrastructure as a 'buddy'. OneGeology is a standards focussed Spatial Data Infrastructure (SDI) and works to ensure that these standards work together and it is now possible for European Geological Surveys to register their INSPIRE web services within the OneGeology SDI (e.g. see http://www.geosciml.org/geosciml/3.2/documentation/cookbook/INSPIRE_GeoSciML_Cookbook%20_1.0.pdf). The Onegeology portal (http://portal.onegeology.org) is the first port of call for anyone wishing to discover the availability of global geological web services and has new functionality to view and use such services including multiple projection support. KEYWORDS : OneGeology; GeoSciML V 3.2; Data exchange; Portal; INSPIRE; Standards; OGC; Interoperability; GeoScience information; WMS; WFS; Cookbook.

  2. The Use of NASA near Real-time and Archived Satellite Data to Support Disaster Assessment

    NASA Technical Reports Server (NTRS)

    McGrath, Kevin M.; Molthan, Andrew; Burks, Jason

    2014-01-01

    With support from a NASA's Applied Sciences Program, The Short-term Prediction Research and Transition (SPoRT) Center has explored a variety of techniques for utilizing archived and near real-time NASA satellite data to support disaster assessment activities. MODIS data from the NASA Land Atmosphere Near Real-time Capability for EOS currently provides true color and other imagery for assessment and potential applications including, but not limited to, flooding, fires, and tornadoes. In May 2013, the SPoRT Center developed unique power outage composites using the VIIRS Day/Night Band to represent the first clear sky view of damage inflicted upon Moore and Oklahoma City, Oklahoma following the devastating EF-5 tornado that occurred on May 20. Pre-event imagery provided by the NASA funded Web-Enabled Landsat Data project offer a basis of comparison for monitoring post-disaster recovery efforts. Techniques have also been developed to generate products from higher resolution imagery from the recently available International Space Station SERVIR Environmental Research and Visualization System instrument. Of paramount importance is to deliver these products to end users expeditiously and in formats compatible with Decision Support Systems (DSS). Delivery techniques include a Tile Map Service (TMS) and a Web Mapping Service (WMS). These mechanisms allow easy integration of satellite products into DSS's, including the National Weather Service's Damage Assessment Toolkit for use by personnel conducting damage surveys. This poster will present an overview of the developed techniques and products and compare the strengths and weaknesses of the TMS and WMS.

  3. Prototype of Partial Cutting Tool of Geological Map Images Distributed by Geological Web Map Service

    NASA Astrophysics Data System (ADS)

    Nonogaki, S.; Nemoto, T.

    2014-12-01

    Geological maps and topographical maps play an important role in disaster assessment, resource management, and environmental preservation. These map information have been distributed in accordance with Web services standards such as Web Map Service (WMS) and Web Map Tile Service (WMTS) recently. In this study, a partial cutting tool of geological map images distributed by geological WMTS was implemented with Free and Open Source Software. The tool mainly consists of two functions: display function and cutting function. The former function was implemented using OpenLayers. The latter function was implemented using Geospatial Data Abstraction Library (GDAL). All other small functions were implemented by PHP and Python. As a result, this tool allows not only displaying WMTS layer on web browser but also generating a geological map image of intended area and zoom level. At this moment, available WTMS layers are limited to the ones distributed by WMTS for the Seamless Digital Geological Map of Japan. The geological map image can be saved as GeoTIFF format and WebGL format. GeoTIFF is one of the georeferenced raster formats that is available in many kinds of Geographical Information System. WebGL is useful for confirming a relationship between geology and geography in 3D. In conclusion, the partial cutting tool developed in this study would contribute to create better conditions for promoting utilization of geological information. Future work is to increase the number of available WMTS layers and the types of output file format.

  4. Cool Apps: Building Cryospheric Data Applications With Standards-Based Service Oriented Architecture

    NASA Astrophysics Data System (ADS)

    Collins, J. A.; Truslove, I.; Billingsley, B. W.; Oldenburg, J.; Brodzik, M.; Lewis, S.; Liu, M.

    2012-12-01

    The National Snow and Ice Data Center (NSIDC) holds a large collection of cryospheric data, and is involved in a number of informatics research and development projects aimed at improving the discoverability and accessibility of these data. To develop high-quality software in a timely manner, we have adopted a Service-Oriented Architecture (SOA) approach for our core technical infrastructure development. Data services at NSIDC are internally exposed to other tools and applications through standards-based service interfaces. These standards include OAI-PMH (Open Archives Initiative Protocol for Metadata Harvesting), various OGC (Open Geospatial Consortium) standards including WMS (Web Map Service) and WFS (Web Feature Service), ESIP (Federation of Earth Sciences Information Partners) OpenSearch, and NSIDC-specific RESTful services. By taking a standards-based approach, we are able to use off-the-shelf tools and libraries to consume, translate and broker these data services, and thus develop applications faster. Additionally, by exposing public interfaces to these services we provide valuable data services to technical collaborators; for example, NASA Reverb (http://reverb.echo.nasa.gov) uses NSIDC's WMS services. Our latest generation of web applications consume these data services directly. The most complete example of this is the Operation IceBridge Data Portal (http://nsidc.org/icebridge/portal) which depends on many of the aforementioned services, and clearly exhibits many of the advantages of building applications atop a service-oriented architecture. This presentation outlines the architectural approach and components and open standards and protocols adopted at NSIDC, demonstrates the interactions and uses of public and internal service interfaces currently powering applications including the IceBridge Data Portal, and outlines the benefits and challenges of this approach.

  5. Next Generation Landsat Products Delivered Using Virtual Globes and OGC Standard Services

    NASA Astrophysics Data System (ADS)

    Neiers, M.; Dwyer, J.; Neiers, S.

    2008-12-01

    The Landsat Data Continuity Mission (LDCM) is the next in the series of Landsat satellite missions and is tasked with the objective of delivering data acquired by the Operational Land Imager (OLI). The OLI instrument will provide data continuity to over 30 years of global multispectral data collected by the Landsat series of satellites. The U.S. Geological Survey Earth Resources Observation and Science (USGS EROS) Center has responsibility for the development and operation of the LDCM ground system. One of the mission objectives of the LDCM is to distribute OLI data products electronically over the Internet to the general public on a nondiscriminatory basis and at no cost. To ensure the user community and general public can easily access LDCM data from multiple clients, the User Portal Element (UPE) of the LDCM ground system will use OGC standards and services such as Keyhole Markup Language (KML), Web Map Service (WMS), Web Coverage Service (WCS), and Geographic encoding of Really Simple Syndication (GeoRSS) feeds for both access to and delivery of LDCM products. The USGS has developed and tested the capabilities of several successful UPE prototypes for delivery of Landsat metadata, full resolution browse, and orthorectified (L1T) products from clients such as Google Earth, Google Maps, ESRI ArcGIS Explorer, and Microsoft's Virtual Earth. Prototyping efforts included the following services: using virtual globes to search the historical Landsat archive by dynamic generation of KML; notification of and access to new Landsat acquisitions and L1T downloads from GeoRSS feeds; Google indexing of KML files containing links to full resolution browse and data downloads; WMS delivery of reduced resolution browse, full resolution browse, and cloud mask overlays; and custom data downloads using WCS clients. These various prototypes will be demonstrated and LDCM service implementation plans will be discussed during this session.

  6. Vector-Based Data Services for NASA Earth Science

    NASA Astrophysics Data System (ADS)

    Rodriguez, J.; Roberts, J. T.; Ruvane, K.; Cechini, M. F.; Thompson, C. K.; Boller, R. A.; Baynes, K.

    2016-12-01

    Vector data sources offer opportunities for mapping and visualizing science data in a way that allows for more customizable rendering and deeper data analysis than traditional raster images, and popular formats like GeoJSON and Mapbox Vector Tiles allow diverse types of geospatial data to be served in a high-performance and easily consumed-package. Vector data is especially suited to highly dynamic mapping applications and visualization of complex datasets, while growing levels of support for vector formats and features in open-source mapping clients has made utilizing them easier and more powerful than ever. NASA's Global Imagery Browse Services (GIBS) is working to make NASA data more easily and conveniently accessible than ever by serving vector datasets via GeoJSON, Mapbox Vector Tiles, and raster images. This presentation will review these output formats, the services, including WFS, WMS, and WMTS, that can be used to access the data, and some ways in which vector sources can be utilized in popular open-source mapping clients like OpenLayers. Lessons learned from GIBS' recent move towards serving vector will be discussed, as well as how to use GIBS open source software to create, configure, and serve vector data sources using Mapserver and the GIBS OnEarth Apache module.

  7. Publishing Platform for Aerial Orthophoto Maps, the Complete Stack

    NASA Astrophysics Data System (ADS)

    Čepický, J.; Čapek, L.

    2016-06-01

    When creating set of orthophoto maps from mosaic compositions, using airborne systems, such as popular drones, we need to publish results of the work to users. Several steps need to be performed in order get large scale raster data published. As first step, data have to be shared as service (OGC WMS as view service, OGC WCS as download service). But for some applications, OGC WMTS is handy as well, for faster view of the data. Finally the data have to become a part of web mapping application, so that they can be used and evaluated by non-technical users. In this talk, we would like to present automated line of those steps, where user puts in orthophoto image and as a result, OGC Open Web Services are published as well as web mapping application with the data. The web mapping application can be used as standard presentation platform for such type of big raster data to generic user. The publishing platform - Geosense online map information system - can be also used for combination of data from various resources and for creating of unique map compositions and as input for better interpretations of photographed phenomenons. The whole process is successfully tested with eBee drone with raster data resolution 1.5-4 cm/px on many areas and result is also used for creation of derived datasets, usually suited for property management - the records of roads, pavements, traffic signs, public lighting, sewage system, grave locations, and others.

  8. Cloud Bursting with GlideinWMS: Means to satisfy ever increasing computing needs for Scientific Workflows

    NASA Astrophysics Data System (ADS)

    Mhashilkar, Parag; Tiradani, Anthony; Holzman, Burt; Larson, Krista; Sfiligoi, Igor; Rynge, Mats

    2014-06-01

    Scientific communities have been in the forefront of adopting new technologies and methodologies in the computing. Scientific computing has influenced how science is done today, achieving breakthroughs that were impossible to achieve several decades ago. For the past decade several such communities in the Open Science Grid (OSG) and the European Grid Infrastructure (EGI) have been using GlideinWMS to run complex application workflows to effectively share computational resources over the grid. GlideinWMS is a pilot-based workload management system (WMS) that creates on demand, a dynamically sized overlay HTCondor batch system on grid resources. At present, the computational resources shared over the grid are just adequate to sustain the computing needs. We envision that the complexity of the science driven by "Big Data" will further push the need for computational resources. To fulfill their increasing demands and/or to run specialized workflows, some of the big communities like CMS are investigating the use of cloud computing as Infrastructure-As-A-Service (IAAS) with GlideinWMS as a potential alternative to fill the void. Similarly, communities with no previous access to computing resources can use GlideinWMS to setup up a batch system on the cloud infrastructure. To enable this, the architecture of GlideinWMS has been extended to enable support for interfacing GlideinWMS with different Scientific and commercial cloud providers like HLT, FutureGrid, FermiCloud and Amazon EC2. In this paper, we describe a solution for cloud bursting with GlideinWMS. The paper describes the approach, architectural changes and lessons learned while enabling support for cloud infrastructures in GlideinWMS.

  9. Cloud Bursting with GlideinWMS: Means to satisfy ever increasing computing needs for Scientific Workflows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mhashilkar, Parag; Tiradani, Anthony; Holzman, Burt

    Scientific communities have been in the forefront of adopting new technologies and methodologies in the computing. Scientific computing has influenced how science is done today, achieving breakthroughs that were impossible to achieve several decades ago. For the past decade several such communities in the Open Science Grid (OSG) and the European Grid Infrastructure (EGI) have been using GlideinWMS to run complex application workflows to effectively share computational resources over the grid. GlideinWMS is a pilot-based workload management system (WMS) that creates on demand, a dynamically sized overlay HTCondor batch system on grid resources. At present, the computational resources shared overmore » the grid are just adequate to sustain the computing needs. We envision that the complexity of the science driven by 'Big Data' will further push the need for computational resources. To fulfill their increasing demands and/or to run specialized workflows, some of the big communities like CMS are investigating the use of cloud computing as Infrastructure-As-A-Service (IAAS) with GlideinWMS as a potential alternative to fill the void. Similarly, communities with no previous access to computing resources can use GlideinWMS to setup up a batch system on the cloud infrastructure. To enable this, the architecture of GlideinWMS has been extended to enable support for interfacing GlideinWMS with different Scientific and commercial cloud providers like HLT, FutureGrid, FermiCloud and Amazon EC2. In this paper, we describe a solution for cloud bursting with GlideinWMS. The paper describes the approach, architectural changes and lessons learned while enabling support for cloud infrastructures in GlideinWMS.« less

  10. Evolution of System Architectures: Where Do We Need to Fail Next?

    NASA Astrophysics Data System (ADS)

    Bermudez, Luis; Alameh, Nadine; Percivall, George

    2013-04-01

    Innovation requires testing and failing. Thomas Edison was right when he said "I have not failed. I've just found 10,000 ways that won't work". For innovation and improvement of standards to happen, service Architectures have to be tested and tested. Within the Open Geospatial Consortium (OGC), testing of service architectures has occurred for the last 15 years. This talk will present an evolution of these service architectures and a possible future path. OGC is a global forum for the collaboration of developers and users of spatial data products and services, and for the advancement and development of international standards for geospatial interoperability. The OGC Interoperability Program is a series of hands-on, fast paced, engineering initiatives to accelerate the development and acceptance of OGC standards. Each initiative is organized in threads that provide focus under a particular theme. The first testbed, OGC Web Services phase 1, completed in 2003 had four threads: Common Architecture, Web Mapping, Sensor Web and Web Imagery Enablement. The Common Architecture was a cross-thread theme, to ensure that the Web Mapping and Sensor Web experiments built on a base common architecture. The architecture was based on the three main SOA components: Broker, Requestor and Provider. It proposed a general service model defining service interactions and dependencies; categorization of service types; registries to allow discovery and access of services; data models and encodings; and common services (WMS, WFS, WCS). For the latter, there was a clear distinction on the different services: Data Services (e.g. WMS), Application services (e.g. Coordinate transformation) and server-side client applications (e.g. image exploitation). The latest testbed, OGC Web Service phase 9, completed in 2012 had 5 threads: Aviation, Cross-Community Interoperability (CCI), Security and Services Interoperability (SSI), OWS Innovations and Compliance & Interoperability Testing & Evaluation (CITE). Compared to the first testbed, OWS-9 did not have a separate common architecture thread. Instead the emphasis was on brokering information models, securing them and making data available efficiently on mobile devices. The outcome is an architecture based on usability and non-intrusiveness while leveraging mediation of information models from different communities. This talk will use lessons learned from the evolution from OGC Testbed phase 1 to phase 9 to better understand how global and complex infrastructures evolve to support many communities including the Earth System Science Community.

  11. Land User and Land Cover Maps of Europe: a Webgis Platform

    NASA Astrophysics Data System (ADS)

    Brovelli, M. A.; Fahl, F. C.; Minghini, M.; Molinari, M. E.

    2016-06-01

    This paper presents the methods and implementation processes of a WebGIS platform designed to publish the available land use and land cover maps of Europe at continental scale. The system is built completely on open source infrastructure and open standards. The proposed architecture is based on a server-client model having GeoServer as the map server, Leaflet as the client-side mapping library and the Bootstrap framework at the core of the front-end user interface. The web user interface is designed to have typical features of a desktop GIS (e.g. activate/deactivate layers and order layers by drag and drop actions) and to show specific information on the activated layers (e.g. legend and simplified metadata). Users have the possibility to change the base map from a given list of map providers (e.g. OpenStreetMap and Microsoft Bing) and to control the opacity of each layer to facilitate the comparison with both other land cover layers and the underlying base map. In addition, users can add to the platform any custom layer available through a Web Map Service (WMS) and activate the visualization of photos from popular photo sharing services. This last functionality is provided in order to have a visual assessment of the available land coverages based on other user-generated contents available on the Internet. It is supposed to be a first step towards a calibration/validation service that will be made available in the future.

  12. rasdaman Array Database: current status

    NASA Astrophysics Data System (ADS)

    Merticariu, George; Toader, Alexandru

    2015-04-01

    rasdaman (Raster Data Manager) is a Free Open Source Array Database Management System which provides functionality for storing and processing massive amounts of raster data in the form of multidimensional arrays. The user can access, process and delete the data using SQL. The key features of rasdaman are: flexibility (datasets of any dimensionality can be processed with the help of SQL queries), scalability (rasdaman's distributed architecture enables it to seamlessly run on cloud infrastructures while offering an increase in performance with the increase of computation resources), performance (real-time access, processing, mixing and filtering of arrays of any dimensionality) and reliability (legacy communication protocol replaced with a new one based on cutting edge technology - Google Protocol Buffers and ZeroMQ). Among the data with which the system works, we can count 1D time series, 2D remote sensing imagery, 3D image time series, 3D geophysical data, and 4D atmospheric and climate data. Most of these representations cannot be stored only in the form of raw arrays, as the location information of the contents is also important for having a correct geoposition on Earth. This is defined by ISO 19123 as coverage data. rasdaman provides coverage data support through the Petascope service. Extensions were added on top of rasdaman in order to provide support for the Geoscience community. The following OGC standards are currently supported: Web Map Service (WMS), Web Coverage Service (WCS), and Web Coverage Processing Service (WCPS). The Web Map Service is an extension which provides zoom and pan navigation over images provided by a map server. Starting with version 9.1, rasdaman supports WMS version 1.3. The Web Coverage Service provides capabilities for downloading multi-dimensional coverage data. Support is also provided for several extensions of this service: Subsetting Extension, Scaling Extension, and, starting with version 9.1, Transaction Extension, which defines request types for inserting, updating and deleting coverages. A web client, designed for both novice and experienced users, is also available for the service and its extensions. The client offers an intuitive interface that allows users to work with multi-dimensional coverages by abstracting the specifics of the standard definitions of the requests. The Web Coverage Processing Service defines a language for on-the-fly processing and filtering multi-dimensional raster coverages. rasdaman exposes this service through the WCS processing extension. Demonstrations are provided online via the Earthlook website (earthlook.org) which presents use-cases from a wide variety of application domains, using the rasdaman system as processing engine.

  13. An Integrated Approach for Accessing Multiple Datasets through LANCE

    NASA Astrophysics Data System (ADS)

    Murphy, K. J.; Teague, M.; Conover, H.; Regner, K.; Beaumont, B.; Masuoka, E.; Vollmer, B.; Theobald, M.; Durbin, P.; Michael, K.; Boller, R. A.; Schmaltz, J. E.; Davies, D.; Horricks, K.; Ilavajhala, S.; Thompson, C. K.; Bingham, A.

    2011-12-01

    The NASA/GSFC Land Atmospheres Near-real time Capability for EOS (LANCE) provides imagery for approximately 40 data products from MODIS, AIRS, AMSR-E and OMI to support the applications community in the study of a variety of phenomena. Thirty-six of these products are available within 2.5 hours of observation at the spacecraft. The data set includes the population density data provided by the EOSDIS Socio-Economic Data and Applications Center (SEDAC). The purpose of this paper is to describe the variety of tools that have been developed by LANCE to support user access to the imagery. The long-standing Rapid Response system has been integrated into LANCE and is a major vehicle for the distribution of the imagery to end users. There are presently approximately 10,000 anonymous users per month accessing these imagery. The products are grouped into 14 applications categories such as Smoke Plumes, Pollution, Fires, Agriculture and the selection of any category will make relevant subsets of the 40 products available as possible overlays in an interactive Web Client utilizing Web Mapping Service (WMS) to support user investigations (http://lance2.modaps.eosdis.nasa.gov/wms/). For example, selecting Severe Storms will include 6 products for MODIS, OMI, AIRS, and AMSR-E plus the SEDAC population density data. The client and WMS were developed using open-source technologies such as OpenLayers and MapServer and provides a uniform, browser-based access to data products. All overlays are downloadable in PNG, JPEG, or GeoTiff form up to 200MB per request. The WMS was beta-tested with the user community and substantial performance improvements were made through the use of such techniques as tile-caching. LANCE established a partnership with Physical Oceanography Distributed Active Archive Center (PO DAAC) to develop an alternative presentation for the 40 data products known as the State of the Earth (SOTE). This provides a Google Earth-based interface to the products grouped in the same fashion as the WMS. The SOTE servers stream imagery and data in the OGC KML format and these feeds can be visualized through the Google Earth browser plug-in. SOTE provides visualization through a virtual globe environment by allowing users to interact with the globe via zooming, rotating, and tilting. In addition, SOTE also allows adding custom KML feeds. LANCE also provides datacasting feeds to facilitate user access to imagery for the 40 products and the related HDF-EOS products (available in a variety of formats). These XML-based data feeds contain data attribute and geolocation information, and metadata including an identification of the related application category. Users can subscribe to any feeds through the LANCE web site and use the PO DAAC Feed Reader to filter and view the content. The WMS, SOTE, and datacasting tools can be accessed through http://lance.nasa.gov.

  14. Challenges in Visualizing Satellite Level 2 Atmospheric Data with GIS approach

    NASA Astrophysics Data System (ADS)

    Wei, J. C.; Yang, W.; Zhao, P.; Pham, L.; Meyer, D. J.

    2017-12-01

    Satellite data products are important for a wide variety of applications that can bring far-reaching benefits to the science community and the broader society. These benefits can best be achieved if the satellite data are well utilized and interpreted. Unfortunately, this is not always the case, despite the abundance and relative maturity of numerous satellite data products provided by NASA and other organizations. One way to help users better understand the satellite data is to provide data along with `Images', including accurate pixel coverage area delineation, and science team recommended quality screening for individual geophysical parameters. However, there are challenges of visualizing remote sensed non-gridded products: (1) different geodetics of space-borne instruments (2) data often arranged in "along-track" and "across-track" axes (3) spatially and temporally continuous data chunked into granule files: data for a portion (or all) of a satellite orbit (4) no general rule of resampling or interpolations to a grid (5) geophysical retrieval only based on pixel center location without shape information. In this presentation, we will unravel a new Goddard Earth Sciences Data and Information Services Center (GES DISC) Level 2 (L2) visualization on-demand service. The service's front end provides various visualization and data accessing capabilities, such as overlay and swipe of multiply variables and subset and download of data in different formats. The backend of the service consists of Open Geospatial Consortium (OGC) standard-compliant Web Mapping Service (WMS) and Web Coverage Service. The infrastructure allows inclusion of outside data sources served in OGC compliant protocols and allows other interoperable clients, such as ArcGIS clients, to connect to our L2 WCS/WMS.

  15. Challenges in Obtaining and Visualizing Satellite Level 2 Data in GIS

    NASA Technical Reports Server (NTRS)

    Wei, Jennifer C.; Yang, Wenli; Zhao, Peisheng; Pham, Long; Meyer, David J.

    2017-01-01

    Satellite data products are important for a wide variety of applications that can bring far-reaching benefits to the science community and the broader society. These benefits can best be achieved if the satellite data are well utilized and interpreted. Unfortunately, this is not always the case, despite the abundance and relative maturity of numerous satellite data products provided by NASA and other organizations. One way to help users better understand the satellite data is to provide data along with Images, including accurate pixel coverage area delineation, and science team recommended quality screening for individual geophysical parameters. However, there are challenges of visualizing remote sensed non-gridded products: (1) different geodetics of space-borne instruments (2) data often arranged in a long-track� and a cross-track� axes (3) spatially and temporally continuous data chunked into granule files: data for a portion (or all) of a satellite orbit (4) no general rule of resampling or interpolations to a grid (5) geophysical retrieval only based on pixel center location without shape information. In this presentation, we will unravel a new Goddard Earth Sciences Data and Information Services Center (GES DISC) Level 2 (L2) visualization on-demand service. The service's front end provides various visualization and data accessing capabilities, such as overlay and swipe of multiply variables and subset and download of data in different formats. The backend of the service consists of Open Geospatial Consortium (OGC) standard-compliant Web Mapping Service (WMS) and Web Coverage Service. The infrastructure allows inclusion of outside data sources served in OGC compliant protocols and allows other interoperable clients, such as ArcGIS clients, to connect to our L2 WCS/WMS.

  16. Arctic Research Mapping Application (ARMAP): 2D Maps and 3D Globes Support Arctic Science

    NASA Astrophysics Data System (ADS)

    Johnson, G.; Gaylord, A. G.; Brady, J. J.; Cody, R. P.; Aguilar, J. A.; Dover, M.; Garcia-Lavigne, D.; Manley, W.; Score, R.; Tweedie, C. E.

    2007-12-01

    The Arctic Research Mapping Application (ARMAP) is a suite of online services to provide support of Arctic science. These services include: a text based online search utility, 2D Internet Map Server (IMS); 3D globes and Open Geospatial Consortium (OGC) Web Map Services (WMS). With ARMAP's 2D maps and 3D globes, users can navigate to areas of interest, view a variety of map layers, and explore U.S. Federally funded research projects. Projects can be queried by location, year, funding program, discipline, and keyword. Links take you to specific information and other web sites associated with a particular research project. The Arctic Research Logistics Support Service (ARLSS) database is the foundation of ARMAP including US research funded by the National Science Foundation, National Aeronautics and Space Administration, National Oceanic and Atmospheric Administration, and the United States Geological Survey. Avoiding a duplication of effort has been a primary objective of the ARMAP project which incorporates best practices (e.g. Spatial Data Infrastructure and OGC standard web services and metadata) and off the shelf technologies where appropriate. The ARMAP suite provides tools for users of various levels of technical ability to interact with the data by importing the web services directly into their own GIS applications and virtual globes; performing advanced GIS queries; simply printing maps from a set of predefined images in the map gallery; browsing the layers in an IMS; or by choosing to "fly to" sites using a 3D globe. With special emphasis on the International Polar Year (IPY), ARMAP has targeted science planners, scientists, educators, and the general public. In sum, ARMAP goes beyond a simple map display to enable analysis, synthesis, and coordination of Arctic research. ARMAP may be accessed via the gateway web site at http://www.armap.org.

  17. Modern Data Center Services Supporting Science

    NASA Astrophysics Data System (ADS)

    Varner, J. D.; Cartwright, J.; McLean, S. J.; Boucher, J.; Neufeld, D.; LaRocque, J.; Fischman, D.; McQuinn, E.; Fugett, C.

    2011-12-01

    The National Oceanic and Atmospheric Administration's National Geophysical Data Center (NGDC) World Data Center for Geophysics and Marine Geology provides scientific stewardship, products and services for geophysical data, including bathymetry, gravity, magnetics, seismic reflection, data derived from sediment and rock samples, as well as historical natural hazards data (tsunamis, earthquakes, and volcanoes). Although NGDC has long made many of its datasets available through map and other web services, it has now developed a second generation of services to improve the discovery and access to data. These new services use off-the-shelf commercial and open source software, and take advantage of modern JavaScript and web application frameworks. Services are accessible using both RESTful and SOAP queries as well as Open Geospatial Consortium (OGC) standard protocols such as WMS, WFS, WCS, and KML. These new map services (implemented using ESRI ArcGIS Server) are finer-grained than their predecessors, feature improved cartography, and offer dramatic speed improvements through the use of map caches. Using standards-based interfaces allows customers to incorporate the services without having to coordinate with the provider. Providing fine-grained services increases flexibility for customers building custom applications. The Integrated Ocean and Coastal Mapping program and Coastal and Marine Spatial Planning program are two examples of national initiatives that require common data inventories from multiple sources and benefit from these modern data services. NGDC is also consuming its own services, providing a set of new browser-based mapping applications which allow the user to quickly visualize and search for data. One example is a new interactive mapping application to search and display information about historical natural hazards. NGDC continues to increase the amount of its data holdings that are accessible and is augmenting the capabilities with modern web application frameworks such as Groovy and Grails. Data discovery is being improved and simplified by leveraging ISO metadata standards along with ESRI Geoportal Server.

  18. The OGC Sensor Web Enablement framework

    NASA Astrophysics Data System (ADS)

    Cox, S. J.; Botts, M.

    2006-12-01

    Sensor observations are at the core of natural sciences. Improvements in data-sharing technologies offer the promise of much greater utilisation of observational data. A key to this is interoperable data standards. The Open Geospatial Consortium's (OGC) Sensor Web Enablement initiative (SWE) is developing open standards for web interfaces for the discovery, exchange and processing of sensor observations, and tasking of sensor systems. The goal is to support the construction of complex sensor applications through real-time composition of service chains from standard components. The framework is based around a suite of standard interfaces, and standard encodings for the message transferred between services. The SWE interfaces include: Sensor Observation Service (SOS)-parameterized observation requests (by observation time, feature of interest, property, sensor); Sensor Planning Service (SPS)-tasking a sensor- system to undertake future observations; Sensor Alert Service (SAS)-subscription to an alert, usually triggered by a sensor result exceeding some value. The interface design generally follows the pattern established in the OGC Web Map Service (WMS) and Web Feature Service (WFS) interfaces, where the interaction between a client and service follows a standard sequence of requests and responses. The first obtains a general description of the service capabilities, followed by obtaining detail required to formulate a data request, and finally a request for a data instance or stream. These may be implemented in a stateless "REST" idiom, or using conventional "web-services" (SOAP) messaging. In a deployed system, the SWE interfaces are supplemented by Catalogue, data (WFS) and portrayal (WMS) services, as well as authentication and rights management. The standard SWE data formats are Observations and Measurements (O&M) which encodes observation metadata and results, Sensor Model Language (SensorML) which describes sensor-systems, Transducer Model Language (TML) which covers low-level data streams, and domain-specific GML Application Schemas for definitions of the target feature types. The SWE framework has been demonstrated in several interoperability testbeds. These were based around emergency management, security, contamination and environmental monitoring scenarios.

  19. Coherent visualization of spatial data adapted to roles, tasks, and hardware

    NASA Astrophysics Data System (ADS)

    Wagner, Boris; Peinsipp-Byma, Elisabeth

    2012-06-01

    Modern crisis management requires that users with different roles and computer environments have to deal with a high volume of various data from different sources. For this purpose, Fraunhofer IOSB has developed a geographic information system (GIS) which supports the user depending on available data and the task he has to solve. The system provides merging and visualization of spatial data from various civilian and military sources. It supports the most common spatial data standards (OGC, STANAG) as well as some proprietary interfaces, regardless if these are filebased or database-based. To set the visualization rules generic Styled Layer Descriptors (SLDs) are used, which are an Open Geospatial Consortium (OGC) standard. SLDs allow specifying which data are shown, when and how. The defined SLDs consider the users' roles and task requirements. In addition it is possible to use different displays and the visualization also adapts to the individual resolution of the display. Too high or low information density is avoided. Also, our system enables users with different roles to work together simultaneously using the same data base. Every user is provided with the appropriate and coherent spatial data depending on his current task. These so refined spatial data are served via the OGC services Web Map Service (WMS: server-side rendered raster maps), or the Web Map Tile Service - (WMTS: pre-rendered and cached raster maps).

  20. USA National Phenology Network gridded products documentation

    USGS Publications Warehouse

    Crimmins, Theresa M.; Marsh, R. Lee; Switzer, Jeff R.; Crimmins, Michael A.; Gerst, Katharine L.; Rosemartin, Alyssa H.; Weltzin, Jake F.

    2017-02-23

    The goals of the USA National Phenology Network (USA-NPN, www.usanpn.org) are to advance science, inform decisions, and communicate and connect with the public regarding phenology and species’ responses to environmental variation and climate change. The USA-NPN seeks to facilitate informed ecosystem stewardship and management by providing phenological information freely and openly. One way the USA-NPN is endeavoring to accomplish these goals is by providing data and data products in a wide range of formats, including gridded real-time, short-term forecasted, and historical maps of phenological events, patterns and trends. This document describes the suite of gridded phenologically relevant data products produced and provided by the USA National Phenology Network, which can be accessed at www.usanpn.org/data/phenology_maps and also through web services at geoserver.usanpn.org/geoserver/wms?request=GetCapabilities.

  1. NASA Earth Observations (NEO): Data Imagery for Education and Visualization

    NASA Astrophysics Data System (ADS)

    Ward, K.

    2008-12-01

    NASA Earth Observations (NEO) has dramatically simplified public access to georeferenced imagery of NASA remote sensing data. NEO targets the non-traditional data users who are currently underserved by functionality and formats available from the existing data ordering systems. These users include formal and informal educators, museum and science center personnel, professional communicators, and citizen scientists. NEO currently serves imagery from 45 different datasets with daily, weekly, and/or monthly temporal resolutions, with more datasets currently under development. The imagery from these datasets is produced in coordination with several data partners who are affiliated either with the instrument science teams or with the respective data processing center. NEO is a system of three components -- website, WMS (Web Mapping Service), and ftp archive -- which together are able to meet the wide-ranging needs of our users. Some of these needs include the ability to: view and manipulate imagery using the NEO website -- e.g., applying color palettes, resizing, exporting to a variety of formats including PNG, JPEG, KMZ (Google Earth), GeoTIFF; access the NEO collection via a standards-based API (WMS); and create customized exports for select users (ftp archive) such as Science on a Sphere, NASA's Earth Observatory, and others.

  2. The GLIMS Glacier Database

    NASA Astrophysics Data System (ADS)

    Raup, B. H.; Khalsa, S. S.; Armstrong, R.

    2007-12-01

    The Global Land Ice Measurements from Space (GLIMS) project has built a geospatial and temporal database of glacier data, composed of glacier outlines and various scalar attributes. These data are being derived primarily from satellite imagery, such as from ASTER and Landsat. Each "snapshot" of a glacier is from a specific time, and the database is designed to store multiple snapshots representative of different times. We have implemented two web-based interfaces to the database; one enables exploration of the data via interactive maps (web map server), while the other allows searches based on text-field constraints. The web map server is an Open Geospatial Consortium (OGC) compliant Web Map Server (WMS) and Web Feature Server (WFS). This means that other web sites can display glacier layers from our site over the Internet, or retrieve glacier features in vector format. All components of the system are implemented using Open Source software: Linux, PostgreSQL, PostGIS (geospatial extensions to the database), MapServer (WMS and WFS), and several supporting components such as Proj.4 (a geographic projection library) and PHP. These tools are robust and provide a flexible and powerful framework for web mapping applications. As a service to the GLIMS community, the database contains metadata on all ASTER imagery acquired over glacierized terrain. Reduced-resolution of the images (browse imagery) can be viewed either as a layer in the MapServer application, or overlaid on the virtual globe within Google Earth. The interactive map application allows the user to constrain by time what data appear on the map. For example, ASTER or glacier outlines from 2002 only, or from Autumn in any year, can be displayed. The system allows users to download their selected glacier data in a choice of formats. The results of a query based on spatial selection (using a mouse) or text-field constraints can be downloaded in any of these formats: ESRI shapefiles, KML (Google Earth), MapInfo, GML (Geography Markup Language) and GMT (Generic Mapping Tools). This "clip-and-ship" function allows users to download only the data they are interested in. Our flexible web interfaces to the database, which includes various support layers (e.g. a layer to help collaborators identify satellite imagery over their region of expertise) will facilitate enhanced analysis to be undertaken on glacier systems, their distribution, and their impacts on other Earth systems.

  3. Evaluating horizontal positional accuracy of low-cost UAV orthomosaics over forest terrain using ground control points extracted from different sources

    NASA Astrophysics Data System (ADS)

    Patias, Petros; Giagkas, Fotis; Georgiadis, Charalampos; Mallinis, Giorgos; Kaimaris, Dimitris; Tsioukas, Vassileios

    2017-09-01

    Within the field of forestry, forest road mapping and inventory plays an important role in management activities related to wood harvesting industry, sentiment and water run-off modelling, biodiversity distribution and ecological connectivity, recreation activities, future planning of forest road networks and wildfire protection and fire-fighting. Especially in countries of the Mediterranean Rim, knowledge at regional and national scales regarding the distribution and the characteristics of rural and forest road network is essential in order to ensure an effective emergency management and rapid response of the fire-fighting mechanism. Yet, the absence of accurate and updated geodatabases and the drawbacks related to the use of traditional cartographic methods arising from the forest environment settings, and the cost and efforts needed, as thousands of meters need to be surveyed per site, trigger the need for new data sources and innovative mapping approaches. Monitoring the condition of unpaved forest roads with unmanned aerial vehicle technology is an attractive option for substituting objective, laboursome surveys. Although photogrammetric processing of UAV imagery can achieve accuracy of 1-2 centimeters and dense point clouds, the process is commonly based on the establishment of control points. In the case of forest road networks, which are linear features, there is a need for a great number of control points. Our aim is to evaluate low-cost UAV orthoimages generated over forest areas with GCP's captured from existing national scale aerial orthoimagery, satellite imagery available through a web mapping service (WMS), field surveys using Mobile Mapping System and GNSS receiver. We also explored the direct georeferencing potential through the GNSS onboard the low cost UAV. The results suggest that the GNSS approach proved to most accurate, while the positional accuracy derived using the WMS and the aerial orthoimagery datasets deemed satisfactory for the specific task at hand. The direct georeferencing procedure seems to be insufficient unless an onboard GNSS with improved specifications or Real-Time Kinematic (RTK) capabilities is used.

  4. Prognocean Plus: the Science-Oriented Sea Level Prediction System as a Tool for Public Stakeholders

    NASA Astrophysics Data System (ADS)

    Świerczyńska, M. G.; Miziński, B.; Niedzielski, T.

    2015-12-01

    The novel real-time system for sea level prediction, known as Prognocean Plus, has been developed as a new generation service available through the Polish supercomputing grid infrastructure. The researchers can access the service at https://prognocean.plgrid.pl/. Although the system is science-oriented, we wish to discuss herein its potentials to enhance ocean management studies carried out routinely by public stakeholders. The system produces the short- and medium-term predictions of global altimetric gridded Sea Level Anomaly (SLA) time series, updated daily. The spatial resolution of the SLA forecasts is 1/4° x 1/4°, while the temporal resolution of prognoses is equal to 1 day. The system computes the predictions of time-variable ocean topography using five data-based models, which are not computationally demanding, enabling us to compare their skillfulness in respect to physically-based approaches commonly used by different sea level prediction systems. However, the aim of the system is not only to compute the predictions for science purposes, but primarily to build a user-oriented platform that serves the prognoses and their statistics to a broader community. Thus, we deliver the SLA forecasts as a rapid service available online. In order to provide potential users with the access to science results the Web Map Service (WMS) for Prognocean Plus is designed. We regularly publish the forecasts, both in the interactive graphical WMS service, available from the browser, as well as through the Web Coverage Service (WCS) standard. The Prognocean Plus system, as an early-response system, may be interesting for public stakeholders. It may be used for marine navigation as well as for climate risk management (delineate areas vulnerable to local sea level rise), marine management (advise offered for offshore activities) and coastal management (early warnings against coastal floodings).

  5. Improving data discoverability, accessibility, and interoperability with the Esri ArcGIS Platform at the NASA Atmospheric Science Data Center (ASDC).

    NASA Astrophysics Data System (ADS)

    Tisdale, M.

    2017-12-01

    NASA's Atmospheric Science Data Center (ASDC) is operationally using the Esri ArcGIS Platform to improve data discoverability, accessibility and interoperability to meet the diversifying user requirements from government, private, public and academic communities. The ASDC is actively working to provide their mission essential datasets as ArcGIS Image Services, Open Geospatial Consortium (OGC) Web Mapping Services (WMS), and OGC Web Coverage Services (WCS) while leveraging the ArcGIS multidimensional mosaic dataset structure. Science teams at ASDC are utilizing these services through the development of applications using the Web AppBuilder for ArcGIS and the ArcGIS API for Javascript. These services provide greater exposure of ASDC data holdings to the GIS community and allow for broader sharing and distribution to various end users. These capabilities provide interactive visualization tools and improved geospatial analytical tools for a mission critical understanding in the areas of the earth's radiation budget, clouds, aerosols, and tropospheric chemistry. The presentation will cover how the ASDC is developing geospatial web services and applications to improve data discoverability, accessibility, and interoperability.

  6. Geospatial Data as a Service: The GEOGLAM Rangelands and Pasture Productivity Map Experience

    NASA Astrophysics Data System (ADS)

    Evans, B. J. K.; Antony, J.; Guerschman, J. P.; Larraondo, P. R.; Richards, C. J.

    2017-12-01

    Empowering end-users like pastoralists, land management specialists and land policy makers in the use of earth observation data for both day-to-day and seasonal planning needs both interactive delivery of multiple geospatial datasets and the capability of supporting on-the-fly dynamic queries while simultaneously fostering a community around the effort. The use of and wide adoption of large data archives, like those produced by earth observation missions, are often limited by compute and storage capabilities of the remote user. We demonstrate that wide-scale use of large data archives can be facilitated by end-users dynamically requesting value-added products using open standards (WCS, WMS, WPS), with compute running in the cloud or dedicated data-centres and visualizing outputs on web-front ends. As an example, we will demonstrate how a tool called GSKY can empower a remote end-user by providing the data delivery and analytics capabilities for the GEOGLAM Rangelands and Pasture Productivity (RAPP) Map tool. The GEOGLAM RAPP initiative from the Group on Earth Observations (GEO) and its Agricultural Monitoring subgroup aims at providing practical tools to end-users focusing on the important role of rangelands and pasture systems in providing food production security from both agricultural crops and animal protein. Figure 1, is a screen capture from the RAPP Map interface for an important pasture area in the Namibian rangelands. The RAPP Map has been in production for six months and has garnered significant interest from groups and users all over the world. GSKY, being formulated around the theme of Open Geospatial Data-as-a-Service capabilities uses distributed computing and storage to facilitate this. It works behind the scenes, accepting OGC standard requests in WCS, WMS and WPS. Results from these requests are rendered on a web-front end. In this way, the complexities of data locality and compute execution are masked from an end user. On-the-fly computation of products such as NDVI, Leaf Area Index, vegetation cover and others from original source data including MODIS are achived, with Landsat and Sentinel-2 on the horizon. Innovative use of cloud computing and storage along with flexible front-ends, allow the democratization of data dissemination and we hope better outcomes for the planet.

  7. Implementation of a near-real time cross-border web-mapping platform on airborne particulate matter (PM) concentration with open-source software

    NASA Astrophysics Data System (ADS)

    Knörchen, Achim; Ketzler, Gunnar; Schneider, Christoph

    2015-01-01

    Although Europe has been growing together for the past decades, cross-border information platforms on environmental issues are still scarce. With regard to the establishment of a web-mapping tool on airborne particulate matter (PM) concentration for the Euregio Meuse-Rhine located in the border region of Belgium, Germany and the Netherlands, this article describes the research on methodical and technical backgrounds implementing such a platform. An open-source solution was selected for presenting the data in a Web GIS (OpenLayers/GeoExt; both JavaScript-based), applying other free tools for data handling (Python), data management (PostgreSQL), geo-statistical modelling (Octave), geoprocessing (GRASS GIS/GDAL) and web mapping (MapServer). The multilingual, made-to-order online platform provides access to near-real time data on PM concentration as well as additional background information. In an open data section, commented configuration files for the Web GIS client are being made available for download. Furthermore, all geodata generated by the project is being published under public domain and can be retrieved in various formats or integrated into Desktop GIS as Web Map Services (WMS).

  8. The Use of LANCE Imagery Products to Investigate Hazards and Disasters

    NASA Astrophysics Data System (ADS)

    Schmaltz, J. E.; Teague, M.; Conover, H.; Regner, K.; Masuoka, E.; Vollmer, B. E.; Durbin, P.; Murphy, K. J.; Boller, R. A.; Davies, D.; Ilavajhala, S.; Thompson, C. K.; Bingham, A.; Rao, S.

    2011-12-01

    The NASA/GSFC Land Atmospheres Near-real time Capability for EOS (LANCE) has endeavored to integrate a variety of products from the Terra, Aqua, and Aura missions to assist in meeting the needs of the applications user community. This community has a need for imagery products to support the investigation of a wide variety of phenomena including hazards and disasters. The Evjafjallajokull eruption, the tsunamis/flood in Japan, and the Gulf of Mexico oil spill are recent examples of applications benefiting from the timely and synoptic view afforded by LANCE data. Working with the instrument science teams and the applications community, LANCE has identified 14 applications categories and the LANCE products that will support their investigation. The categories are: Smoke Plumes, Ash Plumes, Dust Storms, Pollution, Severe Storms, Shipping hazards, Fishery hazards, Land Transportation, Fires, Floods, Drought, Vegetation, Agriculture, and Oil Spills. Forty products from AMSR-E, MODIS, AIRS, and OMI have been identified to support analyses and investigations of these phenomena. In each case multiple products from two or more instruments are available which gives a more complete picture of the evolving hazard or disaster. All Level 2 (L2) products are available within 2.5 hours of observation at the spacecraft and the daily L3 products are updated incrementally as new data become available. LANCE provides user access to imagery using two systems: a Web Mapping Service (WMS) and a Google Earth-based interface known as the State of the Earth (SOTE). The latter has resulted from a partnership between LANCE and the Physical Oceanography Distributed Active Archive Center (PO DAAC). When the user selects one of the 14 categories, the relevant products are established within the WMS (http://lance2.modaps.eosdis.nasa.gov/wms/). For each application, population density data are available for densities in excess of 100 people/sqkm with user-defined opacity. These data are provided by the EOSDIS Socio-Economic Data and Applications Center (SEDAC). Certain users may not want to be constrained by the pre-defined categories and related products and all 40 products may be added as potential overlays. The most recent 10 days of near-real time data are available through the WMS. The SOTE provides an interface to the products grouped in the same fashion as the WMS. The SOTE servers stream imagery and data in the OGC KML format and these feeds can be visualized through the Google Earth browser plug-in. SOTE provides visualization through a virtual globe environment by allowing users to interact with the globe via zooming, rotating, and tilting.

  9. A Lithology Based Map Unit Schema For Onegeology Regional Geologic Map Integration

    NASA Astrophysics Data System (ADS)

    Moosdorf, N.; Richard, S. M.

    2012-12-01

    A system of lithogenetic categories for a global lithological map (GLiM, http://www.ifbm.zmaw.de/index.php?id=6460&L=3) has been compiled based on analysis of lithology/genesis categories for regional geologic maps for the entire globe. The scheme is presented for discussion and comment. Analysis of units on a variety of regional geologic maps indicates that units are defined based on assemblages of rock types, as well as their genetic type. In this compilation of continental geology, outcropping surface materials are dominantly sediment/sedimentary rock; major subdivisions of the sedimentary category include clastic sediment, carbonate sedimentary rocks, clastic sedimentary rocks, mixed carbonate and clastic sedimentary rock, colluvium and residuum. Significant areas of mixed igneous and metamorphic rock are also present. A system of global categories to characterize the lithology of regional geologic units is important for Earth System models of matter fluxes to soils, ecosystems, rivers and oceans, and for regional analysis of Earth surface processes at global scale. Because different applications of the classification scheme will focus on different lithologic constituents in mixed units, an ontology-type representation of the scheme that assigns properties to the units in an analyzable manner will be pursued. The OneGeology project is promoting deployment of geologic map services at million scale for all nations. Although initial efforts are commonly simple scanned map WMS services, the intention is to move towards data-based map services that categorize map units with standard vocabularies to allow use of a common map legend for better visual integration of the maps (e.g. see OneGeology Europe, http://onegeology-europe.brgm.fr/ geoportal/ viewer.jsp). Current categorization of regional units with a single lithology from the CGI SimpleLithology (http://resource.geosciml.org/201202/ Vocab2012html/ SimpleLithology201012.html) vocabulary poorly captures the lithologic character of such units in a meaningful way. A lithogenetic unit category scheme accessible as a GeoSciML-portrayal-based OGC Styled Layer Description resource is key to enabling OneGeology (http://oneGeology.org) geologic map services to achieve a high degree of visual harmonization.

  10. How NASA's Atmospheric Science Data Center (ASDC) is operationally using the Esri ArcGIS Platform to improve data discoverability, accessibility and interoperability to meet the diversifying government, private, public and academic communities' driven requirements.

    NASA Astrophysics Data System (ADS)

    Tisdale, M.

    2016-12-01

    NASA's Atmospheric Science Data Center (ASDC) is operationally using the Esri ArcGIS Platform to improve data discoverability, accessibility and interoperability to meet the diversifying government, private, public and academic communities' driven requirements. The ASDC is actively working to provide their mission essential datasets as ArcGIS Image Services, Open Geospatial Consortium (OGC) Web Mapping Services (WMS), OGC Web Coverage Services (WCS) and leveraging the ArcGIS multidimensional mosaic dataset structure. Science teams and ASDC are utilizing these services, developing applications using the Web AppBuilder for ArcGIS and ArcGIS API for Javascript, and evaluating restructuring their data production and access scripts within the ArcGIS Python Toolbox framework and Geoprocessing service environment. These capabilities yield a greater usage and exposure of ASDC data holdings and provide improved geospatial analytical tools for a mission critical understanding in the areas of the earth's radiation budget, clouds, aerosols, and tropospheric chemistry.

  11. Innovative Approaches for the Dissemination of Near Real-time Geostationary Satellite Data for Terrestrial and Space Weather Applications

    NASA Astrophysics Data System (ADS)

    Jedlovec, G.; McGrath, K.; Meyer, P. J.; Berndt, E.

    2017-12-01

    A GOES-R series receiving station has been installed at the NASA Marshall Space Flight Center (MSFC) to support GOES-16 transition-to-operations projects of NASA's Earth science program and provide a community portal for GOES-16 data access. This receiving station is comprised of a 6.5-meter dish; motor-driven positioners; Quorum feed and demodulator; and three Linux workstations for ingest, processing, display, and subsequent product generation. The Community Satellite Processing Package (CSPP) is used to process GOES Rebroadcast data from the Advanced Baseline Imager (ABI), Geostationary Lightning Mapper (GLM), Solar Ultraviolet Imager (SUVI), Extreme Ultraviolet and X-ray Irradiance Sensors (EXIS), and Space Environment In-Situ Suite (SEISS) into Level 1b and Level 2 files. GeoTIFFs of the imagery from several of these instruments are ingested into an Esri Arc Enterprise Web Map Service (WMS) server with tiled imagery displayable through a web browser interface or by connecting directly to the WMS with a Geographic Information System software package. These data also drive a basic web interface where users can manually zoom to and animate regions of interest or acquire similar results using a published Application Program Interface. While not as interactive as a WMS-driven interface, this system is much more expeditious with generating and distributing requested imagery. The legacy web capability enacted for the predecessor GOES Imager currently supports approximately 500,000 unique visitors each month. Dissemination capabilities have been refined to support a significantly larger number of anticipated users. The receiving station also supports NASA's Short-term Prediction, Research, and Transition Center's (SPoRT) project activities to dissemination near real-time ABI RGB products to National Weather Service National Centers, including the Satellite Analysis Branch, National Hurricane Center, Ocean Prediction Center, and Weather Prediction Center, where they are displayed in N-AWIPS and AWIPS II. The multitude of additional real-time data users include the U.S. Coast Guard, Federal Aviation Administration, and The Weather Company. A second antenna is being installed for the ingest, processing, and dissemination of GOES-S data.

  12. River Basin Standards Interoperability Pilot

    NASA Astrophysics Data System (ADS)

    Pesquer, Lluís; Masó, Joan; Stasch, Christoph

    2016-04-01

    There is a lot of water information and tools in Europe to be applied in the river basin management but fragmentation and a lack of coordination between countries still exists. The European Commission and the member states have financed several research and innovation projects in support of the Water Framework Directive. Only a few of them are using the recently emerging hydrological standards, such as the OGC WaterML 2.0. WaterInnEU is a Horizon 2020 project focused on creating a marketplace to enhance the exploitation of EU funded ICT models, tools, protocols and policy briefs related to water and to establish suitable conditions for new market opportunities based on these offerings. One of WaterInnEU's main goals is to assess the level of standardization and interoperability of these outcomes as a mechanism to integrate ICT-based tools, incorporate open data platforms and generate a palette of interchangeable components that are able to use the water data emerging from the recently proposed open data sharing processes and data models stimulated by initiatives such as the INSPIRE directive. As part of the standardization and interoperability activities in the project, the authors are designing an experiment (RIBASE, the present work) to demonstrate how current ICT-based tools and water data can work in combination with geospatial web services in the Scheldt river basin. The main structure of this experiment, that is the core of the present work, is composed by the following steps: - Extraction of information from river gauges data in OGC WaterML 2.0 format using SOS services (preferably compliant to the OGC SOS 2.0 Hydrology Profile Best Practice). - Model floods using a WPS 2.0, WaterML 2.0 data and weather forecast models as input. - Evaluation of the applicability of Sensor Notification Services in water emergencies. - Open distribution of the input and output data as OGC web services WaterML, / WCS / WFS and with visualization utilities: WMS. The architecture tests the combination of Gauge data in a WPS that is triggered by a meteorological alert. The data is translated into OGC WaterML 2.0 time series data format and will be ingested in a SOS 2.0. SOS data is visualized in a SOS Client that is able to handle time series. The meteorological forecast data (with the supervision of an operator manipulating the WPS user interface) ingests with WaterML 2.0 time series and terrain data is input for a flooding modelling algorithm. The WPS is able to produce flooding datasets in the form of coverages that is offered to clients via a WCS 2.0 service or a WMS 1.3 service, and downloaded and visualized by the respective clients. The WPS triggers a notification or an alert that will be monitored from an emergency control response service. Acronyms AS: Alert Service ES: Event Service ICT: Information and Communication Technology NS: Notification Service OGC: Open Geospatial Consortium RIBASE: River Basin Standards Interoperability Pilot SOS: Sensor Observation Service WaterML: Water Markup Language WCS: Web Coverage Service WMS: Web Map Service WPS: Web Processing Service

  13. GIS Services, Visualization Products, and Interoperability at the National Oceanic and Atmospheric Administration (NOAA) National Climatic Data Center (NCDC)

    NASA Astrophysics Data System (ADS)

    Baldwin, R.; Ansari, S.; Reid, G.; Lott, N.; Del Greco, S.

    2007-12-01

    The main goal in developing and deploying Geographic Information System (GIS) services at NOAA's National Climatic Data Center (NCDC) is to provide users with simple access to data archives while integrating new and informative climate products. Several systems at NCDC provide a variety of climatic data in GIS formats and/or map viewers. The Online GIS Map Services provide users with data discovery options which flow into detailed product selection maps, which may be queried using standard "region finder" tools or gazetteer (geographical dictionary search) functions. Each tabbed selection offers steps to help users progress through the systems. A series of additional base map layers or data types have been added to provide companion information. New map services include: Severe Weather Data Inventory, Local Climatological Data, Divisional Data, Global Summary of the Day, and Normals/Extremes products. THREDDS Data Server technology is utilized to provide access to gridded multidimensional datasets such as Model, Satellite and Radar. This access allows users to download data as a gridded NetCDF file, which is readable by ArcGIS. In addition, users may subset the data for a specific geographic region, time period, height range or variable prior to download. The NCDC Weather Radar Toolkit (WRT) is a client tool which accesses Weather Surveillance Radar 1988 Doppler (WSR-88D) data locally or remotely from the NCDC archive, NOAA FTP server or any URL or THREDDS Data Server. The WRT Viewer provides tools for custom data overlays, Web Map Service backgrounds, animations and basic filtering. The export of images and movies is provided in multiple formats. The WRT Data Exporter allows for data export in both vector polygon (Shapefile, Well-Known Text) and raster (GeoTIFF, ESRI Grid, VTK, NetCDF, GrADS) formats. As more users become accustom to GIS, questions of better, cheaper, faster access soon follow. Expanding use and availability can best be accomplished through standards which promote interoperability. Our GIS related products provide Open Geospatial Consortium (OGC) compliant Web Map Services (WMS), Web Feature Services (WFS), Web Coverage Services (WCS) and Federal Geographic Data Committee (FGDC) metadata as a complement to the map viewers. KML/KMZ data files (soon to be compliant OGC specifications) also provide access.

  14. European Marine Observation Data Network - EMODnet Physics

    NASA Astrophysics Data System (ADS)

    Manzella, Giuseppe M. R.; Novellino, Antonio; D'Angelo, Paolo; Gorringe, Patrick; Schaap, Dick; Pouliquen, Sylvie; Loubrieu, Thomas; Rickards, Lesley

    2015-04-01

    The EMODnet-Physics portal (www.emodnet-physics.eu) makes layers of physical data and their metadata available for use and contributes towards the definition of an operational European Marine Observation and Data Network (EMODnet). It is based on a strong collaboration between EuroGOOS associates and its regional operational systems (ROOSs), and it is bringing together two very different marine communities: the "real time" ocean observing institute/centers and the National Oceanographic Data Centres (NODCs) that are in charge of ocean data validation, quality check and update for marine environmental monitoring. The EMODnet-Physics is a Marine Observation and Data Information System that provides a single point of access to near real time and historical achieved data (www.emodnet-physics.eu/map) it is built on existing infrastructure by adding value and avoiding any unless complexity, it provides data access to users, it is aimed at attracting new data holders, better and more data. With a long-term vision for a pan European Ocean Observation System sustainability, the EMODnet-Physics is supporting the coordination of the EuroGOOS Regional components and the empowerment and improvement of their data management infrastructure. In turn, EMODnet-Physics already implemented high-level interoperability features (WMS, Web catalogue, web services, etc…) to facilitate connection and data exchange with the ROOS and the Institutes within the ROOSs (www.emodnet-physics.eu/services). The on-going EMODnet-Physics structure delivers environmental marine physical data from the whole Europe (wave height and period, temperature of the water column, wind speed and direction, salinity of the water column, horizontal velocity of the water column, light attenuation, and sea level) as monitored by fixed stations, ARGO floats, drifting buoys, gliders, and ferry-boxes. It does provide discovering of data sets (both NRT - near real time - and Historical data sets), visualization and free download of data from more than 1500 platforms. The portal is composed mainly of three sections: the Map, the Selection List and the Station Info Panel. The Map is the core of the EMODnet-Physics system: here the user can access all available data, customize the map visualization and set different display layers. It is also possible to interact with all the information on the map using the filters provided by the service that can be used to select the stations of interest depending on the type, physical parameters measured, the time period of the observations in the database of the system, country of origin, the water basin of reference. It is also possible to browse the data in time by means of the slider in the lower part of the page that allows the user to view the stations that recorded data in a particular time period. Finally, it is possible to change the standard map view with different layers that provide additional visual information on the status of the waters. The Station Info panel available from the main map by clicking on a single platform provides information on the measurements carried out by the station. Moreover, the system provides full interoperability with third-party software through WMS service, Web Service and Web catalogue in order to exchange data and products according to the most recent interop standards. Further developments will ensure the compatibility to the OGS-SWE (Sensor Web Enablement) standard for the description of sensors and related observations using OpenGIS specifications (SensorML, O&M, SOS). The full list of services is available at www.emodnet-physics.eu/services. The result is an excellent example of innovative technologies for providing open and free access to geo-referenced data for the creation of new advanced (operational) oceanography services.

  15. Development of ITSASGIS-5D: seeking interoperability between Marine GIS layers and scientific multidimensional data using open source tools and OGC services for multidisciplinary research.

    NASA Astrophysics Data System (ADS)

    Sagarminaga, Y.; Galparsoro, I.; Reig, R.; Sánchez, J. A.

    2012-04-01

    Since 2000, an intense effort was conducted in AZTI's Marine Research Division to set up a data management system which could gather all the marine datasets that were being produced by different in-house research projects. For that, a corporative GIS was designed that included a data and metadata repository, a database, a layer catalog & search application and an internet map viewer. Several layers, mostly dealing with physical, chemical and biological in-situ sampling, and basic and thematic cartography including bathymetry, geomorphology, different species habitat maps, and human pressure and activities maps, were successfully gathered in this system. Very soon, it was realised that new marine technologies yielding continuous multidimensional data, sometimes called FES (Fluid Earth System) data, were difficult to handle in this structure. The data affected, mainly included numerical oceanographic and meteorological models, remote sensing data, coastal RADAR data, and some in-situ observational systems such as CTD's casts, moored or lagrangian buoys, etc. A management system for gridded multidimensional data was developed using standardized formats (netcdf using CF conventions) and tools such as THREDDS catalog (UNIDATA/UCAR) providing web services such as OPENDAP, NCSS, and WCS, as well as ncWMS service developed by the Reading e-science Center. At present, a system (ITSASGIS-5D) is being developed, based on OGC standards and open-source tools to allow interoperability between all the data types mentioned before. This system includes, in the server side, postgresql/postgis databases and geoserver for GIS layers, and THREDDS/Opendap and ncWMS services for FES gridded data. Moreover, an on-line client is being developed to allow joint access, user configuration, data visualisation & query and data distribution. This client is using mapfish, ExtJS - GeoEXT, and openlayers libraries. Through this presentation the elements of the first released version of this system will be described and showed, together with the new topics to be developed in new versions that include among others, the integration of geoNetwork libraries and tools for both FES and GIS metadata management, and the use of new OGC Sensor Observation Services (SOS) to integrate non gridded multidimensional data such as time series, depth profiles or trajectories provided by different observational systems. The final aim of this approach is to contribute to the multidisciplinary access and use of marine data for management and research activities, and facilitate the implementation of integrated ecosystem based approaches in the fields of fisheries advice and management, marine spatial planning, or the implementation of the European policies such as the Water Framework Directive, the Marine Strategy Framework Directive or the Habitat Framework Directive.

  16. The Demonstrator for the European Plate Observing System (EPOS)

    NASA Astrophysics Data System (ADS)

    Hoffmann, T. L.; Euteneuer, F.; Ulbricht, D.; Lauterjung, J.; Bailo, D.; Jeffery, K. G.

    2014-12-01

    An important outcome of the 4-year Preparatory Phase of the ESFRI project European Plate Observing System (EPOS) was the development and first implementation of the EPOS Demonstrator by the project's ICT Working Group 7. The Demonstrator implements the vertical integration of the three-layer architectural scheme for EPOS, connecting the Integrated Core Services (ICS), Thematic Core Services (TCS) and the National Research Infrastructures (NRI). The demonstrator provides a single GUI with central key discovery and query functionalities, based on already existing services by the seismic, geologic and geodetic communities. More specifically the seismic services of the Demonstrator utilize webservices and APIs for data and discovery of raw seismic data (FDSN webservices by the EIDA Network), events (Geoportal by EMSC) and analytical data products (e.g., hazard maps by EFEHR via OGC WMS). For geologic services, the EPOS Demonstrator accesses OneGeology Europe which serves the community with geologic maps and point information via OGC webservices. The Demonstrator also provides access to raw geodetic data via a newly developed universal tool called GSAC. The Demonstrator itself resembles the future Integrated Core Service (ICS) and provides direct access to the end user. Its core functionality lies in a metadata catalogue, which serves as the central information hub and stores information about all RIs, related persons, projects, financial background and technical access information. The database schema of the catalogue is based on CERIF, which has been slightly adapted. Currently, the portal provides basic query functions as well as cross domain search. [www.epos.cineca.it

  17. A Web-based Visualization System for Three Dimensional Geological Model using Open GIS

    NASA Astrophysics Data System (ADS)

    Nemoto, T.; Masumoto, S.; Nonogaki, S.

    2017-12-01

    A three dimensional geological model is an important information in various fields such as environmental assessment, urban planning, resource development, waste management and disaster mitigation. In this study, we have developed a web-based visualization system for 3D geological model using free and open source software. The system has been successfully implemented by integrating web mapping engine MapServer and geographic information system GRASS. MapServer plays a role of mapping horizontal cross sections of 3D geological model and a topographic map. GRASS provides the core components for management, analysis and image processing of the geological model. Online access to GRASS functions has been enabled using PyWPS that is an implementation of WPS (Web Processing Service) Open Geospatial Consortium (OGC) standard. The system has two main functions. Two dimensional visualization function allows users to generate horizontal and vertical cross sections of 3D geological model. These images are delivered via WMS (Web Map Service) and WPS OGC standards. Horizontal cross sections are overlaid on the topographic map. A vertical cross section is generated by clicking a start point and an end point on the map. Three dimensional visualization function allows users to visualize geological boundary surfaces and a panel diagram. The user can visualize them from various angles by mouse operation. WebGL is utilized for 3D visualization. WebGL is a web technology that brings hardware-accelerated 3D graphics to the browser without installing additional software. The geological boundary surfaces can be downloaded to incorporate the geologic structure in a design on CAD and model for various simulations. This study was supported by JSPS KAKENHI Grant Number JP16K00158.

  18. Design of a High Resolution Open Access Global Snow Cover Web Map Service Using Ground and Satellite Observations

    NASA Astrophysics Data System (ADS)

    Kadlec, J.; Ames, D. P.

    2014-12-01

    The aim of the presented work is creating a freely accessible, dynamic and re-usable snow cover map of the world by combining snow extent and snow depth datasets from multiple sources. The examined data sources are: remote sensing datasets (MODIS, CryoLand), weather forecasting model outputs (OpenWeatherMap, forecast.io), ground observation networks (CUAHSI HIS, GSOD, GHCN, and selected national networks), and user-contributed snow reports on social networks (cross-country and backcountry skiing trip reports). For adding each type of dataset, an interface and an adapter is created. Each adapter supports queries by area, time range, or combination of area and time range. The combined dataset is published as an online snow cover mapping service. This web service lowers the learning curve that is required to view, access, and analyze snow depth maps and snow time-series. All data published by this service are licensed as open data; encouraging the re-use of the data in customized applications in climatology, hydrology, sports and other disciplines. The initial version of the interactive snow map is on the website snow.hydrodata.org. This website supports the view by time and view by site. In view by time, the spatial distribution of snow for a selected area and time period is shown. In view by site, the time-series charts of snow depth at a selected location is displayed. All snow extent and snow depth map layers and time series are accessible and discoverable through internationally approved protocols including WMS, WFS, WCS, WaterOneFlow and WaterML. Therefore they can also be easily added to GIS software or 3rd-party web map applications. The central hypothesis driving this research is that the integration of user contributed data and/or social-network derived snow data together with other open access data sources will result in more accurate and higher resolution - and hence more useful snow cover maps than satellite data or government agency produced data by itself.

  19. Implementing a New Cloud Computing Library Management Service: A Symbiotic Approach

    ERIC Educational Resources Information Center

    Dula, Michael; Jacobsen, Lynne; Ferguson, Tyler; Ross, Rob

    2012-01-01

    This article presents the story of how Pepperdine University migrated its library management functions to the cloud using what is now known as OCLC's WorldShare Management Services (WMS). The story of implementing this new service is told from two vantage points: (1) that of the library; and (2) that of the service provider. The authors were the…

  20. Web access and dissemination of Andalusian coastal erosion rates: viewers and standard/filtered map services.

    NASA Astrophysics Data System (ADS)

    Álvarez Francoso, Jose; Prieto Campos, Antonio; Ojeda Zujar, Jose; Guisado-Pintado, Emilia; Pérez Alcántara, Juan Pedro

    2017-04-01

    The accessibility to environmental information via web viewers using map services (OGC or proprietary services) has become more frequent since newly information sources (ortophotos, LIDAR, GPS) are of great detailed and thus generate a great volume of data which barely can be disseminated using either analogue (paper maps) or digital (pdf) formats. Moreover, governments and public institutions are concerned about the need of facilitates provision to research results and improve communication about natural hazards to citizens and stakeholders. This information ultimately, if adequately disseminated, it's crucial in decision making processes, risk management approaches and could help to increase social awareness related to environmental issues (particularly climate change impacts). To overcome this issue, two strategies for wide dissemination and communication of the results achieved in the calculation of beach erosion for the 640 km length of the Andalusian coast (South Spain) using web viewer technology are presented. Each of them are oriented to different end users and thus based on different methodologies. Erosion rates has been calculated at 50m intervals for different periods (1956-1977-2001-2011) as part of a National Research Project based on the spasialisation and web-access of coastal vulnerability indicators for Andalusian region. The 1st proposal generates WMS services (following OGC standards) that are made available by Geoserver, using a geoviewer client developed through Leaflet. This viewer is designed to be used by the general public (citizens, politics, etc) by combining a set of tools that give access to related documents (pdfs), visualisation tools (panoramio pictures, geo-localisation with GPS) are which are displayed within an user-friendly interface. Further, the use of WMS services (implemented on Geoserver) provides a detailed semiology (arrows and proportional symbols, using alongshore coastaline buffers to represent data) which not only enhances access to erosion rates but also enables multi-scale data representation. The 2nd proposal, as intended to be used by technicians and specialists on the field, includes a geoviewer with an innovative profile (including visualization of time-ranges, application of different uncertainty levels to the data, etc) to fulfil the needs of these users. For its development, a set of Javascript libraries combined with Openlayers (or Leaflet) are implemented to guarantee all the functionalities existing for the basic geoviewer. Further to this, the viewer has been improved by i) the generation of services by request through the application of a filter in ECQL language (Extended Common Query Language), using the vendor parameter CQL_FILTER from Geoserver. These dynamic filters allow the final user to predefine the visualised variable, its spatial and temporal domain, a range of specific values and other attributes, thus multiplying the generation of real-time cartography; ii) by using the layer's WFS service, the Javascript application exploit the alphanumeric data to generate related statistics in real time (e.g. mean rates, length of eroded coast, etc.) and interactive graphs (via HighCharts.js library) which accurately help in beach erosion rates interpretation (representing trends and bars diagrams, among others. As a result two approaches for communicating scientific results to different audiences based on web-based with complete dataset of geo-information, services and functionalities are implemented. The combination of standardised environmental data with tailor-made exploitation techniques (interactive maps, and real-time statistics) assures the correct access and interpretation of the information.

  1. Exploring NASA OMI Level 2 Data With Visualization

    NASA Technical Reports Server (NTRS)

    Wei, Jennifer; Yang, Wenli; Johnson, James; Zhao, Peisheng; Gerasimov, Irina; Pham, Long; Vicente, Gilberto

    2014-01-01

    Satellite data products are important for a wide variety of applications that can bring far-reaching benefits to the science community and the broader society. These benefits can best be achieved if the satellite data are well utilized and interpreted, such as model inputs from satellite, or extreme events (such as volcano eruptions, dust storms,... etc.). Unfortunately, this is not always the case, despite the abundance and relative maturity of numerous satellite data products provided by NASA and other organizations. Such obstacles may be avoided by allowing users to visualize satellite data as "images", with accurate pixel-level (Level-2) information, including pixel coverage area delineation and science team recommended quality screening for individual geophysical parameters. We present a prototype service from the Goddard Earth Sciences Data and Information Services Center (GES DISC) supporting Aura OMI Level-2 Data with GIS-like capabilities. Functionality includes selecting data sources (e.g., multiple parameters under the same scene, like NO2 and SO2, or the same parameter with different aggregation methods, like NO2 in OMNO2G and OMNO2D products), user-defined area-of-interest and temporal extents, zooming, panning, overlaying, sliding, and data subsetting, reformatting, and reprojection. The system will allow any user-defined portal interface (front-end) to connect to our backend server with OGC standard-compliant Web Mapping Service (WMS) and Web Coverage Service (WCS) calls. This back-end service should greatly enhance its expandability to integrate additional outside data/map sources.

  2. Exploring NASA OMI Level 2 Data With Visualization

    NASA Technical Reports Server (NTRS)

    Wei, Jennifer C.; Yang, Wenli; Johnson, James; Zhao, Peisheng; Gerasimov, Irina; Pham, Long; Vincente, Gilbert

    2014-01-01

    Satellite data products are important for a wide variety of applications that can bring far-reaching benefits to the science community and the broader society. These benefits can best be achieved if the satellite data are well utilized and interpreted, such as model inputs from satellite, or extreme events (such as volcano eruptions, dust storms, etc.).Unfortunately, this is not always the case, despite the abundance and relative maturity of numerous satellite data products provided by NASA and other organizations. Such obstacles may be avoided by allowing users to visualize satellite data as images, with accurate pixel-level (Level-2) information, including pixel coverage area delineation and science team recommended quality screening for individual geophysical parameters. We present a prototype service from the Goddard Earth Sciences Data and Information Services Center (GES DISC) supporting Aura OMI Level-2 Data with GIS-like capabilities. Functionality includes selecting data sources (e.g., multiple parameters under the same scene, like NO2 and SO2, or the same parameter with different aggregation methods, like NO2 in OMNO2G and OMNO2D products), user-defined area-of-interest and temporal extents, zooming, panning, overlaying, sliding, and data subsetting, reformatting, and reprojection. The system will allow any user-defined portal interface (front-end) to connect to our backend server with OGC standard-compliant Web Mapping Service (WMS) and Web Coverage Service (WCS) calls. This back-end service should greatly enhance its expandability to integrate additional outside data-map sources.

  3. Collaborative Science Using Web Services and the SciFlo Grid Dataflow Engine

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Manipon, G.; Xing, Z.; Yunck, T.

    2006-12-01

    The General Earth Science Investigation Suite (GENESIS) project is a NASA-sponsored partnership between the Jet Propulsion Laboratory, academia, and NASA data centers to develop a new suite of Web Services tools to facilitate multi-sensor investigations in Earth System Science. The goal of GENESIS is to enable large-scale, multi-instrument atmospheric science using combined datasets from the AIRS, MODIS, MISR, and GPS sensors. Investigations include cross-comparison of spaceborne climate sensors, cloud spectral analysis, study of upper troposphere-stratosphere water transport, study of the aerosol indirect cloud effect, and global climate model validation. The challenges are to bring together very large datasets, reformat and understand the individual instrument retrievals, co-register or re-grid the retrieved physical parameters, perform computationally-intensive data fusion and data mining operations, and accumulate complex statistics over months to years of data. To meet these challenges, we have developed a Grid computing and dataflow framework, named SciFlo, in which we are deploying a set of versatile and reusable operators for data access, subsetting, registration, mining, fusion, compression, and advanced statistical analysis. SciFlo leverages remote Web Services, called via Simple Object Access Protocol (SOAP) or REST (one-line) URLs, and the Grid Computing standards (WS-* &Globus Alliance toolkits), and enables scientists to do multi-instrument Earth Science by assembling reusable Web Services and native executables into a distributed computing flow (tree of operators). The SciFlo client &server engines optimize the execution of such distributed data flows and allow the user to transparently find and use datasets and operators without worrying about the actual location of the Grid resources. In particular, SciFlo exploits the wealth of datasets accessible by OpenGIS Consortium (OGC) Web Mapping Servers & Web Coverage Servers (WMS/WCS), and by Open Data Access Protocol (OpenDAP) servers. The scientist injects a distributed computation into the Grid by simply filling out an HTML form or directly authoring the underlying XML dataflow document, and results are returned directly to the scientist's desktop. Once an analysis has been specified for a chunk or day of data, it can be easily repeated with different control parameters or over months of data. Recently, the Earth Science Information Partners (ESIP) Federation sponsored a collaborative activity in which several ESIP members advertised their respective WMS/WCS and SOAP services, developed some collaborative science scenarios for atmospheric and aerosol science, and then choreographed services from multiple groups into demonstration workflows using the SciFlo engine and a Business Process Execution Language (BPEL) workflow engine. For several scenarios, the same collaborative workflow was executed in three ways: using hand-coded scripts, by executing a SciFlo document, and by executing a BPEL workflow document. We will discuss the lessons learned from this activity, the need for standardized interfaces (like WMS/WCS), the difficulty in agreeing on even simple XML formats and interfaces, and further collaborations that are being pursued.

  4. The impact on the workload of the Ward Manager with the introduction of administrative assistants.

    PubMed

    Locke, Rachel; Leach, Camilla; Kitsell, Fleur; Griffith, Jacki

    2011-03-01

    To evaluate the impact on the workload of the Ward Manager (WM) with the introduction of administrative assistants into eight trusts in the South of England in a year-long pilot. Ward Managers are nurse leaders who are responsible for ward management and delivering expert clinical care to patients. They have traditionally been expected to achieve this role without administrative assistance. Meeting the workload demands of multiple roles and overload has meant the leadership and clinical role has suffered, presenting issues of low morale among existing WMs and issues of recruiting the next generation of WMs. Sixty qualitative interviews were carried out with 16 WMs, 12 Ward Manager Assistants (WMAs), and six senior nurse executives about the impact of the introduction of the WMA post. Quantitative data to measure change in WM workload and ward activity was supplied by 24 wards. Ward Managers reported spending reduced time on administrative tasks and having increased time available to spend on the ward with patients and leading staff. With the introduction of WMAs, there was also improvement in key performance measures (the maintenance of quality under service pressures) and increased staff motivation. There was overwhelming support for the introduction of administrative assistants from participating WMs. The WMAs enabled WMs to spend more time with patients and, more widely, to provide greater support to ward teams. The success of the pilot is reflected in wards working hard to be able to extend contracts of WMAs. The extent of the success is reflected in wards that were not participants in the pilot, observing the benefits of the post, having worked to secure funding to recruit their own WMAs. The widespread introduction of administrative assistance could increase ward productivity and provide support for clinical leaders. Continuing professional development for WMs needs to incorporate training about management responsibilities and how to best use administrative support. © 2011 The Authors. Journal compilation © 2011 Blackwell Publishing Ltd.

  5. DISTANT EARLY WARNING SYSTEM for Tsunamis - A wide-area and multi-hazard approach

    NASA Astrophysics Data System (ADS)

    Hammitzsch, Martin; Lendholt, Matthias; Wächter, Joachim

    2010-05-01

    The DEWS (Distant Early Warning System) [1] project, funded under the 6th Framework Programme of the European Union, has the objective to create a new generation of interoperable early warning systems based on an open sensor platform. This platform integrates OGC [2] SWE [3] compliant sensor systems for the rapid detection of hazardous events, like earthquakes, sea level anomalies, ocean floor occurrences, and ground displacements in the case of tsunami early warning. Based on the upstream information flow DEWS focuses on the improvement of downstream capacities of warning centres especially by improving information logistics for effective and targeted warning message aggregation for a multilingual environment. Multiple telecommunication channels will be used for the dissemination of warning messages. Wherever possible, existing standards have been integrated. The Command and Control User Interface (CCUI), a rich client application based on Eclipse RCP (Rich Client Platform) [4] and the open source GIS uDig [5], integrates various OGC services. Using WMS (Web Map Service) [6] and WFS (Web Feature Service) [7] spatial data are utilized to depict the situation picture and to integrate a simulation system via WPS (Web Processing Service) [8] to identify affected areas. Warning messages are compiled and transmitted in the OASIS [9] CAP (Common Alerting Protocol) [10] standard together with addressing information defined via EDXL-DE (Emergency Data Exchange Language - Distribution Element) [11]. Internal interfaces are realized with SOAP [12] web services. Based on results of GITEWS [13] - in particular the GITEWS Tsunami Service Bus [14] - the DEWS approach provides an implementation for tsunami early warning systems but other geological paradigms are going to follow, e.g. volcanic eruptions or landslides. Therefore in future also multi-hazard functionality is conceivable. The specific software architecture of DEWS makes it possible to dock varying sensors to the system and to extend the CCUI with hazard specific functionality. The presentation covers the DEWS project, the system architecture and the CCUI in conjunction with details of information logistics. The DEWS Wide Area Centre connecting national centres to allow the international communication and warning exchange is presented also. REFERENCES: [1] DEWS, www.dews-online.org [2] OGC, www.opengeospatial.org [3] SWE, www.opengeospatial.org/projects/groups/sensorweb [4] Eclipse RCP, www.eclipse.org/home/categories/rcp.php [5] uDig, udig.refractions.net [6] WMS, www.opengeospatial.org/standards/wms [7] WFS, www.opengeospatial.org/standards/wfs [8] WPS, www.opengeospatial.org/standards/wps [9] OASIS, www.oasis-open.org [10] CAP, www.oasis-open.org/specs/#capv1.1 [11] EDXL-DE, www.oasis-open.org/specs/#edxlde-v1.0 [12] SOAP, www.w3.org/TR/soap [13] GITEWS (German Indonesian Tsunami Early Warning System) is a project of the German Federal Government to aid the recon¬struction of the tsunami-prone Indian Ocean region, www.gitews.org [14] The Tsunami Service Bus is the GITEWS sensor system integration platform offering standardised services for the detection and monitoring of tsunamis

  6. Data Access Tools And Services At The Goddard Distributed Active Archive Center (GDAAC)

    NASA Technical Reports Server (NTRS)

    Pham, Long; Eng, Eunice; Sweatman, Paul

    2003-01-01

    As one of the largest providers of Earth Science data from the Earth Observing System, GDAAC provides the latest data from the Moderate Resolution Imaging Spectroradiometer (MODIS), Atmospheric Infrared Sounder (AIRS), Solar Radiation and Climate Experiment (SORCE) data products via GDAAC's data pool (50TB of disk cache). In order to make this huge volume of data more accessible to the public and science communities, the GDAAC offers multiple data access tools and services: Open Source Project for Network Data Access Protocol (OPeNDAP), Grid Analysis and Display System (GrADS/DODS) (GDS), Live Access Server (LAS), OpenGlS Web Map Server (WMS) and Near Archive Data Mining (NADM). The objective is to assist users in retrieving electronically a smaller, usable portion of data for further analysis. The OPeNDAP server, formerly known as the Distributed Oceanographic Data System (DODS), allows the user to retrieve data without worrying about the data format. OPeNDAP is capable of server-side subsetting of HDF, HDF-EOS, netCDF, JGOFS, ASCII, DSP, FITS and binary data formats. The GrADS/DODS server is capable of serving the same data formats as OPeNDAP. GDS has an additional feature of server-side analysis. Users can analyze the data on the server there by decreasing the computational load on their client's system. The LAS is a flexible server that allows user to graphically visualize data on the fly, to request different file formats and to compare variables from distributed locations. Users of LAS have options to use other available graphics viewers such as IDL, Matlab or GrADS. WMS is based on the OPeNDAP for serving geospatial information. WMS supports OpenGlS protocol to provide data in GIs-friendly formats for analysis and visualization. NADM is another access to the GDAAC's data pool. NADM gives users the capability to use a browser to upload their C, FORTRAN or IDL algorithms, test the algorithms, and mine data in the data pool. With NADM, the GDAAC provides an environment physically close to the data source. NADM will benefit users with mining or offer data reduction algorithms by reducing large volumes of data before transmission over the network to the user.

  7. Tiled WMS/KML Server V2

    NASA Technical Reports Server (NTRS)

    Plesea, Lucian

    2012-01-01

    This software is a higher-performance implementation of tiled WMS, with integral support for KML and time-varying data. This software is compliant with the Open Geospatial WMS standard, and supports KML natively as a WMS return type, including support for the time attribute. Regionated KML wrappers are generated that match the existing tiled WMS dataset. Ping and JPG formats are supported, and the software is implemented as an Apache 2.0 module that supports a threading execution model that is capable of supporting very high request rates. The module intercepts and responds to WMS requests that match certain patterns and returns the existing tiles. If a KML format that matches an existing pyramid and tile dataset is requested, regionated KML is generated and returned to the requesting application. In addition, KML requests that do not match the existing tile datasets generate a KML response that includes the corresponding JPG WMS request, effectively adding KML support to a backing WMS server.

  8. Comparison of Wechsler Memory Scale-Fourth Edition (WMS-IV) and Third Edition (WMS-III) dimensional structures: improved ability to evaluate auditory and visual constructs.

    PubMed

    Hoelzle, James B; Nelson, Nathaniel W; Smith, Clifford A

    2011-03-01

    Dimensional structures underlying the Wechsler Memory Scale-Fourth Edition (WMS-IV) and Wechsler Memory Scale-Third Edition (WMS-III) were compared to determine whether the revised measure has a more coherent and clinically relevant factor structure. Principal component analyses were conducted in normative samples reported in the respective technical manuals. Empirically supported procedures guided retention of dimensions. An invariant two-dimensional WMS-IV structure reflecting constructs of auditory learning/memory and visual attention/memory (C1 = .97; C2 = .96) is more theoretically coherent than the replicable, heterogeneous WMS-III dimension (C1 = .97). This research suggests that the WMS-IV may have greater utility in identifying lateralized memory dysfunction.

  9. Pegasus Workflow Management System: Helping Applications From Earth and Space

    NASA Astrophysics Data System (ADS)

    Mehta, G.; Deelman, E.; Vahi, K.; Silva, F.

    2010-12-01

    Pegasus WMS is a Workflow Management System that can manage large-scale scientific workflows across Grid, local and Cloud resources simultaneously. Pegasus WMS provides a means for representing the workflow of an application in an abstract XML form, agnostic of the resources available to run it and the location of data and executables. It then compiles these workflows into concrete plans by querying catalogs and farming computations across local and distributed computing resources, as well as emerging commercial and community cloud environments in an easy and reliable manner. Pegasus WMS optimizes the execution as well as data movement by leveraging existing Grid and cloud technologies via a flexible pluggable interface and provides advanced features like reusing existing data, automatic cleanup of generated data, and recursive workflows with deferred planning. It also captures all the provenance of the workflow from the planning stage to the execution of the generated data, helping scientists to accurately measure performance metrics of their workflow as well as data reproducibility issues. Pegasus WMS was initially developed as part of the GriPhyN project to support large-scale high-energy physics and astrophysics experiments. Direct funding from the NSF enabled support for a wide variety of applications from diverse domains including earthquake simulation, bacterial RNA studies, helioseismology and ocean modeling. Earthquake Simulation: Pegasus WMS was recently used in a large scale production run in 2009 by the Southern California Earthquake Centre to run 192 million loosely coupled tasks and about 2000 tightly coupled MPI style tasks on National Cyber infrastructure for generating a probabilistic seismic hazard map of the Southern California region. SCEC ran 223 workflows over a period of eight weeks, using on average 4,420 cores, with a peak of 14,540 cores. A total of 192 million files were produced totaling about 165TB out of which 11TB of data was saved. Astrophysics: The Laser Interferometer Gravitational-Wave Observatory (LIGO) uses Pegasus WMS to search for binary inspiral gravitational waves. A month of LIGO data requires many thousands of jobs, running for days on hundreds of CPUs on the LIGO Data Grid (LDG) and Open Science Grid (OSG). Ocean Temperature Forecast: Researchers at the Jet Propulsion Laboratory are exploring Pegasus WMS to run ocean forecast ensembles of the California coastal region. These models produce a number of daily forecasts for water temperature, salinity, and other measures. Helioseismology: The Solar Dynamics Observatory (SDO) is NASA's most important solar physics mission of this coming decade. Pegasus WMS is being used to analyze the data from SDO, which will be predominantly used to learn about solar magnetic activity and to probe the internal structure and dynamics of the Sun with helioseismology. Bacterial RNA studies: SIPHT is an application in bacterial genomics, which predicts sRNA (small non-coding RNAs)-encoding genes in bacteria. This project currently provides a web-based interface using Pegasus WMS at the backend to facilitate large-scale execution of the workflows on varied resources and provide better notifications of task/workflow completion.

  10. Developing a GIS for CO2 analysis using lightweight, open source components

    NASA Astrophysics Data System (ADS)

    Verma, R.; Goodale, C. E.; Hart, A. F.; Kulawik, S. S.; Law, E.; Osterman, G. B.; Braverman, A.; Nguyen, H. M.; Mattmann, C. A.; Crichton, D. J.; Eldering, A.; Castano, R.; Gunson, M. R.

    2012-12-01

    There are advantages to approaching the realm of geographic information systems (GIS) using lightweight, open source components in place of a more traditional web map service (WMS) solution. Rapid prototyping, schema-less data storage, the flexible interchange of components, and open source community support are just some of the benefits. In our effort to develop an application supporting the geospatial and temporal rendering of remote sensing carbon-dioxide (CO2) data for the CO2 Virtual Science Data Environment project, we have connected heterogeneous open source components together to form a GIS. Utilizing widely popular open source components including the schema-less database MongoDB, Leaflet interactive maps, the HighCharts JavaScript graphing library, and Python Bottle web-services, we have constructed a system for rapidly visualizing CO2 data with reduced up-front development costs. These components can be aggregated together, resulting in a configurable stack capable of replicating features provided by more standard GIS technologies. The approach we have taken is not meant to replace the more established GIS solutions, but to instead offer a rapid way to provide GIS features early in the development of an application and to offer a path towards utilizing more capable GIS technology in the future.

  11. Geospatial data sharing, online spatial analysis and processing of Indian Biodiversity data in Internet GIS domain - A case study for raster based online geo-processing

    NASA Astrophysics Data System (ADS)

    Karnatak, H.; Pandey, K.; Oberai, K.; Roy, A.; Joshi, D.; Singh, H.; Raju, P. L. N.; Krishna Murthy, Y. V. N.

    2014-11-01

    National Biodiversity Characterization at Landscape Level, a project jointly sponsored by Department of Biotechnology and Department of Space, was implemented to identify and map the potential biodiversity rich areas in India. This project has generated spatial information at three levels viz. Satellite based primary information (Vegetation Type map, spatial locations of road & village, Fire occurrence); geospatially derived or modelled information (Disturbance Index, Fragmentation, Biological Richness) and geospatially referenced field samples plots. The study provides information of high disturbance and high biological richness areas suggesting future management strategies and formulating action plans. The study has generated for the first time baseline database in India which will be a valuable input towards climate change study in the Indian Subcontinent. The spatial data generated during the study is organized as central data repository in Geo-RDBMS environment using PostgreSQL and POSTGIS. The raster and vector data is published as OGC WMS and WFS standard for development of web base geoinformation system using Service Oriented Architecture (SOA). The WMS and WFS based system allows geo-visualization, online query and map outputs generation based on user request and response. This is a typical mashup architecture based geo-information system which allows access to remote web services like ISRO Bhuvan, Openstreet map, Google map etc., with overlay on Biodiversity data for effective study on Bio-resources. The spatial queries and analysis with vector data is achieved through SQL queries on POSTGIS and WFS-T operations. But the most important challenge is to develop a system for online raster based geo-spatial analysis and processing based on user defined Area of Interest (AOI) for large raster data sets. The map data of this study contains approximately 20 GB of size for each data layer which are five in number. An attempt has been to develop system using python, PostGIS and PHP for raster data analysis over the web for Biodiversity conservation and prioritization. The developed system takes inputs from users as WKT, Openlayer based Polygon geometry and Shape file upload as AOI to perform raster based operation using Python and GDAL/OGR. The intermediate products are stored in temporary files and tables which generate XML outputs for web representation. The raster operations like clip-zip-ship, class wise area statistics, single to multi-layer operations, diagrammatic representation and other geo-statistical analysis are performed. This is indigenous geospatial data processing engine developed using Open system architecture for spatial analysis of Biodiversity data sets in Internet GIS environment. The performance of this applications in multi-user environment like Internet domain is another challenging task which is addressed by fine tuning the source code, server hardening, spatial indexing and running the process in load balance mode. The developed system is hosted in Internet domain (http://bis.iirs.gov.in) for user access.

  12. Application of open source standards and technologies in the http://climate4impact.eu/ portal

    NASA Astrophysics Data System (ADS)

    Plieger, Maarten; Som de Cerff, Wim; Pagé, Christian; Tatarinova, Natalia

    2015-04-01

    This presentation will demonstrate how to calculate and visualize the climate indice SU (number of summer days) on the climate4impact portal. The following topics will be covered during the demonstration: - Security: Login using OpenID for access to the Earth System Grid Fedeation (ESGF) data nodes. The ESGF works in conjunction with several external websites and systems. The climate4impact portal uses X509 based short lived credentials, generated on behalf of the user with a MyProxy service. Single Sign-on (SSO) is used to make these websites and systems work together. - Discovery: Facetted search based on e.g. variable name, model and institute using the ESGF search services. A catalog browser allows for browsing through CMIP5 and any other climate model data catalogues (e.g. ESSENCE, EOBS, UNIDATA). - Processing using Web Processing Services (WPS): Transform data, subset, export into other formats, and perform climate indices calculations using Web Processing Services implemented by PyWPS, based on NCAR NCPP OpenClimateGIS and IS-ENES2 ICCLIM. - Visualization using Web Map Services (WMS): Visualize data from ESGF data nodes using ADAGUC Web Map Services. The aim of climate4impact is to enhance the use of Climate Research Data and to enhance the interaction with climate effect/impact communities. The portal is based on 21 impact use cases from 5 different European countries, and is evaluated by a user panel consisting of use case owners. It has been developed within the European projects IS-ENES and IS-ENES2 for more than 5 years, and its development currently continues within IS-ENES2 and CLIPC. As the climate impact community is very broad, the focus is mainly on the scientific impact community. This work has resulted in the ENES portal interface for climate impact communities and can be visited at http://climate4impact.eu/ The current main objectives for climate4impact can be summarized in two objectives. The first one is to work on a web interface which automatically generates a graphical user interface on WPS endpoints. The WPS calculates climate indices and subset data using OpenClimateGIS/ICCLIM on data stored in ESGF data nodes. Data is then transmitted from ESGF nodes over secured OpenDAP and becomes available in a new, per user, secured OpenDAP server. The results can then be visualized again using ADAGUC WMS. Dedicated wizards for processing of climate indices will be developed in close collaboration with users. The second one is to expose climate4impact services, so as to offer standardized services which can be used by other portals. This has the advantage to add interoperability between several portals, as well as to enable the design of specific portals aimed at different impact communities, either thematic or national, for example.

  13. Biowep: a workflow enactment portal for bioinformatics applications.

    PubMed

    Romano, Paolo; Bartocci, Ezio; Bertolini, Guglielmo; De Paoli, Flavio; Marra, Domenico; Mauri, Giancarlo; Merelli, Emanuela; Milanesi, Luciano

    2007-03-08

    The huge amount of biological information, its distribution over the Internet and the heterogeneity of available software tools makes the adoption of new data integration and analysis network tools a necessity in bioinformatics. ICT standards and tools, like Web Services and Workflow Management Systems (WMS), can support the creation and deployment of such systems. Many Web Services are already available and some WMS have been proposed. They assume that researchers know which bioinformatics resources can be reached through a programmatic interface and that they are skilled in programming and building workflows. Therefore, they are not viable to the majority of unskilled researchers. A portal enabling these to take profit from new technologies is still missing. We designed biowep, a web based client application that allows for the selection and execution of a set of predefined workflows. The system is available on-line. Biowep architecture includes a Workflow Manager, a User Interface and a Workflow Executor. The task of the Workflow Manager is the creation and annotation of workflows. These can be created by using either the Taverna Workbench or BioWMS. Enactment of workflows is carried out by FreeFluo for Taverna workflows and by BioAgent/Hermes, a mobile agent-based middleware, for BioWMS ones. Main workflows' processing steps are annotated on the basis of their input and output, elaboration type and application domain by using a classification of bioinformatics data and tasks. The interface supports users authentication and profiling. Workflows can be selected on the basis of users' profiles and can be searched through their annotations. Results can be saved. We developed a web system that support the selection and execution of predefined workflows, thus simplifying access for all researchers. The implementation of Web Services allowing specialized software to interact with an exhaustive set of biomedical databases and analysis software and the creation of effective workflows can significantly improve automation of in-silico analysis. Biowep is available for interested researchers as a reference portal. They are invited to submit their workflows to the workflow repository. Biowep is further being developed in the sphere of the Laboratory of Interdisciplinary Technologies in Bioinformatics - LITBIO.

  14. Biowep: a workflow enactment portal for bioinformatics applications

    PubMed Central

    Romano, Paolo; Bartocci, Ezio; Bertolini, Guglielmo; De Paoli, Flavio; Marra, Domenico; Mauri, Giancarlo; Merelli, Emanuela; Milanesi, Luciano

    2007-01-01

    Background The huge amount of biological information, its distribution over the Internet and the heterogeneity of available software tools makes the adoption of new data integration and analysis network tools a necessity in bioinformatics. ICT standards and tools, like Web Services and Workflow Management Systems (WMS), can support the creation and deployment of such systems. Many Web Services are already available and some WMS have been proposed. They assume that researchers know which bioinformatics resources can be reached through a programmatic interface and that they are skilled in programming and building workflows. Therefore, they are not viable to the majority of unskilled researchers. A portal enabling these to take profit from new technologies is still missing. Results We designed biowep, a web based client application that allows for the selection and execution of a set of predefined workflows. The system is available on-line. Biowep architecture includes a Workflow Manager, a User Interface and a Workflow Executor. The task of the Workflow Manager is the creation and annotation of workflows. These can be created by using either the Taverna Workbench or BioWMS. Enactment of workflows is carried out by FreeFluo for Taverna workflows and by BioAgent/Hermes, a mobile agent-based middleware, for BioWMS ones. Main workflows' processing steps are annotated on the basis of their input and output, elaboration type and application domain by using a classification of bioinformatics data and tasks. The interface supports users authentication and profiling. Workflows can be selected on the basis of users' profiles and can be searched through their annotations. Results can be saved. Conclusion We developed a web system that support the selection and execution of predefined workflows, thus simplifying access for all researchers. The implementation of Web Services allowing specialized software to interact with an exhaustive set of biomedical databases and analysis software and the creation of effective workflows can significantly improve automation of in-silico analysis. Biowep is available for interested researchers as a reference portal. They are invited to submit their workflows to the workflow repository. Biowep is further being developed in the sphere of the Laboratory of Interdisciplinary Technologies in Bioinformatics – LITBIO. PMID:17430563

  15. Geoscience data visualization and analysis using GeoMapApp

    NASA Astrophysics Data System (ADS)

    Ferrini, Vicki; Carbotte, Suzanne; Ryan, William; Chan, Samantha

    2013-04-01

    Increased availability of geoscience data resources has resulted in new opportunities for developing visualization and analysis tools that not only promote data integration and synthesis, but also facilitate quantitative cross-disciplinary access to data. Interdisciplinary investigations, in particular, frequently require visualizations and quantitative access to specialized data resources across disciplines, which has historically required specialist knowledge of data formats and software tools. GeoMapApp (www.geomapapp.org) is a free online data visualization and analysis tool that provides direct quantitative access to a wide variety of geoscience data for a broad international interdisciplinary user community. While GeoMapApp provides access to online data resources, it can also be packaged to work offline through the deployment of a small portable hard drive. This mode of operation can be particularly useful during field programs to provide functionality and direct access to data when a network connection is not possible. Hundreds of data sets from a variety of repositories are directly accessible in GeoMapApp, without the need for the user to understand the specifics of file formats or data reduction procedures. Available data include global and regional gridded data, images, as well as tabular and vector datasets. In addition to basic visualization and data discovery functionality, users are provided with simple tools for creating customized maps and visualizations and to quantitatively interrogate data. Specialized data portals with advanced functionality are also provided for power users to further analyze data resources and access underlying component datasets. Users may import and analyze their own geospatial datasets by loading local versions of geospatial data and can access content made available through Web Feature Services (WFS) and Web Map Services (WMS). Once data are loaded in GeoMapApp, a variety options are provided to export data and/or 2D/3D visualizations into common formats including grids, images, text files, spreadsheets, etc. Examples of interdisciplinary investigations that make use of GeoMapApp visualization and analysis functionality will be provided.

  16. GENESIS SciFlo: Choreographing Interoperable Web Services on the Grid using a Semantically-Enabled Dataflow Execution Environment

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Manipon, G.; Xing, Z.

    2007-12-01

    The General Earth Science Investigation Suite (GENESIS) project is a NASA-sponsored partnership between the Jet Propulsion Laboratory, academia, and NASA data centers to develop a new suite of Web Services tools to facilitate multi-sensor investigations in Earth System Science. The goal of GENESIS is to enable large-scale, multi-instrument atmospheric science using combined datasets from the AIRS, MODIS, MISR, and GPS sensors. Investigations include cross-comparison of spaceborne climate sensors, cloud spectral analysis, study of upper troposphere-stratosphere water transport, study of the aerosol indirect cloud effect, and global climate model validation. The challenges are to bring together very large datasets, reformat and understand the individual instrument retrievals, co-register or re-grid the retrieved physical parameters, perform computationally-intensive data fusion and data mining operations, and accumulate complex statistics over months to years of data. To meet these challenges, we have developed a Grid computing and dataflow framework, named SciFlo, in which we are deploying a set of versatile and reusable operators for data access, subsetting, registration, mining, fusion, compression, and advanced statistical analysis. SciFlo leverages remote Web Services, called via Simple Object Access Protocol (SOAP) or REST (one-line) URLs, and the Grid Computing standards (WS-* & Globus Alliance toolkits), and enables scientists to do multi- instrument Earth Science by assembling reusable Web Services and native executables into a distributed computing flow (tree of operators). The SciFlo client & server engines optimize the execution of such distributed data flows and allow the user to transparently find and use datasets and operators without worrying about the actual location of the Grid resources. In particular, SciFlo exploits the wealth of datasets accessible by OpenGIS Consortium (OGC) Web Mapping Servers & Web Coverage Servers (WMS/WCS), and by Open Data Access Protocol (OpenDAP) servers. SciFlo also publishes its own SOAP services for space/time query and subsetting of Earth Science datasets, and automated access to large datasets via lists of (FTP, HTTP, or DAP) URLs which point to on-line HDF or netCDF files. Typical distributed workflows obtain datasets by calling standard WMS/WCS servers or discovering and fetching data granules from ftp sites; invoke remote analysis operators available as SOAP services (interface described by a WSDL document); and merge results into binary containers (netCDF or HDF files) for further analysis using local executable operators. Naming conventions (HDFEOS and CF-1.0 for netCDF) are exploited to automatically understand and read on-line datasets. More interoperable conventions, and broader adoption of existing converntions, are vital if we are to "scale up" automated choreography of Web Services beyond toy applications. Recently, the ESIP Federation sponsored a collaborative activity in which several ESIP members developed some collaborative science scenarios for atmospheric and aerosol science, and then choreographed services from multiple groups into demonstration workflows using the SciFlo engine and a Business Process Execution Language (BPEL) workflow engine. We will discuss the lessons learned from this activity, the need for standardized interfaces (like WMS/WCS), the difficulty in agreeing on even simple XML formats and interfaces, the benefits of doing collaborative science analysis at the "touch of a button" once services are connected, and further collaborations that are being pursued.

  17. CometQuest: A Rosetta Adventure

    NASA Technical Reports Server (NTRS)

    Leon, Nancy J.; Fisher, Diane K.; Novati, Alexander; Chmielewski, Artur B.; Fitzpatrick, Austin J.; Angrum, Andrea

    2012-01-01

    This software is a higher-performance implementation of tiled WMS, with integral support for KML and time-varying data. This software is compliant with the Open Geospatial WMS standard, and supports KML natively as a WMS return type, including support for the time attribute. Regionated KML wrappers are generated that match the existing tiled WMS dataset. Ping and JPG formats are supported, and the software is implemented as an Apache 2.0 module that supports a threading execution model that is capable of supporting very high request rates. The module intercepts and responds to WMS requests that match certain patterns and returns the existing tiles. If a KML format that matches an existing pyramid and tile dataset is requested, regionated KML is generated and returned to the requesting application. In addition, KML requests that do not match the existing tile datasets generate a KML response that includes the corresponding JPG WMS request, effectively adding KML support to a backing WMS server.

  18. In vitro and in vivo biotransformation of WMS-1410, a potent GluN2B selective NMDA receptor antagonist.

    PubMed

    Falck, Evamaria; Begrow, Frank; Verspohl, Eugen J; Wünsch, Bernhard

    2014-06-01

    Structural modification of the GluN2B selective NMDA receptor antagonist ifenprodil led to the 3-benzazepine WMS-1410 with similar GluN2B affinity but higher receptor selectivity. Herein the in vitro and in vivo biotransformation of WMS-1410 is reported. Incubation of WMS-1410 with rat liver microsomes and different cofactors resulted in four hydroxylated phase I metabolites, two phase II metabolites and five combined phase I/II metabolites. With exception of catechol 4, these metabolites were also identified in the urine of a rat treated with WMS-1410. However the metabolites 7, 8 and 12 clearly show that the catechol metabolite 4 was also formed in vivo. As shown for ifenprodil the phenol of WMS-1410 represents the metabolically most reactive structural element. The biotransformation of WMS-1410 is considerably slower than the biotransformation of ifenprodil indicating a higher metabolic stability. From the viewpoint of metabolic stability the bioisosteric replacement of the phenol of WMS-1410 by a metabolically more stable moiety should be favourable. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Clinical utility of the Wechsler Memory Scale--Fourth Edition (WMS-IV) in predicting laterality of temporal lobe epilepsy among surgical candidates.

    PubMed

    Soble, Jason R; Eichstaedt, Katie E; Waseem, Hena; Mattingly, Michelle L; Benbadis, Selim R; Bozorg, Ali M; Vale, Fernando L; Schoenberg, Mike R

    2014-12-01

    This study evaluated the accuracy of the Wechsler Memory Scale--Fourth Edition (WMS-IV) in identifying functional cognitive deficits associated with seizure laterality in localization-related temporal lobe epilepsy (TLE) relative to a previously established measure, the Rey Auditory Verbal Learning Test (RAVLT). Emerging WMS-IV studies have highlighted psychometric improvements that may enhance its ability to identify lateralized memory deficits. Data from 57 patients with video-EEG-confirmed unilateral TLE who were administered the WMS-IV and RAVLT as part of a comprehensive presurgical neuropsychological evaluation for temporal resection were retrospectively reviewed. We examined the predictive accuracy of the WMS-IV not only in terms of verbal versus visual composite scores but also using individual subtests. A series of hierarchal logistic regression models were developed, including the RAVLT, WMS-IV delayed subtests (Logical Memory, Verbal Paired Associates, Designs, Visual Reproduction), and a WMS-IV verbal-visual memory difference score. Analyses showed that the RAVLT significantly predicted laterality with overall classification rates of 69.6% to 70.2%, whereas neither the individual WMS-IV subtests nor the verbal-visual memory difference score accounted for additional significant variance. Similar to previous versions of the WMS, findings cast doubt as to whether the WMS-IV offers significant incremental validity in discriminating seizure laterality in TLE beyond what can be obtained from the RAVLT. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. Analysis of calibration-free wavelength-scanned wavelength modulation spectroscopy for practical gas sensing using tunable diode lasers

    NASA Astrophysics Data System (ADS)

    Sun, K.; Chao, X.; Sur, R.; Goldenstein, C. S.; Jeffries, J. B.; Hanson, R. K.

    2013-12-01

    A novel strategy has been developed for analysis of wavelength-scanned, wavelength modulation spectroscopy (WMS) with tunable diode lasers (TDLs). The method simulates WMS signals to compare with measurements to determine gas properties (e.g., temperature, pressure and concentration of the absorbing species). Injection-current-tuned TDLs have simultaneous wavelength and intensity variation, which severely complicates the Fourier expansion of the simulated WMS signal into harmonics of the modulation frequency (fm). The new method differs from previous WMS analysis strategies in two significant ways: (1) the measured laser intensity is used to simulate the transmitted laser intensity and (2) digital lock-in and low-pass filter software is used to expand both simulated and measured transmitted laser intensities into harmonics of the modulation frequency, WMS-nfm (n = 1, 2, 3,…), avoiding the need for an analytic model of intensity modulation or Fourier expansion of the simulated WMS harmonics. This analysis scheme is valid at any optical depth, modulation index, and at all values of scanned-laser wavelength. The method is demonstrated and validated with WMS of H2O dilute in air (1 atm, 296 K, near 1392 nm). WMS-nfm harmonics for n = 1 to 6 are extracted and the simulation and measurements are found in good agreement for the entire WMS lineshape. The use of 1f-normalization strategies to realize calibration-free wavelength-scanned WMS is also discussed.

  1. Analysis of the cadastral data published in the Polish Spatial Data Infrastructure

    NASA Astrophysics Data System (ADS)

    Izdebski, Waldemar

    2017-12-01

    The cadastral data, including land parcels, are the basic reference data for presenting various objects collected in spatial databases. Easy access to up-to-date records is a very important matter for the individuals and institutions using spatial data infrastructure. The primary objective of the study was to check the current accessibility of cadastral data as well as to verify how current and complete they are. The author started researching this topic in 2007, i.e. from the moment the Team for National Spatial Data Infrastructure developed documentation concerning the standard of publishing cadastral data with the use of the WMS. Since ten years, the author was monitoring the status of cadastral data publishing in various districts as well as participated in data publishing in many districts. In 2017, when only half of the districts published WMS services from cadastral data, the questions arise: why is it so and how to change this unfavourable status? As a result of the tests performed, it was found that the status of publishing cadastral data is still far from perfect. The quality of the offered web services varies and, unfortunately, many services offer poor performance; moreover, there are plenty services that do not operate at all.

  2. Development of Web GIS for complex processing and visualization of climate geospatial datasets as an integral part of dedicated Virtual Research Environment

    NASA Astrophysics Data System (ADS)

    Gordov, Evgeny; Okladnikov, Igor; Titov, Alexander

    2017-04-01

    For comprehensive usage of large geospatial meteorological and climate datasets it is necessary to create a distributed software infrastructure based on the spatial data infrastructure (SDI) approach. Currently, it is generally accepted that the development of client applications as integrated elements of such infrastructure should be based on the usage of modern web and GIS technologies. The paper describes the Web GIS for complex processing and visualization of geospatial (mainly in NetCDF and PostGIS formats) datasets as an integral part of the dedicated Virtual Research Environment for comprehensive study of ongoing and possible future climate change, and analysis of their implications, providing full information and computing support for the study of economic, political and social consequences of global climate change at the global and regional levels. The Web GIS consists of two basic software parts: 1. Server-side part representing PHP applications of the SDI geoportal and realizing the functionality of interaction with computational core backend, WMS/WFS/WPS cartographical services, as well as implementing an open API for browser-based client software. Being the secondary one, this part provides a limited set of procedures accessible via standard HTTP interface. 2. Front-end part representing Web GIS client developed according to a "single page application" technology based on JavaScript libraries OpenLayers (http://openlayers.org/), ExtJS (https://www.sencha.com/products/extjs), GeoExt (http://geoext.org/). It implements application business logic and provides intuitive user interface similar to the interface of such popular desktop GIS applications, as uDIG, QuantumGIS etc. Boundless/OpenGeo architecture was used as a basis for Web-GIS client development. According to general INSPIRE requirements to data visualization Web GIS provides such standard functionality as data overview, image navigation, scrolling, scaling and graphical overlay, displaying map legends and corresponding metadata information. The specialized Web GIS client contains three basic tires: • Tier of NetCDF metadata in JSON format • Middleware tier of JavaScript objects implementing methods to work with: o NetCDF metadata o XML file of selected calculations configuration (XML task) o WMS/WFS/WPS cartographical services • Graphical user interface tier representing JavaScript objects realizing general application business logic Web-GIS developed provides computational processing services launching to support solving tasks in the area of environmental monitoring, as well as presenting calculation results in the form of WMS/WFS cartographical layers in raster (PNG, JPG, GeoTIFF), vector (KML, GML, Shape), and binary (NetCDF) formats. It has shown its effectiveness in the process of solving real climate change research problems and disseminating investigation results in cartographical formats. The work is supported by the Russian Science Foundation grant No 16-19-10257.

  3. GIS Technologies For The New Planetary Science Archive (PSA)

    NASA Astrophysics Data System (ADS)

    Docasal, R.; Barbarisi, I.; Rios, C.; Macfarlane, A. J.; Gonzalez, J.; Arviset, C.; De Marchi, G.; Martinez, S.; Grotheer, E.; Lim, T.; Besse, S.; Heather, D.; Fraga, D.; Barthelemy, M.

    2015-12-01

    Geographical information system (GIS) is becoming increasingly used for planetary science. GIS are computerised systems for the storage, retrieval, manipulation, analysis, and display of geographically referenced data. Some data stored in the Planetary Science Archive (PSA), for instance, a set of Mars Express/Venus Express data, have spatial metadata associated to them. To facilitate users in handling and visualising spatial data in GIS applications, the new PSA should support interoperability with interfaces implementing the standards approved by the Open Geospatial Consortium (OGC). These standards are followed in order to develop open interfaces and encodings that allow data to be exchanged with GIS Client Applications, well-known examples of which are Google Earth and NASA World Wind as well as open source tools such as Openlayers. The technology already exists within PostgreSQL databases to store searchable geometrical data in the form of the PostGIS extension. An existing open source maps server is GeoServer, an instance of which has been deployed for the new PSA, uses the OGC standards to allow, among others, the sharing, processing and editing of data and spatial data through the Web Feature Service (WFS) standard as well as serving georeferenced map images through the Web Map Service (WMS). The final goal of the new PSA, being developed by the European Space Astronomy Centre (ESAC) Science Data Centre (ESDC), is to create an archive which enables science exploitation of ESA's planetary missions datasets. This can be facilitated through the GIS framework, offering interfaces (both web GUI and scriptable APIs) that can be used more easily and scientifically by the community, and that will also enable the community to build added value services on top of the PSA.

  4. Oregon OCS seafloor mapping: Selected lease blocks relevant to renewable energy

    USGS Publications Warehouse

    Cochrane, Guy R.; Hemery, Lenaïg G.; Henkel, Sarah K.

    2017-05-23

    In 2014 the U.S. Geological Survey (USGS) and the Bureau of Ocean Energy Management (BOEM) entered into Intra-agency agreement M13PG00037 to map an area of the Oregon Outer Continental Shelf (OCS) off of Coos Bay, Oregon, under consideration for development of a floating wind energy farm. The BOEM requires seafloor mapping and site characterization studies in order to evaluate the impact of seafloor and sub-seafloor conditions on the installation, operation, and structural integrity of proposed renewable energy projects, as well as to assess the potential effects of construction and operations on archaeological resources. The mission of the USGS is to provide geologic, topographic, and hydrologic information that contributes to the wise management of the Nation's natural resources and that promotes the health, safety, and well being of the people. This information consists of maps, databases, and descriptions and analyses of the water, energy, and mineral resources, land surface, underlying geologic structure, and dynamic processes of the earth.For the Oregon OCS study, the USGS acquired multibeam echo sounder and seafloor video data surrounding the proposed development site, which is 95 km2 in area and 15 miles offshore from Coos Bay. The development site had been surveyed by Solmar Hydro Inc. in 2013 under a contract with WindFloat Pacific. The USGS subsequently produced a bathymetry digital elevation model and a backscatter intensity grid that were merged with existing data collected by the contractor. The merged grids were published along with visual observations of benthic geo-habitat from the video data in an associated USGS data release (Cochrane and others, 2015).This report includes the results of analysis of the video data conducted by Oregon State University and the geo-habitat interpretation of the multibeam echo sounder (MBES) data conducted by the USGS. MBES data was published in Cochrane and others (2015). Interpretive data associated with this publication is published in Cochrane (2017). All the data is provided as geographic information system (GIS) files that contain both Esri ArcGIS geotiffs or shapefiles. For those who do not own the full suite of Esri GIS and mapping software, the data can be read using Esri ArcReader, a free viewer that is available at http://www.esri.com/software/arcgis/arcreader/index.html (last accessed August 29, 2016). Web services, which consist of standard implementations of ArcGIS representational state transfer (REST) Service and Open Geospatial Consortium (OGC) GIS web map service (WMS), also are available for all published GIS data. Web services were created using an ArcGIS service definition file, resulting in data layers that are symbolized as shown on the associated report figures. Both the ArcGIS REST Service and OGC WMS Service include all the individual GIS layers. Data layers are bundled together in a map-area web service; however, each layer can be symbolized and accessed individually after the web service is ingested into a desktop application or web map. Web services enable users to download and view data, as well as to easily add data to their own workflows, using any browser-enabled, standalone or mobile device.Though the surficial substrate is dominated by combinations of mud and sand substrate, a diverse assortment of geomorphologic features are related to geologic processes—one anticlinal ridge where bedrock is exposed, a slump and associated scarps, and pockmarks. Pockmarks are seen in the form of fields of small pockmarks, a lineation of large pockmarks with methanogenic carbonates, and areas of large pockmarks that have merged into larger variously shaped depressions. The slump appears to have originated at the pockmark lineation. Video-supervised numerical analysis of the MBES backscatter intensity data and vector ruggedness derived from the MBES bathymetry data was used to produce a substrate model called a seafloor character raster for the study area. The seafloor character raster consists of three substrate classes: soft-flat areas, hard-flat areas, and hard-rugged areas. A Coastal and Marine Ecological Classification Standard (CMECS) geoform and substrate map was also produced using depth, slope, and benthic position index classes to delineate geoform boundaries. Seven geoforms were identified in this process, including ridges, slump scars, slump deposits, basins, and pockmarks.Statistical analysis of the video data for correlations between substrate, depth, and invertebrate assemblages resulted in the identification of seven biomes: three hard-bottom biomes and four softbottom biomes. A similar analysis of vertebrate observations produces a similar set of biomes. The biome between-group dissimilarity was very high or high. Invertebrates alone represent most of the structure of the whole benthic community into different assemblages. A biotope map was generated using the seafloor character raster and the substrate and depth values of the biomes. Hard substrate biotopes were small in size and were located primarily on the ridge and in pockmarks along the pockmark lineation. The soft-bottom bitopes consisted of large contiguous areas delimited by isobaths.

  5. Synthesis of wrinkled mesoporous silica and its reinforcing effect for dental resin composites.

    PubMed

    Wang, Ruili; Habib, Eric; Zhu, X X

    2017-10-01

    The aim of this work is to explore the reinforcing effect of wrinkled mesoporous silica (WMS), which should allow micromechanical resin matrix/filler interlocking in dental resin composites, and to investigate the effect of silica morphology, loading, and compositions on their mechanical properties. WMS (average diameter of 496nm) was prepared through the self-assembly method and characterized by the use of the electron microscopy, dynamic light scattering, and the N 2 adsorption-desorption measurements. The mechanical properties of resin composites containing silanized WMS and nonporous smaller silica were evaluated with a universal mechanical testing machine. Field-emission scanning electron microscopy was used to study the fracture morphology of dental composites. Resin composites including silanized silica particles (average diameter of 507nm) served as the control group. Higher filler loading of silanized WMS substantially improved the mechanical properties of the neat resin matrix, over the composites loaded with regular silanized silica particles similar in size. The impregnation of smaller secondary silica particles with diameters of 90 and 190nm, denoted respectively as Si90 and Si190, increased the filler loading of the bimodal WMS filler (WMS-Si90 or WMS-Si190) to 60wt%, and the corresponding composites exhibited better mechanical properties than the control fillers made with regular silica particles. Among all composites, the optimal WMS-Si190- filled composite (mass ratio WMS:Si190=10:90, total filler loading 60wt%) exhibited the best mechanical performance including flexural strength, flexural modulus, compressive strength and Vickers microhardness. The incorporation of WMS and its mixed bimodal fillers with smaller silica particles led to the design and formulation of dental resin composites with superior mechanical properties. Copyright © 2017 The Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  6. Gulf of Mexico Data Atlas: Digital Data Discovery and Access

    NASA Astrophysics Data System (ADS)

    Rose, K.

    2014-12-01

    The Gulf of Mexico Data Atlas is an online data discovery and access tool that allows users to browse a growing collection of ecosystem-related datasets visualized as map plates. Thematically, the Atlas includes updated long-term assessments of the physical, biological, environmental, economic and living marine resource characteristics that indicate baseline conditions of the Gulf of Mexico ecosystems. These data are crucial components of integrated ecosystem assessments and modeling and support restoration and monitoring efforts in the Gulf. A multi-agency executive steering committee including members from international, federal, state, and non-governmental organizations was established to guide Atlas development and to contribute data and expertise. The Atlas currently contains over 235 maps in 70 subject areas. Each map plate is accompanied by a descriptive summary authored by a subject matter expert and each data set is fully documented by metadata in Federal Geographic Data Committee (FGDC)-compliant standards. Source data are available in native formats and as web mapping services (WMS). Datasets are also searchable through an accompanying Map Catalog and RSS feed. The Gulf of Mexico Data Atlas is an operational example of the philosophy of leveraging resources among agencies and activities involved in geospatial data as outlined in the US Department of Interior and FGDC "Geospatial Platform Modernization Roadmap v4 - March 2011". We continue to update and add datasets through existing and new partnerships to ensure that the Atlas becomes a truly ecosystem-wide resource.

  7. Method for calibration-free scanned-wavelength modulation spectroscopy for gas sensing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanson, Ronald K.; Jeffries, Jay B.; Sun, Kai

    A method of calibration-free scanned-wavelength modulation spectroscopy (WMS) absorption sensing is provided by obtaining absorption lineshape measurements of a gas sample on a sensor using 1f-normalized WMS-2f where an injection current to an injection current-tunable diode laser (TDL) is modulated at a frequency f, where a wavelength modulation and an intensity modulation of the TDL are simultaneously generated, extracting using a numerical lock-in program and a low-pass filter appropriate band-width WMS-nf (n=1, 2, . . . ) signals, where the WMS-nf signals are harmonics of the f, determining a physical property of the gas sample according to ratios of themore » WMS-nf signals, determining the zero-absorption background using scanned-wavelength WMS, and determining non-absorption losses using at least two of the harmonics, where a need for a non-absorption baseline measurement is removed from measurements in environments where collision broadening has blended transition linewidths, where calibration free WMS measurements without knowledge of the transition linewidth is enabled.« less

  8. Data Access System for Hydrology

    NASA Astrophysics Data System (ADS)

    Whitenack, T.; Zaslavsky, I.; Valentine, D.; Djokic, D.

    2007-12-01

    As part of the CUAHSI HIS (Consortium of Universities for the Advancement of Hydrologic Science, Inc., Hydrologic Information System), the CUAHSI HIS team has developed Data Access System for Hydrology or DASH. DASH is based on commercial off the shelf technology, which has been developed in conjunction with a commercial partner, ESRI. DASH is a web-based user interface, developed in ASP.NET developed using ESRI ArcGIS Server 9.2 that represents a mapping, querying and data retrieval interface over observation and GIS databases, and web services. This is the front end application for the CUAHSI Hydrologic Information System Server. The HIS Server is a software stack that organizes observation databases, geographic data layers, data importing and management tools, and online user interfaces such as the DASH application, into a flexible multi- tier application for serving both national-level and locally-maintained observation data. The user interface of the DASH web application allows online users to query observation networks by location and attributes, selecting stations in a user-specified area where a particular variable was measured during a given time interval. Once one or more stations and variables are selected, the user can retrieve and download the observation data for further off-line analysis. The DASH application is highly configurable. The mapping interface can be configured to display map services from multiple sources in multiple formats, including ArcGIS Server, ArcIMS, and WMS. The observation network data is configured in an XML file where you specify the network's web service location and its corresponding map layer. Upon initial deployment, two national level observation networks (USGS NWIS daily values and USGS NWIS Instantaneous values) are already pre-configured. There is also an optional login page which can be used to restrict access as well as providing a alternative to immediate downloads. For large request, users would be notified via email with a link to their data when it is ready.

  9. Modern Technologies aspects for Oceanographic Data Management and Dissemination : The HNODC Implementation

    NASA Astrophysics Data System (ADS)

    Lykiardopoulos, A.; Iona, A.; Lakes, V.; Batis, A.; Balopoulos, E.

    2009-04-01

    The development of new technologies for the aim of enhancing Web Applications with Dynamically data access was the starting point for Geospatial Web Applications to developed at the same time as well. By the means of these technologies the Web Applications embed the capability of presenting Geographical representations of the Geo Information. The induction in nowadays, of the state of the art technologies known as Web Services, enforce the Web Applications to have interoperability among them i.e. to be able to process requests from each other via a network. In particular throughout the Oceanographic Community, modern Geographical Information systems based on Geospatial Web Services are now developed or will be developed shortly in the near future, with capabilities of managing the information itself fully through Web Based Geographical Interfaces. The exploitation of HNODC Data Base, through a Web Based Application enhanced with Web Services by the use of open source tolls may be consider as an ideal case of such implementation. Hellenic National Oceanographic Data Center (HNODC) as a National Public Oceanographic Data provider and at the same time a member of the International Net of Oceanographic Data Centers( IOC/IODE), owns a very big volume of Data and Relevant information about the Marine Ecosystem. For the efficient management and exploitation of these Data, a relational Data Base has been constructed with a storage of over 300.000 station data concerning, physical, chemical and biological Oceanographic information. The development of a modern Web Application for the End User worldwide to be able to explore and navigate throughout HNODC data via the use of an interface with the capability of presenting Geographical representations of the Geo Information, is today a fact. The application is constituted with State of the art software components and tools such as: • Geospatial and no Spatial Web Services mechanisms • Geospatial open source tools for the creation of Dynamic Geographical Representations. • Communication protocols (messaging mechanisms) in all Layers such as XML and GML together with SOAP protocol via Apache/Axis. At the same time the application may interact with any other SOA application either in sending or receiving Geospatial Data through Geographical Layers, since it inherits the big advantage of interoperability between Web Services systems. Roughly the Architecture can denoted as follows: • At the back End Open source PostgreSQL DBMS stands as the data storage mechanism with more than one Data Base Schemas cause of the separation of the Geospatial Data and the non Geospatial Data. • UMN Map Server and Geoserver are the mechanisms for: Represent Geospatial Data via Web Map Service (WMS) Querying and Navigating in Geospatial and Meta Data Information via Web Feature Service (WFS) oAnd in the near future Transacting and processing new or existing Geospatial Data via Web Processing Service (WPS) • Map Bender, a geospatial portal site management software for OGC and OWS architectures acts as the integration module between the Geospatial Mechanisms. Mapbender comes with an embedded data model capable to manage interfaces for displaying, navigating and querying OGC compliant web map and feature services (WMS and transactional WFS). • Apache and Tomcat stand again as the Web Service middle Layers • Apache Axis with it's embedded implementation of the SOAP protocol ("Simple Object Access Protocol") acts as the No spatial data Mechanism of Web Services. These modules of the platform are still under development but their implementation will be fulfilled in the near future. • And a new Web user Interface for the end user based on enhanced and customized version of a MapBender GUI, a powerful Web Services client. For HNODC the interoperability of Web Services is the big advantage of the developed platform since it is capable to act in the future as provider and consumer of Web Services in both ways: • Either as data products provider for external SOA platforms. • Or as consumer of data products from external SOA platforms for new applications to be developed or for existing applications to be enhanced. A great paradigm of Data Managenet integration and dissemination via the use of such technologies is the European's Union Research Project Seadatanet, with the main objective to develop a standardized distributed system for managing and disseminating the large and diverse data sets and to enhance the currently existing infrastructures with Web Services Further more and when the technology of Web Processing Service (WPS), will be mature enough and applicable for development, the derived data products will be able to have any kind of GIS functionality for consumers across the network. From this point of view HNODC, joins the global scientific community by providing and consuming application Independent data products.

  10. glideinWMS - A generic pilot-based Workload Management System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sfiligoi, Igor; /Fermilab

    The Grid resources are distributed among hundreds of independent Grid sites, requiring a higher level Workload Management System (WMS) to be used efficiently. Pilot jobs have been used for this purpose by many communities, bringing increased reliability, global fair share and just in time resource matching. GlideinWMS is a WMS based on the Condor glidein concept, i.e. a regular Condor pool, with the Condor daemons (startds) being started by pilot jobs, and real jobs being vanilla, standard or MPI universe jobs. The glideinWMS is composed of a set of Glidein Factories, handling the submission of pilot jobs to a setmore » of Grid sites, and a set of VO Frontends, requesting pilot submission based on the status of user jobs. This paper contains the structural overview of glideinWMS as well as a detailed description of the current implementation and the current scalability limits.« less

  11. Wavelength modulation diode laser absorption spectroscopy for high-pressure gas sensing

    NASA Astrophysics Data System (ADS)

    Sun, K.; Chao, X.; Sur, R.; Jeffries, J. B.; Hanson, R. K.

    2013-03-01

    A general model for 1 f-normalized wavelength modulation absorption spectroscopy with nf detection (i.e., WMS- nf) is presented that considers the performance of injection-current-tuned diode lasers and the reflective interference produced by other optical components on the line-of-sight (LOS) transmission intensity. This model explores the optimization of sensitive detection of optical absorption by species with structured spectra at elevated pressures. Predictions have been validated by comparison with measurements of the 1 f-normalized WMS- nf (for n = 2-6) lineshape of the R(11) transition in the 1st overtone band of CO near 2.3 μm at four different pressures ranging from 5 to 20 atm, all at room temperature. The CO mole fractions measured by 1 f-normalized WMS-2 f, 3 f, and 4 f techniques agree with calibrated mixtures within 2.0 %. At conditions where absorption features are significantly broadened and large modulation depths are required, uncertainties in the WMS background signals due to reflective interference in the optical path can produce significant error in gas mole fraction measurements by 1 f-normalized WMS-2 f. However, such potential errors can be greatly reduced by using the higher harmonics, i.e., 1 f-normalized WMS- nf with n > 2. In addition, less interference from pressure-broadened neighboring transitions has been observed for WMS with higher harmonics than for WMS-2 f.

  12. Use of Open Standards and Technologies at the Lunar Mapping and Modeling Project

    NASA Astrophysics Data System (ADS)

    Law, E.; Malhotra, S.; Bui, B.; Chang, G.; Goodale, C. E.; Ramirez, P.; Kim, R. M.; Sadaqathulla, S.; Rodriguez, L.

    2011-12-01

    The Lunar Mapping and Modeling Project (LMMP), led by the Marshall Space Flight center (MSFC), is tasked by NASA. The project is responsible for the development of an information system to support lunar exploration activities. It provides lunar explorers a set of tools and lunar map and model products that are predominantly derived from present lunar missions (e.g., the Lunar Reconnaissance Orbiter (LRO)) and from historical missions (e.g., Apollo). At Jet Propulsion Laboratory (JPL), we have built the LMMP interoperable geospatial information system's underlying infrastructure and a single point of entry - the LMMP Portal by employing a number of open standards and technologies. The Portal exposes a set of services to users to allow search, visualization, subset, and download of lunar data managed by the system. Users also have access to a set of tools that visualize, analyze and annotate the data. The infrastructure and Portal are based on web service oriented architecture. We designed the system to support solar system bodies in general including asteroids, earth and planets. We employed a combination of custom software, commercial and open-source components, off-the-shelf hardware and pay-by-use cloud computing services. The use of open standards and web service interfaces facilitate platform and application independent access to the services and data, offering for instances, iPad and Android mobile applications and large screen multi-touch with 3-D terrain viewing functions, for a rich browsing and analysis experience from a variety of platforms. The web services made use of open standards including: Representational State Transfer (REST); and Open Geospatial Consortium (OGC)'s Web Map Service (WMS), Web Coverage Service (WCS), Web Feature Service (WFS). Its data management services have been built on top of a set of open technologies including: Object Oriented Data Technology (OODT) - open source data catalog, archive, file management, data grid framework; openSSO - open source access management and federation platform; solr - open source enterprise search platform; redmine - open source project collaboration and management framework; GDAL - open source geospatial data abstraction library; and others. Its data products are compliant with Federal Geographic Data Committee (FGDC) metadata standard. This standardization allows users to access the data products via custom written applications or off-the-shelf applications such as GoogleEarth. We will demonstrate this ready-to-use system for data discovery and visualization by walking through the data services provided through the portal such as browse, search, and other tools. We will further demonstrate image viewing and layering of lunar map images from the Internet, via mobile devices such as Apple's iPad.

  13. Web Services as Building Blocks for an Open Coastal Observing System

    NASA Astrophysics Data System (ADS)

    Breitbach, G.; Krasemann, H.

    2012-04-01

    In coastal observing systems it is needed to integrate different observing methods like remote sensing, in-situ measurements, and models into a synoptic view of the state of the observed region. This integration can be based solely on web services combining data and metadata. Such an approach is pursued for COSYNA (Coastal Observing System for Northern and Artic seas). Data from satellite and radar remote sensing, measurements of buoys, stations and Ferryboxes are the observation part of COSYNA. These data are assimilated into models to create pre-operational forecasts. For discovering data an OGC Web Feature Service (WFS) is used by the COSYNA data portal. This Web Feature Service knows the necessary metadata not only for finding data, but in addition the URLs of web services to view and download the data. To make the data from different resources comparable a common vocabulary is needed. For COSYNA the standard names from CF-conventions are stored within the metadata whenever possible. For the metadata an INSPIRE and ISO19115 compatible data format is used. The WFS is fed from the metadata-system using database-views. Actual data are stored in two different formats, in NetCDF-files for gridded data and in an RDBMS for time-series-like data. The web service URLs are mostly standard based the standards are mainly OGC standards. Maps were created from netcdf files with the help of the ncWMS tool whereas a self-developed java servlet is used for maps of moving measurement platforms. In this case download of data is offered via OGC SOS. For NetCDF-files OPeNDAP is used for the data download. The OGC CSW is used for accessing extended metadata. The concept of data management in COSYNA will be presented which is independent of the special services used in COSYNA. This concept is parameter and data centric and might be useful for other observing systems.

  14. Newly Released TRMM Version 7 Products, Other Precipitation Datasets and Data Services at NASA GES DISC

    NASA Technical Reports Server (NTRS)

    Liu, Zhong; Ostrenga, D.; Teng, W. L.; Trivedi, Bhagirath; Kempler, S.

    2012-01-01

    The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) is home of global precipitation product archives, in particular, the Tropical Rainfall Measuring Mission (TRMM) products. TRMM is a joint U.S.-Japan satellite mission to monitor tropical and subtropical (40 S - 40 N) precipitation and to estimate its associated latent heating. The TRMM satellite provides the first detailed and comprehensive dataset on the four dimensional distribution of rainfall and latent heating over vastly undersampled tropical and subtropical oceans and continents. The TRMM satellite was launched on November 27, 1997. TRMM data products are archived at and distributed by GES DISC. The newly released TRMM Version 7 consists of several changes including new parameters, new products, meta data, data structures, etc. For example, hydrometeor profiles in 2A12 now have 28 layers (14 in V6). New parameters have been added to several popular Level-3 products, such as, 3B42, 3B43. Version 2.2 of the Global Precipitation Climatology Project (GPCP) dataset has been added to the TRMM Online Visualization and Analysis System (TOVAS; URL: http://disc2.nascom.nasa.gov/Giovanni/tovas/), allowing online analysis and visualization without downloading data and software. The GPCP dataset extends back to 1979. Version 3 of the Global Precipitation Climatology Centre (GPCC) monitoring product has been updated in TOVAS as well. The product provides global gauge-based monthly rainfall along with number of gauges per grid. The dataset begins in January 1986. To facilitate data and information access and support precipitation research and applications, we have developed a Precipitation Data and Information Services Center (PDISC; URL: http://disc.gsfc.nasa.gov/precipitation). In addition to TRMM, PDISC provides current and past observational precipitation data. Users can access precipitation data archives consisting of both remote sensing and in-situ observations. Users can use these data products to conduct a wide variety of activities, including case studies, model evaluation, uncertainty investigation, etc. To support Earth science applications, PDISC provides users near-real-time precipitation products over the Internet. At PDISC, users can access tools and software. Documentation, FAQ and assistance are also available. Other capabilities include: 1) Mirador (http://mirador.gsfc.nasa.gov/), a simplified interface for searching, browsing, and ordering Earth science data at NASA Goddard Earth Sciences Data and Information Services Center (GES DISC). Mirador is designed to be fast and easy to learn; 2)TOVAS; 3) NetCDF data download for the GIS community; 4) Data via OPeNDAP (http://disc.sci.gsfc.nasa.gov/services/opendap/). The OPeNDAP provides remote access to individual variables within datasets in a form usable by many tools, such as IDV, McIDAS-V, Panoply, Ferret and GrADS; 5) The Open Geospatial Consortium (OGC) Web Map Service (WMS) (http://disc.sci.gsfc.nasa.gov/services/wxs_ogc.shtml). The WMS is an interface that allows the use of data and enables clients to build customized maps with data coming from a different network.

  15. Newly Released TRMM Version 7 Products, GPCP Version 2.2 Precipitation Dataset and Data Services at NASA GES DISC

    NASA Astrophysics Data System (ADS)

    Ostrenga, D.; Liu, Z.; Teng, W. L.; Trivedi, B.; Kempler, S.

    2011-12-01

    The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) is home of global precipitation product archives, in particular, the Tropical Rainfall Measuring Mission (TRMM) products. TRMM is a joint U.S.-Japan satellite mission to monitor tropical and subtropical (40deg S - 40deg N) precipitation and to estimate its associated latent heating. The TRMM satellite provides the first detailed and comprehensive dataset on the four dimensional distribution of rainfall and latent heating over vastly undersampled tropical and subtropical oceans and continents. The TRMM satellite was launched on November 27, 1997. TRMM data products are archived at and distributed by GES DISC. The newly released TRMM Version 7 consists of several changes including new parameters, new products, meta data, data structures, etc. For example, hydrometeor profiles in 2A12 now have 28 layers (14 in V6). New parameters have been added to several popular Level-3 products, such as, 3B42, 3B43. Version 2.2 of the Global Precipitation Climatology Project (GPCP) dataset has been added to the TRMM Online Visualization and Analysis System (TOVAS; URL: http://disc2.nascom.nasa.gov/Giovanni/tovas/), allowing online analysis and visualization without downloading data and software. The GPCP dataset extends back to 1979. Results of basic intercomparison between the new and the previous versions of both TRMM and GPCP will be presented to help understand changes in data product characteristics. To facilitate data and information access and support precipitation research and applications, we have developed a Precipitation Data and Information Services Center (PDISC; URL: http://disc.gsfc.nasa.gov/precipitation). In addition to TRMM, PDISC provides current and past observational precipitation data. Users can access precipitation data archives consisting of both remote sensing and in-situ observations. Users can use these data products to conduct a wide variety of activities, including case studies, model evaluation, uncertainty investigation, etc. To support Earth science applications, PDISC provides users near-real-time precipitation products over the Internet. At PDISC, users can access tools and software. Documentation, FAQ and assistance are also available. Other capabilities include: 1) Mirador (http://mirador.gsfc.nasa.gov/), a simplified interface for searching, browsing, and ordering Earth science data at NASA Goddard Earth Sciences Data and Information Services Center (GES DISC). Mirador is designed to be fast and easy to learn; 2)TOVAS; 3) NetCDF data download for the GIS community; 4) Data via OPeNDAP (http://disc.sci.gsfc.nasa.gov/services/opendap/). The OPeNDAP provides remote access to individual variables within datasets in a form usable by many tools, such as IDV, McIDAS-V, Panoply, Ferret and GrADS; 5) The Open Geospatial Consortium (OGC) Web Map Service (WMS) (http://disc.sci.gsfc.nasa.gov/services/wxs_ogc.shtml). The WMS is an interface that allows the use of data and enables clients to build customized maps with data coming from a different network. More details along with examples will be presented.

  16. Comparison between wire mesh sensor and gamma densitometry void measurements in two-phase flows

    NASA Astrophysics Data System (ADS)

    Sharaf, S.; Da Silva, M.; Hampel, U.; Zippe, C.; Beyer, M.; Azzopardi, B.

    2011-10-01

    Wire mesh sensors (WMS) are fast imaging instruments that are used for gas-liquid and liquid-liquid two-phase flow measurements and experimental investigations. Experimental tests were conducted at Helmholtz-Zentrum Dresden-Rossendorf to test both the capacitance and conductance WMS against a gamma densitometer (GD). A small gas-liquid test facility was utilized. This consisted of a vertical round pipe approximately 1 m in length, and 50 mm internal diameter. A 16 × 16 WMS was used with high spatial and temporal resolutions. Air-deionized water was the two-phase mixture. The gas superficial velocity was varied between 0.05 m s-1 and 1.4 m s-1 at two liquid velocities of 0.2 and 0.7 m s-1. The GD consisted of a collimated source and a collimated detector. The GD was placed on a moving platform close to the plane of wires of the sensor, in order to align it accurately using a counter mechanism, with each of the wires of the WMS, and the platform could scan the full section of the pipe. The WMS was operated as a conductivity WMS for a half-plane with eight wires and as a capacitance WMS for the other half. For the cross-sectional void (time and space averaged), along each wire, there was good agreement between WMS and the GD chordal void fraction near the centre of the pipe.

  17. US National Geothermal Data System: Web feature services and system operations

    NASA Astrophysics Data System (ADS)

    Richard, Stephen; Clark, Ryan; Allison, M. Lee; Anderson, Arlene

    2013-04-01

    The US National Geothermal Data System is being developed with support from the US Department of Energy to reduce risk in geothermal energy development by providing online access to the body of geothermal data available in the US. The system is being implemented using Open Geospatial Consortium web services for catalog search (CSW), map browsing (WMS), and data access (WFS). The catalog now includes 2427 registered resources, mostly individual documents accessible via URL. 173 WMS and WFS services are registered, hosted by 4 NGDS system nodes, as well as 6 other state geological surveys. Simple feature schema for interchange formats have been developed by an informal community process in which draft content models are developed based on the information actually available in most data provider's internal datasets. A template pattern is used for the content models so that commonly used content items have the same name and data type across models. Models are documented in Excel workbooks and posted for community review with a deadline for comment; at the end of the comment period a technical working group reviews and discusses comments and votes on adoption. When adopted, an XML schema is implemented for the content model. Our approach has been to keep the focus of each interchange schema narrow, such that simple-feature (flat file) XML schema are sufficient to implement the content model. Keeping individual interchange formats simple, and allowing flexibility to introduce new content models as needed have both assisted in adoption of the service architecture. One problem that remains to be solved is that off-the-shelf server packages (GeoServer, ArcGIS server) do not permit configuration of a normative schema location to be bound with XML namespaces in instance documents. Such configuration is possible with GeoServer using a more complex deployment process. XML interchange format schema versions are indicated by the namespace URI; because of the schema location problems, namespace URIs are redirected to the normative schema location. An additional issue that needs consideration is the expected lifetime of a service instance. A service contract should be accessible online and discoverable as part of the metadata for each service instance; this contract should specify the policy for service termination process--e.g. how notification will be made, if there is an expected end-of-life date. Application developers must be aware of these lifetime limitations to avoid unexpected failures. The evolution of the the service inventory to date has been driven primarily by data providers wishing to improve access to their data holdings. Focus is currently shifting towards improving tools for data consumer interaction--search, data inspection, and download. Long term viability of the system depends on business interdependence between the data providers and data consumers.

  18. ERDDAP - An Easier Way for Diverse Clients to Access Scientific Data From Diverse Sources

    NASA Astrophysics Data System (ADS)

    Mendelssohn, R.; Simons, R. A.

    2008-12-01

    ERDDAP is a new open-source, web-based service that aggregates data from other web services: OPeNDAP grid servers (THREDDS), OPeNDAP sequence servers (Dapper), NOS SOAP service, SOS (IOOS, OOStethys), microWFS, DiGIR (OBIS, BMDE). Regardless of the data source, ERDDAP makes all datasets available to clients via standard (and enhanced) DAP requests and makes some datasets accessible via WMS. A client's request also specifies the desired format for the results, e.g., .asc, .csv, .das, .dds, .dods, htmlTable, XHTML, .mat, netCDF, .kml, .png, or .pdf (formats more directly useful to clients). ERDDAP interprets a client request, requests the data from the data source (in the appropriate way), reformats the data source's response, and sends the result to the client. Thus ERDDAP makes data from diverse sources available to diverse clients via standardized interfaces. Clients don't have to install libraries to get data from ERDDAP because ERDDAP is RESTful and resource-oriented: a URL completely defines a data request and the URL can be used in any application that can send a URL and receive a file. This also makes it easy to use ERDDAP in mashups with other web services. ERDDAP could be extended to support other protocols. ERDDAP's hub and spoke architecture simplifies adding support for new types of data sources and new types of clients. ERDDAP includes metadata management support, catalog services, and services to make graphs and maps.

  19. Temporal Stability of the Dutch Version of the Wechsler Memory Scale-Fourth Edition (WMS-IV-NL).

    PubMed

    Bouman, Zita; Hendriks, Marc P H; Aldenkamp, Albert P; Kessels, Roy P C

    2015-01-01

    The Wechsler Memory Scale-Fourth Edition (WMS-IV) is one of the most widely used memory batteries. We examined the test-retest reliability, practice effects, and standardized regression-based (SRB) change norms for the Dutch version of the WMS-IV (WMS-IV-NL) after both short and long retest intervals. The WMS-IV-NL was administered twice after either a short (M = 8.48 weeks, SD = 3.40 weeks, range = 3-16) or a long (M = 17.87 months, SD = 3.48, range = 12-24) retest interval in a sample of 234 healthy participants (M = 59.55 years, range = 16-90; 118 completed the Adult Battery; and 116 completed the Older Adult Battery). The test-retest reliability estimates varied across indexes. They were adequate to good after a short retest interval (ranging from .74 to .86), with the exception of the Visual Working Memory Index (r = .59), yet generally lower after a long retest interval (ranging from .56 to .77). Practice effects were only observed after a short retest interval (overall group mean gains up to 11 points), whereas no significant change in performance was found after a long retest interval. Furthermore, practice effect-adjusted SRB change norms were calculated for all WMS-IV-NL index scores. Overall, this study shows that the test-retest reliability of the WMS-IV-NL varied across indexes. Practice effects were observed after a short retest interval, but no evidence was found for practice effects after a long retest interval from one to two years. Finally, the SRB change norms were provided for the WMS-IV-NL.

  20. GeoDash: Assisting Visual Image Interpretation in Collect Earth Online by Leveraging Big Data on Google Earth Engine

    NASA Technical Reports Server (NTRS)

    Markert, Kel; Ashmall, William; Johnson, Gary; Saah, David; Mollicone, Danilo; Diaz, Alfonso Sanchez-Paus; Anderson, Eric; Flores, Africa; Griffin, Robert

    2017-01-01

    Collect Earth Online (CEO) is a free and open online implementation of the FAO Collect Earth system for collaboratively collecting environmental data through the visual interpretation of Earth observation imagery. The primary collection mechanism in CEO is human interpretation of land surface characteristics in imagery served via Web Map Services (WMS). However, interpreters may not have enough contextual information to classify samples by only viewing the imagery served via WMS, be they high resolution or otherwise. To assist in the interpretation and collection processes in CEO, SERVIR, a joint NASA-USAID initiative that brings Earth observations to improve environmental decision making in developing countries, developed the GeoDash system, an embedded and critical component of CEO. GeoDash leverages Google Earth Engine (GEE) by allowing users to set up custom browser-based widgets that pull from GEE's massive public data catalog. These widgets can be quick looks of other satellite imagery, time series graphs of environmental variables, and statistics panels of the same. Users can customize widgets with any of GEE's image collections, such as the historical Landsat collection with data available since the 1970s, select date ranges, image stretch parameters, graph characteristics, and create custom layouts, all on-the-fly to support plot interpretation in CEO. This presentation focuses on the implementation and potential applications, including the back-end links to GEE and the user interface with custom widget building. GeoDash takes large data volumes and condenses them into meaningful, relevant information for interpreters. While designed initially with national and global forest resource assessments in mind, the system will complement disaster assessments, agriculture management, project monitoring and evaluation, and more.

  1. GeoDash: Assisting Visual Image Interpretation in Collect Earth Online by Leveraging Big Data on Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Markert, K. N.; Ashmall, W.; Johnson, G.; Saah, D. S.; Anderson, E.; Flores Cordova, A. I.; Díaz, A. S. P.; Mollicone, D.; Griffin, R.

    2017-12-01

    Collect Earth Online (CEO) is a free and open online implementation of the FAO Collect Earth system for collaboratively collecting environmental data through the visual interpretation of Earth observation imagery. The primary collection mechanism in CEO is human interpretation of land surface characteristics in imagery served via Web Map Services (WMS). However, interpreters may not have enough contextual information to classify samples by only viewing the imagery served via WMS, be they high resolution or otherwise. To assist in the interpretation and collection processes in CEO, SERVIR, a joint NASA-USAID initiative that brings Earth observations to improve environmental decision making in developing countries, developed the GeoDash system, an embedded and critical component of CEO. GeoDash leverages Google Earth Engine (GEE) by allowing users to set up custom browser-based widgets that pull from GEE's massive public data catalog. These widgets can be quick looks of other satellite imagery, time series graphs of environmental variables, and statistics panels of the same. Users can customize widgets with any of GEE's image collections, such as the historical Landsat collection with data available since the 1970s, select date ranges, image stretch parameters, graph characteristics, and create custom layouts, all on-the-fly to support plot interpretation in CEO. This presentation focuses on the implementation and potential applications, including the back-end links to GEE and the user interface with custom widget building. GeoDash takes large data volumes and condenses them into meaningful, relevant information for interpreters. While designed initially with national and global forest resource assessments in mind, the system will complement disaster assessments, agriculture management, project monitoring and evaluation, and more.

  2. Story Maps as an Effective Social Medium for Data Synthesis, Communication, and Dissemination

    NASA Astrophysics Data System (ADS)

    Wright, D. J.; Verrill, A.; Artz, M.; Deming, R.

    2014-12-01

    The story map is a new medium for sharing not only data, but also photos, videos, sounds, and maps, as a way to tell a specific and compelling story by way of that content. It is emerging as a popular and effective social media too. The user may employ some fairly sophisticated cartographic functionality without advanced training in cartography or GIS. Story maps are essentially web map applications built from web maps, which in turn are built from web-accessible data (including OGC WMS, WFS). This paper will emphasize the approaches and technologies of web-based GIS to tell "stories" about important connections among scientists, resource managers, and policy makers focused on oceans and coasts within the US; and how combining the new medium of "intelligent Web maps" with text, multimedia content, and intuitive user experiences has a great potential to synthesize the data, and it primary interpretative message in order to inform, educate, and inspire about a wide variety of ocean science and policy issues.

  3. Test Review: Advanced Clinical Solutions for WAIS-IV and WMS-IV

    ERIC Educational Resources Information Center

    Chu, Yiting; Lai, Mark H. C.; Xu, Yining; Zhou, Yuanyuan

    2012-01-01

    The authors review the "Advanced Clinical Solutions for WAIS-IV and WMS-IV". The "Advanced Clinical Solutions (ACS) for the Wechsler Adult Intelligence Scale-Fourth Edition" (WAIS-IV; Wechsler, 2008) and the "Wechsler Memory Scale-Fourth Edition" (WMS-IV; Wechsler, 2009) was published by Pearson in 2009. It is a…

  4. NASA GES DISC support of CO2 Data from OCO-2, ACOS, and AIRS

    NASA Technical Reports Server (NTRS)

    Wei, Jennifer C; Vollmer, Bruce E.; Savtchenko, Andrey K.; Hearty, Thomas J; Albayrak, Rustem Arif; Deshong, Barbara E.

    2013-01-01

    NASA Goddard Earth Sciences Data and Information Services Centers (GES DISC) is the data center assigned to archive and distribute current AIRS, ACOS data and data from the upcoming OCO-2 mission. The GES DISC archives and supports data containing information on CO2 as well as other atmospheric composition, atmospheric dynamics, modeling and precipitation. Along with the data stewardship, an important mission of GES DISC is to facilitate access to and enhance the usability of data as well as to broaden the user base. GES DISC strives to promote the awareness of science content and novelty of the data by working with Science Team members and releasing news articles as appropriate. Analysis of events that are of interest to the general public, and that help in understanding the goals of NASA Earth Observing missions, have been among most popular practices.Users have unrestricted access to a user-friendly search interface, Mirador, that allows temporal, spatial, keyword and event searches, as well as an ontology-driven drill down. Variable subsetting, format conversion, quality screening, and quick browse, are among the services available in Mirador. The majority of the GES DISC data are also accessible through OPeNDAP (Open-source Project for a Network Data Access Protocol) and WMS (Web Map Service). These services add more options for specialized subsetting, format conversion, image viewing and contributing to data interoperability.

  5. Web-GIS visualisation of permafrost-related Remote Sensing products for ESA GlobPermafrost

    NASA Astrophysics Data System (ADS)

    Haas, A.; Heim, B.; Schaefer-Neth, C.; Laboor, S.; Nitze, I.; Grosse, G.; Bartsch, A.; Kaab, A.; Strozzi, T.; Wiesmann, A.; Seifert, F. M.

    2016-12-01

    The ESA GlobPermafrost (www.globpermafrost.info) provides a remote sensing service for permafrost research and applications. The service comprises of data product generation for various sites and regions as well as specific infrastructure allowing overview and access to datasets. Based on an online user survey conducted within the project, the user community extensively applies GIS software to handle remote sensing-derived datasets and requires preview functionalities before accessing them. In response, we develop the Permafrost Information System PerSys which is conceptualized as an open access geospatial data dissemination and visualization portal. PerSys will allow visualisation of GlobPermafrost raster and vector products such as land cover classifications, Landsat multispectral index trend datasets, lake and wetland extents, InSAR-based land surface deformation maps, rock glacier velocity fields, spatially distributed permafrost model outputs, and land surface temperature datasets. The datasets will be published as WebGIS services relying on OGC-standardized Web Mapping Service (WMS) and Web Feature Service (WFS) technologies for data display and visualization. The WebGIS environment will be hosted at the AWI computing centre where a geodata infrastructure has been implemented comprising of ArcGIS for Server 10.4, PostgreSQL 9.2 and a browser-driven data viewer based on Leaflet (http://leafletjs.com). Independently, we will provide an `Access - Restricted Data Dissemination Service', which will be available to registered users for testing frequently updated versions of project datasets. PerSys will become a core project of the Arctic Permafrost Geospatial Centre (APGC) within the ERC-funded PETA-CARB project (www.awi.de/petacarb). The APGC Data Catalogue will contain all final products of GlobPermafrost, allow in-depth dataset search via keywords, spatial and temporal coverage, data type, etc., and will provide DOI-based links to the datasets archived in the long-term, open access PANGAEA data repository.

  6. Interoperability In The New Planetary Science Archive (PSA)

    NASA Astrophysics Data System (ADS)

    Rios, C.; Barbarisi, I.; Docasal, R.; Macfarlane, A. J.; Gonzalez, J.; Arviset, C.; Grotheer, E.; Besse, S.; Martinez, S.; Heather, D.; De Marchi, G.; Lim, T.; Fraga, D.; Barthelemy, M.

    2015-12-01

    As the world becomes increasingly interconnected, there is a greater need to provide interoperability with software and applications that are commonly being used globally. For this purpose, the development of the new Planetary Science Archive (PSA), by the European Space Astronomy Centre (ESAC) Science Data Centre (ESDC), is focused on building a modern science archive that takes into account internationally recognised standards in order to provide access to the archive through tools from third parties, for example by the NASA Planetary Data System (PDS), the VESPA project from the Virtual Observatory of Paris as well as other international institutions. The protocols and standards currently being supported by the new Planetary Science Archive at this time are the Planetary Data Access Protocol (PDAP), the EuroPlanet-Table Access Protocol (EPN-TAP) and Open Geospatial Consortium (OGC) standards. The architecture of the PSA consists of a Geoserver (an open-source map server), the goal of which is to support use cases such as the distribution of search results, sharing and processing data through a OGC Web Feature Service (WFS) and a Web Map Service (WMS). This server also allows the retrieval of requested information in several standard output formats like Keyhole Markup Language (KML), Geography Markup Language (GML), shapefile, JavaScript Object Notation (JSON) and Comma Separated Values (CSV), among others. The provision of these various output formats enables end-users to be able to transfer retrieved data into popular applications such as Google Mars and NASA World Wind.

  7. E-DECIDER Decision Support Gateway For Earthquake Disaster Response

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Stough, T. M.; Parker, J. W.; Burl, M. C.; Donnellan, A.; Blom, R. G.; Pierce, M. E.; Wang, J.; Ma, Y.; Rundle, J. B.; Yoder, M. R.

    2013-12-01

    Earthquake Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response (E-DECIDER) is a NASA-funded project developing capabilities for decision-making utilizing remote sensing data and modeling software in order to provide decision support for earthquake disaster management and response. E-DECIDER incorporates earthquake forecasting methodology and geophysical modeling tools developed through NASA's QuakeSim project in order to produce standards-compliant map data products to aid in decision-making following an earthquake. Remote sensing and geodetic data, in conjunction with modeling and forecasting tools, help provide both long-term planning information for disaster management decision makers as well as short-term information following earthquake events (i.e. identifying areas where the greatest deformation and damage has occurred and emergency services may need to be focused). E-DECIDER utilizes a service-based GIS model for its cyber-infrastructure in order to produce standards-compliant products for different user types with multiple service protocols (such as KML, WMS, WFS, and WCS). The goal is to make complex GIS processing and domain-specific analysis tools more accessible to general users through software services as well as provide system sustainability through infrastructure services. The system comprises several components, which include: a GeoServer for thematic mapping and data distribution, a geospatial database for storage and spatial analysis, web service APIs, including simple-to-use REST APIs for complex GIS functionalities, and geoprocessing tools including python scripts to produce standards-compliant data products. These are then served to the E-DECIDER decision support gateway (http://e-decider.org), the E-DECIDER mobile interface, and to the Department of Homeland Security decision support middleware UICDS (Unified Incident Command and Decision Support). The E-DECIDER decision support gateway features a web interface that delivers map data products including deformation modeling results (slope change and strain magnitude) and aftershock forecasts, with remote sensing change detection results under development. These products are event triggered (from the USGS earthquake feed) and will be posted to event feeds on the E-DECIDER webpage and accessible via the mobile interface and UICDS. E-DECIDER also features a KML service that provides infrastructure information from the FEMA HAZUS database through UICDS and the mobile interface. The back-end GIS service architecture and front-end gateway components form a decision support system that is designed for ease-of-use and extensibility for end-users.

  8. Utilizing Free and Open Source Software to access, view and compare in situ observations, EO products and model output data

    NASA Astrophysics Data System (ADS)

    Vines, Aleksander; Hamre, Torill; Lygre, Kjetil

    2014-05-01

    The GreenSeas project (Development of global plankton data base and model system for eco-climate early warning) aims to advance the knowledge and predictive capacities of how marine ecosystems will respond to global change. A main task has been to set up a data delivery and monitoring core service following the open and free data access policy implemented in the Global Monitoring for the Environment and Security (GMES) programme. The aim is to ensure open and free access to historical plankton data, new data (EO products and in situ measurements), model data (including estimates of simulation error) and biological, environmental and climatic indicators to a range of stakeholders, such as scientists, policy makers and environmental managers. To this end, we have developed a geo-spatial database of both historical and new in situ physical, biological and chemical parameters for the Southern Ocean, Atlantic, Nordic Seas and the Arctic, and organized related satellite-derived quantities and model forecasts in a joint geo-spatial repository. For easy access to these data, we have implemented a web-based GIS (Geographical Information Systems) where observed, derived and forcasted parameters can be searched, displayed, compared and exported. Model forecasts can also be uploaded dynamically to the system, to allow modelers to quickly compare their results with available in situ and satellite observations. We have implemented the web-based GIS(Geographical Information Systems) system based on free and open source technologies: Thredds Data Server, ncWMS, GeoServer, OpenLayers, PostGIS, Liferay, Apache Tomcat, PRTree, NetCDF-Java, json-simple, Geotoolkit, Highcharts, GeoExt, MapFish, FileSaver, jQuery, jstree and qUnit. We also wanted to used open standards to communicate between the different services and we use WMS, WFS, netCDF, GML, OPeNDAP, JSON, and SLD. The main advantage we got from using FOSS was that we did not have to invent the wheel all over again, but could use already existing code and functionalities on our software for free: Of course most the software did not have to be open source for this, but in some cases we had to do minor modifications to make the different technologies work together. We could extract the parts of the code that we needed for a specific task. One example of this was to use part of the code from ncWMS and Thredds to help our main application to both read netCDF files and present them in the browser. This presentation will focus on both difficulties we had with and advantages we got from developing this tool with FOSS.

  9. DOSoReMI.hu: collection of countrywide DSM products partly according to GSM.net specifications, partly driven by specific user demands

    NASA Astrophysics Data System (ADS)

    Pásztor, László; Laborczi, Annamária; Takács, Katalin; Szatmári, Gábor; Illés, Gábor; Bakacsi, Zsófia; Szabó, József

    2017-04-01

    Due to former soil surveys and mapping activities significant amount of soil information has accumulated in Hungary. In traditional soil mapping the creation of a new map was troublesome and laborious. As a consequence, robust maps were elaborated and rather the demands were fitted to the available map products. Until recently spatial soil information demands have been serviced with the available datasets either in their actual form or after certain specific and often enforced, thematic and spatial inference. Considerable imperfection may occur in the accuracy and reliability of the map products, since there might be significant discrepancies between the available data and the expected information. The DOSoReMI.hu (Digital, Optimized, Soil Related Maps and Information in Hungary) project was started intentionally for the renewal of the national soil spatial infrastructure in Hungary. During our activities we have significantly extended the potential, how soil information requirements could be satisfied. Soil property, soil type as well as functional soil maps were targeted. The set of the applied digital soil mapping techniques has been gradually broadened incorporating and eventually integrating geostatistical, data mining and GIS tools. Soil property maps have been compiled partly according to GSM.net specifications, partly by slightly or more strictly changing some of their predefined parameters (depth intervals, pixel size, property etc.) according to the specific demands on the final products. The elaborated primary maps were further processed, since even DOSoReMI.hu intended to take steps for the regionalization of higher level soil information (processes, functions, and services) involving crop models in the spatial modelling. The framework of DOSoReMI.hu also provides opportunity for the elaboration of goal specific soil maps, with the prescription of the parameters (thematic, resolution, accuracy, reliability etc.) characterizing the map product. As a result, unique digital soil map products (in a more general meaning) were elaborated regionalizing specific soil (related) features, which were never mapped before, even nationally with high ( 1 ha) spatial resolution. Based upon the collected experiences, the full range of GSM.net products were also targeted. The web publishing of the results was also elaborated creating a proper WMS environment. Our paper will present the resulted national maps furthermore some conclusions drawn from the experiences.] Acknowledgement: Our work was supported by the Hungarian National Scientific Research Foundation (OTKA) under Grant K105167 and AGRARKLÍMA.2 VKSZ_12-1-2013-0034.

  10. Clinical utility of the Wechsler Memory Scale - Fourth Edition (WMS-IV) in patients with intractable temporal lobe epilepsy.

    PubMed

    Bouman, Zita; Elhorst, Didi; Hendriks, Marc P H; Kessels, Roy P C; Aldenkamp, Albert P

    2016-02-01

    The Wechsler Memory Scale (WMS) is one of the most widely used test batteries to assess memory functions in patients with brain dysfunctions of different etiologies. This study examined the clinical validation of the Dutch Wechsler Memory Scale - Fourth Edition (WMS-IV-NL) in patients with temporal lobe epilepsy (TLE). The sample consisted of 75 patients with intractable TLE, who were eligible for epilepsy surgery, and 77 demographically matched healthy controls. All participants were examined with the WMS-IV-NL. Patients with TLE performed significantly worse than healthy controls on all WMS-IV-NL indices and subtests (p<.01), with the exception of the Visual Working Memory Index including its contributing subtests, as well as the subtests Logical Memory I, Verbal Paired Associates I, and Designs II. In addition, patients with mesiotemporal abnormalities performed significantly worse than patients with lateral temporal abnormalities on the subtests Logical Memory I and Designs II and all the indices (p<.05), with the exception of the Auditory Memory Index and Visual Working Memory Index. Patients with either a left or a right temporal focus performed equally on all WMS-IV-NL indices and subtests (F(15, 50)=.70, p=.78), as well as the Auditory-Visual discrepancy score (t(64)=-1.40, p=.17). The WMS-IV-NL is capable of detecting memory problems in patients with TLE, indicating that it is a sufficiently valid memory battery. Furthermore, the findings support previous research showing that the WMS-IV has limited value in identifying material-specific memory deficits in presurgical patients with TLE. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Establishing Transportation Framework Services Using the Open Geospatial Consortium Web Feature Service Specification

    NASA Astrophysics Data System (ADS)

    Yang, C.; Wong, D. W.; Phillips, T.; Wright, R. A.; Lindsey, S.; Kafatos, M.

    2005-12-01

    As a teamed partnership of the Center for Earth Observing and Space Research (CEOSR) at George Mason University (GMU), Virginia Department of Transportation (VDOT), Bureau of Transportation Statistics at the Department of Transportation (BTS/DOT), and Intergraph, we established Transportation Framework Data Services using Open Geospatial Consortium (OGC)'s Web Feature Service (WFS) Specification to enable the sharing of transportation data among the federal level with data from BTS/DOT, the state level through VDOT, the industries through Intergraph. CEOSR develops WFS solutions using Intergraph software. Relevant technical documents are also developed and disseminated through the partners. The WFS is integrated with operational geospatial systems at CEOSR and VDOT. CEOSR works with Intergraph on developing WFS solutions and technical documents. GeoMedia WebMap WFS toolkit is used with software and technical support from Intergraph. ESRI ArcIMS WFS connector is used with GMU's campus license of ESRI products. Tested solutions are integrated with framework data service operational systems, including 1) CEOSR's interoperable geospatial information services, FGDC clearinghouse Node, Geospatial One Stop (GOS) portal, and WMS services, 2) VDOT's state transportation data and GIS infrastructure, and 3)BTS/DOT's national transportation data. The project presents: 1) develop and deploy an operational OGC WFS 1.1 interfaces at CEOSR for registering with FGDC/GOS Portal and responding to Web ``POST'' requests for transportation Framework data as listed in Table 1; 2) build the WFS service that can return the data that conform to the drafted ANSI/INCITS L1 Standard (when available) for each identified theme in the format given by OGC Geography Markup Language (GML) Version 3.0 or higher; 3) integrate the OGC WFS with CEOSR's clearinghouse nodes, 4) establish a formal partnership to develop and share WFS-based geospatial interoperability technology among GMU, VDOT, BTS/DOT, and Intergraph; and 5) develop WFS-based solutions and technical documents using the GeoMedia WebMap WFS toolkit. Geospatial Web Feature Service is demonstrated to be more efficient in sharing vector data and supports direct Internet access transportation data. Developed WFS solutions also enhanced the interoperable service provided by CEOSR through the FGDC clearinghouse node and the GOS Portal.

  12. glideinWMS—a generic pilot-based workload management system

    NASA Astrophysics Data System (ADS)

    Sfiligoi, I.

    2008-07-01

    The Grid resources are distributed among hundreds of independent Grid sites, requiring a higher level Workload Management System (WMS) to be used efficiently. Pilot jobs have been used for this purpose by many communities, bringing increased reliability, global fair share and just in time resource matching. glideinWMS is a WMS based on the Condor glidein concept, i.e. a regular Condor pool, with the Condor daemons (startds) being started by pilot jobs, and real jobs being vanilla, standard or MPI universe jobs. The glideinWMS is composed of a set of Glidein Factories, handling the submission of pilot jobs to a set of Grid sites, and a set of VO Frontends, requesting pilot submission based on the status of user jobs. This paper contains the structural overview of glideinWMS as well as a detailed description of the current implementation and the current scalability limits.

  13. An interoperable standard system for the automatic generation and publication of the fire risk maps based on Fire Weather Index (FWI)

    NASA Astrophysics Data System (ADS)

    Julià Selvas, Núria; Ninyerola Casals, Miquel

    2015-04-01

    It has been implemented an automatic system to predict the fire risk in the Principality of Andorra, a small country located in the eastern Pyrenees mountain range, bordered by Catalonia and France, due to its location, his landscape is a set of a rugged mountains with an average elevation around 2000 meters. The system is based on the Fire Weather Index (FWI) that consists on different components, each one, measuring a different aspect of the fire danger calculated by the values of the weather variables at midday. CENMA (Centre d'Estudis de la Neu i de la Muntanya d'Andorra) has a network around 10 automatic meteorological stations, located in different places, peeks and valleys, that measure weather data like relative humidity, wind direction and speed, surface temperature, rainfall and snow cover every ten minutes; this data is sent daily and automatically to the system implemented that will be processed in the way to filter incorrect measurements and to homogenizer measurement units. Then this data is used to calculate all components of the FWI at midday and for the level of each station, creating a database with the values of the homogeneous measurements and the FWI components for each weather station. In order to extend and model this data to all Andorran territory and to obtain a continuous map, an interpolation method based on a multiple regression with spline residual interpolation has been implemented. This interpolation considerer the FWI data as well as other relevant predictors such as latitude, altitude, global solar radiation and sea distance. The obtained values (maps) are validated using a cross-validation leave-one-out method. The discrete and continuous maps are rendered in tiled raster maps and published in a web portal conform to Web Map Service (WMS) Open Geospatial Consortium (OGC) standard. Metadata and other reference maps (fuel maps, topographic maps, etc) are also available from this geoportal.

  14. Tropical Rainfall Measuring Mission (TRMM) Precipitation Data and Services for Research and Applications

    NASA Technical Reports Server (NTRS)

    Liu, Zhong; Ostrenga, Dana; Teng, William; Kempler, Steven

    2012-01-01

    Precipitation is a critical component of the Earth's hydrological cycle. Launched on 27 November 1997, TRMM is a joint U.S.-Japan satellite mission to provide the first detailed and comprehensive data set of the four-dimensional distribution of rainfall and latent heating over vastly under-sampled tropical and subtropical oceans and continents (40 S - 40 N). Over the past 14 years, TRMM has been a major data source for meteorological, hydrological and other research and application activities around the world. The purpose of this short article is to inform that the NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) provides TRMM archive and near-real-time precipitation data sets and services for research and applications. TRMM data consist of orbital data from TRMM instruments at the sensor s resolution, gridded data at a range of spatial and temporal resolutions, subsets, ground-based instrument data, and ancillary data. Data analysis, display, and delivery are facilitated by the following services: (1) Mirador (data search and access); (2) TOVAS (TRMM Online Visualization and Analysis System); (3) OPeNDAP (Open-source Project for a Network Data Access Protocol); (4) GrADS Data Server (GDS); and (5) Open Geospatial Consortium (OGC) Web Map Service (WMS) for the GIS community. Precipitation data application services are available to support a wide variety of applications around the world. Future plans include enhanced and new services to address data related issues from the user community. Meanwhile, the GES DISC is preparing for the Global Precipitation Measurement (GPM) mission which is scheduled for launch in 2014.

  15. Web Services Implementations at Land Process and Goddard Earth Sciences Distributed Active Archive Centers

    NASA Astrophysics Data System (ADS)

    Cole, M.; Bambacus, M.; Lynnes, C.; Sauer, B.; Falke, S.; Yang, W.

    2007-12-01

    NASA's vast array of scientific data within its Distributed Active Archive Centers (DAACs) is especially valuable to both traditional research scientists as well as the emerging market of Earth Science Information Partners. For example, the air quality science and management communities are increasingly using satellite derived observations in their analyses and decision making. The Air Quality Cluster in the Federation of Earth Science Information Partners (ESIP) uses web infrastructures of interoperability, or Service Oriented Architecture (SOA), to extend data exploration, use, and analysis and provides a user environment for DAAC products. In an effort to continually offer these NASA data to the broadest research community audience, and reusing emerging technologies, both NASA's Goddard Earth Science (GES) and Land Process (LP) DAACs have engaged in a web services pilot project. Through these projects both GES and LP have exposed data through the Open Geospatial Consortiums (OGC) Web Services standards. Reusing several different existing applications and implementation techniques, GES and LP successfully exposed a variety data, through distributed systems to be ingested into multiple end-user systems. The results of this project will enable researchers world wide to access some of NASA's GES & LP DAAC data through OGC protocols. This functionality encourages inter-disciplinary research while increasing data use through advanced technologies. This paper will concentrate on the implementation and use of OGC Web Services, specifically Web Map and Web Coverage Services (WMS, WCS) at GES and LP DAACs, and the value of these services within scientific applications, including integration with the DataFed air quality web infrastructure and in the development of data analysis web applications.

  16. Models based on "out-of Kilter" algorithm

    NASA Astrophysics Data System (ADS)

    Adler, M. J.; Drobot, R.

    2012-04-01

    In case of many water users along the river stretches, it is very important, in case of low flows and droughty periods to develop an optimization model for water allocation, to cover all needs under certain predefined constraints, depending of the Contingency Plan for drought management. Such a program was developed during the implementation of the WATMAN Project, in Romania (WATMAN Project, 2005-2006, USTDA) for Arges-Dambovita-Ialomita Basins water transfers. This good practice was proposed for WATER CoRe Project- Good Practice Handbook for Drought Management, (InterregIVC, 2011), to be applied for the European Regions. Two types of simulation-optimization models based on an improved version of out-of-kilter algorithm as optimization technique have been developed and used in Romania: • models for founding of the short-term operation of a WMS, • models generically named SIMOPT that aim to the analysis of long-term WMS operation and have as the main results the statistical WMS functional parameters. A real WMS is modeled by an arcs-nodes network so the real WMS operation problem becomes a problem of flows in networks. The nodes and oriented arcs as well as their characteristics such as lower and upper limits and associated costs are the direct analog of the physical and operational WMS characteristics. Arcs represent both physical and conventional elements of WMS such as river branches, channels or pipes, water user demands or other water management requirements, trenches of water reservoirs volumes, water levels in channels or rivers, nodes are junctions of at least two arcs and stand for locations of lakes or water reservoirs and/or confluences of river branches, water withdrawal or wastewater discharge points, etc. Quantitative features of water resources, water users and water reservoirs or other water works are expressed as constraints of non-violating the lower and upper limits assigned on arcs. Options of WMS functioning i.e. water retention/discharge in/from the reservoirs or diversion of water from one part of WMS to the other in order to meet water demands as well as the water user economic benefit or loss related to the degree of water demand, are the defining elements of the objective function and are conventionally expressed by the means of costs attached to the arcs. The problem of optimizing the WMS operation is formulated like a flow in networks problem as following: to find the flow that minimize the cost in the whole network while meeting the constraints of continuity in nodes and the constraints of non-exceeding lower and upper flow limits on arcs. Conversion of WMS in the arcs-nodes network and the adequate choice of costs and limits on arcs are steps of a unitary process and depend on the goal of the respective model.

  17. The CLIMB Geoportal - A web-based dissemination and documentation platform for hydrological modelling data

    NASA Astrophysics Data System (ADS)

    Blaschek, Michael; Gerken, Daniel; Ludwig, Ralf; Duttmann, Rainer

    2015-04-01

    Geoportals are important elements of spatial data infrastructures (SDIs) that are strongly based on GIS-related web services. These services are basically meant for distributing, documenting and visualizing (spatial) data in a standardized manner; an important but challenging task especially in large scientific projects with a high number of data suppliers and producers from various countries. This presentation focuses on introducing the free and open-source based geoportal solution developed within the research project CLIMB (Climate Induced Changes on the Hydrology of Mediterranean Basins, www.climb-fp7.eu) that serves as the central platform for interchanging project-related spatial data and information. In this collaboration, financed by the EU-FP7-framework and coordinated at the LMU Munich, 21 partner institutions from nine European and non-European countries were involved. The CLIMB Geoportal (lgi-climbsrv.geographie.uni-kiel.de) stores and provides spatially distributed data about the current state and future changes of the hydrological conditions within the seven CLIMB test sites around the Mediterranean. Hydrological modelling outcome - validated by the CLIMB partners - is offered to the public in forms of Web Map Services (WMS), whereas downloading the underlying data itself through Web Coverage Services (WCS) is possible for registered users only. A selection of common indicators such as discharge, drought index as well as uncertainty measures including their changes over time were used in different spatial resolution. Besides map information, the portal enables the graphical display of time series of selected variables calculated by the individual models applied within the CLIMB-project. The implementation of the CLIMB Geoportal is finally based on version 2.0c5 of the open source geospatial content management system GeoNode. It includes a GeoServer instance for providing the OGC-compliant web services and comes with a metadata catalog (pycsw) as well as a built-in WebGIS-client based on GeoExt (GeoExplorer). PostgreSQL enhanced by PostGIS in versions 9.2.1/2.0.1 serves as database backend for all base data of the study sites and for the time series of relevant hydrological indicators. Spatial model results in raster-format are stored file-based as GeoTIFFs. Due to the high number of model outputs, the generation of metadata (xml) and graphical rendering instructions (sld) associated with each single layer of the WMS has been done automatically using the statistical software R. Additional applications that have been programmed during the project period include a Java-based interface for comfortable download of climate data that was initially needed as input data in hydrological modeling as well as a tool for displaying time series of selected risk indicators which is directly integrated into the portal structure implemented using Python (Django) and JavaScript. The presented CLIMB Geoportal shows that relevant results of even large international research projects involving many partners and varying national standards in data handling, can be effectively disseminated to stakeholders, policy makers and other interested parties. Thus, it is a successful example of using free and open-source software for providing long-term visibility and access to data produced within a particular (environmental) research project.

  18. A National Crop Progress Monitoring System Based on NASA Earth Science Results

    NASA Astrophysics Data System (ADS)

    Di, L.; Yu, G.; Zhang, B.; Deng, M.; Yang, Z.

    2011-12-01

    Crop progress is an important piece of information for food security and agricultural commodities. Timely monitoring and reporting are mandated for the operation of agricultural statistical agencies. Traditionally, the weekly reporting issued by the National Agricultural Statistics Service (NASS) of the United States Department of Agriculture (USDA) is based on reports from the knowledgeable state and county agricultural officials and farmers. The results are spatially coarse and subjective. In this project, a remote-sensing-supported crop progress monitoring system is being developed intensively using the data and derived products from NASA Earth Observing satellites. Moderate Resolution Imaging Spectroradiometer (MODIS) Level 3 product - MOD09 (Surface Reflectance) is used for deriving daily normalized vegetation index (NDVI), vegetation condition index (VCI), and mean vegetation condition index (MVCI). Ratio change to previous year and multiple year mean can be also produced on demand. The time-series vegetation condition indices are further combined with the NASS' remote-sensing-derived Cropland Data Layer (CDL) to estimate crop condition and progress crop by crop. To facilitate the operational requirement and increase the accessibility of data and products by different users, each component of the system has being developed and implemented following open specifications under the Web Service reference model of Open Geospatial Consortium Inc. Sensor observations and data are accessed through Web Coverage Service (WCS), Web Feature Service (WFS), or Sensor Observation Service (SOS) if available. Products are also served through such open-specification-compliant services. For rendering and presentation, Web Map Service (WMS) is used. A Web-service based system is set up and deployed at dss.csiss.gmu.edu/NDVIDownload. Further development will adopt crop growth models, feed the models with remotely sensed precipitation and soil moisture information, and incorporate the model results with vegetation-index time series for crop progress stage estimation.

  19. OnEarth: An Open Source Solution for Efficiently Serving High-Resolution Mapped Image Products

    NASA Astrophysics Data System (ADS)

    Thompson, C. K.; Plesea, L.; Hall, J. R.; Roberts, J. T.; Cechini, M. F.; Schmaltz, J. E.; Alarcon, C.; Huang, T.; McGann, J. M.; Chang, G.; Boller, R. A.; Ilavajhala, S.; Murphy, K. J.; Bingham, A. W.

    2013-12-01

    This presentation introduces OnEarth, a server side software package originally developed at the Jet Propulsion Laboratory (JPL), that facilitates network-based, minimum-latency geolocated image access independent of image size or spatial resolution. The key component in this package is the Meta Raster Format (MRF), a specialized raster file extension to the Geospatial Data Abstraction Library (GDAL) consisting of an internal indexed pyramid of image tiles. Imagery to be served is converted to the MRF format and made accessible online via an expandable set of server modules handling requests in several common protocols, including the Open Geospatial Consortium (OGC) compliant Web Map Tile Service (WMTS) as well as Tiled WMS and Keyhole Markup Language (KML). OnEarth has recently transitioned to open source status and is maintained and actively developed as part of GIBS (Global Imagery Browse Services), a collaborative project between JPL and Goddard Space Flight Center (GSFC). The primary function of GIBS is to enhance and streamline the data discovery process and to support near real-time (NRT) applications via the expeditious ingestion and serving of full-resolution imagery representing science products from across the NASA Earth Science spectrum. Open source software solutions are leveraged where possible in order to utilize existing available technologies, reduce development time, and enlist wider community participation. We will discuss some of the factors and decision points in transitioning OnEarth to a suitable open source paradigm, including repository and licensing agreement decision points, institutional hurdles, and perceived benefits. We will also provide examples illustrating how OnEarth is integrated within GIBS and other applications.

  20. Understanding the unique biogeochemistry of the Mediterranean Sea: Insights from a coupled phosphorus and nitrogen model

    NASA Astrophysics Data System (ADS)

    Powley, Helen R.; Krom, Michael D.; Van Cappellen, Philippe

    2017-06-01

    The Mediterranean Sea (MS) is an oligotrophic basin whose offshore water column exhibits low dissolved inorganic phosphorus (P) and nitrogen (N) concentrations, unusually high nitrate (NO3) to phosphate (PO4) ratios, and distinct biogeochemical differences between the Western Mediterranean Sea (WMS) and Eastern Mediterranean Sea (EMS). A new mass balance model of P and N cycling in the WMS is coupled to a pre-existing EMS model to understand these biogeochemical features. Estimated land-derived inputs of reactive P and N to the WMS and EMS are similar per unit surface area, but marine inputs are 4 to 5 times greater for the WMS, which helps explain the approximately 3 times higher primary productivity of the WMS. The lateral inputs of marine sourced inorganic and organic P support significant fractions of new production in the WMS and EMS, similar to subtropical gyres. The mass balance calculations imply that the MS is net heterotrophic: dissolved organic P and N entering the WMS and EMS, primarily via the Straits of Gibraltar and Sicily, are mineralized to PO4 and NO3 and subsequently exported out of the basin by the prevailing anti-estuarine circulation. The high deepwater (DW) molar NO3:PO4 ratios reflect the high reactive N:P ratio of inputs to the WMS and EMS, combined with low denitrification rates. The lower DW NO3:PO4 ratio of the WMS (21) compared to the EMS (28) reflects lower reactive N:P ratios of inputs to the WMS, including the relatively low N:P ratio of Atlantic surface water flowing into the WMS.Plain Language SummaryThe Mediterranean Sea (MS) is a marine desert: it exhibits extremely low biological productivity despite being almost entirely surrounded by land with high nutrient loadings from a large coastal population. To explain this paradox, we analyze the sources and fate of the two main nutrient elements that support the production of marine biomass, phosphorus (P), and nitrogen (N). We find that the main source of P and N to the MS is inflow of surface water from the Atlantic Ocean via the Strait of Gibraltar, not land-derived sources. This inflow is balanced by a return to the Atlantic Ocean of deeper Mediterranean water enriched in the biologically most active forms of P and N, phosphate and nitrate. The very low productivity of the MS therefore reflects a switch from less bioavailable chemical forms of P and N entering the MS to more bioavailable forms leaving the MS. Computer simulations reproduce these chemical differences when coupling the biological utilization and recycling of P and N to the circulation of the MS, which drives the water exchanges across the Strait of Gibraltar. These simulations also reproduce the differences in productivity and nutrient distributions between the western and eastern basins of the MS.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_5");'>5</a></li> <li><a href="#" onclick='return showDiv("page_6");'>6</a></li> <li class="active"><span>7</span></li> <li><a href="#" onclick='return showDiv("page_8");'>8</a></li> <li><a href="#" onclick='return showDiv("page_9");'>9</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_7 --> <div id="page_8" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_6");'>6</a></li> <li><a href="#" onclick='return showDiv("page_7");'>7</a></li> <li class="active"><span>8</span></li> <li><a href="#" onclick='return showDiv("page_9");'>9</a></li> <li><a href="#" onclick='return showDiv("page_10");'>10</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="141"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27295622','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27295622"><span>Economies of density for on-site waste water treatment.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Eggimann, Sven; Truffer, Bernhard; Maurer, Max</p> <p>2016-09-15</p> <p>Decentralised wastewater treatment is increasingly gaining interest as a means of responding to sustainability challenges. Cost comparisons are a crucial element of any sustainability assessment. While the cost characteristics of centralised waste water management systems (WMS) have been studied extensively, the economics of decentralised WMS are less understood. A key motivation for studying the costs of decentralised WMS is to compare the cost of centralised and decentralised WMS in order to decide on cost-efficient sanitation solutions. This paper outlines a model designed to assess those costs which depend on the spatial density of decentralised wastewater treatment plants in a region. Density-related costs are mostly linked to operation and maintenance activities which depend on transportation, like sludge removal or the visits of professionals to the plants for control, servicing or repairs. We first specify a modelled cost-density relationship for a region in a geometric two-dimensional space by means of heuristic routing algorithms that consider time and load-capacity restrictions. The generic model is then applied to a Swiss case study for which we specify a broad range of modelling parameters. As a result, we identify a 'hockey-stick'-shaped cost curve that is characterised by strong cost reductions at high density values which level out at around 1 to 1.5 plants per km(2). Variations in the cost curves are mostly due to differences in management approaches (scheduled or unscheduled emptying). In addition to the well-known diseconomies of scale in the case of centralised sanitation, we find a similar generic cost behaviour for decentralised sanitation due to economies of density. Low densities in sparsely populated regions thus result in higher costs for both centralised and decentralised system. Policy implications are that efforts to introduce decentralised options in a region should consider the low-density/high-cost problem when comparing centralised and decentralised options. Copyright © 2016 Elsevier Ltd. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..1918727P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..1918727P"><span>The climate4impact platform: Providing, tailoring and facilitating climate model data access</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Pagé, Christian; Pagani, Andrea; Plieger, Maarten; Som de Cerff, Wim; Mihajlovski, Andrej; de Vreede, Ernst; Spinuso, Alessandro; Hutjes, Ronald; de Jong, Fokke; Bärring, Lars; Vega, Manuel; Cofiño, Antonio; d'Anca, Alessandro; Fiore, Sandro; Kolax, Michael</p> <p>2017-04-01</p> <p>One of the main objectives of climate4impact is to provide standardized web services and tools that are reusable in other portals. These services include web processing services, web coverage services and web mapping services (WPS, WCS and WMS). Tailored portals can be targeted to specific communities and/or countries/regions while making use of those services. Easier access to climate data is very important for the climate change impact communities. To fulfill this objective, the climate4impact (http://climate4impact.eu/) web portal and services has been developed, targeting climate change impact modellers, impact and adaptation consultants, as well as other experts using climate change data. It provides to users harmonized access to climate model data through tailored services. It features static and dynamic documentation, Use Cases and best practice examples, an advanced search interface, an integrated authentication and authorization system with the Earth System Grid Federation (ESGF), a visualization interface with ADAGUC web mapping tools. In the latest version, statistical downscaling services, provided by the Santander Meteorology Group Downscaling Portal, were integrated. An innovative interface to integrate statistical downscaling services will be released in the upcoming version. The latter will be a big step in bridging the gap between climate scientists and the climate change impact communities. The climate4impact portal builds on the infrastructure of an international distributed database that has been set to disseminate the results from the global climate model results of the Coupled Model Intercomparison project Phase 5 (CMIP5). This database, the ESGF, is an international collaboration that develops, deploys and maintains software infrastructure for the management, dissemination, and analysis of climate model data. The European FP7 project IS-ENES, Infrastructure for the European Network for Earth System modelling, supports the European contribution to ESGF and contributes to the ESGF open source effort, notably through the development of search, monitoring, quality control, and metadata services. In its second phase, IS-ENES2 supports the implementation of regional climate model results from the international Coordinated Regional Downscaling Experiments (CORDEX). These services were extended within the European FP7 Climate Information Portal for Copernicus (CLIPC) project, and some could be later integrated into the European Copernicus platform.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26678400','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26678400"><span>Material-specific retroactive interference effects of the Wechsler Adult Intelligence Scale-Fourth Edition on the Wechsler Memory Scale-Fourth Edition in a nonclinical sample.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Ingram, Nicolette S; Diakoumakos, Jessica V; Sinclair, Erin R; Crowe, Simon F</p> <p>2016-01-01</p> <p>This study investigated proactive and retroactive interference effects between the Wechsler Memory Scale-Fourth Edition (WMS-IV) using the flexible approach, and the Wechsler Adult Intelligence Scale-Fourth Edition (WAIS-IV). One hundred and eighty nonclinical participants were assigned to a four (visual interference, verbal interference, visual and verbal interference, vs. no interference) by two (retroactive vs. proactive) between-subjects design. The administration order of the tests was counterbalanced (i.e., administration of the WAIS-IV prior to the WMS-IV, and the WAIS-IV administered during the delay interval of the WMS-IV). The WAIS-IV produced significant retroactive interference effects on the WMS-IV; however, no proactive interference effect was observed. The retroactive interference effect was dependent on material specificity. The results indicate that material presented within the delay of the WMS-IV can have a significant effect on subsequent delayed recall. Clinicians should carefully consider the effects associated with carry-over effects of these tests when using them in combination.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013EGUGA..1512849M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013EGUGA..1512849M"><span>Climate Data Service in the FP7 EarthServer Project</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Mantovani, Simone; Natali, Stefano; Barboni, Damiano; Grazia Veratelli, Maria</p> <p>2013-04-01</p> <p>EarthServer is a European Framework Program project that aims at developing and demonstrating the usability of open standards (OGC and W3C) in the management of multi-source, any-size, multi-dimensional spatio-temporal data - in short: "Big Earth Data Analytics". In order to demonstrate the feasibility of the approach, six thematic Lighthouse Applications (Cryospheric Science, Airborne Science, Atmospheric/ Climate Science, Geology, Oceanography, and Planetary Science), each with 100+ TB, are implemented. Scope of the Atmospheric/Climate lighthouse application (Climate Data Service) is to implement the system containing global to regional 2D / 3D / 4D datasets retrieved either from satellite observations, from numerical modelling and in-situ observations. Data contained in the Climate Data Service regard atmospheric profiles of temperature / humidity, aerosol content, AOT, and cloud properties provided by entities such as the European Centre for Mesoscale Weather Forecast (ECMWF), the Austrian Meteorological Service (Zentralanstalt für Meteorologie und Geodynamik - ZAMG), the Italian National Agency for new technologies, energies and sustainable development (ENEA), and the Sweden's Meteorological and Hydrological Institute (Sveriges Meteorologiska och Hydrologiska Institut -- SMHI). The system, through an easy-to-use web application permits to browse the loaded data, visualize their temporal evolution on a specific point with the creation of 2D graphs of a single field, or compare different fields on the same point (e.g. temperatures from different models and satellite observations), and visualize maps of specific fields superimposed with high resolution background maps. All data access operations and display are performed by means of OGC standard operations namely WMS, WCS and WCPS. The EarthServer project has just started its second year over a 3-years development plan: the present status the system contains subsets of the final database, with the scope of demonstrating I/O modules and visualization tools. At the end of the project all datasets will be available to the users.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=20120010095&hterms=mingyue&qs=N%3D0%26Ntk%3DAll%26Ntx%3Dmode%2Bmatchall%26Ntt%3Dmingyue','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=20120010095&hterms=mingyue&qs=N%3D0%26Ntk%3DAll%26Ntx%3Dmode%2Bmatchall%26Ntt%3Dmingyue"><span>Development of a Web-Based Visualization Platform for Climate Research Using Google Earth</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Sun, Xiaojuan; Shen, Suhung; Leptoukh, Gregory G.; Wang, Panxing; Di, Liping; Lu, Mingyue</p> <p>2011-01-01</p> <p>Recently, it has become easier to access climate data from satellites, ground measurements, and models from various data centers, However, searching. accessing, and prc(essing heterogeneous data from different sources are very tim -consuming tasks. There is lack of a comprehensive visual platform to acquire distributed and heterogeneous scientific data and to render processed images from a single accessing point for climate studies. This paper. documents the design and implementation of a Web-based visual, interoperable, and scalable platform that is able to access climatological fields from models, satellites, and ground stations from a number of data sources using Google Earth (GE) as a common graphical interface. The development is based on the TCP/IP protocol and various data sharing open sources, such as OPeNDAP, GDS, Web Processing Service (WPS), and Web Mapping Service (WMS). The visualization capability of integrating various measurements into cE extends dramatically the awareness and visibility of scientific results. Using embedded geographic information in the GE, the designed system improves our understanding of the relationships of different elements in a four dimensional domain. The system enables easy and convenient synergistic research on a virtual platform for professionals and the general public, gr$tly advancing global data sharing and scientific research collaboration.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26946935','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26946935"><span>Co-treatment of fruit and vegetable waste in sludge digesters: Chemical and spectroscopic investigation by fluorescence and Fourier transform infrared spectroscopy.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Provenzano, Maria Rosaria; Cavallo, Ornella; Malerba, Anna Daniela; Di Maria, Francesco; Cucina, Mirko; Massaccesi, Luisa; Gigliotti, Giovanni</p> <p>2016-04-01</p> <p>In a previous work co-digestion of food waste and sewage sludge was performed in a pilot apparatus reproducing operating conditions of an existing full scale digester and processing waste mixed sludge (WMS) and fruit and vegetable waste (FVW) at different organic loading rates. An analysis of the relationship among bio-methane generation, process stability and digestate phytotoxicity was conducted. In this paper we considered humification parameters and spectroscopic analysis. Humification parameters indicated a higher not humified fraction (NH) and a lower degree of humification (DH) of FVW with respect to WMS (NH=19.22 and 5.10%; DH=36.65 and 61.94% for FVW and WMS, respectively) associated with their different chemical compositions and with the stabilization process previously undergone by sludge. FVW additions seemed to be favourable from an agronomical point of view since a lower percentage of organic carbon was lost. Fourier transform infrared spectra suggested consumption of aliphatics associated with rising in bio-methane generation followed by accumulation of aliphatics and carboxylic acids when the biogas production dropped. The trend of peaks ratios can be used as an indicator of the process efficiency. Fluorescence intensity of peak B associated with tryptophan-like substances and peak D associated with humic-like substances observed on tridimensional Excitation Emission Matrix maps increased up to sample corresponding to the highest rate of biogas production. Overall spectroscopic results provided evidence of different chemical pathways of anaerobic digestion associated with increasing amount of FVW which led to different levels of biogas production. Copyright © 2016 Elsevier Ltd. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28193204','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28193204"><span>Development of a wheelchair mobility skills test for children and adolescents: combining evidence with clinical expertise.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Sol, Marleen Elisabeth; Verschuren, Olaf; de Groot, Laura; de Groot, Janke Frederike</p> <p>2017-02-13</p> <p>Wheelchair mobility skills (WMS) training is regarded by children using a manual wheelchair and their parents as an important factor to improve participation and daily physical activity. Currently, there is no outcome measure available for the evaluation of WMS in children. Several wheelchair mobility outcome measures have been developed for adults, but none of these have been validated in children. Therefore the objective of this study is to develop a WMS outcome measure for children using the current knowledge from literature in combination with the clinical expertise of health care professionals, children and their parents. Mixed methods approach. Phase 1: Item identification of WMS items through a systematic review using the 'COnsensus-based Standards for the selection of health Measurement Instruments' (COSMIN) recommendations. Phase 2: Item selection and validation of relevant WMS items for children, using a focus group and interviews with children using a manual wheelchair, their parents and health care professionals. Phase 3: Feasibility of the newly developed Utrecht Pediatric Wheelchair Mobility Skills Test (UP-WMST) through pilot testing. Phase 1: Data analysis and synthesis of nine WMS related outcome measures showed there is no widely used outcome measure with levels of evidence across all measurement properties. However, four outcome measures showed some levels of evidence on reliability and validity for adults. Twenty-two WMS items with the best clinimetric properties were selected for further analysis in phase 2. Phase 2: Fifteen items were deemed as relevant for children, one item needed adaptation and six items were considered not relevant for assessing WMS in children. Phase 3: Two health care professionals administered the UP-WMST in eight children. The instructions of the UP-WMST were clear, but the scoring method of the height difference items needed adaptation. The outdoor items for rolling over soft surface and the side slope item were excluded in the final version of the UP-WMST due to logistic reasons. The newly developed 15 item UP-WMST is a validated outcome measure which is easy to administer in children using a manual wheelchair. More research regarding reliability, construct validity and responsiveness is warranted before the UP-WMST can be used in practice.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012AGUFMPA51A2075H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012AGUFMPA51A2075H"><span>AirNow Information Management System - Global Earth Observation System of Systems Data Processor for Real-Time Air Quality Data Products</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Haderman, M.; Dye, T. S.; White, J. E.; Dickerson, P.; Pasch, A. N.; Miller, D. S.; Chan, A. C.</p> <p>2012-12-01</p> <p>Built upon the success of the U.S. Environmental Protection Agency's (EPA) AirNow program (www.AirNow.gov), the AirNow-International (AirNow-I) system contains an enhanced suite of software programs that process and quality control real-time air quality and environmental data and distribute customized maps, files, and data feeds. The goals of the AirNow-I program are similar to those of the successful U.S. program and include fostering the exchange of environmental data; making advances in air quality knowledge and applications; and building a community of people, organizations, and decision makers in environmental management. In 2010, Shanghai became the first city in China to run this state-of-the-art air quality data management and notification system. AirNow-I consists of a suite of modules (software programs and schedulers) centered on a database. One such module is the Information Management System (IMS), which can automatically produce maps and other data products through the use of GIS software to provide the most current air quality information to the public. Developed with Global Earth Observation System of Systems (GEOSS) interoperability in mind, IMS is based on non-proprietary standards, with preference to formal international standards. The system depends on data and information providers accepting and implementing a set of interoperability arrangements, including technical specifications for collecting, processing, storing, and disseminating shared data, metadata, and products. In particular, the specifications include standards for service-oriented architecture and web-based interfaces, such as a web mapping service (WMS), web coverage service (WCS), web feature service (WFS), sensor web services, and Really Simple Syndication (RSS) feeds. IMS is flexible, open, redundant, and modular. It also allows the merging of data grids to create complex grids that show comprehensive air quality conditions. For example, the AirNow Satellite Data Processor (ASDP) was recently developed to merge PM2.5 estimates from National Aeronautics and Space Administration (NASA) satellite data and AirNow observational data, creating more precise maps and gridded data products for under-monitored areas. The ASDP can easily incorporate other data feeds, including fire and smoke locations, to build enhanced real-time air quality data products. In this presentation, we provide an overview of the features and functions of IMS, an explanation of how data moves through IMS, the rationale of the system architecture, and highlights of the ASDP as an example of the modularity and scalability of IMS.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013EGUGA..15.6509P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013EGUGA..15.6509P"><span>The SOOS Data Portal, providing access to Southern Oceans data</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Proctor, Roger; Finney, Kim; Blain, Peter; Taylor, Fiona; Newman, Louise; Meredith, Mike; Schofield, Oscar</p> <p>2013-04-01</p> <p>The Southern Ocean Observing System (SOOS) is an international initiative to enhance, coordinate and expand the strategic observations of the Southern Oceans that are required to address key scientific and societal challenges. A key component of SOOS will be the creation and maintenance of a Southern Ocean Data Portal to provide improved access to historical and ongoing data (Schofield et al., 2012, Eos, Vol. 93, No. 26, pp 241-243). The scale of this effort will require strong leveraging of existing data centres, new cyberinfrastructure development efforts, and defined data collection, quality control, and archiving procedures across the international community. The task of assembling the SOOS data portal is assigned to the SOOS Data Management Sub-Committee. The information infrastructure chosen for the SOOS data portal is based on the Australian Ocean Data Network (AODN, http://portal.aodn.org.au). The AODN infrastructure is built on open-source tools and the use of international standards ensures efficiency of data exchange and interoperability between contributing systems. OGC standard web services protocols are used for serving of data via the internet. These include Web Map Service (WMS) for visualisation, Web Feature Service (WFS) for data download, and Catalogue Service for Web (CSW) for catalogue exchange. The portal offers a number of tools to access and visualize data: - a Search link to the metadata catalogue enables search and discovery by simple text search, by geographic area, temporal extent, keyword, parameter, organisation, or by any combination of these, allowing users to gain access to further information and/or the data for download. Also, searches can be restricted to items which have either data to download, or attached map layers, or both - a Map interface for discovery and display of data, with the ability to change the style and opacity of layers, add additional data layers via OGC Web Map Services, view animated timeseries datastreams - data can be easily accessed and downloaded including directly from OPeNDAP/THREDDS servers. The SOOS data portal (http://soos.aodn.org.au/soos) aims to make access to Southern Ocean data a simple process and the initial layout classifies data into six themes - Heat and Freshwater; Circulation; Ice-sheets and Sea level; Carbon; Sea-ice; and Ecosystems, with the ability to integrate layers between themes. The portal is in its infancy (pilot launched January 2013) with a limited number of datasets available; however, the number of datasets is expected to grow rapidly as the international community becomes fully engaged.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/29546139','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/29546139"><span>Implementing Health Policy: Lessons from the Scottish Well Men's Policy Initiative.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Douglas, Flora; van Teijlingen, Edwin; Smith, Cairns; Moffat, Mandy</p> <p>2015-01-01</p> <p>Little is known about how health professionals translate national government health policy directives into action. This paper examines that process using the so-called Well Men's Services (WMS) policy initiative as a 'real world' case study. The WMS were launched by the Scottish Government to address men's health inequalities. Our analysis aimed to develop a deeper understanding of policy implementation as it naturally occurred, used an analytical framework that was developed to reflect the 'rational planning' principles health professionals are commonly encouraged to use for implementation purposes. A mixed-methods qualitative enquiry using a data archive generated during the WMS policy evaluation was used to critically analyze (post hoc) the perspectives of national policy makers, and local health and social care professionals about the: (a) 'policy problem', (b) interventions intended to address the problem, and (c) anticipated policy outcomes. This analysis revealed four key themes: (1) ambiguity regarding the policy problem and means of intervention; (2) behavioral framing of the policy problem and intervention; (3) uncertainty about the policy evidence base and outcomes, and; (4) a focus on intervention as outcome . This study found that mechanistic planning heuristics (as a means of supporting implementation) fails to grapple with the indeterminate nature of population health problems. A new approach to planning and implementing public health interventions is required that recognises the complex and political nature of health problems; the inevitability of imperfect and contested evidence regarding intervention, and, future associated uncertainties.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20160007391','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20160007391"><span>Improving the Accessibility and Use of NASA Earth Science Data</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Tisdale, Matthew; Tisdale, Brian</p> <p>2015-01-01</p> <p>Many of the NASA Langley Atmospheric Science Data Center (ASDC) Distributed Active Archive Center (DAAC) multidimensional tropospheric and atmospheric chemistry data products are stored in HDF4, HDF5 or NetCDF format, which traditionally have been difficult to analyze and visualize with geospatial tools. With the rising demand from the diverse end-user communities for geospatial tools to handle multidimensional products, several applications, such as ArcGIS, have refined their software. Many geospatial applications now have new functionalities that enable the end user to: Store, serve, and perform analysis on each individual variable, its time dimension, and vertical dimension. Use NetCDF, GRIB, and HDF raster data formats across applications directly. Publish output within REST image services or WMS for time and space enabled web application development. During this webinar, participants will learn how to leverage geospatial applications such as ArcGIS, OPeNDAP and ncWMS in the production of Earth science information, and in increasing data accessibility and usability.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010EGUGA..1212555F','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010EGUGA..1212555F"><span>A WebGIS service for managing, sharing and communicating information on mountain risks: a pilot study at the Barcelonnette Basin (South French Alps)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Frigerio, Simone; Skupinski, Grzegorz; Kappes, Melanie; Malet, Jean-Philippe; Puissant, Anne</p> <p>2010-05-01</p> <p>Integrative analysis, assessment and management of mountain hazards and risks require (1) the intense cooperation among scientists, local practitioners and stakeholders, (2) the compilation of multi-source GIS database on both the sources of the dangers and their impacts, and (3) the communication of scientific results which is still a challenge. Within the European project Mountain Risks and the French-research initiative OMIV (Multi-disciplinary Observatory on Slope Instabilities; http://eost.u-strasbg.fr/omiv), several approaches are under development aiming at a coherent communication of scientific results to the population in order to inform about hazards and risks and support practical risk management measures. A simple and user-friendly approach with a visual-web-based interface is proposed, able (1) to incorporate geographical information on past events and on controlling factors, (2) to include administrative boundaries and official risk regulation maps, and(3) to integrate all modeling results obtained in the study area (already performed or in progress). The possibility to share information by means of web services offers a double utility: firstly it is a way to decrease the gap between scientific community's results and stakeholders' practical needs (simple interface, easy-to-use buttons in a generally user-friendly approach). Secondly the wide collection of diverse information (records of historical events, conditioning and triggering factors, information on elements at risk and their vulnerability, modeling results) in combination with the possibility of comparison among the data offers a great support in the decision-making process. As first case study, the Barcelonnette Basin (South French Alps) has been chosen for the pilot development of the interface. The objective is to organize, manage and share a wide range of information and calibrate a correct web-service solution. Several steps are planned to achieve this goal: the creation of a hierarchical GeoDB that includes all information available for the area (high resolution airborne and satellite imagery, various DEMs, geo-environmental factor maps, susceptibility and hazard maps, historical events and old photographs, maps of elements at risk, potential consequence maps, existing risk scenarios and risk maps) using different organizational folders (splitted in web-switches), the definition of an OpenSource Cartoweb web-platform (based on GeoDB structure) and finally the adjustment of a POSTGIS and POSTGRESQL environment to accomplish query actions, a metadata support system, and a WMS for external data connection and layer control.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/16895854','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/16895854"><span>Development of WAIS-III General Ability Index Minus WMS-III memory discrepancy scores.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Lange, Rael T; Chelune, Gordon J; Tulsky, David S</p> <p>2006-09-01</p> <p>Analysis of the discrepancy between intellectual functioning and memory ability has received some support as a useful means for evaluating memory impairment. In recent additions to Wechlser scale interpretation, the WAIS-III General Ability Index (GAI) and the WMS-III Delayed Memory Index (DMI) were developed. The purpose of this investigation is to develop base rate data for GAI-IMI, GAI-GMI, and GAI-DMI discrepancy scores using data from the WAIS-III/WMS-III standardization sample (weighted N = 1250). Base rate tables were developed using the predicted-difference method and two simple-difference methods (i.e., stratified and non-stratified). These tables provide valuable data for clinical reference purposes to determine the frequency of GAI-IMI, GAI-GMI, and GAI-DMI discrepancy scores in the WAIS-III/WMS-III standardization sample.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26882178','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26882178"><span>Indicators of suboptimal performance embedded in the Wechsler Memory Scale-Fourth Edition (WMS-IV).</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Bouman, Zita; Hendriks, Marc P H; Schmand, Ben A; Kessels, Roy P C; Aldenkamp, Albert P</p> <p>2016-01-01</p> <p>Recognition and visual working memory tasks from the Wechsler Memory Scale-Fourth Edition (WMS-IV) have previously been documented as useful indicators for suboptimal performance. The present study examined the clinical utility of the Dutch version of the WMS-IV (WMS-IV-NL) for the identification of suboptimal performance using an analogue study design. The patient group consisted of 59 mixed-etiology patients; the experimental malingerers were 50 healthy individuals who were asked to simulate cognitive impairment as a result of a traumatic brain injury; the last group consisted of 50 healthy controls who were instructed to put forth full effort. Experimental malingerers performed significantly lower on all WMS-IV-NL tasks than did the patients and healthy controls. A binary logistic regression analysis was performed on the experimental malingerers and the patients. The first model contained the visual working memory subtests (Spatial Addition and Symbol Span) and the recognition tasks of the following subtests: Logical Memory, Verbal Paired Associates, Designs, Visual Reproduction. The results showed an overall classification rate of 78.4%, and only Spatial Addition explained a significant amount of variation (p < .001). Subsequent logistic regression analysis and receiver operating characteristic (ROC) analysis supported the discriminatory power of the subtest Spatial Addition. A scaled score cutoff of <4 produced 93% specificity and 52% sensitivity for detection of suboptimal performance. The WMS-IV-NL Spatial Addition subtest may provide clinically useful information for the detection of suboptimal performance.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010AGUFMIN31C..02L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010AGUFMIN31C..02L"><span>The GEOSS Clearinghouse based on the GeoNetwork opensource</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Liu, K.; Yang, C.; Wu, H.; Huang, Q.</p> <p>2010-12-01</p> <p>The Global Earth Observation System of Systems (GEOSS) is established to support the study of the Earth system in a global community. It provides services for social management, quick response, academic research, and education. The purpose of GEOSS is to achieve comprehensive, coordinated and sustained observations of the Earth system, improve monitoring of the state of the Earth, increase understanding of Earth processes, and enhance prediction of the behavior of the Earth system. In 2009, GEO called for a competition for an official GEOSS clearinghouse to be selected as a source to consolidating catalogs for Earth observations. The Joint Center for Intelligent Spatial Computing at George Mason University worked with USGS to submit a solution based on the open-source platform - GeoNetwork. In the spring of 2010, the solution is selected as the product for GEOSS clearinghouse. The GEOSS Clearinghouse is a common search facility for the Intergovernmental Group on Ea rth Observation (GEO). By providing a list of harvesting functions in Business Logic, GEOSS clearinghouse can collect metadata from distributed catalogs including other GeoNetwork native nodes, webDAV/sitemap/WAF, catalog services for the web (CSW)2.0, GEOSS Component and Service Registry (http://geossregistries.info/), OGC Web Services (WCS, WFS, WMS and WPS), OAI Protocol for Metadata Harvesting 2.0, ArcSDE Server and Local File System. Metadata in GEOSS clearinghouse are managed in a database (MySQL, Postgresql, Oracle, or MckoiDB) and an index of the metadata is maintained through Lucene engine. Thus, EO data, services, and related resources can be discovered and accessed. It supports a variety of geospatial standards including CSW and SRU for search, FGDC and ISO metadata, and WMS related OGC standards for data access and visualization, as linked from the metadata.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/15512937','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/15512937"><span>Replacement of the Faces subtest by Visual Reproductions within Wechsler Memory Scale-Third Edition (WMS-III) visual memory indexes: implications for discrepancy analysis.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Hawkins, Keith A; Tulsky, David S</p> <p>2004-06-01</p> <p>Within discrepancy analysis differences between scores are examined for abnormality. Although larger differences are generally associated with rising impairment probabilities, the relationship between discrepancy size and abnormality varies across score pairs in relation to the correlation between the contrasted scores in normal subjects. Examinee ability level also affects the size of discrepancies observed normally. Wechsler Memory Scale-Third Edition (WMS-III) visual index scores correlate only modestly with other Wechsler Adult Intelligence Scale-Third Edition (WAIS-III) and WMS-III index scores; consequently, differences between these scores and others have to be very large before they become unusual, especially for subjects of higher intelligence. The substitution of the Faces subtest by Visual Reproductions within visual memory indexes formed by the combination of WMS-III visual subtests (creating immediate recall, delayed recall, and combined immediate and delayed index scores) results in higher correlation coefficients, and a decline in the discrepancy size required to surpass base rate thresholds for probable impairment. This gain appears not to occur at the cost of a diminished sensitivity to diverse pathologies. New WMS-III discrepancy base rate data are supplied to complement those currently available to clinicians.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..1812074P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..1812074P"><span>Semantics-informed cartography: the case of Piemonte Geological Map</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Piana, Fabrizio; Lombardo, Vincenzo; Mimmo, Dario; Giardino, Marco; Fubelli, Giandomenico</p> <p>2016-04-01</p> <p>In modern digital geological maps, namely those supported by a large geo-database and devoted to dynamical, interactive representation on WMS-WebGIS services, there is the need to provide, in an explicit form, the geological assumptions used for the design and compilation of the database of the Map, and to get a definition and/or adoption of semantic representation and taxonomies, in order to achieve a formal and interoperable representation of the geologic knowledge. These approaches are fundamental for the integration and harmonisation of geological information and services across cultural (e.g. different scientific disciplines) and/or physical barriers (e.g. administrative boundaries). Initiatives such as GeoScience Markup Language (last version is GeoSciML 4.0, 2015, http://www.geosciml.org) and the INSPIRE "Data Specification on Geology" http://inspire.jrc.ec.europa.eu/documents/Data_Specifications/INSPIRE_DataSpecification_GE_v3.0rc3.pdf (an operative simplification of GeoSciML, last version is 3.0 rc3, 2013), as well as the recent terminological shepherding of the Geoscience Terminology Working Group (GTWG) have been promoting information exchange of the geologic knowledge. Grounded on these standard vocabularies, schemas and data models, we provide a shared semantic classification of geological data referring to the study case of the synthetic digital geological map of the Piemonte region (NW Italy), named "GEOPiemonteMap", developed by the CNR Institute of Geosciences and Earth Resources, Torino (CNR IGG TO) and hosted as a dynamical interactive map on the geoportal of ARPA Piemonte Environmental Agency. The Piemonte Geological Map is grounded on a regional-scale geo-database consisting of some hundreds of GeologicUnits whose thousands instances (Mapped Features, polygons geometry) widely occur in Piemonte region, and each one is bounded by GeologicStructures (Mapped Features, line geometry). GeologicUnits and GeologicStructures have been spatially correlated through the whole region and described using the GeoSciML vocabularies. A hierarchical schema is provided for the Piemonte Geological Map that gives the parental relations between several orders of GeologicUnits referring to mostly recurring geological objects and main GeologicEvents, in a logical framework compliant with GeoSciML and INSPIRE data models. The classification criteria and the Hierarchy Schema used to define the GEOPiemonteMap Legend, as well as the intended meanings of the geological concepts used to achieve the overall classification schema, are explicitly described in several WikiGeo pages (implemented by "MediaWiki" open source software, https://www.mediawiki.org/wiki/MediaWiki). Moreover, a further step toward a formal classification of the contents (both data and interpretation) of the GEOPiemonteMap was triggered, by setting up an ontological framework, named "OntoGeonous", in order to achieve a thorough semantic characterization of the Map.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..1614222C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..1614222C"><span>HELI-DEM portal for geo-processing services</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Cannata, Massimiliano; Antonovic, Milan; Molinari, Monia</p> <p>2014-05-01</p> <p>HELI-DEM (Helvetia-Italy Digital Elevation Model) is a project developed in the framework of Italy/Switzerland Operational Programme for Trans-frontier Cooperation 2007-2013 whose major aim is to create a unified digital terrain model that includes the alpine and sub-alpine areas between Italy and Switzerland. The partners of the project are: Lombardy Region, Piedmont Region, Polytechnic of Milan, Polytechnic of Turin and Fondazione Politecnico from Italy; Institute of Earth Sciences (SUPSI) from Switzerland. The digital terrain model has been produced by integrating and validating the different elevation data available for the areas of interest, characterized by different reference frame, resolutions and accuracies: DHM at 25 m resolution from Swisstopo, DTM at 20 m resolution from Lombardy Region, DTM at 5 m resolution from Piedmont Region and DTM LiDAR PST-A at about 1 m resolution, that covers the main river bed areas and is produced by the Italian Ministry of the Environment. Further results of the project are: the generation of a unique Italian Swiss geoid with an accuracy of few centimeters (Gilardoni et al. 2012); the establishment of a GNSS permanent network, prototype of a transnational positioning service; the development of a geo-portal, entirely based on open source technologies and open standards, which provides the cross-border DTM and offers some capabilities of analysis and processing through the Internet. With this talk, the authors want to present the main steps of the project with a focus on the HELI-DEM geo-portal development carried out by the Institute of Earth Sciences, which is the access point to the DTM outputted from the project. The portal, accessible at http://geoservice.ist.supsi.ch/helidem, is a demonstration of open source technologies combined for providing access to geospatial functionalities to wide non GIS expert public. In fact, the system is entirely developed using only Open Standards and Free and Open Source Software (FOSS) both on the server side (services) and on the client side (interface). In addition to self developed code the system relies mainly on teh software GRASS 7 [1], ZOO-project [2], Geoserver [3] and OpenLayers [4] and the standards WMS [5], WCS [6] and WPS [7]. At the time of writing, the portal offers features like profiling, contour extraction, watershed delineation and analysis, derivatives calculation, data extraction, coordinate conversion but it is evolving and it is planned to extend to a series of environmental modeling that the IST developed in the past like dam break simulation, landslide run-out estimation and floods due to landslide impact in artificial basins. [1] Neteler M., Mitasova H., Open Source GIS: A GRASS GIS Approach. 3rd Ed. 406 pp, Springer, New York, 2008. [2] Fenoy G., Bozon N., Raghavan V., ZOO Project: The Open Wps Platform. Proceeding of 1st International Workshop on Pervasive Web Mapping, Geoprocessing and Services (WebMGS). Como, http://www.isprs.org/proceedings/XXXVIII/4-W13/ID_32.pdf, 26-27 agosto 2010. [3] Giannecchini S., Aime A., GeoServer, il server open source per la gestione interoperabile dei dati geospaziali. Atti 15a Conferenza Nazionale ASITA. Reggia di Colorno, 15-18 novembre 2011. [4] Perez A.S., OpenLayers Cookbook. Packt Publishing, 2012. ISBN 1849517843. [5] OGC, OpenGIS Web Map Server Implementation Specification, http://www.opengeospatial.org/standards/wms, 2006. [6] OGC, OGC WCS 2.0 Interface Standard - Core, http://portal.opengeospatial.org/files/?artifact_id=41437, 2010b. [7] OGC, OpenGIS Web Processing Service, http://portal.opengeospatial.org/files/?artifact_id=24151, 2007.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013EGUGA..1512168M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013EGUGA..1512168M"><span>Exchanging the Context between OGC Geospatial Web clients and GIS applications using Atom</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Maso, Joan; Díaz, Paula; Riverola, Anna; Pons, Xavier</p> <p>2013-04-01</p> <p>Currently, the discovery and sharing of geospatial information over the web still presents difficulties. News distribution through website content was simplified by the use of Really Simple Syndication (RSS) and Atom syndication formats. This communication exposes an extension of Atom to redistribute references to geospatial information in a Spatial Data Infrastructure distributed environment. A geospatial client can save the status of an application that involves several OGC services of different kind and direct data and share this status with other users that need the same information and use different client vendor products in an interoperable way. The extensibility of the Atom format was essential to define a format that could be used in RSS enabled web browser, Mass Market map viewers and emerging geospatial enable integrated clients that support Open Geospatial Consortium (OGC) services. Since OWS Context has been designed as an Atom extension, it is possible to see the document in common places where Atom documents are valid. Internet web browsers are able to present the document as a list of items with title, abstract, time, description and downloading features. OWS Context uses GeoRSS so that, the document can be to be interpreted by both Google maps and Bing Maps as items that have the extent represented in a dynamic map. Another way to explode a OWS Context is to develop an XSLT to transform the Atom feed into an HTML5 document that shows the exact status of the client view window that saved the context document. To accomplish so, we use the width and height of the client window, and the extent of the view in world (geographic) coordinates in order to calculate the scale of the map. Then, we can mix elements in world coordinates (such as CF-NetCDF files or GML) with elements in pixel coordinates (such as WMS maps, WMTS tiles and direct SVG content). A smarter map browser application called MiraMon Map Browser is able to write a context document and read it again to recover the context of the previous view or load a context generated by another application. The possibility to store direct links to direct files in OWS Context is particularly interesting for GIS desktop solutions. This communication also presents the development made in the MiraMon desktop GIS solution to include OWS Context. MiraMon software is able to deal either with local files, web services and database connections. As in any other GIS solution, MiraMon team designed its own file (MiraMon Map MMM) for storing and sharing the status of a GIS session. The new OWS Context format is now adopted as an interoperable substitution of the MMM. The extensibility of the format makes it possible to map concepts in the MMM to current OWS Context elements (such as titles, data links, extent, etc) and to generate new elements that are able to include all extra metadata not currently covered by OWS Context. These developments were done in the nine edition of the OpenGIS Web Services Interoperability Experiment (OWS-9) and are demonstrated in this communication.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010CG.....36.1362G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010CG.....36.1362G"><span>Improving data management and dissemination in web based information systems by semantic enrichment of descriptive data aspects</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Gebhardt, Steffen; Wehrmann, Thilo; Klinger, Verena; Schettler, Ingo; Huth, Juliane; Künzer, Claudia; Dech, Stefan</p> <p>2010-10-01</p> <p>The German-Vietnamese water-related information system for the Mekong Delta (WISDOM) project supports business processes in Integrated Water Resources Management in Vietnam. Multiple disciplines bring together earth and ground based observation themes, such as environmental monitoring, water management, demographics, economy, information technology, and infrastructural systems. This paper introduces the components of the web-based WISDOM system including data, logic and presentation tier. It focuses on the data models upon which the database management system is built, including techniques for tagging or linking metadata with the stored information. The model also uses ordered groupings of spatial, thematic and temporal reference objects to semantically tag datasets to enable fast data retrieval, such as finding all data in a specific administrative unit belonging to a specific theme. A spatial database extension is employed by the PostgreSQL database. This object-oriented database was chosen over a relational database to tag spatial objects to tabular data, improving the retrieval of census and observational data at regional, provincial, and local areas. While the spatial database hinders processing raster data, a "work-around" was built into WISDOM to permit efficient management of both raster and vector data. The data model also incorporates styling aspects of the spatial datasets through styled layer descriptions (SLD) and web mapping service (WMS) layer specifications, allowing retrieval of rendered maps. Metadata elements of the spatial data are based on the ISO19115 standard. XML structured information of the SLD and metadata are stored in an XML database. The data models and the data management system are robust for managing the large quantity of spatial objects, sensor observations, census and document data. The operational WISDOM information system prototype contains modules for data management, automatic data integration, and web services for data retrieval, analysis, and distribution. The graphical user interfaces facilitate metadata cataloguing, data warehousing, web sensor data analysis and thematic mapping.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_6");'>6</a></li> <li><a href="#" onclick='return showDiv("page_7");'>7</a></li> <li class="active"><span>8</span></li> <li><a href="#" onclick='return showDiv("page_9");'>9</a></li> <li><a href="#" onclick='return showDiv("page_10");'>10</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_8 --> <div id="page_9" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_7");'>7</a></li> <li><a href="#" onclick='return showDiv("page_8");'>8</a></li> <li class="active"><span>9</span></li> <li><a href="#" onclick='return showDiv("page_10");'>10</a></li> <li><a href="#" onclick='return showDiv("page_11");'>11</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="161"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1295710','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1295710"><span>Evolution of the pilot infrastructure of CMS: towards a single glideinWMS pool</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Belforte, S.; Gutsche, O.; Letts, J.</p> <p>2014-01-01</p> <p>CMS production and analysis job submission is based largely on glideinWMS and pilot submissions. The transition from multiple different submission solutions like gLite WMS and HTCondor-based implementations was carried out over years and is coming now to a conclusion. The historically explained separate glideinWMS pools for different types of production jobs and analysis jobs are being unified into a single global pool. This enables CMS to benefit from global prioritization and scheduling possibilities. It also presents the sites with only one kind of pilots and eliminates the need of having to make scheduling decisions on the CE level. This papermore » provides an analysis of the benefits of a unified resource pool, as well as a description of the resulting global policy. It will explain the technical challenges moving forward and present solutions to some of them.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/15222811','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/15222811"><span>Education-stratified base-rate information on discrepancy scores within and between the Wechsler Adult Intelligence Scale--Third Edition and the Wechsler Memory Scale--Third Edition.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Dori, Galit A; Chelune, Gordon J</p> <p>2004-06-01</p> <p>The Wechsler Adult Intelligence Scale--Third Edition (WAIS-III; D. Wechsler, 1997a) and the Wechsler Memory Scale--Third Edition (WMS-III; D. Wechsler, 1997b) are 2 of the most frequently used measures in psychology and neuropsychology. To facilitate the diagnostic use of these measures in the clinical decision-making process, this article provides information on education-stratified, directional prevalence rates (i.e., base rates) of discrepancy scores between the major index scores for the WAIS-III, the WMS-III, and between the WAIS-III and WMS-III. To illustrate how such base-rate data can be clinically used, this article reviews the relative risk (i.e., odds ratio) of empirically defined "rare" cognitive deficits in 2 of the clinical samples presented in the WAIS-III--WMS-III Technical Manual (The Psychological Corporation, 1997). ((c) 2004 APA, all rights reserved)</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014ISPAr.XL8..127S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014ISPAr.XL8..127S"><span>Geospatial Modelling Approach for Interlinking of Rivers: A Case Study of Vamsadhara and Nagavali River Systems in Srikakulam, Andhra Pradesh</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Swathi Lakshmi, A.; Saran, S.; Srivastav, S. K.; Krishna Murthy, Y. V. N.</p> <p>2014-11-01</p> <p>India is prone to several natural disasters such as floods, droughts, cyclones, landslides and earthquakes on account of its geoclimatic conditions. But the most frequent and prominent disasters are floods and droughts. So to reduce the impact of floods and droughts in India, interlinking of rivers is one of the best solutions to transfer the surplus flood waters to deficit/drought prone areas. Geospatial modelling provides a holistic approach to generate probable interlinking routes of rivers based on existing geoinformatics tools and technologies. In the present study, SRTM DEM and AWiFS datasets coupled with land-use/land -cover, geomorphology, soil and interpolated rainfall surface maps have been used to identify the potential routes in geospatial domain for interlinking of Vamsadhara and Nagavali River Systems in Srikakulam district, Andhra Pradesh. The first order derivatives are derived from DEM and road, railway and drainage networks have been delineated using the satellite data. The inundation map has been prepared using AWiFS derived Normalized Difference Water Index (NDWI). The Drought prone areas were delineated on the satellite image as per the records declared by Revenue Department, Srikakulam. Majority Rule Based (MRB) aggregation technique is performed to optimize the resolution of obtained data in order to retain the spatial variability of the classes. Analytical Hierarchy Process (AHP) based Multi-Criteria Decision Making (MCDM) is implemented to obtain the prioritization of parameters like geomorphology, soil, DEM, slope, and land use/land-cover. A likelihood grid has been generated and all the thematic layers are overlaid to identify the potential grids for routing optimization. To give a better routing map, impedance map has been generated and several other constraints are considered. The implementation of canal construction needs extra cost in some areas. The developed routing map is published into OGC WMS services using open source GeoServer and proposed routing service can be visualized over Bhuvan portal (<a href="http://http://www.bhuvan.nrsc.gov.in/" target="_blank">http://www.bhuvan.nrsc.gov.in/</a>).Thus the obtained routing map of proposed canals focuses on transferring the surplus waters to drought prone areas to solve the problem of water scarcity, to properly utilize the flood waters for irrigational purposes and also help in recharging of groundwater. Similar methodology can be adopted in other interlinking of river systems.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20090032057','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20090032057"><span>The NASA NEESPI Data Portal: Products, Information, and Services</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Shen, Suhung; Leptoukh, Gregory; Loboda, Tatiana; Csiszar, Ivan; Romanov, Peter; Gerasimov, Irina</p> <p>2008-01-01</p> <p>Studies have indicated that land cover and use changes in Northern Eurasia influence global climate system. However, the procedures are not fully understood and it is challenging to understand the interactions between the land changes in this region and the global climate. Having integrated data collections form multiple disciplines are important for studies of climate and environmental changes. Remote sensed and model data are particularly important die to sparse in situ measurements in many Eurasia regions especially in Siberia. The NASA GES DISC (Goddard Earth Sciences Data and Information Services Center) NEESPI data portal has generated infrastructure to provide satellite remote sensing and numerical model data for atmospheric, land surface, and cryosphere. Data searching, subsetting, and downloading functions are available. ONe useful tool is the Web-based online data analysis and visualization system, Giovanni (Goddard Interactive Online Visualization ANd aNalysis Infrastructure), which allows scientists to assess easily the state and dynamics of terrestrial ecosystems in Northern Eurasia and their interactions with global climate system. Recently, we have created a metadata database prototype to expand the NASA NEESPI data portal for providing a venue for NEESPI scientists fo find the desired data easily and leveraging data sharing within NEESPI projects. The database provides product level information. The desired data can be found through navigation and free text search and narrowed down by filtering with a number of constraints. In addition, we have developed a Web Map Service (WMS) prototype to allow access data and images from difference data resources.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012ApPhB.106..987C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012ApPhB.106..987C"><span>Wavelength-modulation-spectroscopy for real-time, in situ NO detection in combustion gases with a 5.2 μm quantum-cascade laser</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Chao, X.; Jeffries, J. B.; Hanson, R. K.</p> <p>2012-03-01</p> <p>A mid-infrared absorption strategy with calibration-free wavelength-modulation-spectroscopy (WMS) has been developed and demonstrated for real-time, in situ detection of nitric oxide in particulate-laden combustion-exhaust gases up to temperatures of 700 K. An external-cavity quantum-cascade laser (ECQCL) near 5.2 μm accessed the fundamental absorption band of NO, and a wavelength-scanned, 1 f-normalized WMS with second-harmonic detection (WMS-2 f/1 f) strategy was developed. Due to the external-cavity laser architecture, large nonlinear intensity modulation (IM) was observed when the wavelength was modulated by injection-current modulation, and the IM indices were also found to be strongly wavelength-dependent as the center wavelength was scanned with piezoelectric tuning of the cavity. A quantitative model of the 1 f-normalized WMS-2 f signal was developed and validated under laboratory conditions. A sensor was subsequently designed, built and demonstrated for real-time, in situ measurements of NO across a 3 m path in the particulate-laden exhaust of a pulverized-coal-fired power plant boiler. The 1 f-normalized WMS-2 f method proved to have better noise immunity for non-absorption transmission, than wavelength-scanned direct absorption. A 0.3 ppm-m detection limit was estimated using the R15.5 transition near 1927 cm-1 with 1 s averaging. Mid-infrared QCL-based NO absorption with 1 f-normalized WMS-2 f detection shows excellent promise for practical sensing in the combustion exhaust.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2008AGUFMIN31C1151H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2008AGUFMIN31C1151H"><span>Building Geospatial Web Services for Ecological Monitoring and Forecasting</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Hiatt, S. H.; Hashimoto, H.; Melton, F. S.; Michaelis, A. R.; Milesi, C.; Nemani, R. R.; Wang, W.</p> <p>2008-12-01</p> <p>The Terrestrial Observation and Prediction System (TOPS) at NASA Ames Research Center is a modeling system that generates a suite of gridded data products in near real-time that are designed to enhance management decisions related to floods, droughts, forest fires, human health, as well as crop, range, and forest production. While these data products introduce great possibilities for assisting management decisions and informing further research, realization of their full potential is complicated by their shear volume and by the need for a necessary infrastructure for remotely browsing, visualizing, and analyzing the data. In order to address these difficulties we have built an OGC-compliant WMS and WCS server based on an open source software stack that provides standardized access to our archive of data. This server is built using the open source Java library GeoTools which achieves efficient I/O and image rendering through Java Advanced Imaging. We developed spatio-temporal raster management capabilities using the PostGrid raster indexation engine. We provide visualization and browsing capabilities through a customized Ajax web interface derived from the kaMap project. This interface allows resource managers to quickly assess ecosystem conditions and identify significant trends and anomalies from within their web browser without the need to download source data or install special software. Our standardized web services also expose TOPS data to a range of potential clients, from web mapping applications to virtual globes and desktop GIS packages. However, support for managing the temporal dimension of our data is currently limited in existing software systems. Future work will attempt to overcome this shortcoming by building time-series visualization and analysis tools that can be integrated with existing geospatial software.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2008AGUFMIN51A1145M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2008AGUFMIN51A1145M"><span>The GEON Integrated Data Viewer (IDV) and IRIS DMC Services Illustrate CyberInfrastructure Support for Seismic Data Visualization and Interpretation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Meertens, C.; Wier, S.; Ahern, T.; Casey, R.; Weertman, B.; Laughbon, C.</p> <p>2008-12-01</p> <p>UNAVCO and the IRIS DMC are data service partners for seismic visualization, particularly for hypocentral data and tomography. UNAVCO provides the GEON Integrated Data Viewer (IDV), an extension of the Unidata IDV, a free, interactive, research-level, software display and analysis tool for data in 3D (latitude, longitude, depth) and 4D (with time), located on or inside the Earth. The GEON IDV is designed to meet the challenge of investigating complex, multi-variate, time-varying, three- dimensional geoscience data in the context of new remote and shared data sources. The GEON IDV supports data access from data sources using HTTP and FTP servers, OPeNDAP servers, THREDDS catalogs, RSS feeds, and WMS (web map) servers. The IRIS DMC (Data Management System) has developed web services providing data for earthquake hypocentral data and seismic tomography model grids. These services can be called by the GEON IDV to access data at IRIS without copying files. The IRIS Earthquake Browser (IEB) is a web-based query tool for hypocentral data. The IEB combines the DMC's large database of more than 1,900,000 earthquakes with the Google Maps web interface. With the IEB you can quickly find earthquakes in any region of the globe and then import this information into the GEON Integrated Data Viewer where the hypocenters may be visualized. You can select earthquakes by location region, time, depth, and magnitude. The IEB gives the IDV a URL to the selected data. The IDV then shows the data as maps or 3D displays, with interactive control of vertical scale, area, map projection, with symbol size and color control by magnitude or depth. The IDV can show progressive time animation of, for example, aftershocks filling a source region. The IRIS Tomoserver converts seismic tomography model output grids to NetCDF for use in the IDV. The Tomoserver accepts a tomographic model file as input from a user and provides an equivalent NetCDF file as output. The service supports NA04, S3D, A1D and CUB input file formats, contributed by their respective creators. The NetCDF file is saved to a location that can be referenced with a URL on an IRIS server. The URL for the NetCDF file is provided to the user. The user can download the data from IRIS, or copy the URL into IDV directly for interpretation, and the IDV will access the data at IRIS. The Tomoserver conversion software was developed by Instrumental Software Technologies, Inc. Use cases with the GEON IDV and IRIS DMC data services will be shown.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018ApPhB.124...15G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018ApPhB.124...15G"><span>Ultra-sensitive probe of spectral line structure and detection of isotopic oxygen</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Garner, Richard M.; Dharamsi, A. N.; Khan, M. Amir</p> <p>2018-01-01</p> <p>We discuss a new method of investigating and obtaining quantitative behavior of higher harmonic (> 2f) wavelength modulation spectroscopy (WMS) based on the signal structure. It is shown that the spectral structure of higher harmonic WMS signals, quantified by the number of zero crossings and turnings points, can have increased sensitivity to ambient conditions or line-broadening effects from changes in temperature, pressure, or optical depth. The structure of WMS signals, characterized by combinations of signal magnitude and spectral locations of turning points and zero crossings, provides a unique scale that quantifies lineshape parameters and, thus, useful in optimization of measurements obtained from multi-harmonic WMS signals. We demonstrate this by detecting weaker rotational-vibrational transitions of isotopic atmospheric oxygen (16O18O) in the near-infrared region where higher harmonic WMS signals are more sensitive contrary to their signal-to-noise ratio considerations. The proposed approach based on spectral structure provides the ability to investigate and quantify signals not only at linecenter but also in the wing region of the absorption profile. This formulation is particularly useful in tunable diode laser spectroscopy and ultra-precision laser-based sensors where absorption signal profile carries information of quantities of interest, e.g., concentration, velocity, or gas collision dynamics, etc.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26829039','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26829039"><span>Skin Microbiome Surveys Are Strongly Influenced by Experimental Design.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Meisel, Jacquelyn S; Hannigan, Geoffrey D; Tyldsley, Amanda S; SanMiguel, Adam J; Hodkinson, Brendan P; Zheng, Qi; Grice, Elizabeth A</p> <p>2016-05-01</p> <p>Culture-independent studies to characterize skin microbiota are increasingly common, due in part to affordable and accessible sequencing and analysis platforms. Compared to culture-based techniques, DNA sequencing of the bacterial 16S ribosomal RNA (rRNA) gene or whole metagenome shotgun (WMS) sequencing provides more precise microbial community characterizations. Most widely used protocols were developed to characterize microbiota of other habitats (i.e., gastrointestinal) and have not been systematically compared for their utility in skin microbiome surveys. Here we establish a resource for the cutaneous research community to guide experimental design in characterizing skin microbiota. We compare two widely sequenced regions of the 16S rRNA gene to WMS sequencing for recapitulating skin microbiome community composition, diversity, and genetic functional enrichment. We show that WMS sequencing most accurately recapitulates microbial communities, but sequencing of hypervariable regions 1-3 of the 16S rRNA gene provides highly similar results. Sequencing of hypervariable region 4 poorly captures skin commensal microbiota, especially Propionibacterium. WMS sequencing, which is resource and cost intensive, provides evidence of a community's functional potential; however, metagenome predictions based on 16S rRNA sequence tags closely approximate WMS genetic functional profiles. This study highlights the importance of experimental design for downstream results in skin microbiome surveys. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4842136','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4842136"><span>Skin microbiome surveys are strongly influenced by experimental design</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Meisel, Jacquelyn S.; Hannigan, Geoffrey D.; Tyldsley, Amanda S.; SanMiguel, Adam J.; Hodkinson, Brendan P.; Zheng, Qi; Grice, Elizabeth A.</p> <p>2016-01-01</p> <p>Culture-independent studies to characterize skin microbiota are increasingly common, due in part to affordable and accessible sequencing and analysis platforms. Compared to culture-based techniques, DNA sequencing of the bacterial 16S ribosomal RNA (rRNA) gene or whole metagenome shotgun (WMS) sequencing provide more precise microbial community characterizations. Most widely used protocols were developed to characterize microbiota of other habitats (i.e. gastrointestinal), and have not been systematically compared for their utility in skin microbiome surveys. Here we establish a resource for the cutaneous research community to guide experimental design in characterizing skin microbiota. We compare two widely sequenced regions of the 16S rRNA gene to WMS sequencing for recapitulating skin microbiome community composition, diversity, and genetic functional enrichment. We show that WMS sequencing most accurately recapitulates microbial communities, but sequencing of hypervariable regions 1-3 of the 16S rRNA gene provides highly similar results. Sequencing of hypervariable region 4 poorly captures skin commensal microbiota, especially Propionibacterium. WMS sequencing, which is resource- and cost-intensive, provides evidence of a community’s functional potential; however, metagenome predictions based on 16S rRNA sequence tags closely approximate WMS genetic functional profiles. This work highlights the importance of experimental design for downstream results in skin microbiome surveys. PMID:26829039</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2006ApOpt..45.1052L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2006ApOpt..45.1052L"><span>Extension of wavelength-modulation spectroscopy to large modulation depth for diode laser absorption measurements in high-pressure gases</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Li, Hejie; Rieker, Gregory B.; Liu, Xiang; Jeffries, Jay B.; Hanson, Ronald K.</p> <p>2006-02-01</p> <p>Tunable diode laser absorption measurements at high pressures by use of wavelength-modulation spectroscopy (WMS) require large modulation depths for optimum detection of molecular absorption spectra blended by collisional broadening or dense spacing of the rovibrational transitions. Diode lasers have a large and nonlinear intensity modulation when the wavelength is modulated over a large range by injection-current tuning. In addition to this intensity modulation, other laser performance parameters are measured, including the phase shift between the frequency modulation and the intensity modulation. Following published theory, these parameters are incorporated into an improved model of the WMS signal. The influence of these nonideal laser effects is investigated by means of wavelength-scanned WMS measurements as a function of bath gas pressure on rovibrational transitions of water vapor near 1388 nm. Lock-in detection of the magnitude of the 2f signal is performed to remove the dependence on detection phase. We find good agreement between measurements and the improved model developed for the 2f component of the WMS signal. The effects of the nonideal performance parameters of commercial diode lasers are especially important away from the line center of discrete spectra, and these contributions become more pronounced for 2f signals with the large modulation depths needed for WMS at elevated pressures.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFMIN24A..07G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFMIN24A..07G"><span>AWS-Glacier As A Storage Foundation For AWS-EC2 Hosted Scientific Data Services</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Gallagher, J. H. R.; Potter, N.</p> <p>2016-12-01</p> <p>Using AWS Glacier as a base level data store for a scientific data service presents new challenges for the web accessible data services, along with their software clients and human operators. All meaningful Glacier transactions take at least 4 hours to complete. This is in contrast to the various web APIs for data such as WMS, WFS, WCS, DAP2, and Netcdf tools which were all written based on the premise that the response will be (nearly) immediate. Only DAP4 and WPS contain an explicit asynchronous component to their respective protocols which allows for "return later" behaviors. We were able to put Hyrax (a DAP4 server) in front of Glacier-held resources, but there were significant issues. Any kind of probing of the datasets happens at the cost of the Glacier retrieval period, 4 hours. A couple of crucial things fall out of this: The first is that the service must cache metadata, including coordinate map arrays, so that a client can have enough information available in the "immediate" time frame to make a decisions about what to ask for from the dataset. This type of request planning is important because a data access request will take 4 hours to complete unless the data resource has been cached. The second thing is that the clients need to change their behavior when accessing datasets in an asynchronous system, even if the metadata is cached. Commonly, client applications will request a number of data components from a DAP2 service in the course of "discovering" the dataset. This may not be a well-supported model of interaction with Glacier or any other high latency data store.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFMIN51C1871L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFMIN51C1871L"><span>NCI's Distributed Geospatial Data Server</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Larraondo, P. R.; Evans, B. J. K.; Antony, J.</p> <p>2016-12-01</p> <p>Earth systems, environmental and geophysics datasets are an extremely valuable source of information about the state and evolution of the Earth. However, different disciplines and applications require this data to be post-processed in different ways before it can be used. For researchers experimenting with algorithms across large datasets or combining multiple data sets, the traditional approach to batch data processing and storing all the output for later analysis rapidly becomes unfeasible, and often requires additional work to publish for others to use. Recent developments on distributed computing using interactive access to significant cloud infrastructure opens the door for new ways of processing data on demand, hence alleviating the need for storage space for each individual copy of each product. The Australian National Computational Infrastructure (NCI) has developed a highly distributed geospatial data server which supports interactive processing of large geospatial data products, including satellite Earth Observation data and global model data, using flexible user-defined functions. This system dynamically and efficiently distributes the required computations among cloud nodes and thus provides a scalable analysis capability. In many cases this completely alleviates the need to preprocess and store the data as products. This system presents a standards-compliant interface, allowing ready accessibility for users of the data. Typical data wrangling problems such as handling different file formats and data types, or harmonising the coordinate projections or temporal and spatial resolutions, can now be handled automatically by this service. The geospatial data server exposes functionality for specifying how the data should be aggregated and transformed. The resulting products can be served using several standards such as the Open Geospatial Consortium's (OGC) Web Map Service (WMS) or Web Feature Service (WFS), Open Street Map tiles, or raw binary arrays under different conventions. We will show some cases where we have used this new capability to provide a significant improvement over previous approaches.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/pages/biblio/1435936-compact-tdlas-based-optical-sensor-ppb-level-ethane-detection-use-room-temperature-cw-interband-cascade-laser','SCIGOV-DOEP'); return false;" href="https://www.osti.gov/pages/biblio/1435936-compact-tdlas-based-optical-sensor-ppb-level-ethane-detection-use-room-temperature-cw-interband-cascade-laser"><span>Compact TDLAS based optical sensor for ppb-level ethane detection by use of a 3.34 μm room-temperature CW interband cascade laser</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/pages">DOE PAGES</a></p> <p>Li, Chunguang; Dong, Lei; Zheng, Chuantao; ...</p> <p>2016-03-26</p> <p>A mid-infrared ethane (C 2H 6) sensor based on a wavelength modulation spectroscopy (WMS) technique was developed using a thermoelectrically cooled (TEC), continuous-wave (CW) interband cascade laser (ICL) emitting at 3.34 μm and a dense multi-pass gas cell (MPGC, 17 × 6.5 × 5.5 cm 3) with a 54.6 m optical path length. A compact optical sensor system with a physical size of 35.5 × 18 × 12.5 cm 3 was designed and constructed. An ICL was employed for targeting a strong C 2H 6 line at 2996.88 cm -1 at <100 Torr gas pressure in the fundamental absorption bandmore » of C 2H 6. The sensor performance, including the minimum detection limit (MDL) and the stability were improved by reducing the effect of laser power drift by means of the 2f/1f-WMS technique. A MDL of ~1.2 parts per billion (ppbv) for 2f-WMS and ~1.0 ppbv for 2f/1f-WMS were achieved, respectively, with a measurement time of 4 s. The MDL was further improved from 299 pptv (@108 s for 2f-WMS) to 239 pptv (@208 s for 2f/1f-WMS), based on an Allan deviation analysis.The rise time (@0 → 100 ppbv) and fall time (@100 → 0 ppbv) were determined to be ~64 s and ~48 s,respectively, at a gas pressure of <100 Torr for the C 2H 6 sensor operation.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1435936','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1435936"><span>Compact TDLAS based optical sensor for ppb-level ethane detection by use of a 3.34 μm room-temperature CW interband cascade laser</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Li, Chunguang; Dong, Lei; Zheng, Chuantao</p> <p></p> <p>A mid-infrared ethane (C 2H 6) sensor based on a wavelength modulation spectroscopy (WMS) technique was developed using a thermoelectrically cooled (TEC), continuous-wave (CW) interband cascade laser (ICL) emitting at 3.34 μm and a dense multi-pass gas cell (MPGC, 17 × 6.5 × 5.5 cm 3) with a 54.6 m optical path length. A compact optical sensor system with a physical size of 35.5 × 18 × 12.5 cm 3 was designed and constructed. An ICL was employed for targeting a strong C 2H 6 line at 2996.88 cm -1 at <100 Torr gas pressure in the fundamental absorption bandmore » of C 2H 6. The sensor performance, including the minimum detection limit (MDL) and the stability were improved by reducing the effect of laser power drift by means of the 2f/1f-WMS technique. A MDL of ~1.2 parts per billion (ppbv) for 2f-WMS and ~1.0 ppbv for 2f/1f-WMS were achieved, respectively, with a measurement time of 4 s. The MDL was further improved from 299 pptv (@108 s for 2f-WMS) to 239 pptv (@208 s for 2f/1f-WMS), based on an Allan deviation analysis.The rise time (@0 → 100 ppbv) and fall time (@100 → 0 ppbv) were determined to be ~64 s and ~48 s,respectively, at a gas pressure of <100 Torr for the C 2H 6 sensor operation.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFMPA23B2222L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFMPA23B2222L"><span>A Novel Web Application to Analyze and Visualize Extreme Heat Events</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Li, G.; Jones, H.; Trtanj, J.</p> <p>2016-12-01</p> <p>Extreme heat is the leading cause of weather-related deaths in the United States annually and is expected to increase with our warming climate. However, most of these deaths are preventable with proper tools and services to inform the public about heat waves. In this project, we have investigated the key indicators of a heat wave, the vulnerable populations, and the data visualization strategies of how those populations most effectively absorb heat wave data. A map-based web app has been created that allows users to search and visualize historical heat waves in the United States incorporating these strategies. This app utilizes daily maximum temperature data from NOAA Global Historical Climatology Network which contains about 2.7 million data points from over 7,000 stations per year. The point data are spatially aggregated into county-level data using county geometry from US Census Bureau and stored in Postgres database with PostGIS spatial capability. GeoServer, a powerful map server, is used to serve the image and data layers (WMS and WFS). The JavaScript-based web-mapping platform Leaflet is used to display the temperature layers. A number of functions have been implemented for the search and display. Users can search for extreme heat events by county or by date. The "by date" option allows a user to select a date and a Tmax threshold which then highlights all of the areas on the map that meet those date and temperature parameters. The "by county" option allows the user to select a county on the map which then retrieves a list of heat wave dates and daily Tmax measurements. This visualization is clean, user-friendly, and novel because while this sort of time, space, and temperature measurements can be found by querying meteorological datasets, there does not exist a tool that neatly packages this information together in an easily accessible and non-technical manner, especially in a time where climate change urges a better understanding of heat waves.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010EGUGA..12.1681K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010EGUGA..12.1681K"><span>The deegree framework - Spatial Data Infrastructure solution for end-users and developers</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kiehle, Christian; Poth, Andreas</p> <p>2010-05-01</p> <p>The open source software framework deegree is a comprehensive implementa­tion of standards as defined by ISO and Open Geospatial Consortium (OGC). It has been developed with two goals in mind: provide a uniform framework for implementing Spatial Data Infrastructures (SDI) and adhering to standards as strictly as possible. Although being open source software (Lesser GNU Public Li­cense, LGPL), deegree has been developed with a business model in mind: providing the general building blocks of SDIs without license fees and offer cus­tomization, consulting and tailoring by specialized companies. The core of deegree is a comprehensive Java Application Programming Inter­face (API) offering access to spatial features, analysis, metadata and coordinate reference systems. As a library, deegree can and has been integrated as a core module inside spatial information systems. It is reference implementation for several OGC standards and based on an ISO 19107 geometry model. For end users, deegree is shipped as a web application providing easy-to-set-up components for web mapping and spatial analysis. Since 2000, deegree has been the backbone of many productive SDIs, first and foremost for governmental stakeholders (e.g. Federal Agency for Cartography and Geodesy in Germany, the Ministry of Housing, Spatial Planning and the En­vironment in the Netherlands, etc.) as well as for research and development projects as an early adoption of standards, drafts and discussion papers. Be­sides mature standards like Web Map Service, Web Feature Service and Cata­logue Services, deegree also implements rather new standards like the Sensor Observation Service, the Web Processing Service and the Web Coordinate Transformation Service (WCTS). While a robust background in standardization (knowledge and implementation) is a must for consultancy, standard-compliant services and encodings alone do not provide solutions for customers. The added value is comprised by a sophistic­ated set of client software, desktop and web environments. A focus lies on different client solutions for specific standards like the Web Pro­cessing Service and the Web Coordinate Transformation Service. On the other hand, complex geoportal solutions comprised of multiple standards and en­hanced by components for user management, security and map client function­ality show the demanding requirements of real world solutions. The XPlan-GML-standard as defined by the German spatial planing authorities is a good ex­ample of how complex real-world requirements can get. XPlan-GML is intended to provide a framework for digital spatial planning documents and requires complex Geography Markup Language (GML) features along with Symbology Encoding (SE), Filter Encoding (FE), Web Map Services (WMS), Web Feature Services (WFS). This complex in­frastructure should be used by urban and spatial planners and therefore re­quires a user-friendly graphical interface hiding the complexity of the underly­ing infrastructure. Based on challenges faced within customer projects, the importance of easy to use software components is focused. SDI solution should be build upon ISO/OGC-standards, but more important, should be user-friendly and support the users in spatial data management and analysis.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013EGUGA..15.7794T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013EGUGA..15.7794T"><span>Architecture of the local spatial data infrastructure for regional climate change research</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Titov, Alexander; Gordov, Evgeny</p> <p>2013-04-01</p> <p>Georeferenced datasets (meteorological databases, modeling and reanalysis results, etc.) are actively used in modeling and analysis of climate change for various spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their size which might constitute up to tens terabytes for a single dataset studies in the area of climate and environmental change require a special software support based on SDI approach. A dedicated architecture of the local spatial data infrastructure aiming at regional climate change analysis using modern web mapping technologies is presented. Geoportal is a key element of any SDI, allowing searching of geoinformation resources (datasets and services) using metadata catalogs, producing geospatial data selections by their parameters (data access functionality) as well as managing services and applications of cartographical visualization. It should be noted that due to objective reasons such as big dataset volume, complexity of data models used, syntactic and semantic differences of various datasets, the development of environmental geodata access, processing and visualization services turns out to be quite a complex task. Those circumstances were taken into account while developing architecture of the local spatial data infrastructure as a universal framework providing geodata services. So that, the architecture presented includes: 1. Effective in terms of search, access, retrieval and subsequent statistical processing, model of storing big sets of regional georeferenced data, allowing in particular to store frequently used values (like monthly and annual climate change indices, etc.), thus providing different temporal views of the datasets 2. General architecture of the corresponding software components handling geospatial datasets within the storage model 3. Metadata catalog describing in detail using ISO 19115 and CF-convention standards datasets used in climate researches as a basic element of the spatial data infrastructure as well as its publication according to OGC CSW (Catalog Service Web) specification 4. Computational and mapping web services to work with geospatial datasets based on OWS (OGC Web Services) standards: WMS, WFS, WPS 5. Geoportal as a key element of thematic regional spatial data infrastructure providing also software framework for dedicated web applications development To realize web mapping services Geoserver software is used since it provides natural WPS implementation as a separate software module. To provide geospatial metadata services GeoNetwork Opensource (http://geonetwork-opensource.org) product is planned to be used for it supports ISO 19115/ISO 19119/ISO 19139 metadata standards as well as ISO CSW 2.0 profile for both client and server. To implement thematic applications based on geospatial web services within the framework of local SDI geoportal the following open source software have been selected: 1. OpenLayers JavaScript library, providing basic web mapping functionality for the thin client such as web browser 2. GeoExt/ExtJS JavaScript libraries for building client-side web applications working with geodata services. The web interface developed will be similar to the interface of such popular desktop GIS applications, as uDIG, QuantumGIS etc. The work is partially supported by RF Ministry of Education and Science grant 8345, SB RAS Program VIII.80.2.1 and IP 131.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/11281309','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/11281309"><span>Memory testing in dementia: how much is enough?</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Derrer, D S; Howieson, D B; Mueller, E A; Camicioli, R M; Sexton, G; Kaye, J A</p> <p>2001-01-01</p> <p>Analyses of eight widely used memory measures (Word List Acquisition and Recall used in the Alzheimer's Disease Assessment Scale and the Consortium to Establish a Registry for Alzheimer's Disease neuropsychology battery, Wechsler Memory Scale-Revised [WMS-R] Logical Memory I and II, WMS-R Visual Reproduction I and II, the memory scores from the Neurobehavioral Cognitive Status Examination [NCSE], memory scores from the Mini-Mental State Examination [MMSE]), and the MMSE total score showed each to have moderate predictive power in differentiating between patients with mild dementia and healthy normal controls. When these instruments were combined in a logistic regression analysis, three of them had substantial predictive power. Together, the Word List Acquisition, WMS-R Logical Memory II, and WMS-R Visual Reproduction II were 97.26% accurate (100% sensitive and 94.59% specific) in distinguishing these two groups. The Word List Acquisition is a brief test that alone had high accuracy (92%). These memory tests are highly useful in the diagnosis of mild dementia.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3659070','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3659070"><span>Clinical significance of knowledge about the structure, function, and impairments of working memory</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Brodziak, Andrzej; Brewczyński, Adam; Bajor, Grzegorz</p> <p>2013-01-01</p> <p>A review of contemporary research on the working memory system (WMS) is important, both due to the need to focus the discussion on further necessary investigations on the structure and function of this key part of the human brain, as well as to share this knowledge with clinicians. In our introduction we try to clarify the actual terminology and provide an intuitively understandable model for 3 basic cognitive operations: perception, recognition, imagery, and manipulation of recalled mental images. We emphasize the importance of knowledge of the structure and function of the WMS for the possibility to demonstrate the links between genetic polymorphisms and the prevalence to some mental disorders. We also review current knowledge of working memory dysfunction in the most common diseases and specific clinical situations such as maturation and aging. Finally, we briefly discuss methods for assessment of WMS capacity. This article establishes a kind of compendium of knowledge for clinicians who are not familiar with the structure and operation of the WMS. PMID:23645218</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_7");'>7</a></li> <li><a href="#" onclick='return showDiv("page_8");'>8</a></li> <li class="active"><span>9</span></li> <li><a href="#" onclick='return showDiv("page_10");'>10</a></li> <li><a href="#" onclick='return showDiv("page_11");'>11</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_9 --> <div id="page_10" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_8");'>8</a></li> <li><a href="#" onclick='return showDiv("page_9");'>9</a></li> <li class="active"><span>10</span></li> <li><a href="#" onclick='return showDiv("page_11");'>11</a></li> <li><a href="#" onclick='return showDiv("page_12");'>12</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="181"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1247509','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1247509"><span>Using the GlideinWMS System as a Common Resource Provisioning Layer in CMS</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Balcas, J.; Belforte, S.; Bockelman, B.</p> <p>2015-12-23</p> <p>CMS will require access to more than 125k processor cores for the beginning of Run 2 in 2015 to carry out its ambitious physics program with more and higher complexity events. During Run1 these resources were predominantly provided by a mix of grid sites and local batch resources. During the long shut down cloud infrastructures, diverse opportunistic resources and HPC supercomputing centers were made available to CMS, which further complicated the operations of the submission infrastructure. In this presentation we will discuss the CMS effort to adopt and deploy the glideinWMS system as a common resource provisioning layer to grid,more » cloud, local batch, and opportunistic resources and sites. We will address the challenges associated with integrating the various types of resources, the efficiency gains and simplifications associated with using a common resource provisioning layer, and discuss the solutions found. We will finish with an outlook of future plans for how CMS is moving forward on resource provisioning for more heterogenous architectures and services.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015JPhCS.664f2030B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015JPhCS.664f2030B"><span>Pushing HTCondor and glideinWMS to 200K+ Jobs in a Global Pool for CMS before Run 2</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Balcas, J.; Belforte, S.; Bockelman, B.; Gutsche, O.; Khan, F.; Larson, K.; Letts, J.; Mascheroni, M.; Mason, D.; McCrea, A.; Saiz-Santos, M.; Sfiligoi, I.</p> <p>2015-12-01</p> <p>The CMS experiment at the LHC relies on HTCondor and glideinWMS as its primary batch and pilot-based Grid provisioning system. So far we have been running several independent resource pools, but we are working on unifying them all to reduce the operational load and more effectively share resources between various activities in CMS. The major challenge of this unification activity is scale. The combined pool size is expected to reach 200K job slots, which is significantly bigger than any other multi-user HTCondor based system currently in production. To get there we have studied scaling limitations in our existing pools, the biggest of which tops out at about 70K slots, providing valuable feedback to the development communities, who have responded by delivering improvements which have helped us reach higher and higher scales with more stability. We have also worked on improving the organization and support model for this critical service during Run 2 of the LHC. This contribution will present the results of the scale testing and experiences from the first months of running the Global Pool.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=david+AND+wechsler&pg=2&id=EJ432036','ERIC'); return false;" href="https://eric.ed.gov/?q=david+AND+wechsler&pg=2&id=EJ432036"><span>The Wechsler Memory Scale: A Review of Research.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Ivison, David</p> <p>1990-01-01</p> <p>Research on the standardization, reliability, validity, factor structure, and subtests of the Wechsler Memory Scale (WMS) (1945) and its revised version (1987) is reviewed. Much research relating to the WMS appears to be relevant to the revised version. Use of the instrument in Australia is discussed. (SLD)</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2008AGUFMIN41A1135O','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2008AGUFMIN41A1135O"><span>Data Visualization of Lunar Orbiter KAGUYA (SELENE) using web-based GIS</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Okumura, H.; Sobue, S.; Yamamoto, A.; Fujita, T.</p> <p>2008-12-01</p> <p>The Japanese Lunar Orbiter KAGUYA (SELENE) was launched on Sep.14 2007, and started nominal observation from Dec. 21 2007. KAGUYA has 15 ongoing observation missions and is obtaining various physical quantity data of the moon such as elemental abundance, mineralogical composition, geological feature, magnetic field and gravity field. We are working on the visualization of these data and the application of them to web-based GIS. Our purpose of data visualization is the promotion of science and education and public outreach (EPO). As for scientific usage and public outreach, we already constructed KAGUYA Web Map Server (WMS) at JAXA Sagamihara Campus and began to test it among internal KAGUYA project. KAGUYA science team plans the integrated science using the data of multiple instruments with the aim of obtaining the new findings of the origin and the evolution of the moon. In the study of the integrated science, scientists have to access, compare and analyze various types of data with different resolution. Web-based GIS will allow users to map, overlay and share the data and information easily. So it will be the best way to progress such a study and we are developing the KAGUYA WMS as a platform of the KAGUYA integrated science. For the purpose of EPO, we are customizing NASA World Wind (NWW) JAVA for KAGUYA supported by NWW project. Users will be able to search and view many images and movies of KAGUYA on NWW JAVA in the easy and attractive way. In addition, we are considering applying KAGUYA images to Google Moon with KML format and adding KAGUYA movies to Google/YouTube.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4279545','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4279545"><span>Rapid, Time-Division Multiplexed, Direct Absorption- and Wavelength Modulation-Spectroscopy</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Klein, Alexander; Witzel, Oliver; Ebert, Volker</p> <p>2014-01-01</p> <p>We present a tunable diode laser spectrometer with a novel, rapid time multiplexed direct absorption- and wavelength modulation-spectroscopy operation mode. The new technique allows enhancing the precision and dynamic range of a tunable diode laser absorption spectrometer without sacrificing accuracy. The spectroscopic technique combines the benefits of absolute concentration measurements using calibration-free direct tunable diode laser absorption spectroscopy (dTDLAS) with the enhanced noise rejection of wavelength modulation spectroscopy (WMS). In this work we demonstrate for the first time a 125 Hz time division multiplexed (TDM-dTDLAS-WMS) spectroscopic scheme by alternating the modulation of a DFB-laser between a triangle-ramp (dTDLAS) and an additional 20 kHz sinusoidal modulation (WMS). The absolute concentration measurement via the dTDLAS-technique allows one to simultaneously calibrate the normalized 2f/1f-signal of the WMS-technique. A dTDLAS/WMS-spectrometer at 1.37 μm for H2O detection was built for experimental validation of the multiplexing scheme over a concentration range from 50 to 3000 ppmV (0.1 MPa, 293 K). A precision of 190 ppbV was achieved with an absorption length of 12.7 cm and an averaging time of two seconds. Our results show a five-fold improvement in precision over the entire concentration range and a significantly decreased averaging time of the spectrometer. PMID:25405508</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012AGUFMIN34A..06P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012AGUFMIN34A..06P"><span>Connecting long-tail scientists with big data centers using SaaS</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Percivall, G. S.; Bermudez, L. E.</p> <p>2012-12-01</p> <p>Big data centers and long tail scientists represent two extremes in the geoscience research community. Interoperability and inter-use based on software-as-a-service (SaaS) increases access to big data holdings by this underserved community of scientists. Large, institutional data centers have long been recognized as vital resources in the geoscience community. Permanent data archiving and dissemination centers provide "access to the data and (are) a critical source of people who have experience in the use of the data and can provide advice and counsel for new applications." [NRC] The "long-tail of science" is the geoscience researchers that work separate from institutional data centers [Heidorn]. Long-tail scientists need to be efficient consumers of data from large, institutional data centers. Discussions in NSF EarthCube capture the challenges: "Like the vast majority of NSF-funded researchers, Alice (a long-tail scientist) works with limited resources. In the absence of suitable expertise and infrastructure, the apparently simple task that she assigns to her graduate student becomes an information discovery and management nightmare. Downloading and transforming datasets takes weeks." [Foster, et.al.] The long-tail metaphor points to methods to bridge the gap, i.e., the Web. A decade ago, OGC began building a geospatial information space using open, web standards for geoprocessing [ORM]. Recently, [Foster, et.al.] accurately observed that "by adopting, adapting, and applying semantic web and SaaS technologies, we can make the use of geoscience data as easy and convenient as consumption of online media." SaaS places web services into Cloud Computing. SaaS for geospatial is emerging rapidly building on the first-generation geospatial web, e.g., OGC Web Coverage Service [WCS] and the Data Access Protocol [DAP]. Several recent examples show progress in applying SaaS to geosciences: - NASA's Earth Data Coherent Web has a goal to improve science user experience using Web Services (e.g. W*S, SOAP, RESTful) to reduce barriers to using EOSDIS data [ECW]. - NASA's LANCE provides direct access to vast amounts of satellite data using the OGC Web Map Tile Service (WMTS). - NOAA's Unified Access Framework for Gridded Data (UAF Grid) is a web service based capability for direct access to a variety of datasets using netCDF, OPeNDAP, THREDDS, WMS and WCS. [UAF] Tools to access SaaS's are many and varied: some proprietary, others open source; some run in browsers, others are stand-alone applications. What's required is interoperability using web interfaces offered by the data centers. NOAA's UAF service stack supports Matlab, ArcGIS, Ferret, GrADS, Google Earth, IDV, LAS. Any SaaS that offers OGC Web Services (WMS, WFS, WCS) can be accessed by scores of clients [OGC]. While there has been much progress in the recent year toward offering web services for the long-tail of scientists, more needs to be done. Web services offer data access but more than access is needed for inter-use of data, e.g. defining data schemas that allow for data fusion, addressing coordinate systems, spatial geometry, and semantics for observations. Connecting long-tail scientists with large, data centers using SaaS and, in the future, semantic web, will address this large and currently underserved user community.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=19990054053&hterms=quantitative+research&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D50%26Ntt%3Dquantitative%2Bresearch','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=19990054053&hterms=quantitative+research&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D50%26Ntt%3Dquantitative%2Bresearch"><span>A Compact, Tunable Near-UV Source for Quantitative Microgravity Combustion Diagnostics</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Peterson, K. A.; Oh, D. B.</p> <p>1999-01-01</p> <p>There is a need for improved optical diagnostic methods for use in microgravity combustion research. Spectroscopic methods with fast time response that can provide absolute concentrations and concentration profiles of important chemical species in flames are needed to facilitate the understanding of combustion kinetics in microgravity. Although a variety of sophisticated laser-based diagnostics (such as planar laser induced fluorescence, degenerate four wave mixing and coherent Raman methods) have been applied to the study of combustion in laboratory flames, the instrumentation associated with these methods is not well suited to microgravity drop tower or space station platforms. Important attributes of diagnostic systems for such applications include compact size, low power consumption, ruggedness, and reliability. We describe a diode laser-based near-UV source designed with the constraints of microgravity research in mind. Coherent light near 420 nm is generated by frequency doubling in a nonlinear crystal. This light source is single mode with a very narrow bandwidth suitable for gas phase diagnostics, can be tuned over several 1/cm and can be wavelength modulated at up to MHz frequencies. We demonstrate the usefulness of this source for combustion diagnostics by measuring CH radical concentration profiles in an atmospheric pressure laboratory flame. The radical concentrations are measured using wavelength modulation spectroscopy (WMS) to obtain the line-of-sight integrated absorption for different paths through the flame. Laser induced fluorescence (LIF) measurements are also demonstrated with this instrument, showing the feasibility of simultaneous WMS absorption and LIF measurements with the same light source. LIF detection perpendicular to the laser beam can be used to map relative species densities along the line-of-sight while the integrated absorption available through WMS provides a mathematical constraint on the extraction of quantitative information from the LIF data. Combining absorption with LIF - especially if the measurements are made simultaneously with the same excitation beam - may allow elimination of geometrical factors and effects of intensity fluctuations (common difficulties with the analysis of LIF data) from the analysis.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010AGUFM.H23A1164W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010AGUFM.H23A1164W"><span>Developing of operational hydro-meteorological simulating and displaying system</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Wang, Y.; Shih, D.; Chen, C.</p> <p>2010-12-01</p> <p>Hydrological hazards, which often occur in conjunction with extreme precipitation events, are the most frequent type of natural disaster in Taiwan. Hence, the researchers at the Taiwan Typhoon and Flood Research Institute (TTFRI) are devoted to analyzing and gaining a better understanding of the causes and effects of natural disasters, and in particular, typhoons and floods. The long-term goal of the TTFRI is to develop a unified weather-hydrological-oceanic model suitable for simulations with local parameterizations in Taiwan. The development of a fully coupled weather-hydrology interaction model is not yet completed but some operational hydro-meteorological simulations are presented as a step in the direction of completing a full model. The predicted rainfall data from Weather Research Forecasting (WRF) are used as our meteorological forcing on watershed modeling. The hydrology and hydraulic modeling are conducted by WASH123D numerical model. And the WRF/WASH123D coupled system is applied to simulate floods during the typhoon landfall periods. The daily operational runs start at 04UTC, 10UTC, 16UTC and 22UTC, about 4 hours after data downloaded from NCEP GFS. This system will execute 72-hr weather forecasts. The simulation of WASH123D will sequentially trigger after receiving WRF rainfall data. This study presents the preliminary framework of establishing this system, and our goal is to build this earlier warning system to alert the public form dangerous. The simulation results are further display by a 3D GIS web service system. This system is established following the Open Geospatial Consortium (OGC) standardization process for GIS web service, such as Web Map Service (WMS) and Web Feature Service (WFS). The traditional 2D GIS data, such as high resolution aerial photomaps and satellite images are integrated into 3D landscape model. The simulated flooding and inundation area can be dynamically mapped on Wed 3D world. The final goal of this system is to real-time forecast flood and the results can be visually displayed on the virtual catchment. The policymaker can easily and real-time gain visual information for decision making at any site through internet.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFMIN21D0060G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFMIN21D0060G"><span>PAVICS: A platform for the Analysis and Visualization of Climate Science - adopting a workflow-based analysis method for dealing with a multitude of climate data sources</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Gauvin St-Denis, B.; Landry, T.; Huard, D. B.; Byrns, D.; Chaumont, D.; Foucher, S.</p> <p>2017-12-01</p> <p>As the number of scientific studies and policy decisions requiring tailored climate information continues to increase, the demand for support from climate service centers to provide the latest information in the format most helpful for the end-user is also on the rise. Ouranos, being one such organization based in Montreal, has partnered with the Centre de recherche informatique de Montreal (CRIM) to develop a platform that will offer climate data products that have been identified as most useful for users through years of consultation. The platform is built as modular components that target the various requirements of climate data analysis. The data components host and catalog NetCDF data as well as geographical and political delimitations. The analysis components are made available as atomic operations through Web Processing Service (WPS) or as workflows, whereby the operations are chained through a simple JSON structure and executed on a distributed network of computing resources. The visualization components range from Web Map Service (WMS) to a complete frontend for searching the data, launching workflows and interacting with maps of the results. Each component can easily be deployed and executed as an independent service through the use of Docker technology and a proxy is available to regulate user workspaces and access permissions. PAVICS includes various components from birdhouse, a collection of WPS initially developed by the German Climate Research Center (DKRZ) and Institut Pierre Simon Laplace (IPSL) and is designed to be highly interoperable with other WPS as well as many Open Geospatial Consortium (OGC) standards. Further connectivity is made with the Earth System Grid Federation (ESGF) nodes and local results are made searchable using the same API terminology. Other projects conducted by CRIM that integrate with PAVICS include the OGC Testbed 13 Innovation Program (IP) initiative that will enhance advanced cloud capabilities, application packaging deployment processes, as well as enabling Earth Observation (EO) processes relevant to climate. As part of its experimental agenda, working implementations of scalable machine learning on big climate data with Spark and SciSpark were delivered.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/19205944','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/19205944"><span>WAIS-III FSIQ and GAI in ability-memory discrepancy analysis.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Glass, Laura A; Bartels, Jared M; Ryan, Joseph J</p> <p>2009-01-01</p> <p>The present investigation compares WAIS-III FSIQ-WMS-III with GAI-WMS-III discrepancies in 135 male inpatients with suspected memory impairment. Full Scale IQ and GAI scores were highly correlated, r= .96, with mean values of 92.10 and 93.59, respectively. Additional analyses with the ability composites compared to each WMS-III index (IMI, GMI, and DMI), the GAI consistently produced larger difference scores than did the FSIQ; however, effect sizes were relatively small (ES= .12). Lastly, case-by-case analyses demonstrated concordance rates of 86% for the FSIQ-IMI and GAI-IMI comparisons, 85% for the FSIQ-GMI and GAI-GMI, and 82% for the FSIQ-DMI and GAI-DMI.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=19870045798&hterms=personal+hygiene&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D40%26Ntt%3Dpersonal%2Bhygiene','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=19870045798&hterms=personal+hygiene&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D40%26Ntt%3Dpersonal%2Bhygiene"><span>Recent Space Shuttle crew compartment design improvements</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Goodman, Jerry R.</p> <p>1986-01-01</p> <p>Significant design changes to the Space Shuttle waste management system (WMS) and its related personal hygiene support provisions (PHSP) have been made recently to improve overall operational performance and human factors interfaces. The WMS design improvements involve increased urinal flow, individual urinals, and provisions for manually compacting feces and cleanup materials to ensure adequate mission capacity. The basic arrangement and stowage of the PHSP used during waste management operations were extensively changed to better serve habitability concerns and operations needs, and to improve the hygiene of WMS operations. This paper describes these changes and the design, development, and flight test evaluation. In addition, provisions for an eighth crewmember and a new four-tier sleep station are described.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015EGUGA..17.9770P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015EGUGA..17.9770P"><span>Application of polar orbiter products in weather forecasting using open source tools and open standards</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Plieger, Maarten; de Vreede, Ernst</p> <p>2015-04-01</p> <p>EUMETSAT disseminates data for a number of polar satellites. At KNMI these data are not fully used for operational weather forecasting mainly because of the irregular coverage and lack of tools for handling these different types of data and products. For weather forecasting there is a lot of interest in the application of products from these polar orbiters. One of the key aspects is the high-resolution of these products, which can complement the information provided by numerical weather forecasts. Another advantage over geostationary satellites is the high coverage at higher latitudes and lack of parallax. Products like the VIIRS day-night band offer many possibilities for this application. This presentation will describe a project that aims to make available a number of products from polar satellites to the forecasting operation. The goal of the project is to enable easy and timely access to polar orbiter products and enable combined presentations of satellite imagery with model data. The system will be able to generate RGB composites (“false colour images”) for operational use. The system will be built using open source components and open standards. Pytroll components are used for data handling, reprojection and derived product generation. For interactive presentation of imagery the browser based ADAGUC WMS viewer component is used. Image generation is done by ADAGUC server components, which provide OGC WMS services. Polar satellite products are stored as true color RGBA data in the NetCDF file format, the satellite swaths are stored as regular grids with their own custom geographical projection. The ADAGUC WMS system is able to reproject, render and combine these data in a webbrowser interactively. Results and lessons learned will be presented at the conference.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..1911098R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..1911098R"><span>GSKY: A scalable distributed geospatial data server on the cloud</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Rozas Larraondo, Pablo; Pringle, Sean; Antony, Joseph; Evans, Ben</p> <p>2017-04-01</p> <p>Earth systems, environmental and geophysical datasets are an extremely valuable sources of information about the state and evolution of the Earth. Being able to combine information coming from different geospatial collections is in increasing demand by the scientific community, and requires managing and manipulating data with different formats and performing operations such as map reprojections, resampling and other transformations. Due to the large data volume inherent in these collections, storing multiple copies of them is unfeasible and so such data manipulation must be performed on-the-fly using efficient, high performance techniques. Ideally this should be performed using a trusted data service and common system libraries to ensure wide use and reproducibility. Recent developments in distributed computing based on dynamic access to significant cloud infrastructure opens the door for such new ways of processing geospatial data on demand. The National Computational Infrastructure (NCI), hosted at the Australian National University (ANU), has over 10 Petabytes of nationally significant research data collections. Some of these collections, which comprise a variety of observed and modelled geospatial data, are now made available via a highly distributed geospatial data server, called GSKY (pronounced [jee-skee]). GSKY supports on demand processing of large geospatial data products such as satellite earth observation data as well as numerical weather products, allowing interactive exploration and analysis of the data. It dynamically and efficiently distributes the required computations among cloud nodes providing a scalable analysis framework that can adapt to serve large number of concurrent users. Typical geospatial workflows handling different file formats and data types, or blending data in different coordinate projections and spatio-temporal resolutions, is handled transparently by GSKY. This is achieved by decoupling the data ingestion and indexing process as an independent service. An indexing service crawls data collections either locally or remotely by extracting, storing and indexing all spatio-temporal metadata associated with each individual record. GSKY provides the user with the ability of specifying how ingested data should be aggregated, transformed and presented. It presents an OGC standards-compliant interface, allowing ready accessibility for users of the data via Web Map Services (WMS), Web Processing Services (WPS) or raw data arrays using Web Coverage Services (WCS). The presentation will show some cases where we have used this new capability to provide a significant improvement over previous approaches.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018LaPhL..15d5701M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018LaPhL..15d5701M"><span>Wavelength modulation spectroscopy coupled with an external-cavity quantum cascade laser operating between 7.5 and 8 µm</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Maity, Abhijit; Pal, Mithun; Maithani, Sanchi; Dutta Banik, Gourab; Pradhan, Manik</p> <p>2018-04-01</p> <p>We demonstrate a mid-infrared detection strategy with 1f-normalized 2f-wavelength modulation spectroscopy (WMS-2f/1f) using a continuous wave (CW) external-cavity quantum cascade laser (EC-QCL) operating between 7.5 and 8 µm. The detailed performance of the WMS-2f/1f detection method was evaluated by making rotationally resolved measurements in the (ν 4  +  ν 5) combination band of acetylene (C2H2) at 1311.7600 cm-1. A noise-limited detection limit of three parts per billion (ppb) with an integration time of 110 s was achieved for C2H2 detection. The present high-resolution CW-EC-QCL system coupled with the WMS-2f/1f strategy was further validated with an extended range of C2H2 concentration of 0.1-1000 ppm, which shows excellent promise for real-life practical sensing applications. Finally, we utilized the WMS-2f/1f technique to measure the C2H2 concentration in the exhaled breath of smokers.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26160974','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26160974"><span>Clinical validation of three short forms of the Dutch Wechsler Memory Scale-Fourth Edition (WMS-IV-NL) in a mixed clinical sample.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Bouman, Zita; Hendriks, Marc P H; Van Der Veld, William M; Aldenkamp, Albert P; Kessels, Roy P C</p> <p>2016-06-01</p> <p>The reliability and validity of three short forms of the Dutch version of the Wechsler Memory Scale-Fourth Edition (WMS-IV-NL) were evaluated in a mixed clinical sample of 235 patients. The short forms were based on the WMS-IV Flexible Approach, that is, a 3-subtest combination (Older Adult Battery for Adults) and two 2-subtest combinations (Logical Memory and Visual Reproduction and Logical Memory and Designs), which can be used to estimate the Immediate, Delayed, Auditory and Visual Memory Indices. All short forms showed good reliability coefficients. As expected, for adults (16-69 years old) the 3-subtest short form was consistently more accurate (predictive accuracy ranged from 73% to 100%) than both 2-subtest short forms (range = 61%-80%). Furthermore, for older adults (65-90 years old), the predictive accuracy of the 2-subtest short form ranged from 75% to 100%. These results suggest that caution is warranted when using the WMS-IV-NL Flexible Approach short forms to estimate all four indices. © The Author(s) 2015.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24269053','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24269053"><span>Structural characterisation of parotid and whole mouth salivary pellicles adsorbed onto DPI and QCMD hydroxyapatite sensors.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Ash, Anthony; Burnett, Gary R; Parker, Roger; Ridout, Mike J; Rigby, Neil M; Wilde, Peter J</p> <p>2014-04-01</p> <p>In this study we investigated the differences in the properties of pellicles formed from stimulated parotid saliva (PS), which contains little or no mucin; and stimulated whole mouth saliva (WMS), which contains mainly two types of mucin: MUC5B and MUC7. By contacting WMS and PS with quartz-crystal microbalance with dissipation monitoring (QCM-D) and dual polarisation interferometer (DPI) hydroxyapatite (the main component of enamel) coated sensors, we observed the formation and structure of the respective salivary pellicles. As this was the first time that DPI hydroxyapatite sensors have been used to measure salivary pellicle adsorption; the techniques combined allowed us to measure the hydrated mass, dry mass, thickness and viscoelastic properties of the pellicle; but also to record the density of the PS and WMS formed pellicles. Subsequently, the PS pellicle was shown to form a denser layer than WMS pellicle; which would suggest that the proteins present in PS are also responsible for forming the dense basal layer of the acquired enamel pellicle. Whereas proteins present in the WMS are more likely to help form the softer outer layer of the pellicle. The data presented help to further define the mechanisms leading to the multi-layered structure of the salivary pellicle and demonstrate that salivary composition has an important effect on the structural properties of the adsorbed pellicle. Copyright © 2013 Elsevier B.V. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26514213','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26514213"><span>Resistant starch and protein intake enhances fat oxidation and feelings of fullness in lean and overweight/obese women.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Gentile, Christopher L; Ward, Emery; Holst, Jens Juul; Astrup, Arne; Ormsbee, Michael J; Connelly, Scott; Arciero, Paul J</p> <p>2015-10-29</p> <p>Diets high in either resistant starch or protein have been shown to aid in weight management. We examined the effects of meals high in non-resistant or resistant starch with and without elevated protein intake on substrate utilization, energy expenditure, and satiety in lean and overweight/obese women. Women of varying levels of adiposity consumed one of four pancake test meals in a single-blind, randomized crossover design: 1) waxy maize (control) starch (WMS); 2) waxy maize starch and whey protein (WMS+WP); 3) resistant starch (RS); or 4) RS and whey protein (RS+WP). Total post-prandial energy expenditure did not differ following any of the four test meals (WMS = 197.9 ± 8.9; WMS+WP = 188 ± 8.1; RS = 191.9 ± 8.9; RS+WP = 195.8 ± 8.7, kcals/180 min), although the combination of RS+WP, but not either intervention alone, significantly increased (P <0.01) fat oxidation (WMS = 89.5 ± 5.4; WMS+WP = 84.5 ± 7.2; RS = 97.4 ± 5.4; RS+WP = 107.8 ± 5.4, kcals/180 min). Measures of fullness increased (125% vs. 45%) and hunger decreased (55% vs. 16%) following WP supplemented versus non-whey conditions (WMS+WP, RS+WP vs. WMS, RS), whereas circulating hunger and satiety factors were not different among any of the test meals. However, peptide YY (PYY) was significantly elevated at 180 min following RS+WP meal. The combined consumption of dietary resistant starch and protein increases fat oxidation, PYY, and enhances feelings of satiety and fullness to levels that may be clinically relevant if maintained under chronic conditions. This trial was registered at clinicaltrials.gov as NCT02418429.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1436078','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1436078"><span>Ppb-level mid-infrared ethane detection based on three measurement schemes using a 3.34-μm continuous-wave interband cascade laser</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Li, Chunguang; Zheng, Chuantao; Dong, Lei</p> <p></p> <p>A ppb-level mid-infrared ethane (C 2H 6) sensor was developed using a continuous-wave, thermoelectrically cooled, distributed feedback interband cascade laser emitting at 3.34 μm and a miniature dense patterned multipass gas cell with a 54.6-m optical path length. The performance of the sensor was investigated using two different techniques based on the tunable interband cascade laser: direct absorption spectroscopy (DAS) and second-harmonic wavelength modulation spectroscopy (2f-WMS). Three measurement schemes, DAS, WMS and quasi-simultaneous DAS and WMS, were realized based on the same optical sensor core. A detection limit of ~7.92 ppbv with a precision of ±30 ppbv for the separatemore » DAS scheme with an averaging time of 1 s and a detection limit of ~1.19 ppbv with a precision of about ±4 ppbv for the separate WMS scheme with a 4-s averaging time were achieved. An Allan–Werle variance analysis indicated that the precisions can be further improved to 777 pptv @ 166 s for the separate DAS scheme and 269 pptv @ 108 s for the WMS scheme, respectively. For the quasi-simultaneous DAS and WMS scheme, both the 2f signal and the direct absorption signal were simultaneously extracted using a LabVIEW platform, and four C 2H 6 samples (0, 30, 60 and 90 ppbv with nitrogen as the balance gas) were used as the target gases to assess the sensor performance. A detailed comparison of the three measurement schemes is reported. Here, atmospheric C 2H 6 measurements on the Rice University campus and a field test at a compressed natural gas station in Houston, TX, were conducted to evaluate the performance of the sensor system as a robust and reliable field-deployable sensor system.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/pages/biblio/1436078-ppb-level-mid-infrared-ethane-detection-based-three-measurement-schemes-using-continuous-wave-interband-cascade-laser','SCIGOV-DOEP'); return false;" href="https://www.osti.gov/pages/biblio/1436078-ppb-level-mid-infrared-ethane-detection-based-three-measurement-schemes-using-continuous-wave-interband-cascade-laser"><span>Ppb-level mid-infrared ethane detection based on three measurement schemes using a 3.34-μm continuous-wave interband cascade laser</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/pages">DOE PAGES</a></p> <p>Li, Chunguang; Zheng, Chuantao; Dong, Lei; ...</p> <p>2016-06-20</p> <p>A ppb-level mid-infrared ethane (C 2H 6) sensor was developed using a continuous-wave, thermoelectrically cooled, distributed feedback interband cascade laser emitting at 3.34 μm and a miniature dense patterned multipass gas cell with a 54.6-m optical path length. The performance of the sensor was investigated using two different techniques based on the tunable interband cascade laser: direct absorption spectroscopy (DAS) and second-harmonic wavelength modulation spectroscopy (2f-WMS). Three measurement schemes, DAS, WMS and quasi-simultaneous DAS and WMS, were realized based on the same optical sensor core. A detection limit of ~7.92 ppbv with a precision of ±30 ppbv for the separatemore » DAS scheme with an averaging time of 1 s and a detection limit of ~1.19 ppbv with a precision of about ±4 ppbv for the separate WMS scheme with a 4-s averaging time were achieved. An Allan–Werle variance analysis indicated that the precisions can be further improved to 777 pptv @ 166 s for the separate DAS scheme and 269 pptv @ 108 s for the WMS scheme, respectively. For the quasi-simultaneous DAS and WMS scheme, both the 2f signal and the direct absorption signal were simultaneously extracted using a LabVIEW platform, and four C 2H 6 samples (0, 30, 60 and 90 ppbv with nitrogen as the balance gas) were used as the target gases to assess the sensor performance. A detailed comparison of the three measurement schemes is reported. Here, atmospheric C 2H 6 measurements on the Rice University campus and a field test at a compressed natural gas station in Houston, TX, were conducted to evaluate the performance of the sensor system as a robust and reliable field-deployable sensor system.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=Mayo%2c+E&pg=2&id=EJ452393','ERIC'); return false;" href="https://eric.ed.gov/?q=Mayo%2c+E&pg=2&id=EJ452393"><span>Mayo's Older Americans Normative Studies (MOANS): Factor Structure of a Core Battery.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Smith, Glenn E.; And Others</p> <p>1992-01-01</p> <p>Using the Mayo Older Americans Normative Studies (MOANS) group (526 55-to 97-year-old adults), factor models were examined for the Wechsler Adult Intelligence Scale-Revised (WAIS-R); the Wechsler Memory Scale (WMS); and a core battery of the WAIS-R, the WMS, and the Rey Auditory-Verbal Learning Test. (SLD)</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_8");'>8</a></li> <li><a href="#" onclick='return showDiv("page_9");'>9</a></li> <li class="active"><span>10</span></li> <li><a href="#" onclick='return showDiv("page_11");'>11</a></li> <li><a href="#" onclick='return showDiv("page_12");'>12</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_10 --> <div id="page_11" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_9");'>9</a></li> <li><a href="#" onclick='return showDiv("page_10");'>10</a></li> <li class="active"><span>11</span></li> <li><a href="#" onclick='return showDiv("page_12");'>12</a></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="201"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=temporal+AND+lobe+AND+epilepsy&pg=3&id=EJ557708','ERIC'); return false;" href="https://eric.ed.gov/?q=temporal+AND+lobe+AND+epilepsy&pg=3&id=EJ557708"><span>Receiver Operating Characteristic Curve Analysis of Wechsler Memory Scale-Revised Scores in Epilepsy Surgery Candidates.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Barr, William B.</p> <p>1997-01-01</p> <p>Wechsler Memory Scale-Revised (WMS-R) scores were analyzed for 82 epilepsy surgery candidates and used in combination with receiver operating characteristic curves to classify patients with left (LTL) and right (RTL) temporal lobe seizure onset. Results indicate that WMS-R scores used alone or in combination provide relatively poor discrimination…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/FR-2010-11-05/pdf/2010-27996.pdf','FEDREG'); return false;" href="https://www.gpo.gov/fdsys/pkg/FR-2010-11-05/pdf/2010-27996.pdf"><span>75 FR 68402 - Georges Creek Railway, LLC-Operation Exemption-in Allegany County, MD</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.gpo.gov/fdsys/browse/collection.action?collectionCode=FR">Federal Register 2010, 2011, 2012, 2013, 2014</a></p> <p></p> <p>2010-11-05</p> <p>.... 25, 2005). By decision served December 14, 2005, WMS, LLC (WMS) was authorized to acquire the Line....27, and by decision served August 18, 2006, James Riffin was substituted as the acquiring entity in... acquired a rail line under the OFA process from transferring that line to any entity other than the...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/22399900','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/22399900"><span>Diode laser detection of greenhouse gases in the near-infrared region by wavelength modulation spectroscopy: pressure dependence of the detection sensitivity.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Asakawa, Takashi; Kanno, Nozomu; Tonokura, Kenichi</p> <p>2010-01-01</p> <p>We have investigated the pressure dependence of the detection sensitivity of CO(2), N(2)O and CH(4) using wavelength modulation spectroscopy (WMS) with distributed feed-back diode lasers in the near infrared region. The spectral line shapes and the background noise of the second harmonics (2f) detection of the WMS were analyzed theoretically. We determined the optimum pressure conditions in the detection of CO(2), N(2)O and CH(4), by taking into consideration the background noise in the WMS. At the optimum total pressure for the detection of CO(2), N(2)O and CH(4), the limits of detection in the present system were determined.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015AGUFMIN31D..05K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015AGUFMIN31D..05K"><span>On the value of satellite-based river discharge and river flood data</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kettner, A. J.; Brakenridge, R.; van Praag, E.; Borrero, S.; Slayback, D. A.; Young, C.; Cohen, S.; Prades, L.; de Groeve, T.</p> <p>2015-12-01</p> <p>Flooding is the most common natural hazard worldwide. According to the World Resources Institute, floods impact 21 million people every year and affect the global GDP by $96 billion. Providing accurate flood maps in near-real time (NRT) is critical to their utility to first responders. Also, in times of flooding, river gauging stations on location, if any, are of less use to monitor stage height as an approximation for water surface area, as often the stations themselves get washed out or peak water levels reach much beyond their design measuring capacity. In a joint effort with NASA Goddard Space Flight Center, the European Commission Joint Research Centre and the University of Alabama, the Dartmouth Flood Observatory (DFO) measures NRT: 1) river discharges, and 2) water inundation extents, both with a global coverage on a daily basis. Satellite-based passive microwave sensors and hydrological modeling are utilized to establish 'remote-sensing based discharge stations'. Once calibrated, daily discharge time series span from 1998 to the present. Also, the two MODIS instruments aboard the NASA Terra and Aqua satellites provide daily floodplain inundation extent with global coverage at a spatial resolution of 250m. DFO's mission is to provide easy access to NRT river and flood data products. Apart from the DFO web portal, several water extent products can be ingested by utilizing a Web Map Service (WMS), such as is established with for Latin America and the Caribbean (LAC) region through the GeoSUR program portal. This effort includes implementing over 100 satellite discharge stations showing in NRT if a river is flooding, normal, or in low flow. New collaborative efforts have resulted in flood hazard maps which display flood extent as well as exceedance probabilities. The record length of our sensors allows mapping the 1.5 year, 5 year and 25 year flood extent. These can provide key information to water management and disaster response entities.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012AGUFMIN51F..01S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012AGUFMIN51F..01S"><span>Global Imagery Browse Services (GIBS) - Rapidly Serving NASA Imagery for Applications and Science Users</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Schmaltz, J. E.; Ilavajhala, S.; Plesea, L.; Hall, J. R.; Boller, R. A.; Chang, G.; Sadaqathullah, S.; Kim, R.; Murphy, K. J.; Thompson, C. K.</p> <p>2012-12-01</p> <p>Expedited processing of imagery from NASA satellites for near-real time use by non-science applications users has a long history, especially since the beginning of the Terra and Aqua missions. Several years ago, the Land Atmosphere Near-real-time Capability for EOS (LANCE) was created to greatly expand the range of near-real time data products from a variety of Earth Observing System (EOS) instruments. NASA's Earth Observing System Data and Information System (EOSDIS) began exploring methods to distribute these data as imagery in an intuitive, geo-referenced format, which would be available within three hours of acquisition. Toward this end, EOSDIS has developed the Global Imagery Browse Services (GIBS, http://earthdata.nasa.gov/gibs) to provide highly responsive, scalable, and expandable imagery services. The baseline technology chosen for GIBS was a Tiled Web Mapping Service (TWMS) developed at the Jet Propulsion Laboratory. Using this, global images and mosaics are divided into tiles with fixed bounding boxes for a pyramid of fixed resolutions. Initially, the satellite imagery is created at the existing data systems for each sensor, ensuring the oversight of those most knowledgeable about the science. There, the satellite data is geolocated and converted to an image format such as JPEG, TIFF, or PNG. The GIBS ingest server retrieves imagery from the various data systems and converts them into image tiles, which are stored in a highly-optimized raster format named Meta Raster Format (MRF). The image tiles are then served to users via HTTP by means of an Apache module. Services are available for the entire globe (lat-long projection) and for both polar regions (polar stereographic projection). Requests to the services can be made with the non-standard, but widely known, TWMS format or via the well-known OGC Web Map Tile Service (WMTS) standard format. Standard OGC Web Map Service (WMS) access to the GIBS server is also available. In addition, users may request a KML pyramid. This variety of access methods allows stakeholders to develop visualization/browse clients for a diverse variety of specific audiences. Currently, EOSDIS is providing an OpenLayers web client, Worldview (http://earthdata.nasa.gov/worldview), as an interface to GIBS. A variety of other existing clients can also be developed using such tools as Google Earth, Google Earth browser Plugin, ESRI's Adobe Flash/Flex Client Library, NASA World Wind, Perceptive Pixel Client, Esri's iOS Client Library, and OpenLayers for Mobile. The imagery browse capabilities from GIBS can be combined with other EOSDIS services (i.e. ECHO OpenSearch) via a client that ties them both together to provide an interface that enables data download from the onscreen imagery. Future plans for GIBS include providing imagery based on science quality data from the entire data record of these EOS instruments.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFMIN41C1678W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFMIN41C1678W"><span>A New Look at Data Usage by Using Metadata Attributes as Indicators of Data Quality</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Won, Y. I.; Wanchoo, L.; Behnke, J.</p> <p>2016-12-01</p> <p>NASA's Earth Observing System Data and Information System (EOSDIS) stores and distributes data from EOS satellites, as well as ancillary, airborne, in-situ, and socio-economic data. Twelve EOSDIS data centers support different scientific disciplines by providing products and services tailored to specific science communities. Although discipline oriented, these data centers provide common data management functions of ingest, archive and distribution, as well as documentation of their data and services on their web-sites. The Earth Science Data and Information System (ESDIS) Project collects these metrics from the EOSDIS data centers on a daily basis through a tool called the ESDIS Metrics System (EMS). These metrics are used in this study. The implementation of the Earthdata Login - formerly known as the User Registration System (URS) - across the various NASA data centers provides the EMS additional information about users obtaining data products from EOSDIS data centers. These additional user attributes collected by the Earthdata login, such as the user's primary area of study can augment the understanding of data usage, which in turn can help the EOSDIS program better understand the users' needs. This study will review the key metrics (users, distributed volume, and files) in multiple ways to gain an understanding of the significance of the metadata. Characterizing the usability of data by key metadata elements such as discipline and study area, will assist in understanding how the users have evolved over time. The data usage pattern based on version numbers may also provide some insight into the level of data quality. In addition, the data metrics by various services such as the Open-source Project for a Network Data Access Protocol (OPeNDAP), Web Map Service (WMS), Web Coverage Service (WCS), and subsets, will address how these services have extended the usage of data. Over-all, this study will present the usage of data and metadata by metrics analyses and will assist data centers in better supporting the needs of the users.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1247508','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1247508"><span>Pushing HTCondor and glideinWMS to 200K+ Jobs in a Global Pool for CMS before Run 2</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Balcas, J.; Belforte, S.; Bockelman, B.</p> <p>2015-12-23</p> <p>The CMS experiment at the LHC relies on HTCondor and glideinWMS as its primary batch and pilot-based Grid provisioning system. So far we have been running several independent resource pools, but we are working on unifying them all to reduce the operational load and more effectively share resources between various activities in CMS. The major challenge of this unification activity is scale. The combined pool size is expected to reach 200K job slots, which is significantly bigger than any other multi-user HTCondor based system currently in production. To get there we have studied scaling limitations in our existing pools, themore » biggest of which tops out at about 70K slots, providing valuable feedback to the development communities, who have responded by delivering improvements which have helped us reach higher and higher scales with more stability. We have also worked on improving the organization and support model for this critical service during Run 2 of the LHC. This contribution will present the results of the scale testing and experiences from the first months of running the Global Pool.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFM.H42B..04F','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFM.H42B..04F"><span>Towards a National Hydrological Forecasting system for Canada : Lessons Learned from the Great Lakes and St. Lawrence Prediction System</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Fortin, V.; Durnford, D.; Gaborit, E.; Davison, B.; Dimitrijevic, M.; Matte, P.</p> <p>2016-12-01</p> <p>Environment and Climate Change Canada has recently deployed a water cycle prediction system for the Great Lakes and St. Lawrence River. The model domain includes both the Canadian and US portions of the watershed. It provides 84-h forecasts of weather elements, lake level, lake ice cover and surface currents based on two-way coupling of the GEM numerical weather prediction (NWP) model with the NEMO ocean model. Streamflow of all the major tributaries of the Great Lakes and St. Lawrence River are estimated by the WATROUTE routing model, which routes the surface runoff forecasted by GEM's land-surface scheme and assimilates streamflow observations where available. Streamflow forecasts are updated twice daily and are disseminated through an OGC compliant web map service (WMS) and a web feature service (WFS). In this presentation, in addition to describing the system and documenting its forecast skill, we show how it is being used by clients for various environmental prediction applications. We then discuss the importance of two-way coupling, land-surface and hillslope modelling and the impact of horizontal resolution on hydrological prediction skill. In the second portion of the talk, we discuss plans for implementing a similar system at the national scale, using what we have learned in the Great Lakes and St. Lawrence watershed. Early results obtained for the headwaters of the Saskatchewan River as well as for the whole Nelson-Churchill watershed are presented.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3252277','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3252277"><span>Microenvironmental Regulation by Fibrillin-1</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Sengle, Gerhard; Tsutsui, Ko; Keene, Douglas R.; Tufa, Sara F.; Carlson, Eric J.; Charbonneau, Noe L.; Ono, Robert N.; Sasaki, Takako; Wirtz, Mary K.; Samples, John R.; Fessler, Liselotte I.; Fessler, John H.; Sekiguchi, Kiyotoshi; Hayflick, Susan J.; Sakai, Lynn Y.</p> <p>2012-01-01</p> <p>Fibrillin-1 is a ubiquitous extracellular matrix molecule that sequesters latent growth factor complexes. A role for fibrillin-1 in specifying tissue microenvironments has not been elucidated, even though the concept that fibrillin-1 provides extracellular control of growth factor signaling is currently appreciated. Mutations in FBN1 are mainly responsible for the Marfan syndrome (MFS), recognized by its pleiotropic clinical features including tall stature and arachnodactyly, aortic dilatation and dissection, and ectopia lentis. Each of the many different mutations in FBN1 known to cause MFS must lead to similar clinical features through common mechanisms, proceeding principally through the activation of TGFβ signaling. Here we show that a novel FBN1 mutation in a family with Weill-Marchesani syndrome (WMS) causes thick skin, short stature, and brachydactyly when replicated in mice. WMS mice confirm that this mutation does not cause MFS. The mutation deletes three domains in fibrillin-1, abolishing a binding site utilized by ADAMTSLIKE-2, -3, -6, and papilin. Our results place these ADAMTSLIKE proteins in a molecular pathway involving fibrillin-1 and ADAMTS-10. Investigations of microfibril ultrastructure in WMS humans and mice demonstrate that modulation of the fibrillin microfibril scaffold can influence local tissue microenvironments and link fibrillin-1 function to skin homeostasis and the regulation of dermal collagen production. Hence, pathogenetic mechanisms caused by dysregulated WMS microenvironments diverge from Marfan pathogenetic mechanisms, which lead to broad activation of TGFβ signaling in multiple tissues. We conclude that local tissue-specific microenvironments, affected in WMS, are maintained by a fibrillin-1 microfibril scaffold, modulated by ADAMTSLIKE proteins in concert with ADAMTS enzymes. PMID:22242013</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/22964402','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/22964402"><span>Multi-criteria analysis towards the new end use of recycled water for household laundry: a case study in Sydney.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Chen, Z; Ngo, H H; Guo, W S; Listowski, A; O'Halloran, K; Thompson, M; Muthukaruppan, M</p> <p>2012-11-01</p> <p>This paper aims to put forward several management alternatives regarding the application of recycled water for household laundry in Sydney. Based on different recycled water treatment techniques such as microfiltration (MF), granular activated carbon (GAC) or reverse osmosis (RO), and types of washing machines (WMs), five alternatives were proposed as follows: (1) do nothing scenario; (2) MF+existing WMs; (3) MF+new WMs; (4) MF-GAC+existing WMs; and (5) MF-RO+existing WMs. Accordingly, a comprehensive quantitative assessment on the trade-off among a variety of issues (e.g., engineering feasibility, initial cost, energy consumption, supply flexibility and water savings) was performed over the alternatives. This was achieved by a computer-based multi-criteria analysis (MCA) using the rank order weight generation together with preference ranking organization method for enrichment evaluation (PROMETHEE) outranking techniques. Particularly, the generated 10,000 combinations of weights via Monte Carlo simulation were able to significantly reduce the man-made errors of single fixed set of weights because of its objectivity and high efficiency. To illustrate the methodology, a case study on Rouse Hill Development Area (RHDA), Sydney, Australia was carried out afterwards. The study was concluded by highlighting the feasibility of using highly treated recycled water for existing and new washing machines. This could provide a powerful guidance for sustainable water reuse management in the long term. However, more detailed field trials and investigations are still needed to effectively understand, predict and manage the impact of selected recycled water for new end use alternatives. Copyright © 2012 Elsevier B.V. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=js&id=EJ908096','ERIC'); return false;" href="https://eric.ed.gov/?q=js&id=EJ908096"><span>A Quantitative Measure of JS's Memory</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Ben Shalom, Dorit; Faran, Yifat; Boucher, Jill</p> <p>2010-01-01</p> <p>JS is a highly able, well-educated 37 year old man with Asperger syndrome. A recent qualitative paper (Boucher, 2007) described his self-report of verbal and visual memory difficulties. The present paper used the WMS-III to compare the memory profile of JS to that of the adults with HFA in the Williams et al. (2005) WMS-III paper. Results show…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/FR-2010-11-05/pdf/2010-27968.pdf','FEDREG'); return false;" href="https://www.gpo.gov/fdsys/pkg/FR-2010-11-05/pdf/2010-27968.pdf"><span>75 FR 68400 - Eighteen Thirty Group, LLC-Acquisition Exemption-in Allegany County, MD</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.gpo.gov/fdsys/browse/collection.action?collectionCode=FR">Federal Register 2010, 2011, 2012, 2013, 2014</a></p> <p></p> <p>2010-11-05</p> <p>.... 659X) (STB served Aug. 25, 2005). By decision served December 14, 2005, WMS, LLC (WMS) was authorized....C. 10904 and 49 CFR 1152.27, and by decision served August 18, 2006, James Riffin was substituted as... entity that has acquired a rail line under the OFA process from transferring that line to any entity...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/23886016','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/23886016"><span>Memory assessment and depression: testing for factor structure and measurement invariance of the Wechsler Memory Scale-Fourth Edition across a clinical and matched control sample.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Pauls, Franz; Petermann, Franz; Lepach, Anja Christina</p> <p>2013-01-01</p> <p>Between-group comparisons are permissible and meaningfully interpretable only if diagnostic instruments are proved to measure the same latent dimensions across different groups. Addressing this issue, the present study was carried out to provide a rigorous test of measurement invariance. Confirmatory factor analyses were used to determine which model solution could best explain memory performance as measured by the Wechsler Memory Scale-Fourth Edition (WMS-IV) in a clinical depression sample and in healthy controls. Multigroup confirmatory factor analysis was conducted to evaluate the evidence for measurement invariance. A three-factor model solution including the dimensions of auditory memory, visual memory, and visual working memory was identified to best fit the data in both samples, and measurement invariance was partially satisfied. The results supported clinical utility of the WMS-IV--that is, auditory and visual memory performances of patients with depressive disorders are interpretable on the basis of the WMS-IV standardization data. However, possible differences in visual working memory functions between healthy and depressed individuals could restrict comparisons of the WMS-IV working memory index.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28181445','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28181445"><span>Pressure-Dependent Detection of Carbon Monoxide Employing Wavelength Modulation Spectroscopy Using a Herriott-Type Cell.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Li, Chuanliang; Wu, Yingfa; Qiu, Xuanbing; Wei, Jilin; Deng, Lunhua</p> <p>2017-05-01</p> <p>Wavelength modulation spectroscopy (WMS) combined with a multipass absorption cell has been used to measure a weak absorption line of carbon monoxide (CO) at 1.578 µm. A 0.95m Herriott-type cell provides an effective absorption path length of 55.1 m. The WMS signals from the first and second harmonic output of a lock-in amplifier (WMS-1 f and 2 f, respectively) agree with the Beer-Lambert law, especially at low concentrations. After boxcar averaging, the minimum detection limit achieved is 4.3 ppm for a measurement time of 0.125 s. The corresponding normalized detection limit is 84 ppm m Hz -1/2 . If the integrated time is increased to 88 s, the minimum detectable limit of CO can reach to 0.29 ppm based on an Allan variation analysis. The pressure-dependent relationship is validated after accounting for the pressure factor in data processing. Finally, a linear correlation between the WMS-2 f amplitudes and gas concentrations is obtained at concentration ratios less than 15.5%, and the accuracy is better than 92% at total pressure less than 62.7 Torr.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFMNH43A0169G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFMNH43A0169G"><span>Flood Simulation Using WMS Model in Small Watershed after Strong Earthquake -A Case Study of Longxihe Watershed, Sichuan province, China</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Guo, B.</p> <p>2017-12-01</p> <p>Mountain watershed in Western China is prone to flash floods. The Wenchuan earthquake on May 12, 2008 led to the destruction of surface, and frequent landslides and debris flow, which further exacerbated the flash flood hazards. Two giant torrent and debris flows occurred due to heavy rainfall after the earthquake, one was on August 13 2010, and the other on August 18 2010. Flash floods reduction and risk assessment are the key issues in post-disaster reconstruction. Hydrological prediction models are important and cost-efficient mitigation tools being widely applied. In this paper, hydrological observations and simulation using remote sensing data and the WMS model are carried out in the typical flood-hit area, Longxihe watershed, Dujiangyan City, Sichuan Province, China. The hydrological response of rainfall runoff is discussed. The results show that: the WMS HEC-1 model can well simulate the runoff process of small watershed in mountainous area. This methodology can be used in other earthquake-affected areas for risk assessment and to predict the magnitude of flash floods. Key Words: Rainfall-runoff modeling. Remote Sensing. Earthquake. WMS.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014AGUFMED33C..04E','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014AGUFMED33C..04E"><span>Using Globe Browsing Systems in Planetariums to Take Audiences to Other Worlds.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Emmart, C. B.</p> <p>2014-12-01</p> <p>For the last decade planetariums have been adding capability of "full dome video" systems for both movie playback and interactive display. True scientific data visualization has now come to planetarium audiences as a means to display the actual three dimensional layout of the universe, the time based array of planets, minor bodies and spacecraft across the solar system, and now globe browsing systems to examine planetary bodies to the limits of resolutions acquired. Additionally, such planetarium facilities can be networked for simultaneous display across the world for wider audience and reach to authoritative scientist description and commentary. Data repositories such as NASA's Lunar Mapping and Modeling Project (LMMP), NASA GSFC's LANCE-MODIS, and others conforming to the Open Geospatial Consortium (OGC) standard of Web Map Server (WMS) protocols make geospatial data available for a growing number of dome supporting globe visualization systems. The immersive surround graphics of full dome video replicates our visual system creating authentic virtual scenes effectively placing audiences on location in some cases to other worlds only mapped robotically.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012AGUFM.H43K..02L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012AGUFM.H43K..02L"><span>Utilizing Satellite-derived Precipitation Products in Hydrometeorological Applications</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Liu, Z.; Ostrenga, D.; Teng, W. L.; Kempler, S. J.; Huffman, G. J.</p> <p>2012-12-01</p> <p>Each year droughts and floods happen around the world and can cause severe property damages and human casualties. Accurate measurement and forecast are important for preparedness and mitigation efforts. Through multi-satellite blended techniques, significant progress has been made over the past decade in satellite-based precipitation product development, such as, products' spatial and temporal resolutions as well as timely availability. These new products are widely used in various research and applications. In particular, the TRMM Multi-satellite Precipitation Analysis (TMPA) products archived and distributed by the NASA Goddard Earth Sciences (GES) Data and Information Services Center (DISC) provide 3-hourly, daily and monthly near-global (50° N - 50° S) precipitation datasets for research and applications. Two versions of TMPA products are available, research (3B42, 3B43, rain gauge adjusted) and near-real-time (3B42RT). At GES DISC, we have developed precipitation data services to support hydrometeorological applications in order to maximize the TRMM mission's societal benefits. In this presentation, we will present examples of utilizing TMPA precipitation products in hydrometeorological applications including: 1) monitoring global floods and droughts; 2) providing data services to support the USDA Crop Explorer; 3) support hurricane monitoring activities and research; and 4) retrospective analog year analyses to improve USDA's world agricultural supply and demand estimates. We will also present precipitation data services that can be used to support hydrometeorological applications including: 1) User friendly TRMM Online Visualization and Analysis System (TOVAS; URL: http://disc2.nascom.nasa.gov/Giovanni/tovas/); 2) Mirador (http://mirador.gsfc.nasa.gov/), a simplified interface for searching, browsing, and ordering Earth science data at GES DISC; 3) Simple Subset Wizard (http://disc.sci.gsfc.nasa.gov/SSW/ ) for data subsetting and format conversion; 4) Data via OPeNDAP (http://disc.sci.gsfc.nasa.gov/services/opendap/) that can be used for remote access to individual variables within datasets in a form usable by many tools, such as IDV, McIDAS-V, Panoply, Ferret and GrADS; 4) GrADS-DODS Data Server or GDS (http://disc2.nascom.nasa.gov/dods/); 5) The Open Geospatial Consortium (OGC) Web Map Service (WMS) (http://disc.sci.gsfc.nasa.gov/services/wxs_ogc.shtml) that allows the use of data and enables clients to build customized maps with data coming from a different network; and 6) Providing NASA gridded hydrological data access through CUAHSI HIS (Consortium of Universities for the Advancement of Hydrologic Science, Inc. - Hydrologic Information Systems).</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013ISPAr.XL2b..13B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013ISPAr.XL2b..13B"><span>Participatory Gis: Experimentations for a 3d Social Virtual Globe</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Brovelli, M. A.; Minghini, M.; Zamboni, G.</p> <p>2013-08-01</p> <p>The dawn of GeoWeb 2.0, the geographic extension of Web 2.0, has opened new possibilities in terms of online dissemination and sharing of geospatial contents, thus laying the foundations for a fruitful development of Participatory GIS (PGIS). The purpose of the study is to investigate the extension of PGIS applications, which are quite mature in the traditional bi-dimensional framework, up to the third dimension. More in detail, the system should couple a powerful 3D visualization with an increase of public participation by means of a tool allowing data collecting from mobile devices (e.g. smartphones and tablets). The PGIS application, built using the open source NASA World Wind virtual globe, is focussed on the cultural and tourism heritage of Como city, located in Northern Italy. An authentication mechanism was implemented, which allows users to create and manage customized projects through cartographic mash-ups of Web Map Service (WMS) layers. Saved projects populate a catalogue which is available to the entire community. Together with historical maps and the current cartography of the city, the system is also able to manage geo-tagged multimedia data, which come from user field-surveys performed through mobile devices and report POIs (Points Of Interest). Each logged user can then contribute to POIs characterization by adding textual and multimedia information (e.g. images, audios and videos) directly on the globe. All in all, the resulting application allows users to create and share contributions as it usually happens on social platforms, additionally providing a realistic 3D representation enhancing the expressive power of data.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/29270122','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/29270122"><span>Differential Effects of the Factor Structure of the Wechsler Memory Scale-Revised on the Cortical Thickness and Complexity of Patients Aged Over 75 Years in a Memory Clinic Setting.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Kinno, Ryuta; Shiromaru, Azusa; Mori, Yukiko; Futamura, Akinori; Kuroda, Takeshi; Yano, Satoshi; Murakami, Hidetomo; Ono, Kenjiro</p> <p>2017-01-01</p> <p>The Wechsler Memory Scale-Revised (WMS-R) is one of the internationally well-known batteries for memory assessment in a general memory clinic setting. Several factor structures of the WMS-R for patients aged under 74 have been proposed. However, little is known about the factor structure of the WMS-R for patients aged over 75 years and its neurological significance. Thus, we conducted exploratory factor analysis to determine the factor structure of the WMS-R for patients aged over 75 years in a memory clinic setting. Regional cerebral blood flow (rCBF) was calculated from single-photon emission computed tomography data. Cortical thickness and cortical fractal dimension, as the marker of cortical complexity, were calculated from high resolution magnetic resonance imaging data. We found that the four factors appeared to be the most appropriate solution to the model, including recognition memory, paired associate memory, visual-and-working memory, and attention as factors. Patients with mild cognitive impairments showed significantly higher factor scores for paired associate memory, visual-and-working memory, and attention than patients with Alzheimer's disease. Regarding the neuroimaging data, the factor scores for paired associate memory positively correlated with rCBF in the left pericallosal and hippocampal regions. Moreover, the factor score for paired associate memory showed most robust correlations with the cortical thickness in the limbic system, whereas the factor score for attention correlated with the cortical thickness in the bilateral precuneus. Furthermore, each factor score correlated with the cortical fractal dimension in the bilateral frontotemporal regions. Interestingly, the factor scores for the visual-and-working memory and attention selectively correlated with the cortical fractal dimension in the right posterior cingulate cortex and right precuneus cortex, respectively. These findings demonstrate that recognition memory, paired associate memory, visual-and-working memory, and attention can be crucial factors for interpreting the WMS-R results of elderly patients aged over 75 years in a memory clinic setting. Considering these findings, the results of WMS-R in elderly patients aged over 75 years in a memory clinic setting should be cautiously interpreted.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20160002964','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20160002964"><span>The Matsu Wheel: A Cloud-Based Framework for Efficient Analysis and Reanalysis of Earth Satellite Imagery</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Patterson, Maria T.; Anderson, Nicholas; Bennett, Collin; Bruggemann, Jacob; Grossman, Robert L.; Handy, Matthew; Ly, Vuong; Mandl, Daniel J.; Pederson, Shane; Pivarski, James; <a style="text-decoration: none; " href="javascript:void(0); " onClick="displayelement('author_20160002964'); toggleEditAbsImage('author_20160002964_show'); toggleEditAbsImage('author_20160002964_hide'); "> <img style="display:inline; width:12px; height:12px; " src="images/arrow-up.gif" width="12" height="12" border="0" alt="hide" id="author_20160002964_show"> <img style="width:12px; height:12px; display:none; " src="images/arrow-down.gif" width="12" height="12" border="0" alt="hide" id="author_20160002964_hide"></p> <p>2016-01-01</p> <p>Project Matsu is a collaboration between the Open Commons Consortium and NASA focused on developing open source technology for cloud-based processing of Earth satellite imagery with practical applications to aid in natural disaster detection and relief. Project Matsu has developed an open source cloud-based infrastructure to process, analyze, and reanalyze large collections of hyperspectral satellite image data using OpenStack, Hadoop, MapReduce and related technologies. We describe a framework for efficient analysis of large amounts of data called the Matsu "Wheel." The Matsu Wheel is currently used to process incoming hyperspectral satellite data produced daily by NASA's Earth Observing-1 (EO-1) satellite. The framework allows batches of analytics, scanning for new data, to be applied to data as it flows in. In the Matsu Wheel, the data only need to be accessed and preprocessed once, regardless of the number or types of analytics, which can easily be slotted into the existing framework. The Matsu Wheel system provides a significantly more efficient use of computational resources over alternative methods when the data are large, have high-volume throughput, may require heavy preprocessing, and are typically used for many types of analysis. We also describe our preliminary Wheel analytics, including an anomaly detector for rare spectral signatures or thermal anomalies in hyperspectral data and a land cover classifier that can be used for water and flood detection. Each of these analytics can generate visual reports accessible via the web for the public and interested decision makers. The result products of the analytics are also made accessible through an Open Geospatial Compliant (OGC)-compliant Web Map Service (WMS) for further distribution. The Matsu Wheel allows many shared data services to be performed together to efficiently use resources for processing hyperspectral satellite image data and other, e.g., large environmental datasets that may be analyzed for many purposes.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_9");'>9</a></li> <li><a href="#" onclick='return showDiv("page_10");'>10</a></li> <li class="active"><span>11</span></li> <li><a href="#" onclick='return showDiv("page_12");'>12</a></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_11 --> <div id="page_12" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_10");'>10</a></li> <li><a href="#" onclick='return showDiv("page_11");'>11</a></li> <li class="active"><span>12</span></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="221"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2008AGUFMOS41G..08B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2008AGUFMOS41G..08B"><span>NOAA Operational Tsunameter Support for Research</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Bouchard, R.; Stroker, K.</p> <p>2008-12-01</p> <p>In March 2008, the National Oceanic and Atmospheric Administration's (NOAA) National Data Buoy Center (NDBC) completed the deployment of the last of the 39-station network of deep-sea tsunameters. As part of NOAA's effort to strengthen tsunami warning capabilities, NDBC expanded the network from 6 to 39 stations and upgraded all stations to the second generation Deep-ocean Assessment and Reporting of Tsunamis technology (DART II). Consisting of a bottom pressure recorder (BPR) and a surface buoy, the tsunameters deliver water-column heights, estimated from pressure measurements at the sea floor, to Tsunami Warning Centers in less than 3 minutes. This network provides coastal communities in the Pacific, Atlantic, Caribbean, and the Gulf of Mexico with faster and more accurate tsunami warnings. In addition, both the coarse resolution real-time data and the high resolution (15-second) recorded data provide invaluable contributions to research, such as the detection of the 2004 Sumatran tsunami in the Northeast Pacific (Gower and González, 2006) and the experimental tsunami forecast system (Bernard et al., 2007). NDBC normally recovers the BPRs every 24 months and sends the recovered high resolution data to NOAA's National Geophysical Data Center (NGDC) for archive and distribution. NGDC edits and processes this raw binary format to obtain research-quality data. NGDC provides access to retrospective BPR data from 1986 to the present. The DART database includes pressure and temperature data from the ocean floor, stored in a relational database, enabling data integration with the global tsunami and significant earthquake databases. All data are accessible via the Web as tables, reports, interactive maps, OGC Web Map Services (WMS), and Web Feature Services (WFS) to researchers around the world. References: Gower, J. and F. González, 2006. U.S. Warning System Detected the Sumatra Tsunami, Eos Trans. AGU, 87(10). Bernard, E. N., C. Meinig, and A. Hilton, 2007. Deep Ocean Tsunami Detection: Third Generation DART, Eos Trans. AGU, 88(52), Fall Meet. Suppl., Abstract S51C-03.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20130013872','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20130013872"><span>Use of VIIRS DNB Data to Monitor Power Outages and Restoration for Significant Weather Events</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Jedlovec, Gary; Molthan, Andrew</p> <p>2008-01-01</p> <p>NASA fs Short-term Prediction Research and Transition (SPoRT) project operates from NASA's Marshall Space Flight Center in Huntsville, Alabama. The team provides unique satellite data to the National Weather Service (NWS) and other agencies and organizations for weather analysis. While much of its work is focused on improving short-term weather forecasting, the SPoRT team supported damage assessment and response to Hurricane Superstorm Sandy by providing imagery that highlighted regions without power. The team used data from the Visible Infrared Imaging Radiometer Suite (VIIRS) on the Suomi National Polar-orbiting Partnership (Suomi NPP) satellite. The VIIRS low-light sensor, known as the day-night-band (DNB), can detect nighttime light from wildfires, urban and rural communities, and other human activity which emits light. It can also detect moonlight reflected from clouds and surface features. Using real time VIIRS data collected by our collaborative partner at the Space Science and Engineering Center of the University of Wisconsin, the SPoRT team created composite imagery to help detect power outages and restoration. This blackout imagery allowed emergency response teams from a variety of agencies to better plan and marshal resources for recovery efforts. The blackout product identified large-scale outages, offering a comprehensive perspective beyond a patchwork GIS mapping of outages that utility companies provide based on customer complaints. To support the relief efforts, the team provided its imagery to the USGS data portal, which the Federal Emergency Management Agency (FEMA) and other agencies used in their relief efforts. The team fs product helped FEMA, the U.S. Army Corps of Engineers, and U.S. Army monitor regions without power as part of their disaster response activities. Disaster responders used the images to identify possible outages and effectively distribute relief resources. An enhanced product is being developed and integrated into a web mapping service (WMS) for dissemination and use by a broader end user community.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4634977','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4634977"><span>Rational Phosphorus Application Facilitates the Sustainability of the Wheat/Maize/Soybean Relay Strip Intercropping System</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Wang, Ke; Liu, Jing; Lu, Junyu; Xu, Kaiwei</p> <p>2015-01-01</p> <p>Wheat (Triticum aestivum L.)/maize (Zea mays L.)/soybean (Glycine max L.) relay strip intercropping (W/M/S) system is commonly used by the smallholders in the Southwest of China. However, little known is how to manage phosphorus (P) to enhance P use efficiency of the W/M/S system and to mitigate P leaching that is a major source of pollution. Field experiments were carried out in 2011, 2012, and 2013 to test the impact of five P application rates on yield and P use efficiency of the W/M/S system. The study measured grain yield, shoot P uptake, apparent P recovery efficiency (PRE) and soil P content. A linear-plateau model was used to determine the critical P rate that maximizes gains in the indexes of system productivity. The results show that increase in P application rates aggrandized shoot P uptake and crops yields at threshold rates of 70 and 71.5 kg P ha-1 respectively. With P application rates increasing, the W/M/S system decreased the PRE from 35.9% to 12.3% averaged over the three years. A rational P application rate, 72 kg P ha-1, or an appropriate soil Olsen-P level, 19.1 mg kg-1, drives the W/M/S system to maximize total grain yield while minimizing P surplus, as a result of the PRE up to 28.0%. We conclude that rational P application is an important approach for relay intercropping to produce high yield while mitigating P pollution and the rational P application-based integrated P fertilizer management is vital for sustainable intensification of agriculture in the Southwest of China. PMID:26540207</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28447268','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28447268"><span>Atypical age-dependency of executive function and white matter microstructure in children and adolescents with autism spectrum disorders.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Martínez, Kenia; Merchán-Naranjo, Jessica; Pina-Camacho, Laura; Alemán-Gómez, Yasser; Boada, Leticia; Fraguas, David; Moreno, Carmen; Arango, Celso; Janssen, Joost; Parellada, Mara</p> <p>2017-11-01</p> <p>Executive function (EF) performance is associated with measurements of white matter microstructure (WMS) in typical individuals. Impaired EF is a hallmark symptom of autism spectrum disorders (ASD) but it is unclear how impaired EF relates to variability in WMS. Twenty-one male youth (8-18 years) with ASD and without intellectual disability and twenty-one typical male participants (TP) matched for age, intelligence quotient, handedness, race and parental socioeconomic status were recruited. Five EF domains were assessed and several DTI-based measurements of WMS [fractional anisotropy (FA), mean diffusivity (MD) and radial diffusivity (RD)] were estimated for eighteen white matter tracts. The ASD group had lower scores for attention (F = 8.37, p = 0.006) and response inhibition (F = 13.09, p = 0.001). Age-dependent changes of EF performance and WMS measurements were present in TP but attenuated in the ASD group. The strongest diagnosis-by-age effect was found for forceps minor, left anterior thalamic radiation and left cingulum angular bundle (all p's ≤ 0.002). In these tracts subjects with ASD tended to have equal or increased FA and/or reduced MD and/or RD at younger ages while controls had increased FA and/or reduced MD and/or RD thereafter. Only for TP individuals, increased FA in the left anterior thalamic radiation was associated with better response inhibition, while reduced RD in forceps minor and left cingulum angular bundle was related to better problem solving and working memory performance respectively. These findings provide novel insight into the age-dependency of EF performance and WMS in ASD, which can be instructive to cognitive training programs.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013EGUGA..15.2473S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013EGUGA..15.2473S"><span>Turning Interoperability Operational with GST</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Schaeben, Helmut; Gabriel, Paul; Gietzel, Jan; Le, Hai Ha</p> <p>2013-04-01</p> <p>GST - Geosciences in space and time is being developed and implemented as hub to facilitate the exchange of spatially and temporally indexed multi-dimensional geoscience data and corresponding geomodels amongst partners. It originates from TUBAF's contribution to the EU project "ProMine" and its perspective extensions are TUBAF's contribution to the actual EU project "GeoMol". As of today, it provides basic components of a geodata infrastructure as required to establish interoperability with respect to geosciences. Generally, interoperability means the facilitation of cross-border and cross-sector information exchange, taking into account legal, organisational, semantic and technical aspects, cf. Interoperability Solutions for European Public Administrations (ISA), cf. http://ec.europa.eu/isa/. Practical interoperability for partners of a joint geoscience project, say European Geological Surveys acting in a border region, means in particular provision of IT technology to exchange spatially and maybe additionally temporally indexed multi-dimensional geoscience data and corresponding models, i.e. the objects composing geomodels capturing the geometry, topology, and various geoscience contents. Geodata Infrastructure (GDI) and interoperability are objectives of several inititatives, e.g. INSPIRE, OneGeology-Europe, and most recently EGDI-SCOPE to name just the most prominent ones. Then there are quite a few markup languages (ML) related to geographical or geological information like GeoSciML, EarthResourceML, BoreholeML, ResqML for reservoir characterization, earth and reservoir models, and many others featuring geoscience information. Several Web Services are focused on geographical or geoscience information. The Open Geospatial Consortium (OGC) promotes specifications of a Web Feature Service (WFS), a Web Map Service (WMS), a Web Coverage Serverice (WCS), a Web 3D Service (W3DS), and many more. It will be clarified how GST is related to these initiatives, especially how it complies with existing or developing standards or quasi-standards and how it applies and extents services towards interoperability in the Earth sciences.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..19.7113N','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..19.7113N"><span>EMODnet Physics in the EMODnet program phase 3</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Novellino, Antonio; Gorringe, Patrick; Schaap, Dick; Pouliquen, Sylvie; Rickards, Lesley; Thijsse, Peter; Manzella, Giuseppe</p> <p>2017-04-01</p> <p>Access to marine data is of vital importance for marine research and a key issue for various studies, from climate change prediction to off shore engineering. Giving access to and harmonising marine data from different sources will help industry, public authorities and researchers find the data and make more effective use of them to develop new products, services and improve our understanding of how the seas behave. The aim of EMODnet Physics is the provision of a combined array of services and functionalities (facility for viewing and downloading, dashboard reporting and machine-to-machine communication services) to obtain, free of charge data, meta-data and data products on the physical conditions of European sea basins and oceans from many different distributed data bases. Moreover, the system provides full interoperability with third-party software through WMS services, Web Services and Web catalogues in order to exchange data and products according to the most recent standards. This assures to the user, the access to data having same quality and formats. The portal is providing access to data and products of: wave height and period; temperature and salinity of the water column; wind speed and direction; horizontal velocity of the water column; light attenuation; sea ice coverage and sea level trends. EMODnet Physics is continuously enhancing the number and type of platforms in the system by unlocking and providing high quality data from a growing network. Nowadays the system does integrate information by more than 12.000 stations and is including two ready-to-use data products: Ice Map and Sea Level Trends. The final aim of EMODnet Physics is to confederate different portals and be a portal of portal to further extend the number and type of data (e.g. water noise, river data, etc.) and platforms (e.g. animal bourne instruments, etc) feeding the system; improve the capacity of the system producing data and products that could match the market needs of the current and potential new end and intermediate users.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017NHESS..17..197V','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017NHESS..17..197V"><span>The role of EMODnet Chemistry in the European challenge for Good Environmental Status</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Vinci, Matteo; Giorgetti, Alessandra; Lipizer, Marina</p> <p>2017-02-01</p> <p>The European Union set the ambitious objective to reach within 2020 the goal of Good Environmental Status. The European Commission (2008) represents the legislative framework that drives member state efforts to reach it. The Integrated Maritime Policy supported the need to provide a European knowledge base able to drive sustainable development by launching in 2009 a new European Marine Observation and Data Network (EMODnet). Through a stepwise approach, EMODnet Chemistry aims to provide high-quality marine environmental data and related products at the scale of regions and sub-regions defined by the Marine Strategy Framework Directive. The chemistry lot takes advantage and further develops the SeaDataNet pan-European infrastructure and the distributed approach, linking together a network of more than 100 National Oceanographic Data Centres providing data from more than 500 data originators. The close interaction with EEA, RSCs, ICES and EMODnet-MSFD coordination group facilitated the identification of the most appropriate set of information required for the MSFD process. EMODnet Chemistry provides aggregated and validated regional data collections for nutrients, dissolved gasses, chlorophyll, and contaminants, properly visualized with OGC WMS and WPS viewing services. Concentration maps with 10-year moving window from 1960 to 2014, by season and for selected vertical layers, are computed and made available.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010EGUGA..1215285K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010EGUGA..1215285K"><span>Modelling noise propagation using Grid Resources. Progress within GDI-Grid</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kiehle, Christian; Mayer, Christian; Padberg, Alexander; Stapelfeld, Hartmut</p> <p>2010-05-01</p> <p>Modelling noise propagation using Grid Resources. Progress within GDI-Grid. GDI-Grid (english: SDI-Grid) is a research project funded by the German Ministry for Science and Education (BMBF). It aims at bridging the gaps between OGC Web Services (OWS) and Grid infrastructures and identifying the potential of utilizing the superior storage capacities and computational power of grid infrastructures for geospatial applications while keeping the well-known service interfaces specified by the OGC. The project considers all major OGC webservice interfaces for Web Mapping (WMS), Feature access (Web Feature Service), Coverage access (Web Coverage Service) and processing (Web Processing Service). The major challenge within GDI-Grid is the harmonization of diverging standards as defined by standardization bodies for Grid computing and spatial information exchange. The project started in 2007 and will continue until June 2010. The concept for the gridification of OWS developed by lat/lon GmbH and the Department of Geography of the University of Bonn is applied to three real-world scenarios in order to check its practicability: a flood simulation, a scenario for emergency routing and a noise propagation simulation. The latter scenario is addressed by the Stapelfeldt Ingenieurgesellschaft mbH located in Dortmund adapting their LimA software to utilize grid resources. Noise mapping of e.g. traffic noise in urban agglomerates and along major trunk roads is a reoccurring demand of the EU Noise Directive. Input data requires road net and traffic, terrain, buildings and noise protection screens as well as population distribution. Noise impact levels are generally calculated in 10 m grid and along relevant building facades. For each receiver position sources within a typical range of 2000 m are split down into small segments, depending on local geometry. For each of the segments propagation analysis includes diffraction effects caused by all obstacles on the path of sound propagation. This immense intensive calculation needs to be performed for a major part of European landscape. A LINUX version of the commercial LimA software for noise mapping analysis has been implemented on a test cluster within the German D-GRID computer network. Results and performance indicators will be presented. The presentation is an extension to last-years presentation "Spatial Data Infrastructures and Grid Computing: the GDI-Grid project" that described the gridification concept developed in the GDI-Grid project and provided an overview of the conceptual gaps between Grid Computing and Spatial Data Infrastructures. Results from the GDI-Grid project are incorporated in the OGC-OGF (Open Grid Forum) collaboration efforts as well as the OGC WPS 2.0 standards working group developing the next major version of the WPS specification.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA474965','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA474965"><span>Serum Dioxin and Memory Among Veterans of Operation Ranch Hand</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>2007-09-01</p> <p>logical memory and visual reproductions subtests. In 1987, the WMS-R was published, expanding on the original WMS and creating a more thorough and...the Verbal Paired Associates subtest, the Logical Memory subtest (immediate and delayed recall), and the Visual Reproduction subtest (immediate and...Visual Reproduction subtest, designed to assess visual memory, the veteran was asked to draw from memory four simple geometric designs that were each</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25466134','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25466134"><span>Effects of heat-moisture treatment reaction conditions on the physicochemical and structural properties of maize starch: moisture and length of heating.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Sui, Zhongquan; Yao, Tianming; Zhao, Yue; Ye, Xiaoting; Kong, Xiangli; Ai, Lianzhong</p> <p>2015-04-15</p> <p>Changes in the properties of normal maize starch (NMS) and waxy maize starch (WMS) after heat-moisture treatment (HMT) under various reaction conditions were investigated. NMS and WMS were adjusted to moisture levels of 20%, 25% and 30% and heated at 100 °C for 2, 4, 8 and 16 h. The results showed that moisture content was the most important factor in determining pasting properties for NMS, whereas the heating length was more important for WMS. Swelling power decreased in NMS but increased in WMS, and while the solubility index decreased for both samples, the changes were largely determined by moisture content. The gelatinisation temperatures of both samples increased with increasing moisture content but remained unchanged with increasing heating length. The Fourier transform infrared (FT-IR) absorbance ratio was affected to different extents by the moisture levels but remained constant with increasing the heating length. The X-ray intensities increased but relative crystallinity decreased to a greater extent with increasing moisture content. This study showed that the levels of moisture content and length of heating had significant impacts on the structural and physicochemical properties of normal and waxy maize starches but to different extents. Copyright © 2014 Elsevier Ltd. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014ApPhB.117..411S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014ApPhB.117..411S"><span>O absorption measurements in an engineering-scale high-pressure coal gasifier</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Sun, Kai; Sur, Ritobrata; Jeffries, Jay B.; Hanson, Ronald K.; Clark, Tommy; Anthony, Justin; Machovec, Scott; Northington, John</p> <p>2014-10-01</p> <p>A real-time, in situ water vapor (H2O) sensor using a tunable diode laser near 1,352 nm was developed to continuously monitor water vapor in the synthesis gas of an engineering-scale high-pressure coal gasifier. Wavelength-scanned wavelength-modulation spectroscopy with second harmonic detection (WMS-2 f) was used to determine the absorption magnitude. The 1 f-normalized, WMS-2 f signal (WMS-2 f/1 f) was insensitive to non-absorption transmission losses including beam steering and light scattering by the particulate in the synthesis gas. A fitting strategy was used to simultaneously determine the water vapor mole fraction and the collisional-broadening width of the transition from the scanned 1 f-normalized WMS-2 f waveform at pressures up to 15 atm, which can be used for large absorbance values. This strategy is analogous to the fitting strategy for wavelength-scanned direct absorption measurements. In a test campaign at the US National Carbon Capture Center, the sensor demonstrated a water vapor detection limit of ~800 ppm (25 Hz bandwidth) at conditions with more than 99.99 % non-absorption transmission losses. Successful unattended monitoring was demonstrated over a 435 h period. Strong correlations between the sensor measurements and transient gasifier operation conditions were observed, demonstrating the capability of laser absorption to monitor the gasification process.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2006AGUFMIN51B0812H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2006AGUFMIN51B0812H"><span>The Live Access Server Scientific Product Generation Through Workflow Orchestration</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Hankin, S.; Calahan, J.; Li, J.; Manke, A.; O'Brien, K.; Schweitzer, R.</p> <p>2006-12-01</p> <p>The Live Access Server (LAS) is a well-established Web-application for display and analysis of geo-science data sets. The software, which can be downloaded and installed by anyone, gives data providers an easy way to establish services for their on-line data holdings, so their users can make plots; create and download data sub-sets; compare (difference) fields; and perform simple analyses. Now at version 7.0, LAS has been in operation since 1994. The current "Armstrong" release of LAS V7 consists of three components in a tiered architecture: user interface, workflow orchestration and Web Services. The LAS user interface (UI) communicates with the LAS Product Server via an XML protocol embedded in an HTTP "get" URL. Libraries (APIs) have been developed in Java, JavaScript and perl that can readily generate this URL. As a result of this flexibility it is common to find LAS user interfaces of radically different character, tailored to the nature of specific datasets or the mindset of specific users. When a request is received by the LAS Product Server (LPS -- the workflow orchestration component), business logic converts this request into a series of Web Service requests invoked via SOAP. These "back- end" Web services perform data access and generate products (visualizations, data subsets, analyses, etc.). LPS then packages these outputs into final products (typically HTML pages) via Jakarta Velocity templates for delivery to the end user. "Fine grained" data access is performed by back-end services that may utilize JDBC for data base access; the OPeNDAP "DAPPER" protocol; or (in principle) the OGC WFS protocol. Back-end visualization services are commonly legacy science applications wrapped in Java or Python (or perl) classes and deployed as Web Services accessible via SOAP. Ferret is the default visualization application used by LAS, though other applications such as Matlab, CDAT, and GrADS can also be used. Other back-end services may include generation of Google Earth layers using KML; generation of maps via WMS or ArcIMS protocols; and data manipulation with Unix utilities.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014ISPAr.XL4..273V','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014ISPAr.XL4..273V"><span>Towards a Location-based Service for Early Mental Health Interventions in Disaster Response Using Minimalistic Tele-operated Android Robots Technology</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Vahidi, H.; Mobasheri, A.; Alimardani, M.; Guan, Q.; Bakillah, M.</p> <p>2014-04-01</p> <p>Providing early mental health services during disaster is a great challenge in the disaster response phase. Lack of access to adequate mental-health professionals in the early stages of large-scale disasters dramatically influences the trend of a successful mental health aid. In this paper, a conceptual framework has been suggested for adopting cellphone-type tele-operated android robots in the early stages of disasters for providing the early mental health services for disaster survivors by developing a locationbased and participatory approach. The techniques of enabling GI-services in a Peer-to-Peer (P2P) environment were studied to overcome the limitations of current centralized services. Therefore, the aim of this research study is to add more flexibility and autonomy to GI web services (WMS, WFS, WPS, etc.) and alleviate to some degree the inherent limitations of these centralized systems. A P2P system Architecture is presented for the location-based service using minimalistic tele-operated android robots, and some key techniques of implementing this service using BestPeer were studied for developing this framework.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/19395360','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/19395360"><span>Interference effects on commonly used memory tasks.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Brophy, Linda M; Jackson, Martin; Crowe, Simon F</p> <p>2009-02-01</p> <p>This paper reports two studies which investigated the effect of interference on delayed recall scores of the WMS-III and other commonly used memory measures. In Study 1, participants completed the immediate and delayed components of the WMS-III, with or without the introduction of conceptually similar memory tasks between the recall trials. In Study 2, this order of administration was reversed, with the WMS-III subtests used as the interference items. The results indicated that the introduction of interference items during the delay negatively affected delayed recall performance on almost all sub-tests. In addition, equal effects of proactive and retroactive interference were demonstrated. These findings raise concerns regarding the standardization process for memory tasks and highlight the need to consider interference effects in clinical practice, and stand as a caution in the use of memory-related materials during the delay interval in memory testing.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1435053','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1435053"><span></span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Dong, Lei; Yu, Yajun; Li, Chunguang</p> <p></p> <p>A ppb-level formaldehyde (H 2CO) sensor was developed using a thermoelectrically cooled (TEC), continuous-wave (CW) room temperature interband cascade laser (ICL) emitting at 3.59 μm and a miniature dense pattern multipass gas cell with >50 m optical path length. Performance of the sensor was investigated with two measurement schemes: direct absorption (DAS) and wavelength modulation spectroscopy (WMS). With an integration time of less than 1.5 second, a detection limit of ~3 ppbv for H 2CO measurement with precision of 1.25 ppbv for DAS and 0.58 ppbv for WMS, respectively, was achieved without zero air based background subtraction. An Allan-Werle variancemore » analysis indicated that the precisions can be further improved to 0.26 ppbv @ 300s for DAS and 69 pptv @ 90 s for WMS, respectively. Finally, a side-by-side comparison between two measurement schemes is also discussed in detail.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010JPhCS.219f2022M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010JPhCS.219f2022M"><span>The CREAM-CE: First experiences, results and requirements of the four LHC experiments</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Mendez Lorenzo, Patricia; Santinelli, Roberto; Sciaba, Andrea; Thackray, Nick; Shiers, Jamie; Renshall, Harry; Sgaravatto, Massimo; Padhi, Sanjay</p> <p>2010-04-01</p> <p>In terms of the gLite middleware, the current LCG-CE used by the four LHC experiments is about to be deprecated. The new CREAM-CE service (Computing Resource Execution And Management) has been approved to replace the previous service. CREAM-CE is a lightweight service created to handle job management operations at the CE level. It is able to accept requests both via the gLite WMS service and also via direct submission for transmission to the local batch system. This flexible duality provides the experiments with a large level of freedom to adapt the service to their own computing models, but at the same time it requires a careful follow up of the requirements and tests of the experiments to ensure that their needs are fulfilled before real data taking. In this paper we present the current testing results of the four LHC experiments concerning this new service. The operations procedures, which have been elaborated together with the experiment support teams will be discussed. Finally, the experiments requirements and the expectations for both the sites and the service itself are exposed in detail.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010EGUGA..12.2606F','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010EGUGA..12.2606F"><span>The Climate-G Portal: a Grid Enabled Scientifc Gateway for Climate Change</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Fiore, Sandro; Negro, Alessandro; Aloisio, Giovanni</p> <p>2010-05-01</p> <p>Grid portals are web gateways aiming at concealing the underlying infrastructure through a pervasive, transparent, user-friendly, ubiquitous and seamless access to heterogeneous and geographical spread resources (i.e. storage, computational facilities, services, sensors, network, databases). Definitively they provide an enhanced problem-solving environment able to deal with modern, large scale scientific and engineering problems. Scientific gateways are able to introduce a revolution in the way scientists and researchers organize and carry out their activities. Access to distributed resources, complex workflow capabilities, and community-oriented functionalities are just some of the features that can be provided by such a web-based environment. In the context of the EGEE NA4 Earth Science Cluster, Climate-G is a distributed testbed focusing on climate change research topics. The Euro-Mediterranean Center for Climate Change (CMCC) is actively participating in the testbed providing the scientific gateway (Climate-G Portal) to access to the entire infrastructure. The Climate-G Portal has to face important and critical challenges as well as has to satisfy and address key requirements. In the following, the most relevant ones are presented and discussed. Transparency: the portal has to provide a transparent access to the underlying infrastructure preventing users from dealing with low level details and the complexity of a distributed grid environment. Security: users must be authenticated and authorized on the portal to access and exploit portal functionalities. A wide set of roles is needed to clearly assign the proper one to each user. The access to the computational grid must be completely secured, since the target infrastructure to run jobs is a production grid environment. A security infrastructure (based on X509v3 digital certificates) is strongly needed. Pervasivity and ubiquity: the access to the system must be pervasive and ubiquitous. This is easily true due to the nature of the needed web approach. Usability and simplicity: the portal has to provide simple, high level and user friendly interfaces to ease the access and exploitation of the entire system. Coexistence of general purpose and domain oriented services: along with general purpose services (file transfer, job submission, etc.), the portal has to provide domain based services and functionalities. Subsetting of data, visualization of 2D maps around a virtual globe, delivery of maps through OGC compliant interfaces (i.e. Web Map Service - WMS) are just some examples. Since april 2009, about 70 users (85% coming from the climate change community) got access to the portal. A key challenge of this work is the idea to provide users with an integrated working environment, that is a place where scientists can find huge amount of data, complete metadata support, a wide set of data access services, data visualization and analysis tools, easy access to the underlying grid infrastructure and advanced monitoring interfaces.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2009AGUFMIN21B1054W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2009AGUFMIN21B1054W"><span>Interoperability Between Coastal Web Atlases Using Semantic Mediation: A Case Study of the International Coastal Atlas Network (ICAN)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Wright, D. J.; Lassoued, Y.; Dwyer, N.; Haddad, T.; Bermudez, L. E.; Dunne, D.</p> <p>2009-12-01</p> <p>Coastal mapping plays an important role in informing marine spatial planning, resource management, maritime safety, hazard assessment and even national sovereignty. As such, there is now a plethora of data/metadata catalogs, pre-made maps, tabular and text information on resource availability and exploitation, and decision-making tools. A recent trend has been to encapsulate these in a special class of web-enabled geographic information systems called a coastal web atlas (CWA). While multiple benefits are derived from tailor-made atlases, there is great value added from the integration of disparate CWAs. CWAs linked to one another can query more successfully to optimize planning and decision-making. If a dataset is missing in one atlas, it may be immediately located in another. Similar datasets in two atlases may be combined to enhance study in either region. *But how best to achieve semantic interoperability to mitigate vague data queries, concepts or natural language semantics when retrieving and integrating data and information?* We report on the development of a new prototype seeking to interoperate between two initial CWAs: the Marine Irish Digital Atlas (MIDA) and the Oregon Coastal Atlas (OCA). These two mature atlases are used as a testbed for more regional connections, with the intent for the OCA to use lessons learned to develop a regional network of CWAs along the west coast, and for MIDA to do the same in building and strengthening atlas networks with the UK, Belgium, and other parts of Europe. Our prototype uses semantic interoperability via services harmonization and ontology mediation, allowing local atlases to use their own data structures, and vocabularies (ontologies). We use standard technologies such as OGC Web Map Services (WMS) for delivering maps, and OGC Catalogue Service for the Web (CSW) for delivering and querying ISO-19139 metadata. The metadata records of a given CWA use a given ontology of terms called local ontology. Human or machine users formulate their requests using a common ontology of metadata terms, called global ontology. A CSW mediator rewrites the user’s request into CSW requests over local CSWs using their own (local) ontologies, collects the results and sends them back to the user. To extend the system, we have recently added global maritime boundaries and are also considering nearshore ocean observing system data. Ongoing work includes adding WFS, error management, and exception handling, enabling Smart Searches, and writing full documentation. This prototype is a central research project of the new International Coastal Atlas Network (ICAN), a group of 30+ organizations from 14 nations (and growing) dedicated to seeking interoperability approaches to CWAs in support of coastal zone management and the translation of coastal science to coastal decision-making.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015EGUGA..1713753L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015EGUGA..1713753L"><span>Making large amounts of meteorological plots easily accessible to users</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Lamy-Thepaut, Sylvie; Siemen, Stephan; Sahin, Cihan; Raoult, Baudouin</p> <p>2015-04-01</p> <p>The European Centre for Medium-Range Weather Forecasts (ECMWF) is an international organisation providing its member organisations with forecasts in the medium time range of 3 to 15 days, and some longer-range forecasts for up to a year ahead, with varying degrees of detail. As part of its mission, ECMWF generates an increasing number of forecast data products for its users. To support the work of forecasters and researchers and to let them make best use of ECMWF forecasts, the Centre also provides tools and interfaces to visualise their products. This allows users to make use of and explore forecasts without having to transfer large amounts of raw data. This is especially true for products based on ECMWF's 50 member ensemble forecast, where some specific processing and visualisation are applied to extract information. Every day, thousands of raw data are being pushed to the ECMWF's interactive web charts application called ecCharts, and thousands of products are processed and pushed to ECMWF's institutional web site ecCharts provides a highly interactive application to display and manipulate recent numerical forecasts to forecasters in national weather services and ECMWF's commercial customers. With ecCharts forecasters are able to explore ECMWF's medium-range forecasts in far greater detail than has previously been possible on the web, and this as soon as the forecast becomes available. All ecCharts's products are also available through a machine-to-machine web map service based on the OGC Web Map Service (WMS) standard. ECMWF institutional web site provides access to a large number of graphical products. It was entirely redesigned last year. It now shares the same infrastructure as ECMWF's ecCharts, and can benefit of some ecCharts functionalities, for example the dashboard. The dashboard initially developed for ecCharts allows users to organise their own collection of products depending on their work flow, and is being further developed. In its first implementation, It presents the user's products in a single interface with fast access to the original product, and possibilities of synchronous animations between them. But its functionalities are being extended to give users the freedom to collect not only ecCharts's 2D maps and graphs, but also other ECMWF Web products such as monthly and seasonal products, scores, and observation monitoring. The dashboard will play a key role to help the user to interpret the large amount of information that ECMWF is providing. This talk will present examples of how the new user interface can organise complex meteorological maps and graphs and show the new possibilities users have gained by using the web as a medium.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA532856','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA532856"><span>Laser-Based Measurements of OH, Temperature, and Water Vapor Concentration in a Hydrocarbon-Fueled Scramjet (POSTPRINT)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>2008-07-01</p> <p>hours. The detector signals are post-processed with a software lock-in amplifier to recover the WMS-1f and WMS-2f signals. The TDLAS sensor utilizes...Figure 6. Schematic of TDLAS sensor for temperature and water vapor concentration. Fiber Diode lasers Grating Fiber Detectors Demultiplexer Multiplexer...within the combustor. Tunable diode laser- based absorption spectroscopy ( TDLAS ) is used to measure water vapor concentration and static temperature near</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_10");'>10</a></li> <li><a href="#" onclick='return showDiv("page_11");'>11</a></li> <li class="active"><span>12</span></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_12 --> <div id="page_13" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_11");'>11</a></li> <li><a href="#" onclick='return showDiv("page_12");'>12</a></li> <li class="active"><span>13</span></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="241"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015ApPhB.119...45Q','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015ApPhB.119...45Q"><span>In situ H2O and temperature detection close to burning biomass pellets using calibration-free wavelength modulation spectroscopy</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Qu, Zhechao; Schmidt, Florian M.</p> <p>2015-04-01</p> <p>The design and application of an H2O/temperature sensor based on scanned calibration-free wavelength modulation spectroscopy (CF-WMS) and a single tunable diode laser at 1.4 µm is presented. The sensor probes two H2O absorption peaks in a single scan and simultaneously retrieves H2O concentration and temperature by least-squares fitting simulated 1f-normalized 2f-WMS spectra to measured 2f/ 1f-WMS signals, with temperature, concentration and nonlinear modulation amplitude as fitting parameters. Given a minimum detectable absorbance of 1.7 × 10-5 cm-1 Hz-1/2, the system is applicable down to an H2O concentration of 0.1 % at 1,000 K and 20 cm path length (200 ppm·m). The temperature in a water-seeded laboratory-scale reactor (670-1220 K at 4 % H2O) was determined within an accuracy of 1 % by comparison with the reactor thermocouple. The CF-WMS sensor was applied to real time in situ measurements of H2O concentration and temperature time histories (0.25-s time resolution) in the hot gases 2-11 mm above biomass pellets during atmospheric combustion in the reactor. Temperatures between 1,200 and 1,600 K and H2O concentrations up to 40 % were detected above the biofuels.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28901894','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28901894"><span>Dietary starch types affect liver nutrient metabolism of finishing pigs.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Xie, Chen; Li, Yanjiao; Li, Jiaolong; Zhang, Lin; Zhou, Guanghong; Gao, Feng</p> <p>2017-09-01</p> <p>This study aimed to evaluate the effect of different starch types on liver nutrient metabolism of finishing pigs. In all ninety barrows were randomly allocated to three diets with five replicates of six pigs, containing purified waxy maize starch (WMS), non-waxy maize starch (NMS) and pea starch (PS) (the amylose to amylopectin ratios were 0·07, 0·19 and 0·28, respectively). After 28 d of treatments, two per pen (close to the average body weight of the pen) were weighed individually, slaughtered and liver samples were collected. Compared with the WMS diet, the PS diet decreased the activities of glycogen phosphorylase, phosphoenolpyruvate carboxykinase and the expression of phosphoenolpyruvate carboxykinase 1 in liver (P0·05). Compared with the WMS diet, the PS diet reduced the expressions of glutamate dehydrogenase and carbamoyl phosphate synthetase 1 in liver (P<0·05). PS diet decreased the expression of the insulin receptor, and increased the expressions of mammalian target of rapamycin complex 1 and ribosomal protein S6 kinase β-1 in liver compared with the WMS diet (P<0·05). These findings indicated that the diet with higher amylose content could down-regulate gluconeogenesis, and cause less fat deposition and more protein deposition by affecting the insulin/PI3K/protein kinase B signalling pathway in liver of finishing pigs.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20120009216','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20120009216"><span>Integrating NASA Satellite Data Into USDA World Agricultural Outlook Board Decision Making Environment To Improve Agricultural Estimates</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Teng, William; Shannon, Harlan; deJeu, Richard; Kempler, Steve</p> <p>2012-01-01</p> <p>The USDA World Agricultural Outlook Board (WAOB) is responsible for monitoring weather and climate impacts on domestic and foreign crop development. One of WAOB's primary goals is to determine the net cumulative effect of weather and climate anomalies on final crop yields. To this end, a broad array of information is consulted. The resulting agricultural weather assessments are published in the Weekly Weather and Crop Bulletin, to keep farmers, policy makers, and commercial agricultural interests informed of weather and climate impacts on agriculture. The goal of the current project is to improve WAOB estimates by integrating NASA satellite precipitation and soil moisture observations into WAOB's decision making environment. Precipitation (Level 3 gridded) is from the TRMM Multi-satellite Precipitation Analysis (TMPA). Soil moisture (Level 2 swath and Level 3 gridded) is generated by the Land Parameter Retrieval Model (LPRM) and operationally produced by the NASA Goddard Earth Sciences Data and Information Services Center (GBS DISC). A root zone soil moisture (RZSM) product is also generated, via assimilation of the Level 3 LPRM data by a land surface model (part of a related project). Data services to be available for these products include GeoTIFF, GDS (GrADS Data Server), WMS (Web Map Service), WCS (Web Coverage Service), and NASA Giovanni. Project benchmarking is based on retrospective analyses of WAOB analog year comparisons. The latter are between a given year and historical years with similar weather patterns and estimated crop yields. An analog index (AI) was developed to introduce a more rigorous, statistical approach for identifying analog years. Results thus far show that crop yield estimates derived from TMPA precipitation data are closer to measured yields than are estimates derived from surface-based precipitation measurements. Work is continuing to include LPRM surface soil moisture data and model-assimilated RZSM.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013EGUGA..15.5600W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013EGUGA..15.5600W"><span>EarthServer: Visualisation and use of uncertainty as a data exploration tool</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Walker, Peter; Clements, Oliver; Grant, Mike</p> <p>2013-04-01</p> <p>The Ocean Science/Earth Observation community generates huge datasets from satellite observation. Until recently it has been difficult to obtain matching uncertainty information for these datasets and to apply this to their processing. In order to make use of uncertainty information when analysing "Big Data" we need both the uncertainty itself (attached to the underlying data) and a means of working with the combined product without requiring the entire dataset to be downloaded. The European Commission FP7 project EarthServer (http://earthserver.eu) is addressing the problem of accessing and ad-hoc analysis of extreme-size Earth Science data using cutting-edge Array Database technology. The core software (Rasdaman) and web services wrapper (Petascope) allow huge datasets to be accessed using Open Geospatial Consortium (OGC) standard interfaces including the well established standards, Web Coverage Service (WCS) and Web Map Service (WMS) as well as the emerging standard, Web Coverage Processing Service (WCPS). The WCPS standard allows the running of ad-hoc queries on any of the data stored within Rasdaman, creating an infrastructure where users are not restricted by bandwidth when manipulating or querying huge datasets. The ESA Ocean Colour - Climate Change Initiative (OC-CCI) project (http://www.esa-oceancolour-cci.org/), is producing high-resolution, global ocean colour datasets over the full time period (1998-2012) where high quality observations were available. This climate data record includes per-pixel uncertainty data for each variable, based on an analytic method that classifies how much and which types of water are present in a pixel, and assigns uncertainty based on robust comparisons to global in-situ validation datasets. These uncertainty values take two forms, Root Mean Square (RMS) and Bias uncertainty, respectively representing the expected variability and expected offset error. By combining the data produced through the OC-CCI project with the software from the EarthServer project we can produce a novel data offering that allows the use of traditional exploration and access mechanisms such as WMS and WCS. However the real benefits can be seen when utilising WCPS to explore the data . We will show two major benefits to this infrastructure. Firstly we will show that the visualisation of the combined chlorophyll and uncertainty datasets through a web based GIS portal gives users the ability to instantaneously assess the quality of the data they are exploring using traditional web based plotting techniques as well as through novel web based 3 dimensional visualisation. Secondly we will showcase the benefits available when combining these data with the WCPS standard. The uncertainty data can be utilised in queries using the standard WCPS query language. This allows selection of data either for download or use within the query, based on the respective uncertainty values as well as the possibility of incorporating both the chlorophyll data and uncertainty data into complex queries to produce additional novel data products. By filtering with uncertainty at the data source rather than the client we can minimise traffic over the network allowing huge datasets to be worked on with a minimal time penalty.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017E%26ES...70a2040U','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017E%26ES...70a2040U"><span>Disaster mitigation at drainage basin of Kuranji Padang City</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Utama, L.; Yamin, M.</p> <p>2017-06-01</p> <p>Floods is flooding of effect of exit water groove river because big river debit sudden its accomodation energy, happened swiftly knock over areas which is debasement, in river basin and hollow. Flow debris or which is recognized with galodo have knock over river of Kuranji year 2012 in Padang city. Area is floods disaster are: 19 Sub-District in 7 district, and hard that is district of Pauh and district of Nanggalo. Governmental claim tired loss of Rp 263,9 Billion while Government of Provinsi West Sumatera appraise loss estimated by Fourty Billion Rupiah (Padang Ekspress 28 July 2012), with detail of damage house counted 878 unit, damage religious service house 15 unit, damage irrigation 12 unit, damage bridge 6 unit, damage school 2 unit, damage health post 1 unit. Result of calculation, by using rainfall of year 2003 until year 2015 with method Gumbel, Hasper and Wedwen, got high rainfall plan is 310,00 mm, and method Melchior and Hasper floods is 1125,86 m³ / second. From result of study analyse at Citra map of correlation and image to parameters cause of floods, and use software Watershed Modelling System (WMS) this region have two class that is middle susceptance and low susceptance. Middle susceptance area is there are in middle river and downstream river, with inclination level off. Low susceptance area there is middle river. Area which have potency result the happening of floods is headwaters, because having keen ramp storey level ( 45 - 55%) and is hilly. For the mitigasi of floods disaster determined by three area evacuate that are: Sub-District Of Kelurahan Limau Manis District Of Pauh, Sub-District Of Surau Gadang District Of Nanggalo, and Sub-District Of Lambung Bukik District of Pauh, in the form of map.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/9811133','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/9811133"><span>Visual reproduction subtest of the Wechsler Memory Scale-Revised: analysis of construct validity.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Williams, M A; Rich, M A; Reed, L K; Jackson, W T; LaMarche, J A; Boll, T J</p> <p>1998-11-01</p> <p>This study assessed the construct validity of Visual Reproduction (VR) Cards A (Flags) and B (Boxes) from the original Wechsler Memory Scale (WMS) compared to Flags and Boxes from the revised edition of the WMS (WMS-R). Independent raters scored Flags and Boxes using both the original and revised scoring criteria and correlations were obtained with age, education, IQ, and four separate criterion memory measures. Results show that for Flags, there is a tendency for the revised scoring criteria to produce improved construct validity. For Boxes, however, there was a trend in the opposite direction, with the revised scoring criteria demonstrating worse construct validity. Factor analysis suggests that Flags are a more distinct measure of visual memory, whereas Boxes are more complex and significantly associated with conceptual reasoning abilities. Using the revised scoring criteria, Boxes were found to be more strongly related to IQ than Flags. This difference was not found using the original scoring criteria.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/pages/biblio/1435053-ppb-level-formaldehyde-detection-using-cw-room-temperature-interband-cascade-laser-miniature-dense-pattern-multipass-gas-cell','SCIGOV-DOEP'); return false;" href="https://www.osti.gov/pages/biblio/1435053-ppb-level-formaldehyde-detection-using-cw-room-temperature-interband-cascade-laser-miniature-dense-pattern-multipass-gas-cell"><span>Ppb-level formaldehyde detection using a CW room-temperature interband cascade laser and a miniature dense pattern multipass gas cell</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/pages">DOE PAGES</a></p> <p>Dong, Lei; Yu, Yajun; Li, Chunguang; ...</p> <p>2015-07-27</p> <p>A ppb-level formaldehyde (H 2CO) sensor was developed using a thermoelectrically cooled (TEC), continuous-wave (CW) room temperature interband cascade laser (ICL) emitting at 3.59 μm and a miniature dense pattern multipass gas cell with >50 m optical path length. Performance of the sensor was investigated with two measurement schemes: direct absorption (DAS) and wavelength modulation spectroscopy (WMS). With an integration time of less than 1.5 second, a detection limit of ~3 ppbv for H 2CO measurement with precision of 1.25 ppbv for DAS and 0.58 ppbv for WMS, respectively, was achieved without zero air based background subtraction. An Allan-Werle variancemore » analysis indicated that the precisions can be further improved to 0.26 ppbv @ 300s for DAS and 69 pptv @ 90 s for WMS, respectively. Finally, a side-by-side comparison between two measurement schemes is also discussed in detail.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25095404','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25095404"><span>[Absorption spectrum of Quasi-continuous laser modulation demodulation method].</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Shao, Xin; Liu, Fu-Gui; Du, Zhen-Hui; Wang, Wei</p> <p>2014-05-01</p> <p>A software phase-locked amplifier demodulation method is proposed in order to demodulate the second harmonic (2f) signal of quasi-continuous laser wavelength modulation spectroscopy (WMS) properly, based on the analysis of its signal characteristics. By judging the effectiveness of the measurement data, filter, phase-sensitive detection, digital filtering and other processing, the method can achieve the sensitive detection of quasi-continuous signal The method was verified by using carbon dioxide detection experiments. The WMS-2f signal obtained by the software phase-locked amplifier and the high-performance phase-locked amplifier (SR844) were compared simultaneously. The results show that the Allan variance of WMS-2f signal demodulated by the software phase-locked amplifier is one order of magnitude smaller than that demodulated by SR844, corresponding two order of magnitude lower of detection limit. And it is able to solve the unlocked problem caused by the small duty cycle of quasi-continuous modulation signal, with a small signal waveform distortion.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/16624786','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/16624786"><span>Application of new WAIS-III/WMS-III discrepancy scores for evaluating memory functioning: relationship between intellectual and memory ability.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Lange, Rael T; Chelune, Gordon J</p> <p>2006-05-01</p> <p>Analysis of the discrepancy between memory and intellectual ability has received some support as a means for evaluating memory impairment. Recently, comprehensive base rate tables for General Ability Index (GAI) minus memory discrepancy scores (i.e., GAI-memory) were developed using the WAIS-III/WMS-III standardization sample (Lange, Chelune, & Tulsky, in press). The purpose of this study was to evaluate the clinical utility of GAI-memory discrepancy scores to identify memory impairment in 34 patients with Alzheimer's type dementia (DAT) versus a sample of 34 demographically matched healthy participants. On average, patients with DAT obtained significantly lower scores on all WAIS-III and WMS-III indexes and had larger GAI-memory discrepancy scores. Clinical outcome analyses revealed that GAI-memory scores were useful at identifying memory impairment in patients with DAT versus matched healthy participants. However, GAI-memory discrepancy scores failed to provide unique interpretive information beyond that which is gained from the memory indexes alone. Implications and future research directions are discussed.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/23146408','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/23146408"><span>Cognitive stimulation therapy (CST): neuropsychological mechanisms of change.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Hall, Louise; Orrell, Martin; Stott, Joshua; Spector, Aimee</p> <p>2013-03-01</p> <p>Cognitive stimulation therapy (CST) is an evidence-based psychosocial intervention for people with dementia consisting of 14 group sessions aiming to stimulate various areas of cognition. This study examined the effects of CST on specific cognitive domains and explored the neuropsychological processes underpinning any effects. A total of 34 participants with mild to moderate dementia were included. A one-group pretest-posttest design was used. Participants completed a battery of neuropsychological tests in the week before and after the manualised seven-week CST programme. There were significant improvement pre- to post-CST group on measures of delayed verbal recall (WMS III logical memory subtest - delayed), visual memory (WMS III visual reproduction subtest - delayed), orientation (WMS III information and orientation subscale), and auditory comprehension (Token Test). There were no significant changes on measures of naming (Boston Naming Test-2), attention (Trail Making Test A/Digit Span), executive function (DKEFS verbal fluency/Trail Making Test B), praxis (WMS III visual reproduction - immediate) or on a general cognitive screen (MMSE). Memory, comprehension of syntax, and orientation appear to be the cognitive domains most impacted by CST. One hypothesis is that the language-based nature of CST enhances neural pathways responsible for processing of syntax, possibly also aiding verbal recall. Another is that the reduction in negative self-stereotypes due to the de-stigmatising effect of CST may impact on language and memory, domains that are the primary focus of CST. Further research is required to substantiate these hypotheses.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26798080','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26798080"><span>High-Pressure Measurements of Temperature and CO2 Concentration Using Tunable Diode Lasers at 2 μm.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Cai, Tingdong; Gao, Guangzhen; Wang, Minrui; Wang, Guishi; Liu, Ying; Gao, Xiaoming</p> <p>2016-03-01</p> <p>A sensor for simultaneous measurements of temperature and carbon dioxide (CO2) concentration at elevated pressure is developed using tunable diode lasers at 2 µm. Based on some selection rules, a CO2 line pair at 5006.140 and 5010.725 cm(-1) is selected for the TDL sensor. In order to ensure the accuracy and rapidity of the sensor, a quasi-fixed-wavelength WMS is employed. Normalization of the 2f signal with the 1f signal magnitude is used to remove the need for calibration and correct for transmission variation due to beam steering, mechanical misalignments, soot, and windows fouling. Temperatures are obtained from comparison of the background-subtracted 1f-normalized WMS-2f signals ratio and a 1f-normalized WMS-2f peak values ratio model. CO2 concentration is inferred from the 1f-normalized WMS-2f peak values of the CO2 transition at 5006.140 cm(-1). Measurements of temperature and CO2 concentration are carried out in static cell experiments (P = 1-10 atm, T = 500-1200 K) to validate the accuracy and ability of the sensor. The results show that accuracy of the sensor for temperature and CO2 concentration are 1.66% temperature and 3.1%, respectively. All the measurements show the potential utility of the sensor for combustion diagnose at elevated pressure. © The Author(s) 2016.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013ISPAr.XL2b.167O','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013ISPAr.XL2b.167O"><span>Spatial Data Web Services Pricing Model Infrastructure</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ozmus, L.; Erkek, B.; Colak, S.; Cankurt, I.; Bakıcı, S.</p> <p>2013-08-01</p> <p>The General Directorate of Land Registry and Cadastre (TKGM) which is the leader in the field of cartography largely continues its missions which are; to keep and update land registry and cadastre system of the country under the responsibility of the treasure, to perform transactions related to real estate and to establish Turkish national spatial information system. TKGM a public agency has completed many projects. Such as; Continuously Operating GPS Reference Stations (TUSAGA-Aktif), Geo-Metadata Portal (HBB), Orthophoto-Base Map Production and web services, Completion of Initial Cadastre, Cadastral Renovation Project (TKMP), Land Registry and Cadastre Information System (TAKBIS), Turkish National Spatial Data Infrastructure Project (TNSDI), Ottoman Land Registry Archive Information System (TARBIS). TKGM provides updated map and map information to not only public institutions but also to related society in the name of social responsibility principals. Turkish National Spatial Data Infrastructure activities have been started by the motivation of Circular No. 2003/48 which was declared by Turkish Prime Ministry in 2003 within the context of e-Transformation of Turkey Short-term Action Plan. Action No. 47 in the mentioned action plan implies that "A Feasibility Study shall be made in order to establish the Turkish National Spatial Data Infrastructure" whose responsibility has been given to General Directorate of Land Registry and Cadastre. Feasibility report of NSDI has been completed in 10th of December 2010. After decision of Steering Committee, feasibility report has been send to Development Bank (old name State Planning Organization) for further evaluation. There are two main arrangements with related this project (feasibility report).First; Now there is only one Ministry which is Ministry of Environment and Urbanism responsible for establishment, operating and all national level activities of NSDI. And Second arrangement is related to institutional Level. The most important law with related NSDI is the establishment of General Directorate of Geographic Information System under the Ministry of Environment and Urbanism. due to; to do or to have do works and activities with related to the establishment of National Geographic Information Systems (NGIS), usage of NGIS and improvements of NGIS. Outputs of these projects are served to not only public administration but also to Turkish society. Today for example, TAKBIS data (cadastre services) are shared more than 50 institutions by Web services, Tusaga-Aktif system has more than 3800 users who are having real-time GPS data correction, Orthophoto WMS services has been started for two years as a charge of free. Today there is great discussion about data pricing among the institutions. Some of them think that the pricing is storage of the data. Some of them think that the pricing is value of data itself. There is no certain rule about pricing. On this paper firstly, pricing of data storage and later on spatial data pricing models in different countries are investigated to improve institutional understanding in Turkey.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/19709458','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/19709458"><span>Diagnostic efficiency of demographically corrected Wechsler Adult Intelligence Scale-III and Wechsler Memory Scale-III indices in moderate to severe traumatic brain injury and lower education levels.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Walker, Alexandra J; Batchelor, Jennifer; Shores, E Arthur; Jones, Mike</p> <p>2009-11-01</p> <p>Despite the sensitivity of neuropsychological tests to educational level, improved diagnostic accuracy for demographically corrected scores has yet to be established. Diagnostic efficiency statistics of Wechsler Adult Intelligence Scale-III (WAIS-III) and Wechsler Memory Scale-III (WMS-III) indices that were corrected for education, sex, and age (demographically corrected) were compared with age corrected indices in individuals aged 16 to 75 years with moderate to severe traumatic brain injury (TBI) and 12 years or less education. TBI participants (n = 100) were consecutive referrals to an outpatient rehabilitation service and met careful selection criteria. Controls (n = 100) were obtained from the WAIS-III/WMS-III standardization sample. Demographically corrected indices did not provide higher diagnostic efficiency than age corrected indices and this result was supported by reanalysis of the TBI group against a larger and unmatched control group. Processing Speed Index provided comparable diagnostic accuracy to that of combined indices. Demographically corrected indices were associated with higher cut-scores to maximize overall classification, reflecting the upward adjustment of those scores in a lower education sample. This suggests that, in clinical practice, the test results of individuals with limited education may be more accurately interpreted with the application of demographic corrections. Diagnostic efficiency statistics are presented, and future research directions are discussed.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..18.6500N','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..18.6500N"><span>SCHeMA web-based observation data information system</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Novellino, Antonio; Benedetti, Giacomo; D'Angelo, Paolo; Confalonieri, Fabio; Massa, Francesco; Povero, Paolo; Tercier-Waeber, Marie-Louise</p> <p>2016-04-01</p> <p>It is well recognized that the need of sharing ocean data among non-specialized users is constantly increasing. Initiatives that are built upon international standards will contribute to simplify data processing and dissemination, improve user-accessibility also through web browsers, facilitate the sharing of information across the integrated network of ocean observing systems; and ultimately provide a better understanding of the ocean functioning. The SCHeMA (Integrated in Situ Chemical MApping probe) Project is developing an open and modular sensing solution for autonomous in situ high resolution mapping of a wide range of anthropogenic and natural chemical compounds coupled to master bio-physicochemical parameters (www.schema-ocean.eu). The SCHeMA web system is designed to ensure user-friendly data discovery, access and download as well as interoperability with other projects through a dedicated interface that implements the Global Earth Observation System of Systems - Common Infrastructure (GCI) recommendations and the international Open Geospatial Consortium - Sensor Web Enablement (OGC-SWE) standards. This approach will insure data accessibility in compliance with major European Directives and recommendations. Being modular, the system allows the plug-and-play of commercially available probes as well as new sensor probess under development within the project. The access to the network of monitoring probes is provided via a web-based system interface that, being implemented as a SOS (Sensor Observation Service), is providing standard interoperability and access tosensor observations systems through O&M standard - as well as sensor descriptions - encoded in Sensor Model Language (SensorML). The use of common vocabularies in all metadatabases and data formats, to describe data in an already harmonized and common standard is a prerequisite towards consistency and interoperability. Therefore, the SCHeMA SOS has adopted the SeaVox common vocabularies populated by SeaDataNet network of National Oceanographic Data Centres. The SCHeMA presentation layer, a fundamental part of the software architecture, offers to the user a bidirectional interaction with the integrated system allowing to manage and configure the sensor probes; view the stored observations and metadata, and handle alarms. The overall structure of the web portal developed within the SCHeMA initiative (Sensor Configuration, development of Core Profile interface for data access via OGC standard, external services such as web services, WMS, WFS; and Data download and query manager) will be presented and illustrated with examples of ongoing tests in costal and open sea.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012AGUFM.H12F..07P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012AGUFM.H12F..07P"><span>National Geothermal Data System: State Geological Survey Contributions to Date</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Patten, K.; Allison, M. L.; Richard, S. M.; Clark, R.; Love, D.; Coleman, C.; Caudill, C.; Matti, J.; Musil, L.; Day, J.; Chen, G.</p> <p>2012-12-01</p> <p>In collaboration with the Association of American State Geologists the Arizona Geological Survey is leading the effort to bring legacy geothermal data to the U.S. Department of Energy's National Geothermal Data System (NGDS). NGDS is a national, sustainable, distributed, interoperable network of data and service (application) providers entering its final stages of development. Once completed the geothermal industry, the public, and policy makers will have access to consistent and reliable data, which in turn, reduces the amount of staff time devoted to finding, retrieving, integrating, and verifying information. With easier access to information, the high cost and risk of geothermal power projects (especially exploration drilling) is reduced. This presentation focuses on the scientific and data integration methodology as well as State Geological Survey contributions to date. The NGDS is built using the U.S. Geoscience Information Network (USGIN) data integration framework to promote interoperability across the Earth sciences community and with other emerging data integration and networking efforts. Core to the USGIN concept is that of data provenance; by allowing data providers to maintain and house their data. After concluding the second year of the project, we have nearly 800 datasets representing over 2 million data points from the state geological surveys. A new AASG specific search catalog based on popular internet search formats enables end users to more easily find and identify geothermal resources in a specific region. Sixteen states, including a consortium of Great Basin states, have initiated new field data collection for submission to the NGDS. The new field data includes data from at least 21 newly drilled thermal gradient holes in previously unexplored areas. Most of the datasets provided to the NGDS are being portrayed as Open Geospatial Consortium (OGC) Web Map Services (WMS) and Web Feature Services (WFS), meaning that the data is compatible with a variety of visualization software. Web services are ideal for the NGDS data for a number of reasons including that they preserve data ownership in that they are read only and new services can be deployed to meet new requirements without modifying existing applications.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013AGUFMIN43C..02S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013AGUFMIN43C..02S"><span>Improving Data Catalogs with Free and Open Source Software</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Schweitzer, R.; Hankin, S.; O'Brien, K.</p> <p>2013-12-01</p> <p>The Global Earth Observation Integrated Data Environment (GEO-IDE) is NOAA's effort to successfully integrate data and information with partners in the national US-Global Earth Observation System (US-GEO) and the international Global Earth Observation System of Systems (GEOSS). As part of the GEO-IDE, the Unified Access Framework (UAF) is working to build momentum towards the goal of increased data integration and interoperability. The UAF project is moving towards this goal with an approach that includes leveraging well known and widely used standards, as well as free and open source software. The UAF project shares the widely held conviction that the use of data standards is a key ingredient necessary to achieve interoperability. Many community-based consensus standards fail, though, due to poor compliance. Compliance problems emerge for many reasons: because the standards evolve through versions, because documentation is ambiguous or because individual data providers find the standard inadequate as-is to meet their special needs. In addition, minimalist use of standards will lead to a compliant service, but one which is of low quality. In this presentation, we will be discussing the UAF effort to build a catalog cleaning tool which is designed to crawl THREDDS catalogs, analyze the data available, and then build a 'clean' catalog of data which is standards compliant and has a uniform set of data access services available. These data services include, among others, OPeNDAP, Web Coverage Service (WCS) and Web Mapping Service (WMS). We will also discuss how we are utilizing free and open source software and services to both crawl, analyze and build the clean data catalog, as well as our efforts to help data providers improve their data catalogs. We'll discuss the use of open source software such as DataNucleus, Thematic Realtime Environmental Distributed Data Services (THREDDS), ncISO and the netCDF Java Common Data Model (CDM). We'll also demonstrate how we are using free services such as Google Charts to create an easily identifiable visual metaphor which describes the quality of data catalogs. Using this rubric, in conjunction with the ncISO metadata quality rubric, will allow data providers to identify non-compliance issues in their data catalogs, thereby improving data availability to their users and to data discovery systems</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013AGUFMIN11A1511L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013AGUFMIN11A1511L"><span>Oceanotron, Scalable Server for Marine Observations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Loubrieu, T.; Bregent, S.; Blower, J. D.; Griffiths, G.</p> <p>2013-12-01</p> <p>Ifremer, French marine institute, is deeply involved in data management for different ocean in-situ observation programs (ARGO, OceanSites, GOSUD, ...) or other European programs aiming at networking ocean in-situ observation data repositories (myOcean, seaDataNet, Emodnet). To capitalize the effort for implementing advance data dissemination services (visualization, download with subsetting) for these programs and generally speaking water-column observations repositories, Ifremer decided to develop the oceanotron server (2010). Knowing the diversity of data repository formats (RDBMS, netCDF, ODV, ...) and the temperamental nature of the standard interoperability interface profiles (OGC/WMS, OGC/WFS, OGC/SOS, OpeNDAP, ...), the server is designed to manage plugins: - StorageUnits : which enable to read specific data repository formats (netCDF/OceanSites, RDBMS schema, ODV binary format). - FrontDesks : which get external requests and send results for interoperable protocols (OGC/WMS, OGC/SOS, OpenDAP). In between a third type of plugin may be inserted: - TransformationUnits : which enable ocean business related transformation of the features (for example conversion of vertical coordinates from pressure in dB to meters under sea surface). The server is released under open-source license so that partners can develop their own plugins. Within MyOcean project, University of Reading has plugged a WMS implementation as an oceanotron frontdesk. The modules are connected together by sharing the same information model for marine observations (or sampling features: vertical profiles, point series and trajectories), dataset metadata and queries. The shared information model is based on OGC/Observation & Measurement and Unidata/Common Data Model initiatives. The model is implemented in java (http://www.ifremer.fr/isi/oceanotron/javadoc/). This inner-interoperability level enables to capitalize ocean business expertise in software development without being indentured to specific data formats or protocols. Oceanotron is deployed at seven European data centres for marine in-situ observations within myOcean. While additional extensions are still being developed, to promote new collaborative initiatives, a work is now done on continuous and distributed integration (jenkins, maven), shared reference documentation (on alfresco) and code and release dissemination (sourceforge, github).</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25655352','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25655352"><span>Upgraded biogas from municipal solid waste for natural gas substitution and CO2 reduction--a case study of Austria, Italy, and Spain.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Starr, Katherine; Villalba, Gara; Gabarrell, Xavier</p> <p>2015-04-01</p> <p>Biogas is rich in methane and can be further purified through biogas upgrading technologies, presenting a viable alternative to natural gas. Landfills and anaerobic digestors treating municipal solid waste are a large source of such biogas. They therefore offer an attractive opportunity to tap into this potential source of natural gas while at the same time minimizing the global warming impact resulting from methane emissions in waste management schemes (WMS) and fossil fuel consumption reduction. This study looks at the current municipal solid waste flows of Spain, Italy, and Austria over one year (2009), in order to determine how much biogas is generated. Then it examines how much natural gas could be substituted by using four different biogas upgrading technologies. Based on current waste generation rates, exploratory but realistic WMS were created for each country in order to maximize biogas production and potential for natural gas substitution. It was found that the potential substitution of natural gas by biogas resulting from the current WMS seems rather insignificant: 0.2% for Austria, 0.6% for Italy and 0.3% for Spain. However, if the WMS is redesigned to maximize biogas production, these figures can increase to 0.7% for Austria, 1% for Italy and 2% for Spain. Furthermore, the potential CO2 reduction as a consequence of capturing the biogas and replacing fossil fuel can result in up to a 93% reduction of the annual national waste greenhouse gas emissions of Spain and Italy. Copyright © 2015 Elsevier Ltd. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012JPhCS.396c2001A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012JPhCS.396c2001A"><span>Eurogrid: a new glideinWMS based portal for CDF data analysis</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Amerio, S.; Benjamin, D.; Dost, J.; Compostella, G.; Lucchesi, D.; Sfiligoi, I.</p> <p>2012-12-01</p> <p>The CDF experiment at Fermilab ended its Run-II phase on September 2011 after 11 years of operations and 10 fb-1 of collected data. CDF computing model is based on a Central Analysis Farm (CAF) consisting of local computing and storage resources, supported by OSG and LCG resources accessed through dedicated portals. At the beginning of 2011 a new portal, Eurogrid, has been developed to effectively exploit computing and disk resources in Europe: a dedicated farm and storage area at the TIER-1 CNAF computing center in Italy, and additional LCG computing resources at different TIER-2 sites in Italy, Spain, Germany and France, are accessed through a common interface. The goal of this project is to develop a portal easy to integrate in the existing CDF computing model, completely transparent to the user and requiring a minimum amount of maintenance support by the CDF collaboration. In this paper we will review the implementation of this new portal, and its performance in the first months of usage. Eurogrid is based on the glideinWMS software, a glidein based Workload Management System (WMS) that works on top of Condor. As CDF CAF is based on Condor, the choice of the glideinWMS software was natural and the implementation seamless. Thanks to the pilot jobs, user-specific requirements and site resources are matched in a very efficient way, completely transparent to the users. Official since June 2011, Eurogrid effectively complements and supports CDF computing resources offering an optimal solution for the future in terms of required manpower for administration, support and development.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/19714554','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/19714554"><span>[Evaluation of memory in acquired brain injury: a comparison between the Wechsler Memory Scale and the Rivermead Behaviour Memory Test].</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Guinea-Hidalgo, A; Luna-Lario, P; Tirapu-Ustárroz, J</p> <p></p> <p>Learning processes and memory are frequently compromised in acquired brain injury (ABI), while at the same time such involvement is often heterogeneous and a source of deficits in other cognitive capacities and significant functional limitations. A good neuropsychological evaluation of memory is designed to study not only the type, intensity and nature of the problems, but also the way they manifest in daily life. This study examines the correlation between a traditional memory test, the Wechsler Memory Scale-III (WMS-III), and a memory test that is considered to be functional, the Rivermead Behavioural Memory Test (RBMT), in a sample of 60 patients with ABI. All the correlations that were observed were moderate. Greater correlations were found among the verbal memory subtests than among the visual memory tests. An important number of subjects with below-normal scalar scores on the WMS-III correctly performed (either fully or partially) the corresponding test in the RBMT. The joint use of the WMS-III and RBMT in evaluation can provide a more comprehensive analysis of the memory deficits and their rehabilitation. The lower scores obtained in the WMS-III compared to those of the RBMT indicate greater sensitivity of the former. Nevertheless, further testing needs to be carried out in the future to compare the performance in the tests after the patients and those around them have subjectively assessed their functional limitations. This would make it possible to determine which of the two tests offers the best balance between sensitivity and specificity, as well as a higher predictive value.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_11");'>11</a></li> <li><a href="#" onclick='return showDiv("page_12");'>12</a></li> <li class="active"><span>13</span></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_13 --> <div id="page_14" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_12");'>12</a></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li class="active"><span>14</span></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="261"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017MeScT..28l5501V','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017MeScT..28l5501V"><span>Design and implementation of a laser-based absorption spectroscopy sensor for in situ monitoring of biomass gasification</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Viveros Salazar, David; Goldenstein, Christopher S.; Jeffries, Jay B.; Seiser, Reinhard; Cattolica, Robert J.; Hanson, Ronald K.</p> <p>2017-12-01</p> <p>Research to demonstrate in situ laser-absorption-based sensing of H2O, CH4, CO2, and CO mole fraction is reported for the product gas line of a biomass gasifier. Spectral simulations were used to select candidate sensor wavelengths that optimize sensitive monitoring of the target species while minimizing interference from other species in the gas stream. A prototype sensor was constructed and measurements performed in the laboratory at Stanford to validate performance. Field measurements then were demonstrated in a pilot scale biomass gasifier at West Biofuels in Woodland, CA. The performance of a prototype sensor was compared for two sensor strategies: wavelength-scanned direct absorption (DA) and wavelength-scanned wavelength modulation spectroscopy (WMS). The lasers used had markedly different wavelength tuning response to injection current, and modern distributed feedback lasers (DFB) with nearly linear tuning response to injection current were shown to be superior, leading to guidelines for laser selection for sensor fabrication. Non-absorption loss in the transmitted laser intensity from particulate scattering and window fouling encouraged the use of normalized WMS measurement schemes. The complications of using normalized WMS for relatively large values of absorbance and its mitigation are discussed. A method for reducing adverse sensor performance effects of a time-varying WMS background signal is also presented. The laser absorption sensor provided measurements with the sub-second time resolution needed for gasifier control and more importantly provided precise measurements of H2O in the gasification products, which can be problematic for the typical gas chromatography sensors used by industry.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016JGRC..121.8230P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016JGRC..121.8230P"><span>Circulation and oxygen cycling in the Mediterranean Sea: Sensitivity to future climate change</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Powley, Helen R.; Krom, Michael D.; Van Cappellen, Philippe</p> <p>2016-11-01</p> <p>Climate change is expected to increase temperatures and decrease precipitation in the Mediterranean Sea (MS) basin, causing substantial changes in the thermohaline circulation (THC) of both the Western Mediterranean Sea (WMS) and Eastern Mediterranean Sea (EMS). The exact nature of future circulation changes remains highly uncertain, however, with forecasts varying from a weakening to a strengthening of the THC. Here we assess the sensitivity of dissolved oxygen (O2) distributions in the WMS and EMS to THC changes using a mass balance model, which represents the exchanges of O2 between surface, intermediate, and deep water reservoirs, and through the Straits of Sicily and Gibraltar. Perturbations spanning the ranges in O2 solubility, aerobic respiration kinetics, and THC changes projected for the year 2100 are imposed to the O2 model. In all scenarios tested, the entire MS remains fully oxygenated after 100 years; depending on the THC regime, average deep water O2 concentrations fall in the ranges 151-205 and 160-219 µM in the WMS and EMS, respectively. On longer timescales (>1000 years), the scenario with the largest (>74%) decline in deep water formation rate leads to deep water hypoxia in the EMS but, even then, the WMS deep water remains oxygenated. In addition, a weakening of THC may result in a negative feedback on O2 consumption as supply of labile dissolved organic carbon to deep water decreases. Thus, it appears unlikely that climate-driven changes in THC will cause severe O2 depletion of the deep water masses of the MS in the foreseeable future.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28869478','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28869478"><span>Tai Chi Chuan and Baduanjin Increase Grey Matter Volume in Older Adults: A Brain Imaging Study.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Tao, Jing; Liu, Jiao; Liu, Weilin; Huang, Jia; Xue, Xiehua; Chen, Xiangli; Wu, Jinsong; Zheng, Guohua; Chen, Bai; Li, Ming; Sun, Sharon; Jorgenson, Kristen; Lang, Courtney; Hu, Kun; Chen, Shanjia; Chen, Lidian; Kong, Jian</p> <p>2017-01-01</p> <p>The aim of this study is to investigate and compare how 12-weeks of Tai Chi Chuan and Baduanjin exercise can modulate brain structure and memory function in older adults. Magnetic resonance imaging and memory function measurements (Wechsler Memory Scale-Chinese revised, WMS-CR) were applied at both the beginning and end of the study. Results showed that both Tai Chi Chuan and Baduanjin could significantly increase grey matter volume (GMV) in the insula, medial temporal lobe, and putamen after 12-weeks of exercise. No significant differences were observed in GMV between the Tai Chi Chuan and Baduanjin groups. We also found that compared to healthy controls, Tai Chi Chuan and Baduanjin significantly improved visual reproduction subscores on the WMS-CR. Baduanjin also improved mental control, recognition, touch, and comprehension memory subscores of the WMS-CR compared to the control group. Memory quotient and visual reproduction subscores were both associated with GMV increases in the putamen and hippocampus. Our results demonstrate the potential of Tai Chi Chuan and Baduanjin exercise for the prevention of memory deficits in older adults.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25498258','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25498258"><span>Wilderness Medical Society practice guidelines for the use of epinephrine in outdoor education and wilderness settings: 2014 update.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Gaudio, Flavio G; Lemery, Jay; Johnson, David E</p> <p>2014-12-01</p> <p>The Epinephrine Roundtable took place on July 27, 2008, during the 25th Annual Meeting of the Wilderness Medical Society (WMS) in Snowmass, CO. The WMS convened this roundtable to explore areas of consensus and uncertainty in the field treatment of anaphylaxis. Panelists were selected on the basis of their relevant academic or professional experience. There is a paucity of data that address the treatment of anaphylaxis in the wilderness. Anaphylaxis is a rare disease, with a sudden onset and drastic course that does not lend itself to study in randomized, controlled trials. Therefore, the panel endorsed the following position based on the limited available evidence and review of published articles, as well as expert consensus. The position represents the consensus of the panelists and is endorsed by the WMS. In 2014, the authors reviewed relevant articles published since the Epinephrine Roundtable. The following is an updated version of the original guidelines published in Wilderness & Environmental Medicine 2010;21(4):185-187. Copyright © 2014 Wilderness Medical Society. Published by Elsevier Inc. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/9134590','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/9134590"><span>Neuropsychological correlates of sustained attention in schizophrenia.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Chen, E Y; Lam, L C; Chen, R Y; Nguyen, D G; Chan, C K; Wilkins, A J</p> <p>1997-04-11</p> <p>We employed a simple and relatively undemanding task of monotone counting for the assessment of sustained attention in schizophrenic patients. The monotone counting task has been validated neuropsychologically and is particularly sensitive to right prefrontal lesions. We compared the performance of schizophrenic patients with age- and education-matched controls. We then explored the extent to which a range of commonly employed neuropsychological tasks in schizophrenia research are related to attentional impairment as measured in this way. Monotone counting performance was found to be correlated with digit span (WAIS-R-HK), information (WAIS-R-HK), comprehension (WAIS-R-HK), logical memory (immediate recall) (Weschler Memory Scale, WMS), and visual reproduction (WMS). Multiple regression analysis also identified visual reproduction, digit span and comprehension as significant predictors of attention performance. In contrast, logical memory (delay recall) (WMS), similarity (WAIS-R-HK), semantic fluency, and Wisconsin Card Sorting Test (perseverative errors) were not correlated with attention. In addition, no significant correlation between sustained attention and symptoms was found. These findings are discussed in the context of a weakly modular cognitive system where attentional impairment may contribute selectively to a range of other cognitive deficits.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25968506','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25968506"><span>Wavelength modulation spectroscopy--digital detection of gas absorption harmonics based on Fourier analysis.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Mei, Liang; Svanberg, Sune</p> <p>2015-03-20</p> <p>This work presents a detailed study of the theoretical aspects of the Fourier analysis method, which has been utilized for gas absorption harmonic detection in wavelength modulation spectroscopy (WMS). The lock-in detection of the harmonic signal is accomplished by studying the phase term of the inverse Fourier transform of the Fourier spectrum that corresponds to the harmonic signal. The mathematics and the corresponding simulation results are given for each procedure when applying the Fourier analysis method. The present work provides a detailed view of the WMS technique when applying the Fourier analysis method.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AIPC.1738H0004P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AIPC.1738H0004P"><span>Slug to churn transition analysis using wire-mesh sensor</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>H. F. Velasco, P.; Ortiz-Vidal, L. E.; Rocha, D. M.; Rodriguez, O. M. H.</p> <p>2016-06-01</p> <p>A comparison between some theoretical slug to churn flow-pattern transition models and experimental data is performed. The flow-pattern database considers vertical upward air-water flow at standard temperature and pressure for 50 mm and 32 mm ID pipes. A briefly description of the models and its phenomenology is presented. In general, the performance of the transition models is poor. We found that new experimental studies describing objectively both stable and unstable slug flow-pattern are required. In this sense, the Wire Mesh Sensor (WMS) can assist to that aim. The potential of the WMS is outlined.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012AGUFMIN53C1759C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012AGUFMIN53C1759C"><span>Lunar Mapping and Modeling On-the-Go: A mobile framework for viewing and interacting with large geospatial datasets</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Chang, G.; Kim, R.; Bui, B.; Sadaqathullah, S.; Law, E.; Malhotra, S.</p> <p>2012-12-01</p> <p>The Lunar Mapping and Modeling Portal (LMMP, https://www.lmmp.nasa.gov/) is a collaboration between four NASA centers, JPL, Marshall, Goddard, and Ames, along with the USGS and US Army to provide a centralized geospatial repository for storing processed lunar data collected from the Apollo missions to the latest data acquired by the Lunar Reconnaissance Orbiter (LRO). We offer various scientific and visualization tools to analyze rock and crater densities, lighting maps, thermal measurements, mineral concentrations, slope hazards, and digital elevation maps with the intention of serving not only scientists and lunar mission planners, but also the general public. The project has pioneered in leveraging new technologies and embracing new computing paradigms to create a system that is sophisticated, secure, robust, and scalable all the while being easy to use, streamlined, and modular. We have led innovations through the use of a hybrid cloud infrastructure, authentication through various sources, and utilizing an in-house GIS framework, TWMS (TiledWMS) as well as the commercial ArcGIS product from ESRI. On the client end, we also provide a Flash GUI framework as well as REST web services to interact with the portal. We have also developed a visualization framework on mobile devices, specifically Apple's iOS, which allows anyone from anywhere to interact with LMMP. At the most basic level, the framework allows users to browse LMMP's entire catalog of over 600 data imagery products ranging from global basemaps to LRO's Narrow Angle Camera (NAC) images that provide details of up to .5 meters/pixel. Users are able to view map metadata and can zoom in and out as well as pan around the entire lunar surface with the appropriate basemap. They can arbitrarily stack the maps and images on top of each other to show a layered view of the surface with layer transparency adjusted to suit the user's desired look. Once the user has selected a combination of layers, he can also bookmark those layers for quick access in subsequent sessions. A search tool is also provided to allow users to quickly find points of interests on the moon and to view the auxiliary data associated with that feature. More advanced features include the ability to interact with the data. Using the services provided by the portal, users will be able to log in and access the same scientific analysis tools provided on the web site including measuring between two points, generating subsets, and running other analysis tools, all by using a customized touch interface that are immediately familiar to users of these smart mobile devices. Users can also access their own storage on the portal and view or send the data to other users. Finally, there are features that will utilize functionality that can only be enabled by mobile devices. This includes the use of the gyroscopes and motion sensors to provide a haptic interface visualize lunar data in 3D, on the device as well as potentially on a large screen. The mobile framework that we have developed for LMMP provides a glimpse of what is possible in visualizing and manipulating large geospatial data on small portable devices. While the framework is currently tuned to our portal, we hope that we can generalize the tool to use data sources from any type of GIS services.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..1613099K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..1613099K"><span>Brandenburg 3D - a comprehensive 3D Subsurface Model, Conception of an Infrastructure Node and a Web Application</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kerschke, Dorit; Schilling, Maik; Simon, Andreas; Wächter, Joachim</p> <p>2014-05-01</p> <p>The Energiewende and the increasing scarcity of raw materials will lead to an intensified utilization of the subsurface in Germany. Within this context, geological 3D modeling is a fundamental approach for integrated decision and planning processes. Initiated by the development of the European Geospatial Infrastructure INSPIRE, the German State Geological Offices started digitizing their predominantly analog archive inventory. Until now, a comprehensive 3D subsurface model of Brandenburg did not exist. Therefore the project B3D strived to develop a new 3D model as well as a subsequent infrastructure node to integrate all geological and spatial data within the Geodaten-Infrastruktur Brandenburg (Geospatial Infrastructure, GDI-BB) and provide it to the public through an interactive 2D/3D web application. The functionality of the web application is based on a client-server architecture. Server-sided, all available spatial data is published through GeoServer. GeoServer is designed for interoperability and acts as the reference implementation of the Open Geospatial Consortium (OGC) Web Feature Service (WFS) standard that provides the interface that allows requests for geographical features. In addition, GeoServer implements, among others, the high performance certified compliant Web Map Service (WMS) that serves geo-referenced map images. For publishing 3D data, the OGC Web 3D Service (W3DS), a portrayal service for three-dimensional geo-data, is used. The W3DS displays elements representing the geometry, appearance, and behavior of geographic objects. On the client side, the web application is solely based on Free and Open Source Software and leans on the JavaScript API WebGL that allows the interactive rendering of 2D and 3D graphics by means of GPU accelerated usage of physics and image processing as part of the web page canvas without the use of plug-ins. WebGL is supported by most web browsers (e.g., Google Chrome, Mozilla Firefox, Safari, and Opera). The web application enables an intuitive navigation through all available information and allows the visualization of geological maps (2D), seismic transects (2D/3D), wells (2D/3D), and the 3D-model. These achievements will alleviate spatial and geological data management within the German State Geological Offices and foster the interoperability of heterogeneous systems. It will provide guidance to a systematic subsurface management across system, domain and administrative boundaries on the basis of a federated spatial data infrastructure, and include the public in the decision processes (e-Governance). Yet, the interoperability of the systems has to be strongly propelled forward through agreements on standards that need to be decided upon in responsible committees. The project B3D is funded with resources from the European Fund for Regional Development (EFRE).</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28431036','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28431036"><span>Effects of Delay Duration on the WMS Logical Memory Performance of Older Adults with Probable Alzheimer's Disease, Probable Vascular Dementia, and Normal Cognition.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Montgomery, Valencia; Harris, Katie; Stabler, Anthony; Lu, Lisa H</p> <p>2017-05-01</p> <p>To examine how the duration of time delay between Wechsler Memory Scale (WMS) Logical Memory I and Logical Memory II (LM) affected participants' recall performance. There are 46,146 total Logical Memory administrations to participants diagnosed with either Alzheimer's disease (AD), vascular dementia (VaD), or normal cognition in the National Alzheimer's Disease Coordinating Center's Uniform Data Set. Only 50% of the sample was administered the standard 20-35 min of delay as specified by WMS-R and WMS-III. We found a significant effect of delay time duration on proportion of information retained for the VaD group compared to its control group, which remained after adding LMI raw score as a covariate. There was poorer retention of information with longer delay for this group. This association was not as strong for the AD and cognitively normal groups. A 24.5-min delay was most optimal for differentiating AD from VaD participants (47.7% classification accuracy), an 18.5-min delay was most optimal for differentiating AD versus normal participants (51.7% classification accuracy), and a 22.5-min delay was most optimal for differentiating VaD versus normal participants (52.9% classification accuracy). Considering diagnostic implications, our findings suggest that test administration should incorporate precise tracking of delay periods. We recommend a 20-min delay with 18-25-min range. Poor classification accuracy based on LM data alone is a reminder that story memory performance is only one piece of data that contributes to complex clinical decisions. However, strict adherence to the recommended range yields optimal data for diagnostic decisions. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2002ApPhB..75..229F','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2002ApPhB..75..229F"><span>Digital, phase-sensitive detection for in situ diode-laser spectroscopy under rapidly changing transmission conditions</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Fernholz, T.; Teichert, H.; Ebert, V.</p> <p></p> <p>A new harmonic detection scheme for fully digital, fast-scanning wavelength-modulation spectroscopy (DFS-WMS) is presented. DFS-WMS is specially suited for in situ absorption measurements in combustion environments under fast fluctuating transmission conditions and is demonstrated for the first time by open-path monitoring of ambient oxygen using a distributed-feedback diode laser, which is doubly modulated with a fast linear 1 kHz-scan and a sinusoidal 300 kHz-modulation. After an analog high-pass filter, the detector signal is digitized with a 5 megasample/s 12-bit AD-converter card plugged into a PC and subsequently - unlike standard lock-ins - filtered further by co-adding 100 scans, to generate a narrowband comb filter. All further filtering and the demodulation are performed completely digitally on a PC with the help of discrete Fourier transforms (DFT). Both 1f- and 2f-signals, are simultaneously extracted from the detector signal using one ADC input channel. For the 2f-signal, a linearity of 2% and a minimum detectable absorption of 10-4 could be verified experimentally, with the sensitivity to date being limited only by insufficient gain on the 2f-frequency channel. Using the offset in the 1f signal as a transmission `probe', we could show that the 2f-signal can be transmission-corrected by a simple division by the 1f-background, proving that DFS-WMS provides the possibility of compensating for transmission fluctuations. With the inherent suppression of additive noise, DFS-WMS seems well suited for quantitative in situ absorption spectroscopy in large combustion systems. This assumption is supported by the first measurements of oxygen in a high-pressure combustor at 12 bar.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2009MeScT..20k5201C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2009MeScT..20k5201C"><span>Absorption sensor for CO in combustion gases using 2.3 µm tunable diode lasers</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Chao, X.; Jeffries, J. B.; Hanson, R. K.</p> <p>2009-11-01</p> <p>Tunable diode laser absorption spectroscopy of CO was studied in the controlled laboratory environments of a heated cell and a combustion exhaust rig. Two absorption lines, R(10) and R(11) in the first overtone band of CO near 2.3 µm, were selected from a HITRAN simulation to minimize interference from water vapor at a representative combustion exhaust temperature (~1200 K). The linestrengths and collision broadening coefficients for these lines were measured in a heated static cell. This database was then used in a comparative study of direct absorption and wavelength-modulation absorption. CO concentration measurements using scanned-wavelength direct absorption (DA) and wavelength modulation with the second-harmonic signal normalized by the first-harmonic signal (WMS-2f/1f) all agreed with those measured by a conventional gas sampling analyzer over the range from <10 ppm to 2.3%. As expected, water vapor was found to be the dominant source of background interference for CO detection in combustion flows at high temperatures. Water absorption was measured to a high spectral resolution within the wavelength region 4295-4301 cm-1 at 1100 K, and shown to produce <10 ppm level interference for CO detection in combustion exhausts at temperatures up to 1200 K. We found that the WMS-2f/1f strategy avoids the need for WMS calibration measurements but requires characterization of the wavelength and injection-current intensity modulation of the specific diode laser. We conclude that WMS-2f/1f using the selected R(10) or R(11) transitions in the CO overtone band holds good promise for sensitive in situ detection of ppm-level CO in combustion flows, with high resistance to interference absorption from H2O.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/22026588','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/22026588"><span>A phase I study of low-pressure hyperbaric oxygen therapy for blast-induced post-concussion syndrome and post-traumatic stress disorder.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Harch, Paul G; Andrews, Susan R; Fogarty, Edward F; Amen, Daniel; Pezzullo, John C; Lucarini, Juliette; Aubrey, Claire; Taylor, Derek V; Staab, Paul K; Van Meter, Keith W</p> <p>2012-01-01</p> <p>This is a preliminary report on the safety and efficacy of 1.5 ATA hyperbaric oxygen therapy (HBOT) in military subjects with chronic blast-induced mild to moderate traumatic brain injury (TBI)/post-concussion syndrome (PCS) and post-traumatic stress disorder (PTSD). Sixteen military subjects received 40 1.5 ATA/60 min HBOT sessions in 30 days. Symptoms, physical and neurological exams, SPECT brain imaging, and neuropsychological and psychological testing were completed before and within 1 week after treatment. Subjects experienced reversible middle ear barotrauma (5), transient deterioration in symptoms (4), and reversible bronchospasm (1); one subject withdrew. Post-treatment testing demonstrated significant improvement in: symptoms, neurological exam, full-scale IQ (+14.8 points; p<0.001), WMS IV Delayed Memory (p=0.026), WMS-IV Working Memory (p=0.003), Stroop Test (p<0.001), TOVA Impulsivity (p=0.041), TOVA Variability (p=0.045), Grooved Pegboard (p=0.028), PCS symptoms (Rivermead PCSQ: p=0.0002), PTSD symptoms (PCL-M: p<0.001), depression (PHQ-9: p<0.001), anxiety (GAD-7: p=0.007), quality of life (MPQoL: p=0.003), and self-report of percent of normal (p<0.001), SPECT coefficient of variation in all white matter and some gray matter ROIs after the first HBOT, and in half of white matter ROIs after 40 HBOT sessions, and SPECT statistical parametric mapping analysis (diffuse improvements in regional cerebral blood flow after 1 and 40 HBOT sessions). Forty 1.5 ATA HBOT sessions in 1 month was safe in a military cohort with chronic blast-induced PCS and PTSD. Significant improvements occurred in symptoms, abnormal physical exam findings, cognitive testing, and quality-of-life measurements, with concomitant significant improvements in SPECT.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013AGUFMIN43C..06H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013AGUFMIN43C..06H"><span>Proteus - A Free and Open Source Sensor Observation Service (SOS) Client</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Henriksson, J.; Satapathy, G.; Bermudez, L. E.</p> <p>2013-12-01</p> <p>The Earth's 'electronic skin' is becoming ever more sophisticated with a growing number of sensors measuring everything from seawater salinity levels to atmospheric pressure. To further the scientific application of this data collection effort, it is important to make the data easily available to anyone who wants to use it. Making Earth Science data readily available will allow the data to be used in new and potentially groundbreaking ways. The US National Science and Technology Council made this clear in its most recent National Strategy for Civil Earth Observations report, when it remarked that Earth observations 'are often found to be useful for additional purposes not foreseen during the development of the observation system'. On the road to this goal the Open Geospatial Consortium (OGC) is defining uniform data formats and service interfaces to facilitate the discovery and access of sensor data. This is being done through the Sensor Web Enablement (SWE) stack of standards, which include the Sensor Observation Service (SOS), Sensor Model Language (SensorML), Observations & Measurements (O&M) and Catalog Service for the Web (CSW). End-users do not have to use these standards directly, but can use smart tools that leverage and implement them. We have developed such a tool named Proteus. Proteus is an open-source sensor data discovery client. The goal of Proteus is to be a general-purpose client that can be used by anyone for discovering and accessing sensor data via OGC-based services. Proteus is a desktop client and supports a straightforward workflow for finding sensor data. The workflow takes the user through the process of selecting appropriate services, bounding boxes, observed properties, time periods and other search facets. NASA World Wind is used to display the matching sensor offerings on a map. Data from any sensor offering can be previewed in a time series. The user can download data from a single sensor offering, or download data in bulk from all matching sensor offerings. Proteus leverages NASA World Wind's WMS capabilities and allow overlaying sensor offerings on top of any map. Specific search criteria (i.e. user discoveries) can be saved and later restored. Proteus is supports two user types: 1) the researcher/scientist interested in discovering and downloading specific sensor data as input to research processes, and 2) the data manager responsible for maintaining sensor data services (e.g. SOSs) and wants to ensure proper data and metadata delivery, verify sensor data, and receive sensor data alerts. Proteus has a Web-based companion product named the Community Hub that is used to generate sensor data alerts. Alerts can be received via an RSS feed, viewed in a Web browser or displayed directly in Proteus via a Web-based API. To advance the vision of making Earth Science data easily discoverable and accessible to end-users, professional or laymen, Proteus is available as open-source on GitHub (https://github.com/intelligentautomation/proteus).</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017E%26ES...96a2012T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017E%26ES...96a2012T"><span>Architecture of a spatial data service system for statistical analysis and visualization of regional climate changes</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Titov, A. G.; Okladnikov, I. G.; Gordov, E. P.</p> <p>2017-11-01</p> <p>The use of large geospatial datasets in climate change studies requires the development of a set of Spatial Data Infrastructure (SDI) elements, including geoprocessing and cartographical visualization web services. This paper presents the architecture of a geospatial OGC web service system as an integral part of a virtual research environment (VRE) general architecture for statistical processing and visualization of meteorological and climatic data. The architecture is a set of interconnected standalone SDI nodes with corresponding data storage systems. Each node runs a specialized software, such as a geoportal, cartographical web services (WMS/WFS), a metadata catalog, and a MySQL database of technical metadata describing geospatial datasets available for the node. It also contains geospatial data processing services (WPS) based on a modular computing backend realizing statistical processing functionality and, thus, providing analysis of large datasets with the results of visualization and export into files of standard formats (XML, binary, etc.). Some cartographical web services have been developed in a system’s prototype to provide capabilities to work with raster and vector geospatial data based on OGC web services. The distributed architecture presented allows easy addition of new nodes, computing and data storage systems, and provides a solid computational infrastructure for regional climate change studies based on modern Web and GIS technologies.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=19730053172&hterms=management+waste+solid&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D30%26Ntt%3Dmanagement%2Bwaste%2Bsolid','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=19730053172&hterms=management+waste+solid&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D30%26Ntt%3Dmanagement%2Bwaste%2Bsolid"><span>Waste Management System overview for future spacecraft.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Ingelfinger, A. L.; Murray, R. W.</p> <p>1973-01-01</p> <p>Waste Management Systems (WMS) for post Apollo spacecraft will be significantly more sophisticated and earthlike in user procedures. Some of the features of the advanced WMS will be accommodation of both males and females, automatic operation, either tissue wipe or anal wash, measurement and sampling of urine, feces and vomitus for medical analysis, water recovery, and solids disposal. This paper presents an overview of the major problems of and approaches to waste management for future spacecraft. Some of the processes discussed are liquid/gas separation, the Dry-John, the Hydro-John, automated sampling, vapor compression distillation, vacuum distillation-catalytic oxidation, incineration, and the integration of the above into complete systems.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5659386','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5659386"><span>Tai Chi Chuan and Baduanjin increase grey matter volume in older adults: a brain imaging study</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Tao, Jing; Liu, Jiao; Liu, Weilin; Huang, Jia; Xue, Xiehua; Chen, Xiangli; Wu, Jinsong; Zheng, Guohua; Chen, Bai; Li, Ming; Sun, Sharon; Jorgenson, Kristen; Lang, Courtney; Hu, Kun; Chen, Shanjia; Chen, Lidian; Kong, Jian</p> <p>2017-01-01</p> <p>The aim of this study is to investigate and compare how 12-weeks of Tai Chi Chuan and Baduanjin exercise can modulate brain structure and memory function in older adults. Magnetic Resonance Imaging(MRI) and memory function measurements (Wechsler Memory Scale-Chinese revised, WMS-CR)were applied at both the beginning and end of the study. Results showed that both Tai Chi Chuan and Baduanjin could significantly increase grey matter volume (GMV) in the insula, medial temporal lobe (MTL), and putamen after 12-weeks of exercise. No significant differences were observed in grey matter volume (GMV) between the Tai Chi Chuan and Baduanjin groups. We also found that compared to healthy controls, Tai Chi Chuan and Baduanjin significantly improved visual reproduction subscores on the WMS-CR. Baduanjin also improved mental control, recognition, touch and comprehension memory subscores of the WMS-CR compared to the control group. Memory quotient (MQ)and visual reproduction subscores were both associated with GMV increases in the putamen and hippocampus. Our results demonstrate the potential of Tai Chi Chuan and Baduanjin exercise for the prevention of memory deficits in older adults. PMID:28869478</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/29308933','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/29308933"><span>Evaluating the accuracy of the Wechsler Memory Scale-Fourth Edition (WMS-IV) logical memory embedded validity index for detecting invalid test performance.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Soble, Jason R; Bain, Kathleen M; Bailey, K Chase; Kirton, Joshua W; Marceaux, Janice C; Critchfield, Edan A; McCoy, Karin J M; O'Rourke, Justin J F</p> <p>2018-01-08</p> <p>Embedded performance validity tests (PVTs) allow for continuous assessment of invalid performance throughout neuropsychological test batteries. This study evaluated the utility of the Wechsler Memory Scale-Fourth Edition (WMS-IV) Logical Memory (LM) Recognition score as an embedded PVT using the Advanced Clinical Solutions (ACS) for WAIS-IV/WMS-IV Effort System. This mixed clinical sample was comprised of 97 total participants, 71 of whom were classified as valid and 26 as invalid based on three well-validated, freestanding criterion PVTs. Overall, the LM embedded PVT demonstrated poor concordance with the criterion PVTs and unacceptable psychometric properties using ACS validity base rates (42% sensitivity/79% specificity). Moreover, 15-39% of participants obtained an invalid ACS base rate despite having a normatively-intact age-corrected LM Recognition total score. Receiving operating characteristic curve analysis revealed a Recognition total score cutoff of < 61% correct improved specificity (92%) while sensitivity remained weak (31%). Thus, results indicated the LM Recognition embedded PVT is not appropriate for use from an evidence-based perspective, and that clinicians may be faced with reconciling how a normatively intact cognitive performance on the Recognition subtest could simultaneously reflect invalid performance validity.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012ISPAr39B4..255U','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012ISPAr39B4..255U"><span>Automating Mapping Production for the Enterprise: from Contract to Delivery</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Uebbing, R.; Xie, C.; Beshah, B.; Welter, J.</p> <p>2012-07-01</p> <p>The ever increasing volume and quality of geospatial data has created new challenges for mapping companies. Due to increased image resolution, fusion of different data sources and more frequent data update requirements, mapping production is forced to streamline the work flow to meet client deadlines. But the data volume alone is not the only barrier for an efficient production work flow. Processing geospatial information traditionally uses domain and vendor specific applications that do not interface with each other, often leading to data duplication and therefore creating sources for error. Also, it creates isolation between different departments within a mapping company resulting in additional communication barriers. North West Geomatics has designed and implemented a data centric enterprise solution for the flight acquisition and production work flow to combat the above challenges. A central data repository containing not only geospatial data in the strictest sense such as images, vector layers and 3D point clouds, but also other information such as product specifications, client requirements, flight acquisition data, production resource usage and much more has been deployed at the company. As there is only one instance of the database shared throughout the whole organization it allows all employees, given they have been granted the appropriate permission, to view the current status of any project with a graphical and table based interface through its life cycle from sales, through flight acquisition, production and product delivery. Not only can users track progress and status of various work flow steps, but the system also allows users and applications to actively schedule or start specific production steps such as data ingestion and triangulation with many other steps (orthorectification, mosaicing, accounting, etc.) in the planning stages. While the complete system is exposed to the users through a web interface and therefore allowing outside customers to also view their data, much of the design and development was focused on work flow automation, scalability and security. Ideally, users will interact with the system to retrieve a specific project status and summaries while the work flow processes are triggered automatically by modeling their dependencies. The enterprise system is built using open source technologies (PostGIS, Hibernate, OpenLayers, GWT and others) and adheres to OGC web services for data delivery (WMS/WFS/WCS) to third party applications.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013AGUFMIN33B1533B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013AGUFMIN33B1533B"><span>Common Web Mapping and Mobile Device Framework for Display of NASA Real-time Data</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Burks, J. E.</p> <p>2013-12-01</p> <p>Scientists have strategic goals to deliver their unique datasets and research to both collaborative partners and more broadly to the public. These datasets can have a significant impact locally and globally as has been shown by the success of the NASA Short-term Prediction Research and Transition (SPoRT) Center and SERVIR programs at Marshall Space Flight Center. Each of these respective organizations provides near real-time data at the best resolution possible to address concerns of the operational weather forecasting community (SPoRT) and to support environmental monitoring and disaster assessment (SERVIR). However, one of the biggest struggles to delivering the data to these and other Earth science community partners is formatting the product to fit into an end user's Decision Support System (DSS). The problem of delivering the data to the end-user's DSS can be a significant impediment to transitioning research to operational environments especially for disaster response where the deliver time is critical. The decision makers, in addition to the DSS, need seamless access to these same datasets from a web browser or a mobile phone for support when they are away from their DSS or for personnel out in the field. A framework has been developed for MSFC Earth Science program that can be used to easily enable seamless delivery of scientific data to end users in multiple formats. The first format is an open geospatial format, Web Mapping Service (WMS), which is easily integrated into most DSSs. The second format is a web browser display, which can be embedded within any MSFC Science web page with just a few lines of web page coding. The third format is accessible in the form of iOS and Android native mobile applications that could be downloaded from an 'app store'. The framework developed has reduced the level of effort needed to bring new and existing NASA datasets to each of these end user platforms and help extend the reach of science data.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_12");'>12</a></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li class="active"><span>14</span></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_14 --> <div id="page_15" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li class="active"><span>15</span></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="281"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20140006477','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20140006477"><span>Common Web Mapping and Mobile Device Framework for Display of NASA Real-time Data</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Burks, Jason</p> <p>2013-01-01</p> <p>Scientists have strategic goals to deliver their unique datasets and research to both collaborative partners and more broadly to the public. These datasets can have a significant impact locally and globally as has been shown by the success of the NASA Short-term Prediction Research and Transition (SPoRT) Center and SERVIR programs at Marshall Space Flight Center. Each of these respective organizations provides near real-time data at the best resolution possible to address concerns of the operational weather forecasting community (SPoRT) and to support environmental monitoring and disaster assessment (SERVIR). However, one of the biggest struggles to delivering the data to these and other Earth science community partners is formatting the product to fit into an end user's Decision Support System (DSS). The problem of delivering the data to the end-user's DSS can be a significant impediment to transitioning research to operational environments especially for disaster response where the deliver time is critical. The decision makers, in addition to the DSS, need seamless access to these same datasets from a web browser or a mobile phone for support when they are away from their DSS or for personnel out in the field. A framework has been developed for MSFC Earth Science program that can be used to easily enable seamless delivery of scientific data to end users in multiple formats. The first format is an open geospatial format, Web Mapping Service (WMS), which is easily integrated into most DSSs. The second format is a web browser display, which can be embedded within any MSFC Science web page with just a few lines of web page coding. The third format is accessible in the form of iOS and Android native mobile applications that could be downloaded from an "app store". The framework developed has reduced the level of effort needed to bring new and existing NASA datasets to each of these end user platforms and help extend the reach of science data.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017MS%26E..245d2080G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017MS%26E..245d2080G"><span>Use of LIDAR Data in the 3D/4D Analyses of the Krakow Fortress Objects</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Glowienka, Ewa; Michalowska, Krystyna; Opalinski, Piotr; Hejmanowska, Beata; Mikrut, Slawomir; Kramarczyk, Piotr</p> <p>2017-10-01</p> <p>The article presents partial results of studies within the framework of the international project "Cultural Heritage Through Time" (CHT2). The subject of the study were forts of the Krakow Fortress, which had been built by the Austrians between 1849-1914 in order to provide defence against the Russians. Research works were aimed at identifying architectural changes occurring in different time periods in relation to selected objects of the Krakow Fortress. For the analysis, the following LIDAR (Light Detection and Ranging) data was applied: Digital Terrain Models (DTM), Digital Surface Model (DSM), as well as the cartographic data: maps and orthophotomaps. All spatial data was obtained from the Polish Main Office of Geodesy and Cartography (Główny Urząd Geodezji i Kartografii - GUGIK). The majority of the cartographic data is available in the form of Web Map Services (WMS) on Geoportal (www.geoportal.gov.pl). The archival data was made available by the Historical Museum of the City of Krakow, or obtained from private collections. In order to conduct a thorough analysis of objects of the Krakow fortress, DTM and DSM data was obtained, either in ASCII format, or in the source *.las (LIDAR) format. On the basis of DTM and DSM, the degree of destruction of selected fortress objects was determined, occurring as a result of the action of demolishing those objects in the interwar period (1920-1939) and in the 1950s. The research has been made on the basis of all available cartographic materials, both archival (plans, maps, photos) and current (topographic map, orthophotomap, etc.) ones. Verification of archival maps and plans was carried out by comparing current digital images of the existing forms of fortifications with designs developed by the Austrians. As a result, it was possible to identify the differences between the original design, and the current state of the objects concerned. The analyses, which have been conducted, also allowed checking the legitimacy of locating the forts in terms of the object visibility from the enemy’s side (foreground), presence and number of "dead fields" in the foreground, the effectiveness of blurring characteristic military forms by means of masks formed from tree rows and shrubs. Furthermore, the analyses involved examination of the impact of erosion resulting from the natural process of silting drains of forts’ ground forms, as well as processes of obliterating of the slopes, sliding of the scarps, and flooding of moats and caponieres.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010PhDT........26V','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010PhDT........26V"><span>A mid-infrared laser absorption sensor for carbon monoxide and temperature measurements</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Vanderover, Jeremy</p> <p></p> <p>A mid-infrared (mid-IR) absorption sensor based on quantum cascade laser (QCL) technology has been developed and demonstrated for high-temperature thermometry and carbon monoxide (CO) measurements in combustion environments. The sensor probes the high-intensity fundamental CO ro-vibrational band at 4.6 mum enabling sensitive measurement of CO and temperature at kHz acquisition rates. Because the sensor operates in the mid-IR CO fundamental band it is several orders of magnitude more sensitive than most of the previously developed CO combustion sensors which utilized absorption in the near-IR overtone bands and mature traditional telecommunications-based diode lasers. The sensor has been demonstrated and validated under operation in both scanned-wavelength absorption and wavelength-modulation spectroscopy (WMS) modes in room-temperature gas cell and high-temperature shock tube experiments with known and specified gas conditions. The sensor has also been demonstrated for CO and temperature measurements in an atmospheric premixed ethylene/air McKenna burner flat flame for a range of equivalence ratios (phi = 0.7-1.4). Demonstration of the sensor under scanned-wavelength direct absorption operation was performed in a room-temperature gas cell (297 K and 0.001-1 atm) allowing validation of the line strengths and line shapes predicted by the HITRAN 2004 spectroscopic database. Application of the sensor in scanned-wavelength mode, at 1-2 kHz acquisition bandwidths, to specified high-temperature shock-heated gases (950-3400 K, 1 atm) provided validation of the sensor for measurements under the high-temperature conditions found in combustion devices. The scanned-wavelength shock tube measurements yielded temperature determinations that deviated by only +/-1.2% (1-sigma deviation) with the reflected shock temperatures and CO mole fraction determinations that deviated by that specified CO mole fraction by only +/-1.5% (1-sigma deviation). These deviations are in fact smaller than the estimated uncertainties of 2.5-3% in both sensor determined temperature and CO. Enhancement of the sensor sensitivity can be achieved through use wavelength-modulation spectroscopy (WMS). Similarly, under WMS operation the sensor was applied to room-temperature gas cell (297 K, 0.001-1 atm) measurements, which indicate that the sensor sensitivity in WMS operation is approximately an order-of-magnitude greater than that achieved in scanned-wavelength mode, and high-temperature shock-heated gases (850-3400 K, 1 atm), which validate the sensor for sensitive thermometry at combustion temperatures. In WMS mode the temperature measurements show 1-sigma deviation of +/-1.9% with the reflected shock conditions. High-temperature CO concentration measurements require calibration to scale the measured WMS-2f peak height with a simulated WMS-2 f line shape. However, using single point calibration for each CO containing mixture studied resulted in fairly good agreement (1-sigma deviation of +/-4.2%) between measured and simulated WMS-2f peak height. In other words, CO mole fraction determinations (proportional to peak height) were achieved with deviation of +/-4.2% with specified CO mole fraction. Sensor measurements made at a 1 kHz acquisition bandwidth in an atmospheric pressure ethylene/air flat-flame produced by a McKenna burner for equivalence ratios from 0.7 to 1.4 were in excellent accord with thermocouple measurements and chemical equilibrium predictions for CO based on the thermocouple temperatures for rich conditions. At lean conditions sensor temperature determinations are lower than thermocouple determinations by around 150 K due to the cool flame edge and sensor CO measurements are greater than those predicted by chemical equilibrium due to super-equilibrium CO in the cool flame edge. The CO sensor developed and described herein and validated in room-temperature cell, high-temperature shock tube, and flat-flame burner measurements has potential for a vast array of measurements in combustion, energy, and industrial gas sensing applications. It has unsurpassed sensitivity due to the use of the fundamental CO band at 4.6 mum and provides kHz acquisition bandwidths necessary for high-speed measurements in these systems. This research was directed by Professor Matt Oehlschlaeger and supported by the Office of Naval Research (ONR).</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015JPhCS.664f2036B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015JPhCS.664f2036B"><span>The GridPP DIRAC project - DIRAC for non-LHC communities</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Bauer, D.; Colling, D.; Currie, R.; Fayer, S.; Huffman, A.; Martyniak, J.; Rand, D.; Richards, A.</p> <p>2015-12-01</p> <p>The GridPP consortium in the UK is currently testing a multi-VO DIRAC service aimed at non-LHC VOs. These VOs (Virtual Organisations) are typically small and generally do not have a dedicated computing support post. The majority of these represent particle physics experiments (e.g. NA62 and COMET), although the scope of the DIRAC service is not limited to this field. A few VOs have designed bespoke tools around the EMI-WMS & LFC, while others have so far eschewed distributed resources as they perceive the overhead for accessing them to be too high. The aim of the GridPP DIRAC project is to provide an easily adaptable toolkit for such VOs in order to lower the threshold for access to distributed resources such as Grid and cloud computing. As well as hosting a centrally run DIRAC service, we will also publish our changes and additions to the upstream DIRAC codebase under an open-source license. We report on the current status of this project and show increasing adoption of DIRAC within the non-LHC communities.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19930018404','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19930018404"><span>Welding process modelling and control</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Romine, Peter L.; Adenwala, Jinen A.</p> <p>1993-01-01</p> <p>The research and analysis performed, and software developed, and hardware/software recommendations made during 1992 in development of the PC-based data acquisition system for support of Welding Process Modeling and Control is reported. A need was identified by the Metals Processing Branch of NASA Marshall Space Flight Center, for a mobile data aquisition and analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC-based system was chosen. The Welding Measurement System (WMS) is a dedicated instrument, strictly for the use of data aquisition and analysis. Although the WMS supports many of the functions associated with the process control, it is not the intention for this system to be used for welding process control.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..16.7313D','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..16.7313D"><span>Beebook: light field mapping app</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>De Donatis, Mauro; Di Pietro, Gianfranco; Rinnone, Fabio</p> <p>2014-05-01</p> <p>In the last decade the mobile systems for field digital mapping were developed (see Wikipedia for "Digital geologic mapping"), also against many skeptic traditional geologists. Until now, hardware was often heavy (tablet PC) and software sometime difficult also for expert GIS users. At present, the advent of light tablet and applications makes things easier, but we are far to find a whole solution for a complex survey like the geological one where you have to manage complexities such information, hypothesis, data, interpretation. Beebook is a new app for Android devices, has been developed for fast ad easy mapping work in the field trying to try to solve this problem. The main features are: • off-line raster management, GeoTIFF ed other raster format using; • on-line map visualisation (Google Maps, OSM, WMS, WFS); • SR management and conversion using PROJ.4; • vector file mash-up (KML and SQLite format); • editing of vector data on the map (lines, points, polygons); • augmented reality using "Mixare" platform; • export of vector data in KML, CSV, SQLite (Spatialite) format; • note: GPS or manual point inserting linked to other application files (pictures, spreadsheet, etc.); • form: creation, edition and filling of customized form; • GPS: status control, tracker and positioning on map; • sharing: synchronization and sharing of data, forms, positioning and other information can be done among users. The input methods are different from digital keyboard to fingers touch, from voice recording to stylus. In particular the most efficient way of inserting information is the stylus (or pen): field geologists are familiar with annotation and sketches. Therefore we suggest the use of devices with stylus. The main point is that Beebook is the first "transparent" mobile GIS for tablet and smartphone deriving from previous experience as traditional mapping and different previous digital mapping software ideation and development (MapIT, BeeGIS, Geopaparazzi). Deriving from those experiences, we developed a tool which is easy to use and applicable not only for geology but also to every field survey.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20110007899','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20110007899"><span>Community-Based Services that Facilitate Interoperability and Intercomparison of Precipitation Datasets from Multiple Sources</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Liu, Zhong; Kempler, Steven; Teng, William; Leptoukh, Gregory; Ostrenga, Dana</p> <p>2010-01-01</p> <p>Over the past 12 years, large volumes of precipitation data have been generated from space-based observatories (e.g., TRMM), merging of data products (e.g., gridded 3B42), models (e.g., GMAO), climatologies (e.g., Chang SSM/I derived rain indices), field campaigns, and ground-based measuring stations. The science research, applications, and education communities have greatly benefited from the unrestricted availability of these data from the Goddard Earth Sciences Data and Information Services Center (GES DISC) and, in particular, the services tailored toward precipitation data access and usability. In addition, tools and services that are responsive to the expressed evolving needs of the precipitation data user communities have been developed at the Precipitation Data and Information Services Center (PDISC) (http://disc.gsfc.nasa.gov/precipitation or google NASA PDISC), located at the GES DISC, to provide users with quick data exploration and access capabilities. In recent years, data management and access services have become increasingly sophisticated, such that they now afford researchers, particularly those interested in multi-data set science analysis and/or data validation, the ability to homogenize data sets, in order to apply multi-variant, comparison, and evaluation functions. Included in these services is the ability to capture data quality and data provenance. These interoperability services can be directly applied to future data sets, such as those from the Global Precipitation Measurement (GPM) mission. This presentation describes the data sets and services at the PDISC that are currently used by precipitation science and applications researchers, and which will be enhanced in preparation for GPM and associated multi-sensor data research. Specifically, the GES-DISC Interactive Online Visualization ANd aNalysis Infrastructure (Giovanni) will be illustrated. Giovanni enables scientific exploration of Earth science data without researchers having to perform the complicated data access and match-up processes. In addition, PDISC tool and service capabilities being adapted for GPM data will be described, including the Google-like Mirador data search and access engine; semantic technology to help manage large amounts of multi-sensor data and their relationships; data access through various Web services (e.g., OPeNDAP, GDS, WMS, WCS); conversion to various formats (e.g., netCDF, HDF, KML (for Google Earth)); visualization and analysis of Level 2 data profiles and maps; parameter and spatial subsetting; time and temporal aggregation; regridding; data version control and provenance; continuous archive verification; and expertise in data-related standards and interoperability. The goal of providing these services is to further the progress towards a common framework by which data analysis/validation can be more easily accomplished.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015MeScT..26a5306S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015MeScT..26a5306S"><span>The wire-mesh sensor as a two-phase flow meter</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Shaban, H.; Tavoularis, S.</p> <p>2015-01-01</p> <p>A novel gas and liquid flow rate measurement method is proposed for use in vertical upward and downward gas-liquid pipe flows. This method is based on the analysis of the time history of area-averaged void fraction that is measured using a conductivity wire-mesh sensor (WMS). WMS measurements were collected in vertical upward and downward air-water flows in a pipe with an internal diameter of 32.5 mm at nearly atmospheric pressure. The relative frequencies and the power spectral density of area-averaged void fraction were calculated and used as representative properties. Independent features, extracted from these properties using Principal Component Analysis and Independent Component Analysis, were used as inputs to artificial neural networks, which were trained to give the gas and liquid flow rates as outputs. The present method was shown to be accurate for all four encountered flow regimes and for a wide range of flow conditions. Besides providing accurate predictions for steady flows, the method was also tested successfully in three flows with transient liquid flow rates. The method was augmented by the use of the cross-correlation function of area-averaged void fraction determined from the output of a dual WMS unit as an additional representative property, which was found to improve the accuracy of flow rate prediction.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27185959','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27185959"><span>Characterizing the correlations between local phase fractions of gas-liquid two-phase flow with wire-mesh sensor.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Tan, C; Liu, W L; Dong, F</p> <p>2016-06-28</p> <p>Understanding of flow patterns and their transitions is significant to uncover the flow mechanics of two-phase flow. The local phase distribution and its fluctuations contain rich information regarding the flow structures. A wire-mesh sensor (WMS) was used to study the local phase fluctuations of horizontal gas-liquid two-phase flow, which was verified through comparing the reconstructed three-dimensional flow structure with photographs taken during the experiments. Each crossing point of the WMS is treated as a node, so the measurement on each node is the phase fraction in this local area. An undirected and unweighted flow pattern network was established based on connections that are formed by cross-correlating the time series of each node under different flow patterns. The structure of the flow pattern network reveals the relationship of the phase fluctuations at each node during flow pattern transition, which is then quantified by introducing the topological index of the complex network. The proposed analysis method using the WMS not only provides three-dimensional visualizations of the gas-liquid two-phase flow, but is also a thorough analysis for the structure of flow patterns and the characteristics of flow pattern transition. This article is part of the themed issue 'Supersensing through industrial process tomography'. © 2016 The Author(s).</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018OptCo.415...25Z','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018OptCo.415...25Z"><span>Improvement in QEPAS system utilizing a second harmonic based wavelength calibration technique</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Zhang, Qinduan; Chang, Jun; Wang, Fupeng; Wang, Zongliang; Xie, Yulei; Gong, Weihua</p> <p>2018-05-01</p> <p>A simple laser wavelength calibration technique, based on second harmonic signal, is demonstrated in this paper to improve the performance of quartz enhanced photoacoustic spectroscopy (QEPAS) gas sensing system, e.g. improving the signal to noise ratio (SNR), detection limit and long-term stability. Constant current, corresponding to the gas absorption line, combining f/2 frequency sinusoidal signal are used to drive the laser (constant driving mode), a software based real-time wavelength calibration technique is developed to eliminate the wavelength drift due to ambient fluctuations. Compared to conventional wavelength modulation spectroscopy (WMS), this method allows lower filtering bandwidth and averaging algorithm applied to QEPAS system, improving SNR and detection limit. In addition, the real-time wavelength calibration technique guarantees the laser output is modulated steadily at gas absorption line. Water vapor is chosen as an objective gas to evaluate its performance compared to constant driving mode and conventional WMS system. The water vapor sensor was designed insensitive to the incoherent external acoustic noise by the numerical averaging technique. As a result, the SNR increases 12.87 times in wavelength calibration technique based system compared to conventional WMS system. The new system achieved a better linear response (R2 = 0 . 9995) in concentration range from 300 to 2000 ppmv, and achieved a minimum detection limit (MDL) of 630 ppbv.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27089037','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27089037"><span>Portuguese version of Wechsler Memory Scale-3rd edition's utility with demented elderly adults.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Gonçalves, Cátia; Pinho, Maria S; Cruz, Vítor; Gens, Helena; Oliveira, Fátima; Pais, Joana; Rente, José; Santana, Isabel; Santos, José M</p> <p>2017-01-01</p> <p>The purpose of this study is to analyze the utility of the Portuguese version of the Wechsler Memory Scale-3rd edition (WMS-III) with demented elderly people, namely its capacity to detect and discriminate between subcortical vascular dementia (SVD) and Alzheimer's disease (AD). We assessed early demented patients (SVD = 16; AD = 36) aged 65 or older who were compared to a control group (n = 40). Both clinical groups were adequately matched in terms of disease severity, overall cognitive functioning, depressive symptomatology, and pre-morbid intelligence. Between-group's differences were evaluated using the Quade's rank analysis of covariance. We also computed indexes and subtests optimal cut-off scores, and the corresponding sensitivity, specificity, and positive and negative predictive values, which were able to successfully discriminate between patients and healthy subjects. The SVD patients had a better overall memory performance than AD patients on the majority of the indexes and the delayed condition subtests of the WMS-III. The AD patients only showed a better performance on digit span subtest. Several measures discriminated patients from healthy subjects. This study suggests some recommendations for the diagnostic accuracy of the Portuguese version of WMS-III in dementia and about differential diagnosis between SVD and AD.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4874384','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4874384"><span>Characterizing the correlations between local phase fractions of gas–liquid two-phase flow with wire-mesh sensor</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Liu, W. L.; Dong, F.</p> <p>2016-01-01</p> <p>Understanding of flow patterns and their transitions is significant to uncover the flow mechanics of two-phase flow. The local phase distribution and its fluctuations contain rich information regarding the flow structures. A wire-mesh sensor (WMS) was used to study the local phase fluctuations of horizontal gas–liquid two-phase flow, which was verified through comparing the reconstructed three-dimensional flow structure with photographs taken during the experiments. Each crossing point of the WMS is treated as a node, so the measurement on each node is the phase fraction in this local area. An undirected and unweighted flow pattern network was established based on connections that are formed by cross-correlating the time series of each node under different flow patterns. The structure of the flow pattern network reveals the relationship of the phase fluctuations at each node during flow pattern transition, which is then quantified by introducing the topological index of the complex network. The proposed analysis method using the WMS not only provides three-dimensional visualizations of the gas–liquid two-phase flow, but is also a thorough analysis for the structure of flow patterns and the characteristics of flow pattern transition. This article is part of the themed issue ‘Supersensing through industrial process tomography’. PMID:27185959</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016ISPAnIII4...33B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016ISPAnIII4...33B"><span>Design for Connecting Spatial Data Infrastructures with Sensor Web (sensdi)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Bhattacharya, D.; M., M.</p> <p>2016-06-01</p> <p>Integrating Sensor Web With Spatial Data Infrastructures (SENSDI) aims to extend SDIs with sensor web enablement, converging geospatial and built infrastructure, and implement test cases with sensor data and SDI. It is about research to harness the sensed environment by utilizing domain specific sensor data to create a generalized sensor webframework. The challenges being semantic enablement for Spatial Data Infrastructures, and connecting the interfaces of SDI with interfaces of Sensor Web. The proposed research plan is to Identify sensor data sources, Setup an open source SDI, Match the APIs and functions between Sensor Web and SDI, and Case studies like hazard applications, urban applications etc. We take up co-operative development of SDI best practices to enable a new realm of a location enabled and semantically enriched World Wide Web - the "Geospatial Web" or "Geosemantic Web" by setting up one to one correspondence between WMS, WFS, WCS, Metadata and 'Sensor Observation Service' (SOS); 'Sensor Planning Service' (SPS); 'Sensor Alert Service' (SAS); a service that facilitates asynchronous message interchange between users and services, and between two OGC-SWE services, called the 'Web Notification Service' (WNS). Hence in conclusion, it is of importance to geospatial studies to integrate SDI with Sensor Web. The integration can be done through merging the common OGC interfaces of SDI and Sensor Web. Multi-usability studies to validate integration has to be undertaken as future research.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27412194','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27412194"><span>Developmental Acclimation of Drosophila suzukii (Diptera: Drosophilidae) and Its Effect on Diapause and Winter Stress Tolerance.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Wallingford, Anna K; Loeb, Gregory M</p> <p>2016-08-01</p> <p>We investigated the influence of developmental conditions on adult morphology, reproductive arrest, and winter stress tolerance of the invasive pest of small fruit, Drosophila suzukii (Matsumura) (Diptera: Drosophilidae). Cooler rearing temperatures (15 °C) resulted in larger, darker "winter morph" (WM) adults than "summer morph" flies reared at optimal temperatures (25 °C). Abdominal pigmentation scores and body size measurements of laboratory-reared WMs were similar to those of D. suzukii females captured in late autumn in Geneva, NY. We evaluated reproductive diapause and cold hardiness in live-captured D. suzukii WMs as well as WMs reared in the laboratory from egg to adult under four developmental conditions: static cool temperatures (SWM; 15 °C, 12:12 h L:D), fluctuating temperatures (FWM; 20 °C L: 10 °C D, 12:12 h L:D), and static cool temperatures (15 °C, 12:12 h L:D) followed by posteclosion chilling (CWM; 10 °C) under short-day (SD; 12:12 h L:D) or long-day photoperiods (LD; 16:8 h L:D). Live-captured D. suzukii WMs and CWMs had longer preoviposition times than newly eclosed summer morph adults, indicating a reproductive diapause that was not observed in SWMs or FWMs. Additionally, recovery after acute freeze stress was not different between CWM-SD females and live captured WM females. More 7-d-old CWMs survived 0, -1, or - 3 °C freeze stress than summer morph adults, and more CWM-SD adults survived -3 °C freeze stress than CWM-LD adults. Survival after -3 °C freeze stress was significantly higher in diapausing, CWMs than nondiapausing SWMs and FWMs. © The Authors 2016. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AcASn..57..729W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AcASn..57..729W"><span>Astronomical Site Survey for Mountain Wumingshan Area in Western Sichuan Based on GIS</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Wu, N.; Liu, Y.; Zhao, H. M.</p> <p>2016-11-01</p> <p>In the Western-China Astronomical Site Survey project, we utilize the Geographic Information System (GIS) for the collection of long-term data, in order to investigate and study the Wumingshan (WMS) mountain and its surrounding areas for their geography, geology, climate, meteorology, social and demographic trends. Data analysis results show that the WMS mountain is located in the eastern fold belt of the Tibet Plateau--the typical region of the Hengduan Mountains, which leads to its large elevation, gently trended ridge, and stable geological structure. The highest altitude above the sea level at the WMS is more than 5000 m, but there are population settlements nearby with the low altitude of only 2000-3000 m, which are important for realizing low-level cost logistics conditions for the future headquarter or logistic base. Earthquake landslides and other geological disasters were rarely recorded. The other facts are such as the dry and clean atmosphere, the sparse vegetation, the semi-dry-state land, the perennial prevailing southwest wind, the rain-less winter, and the relatively short rainy-season summer, the location in the heartland of the large Shangri-La, no records of dust storms and the other inclement weather, low cloud coverage, the stability of wind direction, the small wind speed, the high possibility of clear sky, the far distance away from the developed areas in Sichuan and Yunnan provinces, and Tibet Autonomous Region, the sparsely populated people, the slowly developed economy, the peaceful and stable social environment, etc. Specially, in recent years, with the development of the local tourist resources, the traffic conditions in Daocheng have been significantly improved. With high quality highway maintenance and daily air transport capacity, the transportation of land and aviation is rarely interrupted due to snowing, which often happens in high plateau regions. Therefore, the WMS area possesses the potential conditions to establish the future high altitude observatory, and it is really a very rare astronomical site resource.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017PolSc..11...19P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017PolSc..11...19P"><span>Quantitative ecological risk assessment of inhabitants exposed to polycyclic aromatic hydrocarbons in terrestrial soils of King George Island, Antarctica</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Pongpiachan, S.; Hattayanone, M.; Pinyakong, O.; Viyakarn, V.; Chavanich, S. A.; Bo, C.; Khumsup, C.; Kittikoon, I.; Hirunyatrakul, P.</p> <p>2017-03-01</p> <p>This study aims to conduct a quantitative ecological risk assessment of human exposure to polycyclic aromatic hydrocarbons (PAHs) in terrestrial soils of King George Island, Antarctica. Generally, the average PAH concentrations detected in King George Terrestrial Soils (KGS) were appreciably lower than those of World Marine Sediments (WMS) and World Terrestrial Soils (WTS), highlighting the fact that Antarctica is one of the most pristine continents in the world. The total concentrations of twelve probably carcinogenic PAHs (ΣPAHs: a sum of Phe, An, Fluo, Pyr, B[a]A, Chry, B[b]F, B[k]F, B[a]P, Ind, D[a,h]A and B[g,h,i]P) were 3.21 ± 1.62 ng g-1, 5749 ± 4576 ng g-1, and 257,496 ± 291,268 ng g-1, for KGS, WMS and WTS, respectively. In spite of the fact that KGS has extremely low ΣPAHs in comparison with others, the percentage contribution of Phe is exceedingly high with the value of 50%. By assuming that incidental ingestion and dermal contact are two major exposure pathways responsible for the adverse human health effects, the cancer and non-cancer risks from environmental exposure to PAHs were carefully evaluated based on the ;Role of the Baseline Risk Assessment in Superfund Remedy Selection Decisions; memorandum provided by US-EPA. The logarithms of cancer risk levels of PAH contents in KGS varied from -11.1 to -7.18 with an average of -7.96 ± 7.73, which is 1790 times and 80,176 times lower than that of WMS and WTS, respectively. All cancer risk levels of PAH concentrations observed in KGS are significantly (p < 0.001) lower than those of WMS and WTS. Despite the Comandante Ferraz Antarctic Station fire occurred in February 25th, 2012, both the cancer and non-cancer risks of environmental exposure to PAHs were found in ;acceptable level;.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010AGUFMIN32A..03S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010AGUFMIN32A..03S"><span>Increasing the availability and usability of terrestrial ecology data through geospatial Web services and visualization tools (Invited)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Santhana Vannan, S.; Cook, R. B.; Wilson, B. E.; Wei, Y.</p> <p>2010-12-01</p> <p>Terrestrial ecology data sets are produced from diverse data sources such as model output, field data collection, laboratory analysis and remote sensing observation. These data sets can be created, distributed, and consumed in diverse ways as well. However, this diversity can hinder the usability of the data, and limit data users’ abilities to validate and reuse data for science and application purposes. Geospatial web services, such as those described in this paper, are an important means of reducing this burden. Terrestrial ecology researchers generally create the data sets in diverse file formats, with file and data structures tailored to the specific needs of their project, possibly as tabular data, geospatial images, or documentation in a report. Data centers may reformat the data to an archive-stable format and distribute the data sets through one or more protocols, such as FTP, email, and WWW. Because of the diverse data preparation, delivery, and usage patterns, users have to invest time and resources to bring the data into the format and structure most useful for their analysis. This time-consuming data preparation process shifts valuable resources from data analysis to data assembly. To address these issues, the ORNL DAAC, a NASA-sponsored terrestrial ecology data center, has utilized geospatial Web service technology, such as Open Geospatial Consortium (OGC) Web Map Service (WMS) and OGC Web Coverage Service (WCS) standards, to increase the usability and availability of terrestrial ecology data sets. Data sets are standardized into non-proprietary file formats and distributed through OGC Web Service standards. OGC Web services allow the ORNL DAAC to store data sets in a single format and distribute them in multiple ways and formats. Registering the OGC Web services through search catalogues and other spatial data tools allows for publicizing the data sets and makes them more available across the Internet. The ORNL DAAC has also created a Web-based graphical user interface called Spatial Data Access Tool (SDAT) that utilizes OGC Web services standards and allows data distribution and consumption for users not familiar with OGC standards. SDAT also allows for users to visualize the data set prior to download. Google Earth visualizations of the data set are also provided through SDAT. The use of OGC Web service standards at the ORNL DAAC has enabled an increase in data consumption. In one case, a data set had ~10 fold increase in download through OGC Web service in comparison to the conventional FTP and WWW method of access. The increase in download suggests that users are not only finding the data sets they need but also able to consume them readily in the format they need.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010EGUGA..1211436B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010EGUGA..1211436B"><span>GeoNetwork powered GI-cat: a geoportal hybrid solution</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Baldini, Alessio; Boldrini, Enrico; Santoro, Mattia; Mazzetti, Paolo</p> <p>2010-05-01</p> <p>To the aim of setting up a Spatial Data Infrastructures (SDI) the creation of a system for the metadata management and discovery plays a fundamental role. An effective solution is the use of a geoportal (e.g. FAO/ESA geoportal), that has the important benefit of being accessible from a web browser. With this work we present a solution based integrating two of the available frameworks: GeoNetwork and GI-cat. GeoNetwork is an opensource software designed to improve accessibility of a wide variety of data together with the associated ancillary information (metadata), at different scale and from multidisciplinary sources; data are organized and documented in a standard and consistent way. GeoNetwork implements both the Portal and Catalog components of a Spatial Data Infrastructure (SDI) defined in the OGC Reference Architecture. It provides tools for managing and publishing metadata on spatial data and related services. GeoNetwork allows harvesting of various types of web data sources e.g. OGC Web Services (e.g. CSW, WCS, WMS). GI-cat is a distributed catalog based on a service-oriented framework of modular components and can be customized and tailored to support different deployment scenarios. It can federate a multiplicity of catalogs services, as well as inventory and access services in order to discover and access heterogeneous ESS resources. The federated resources are exposed by GI-cat through several standard catalog interfaces (e.g. OGC CSW AP ISO, OpenSearch, etc.) and by the GI-cat extended interface. Specific components implement mediation services for interfacing heterogeneous service providers, each of which exposes a specific standard specification; such components are called Accessors. These mediating components solve providers data modelmultiplicity by mapping them onto the GI-cat internal data model which implements the ISO 19115 Core profile. Accessors also implement the query protocol mapping; first they translate the query requests expressed according to the interface protocols exposed by GI-cat into the multiple query dialects spoken by the resource service providers. Currently, a number of well-accepted catalog and inventory services are supported, including several OGC Web Services, THREDDS Data Server, SeaDataNet Common Data Index, GBIF and OpenSearch engines. A GeoNetwork powered GI-cat has been developed in order to exploit the best of the two frameworks. The new system uses a modified version of GeoNetwork web interface in order to add the capability of querying also the specified GI-cat catalog and not only the GeoNetwork internal database. The resulting system consists in a geoportal in which GI-cat plays the role of the search engine. This new system allows to distribute the query on the different types of data sources linked to a GI-cat. The metadata results of the query are then visualized by the Geonetwork web interface. This configuration was experimented in the framework of GIIDA, a project of the Italian National Research Council (CNR) focused on data accessibility and interoperability. A second advantage of this solution is achieved setting up a GeoNetwork catalog amongst the accessors of the GI-cat instance. Such a configuration will allow in turn GI-cat to run the query against the internal GeoNetwork database. This allows to have both the harvesting and the metadata editor functionalities provided by GeoNetwork and the distributed search functionality of GI-cat available in a consistent way through the same web interface.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015AGUFMIN51A1793X','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015AGUFMIN51A1793X"><span><em>ArcGIS Framework for Scientific Data Analysis and Serving</em></span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Xu, H.; Ju, W.; Zhang, J.</p> <p>2015-12-01</p> <p>ArcGIS is a platform for managing, visualizing, analyzing, and serving geospatial data. Scientific data as part of the geospatial data features multiple dimensions (X, Y, time, and depth) and large volume. Multidimensional mosaic dataset (MDMD), a newly enhanced data model in ArcGIS, models the multidimensional gridded data (e.g. raster or image) as a hypercube and enables ArcGIS's capabilities to handle the large volume and near-real time scientific data. Built on top of geodatabase, the MDMD stores the dimension values and the variables (2D arrays) in a geodatabase table which allows accessing a slice or slices of the hypercube through a simple query and supports animating changes along time or vertical dimension using ArcGIS desktop or web clients. Through raster types, MDMD can manage not only netCDF, GRIB, and HDF formats but also many other formats or satellite data. It is scalable and can handle large data volume. The parallel geo-processing engine makes the data ingestion fast and easily. Raster function, definition of a raster processing algorithm, is a very important component in ArcGIS platform for on-demand raster processing and analysis. The scientific data analytics is achieved through the MDMD and raster function templates which perform on-demand scientific computation with variables ingested in the MDMD. For example, aggregating monthly average from daily data; computing total rainfall of a year; calculating heat index for forecasting data, and identifying fishing habitat zones etc. Addtionally, MDMD with the associated raster function templates can be served through ArcGIS server as image services which provide a framework for on-demand server side computation and analysis, and the published services can be accessed by multiple clients such as ArcMap, ArcGIS Online, JavaScript, REST, WCS, and WMS. This presentation will focus on the MDMD model and raster processing templates. In addtion, MODIS land cover, NDFD weather service, and HYCOM ocean model will be used to illustrate how ArcGIS platform and MDMD model can facilitate scientific data visualization and analytics and how the analysis results can be shared to more audience through ArcGIS Online and Portal.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/22190526','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/22190526"><span>[Event-related potentials P₃₀₀ with memory function and psychopathology in first-episode paranoid schizophrenia].</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Liu, Wei-bo; Chen, Qiao-zhen; Yin, Hou-min; Zheng, Lei-lei; Yu, Shao-hua; Chen, Yi-ping; Li, Hui-chun</p> <p>2011-11-01</p> <p>To investigate the variability of event-related potentials P(300) and the relationship with memory function/psychopathology in patients with first-episode paranoid schizophrenia. Thirty patients with first-episode paranoid schizophrenia (patient group) and twenty health subjects (control group) were enrolled in the study. The auditory event-related potentials P₃₀₀ at the scalp electrodes Cz, Pz and Wechsler Memory Scale (WMS) were examined in both groups, Positive And Negative Syndrome Scale (PANSS) was evaluated in patient group. In comparison with control group, patients had longer latency of P₃₀₀ [(390.6 ± 47.6)ms at Cz and (393.3 ± 50.1)ms at Pz] (P<0.01), lower amplitude of P₃₀₀ [(7.7 ± 3.4) μV at Cz and (8.5 ± 3.9)μV at Pz] (P<0.05-0.01). The memory quotient (88.1 ± 10.0) scores and short-term memory, immediate memory in patient group were damaged significantly (P<0.05-0.01). In patient group, the latency of P300 was correlated positively with PANSS scores and negatively with WMS scores (P<0.05-0.01). First-episode paranoid schizophrenia has memory deficit, which can be evaluated comprehensively by P₃₀₀ and WMS. The longer latency of P₃₀₀ might be associated with the increased severity of first-episode paranoid schizophrenia.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li class="active"><span>15</span></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_15 --> <div id="page_16" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li class="active"><span>16</span></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="301"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/29260040','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/29260040"><span>Cognitive impairment in neuromyelitis optica spectrum disorders: A comparison of the Wechsler Adult Intelligence Scale-III and the Wechsler Memory Scale Revised with the Rao Brief Repeatable Neuropsychological Battery.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Fujimori, Juichi; Nakashima, Ichiro; Baba, Toru; Meguro, Yuko; Ogawa, Ryo; Fujihara, Kazuo</p> <p>2017-12-01</p> <p>Approximately 55% of patients with neuromyelitis optica spectrum disorder (NMOSD) show cognitive impairment as evaluated using the Rao Brief Repeatable Neuropsychological Battery (BRBN), but this frequency appears to be higher than the frequency of specific brain lesions in NMOSD. We studied whether cognitive impairment could be observed in NMOSD patients with no or minor non-specific brain lesions. We evaluated cognitive function in 12 NMOSD and 14 MS patients using the Wechsler Adult Intelligence Scale-III (WAIS-III), the Wechsler Memory Scale-Revised (WMS-R), and the BRBN. We judged as cognitively impaired patients whose scores were below the average by 2 standard deviations or greater in 2 or more cognitive domains. Cognitive impairment was observed in 5 MS patients (35.7%) and in the only NMOSD patient (8.3%) with symptomatic brain lesions, but not in the other NMOSD patients who had no or minor non-specific brain lesions. Meanwhile, 5 NMOSD (41.7%) and 4 MS (28.6%) patients who had normal cognition according to the WAIS-III and WMS-R were assessed as cognitively impaired by the BRBN (which is not standardized for age). Cognitive function in NMOSD patients with no or mild non-specific brain lesions was preserved according to the WAIS-III and WMS-R.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016ApPhB.122..188P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016ApPhB.122..188P"><span>High-sensitivity in situ QCLAS-based ammonia concentration sensor for high-temperature applications</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Peng, W. Y.; Sur, R.; Strand, C. L.; Spearrin, R. M.; Jeffries, J. B.; Hanson, R. K.</p> <p>2016-07-01</p> <p>A novel quantum cascade laser (QCL) absorption sensor is presented for high-sensitivity in situ measurements of ammonia (hbox {NH}_3) in high-temperature environments, using scanned wavelength modulation spectroscopy (WMS) with first-harmonic-normalized second-harmonic detection (scanned WMS-2 f/1 f) to neutralize the effect of non-absorption losses in the harsh environment. The sensor utilized the sQ(9,9) transition of the fundamental symmetric stretch band of hbox {NH}_3 at 10.39 {\\upmu }hbox {m} and was sinusoidally modulated at 10 kHz and scanned across the peak of the absorption feature at 50 Hz, leading to a detection bandwidth of 100 Hz. A novel technique was used to select an optimal WMS modulation depth parameter that reduced the sensor's sensitivity to spectral interference from hbox {H}_2hbox {O} and hbox {CO}_2 without significantly sacrificing signal-to-noise ratio. The sensor performance was validated by measuring known concentrations of hbox {NH}_3 in a flowing gas cell. The sensor was then demonstrated in a laboratory-scale methane-air burner seeded with hbox {NH}_3, achieving a demonstrated detection limit of 2.8 ± 0.26 ppm hbox {NH}_3 by mole at a path length of 179 cm, equivalence ratio of 0.6, pressure of 1 atm, and temperatures of up to 600 K.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..1914294A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..1914294A"><span>An approach to drought data web-dissemination</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Angeluccetti, Irene; Perez, Francesca; Balbo, Simone; Cámaro, Walther; Boccardo, Piero</p> <p>2017-04-01</p> <p>Drought data dissemination has always been a challenge for the scientific community. Firstly, a variety of widely known datasets is currently being used to describe different aspects of this same phenomenon. Secondly, new indexes are constantly being produced by scientists trying to better capture drought events. The present work aims at presenting how the drought monitoring communication issue was addressed by the ITHACA team. The ITHACA drought monitoring system makes use of two indicators: the Standardized Precipitation Index (SPI) and the Seasonal Small Integral Deviation (SSID). The first one is obtained considering the 3-months cumulating interval of the rainfall derived from the TRMM dataset; the second one is the percent deviation from the historical average value of the integral of the NDVI function describing the vegetation season. The SPI and the SSID are 30 and 5 km gridded respectively. The whole time-series of these two indicators (since year 2000 onwards), covering the whole Africa, are published by a WebGIS platform (http://drought.ithacaweb.org). On the one hand, although the SPI has been used for decades in different contexts and little explanation is due when presenting this indicator to an audience with a scientific background, the WebGIS platform shows a guide for its correct interpretation. On the other hand, being the SSID not commonly used in the field of vegetation analysis, the guide shown on the WebGIS platform is essential for the visitor to understand the data. Recently a new index has been created in order to synthesize, for a non-expert audience, the information provided by the indicators. It is aggregated per second order administrative levels and is calculated as follows: (i) a meteorological drought warning is issued when negative SPI and no vegetative season is detected (a blue palette is used); (ii) a warning value is assigned if SSID, SPI, or both, are negative (amber to brown palette is used) i.e., where the vegetative season is ongoing and the SSID is negative, a negative SPI value entails an agricultural drought warning, while a positive SPI implies a vegetation stress warning; (iv) a meteorological drought warning is issued when negative SPI during the vegetation season is detected but vegetation stress effects are not (i.e. positive SSID). The latest available Drought Warning Index is also published on the mentioned WebGIS platform. The index is stored in a database table: a single value is calculated for each administrative level. A table view on the database contains fields describing the geometry of the administrative level polygons and the respective index; this table view is published as a WMS service, by associating the symbology previously described. The WMS service is then captured in order to generate a live map with a series of basic WebGIS functionalities. The integrated index is undoubtedly useful for a non-expert user to understand immediately if a particular region is subject to a drought stress. However, the simplification introduces uncertainty as it implies several assumptions that couldn't be verified at a continental scale.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2011CSSE....6..559T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2011CSSE....6..559T"><span>Home culture, science, school and science learning: is reconciliation possible?</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Tan, Aik-Ling</p> <p>2011-09-01</p> <p>In response to Meyer and Crawford's article on how nature of science and authentic science inquiry strategies can be used to support the learning of science for underrepresented students, I explore the possibly of reconciliation between the cultures of school, science, school science as well as home. Such reconciliation is only possible when science teachers are cognizant of the factors affecting the cultural values and belief systems of underrepresented students. Using my experience as an Asian learner of WMS, I suggest that open and honest dialogues in science classrooms will allow for greater clarity of the ideals that WMS profess and cultural beliefs of underrepresented students. This in-depth understanding will eliminate guesswork and unrealistic expectations and in the process promote tolerance and acceptance of diversity in ways of knowing.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25498265','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25498265"><span>Wilderness Medical Society practice guidelines for the prevention and treatment of lightning injuries: 2014 update.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Davis, Chris; Engeln, Anna; Johnson, Eric L; McIntosh, Scott E; Zafren, Ken; Islas, Arthur A; McStay, Christopher; Smith, William R; Cushing, Tracy</p> <p>2014-12-01</p> <p>To provide guidance to clinicians about best practices, the Wilderness Medical Society (WMS) convened an expert panel to develop evidence-based guidelines for the treatment and prevention of lightning injuries. These guidelines include a review of the epidemiology of lightning and recommendations for the prevention of lightning strikes, along with treatment recommendations organized by organ system. Recommendations are graded on the basis of the quality of supporting evidence according to criteria put forth by the American College of Chest Physicians. This is an updated version of the original WMS Practice Guidelines for Prevention and Treatment of Lightning Injuries published in Wilderness & Environmental Medicine 2012;23(3):260-269. Copyright © 2014 Wilderness Medical Society. Published by Elsevier Inc. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2008AGUFMIN31A1131B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2008AGUFMIN31A1131B"><span>The Infusion of Dust Model Model Outputs into Public Health Decision Making - an Examination of Differential Adoption of SOAP and Open Geospatial Consortium Service Products into Public Health Decision Support Systems</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Benedict, K. K.</p> <p>2008-12-01</p> <p>Since 2004 the Earth Data Analysis Center, in collaboration with the researchers at the University of Arizona and George Mason University, with funding from NASA, has been developing a services oriented architecture (SOA) that acquires remote sensing, meteorological forecast, and observed ground level particulate data (EPA AirNow) from NASA, NOAA, and DataFed through a variety of standards-based service interfaces. These acquired data are used to initialize and set boundary conditions for the execution of the Dust Regional Atmospheric Model (DREAM) to generate daily 48-hour dust forecasts, which are then published via a combination of Open Geospatial Consortium (OGC) services (WMS and WCS), basic HTTP request-based services, and SOAP services. The goal of this work has been to develop services that can be integrated into existing public health decision support systems (DSS) to provide enhanced environmental data (i.e. ground surface particulate concentration estimates) for use in epidemiological analysis, public health warning systems, and syndromic surveillance systems. While the project has succeeded in deploying these products into the target systems, there has been differential adoption of the different service interface products, with the simple OGC and HTTP interfaces generating much greater interest by DSS developers and researchers than the more complex SOAP service interfaces. This paper reviews the SOA developed as part of this project and provides insights into how different service models may have a significant impact on the infusion of Earth science products into decision making processes and systems.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018ISPAr42.3.2407Z','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018ISPAr42.3.2407Z"><span>Research and Practice of the News Map Compilation Service</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Zhao, T.; Liu, W.; Ma, W.</p> <p>2018-04-01</p> <p>Based on the needs of the news media on the map, this paper researches on the news map compilation service, conducts demand research on the service of compiling news maps, designs and compiles the public authority base map suitable for media publication, and constructs the news base map material library. It studies the compilation of domestic and international news maps with timeliness and strong pertinence and cross-regional characteristics, constructs the hot news thematic gallery and news map customization services, conducts research on types of news maps, establish closer liaison and cooperation methods with news media, and guides news media to use correct maps. Through the practice of the news map compilation service, this paper lists two cases of news map preparation services used by different media, compares and analyses cases, summarizes the research situation of news map compilation service, and at the same time puts forward outstanding problems and development suggestions in the service of news map compilation service.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..18.5985V','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..18.5985V"><span>Eutrophication and contaminant data management for EU marine policies: the EMODnet Chemistry infrastructure.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Vinci, Matteo; Lipizer, Marina; Giorgetti, Alessandra</p> <p>2016-04-01</p> <p>The European Marine Observation and Data Network (EMODnet) initiative has the following purposes: to assemble marine metadata, data and products, to make these fragmented resources more easily available to public and private users and to provide quality-assured, standardised and harmonised marine data. EMODnet Chemistry was launched by DG MARE in 2009 to support the Marine Strategy Framework Directive (MSFD) requirements for the assessment of eutrophication and contaminants, following INSPIRE Directive rules. The aim is twofold: the first task is to make available and reusable the big amount of fragmented and inaccessible data, hosted in the European research institutes and environmental agencies. The second objective is to develop visualization services useful for the tasks of the MSFD. The technical set-up is based on the principle of adopting and adapting the SeaDataNet infrastructure for ocean and marine data which are managed by National Oceanographic Data Centers and relies on a distributed network of data centers. Data centers contribute to data harvesting and enrichment with the relevant metadata. Data are processed into interoperable formats (using agreed standards ISO XML, ODV) with the use of common vocabularies and standardized quality control procedures .Data quality control is a key issue when merging heterogeneous data coming from different sources and a data validation loop has been agreed within EMODnet Chemistry community and is routinely performed. After data quality control done by the regional coordinators of the EU marine basins (Atlantic, Baltic, North, Mediterranean and Black Sea), validated regional datasets are used to develop data products useful for the requirements of the MSFD. EMODnet Chemistry provides interpolated seasonal maps of nutrients and services for the visualization of time series and profiles of several chemical parameters. All visualization services are developed following OGC standards as WMS and WPS. In order to test new strategies for data storage, reanalysis and to upgrade the infrastructure performances, EMODnet Chemistry has chosen the Cloud environment offered by Cineca (the Consortium of Italian Universities and research institutes) where both regional aggregated datasets and analysis and visualization services are hosted. Finally, beside the delivery of data and the visualization products, the results of the data harvesting provide a useful tool to identify data gaps where the future monitoring efforts should be focused.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014AGUFMIN21B3708Z','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014AGUFMIN21B3708Z"><span>An open source Java web application to build self-contained Web GIS sites</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Zavala Romero, O.; Ahmed, A.; Chassignet, E.; Zavala-Hidalgo, J.</p> <p>2014-12-01</p> <p>This work describes OWGIS, an open source Java web application that creates Web GIS sites by automatically writing HTML and JavaScript code. OWGIS is configured by XML files that define which layers (geographic datasets) will be displayed on the websites. This project uses several Open Geospatial Consortium standards to request data from typical map servers, such as GeoServer, and is also able to request data from ncWMS servers. The latter allows for the displaying of 4D data stored using the NetCDF file format (widely used for storing environmental model datasets). Some of the features available on the sites built with OWGIS are: multiple languages, animations, vertical profiles and vertical transects, color palettes, color ranges, and the ability to download data. OWGIS main users are scientists, such as oceanographers or climate scientists, who store their data in NetCDF files and want to analyze, visualize, share, or compare their data using a website.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/22470221-upgraded-biogas-from-municipal-solid-waste-natural-gas-substitution-co-sub-reduction-case-study-austria-italy-spain','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/22470221-upgraded-biogas-from-municipal-solid-waste-natural-gas-substitution-co-sub-reduction-case-study-austria-italy-spain"><span>Upgraded biogas from municipal solid waste for natural gas substitution and CO{sub 2} reduction – A case study of Austria, Italy, and Spain</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Starr, Katherine; Villalba, Gara, E-mail: gara.villalba@uab.es; Sostenipra, Institute de Ciencia i Technologia Ambientals</p> <p>2015-04-15</p> <p>Highlights: • Biogas can be upgraded to create biomethane, a substitute to natural gas. • Biogas upgrading was applied to landfills and anaerobic digestors in 3 countries. • Up to 0.6% of a country’s consumption of natural gas could be replaced by biomethane. • Italy could save 46% of the national CO{sub 2} emissions attributed to the waste sector. • Scenarios were created to increase biomethane production. - Abstract: Biogas is rich in methane and can be further purified through biogas upgrading technologies, presenting a viable alternative to natural gas. Landfills and anaerobic digestors treating municipal solid waste are amore » large source of such biogas. They therefore offer an attractive opportunity to tap into this potential source of natural gas while at the same time minimizing the global warming impact resulting from methane emissions in waste management schemes (WMS) and fossil fuel consumption reduction. This study looks at the current municipal solid waste flows of Spain, Italy, and Austria over one year (2009), in order to determine how much biogas is generated. Then it examines how much natural gas could be substituted by using four different biogas upgrading technologies. Based on current waste generation rates, exploratory but realistic WMS were created for each country in order to maximize biogas production and potential for natural gas substitution. It was found that the potential substitution of natural gas by biogas resulting from the current WMS seems rather insignificant: 0.2% for Austria, 0.6% for Italy and 0.3% for Spain. However, if the WMS is redesigned to maximize biogas production, these figures can increase to 0.7% for Austria, 1% for Italy and 2% for Spain. Furthermore, the potential CO{sub 2} reduction as a consequence of capturing the biogas and replacing fossil fuel can result in up to a 93% reduction of the annual national waste greenhouse gas emissions of Spain and Italy.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2007AGUFMIN11A0101A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2007AGUFMIN11A0101A"><span>Usage of Wireless Sensor Networks in a service based spatial data infrastructure for Landslide Monitoring and Early Warning</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Arnhardt, C.; Fernandez-Steeger, T. M.; Walter, K.; Kallash, A.; Niemeyer, F.; Azzam, R.; Bill, R.</p> <p>2007-12-01</p> <p>The joint project Sensor based Landslide Early Warning System (SLEWS) aims at a systematic development of a prototyping alarm- and early warning system for the detection of mass movements by application of an ad hoc wireless sensor network (WSN). Next to the development of suitable sensor setups, sensor fusion and network fusion are applied to enhance data quality and reduce false alarm rates. Of special interest is the data retrieval, processing and visualization in GI-Systems. Therefore a suitable serviced based Spatial Data Infrastructure (SDI) will be developed with respect to existing and upcoming Open Geospatial Consortium (OGC) standards.The application of WSN provides a cheap and easy to set up solution for special monitoring and data gathering in large areas. Measurement data from different low-cost transducers for deformation observation (acceleration, displacement, tilting) is collected by distributed sensor nodes (motes), which interact separately and connect each other in a self-organizing manner. Data are collected and aggregated at the beacon (transmission station) and further operations like data pre-processing and compression can be performed. The WSN concept provides next to energy efficiency, miniaturization, real-time monitoring and remote operation, but also new monitoring strategies like sensor and network fusion. Since not only single sensors can be integrated at single motes either cross-validation or redundant sensor setups are possible to enhance data quality. The planned monitoring and information system will include a mobile infrastructure (information technologies and communication components) as well as methods and models to estimate surface deformation parameters (positioning systems). The measurements result in heterogeneous observation sets that have to be integrated in a common adjustment and filtering approach. Reliable real-time information will be obtained using a range of sensor input and algorithms, from which early warnings and prognosis may be derived. Implementation of sensor algorithms is an important task to form the business logic. This will be represented in self-contained web-based processing services (WPS). In the future different types of sensor networks can communicate via an infrastructure of OGC services using an interoperable way by standardized protocols as the Sensor Markup Language (SensorML) and Observations & Measurements Schema (O&M). Synchronous and asynchronous information services as the Sensor Alert Service (SAS) and the Web Notification Services (WNS) will provide defined users and user groups with time-critical readings from the observation site. Techniques using services for visualizing mapping data (WMS), meta data (CSW), vector (WFS) and raster data (WCS) will range from high detailed expert based output to fuzzy graphical warning elements.The expected results will be an advancement regarding classical alarm and early warning systems as the WSN are free scalable, extensible and easy to install.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013AGUFMIN21B1390M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013AGUFMIN21B1390M"><span>NASA's Climate Data Services Initiative</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>McInerney, M.; Duffy, D.; Schnase, J. L.; Webster, W. P.</p> <p>2013-12-01</p> <p>Our understanding of the Earth's processes is based on a combination of observational data records and mathematical models. The size of NASA's space-based observational data sets is growing dramatically as new missions come online. However a potentially bigger data challenge is posed by the work of climate scientists, whose models are regularly producing data sets of hundreds of terabytes or more. It is important to understand that the 'Big Data' challenge of climate science cannot be solved with a single technological approach or an ad hoc assemblage of technologies. It will require a multi-faceted, well-integrated suite of capabilities that include cloud computing, large-scale compute-storage systems, high-performance analytics, scalable data management, and advanced deployment mechanisms in addition to the existing, well-established array of mature information technologies. It will also require a coherent organizational effort that is able to focus on the specific and sometimes unique requirements of climate science. Given that it is the knowledge that is gained from data that is of ultimate benefit to society, data publication and data analytics will play a particularly important role. In an effort to accelerate scientific discovery and innovation through broader use of climate data, NASA Goddard Space Flight Center's Office of Computational and Information Sciences and Technology has embarked on a determined effort to build a comprehensive, integrated data publication and analysis capability for climate science. The Climate Data Services (CDS) Initiative integrates people, expertise, and technology into a highly-focused, next-generation, one-stop climate science information service. The CDS Initiative is providing the organizational framework, processes, and protocols needed to deploy existing information technologies quickly using a combination of enterprise-level services and an expanding array of cloud services. Crucial to its effectiveness, the CDS Initiative is developing the technical expertise to move new information technologies from R&D into operational use. This combination enables full, end-to-end support for climate data publishing and data analytics, and affords the flexibility required to meet future and unanticipated needs. Current science efforts being supported by the CDS Initiative include IPPC, OBS4MIP, ANA4MIPS, MERRA II, National Climate Assessment, the Ocean Data Assimilation project, NASA Earth Exchange (NEX), and the RECOVER Burned Area Emergency Response decision support system. Service offerings include an integrated suite of classic technologies (FTP, LAS, THREDDS, ESGF, GRaD-DODS, OPeNDAP, WMS, ArcGIS Server), emerging technologies (iRODS, UVCDAT), and advanced technologies (MERRA Analytic Services, MapReduce, Ontology Services, and the CDS API). This poster will describe the CDS Initiative, provide details about the Initiative's advanced offerings, and layout the CDS Initiative's deployment roadmap.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://medlineplus.gov/ency/patientinstructions/000701.htm','NIH-MEDLINEPLUS'); return false;" href="https://medlineplus.gov/ency/patientinstructions/000701.htm"><span>COPD and other health problems</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://medlineplus.gov/">MedlinePlus</a></p> <p></p> <p></p> <p>... 105. Global Initiative for Chronic Obstructive Lung Disease (GOLD) website. Global strategy for the diagnosis, management, and ... report. goldcopd.org/wp-content/uploads/2017/11/GOLD-2018-v6.0-FINAL-revised-20-Nov_WMS. ...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1346388','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1346388"><span>Enabling opportunistic resources for CMS Computing Operations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Hufnagel, Dirk</p> <p></p> <p>With the increased pressure on computing brought by the higher energy and luminosity from the LHC in Run 2, CMS Computing Operations expects to require the ability to utilize opportunistic resources resources not owned by, or a priori configured for CMS to meet peak demands. In addition to our dedicated resources we look to add computing resources from non CMS grids, cloud resources, and national supercomputing centers. CMS uses the HTCondor/glideinWMS job submission infrastructure for all its batch processing, so such resources will need to be transparently integrated into its glideinWMS pool. Bosco and parrot wrappers are used to enablemore » access and bring the CMS environment into these non CMS resources. Finally, we describe our strategy to supplement our native capabilities with opportunistic resources and our experience so far using them.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1325970','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1325970"><span>Enabling opportunistic resources for CMS Computing Operations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Hufnagel, Dick</p> <p></p> <p>With the increased pressure on computing brought by the higher energy and luminosity from the LHC in Run 2, CMS Computing Operations expects to require the ability to utilize “opportunistic” resources — resources not owned by, or a priori configured for CMS — to meet peak demands. In addition to our dedicated resources we look to add computing resources from non CMS grids, cloud resources, and national supercomputing centers. CMS uses the HTCondor/glideinWMS job submission infrastructure for all its batch processing, so such resources will need to be transparently integrated into its glideinWMS pool. Bosco and parrot wrappers are usedmore » to enable access and bring the CMS environment into these non CMS resources. Here we describe our strategy to supplement our native capabilities with opportunistic resources and our experience so far using them.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/pages/biblio/1346388-enabling-opportunistic-resources-cms-computing-operations','SCIGOV-DOEP'); return false;" href="https://www.osti.gov/pages/biblio/1346388-enabling-opportunistic-resources-cms-computing-operations"><span>Enabling opportunistic resources for CMS Computing Operations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/pages">DOE PAGES</a></p> <p>Hufnagel, Dirk</p> <p>2015-12-23</p> <p>With the increased pressure on computing brought by the higher energy and luminosity from the LHC in Run 2, CMS Computing Operations expects to require the ability to utilize opportunistic resources resources not owned by, or a priori configured for CMS to meet peak demands. In addition to our dedicated resources we look to add computing resources from non CMS grids, cloud resources, and national supercomputing centers. CMS uses the HTCondor/glideinWMS job submission infrastructure for all its batch processing, so such resources will need to be transparently integrated into its glideinWMS pool. Bosco and parrot wrappers are used to enablemore » access and bring the CMS environment into these non CMS resources. Finally, we describe our strategy to supplement our native capabilities with opportunistic resources and our experience so far using them.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/460772','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/460772"><span>Los Alamos Plutonium Facility Waste Management System</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Smith, K.; Montoya, A.; Wieneke, R.</p> <p>1997-02-01</p> <p>This paper describes the new computer-based transuranic (TRU) Waste Management System (WMS) being implemented at the Plutonium Facility at Los Alamos National Laboratory (LANL). The Waste Management System is a distributed computer processing system stored in a Sybase database and accessed by a graphical user interface (GUI) written in Omnis7. It resides on the local area network at the Plutonium Facility and is accessible by authorized TRU waste originators, count room personnel, radiation protection technicians (RPTs), quality assurance personnel, and waste management personnel for data input and verification. Future goals include bringing outside groups like the LANL Waste Management Facilitymore » on-line to participate in this streamlined system. The WMS is changing the TRU paper trail into a computer trail, saving time and eliminating errors and inconsistencies in the process.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/29491481','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/29491481"><span>The Spanish version of Face-Name Associative Memory Exam (S-FNAME) performance is related to amyloid burden in Subjective Cognitive Decline.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Sanabria, Angela; Alegret, Montserrat; Rodriguez-Gomez, Octavio; Valero, Sergi; Sotolongo-Grau, Oscar; Monté-Rubio, Gemma; Abdelnour, Carla; Espinosa, Ana; Ortega, Gemma; Perez-Cordon, Alba; Gailhajanet, Anna; Hernandez, Isabel; Rosende-Roca, Maitee; Vargas, Liliana; Mauleon, Ana; Sanchez, Domingo; Martin, Elvira; Rentz, Dorene M; Lomeña, Francisco; Ruiz, Agustin; Tarraga, Lluis; Boada, Merce</p> <p>2018-02-28</p> <p>The Face-Name Associative Memory Exam (FNAME) is a paired associative memory test created to detect memory deficits in individuals with preclinical Alzheimer's disease (AD). Worse performance on FNAME in cognitively healthy individuals were found related to higher amyloid beta (Aβ) burden measured with Positron-Emission-Tomography using 11 C-PiB (PiB-PET). We previously reported normative data of a Spanish version of FNAME (S-FNAME) in cognitively healthy Spanish-speaking subjects. The aim of the present study was to determine whether performance on S-FNAME was associated with Aβ burden in subjective cognitive decline (SCD) individuals. 200 SCD subjects received neurological and neuropsychological assessments, including the S-FNAME and the Word List task from the Wechsler-Memory-Scale-III (WMS-III). Moreover, they received an MRI and (18)F-Florbetaben Positron-Emission-Tomography (FBB-PET) to measure Aβ burden. Three cognitive factor composites were derived for the episodic memory measures (face-name [SFN-N], face-occupation [SFN-O] and WMS-III) to determine whether episodic memory performance was related to Aβ deposition. Higher global Aβ deposition was significantly related to worse performance on SFN-N but not with SFN-O or WMS-III Composite. Moreover, worse SFN-N performance was significantly related to higher Aβ deposition in bilateral  Posterior Cingulate Cortex. The S-FNAME may be a promising neuropsychological tool for detecting SCD individuals with preclinical AD.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4643305','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4643305"><span>A longitudinal model for disease progression was developed and applied to multiple sclerosis</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Lawton, Michael; Tilling, Kate; Robertson, Neil; Tremlett, Helen; Zhu, Feng; Harding, Katharine; Oger, Joel; Ben-Shlomo, Yoav</p> <p>2015-01-01</p> <p>Objectives To develop a model of disease progression using multiple sclerosis (MS) as an exemplar. Study Design and Settings Two observational cohorts, the University of Wales MS (UoWMS), UK (1976), and British Columbia MS (BCMS) database, Canada (1980), with longitudinal disability data [the Expanded Disability Status Scale (EDSS)] were used; individuals potentially eligible for MS disease-modifying drugs treatments, but who were unexposed, were selected. Multilevel modeling was used to estimate the EDSS trajectory over time in one data set and validated in the other; challenges addressed included the choice and function of time axis, complex observation-level variation, adjustments for MS relapses, and autocorrelation. Results The best-fitting model for the UoWMS cohort (404 individuals, and 2,290 EDSS observations) included a nonlinear function of time since onset. Measurement error decreased over time and ad hoc methods reduced autocorrelation and the effect of relapse. Replication within the BCMS cohort (978 individuals and 7,335 EDSS observations) led to a model with similar time (years) coefficients, time [0.22 (95% confidence interval {CI}: 0.19, 0.26), 0.16 (95% CI: 0.10, 0.22)] and log time [−0.13 (95% CI: −0.39, 0.14), −0.15 (95% CI: −0.70, 0.40)] for BCMS and UoWMS, respectively. Conclusion It is possible to develop robust models of disability progression for chronic disease. However, explicit validation is important given the complex methodological challenges faced. PMID:26071892</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28368082','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28368082"><span>[Comparison of the Wechsler Memory Scale-III and the Spain-Complutense Verbal Learning Test in acquired brain injury: construct validity and ecological validity].</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Luna-Lario, P; Pena, J; Ojeda, N</p> <p>2017-04-16</p> <p>To perform an in-depth examination of the construct validity and the ecological validity of the Wechsler Memory Scale-III (WMS-III) and the Spain-Complutense Verbal Learning Test (TAVEC). The sample consists of 106 adults with acquired brain injury who were treated in the Area of Neuropsychology and Neuropsychiatry of the Complejo Hospitalario de Navarra and displayed memory deficit as the main sequela, measured by means of specific memory tests. The construct validity is determined by examining the tasks required in each test over the basic theoretical models, comparing the performance according to the parameters offered by the tests, contrasting the severity indices of each test and analysing their convergence. The external validity is explored through the correlation between the tests and by using regression models. According to the results obtained, both the WMS-III and the TAVEC have construct validity. The TAVEC is more sensitive and captures not only the deficits in mnemonic consolidation, but also in the executive functions involved in memory. The working memory index of the WMS-III is useful for predicting the return to work at two years after the acquired brain injury, but none of the instruments anticipates the disability and dependence at least six months after the injury. We reflect upon the construct validity of the tests and their insufficient capacity to predict functionality when the sequelae become chronic.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li class="active"><span>16</span></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_16 --> <div id="page_17" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li class="active"><span>17</span></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="321"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25337913','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25337913"><span>Uncertainties in ecosystem service maps: a comparison on the European scale.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Schulp, Catharina J E; Burkhard, Benjamin; Maes, Joachim; Van Vliet, Jasper; Verburg, Peter H</p> <p>2014-01-01</p> <p>Safeguarding the benefits that ecosystems provide to society is increasingly included as a target in international policies. To support such policies, ecosystem service maps are made. However, there is little attention for the accuracy of these maps. We made a systematic review and quantitative comparison of ecosystem service maps on the European scale to generate insights in the uncertainty of ecosystem service maps and discuss the possibilities for quantitative validation. Maps of climate regulation and recreation were reasonably similar while large uncertainties among maps of erosion protection and flood regulation were observed. Pollination maps had a moderate similarity. Differences among the maps were caused by differences in indicator definition, level of process understanding, mapping aim, data sources and methodology. Absence of suitable observed data on ecosystem services provisioning hampers independent validation of the maps. Consequently, there are, so far, no accurate measures for ecosystem service map quality. Policy makers and other users need to be cautious when applying ecosystem service maps for decision-making. The results illustrate the need for better process understanding and data acquisition to advance ecosystem service mapping, modelling and validation.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..19.8950W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..19.8950W"><span>Interpolate with DIVA and view the products in OceanBrowser : what's up ?</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Watelet, Sylvain; Barth, Alexander; Beckers, Jean-Marie; Troupin, Charles</p> <p>2017-04-01</p> <p>The Data-Interpolating Variational Analysis (DIVA) software is a statistical tool designed to reconstruct a continuous field from discrete measurements. This method is based on the numerical implementation of the Variational Inverse Model (VIM), which consists of a minimization of a cost function, allowing the choice of the analyzed field fitting at best the data sets without presenting unrealistic strong variations. The problem is solved efficiently using a finite-element method. This method, equivalent to the Optimal Interpolation, is particularly suited to deal with irregularly-spaced observations and produces outputs on a regular grid (2D, 3D or 4D). The results are stored in NetCDF files, the most widespread format in the earth sciences community. OceanBrowser is a web-service that allows one to visualize gridded fields on-line. Within the SeaDataNet and EMODNET (Chemical lot) projects, several national ocean data centers have created gridded climatologies of different ocean properties using the data analysis software DIVA. In order to give a common viewing service to those interpolated products, the GHER has developed OceanBrowser which is based on open standards from the Open Geospatial Consortium (OGC), in particular Web Map Service (WMS) and Web Feature Service (WFS). These standards define a protocol for describing, requesting and querying two-dimensional maps at a given depth and time. DIVA and OceanBrowser are both softwares tools which are continuously upgraded and distributed for free through frequent version releases. The development is funded by the EMODnet and SeaDataNet projects and include many discussions and feedback from the users community. Here, we present two recent major upgrades. First, we have implemented a "customization" of DIVA analyses following the sea bottom, using the bottom depth gradient as a new source of information. The weaker the slope of the bottom ocean, the higher the correlation length. This correlation length being associated with the propagation of the information, it is therefore harder to interpolate through bottom topographic "barriers" such as the continental slope and easier to do it in the perpendicular direction. Although realistic for most applications, this behaviour can always be disabled by the user. Second, we have added some combined products in OceanBrowser, covering all European seas at once. Based on the analyses performed by the other EMODnet partners using DIVA on five zones (Atlantic, North Sea, Baltic Sea, Black Sea, Mediterranean Sea), we have computed a single European product for five variables : ammonium, chlorophyll-a, dissolved oxygen concentration, phosphate and silicate. At boundaries, a smooth filter was used to remove possible discrepancies between regional analyses. Our European combined product is available for all seasons and several depths. This is the first step towards the use of a common reference field for all European seas when running DIVA.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2004AGUFMSF31B..07F','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2004AGUFMSF31B..07F"><span>The International Solid Earth Research Virtual Observatory</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Fox, G.; Pierce, M.; Rundle, J.; Donnellan, A.; Parker, J.; Granat, R.; Lyzenga, G.; McLeod, D.; Grant, L.</p> <p>2004-12-01</p> <p>We describe the architecture and initial implementation of the International Solid Earth Research Virtual Observatory (iSERVO). This has been prototyped within the USA as SERVOGrid and expansion is planned to Australia, China, Japan and other countries. We base our design on a globally scalable distributed "cyber-infrastructure" or Grid built around a Web Services-based approach consistent with the extended Web Service Interoperability approach. The Solid Earth Science Working Group of NASA has identified several challenges for Earth Science research. In order to investigate these, we need to couple numerical simulation codes and data mining tools to observational data sets. This observational data are now available on-line in internet-accessible forms, and the quantity of this data is expected to grow explosively over the next decade. We architect iSERVO as a loosely federated Grid of Grids with each country involved supporting a national Solid Earth Research Grid. The national Grid Operations, possibly with dedicated control centers, are linked together to support iSERVO where an International Grid control center may eventually be necessary. We address the difficult multi-administrative domain security and ownership issues by exposing capabilities as services for which the risk of abuse is minimized. We support large scale simulations within a single domain using service-hosted tools (mesh generation, data repository and sensor access, GIS, visualization). Simulations typically involve sequential or parallel machines in a single domain supported by cross-continent services. We use Web Services implement Service Oriented Architecture (SOA) using WSDL for service description and SOAP for message formats. These are augmented by UDDI, WS-Security, WS-Notification/Eventing and WS-ReliableMessaging in the WS-I+ approach. Support for the latter two capabilities will be available over the next 6 months from the NaradaBrokering messaging system. We augment these specifications with the powerful portlet architecture using WSRP and JSR168 supported by such portal containers as uPortal, WebSphere, and Apache JetSpeed2. The latter portal aggregates component user interfaces for each iSERVO service allowing flexible customization of the user interface. We exploit the portlets produced by the NSF NMI (Middleware initiative) OGCE activity. iSERVO also uses specifications from the Open Geographical Information Systems (GIS) Consortium (OGC) that defines a number of standards for modeling earth surface feature data and services for interacting with this data. The data models are expressed in the XML-based Geography Markup Language (GML), and the OGC service framework are being adapted to use the Web Service model. The SERVO prototype includes a GIS Grid that currently includes the core WMS and WFS (Map and Feature) services. We will follow the best practice in the Grid and Web Service field and will adapt our technology as appropriate. For example, we expect to support services built on WS-RF when is finalized and to make use of the database interfaces OGSA-DAI and its WS-I+ versions. Finally, we review advances in Web Service scripting (such as HPSearch) and workflow systems (such as GCF) and their applications to iSERVO.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2007AGUFMIN52A..05C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2007AGUFMIN52A..05C"><span>Leverage and Delegation in Developing an Information Model for Geology</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Cox, S. J.</p> <p>2007-12-01</p> <p>GeoSciML is an information model and XML encoding developed by a group of primarily geologic survey organizations under the auspices of the IUGS CGI. The scope of the core model broadly corresponds with information traditionally portrayed on a geologic map, viz. interpreted geology, some observations, the map legend and accompanying memoir. The development of GeoSciML has followed the methodology specified for an Application Schema defined by OGC and ISO 19100 series standards. This requires agreement within a community concerning their domain model, its formal representation using UML, documentation as a Feature Type Catalogue, with an XML Schema implementation generated from the model by applying a rule-based transformation. The framework and technology supports a modular governance process. Standard datatypes and GI components (geometry, the feature and coverage metamodels, metadata) are imported from the ISO framework. The observation and sampling model (including boreholes) is imported from OGC. The scale used for most scalar literal values (terms, codes, measures) allows for localization where necessary. Wildcards and abstract base- classes provide explicit extensibility points. Link attributes appear in a regular way in the encodings, allowing reference to external resources using URIs. The encoding is compatible with generic GI data-service interfaces (WFS, WMS, SOS). For maximum interoperability within a community, the interfaces may be specialised through domain-specified constraints (e.g. feature-types, scale and vocabulary bindings, query-models). Formalization using UML and XML allows use of standard validation and processing tools. Use of upper-level elements defined for generic GI application reduces the development effort and governance resonsibility, while maximising cross-domain interoperability. On the other hand, enabling specialization to be delegated in a controlled manner is essential to adoption across a range of subdisciplines and jurisdictions. The GeoSciML design team is responsible only for the part of the model that is unique to geology but for which general agreement can be reached within the domain. This paper is presented on behalf of the Interoperability Working Group of the IUGS Commission for Geoscience Information (CGI) - follow web-link for details of the membership.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015EGUGA..17.7154C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015EGUGA..17.7154C"><span>DECADE web portal: toward the integration of MaGa, EarthChem and VOTW data systems to further the knowledge on Earth degassing</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Cardellini, Carlo; Frigeri, Alessandro; Lehnert, Kerstin; Ash, Jason; McCormick, Brendan; Chiodini, Giovanni; Fischer, Tobias; Cottrell, Elizabeth</p> <p>2015-04-01</p> <p>The release of volatiles from the Earth's interior takes place in both volcanic and non-volcanic areas of the planet. The comprehension of such complex process and the improvement of the current estimates of global carbon emissions, will greatly benefit from the integration of geochemical, petrological and volcanological data. At present, major online data repositories relevant to studies of degassing are not linked and interoperable. In the framework of the Deep Earth Carbon Degassing (DECADE) initiative of the Deep Carbon Observatory (DCO), we are developing interoperability between three data systems that will make their data accessible via the DECADE portal: (1) the Smithsonian Institutionian's Global Volcanism Program database (VOTW) of volcanic activity data, (2) EarthChem databases for geochemical and geochronological data of rocks and melt inclusions, and (3) the MaGa database (Mapping Gas emissions) which contains compositional and flux data of gases released at volcanic and non-volcanic degassing sites. The DECADE web portal will create a powerful search engine of these databases from a single entry point and will return comprehensive multi-component datasets. A user will be able, for example, to obtain data relating to compositions of emitted gases, compositions and age of the erupted products and coincident activity, of a specific volcano. This level of capability requires a complete synergy between the databases, including availability of standard-based web services (WMS, WFS) at all data systems. Data and metadata can thus be extracted from each system without interfering with each database's local schema or being replicated to achieve integration at the DECADE web portal. The DECADE portal will enable new synoptic perspectives on the Earth degassing process allowing to explore Earth degassing related datasets over previously unexplored spatial or temporal ranges.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010EGUGA..1213457G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010EGUGA..1213457G"><span>OGC and Grid Interoperability in enviroGRIDS Project</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Gorgan, Dorian; Rodila, Denisa; Bacu, Victor; Giuliani, Gregory; Ray, Nicolas</p> <p>2010-05-01</p> <p>EnviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is a 4-years FP7 Project aiming to address the subjects of ecologically unsustainable development and inadequate resource management. The project develops a Spatial Data Infrastructure of the Black Sea Catchment region. The geospatial technologies offer very specialized functionality for Earth Science oriented applications as well as the Grid oriented technology that is able to support distributed and parallel processing. One challenge of the enviroGRIDS project is the interoperability between geospatial and Grid infrastructures by providing the basic and the extended features of the both technologies. The geospatial interoperability technology has been promoted as a way of dealing with large volumes of geospatial data in distributed environments through the development of interoperable Web service specifications proposed by the Open Geospatial Consortium (OGC), with applications spread across multiple fields but especially in Earth observation research. Due to the huge volumes of data available in the geospatial domain and the additional introduced issues (data management, secure data transfer, data distribution and data computation), the need for an infrastructure capable to manage all those problems becomes an important aspect. The Grid promotes and facilitates the secure interoperations of geospatial heterogeneous distributed data within a distributed environment, the creation and management of large distributed computational jobs and assures a security level for communication and transfer of messages based on certificates. This presentation analysis and discusses the most significant use cases for enabling the OGC Web services interoperability with the Grid environment and focuses on the description and implementation of the most promising one. In these use cases we give a special attention to issues such as: the relations between computational grid and the OGC Web service protocols, the advantages offered by the Grid technology - such as providing a secure interoperability between the distributed geospatial resource -and the issues introduced by the integration of distributed geospatial data in a secure environment: data and service discovery, management, access and computation. enviroGRIDS project proposes a new architecture which allows a flexible and scalable approach for integrating the geospatial domain represented by the OGC Web services with the Grid domain represented by the gLite middleware. The parallelism offered by the Grid technology is discussed and explored at the data level, management level and computation level. The analysis is carried out for OGC Web service interoperability in general but specific details are emphasized for Web Map Service (WMS), Web Feature Service (WFS), Web Coverage Service (WCS), Web Processing Service (WPS) and Catalog Service for Web (CSW). Issues regarding the mapping and the interoperability between the OGC and the Grid standards and protocols are analyzed as they are the base in solving the communication problems between the two environments: grid and geospatial. The presetation mainly highlights how the Grid environment and Grid applications capabilities can be extended and utilized in geospatial interoperability. Interoperability between geospatial and Grid infrastructures provides features such as the specific geospatial complex functionality and the high power computation and security of the Grid, high spatial model resolution and geographical area covering, flexible combination and interoperability of the geographical models. According with the Service Oriented Architecture concepts and requirements of interoperability between geospatial and Grid infrastructures each of the main functionality is visible from enviroGRIDS Portal and consequently, by the end user applications such as Decision Maker/Citizen oriented Applications. The enviroGRIDS portal is the single way of the user to get into the system and the portal faces a unique style of the graphical user interface. Main reference for further information: [1] enviroGRIDS Project, http://www.envirogrids.net/</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/11771631','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/11771631"><span>The influence of IQ stratification on WAIS-III/WMS-III FSIQ-general memory index discrepancy base-rates in the standardization sample.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Hawkins, K A; Tulsky, D S</p> <p>2001-11-01</p> <p>Since memory performance expectations may be IQ-based, unidirectional base rate data for IQ-Memory Score discrepancies are provided in the WAIS-III/WMS-III Technical Manual. The utility of these data partially rests on the assumption that discrepancy base rates do not vary across ability levels. FSIQ stratified base rate data generated from the standardization sample, however, demonstrate substantial variability across the IQ spectrum. A superiority of memory score over FSIQ is typical at lower IQ levels, whereas the converse is true at higher IQ levels. These data indicate that the use of IQ-memory score unstratified "simple difference" tables could lead to erroneous conclusions for clients with low or high IQ. IQ stratified standardization base rate data are provided as a complement to the "predicted difference" method detailed in the Technical Manual.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/20177123','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/20177123"><span>Testing the limits: cautions and concerns regarding the new Wechsler IQ and Memory scales.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Loring, David W; Bauer, Russell M</p> <p>2010-02-23</p> <p>The Wechsler Adult Intelligence Scale (WAIS) and the Wechsler Memory Scale (WMS) are 2 of the most common psychological tests used in clinical care and research in neurology. Newly revised versions of both instruments (WAIS-IV and WMS-IV) have recently been published and are increasingly being adopted by the neuropsychology community. There have been significant changes in the structure and content of both scales, leading to the potential for inaccurate patient classification if algorithms developed using their predecessors are employed. There are presently insufficient clinical data in neurologic populations to insure their appropriate application to neuropsychological evaluations. We provide a perspective on these important new neuropsychological instruments, comment on the pressures to adopt these tests in the absence of an appropriate evidence base supporting their incremental validity, and describe the potential negative impact on both patient care and continuing research applications.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017JPhCS.898h2032B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017JPhCS.898h2032B"><span>CMS Connect</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Balcas, J.; Bockelman, B.; Gardner, R., Jr.; Hurtado Anampa, K.; Jayatilaka, B.; Aftab Khan, F.; Lannon, K.; Larson, K.; Letts, J.; Marra Da Silva, J.; Mascheroni, M.; Mason, D.; Perez-Calero Yzquierdo, A.; Tiradani, A.</p> <p>2017-10-01</p> <p>The CMS experiment collects and analyzes large amounts of data coming from high energy particle collisions produced by the Large Hadron Collider (LHC) at CERN. This involves a huge amount of real and simulated data processing that needs to be handled in batch-oriented platforms. The CMS Global Pool of computing resources provide +100K dedicated CPU cores and another 50K to 100K CPU cores from opportunistic resources for these kind of tasks and even though production and event processing analysis workflows are already managed by existing tools, there is still a lack of support to submit final stage condor-like analysis jobs familiar to Tier-3 or local Computing Facilities users into these distributed resources in an integrated (with other CMS services) and friendly way. CMS Connect is a set of computing tools and services designed to augment existing services in the CMS Physics community focusing on these kind of condor analysis jobs. It is based on the CI-Connect platform developed by the Open Science Grid and uses the CMS GlideInWMS infrastructure to transparently plug CMS global grid resources into a virtual pool accessed via a single submission machine. This paper describes the specific developments and deployment of CMS Connect beyond the CI-Connect platform in order to integrate the service with CMS specific needs, including specific Site submission, accounting of jobs and automated reporting to standard CMS monitoring resources in an effortless way to their users.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010AGUFMIN33A1300V','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010AGUFMIN33A1300V"><span>Adapting the CUAHSI Hydrologic Information System to OGC standards</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Valentine, D. W.; Whitenack, T.; Zaslavsky, I.</p> <p>2010-12-01</p> <p>The CUAHSI Hydrologic Information System (HIS) provides web and desktop client access to hydrologic observations via water data web services using an XML schema called “WaterML”. The WaterML 1.x specification and the corresponding Water Data Services have been the backbone of the HIS service-oriented architecture (SOA) and have been adopted for serving hydrologic data by several federal agencies and many academic groups. The central discovery service, HIS Central, is based on an metadata catalog that references 4.7 billion observations, organized as 23 million data series from 1.5 million sites from 51 organizations. Observations data are published using HydroServer nodes that have been deployed at 18 organizations. Usage of HIS has increased by 8x from 2008 to 2010, and doubled in usage from 1600 data series a day in 2009 to 3600 data series a day in the first half of 2010. The HIS central metadata catalog currently harvests information from 56 Water Data Services. We collaborate on the catalog updates with two federal partners, USGS and US EPA: their data series are periodically reloaded into the HIS metadata catalog. We are pursuing two main development directions in the HIS project: Cloud-based computing, and further compliance with Open Geospatial Consortium (OGC) standards. The goal of moving to cloud-computing is to provide a scalable collaborative system with a simpler deployment and less dependence of hardware maintenance and staff. This move requires re-architecting the information models underlying the metadata catalog, and Water Data Services to be independent of the underlying relational database model, allowing for implementation on both relational databases, and cloud-based processing systems. Cloud-based HIS central resources can be managed collaboratively; partners share responsibility for their metadata by publishing data series information into the centralized catalog. Publishing data series will use REST-based service interfaces, like OData, as the basis for ingesting data series information into a cloud-hosted catalog. The future HIS services involve providing information via OGC Standards that will allow for observational data access from commercial GIS applications. Use of standards will allow for tools to access observational data from other projects using standards, such as the Ocean Observatories Initiative, and for tools from such projects to be integrated into the HIS toolset. With international collaborators, we have been developing a water information exchange language called “WaterML 2.0” which will be used to deliver observations data over OGC Sensor Observation Services (SOS). A software stack of OGC standard services will provide access to HIS information. In addition to SOS, Web Mapping and Feature Services (WMS, and WFS) will provide access to location information. Catalog Services for the Web (CSW) will provide a catalog for water information that is both centralized, and distributed. We intend the OGC standards supplement the existing HIS service interfaces, rather than replace the present service interfaces. The ultimate goal of this development is expand access to hydrologic observations data, and create an environment where these data can be seamlessly integrated with standards-compliant data resources.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2007ApPhB..89..407L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2007ApPhB..89..407L"><span>Near-infrared diode laser absorption sensor for rapid measurements of temperature and water vapor in a shock tube</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Li, H.; Farooq, A.; Jeffries, J. B.; Hanson, R. K.</p> <p>2007-11-01</p> <p>A fast-response (100 kHz) tunable diode laser absorption sensor is developed for measurements of temperature and H2O concentration in shock tubes, e.g. for studies of combustion chemistry. Gas temperature is determined from the ratio of fixed-wavelength laser absorption of two H2O transitions near 7185.60 cm-1 and 7154.35 cm-1, which are selected using design rules for the target temperature range of 1000-2000 K and pressure range of 1-2 atm. Wavelength modulation spectroscopy is employed with second-harmonic detection (WMS-2f) to improve the sensor sensitivity and accuracy. Normalization of the second-harmonic signal by the first-harmonic signal is used to remove the need for calibration and minimize interference from emission, scattering, beam steering, and window fouling. The laser modulation depth for each H2O transition is optimized to maximize the WMS-2f signal for the target test conditions. The WMS-2f sensor is first validated in mixtures of H2O and Ar in a heated cell for the temperature range of 500-1200 K (P=1 atm), yielding an accuracy of 1.9% for temperature and 1.4% for H2O concentration measurements. Shock wave tests with non-reactive H2O-Ar mixtures are then conducted to demonstrate the sensor accuracy (1.5% for temperature and 1.4% for H2O concentration) and response time at higher temperatures (1200-1700 K, P=1.3-1.6 atm).</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013AGUFM.H33E1418O','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013AGUFM.H33E1418O"><span>Preparing Precipitation Data Access, Value-added Services and Scientific Exploration Tools for the Integrated Multi-satellitE Retrievals for GPM (IMERG)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ostrenga, D.; Liu, Z.; Kempler, S. J.; Vollmer, B.; Teng, W. L.</p> <p>2013-12-01</p> <p>The Precipitation Data and Information Services Center (PDISC) (http://disc.gsfc.nasa.gov/precipitation or google: NASA PDISC), located at the NASA Goddard Space Flight Center (GSFC) Earth Sciences (GES) Data and Information Services Center (DISC), is home of the Tropical Rainfall Measuring Mission (TRMM) data archive. For over 15 years, the GES DISC has served not only TRMM, but also other space-based, airborne-based, field campaign and ground-based precipitation data products to the precipitation community and other disciplinary communities as well. The TRMM Multi-Satellite Precipitation Analysis (TMPA) products are the most popular products in the TRMM product family in terms of data download and access through Mirador, the GES-DISC Interactive Online Visualization ANd aNalysis Infrastructure (Giovanni) and other services. The next generation of TMPA, the Integrated Multi-satellitE Retrievals for GPM (IMERG) to be released in 2014 after the launch of GPM, will be significantly improved in terms of spatial and temporal resolutions. To better serve the user community, we are preparing data services and samples are listed below. To enable scientific exploration of Earth science data products without going through complicated and often time consuming processes, such as data downloading, data processing, etc., the GES DISC has developed Giovanni in consultation with members of the user community, requesting quick search, subset, analysis and display capabilities for their specific data of interest. For example, the TRMM Online Visualization and Analysis System (TOVAS, http://disc2.nascom.nasa.gov/Giovanni/tovas/) has proven extremely popular, especially as additional datasets have been added upon request. Giovanni will continue to evolve to accommodate GPM data and the multi-sensor data inter-comparisons that will be sure to follow. Additional PDISC tool and service capabilities being adapted for GPM data include: An on-line PDISC Portal (includes user guide, etc.); Data ingest, processing, distribution from on-line archive; Google-like Mirador data search and access engine; electronic distribution, Subscriptions; Uses semantic technology to help manage large amounts of multi-sensor data and their relationships; Data drill down and search capabilities; Data access through various web services, i.e., OPeNDAP, GDS, WMS, WCS; Conversion into various formats, e.g., netCDF, HDF, KML (for Google Earth), ascii; Exploration, visualization and statistical online analysis through Giovanni; Visualization and analysis of L2 data profiles and maps; Generation of derived products, such as, daily products; Parameter and spatial subsetting; Time and temporal aggregation; Regridding; Data version control and provenance; Data Stewardship - Continuous archive verification; Documentation; Science support for proper data usage, help desk; Monitoring services for applications; Expertise in data related standards and interoperability. This presentation will further describe the data services at the PDISC that are currently being utilized by precipitation science and application researchers, and the preparation plan for IMERG. Comments and feedback are welcome.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..16.4469A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..16.4469A"><span>Flood Hazard Mapping Assessment for Lebanon</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Abdallah, Chadi; Darwich, Talal; Hamze, Mouin; Zaarour, Nathalie</p> <p>2014-05-01</p> <p>Of all natural disasters, floods affect the greatest number of people worldwide and have the greatest potential to cause damage. In fact, floods are responsible for over one third of people affected by natural disasters; almost 190 million people in more than 90 countries are exposed to catastrophic floods every year. Nowadays, with the emerging global warming phenomenon, this number is expected to increase, therefore, flood prediction and prevention has become a necessity in many places around the globe to decrease damages caused by flooding. Available evidence hints at an increasing frequency of flooding disasters being witnessed in the last 25 years in Lebanon. The consequences of such events are tragic including annual financial losses of around 15 million dollars. In this work, a hydrologic-hydraulic modeling framework for flood hazard mapping over Lebanon covering 19 watershed was introduced. Several empirical, statistical and stochastic methods to calculate the flood magnitude and its related return periods, where rainfall and river gauge data are neither continuous nor available on a long term basis with an absence of proper river sections that under estimate flows during flood events. TRMM weather satellite information, automated drainage networks, curve numbers and other geometrical characteristics for each basin was prepared using WMS-software and then exported into HMS files to implement the hydrologic modeling (rainfall-runoff) for single designed storm of uniformly distributed depth along each basin. The obtained flow hydrographs were implemented in the hydraulic model (HEC-RAS) where relative water surface profiles are calculated and flood plains are delineated. The model was calibrated using the last flood event of January 2013, field investigation, and high resolution satellite images. Flow results proved to have an accuracy ranging between 83-87% when compared to the computed statistical and stochastic methods. Results included the generation of recurrence flood plain maps of 10, 50 & 100 years intensity maps along with flood hazard maps for each watershed. It is of utmost significance for this study to be effective that the produced flood intensity and hazard maps will be made available to decision-makers, planners and relevant community stakeholders.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010EGUGA..1211935B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010EGUGA..1211935B"><span>Harvesting implementation for the GI-cat distributed catalog</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Boldrini, Enrico; Papeschi, Fabrizio; Bigagli, Lorenzo; Mazzetti, Paolo</p> <p>2010-05-01</p> <p>GI-cat framework implements a distributed catalog service supporting different international standards and interoperability arrangements in use by the geoscientific community. The distribution functionality in conjunction with the mediation functionality allows to seamlessly query remote heterogeneous data sources, including OGC Web Services - e.e. OGC CSW, WCS, WFS and WMS, community standards such as UNIDATA THREDDS/OPeNDAP, SeaDataNet CDI (Common Data Index), GBIF (Global Biodiversity Information Facility) services and OpenSearch engines. In the GI-cat modular architecture a distributor component carry out the distribution functionality by query delegation to the mediator components (one for each different data source). Each of these mediator components is able to query a specific data source and convert back the results by mapping of the foreign data model to the GI-cat internal one, based on ISO 19139. In order to cope with deployment scenarios in which local data is expected, an harvesting approach has been experimented. The new strategy comes in addition to the consolidated distributed approach, allowing the user to switch between a remote and a local search at will for each federated resource; this extends GI-cat configuration possibilities. The harvesting strategy is designed in GI-cat by the use at the core of a local cache component, implemented as a native XML database and based on eXist. The different heterogeneous sources are queried for the bulk of available data; this data is then injected into the cache component after being converted to the GI-cat data model. The query and conversion steps are performed by the mediator components that were are part of the GI-cat framework. Afterward each new query can be exercised against local data that have been stored in the cache component. Considering both advantages and shortcomings that affect harvesting and query distribution approaches, it comes out that a user driven tuning is required to take the best of them. This is often related to the specific user scenarios to be implemented. GI-cat proved to be a flexible framework to address user need. The GI-cat configurator tool was updated to make such a tuning possible: each data source can be configured to enable either harvesting or query distribution approaches; in the former case an appropriate harvesting interval can be set.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://edg.epa.gov/metadata/catalog/search/resource/details.page?uuid=%7BBDF514A6-05A8-400D-BF3D-030645461334%7D','PESTICIDES'); return false;" href="https://edg.epa.gov/metadata/catalog/search/resource/details.page?uuid=%7BBDF514A6-05A8-400D-BF3D-030645461334%7D"><span>EnviroAtlas - Metrics for Austin, TX</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.epa.gov/pesticides/search.htm">EPA Pesticide Factsheets</a></p> <p></p> <p></p> <p>This EnviroAtlas web service supports research and online mapping activities related to EnviroAtlas (https://enviroatlas.epa.gov/EnviroAtlas). The layers in this web service depict ecosystem services at the census block group level for the community of Austin, Texas. These layers illustrate the ecosystems and natural resources that are associated with clean air (https://enviroatlas.epa.gov/arcgis/rest/services/Communities/ESC_ATX_CleanAir/MapServer); clean and plentiful water (https://enviroatlas.epa.gov/arcgis/rest/services/Communities/ESC_ATX_CleanPlentifulWater/MapServer); natural hazard mitigation (https://enviroatlas.epa.gov/arcgis/rest/services/Communities/ESC_ATX_NaturalHazardMitigation/MapServer); climate stabilization (https://enviroatlas.epa.gov/arcgis/rest/services/Communities/ESC_ATX_ClimateStabilization/MapServer); food, fuel, and materials (https://enviroatlas.epa.gov/arcgis/rest/services/Communities/ESC_ATX_FoodFuelMaterials/MapServer); recreation, culture, and aesthetics (https://enviroatlas.epa.gov/arcgis/rest/services/Communities/ESC_ATX_RecreationCultureAesthetics/MapServer); and biodiversity conservation (https://enviroatlas.epa.gov/arcgis/rest/services/Communities/ESC_ATX_BiodiversityConservation/MapServer), and factors that place stress on those resources. EnviroAtlas allows the user to interact with a web-based, easy-to-use, mapping application to view and analyze multiple ecosystem services for the conterminous United States as well as de</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013AGUFMIN33C..08S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013AGUFMIN33C..08S"><span>Emerging Geospatial Sharing Technologies in Earth and Space Science Informatics</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Singh, R.; Bermudez, L. E.</p> <p>2013-12-01</p> <p>Emerging Geospatial Sharing Technologies in Earth and Space Science Informatics The Open Geospatial Consortium (OGC) mission is to serve as a global forum for the collaboration of developers and users of spatial data products and services, and to advance the development of international standards for geospatial interoperability. The OGC coordinates with over 400 institutions in the development of geospatial standards. In the last years two main trends are making disruptions in geospatial applications: mobile and context sharing. People now have more and more mobile devices to support their work and personal life. Mobile devices are intermittently connected to the internet and have smaller computing capacity than a desktop computer. Based on this trend a new OGC file format standard called GeoPackage will enable greater geospatial data sharing on mobile devices. GeoPackage is perhaps best understood as the natural evolution of Shapefiles, which have been the predominant lightweight geodata sharing format for two decades. However the format is extremely limited. Four major shortcomings are that only vector points, lines, and polygons are supported; property names are constrained by the dBASE format; multiple files are required to encode a single data set; and multiple Shapefiles are required to encode multiple data sets. A more modern lingua franca for geospatial data is long overdue. GeoPackage fills this need with support for vector data, image tile matrices, and raster data. And it builds upon a database container - SQLite - that's self-contained, single-file, cross-platform, serverless, transactional, and open source. A GeoPackage, in essence, is a set of SQLite database tables whose content and layout is described in the candidate GeoPackage Implementation Specification available at https://portal.opengeospatial.org/files/?artifact_id=54838&version=1. The second trend is sharing client 'contexts'. When a user is looking into an article or a product on the web, they can easily share this information with colleagues or friends via an email that includes URLs (links to web resources) and attachments (inline data). In the case of geospatial information, a user would like to share a map created from different OGC sources, which may include for example, WMS and WFS links, and GML and KML annotations. The emerging OGC file format is called the OGC Web Services Context Document (OWS Context), which allows clients to reproduce a map previously created by someone else. Context sharing is important in a variety of domains, from emergency response, where fire, police and emergency medical personnel need to work off a common map, to multi-national military operations, where coalition forces need to share common data sources, but have cartographic displays in different languages and symbology sets. OWS Contexts can be written in XML (building upon the Atom Syndication Format) or JSON. This presentation will provide an introduction of GeoPackage and OWS Context and how they can be used to advance sharing of Earth and Space Science information.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015AGUFMIN23D1756G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015AGUFMIN23D1756G"><span>Using OPeNDAP's Data-Services Framework to Lift Mash-Ups above Blind Dates</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Gallagher, J. H. R.; Fulker, D. W.</p> <p>2015-12-01</p> <p>OPeNDAP's data-as-service framework (Hyrax) matches diverse sources with many end-user tools and contexts. Keys to its flexibility include: A data model embracing tabular data alongside n-dim arrays and other structures useful in geoinformatics. A REST-like protocol that supports—via suffix notation—a growing set of output forms (netCDF, XML, etc.) plus a query syntax for subsetting. Subsetting applies (via constraints on column values) to tabular data or (via constraints on indices or coordinates) to array-style data . A handler-style architecture that admits a growing set of input types. Community members may contribute handlers, making Hyrax effective as middleware, where N sources are mapped to M outputs with order N+M effort (not NxM). Hyrax offers virtual aggregations of source data, enabling granularity aimed at users, not data-collectors. OPeNDAP-access libraries exist in multiple languages, including Python, Java, and C++. Recent enhancements are increasing this framework's interoperability (i.e., its mash-up) potential. Extensions implemented as servlets—running adjacent to Hyrax—are enriching the forms of aggregation and enabling new protocols: User-specified aggregations, namely, applying a query to (huge) lists of source granules, and receiving one (large) table or zipped netCDF file. OGC (Open Geospatial Consortium) protocols, WMS and WCS. A Webification (W10n) protocol that returns JavaScript Object Notation (JSON). Extensions to OPeNDAP's query language are reducing transfer volumes and enabling new forms of inspection. Advances underway include: Functions that, for triangular-mesh sources, return sub-meshes spec'd via geospatial bounding boxes. Functions that, for data from multiple, satellite-borne sensors (with differing orbits), select observations based on coincidence. Calculations of means, histograms, etc. that greatly reduce output volumes.. Paths for communities to contribute new server functions (in Python, e.g.) that data providers may incorporate into Hyrax via installation parameters. One could say Hyrax itself is a mash-up, but we suggest it as an instrument for a mash-up artist's toolbox. This instrument can support mash-ups built on netCDF files, OGC protocols, JavaScript Web pages, and/or programs written in Python, Java, C or C++.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..1615742A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..1615742A"><span>National Geothermal Data System (USA): an Exemplar of Open Access to Data</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Allison, M. Lee; Richard, Stephen; Blackman, Harold; Anderson, Arlene; Patten, Kim</p> <p>2014-05-01</p> <p>The National Geothermal Data System's (NGDS - www.geothermaldata.org) formal launch in April, 2014 will provide open access to millions of data records, sharing -relevant geoscience and longer term to land use data to propel geothermal development and production. NGDS serves information from all of the U.S. Department of Energy's sponsored development and research projects and geologic data from all 50 states, using free and open source software. This interactive online system is opening new exploration opportunities and potentially shortening project development by making data easily discoverable, accessible, and interoperable. We continue to populate our prototype functional data system with multiple data nodes and nationwide data online and available to the public. Data from state geological surveys and partners includes more than 6 million records online, including 1.72 million well headers (oil and gas, water, geothermal), 670,000 well logs, and 497,000 borehole temperatures and is growing rapidly. There are over 312 interoperable Web services and another 106 WMS (Web Map Services) registered in the system as of January, 2014. Companion projects run by Southern Methodist University and U.S. Geological Survey (USGS) are adding millions of additional data records. The DOE Geothermal Data Repository, currently hosted on OpenEI, is a system node and clearinghouse for data from hundreds of U.S. DOE-funded geothermal projects. NGDS is built on the US Geoscience Information Network (USGIN) data integration framework, which is a joint undertaking of the USGS and the Association of American State Geologists (AASG). NGDS complies with the White House Executive Order of May 2013, requiring all federal agencies to make their data holdings publicly accessible online in open source, interoperable formats with common core and extensible metadata. The National Geothermal Data System is being designed, built, deployed, and populated primarily with support from the US Department of Energy, Geothermal Technologies Office. To keep this system operational after the original implementation will require four core elements: continued serving of data and applications by providers; maintenance of system operations; a governance structure; and an effective business model. Each of these presents a number of challenges currently under consideration.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFMIN51C1870L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFMIN51C1870L"><span>Remotely Sensed Imagery from USGS: Update on Products and Portals</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Lamb, R.; Lemig, K.</p> <p>2016-12-01</p> <p>The USGS Earth Resources Observation and Science (EROS) Center has recently implemented a number of additions and changes to its existing suite of products and user access systems. Together, these changes will enhance the accessibility, breadth, and usability of the remotely sensed image products and delivery mechanisms available from USGS. As of late 2016, several new image products are now available for public download at no charge from USGS/EROS Center. These new products include: (1) global Level 1T (precision terrain-corrected) products from Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER), provided via NASA's Land Processes Distributed Active Archive Center (LP DAAC); and (2) Sentinel-2 Multispectral Instrument (MSI) products, available through a collaborative effort with the European Space Agency (ESA). Other new products are also planned to become available soon. In an effort to enable future scientific analysis of the full 40+ year Landsat archive, the USGS also introduced a new "Collection Management" strategy for all Landsat Level 1 products. This new archive and access schema involves quality-based tier designations that will support future time series analysis of the historic Landsat archive at the pixel level. Along with the quality tier designations, the USGS has also implemented a number of other Level 1 product improvements to support Landsat science applications, including: enhanced metadata, improved geometric processing, refined quality assessment information, and angle coefficient files. The full USGS Landsat archive is now being reprocessed in accordance with the new `Collection 1' specifications. Several USGS data access and visualization systems have also seen major upgrades. These user interfaces include a new version of the USGS LandsatLook Viewer which was released in Fall 2017 to provide enhanced functionality and Sentinel-2 visualization and access support. A beta release of the USGS Global Visualization Tool ("GloVis Next") was also released in Fall 2017, with many new features including data visualization at full resolution. The USGS also introduced a time-enabled web mapping service (WMS) to support time-based access to the existing LandsatLook "natural color" full-resolution browse image services.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013AGUFMIN41A1593A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013AGUFMIN41A1593A"><span>National Geothermal Data System: an Exemplar of Open Access to Data</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Allison, M. L.; Richard, S. M.; Blackman, H.; Anderson, A.</p> <p>2013-12-01</p> <p>The National Geothermal Data System's (NGDS - www.geothermaldata.org) formal launch in 2014 will provide open access to millions of datasets, sharing technical geothermal-relevant data across the geosciences to propel geothermal development and production. With information from all of the Department of Energy's sponsored development and research projects and geologic data from all 50 states, this free, interactive tool is opening new exploration opportunities and shortening project development by making data easily discoverable and accessible. We continue to populate our prototype functional data system with multiple data nodes and nationwide data online and available to the public. Data from state geological surveys and partners includes more than 5 million records online, including 1.48 million well headers (oil and gas, water, geothermal), 732,000 well logs, and 314,000 borehole temperatures and is growing rapidly. There are over 250 Web services and another 138 WMS (Web Map Services) registered in the system as of August, 2013. Companion projects run by Boise State University, Southern Methodist University, and USGS are adding millions of additional data records. The National Renewable Energy Laboratory is managing the Geothermal Data Repository which will serve as a system node and clearinghouse for data from hundreds of DOE-funded geothermal projects. NGDS is built on the US Geoscience Information Network data integration framework, which is a joint undertaking of the USGS and the Association of American State Geologists (AASG). NGDS is fully compliant with the White House Executive Order of May 2013, requiring all federal agencies to make their data holdings publicly accessible online in open source, interoperable formats with common core and extensible metadata. The National Geothermal Data System is being designed, built, deployed, and populated primarily with grants from the US Department of Energy, Geothermal Technologies Office. To keep this operational system sustainable after the original implementation will require four core elements: continued serving of data and applications by providers; maintenance of system operations; a governance structure; and an effective business model. Each of these presents a number of challenges currently under consideration.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li class="active"><span>17</span></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_17 --> <div id="page_18" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li class="active"><span>18</span></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="341"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2009AGUFM.U33B0063R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2009AGUFM.U33B0063R"><span>Bringing Terra Science to the People: 10 years of education and public outreach</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Riebeek, H.; Chambers, L. H.; Yuen, K.; Herring, D.</p> <p>2009-12-01</p> <p>The default image on Apple's iPhone is a blue, white, green and tan globe: the Blue Marble. The iconic image was produced using Terra data as part of the mission's education and public outreach efforts. As far-reaching and innovative as Terra science has been over the past decade, Terra education and public outreach efforts have been equally successful. This talk will provide an overview of Terra's crosscutting education and public outreach projects, which have reached into educational facilities—classrooms, museums, and science centers, across the Internet, and into everyday life. The Earth Observatory web site was the first web site designed for the public that told the unified story of what we can learn about our planet from all space-based platforms. Initially conceived as part of Terra mission outreach in 1999, the web site has won five Webby awards, the highest recognition a web site can receive. The Visible Earth image gallery is a catalogue of NASA Earth imagery that receives more than one million page views per month. The NEO (NASA Earth Observations) web site and WMS (web mapping service) tool serves global data sets to museums and science centers across the world. Terra educational products, including the My NASA Data web service and the Students' Cloud Observations Online (S'COOL) project, bring Terra data into the classroom. Both projects target multiple grade levels, ranging from elementary school to graduate school. S'COOL uses student observations of clouds to help validate Terra data. Students and their parents have puzzled over weekly "Where on Earth" geography quizzes published on line. Perhaps the most difficult group to reach is the large segment of the public that does not seek out science information online or in a science museum or classroom. To reach these people, EarthSky produced a series of podcasts and radio broadcasts that brought Terra science to more than 30 million people in 2009. Terra imagery, including the Blue Marble, have seen wide distribution in books like Our Changing Planet and films like An Inconvenient Truth. The Blue Marble, courtesy Reto Stockli and Rob Simmon, NASA's Earth Observatory.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFMPA23B2225Z','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFMPA23B2225Z"><span>Owgis 2.0: Open Source Java Application that Builds Web GIS Interfaces for Desktop Andmobile Devices</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Zavala Romero, O.; Chassignet, E.; Zavala-Hidalgo, J.; Pandav, H.; Velissariou, P.; Meyer-Baese, A.</p> <p>2016-12-01</p> <p>OWGIS is an open source Java and JavaScript application that builds easily configurable Web GIS sites for desktop and mobile devices. The current version of OWGIS generates mobile interfaces based on HTML5 technology and can be used to create mobile applications. The style of the generated websites can be modified using COMPASS, a well known CSS Authoring Framework. In addition, OWGIS uses several Open Geospatial Consortium standards to request datafrom the most common map servers, such as GeoServer. It is also able to request data from ncWMS servers, allowing the websites to display 4D data from NetCDF files. This application is configured by XML files that define which layers, geographic datasets, are displayed on the Web GIS sites. Among other features, OWGIS allows for animations; streamlines from vector data; virtual globe display; vertical profiles and vertical transects; different color palettes; the ability to download data; and display text in multiple languages. OWGIS users are mainly scientists in the oceanography, meteorology and climate fields.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013AGUFMIN11B1522H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013AGUFMIN11B1522H"><span>NASA World Wind Near Real Time Data for Earth</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Hogan, P.</p> <p>2013-12-01</p> <p>Innovation requires open standards for data exchange, not to mention ^access to data^ so that value-added, the information intelligence, can be continually created and advanced by the larger community. Likewise, innovation by academia and entrepreneurial enterprise alike, are greatly benefited by an open platform that provides the basic technology for access and visualization of that data. NASA World Wind Java, and now NASA World Wind iOS for the iPhone and iPad, provides that technology. Whether the interest is weather science or climate science, emergency response or supply chain, seeing spatial data in its native context of Earth accelerates understanding and improves decision-making. NASA World Wind open source technology provides the basic elements for 4D visualization, using Open Geospatial Consortium (OGC) protocols, while allowing for customized access to any data, big or small, including support for NetCDF. NASA World Wind includes access to a suite of US Government WMS servers with near real time data. The larger community can readily capitalize on this technology, building their own value-added applications, either open or proprietary. Night lights heat map Glacier National Park</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018HMT....54..209A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018HMT....54..209A"><span>Void fraction development in gas-liquid flow after a U-bend in a vertically upwards serpentine-configuration large-diameter pipe</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Almabrok, Almabrok A.; Aliyu, Aliyu M.; Baba, Yahaya D.; Lao, Liyun; Yeung, Hoi</p> <p>2018-01-01</p> <p>We investigate the effect of a return U-bend on flow behaviour in the vertical upward section of a large-diameter pipe. A wire mesh sensor was employed to study the void fraction distributions at axial distances of 5, 28 and 47 pipe diameters after the upstream bottom bend. The study found that, the bottom bend has considerable impacts on up-flow behaviour. In all conditions, centrifugal action causes appreciable misdistribution in the adjacent straight section. Plots from WMS measurements show that flow asymmetry significantly reduces along the axis at L/D = 47. Regime maps generated from three axial locations showed that, in addition to bubbly, intermittent and annular flows, oscillatory flow occurred particularly when gas and liquid flow rates were relatively low. At this position, mean void fractions were in agreement with those from other large-pipe studies, and comparisons were made with existing void fraction correlations. Among the correlations surveyed, drift flux-type correlations were found to give the best predictive results.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.usgs.gov/fs/2010/3055/','USGSPUBS'); return false;" href="https://pubs.usgs.gov/fs/2010/3055/"><span>The National Map: New Viewer, Services, and Data Download</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Dollison, Robert M.</p> <p>2010-01-01</p> <p>Managed by the U.S. Geological Survey's (USGS) National Geospatial Program, The National Map has transitioned data assets and viewer applications to a new visualization and product and service delivery environment, which includes an improved viewing platform, base map data and overlay services, and an integrated data download service. This new viewing solution expands upon the National Geospatial Intelligence Agency (NGA) Palanterra X3 viewer, providing a solid technology foundation for navigation and basic Web mapping functionality. Building upon the NGA viewer allows The National Map to focus on improving data services, functions, and data download capabilities. Initially released to the public at the 125th anniversary of mapping in the USGS on December 3, 2009, the viewer and services are now the primary distribution point for The National Map data. The National Map Viewer: http://viewer.nationalmap.gov</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4821494','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4821494"><span>A Map-Based Service Supporting Different Types of Geographic Knowledge for the Public</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Zhou, Mengjie; Wang, Rui; Tian, Jing; Ye, Ning; Mai, Shumin</p> <p>2016-01-01</p> <p>The internet enables the rapid and easy creation, storage, and transfer of knowledge; however, services that transfer geographic knowledge and facilitate the public understanding of geographic knowledge are still underdeveloped to date. Existing online maps (or atlases) can support limited types of geographic knowledge. In this study, we propose a framework for map-based services to represent and transfer different types of geographic knowledge to the public. A map-based service provides tools to ensure the effective transfer of geographic knowledge. We discuss the types of geographic knowledge that should be represented and transferred to the public, and we propose guidelines and a method to represent various types of knowledge through a map-based service. To facilitate the effective transfer of geographic knowledge, tools such as auxiliary background knowledge and auxiliary map-reading tools are provided through interactions with maps. An experiment conducted to illustrate our idea and to evaluate the usefulness of the map-based service is described; the results demonstrate that the map-based service is useful for transferring different types of geographic knowledge. PMID:27045314</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27045314','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27045314"><span>A Map-Based Service Supporting Different Types of Geographic Knowledge for the Public.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Zhou, Mengjie; Wang, Rui; Tian, Jing; Ye, Ning; Mai, Shumin</p> <p>2016-01-01</p> <p>The internet enables the rapid and easy creation, storage, and transfer of knowledge; however, services that transfer geographic knowledge and facilitate the public understanding of geographic knowledge are still underdeveloped to date. Existing online maps (or atlases) can support limited types of geographic knowledge. In this study, we propose a framework for map-based services to represent and transfer different types of geographic knowledge to the public. A map-based service provides tools to ensure the effective transfer of geographic knowledge. We discuss the types of geographic knowledge that should be represented and transferred to the public, and we propose guidelines and a method to represent various types of knowledge through a map-based service. To facilitate the effective transfer of geographic knowledge, tools such as auxiliary background knowledge and auxiliary map-reading tools are provided through interactions with maps. An experiment conducted to illustrate our idea and to evaluate the usefulness of the map-based service is described; the results demonstrate that the map-based service is useful for transferring different types of geographic knowledge.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012JPhCS.396b2053T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012JPhCS.396b2053T"><span>SuperB Simulation Production System</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Tomassetti, L.; Bianchi, F.; Ciaschini, V.; Corvo, M.; Del Prete, D.; Di Simone, A.; Donvito, G.; Fella, A.; Franchini, P.; Giacomini, F.; Gianoli, A.; Longo, S.; Luitz, S.; Luppi, E.; Manzali, M.; Pardi, S.; Paolini, A.; Perez, A.; Rama, M.; Russo, G.; Santeramo, B.; Stroili, R.</p> <p>2012-12-01</p> <p>The SuperB asymmetric e+e- collider and detector to be built at the newly founded Nicola Cabibbo Lab will provide a uniquely sensitive probe of New Physics in the flavor sector of the Standard Model. Studying minute effects in the heavy quark and heavy lepton sectors requires a data sample of 75 ab-1 and a peak luminosity of 1036 cm-2 s-1. The SuperB Computing group is working on developing a simulation production framework capable to satisfy the experiment needs. It provides access to distributed resources in order to support both the detector design definition and its performance evaluation studies. During last year the framework has evolved from the point of view of job workflow, Grid services interfaces and technologies adoption. A complete code refactoring and sub-component language porting now permits the framework to sustain distributed production involving resources from two continents and Grid Flavors. In this paper we will report a complete description of the production system status of the art, its evolution and its integration with Grid services; in particular, we will focus on the utilization of new Grid component features as in LB and WMS version 3. Results from the last official SuperB production cycle will be reported.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2011JPhCS.331f2026A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2011JPhCS.331f2026A"><span>Certification of production-quality gLite Job Management components</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Andreetto, P.; Bertocco, S.; Capannini, F.; Cecchi, M.; Dorigo, A.; Frizziero, E.; Giacomini, F.; Gianelle, A.; Mezzadri, M.; Molinari, E.; Monforte, S.; Prelz, F.; Rebatto, D.; Sgaravatto, M.; Zangrando, L.</p> <p>2011-12-01</p> <p>With the advent of the recent European Union (EU) funded projects aimed at achieving an open, coordinated and proactive collaboration among the European communities that provide distributed computing services, more strict requirements and quality standards will be asked to middleware providers. Such a highly competitive and dynamic environment, organized to comply a business-oriented model, has already started pursuing quality criteria, thus requiring to formally define rigorous procedures, interfaces and roles for each step of the software life-cycle. This will ensure quality-certified releases and updates of the Grid middleware. In the European Middleware Initiative (EMI), the release management for one or more components will be organized into Product Team (PT) units, fully responsible for delivering production ready, quality-certified software and for coordinating each other to contribute to the EMI release as a whole. This paper presents the certification process, with respect to integration, installation, configuration and testing, adopted at INFN by the Product Team responsible for the gLite Web-Service based Computing Element (CREAM CE) and for the Workload Management System (WMS). The used resources, the testbeds layout, the integration and deployment methods, the certification steps to provide feedback to developers and to grant quality results are described.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/22030226','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/22030226"><span>A distinct adipose tissue gene expression response to caloric restriction predicts 6-mo weight maintenance in obese subjects.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Mutch, David M; Pers, Tune H; Temanni, M Ramzi; Pelloux, Veronique; Marquez-Quiñones, Adriana; Holst, Claus; Martinez, J Alfredo; Babalis, Dimitris; van Baak, Marleen A; Handjieva-Darlenska, Teodora; Walker, Celia G; Astrup, Arne; Saris, Wim H M; Langin, Dominique; Viguerie, Nathalie; Zucker, Jean-Daniel; Clément, Karine</p> <p>2011-12-01</p> <p>Weight loss has been shown to reduce risk factors associated with cardiovascular disease and diabetes; however, successful maintenance of weight loss continues to pose a challenge. The present study was designed to assess whether changes in subcutaneous adipose tissue (scAT) gene expression during a low-calorie diet (LCD) could be used to differentiate and predict subjects who experience successful short-term weight maintenance from subjects who experience weight regain. Forty white women followed a dietary protocol consisting of an 8-wk LCD phase followed by a 6-mo weight-maintenance phase. Participants were classified as weight maintainers (WMs; 0-10% weight regain) and weight regainers (WRs; 50-100% weight regain) by considering changes in body weight during the 2 phases. Anthropometric measurements, bioclinical variables, and scAT gene expression were studied in all individuals before and after the LCD. Energy intake was estimated by using 3-d dietary records. No differences in body weight and fasting insulin were observed between WMs and WRs at baseline or after the LCD period. The LCD resulted in significant decreases in body weight and in several plasma variables in both groups. WMs experienced a significant reduction in insulin secretion in response to an oral-glucose-tolerance test after the LCD; in contrast, no changes in insulin secretion were observed in WRs after the LCD. An ANOVA of scAT gene expression showed that genes regulating fatty acid metabolism, citric acid cycle, oxidative phosphorylation, and apoptosis were regulated differently by the LCD in WM and WR subjects. This study suggests that LCD-induced changes in insulin secretion and scAT gene expression may have the potential to predict successful short-term weight maintenance. This trial was registered at clinicaltrials.gov as NCT00390637.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5977208','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5977208"><span>The Present and Future of Whole Genome Sequencing (WGS) and Whole Metagenome Sequencing (WMS) for Surveillance of Antimicrobial Resistant Microorganisms and Antimicrobial Resistance Genes across the Food Chain</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Oniciuc, Elena A.; Likotrafiti, Eleni; Alvarez-Molina, Adrián; Alvarez-Ordóñez, Avelino</p> <p>2018-01-01</p> <p>Antimicrobial resistance (AMR) surveillance is a critical step within risk assessment schemes, as it is the basis for informing global strategies, monitoring the effectiveness of public health interventions, and detecting new trends and emerging threats linked to food. Surveillance of AMR is currently based on the isolation of indicator microorganisms and the phenotypic characterization of clinical, environmental and food strains isolated. However, this approach provides very limited information on the mechanisms driving AMR or on the presence or spread of AMR genes throughout the food chain. Whole-genome sequencing (WGS) of bacterial pathogens has shown potential for epidemiological surveillance, outbreak detection, and infection control. In addition, whole metagenome sequencing (WMS) allows for the culture-independent analysis of complex microbial communities, providing useful information on AMR genes occurrence. Both technologies can assist the tracking of AMR genes and mobile genetic elements, providing the necessary information for the implementation of quantitative risk assessments and allowing for the identification of hotspots and routes of transmission of AMR across the food chain. This review article summarizes the information currently available on the use of WGS and WMS for surveillance of AMR in foodborne pathogenic bacteria and food-related samples and discusses future needs that will have to be considered for the routine implementation of these next-generation sequencing methodologies with this aim. In particular, methodological constraints that impede the use at a global scale of these high-throughput sequencing (HTS) technologies are identified, and the standardization of methods and protocols is suggested as a measure to upgrade HTS-based AMR surveillance schemes. PMID:29789467</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4610589','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4610589"><span>Map as a Service: A Framework for Visualising and Maximising Information Return from Multi-Modal Wireless Sensor Networks</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Hammoudeh, Mohammad; Newman, Robert; Dennett, Christopher; Mount, Sarah; Aldabbas, Omar</p> <p>2015-01-01</p> <p>This paper presents a distributed information extraction and visualisation service, called the mapping service, for maximising information return from large-scale wireless sensor networks. Such a service would greatly simplify the production of higher-level, information-rich, representations suitable for informing other network services and the delivery of field information visualisations. The mapping service utilises a blend of inductive and deductive models to map sense data accurately using externally available knowledge. It utilises the special characteristics of the application domain to render visualisations in a map format that are a precise reflection of the concrete reality. This service is suitable for visualising an arbitrary number of sense modalities. It is capable of visualising from multiple independent types of the sense data to overcome the limitations of generating visualisations from a single type of sense modality. Furthermore, the mapping service responds dynamically to changes in the environmental conditions, which may affect the visualisation performance by continuously updating the application domain model in a distributed manner. Finally, a distributed self-adaptation function is proposed with the goal of saving more power and generating more accurate data visualisation. We conduct comprehensive experimentation to evaluate the performance of our mapping service and show that it achieves low communication overhead, produces maps of high fidelity, and further minimises the mapping predictive error dynamically through integrating the application domain model in the mapping service. PMID:26378539</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012AGUFMIN11D1475G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012AGUFMIN11D1475G"><span>Building a Snow Data Management System using Open Source Software (and IDL)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Goodale, C. E.; Mattmann, C. A.; Ramirez, P.; Hart, A. F.; Painter, T.; Zimdars, P. A.; Bryant, A.; Brodzik, M.; Skiles, M.; Seidel, F. C.; Rittger, K. E.</p> <p>2012-12-01</p> <p>At NASA's Jet Propulsion Laboratory free and open source software is used everyday to support a wide range of projects, from planetary to climate to research and development. In this abstract I will discuss the key role that open source software has played in building a robust science data processing pipeline for snow hydrology research, and how the system is also able to leverage programs written in IDL, making JPL's Snow Data System a hybrid of open source and proprietary software. Main Points: - The Design of the Snow Data System (illustrate how the collection of sub-systems are combined to create a complete data processing pipeline) - Discuss the Challenges of moving from a single algorithm on a laptop, to running 100's of parallel algorithms on a cluster of servers (lesson's learned) - Code changes - Software license related challenges - Storage Requirements - System Evolution (from data archiving, to data processing, to data on a map, to near-real-time products and maps) - Road map for the next 6 months (including how easily we re-used the snowDS code base to support the Airborne Snow Observatory Mission) Software in Use and their Software Licenses: IDL - Used for pre and post processing of data. Licensed under a proprietary software license held by Excelis. Apache OODT - Used for data management and workflow processing. Licensed under the Apache License Version 2. GDAL - Geospatial Data processing library used for data re-projection currently. Licensed under the X/MIT license. GeoServer - WMS Server. Licensed under the General Public License Version 2.0 Leaflet.js - Javascript web mapping library. Licensed under the Berkeley Software Distribution License. Python - Glue code and miscellaneous data processing support. Licensed under the Python Software Foundation License. Perl - Script wrapper for running the SCAG algorithm. Licensed under the General Public License Version 3. PHP - Front-end web application programming. Licensed under the PHP License Version 3.01</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..16.2735S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..16.2735S"><span>SeaDataNet II - EMODNet - building a pan-European infrastructure for marine and ocean data management</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Schaap, Dick M. A.; Fichaut, Michele</p> <p>2014-05-01</p> <p>The second phase of the project SeaDataNet is well underway since October 2011 and is making good progress. The main objective is to improve operations and to progress towards an efficient data management infrastructure able to handle the diversity and large volume of data collected via research cruises and monitoring activities in European marine waters and global oceans. The SeaDataNet infrastructure comprises a network of interconnected data centres and a central SeaDataNet portal. The portal provides users a unified and transparent overview of the metadata and controlled access to the large collections of data sets, managed by the interconnected data centres, and the various SeaDataNet standards and tools,. Recently the 1st Innovation Cycle has been completed, including upgrading of the CDI Data Discovery and Access service to ISO 19139 and making it fully INSPIRE compliant. The extensive SeaDataNet Vocabularies have been upgraded too and implemented for all SeaDataNet European metadata directories. SeaDataNet is setting and governing marine data standards, and exploring and establishing interoperability solutions to connect to other e-infrastructures on the basis of standards of ISO (19115, 19139), OGC (WMS, WFS, CS-W and SWE), and OpenSearch. The population of directories has also increased considerably in cooperation and involvement in associated EU projects and initiatives. SeaDataNet now gives overview and access to more than 1.4 million data sets for physical oceanography, chemistry, geology, geophysics, bathymetry and biology from more than 90 connected data centres from 30 countries riparian to European seas. Access to marine data is also a key issue for the implementation of the EU Marine Strategy Framework Directive (MSFD). The EU communication 'Marine Knowledge 2020' underpins the importance of data availability and harmonising access to marine data from different sources. SeaDataNet qualified itself for leading the data management component of the EMODNet (European Marine Observation and Data Network) that is promoted in the EU Communication. In the past 4 years EMODNet portals have been initiated for marine data themes: digital bathymetry, chemistry, physical oceanography, geology, biology, and seabed habitat mapping. These portals are now being expanded to all European seas in successor projects, which started mid 2013 from EU DG MARE. EMODNet encourages more data providers to come forward for data sharing and participating in the process of making complete overviews and homogeneous data products. The EMODNet Bathymetry project is very illustrative for the synergy with SeaDataNet and added value of generating public data products. The project develops and publishes Digital Terrain Models (DTM) for the European seas. These are produced from survey and aggregated data sets. The portal provides a versatile DTM viewing service with many relevant map layers and functions for retrieving. A further refinement is taking place in the new phase. The presentation will give information on present services of the SeaDataNet infrastructure and services, highlight key achievements in SeaDataNet II so far, and give further insights in the EMODNet Bathymetry progress.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017JPhCS.898i2039B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017JPhCS.898i2039B"><span>Effective HTCondor-based monitoring system for CMS</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Balcas, J.; Bockelman, B. P.; Da Silva, J. M.; Hernandez, J.; Khan, F. A.; Letts, J.; Mascheroni, M.; Mason, D. A.; Perez-Calero Yzquierdo, A.; Vlimant, J.-R.; pre="for the"> CMS Consortium, <author</p> <p>2017-10-01</p> <p>The CMS experiment at the LHC relies on HTCondor and glideinWMS as its primary batch and pilot-based Grid provisioning systems, respectively. Given the scale of the global queue in CMS, the operators found it increasingly difficult to monitor the pool to find problems and fix them. The operators had to rely on several different web pages, with several different levels of information, and sift tirelessly through log files in order to monitor the pool completely. Therefore, coming up with a suitable monitoring system was one of the crucial items before the beginning of the LHC Run 2 in order to ensure early detection of issues and to give a good overview of the whole pool. Our new monitoring page utilizes the HTCondor ClassAd information to provide a complete picture of the whole submission infrastructure in CMS. The monitoring page includes useful information from HTCondor schedulers, central managers, the glideinWMS frontend, and factories. It also incorporates information about users and tasks making it easy for operators to provide support and debug issues.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1420915','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1420915"><span>Stability and Scalability of the CMS Global Pool: Pushing HTCondor and GlideinWMS to New Limits</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Balcas, J.; Bockelman, B.; Hufnagel, D.</p> <p></p> <p>The CMS Global Pool, based on HTCondor and glideinWMS, is the main computing resource provisioning system for all CMS workflows, including analysis, Monte Carlo production, and detector data reprocessing activities. The total resources at Tier-1 and Tier-2 grid sites pledged to CMS exceed 100,000 CPU cores, while another 50,000 to 100,000 CPU cores are available opportunistically, pushing the needs of the Global Pool to higher scales each year. These resources are becoming more diverse in their accessibility and configuration over time. Furthermore, the challenge of stably running at higher and higher scales while introducing new modes of operation such asmore » multi-core pilots, as well as the chaotic nature of physics analysis workflows, places huge strains on the submission infrastructure. This paper details some of the most important challenges to scalability and stability that the CMS Global Pool has faced since the beginning of the LHC Run II and how they were overcome.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28946258','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28946258"><span>Disruption and molecule degradation of waxy maize starch granules during high pressure homogenization process.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Wei, Benxi; Cai, Canxin; Xu, Baoguo; Jin, Zhengyu; Tian, Yaoqi</p> <p>2018-02-01</p> <p>The mechanism underlying the fragmentation of waxy maize starch (WMS) granules during high-pressure homogenization (HPH) was studied and the results were interpreted in terms of granular and molecular aspects. The diameter of disrupted starch granules decreased exponentially with increasing HPH pressure, but decreased linearly with increasing of HPH cycles. Scanning electron microscopy revealed a cone-like inside-out disruption pattern through the channels that resulted in separation of blocklets fragments or starch fragments. The M w of amylopectin was reduced by ∼half following treatment at 150MPa with two cycles, or at 100MPa for eight cycles, and the decrease was in accordance with the disruption of starch granules. This indicated that amylopectin was "protected" by blocklets, and the disruption of WMS granules mainly occurred close to the linkage among blocklets. Increasing the HPH pressure appeared to be more effective for breaking starch granules than increasing the number of HPH cycles. Copyright © 2017 Elsevier Ltd. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/19550485','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/19550485"><span>Clinical system for non-invasive in situ monitoring of gases in the human paranasal sinuses.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Lewander, Märta; Guan, Zuguang; Svanberg, Katarina; Svanberg, Sune; Svensson, Tomas</p> <p>2009-06-22</p> <p>We present a portable system for non-invasive, simultaneous sensing of molecular oxygen (O(2)) and water vapor (H(2)O) in the human paranasal cavities. The system is based on high-resolution tunable diode laser spectroscopy (TDLAS) and digital wavelength modulation spectroscopy (dWMS). Since optical interference and non-ideal tuning of the diode lasers render signal processing complex, we focus on Fourier analysis of dWMS signals and procedures for removal of background signals. Clinical data are presented, and exhibit a significant improvement in signal-to-noise with respect to earlier work. The in situ detection limit, in terms of absorption fraction, is about 5x10(-5) for oxygen and 5x10(-4) for water vapor, but varies between patients due to differences in light attenuation. In addition, we discuss the use of water vapor as a reference in quantification of in situ oxygen concentration in detail. In particular, light propagation aspects are investigated by employing photon time-of-flight spectroscopy.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015EGUGA..17.6254V','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015EGUGA..17.6254V"><span>DATA.KNMI.NL - Status & Future Challenges</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>van de Vegte, John; Som de Cerff, Wim; Verhoef, Hans; Plieger, Maarten; de Vreede, Ernst; van der Neut, Ian; Bos, Jeroen; Ha, Siu-Siu; Sluiter, Raymond; Willem Noteboom, Jan; Klein Baltink, Henk; Reijmerink, Mieke</p> <p>2015-04-01</p> <p>The Royal Netherlands Meteorological Institute (KNMI) has over 150 years of knowledge and gathered information related to weather, Climate and Seismology. A huge part of this information is from numerical models, insitu sensor networks and remote sensing satellites. This digital collection is becoming more and more available in the newly developed KNMI Data Centre, that is now 2 years operational. The KNMI Data Centre project has a user driven development approach where SCRUM is chosen to get maximum user involvement in a relative short timeframe. The system is build on open standards and proven opensource technology (which includes in-house developed software like ADAGUC WMS and Portal). The presentation will focus on the aspects of developing the initial KNMI Data Centre, the operational use of the last 2 years, and how a major release for the coming year will be realized. The new release which will focus on better user experience and extending the technical data interfaces to the data centre. Keywords: Agile, Usage Statistics, Open Data, Inspire, DOI, WMS, WCS, OPeNDAP</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27825737','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27825737"><span>Profile of cognitive function in adults with duchenne muscular dystrophy.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Ueda, Yukihiko; Suwazono, Shugo; Maedo, Sino; Higuchi, Itsuro</p> <p>2017-03-01</p> <p>Several studies have examined intellectual functioning of boys with duchenne muscular dystrophy (DMD). However, little is known about the remaining cognitive weaknesses in adults with DMD. The purpose of this study was to investigate the profile of cognitive functioning that is characteristics of adults with DMD. Twenty-four subscales from the Wechsler Adult Intelligence Scale III (WAIS-III), the Clinical Assessment for Attention (CAT), and the Wechsler Memory Scale Revised (WMS-R) were used to assess participants with DMD (N=15; mean age=30.4years). Scores for Picture Completion, Arithmetic, Matrix Reasoning, Symbol Search, Letter-Number Sequencing, and Digit Span of the WAIS-III; all CAT scores, and Logical Memory and Delayed Logical Memory from the WMS-R were significantly deficient in adults with DMD in comparison to the normal population. The ability to sequentially process auditory and visual information remains impaired in adults with DMD. Copyright © 2016 The Japanese Society of Child Neurology. Published by Elsevier B.V. All rights reserved.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li class="active"><span>18</span></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_18 --> <div id="page_19" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li class="active"><span>19</span></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="361"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017JPhCS.898e2031B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017JPhCS.898e2031B"><span>Stability and scalability of the CMS Global Pool: Pushing HTCondor and glideinWMS to new limits</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Balcas, J.; Bockelman, B.; Hufnagel, D.; Hurtado Anampa, K.; Aftab Khan, F.; Larson, K.; Letts, J.; Marra da Silva, J.; Mascheroni, M.; Mason, D.; Perez-Calero Yzquierdo, A.; Tiradani, A.</p> <p>2017-10-01</p> <p>The CMS Global Pool, based on HTCondor and glideinWMS, is the main computing resource provisioning system for all CMS workflows, including analysis, Monte Carlo production, and detector data reprocessing activities. The total resources at Tier-1 and Tier-2 grid sites pledged to CMS exceed 100,000 CPU cores, while another 50,000 to 100,000 CPU cores are available opportunistically, pushing the needs of the Global Pool to higher scales each year. These resources are becoming more diverse in their accessibility and configuration over time. Furthermore, the challenge of stably running at higher and higher scales while introducing new modes of operation such as multi-core pilots, as well as the chaotic nature of physics analysis workflows, places huge strains on the submission infrastructure. This paper details some of the most important challenges to scalability and stability that the CMS Global Pool has faced since the beginning of the LHC Run II and how they were overcome.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013EGUGA..15.3620S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013EGUGA..15.3620S"><span>EMODNet Hydrography - Seabed Mapping - Developing a higher resolution digital bathymetry for the European seas</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Schaap, Dick M. A.; Moussat, Eric</p> <p>2013-04-01</p> <p>In December 2007 the European Parliament and Council adopted the Marine Strategy Framework Directive (MSFD) which aims to achieve environmentally healthy marine waters by 2020. This Directive includes an initiative for an overarching European Marine Observation and Data Network (EMODNet). The EMODNet Hydrography - Seabed Mapping projects made good progress in developing the EMODNet Hydrography portal to provide overview and access to available bathymetric survey datasets and to generate an harmonised digital bathymetry for Europe's sea basins. Up till end 2012 more than 8400 bathymetric survey datasets, managed by 14 data centres from 9 countries and originated from 118 institutes, have been gathered and populated in the EMODNet Hydrography Data Discovery and Access service, adopting SeaDataNet standards. These datasets have been used as input for analysing and generating the EMODNet digital terrain model (DTM), so far for the following sea basins: • the Greater North Sea, including the Kattegat • the English Channel and Celtic Seas • Western and Central Mediterranean Sea and Ionian Sea • Bay of Biscay, Iberian coast and North-East Atlantic • Adriatic Sea • Aegean - Levantine Sea (Eastern Mediterranean). • Azores - Madeira EEZ The Hydrography Viewing service gives users wide functionality for viewing and downloading the EMODNet digital bathymetry: • water depth in gridded form on a DTM grid of a quarter a minute of longitude and latitude • option to view QC parameters of individual DTM cells and references to source data • option to download DTM tiles in different formats: ESRI ASCII, XYZ, CSV, NetCDF (CF), GeoTiff and SD for Fledermaus 3 D viewer software • option for users to create their Personal Layer and to upload multibeam survey ASCII datasets for automatic processing into personal DTMs following the EMODNet standards The NetCDF (CF) DTM files are fit for use in a special 3D Viewer software package which is based on the existing open source NASA World Wind JSK application. It has been developed in the frame of the EU Geo-Seas project (another sibling of SeaDataNet for marine geological and geophysical data) and is freely available. The 3D viewer also supports the ingestion of WMS overlay maps. The EMODNet consortium is actively seeking cooperation with Hydrographic Offices, research institutes, authorities and private organisations for additional data sets (single and multibeam surveys, sounding tracks, composite products) to contribute to an even better geographical coverage. These datasets will be used for upgrading and extending the EMODNet regional Digital Terrain Models (DTM). The datasets themselves are not distributed but described in the metadata service, giving clear information about the background survey data used for the DTM, their access restrictions, originators and distributors and facilitating requests by users to originators. This way the portal provides originators of bathymetric data sets an attractive shop window for promoting their data sets to potential users, without losing control. The EMODNet Hydrography Consortium consists of MARIS (NL), ATLIS (NL), IFREMER (FR), SHOM (FR), IEO (ES), GSI (IE), NERC-NOCS (UK), OGS (IT), HCMR (GR), and UNEP/GRID-Arendal (NO) with associate partners CNR-ISMAR (IT), OGS-RIMA (IT), IHPT (PT), and LNEG (PT). Website: http://www.emodnet-hydrography.eu</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012ISPAr39B4..449Z','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012ISPAr39B4..449Z"><span>Comprehensive Evaluation and Analysis of China's Mainstream Online Map Service Websites</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Zhang, H.; Jiang, J.; Huang, W.; Wang, Q.; Gu, X.</p> <p>2012-08-01</p> <p>With the flourish development of China's Internet market, all kinds of users for map service demand is rising continually, within it contains tremendous commercial interests. Many internet giants have got involved in the field of online map service, and defined it as an important strategic product of the company. The main purpose of this research is to evaluate these online map service websites comprehensively with a model, and analyse the problems according to the evaluation results. Then some corresponding solving measures are proposed, which provides a theoretical and application guidance for the future development of fiercely competitive online map websites. The research consists of three stages: (a) the mainstream online map service websites in China are introduced and the present situation of them is analysed through visit, investigation, consultant, analysis and research. (b) a whole comprehensive evaluation quota system of online map service websites from the view of functions, layout, interaction design color position and so on, combining with the data indexes such as time efficiency, accuracy, objectivity and authority. (c) a comprehensive evaluation to these online map service websites is proceeded based on the fuzzy evaluation mathematical model, and the difficulty that measure the map websites quantitatively is solved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012EGUGA..1410663W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012EGUGA..1410663W"><span>A Common Metadata System for Marine Data Portals</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Wosniok, C.; Breitbach, G.; Lehfeldt, R.</p> <p>2012-04-01</p> <p>Processing and allocation of marine datasets depend on the nature of the data resulting from field campaigns, continuous monitoring and numerical modeling. Two research and development projects in northern Germany manage different types of marine data. Due to different data characteristics and institutional frameworks separate data portals are required. This paper describes the integration of distributed marine data in Germany. The Marine Data Infrastructure of Germany (MDI-DE) supports public authorities in the German coastal zone with the implementation of European directives like INSPIRE or the Marine Strategy Framework Directive. This is carried out through setting up standardized web services within a network of participating coastal agencies and the installation of a common data portal (http://www.mdi-de.org), which integrates distributed marine data concerning coastal engineering, coastal water protection and nature conservation in an interoperable and harmonized manner for administrative and scientific purposes as well as for information of the general public. The Coastal Observation System for Northern and Arctic Seas (COSYNA) aims at developing and testing analysis systems for the operational synoptic description of the environmental status of the North Sea and of Arctic coastal waters. This is done by establishing a network of monitoring facilities and the provision of its data in near-real-time. In situ measurements with poles, ferry boxes, and buoys, together with remote sensing measurements, and the data assimilation of these data into simulation results enables COSYNA to provide pre-operational 'products', that are beyond the present routinely applied techniques in observation and modelling. The data allocation in near-real-time requires thoroughly executed data validation, which is processed on the fly before data is passed on to the COSYNA portal (http://kofserver2.hzg.de/codm/). Both projects apply OGC standards such as Web Mapping Service (WMS), Web Feature Service (WFS) and Sensor Observation Service (SOS), which ensures interoperability and extensibility. In addition, metadata as crucial components for searching and finding information in large data infrastructures is provided via the Catalogue Web Service (CS-W). MDI-DE and COSYNA rely on the metadata information system for marine metadata NOKIS, which reflects a metadata profile tailored for marine data according to the specifications of German coastal authorities. In spite of this common software base, interoperability between the two data collections requires constant alignments of the diverse data processed by the two portals. While monitoring data in the MDI-DE is currently rather campaign-based, COSYNA has to fit constantly evolving time series into metadata sets. With all data following the same metadata profile, we now reach full interoperability between the different data collections. The distributed marine information system provides options to search, find and visualise the harmonised results from continuous monitoring, field campaigns, numerical modeling and other data in one web client.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1399899','DOE-PATENT-XML'); return false;" href="https://www.osti.gov/servlets/purl/1399899"><span>Method and system for a network mapping service</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/doepatents">DOEpatents</a></p> <p>Bynum, Leo</p> <p>2017-10-17</p> <p>A method and system of publishing a map includes providing access to a plurality of map data files or mapping services between at least one publisher and at least one subscriber; defining a map in a map context comprising parameters and descriptors to substantially duplicate a map by reference to mutually accessible data or mapping services, publishing a map to a channel in a table file on server; accessing the channel by at least one subscriber, transmitting the mapping context from the server to the at least one subscriber, executing the map context by the at least one subscriber, and generating the map on a display software associated with the at least one subscriber by reconstituting the map from the references and other data in the mapping context.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018MS%26E..327c2015C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018MS%26E..327c2015C"><span>Composite Gypsum Binders with Silica-containing Additives</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Chernysheva, N. V.; Lesovik, V. S.; Drebezgova, M. Yu; Shatalova, S. V.; Alaskhanov, A. H.</p> <p>2018-03-01</p> <p>New types of fine mineral additives are proposed for designing water-resistant Composite Gypsum Binders (CGB); these additives significantly differ from traditional quartz feed: wastes from wet magnetic separation of Banded Iron Formation (BIF WMS waste), nanodispersed silica powder (NSP), chalk. Possibility of their combined use has been studied as well.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20150019488','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20150019488"><span>Global Precipitation Measurement (GPM) Mission Products and Services at the NASA Goddard Earth Sciences Data and Information Services Center (GES DISC)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Liu, Z.; Ostrenga, D.; Vollmer, B.; Kempler, S.; Deshong, B.; Greene, M.</p> <p>2015-01-01</p> <p>The NASA Goddard Earth Sciences (GES) Data and Information Services Center (DISC) hosts and distributes GPM data within the NASA Earth Observation System Data Information System (EOSDIS). The GES DISC is also home to the data archive for the GPM predecessor, the Tropical Rainfall Measuring Mission (TRMM). Over the past 17 years, the GES DISC has served the scientific as well as other communities with TRMM data and user-friendly services. During the GPM era, the GES DISC will continue to provide user-friendly data services and customer support to users around the world. GPM products currently and to-be available: -Level-1 GPM Microwave Imager (GMI) and partner radiometer products, DPR products -Level-2 Goddard Profiling Algorithm (GPROF) GMI and partner products, DPR products -Level-3 daily and monthly products, DPR products -Integrated Multi-satellitE Retrievals for GPM (IMERG) products (early, late, and final) A dedicated Web portal (including user guides, etc.) has been developed for GPM data (http://disc.sci.gsfc.nasa.gov/gpm). Data services that are currently and to-be available include Google-like Mirador (http://mirador.gsfc.nasa.gov/) for data search and access; data access through various Web services (e.g., OPeNDAP, GDS, WMS, WCS); conversion into various formats (e.g., netCDF, HDF, KML (for Google Earth), ASCII); exploration, visualization, and statistical online analysis through Giovanni (http://giovanni.gsfc.nasa.gov); generation of value-added products; parameter and spatial subsetting; time aggregation; regridding; data version control and provenance; documentation; science support for proper data usage, FAQ, help desk; monitoring services (e.g. Current Conditions) for applications. The United User Interface (UUI) is the next step in the evolution of the GES DISC web site. It attempts to provide seamless access to data, information and services through a single interface without sending the user to different applications or URLs (e.g., search, access, subset, Giovanni, documents).</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1247507','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1247507"><span>How Much Higher Can HTCondor Fly?</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Fajardo, E. M.; Dost, J. M.; Holzman, B.</p> <p></p> <p>The HTCondor high throughput computing system is heavily used in the high energy physics (HEP) community as the batch system for several Worldwide LHC Computing Grid (WLCG) resources. Moreover, it is the backbone of GlidelnWMS, the pilot system used by the computing organization of the Compact Muon Solenoid (CMS) experiment. To prepare for LHC Run 2, we probed the scalability limits of new versions and configurations of HTCondor with a goal of reaching 200,000 simultaneous running jobs in a single internationally distributed dynamic pool.In this paper, we first describe how we created an opportunistic distributed testbed capable of exercising runsmore » with 200,000 simultaneous jobs without impacting production. This testbed methodology is appropriate not only for scale testing HTCondor, but potentially for many other services. In addition to the test conditions and the testbed topology, we include the suggested configuration options used to obtain the scaling results, and describe some of the changes to HTCondor inspired by our testing that enabled sustained operations at scales well beyond previous limits.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2011JPhCS.331g2031S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2011JPhCS.331g2031S"><span>Operating a production pilot factory serving several scientific domains</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Sfiligoi, I.; Würthwein, F.; Andrews, W.; Dost, J. M.; MacNeill, I.; McCrea, A.; Sheripon, E.; Murphy, C. W.</p> <p>2011-12-01</p> <p>Pilot infrastructures are becoming prominent players in the Grid environment. One of the major advantages is represented by the reduced effort required by the user communities (also known as Virtual Organizations or VOs) due to the outsourcing of the Grid interfacing services, i.e. the pilot factory, to Grid experts. One such pilot factory, based on the glideinWMS pilot infrastructure, is being operated by the Open Science Grid at University of California San Diego (UCSD). This pilot factory is serving multiple VOs from several scientific domains. Currently the three major clients are the analysis operations of the HEP experiment CMS, the community VO HCC, which serves mostly math, biology and computer science users, and the structural biology VO NEBioGrid. The UCSD glidein factory allows the served VOs to use Grid resources distributed over 150 sites in North and South America, in Europe, and in Asia. This paper presents the steps taken to create a production quality pilot factory, together with the challenges encountered along the road.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2007SPIE.6754E..21Y','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2007SPIE.6754E..21Y"><span>DIY-style GIS service in mobile navigation system integrated with web and wireless GIS</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Yan, Yongbin; Wu, Jianping; Fan, Caiyou; Wang, Minqi; Dai, Sheng</p> <p>2007-06-01</p> <p>Mobile navigation system based on handheld device can not only provide basic GIS services, but also enable these GIS services to be provided without location limit, to be more instantly interacted between users and devices. However, we still see that most navigation systems have common defects on user experience like limited map format, few map resources, and unable location share. To overcome the above defects, we propose DIY-style GIS service which provide users a more free software environment and allow uses to customize their GIS services. These services include defining geographical coordinate system of maps which helps to hugely enlarge the map source, editing vector feature, related property information and hotlink images, customizing covered area of download map via General Packet Radio Service (GPRS), and sharing users' location information via SMS (Short Message Service) which establishes the communication between users who needs GIS services. The paper introduces the integration of web and wireless GIS service in a mobile navigation system and presents an implementation sample of a DIY-Style GIS service in a mobile navigation system.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28873862','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28873862"><span>Customised City Maps in Mobile Applications for Senior Citizens.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Reins, Frank; Berker, Frank; Heck, Helmut</p> <p>2017-01-01</p> <p>Map services should be used in mobile applications for senior citizens. Do the commonly used map services meet the needs of elderly people? - Exemplarily, the contrast ratios of common maps in comparison to an optimized custom rendered map are examined in the paper.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://link.springer.com/article/10.1007/s10113-015-0756-7','USGSPUBS'); return false;" href="http://link.springer.com/article/10.1007/s10113-015-0756-7"><span>Linking biophysical models and public preferences for ecosystem service assessments: a case study for the Southern Rocky Mountains</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Bagstad, Kenneth J.; Reed, James; Semmens, Darius J.; Sherrouse, Ben C.; Troy, Austin</p> <p>2016-01-01</p> <p>Through extensive research, ecosystem services have been mapped using both survey-based and biophysical approaches, but comparative mapping of public values and those quantified using models has been lacking. In this paper, we mapped hot and cold spots for perceived and modeled ecosystem services by synthesizing results from a social-values mapping study of residents living near the Pike–San Isabel National Forest (PSI), located in the Southern Rocky Mountains, with corresponding biophysically modeled ecosystem services. Social-value maps for the PSI were developed using the Social Values for Ecosystem Services tool, providing statistically modeled continuous value surfaces for 12 value types, including aesthetic, biodiversity, and life-sustaining values. Biophysically modeled maps of carbon sequestration and storage, scenic viewsheds, sediment regulation, and water yield were generated using the Artificial Intelligence for Ecosystem Services tool. Hotspots for both perceived and modeled services were disproportionately located within the PSI’s wilderness areas. Additionally, we used regression analysis to evaluate spatial relationships between perceived biodiversity and cultural ecosystem services and corresponding biophysical model outputs. Our goal was to determine whether publicly valued locations for aesthetic, biodiversity, and life-sustaining values relate meaningfully to results from corresponding biophysical ecosystem service models. We found weak relationships between perceived and biophysically modeled services, indicating that public perception of ecosystem service provisioning regions is limited. We believe that biophysical and social approaches to ecosystem service mapping can serve as methodological complements that can advance ecosystem services-based resource management, benefitting resource managers by showing potential locations of synergy or conflict between areas supplying ecosystem services and those valued by the public.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..16.4774O','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..16.4774O"><span>Web-GIS approach for integrated analysis of heterogeneous georeferenced data</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Okladnikov, Igor; Gordov, Evgeny; Titov, Alexander; Shulgina, Tamara</p> <p>2014-05-01</p> <p>Georeferenced datasets are currently actively used for modeling, interpretation and forecasting of climatic and ecosystem changes on different spatial and temporal scales [1]. Due to inherent heterogeneity of environmental datasets as well as their huge size (up to tens terabytes for a single dataset) a special software supporting studies in the climate and environmental change areas is required [2]. Dedicated information-computational system for integrated analysis of heterogeneous georeferenced climatological and meteorological data is presented. It is based on combination of Web and GIS technologies according to Open Geospatial Consortium (OGC) standards, and involves many modern solutions such as object-oriented programming model, modular composition, and JavaScript libraries based on GeoExt library (http://www.geoext.org), ExtJS Framework (http://www.sencha.com/products/extjs) and OpenLayers software (http://openlayers.org). The main advantage of the system lies in it's capability to perform integrated analysis of time series of georeferenced data obtained from different sources (in-situ observations, model results, remote sensing data) and to combine the results in a single map [3, 4] as WMS and WFS layers in a web-GIS application. Also analysis results are available for downloading as binary files from the graphical user interface or can be directly accessed through web mapping (WMS) and web feature (WFS) services for a further processing by the user. Data processing is performed on geographically distributed computational cluster comprising data storage systems and corresponding computational nodes. Several geophysical datasets represented by NCEP/NCAR Reanalysis II, JMA/CRIEPI JRA-25 Reanalysis, ECMWF ERA-40 Reanalysis, ECMWF ERA Interim Reanalysis, MRI/JMA APHRODITE's Water Resources Project Reanalysis, DWD Global Precipitation Climatology Centre's data, GMAO Modern Era-Retrospective analysis for Research and Applications, reanalysis of Monitoring atmospheric composition and climate (MACC) Collaborated Project, NOAA-CIRES Twentieth Century Global Reanalysis Version II, NCEP Climate Forecast System Reanalysis (CFSR), meteorological observational data for the territory of the former USSR for the 20th century, results of modeling by global and regional climatological models, and others are available for processing by the system. The Web-GIS information-computational system for heterogeneous geophysical data analysis provides specialists involved into multidisciplinary research projects with reliable and practical instruments for integrated research of climate and ecosystems changes on global and regional scales. With its help even an unskilled in programming user is able to process and visualize multidimensional observational and model data through unified web-interface using a common graphical web-browser. This work is partially supported by SB RAS project VIII.80.2.1, RFBR grant #13-05-12034, grant #14-05-00502, and integrated project SB RAS #131. References 1. Gordov E.P., Lykosov V.N., Krupchatnikov V.N., Okladnikov I.G., Titov A.G., Shulgina T.M. Computational and information technologies for monitoring and modeling of climate changes and their consequences. - Novosibirsk: Nauka, Siberian branch, 2013. - 195 p. (in Russian) 2. Felice Frankel, Rosalind Reid. Big data: Distilling meaning from data // Nature. Vol. 455. N. 7209. P. 30. 3. T.M. Shulgina, E.P. Gordov, I.G. Okladnikov, A.G., Titov, E.Yu. Genina, N.P. Gorbatenko, I.V. Kuzhevskaya, A.S. Akhmetshina. Software complex for a regional climate change analysis. // Vestnik NGU. Series: Information technologies. 2013. Vol. 11. Issue 1. P. 124-131 (in Russian). 4. I.G. Okladnikov, A.G. Titov, T.M. Shulgina, E.P. Gordov, V.Yu. Bogomolov, Yu.V. Martynova, S.P. Suschenko, A.V. Skvortsov. Software for analysis and visualization of climate change monitoring and forecasting data // Numerical methods and programming, 2013. Vol. 14. P. 123-131 (in Russian).</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=SGR&id=ED146510','ERIC'); return false;" href="https://eric.ed.gov/?q=SGR&id=ED146510"><span>Cognitive Tasks as Predictors of Behavioral Competencies in the Aged.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Cyr, J.; Stones, M. J.</p> <p></p> <p>This paper discusses the quantitative and qualitative aspects of the relationship between cognitive abilities and behavioral competencies in elderly institutional residents. The former was assessed by an array of five cognitive measures: two Piagetian tasks, Set Test, WAIS Vocabulary and Digit Span subtests, and the WMS Associate Learning subtest;…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1347302','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1347302"><span>Cost Implications of an Interim Storage Facility in the Waste Management System</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Jarrell, Joshua J.; Joseph, III, Robert Anthony; Howard, Rob L</p> <p>2016-09-01</p> <p>This report provides an evaluation of the cost implications of incorporating a consolidated interim storage facility (ISF) into the waste management system (WMS). Specifically, the impacts of the timing of opening an ISF relative to opening a repository were analyzed to understand the potential effects on total system costs.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=1073492','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=1073492"><span>Memory disorders in probable Alzheimer's disease: the role of hippocampal atrophy as shown with MRI.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Deweer, B; Lehéricy, S; Pillon, B; Baulac, M; Chiras, J; Marsault, C; Agid, Y; Dubois, B</p> <p>1995-01-01</p> <p>Magnetic resonance based volumetric measures of hippocampal formation, amygdala (A), caudate nucleus (CN), normalised for total intracranial volume (TIV), were analysed in relation to measures of cognitive deterioration and specific features of memory functions in 18 patients with probable Alzheimer's disease. Neuropsychological examination included the mini mental state examination (MMSE), the Mattis dementia rating scale (DRS), tests of executive functions, assessment of language abilities and praxis, the Wechsler memory scale (WMS), the California verbal learning test (CVLT) and the Grober and Buschke test. The volume of the hippocampal formation (HF/TIV) was correlated with specific memory variables: memory quotient and paired associates of the WMS; intrusions and discriminability at recognition for the Grober and Buschke test. By contrast, except for intrusions, no correlations were found between memory variables and the volume of amygdala (A/TIV). No correlations were found between the volume of caudate nuclei (CN/TIV) and any neuropsychological score. The volume of the hippocampal formation was therefore selectively related to quantitative and qualitative aspects of memory performance in patients with probable Alzheimer's disease. Images PMID:7745409</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28734229','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28734229"><span>Effects of dietary starch types on early postmortem muscle energy metabolism in finishing pigs.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Li, Y J; Gao, T; Li, J L; Zhang, L; Gao, F; Zhou, G H</p> <p>2017-11-01</p> <p>This study aimed to investigate the effects of different dietary starch types on early postmortem muscle energy metabolism in finishing pigs. Ninety barrows (68.0±2.0kg) were randomly allotted to three experimental diets with five replicates of six pigs, containing pure waxy maize starch (WMS), nonwaxy maize starch (NMS), and pea starch (PS) (amylose/amylopectin were 0.07, 0.19 and 0.28 respectively). Compared with the WMS diet, pigs fed the PS diet exhibited greater creatine kinase activity, higher adenosine triphosphate and adenosine diphosphate contents, lower phosphocreatine (PCr), adenosine monophosphate and glycogen contents, and lower glycolytic potential (P<0.05). Moreover, the PS diet led to reduced percentage of bound hexokinase activity, decreased level of phosphorylated AKT (P<0.05) and increased level of hypoxia-inducible factor-1α (P<0.05). In conclusion, diet with high amylose content might promote PCr degradation and inhibit the rate of glycolysis, followed by attenuation of early postmortem glycolysis in finishing pigs. Copyright © 2017 Elsevier Ltd. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25258176','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25258176"><span>Sensitivity and specificity of memory and naming tests for identifying left temporal-lobe epilepsy.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Umfleet, Laura Glass; Janecek, Julie K; Quasney, Erin; Sabsevitz, David S; Ryan, Joseph J; Binder, Jeffrey R; Swanson, Sara J</p> <p>2015-01-01</p> <p>The sensitivity and specificity of the Selective Reminding Test (SRT) Delayed Recall, Wechsler Memory Scale (WMS) Logical Memory, the Boston Naming Test (BNT), and two nonverbal memory measures for detecting lateralized dysfunction in association with side of seizure focus was examined in a sample of 143 patients with left or right temporal-lobe epilepsy (TLE). Scores on the SRT and BNT were statistically significantly lower in the left TLE group compared with the right TLE group, whereas no group differences emerged on the Logical Memory subtest. No significant group differences were found with nonverbal memory measures. When the SRT and BNT were both entered as predictors in a logistic regression, the BNT, although significant, added minimal value to the model beyond the variance accounted for by the SRT Delayed Recall. Both variables emerged as significant predictors of side of seizure focus when entered into separate regressions. Sensitivity and specificity of the SRT and BNT ranged from 56% to 65%. The WMS Logical Memory and nonverbal memory measures were not significant predictors of the side of seizure focus.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/23031705','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/23031705"><span>Simultaneous measurements of multiple parameters at elevated temperature using a frequency-division multiplexing scheme with tunable diode lasers.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Cai, Tingdong; Gao, Guangzhen; Liu, Ying</p> <p>2012-10-01</p> <p>A multiplexed diode-laser sensor system based on second harmonic detection of wavelength modulation spectroscopy (WMS) is developed for application at elevated temperatures with two near-infrared diode lasers multiplexed using a frequency-division multiplexing scheme. One laser is tuned over a H(2)O line pair near 7079.176 and 7079.855 cm(-1), and another laser is tuned over a pair of CO(2) and CO lines near 6361.250 and 6361.344 cm(-1). Temperature and concentrations of H(2)O, CO(2), and CO could be measured simultaneously by this system. In order to remove the need for calibration and correct for transmission variation due to beam steering, mechanical misalignments, soot, and windows fouling, the WMS-1f normalized 2f method is used. Demonstration experiments are conducted in a heated static cell. The precision of temperature and the concentrations for H(2)O, CO(2), and CO are found to be 1.57%, 3.87%, 3.01%, and 3.58%, respectively. These results illustrate the potential of this sensor for applications at high temperatures.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=physics&pg=4&id=EJ1032690','ERIC'); return false;" href="https://eric.ed.gov/?q=physics&pg=4&id=EJ1032690"><span>What Do Pre-Service Physics Teachers Know and Think about Concept Mapping?</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Didis, Nilüfer; Özcan, Özgür; Azar, Ali</p> <p>2014-01-01</p> <p>In order to use concept maps in physics classes effectively, teachers' knowledge and ideas about concept mapping are as important as the physics knowledge used in mapping. For this reason, we aimed to examine pre-service physics teachers' knowledge on concept mapping, their ideas about the implementation of concept mapping in physics…</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li class="active"><span>19</span></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_19 --> <div id="page_20" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li class="active"><span>20</span></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="381"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/29295408','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/29295408"><span>Building Interoperable FHIR-Based Vocabulary Mapping Services: A Case Study of OHDSI Vocabularies and Mappings.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Jiang, Guoqian; Kiefer, Richard; Prud'hommeaux, Eric; Solbrig, Harold R</p> <p>2017-01-01</p> <p>The OHDSI Common Data Model (CDM) is a deep information model, in which its vocabulary component plays a critical role in enabling consistent coding and query of clinical data. The objective of the study is to create methods and tools to expose the OHDSI vocabularies and mappings as the vocabulary mapping services using two HL7 FHIR core terminology resources ConceptMap and ValueSet. We discuss the benefits and challenges in building the FHIR-based terminology services.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/29091667','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/29091667"><span>Measurement of atmospheric carbon dioxide and water vapor in built-up urban areas in the Gandhinagar-Ahmedabad region in India using a portable tunable diode laser spectroscopy system.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Roy, Anirban; Sharma, Neetesh Kumar; Chakraborty, Arup Lal; Upadhyay, Abhishek</p> <p>2017-11-01</p> <p>This paper reports open-path in situ measurements of atmospheric carbon dioxide at Gandhinagar (23.2156°N, 72.6369°E) and Ahmedabad (23.0225°N, 72.5714°E) in the heavily industrialized state of Gujarat in western India. Calibration-free second harmonic wavelength modulation spectroscopy (2f WMS) is used to carry out accurate and fully automated measurements. The mean values of the mole fraction of carbon dioxide at four locations were 438 ppm, 495 ppm, 550 ppm, and 740 ppm, respectively. These values are much higher than the current global average of 406.67 ppm. A 1 mW, 2004-nm vertical cavity surface-emitting laser is used to selectively interrogate the R16 transition of carbon dioxide at 2003.5 nm (4991.2585 cm -1 ). The 2f WMS signal corresponding to the gas absorption line shape is simulated using spectroscopic parameters available in the HITRAN database and relevant laser parameters that are extracted in situ from non-absorbing spectral wings of the harmonic signals. The mole fraction of carbon dioxide is extracted in real-time by a MATLAB program from least-squares fit of the simulated 2f WMS signal to the corresponding experimentally obtained signal. A 10-mW, 1392.54-nm distributed feedback laser is used at two of the locations to carry out water vapor measurements using direct absorption spectroscopy. This is the first instance of a portable tunable diode laser spectroscopy system being deployed in an urban location in India to measure atmospheric carbon dioxide and water vapor under varying traffic conditions. The measurements clearly demonstrate the need to adopt tunable diode laser spectroscopy for precise long-term monitoring of greenhouse gases in the Indian subcontinent.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/FR-2012-03-15/pdf/2012-6316.pdf','FEDREG'); return false;" href="https://www.gpo.gov/fdsys/pkg/FR-2012-03-15/pdf/2012-6316.pdf"><span>77 FR 15369 - Mobility Fund Phase I Auction GIS Data of Potentially Eligible Census Blocks</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.gpo.gov/fdsys/browse/collection.action?collectionCode=FR">Federal Register 2010, 2011, 2012, 2013, 2014</a></p> <p></p> <p>2012-03-15</p> <p>....fcc.gov/auctions/901/ , are the following: Downloadable shapefile Web mapping service MapBox map tiles... GIS software allows you to add this service as a layer to your session or project. 6. MapBox map tiles are cached map tiles of the data. With this open source software approach, these image tiles can be...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010EGUGA..1210579D','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010EGUGA..1210579D"><span>User-driven generation of standard data services</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Díaz, Laura; Granell, Carlos; Gould, Michael; Huerta, Joaquín.</p> <p>2010-05-01</p> <p>Geospatial Information systems are experiencing the shift from monolithic to distributed environments (Bernard, 2003). Current research trends for discover and access of geospatial resources, in these distributed environments, are being addressed by deployment of interconnected Spatial Data Infrastructure (SDI) nodes at different scales to build a global spatial information infrastructure (Masser et al., 2008; Rajabifard et al., 2002). One of the challenges for implementing these global and multiscale SDIs is to agree with common standards in consideration with heterogeneity of various stakeholders [Masser 2005]. In Europe, the European Commission took the INSPIRE initiative to monitor the development of European SDIs. INSPIRE Directive addresses the need for web services to discover, view, transform, invoke, and download geospatial resources, which enable various stakeholders to share resources in an interoperable manner [INSPIRE 2007]. Such web services require technical specifications for the interoperability and harmonization of their SDIs [INSPIRE 2007]. Moreover, interoperability is ensured by a number of specification efforts, in the geo domain most prominently by ISO/TC 211 and the OpenGIS Consortium (OGC) (Bernard, 2003). Other research challenges regarding SDI are on one hand how to handle complexity by users in charge of maintaining SDIs as they grow, and on the other hand the fact the SDI maintenance and evolution should be guided (Bejar et al, 2009). So there is a motivation to improve the complex deployment mechanisms in SDI since there is a need of expertise and time to deploy resources and integrate them by means of standard services. In this context we present an architecture following the INSPIRE technical guidelines and therefore based on SDI principles. This architecture supports distributed applications and provides components to assist users in deploying and updating SDI resources. Therefore mechanisms and components for the automatic generation and publication of standard geospatial are proposed. These mechanisms deal with the fact of hiding the underlying technology and let stakeholders wrap resources as standard services to share these resources in a transparent manner. These components are integrated in our architecture within the Service Framework node (module). PIC Figure 1: Figure 1. Architecture components diagram Figure 1 shows the components of the architecture: The Application Node provides the entry point for users to run distributed applications. This software component has the user interface and the application logic. The Service Connector component provides the ability to connect to the services available in the middleware layer of SDI. This node acts as a socket to OGC Web Services. For instance we appreciate the WMS component implementing the OGC WMS specification as it is the standard recommended by the INSPIRE implementation rules as View Service Type.The Service Framework node contains several components. The Service Framework main functionality is to assist users in wrapping and sharing geospatial resources. It implements the proposed mechanisms to improve the availability and visibility of geospatial resources. The main components of this framework are the Data wrapper, the Process Wrapper and the Service Publisher. The Data Wrapper and Process Wrapper components guide users to wrap data and tools as standard services according with INSPIRE implementing rules (availability). The Service Publisher component aims at creating service metadata and publishing them in catalogues (visibility). Roughly speaking, all of these components are concerned with the idea of acting as a service generator and publisher, i.e., they get a resource (data or process) and return an INSPIRE service that will be published in catalogue services. References Béjar, R., Latre, M. Á., Nogueras-Iso, J., Muro-Medrano, P. R., Zarazaga-Soria, F. J. 2009. International Journal of Geographical Information Science, 23(3), 271-294. Bernard, L, U Einspanier, M Lutz & C Portele. Interoperability in GI Service Chains The Way Forward. In: M. Gould, R. Laurini & S. Coulondre (Eds.). 6th AGILE Conference on Geographic Information Science 2003, Lyon: 179-188. INSPIRE. Directive 2007/2/EC of the European Parliament and of the Council of 14 March 2007 establishing an Infrastructure for Spatial Information in the European Community. (2007) Masser, I. GIS Worlds: Creating Spatial Data Infrastructures. Redlands, California. ESRI Press. (2005) Masser, I., Rajabifard, A., Williamson, I. 2008. Spatially enabling governments through SDI implementation. International Journal of Geographical Information Science. Vol. 22, No. 1, (2008) 5-20 Rajabifard, A., Feeney, M-E. F., Williamson, I. P. 2002. Future directions for SDI development. International Journal of Applied Earth Observation and Geoinformation 4 (2002) 11-22</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20100010894','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20100010894"><span>NASA Tech Briefs, September 2007</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p></p> <p>2007-01-01</p> <p>Topics covered include; Rapid Fabrication of Carbide Matrix/Carbon Fiber Composites; Coating Thermoelectric Devices To Suppress Sublimation; Ultrahigh-Temperature Ceramics; Improved C/SiC Ceramic Composites Made Using PIP; Coating Carbon Fibers With Platinum; Two-Band, Low-Loss Microwave Window; MCM Polarimetric Radiometers for Planar Arrays; Aperture-Coupled Thin-Membrane L-Band Antenna; WGM-Based Photonic Local Oscillators and Modulators; Focal-Plane Arrays of Quantum-Dot Infrared Photodetectors; Laser Range and Bearing Finder With No Moving Parts; Microrectenna: A Terahertz Antenna and Rectifier on a Chip; Miniature L-Band Radar Transceiver; Robotic Vision-Based Localization in an Urban Environment; Programs for Testing an SSME-Monitoring System; Cathodoluminescent Source of Intense White Light; Displaying and Analyzing Antenna Radiation Patterns; Payload Operations Support Team Tools; Space-Shuttle Emulator Software; Soft Real-Time PID Control on a VME Computer; Analyzing Radio-Frequency Coverage for the ISS; Nanorod-Based Fast-Response Pressure-Sensitive Paints; Capacitors Would Help Protect Against Hypervelocity Impacts; Diaphragm Pump With Resonant Piezoelectric Drive; Improved Quick-Release Pin Mechanism; Designing Rolling-Element Bearings; Reverse-Tangent Injection in a Centrifugal Compressor; Inertial Measurements for Aero-assisted Navigation (IMAN); Analysis of Complex Valve and Feed Systems; Improved Path Planning Onboard the Mars Exploration Rovers; Robust, Flexible Motion Control for the Mars Explorer Rovers; Solar Sail Spaceflight Simulation; Fluorine-Based DRIE of Fused Silica; Mechanical Alloying for Making Thermoelectric Compounds; Process for High-Rate Fabrication of Alumina Nanotemplates; Electroform/Plasma-Spray Laminates for X-Ray Optics; An Automated Flying-Insect Detection System; Calligraphic Poling of Ferroelectric Material; Blackbody Cavity for Calibrations at 200 to 273 K; KML Super Overlay to WMS Translator; High-Performance Tiled WMS and KML Web Server; Modeling of Radiative Transfer in Protostellar Disks; Composite Pulse Tube; Photometric Calibration of Consumer Video Cameras; Criterion for Identifying Vortices in High- Pressure Flows; Amplified Thermionic Cooling Using Arrays of Nanowires; Delamination-Indicating Thermal Barrier Coatings; Preventing Raman Lasing in High-Q WGM Resonators; Procedures for Tuning a Multiresonator Photonic Filter; Robust Mapping of Incoherent Fiber-Optic Bundles; Extended-Range Ultrarefractive 1D Photonic Crystal Prisms; Rapid Analysis of Mass Distribution of Radiation Shielding; Modeling Magnetic Properties in EZTB; Deep Space Network Antenna Logic Controller; Modeling Carbon and Hydrocarbon Molecular Structures in EZTB; BigView Image Viewing on Tiled Displays; and Imaging Sensor Flight and Test Equipment Software.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012AGUFMIN33C1552S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012AGUFMIN33C1552S"><span>Global Near Real-Time Satellite-based Flood Monitoring and Product Dissemination</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Smith, M.; Slayback, D. A.; Policelli, F.; Brakenridge, G. R.; Tokay, M.</p> <p>2012-12-01</p> <p>Flooding is among the most destructive, frequent, and costly natural disasters faced by modern society, with several major events occurring each year. In the past few years, major floods have devastated parts of China, Thailand, Pakistan, Australia, and the Philippines, among others. The toll of these events, in financial costs, displacement of individuals, and deaths, is substantial and continues to rise as climate change generates more extreme weather events. When these events do occur, the disaster management community requires frequently updated and easily accessible information to better understand the extent of flooding and better coordinate response efforts. With funding from NASA's Applied Sciences program, we have developed, and are now operating, a near real-time global flood mapping system to help provide critical flood extent information within 24-48 hours after flooding events. The system applies a water detection algorithm to MODIS imagery received from the LANCE (Land Atmosphere Near real-time Capability for EOS) system at NASA Goddard. The LANCE system typically processes imagery in less than 3 hours after satellite overpass, and our flood mapping system can output flood products within ½ hour of acquiring the LANCE products. Using imagery from both the Terra (10:30 AM local time overpass) and Aqua (1:30 PM) platforms allows an initial assessment of flooding extent by late afternoon, every day, and more robust assessments after accumulating imagery over a longer period; the MODIS sensors are optical, so cloud cover remains an issue, which is partly overcome by using multiple looks over one or more days. Other issues include the relatively coarse scale of the MODIS imagery (250 meters), the difficulty of detecting flood waters in areas with continuous canopy cover, confusion of shadow (cloud or terrain) with water, and accurately identifying detected water as flood as opposed to normal water extents. We have made progress on some of these issues, and are working to develop higher resolution flood detection using alternate sensors, including Landsat and various radar sensors. Although these provide better spatial resolution, this comes at the cost of being less timely. As of late 2011, the system expanded to fully global daily flood monitoring, with free public access to the generated products. These include GIS-ready files of flood and normal water extent (KML, shapefile, raster), and small scale graphic maps (10 degrees square) showing regional flood extent. We are now expanding product distribution channels to include live web services (WMS, etc), allowing easier access via standalone apps. We are also working to bring our product into the Pacific Disaster Center's Disaster Alert system and mobile app for wider accessibility.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20110009888','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20110009888"><span>Land and Atmosphere Near-Real-Time Capability for Earth Observing System</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Murphy, Kevin J.</p> <p>2011-01-01</p> <p>The past decade has seen a rapid increase in availability and usage of near-real-time data from satellite sensors. The EOSDIS (Earth Observing System Data and Information System) was not originally designed to provide data with sufficiently low latency to satisfy the requirements for near-real-time users. The EOS (Earth Observing System) instruments aboard the Terra, Aqua and Aura satellites make global measurements daily, which are processed into higher-level 'standard' products within 8-40 hours of observation and then made available to users, primarily earth science researchers. However, applications users, operational agencies, and even researchers desire EOS products in near-real-time to support research and applications, including numerical weather and climate prediction and forecasting, monitoring of natural hazards, ecological/invasive species, agriculture, air quality, disaster relief and homeland security. These users often need data much sooner than routine science processing allows, usually within 3 hours, and are willing to trade science product quality for timely access. While Direct Broadcast provides more timely access to data, it does not provide global coverage. In 2002, a joint initiative between NASA (National Aeronautics and Space Administration), NOAA (National Oceanic and Atmospheric Administration), and the DOD (Department of Defense) was undertaken to provide data from EOS instruments in near-real-time. The NRTPE (Near Real Time Processing Effort) provided products within 3 hours of observation on a best-effort basis. As the popularity of these near-real-time products and applications grew, multiple near-real-time systems began to spring up such as the Rapid Response System. In recognizing the dependence of customers on this data and the need for highly reliable and timely data access, NASA's Earth Science Division sponsored the Earth Science Data and Information System Project (ESDIS)-led development of a new near-real-time system called LANCE (Land, Atmosphere Near-Real-Time Capability for EOS) in 2009. LANCE consists of special processing elements, co-located with selected EOSDIS data centers and processing facilities. A primary goal of LANCE is to bring multiple near-real-time systems under one umbrella, offering commonality in data access, quality control, and latency. LANCE now processes and distributes data from the Moderate Resolution Imaging Spectroradiometer (MODIS), Atmospheric Infrared Sounder (AIRS), Advanced Microwave Scanning Radiometer Earth Observing System (AMSR-E), Microwave Limb Sounder (MLS) and Ozone Monitoring Instrument (OMI) instruments within 3 hours of satellite observation. The Rapid Response System and the Fire Information for Resource Management System (FIRMS) capabilities will be incorporated into LANCE in 2011. LANCE maintains a central website to facilitate easy access to data and user services. LANCE products are extensively tested and compared with science products before being made available to users. Each element also plans to implement redundant network, power and server infrastructure to ensure high availability of data and services. Through the user registration system, users are informed of any data outages and when new products or services will be available for access. Building on a significant investment by NASA in developing science algorithms and products, LANCE creates products that have a demonstrated utility for applications requiring near-real-time data. From lower level data products such as calibrated geolocated radiances to higher-level products such as sea ice extent, snow cover, and cloud cover, users have integrated LANCE data into forecast models and decision support systems. The table above shows the current near-real-time product categories by instrument. The ESDIS Project continues to improve the LANCE system and use the experience gained through practice to seek adjustments to improve the quality and performance of the system. For example, anGC-compliant Web Map Service (WMS) will be added shortly that will allow users to download geo-referenced MODIS images for arbitrary bounding boxes. Further, an OGC-compliant Web Coverage Service (WCS) will be added later this year that will expedite user access to arbitrary data subsets or re-formatted products. AIRS images are now served through WMS and available in multiple formats (PNG, GeoTIFF, KMZ). NASA has established a LANCE User Working Group to steer the development of the system and create a forum for sharing ideas and experiences that are expected to further improve the LANCE capabilities. The LANCE system has proved a success by satisfying the growing needs of the applications and operational communities for land and atmosphere data in near-real-time. NASA's Earth Sciences Division was able to leverage existing science research capabilities to provide the near-real-time community with products and imagery that support monitoring of disasters in a timely manner.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2009AGUFMIN51C..01L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2009AGUFMIN51C..01L"><span>Web-based spatial analysis with the ILWIS open source GIS software and satellite images from GEONETCast</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Lemmens, R.; Maathuis, B.; Mannaerts, C.; Foerster, T.; Schaeffer, B.; Wytzisk, A.</p> <p>2009-12-01</p> <p>This paper involves easy accessible integrated web-based analysis of satellite images with a plug-in based open source software. The paper is targeted to both users and developers of geospatial software. Guided by a use case scenario, we describe the ILWIS software and its toolbox to access satellite images through the GEONETCast broadcasting system. The last two decades have shown a major shift from stand-alone software systems to networked ones, often client/server applications using distributed geo-(web-)services. This allows organisations to combine without much effort their own data with remotely available data and processing functionality. Key to this integrated spatial data analysis is a low-cost access to data from within a user-friendly and flexible software. Web-based open source software solutions are more often a powerful option for developing countries. The Integrated Land and Water Information System (ILWIS) is a PC-based GIS & Remote Sensing software, comprising a complete package of image processing, spatial analysis and digital mapping and was developed as commercial software from the early nineties onwards. Recent project efforts have migrated ILWIS into a modular, plug-in-based open source software, and provide web-service support for OGC-based web mapping and processing. The core objective of the ILWIS Open source project is to provide a maintainable framework for researchers and software developers to implement training components, scientific toolboxes and (web-) services. The latest plug-ins have been developed for multi-criteria decision making, water resources analysis and spatial statistics analysis. The development of this framework is done since 2007 in the context of 52°North, which is an open initiative that advances the development of cutting edge open source geospatial software, using the GPL license. GEONETCast, as part of the emerging Global Earth Observation System of Systems (GEOSS), puts essential environmental data at the fingertips of users around the globe. This user-friendly and low-cost information dissemination provides global information as a basis for decision-making in a number of critical areas, including public health, energy, agriculture, weather, water, climate, natural disasters and ecosystems. GEONETCast makes available satellite images via Digital Video Broadcast (DVB) technology. An OGC WMS interface and plug-ins which convert GEONETCast data streams allow an ILWIS user to integrate various distributed data sources with data locally stored on his machine. Our paper describes a use case in which ILWIS is used with GEONETCast satellite imagery for decision making processes in Ghana. We also explain how the ILWIS software can be extended with additional functionality by means of building plug-ins and unfold our plans to implement other OGC standards, such as WCS and WPS in the same context. Especially, the latter one can be seen as a major step forward in terms of moving well-proven desktop based processing functionality to the web. This enables the embedding of ILWIS functionality in Spatial Data Infrastructures or even the execution in scalable and on-demand cloud computing environments.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010EGUGA..12.3016N','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010EGUGA..12.3016N"><span>Development of Integration Framework for Sensor Network and Satellite Image based on OGC Web Services</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ninsawat, Sarawut; Yamamoto, Hirokazu; Kamei, Akihide; Nakamura, Ryosuke; Tsuchida, Satoshi; Maeda, Takahisa</p> <p>2010-05-01</p> <p>With the availability of network enabled sensing devices, the volume of information being collected by networked sensors has increased dramatically in recent years. Over 100 physical, chemical and biological properties can be sensed using in-situ or remote sensing technology. A collection of these sensor nodes forms a sensor network, which is easily deployable to provide a high degree of visibility into real-world physical processes as events unfold. The sensor observation network could allow gathering of diverse types of data at greater spatial and temporal resolution, through the use of wired or wireless network infrastructure, thus real-time or near-real time data from sensor observation network allow researchers and decision-makers to respond speedily to events. However, in the case of environmental monitoring, only a capability to acquire in-situ data periodically is not sufficient but also the management and proper utilization of data also need to be careful consideration. It requires the implementation of database and IT solutions that are robust, scalable and able to interoperate between difference and distributed stakeholders to provide lucid, timely and accurate update to researchers, planners and citizens. The GEO (Global Earth Observation) Grid is primarily aiming at providing an e-Science infrastructure for the earth science community. The GEO Grid is designed to integrate various kinds of data related to the earth observation using the grid technology, which is developed for sharing data, storage, and computational powers of high performance computing, and is accessible as a set of services. A comprehensive web-based system for integrating field sensor and data satellite image based on various open standards of OGC (Open Geospatial Consortium) specifications has been developed. Web Processing Service (WPS), which is most likely the future direction of Web-GIS, performs the computation of spatial data from distributed data sources and returns the outcome in a standard format. The interoperability capabilities and Service Oriented Architecture (SOA) of web services allow incorporating between sensor network measurement available from Sensor Observation Service (SOS) and satellite remote sensing data from Web Mapping Service (WMS) as distributed data sources for WPS. Various applications have been developed to demonstrate the efficacy of integrating heterogeneous data source. For example, the validation of the MODIS aerosol products (MOD08_D3, the Level-3 MODIS Atmosphere Daily Global Product) by ground-based measurements using the sunphotometer (skyradiometer, Prede POM-02) installed at Phenological Eyes Network (PEN) sites in Japan. Furthermore, the web-based framework system for studying a relationship between calculated Vegetation Index from MODIS satellite image surface reflectance (MOD09GA, the Surface Reflectance Daily L2G Global 1km and 500m Product) and Gross Primary Production (GPP) field measurement at flux tower site in Thailand and Japan has been also developed. The success of both applications will contribute to maximize data utilization and improve accuracy of information by validate MODIS satellite products using high degree of accuracy and temporal measurement of field measurement data.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=mc+AND+wm&id=EJ851637','ERIC'); return false;" href="https://eric.ed.gov/?q=mc+AND+wm&id=EJ851637"><span>Social Competence and Social Skills Training and Intervention for Children with Autism Spectrum Disorders</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Cotugno, Albert J.</p> <p>2009-01-01</p> <p>This study examined the effectiveness of a 30 week social competence and social skills group intervention program with children, ages 7-11, diagnosed with Autism Spectrum Disorders (ASD). Eighteen children with ASD were assessed with pretreatment and posttreatment measures on the Walker-McConnell Scale (WMS) and the MGH YouthCare Social Competence…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.fs.usda.gov/treesearch/pubs/54538','TREESEARCH'); return false;" href="https://www.fs.usda.gov/treesearch/pubs/54538"><span>The US Wilderness Managers Survey: Charting a path for the future</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.fs.usda.gov/treesearch/">Treesearch</a></p> <p>Chad P. Dawson; Ken Cordell; Alan E. Watson; Ramesh Ghimire; Gary T. Green</p> <p>2016-01-01</p> <p>The Wilderness Manager Survey (WMS) was developed in 2014 to support interagency strategic planning for the National Wilderness Preservation System (NWPS) and asked managers about their perceived threats to the NWPS, the need for science information to support decisionmaking, the need for education and training, and the most important problems for managers in the...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://edg.epa.gov/metadata/catalog/search/resource/details.page?uuid=%7Bfe643065-3153-46a5-8c5e-b9cc3ede39ff%7D','PESTICIDES'); return false;" href="https://edg.epa.gov/metadata/catalog/search/resource/details.page?uuid=%7Bfe643065-3153-46a5-8c5e-b9cc3ede39ff%7D"><span>EnviroAtlas National Layers Master Web Service</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.epa.gov/pesticides/search.htm">EPA Pesticide Factsheets</a></p> <p></p> <p></p> <p>This EnviroAtlas web service supports research and online mapping activities related to EnviroAtlas (https://www.epa.gov/enviroatlas). This web service includes layers depicting EnviroAtlas national metrics mapped at the 12-digit HUC within the conterminous United States. This dataset was produced by the US EPA to support research and online mapping activities related to EnviroAtlas. EnviroAtlas (https://www.epa.gov/enviroatlas) allows the user to interact with a web-based, easy-to-use, mapping application to view and analyze multiple ecosystem services for the contiguous United States. The dataset is available as downloadable data (https://edg.epa.gov/data/Public/ORD/EnviroAtlas) or as an EnviroAtlas map service. Additional descriptive information about each attribute in this dataset can be found in its associated EnviroAtlas Fact Sheet (https://www.epa.gov/enviroatlas/enviroatlas-fact-sheets).</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70168484','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70168484"><span>A multi-indicator framework for mapping cultural ecosystem services: The case of freshwater recreational fishing</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Villamagna, Amy M.; Mogollón, Beatriz; Angermeier, Paul</p> <p>2014-01-01</p> <p>Despite recent interest, ecosystem services are not yet fully incorporated into private and public decisions about natural resource management. Cultural ecosystem services (CES) are among the most challenging of services to include because they comprise complex ecological and social properties and processes that make them difficult to measure, map or monetize. Like others, CES are vulnerable to landscape changes and unsustainable use. To date, the sustainability of services has not been adequately addressed and few studies have considered measures of service capacity and demand simultaneously. To facilitate sustainability assessments and management of CES, our study objectives were to (1) develop a spatially explicit framework for mapping the capacity of ecosystems to provide freshwater recreational fishing, an important cultural service, (2) map societal demand for freshwater recreational fishing based on license data and identify areas of potential overuse, and (3) demonstrate how maps of relative capacity and relative demand could be interfaced to estimate sustainability of a CES. We mapped freshwater recreational fishing capacity at the 12-digit hydrologic unit-scale in North Carolina and Virginia using a multi-indicator service framework incorporating biophysical and social landscape metrics and mapped demand based on fishing license data. Mapping of capacity revealed a gradual decrease in capacity eastward from the mountains to the coastal plain and that fishing demand was greatest in urban areas. When comparing standardized relative measures of capacity and demand for freshwater recreational fishing, we found that ranks of capacity exceeded ranks of demand in most hydrologic units, except in 17% of North Carolina and 5% of Virginia. Our GIS-based approach to view freshwater recreational fishing through an ecosystem service lens will enable scientists and managers to examine (1) biophysical and social factors that foster or diminish cultural ecosystem services delivery, (2) demand for cultural ecosystem services relative to their capacity, and (3) ecological pressures like potential overuse that affect service sustainability. Ultimately, we expect such analyses to inform decision-making for freshwater recreational fisheries and other cultural ecosystem services.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..19.8641W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..19.8641W"><span>Concept of a spatial data infrastructure for web-mapping, processing and service provision for geo-hazards</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Weinke, Elisabeth; Hölbling, Daniel; Albrecht, Florian; Friedl, Barbara</p> <p>2017-04-01</p> <p>Geo-hazards and their effects are distributed geographically over wide regions. The effective mapping and monitoring is essential for hazard assessment and mitigation. It is often best achieved using satellite imagery and new object-based image analysis approaches to identify and delineate geo-hazard objects (landslides, floods, forest fires, storm damages, etc.). At the moment, several local/national databases and platforms provide and publish data of different types of geo-hazards as well as web-based risk maps and decision support systems. Also, the European commission implemented the Copernicus Emergency Management Service (EMS) in 2015 that publishes information about natural and man-made disasters and risks. Currently, no platform for landslides or geo-hazards as such exists that enables the integration of the user in the mapping and monitoring process. In this study we introduce the concept of a spatial data infrastructure for object delineation, web-processing and service provision of landslide information with the focus on user interaction in all processes. A first prototype for the processing and mapping of landslides in Austria and Italy has been developed within the project Land@Slide, funded by the Austrian Research Promotion Agency FFG in the Austrian Space Applications Program ASAP. The spatial data infrastructure and its services for the mapping, processing and analysis of landslides can be extended to other regions and to all types of geo-hazards for analysis and delineation based on Earth Observation (EO) data. The architecture of the first prototypical spatial data infrastructure includes four main areas of technical components. The data tier consists of a file storage system and the spatial data catalogue for the management of EO-data, other geospatial data on geo-hazards, as well as descriptions and protocols for the data processing and analysis. An interface to extend the data integration from external sources (e.g. Sentinel-2 data) is planned for the possibility of rapid mapping. The server tier consists of java based web and GIS server. Sub and main services are part of the service tier. Sub services are for example map services, feature editing services, geometry services, geoprocessing services and metadata services. For (meta)data provision and to support data interoperability, web standards of the OGC and the rest-interface is used. Four central main services are designed and developed: (1) a mapping service (including image segmentation and classification approaches), (2) a monitoring service to monitor changes over time, (3) a validation service to analyze landslide delineations from different sources and (4) an infrastructure service to identify affected landslides. The main services use and combine parts of the sub services. Furthermore, a series of client applications based on new technology standards making use of the data and services offered by the spatial data infrastructure. Next steps include the design to extend the current spatial data infrastructure to other areas and geo-hazard types to develop a spatial data infrastructure that can assist targeted mapping and monitoring of geo-hazards on a global context.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.loc.gov/pictures/collection/hh/item/wv0307.photos.041154p/','SCIGOV-HHH'); return false;" href="https://www.loc.gov/pictures/collection/hh/item/wv0307.photos.041154p/"><span>37. Photo copy of map, (original in Forest Service Office, ...</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.loc.gov/pictures/collection/hh/">Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey</a></p> <p></p> <p></p> <p>37. Photo copy of map, (original in Forest Service Office, Elkins, WV, 'Blister Rust Survey Map), 1930. PARSONS NURSERY SITE PLAN - Parsons Nursery, South side of U.S. Route 219, Parsons, Tucker County, WV</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA145077','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA145077"><span>How to Evaluate Training.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>1982-09-30</p> <p>not changed because they are not subject to a careful evaluation. The solution The four job aids contained in this manual provide specific techniques...lesson plans tions. " training design, or * testing NOTICE This manual has been developed using the standards of the Information MappingO writing service...Information Mapping, Inc. S NOTICE This manual has been developed using the standards of the Information MappingS writing service. Infornation Mapping</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://files.eric.ed.gov/fulltext/EJ1118749.pdf','ERIC'); return false;" href="http://files.eric.ed.gov/fulltext/EJ1118749.pdf"><span>Using Ecological Asset Mapping to Investigate Pre-Service Teachers' Cultural Assets</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Borrero, Noah; Yeh, Christine</p> <p>2016-01-01</p> <p>We examined the impact of a pedagogical strategy, ecological asset mapping, on 19 pre-service teachers' self-exploration, development of respect for others, and critical examination of social injustice. Data were analyzed from participants' ecological asset maps and essays describing the experience of completing and sharing the maps. The analysis…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=Plot&pg=4&id=EJ1141277','ERIC'); return false;" href="https://eric.ed.gov/?q=Plot&pg=4&id=EJ1141277"><span>Journey Mapping the User Experience</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Samson, Sue; Granath, Kim; Alger, Adrienne</p> <p>2017-01-01</p> <p>This journey-mapping pilot study was designed to determine whether journey mapping is an effective method to enhance the student experience of using the library by assessing our services from their point of view. Journey mapping plots a process or service to produce a visual representation of a library transaction--from the point at which the…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.cpc.ncep.noaa.gov/products/analysis_monitoring/regional_monitoring','SCIGOVWS'); return false;" href="http://www.cpc.ncep.noaa.gov/products/analysis_monitoring/regional_monitoring"><span>Climate Prediction Center - Expert Assessments Index</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.science.gov/aboutsearch.html">Science.gov Websites</a></p> <p></p> <p></p> <p>Weather Service NWS logo - Click to go to the NWS home page <em>Climate</em> Prediction Center Home Site Map News Web resources and services. HOME > Monitoring and Data > Global <em>Climate</em> Data & Maps > ; Global Regional <em>Climate</em> Maps Regional <em>Climate</em> Maps Banner The Monthly regional analyses products are</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/17070970','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/17070970"><span>Place and provision: mapping mental health advocacy services in London.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Foley, Ronan; Platzer, Hazel</p> <p>2007-02-01</p> <p>The National Health Service (NHS) Executive for London carried out an investigation in 2002 as part of their wider mental health strategy to establish whether existing mental health advocacy provision in the city was meeting need. The project took a two-part approach, with an emphasis on, (a) mapping the provision of advocacy services and, (b) cartographic mapping of service location and catchments. Data were collected through a detailed questionnaire with service providers in collaboration with the Greater London Mental Health Advocacy Network (GLMHAN) and additional health and government sources. The service mapping identified some key statistics on funding, caseloads and models of service provision with an additional emphasis on coverage, capacity, and funding stability. The questionnaire was augmented by interviews and focus groups with commissioners, service providers and service users and identified differing perspectives and problems, which informed the different perspectives of each of these groups. The cartographic mapping exercise demonstrated a spatially-even provision of mental health advocacy services across the city with each borough being served by at least one local service as well as by London wide specialist schemes. However, at local level, no one borough had the full range of specialist provision to match local demographic need. Ultimately the research assisted the Advisory Group in providing commissioning agencies with clear information on the current status of city-wide mental health advocacy services, and on gaps in existing advocacy provision alongside previously unconsidered geographical and service dimensions of that provision.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li class="active"><span>20</span></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_20 --> <div id="page_21" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li class="active"><span>21</span></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="401"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25222703','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25222703"><span>Fiscal mapping autism spectrum disorder funds: a case study of Ohio.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Joyce, Hilary D; Hoffman, Jill; Anderson-Butcher, Dawn; Moodie-Dyer, Amber</p> <p>2014-01-01</p> <p>Individuals with autism spectrum disorders (ASDs) have complex needs requiring regular service utilization. Policymakers, administrators, and community leaders are looking for ways to finance ASD services and systems. Understanding the fiscal resources that support ASD services is essential. This article uses fiscal mapping to explore ASD funding streams in Ohio. Fiscal mapping steps are overviewed to assist ASD stakeholders in identifying and examining ASD-related funding. Implications are drawn related to how fiscal mapping could be used to identify and leverage funding for ASD services. The resulting information is critical to utilizing existing resources, advocating for resources, and leveraging available funds.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=Mayo%2c+E&pg=2&id=EJ556339','ERIC'); return false;" href="https://eric.ed.gov/?q=Mayo%2c+E&pg=2&id=EJ556339"><span>Mayo's Older American Normative Studies: Separate Norms for WMS-R Logical Memory Stories.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Smith, Glenn E.; Wong, Jennifer S.; Ivnik, Robert J.; Malec, James F.</p> <p>1997-01-01</p> <p>Norms are presented for persons ages 56 to 93 years for each story from the Logical Memory subtests of the revised edition of the Wechsler Memory Scale following the methods used for other Mayo's Older American Normative Studies. Means and standard deviations are presented for 3-year interval age groups from age 61 to 88. (SLD)</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=art+AND+perception&pg=2&id=ED581971','ERIC'); return false;" href="https://eric.ed.gov/?q=art+AND+perception&pg=2&id=ED581971"><span>Examining the Impact of Art-Based Anchor Charts on Academic Achievement in Language Arts</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Fontanez, Kimberly</p> <p>2017-01-01</p> <p>The students at 2 middle schools in County SD, NHMS and WMS are not scoring on or above grade level on the information text portion of the English Language Arts (ELA) standardized SC Palmetto Assessment of State Standards (SCPASS) test given annually in South Carolina. The teachers developed and implemented art-based anchor charts to help close…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014AGUFMIN52A..06R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014AGUFMIN52A..06R"><span>Enabling Web-Based GIS Tools for Internet and Mobile Devices To Improve and Expand NASA Data Accessibility and Analysis Functionality for the Renewable Energy and Agricultural Applications</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ross, A.; Stackhouse, P. W.; Tisdale, B.; Tisdale, M.; Chandler, W.; Hoell, J. M., Jr.; Kusterer, J.</p> <p>2014-12-01</p> <p>The NASA Langley Research Center Science Directorate and Atmospheric Science Data Center have initiated a pilot program to utilize Geographic Information System (GIS) tools that enable, generate and store climatological averages using spatial queries and calculations in a spatial database resulting in greater accessibility of data for government agencies, industry and private sector individuals. The major objectives of this effort include the 1) Processing and reformulation of current data to be consistent with ESRI and openGIS tools, 2) Develop functions to improve capability and analysis that produce "on-the-fly" data products, extending these past the single location to regional and global scales. 3) Update the current web sites to enable both web-based and mobile application displays for optimization on mobile platforms, 4) Interact with user communities in government and industry to test formats and usage of optimization, and 5) develop a series of metrics that allow for monitoring of progressive performance. Significant project results will include the the development of Open Geospatial Consortium (OGC) compliant web services (WMS, WCS, WFS, WPS) that serve renewable energy and agricultural application products to users using GIS software and tools. Each data product and OGC service will be registered within ECHO, the Common Metadata Repository, the Geospatial Platform, and Data.gov to ensure the data are easily discoverable and provide data users with enhanced access to SSE data, parameters, services, and applications. This effort supports cross agency, cross organization, and interoperability of SSE data products and services by collaborating with DOI, NRCan, NREL, NCAR, and HOMER for requirements vetting and test bed users before making available to the wider public.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFMIN23E..04B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFMIN23E..04B"><span>The OGC Innovation Program Testbeds - Advancing Architectures for Earth and Systems</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Bermudez, L. E.; Percivall, G.; Simonis, I.; Serich, S.</p> <p>2017-12-01</p> <p>The OGC Innovation Program provides a collaborative agile process for solving challenging science problems and advancing new technologies. Since 1999, 100 initiatives have taken place, from multi-million dollar testbeds to small interoperability experiments. During these initiatives, sponsors and technology implementers (including academia and private sector) come together to solve problems, produce prototypes, develop demonstrations, provide best practices, and advance the future of standards. This presentation will provide the latest system architectures that can be used for Earth and space systems as a result of the OGC Testbed 13, including the following components: Elastic cloud autoscaler for Earth Observations (EO) using a WPS in an ESGF hybrid climate data research platform. Accessibility of climate data for the scientist and non-scientist users via on demand models wrapped in WPS. Standards descriptions for containerize applications to discover processes on the cloud, including using linked data, a WPS extension for hybrid clouds and linking to hybrid big data stores. OpenID and OAuth to secure OGC Services with built-in Attribute Based Access Control (ABAC) infrastructures leveraging GeoDRM patterns. Publishing and access of vector tiles, including use of compression and attribute options reusing patterns from WMS, WMTS and WFS. Servers providing 3D Tiles and streaming of data, including Indexed 3d Scene Layer (I3S), CityGML and Common DataBase (CDB). Asynchronous Services with advanced pushed notifications strategies, with a filter language instead of simple topic subscriptions, that can be use across OGC services. Testbed 14 will continue advancing topics like Big Data, security, and streaming, as well as making easier to use OGC services (e.g. RESTful APIs). The Call for Participation will be issued in December and responses are due on mid January 2018.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFMIN23E..04B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFMIN23E..04B"><span>The OGC Innovation Program Testbeds - Advancing Architectures for Earth and Systems</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Bermudez, L. E.; Percivall, G.; Simonis, I.; Serich, S.</p> <p>2016-12-01</p> <p>The OGC Innovation Program provides a collaborative agile process for solving challenging science problems and advancing new technologies. Since 1999, 100 initiatives have taken place, from multi-million dollar testbeds to small interoperability experiments. During these initiatives, sponsors and technology implementers (including academia and private sector) come together to solve problems, produce prototypes, develop demonstrations, provide best practices, and advance the future of standards. This presentation will provide the latest system architectures that can be used for Earth and space systems as a result of the OGC Testbed 13, including the following components: Elastic cloud autoscaler for Earth Observations (EO) using a WPS in an ESGF hybrid climate data research platform. Accessibility of climate data for the scientist and non-scientist users via on demand models wrapped in WPS. Standards descriptions for containerize applications to discover processes on the cloud, including using linked data, a WPS extension for hybrid clouds and linking to hybrid big data stores. OpenID and OAuth to secure OGC Services with built-in Attribute Based Access Control (ABAC) infrastructures leveraging GeoDRM patterns. Publishing and access of vector tiles, including use of compression and attribute options reusing patterns from WMS, WMTS and WFS. Servers providing 3D Tiles and streaming of data, including Indexed 3d Scene Layer (I3S), CityGML and Common DataBase (CDB). Asynchronous Services with advanced pushed notifications strategies, with a filter language instead of simple topic subscriptions, that can be use across OGC services. Testbed 14 will continue advancing topics like Big Data, security, and streaming, as well as making easier to use OGC services (e.g. RESTful APIs). The Call for Participation will be issued in December and responses are due on mid January 2018.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4672060','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4672060"><span>Bacterial Exchange in Household Washing Machines</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Callewaert, Chris; Van Nevel, Sam; Kerckhof, Frederiek-Maarten; Granitsiotis, Michael S.; Boon, Nico</p> <p>2015-01-01</p> <p>Household washing machines (WMs) launder soiled clothes and textiles, but do not sterilize them. We investigated the microbial exchange occurring in five household WMs. Samples from a new cotton T-shirt were laundered together with a normal laundry load. Analyses were performed on the influent water and the ingoing cotton samples, as well as the greywater and the washed cotton samples. The number of living bacteria was generally not lower in the WM effluent water as compared to the influent water. The laundering process caused a microbial exchange of influent water bacteria, skin-, and clothes-related bacteria and biofilm-related bacteria in the WM. A variety of biofilm-producing bacteria were enriched in the effluent after laundering, although their presence in the cotton sample was low. Nearly all bacterial genera detected on the initial cotton sample were still present in the washed cotton samples. A selection for typical skin- and clothes-related microbial species occurred in the cotton samples after laundering. Accordingly, malodour-causing microbial species might be further distributed to other clothes. The bacteria on the ingoing textiles contributed for a large part to the microbiome found in the textiles after laundering. PMID:26696989</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5566711','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5566711"><span>Distinguishing Depressive Pseudodementia from Alzheimer Disease: A Comparative Study of Hippocampal Volumetry and Cognitive Tests</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Sahin, Sevki; Okluoglu Önal, Tugba; Cinar, Nilgun; Bozdemir, Meral; Çubuk, Rahmi; Karsidag, Sibel</p> <p>2017-01-01</p> <p>Background and Aim Depressive pseudodementia (DPD) is a condition which may develop secondary to depression. The aim of this study was to contribute to the differential diagnosis between Alzheimer disease (AD) and DPD by comparing the neurocognitive tests and hippocampal volume. Materials and Methods Patients who met criteria of AD/DPD were enrolled in the study. All patients were assessed using the Wechsler Memory Scale (WMS), clock-drawing test, Stroop test, Benton Facial Recognition Test (BFRT), Boston Naming Test, Mini-Mental State Examination (MMSE), and Geriatric Depression Scale (GDS). Hippocampal volume was measured by importing the coronal T1-weighted magnetic resonance images to the Vitrea 2 workstation. Results A significant difference was found between the AD and DPD groups on the WMS test, clock-drawing test, Stroop test, Boston Naming Test, MMSE, GDS, and left hippocampal volume. A significant correlation between BFRT and bilateral hippocampal volumes was found in the AD group. No correlation was found among parameters in DPD patients. Conclusions Our results suggest that evaluation of facial recognition and left hippocampal volume may provide more reliable evidence for distinguishing DPD from AD. Further investigations combined with functional imaging techniques including more patients are needed. PMID:28868066</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19960002966','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19960002966"><span>Automated data acquisition technology development:Automated modeling and control development</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Romine, Peter L.</p> <p>1995-01-01</p> <p>This report documents the completion of, and improvements made to, the software developed for automated data acquisition and automated modeling and control development on the Texas Micro rackmounted PC's. This research was initiated because a need was identified by the Metal Processing Branch of NASA Marshall Space Flight Center for a mobile data acquisition and data analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC based system was chosen. The Welding Measurement System (WMS), is a dedicated instrument strickly for use of data acquisition and data analysis. In addition to the data acquisition functions described in this thesis, WMS also supports many functions associated with process control. The hardware and software requirements for an automated acquisition system for welding process parameters, welding equipment checkout, and welding process modeling were determined in 1992. From these recommendations, NASA purchased the necessary hardware and software. The new welding acquisition system is designed to collect welding parameter data and perform analysis to determine the voltage versus current arc-length relationship for VPPA welding. Once the results of this analysis are obtained, they can then be used to develop a RAIL function to control welding startup and shutdown without torch crashing.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28475953','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28475953"><span>Effects of dietary starch types on growth performance, meat quality and myofibre type of finishing pigs.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Li, Y J; Li, J L; Zhang, L; Gao, F; Zhou, G H</p> <p>2017-09-01</p> <p>To investigate the effects of dietary starch types on growth performance and meat quality of finishing pigs, ninety barrows (68.0±2.0kg) were randomly allotted to three experimental diets with five replicates of six pigs, containing pure waxy maize starch (WMS), nonwaxy maize starch (NMS) and pea starch (PS) (amylose/amylopectin were 0.07, 0.19 and 0.28 respectively). Compared with WMS diet, PS diet increased the average daily gain, loin eye area, pH 45 value, NMR transverse relaxation (T 2 ) 2 peak area ratio and sarcoplasmic protein solubility, decreased the feed to gain ratio, back fat, drip loss, cooking loss and T 23 peak area ratio (P<0.05). Moreover, PS diet increased the myosin heavy-chain (MyHC)-I and IIa levels, decreased the MyHC-IIb level, decreased the miR23a level and increased its target gene level, increased the miR499 level and decreased its target gene level (P<0.05). Diet with high amylose content might be beneficial to the growth performance and meat quality of finishing pigs. Copyright © 2017 Elsevier Ltd. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/22462576','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/22462576"><span>Substitution of California Verbal Learning Test, second edition for Verbal Paired Associates on the Wechsler Memory Scale, fourth edition.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Miller, Justin B; Axelrod, Bradley N; Rapport, Lisa J; Hanks, Robin A; Bashem, Jesse R; Schutte, Christian</p> <p>2012-01-01</p> <p>Two common measures used to evaluate verbal learning and memory are the Verbal Paired Associates (VPA) subtest from the Wechsler Memory Scales (WMS) and the second edition of the California Verbal Learning Test (CVLT-II). For the fourth edition of the WMS, scores from the CVLT-II can be substituted for VPA; the present study sought to examine the validity of the substitution. For each substitution, paired-samples t tests were conducted between original VPA scaled scores and scaled scores obtained from the CVLT-II substitution to evaluate comparability. Similar comparisons were made at the index score level. At the index score level, substitution resulted in significantly lower scores for the AMI (p = .03; r = .13) but not for the IMI (p = .29) or DMI (p = .09). For the subtest scores, substituted scaled scores for VPA were not significantly different from original scores for the immediate recall condition (p = .20) but were significantly lower at delayed recall (p = .01). These findings offer partial support for the substitution. For both the immediate and delayed conditions, the substitution produced generally lower subtest scores compared to original VPA subtest scores.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26696989','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26696989"><span>Bacterial Exchange in Household Washing Machines.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Callewaert, Chris; Van Nevel, Sam; Kerckhof, Frederiek-Maarten; Granitsiotis, Michael S; Boon, Nico</p> <p>2015-01-01</p> <p>Household washing machines (WMs) launder soiled clothes and textiles, but do not sterilize them. We investigated the microbial exchange occurring in five household WMs. Samples from a new cotton T-shirt were laundered together with a normal laundry load. Analyses were performed on the influent water and the ingoing cotton samples, as well as the greywater and the washed cotton samples. The number of living bacteria was generally not lower in the WM effluent water as compared to the influent water. The laundering process caused a microbial exchange of influent water bacteria, skin-, and clothes-related bacteria and biofilm-related bacteria in the WM. A variety of biofilm-producing bacteria were enriched in the effluent after laundering, although their presence in the cotton sample was low. Nearly all bacterial genera detected on the initial cotton sample were still present in the washed cotton samples. A selection for typical skin- and clothes-related microbial species occurred in the cotton samples after laundering. Accordingly, malodour-causing microbial species might be further distributed to other clothes. The bacteria on the ingoing textiles contributed for a large part to the microbiome found in the textiles after laundering.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014ApPhB.115....9S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014ApPhB.115....9S"><span>Multi-species laser absorption sensors for in situ monitoring of syngas composition</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Sur, Ritobrata; Sun, Kai; Jeffries, Jay B.; Hanson, Ronald K.</p> <p>2014-04-01</p> <p>Tunable diode laser absorption spectroscopy sensors for detection of CO, CO2, CH4 and H2O at elevated pressures in mixtures of synthesis gas (syngas: products of coal and/or biomass gasification) were developed and tested. Wavelength modulation spectroscopy (WMS) with 1f-normalized 2f detection was employed. Fiber-coupled DFB diode lasers operating at 2325, 2017, 2290 and 1352 nm were used for simultaneously measuring CO, CO2, CH4 and H2O, respectively. Criteria for the selection of transitions were developed, and transitions were selected to optimize the signal and minimize interference from other species. For quantitative WMS measurements, the collision-broadening coefficients of the selected transitions were determined for collisions with possible syngas components, namely CO, CO2, CH4, H2O, N2 and H2. Sample measurements were performed for each species in gas cells at a temperature of 25 °C up to pressures of 20 atm. To validate the sensor performance, the composition of synthetic syngas was determined by the absorption sensor and compared with the known values. A method of estimating the lower heating value and Wobbe index of the syngas mixture from these measurements was also demonstrated.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2009ApPhB..94...51R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2009ApPhB..94...51R"><span>Measurements of high-pressure CO2 absorption near 2.0 μm and implications on tunable diode laser sensor design</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Rieker, G. B.; Jeffries, J. B.; Hanson, R. K.</p> <p>2009-01-01</p> <p>A tunable diode laser (TDL) is used to measure the absorption spectra of the R46 through R54 transitions of the 20012 ←00001 band of CO2 near 2.0 μm (5000 cm-1) at room temperature and pressures to 10 atm (densities to 9.2 amagat). Spectra are recorded using direct absorption spectroscopy and wavelength modulation spectroscopy with second-harmonic detection (WMS-2f) in a mixture containing 11% CO2 in air. The direct absorption spectra are influenced by non-Lorentzian effects including finite-duration collisions which perturb far-wing absorption, and an empirical χ-function correction to the Voigt line shape is shown to greatly reduce error in the spectral model. WMS-2f spectra are shown to be at least a factor of four less-influenced by non-Lorentzian effects in this region, making this approach more resistant to errors in the far-wing line shape model and allowing a comparison between the spectral parameters of HITRAN and a new database which includes pressure-induced shift coefficients. The implications of these measurements on practical, high-pressure CO2 sensor design are discussed.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.sciencedirect.com/science/article/pii/S0143622810000858','USGSPUBS'); return false;" href="http://www.sciencedirect.com/science/article/pii/S0143622810000858"><span>A GIS application for assessing, mapping, and quantifying the social values of ecosystem services</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Sherrouse, Benson C.; Clement, Jessica M.; Semmens, Darius J.</p> <p>2011-01-01</p> <p>As human pressures on ecosystems continue to increase, research involving the effective incorporation of social values information into the context of comprehensive ecosystem services assessments is becoming more important. Including quantified, spatially explicit social value metrics in such assessments will improve the analysis of relative tradeoffs among ecosystem services. This paper describes a GIS application, Social Values for Ecosystem Services (SolVES), developed to assess, map, and quantify the perceived social values of ecosystem services by deriving a non-monetary Value Index from responses to a public attitude and preference survey. SolVES calculates and maps the Value Index for social values held by various survey subgroups, as distinguished by their attitudes regarding ecosystem use. Index values can be compared within and among survey subgroups to explore the effect of social contexts on the valuation of ecosystem services. Index values can also be correlated and regressed against landscape metrics SolVES calculates from various environmental data layers. Coefficients derived through these analyses were applied to their corresponding data layers to generate a predicted social value map. This map compared favorably with other SolVES output and led to the addition of a predictive mapping function to SolVES for value transfer to areas where survey data are unavailable. A more robust application is being developed as a public domain tool for decision makers and researchers to map social values of ecosystem services and to facilitate discussions among diverse stakeholders involving relative tradeoffs among different ecosystem services in a variety of physical and social contexts.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=318970&Lab=NERL&keyword=leadership&actType=&TIMSType=+&TIMSSubTypeID=&DEID=&epaNumber=&ntisID=&archiveStatus=Both&ombCat=Any&dateBeginCreated=&dateEndCreated=&dateBeginPublishedPresented=&dateEndPublishedPresented=&dateBeginUpdated=&dateEndUpdated=&dateBeginCompleted=&dateEndCompleted=&personID=&role=Any&journalID=&publisherID=&sortBy=revisionDate&count=50','EPA-EIMS'); return false;" href="https://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=318970&Lab=NERL&keyword=leadership&actType=&TIMSType=+&TIMSSubTypeID=&DEID=&epaNumber=&ntisID=&archiveStatus=Both&ombCat=Any&dateBeginCreated=&dateEndCreated=&dateBeginPublishedPresented=&dateEndPublishedPresented=&dateBeginUpdated=&dateEndUpdated=&dateBeginCompleted=&dateEndCompleted=&personID=&role=Any&journalID=&publisherID=&sortBy=revisionDate&count=50"><span>A National Approach to Quantify and Map Biodiversity Conservation Metrics within an Ecosystem Services Framework</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://oaspub.epa.gov/eims/query.page">EPA Science Inventory</a></p> <p></p> <p></p> <p>Ecosystem services, i.e., "services provided to humans from natural systems," have become a key issue of this century in resource management, conservation planning, human well-being, and environmental decision analysis. Mapping and quantifying ecosystem services have be...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=277025&keyword=ROADS&actType=&TIMSType=+&TIMSSubTypeID=&DEID=&epaNumber=&ntisID=&archiveStatus=Both&ombCat=Any&dateBeginCreated=&dateEndCreated=&dateBeginPublishedPresented=&dateEndPublishedPresented=&dateBeginUpdated=&dateEndUpdated=&dateBeginCompleted=&dateEndCompleted=&personID=&role=Any&journalID=&publisherID=&sortBy=revisionDate&count=50','EPA-EIMS'); return false;" href="https://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=277025&keyword=ROADS&actType=&TIMSType=+&TIMSSubTypeID=&DEID=&epaNumber=&ntisID=&archiveStatus=Both&ombCat=Any&dateBeginCreated=&dateEndCreated=&dateBeginPublishedPresented=&dateEndPublishedPresented=&dateBeginUpdated=&dateEndUpdated=&dateBeginCompleted=&dateEndCompleted=&personID=&role=Any&journalID=&publisherID=&sortBy=revisionDate&count=50"><span>Mapping ecosystem services in the St. Louis River estuary (presentation)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://oaspub.epa.gov/eims/query.page">EPA Science Inventory</a></p> <p></p> <p></p> <p>Management of ecosystems for sustainable provision of services beneficial to human communities requires reliable data about from where in the ecosystem services flow. Our objective is to map ecosystem services in the St. Louis River with the overarching EPA goal of community sust...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFMIN13D..03G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFMIN13D..03G"><span>PAVICS: A Platform for the Analysis and Visualization of Climate Science</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Gauvin St-Denis, B.; Landry, T.; Huard, D. B.; Byrns, D.; Chaumont, D.; Foucher, S.</p> <p>2016-12-01</p> <p>Climate service providers are boundary organizations working at the interface of climate science research and users of climate information. Users include academics in other disciplines looking for credible, customized future climate scenarios, government planners, resource managers, asset owners, as well as service utilities. These users are looking for relevant information regarding the impacts of climate change as well as informing decisions regarding adaptation options. As climate change concerns become mainstream, the pressure on climate service providers to deliver tailored, high quality information in a timely manner increases rapidly. To meet this growing demand, Ouranos, a climate service center located in Montreal, is collaborating with the Centre de recherche informatique de Montreal (CRIM) to develop a climate data analysis web-based platform interacting with RESTful services covering data access and retrieval, geospatial analysis, bias correction, distributed climate indicator computing and results visualization. The project, financed by CANARIE, relies on the experience of the UV-CDAT and ESGF-CWT teams, as well as on the Birdhouse framework developed by the German Climate Research Center (DKRZ) and French IPSL. Climate data is accessed through OPEnDAP, while computations are carried through WPS. Regions such as watersheds or user-defined polygons, used as spatial selections for computations, are managed by GeoServer, also providing WMS, WFS and WPS capabilities. The services are hosted on independent servers communicating by high throughput network. Deployment, maintenance and collaboration with other development teams are eased by the use of Docker and OpenStack VMs. Web-based tools are developed with modern web frameworks such as React-Redux, OpenLayers 3, Cesium and Plotly. Although the main objective of the project is to build a functioning, usable data analysis pipeline within two years, time is also devoted to explore emerging technologies and assess their potential. For instance, sandbox environments will store climate data in HDFS, process it with Apache Spark and allow interaction through Jupyter Notebooks. Data streaming of observational data with OpenGL and Cesium is also considered.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018ISPAr42.3.1263M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018ISPAr42.3.1263M"><span>Improving Land Cover Mapping: a Mobile Application Based on ESA Sentinel 2 Imagery</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Melis, M. T.; Dessì, F.; Loddo, P.; La Mantia, C.; Da Pelo, S.; Deflorio, A. M.; Ghiglieri, G.; Hailu, B. T.; Kalegele, K.; Mwasi, B. N.</p> <p>2018-04-01</p> <p>The increasing availability of satellite data is a real value for the enhancement of environmental knowledge and land management. Possibilities to integrate different source of geo-data are growing and methodologies to create thematic database are becoming very sophisticated. Moreover, the access to internet services and, in particular, to web mapping services is well developed and spread either between expert users than the citizens. Web map services, like Google Maps or Open Street Maps, give the access to updated optical imagery or topographic maps but information on land cover/use - are not still provided. Therefore, there are many failings in the general utilization -non-specialized users- and access to those maps. This issue is particularly felt where the digital (web) maps could form the basis for land use management as they are more economic and accessible than the paper maps. These conditions are well known in many African countries where, while the internet access is becoming open to all, the local map agencies and their products are not widespread.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/20422551','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/20422551"><span>The evolution of internet-based map server applications in the United States Department of Agriculture, Veterinary Services.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Maroney, Susan A; McCool, Mary Jane; Geter, Kenneth D; James, Angela M</p> <p>2007-01-01</p> <p>The internet is used increasingly as an effective means of disseminating information. For the past five years, the United States Department of Agriculture (USDA) Veterinary Services (VS) has published animal health information in internet-based map server applications, each oriented to a specific surveillance or outbreak response need. Using internet-based technology allows users to create dynamic, customised maps and perform basic spatial analysis without the need to buy or learn desktop geographic information systems (GIS) software. At the same time, access can be restricted to authorised users. The VS internet mapping applications to date are as follows: Equine Infectious Anemia Testing 1972-2005, National Tick Survey tick distribution maps, the Emergency Management Response System-Mapping Module for disease investigations and emergency outbreaks, and the Scrapie mapping module to assist with the control and eradication of this disease. These services were created using Environmental Systems Research Institute (ESRI)'s internet map server technology (ArcIMS). Other leading technologies for spatial data dissemination are ArcGIS Server, ArcEngine, and ArcWeb Services. VS is prototyping applications using these technologies, including the VS Atlas of Animal Health Information using ArcGIS Server technology and the Map Kiosk using ArcEngine for automating standard map production in the case of an emergency.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li class="active"><span>21</span></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_21 --> <div id="page_22" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li class="active"><span>22</span></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="421"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010SPIE.7840E..20Z','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010SPIE.7840E..20Z"><span>SWOT analysis on National Common Geospatial Information Service Platform of China</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Zheng, Xinyan; He, Biao</p> <p>2010-11-01</p> <p>Currently, the trend of International Surveying and Mapping is shifting from map production to integrated service of geospatial information, such as GOS of U.S. etc. Under this circumstance, the Surveying and Mapping of China is inevitably shifting from 4D product service to NCGISPC (National Common Geospatial Information Service Platform of China)-centered service. Although State Bureau of Surveying and Mapping of China has already provided a great quantity of geospatial information service to various lines of business, such as emergency and disaster management, transportation, water resource, agriculture etc. The shortcomings of the traditional service mode are more and more obvious, due to the highly emerging requirement of e-government construction, the remarkable development of IT technology and emerging online geospatial service demands of various lines of business. NCGISPC, which aimed to provide multiple authoritative online one-stop geospatial information service and API for further development to government, business and public, is now the strategic core of SBSM (State Bureau of Surveying and Mapping of China). This paper focuses on the paradigm shift that NCGISPC brings up by using SWOT (Strength, Weakness, Opportunity and Threat) analysis, compared to the service mode that based on 4D product. Though NCGISPC is still at its early stage, it represents the future service mode of geospatial information of China, and surely will have great impact not only on the construction of digital China, but also on the way that everyone uses geospatial information service.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20160000226','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20160000226"><span>Global Precipitation Measurement (GPM) Mission Products and Services at the NASA Goddard Earth Sciences (GES) Data and Information Services Center (DISC)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Liu, Zhong; Ostrenga, D.; Vollmer, B.; Deshong, B.; Greene, M.; Teng, W.; Kempler, S. J.</p> <p>2015-01-01</p> <p>On February 27, 2014, the NASA Global Precipitation Measurement (GPM) mission was launched to provide the next-generation global observations of rain and snow (http:pmm.nasa.govGPM). The GPM mission consists of an international network of satellites in which a GPM Core Observatory satellite carries both active and passive microwave instruments to measure precipitation and serve as a reference standard, to unify precipitation measurements from a constellation of other research and operational satellites. The NASA Goddard Earth Sciences (GES) Data and Information Services Center (DISC) hosts and distributes GPM data within the NASA Earth Observation System Data Information System (EOSDIS). The GES DISC is home to the data archive for the GPM predecessor, the Tropical Rainfall Measuring Mission (TRMM). Over the past 16 years, the GES DISC has served the scientific as well as other communities with TRMM data and user-friendly services. During the GPM era, the GES DISC will continue to provide user-friendly data services and customer support to users around the world. GPM products currently and to-be available include the following: 1. Level-1 GPM Microwave Imager (GMI) and partner radiometer products. 2. Goddard Profiling Algorithm (GPROF) GMI and partner products. 3. Integrated Multi-satellitE Retrievals for GPM (IMERG) products. (early, late, and final)A dedicated Web portal (including user guides, etc.) has been developed for GPM data (http:disc.sci.gsfc.nasa.govgpm). Data services that are currently and to-be available include Google-like Mirador (http:mirador.gsfc.nasa.gov) for data search and access; data access through various Web services (e.g., OPeNDAP, GDS, WMS, WCS); conversion into various formats (e.g., netCDF, HDF, KML (for Google Earth), ASCII); exploration, visualization, and statistical online analysis through Giovanni (http:giovanni.gsfc.nasa.gov); generation of value-added products; parameter and spatial subsetting; time aggregation; regridding; data version control and provenance; documentation; science support for proper data usage, FAQ, help desk; monitoring services (e.g. Current Conditions) for applications.In this presentation, we will present GPM data products and services with examples.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014AGUFM.H13B1101O','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014AGUFM.H13B1101O"><span>Global Precipitation Measurement (GPM) Mission Products and Services at the NASA Goddard Earth Sciences (GES) Data and Information Services Center (DISC)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ostrenga, D.; Liu, Z.; Vollmer, B.; Teng, W. L.; Kempler, S. J.</p> <p>2014-12-01</p> <p>On February 27, 2014, the NASA Global Precipitation Measurement (GPM) mission was launched to provide the next-generation global observations of rain and snow (http://pmm.nasa.gov/GPM). The GPM mission consists of an international network of satellites in which a GPM "Core Observatory" satellite carries both active and passive microwave instruments to measure precipitation and serve as a reference standard, to unify precipitation measurements from a constellation of other research and operational satellites. The NASA Goddard Earth Sciences (GES) Data and Information Services Center (DISC) hosts and distributes GPM data within the NASA Earth Observation System Data Information System (EOSDIS). The GES DISC is home to the data archive for the GPM predecessor, the Tropical Rainfall Measuring Mission (TRMM). Over the past 16 years, the GES DISC has served the scientific as well as other communities with TRMM data and user-friendly services. During the GPM era, the GES DISC will continue to provide user-friendly data services and customer support to users around the world. GPM products currently and to-be available include the following: Level-1 GPM Microwave Imager (GMI) and partner radiometer products Goddard Profiling Algorithm (GPROF) GMI and partner products Integrated Multi-satellitE Retrievals for GPM (IMERG) products (early, late, and final) A dedicated Web portal (including user guides, etc.) has been developed for GPM data (http://disc.sci.gsfc.nasa.gov/gpm). Data services that are currently and to-be available include Google-like Mirador (http://mirador.gsfc.nasa.gov/) for data search and access; data access through various Web services (e.g., OPeNDAP, GDS, WMS, WCS); conversion into various formats (e.g., netCDF, HDF, KML (for Google Earth), ASCII); exploration, visualization, and statistical online analysis through Giovanni (http://giovanni.gsfc.nasa.gov); generation of value-added products; parameter and spatial subsetting; time aggregation; regridding; data version control and provenance; documentation; science support for proper data usage, FAQ, help desk; monitoring services (e.g. Current Conditions) for applications. In this presentation, we will present GPM data products and services with examples.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=270572&keyword=ROADS&actType=&TIMSType=+&TIMSSubTypeID=&DEID=&epaNumber=&ntisID=&archiveStatus=Both&ombCat=Any&dateBeginCreated=&dateEndCreated=&dateBeginPublishedPresented=&dateEndPublishedPresented=&dateBeginUpdated=&dateEndUpdated=&dateBeginCompleted=&dateEndCompleted=&personID=&role=Any&journalID=&publisherID=&sortBy=revisionDate&count=50','EPA-EIMS'); return false;" href="https://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=270572&keyword=ROADS&actType=&TIMSType=+&TIMSSubTypeID=&DEID=&epaNumber=&ntisID=&archiveStatus=Both&ombCat=Any&dateBeginCreated=&dateEndCreated=&dateBeginPublishedPresented=&dateEndPublishedPresented=&dateBeginUpdated=&dateEndUpdated=&dateBeginCompleted=&dateEndCompleted=&personID=&role=Any&journalID=&publisherID=&sortBy=revisionDate&count=50"><span>Mapping ecosystem services in the St. Louis River Estuary</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://oaspub.epa.gov/eims/query.page">EPA Science Inventory</a></p> <p></p> <p></p> <p>Sustainable management of ecosystems for the perpetual flow of services beneficial to human communities requires reliable data about from where in the ecosystem services flow. Our objective is to map ecosystem services in the St. Louis River with the overarching U.S. EPA goal of ...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015EGUGA..17.8697S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015EGUGA..17.8697S"><span>The IS-ENES climate4impact portal: bridging the CMIP5 and CORDEX data to impact users</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Som de Cerff, Wim; Plieger, Maarten; Page, Christian; Tatarinova, Natalia; Hutjes, Ronald; de Jong, Fokke; Bärring, Lars; Sjökvist, Elin; Vega Saldarriaga, Manuel; Santiago Cofiño Gonzalez, Antonio</p> <p>2015-04-01</p> <p>The aim of climate4impact (climate4impact.eu) is to enhance the use of Climate Research Data and to enhance the interaction with climate effect/impact communities. The portal is based on 17 impact use cases from 5 different European countries, and is evaluated by a user panel consisting of use case owners. It has been developed within the IS-ENES European project and is currently operated and further developed in the IS ENES2 project. As the climate impact community is very broad, the focus is mainly on the scientific impact community. Climate4impact is connected to the Earth System Grid Federation (ESGF) nodes containing global climate model data (GCM data) from the fifth phase of the Coupled Model Intercomparison Project (CMIP5) and regional climate model data (RCM) data from the Coordinated Regional Climate Downscaling Experiment (CORDEX). This global network of climate model data centers offers services for data description, discovery and download. The climate4impact portal connects to these services using OpenID, and offers a user interface for searching, visualizing and downloading global climate model data and more. A challenging task is to describe the available model data and how it can be used. The portal informs users about possible caveats when using climate model data. All impact use cases are described in the documentation section, using highlighted keywords pointing to detailed information in the glossary. Climate4impact currently has two main objectives. The first one is to work on a web interface which automatically generates a graphical user interface on WPS endpoints. The WPS calculates climate indices and subset data using OpenClimateGIS/icclim on data stored in ESGF data nodes. Data is then transmitted from ESGF nodes over secured OpenDAP and becomes available in a new, per user, secured OpenDAP server. The results can then be visualized again using ADAGUC WMS. Dedicated wizards for processing of climate indices will be developed in close collaboration with users. The second one is to expose climate4impact services, so as to offer standardized services which can be used by other portals (like the future Copernicus platform, developed in the EU FP7 CLIPC project). This has the advantage to add interoperability between several portals, as well as to enable the design of specific portals aimed at different impact communities, either thematic or national. In the presentation the following subjects will be detailed: - Lessons learned developing climate4impact.eu - Download: Directly from ESGF nodes and other THREDDS catalogs - Connection with the downscaling portal of the university of Cantabria - Experiences on the question and answer site via Askbot - Visualization: Visualize data from ESGF data nodes using ADAGUC Web Map Services. - Processing: Transform data, subset, export into other formats, and perform climate indices calculations using Web Processing Services implemented by PyWPS, based on NCAR NCPP OpenClimateGIS and IS-ENES2 icclim. - Security: Login using OpenID for access to the ESGF data nodes. The ESGF works in conjunction with several external websites and systems. The climate4impact portal uses X509 based short lived credentials, generated on behalf of the user with a MyProxy service. Single Sign-on (SSO) is used to make these websites and systems work together. - Discovery: Facetted search based on e.g. variable name, model and institute using the ESGF search services. A catalog browser allows for browsing through CMIP5 and any other climate model data catalogues (e.g. ESSENCE, EOBS, UNIDATA).</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/FR-2013-07-02/pdf/2013-15628.pdf','FEDREG'); return false;" href="https://www.gpo.gov/fdsys/pkg/FR-2013-07-02/pdf/2013-15628.pdf"><span>78 FR 39628 - Endangered and Threatened Wildlife and Plants; Critical Habitat Map for the Fountain Darter</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.gpo.gov/fdsys/browse/collection.action?collectionCode=FR">Federal Register 2010, 2011, 2012, 2013, 2014</a></p> <p></p> <p>2013-07-02</p> <p>...-0064; 4500030114] RIN 1018-AZ68 Endangered and Threatened Wildlife and Plants; Critical Habitat Map for... U.S. Fish and Wildlife Service (Service), are correcting the critical habitat map for the fountain... and the general public have an accurate critical habitat map for the species. This action does not...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018ISPAr42.3.1681W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018ISPAr42.3.1681W"><span>Topographical Hill Shading Map Production Based Tianditu (map World)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Wang, C.; Zha, Z.; Tang, D.; Yang, J.</p> <p>2018-04-01</p> <p>TIANDITU (Map World) is the public version of National Platform for Common Geospatial Information Service, and the terrain service is an important channel for users on the platform. With the development of TIANDITU, topographical hill shading map production for providing and updating global terrain map on line becomes necessary for the characters of strong intuition, three-dimensional sense and aesthetic effect. As such, the terrain service of TIANDITU focuses on displaying the different scales of topographical data globally. And this paper mainly aims to research the method of topographical hill shading map production globally using DEM (Digital Elevation Model) data between the displaying scales about 1 : 140,000,000 to 1 : 4,000,000, corresponded the display level from 2 to 7 on TIANDITU website.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018P%26SS..150...36H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018P%26SS..150...36H"><span>Interoperability in planetary research for geospatial data analysis</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Hare, Trent M.; Rossi, Angelo P.; Frigeri, Alessandro; Marmo, Chiara</p> <p>2018-01-01</p> <p>For more than a decade there has been a push in the planetary science community to support interoperable methods for accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (e.g., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized geospatial image formats, geologic mapping conventions, U.S. Federal Geographic Data Committee (FGDC) cartographic and metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Map Tile Services (cached image tiles), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they can be just as valuable for planetary domain. Another initiative, called VESPA (Virtual European Solar and Planetary Access), will marry several of the above geoscience standards and astronomy-based standards as defined by International Virtual Observatory Alliance (IVOA). This work outlines the current state of interoperability initiatives in use or in the process of being researched within the planetary geospatial community.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70112919','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70112919"><span>From theoretical to actual ecosystem services: mapping beneficiaries and spatial flows in ecosystem service assessments</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Bagstad, Kenneth J.; Villa, Ferdinando; Batker, David; Harrison-Cox, Jennifer; Voigt, Brian; Johnson, Gary W.</p> <p>2014-01-01</p> <p>Ecosystem services mapping and modeling has focused more on supply than demand, until recently. Whereas the potential provision of economic benefits from ecosystems to people is often quantified through ecological production functions, the use of and demand for ecosystem services has received less attention, as have the spatial flows of services from ecosystems to people. However, new modeling approaches that map and quantify service-specific sources (ecosystem capacity to provide a service), sinks (biophysical or anthropogenic features that deplete or alter service flows), users (user locations and level of demand), and spatial flows can provide a more complete understanding of ecosystem services. Through a case study in Puget Sound, Washington State, USA, we quantify and differentiate between the theoretical or in situ provision of services, i.e., ecosystems’ capacity to supply services, and their actual provision when accounting for the location of beneficiaries and the spatial connections that mediate service flows between people and ecosystems. Our analysis includes five ecosystem services: carbon sequestration and storage, riverine flood regulation, sediment regulation for reservoirs, open space proximity, and scenic viewsheds. Each ecosystem service is characterized by different beneficiary groups and means of service flow. Using the ARtificial Intelligence for Ecosystem Services (ARIES) methodology we map service supply, demand, and flow, extending on simpler approaches used by past studies to map service provision and use. With the exception of the carbon sequestration service, regions that actually provided services to people, i.e., connected to beneficiaries via flow paths, amounted to 16-66% of those theoretically capable of supplying services, i.e., all ecosystems across the landscape. These results offer a more complete understanding of the spatial dynamics of ecosystem services and their effects, and may provide a sounder basis for economic valuation and policy applications than studies that consider only theoretical service provision and/or use.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010JPhCS.219g2051D','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010JPhCS.219g2051D"><span>A Grid job monitoring system</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Dumitrescu, Catalin; Nowack, Andreas; Padhi, Sanjay; Sarkar, Subir</p> <p>2010-04-01</p> <p>This paper presents a web-based Job Monitoring framework for individual Grid sites that allows users to follow in detail their jobs in quasi-real time. The framework consists of several independent components : (a) a set of sensors that run on the site CE and worker nodes and update a database, (b) a simple yet extensible web services framework and (c) an Ajax powered web interface having a look-and-feel and control similar to a desktop application. The monitoring framework supports LSF, Condor and PBS-like batch systems. This is one of the first monitoring systems where an X.509 authenticated web interface can be seamlessly accessed by both end-users and site administrators. While a site administrator has access to all the possible information, a user can only view the jobs for the Virtual Organizations (VO) he/she is a part of. The monitoring framework design supports several possible deployment scenarios. For a site running a supported batch system, the system may be deployed as a whole, or existing site sensors can be adapted and reused with the web services components. A site may even prefer to build the web server independently and choose to use only the Ajax powered web interface. Finally, the system is being used to monitor a glideinWMS instance. This broadens the scope significantly, allowing it to monitor jobs over multiple sites.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24794194','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24794194"><span>Mapping ecosystem services for land use planning, the case of Central Kalimantan.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Sumarga, Elham; Hein, Lars</p> <p>2014-07-01</p> <p>Indonesia is subject to rapid land use change. One of the main causes for the conversion of land is the rapid expansion of the oil palm sector. Land use change involves a progressive loss of forest cover, with major impacts on biodiversity and global CO2 emissions. Ecosystem services have been proposed as a concept that would facilitate the identification of sustainable land management options, however, the scale of land conversion and its spatial diversity pose particular challenges in Indonesia. The objective of this paper is to analyze how ecosystem services can be mapped at the provincial scale, focusing on Central Kalimantan, and to examine how ecosystem services maps can be used for a land use planning. Central Kalimantan is subject to rapid deforestation including the loss of peatland forests and the provincial still lacks a comprehensive land use plan. We examine how seven key ecosystem services can be mapped and modeled at the provincial scale, using a variety of models, and how large scale ecosystem services maps can support the identification of options for sustainable expansion of palm oil production.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015ISPArXL15..571P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015ISPArXL15..571P"><span>A Different Web-Based Geocoding Service Using Fuzzy Techniques</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Pahlavani, P.; Abbaspour, R. A.; Zare Zadiny, A.</p> <p>2015-12-01</p> <p>Geocoding - the process of finding position based on descriptive data such as address or postal code - is considered as one of the most commonly used spatial analyses. Many online map providers such as Google Maps, Bing Maps and Yahoo Maps present geocoding as one of their basic capabilities. Despite the diversity of geocoding services, users usually face some limitations when they use available online geocoding services. In existing geocoding services, proximity and nearness concept is not modelled appropriately as well as these services search address only by address matching based on descriptive data. In addition there are also some limitations in display searching results. Resolving these limitations can enhance efficiency of the existing geocoding services. This paper proposes the idea of integrating fuzzy technique with geocoding process to resolve these limitations. In order to implement the proposed method, a web-based system is designed. In proposed method, nearness to places is defined by fuzzy membership functions and multiple fuzzy distance maps are created. Then these fuzzy distance maps are integrated using fuzzy overlay technique for obtain the results. Proposed methods provides different capabilities for users such as ability to search multi-part addresses, searching places based on their location, non-point representation of results as well as displaying search results based on their priority.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=1560556','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=1560556"><span>Managing Vocabulary Mapping Services</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Che, Chengjian; Monson, Kent; Poon, Kasey B.; Shakib, Shaun C.; Lau, Lee Min</p> <p>2005-01-01</p> <p>The efficient management and maintenance of large-scale and high-quality vocabulary mapping is an operational challenge. The 3M Health Information Systems (HIS) Healthcare Data Dictionary (HDD) group developed an information management system to provide controlled mapping services, resulting in improved efficiency and quality maintenance. PMID:16779203</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/CFR-2011-title47-vol4/pdf/CFR-2011-title47-vol4-sec73-4108.pdf','CFR2011'); return false;" href="https://www.gpo.gov/fdsys/pkg/CFR-2011-title47-vol4/pdf/CFR-2011-title47-vol4-sec73-4108.pdf"><span>47 CFR 73.4108 - FM transmitter site map submissions.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.gpo.gov/fdsys/browse/collectionCfr.action?selectedYearFrom=2011&page.go=Go">Code of Federal Regulations, 2011 CFR</a></p> <p></p> <p>2011-10-01</p> <p>... 47 Telecommunication 4 2011-10-01 2011-10-01 false FM transmitter site map submissions. 73.4108 Section 73.4108 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) BROADCAST RADIO SERVICES RADIO BROADCAST SERVICES Rules Applicable to All Broadcast Stations § 73.4108 FM transmitter site map...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/20511600','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/20511600"><span>Lean implementation in primary care health visiting services in National Health Service UK.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Grove, A L; Meredith, J O; Macintyre, M; Angelis, J; Neailey, K</p> <p>2010-10-01</p> <p>This paper presents the findings of a 13-month lean implementation in National Health Service (NHS) primary care health visiting services from May 2008 to June 2009. Lean was chosen for this study because of its reported success in other healthcare organisations. Value-stream mapping was utilised to map out essential tasks for the participating health visiting service. Stakeholder mapping was conducted to determine the links between all relevant stakeholders. Waste processes were then identified through discussions with these stakeholders, and a redesigned future state process map was produced. Quantitative data were provided through a 10-day time-and-motion study of a selected number of staff within the service. This was analysed to provide an indication of waste activity that could be removed from the system following planned improvements. The value-stream map demonstrated that there were 67 processes in the original health visiting service studied. Analysis revealed that 65% of these processes were waste and could be removed in the redesigned process map. The baseline time-and-motion data demonstrate that clinical staff performed on average 15% waste activities, and the administrative support staff performed 46% waste activities. Opportunities for significant waste reduction have been identified during the study using the lean tools of value-stream mapping and a time-and-motion study. These opportunities include simplification of standard tasks, reduction in paperwork and standardisation of processes. Successful implementation of these improvements will free up resources within the organisation which can be redirected towards providing better direct care to patients.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2474481','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2474481"><span>Global mapping of ecosystem services and conservation priorities</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Naidoo, R.; Balmford, A.; Costanza, R.; Fisher, B.; Green, R. E.; Lehner, B.; Malcolm, T. R.; Ricketts, T. H.</p> <p>2008-01-01</p> <p>Global efforts to conserve biodiversity have the potential to deliver economic benefits to people (i.e., “ecosystem services”). However, regions for which conservation benefits both biodiversity and ecosystem services cannot be identified unless ecosystem services can be quantified and valued and their areas of production mapped. Here we review the theory, data, and analyses needed to produce such maps and find that data availability allows us to quantify imperfect global proxies for only four ecosystem services. Using this incomplete set as an illustration, we compare ecosystem service maps with the global distributions of conventional targets for biodiversity conservation. Our preliminary results show that regions selected to maximize biodiversity provide no more ecosystem services than regions chosen randomly. Furthermore, spatial concordance among different services, and between ecosystem services and established conservation priorities, varies widely. Despite this lack of general concordance, “win–win” areas—regions important for both ecosystem services and biodiversity—can be usefully identified, both among ecoregions and at finer scales within them. An ambitious interdisciplinary research effort is needed to move beyond these preliminary and illustrative analyses to fully assess synergies and trade-offs in conserving biodiversity and ecosystem services. PMID:18621701</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=children+AND+damage&pg=4&id=EJ731418','ERIC'); return false;" href="https://eric.ed.gov/?q=children+AND+damage&pg=4&id=EJ731418"><span>"Frog, Where Are You?" Narratives in Children with Specific Language Impairment, Early Focal Brain Injury, and Williams Syndrome</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Reilly, Judy; Losh, Molly; Bellugi, Ursula; Wulfeck, Beverly</p> <p>2004-01-01</p> <p>In this cross-population study, we use narratives as a context to investigate language development in children from 4 to 12 years of age from three experimental groups: children with early unilateral focal brain damage (FL; N=52); children with specific language impairment (SLI; N=44); children with Williams syndrome (WMS; N=36), and typically…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=MANAGEMENT+AND+OF+AND+ORGANIZATIONAL+AND+BEHAVIOR&pg=3&id=EJ879679','ERIC'); return false;" href="https://eric.ed.gov/?q=MANAGEMENT+AND+OF+AND+ORGANIZATIONAL+AND+BEHAVIOR&pg=3&id=EJ879679"><span>Replacing Voice Input with Technology that Provided Immediate Visual and Audio Feedback to Reduce Employee Errors</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Goomas, David T.</p> <p>2010-01-01</p> <p>In this report from the field at two auto parts distribution centers, order selectors picked auto accessories (e.g., fuses, oil caps, tool kits) into industrial plastic totes as part of store orders. Accurately identifying all store order totes via the license plate number was a prerequisite for the warehouse management system (WMS) to track each…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/22126392','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/22126392"><span>Development of spatial density maps based on geoprocessing web services: application to tuberculosis incidence in Barcelona, Spain.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Dominkovics, Pau; Granell, Carlos; Pérez-Navarro, Antoni; Casals, Martí; Orcau, Angels; Caylà, Joan A</p> <p>2011-11-29</p> <p>Health professionals and authorities strive to cope with heterogeneous data, services, and statistical models to support decision making on public health. Sophisticated analysis and distributed processing capabilities over geocoded epidemiological data are seen as driving factors to speed up control and decision making in these health risk situations. In this context, recent Web technologies and standards-based web services deployed on geospatial information infrastructures have rapidly become an efficient way to access, share, process, and visualize geocoded health-related information. Data used on this study is based on Tuberculosis (TB) cases registered in Barcelona city during 2009. Residential addresses are geocoded and loaded into a spatial database that acts as a backend database. The web-based application architecture and geoprocessing web services are designed according to the Representational State Transfer (REST) principles. These web processing services produce spatial density maps against the backend database. The results are focused on the use of the proposed web-based application to the analysis of TB cases in Barcelona. The application produces spatial density maps to ease the monitoring and decision making process by health professionals. We also include a discussion of how spatial density maps may be useful for health practitioners in such contexts. In this paper, we developed web-based client application and a set of geoprocessing web services to support specific health-spatial requirements. Spatial density maps of TB incidence were generated to help health professionals in analysis and decision-making tasks. The combined use of geographic information tools, map viewers, and geoprocessing services leads to interesting possibilities in handling health data in a spatial manner. In particular, the use of spatial density maps has been effective to identify the most affected areas and its spatial impact. This study is an attempt to demonstrate how web processing services together with web-based mapping capabilities suit the needs of health practitioners in epidemiological analysis scenarios.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3251534','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3251534"><span>Development of spatial density maps based on geoprocessing web services: application to tuberculosis incidence in Barcelona, Spain</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p></p> <p>2011-01-01</p> <p>Background Health professionals and authorities strive to cope with heterogeneous data, services, and statistical models to support decision making on public health. Sophisticated analysis and distributed processing capabilities over geocoded epidemiological data are seen as driving factors to speed up control and decision making in these health risk situations. In this context, recent Web technologies and standards-based web services deployed on geospatial information infrastructures have rapidly become an efficient way to access, share, process, and visualize geocoded health-related information. Methods Data used on this study is based on Tuberculosis (TB) cases registered in Barcelona city during 2009. Residential addresses are geocoded and loaded into a spatial database that acts as a backend database. The web-based application architecture and geoprocessing web services are designed according to the Representational State Transfer (REST) principles. These web processing services produce spatial density maps against the backend database. Results The results are focused on the use of the proposed web-based application to the analysis of TB cases in Barcelona. The application produces spatial density maps to ease the monitoring and decision making process by health professionals. We also include a discussion of how spatial density maps may be useful for health practitioners in such contexts. Conclusions In this paper, we developed web-based client application and a set of geoprocessing web services to support specific health-spatial requirements. Spatial density maps of TB incidence were generated to help health professionals in analysis and decision-making tasks. The combined use of geographic information tools, map viewers, and geoprocessing services leads to interesting possibilities in handling health data in a spatial manner. In particular, the use of spatial density maps has been effective to identify the most affected areas and its spatial impact. This study is an attempt to demonstrate how web processing services together with web-based mapping capabilities suit the needs of health practitioners in epidemiological analysis scenarios. PMID:22126392</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li class="active"><span>22</span></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_22 --> <div id="page_23" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li class="active"><span>23</span></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="441"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26244349','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26244349"><span>Evoked Potentials and Memory/Cognition Tests Validate Brain Atrophy as Measured by 3T MRI (NeuroQuant) in Cognitively Impaired Patients.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Braverman, Eric R; Blum, Kenneth; Hussman, Karl L; Han, David; Dushaj, Kristina; Li, Mona; Marin, Gabriela; Badgaiyan, Rajendra D; Smayda, Richard; Gold, Mark S</p> <p>2015-01-01</p> <p>To our knowledge, this is the largest study evaluating relationships between 3T Magnetic Resonance Imaging (MRI) and P300 and memory/cognitive tests in the literature. The 3T MRI using NeuroQuant has an increased resolution 15 times that of 1.5T MRI. Utilizing NeuroQuant 3T MRI as a diagnostic tool in primary care, subjects (N=169; 19-90 years) displayed increased areas of anatomical atrophy: 34.62% hippocampal atrophy (N=54), 57.14% central atrophy (N=88), and 44.52% temporal atrophy (N=69). A majority of these patients exhibited overlap in measured areas of atrophy and were cognitively impaired. These results positively correlated with decreased P300 values and WMS-III (WMS-III) scores differentially across various brain loci. Delayed latency (p=0.0740) was marginally associated with temporal atrophy; reduced fractional anisotropy (FA) in frontal lobes correlated with aging, delayed P300 latency, and decreased visual and working memory (p=0.0115). Aging and delayed P300 latency correlated with lower FA. The correlation between working memory and reduced FA in frontal lobes is marginally significant (p=0.0787). In the centrum semiovale (CS), reduced FA correlated with visual memory (p=0.0622). Lower demyelination correlated with higher P300 amplitude (p=0.0002). Compared to males, females have higher demyelination (p=0.0064). Along these lines, the higher the P300 amplitude, the lower the bilateral atrophy (p=0.0165). Hippocampal atrophy correlated with increased auditory memory and gender, especially in males (p=0.0087). In considering temporal lobe atrophy correlations: delayed P300 latency and high temporal atrophy (p=0.0740); high auditory memory and low temporal atrophy (p=0.0417); and high working memory and low temporal atrophy (p=0.0166). Central atrophy correlated with aging and immediate memory (p=0.0294): the higher the immediate memory, the lower the central atrophy. Generally, the validation of brain atrophy by P300 and WMS-III could lead to cost-effective methods utilizable in primary care medicine following further confirmation.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2009ApPhB..96..161F','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2009ApPhB..96..161F"><span>Sensitive detection of temperature behind reflected shock waves using wavelength modulation spectroscopy of CO2 near 2.7 μm</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Farooq, A.; Jeffries, J. B.; Hanson, R. K.</p> <p>2009-07-01</p> <p>Tunable diode-laser absorption of CO2 near 2.7 μm incorporating wavelength modulation spectroscopy with second-harmonic detection (WMS-2f) is used to provide a new sensor for sensitive and accurate measurement of the temperature behind reflected shock waves in a shock-tube. The temperature is inferred from the ratio of 2f signals for two selected absorption transitions, at 3633.08 and 3645.56 cm-1, belonging to the ν 1+ ν 3 combination vibrational band of CO2 near 2.7 μm. The modulation depths of 0.078 and 0.063 cm-1 are optimized for the target conditions of the shock-heated gases ( P˜1-2 atm, T˜800-1600 K). The sensor is designed to achieve a high sensitivity to the temperature and a low sensitivity to cold boundary-layer effects and any changes in gas pressure or composition. The fixed-wavelength WMS-2f sensor is tested for temperature and CO2 concentration measurements in a heated static cell (600-1200 K) and in non-reactive shock-tube experiments (900-1700 K) using CO2-Ar mixtures. The relatively large CO2 absorption strength near 2.7 μm and the use of a WMS-2f strategy minimizes noise and enables measurements with lower concentration, higher accuracy, better sensitivity and improved signal-to-noise ratio (SNR) relative to earlier work, using transitions in the 1.5 and 2.0 μm CO2 combination bands. The standard deviation of the measured temperature histories behind reflected shock waves is less than 0.5%. The temperature sensor is also demonstrated in reactive shock-tube experiments of n-heptane oxidation. Seeding of relatively inert CO2 in the initial fuel-oxidizer mixture is utilized to enable measurements of the pre-ignition temperature profiles. To our knowledge, this work represents the first application of wavelength modulation spectroscopy to this new class of diode lasers near 2.7 μm.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4526533','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4526533"><span>Evoked Potentials and Memory/Cognition Tests Validate Brain Atrophy as Measured by 3T MRI (NeuroQuant) in Cognitively Impaired Patients</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Braverman, Eric R.; Blum, Kenneth; Hussman, Karl L.; Han, David; Dushaj, Kristina; Li, Mona; Marin, Gabriela; Badgaiyan, Rajendra D.; Smayda, Richard; Gold, Mark S.</p> <p>2015-01-01</p> <p>To our knowledge, this is the largest study evaluating relationships between 3T Magnetic Resonance Imaging (MRI) and P300 and memory/cognitive tests in the literature. The 3T MRI using NeuroQuant has an increased resolution 15 times that of 1.5T MRI. Utilizing NeuroQuant 3T MRI as a diagnostic tool in primary care, subjects (N=169; 19–90 years) displayed increased areas of anatomical atrophy: 34.62% hippocampal atrophy (N=54), 57.14% central atrophy (N=88), and 44.52% temporal atrophy (N=69). A majority of these patients exhibited overlap in measured areas of atrophy and were cognitively impaired. These results positively correlated with decreased P300 values and WMS-III (WMS-III) scores differentially across various brain loci. Delayed latency (p=0.0740) was marginally associated with temporal atrophy; reduced fractional anisotropy (FA) in frontal lobes correlated with aging, delayed P300 latency, and decreased visual and working memory (p=0.0115). Aging and delayed P300 latency correlated with lower FA. The correlation between working memory and reduced FA in frontal lobes is marginally significant (p=0.0787). In the centrum semiovale (CS), reduced FA correlated with visual memory (p=0.0622). Lower demyelination correlated with higher P300 amplitude (p=0.0002). Compared to males, females have higher demyelination (p=0.0064). Along these lines, the higher the P300 amplitude, the lower the bilateral atrophy (p=0.0165). Hippocampal atrophy correlated with increased auditory memory and gender, especially in males (p=0.0087). In considering temporal lobe atrophy correlations: delayed P300 latency and high temporal atrophy (p=0.0740); high auditory memory and low temporal atrophy (p=0.0417); and high working memory and low temporal atrophy (p=0.0166). Central atrophy correlated with aging and immediate memory (p=0.0294): the higher the immediate memory, the lower the central atrophy. Generally, the validation of brain atrophy by P300 and WMS-III could lead to cost-effective methods utilizable in primary care medicine following further confirmation. PMID:26244349</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..19.1185H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..19.1185H"><span>Towards an EO-based Landslide Web Mapping and Monitoring Service</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Hölbling, Daniel; Weinke, Elisabeth; Albrecht, Florian; Eisank, Clemens; Vecchiotti, Filippo; Friedl, Barbara; Kociu, Arben</p> <p>2017-04-01</p> <p>National and regional authorities and infrastructure maintainers in mountainous regions require accurate knowledge of the location and spatial extent of landslides for hazard and risk management. Information on landslides is often collected by a combination of ground surveying and manual image interpretation following landslide triggering events. However, the high workload and limited time for data acquisition result in a trade-off between completeness, accuracy and detail. Remote sensing data offers great potential for mapping and monitoring landslides in a fast and efficient manner. While facing an increased availability of high-quality Earth Observation (EO) data and new computational methods, there is still a lack in science-policy interaction and in providing innovative tools and methods that can easily be used by stakeholders and users to support their daily work. Taking up this issue, we introduce an innovative and user-oriented EO-based web service for landslide mapping and monitoring. Three central design components of the service are presented: (1) the user requirements definition, (2) the semi-automated image analysis methods implemented in the service, and (3) the web mapping application with its responsive user interface. User requirements were gathered during semi-structured interviews with regional authorities. The potential users were asked if and how they employ remote sensing data for landslide investigation and what their expectations to a landslide web mapping service regarding reliability and usability are. The interviews revealed the capability of our service for landslide documentation and mapping as well as monitoring of selected landslide sites, for example to complete and update landslide inventory maps. In addition, the users see a considerable potential for landslide rapid mapping. The user requirements analysis served as basis for the service concept definition. Optical satellite imagery from different high resolution (HR) and very high resolution (VHR) sensors, e.g. Landsat, Sentinel-2, SPOT-5, WorldView-2/3, was acquired for different study areas in the Alps. Object-based image analysis (OBIA) methods were used for semi-automated mapping of landslides. Selected mapping routines and results, including a step-by-step guidance, are integrated in the service by means of a web processing chain. This allows the user to gain insights into the service idea, the potential of semi-automated mapping methods, and the applicability of various satellite data for specific landslide mapping tasks. Moreover, an easy-to use and guided classification workflow, which includes image segmentation, statistical classification and manual editing options, enables the user to perform his/her own analyses. For validation, the classification results can be downloaded or compared against uploaded reference data using the implemented tools. Furthermore, users can compare the classification results to freely available data such as OpenStreetMap to identify landslide-affected infrastructure (e.g. roads, buildings). They also can upload infrastructure data available at their organization for specific assessments or monitor the evolution of selected landslides over time. Further actions will include the validation of the service in collaboration with stakeholders, decision makers and experts, which is essential to produce landslide information products that can assist the targeted management of natural hazards, and the evaluation of the potential towards the development of an operational Copernicus downstream service.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2008SPIE.7146E..2IT','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2008SPIE.7146E..2IT"><span>Distributed spatial information integration based on web service</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Tong, Hengjian; Zhang, Yun; Shao, Zhenfeng</p> <p>2008-10-01</p> <p>Spatial information systems and spatial information in different geographic locations usually belong to different organizations. They are distributed and often heterogeneous and independent from each other. This leads to the fact that many isolated spatial information islands are formed, reducing the efficiency of information utilization. In order to address this issue, we present a method for effective spatial information integration based on web service. The method applies asynchronous invocation of web service and dynamic invocation of web service to implement distributed, parallel execution of web map services. All isolated information islands are connected by the dispatcher of web service and its registration database to form a uniform collaborative system. According to the web service registration database, the dispatcher of web services can dynamically invoke each web map service through an asynchronous delegating mechanism. All of the web map services can be executed at the same time. When each web map service is done, an image will be returned to the dispatcher. After all of the web services are done, all images are transparently overlaid together in the dispatcher. Thus, users can browse and analyze the integrated spatial information. Experiments demonstrate that the utilization rate of spatial information resources is significantly raised thought the proposed method of distributed spatial information integration.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2009SPIE.7146E..2IT','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2009SPIE.7146E..2IT"><span>Distributed spatial information integration based on web service</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Tong, Hengjian; Zhang, Yun; Shao, Zhenfeng</p> <p>2009-10-01</p> <p>Spatial information systems and spatial information in different geographic locations usually belong to different organizations. They are distributed and often heterogeneous and independent from each other. This leads to the fact that many isolated spatial information islands are formed, reducing the efficiency of information utilization. In order to address this issue, we present a method for effective spatial information integration based on web service. The method applies asynchronous invocation of web service and dynamic invocation of web service to implement distributed, parallel execution of web map services. All isolated information islands are connected by the dispatcher of web service and its registration database to form a uniform collaborative system. According to the web service registration database, the dispatcher of web services can dynamically invoke each web map service through an asynchronous delegating mechanism. All of the web map services can be executed at the same time. When each web map service is done, an image will be returned to the dispatcher. After all of the web services are done, all images are transparently overlaid together in the dispatcher. Thus, users can browse and analyze the integrated spatial information. Experiments demonstrate that the utilization rate of spatial information resources is significantly raised thought the proposed method of distributed spatial information integration.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017MS%26E..242a2077C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017MS%26E..242a2077C"><span>Research on the construction of three level customer service knowledge graph</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Cheng, Shi; Shen, Jiajie; Shi, Quan; Cheng, Xianyi</p> <p>2017-09-01</p> <p>With the explosion of knowledge and information of the enterprise and the growing demand for intelligent knowledge management and application and improve business performance the knowledge expression and processing of the enterprise has become a hot topic. Aim at the problems of the electric marketing customer service knowledge map (customer service knowledge map) in building theory and method, electric marketing knowledge map of three levels of customer service was discussed, and realizing knowledge reasoning based on Neo4j, achieve good results in practical application.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=ROADS&pg=4&id=EJ1061605','ERIC'); return false;" href="https://eric.ed.gov/?q=ROADS&pg=4&id=EJ1061605"><span>A Road Map for Empowering Undergraduates to Practice Service Leadership through Service-Learning in Teams</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Snell, Robin Stanley; Chan, Maureen Yin Lee; Ma, Carol Hok Ka; Chan, Carman Ka Man</p> <p>2015-01-01</p> <p>We present a road map for providing course-embedded service-learning team projects as opportunities for undergraduates to practice as service leaders in Asia and beyond. Basic foundations are that projects address authentic problems or needs, partner organization representatives (PORs) indicate availability for ongoing consultation, students…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=272939&Lab=NERL&keyword=innovation+AND+management&actType=&TIMSType=+&TIMSSubTypeID=&DEID=&epaNumber=&ntisID=&archiveStatus=Both&ombCat=Any&dateBeginCreated=&dateEndCreated=&dateBeginPublishedPresented=&dateEndPublishedPresented=&dateBeginUpdated=&dateEndUpdated=&dateBeginCompleted=&dateEndCompleted=&personID=&role=Any&journalID=&publisherID=&sortBy=revisionDate&count=50','EPA-EIMS'); return false;" href="https://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=272939&Lab=NERL&keyword=innovation+AND+management&actType=&TIMSType=+&TIMSSubTypeID=&DEID=&epaNumber=&ntisID=&archiveStatus=Both&ombCat=Any&dateBeginCreated=&dateEndCreated=&dateBeginPublishedPresented=&dateEndPublishedPresented=&dateBeginUpdated=&dateEndUpdated=&dateBeginCompleted=&dateEndCompleted=&personID=&role=Any&journalID=&publisherID=&sortBy=revisionDate&count=50"><span>The EnviroAtlas ‐ Developing a National Approach to Quantify and Map Metrics within an Ecosystem Services Framework. Subfocus: Multi‐scale Biodiversity Conservation Metrics</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://oaspub.epa.gov/eims/query.page">EPA Science Inventory</a></p> <p></p> <p></p> <p>Ecosystem services, i.e., "services provided to humans from natural systems," have become a key issue of this century in resource management, conservation planning, human well-being, and environmental decision analysis. Mapping and quantifying ecosystem services have become stra...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=saas+OR+software+AND+service&pg=3&id=EJ1128592','ERIC'); return false;" href="https://eric.ed.gov/?q=saas+OR+software+AND+service&pg=3&id=EJ1128592"><span>Day Service Provision for People with Intellectual Disabilities: A Case Study Mapping 15-Year Trends in Ireland</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Fleming, Padraic; McGilloway, Sinead; Barry, Sarah</p> <p>2017-01-01</p> <p>Background: Day services for people with intellectual disabilities are experiencing a global paradigm shift towards innovative person-centred models of care. This study maps changing trends in day service utilization to highlight how policy, emergent patterns and demographic trends influence service delivery. Methods: National intellectual…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014AGUFMOS31B0987C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014AGUFMOS31B0987C"><span>How to change GEBCO outreach activities with Information technologies?</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Chang, E.; Park, K.</p> <p>2014-12-01</p> <p>Since 1995, when National Geographic Information Project began, we have great advance in mapping itself and information service on the earth surface in Korea whether paper maps or online service map. By reviewing geological and mine-related information service in current and comparisons of demands, GEBCO outreach master plan has been prepared. Information service cannot be separated from data production and on dissemination policies. We suggest the potential impact of the changes in information technologies such as mobile service and data fusion, and big data on GEBCO maps based. Less cost and high performance in data service will stimulate more information service; therefore it is necessary to have more customer-oriented manipulation on the data. By inquiring questionnaire, we can draw the potential needs on GEBCO products in various aspects: such as education, accessibility. The gap between experts and non-experts will decrease by digital service from the private and public organizations such as international academic societies since research funds and policies tend to pursue "openness" and "interoperability" among the domains. Some background why and how to prepare outreach activities in GEBCO will be shown.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=247071&keyword=Biodiversity&actType=&TIMSType=+&TIMSSubTypeID=&DEID=&epaNumber=&ntisID=&archiveStatus=Both&ombCat=Any&dateBeginCreated=&dateEndCreated=&dateBeginPublishedPresented=&dateEndPublishedPresented=&dateBeginUpdated=&dateEndUpdated=&dateBeginCompleted=&dateEndCompleted=&personID=&role=Any&journalID=&publisherID=&sortBy=revisionDate&count=50','EPA-EIMS'); return false;" href="https://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=247071&keyword=Biodiversity&actType=&TIMSType=+&TIMSSubTypeID=&DEID=&epaNumber=&ntisID=&archiveStatus=Both&ombCat=Any&dateBeginCreated=&dateEndCreated=&dateBeginPublishedPresented=&dateEndPublishedPresented=&dateBeginUpdated=&dateEndUpdated=&dateBeginCompleted=&dateEndCompleted=&personID=&role=Any&journalID=&publisherID=&sortBy=revisionDate&count=50"><span>Quantifying and Mapping Habitat-Based Biodiversity Metrics Within an Ecosystem Services Framework</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://oaspub.epa.gov/eims/query.page">EPA Science Inventory</a></p> <p></p> <p></p> <p>Ecosystem services have become a key issue of this century in resource management, conservation planning, human well-being, and environmental decision analysis. Mapping and quantifying ecosystem services have become strategic national interests for integrating ecology with econom...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20160000355','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20160000355"><span>Global Precipitation Measurement (GPM) Mission Products and Services at the NASA Goddard Earth Sciences Data and Information Services Center (GES DISC)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Ostrenga, D.; Liu, Z.; Vollmer, B.; Teng, W.; Kempler, S.</p> <p>2014-01-01</p> <p>On February 27, 2014, the NASA Global Precipitation Measurement (GPM) mission was launched to provide the next-generation global observations of rain and snow (http:pmm.nasa.govGPM). The GPM mission consists of an international network of satellites in which a GPM Core Observatory satellite carries both active and passive microwave instruments to measure precipitation and serve as a reference standard, to unify precipitation measurements from a constellation of other research and operational satellites. The NASA Goddard Earth Sciences (GES) Data and Information Services Center (DISC) hosts and distributes GPM data within the NASA Earth Observation System Data Information System (EOSDIS). The GES DISC is home to the data archive for the GPM predecessor, the Tropical Rainfall Measuring Mission (TRMM). Over the past 16 years, the GES DISC has served the scientific as well as other communities with TRMM data and user-friendly services. During the GPM era, the GES DISC will continue to provide user-friendly data services and customer support to users around the world. GPM products currently and to-be available include the following:Level-1 GPM Microwave Imager (GMI) and partner radiometer productsLevel-2 Goddard Profiling Algorithm (GPROF) GMI and partner productsLevel-3 daily and monthly productsIntegrated Multi-satellitE Retrievals for GPM (IMERG) products (early, late, and final) A dedicated Web portal (including user guides, etc.) has been developed for GPM data (http:disc.sci.gsfc.nasa.govgpm). Data services that are currently and to-be available include Google-like Mirador (http:mirador.gsfc.nasa.gov) for data search and access; data access through various Web services (e.g., OPeNDAP, GDS, WMS, WCS); conversion into various formats (e.g., netCDF, HDF, KML (for Google Earth), ASCII); exploration, visualization, and statistical online analysis through Giovanni (http:giovanni.gsfc.nasa.gov); generation of value-added products; parameter and spatial subsetting; time aggregation; regridding; data version control and provenance; documentation; science support for proper data usage, FAQ, help desk; monitoring services (e.g. Current Conditions) for applications.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018PrICA...1...65L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018PrICA...1...65L"><span>Minecraft® on Demand - A new IGN service which combines game and 3D cartography</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Lecordix, François; Fremont, David; Jilani, Moez; Séguin, Emmanuel; Kriat, Sofiane</p> <p>2018-05-01</p> <p>The French national mapping agency, Institut national de l'information géographique et forestière (IGN), decided to develop a new web service, called Minecraft on Demand (www.ign.fr/Minecraft), designed to provide Minecraft maps from the geographic data that IGN produces. This free web service enables the user to select the center of the map and to get a Minecraft world of 5 km long and 5 km wide, at the scale 1 : 1. The player can easily input this map into Minecraft, the world's most popular video game with 121 million copies sold. Launched in June 2016 in France, the service Minecraft® on Demand obtained a fair success (10,000 maps downloaded), more specifically among young people, since it may enable them to discover IGN data and geography.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2952804','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2952804"><span>Analysis of QoS Requirements for e-Health Services and Mapping to Evolved Packet System QoS Classes</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Skorin-Kapov, Lea; Matijasevic, Maja</p> <p>2010-01-01</p> <p>E-Health services comprise a broad range of healthcare services delivered by using information and communication technology. In order to support existing as well as emerging e-Health services over converged next generation network (NGN) architectures, there is a need for network QoS control mechanisms that meet the often stringent requirements of such services. In this paper, we evaluate the QoS support for e-Health services in the context of the Evolved Packet System (EPS), specified by the Third Generation Partnership Project (3GPP) as a multi-access all-IP NGN. We classify heterogeneous e-Health services based on context and network QoS requirements and propose a mapping to existing 3GPP QoS Class Identifiers (QCIs) that serve as a basis for the class-based QoS concept of the EPS. The proposed mapping aims to provide network operators with guidelines for meeting heterogeneous e-Health service requirements. As an example, we present the QoS requirements for a prototype e-Health service supporting tele-consultation between a patient and a doctor and illustrate the use of the proposed mapping to QCIs in standardized QoS control procedures. PMID:20976301</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2009NHESS...9..563H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2009NHESS...9..563H"><span>Evaluation of flood hazard maps in print and web mapping services as information tools in flood risk communication</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Hagemeier-Klose, M.; Wagner, K.</p> <p>2009-04-01</p> <p>Flood risk communication with the general public and the population at risk is getting increasingly important for flood risk management, especially as a precautionary measure. This is also underlined by the EU Flood Directive. The flood related authorities therefore have to develop adjusted information tools which meet the demands of different user groups. This article presents the formative evaluation of flood hazard maps and web mapping services according to the specific requirements and needs of the general public using the dynamic-transactional approach as a theoretical framework. The evaluation was done by a mixture of different methods; an analysis of existing tools, a creative workshop with experts and laymen and an online survey. The currently existing flood hazard maps or web mapping services or web GIS still lack a good balance between simplicity and complexity with adequate readability and usability for the public. Well designed and associative maps (e.g. using blue colours for water depths) which can be compared with past local flood events and which can create empathy in viewers, can help to raise awareness, to heighten the activity and knowledge level or can lead to further information seeking. Concerning web mapping services, a linkage between general flood information like flood extents of different scenarios and corresponding water depths and real time information like gauge levels is an important demand by users. Gauge levels of these scenarios are easier to understand than the scientifically correct return periods or annualities. The recently developed Bavarian web mapping service tries to integrate these requirements.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=256122&Lab=NERL&keyword=essential+AND+economic&actType=&TIMSType=+&TIMSSubTypeID=&DEID=&epaNumber=&ntisID=&archiveStatus=Both&ombCat=Any&dateBeginCreated=&dateEndCreated=&dateBeginPublishedPresented=&dateEndPublishedPresented=&dateBeginUpdated=&dateEndUpdated=&dateBeginCompleted=&dateEndCompleted=&personID=&role=Any&journalID=&publisherID=&sortBy=revisionDate&count=50','EPA-EIMS'); return false;" href="https://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=256122&Lab=NERL&keyword=essential+AND+economic&actType=&TIMSType=+&TIMSSubTypeID=&DEID=&epaNumber=&ntisID=&archiveStatus=Both&ombCat=Any&dateBeginCreated=&dateEndCreated=&dateBeginPublishedPresented=&dateEndPublishedPresented=&dateBeginUpdated=&dateEndUpdated=&dateBeginCompleted=&dateEndCompleted=&personID=&role=Any&journalID=&publisherID=&sortBy=revisionDate&count=50"><span>A National Approach for Mapping and Quantifying Habitat-based Biodiversity Metrics Across Multiple Spatial Scales</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://oaspub.epa.gov/eims/query.page">EPA Science Inventory</a></p> <p></p> <p></p> <p>Ecosystem services, i.e., "services provided to humans from natural systems," have become a key issue of this century in resource management, conservation planning, and environmental decision analysis. Mapping and quantifying ecosystem services have become strategic national inte...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013EGUGA..15.5552C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013EGUGA..15.5552C"><span>EarthServer: Use of Rasdaman as a data store for use in visualisation of complex EO data</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Clements, Oliver; Walker, Peter; Grant, Mike</p> <p>2013-04-01</p> <p>The European Commission FP7 project EarthServer is establishing open access and ad-hoc analytics on extreme-size Earth Science data, based on and extending cutting-edge Array Database technology. EarthServer is built around the Rasdaman Raster Data Manager which extends standard relational database systems with the ability to store and retrieve multi-dimensional raster data of unlimited size through an SQL style query language. Rasdaman facilitates visualisation of data by providing several Open Geospatial Consortium (OGC) standard interfaces through its web services wrapper, Petascope. These include the well established standards, Web Coverage Service (WCS) and Web Map Service (WMS) as well as the emerging standard, Web Coverage Processing Service (WCPS). The WCPS standard allows the running of ad-hoc queries on the data stored within Rasdaman, creating an infrastructure where users are not restricted by bandwidth when manipulating or querying huge datasets. Here we will show that the use of EarthServer technologies and infrastructure allows access and visualisation of massive scale data through a web client with only marginal bandwidth use as opposed to the current mechanism of copying huge amounts of data to create visualisations locally. For example if a user wanted to generate a plot of global average chlorophyll for a complete decade time series they would only have to download the result instead of Terabytes of data. Firstly we will present a brief overview of the capabilities of Rasdaman and the WCPS query language to introduce the ways in which it is used in a visualisation tool chain. We will show that there are several ways in which WCPS can be utilised to create both standard and novel web based visualisations. An example of a standard visualisation is the production of traditional 2d plots, allowing users the ability to plot data products easily. However, the query language allows the creation of novel/custom products, which can then immediately be plotted with the same system. For more complex multi-spectral data, WCPS allows the user to explore novel combinations of bands in standard band-ratio algorithms through a web browser with dynamic updating of the resultant image. To visualise very large datasets Rasdaman has the capability to dynamically scale a dataset or query result so that it can be appraised quickly for use in later unscaled queries. All of these techniques are accessible through a web based GIS interface increasing the number of potential users of the system. Lastly we will show the advances in dynamic web based 3D visualisations being explored within the EarthServer project. By utilising the emerging declarative 3D web standard X3DOM as a tool to visualise the results of WCPS queries we introduce several possible benefits, including quick appraisal of data for outliers or anomalous data points and visualisation of the uncertainty of data alongside the actual data values.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015EGUGA..17.8548S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015EGUGA..17.8548S"><span>Developing the architecture for the Climate Information Portal for Copernicus</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Som de Cerff, Wim; Thijsse, Peter; Plieger, Maarten; Pascoe, Stephen; Jukes, Martin; Leadbetter, Adam; Goosen, Hasse; de Vreede, Ernst</p> <p>2015-04-01</p> <p>Climate change is impacting the environment, society and policy decisions. Information about climate change is available from many sources, but not all of them are reliable. The CLIPC project is developing a portal to provide a single point of access for authoritative scientific information on climate change. This ambitious objective is made possible through the Copernicus Earth Observation Programme for Europe, which will deliver a new generation of environmental measurements of climate quality. The data about the physical environment which is used to inform climate change policy and adaptation measures comes from several categories: satellite measurements, terrestrial observing systems, model projections and simulations and from re-analyses (syntheses of all available observations constrained with numerical weather prediction systems). These data categories are managed by different communities: CLIPC will provide a single point of access for the whole range of data. Information on data value and limitations will be provided as part of a knowledge base of authoritative climate information. The impacts of climate change on society will generally reflect a range of different environmental and climate system changes, and different sectors and actors within society will react differently to these changes. The CLIPC portal will provide some a number of indicators showing impacts on specific sectors which have been generated using a range of factors selected through structured expert consultation. It will also, as part of the transformation services, allow users to explore the consequences of using different combinations of driving factors which they consider to be of particular relevance to their work or life. The portal will provide information on the scientific quality and pitfalls of such transformations to prevent misleading usage of the results. The CLIPC project will not be able to process a comprehensive range of climate change impacts on the physical environment and society, but will develop an end to end processing chain (indicator toolkit), from comprehensive information on the climate state through to highly aggregated decision relevant products. This processing chain will be demonstrated within three thematic areas: water, rural and urban. Indicators of climate change and climate change impact will be provided, and a toolkit to update and post process the collection of indicators will be integrated into the portal. For the indicators three levels (Tiers) have been loosely defined: Tier 1: field summarising properties of the climate system; e.g. temperature change; Tier 2: expressed in terms of environmental properties outside the climate system; e.g. flooding change; Tier 3: expressed in social and economic impact. For the architecture, CLIPC has two interlocked themes: 1. Harmonised access to climate datasets derived from models, observations and re-analyses 2. A climate impact toolkit to evaluate, rank and aggregate indicators For development of the CLIPC architecture an Agile 'storyline' approach is taken. The storyline is a real world use case and consists of producing a Tier 3 indicator (Urban Heat Vulnerability) and making it available through the CLIPC infrastructure for a user group. In this way architecture concepts can be directly tested and improved. Also, the produced indicator can be shown to users to refine requirements. Main components of the CLIPC architecture are 1) Data discovery and access, 2) Data processing, 3) Data visualization, 4) Knowledge base and 5) User Management. The Data discovery and access component main challenge is to provide harmonized access to various sources of climate data (ngEO, EMODNET/SeaDataNet, ESGF, MyOcean). The discovery service concept will be provided using a CLIPC data and data product catalogue and via a structured data search on selected infrastructures, using NERC vocabulary services and mappings. Data processing will be provided using OGC WPS services, linking/re-using existing processing services from climate4impact.eu. The processing services will allow users to calculate climate impact indicators (Tier 1, 2 and 3). Processing wizards will guide users in processing indicators. The PyWPS framework will be used. The CLIPC portal will have its own central viewing service, using OGC standards for interoperability. For the WMS server side the ADAGUC framework will be used. For Tier 3 visualizations specific tailored visualisations will be developed. Tier 3 can be complicated to build and require manual work from specialists to provide meaningful results before they can be published as e.g. interactive maps. The CLIPC knowledge base is a set of services that supply explanatory information to the users when working with CLIPC services. It is structured around 1) a catalogue, containing ISO standardized metadata, citations, background information, links to data; 2) Commentary information, e.g. FAQ, annotation URLs , version information, disclaimers; 3) Technical documents, e.g. using vocabularies and mappings 4) Glossaries, adding and using existing glossaries from e.g. EUPORIAS/IS-ENES, IPCC; 5) literature references. CLIPC will have a very light weight user management system, providing as little barriers to the user as possible. We will make use of OpenID, accepting from selected OpenID providers such as Google and ESGF. In the presentation we will show the storyline implementation: the first results of the Tier 3 indicator, the architecture in development and the lessons learned.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25958393','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25958393"><span>NaviCell Web Service for network-based data visualization.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Bonnet, Eric; Viara, Eric; Kuperstein, Inna; Calzone, Laurence; Cohen, David P A; Barillot, Emmanuel; Zinovyev, Andrei</p> <p>2015-07-01</p> <p>Data visualization is an essential element of biological research, required for obtaining insights and formulating new hypotheses on mechanisms of health and disease. NaviCell Web Service is a tool for network-based visualization of 'omics' data which implements several data visual representation methods and utilities for combining them together. NaviCell Web Service uses Google Maps and semantic zooming to browse large biological network maps, represented in various formats, together with different types of the molecular data mapped on top of them. For achieving this, the tool provides standard heatmaps, barplots and glyphs as well as the novel map staining technique for grasping large-scale trends in numerical values (such as whole transcriptome) projected onto a pathway map. The web service provides a server mode, which allows automating visualization tasks and retrieving data from maps via RESTful (standard HTTP) calls. Bindings to different programming languages are provided (Python and R). We illustrate the purpose of the tool with several case studies using pathway maps created by different research groups, in which data visualization provides new insights into molecular mechanisms involved in systemic diseases such as cancer and neurodegenerative diseases. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li class="active"><span>23</span></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_23 --> <div id="page_24" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li class="active"><span>24</span></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="461"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4489283','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4489283"><span>NaviCell Web Service for network-based data visualization</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Bonnet, Eric; Viara, Eric; Kuperstein, Inna; Calzone, Laurence; Cohen, David P. A.; Barillot, Emmanuel; Zinovyev, Andrei</p> <p>2015-01-01</p> <p>Data visualization is an essential element of biological research, required for obtaining insights and formulating new hypotheses on mechanisms of health and disease. NaviCell Web Service is a tool for network-based visualization of ‘omics’ data which implements several data visual representation methods and utilities for combining them together. NaviCell Web Service uses Google Maps and semantic zooming to browse large biological network maps, represented in various formats, together with different types of the molecular data mapped on top of them. For achieving this, the tool provides standard heatmaps, barplots and glyphs as well as the novel map staining technique for grasping large-scale trends in numerical values (such as whole transcriptome) projected onto a pathway map. The web service provides a server mode, which allows automating visualization tasks and retrieving data from maps via RESTful (standard HTTP) calls. Bindings to different programming languages are provided (Python and R). We illustrate the purpose of the tool with several case studies using pathway maps created by different research groups, in which data visualization provides new insights into molecular mechanisms involved in systemic diseases such as cancer and neurodegenerative diseases. PMID:25958393</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://planthardiness.ars.usda.gov/PHZMWeb/AboutAcknowledgements.aspx','SCIGOVWS'); return false;" href="http://planthardiness.ars.usda.gov/PHZMWeb/AboutAcknowledgements.aspx"><span>Acknowledgments & Citation | USDA Plant Hardiness Zone Map</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.science.gov/aboutsearch.html">Science.gov Websites</a></p> <p></p> <p></p> <p>USDA Logo Agricultural Research Service United States Department of <em>Agriculture</em> Mapping by PRISM , 2012. Agricultural Research Service, U.S. Department of <em>Agriculture</em>. Accessed from http</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19720007443','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19720007443"><span>Inertial waste separation system for zero G WMS</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p></p> <p>1971-01-01</p> <p>The design, operation, and flight test are presented for an inertial waste separation system. Training personnel to use this system under simulated conditions is also discussed. Conclusions indicate that before the system is usable in zero gravity environments, a mirror for the user's guidance should be installed, the bounce cycle and bag changing system should be redesigned, and flange clips should be added to improve the user's balance.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA060962','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA060962"><span>Cost Effectiveness Study of Wastewater Management Systems for Selected U.S. Coast Guard Vessels. Volume 2. Effectiveness Assessment of Candidate Systems</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>1977-03-01</p> <p>267 Input Layout for Each Card Type ...................... 269 Input Sequence .......................... 271 SAMPLE PROBLEM...13 3 Sample Data rormn Used for Documenting MSD Effectiveness Attribute Data ........................... 15 -1 Sample Form Used for Documenting WMS...from commodes, urinals and garbage grinder) and gray (galley and turbid, i.e., output from sinks, showers, laundry, deck, drains, etc.) wastewaters</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=Base+AND+data+AND+memory&pg=2&id=EJ685127','ERIC'); return false;" href="https://eric.ed.gov/?q=Base+AND+data+AND+memory&pg=2&id=EJ685127"><span>Education-Stratified Base-Rate Information on Discrepancy Scores Within and Between the Wechsler Adult Intelligence Scale-Third Edition and the Wechsler Memory Scale-Third Edition</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Dori, Galit A.; Chelune, Gordon J.</p> <p>2004-01-01</p> <p>The Wechsler Adult Intelligence Scale--Third Edition (WAIS-III; D. Wechsler, 1997a) and the Wechsler Memory Scale--Third Edition (WMS-III; D. Wechsler, 1997b) are 2 of the most frequently used measures in psychology and neuropsychology. To facilitate the diagnostic use of these measures in the clinical decision-making process, this article…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5117363','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5117363"><span>Towards a standard for the dynamic measurement of pressure based on laser absorption spectroscopy</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Douglass, K O; Olson, D A</p> <p>2016-01-01</p> <p>We describe an approach for creating a standard for the dynamic measurement of pressure based on the measurement of fundamental quantum properties of molecular systems. From the linewidth and intensities of ro-vibrational transitions we plan on making an accurate determination of pressure and temperature. The goal is to achieve an absolute uncertainty for time-varying pressure of 5 % with a measurement rate of 100 kHz, which will in the future serve as a method for the traceable calibration of pressure sensors used in transient processes. To illustrate this concept we have used wavelength modulation spectroscopy (WMS), due to inherent advantages over direct absorption spectroscopy, to perform rapid measurements of carbon dioxide in order to determine the pressure. The system records the full lineshape profile of a single ro-vibrational transition of CO2 at a repetition rate of 4 kHz and with a systematic measurement uncertainty of 12 % for the linewidth measurement. A series of pressures were measured at a rate of 400 Hz (10 averages) and from these measurements the linewidth was determined with a relative uncertainty of about 0.5 % on average. The pressures measured using WMS have an average difference of 0.6 % from the absolute pressure measured with a capacitance diaphragm sensor. PMID:27881884</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/22451155-can-future-choice-affect-past-measurements-outcome','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/22451155-can-future-choice-affect-past-measurements-outcome"><span>Can a future choice affect a past measurement’s outcome?</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Aharonov, Yakir; Schmid College of Science, Chapman University, Orange, CA 92866; Iyar, The Israeli Institute for Advanced Research, Rehovot</p> <p>2015-04-15</p> <p>An EPR experiment is studied where each particle within the entangled pair undergoes a few weak measurements (WMs) along some pre-set spin orientations, with the outcomes individually recorded. Then the particle undergoes one strong measurement along an orientation chosen at the last moment. Bell-inequality violation is expected between the two final measurements within each EPR pair. At the same time, statistical agreement is expected between these strong measurements and the earlier weak ones performed on that pair. A contradiction seemingly ensues: (i) Bell’s theorem forbids spin values to exist prior to the choice of the orientation measured; (ii) A weakmore » measurement is not supposed to determine the outcome of a successive strong one; and indeed (iii) Almost no disentanglement is inflicted by the WMs; and yet (iv) The outcomes of weak measurements statistically agree with those of the strong ones, suggesting the existence of pre-determined values, in contradiction with (i). Although the conflict can be solved by mere mitigation of the above restrictions, the most reasonable resolution seems to be that of the Two-State-Vector Formalism (TSVF), namely, that the choice of the experimenter has been encrypted within the weak measurement’s outcomes, even before the experimenters themselves know what their choice will be.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5791022','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5791022"><span>Effectiveness of a Computer-Based Training Program of Attention and Memory in Patients with Acquired Brain Damage</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Fernandez, Elizabeth; Bergado Rosado, Jorge A.; Rodriguez Perez, Daymi; Salazar Santana, Sonia; Torres Aguilar, Maydane; Bringas, Maria Luisa</p> <p>2017-01-01</p> <p>Many training programs have been designed using modern software to restore the impaired cognitive functions in patients with acquired brain damage (ABD). The objective of this study was to evaluate the effectiveness of a computer-based training program of attention and memory in patients with ABD, using a two-armed parallel group design, where the experimental group (n = 50) received cognitive stimulation using RehaCom software, and the control group (n = 30) received the standard cognitive stimulation (non-computerized) for eight weeks. In order to assess the possible cognitive changes after the treatment, a post-pre experimental design was employed using the following neuropsychological tests: Wechsler Memory Scale (WMS) and Trail Making test A and B. The effectiveness of the training procedure was statistically significant (p < 0.05) when it established the comparison between the performance in these scales, before and after the training period, in each patient and between the two groups. The training group had statistically significant (p < 0.001) changes in focused attention (Trail A), two subtests (digit span and logical memory), and the overall score of WMS. Finally, we discuss the advantages of computerized training rehabilitation and further directions of this line of work. PMID:29301194</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5619352','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5619352"><span>Development of alternative versions of the Logical Memory subtest of the WMS-R for use in Brazil</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Bolognani, Silvia Adriana Prado; Miranda, Monica Carolina; Martins, Marjorie; Rzezak, Patricia; Bueno, Orlando Francisco Amodeo; de Camargo, Candida Helena Pires; Pompeia, Sabine</p> <p>2015-01-01</p> <p>The logical memory test of the Wechsler Memory Scale is one of the most frequently used standardized tests for assessing verbal memory and consists of two separate short stories each containing 25 idea units. Problems with practice effects arise with re-testing a patient, as these stories may be remembered from previous assessments. Therefore, alternative versions of the test stimuli should be developed to minimize learning effects when repeated testing is required for longitudinal evaluations of patients. Objective To present three alternative stories for each of the original stories frequently used in Brazil (Ana Soares and Roberto Mota) and to show their similarity in terms of content, structure and linguistic characteristics. Methods The alternative stories were developed according to the following criteria: overall structure or thematic content (presentation of the character, conflict, aggravation or complements and resolution); specific structure (sex of the character, location and occupation, details of what happened); formal structure (number of words, characters, verbs and nouns); and readability. Results The alternative stories and scoring criteria are presented in comparison to the original WMS stories (Brazilian version). Conclusion The alternative stories presented here correspond well thematically and structurally to the Brazilian versions of the original stories. PMID:29213955</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/11515244','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/11515244"><span>Neuropsychological dysfunction, mood disturbance, and emotional status of munitions workers.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Bowler, R M; Lezak, M; Booty, A; Hartney, C; Mergler, D; Levin, J; Zisman, F</p> <p>2001-01-01</p> <p>The objective of this study was to compare the neuropsychological function, emotional status, visual function, and illness prevalence of 265 former munitions plant workers (M age = 56.7 years, M years of education = 12.07; 201 African American, 64 White) exposed to organic solvents for an average of 17.03 years with that of a group of 77 unexposed controls (M age = 51.3 years, M years of education = 13.07; 30 African American, 47 White). Neuropsychological tests were selected from the World Health Organization Neurobehavioral Core Test Battery, Wechsler Adult Intelligence Scale-III (WAIS-III), and Wechsler Memory Scale-III (WMS-III) and also included the Brief Symptom Inventory, Profile of Mood States, Beck Anxiety Inventory, and Beck Depression Inventory. Vision tests included the Lanthony d-15 color vision, the Vistech Contrast Sensitivity, and the Snellen. The exposed group showed greater deficits than the controls in verbal learning (WMS-III Logical Memory I Learning Slope and Word Lists I Recall), visuomotor tracking speed (Cancellation H, WAIS-III Digit Symbol-Coding) and psychomotor function (Dynamometer and Grooved Pegboard), and dysfunction in emotional status, illness prevalence, and visual function. African American workers reported higher levels of exposure than Whites. Exposure relations demonstrated increased neuropsychological dysfunction with increased exposure.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26832551','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26832551"><span>Calibration-free wavelength-modulation spectroscopy based on a swiftly determined wavelength-modulation frequency response function of a DFB laser.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Zhao, Gang; Tan, Wei; Hou, Jiajia; Qiu, Xiaodong; Ma, Weiguang; Li, Zhixin; Dong, Lei; Zhang, Lei; Yin, Wangbao; Xiao, Liantuan; Axner, Ove; Jia, Suotang</p> <p>2016-01-25</p> <p>A methodology for calibration-free wavelength modulation spectroscopy (CF-WMS) that is based upon an extensive empirical description of the wavelength-modulation frequency response (WMFR) of DFB laser is presented. An assessment of the WMFR of a DFB laser by the use of an etalon confirms that it consists of two parts: a 1st harmonic component with an amplitude that is linear with the sweep and a nonlinear 2nd harmonic component with a constant amplitude. Simulations show that, among the various factors that affect the line shape of a background-subtracted peak-normalized 2f signal, such as concentration, phase shifts between intensity modulation and frequency modulation, and WMFR, only the last factor has a decisive impact. Based on this and to avoid the impractical use of an etalon, a novel method to pre-determine the parameters of the WMFR by fitting to a background-subtracted peak-normalized 2f signal has been developed. The accuracy of the new scheme to determine the WMFR is demonstrated and compared with that of conventional methods in CF-WMS by detection of trace acetylene. The results show that the new method provides a four times smaller fitting error than the conventional methods and retrieves concentration more accurately.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014SPIE.9286E..3UV','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014SPIE.9286E..3UV"><span>Gas sensing using wavelength modulation spectroscopy</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Viveiros, D.; Ribeiro, J.; Flores, D.; Ferreira, J.; Frazao, O.; Santos, J. L.; Baptista, J. M.</p> <p>2014-08-01</p> <p>An experimental setup has been developed for different gas species sensing based on the Wavelength Modulation Spectroscopy (WMS) principle. The target is the measurement of ammonia, carbon dioxide and methane concentrations. The WMS is a rather sensitive technique for detecting atomic/molecular species presenting the advantage that it can be used in the near-infrared region using optical telecommunications technology. In this technique, the laser wavelength and intensity are modulated applying a sine wave signal through the injection current, which allows the shift of the detection bandwidth to higher frequencies where laser intensity noise is reduced. The wavelength modulated laser light is tuned to the absorption line of the target gas and the absorption information can be retrieved by means of synchronous detection using a lock-in amplifier, where the amplitude of the second harmonic of the laser modulation frequency is proportional to the gas concentration. The amplitude of the second harmonic is normalised by the average laser intensity and detector gain through a LabVIEW® application, where the main advantage of normalising is that the effects of laser output power fluctuations and any variations in laser transmission, or optical-electrical detector gain are eliminated. Two types of sensing heads based on free space light propagation with different optical path length were used, permitting redundancy operation and technology validation.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/23263046','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/23263046"><span>Compact and portable open-path sensor for simultaneous measurements of atmospheric N2O and CO using a quantum cascade laser.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Tao, Lei; Sun, Kang; Khan, M Amir; Miller, David J; Zondlo, Mark A</p> <p>2012-12-17</p> <p>A compact and portable open-path sensor for simultaneous detection of atmospheric N(2)O and CO has been developed with a 4.5 μm quantum cascade laser (QCL). An in-line acetylene (C(2)H(2)) gas reference cell allows for continuous monitoring of the sensor drift and calibration in rapidly changing field environments and thereby allows for open-path detection at high precision and stability. Wavelength modulation spectroscopy (WMS) is used to detect simultaneously both the second and fourth harmonic absorption spectra with an optimized dual modulation amplitude scheme. Multi-harmonic spectra containing atmospheric N(2)O, CO, and the reference C(2)H(2) signals are fit in real-time (10 Hz) by combining a software-based lock-in amplifier with a computationally fast numerical model for WMS. The sensor consumes ~50 W of power and has a mass of ~15 kg. Precision of 0.15 ppbv N(2)O and 0.36 ppbv CO at 10 Hz under laboratory conditions was demonstrated. The sensor has been deployed for extended periods in the field. Simultaneous N(2)O and CO measurements distinguished between natural and fossil fuel combustion sources of N(2)O, an important greenhouse gas with poorly quantified emissions in space and time.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://planthardiness.ars.usda.gov/PHZMWeb/HelpContact.aspx','SCIGOVWS'); return false;" href="http://planthardiness.ars.usda.gov/PHZMWeb/HelpContact.aspx"><span>Contact USDA-ARS | USDA Plant Hardiness Zone Map</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.science.gov/aboutsearch.html">Science.gov Websites</a></p> <p></p> <p></p> <p>USDA Logo <em>Agricultural</em> Research Service United States Department of Agriculture Mapping by PRISM / Help / Contact USDA-ARS Topics How to Use This Website Contact USDA-ARS Contact USDA <em>Agricultural</em> Mapping, please contact the USDA <em>Agricultural</em> Research Service by sending an e-mail to phzm@ars.usda.gov</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.usgs.gov/fs/1998/0094/report.pdf','USGSPUBS'); return false;" href="https://pubs.usgs.gov/fs/1998/0094/report.pdf"><span>Single-edition quadrangle maps</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>,</p> <p>1998-01-01</p> <p>In August 1993, the U.S. Geological Survey's (USGS) National Mapping Division and the U.S. Department of Agriculture's Forest Service signed an Interagency Agreement to begin a single-edition joint mapping program. This agreement established the coordination for producing and maintaining single-edition primary series topographic maps for quadrangles containing National Forest System lands. The joint mapping program saves money by eliminating duplication of effort by the agencies and results in a more frequent revision cycle for quadrangles containing national forests. Maps are revised on the basis of jointly developed standards and contain normal features mapped by the USGS, as well as additional features required for efficient management of National Forest System lands. Single-edition maps look slightly different but meet the content, accuracy, and quality criteria of other USGS products. The Forest Service is responsible for the land management of more than 191 million acres of land throughout the continental United States, Alaska, and Puerto Rico, including 155 national forests and 20 national grasslands. These areas make up the National Forest System lands and comprise more than 10,600 of the 56,000 primary series 7.5-minute quadrangle maps (15-minute in Alaska) covering the United States. The Forest Service has assumed responsibility for maintaining these maps, and the USGS remains responsible for printing and distributing them. Before the agreement, both agencies published similar maps of the same areas. The maps were used for different purposes, but had comparable types of features that were revised at different times. Now, the two products have been combined into one so that the revision cycle is stabilized and only one agency revises the maps, thus increasing the number of current maps available for National Forest System lands. This agreement has improved service to the public by requiring that the agencies share the same maps and that the maps meet a common standard, as well as by significantly reducing duplication of effort.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26571682','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26571682"><span>[Land use and land cover charnge (LUCC) and landscape service: Evaluation, mapping and modeling].</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Song, Zhang-jian; Cao, Yu; Tan, Yong-zhong; Chen, Xiao-dong; Chen, Xian-peng</p> <p>2015-05-01</p> <p>Studies on ecosystem service from landscape scale aspect have received increasing attention from researchers all over the world. Compared with ecosystem scale, it should be more suitable to explore the influence of human activities on land use and land cover change (LUCC), and to interpret the mechanisms and processes of sustainable landscape dynamics on landscape scale. Based on comprehensive and systematic analysis of researches on landscape service, this paper firstly discussed basic concepts and classification of landscape service. Then, methods of evaluation, mapping and modeling of landscape service were analyzed and concluded. Finally, future trends for the research on landscape service were proposed. It was put forward that, exploring further connotation and classification system of landscape service, improving methods and quantitative indicators for evaluation, mapping and modelling of landscape service, carrying out long-term integrated researches on landscape pattern-process-service-scale relationships and enhancing the applications of theories and methods on landscape economics and landscape ecology are very important fields of the research on landscape service in future.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=yahoo&pg=2&id=EJ819037','ERIC'); return false;" href="https://eric.ed.gov/?q=yahoo&pg=2&id=EJ819037"><span>Google Maps: You Are Here</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Jacobsen, Mikael</p> <p>2008-01-01</p> <p>Librarians use online mapping services such as Google Maps, MapQuest, Yahoo Maps, and others to check traffic conditions, find local businesses, and provide directions. However, few libraries are using one of Google Maps most outstanding applications, My Maps, for the creation of enhanced and interactive multimedia maps. My Maps is a simple and…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://files.eric.ed.gov/fulltext/EJ982867.pdf','ERIC'); return false;" href="http://files.eric.ed.gov/fulltext/EJ982867.pdf"><span>Mapping the Early Intervention System in Ontario, Canada</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Underwood, Kathryn</p> <p>2012-01-01</p> <p>This study documents the wide range of early intervention services across the province of Ontario. The services are mapped across the province showing geographic information as well as the scope of services (clinical, family-based, resource support, etc.), the range of early intervention professionals, sources of funding and the populations served…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26317530','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26317530"><span>Quantifying and Mapping the Supply of and Demand for Carbon Storage and Sequestration Service from Urban Trees.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Zhao, Chang; Sander, Heather A</p> <p>2015-01-01</p> <p>Studies that assess the distribution of benefits provided by ecosystem services across urban areas are increasingly common. Nevertheless, current knowledge of both the supply and demand sides of ecosystem services remains limited, leaving a gap in our understanding of balance between ecosystem service supply and demand that restricts our ability to assess and manage these services. The present study seeks to fill this gap by developing and applying an integrated approach to quantifying the supply and demand of a key ecosystem service, carbon storage and sequestration, at the local level. This approach follows three basic steps: (1) quantifying and mapping service supply based upon Light Detection and Ranging (LiDAR) processing and allometric models, (2) quantifying and mapping demand for carbon sequestration using an indicator based on local anthropogenic CO2 emissions, and (3) mapping a supply-to-demand ratio. We illustrate this approach using a portion of the Twin Cities Metropolitan Area of Minnesota, USA. Our results indicate that 1735.69 million kg carbon are stored by urban trees in our study area. Annually, 33.43 million kg carbon are sequestered by trees, whereas 3087.60 million kg carbon are emitted by human sources. Thus, carbon sequestration service provided by urban trees in the study location play a minor role in combating climate change, offsetting approximately 1% of local anthropogenic carbon emissions per year, although avoided emissions via storage in trees are substantial. Our supply-to-demand ratio map provides insight into the balance between carbon sequestration supply in urban trees and demand for such sequestration at the local level, pinpointing critical locations where higher levels of supply and demand exist. Such a ratio map could help planners and policy makers to assess and manage the supply of and demand for carbon sequestration.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/12404671','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/12404671"><span>A double-blind, placebo-controlled, randomized trial of Ginkgo biloba extract EGb 761 in a sample of cognitively intact older adults: neuropsychological findings.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Mix, Joseph A; Crews, W David</p> <p>2002-08-01</p> <p>There appears to be an absence of large-scaled clinical trials that have examined the efficacy of Ginkgo biloba extract on the neuropsychological functioning of cognitively intact older adults. The importance of such clinical research appears paramount in light of the plethora of products containing Ginkgo biloba that are currently being widely marketed to predominantly cognitively intact adults with claims of enhanced cognitive performances. The purpose of this research was to conduct the first known, large-scaled clinical trial of the efficacy of Ginkgo biloba extract (EGb 761) on the neuropsychological functioning of cognitively intact older adults. Two hundred and sixty-two community-dwelling volunteers (both male and female) 60 years of age and older, who reported no history of dementia or significant neurocognitive impairments and obtained Mini-Mental State Examination total scores of at least 26, were examined via a 6-week, randomized, double-blind, fixed-dose, placebo-controlled, parallel-group, clinical trial. Participants were randomly assigned to receive either Ginkgo biloba extract EGb 761(n = 131; 180 mg/day) or placebo (n = 131) for 6 weeks. Efficacy measures consisted of participants' raw change in performance scores from pretreatment baseline to those obtained just prior to termination of treatment on the following standardized neuropsychological measures: Selective Reminding Test (SRT), Wechsler Adult Intelligence Scale-III Block Design (WAIS-III BD) and Digit Symbol-Coding (WAIS-III DS) subtests, and the Wechsler Memory Scale-III Faces I (WMS-III FI) and Faces II (WMS-III FII) subtests. A subjective Follow-up Self-report Questionnaire was also administered to participants just prior to termination of the treatment phase. Analyses of covariance indicated that cognitively intact participants who received 180 mg of EGb 761 daily for 6 weeks exhibited significantly more improvement on SRT tasks involving delayed (30 min) free recall (p < 0.04) and recognition (p < 0.01) of noncontextual, auditory-verbal material, compared with the placebo controls. The EGb 761 group also demonstrated significantly greater improvement on the WMS-III FII subtest assessing delayed (30 min) recognition (p < 0.025) of visual material (i.e. human faces), compared with the placebo group. However, based on the significant difference (p < 0.03) found between the two groups' pretreatment baseline scores on the WMS-III FII, this result should be interpreted with caution. An examination of the participants' subjective ratings of their overall abilities to remember by treatment end on the Follow-up Self-report Questionnaire also revealed that significantly more (p = 0.05) older adults in the EGb 761 group rated their overall abilities to remember by treatment end as 'improved' compared with the placebo controls. Overall, the results from both objective, standardized, neuropsychological tests and a subjective, follow-up self-report questionnaire provided complementary evidence of the potential efficacy of Ginkgo biloba EGb 761 in enhancing certain neuropsychological/memory processes of cognitively intact older adults, 60 years of age and over. Copyright 2002 John Wiley & Sons, Ltd.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li class="active"><span>24</span></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_24 --> <div id="page_25" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li class="active"><span>25</span></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="481"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/130628','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/130628"><span>The status of soil mapping for the Idaho National Engineering Laboratory</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Olson, G.L.; Lee, R.D.; Jeppesen, D.J.</p> <p></p> <p>This report discusses the production of a revised version of the general soil map of the 2304-km{sup 2} (890-mi{sup 2}) Idaho National Engineering Laboratory (INEL) site in southeastern Idaho and the production of a geographic information system (GIS) soil map and supporting database. The revised general soil map replaces an INEL soil map produced in 1978 and incorporates the most current information on INEL soils. The general soil map delineates large soil associations based on National Resources Conservation Services [formerly the Soil Conservation Service (SCS)] principles of soil mapping. The GIS map incorporates detailed information that could not be presentedmore » on the general soil map and is linked to a database that contains the soil map unit descriptions, surficial geology codes, and other pertinent information.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=civic+AND+education+AND+poverty&pg=3&id=EJ1089992','ERIC'); return false;" href="https://eric.ed.gov/?q=civic+AND+education+AND+poverty&pg=3&id=EJ1089992"><span>Mapping Civic Engagement: A Case Study of Service-Learning in Appalachia</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Mann, Jessica; Casebeer, Daniel</p> <p>2016-01-01</p> <p>This study uses social cartography to map student perceptions of a co-curricular service-learning project in an impoverished rural community. As a complement to narrative discourse, mapping provides an opportunity to visualize not only the spatial nature of the educational experience but also, in this case, the benefits of civic engagement. The…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.gpo.gov/fdsys/pkg/FR-2013-09-13/pdf/2013-22141.pdf','FEDREG'); return false;" href="https://www.gpo.gov/fdsys/pkg/FR-2013-09-13/pdf/2013-22141.pdf"><span>78 FR 56650 - Boundary Description and Final Map for Roaring Wild and Scenic River, Mount Hood National Forest...</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.gpo.gov/fdsys/browse/collection.action?collectionCode=FR">Federal Register 2010, 2011, 2012, 2013, 2014</a></p> <p></p> <p>2013-09-13</p> <p>... DEPARTMENT OF AGRICULTURE Forest Service Boundary Description and Final Map for Roaring Wild and... availability. SUMMARY: In accordance with section 3(b) of the Wild and Scenic Rivers Act, the USDA Forest Service, Washington Office, is transmitting the final boundary description and map of the Roaring Wild and...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014AGUFMIN11C3631M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014AGUFMIN11C3631M"><span>Agricultural Census 2012: Publishing Mashable GIS Big Data Services</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Mueller, R.</p> <p>2014-12-01</p> <p>The 2012 Agricultural Census was released by the US Department of Agriculture (USDA) on May 2nd 2014; published on a quinquennial basis covering all facets of American production agriculture. The Agricultural Census is a comprehensive source of uniform published agricultural data for every state and county in the US. This is the first Agricultural Census that is disseminated with web mapping services using REST APIs. USDA developed an open GIS mashable web portal that depicts over 250 maps on Crops and Plants, Economics, Farms, Livestock and Animals, and Operators. These mapping services written in JavaScript replace the traditional static maps published as the Ag Atlas. Web users can now visualize, interact, query, and download the Agricultural Census data in a means not previously discoverable. Stakeholders will now be able to leverage this data for activities such as community planning, agribusiness location suitability analytics, availability of loans/funds, service center locations and staffing, and farm programs and policies. Additional sites serving compatible mashable USDA Big Data web services are as follows: The Food Environment Atlas, The Atlas of Rural and Small-Town America, The Farm Program Atlas, SNAP Data System, CropScape, and VegScape. All portals use a similar data organization scheme of "Categories" and "Maps" providing interactive mashable web services for agricultural stakeholders to exploit.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://edg.epa.gov/metadata/catalog/search/resource/details.page?uuid=%7BBB640445-EC8D-42CC-A20E-BAB827030344%7D','PESTICIDES'); return false;" href="https://edg.epa.gov/metadata/catalog/search/resource/details.page?uuid=%7BBB640445-EC8D-42CC-A20E-BAB827030344%7D"><span>EnviroAtlas - Austin, TX - Demographics by Block Group Web Service</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.epa.gov/pesticides/search.htm">EPA Pesticide Factsheets</a></p> <p></p> <p></p> <p>This EnviroAtlas web service supports research and online mapping activities related to EnviroAtlas (https://enviroatlas.epa.gov/EnviroAtlas). This EnviroAtlas dataset is a summary of key demographic groups for the EnviroAtlas community. This dataset was produced by the US EPA to support research and online mapping activities related to EnviroAtlas. EnviroAtlas (https://www.epa.gov/enviroatlas) allows the user to interact with a web-based, easy-to-use, mapping application to view and analyze multiple ecosystem services for the contiguous United States. The dataset is available as downloadable data (https://edg.epa.gov/data/Public/ORD/EnviroAtlas) or as an EnviroAtlas map service. Additional descriptive information about each attribute in this dataset can be found in its associated EnviroAtlas Fact Sheet (https://www.epa.gov/enviroatlas/enviroatlas-fact-sheets).</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/29108439','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/29108439"><span>Research into Australian emergency services personnel mental health and wellbeing: An evidence map.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Varker, Tracey; Metcalf, Olivia; Forbes, David; Chisolm, Katherine; Harvey, Sam; Van Hooff, Miranda; McFarlane, Alexander; Bryant, Richard; Phelps, Andrea J</p> <p>2018-02-01</p> <p>Evidence maps are a method of systematically characterising the range of research activity in broad topic areas and are a tool for guiding research priorities. 'Evidence-mapping' methodology was used to quantify the nature and distribution of recent peer-reviewed research into the mental health and wellbeing of Australian emergency services personnel. A search of the PsycINFO, EMBASE and Cochrane Library databases was performed for primary research articles that were published between January 2011 and July 2016. In all, 43 studies of primary research were identified and mapped. The majority of the research focused on organisational and individual/social factors and how they relate to mental health problems/wellbeing. There were several areas of research where very few studies were detected through the mapping process, including suicide, personality, stigma and pre-employment factors that may contribute to mental health outcomes and the use of e-health. No studies were detected which examined the prevalence of self-harm and/or harm to others, bullying, alcohol/substance use, barriers to care or experience of families of emergency services personnel. In addition, there was no comprehensive national study that had investigated all sectors of emergency services personnel. This evidence map highlights the need for future research to address the current gaps in mental health and wellbeing research among Australian emergency services personnel. Improved understanding of the mental health and wellbeing of emergency services personnel, and the factors that contribute, should guide organisations' wellbeing policies and procedures.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014AGUFM.V13A4764C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014AGUFM.V13A4764C"><span>DECADE Web Portal: Integrating MaGa, EarthChem and GVP Will Further Our Knowledge on Earth Degassing</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Cardellini, C.; Frigeri, A.; Lehnert, K. A.; Ash, J.; McCormick, B.; Chiodini, G.; Fischer, T. P.; Cottrell, E.</p> <p>2014-12-01</p> <p>The release of gases from the Earth's interior to the exosphere takes place in both volcanic and non-volcanic areas of the planet. Fully understanding this complex process requires the integration of geochemical, petrological and volcanological data. At present, major online data repositories relevant to studies of degassing are not linked and interoperable. We are developing interoperability between three of those, which will support more powerful synoptic studies of degassing. The three data systems that will make their data accessible via the DECADE portal are: (1) the Smithsonian Institution's Global Volcanism Program database (GVP) of volcanic activity data, (2) EarthChem databases for geochemical and geochronological data of rocks and melt inclusions, and (3) the MaGa database (Mapping Gas emissions) which contains compositional and flux data of gases released at volcanic and non-volcanic degassing sites. These databases are developed and maintained by institutions or groups of experts in a specific field, and data are archived in formats specific to these databases. In the framework of the Deep Earth Carbon Degassing (DECADE) initiative of the Deep Carbon Observatory (DCO), we are developing a web portal that will create a powerful search engine of these databases from a single entry point. The portal will return comprehensive multi-component datasets, based on the search criteria selected by the user. For example, a single geographic or temporal search will return data relating to compositions of emitted gases and erupted products, the age of the erupted products, and coincident activity at the volcano. The development of this level of capability for the DECADE Portal requires complete synergy between these databases, including availability of standard-based web services (WMS, WFS) at all data systems. Data and metadata can thus be extracted from each system without interfering with each database's local schema or being replicated to achieve integration at the DECADE web portal. The DECADE portal will enable new synoptic perspectives on the Earth degassing process. Other data systems can be easily plugged in using the existing framework. Our vision is to explore Earth degassing related datasets over previously unexplored spatial or temporal ranges.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://edg.epa.gov/metadata/catalog/search/resource/details.page?uuid=%7B434C2D40-8EC5-4651-8F49-48EBE3545481%7D','PESTICIDES'); return false;" href="https://edg.epa.gov/metadata/catalog/search/resource/details.page?uuid=%7B434C2D40-8EC5-4651-8F49-48EBE3545481%7D"><span>EnviroAtlas -Durham, NC- One Meter Resolution Urban Area Land Cover Map (2010) Web Service</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.epa.gov/pesticides/search.htm">EPA Pesticide Factsheets</a></p> <p></p> <p></p> <p>This EnviroAtlas web service supports research and online mapping activities related to EnviroAtlas (https://www.epa.gov/enviroatlas ). The EnviroAtlas Durham, NC land cover map was generated from USDA NAIP (National Agricultural Imagery Program) four band (red, green, blue and near infrared) aerial photography from July 2010 at 1 m spatial resolution. Five land cover classes were mapped: impervious surface, soil and barren, grass and herbaceous, trees and forest, and water. An accuracy assessment using a stratified random sampling of 500 samples yielded an overall accuracy of 83 percent using a minimum mapping unit of 9 pixels (3x3 pixel window). The area mapped is defined by the US Census Bureau's 2010 Urban Statistical Area for Durham, and includes the cities of Durham, Chapel Hill, Carrboro and Hillsborough, NC. This dataset was produced by the US EPA to support research and online mapping activities related to EnviroAtlas. EnviroAtlas (https://www.epa.gov/enviroatlas ) allows the user to interact with a web-based, easy-to-use, mapping application to view and analyze multiple ecosystem services for the contiguous United States. The dataset is available as downloadable data (https://edg.epa.gov/data/Public/ORD/EnviroAtlas) or as an EnviroAtlas map service. Additional descriptive information about each attribute in this dataset can be found in its associated EnviroAtlas Fact Sheet (https://www.epa.gov/enviroatlas/enviroatlas-fact-sheets ).</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://files.eric.ed.gov/fulltext/EJ1057957.pdf','ERIC'); return false;" href="http://files.eric.ed.gov/fulltext/EJ1057957.pdf"><span>Pre-Service Teacher Beliefs on the Antecedents to Bullying: A Concept Mapping Study</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Lopata, Joel A.; Nowicki, Elizabeth A.</p> <p>2014-01-01</p> <p>In this study, researchers gathered Canadian pre-service teachers' beliefs on the antecedents to bullying. Concept mapping (Kane & Trochim, 2007) was used to analyze the data. This study's findings identified pre-service teachers to have accurate beliefs, inaccurate beliefs, and a lack of knowledge about the antecedents to bullying. Concept…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.cpc.ncep.noaa.gov/products/monitoring_and_data/pacific.shtml','SCIGOVWS'); return false;" href="http://www.cpc.ncep.noaa.gov/products/monitoring_and_data/pacific.shtml"><span>CPC - Monitoring & Data: Pacific Island Climate Data</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.science.gov/aboutsearch.html">Science.gov Websites</a></p> <p></p> <p></p> <p>Weather Service NWS logo - Click to go to the NWS home page <em>Climate</em> Prediction Center Home Site Map News Web resources and services. HOME > Monitoring and Data > Pacific Islands <em>Climate</em> Data & Maps island stations. NOAA/ National Weather Service NOAA Center for Weather and <em>Climate</em> Prediction <em>Climate</em></p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/22961469','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/22961469"><span>Mapping mental health service access: achieving equity through quality improvement.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Green, Stuart A; Poots, Alan J; Marcano-Belisario, Jose; Samarasundera, Edgar; Green, John; Honeybourne, Emmi; Barnes, Ruth</p> <p>2013-06-01</p> <p>Improving access to psychological therapies (IAPTs) services deliver evidence-based care to people with depression and anxiety. A quality improvement (QI) initiative was undertaken by an IAPT service to improve referrals providing an opportunity to evaluate equitable access. QI methodologies were used by the clinical team to improve referrals to the service. The collection of geo-coded data allowed referrals to be mapped to small geographical areas according to deprivation. A total of 6078 patients were referred to the IAPT service during the period of analysis and mapped to 120 unique lower super output areas (LSOAs). The average weekly referral rate rose from 17 during the baseline phase to 43 during the QI implementation phase. Spatial analysis demonstrated all 15 of the high deprivation/low referral LSOAs were converted to high deprivation/high or medium referral LSOAs following the QI initiative. This work highlights the importance of QI in developing clinical services aligned to the needs of the population through the analysis of routine data matched to health needs. Mapping can be utilized to communicate complex information to inform the planning and organization of clinical service delivery and evaluate the progress and sustainability of QI initiatives.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26964892','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26964892"><span>Ecosystem services provided by a complex coastal region: challenges of classification and mapping.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Sousa, Lisa P; Sousa, Ana I; Alves, Fátima L; Lillebø, Ana I</p> <p>2016-03-11</p> <p>A variety of ecosystem services classification systems and mapping approaches are available in the scientific and technical literature, which needs to be selected and adapted when applied to complex territories (e.g. in the interface between water and land, estuary and sea). This paper provides a framework for addressing ecosystem services in complex coastal regions. The roadmap comprises the definition of the exact geographic boundaries of the study area; the use of CICES (Common International Classification of Ecosystem Services) for ecosystem services identification and classification; and the definition of qualitative indicators that will serve as basis to map the ecosystem services. Due to its complexity, the Ria de Aveiro coastal region was selected as case study, presenting an opportunity to explore the application of such approaches at a regional scale. The main challenges of implementing the proposed roadmap, together with its advantages are discussed in this research. The results highlight the importance of considering both the connectivity of natural systems and the complexity of the governance framework; the flexibility and robustness, but also the challenges when applying CICES at regional scale; and the challenges regarding ecosystem services mapping.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4786800','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4786800"><span>Ecosystem services provided by a complex coastal region: challenges of classification and mapping</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Sousa, Lisa P.; Sousa, Ana I.; Alves, Fátima L.; Lillebø, Ana I.</p> <p>2016-01-01</p> <p>A variety of ecosystem services classification systems and mapping approaches are available in the scientific and technical literature, which needs to be selected and adapted when applied to complex territories (e.g. in the interface between water and land, estuary and sea). This paper provides a framework for addressing ecosystem services in complex coastal regions. The roadmap comprises the definition of the exact geographic boundaries of the study area; the use of CICES (Common International Classification of Ecosystem Services) for ecosystem services identification and classification; and the definition of qualitative indicators that will serve as basis to map the ecosystem services. Due to its complexity, the Ria de Aveiro coastal region was selected as case study, presenting an opportunity to explore the application of such approaches at a regional scale. The main challenges of implementing the proposed roadmap, together with its advantages are discussed in this research. The results highlight the importance of considering both the connectivity of natural systems and the complexity of the governance framework; the flexibility and robustness, but also the challenges when applying CICES at regional scale; and the challenges regarding ecosystem services mapping. PMID:26964892</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/1988easc.conf..115M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/1988easc.conf..115M"><span>Satellites vs. fiber optics based networks and services - Road map to strategic planning</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Marandi, James H. R.</p> <p></p> <p>An overview of a generic telecommunications network and its components is presented, and the current developments in satellite and fiber optics technologies are discussed with an eye on the trends in industry. A baseline model is proposed, and a cost comparison of fiber- vs satellite-based networks is made. A step-by-step 'road map' to the successful strategic planning of telecommunications services and facilities is presented. This road map provides for optimization of the current and future networks and services through effective utilization of both satellites and fiber optics. The road map is then applied to different segments of the telecommunications industry and market place, to show its effectiveness for the strategic planning of executives of three types: (1) those heading telecommunications manufacturing concerns, (2) those leading communication service companies, and (3) managers of telecommunication/MIS departments of major corporations. Future networking issues, such as developments in integrated-services digital network standards and technologies, are addressed.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012EGUGA..1412250H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012EGUGA..1412250H"><span>A Walk through TRIDEC's intermediate Tsunami Early Warning System</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Hammitzsch, M.; Reißland, S.; Lendholt, M.</p> <p>2012-04-01</p> <p>The management of natural crises is an important application field of the technology developed in the project Collaborative, Complex, and Critical Decision-Support in Evolving Crises (TRIDEC), co-funded by the European Commission in its Seventh Framework Programme. TRIDEC is based on the development of the German Indonesian Tsunami Early Warning System (GITEWS) and the Distant Early Warning System (DEWS) providing a service platform for both sensor integration and warning dissemination. In TRIDEC new developments in Information and Communication Technology (ICT) are used to extend the existing platform realising a component-based technology framework for building distributed tsunami warning systems for deployment, e.g. in the North-eastern Atlantic, the Mediterranean and Connected Seas (NEAM) region. The TRIDEC system will be implemented in three phases, each with a demonstrator. Successively, the demonstrators are addressing challenges, such as the design and implementation of a robust and scalable service infrastructure supporting the integration and utilisation of existing resources with accelerated generation of large volumes of data. These include sensor systems, geo-information repositories, simulation tools and data fusion tools. In addition to conventional sensors also unconventional sensors and sensor networks play an important role in TRIDEC. The system version presented is based on service-oriented architecture (SOA) concepts and on relevant standards of the Open Geospatial Consortium (OGC), the World Wide Web Consortium (W3C) and the Organization for the Advancement of Structured Information Standards (OASIS). In this way the system continuously gathers, processes and displays events and data coming from open sensor platforms to enable operators to quickly decide whether an early warning is necessary and to send personalized warning messages to the authorities and the population at large through a wide range of communication channels. The system integrates OGC Sensor Web Enablement (SWE) compliant sensor systems for the rapid detection of hazardous events, like earthquakes, sea level anomalies, ocean floor occurrences, and ground displacements. Using OGC Web Map Service (WMS) and Web Feature Service (WFS) spatial data are utilized to depict the situation picture. The integration of a simulation system to identify affected areas is considered using the OGC Web Processing Service (WPS). Warning messages are compiled and transmitted in the OASIS Common Alerting Protocol (CAP) together with addressing information defined via the OASIS Emergency Data Exchange Language - Distribution Element (EDXL-DE). The first system demonstrator has been designed and implemented to support plausible scenarios demonstrating the treatment of simulated tsunami threats with an essential subset of a National Tsunami Warning Centre (NTWC). The feasibility and the potentials of the implemented approach are demonstrated covering standard operations as well as tsunami detection and alerting functions. The demonstrator presented addresses information management and decision-support processes in a hypothetical natural crisis situation caused by a tsunami in the Eastern Mediterranean. Developments of the system are based to the largest extent on free and open source software (FOSS) components and industry standards. Emphasis has been and will be made on leveraging open source technologies that support mature system architecture models wherever appropriate. All open source software produced is foreseen to be published on a publicly available software repository thus allowing others to reuse results achieved and enabling further development and collaboration with a wide community including scientists, developers, users and stakeholders. This live demonstration is linked with the talk "TRIDEC Natural Crisis Management Demonstrator for Tsunamis" (EGU2012-7275) given in the session "Architecture of Future Tsunami Warning Systems" (NH5.7/ESSI1.7).</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012EGUGA..14.3268R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012EGUGA..14.3268R"><span>US Geoscience Information Network, Web Services for Geoscience Information Discovery and Access</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Richard, S.; Allison, L.; Clark, R.; Coleman, C.; Chen, G.</p> <p>2012-04-01</p> <p>The US Geoscience information network has developed metadata profiles for interoperable catalog services based on ISO19139 and the OGC CSW 2.0.2. Currently data services are being deployed for the US Dept. of Energy-funded National Geothermal Data System. These services utilize OGC Web Map Services, Web Feature Services, and THREDDS-served NetCDF for gridded datasets. Services and underlying datasets (along with a wide variety of other information and non information resources are registered in the catalog system. Metadata for registration is produced by various workflows, including harvest from OGC capabilities documents, Drupal-based web applications, transformation from tabular compilations. Catalog search is implemented using the ESRI Geoportal open-source server. We are pursuing various client applications to demonstrated discovery and utilization of the data services. Currently operational applications allow catalog search and data acquisition from map services in an ESRI ArcMap extension, a catalog browse and search application built on openlayers and Django. We are developing use cases and requirements for other applications to utilize geothermal data services for resource exploration and evaluation.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2009EGUGA..11.4130P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2009EGUGA..11.4130P"><span>GI-conf: A configuration tool for the GI-cat distributed catalog</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Papeschi, F.; Boldrini, E.; Bigagli, L.; Mazzetti, P.</p> <p>2009-04-01</p> <p>In this work we present a configuration tool for the GI-cat. In an Service-Oriented Architecture (SOA) framework, GI-cat implements a distributed catalog service providing advanced capabilities, such as: caching, brokering and mediation functionalities. GI-cat applies a distributed approach, being able to distribute queries to the remote service providers of interest in an asynchronous style, and notifies the status of the queries to the caller implementing an incremental feedback mechanism. Today, GI-cat functionalities are made available through two standard catalog interfaces: the OGC CSW ISO and CSW Core Application Profiles. However, two other interfaces are under testing: the CIM and the EO Extension Packages of the CSW ebRIM Application Profile. GI-cat is able to interface a multiplicity of discovery and access services serving heterogeneous Earth and Space Sciences resources. They include international standards like the OGC Web Services -i.e. OGC CSW, WCS, WFS and WMS, as well as interoperability arrangements (i.e. community standards) such as: UNIDATA THREDDS/OPeNDAP, SeaDataNet CDI (Common Data Index), GBIF (Global Biodiversity Information Facility) services, and SibESS-C infrastructure services. GI-conf implements user-friendly configuration tool for GI-cat. This is a GUI application that employs a visual and very simple approach to configure both the GI-cat publishing and distribution capabilities, in a dynamic way. The tool allows to set one or more GI-cat configurations. Each configuration consists of: a) the catalog standards interfaces published by GI-cat; b) the resources (i.e. services/servers) to be accessed and mediated -i.e. federated. Simple icons are used for interfaces and resources, implementing a user-friendly visual approach. The main GI-conf functionalities are: • Interfaces and federated resources management: user can set which interfaces must be published; besides, she/he can add a new resource, update or remove an already federated resource. • Multiple configuration management: multiple GI-cat configurations can be defined; every configuration identifies a set of published interfaces and a set of federated resources. Configurations can be edited, added, removed, exported, and even imported. • HTML report creation: an HTML report can be created, showing the current active GI-cat configuration, including the resources that are being federated and the published interface endpoints. The configuration tool is shipped with GI-cat and can be used to configure the service after its installation is completed.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013NHESS..13.3095K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013NHESS..13.3095K"><span>Mapping tsunami impacts on land cover and related ecosystem service supply in Phang Nga, Thailand</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kaiser, G.; Burkhard, B.; Römer, H.; Sangkaew, S.; Graterol, R.; Haitook, T.; Sterr, H.; Sakuna-Schwartz, D.</p> <p>2013-12-01</p> <p>The 2004 Indian Ocean tsunami caused damages to coastal ecosystems and thus affected the livelihoods of the coastal communities who depend on services provided by these ecosystems. The paper presents a case study on evaluating and mapping the spatial and temporal impacts of the tsunami on land use and land cover (LULC) and related ecosystem service supply in the Phang Nga province, Thailand. The method includes local stakeholder interviews, field investigations, remote-sensing techniques, and GIS. Results provide an ecosystem services matrix with capacity scores for 18 LULC classes and 17 ecosystem functions and services as well as pre-/post-tsunami and recovery maps indicating changes in the ecosystem service supply capacities in the study area. Local stakeholder interviews revealed that mangroves, casuarina forest, mixed beach forest, coral reefs, tidal inlets, as well as wetlands (peat swamp forest) have the highest capacity to supply ecosystem services, while e.g. plantations have a lower capacity. The remote-sensing based damage and recovery analysis showed a loss of the ecosystem service supply capacities in almost all LULC classes for most of the services due to the tsunami. A fast recovery of LULC and related ecosystem service supply capacities within one year could be observed for e.g. beaches, while mangroves or casuarina forest needed several years to recover. Applying multi-temporal mapping the spatial variations of recovery could be visualised. While some patches of coastal forest were fully recovered after 3 yr, other patches were still affected and thus had a reduced capacity to supply ecosystem services. The ecosystem services maps can be used to quantify ecological values and their spatial distribution in the framework of a tsunami risk assessment. Beyond that they are considered to be a useful tool for spatial analysis in coastal risk management in Phang Nga.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=chemistry+AND+enhanced&id=EJ863601','ERIC'); return false;" href="https://eric.ed.gov/?q=chemistry+AND+enhanced&id=EJ863601"><span>The Contribution of Constructivist Instruction Accompanied by Concept Mapping in Enhancing Pre-Service Chemistry Teachers' Conceptual Understanding of Chemistry in the Laboratory Course</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Aydin, Sevgi; Aydemir, Nurdane; Boz, Yezdan; Cetin-Dindar, Ayla; Bektas, Oktay</p> <p>2009-01-01</p> <p>The present study aimed to evaluate whether a chemistry laboratory course called "Laboratory Experiments in Science Education" based on constructivist instruction accompanied with concept mapping enhanced pre-service chemistry teachers' conceptual understanding. Data were collected from five pre-service chemistry teachers at a university…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.osti.gov/dataexplorer/about','SCIGOVWS'); return false;" href="http://www.osti.gov/dataexplorer/about"><span>About | DOE Data Explorer</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.science.gov/aboutsearch.html">Science.gov Websites</a></p> <p></p> <p></p> <p>skip to main content DDE Toggle Navigation Home About DDE FAQs DOE <em>Data</em> ID Service <em>Data</em> ID Service <em>Data</em> ID Service Workshops Contact Us dataexplorer Search For Terms: + Advanced Search × Advanced /Simulations Figures/Plots Genome/Genetics <em>Data</em> Interactive <em>Data</em> Map(s) Multimedia Numeric <em>Data</em> Specialized Mix</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li class="active"><span>25</span></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_25 --> <div class="footer-extlink text-muted" style="margin-bottom:1rem; text-align:center;">Some links on this page may take you to non-federal websites. Their policies may differ from this site.</div> </div><!-- container --> <a id="backToTop" href="#top"> Top </a> <footer> <nav> <ul class="links"> <li><a href="/sitemap.html">Site Map</a></li> <li><a href="/website-policies.html">Website Policies</a></li> <li><a href="https://www.energy.gov/vulnerability-disclosure-policy" target="_blank">Vulnerability Disclosure Program</a></li> <li><a href="/contact.html">Contact Us</a></li> </ul> </nav> </footer> <script type="text/javascript"><!-- // var lastDiv = ""; function showDiv(divName) { // hide last div if (lastDiv) { document.getElementById(lastDiv).className = "hiddenDiv"; } //if value of the box is not nothing and an object with that name exists, then change the class if (divName && document.getElementById(divName)) { document.getElementById(divName).className = "visibleDiv"; lastDiv = divName; } } //--> </script> <script> /** * Function that tracks a click on an outbound link in Google Analytics. * This function takes a valid URL string as an argument, and uses that URL string * as the event label. */ var trackOutboundLink = function(url,collectionCode) { try { h = window.open(url); setTimeout(function() { ga('send', 'event', 'topic-page-click-through', collectionCode, url); }, 1000); } catch(err){} }; </script> <!-- Google Analytics --> <script> (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){ (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o), m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m) })(window,document,'script','//www.google-analytics.com/analytics.js','ga'); ga('create', 'UA-1122789-34', 'auto'); ga('send', 'pageview'); </script> <!-- End Google Analytics --> <script> showDiv('page_1') </script> </body> </html>