Interoperability in planetary research for geospatial data analysis
NASA Astrophysics Data System (ADS)
Hare, Trent M.; Rossi, Angelo P.; Frigeri, Alessandro; Marmo, Chiara
2018-01-01
For more than a decade there has been a push in the planetary science community to support interoperable methods for accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (e.g., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized geospatial image formats, geologic mapping conventions, U.S. Federal Geographic Data Committee (FGDC) cartographic and metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Map Tile Services (cached image tiles), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they can be just as valuable for planetary domain. Another initiative, called VESPA (Virtual European Solar and Planetary Access), will marry several of the above geoscience standards and astronomy-based standards as defined by International Virtual Observatory Alliance (IVOA). This work outlines the current state of interoperability initiatives in use or in the process of being researched within the planetary geospatial community.
Exploring NASA GES DISC Data with Interoperable Services
NASA Technical Reports Server (NTRS)
Zhao, Peisheng; Yang, Wenli; Hegde, Mahabal; Wei, Jennifer C.; Kempler, Steven; Pham, Long; Teng, William; Savtchenko, Andrey
2015-01-01
Overview of NASA GES DISC (NASA Goddard Earth Science Data and Information Services Center) data with interoperable services: Open-standard and Interoperable Services Improve data discoverability, accessibility, and usability with metadata, catalogue and portal standards Achieve data, information and knowledge sharing across applications with standardized interfaces and protocols Open Geospatial Consortium (OGC) Data Services and Specifications Web Coverage Service (WCS) -- data Web Map Service (WMS) -- pictures of data Web Map Tile Service (WMTS) --- pictures of data tiles Styled Layer Descriptors (SLD) --- rendered styles.
1982-09-30
not changed because they are not subject to a careful evaluation. The solution The four job aids contained in this manual provide specific techniques...lesson plans tions. " training design, or * testing NOTICE This manual has been developed using the standards of the Information MappingO writing service...Information Mapping, Inc. S NOTICE This manual has been developed using the standards of the Information MappingS writing service. Infornation Mapping
Cartographic services contract...for everything geographic
,
2003-01-01
The U.S. Geological Survey's (USGS) Cartographic Services Contract (CSC) is used to award work for photogrammetric and mapping services under the umbrella of Architect-Engineer (A&E) contracting. The A&E contract is broad in scope and can accommodate any activity related to standard, nonstandard, graphic, and digital cartographic products. Services provided may include, but are not limited to, photogrammetric mapping and aerotriangulation; orthophotography; thematic mapping (for example, land characterization); analog and digital imagery applications; geographic information systems development; surveying and control acquisition, including ground-based and airborne Global Positioning System; analog and digital image manipulation, analysis, and interpretation; raster and vector map digitizing; data manipulations (for example, transformations, conversions, generalization, integration, and conflation); primary and ancillary data acquisition (for example, aerial photography, satellite imagery, multispectral, multitemporal, and hyperspectral data); image scanning and processing; metadata production, revision, and creation; and production or revision of standard USGS products defined by formal and informal specification and standards, such as those for digital line graphs, digital elevation models, digital orthophoto quadrangles, and digital raster graphics.
Progress of Interoperability in Planetary Research for Geospatial Data Analysis
NASA Astrophysics Data System (ADS)
Hare, T. M.; Gaddis, L. R.
2015-12-01
For nearly a decade there has been a push in the planetary science community to support interoperable methods of accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (i.e., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized image formats that retain geographic information (e.g., GeoTiff, GeoJpeg2000), digital geologic mapping conventions, planetary extensions for symbols that comply with U.S. Federal Geographic Data Committee cartographic and geospatial metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they have been modified to support the planetary domain. The motivation to support common, interoperable data format and delivery standards is not only to improve access for higher-level products but also to address the increasingly distributed nature of the rapidly growing volumes of data. The strength of using an OGC approach is that it provides consistent access to data that are distributed across many facilities. While data-steaming standards are well-supported by both the more sophisticated tools used in Geographic Information System (GIS) and remote sensing industries, they are also supported by many light-weight browsers which facilitates large and small focused science applications and public use. Here we provide an overview of the interoperability initiatives that are currently ongoing in the planetary research community, examples of their successful application, and challenges that remain.
Modern Data Center Services Supporting Science
NASA Astrophysics Data System (ADS)
Varner, J. D.; Cartwright, J.; McLean, S. J.; Boucher, J.; Neufeld, D.; LaRocque, J.; Fischman, D.; McQuinn, E.; Fugett, C.
2011-12-01
The National Oceanic and Atmospheric Administration's National Geophysical Data Center (NGDC) World Data Center for Geophysics and Marine Geology provides scientific stewardship, products and services for geophysical data, including bathymetry, gravity, magnetics, seismic reflection, data derived from sediment and rock samples, as well as historical natural hazards data (tsunamis, earthquakes, and volcanoes). Although NGDC has long made many of its datasets available through map and other web services, it has now developed a second generation of services to improve the discovery and access to data. These new services use off-the-shelf commercial and open source software, and take advantage of modern JavaScript and web application frameworks. Services are accessible using both RESTful and SOAP queries as well as Open Geospatial Consortium (OGC) standard protocols such as WMS, WFS, WCS, and KML. These new map services (implemented using ESRI ArcGIS Server) are finer-grained than their predecessors, feature improved cartography, and offer dramatic speed improvements through the use of map caches. Using standards-based interfaces allows customers to incorporate the services without having to coordinate with the provider. Providing fine-grained services increases flexibility for customers building custom applications. The Integrated Ocean and Coastal Mapping program and Coastal and Marine Spatial Planning program are two examples of national initiatives that require common data inventories from multiple sources and benefit from these modern data services. NGDC is also consuming its own services, providing a set of new browser-based mapping applications which allow the user to quickly visualize and search for data. One example is a new interactive mapping application to search and display information about historical natural hazards. NGDC continues to increase the amount of its data holdings that are accessible and is augmenting the capabilities with modern web application frameworks such as Groovy and Grails. Data discovery is being improved and simplified by leveraging ISO metadata standards along with ESRI Geoportal Server.
Cool Apps: Building Cryospheric Data Applications with Standards-Based Service Oriented Architecture
NASA Astrophysics Data System (ADS)
Oldenburg, J.; Truslove, I.; Collins, J. A.; Liu, M.; Lewis, S.; Brodzik, M.
2012-12-01
The National Snow and Ice Data Center (NSIDC) holds a large collection of cryospheric data, and is involved in a number of informatics research and development projects aimed at improving the discoverability and accessibility of these data. To develop high- quality software in a timely manner, we have adopted a Service- Oriented Architecture (SOA) approach for our core technical infrastructure development. Data services at NSIDC are internally exposed to other tools and applications through standards-based service interfaces. These standards include OAI-PMH (Open Archives Initiative Protocol for Metadata Harvesting), various OGC (Open Geospatial Consortium) standards including WMS (Web Map Service) and WFS (Web Feature Service), ESIP (Federation of Earth Sciences Information Partners) OpenSearch, and NSIDC-defined service endpoints which follow a RESTful architecture. By taking a standards-based approach, we are able to use off-the-shelf tools and libraries to consume, translate and broker these data services, and thus develop applications faster. Additionally, by exposing public interfaces to these services we provide valuable data services to technical collaborators; for example, NASA Reverb (http://reverb.echo.nasa.gov) uses NSIDC's WMS services. Our latest generation of web applications consume these data services directly. The most complete example of this is the Operation IceBridge Data Portal (http://nsidc.org/icebridge/ portal) which depends on many of the aforementioned services, retrieving data in several ways. The maps it displays are obtained through the use of WMS and WFS protocols from a MapServer instance hosted at NSIDC. Links to the scientific data collected on Operation IceBridge campaigns are obtained through ESIP OpenSearch requests service providers that encapsulate our metadata databases. These standards-based web services are also developed at NSIDC and are designed to be used independently of the Portal. This poster provides a visual representation of the relationships described above, with additional details and examples, and more generally outlines the benefits and challenges of this SOA approach.
NaviCell Web Service for network-based data visualization.
Bonnet, Eric; Viara, Eric; Kuperstein, Inna; Calzone, Laurence; Cohen, David P A; Barillot, Emmanuel; Zinovyev, Andrei
2015-07-01
Data visualization is an essential element of biological research, required for obtaining insights and formulating new hypotheses on mechanisms of health and disease. NaviCell Web Service is a tool for network-based visualization of 'omics' data which implements several data visual representation methods and utilities for combining them together. NaviCell Web Service uses Google Maps and semantic zooming to browse large biological network maps, represented in various formats, together with different types of the molecular data mapped on top of them. For achieving this, the tool provides standard heatmaps, barplots and glyphs as well as the novel map staining technique for grasping large-scale trends in numerical values (such as whole transcriptome) projected onto a pathway map. The web service provides a server mode, which allows automating visualization tasks and retrieving data from maps via RESTful (standard HTTP) calls. Bindings to different programming languages are provided (Python and R). We illustrate the purpose of the tool with several case studies using pathway maps created by different research groups, in which data visualization provides new insights into molecular mechanisms involved in systemic diseases such as cancer and neurodegenerative diseases. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
NaviCell Web Service for network-based data visualization
Bonnet, Eric; Viara, Eric; Kuperstein, Inna; Calzone, Laurence; Cohen, David P. A.; Barillot, Emmanuel; Zinovyev, Andrei
2015-01-01
Data visualization is an essential element of biological research, required for obtaining insights and formulating new hypotheses on mechanisms of health and disease. NaviCell Web Service is a tool for network-based visualization of ‘omics’ data which implements several data visual representation methods and utilities for combining them together. NaviCell Web Service uses Google Maps and semantic zooming to browse large biological network maps, represented in various formats, together with different types of the molecular data mapped on top of them. For achieving this, the tool provides standard heatmaps, barplots and glyphs as well as the novel map staining technique for grasping large-scale trends in numerical values (such as whole transcriptome) projected onto a pathway map. The web service provides a server mode, which allows automating visualization tasks and retrieving data from maps via RESTful (standard HTTP) calls. Bindings to different programming languages are provided (Python and R). We illustrate the purpose of the tool with several case studies using pathway maps created by different research groups, in which data visualization provides new insights into molecular mechanisms involved in systemic diseases such as cancer and neurodegenerative diseases. PMID:25958393
US EPA Nonattainment Areas and Designations-Annual PM2.5 (1997 NAAQS)
This web service contains the following layers: PM2.5 Annual 1997 NAAQS State Level and PM2.5 Annual 1997 NAAQS National . It also contains the following tables: maps99.FRED_MAP_VIEWER.%fred_area_map_data and maps99.FRED_MAP_VIEWER.%fred_area_map_view. Full FGDC metadata records for each layer may be found by clicking the layer name at the web service endpoint (https://gispub.epa.gov/arcgis/rest/services/OAR_OAQPS/NAA1997PM25Annual/MapServer) and viewing the layer description.These layers identify areas in the U.S. where air pollution levels have not met the National Ambient Air Quality Standards (NAAQS) for criteria air pollutants and have been designated nonattainment?? areas (NAA). The data are updated weekly from an OAQPS internal database. However, that does not necessarily mean the data have changed. The EPA Office of Air Quality Planning and Standards (OAQPS) has set National Ambient Air Quality Standards for six principal pollutants, which are called criteria pollutants. Under provisions of the Clean Air Act, which is intended to improve the quality of the air we breathe, EPA is required to set National Ambient Air Quality Standards for six common air pollutants. These commonly found air pollutants (also known as criteria pollutants) are found all over the United States. They are particle pollution (often referred to as particulate matter), ground-level ozone, carbon monoxide, sulfur oxides, nitrogen oxides, and lead. For each criteria pollutant, there
ERIC Educational Resources Information Center
Andrews, Judith; Eade, Eleanor
2013-01-01
Birmingham City University's Library and Learning Resources' strategic aim is to improve student satisfaction. A key element is the achievement of the Customer Excellence Standard. An important component of the standard is the mapping of services to improve quality. Library and Learning Resources has developed a methodology to map these…
Cartographic standards to improve maps produced by the Forest Inventory and Analysis program
Charles H. (Hobie) Perry; Mark D. Nelson
2009-01-01
The Forest Service, U.S. Department of Agriculture's Forest Inventory and Analysis (FIA) program is incorporating an increasing number of cartographic products in reports, publications, and presentations. To create greater quality and consistency within the national FIA program, a Geospatial Standards team developed cartographic design standards for FIA map...
Open Standards in Practice: An OGC China Forum Initiative
NASA Astrophysics Data System (ADS)
Yue, Peng; Zhang, Mingda; Taylor, Trevor; Xie, Jibo; Zhang, Hongping; Tong, Xiaochong; Yu, Jinsongdi; Huang, Juntao
2016-11-01
Open standards like OGC standards can be used to improve interoperability and support machine-to-machine interaction over the Web. In the Big Data era, standard-based data and processing services from various vendors could be combined to automate the extraction of information and knowledge from heterogeneous and large volumes of geospatial data. This paper introduces an ongoing OGC China forum initiative, which will demonstrate how OGC standards can benefit the interaction among multiple organizations in China. The ability to share data and processing functions across organizations using standard services could change traditional manual interactions in their business processes, and provide on-demand decision support results by on-line service integration. In the initiative, six organizations are involved in two “MashUp” scenarios on disaster management. One “MashUp” is to derive flood maps in the Poyang Lake, Jiangxi. And the other one is to generate turbidity maps on demand in the East Lake, Wuhan, China. The two scenarios engage different organizations from the Chinese community by integrating sensor observations, data, and processing services from them, and improve the automation of data analysis process using open standards.
Using Clouds for MapReduce Measurement Assignments
ERIC Educational Resources Information Center
Rabkin, Ariel; Reiss, Charles; Katz, Randy; Patterson, David
2013-01-01
We describe our experiences teaching MapReduce in a large undergraduate lecture course using public cloud services and the standard Hadoop API. Using the standard API, students directly experienced the quality of industrial big-data tools. Using the cloud, every student could carry out scalability benchmarking assignments on realistic hardware,…
Use of Open Standards and Technologies at the Lunar Mapping and Modeling Project
NASA Astrophysics Data System (ADS)
Law, E.; Malhotra, S.; Bui, B.; Chang, G.; Goodale, C. E.; Ramirez, P.; Kim, R. M.; Sadaqathulla, S.; Rodriguez, L.
2011-12-01
The Lunar Mapping and Modeling Project (LMMP), led by the Marshall Space Flight center (MSFC), is tasked by NASA. The project is responsible for the development of an information system to support lunar exploration activities. It provides lunar explorers a set of tools and lunar map and model products that are predominantly derived from present lunar missions (e.g., the Lunar Reconnaissance Orbiter (LRO)) and from historical missions (e.g., Apollo). At Jet Propulsion Laboratory (JPL), we have built the LMMP interoperable geospatial information system's underlying infrastructure and a single point of entry - the LMMP Portal by employing a number of open standards and technologies. The Portal exposes a set of services to users to allow search, visualization, subset, and download of lunar data managed by the system. Users also have access to a set of tools that visualize, analyze and annotate the data. The infrastructure and Portal are based on web service oriented architecture. We designed the system to support solar system bodies in general including asteroids, earth and planets. We employed a combination of custom software, commercial and open-source components, off-the-shelf hardware and pay-by-use cloud computing services. The use of open standards and web service interfaces facilitate platform and application independent access to the services and data, offering for instances, iPad and Android mobile applications and large screen multi-touch with 3-D terrain viewing functions, for a rich browsing and analysis experience from a variety of platforms. The web services made use of open standards including: Representational State Transfer (REST); and Open Geospatial Consortium (OGC)'s Web Map Service (WMS), Web Coverage Service (WCS), Web Feature Service (WFS). Its data management services have been built on top of a set of open technologies including: Object Oriented Data Technology (OODT) - open source data catalog, archive, file management, data grid framework; openSSO - open source access management and federation platform; solr - open source enterprise search platform; redmine - open source project collaboration and management framework; GDAL - open source geospatial data abstraction library; and others. Its data products are compliant with Federal Geographic Data Committee (FGDC) metadata standard. This standardization allows users to access the data products via custom written applications or off-the-shelf applications such as GoogleEarth. We will demonstrate this ready-to-use system for data discovery and visualization by walking through the data services provided through the portal such as browse, search, and other tools. We will further demonstrate image viewing and layering of lunar map images from the Internet, via mobile devices such as Apple's iPad.
QTIMaps: A Model to Enable Web Maps in Assessment
ERIC Educational Resources Information Center
Navarrete, Toni; Santos, Patricia; Hernandez-Leo, Davinia; Blat, Josep
2011-01-01
Test-based e-Assessment approaches are mostly focused on the assessment of knowledge and not on that of other skills, which could be supported by multimedia interactive services. This paper presents the QTIMaps model, which combines the IMS QTI standard with web maps services enabling the computational assessment of geographical skills. We…
OneGeology Web Services and Portal as a global geological SDI - latest standards and technology
NASA Astrophysics Data System (ADS)
Duffy, Tim; Tellez-Arenas, Agnes
2014-05-01
The global coverage of OneGeology Web Services (www.onegeology.org and portal.onegeology.org) achieved since 2007 from the 120 participating geological surveys will be reviewed and issues arising discussed. Recent enhancements to the OneGeology Web Services capabilities will be covered including new up to 5 star service accreditation scheme utilising the ISO/OGC Web Mapping Service standard version 1.3, core ISO 19115 metadata additions and Version 2.0 Web Feature Services (WFS) serving the new IUGS-CGI GeoSciML V3.2 geological web data exchange language standard (http://www.geosciml.org/) with its associated 30+ IUGS-CGI available vocabularies (http://resource.geosciml.org/ and http://srvgeosciml.brgm.fr/eXist2010/brgm/client.html). Use of the CGI simpelithology and timescale dictionaries now allow those who wish to do so to offer data harmonisation to query their GeoSciML 3.2 based Web Feature Services and their GeoSciML_Portrayal V2.0.1 (http://www.geosciml.org/) Web Map Services in the OneGeology portal (http://portal.onegeology.org). Contributing to OneGeology involves offering to serve ideally 1:1000,000 scale geological data (in practice any scale now is warmly welcomed) as an OGC (Open Geospatial Consortium) standard based WMS (Web Mapping Service) service from an available WWW server. This may either be hosted within the Geological Survey or a neighbouring, regional or elsewhere institution that offers to serve that data for them i.e. offers to help technically by providing the web serving IT infrastructure as a 'buddy'. OneGeology is a standards focussed Spatial Data Infrastructure (SDI) and works to ensure that these standards work together and it is now possible for European Geological Surveys to register their INSPIRE web services within the OneGeology SDI (e.g. see http://www.geosciml.org/geosciml/3.2/documentation/cookbook/INSPIRE_GeoSciML_Cookbook%20_1.0.pdf). The Onegeology portal (http://portal.onegeology.org) is the first port of call for anyone wishing to discover the availability of global geological web services and has new functionality to view and use such services including multiple projection support. KEYWORDS : OneGeology; GeoSciML V 3.2; Data exchange; Portal; INSPIRE; Standards; OGC; Interoperability; GeoScience information; WMS; WFS; Cookbook.
The deegree framework - Spatial Data Infrastructure solution for end-users and developers
NASA Astrophysics Data System (ADS)
Kiehle, Christian; Poth, Andreas
2010-05-01
The open source software framework deegree is a comprehensive implementation of standards as defined by ISO and Open Geospatial Consortium (OGC). It has been developed with two goals in mind: provide a uniform framework for implementing Spatial Data Infrastructures (SDI) and adhering to standards as strictly as possible. Although being open source software (Lesser GNU Public License, LGPL), deegree has been developed with a business model in mind: providing the general building blocks of SDIs without license fees and offer customization, consulting and tailoring by specialized companies. The core of deegree is a comprehensive Java Application Programming Interface (API) offering access to spatial features, analysis, metadata and coordinate reference systems. As a library, deegree can and has been integrated as a core module inside spatial information systems. It is reference implementation for several OGC standards and based on an ISO 19107 geometry model. For end users, deegree is shipped as a web application providing easy-to-set-up components for web mapping and spatial analysis. Since 2000, deegree has been the backbone of many productive SDIs, first and foremost for governmental stakeholders (e.g. Federal Agency for Cartography and Geodesy in Germany, the Ministry of Housing, Spatial Planning and the Environment in the Netherlands, etc.) as well as for research and development projects as an early adoption of standards, drafts and discussion papers. Besides mature standards like Web Map Service, Web Feature Service and Catalogue Services, deegree also implements rather new standards like the Sensor Observation Service, the Web Processing Service and the Web Coordinate Transformation Service (WCTS). While a robust background in standardization (knowledge and implementation) is a must for consultancy, standard-compliant services and encodings alone do not provide solutions for customers. The added value is comprised by a sophisticated set of client software, desktop and web environments. A focus lies on different client solutions for specific standards like the Web Processing Service and the Web Coordinate Transformation Service. On the other hand, complex geoportal solutions comprised of multiple standards and enhanced by components for user management, security and map client functionality show the demanding requirements of real world solutions. The XPlan-GML-standard as defined by the German spatial planing authorities is a good example of how complex real-world requirements can get. XPlan-GML is intended to provide a framework for digital spatial planning documents and requires complex Geography Markup Language (GML) features along with Symbology Encoding (SE), Filter Encoding (FE), Web Map Services (WMS), Web Feature Services (WFS). This complex infrastructure should be used by urban and spatial planners and therefore requires a user-friendly graphical interface hiding the complexity of the underlying infrastructure. Based on challenges faced within customer projects, the importance of easy to use software components is focused. SDI solution should be build upon ISO/OGC-standards, but more important, should be user-friendly and support the users in spatial data management and analysis.
Single-edition quadrangle maps
,
1998-01-01
In August 1993, the U.S. Geological Survey's (USGS) National Mapping Division and the U.S. Department of Agriculture's Forest Service signed an Interagency Agreement to begin a single-edition joint mapping program. This agreement established the coordination for producing and maintaining single-edition primary series topographic maps for quadrangles containing National Forest System lands. The joint mapping program saves money by eliminating duplication of effort by the agencies and results in a more frequent revision cycle for quadrangles containing national forests. Maps are revised on the basis of jointly developed standards and contain normal features mapped by the USGS, as well as additional features required for efficient management of National Forest System lands. Single-edition maps look slightly different but meet the content, accuracy, and quality criteria of other USGS products. The Forest Service is responsible for the land management of more than 191 million acres of land throughout the continental United States, Alaska, and Puerto Rico, including 155 national forests and 20 national grasslands. These areas make up the National Forest System lands and comprise more than 10,600 of the 56,000 primary series 7.5-minute quadrangle maps (15-minute in Alaska) covering the United States. The Forest Service has assumed responsibility for maintaining these maps, and the USGS remains responsible for printing and distributing them. Before the agreement, both agencies published similar maps of the same areas. The maps were used for different purposes, but had comparable types of features that were revised at different times. Now, the two products have been combined into one so that the revision cycle is stabilized and only one agency revises the maps, thus increasing the number of current maps available for National Forest System lands. This agreement has improved service to the public by requiring that the agencies share the same maps and that the maps meet a common standard, as well as by significantly reducing duplication of effort.
Paterson, Trevor; Law, Andy
2009-08-14
Genomic analysis, particularly for less well-characterized organisms, is greatly assisted by performing comparative analyses between different types of genome maps and across species boundaries. Various providers publish a plethora of on-line resources collating genome mapping data from a multitude of species. Datasources range in scale and scope from small bespoke resources for particular organisms, through larger web-resources containing data from multiple species, to large-scale bioinformatics resources providing access to data derived from genome projects for model and non-model organisms. The heterogeneity of information held in these resources reflects both the technologies used to generate the data and the target users of each resource. Currently there is no common information exchange standard or protocol to enable access and integration of these disparate resources. Consequently data integration and comparison must be performed in an ad hoc manner. We have developed a simple generic XML schema (GenomicMappingData.xsd - GMD) to allow export and exchange of mapping data in a common lightweight XML document format. This schema represents the various types of data objects commonly described across mapping datasources and provides a mechanism for recording relationships between data objects. The schema is sufficiently generic to allow representation of any map type (for example genetic linkage maps, radiation hybrid maps, sequence maps and physical maps). It also provides mechanisms for recording data provenance and for cross referencing external datasources (including for example ENSEMBL, PubMed and Genbank.). The schema is extensible via the inclusion of additional datatypes, which can be achieved by importing further schemas, e.g. a schema defining relationship types. We have built demonstration web services that export data from our ArkDB database according to the GMD schema, facilitating the integration of data retrieval into Taverna workflows. The data exchange standard we present here provides a useful generic format for transfer and integration of genomic and genetic mapping data. The extensibility of our schema allows for inclusion of additional data and provides a mechanism for typing mapping objects via third party standards. Web services retrieving GMD-compliant mapping data demonstrate that use of this exchange standard provides a practical mechanism for achieving data integration, by facilitating syntactically and semantically-controlled access to the data.
Paterson, Trevor; Law, Andy
2009-01-01
Background Genomic analysis, particularly for less well-characterized organisms, is greatly assisted by performing comparative analyses between different types of genome maps and across species boundaries. Various providers publish a plethora of on-line resources collating genome mapping data from a multitude of species. Datasources range in scale and scope from small bespoke resources for particular organisms, through larger web-resources containing data from multiple species, to large-scale bioinformatics resources providing access to data derived from genome projects for model and non-model organisms. The heterogeneity of information held in these resources reflects both the technologies used to generate the data and the target users of each resource. Currently there is no common information exchange standard or protocol to enable access and integration of these disparate resources. Consequently data integration and comparison must be performed in an ad hoc manner. Results We have developed a simple generic XML schema (GenomicMappingData.xsd – GMD) to allow export and exchange of mapping data in a common lightweight XML document format. This schema represents the various types of data objects commonly described across mapping datasources and provides a mechanism for recording relationships between data objects. The schema is sufficiently generic to allow representation of any map type (for example genetic linkage maps, radiation hybrid maps, sequence maps and physical maps). It also provides mechanisms for recording data provenance and for cross referencing external datasources (including for example ENSEMBL, PubMed and Genbank.). The schema is extensible via the inclusion of additional datatypes, which can be achieved by importing further schemas, e.g. a schema defining relationship types. We have built demonstration web services that export data from our ArkDB database according to the GMD schema, facilitating the integration of data retrieval into Taverna workflows. Conclusion The data exchange standard we present here provides a useful generic format for transfer and integration of genomic and genetic mapping data. The extensibility of our schema allows for inclusion of additional data and provides a mechanism for typing mapping objects via third party standards. Web services retrieving GMD-compliant mapping data demonstrate that use of this exchange standard provides a practical mechanism for achieving data integration, by facilitating syntactically and semantically-controlled access to the data. PMID:19682365
Operational Use of OGC Web Services at the Met Office
NASA Astrophysics Data System (ADS)
Wright, Bruce
2010-05-01
The Met Office has adopted the Service-Orientated Architecture paradigm to deliver services to a range of customers through Rich Internet Applications (RIAs). The approach uses standard Open Geospatial Consortium (OGC) web services to provide information to web-based applications through a range of generic data services. "Invent", the Met Office beta site, is used to showcase Met Office future plans for presenting web-based weather forecasts, product and information to the public. This currently hosts a freely accessible Weather Map Viewer, written in JavaScript, which accesses a Web Map Service (WMS), to deliver innovative web-based visualizations of weather and its potential impacts to the public. The intention is to engage the public in the development of new web-based services that more accurately meet their needs. As the service is intended for public use within the UK, it has been designed to support a user base of 5 million, the analysed level of UK web traffic reaching the Met Office's public weather information site. The required scalability has been realised through the use of multi-tier tile caching: - WMS requests are made for 256x256 tiles for fixed areas and zoom levels; - a Tile Cache, developed in house, efficiently serves tiles on demand, managing WMS request for the new tiles; - Edge Servers, externally hosted by Akamai, provide a highly scalable (UK-centric) service for pre-cached tiles, passing new requests to the Tile Cache; - the Invent Weather Map Viewer uses the Google Maps API to request tiles from Edge Servers. (We would expect to make use of the Web Map Tiling Service, when it becomes an OGC standard.) The Met Office delivers specialist commercial products to market sectors such as transport, utilities and defence, which exploit a Web Feature Service (WFS) for data relating forecasts and observations to specific geographic features, and a Web Coverage Service (WCS) for sub-selections of gridded data. These are locally rendered as maps or graphs, and combined with the WMS pre-rendered images and text, in a FLEX application, to provide sophisticated, user impact-based view of the weather. The OGC web services supporting these applications have been developed in collaboration with commercial companies. Visual Weather was originally a desktop application for forecasters, but IBL have developed it to expose the full range of forecast and observation data through standard web services (WCS and WMS). Forecasts and observations relating to specific locations and geographic features are held in an Oracle Database, and exposed as a WFS using Snowflake Software's GO-Publisher application. The Met Office has worked closely with both IBL and Snowflake Software to ensure that the web services provided strike a balance between conformance to the standards and performance in an operational environment. This has proved challenging in areas where the standards are rapidly evolving (e.g. WCS) or do not allow adequate description of the Met-Ocean domain (e.g. multiple time coordinates and parametric vertical coordinates). It has also become clear that careful selection of the features to expose, based on the way in which you expect users to query those features, in necessary in order to deliver adequate performance. These experiences are providing useful 'real-world' input in to the recently launched OGC MetOcean Domain Working Group and World Meteorological Organisation (WMO) initiatives in this area.
GIO-EMS and International Collaboration in Satellite based Emergency Mapping
NASA Astrophysics Data System (ADS)
Kucera, Jan; Lemoine, Guido; Broglia, Marco
2013-04-01
During the last decade, satellite based emergency mapping has developed into a mature operational stage. The European Union's GMES Initial Operations - Emergency Management Service (GIO-EMS), is operational since April 2012. It's set up differs from other mechanisms (for example from the International Charter "Space and Major Disasters"), as it extends fast satellite tasking and delivery with the value adding map production as a single service, which is available, free of charge, to the authorized users of the service. Maps and vector datasets with standard characteristics and formats ranging from post-disaster damage assessment to recovery and disaster prevention are covered by this initiative. Main users of the service are European civil protection authorities and international organizations active in humanitarian aid. All non-sensitive outputs of the service are accessible to the public. The European Commission's in-house science service Joint Research Centre (JRC) is the technical and administrative supervisor of the GIO-EMS. The EC's DG ECHO Monitoring and Information Centre acts as the service's focal point and DG ENTR is responsible for overall service governance. GIO-EMS also aims to contribute to the synergy with similar existing mechanisms at national and international level. The usage of satellite data for emergency mapping has increased during the last years and this trend is expected to continue because of easier accessibility to suitable satellite and other relevant data in the near future. Furthermore, the data and analyses coming from volunteer emergency mapping communities are expected to further enrich the content of such cartographic products. In the case of major disasters the parallel activity of more providers is likely to generate non-optimal use of resources, e.g. unnecessary duplication; whereas coordination may lead to reduced time needed to cover the disaster area. Furthermore the abundant number of geospatial products of different characteristics and quality can become confusing for users. The urgent need for a better coordination has led to establishment of the International Working Group on Satellite Based Emergency Mapping (IWG-SEM). Members of the IWG-SEM, which include JRC, USGS, DLR-ZKI, SERVIR, Sentinel Asia, UNOSAT, UN-SPIDER, GEO, ITHACA and SERTIT have recognized the need to establish the best practice between operational satellite-based emergency mapping programs. The group intends to: • work with the appropriate organizations on definition of professional standards for emergency mapping, guidelines for product generation and reviewing relevant technical standards and protocols • facilitate communication and collaboration during the major emergencies • stimulate coordination of expertise and capacities. The existence of the group and the cooperation among members already brought benefits during recent disasters in Africa and Europe in 2012 in terms of faster and effective satellite data provision and better product generation.
Standards-Based Open-Source Planetary Map Server: Lunaserv
NASA Astrophysics Data System (ADS)
Estes, N. M.; Silva, V. H.; Bowley, K. S.; Lanjewar, K. K.; Robinson, M. S.
2018-04-01
Lunaserv is a planetary capable Web Map Service developed by the LROC SOC. It enables researchers to serve their own planetary data to a wide variety of GIS clients without any additional processing or download steps.
Habitat scale mapping of fisheries ecosystem services values in estuaries
Little is known about the variability of ecosystem service values at spatial scales most relevant to local decision makers. Competing definitions of ecosystem services, the paucity of ecological and economic information and the lack of standardization in methodology are major ob...
NASA Astrophysics Data System (ADS)
Weinke, Elisabeth; Hölbling, Daniel; Albrecht, Florian; Friedl, Barbara
2017-04-01
Geo-hazards and their effects are distributed geographically over wide regions. The effective mapping and monitoring is essential for hazard assessment and mitigation. It is often best achieved using satellite imagery and new object-based image analysis approaches to identify and delineate geo-hazard objects (landslides, floods, forest fires, storm damages, etc.). At the moment, several local/national databases and platforms provide and publish data of different types of geo-hazards as well as web-based risk maps and decision support systems. Also, the European commission implemented the Copernicus Emergency Management Service (EMS) in 2015 that publishes information about natural and man-made disasters and risks. Currently, no platform for landslides or geo-hazards as such exists that enables the integration of the user in the mapping and monitoring process. In this study we introduce the concept of a spatial data infrastructure for object delineation, web-processing and service provision of landslide information with the focus on user interaction in all processes. A first prototype for the processing and mapping of landslides in Austria and Italy has been developed within the project Land@Slide, funded by the Austrian Research Promotion Agency FFG in the Austrian Space Applications Program ASAP. The spatial data infrastructure and its services for the mapping, processing and analysis of landslides can be extended to other regions and to all types of geo-hazards for analysis and delineation based on Earth Observation (EO) data. The architecture of the first prototypical spatial data infrastructure includes four main areas of technical components. The data tier consists of a file storage system and the spatial data catalogue for the management of EO-data, other geospatial data on geo-hazards, as well as descriptions and protocols for the data processing and analysis. An interface to extend the data integration from external sources (e.g. Sentinel-2 data) is planned for the possibility of rapid mapping. The server tier consists of java based web and GIS server. Sub and main services are part of the service tier. Sub services are for example map services, feature editing services, geometry services, geoprocessing services and metadata services. For (meta)data provision and to support data interoperability, web standards of the OGC and the rest-interface is used. Four central main services are designed and developed: (1) a mapping service (including image segmentation and classification approaches), (2) a monitoring service to monitor changes over time, (3) a validation service to analyze landslide delineations from different sources and (4) an infrastructure service to identify affected landslides. The main services use and combine parts of the sub services. Furthermore, a series of client applications based on new technology standards making use of the data and services offered by the spatial data infrastructure. Next steps include the design to extend the current spatial data infrastructure to other areas and geo-hazard types to develop a spatial data infrastructure that can assist targeted mapping and monitoring of geo-hazards on a global context.
Field Guide to the Plant Community Types of Voyageurs National Park
Faber-Langendoen, Don; Aaseng, Norman; Hop, Kevin; Lew-Smith, Michael
2007-01-01
INTRODUCTION The objective of the U.S. Geological Survey-National Park Service Vegetation Mapping Program is to classify, describe, and map vegetation for most of the park units within the National Park Service (NPS). The program was created in response to the NPS Natural Resources Inventory and Monitoring Guidelines issued in 1992. Products for each park include digital files of the vegetation map and field data, keys and descriptions to the plant communities, reports, metadata, map accuracy verification summaries, and aerial photographs. Interagency teams work in each park and, following standardized mapping and field sampling protocols, develop products and vegetation classification standards that document the various vegetation types found in a given park. The use of a standard national vegetation classification system and mapping protocol facilitate effective resource stewardship by ensuring compatibility and widespread use of the information throughout the NPS as well as by other Federal and state agencies. These vegetation classifications and maps and associated information support a wide variety of resource assessment, park management, and planning needs, and provide a structure for framing and answering critical scientific questions about plant communities and their relation to environmental processes across the landscape. This field guide is intended to make the classification accessible to park visitors and researchers at Voyageurs National Park, allowing them to identify any stand of natural vegetation and showing how the classification can be used in conjunction with the vegetation map (Hop and others, 2001).
Maroney, Susan A; McCool, Mary Jane; Geter, Kenneth D; James, Angela M
2007-01-01
The internet is used increasingly as an effective means of disseminating information. For the past five years, the United States Department of Agriculture (USDA) Veterinary Services (VS) has published animal health information in internet-based map server applications, each oriented to a specific surveillance or outbreak response need. Using internet-based technology allows users to create dynamic, customised maps and perform basic spatial analysis without the need to buy or learn desktop geographic information systems (GIS) software. At the same time, access can be restricted to authorised users. The VS internet mapping applications to date are as follows: Equine Infectious Anemia Testing 1972-2005, National Tick Survey tick distribution maps, the Emergency Management Response System-Mapping Module for disease investigations and emergency outbreaks, and the Scrapie mapping module to assist with the control and eradication of this disease. These services were created using Environmental Systems Research Institute (ESRI)'s internet map server technology (ArcIMS). Other leading technologies for spatial data dissemination are ArcGIS Server, ArcEngine, and ArcWeb Services. VS is prototyping applications using these technologies, including the VS Atlas of Animal Health Information using ArcGIS Server technology and the Map Kiosk using ArcEngine for automating standard map production in the case of an emergency.
NASA Astrophysics Data System (ADS)
Jencks, J. H.; Cartwright, J.; Varner, J. D.
2016-12-01
Exploring, understanding, and managing the global oceans are a challenge when hydrographic maps are available for only 5% of the world's oceans. Seafloor mapping is expensive and most government and academic budgets continue to tighten. The first step for any mapping program, before setting out to map uncharted waters, should be to identify if data currently exist in the area of interest. There are many reasons why this seemingly simple suggestion is easier said than done.While certain datasets are accessible online (e.g., NOAA's NCEI, EMODnet, IHO-DCDB), many are not. In some cases, data that are publicly available are difficult to discover and access. No single agency can successfully resolve the complex and pressing demands of ocean and coastal mapping and the associated data stewardship. The National Oceanic and Atmospheric Administration (NOAA) is an active participant in numerous campaign mapping projects whose goals are to carry out coordinated and comprehensive ocean mapping efforts. One of these international programs is an outcome of the Galway Statement on Atlantic Ocean Cooperation signed by the European Union, Canada, and the United States in 2013. At NOAA's National Centers for Environmental Information (NCEI), resources are focused on ensuring the security and widespread availability of the Nation's scientific marine geophysical data through long-term stewardship. NCEI draws on a variety of software technologies and adheres to international standards to meet this challenge. The result is a geospatial framework built on spatially-enabled databases, standards-based web services, and International Standards Organization (ISO) metadata. Through the use of industry standards, the services are constructed such that they can be combined and re-used in a variety of contexts. For example, users may leverage the services in desktop analysis tools, web applications created by the hosting organizations (e.g. the North Atlantic Data Portal), or in custom applications they develop themselves. In order to maximize the return on campaign mapping investments, legacy and newly acquired data must be easily discoverable and readily accessible by numerous applications and formats now and well into the future. Working together, we can ensure that valuable data are made available to the broadest community.
Proposed DoD (Department of Defense) Internet Protocol Standard.
1982-07-06
parameters fall into two categories: service quality parameters and service options. Service quality parameters influence the transmission service provided...Corporation 6 July 1982 -7- TM-7172/481/OO o Service Quality Parameters - Precedence : attempts preferential treatment for high importance datagrams...select the transmission quality. IP passes the type of service (TOS) command set for service quality to the SNP where it is mapped into subnetwork
Satellites vs. fiber optics based networks and services - Road map to strategic planning
NASA Astrophysics Data System (ADS)
Marandi, James H. R.
An overview of a generic telecommunications network and its components is presented, and the current developments in satellite and fiber optics technologies are discussed with an eye on the trends in industry. A baseline model is proposed, and a cost comparison of fiber- vs satellite-based networks is made. A step-by-step 'road map' to the successful strategic planning of telecommunications services and facilities is presented. This road map provides for optimization of the current and future networks and services through effective utilization of both satellites and fiber optics. The road map is then applied to different segments of the telecommunications industry and market place, to show its effectiveness for the strategic planning of executives of three types: (1) those heading telecommunications manufacturing concerns, (2) those leading communication service companies, and (3) managers of telecommunication/MIS departments of major corporations. Future networking issues, such as developments in integrated-services digital network standards and technologies, are addressed.
Analysis of QoS Requirements for e-Health Services and Mapping to Evolved Packet System QoS Classes
Skorin-Kapov, Lea; Matijasevic, Maja
2010-01-01
E-Health services comprise a broad range of healthcare services delivered by using information and communication technology. In order to support existing as well as emerging e-Health services over converged next generation network (NGN) architectures, there is a need for network QoS control mechanisms that meet the often stringent requirements of such services. In this paper, we evaluate the QoS support for e-Health services in the context of the Evolved Packet System (EPS), specified by the Third Generation Partnership Project (3GPP) as a multi-access all-IP NGN. We classify heterogeneous e-Health services based on context and network QoS requirements and propose a mapping to existing 3GPP QoS Class Identifiers (QCIs) that serve as a basis for the class-based QoS concept of the EPS. The proposed mapping aims to provide network operators with guidelines for meeting heterogeneous e-Health service requirements. As an example, we present the QoS requirements for a prototype e-Health service supporting tele-consultation between a patient and a doctor and illustrate the use of the proposed mapping to QCIs in standardized QoS control procedures. PMID:20976301
NASA Astrophysics Data System (ADS)
Baldwin, R.; Ansari, S.; Reid, G.; Lott, N.; Del Greco, S.
2007-12-01
The main goal in developing and deploying Geographic Information System (GIS) services at NOAA's National Climatic Data Center (NCDC) is to provide users with simple access to data archives while integrating new and informative climate products. Several systems at NCDC provide a variety of climatic data in GIS formats and/or map viewers. The Online GIS Map Services provide users with data discovery options which flow into detailed product selection maps, which may be queried using standard "region finder" tools or gazetteer (geographical dictionary search) functions. Each tabbed selection offers steps to help users progress through the systems. A series of additional base map layers or data types have been added to provide companion information. New map services include: Severe Weather Data Inventory, Local Climatological Data, Divisional Data, Global Summary of the Day, and Normals/Extremes products. THREDDS Data Server technology is utilized to provide access to gridded multidimensional datasets such as Model, Satellite and Radar. This access allows users to download data as a gridded NetCDF file, which is readable by ArcGIS. In addition, users may subset the data for a specific geographic region, time period, height range or variable prior to download. The NCDC Weather Radar Toolkit (WRT) is a client tool which accesses Weather Surveillance Radar 1988 Doppler (WSR-88D) data locally or remotely from the NCDC archive, NOAA FTP server or any URL or THREDDS Data Server. The WRT Viewer provides tools for custom data overlays, Web Map Service backgrounds, animations and basic filtering. The export of images and movies is provided in multiple formats. The WRT Data Exporter allows for data export in both vector polygon (Shapefile, Well-Known Text) and raster (GeoTIFF, ESRI Grid, VTK, NetCDF, GrADS) formats. As more users become accustom to GIS, questions of better, cheaper, faster access soon follow. Expanding use and availability can best be accomplished through standards which promote interoperability. Our GIS related products provide Open Geospatial Consortium (OGC) compliant Web Map Services (WMS), Web Feature Services (WFS), Web Coverage Services (WCS) and Federal Geographic Data Committee (FGDC) metadata as a complement to the map viewers. KML/KMZ data files (soon to be compliant OGC specifications) also provide access.
Online, interactive assessment of geothermal energy potential in the U.S
NASA Astrophysics Data System (ADS)
Allison, M. L.; Richard, S. M.; Clark, R.; Coleman, C.; Love, D.; Pape, E.; Musil, L.
2011-12-01
Geothermal-relevant geosciences data from all 50 states (www.stategeothermaldata.org), federal agencies, national labs, and academic centers are being digitized and linked in a distributed network via the U.S. Department of Energy-funded National Geothermal Data System (NGDS) to foster geothermal energy exploration and development through use of interactive online 'mashups,' data integration, and applications. Emphasis is first to make as much information as possible accessible, with a long range goal to make data interoperable through standardized services and interchange formats. Resources may be made available as documents (files) in whatever format they are currently in, converted to tabular files using standard content models, or published as Open Geospatial Consortium or ESRI Web services using the standard xml schema. An initial set of thirty geoscience data content models are in use or under development to define standardized interchange format: aqueous chemistry, borehole temperature data, direct use feature, drill stem test, earthquake hypocenter, fault feature, geologic contact feature, geologic unit feature, thermal/hot spring description, metadata, quaternary fault, volcanic vent description, well header feature, borehole lithology log, crustal stress, gravity, heat flow/temperature gradient, permeability, and feature description data like developed geothermal systems, geologic unit geothermal properties, permeability, production data, rock alteration description, rock chemistry, and thermal conductivity. Map services are also being developed for isopach maps (depth to bedrock), aquifer temperature maps, and several states are working on geothermal resource overview maps. Content models are developed preferentially from existing community use in order to encourage widespread adoption and promulgate minimum metadata quality standards. Geoscience data and maps from NGDS participating institutions (USGS, Southern Methodist University, Boise State University Geothermal Data Coalition) are being supplemented with extensive land management and land use resources from the Western Regional Partnership (15 federal agencies and 5 Western states) to provide access to a comprehensive, holistic set of data critical to geothermal energy development. As of August 2011, over 33,000 data resources have been registered in the system catalog, along with scores of Web services to deliver integrated data to the desktop for free downloading or online use. The data exchange mechanism is built on the U.S. Geoscience Information Network (USGIN, http://lab.usgin.org) protocols and standards developed in partnership with the U.S. Geological Survey.
Weather forecasting with open source software
NASA Astrophysics Data System (ADS)
Rautenhaus, Marc; Dörnbrack, Andreas
2013-04-01
To forecast the weather situation during aircraft-based atmospheric field campaigns, we employ a tool chain of existing and self-developed open source software tools and open standards. Of particular value are the Python programming language with its extension libraries NumPy, SciPy, PyQt4, Matplotlib and the basemap toolkit, the NetCDF standard with the Climate and Forecast (CF) Metadata conventions, and the Open Geospatial Consortium Web Map Service standard. These open source libraries and open standards helped to implement the "Mission Support System", a Web Map Service based tool to support weather forecasting and flight planning during field campaigns. The tool has been implemented in Python and has also been released as open source (Rautenhaus et al., Geosci. Model Dev., 5, 55-71, 2012). In this presentation we discuss the usage of free and open source software for weather forecasting in the context of research flight planning, and highlight how the field campaign work benefits from using open source tools and open standards.
Research on Service Platform of Internet of Things for Smart City
NASA Astrophysics Data System (ADS)
Wang, W.; He, Z.; Huang, D.; Zhang, X.
2014-04-01
The application of Internet of Things in surveying and mapping industry basically is at the exploration stage, has not formed a unified standard. Chongqing Institute of Surveying and Mapping (CQISM) launched the research p roject "Research on the Technology of Internet of Things for Smart City". The project focuses on the key technologies of information transmission and exchange on the Internet of Things platform. The data standards of Internet of Things are designed. The real-time acquisition, mass storage and distributed data service of mass sensors are realized. On this basis, CQISM deploys the prototype platform of Internet of Things. The simulation application in Connected Car proves that the platform design is scientific and practical.
Prototype of Partial Cutting Tool of Geological Map Images Distributed by Geological Web Map Service
NASA Astrophysics Data System (ADS)
Nonogaki, S.; Nemoto, T.
2014-12-01
Geological maps and topographical maps play an important role in disaster assessment, resource management, and environmental preservation. These map information have been distributed in accordance with Web services standards such as Web Map Service (WMS) and Web Map Tile Service (WMTS) recently. In this study, a partial cutting tool of geological map images distributed by geological WMTS was implemented with Free and Open Source Software. The tool mainly consists of two functions: display function and cutting function. The former function was implemented using OpenLayers. The latter function was implemented using Geospatial Data Abstraction Library (GDAL). All other small functions were implemented by PHP and Python. As a result, this tool allows not only displaying WMTS layer on web browser but also generating a geological map image of intended area and zoom level. At this moment, available WTMS layers are limited to the ones distributed by WMTS for the Seamless Digital Geological Map of Japan. The geological map image can be saved as GeoTIFF format and WebGL format. GeoTIFF is one of the georeferenced raster formats that is available in many kinds of Geographical Information System. WebGL is useful for confirming a relationship between geology and geography in 3D. In conclusion, the partial cutting tool developed in this study would contribute to create better conditions for promoting utilization of geological information. Future work is to increase the number of available WMTS layers and the types of output file format.
Lean implementation in primary care health visiting services in National Health Service UK.
Grove, A L; Meredith, J O; Macintyre, M; Angelis, J; Neailey, K
2010-10-01
This paper presents the findings of a 13-month lean implementation in National Health Service (NHS) primary care health visiting services from May 2008 to June 2009. Lean was chosen for this study because of its reported success in other healthcare organisations. Value-stream mapping was utilised to map out essential tasks for the participating health visiting service. Stakeholder mapping was conducted to determine the links between all relevant stakeholders. Waste processes were then identified through discussions with these stakeholders, and a redesigned future state process map was produced. Quantitative data were provided through a 10-day time-and-motion study of a selected number of staff within the service. This was analysed to provide an indication of waste activity that could be removed from the system following planned improvements. The value-stream map demonstrated that there were 67 processes in the original health visiting service studied. Analysis revealed that 65% of these processes were waste and could be removed in the redesigned process map. The baseline time-and-motion data demonstrate that clinical staff performed on average 15% waste activities, and the administrative support staff performed 46% waste activities. Opportunities for significant waste reduction have been identified during the study using the lean tools of value-stream mapping and a time-and-motion study. These opportunities include simplification of standard tasks, reduction in paperwork and standardisation of processes. Successful implementation of these improvements will free up resources within the organisation which can be redirected towards providing better direct care to patients.
NASA Astrophysics Data System (ADS)
Yang, Z.; Han, W.; di, L.
2010-12-01
The National Agricultural Statistics Service (NASS) of the USDA produces the Cropland Data Layer (CDL) product, which is a raster-formatted, geo-referenced, U.S. crop specific land cover classification. These digital data layers are widely used for a variety of applications by universities, research institutions, government agencies, and private industry in climate change studies, environmental ecosystem studies, bioenergy production & transportation planning, environmental health research and agricultural production decision making. The CDL is also used internally by NASS for crop acreage and yield estimation. Like most geospatial data products, the CDL product is only available by CD/DVD delivery or online bulk file downloading via the National Research Conservation Research (NRCS) Geospatial Data Gateway (external users) or in a printed paper map format. There is no online geospatial information access and dissemination, no crop visualization & browsing, no geospatial query capability, nor online analytics. To facilitate the application of this data layer and to help disseminating the data, a web-service based CDL interactive map visualization, dissemination, querying system is proposed. It uses Web service based service oriented architecture, adopts open standard geospatial information science technology and OGC specifications and standards, and re-uses functions/algorithms from GeoBrain Technology (George Mason University developed). This system provides capabilities of on-line geospatial crop information access, query and on-line analytics via interactive maps. It disseminates all data to the decision makers and users via real time retrieval, processing and publishing over the web through standards-based geospatial web services. A CDL region of interest can also be exported directly to Google Earth for mashup or downloaded for use with other desktop application. This web service based system greatly improves equal-accessibility, interoperability, usability, and data visualization, facilitates crop geospatial information usage, and enables US cropland online exploring capability without any client-side software installation. It also greatly reduces the need for paper map and analysis report printing and media usages, and thus enhances low-carbon Agro-geoinformation dissemination for decision support.
Web Map Services (WMS) Global Mosaic
NASA Technical Reports Server (NTRS)
Percivall, George; Plesea, Lucian
2003-01-01
The WMS Global Mosaic provides access to imagery of the global landmass using an open standard for web mapping. The seamless image is a mosaic of Landsat 7 scenes; geographically-accurate with 30 and 15 meter resolutions. By using the OpenGIS Web Map Service (WMS) interface, any organization can use the global mosaic as a layer in their geospatial applications. Based on a trade study, an implementation approach was chosen that extends a previously developed WMS hosting a Landsat 5 CONUS mosaic developed by JPL. The WMS Global Mosaic supports the NASA Geospatial Interoperability Office goal of providing an integrated digital representation of the Earth, widely accessible for humanity's critical decisions.
Villamagna, Amy M.; Mogollón, Beatriz; Angermeier, Paul
2014-01-01
Despite recent interest, ecosystem services are not yet fully incorporated into private and public decisions about natural resource management. Cultural ecosystem services (CES) are among the most challenging of services to include because they comprise complex ecological and social properties and processes that make them difficult to measure, map or monetize. Like others, CES are vulnerable to landscape changes and unsustainable use. To date, the sustainability of services has not been adequately addressed and few studies have considered measures of service capacity and demand simultaneously. To facilitate sustainability assessments and management of CES, our study objectives were to (1) develop a spatially explicit framework for mapping the capacity of ecosystems to provide freshwater recreational fishing, an important cultural service, (2) map societal demand for freshwater recreational fishing based on license data and identify areas of potential overuse, and (3) demonstrate how maps of relative capacity and relative demand could be interfaced to estimate sustainability of a CES. We mapped freshwater recreational fishing capacity at the 12-digit hydrologic unit-scale in North Carolina and Virginia using a multi-indicator service framework incorporating biophysical and social landscape metrics and mapped demand based on fishing license data. Mapping of capacity revealed a gradual decrease in capacity eastward from the mountains to the coastal plain and that fishing demand was greatest in urban areas. When comparing standardized relative measures of capacity and demand for freshwater recreational fishing, we found that ranks of capacity exceeded ranks of demand in most hydrologic units, except in 17% of North Carolina and 5% of Virginia. Our GIS-based approach to view freshwater recreational fishing through an ecosystem service lens will enable scientists and managers to examine (1) biophysical and social factors that foster or diminish cultural ecosystem services delivery, (2) demand for cultural ecosystem services relative to their capacity, and (3) ecological pressures like potential overuse that affect service sustainability. Ultimately, we expect such analyses to inform decision-making for freshwater recreational fisheries and other cultural ecosystem services.
ScotlandsPlaces XML: Bespoke XML or XML Mapping?
ERIC Educational Resources Information Center
Beamer, Ashley; Gillick, Mark
2010-01-01
Purpose: The purpose of this paper is to investigate web services (in the form of parameterised URLs), specifically in the context of the ScotlandsPlaces project. This involves cross-domain querying, data retrieval and display via the development of a bespoke XML standard rather than existing XML formats and mapping between them.…
OneGeology-Europe: architecture, portal and web services to provide a European geological map
NASA Astrophysics Data System (ADS)
Tellez-Arenas, Agnès.; Serrano, Jean-Jacques; Tertre, François; Laxton, John
2010-05-01
OneGeology-Europe is a large ambitious project to make geological spatial data further known and accessible. The OneGeology-Europe project develops an integrated system of data to create and make accessible for the first time through the internet the geological map of the whole of Europe. The architecture implemented by the project is web services oriented, based on the OGC standards: the geological map is not a centralized database but is composed by several web services, each of them hosted by a European country involved in the project. Since geological data are elaborated differently from country to country, they are difficult to share. OneGeology-Europe, while providing more detailed and complete information, will foster even beyond the geological community an easier exchange of data within Europe and globally. This implies an important work regarding the harmonization of the data, both model and the content. OneGeology-Europe is characterised by the high technological capacity of the EU Member States, and has the final goal to achieve the harmonisation of European geological survey data according to common standards. As a direct consequence Europe will make a further step in terms of innovation and information dissemination, continuing to play a world leading role in the development of geosciences information. The scope of the common harmonized data model was defined primarily by the requirements of the geological map of Europe, but in addition users were consulted and the requirements of both INSPIRE and ‘high-resolution' geological maps were considered. The data model is based on GeoSciML, developed since 2006 by a group of Geological Surveys. The data providers involved in the project implemented a new component that allows the web services to deliver the geological map expressed into GeoSciML. In order to capture the information describing the geological units of the map of Europe the scope of the data model needs to include lithology; age; genesis and metamorphic character. For high resolution maps physical properties, bedding characteristics and weathering also need to be added. Furthermore, Geological data held by national geological surveys is generally described in national language of the country. The project has to deal with the multilingual issue, an important requirement of the INSPIRE directive. The project provides a list of harmonized vocabularies, a set of web services to deal with them, and a web site for helping the geoscientists while mapping the terms used into the national datasets into these vocabularies. The web services provided by each data provider, with the particular component that allows them to deliver the harmonised data model and to handle the multilingualism, are the first part of the architecture. The project also implements a web portal that provides several functionalities. Thanks to the common data model implemented by each web service delivering a part of the geological map, and using OGC SLD standards, the client offers the following option. A user can request for a sub-selection of the map, for instance searching on a particular attribute such as "age is quaternary", and display only the parts of the map according to the filter. Using the web services on the common vocabularies, the data displayed are translated. The project started September 2008 for two years, with 29 partners from 20 countries (20 partners are Geological Surveys). The budget is 3.25 M€, with a European Commission contribution of 2.6 M€. The paper will describe the technical solutions to implement OneGeology-Europe components: the profile of the common data model to exchange geological data, the web services to view and access geological data; and a geoportal to provide the user with a user-friendly way to discover, view and access geological data.
Carswell, William J.
2011-01-01
increases the efficiency of the Nation's geospatial community by improving communications about geospatial data, products, services, projects, needs, standards, and best practices. The NGP comprises seven major components (described below), that are managed as a unified set. For example, The National Map establishes data standards and identifies geographic areas where specific types of geospatial data need to be incorporated into The National Map. Partnership Network Liaisons work with Federal, State, local, and tribal partners to help acquire the data. Geospatial technical operations ensure the quality control, integration, and availability to the public of the data acquired. The Emergency Operations Office provides the requirements to The National Map and, during emergencies and natural disasters, provides rapid dissemination of information and data targeted to the needs of emergency responders. The National Atlas uses data from The National Map and other sources to make small-scale maps and multimedia articles about the maps.
77 FR 7489 - Small Business Size Standards: Professional, Technical, and Scientific Services
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-10
... standards for the remaining industries in NAICS Sector 54. This rule also removes ``Map Drafting'' as the... for public comment in the Federal Register on March 16, 2011 (76 FR 14323), which proposed to increase the size standards for 35 industries and one sub-industry in NAICS Sector 54 and one industry in NAICS...
NASA Astrophysics Data System (ADS)
Ebner, M.; Schiegl, M.; Stöckl, W.; Heger, H.
2012-04-01
The Geological Survey of Austria is legally obligated by the INSPIRE directive to provide data that fall under this directive (geology, mineral resources and natural risk zones) to the European commission in a semantically harmonized and technically interoperable way. Until recently the focus was entirely on the publication of high quality printed cartographic products. These have a complex (carto-)graphic data-model, which allows visualizing several thematic aspects, such as lithology, stratigraphy, tectonics, geologic age, mineral resources, mass movements, geomorphology etc. in a single planar map/product. Nonetheless these graphic data-models do not allow retrieving individual thematic aspects since these were coded in a complex portrayal scheme. Automatic information retrieval is thus impossible; and domain knowledge is necessary to interpret these "encrypted datasets". With INSPIRE becoming effective and a variety of conceptual models (e.g. GeoSciML), built around a semantic framework (i.e. controlled vocabularies), being available it is necessary to develop a strategy and workflow for semantic harmonization of such datasets. In this contribution we demonstrate the development of a multistage workflow which will allow us to transform our printed maps to semantically enabled datasets and services and discuss some prerequisites, foundations and problems. In a first step in our workflow we analyzed our maps and developed controlled vocabularies that describe the thematic content of our data. We then developed a physical data-model which we use to attribute our spatial data with thematic information from our controlled vocabularies to form core thematic data sets. This physical data model is geared towards use on an organizational level but builds upon existing standards (INSPIRE, GeoSciML) to allow transformation to international standards. In a final step we will develop a standardized mapping scheme to publish INSPIRE conformant services from our core datasets. This two-step transformation is necessary since a direct mapping to international standards is not possible for traditional map-based data. Controlled vocabularies provide the foundation of a semantic harmonization. For the encoding of the vocabularies we build upon the W3C standard SKOS (=Simple Knowledge Organisation System), a thesaurus specification for the semantic web, which is itself based on the Resource Description Framework (RDF) and RDF Schema and added some DublinCore and VoID for the metadata of our vocabularies and resources. For the development of these thesauri we use the commercial software PoolParty, which is a tool specially build to develop, manage and publish multilingual thesauri. The corporate thesauri of the Austrian Geological Survey are exposed via a web-service that is conformant with the linked data principles. This web-service gives access to a (1) RDF/HTML representation of the resources via a simple, robust and thus persistent http URIs (2) a download of the complete vocabularies in RDF-format (3) a full-fledged SPARQL-Endpoint to query the thesaurus. With the development of physical data-models (based on preexisting conceptual models) one must dismiss the classical schemes of map-based portrayal of data. E.g. for individual Geological units on traditional geological maps usually a single age range is given (e.g. formation age). But one might want to attribute several geologic ages (formation age, metamorphic age, cooling ages etc.) to individual units. Such issues have to be taken into account when developing robust physical data-models. Based on our experience we are convinced that individual institutions need to develop their own controlled vocabularies and individual data-models that fit the specific needs on an organizational level. If externally developed vocabularies and data-models are introduced to established workflows newly generated and existing data may be diverging and it will be hard to achieve or maintain a common standard. We thus suggest that it is necessary for institutions to keep (or develop) to their organizational standards and vocabularies and map them to generally agreed international standards such as INSPIRE or GeoSciML in a fashion suggested by the linked data principles.
Dominkovics, Pau; Granell, Carlos; Pérez-Navarro, Antoni; Casals, Martí; Orcau, Angels; Caylà, Joan A
2011-11-29
Health professionals and authorities strive to cope with heterogeneous data, services, and statistical models to support decision making on public health. Sophisticated analysis and distributed processing capabilities over geocoded epidemiological data are seen as driving factors to speed up control and decision making in these health risk situations. In this context, recent Web technologies and standards-based web services deployed on geospatial information infrastructures have rapidly become an efficient way to access, share, process, and visualize geocoded health-related information. Data used on this study is based on Tuberculosis (TB) cases registered in Barcelona city during 2009. Residential addresses are geocoded and loaded into a spatial database that acts as a backend database. The web-based application architecture and geoprocessing web services are designed according to the Representational State Transfer (REST) principles. These web processing services produce spatial density maps against the backend database. The results are focused on the use of the proposed web-based application to the analysis of TB cases in Barcelona. The application produces spatial density maps to ease the monitoring and decision making process by health professionals. We also include a discussion of how spatial density maps may be useful for health practitioners in such contexts. In this paper, we developed web-based client application and a set of geoprocessing web services to support specific health-spatial requirements. Spatial density maps of TB incidence were generated to help health professionals in analysis and decision-making tasks. The combined use of geographic information tools, map viewers, and geoprocessing services leads to interesting possibilities in handling health data in a spatial manner. In particular, the use of spatial density maps has been effective to identify the most affected areas and its spatial impact. This study is an attempt to demonstrate how web processing services together with web-based mapping capabilities suit the needs of health practitioners in epidemiological analysis scenarios.
2011-01-01
Background Health professionals and authorities strive to cope with heterogeneous data, services, and statistical models to support decision making on public health. Sophisticated analysis and distributed processing capabilities over geocoded epidemiological data are seen as driving factors to speed up control and decision making in these health risk situations. In this context, recent Web technologies and standards-based web services deployed on geospatial information infrastructures have rapidly become an efficient way to access, share, process, and visualize geocoded health-related information. Methods Data used on this study is based on Tuberculosis (TB) cases registered in Barcelona city during 2009. Residential addresses are geocoded and loaded into a spatial database that acts as a backend database. The web-based application architecture and geoprocessing web services are designed according to the Representational State Transfer (REST) principles. These web processing services produce spatial density maps against the backend database. Results The results are focused on the use of the proposed web-based application to the analysis of TB cases in Barcelona. The application produces spatial density maps to ease the monitoring and decision making process by health professionals. We also include a discussion of how spatial density maps may be useful for health practitioners in such contexts. Conclusions In this paper, we developed web-based client application and a set of geoprocessing web services to support specific health-spatial requirements. Spatial density maps of TB incidence were generated to help health professionals in analysis and decision-making tasks. The combined use of geographic information tools, map viewers, and geoprocessing services leads to interesting possibilities in handling health data in a spatial manner. In particular, the use of spatial density maps has been effective to identify the most affected areas and its spatial impact. This study is an attempt to demonstrate how web processing services together with web-based mapping capabilities suit the needs of health practitioners in epidemiological analysis scenarios. PMID:22126392
The interoperability skill of the Geographic Portal of the ISPRA - Geological Survey of Italy
NASA Astrophysics Data System (ADS)
Pia Congi, Maria; Campo, Valentina; Cipolloni, Carlo; Delogu, Daniela; Ventura, Renato; Battaglini, Loredana
2010-05-01
The Geographic Portal of Geological Survey of Italy (ISPRA) available at http://serviziogeologico.apat.it/Portal was planning according to standard criteria of the INSPIRE directive. ArcIMS services and at the same time WMS and WFS services had been realized to satisfy the different clients. For each database and web-services the metadata had been wrote in agreement with the ISO 19115. The management architecture of the portal allow it to encode the clients input and output requests both in ArcXML and in GML language. The web-applications and web-services had been realized for each database owner of Land Protection and Georesources Department concerning the geological map at the scale 1:50.000 (CARG Project) and 1:100.000, the IFFI landslide inventory, the boreholes due Law 464/84, the large-scale geological map and all the raster format maps. The portal thus far published is at the experimental stage but through the development of a new graphical interface achieves the final version. The WMS and WFS services including metadata will be re-designed. The validity of the methodology and the applied standards allow to look ahead to the growing developments. In addition to this it must be borne in mind that the capacity of the new geological standard language (GeoSciML), which is already incorporated in the web-services deployed, will be allow a better display and query of the geological data according to the interoperability. The characteristics of the geological data demand for the cartographic mapping specific libraries of symbols not yet available in a WMS service. This is an other aspect regards the standards of the geological informations. Therefore at the moment were carried out: - a library of geological symbols to be used for printing, with a sketch of system colors and a library for displaying data on video, which almost completely solves the problems of the coverage point and area data (also directed) but that still introduces problems for the linear data (solutions: ArcIMS services from Arcmap projects or a specific SLD implementation for WMS services); - an update of "Guidelines for the supply of geological data" in a short time will be published; - the Geological Survey of Italy is officially involved in the IUGS-CGI working group for the processing and experimentation on the new GeoSciML language with the WMS/WFS services. The availability of geographic informations occurs through the metadata that can be distributed online so that search engines can find them through specialized research. The collected metadata in catalogs are structured in a standard (ISO 19135). The catalogs are a ‘common' interface to locate, view and query data and metadata services, web services and other resources. Then, while working in a growing sector of the environmental knowledgement the focus is to collect the participation of other subjects that contribute to the enrichment of the informative content available, so as to be able to arrive to a real portal of national interest especially in case of disaster management.
Expanding Access and Usage of NASA Near Real-Time Imagery and Data
NASA Astrophysics Data System (ADS)
Cechini, M.; Murphy, K. J.; Boller, R. A.; Schmaltz, J. E.; Thompson, C. K.; Huang, T.; McGann, J. M.; Ilavajhala, S.; Alarcon, C.; Roberts, J. T.
2013-12-01
In late 2009, the Land Atmosphere Near-real-time Capability for EOS (LANCE) was created to greatly expand the range of near real-time data products from a variety of Earth Observing System (EOS) instruments. Since that time, NASA's Earth Observing System Data and Information System (EOSDIS) developed the Global Imagery Browse Services (GIBS) to provide highly responsive, scalable, and expandable imagery services that distribute near real-time imagery in an intuitive and geo-referenced format. The GIBS imagery services provide access through standards-based protocols such as the Open Geospatial Consortium (OGC) Web Map Tile Service (WMTS) and standard mapping file formats such as the Keyhole Markup Language (KML). Leveraging these standard mechanisms opens NASA near real-time imagery to a broad landscape of mapping libraries supporting mobile applications. By easily integrating with mobile application development libraries, GIBS makes it possible for NASA imagery to become a reliable and valuable source for end-user applications. Recently, EOSDIS has taken steps to integrate near real-time metadata products into the EOS ClearingHOuse (ECHO) metadata repository. Registration of near real-time metadata allows for near real-time data discovery through ECHO clients. In kind with the near real-time data processing requirements, the ECHO ingest model allows for low-latency metadata insertion and updates. Combining with the ECHO repository, the fast visual access of GIBS imagery can now be linked directly back to the source data file(s). Through the use of discovery standards such as OpenSearch, desktop and mobile applications can connect users to more than just an image. As data services, such as OGC Web Coverage Service, become more prevalent within the EOSDIS system, applications may even be able to connect users from imagery to data values. In addition, the full resolution GIBS imagery provides visual context to other GIS data and tools. The NASA near real-time imagery covers a broad set of Earth science disciplines. By leveraging the ECHO and GIBS services, these data can become a visual context within which other GIS activities are performed. The focus of this presentation is to discuss the GIBS imagery and ECHO metadata services facilitating near real-time discovery and usage. Existing synergies and future possibilities will also be discussed. The NASA Worldview demonstration client will be used to show an existing application combining the ECHO and GIBS services.
Arctic Research Mapping Application (ARMAP): 2D Maps and 3D Globes Support Arctic Science
NASA Astrophysics Data System (ADS)
Johnson, G.; Gaylord, A. G.; Brady, J. J.; Cody, R. P.; Aguilar, J. A.; Dover, M.; Garcia-Lavigne, D.; Manley, W.; Score, R.; Tweedie, C. E.
2007-12-01
The Arctic Research Mapping Application (ARMAP) is a suite of online services to provide support of Arctic science. These services include: a text based online search utility, 2D Internet Map Server (IMS); 3D globes and Open Geospatial Consortium (OGC) Web Map Services (WMS). With ARMAP's 2D maps and 3D globes, users can navigate to areas of interest, view a variety of map layers, and explore U.S. Federally funded research projects. Projects can be queried by location, year, funding program, discipline, and keyword. Links take you to specific information and other web sites associated with a particular research project. The Arctic Research Logistics Support Service (ARLSS) database is the foundation of ARMAP including US research funded by the National Science Foundation, National Aeronautics and Space Administration, National Oceanic and Atmospheric Administration, and the United States Geological Survey. Avoiding a duplication of effort has been a primary objective of the ARMAP project which incorporates best practices (e.g. Spatial Data Infrastructure and OGC standard web services and metadata) and off the shelf technologies where appropriate. The ARMAP suite provides tools for users of various levels of technical ability to interact with the data by importing the web services directly into their own GIS applications and virtual globes; performing advanced GIS queries; simply printing maps from a set of predefined images in the map gallery; browsing the layers in an IMS; or by choosing to "fly to" sites using a 3D globe. With special emphasis on the International Polar Year (IPY), ARMAP has targeted science planners, scientists, educators, and the general public. In sum, ARMAP goes beyond a simple map display to enable analysis, synthesis, and coordination of Arctic research. ARMAP may be accessed via the gateway web site at http://www.armap.org.
Coherent visualization of spatial data adapted to roles, tasks, and hardware
NASA Astrophysics Data System (ADS)
Wagner, Boris; Peinsipp-Byma, Elisabeth
2012-06-01
Modern crisis management requires that users with different roles and computer environments have to deal with a high volume of various data from different sources. For this purpose, Fraunhofer IOSB has developed a geographic information system (GIS) which supports the user depending on available data and the task he has to solve. The system provides merging and visualization of spatial data from various civilian and military sources. It supports the most common spatial data standards (OGC, STANAG) as well as some proprietary interfaces, regardless if these are filebased or database-based. To set the visualization rules generic Styled Layer Descriptors (SLDs) are used, which are an Open Geospatial Consortium (OGC) standard. SLDs allow specifying which data are shown, when and how. The defined SLDs consider the users' roles and task requirements. In addition it is possible to use different displays and the visualization also adapts to the individual resolution of the display. Too high or low information density is avoided. Also, our system enables users with different roles to work together simultaneously using the same data base. Every user is provided with the appropriate and coherent spatial data depending on his current task. These so refined spatial data are served via the OGC services Web Map Service (WMS: server-side rendered raster maps), or the Web Map Tile Service - (WMTS: pre-rendered and cached raster maps).
Web servicing the biological office.
Szugat, Martin; Güttler, Daniel; Fundel, Katrin; Sohler, Florian; Zimmer, Ralf
2005-09-01
Biologists routinely use Microsoft Office applications for standard analysis tasks. Despite ubiquitous internet resources, information needed for everyday work is often not directly and seamlessly available. Here we describe a very simple and easily extendable mechanism using Web Services to enrich standard MS Office applications with internet resources. We demonstrate its capabilities by providing a Web-based thesaurus for biological objects, which maps names to database identifiers and vice versa via an appropriate synonym list. The client application ProTag makes these features available in MS Office applications using Smart Tags and Add-Ins. http://services.bio.ifi.lmu.de/prothesaurus/
ISO 9001 in a neonatal intensive care unit (NICU).
Vitner, Gad; Nadir, Erez; Feldman, Michael; Yurman, Shmuel
2011-01-01
The aim of this paper is to present the process for approving and certifying a neonatal intensive care unit to ISO 9001 standards. The process started with the department head's decision to improve services quality before deciding to achieve ISO 9001 certification. Department processes were mapped and quality management mechanisms were developed. Process control and performance measurements were defined and implemented to monitor the daily work. A service satisfaction review was conducted to get feedback from families. In total, 28 processes and related work instructions were defined. Process yields showed service improvements. Family satisfaction improved. The paper is based on preparing only one neonatal intensive care unit to the ISO 9001 standard. The case study should act as an incentive for hospital managers aiming to improve service quality based on the ISO 9001 standard. ISO 9001 is becoming a recommended tool to improve clinical service quality.
NASA Technical Reports Server (NTRS)
Burks, Jason E.; Molthan, Andrew L.; McGrath, Kevin M.
2014-01-01
During the last year several significant disasters have occurred such as Superstorm Sandy on the East coast of the United States, and Typhoon Bopha in the Phillipines, along with several others. In support of these disasters NASA's Short-term Prediction Research and Transition (SPoRT) Center delivered various products derived from satellite imagery to help in the assessment of damage and recovery of the affected areas. To better support the decision makers responding to the disasters SPoRT quickly developed several solutions to provide the data using open Geographical Information Service (GIS) formats. Providing the data in open GIS standard formats allowed the end user to easily integrate the data into existing Decision Support Systems (DSS). Both Tile Mapping Service (TMS) and Web Mapping Service (WMS) were leveraged to quickly provide the data to the end-user. Development of the deliver methodology allowed quick response to rapidly developing disasters and enabled NASA SPoRT to bring science data to decision makers in a successful research to operations transition.
NASA Technical Reports Server (NTRS)
Burks, Jason E.; Molthan, Andrew L.; McGrath, Kevin M.
2014-01-01
During the last year several significant disasters have occurred such as Superstorm Sandy on the East coast of the United States, and Typhoon Bopha in the Phillipines, along with several others. In support of these disasters NASA's Short-term Prediction Research and Transition (SPoRT) Center delivered various products derived from satellite imagery to help in the assessment of damage and recovery of the affected areas. To better support the decision makers responding to the disasters SPoRT quickly developed several solutions to provide the data using open Geographical Information Service (GIS) formats. Providing the data in open GIS standard formats allowed the end user to easily integrate the data into existing Decision Support Systems (DSS). Both Tile Mapping Service (TMS) and Web Mapping Service (WMS) were leveraged to quickly provide the data to the end-user. Development of the deliver methodology allowed quick response to rapidly developing disasters and enabled NASA SPoRT to bring science data to decision makers in a successful research to operations transition.
Publishing Platform for Aerial Orthophoto Maps, the Complete Stack
NASA Astrophysics Data System (ADS)
Čepický, J.; Čapek, L.
2016-06-01
When creating set of orthophoto maps from mosaic compositions, using airborne systems, such as popular drones, we need to publish results of the work to users. Several steps need to be performed in order get large scale raster data published. As first step, data have to be shared as service (OGC WMS as view service, OGC WCS as download service). But for some applications, OGC WMTS is handy as well, for faster view of the data. Finally the data have to become a part of web mapping application, so that they can be used and evaluated by non-technical users. In this talk, we would like to present automated line of those steps, where user puts in orthophoto image and as a result, OGC Open Web Services are published as well as web mapping application with the data. The web mapping application can be used as standard presentation platform for such type of big raster data to generic user. The publishing platform - Geosense online map information system - can be also used for combination of data from various resources and for creating of unique map compositions and as input for better interpretations of photographed phenomenons. The whole process is successfully tested with eBee drone with raster data resolution 1.5-4 cm/px on many areas and result is also used for creation of derived datasets, usually suited for property management - the records of roads, pavements, traffic signs, public lighting, sewage system, grave locations, and others.
WMS and WFS Standards Implementation of Weather Data
NASA Astrophysics Data System (ADS)
Armstrong, M.
2005-12-01
CustomWeather is private weather company that delivers global weather data products. CustomWeather has built a mapping platform according to OGC standards. Currently, both a Web Mapping Service (WMS) and Web Feature Service (WFS) are supported by CustomWeather. Supporting open geospatial standards has lead to number of positive changes internally to the processes of CustomWeather, along with those of the clients accessing the data. Quite a number of challenges surfaced during this process, particularly with respect to combining a wide variety of raw modeling and sensor data into a single delivery platform. Open standards have, however, made the delivery of very different data products rather seamless. The discussion will address the issues faced in building an OGC-based mapping platform along with the limitations encountered. While the availability of these data products through open standards is still very young, there have already been many adopters in the utility and navigation industries. The discussion will take a closer look at the different approach taken by these two industries as they utilize interoperability standards with existing data. Insight will be given in regards to applications already taking advantage of this new technology and how this is affecting decision-making processes. CustomWeather has observed considerable interest and potential benefit in this technology from developing countries. Weather data is a key element in disaster management. Interoperability is literally opening up a world of data and has the potential to quickly enable functionality that would otherwise take considerable time to implement. The discussion will briefly touch on our experience.
KML Super Overlay to WMS Translator
NASA Technical Reports Server (NTRS)
Plesea, Lucian
2007-01-01
This translator is a server-based application that automatically generates KML super overlay configuration files required by Google Earth for map data access via the Open Geospatial Consortium WMS (Web Map Service) standard. The translator uses a set of URL parameters that mirror the WMS parameters as much as possible, and it also can generate a super overlay subdivision of any given area that is only loaded when needed, enabling very large areas of coverage at very high resolutions. It can make almost any dataset available as a WMS service visible and usable in any KML application, without the need to reformat the data.
Spatial Assessment of Forest Ecosystem Functions and Services using Human Relating Factors for SDG
NASA Astrophysics Data System (ADS)
Song, C.; Lee, W. K.; Jeon, S. W.; Kim, T.; Lim, C. H.
2015-12-01
Application of ecosystem service concept in environmental related decision making could be numerical and objective standard for policy maker between preserving and developing perspective of environment. However, pursuing maximum benefit from natural capital through ecosystem services caused failure by losing ecosystem functions through its trade-offs. Therefore, difference between ecosystem functions and services were demonstrated and would apply human relating perspectives. Assessment results of ecosystem functions and services can be divided 3 parts. Tree growth per year set as the ecosystem function factor and indicated through so called pure function map. After that, relating functions can be driven such as water conservation, air pollutant purification, climate change regulation, and timber production. Overall process and amount are numerically quantified. These functional results can be transferred to ecosystem services by multiplying economic unit value, so function reflecting service maps can be generated. On the other hand, above services, to implement more reliable human demand, human reflecting service maps are also be developed. As the validation, quantified ecosystem functions are compared with former results through pixel based analysis. Three maps are compared, and through comparing difference between ecosystem function and services and inversed trends in function based and human based service are analysed. In this study, we could find differences in PF, FRS, and HRS in relation to based ecosystem conditions. This study suggests that the differences in PF, FRS, and HRS should be understood in the decision making process for sustainable management of ecosystem services. Although the analysis is based on in sort existing process separation, it is important to consider the possibility of different usage of ecosystem function assessment results and ecosystem service assessment results in SDG policy making. Furthermore, process based functional approach can suggest environmental information which is reflected the other kinds of perspective.
Constellation labeling optimization for bit-interleaved coded APSK
NASA Astrophysics Data System (ADS)
Xiang, Xingyu; Mo, Zijian; Wang, Zhonghai; Pham, Khanh; Blasch, Erik; Chen, Genshe
2016-05-01
This paper investigates the constellation and mapping optimization for amplitude phase shift keying (APSK) modulation, which is deployed in Digital Video Broadcasting Satellite - Second Generation (DVB-S2) and Digital Video Broadcasting - Satellite services to Handhelds (DVB-SH) broadcasting standards due to its merits of power and spectral efficiency together with the robustness against nonlinear distortion. The mapping optimization is performed for 32-APSK according to combined cost functions related to Euclidean distance and mutual information. A Binary switching algorithm and its modified version are used to minimize the cost function and the estimated error between the original and received data. The optimized constellation mapping is tested by combining DVB-S2 standard Low-Density Parity-Check (LDPC) codes in both Bit-Interleaved Coded Modulation (BICM) and BICM with iterative decoding (BICM-ID) systems. The simulated results validate the proposed constellation labeling optimization scheme which yields better performance against conventional 32-APSK constellation defined in DVB-S2 standard.
GIS Technologies For The New Planetary Science Archive (PSA)
NASA Astrophysics Data System (ADS)
Docasal, R.; Barbarisi, I.; Rios, C.; Macfarlane, A. J.; Gonzalez, J.; Arviset, C.; De Marchi, G.; Martinez, S.; Grotheer, E.; Lim, T.; Besse, S.; Heather, D.; Fraga, D.; Barthelemy, M.
2015-12-01
Geographical information system (GIS) is becoming increasingly used for planetary science. GIS are computerised systems for the storage, retrieval, manipulation, analysis, and display of geographically referenced data. Some data stored in the Planetary Science Archive (PSA), for instance, a set of Mars Express/Venus Express data, have spatial metadata associated to them. To facilitate users in handling and visualising spatial data in GIS applications, the new PSA should support interoperability with interfaces implementing the standards approved by the Open Geospatial Consortium (OGC). These standards are followed in order to develop open interfaces and encodings that allow data to be exchanged with GIS Client Applications, well-known examples of which are Google Earth and NASA World Wind as well as open source tools such as Openlayers. The technology already exists within PostgreSQL databases to store searchable geometrical data in the form of the PostGIS extension. An existing open source maps server is GeoServer, an instance of which has been deployed for the new PSA, uses the OGC standards to allow, among others, the sharing, processing and editing of data and spatial data through the Web Feature Service (WFS) standard as well as serving georeferenced map images through the Web Map Service (WMS). The final goal of the new PSA, being developed by the European Space Astronomy Centre (ESAC) Science Data Centre (ESDC), is to create an archive which enables science exploitation of ESA's planetary missions datasets. This can be facilitated through the GIS framework, offering interfaces (both web GUI and scriptable APIs) that can be used more easily and scientifically by the community, and that will also enable the community to build added value services on top of the PSA.
Land User and Land Cover Maps of Europe: a Webgis Platform
NASA Astrophysics Data System (ADS)
Brovelli, M. A.; Fahl, F. C.; Minghini, M.; Molinari, M. E.
2016-06-01
This paper presents the methods and implementation processes of a WebGIS platform designed to publish the available land use and land cover maps of Europe at continental scale. The system is built completely on open source infrastructure and open standards. The proposed architecture is based on a server-client model having GeoServer as the map server, Leaflet as the client-side mapping library and the Bootstrap framework at the core of the front-end user interface. The web user interface is designed to have typical features of a desktop GIS (e.g. activate/deactivate layers and order layers by drag and drop actions) and to show specific information on the activated layers (e.g. legend and simplified metadata). Users have the possibility to change the base map from a given list of map providers (e.g. OpenStreetMap and Microsoft Bing) and to control the opacity of each layer to facilitate the comparison with both other land cover layers and the underlying base map. In addition, users can add to the platform any custom layer available through a Web Map Service (WMS) and activate the visualization of photos from popular photo sharing services. This last functionality is provided in order to have a visual assessment of the available land coverages based on other user-generated contents available on the Internet. It is supposed to be a first step towards a calibration/validation service that will be made available in the future.
Cool Apps: Building Cryospheric Data Applications With Standards-Based Service Oriented Architecture
NASA Astrophysics Data System (ADS)
Collins, J. A.; Truslove, I.; Billingsley, B. W.; Oldenburg, J.; Brodzik, M.; Lewis, S.; Liu, M.
2012-12-01
The National Snow and Ice Data Center (NSIDC) holds a large collection of cryospheric data, and is involved in a number of informatics research and development projects aimed at improving the discoverability and accessibility of these data. To develop high-quality software in a timely manner, we have adopted a Service-Oriented Architecture (SOA) approach for our core technical infrastructure development. Data services at NSIDC are internally exposed to other tools and applications through standards-based service interfaces. These standards include OAI-PMH (Open Archives Initiative Protocol for Metadata Harvesting), various OGC (Open Geospatial Consortium) standards including WMS (Web Map Service) and WFS (Web Feature Service), ESIP (Federation of Earth Sciences Information Partners) OpenSearch, and NSIDC-specific RESTful services. By taking a standards-based approach, we are able to use off-the-shelf tools and libraries to consume, translate and broker these data services, and thus develop applications faster. Additionally, by exposing public interfaces to these services we provide valuable data services to technical collaborators; for example, NASA Reverb (http://reverb.echo.nasa.gov) uses NSIDC's WMS services. Our latest generation of web applications consume these data services directly. The most complete example of this is the Operation IceBridge Data Portal (http://nsidc.org/icebridge/portal) which depends on many of the aforementioned services, and clearly exhibits many of the advantages of building applications atop a service-oriented architecture. This presentation outlines the architectural approach and components and open standards and protocols adopted at NSIDC, demonstrates the interactions and uses of public and internal service interfaces currently powering applications including the IceBridge Data Portal, and outlines the benefits and challenges of this approach.
2013-01-01
Background The harmonization of European health systems brings with it a need for tools to allow the standardized collection of information about medical care. A common coding system and standards for the description of services are needed to allow local data to be incorporated into evidence-informed policy, and to permit equity and mobility to be assessed. The aim of this project has been to design such a classification and a related tool for the coding of services for Long Term Care (DESDE-LTC), based on the European Service Mapping Schedule (ESMS). Methods The development of DESDE-LTC followed an iterative process using nominal groups in 6 European countries. 54 researchers and stakeholders in health and social services contributed to this process. In order to classify services, we use the minimal organization unit or “Basic Stable Input of Care” (BSIC), coded by its principal function or “Main Type of Care” (MTC). The evaluation of the tool included an analysis of feasibility, consistency, ontology, inter-rater reliability, Boolean Factor Analysis, and a preliminary impact analysis (screening, scoping and appraisal). Results DESDE-LTC includes an alpha-numerical coding system, a glossary and an assessment instrument for mapping and counting LTC. It shows high feasibility, consistency, inter-rater reliability and face, content and construct validity. DESDE-LTC is ontologically consistent. It is regarded by experts as useful and relevant for evidence-informed decision making. Conclusion DESDE-LTC contributes to establishing a common terminology, taxonomy and coding of LTC services in a European context, and a standard procedure for data collection and international comparison. PMID:23768163
Web Services as Building Blocks for an Open Coastal Observing System
NASA Astrophysics Data System (ADS)
Breitbach, G.; Krasemann, H.
2012-04-01
In coastal observing systems it is needed to integrate different observing methods like remote sensing, in-situ measurements, and models into a synoptic view of the state of the observed region. This integration can be based solely on web services combining data and metadata. Such an approach is pursued for COSYNA (Coastal Observing System for Northern and Artic seas). Data from satellite and radar remote sensing, measurements of buoys, stations and Ferryboxes are the observation part of COSYNA. These data are assimilated into models to create pre-operational forecasts. For discovering data an OGC Web Feature Service (WFS) is used by the COSYNA data portal. This Web Feature Service knows the necessary metadata not only for finding data, but in addition the URLs of web services to view and download the data. To make the data from different resources comparable a common vocabulary is needed. For COSYNA the standard names from CF-conventions are stored within the metadata whenever possible. For the metadata an INSPIRE and ISO19115 compatible data format is used. The WFS is fed from the metadata-system using database-views. Actual data are stored in two different formats, in NetCDF-files for gridded data and in an RDBMS for time-series-like data. The web service URLs are mostly standard based the standards are mainly OGC standards. Maps were created from netcdf files with the help of the ncWMS tool whereas a self-developed java servlet is used for maps of moving measurement platforms. In this case download of data is offered via OGC SOS. For NetCDF-files OPeNDAP is used for the data download. The OGC CSW is used for accessing extended metadata. The concept of data management in COSYNA will be presented which is independent of the special services used in COSYNA. This concept is parameter and data centric and might be useful for other observing systems.
A Web-based Visualization System for Three Dimensional Geological Model using Open GIS
NASA Astrophysics Data System (ADS)
Nemoto, T.; Masumoto, S.; Nonogaki, S.
2017-12-01
A three dimensional geological model is an important information in various fields such as environmental assessment, urban planning, resource development, waste management and disaster mitigation. In this study, we have developed a web-based visualization system for 3D geological model using free and open source software. The system has been successfully implemented by integrating web mapping engine MapServer and geographic information system GRASS. MapServer plays a role of mapping horizontal cross sections of 3D geological model and a topographic map. GRASS provides the core components for management, analysis and image processing of the geological model. Online access to GRASS functions has been enabled using PyWPS that is an implementation of WPS (Web Processing Service) Open Geospatial Consortium (OGC) standard. The system has two main functions. Two dimensional visualization function allows users to generate horizontal and vertical cross sections of 3D geological model. These images are delivered via WMS (Web Map Service) and WPS OGC standards. Horizontal cross sections are overlaid on the topographic map. A vertical cross section is generated by clicking a start point and an end point on the map. Three dimensional visualization function allows users to visualize geological boundary surfaces and a panel diagram. The user can visualize them from various angles by mouse operation. WebGL is utilized for 3D visualization. WebGL is a web technology that brings hardware-accelerated 3D graphics to the browser without installing additional software. The geological boundary surfaces can be downloaded to incorporate the geologic structure in a design on CAD and model for various simulations. This study was supported by JSPS KAKENHI Grant Number JP16K00158.
Khan, Wajahat Ali; Khattak, Asad Masood; Hussain, Maqbool; Amin, Muhammad Bilal; Afzal, Muhammad; Nugent, Christopher; Lee, Sungyoung
2014-08-01
Heterogeneity in the management of the complex medical data, obstructs the attainment of data level interoperability among Health Information Systems (HIS). This diversity is dependent on the compliance of HISs with different healthcare standards. Its solution demands a mediation system for the accurate interpretation of data in different heterogeneous formats for achieving data interoperability. We propose an adaptive AdapteR Interoperability ENgine mediation system called ARIEN, that arbitrates between HISs compliant to different healthcare standards for accurate and seamless information exchange to achieve data interoperability. ARIEN stores the semantic mapping information between different standards in the Mediation Bridge Ontology (MBO) using ontology matching techniques. These mappings are provided by our System for Parallel Heterogeneity (SPHeRe) matching system and Personalized-Detailed Clinical Model (P-DCM) approach to guarantee accuracy of mappings. The realization of the effectiveness of the mappings stored in the MBO is evaluation of the accuracy in transformation process among different standard formats. We evaluated our proposed system with the transformation process of medical records between Clinical Document Architecture (CDA) and Virtual Medical Record (vMR) standards. The transformation process achieved over 90 % of accuracy level in conversion process between CDA and vMR standards using pattern oriented approach from the MBO. The proposed mediation system improves the overall communication process between HISs. It provides an accurate and seamless medical information exchange to ensure data interoperability and timely healthcare services to patients.
The NIST radioactivity measurement assurance program for the radiopharmaceutical industry.
Cessna, Jeffrey T; Golas, Daniel B
2012-09-01
The National Institute of Standards and Technology (NIST) maintains a program for the establishment and dissemination of activity measurement standards in nuclear medicine. These standards are disseminated through Standard Reference Materials (SRMs), Calibration Services, radionuclide calibrator settings, and the NIST Radioactivity Measurement Assurance Program (NRMAP, formerly the NEI/NIST MAP). The MAP for the radiopharmaceutical industry is described here. Consolidated results show that, for over 3600 comparisons, 96% of the participants' results differed from that of NIST by less than 10%, with 98% being less than 20%. Individual radionuclide results are presented from 214 to 439 comparisons, per radionuclide, for (67)Ga, (90)Y, (99m)Tc, (99)Mo, (111)In, (125)I, (131)I, and (201)Tl. The percentage of participants results within 10% of NIST ranges from 88% to 98%. Published by Elsevier Ltd.
Using JavaScript and the FDSN web service to create an interactive earthquake information system
NASA Astrophysics Data System (ADS)
Fischer, Kasper D.
2015-04-01
The FDSN web service provides a web interface to access earthquake meta-data (e. g. event or station information) and waveform date over the internet. Requests are send to a server as URLs and the output is either XML or miniSEED. This makes it hard to read by humans but easy to process with different software. Different data centers are already supporting the FDSN web service, e. g. USGS, IRIS, ORFEUS. The FDSN web service is also part of the Seiscomp3 (http://www.seiscomp3.org) software. The Seismological Observatory of the Ruhr-University switched to Seiscomp3 as the standard software for the analysis of mining induced earthquakes at the beginning of 2014. This made it necessary to create a new web-based earthquake information service for the publication of results to the general public. This has be done by processing the output of a FDSN web service query by javascript running in a standard browser. The result is an interactive map presenting the observed events and further information of events and stations on a single web page as a table and on a map. In addition the user can download event information, waveform data and station data in different formats like miniSEED, quakeML or FDSNxml. The developed code and all used libraries are open source and freely available.
Improvements in the Protein Identifier Cross-Reference service.
Wein, Samuel P; Côté, Richard G; Dumousseau, Marine; Reisinger, Florian; Hermjakob, Henning; Vizcaíno, Juan A
2012-07-01
The Protein Identifier Cross-Reference (PICR) service is a tool that allows users to map protein identifiers, protein sequences and gene identifiers across over 100 different source databases. PICR takes input through an interactive website as well as Representational State Transfer (REST) and Simple Object Access Protocol (SOAP) services. It returns the results as HTML pages, XLS and CSV files. It has been in production since 2007 and has been recently enhanced to add new functionality and increase the number of databases it covers. Protein subsequences can be Basic Local Alignment Search Tool (BLAST) against the UniProt Knowledgebase (UniProtKB) to provide an entry point to the standard PICR mapping algorithm. In addition, gene identifiers from UniProtKB and Ensembl can now be submitted as input or mapped to as output from PICR. We have also implemented a 'best-guess' mapping algorithm for UniProt. In this article, we describe the usefulness of PICR, how these changes have been implemented, and the corresponding additions to the web services. Finally, we explain that the number of source databases covered by PICR has increased from the initial 73 to the current 102. New resources include several new species-specific Ensembl databases as well as the Ensembl Genome ones. PICR can be accessed at http://www.ebi.ac.uk/Tools/picr/.
E-DECIDER Decision Support Gateway For Earthquake Disaster Response
NASA Astrophysics Data System (ADS)
Glasscoe, M. T.; Stough, T. M.; Parker, J. W.; Burl, M. C.; Donnellan, A.; Blom, R. G.; Pierce, M. E.; Wang, J.; Ma, Y.; Rundle, J. B.; Yoder, M. R.
2013-12-01
Earthquake Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response (E-DECIDER) is a NASA-funded project developing capabilities for decision-making utilizing remote sensing data and modeling software in order to provide decision support for earthquake disaster management and response. E-DECIDER incorporates earthquake forecasting methodology and geophysical modeling tools developed through NASA's QuakeSim project in order to produce standards-compliant map data products to aid in decision-making following an earthquake. Remote sensing and geodetic data, in conjunction with modeling and forecasting tools, help provide both long-term planning information for disaster management decision makers as well as short-term information following earthquake events (i.e. identifying areas where the greatest deformation and damage has occurred and emergency services may need to be focused). E-DECIDER utilizes a service-based GIS model for its cyber-infrastructure in order to produce standards-compliant products for different user types with multiple service protocols (such as KML, WMS, WFS, and WCS). The goal is to make complex GIS processing and domain-specific analysis tools more accessible to general users through software services as well as provide system sustainability through infrastructure services. The system comprises several components, which include: a GeoServer for thematic mapping and data distribution, a geospatial database for storage and spatial analysis, web service APIs, including simple-to-use REST APIs for complex GIS functionalities, and geoprocessing tools including python scripts to produce standards-compliant data products. These are then served to the E-DECIDER decision support gateway (http://e-decider.org), the E-DECIDER mobile interface, and to the Department of Homeland Security decision support middleware UICDS (Unified Incident Command and Decision Support). The E-DECIDER decision support gateway features a web interface that delivers map data products including deformation modeling results (slope change and strain magnitude) and aftershock forecasts, with remote sensing change detection results under development. These products are event triggered (from the USGS earthquake feed) and will be posted to event feeds on the E-DECIDER webpage and accessible via the mobile interface and UICDS. E-DECIDER also features a KML service that provides infrastructure information from the FEMA HAZUS database through UICDS and the mobile interface. The back-end GIS service architecture and front-end gateway components form a decision support system that is designed for ease-of-use and extensibility for end-users.
GeoSciML and EarthResourceML Update, 2012
NASA Astrophysics Data System (ADS)
Richard, S. M.; Commissionthe Management; Application Inte, I.
2012-12-01
CGI Interoperability Working Group activities during 2012 include deployment of services using the GeoSciML-Portrayal schema, addition of new vocabularies to support properties added in version 3.0, improvements to server software for deploying services, introduction of EarthResourceML v.2 for mineral resources, and collaboration with the IUSS on a markup language for soils information. GeoSciML and EarthResourceML have been used as the basis for the INSPIRE Geology and Mineral Resources specifications respectively. GeoSciML-Portrayal is an OGC GML simple-feature application schema for presentation of geologic map unit, contact, and shear displacement structure (fault and ductile shear zone) descriptions in web map services. Use of standard vocabularies for geologic age and lithology enables map services using shared legends to achieve visual harmonization of maps provided by different services. New vocabularies have been added to the collection of CGI vocabularies provided to support interoperable GeoSciML services, and can be accessed through http://resource.geosciml.org. Concept URIs can be dereferenced to obtain SKOS rdf or html representations using the SISSVoc vocabulary service. New releases of the FOSS GeoServer application greatly improve support for complex XML feature schemas like GeoSciML, and the ArcGIS for INSPIRE extension implements similar complex feature support for ArcGIS Server. These improved server implementations greatly facilitate deploying GeoSciML services. EarthResourceML v2 adds features for information related to mining activities. SoilML provides an interchange format for soil material, soil profile, and terrain information. Work is underway to add GeoSciML to the portfolio of Open Geospatial Consortium (OGC) specifications.
Integrating Socioeconomic and Earth Science Data Using Geobrowsers and Web Services: A Demonstration
NASA Astrophysics Data System (ADS)
Schumacher, J. A.; Yetman, G. G.
2007-12-01
The societal benefit areas identified as the focus for the Global Earth Observing System of Systems (GEOSS) 10- year implementation plan are an indicator of the importance of integrating socioeconomic data with earth science data to support decision makers. To aid this integration, CIESIN is delivering its global and U.S. demographic data to commercial and open source Geobrowsers and providing open standards based services for data access. Currently, data on population distribution, poverty, and detailed census data for the U.S. are available for visualization and access in Google Earth, NASA World Wind, and a browser-based 2-dimensional mapping client. The mapping client allows for the creation of web map documents that pull together layers from distributed servers and can be saved and shared. Visualization tools with Geobrowsers, user-driven map creation and sharing via browser-based clients, and a prototype for characterizing populations at risk to predicted precipitation deficits will be demonstrated.
BingEO: Enable Distributed Earth Observation Data for Environmental Research
NASA Astrophysics Data System (ADS)
Wu, H.; Yang, C.; Xu, Y.
2010-12-01
Our planet is facing great environmental challenges including global climate change, environmental vulnerability, extreme poverty, and a shortage of clean cheap energy. To address these problems, scientists are developing various models to analysis, forecast, simulate various geospatial phenomena to support critical decision making. These models not only challenge our computing technology, but also challenge us to feed huge demands of earth observation data. Through various policies and programs, open and free sharing of earth observation data are advocated in earth science. Currently, thousands of data sources are freely available online through open standards such as Web Map Service (WMS), Web Feature Service (WFS) and Web Coverage Service (WCS). Seamless sharing and access to these resources call for a spatial Cyberinfrastructure (CI) to enable the use of spatial data for the advancement of related applied sciences including environmental research. Based on Microsoft Bing Search Engine and Bing Map, a seamlessly integrated and visual tool is under development to bridge the gap between researchers/educators and earth observation data providers. With this tool, earth science researchers/educators can easily and visually find the best data sets for their research and education. The tool includes a registry and its related supporting module at server-side and an integrated portal as its client. The proposed portal, Bing Earth Observation (BingEO), is based on Bing Search and Bing Map to: 1) Use Bing Search to discover Web Map Services (WMS) resources available over the internet; 2) Develop and maintain a registry to manage all the available WMS resources and constantly monitor their service quality; 3) Allow users to manually register data services; 4) Provide a Bing Maps-based Web application to visualize the data on a high-quality and easy-to-manipulate map platform and enable users to select the best data layers online. Given the amount of observation data accumulated already and still growing, BingEO will allow these resources to be utilized more widely, intensively, efficiently and economically in earth science applications.
NASA Astrophysics Data System (ADS)
Bandibas, J. C.; Takarada, S.
2013-12-01
Timely identification of areas affected by natural disasters is very important for a successful rescue and effective emergency relief efforts. This research focuses on the development of a cost effective and efficient system of identifying areas affected by natural disasters, and the efficient distribution of the information. The developed system is composed of 3 modules which are the Web Processing Service (WPS), Web Map Service (WMS) and the user interface provided by J-iView (fig. 1). WPS is an online system that provides computation, storage and data access services. In this study, the WPS module provides online access of the software implementing the developed frequency based change detection algorithm for the identification of areas affected by natural disasters. It also sends requests to WMS servers to get the remotely sensed data to be used in the computation. WMS is a standard protocol that provides a simple HTTP interface for requesting geo-registered map images from one or more geospatial databases. In this research, the WMS component provides remote access of the satellite images which are used as inputs for land cover change detection. The user interface in this system is provided by J-iView, which is an online mapping system developed at the Geological Survey of Japan (GSJ). The 3 modules are seamlessly integrated into a single package using J-iView, which could rapidly generate a map of disaster areas that is instantaneously viewable online. The developed system was tested using ASTER images covering the areas damaged by the March 11, 2011 tsunami in northeastern Japan. The developed system efficiently generated a map showing areas devastated by the tsunami. Based on the initial results of the study, the developed system proved to be a useful tool for emergency workers to quickly identify areas affected by natural disasters.
A Lithology Based Map Unit Schema For Onegeology Regional Geologic Map Integration
NASA Astrophysics Data System (ADS)
Moosdorf, N.; Richard, S. M.
2012-12-01
A system of lithogenetic categories for a global lithological map (GLiM, http://www.ifbm.zmaw.de/index.php?id=6460&L=3) has been compiled based on analysis of lithology/genesis categories for regional geologic maps for the entire globe. The scheme is presented for discussion and comment. Analysis of units on a variety of regional geologic maps indicates that units are defined based on assemblages of rock types, as well as their genetic type. In this compilation of continental geology, outcropping surface materials are dominantly sediment/sedimentary rock; major subdivisions of the sedimentary category include clastic sediment, carbonate sedimentary rocks, clastic sedimentary rocks, mixed carbonate and clastic sedimentary rock, colluvium and residuum. Significant areas of mixed igneous and metamorphic rock are also present. A system of global categories to characterize the lithology of regional geologic units is important for Earth System models of matter fluxes to soils, ecosystems, rivers and oceans, and for regional analysis of Earth surface processes at global scale. Because different applications of the classification scheme will focus on different lithologic constituents in mixed units, an ontology-type representation of the scheme that assigns properties to the units in an analyzable manner will be pursued. The OneGeology project is promoting deployment of geologic map services at million scale for all nations. Although initial efforts are commonly simple scanned map WMS services, the intention is to move towards data-based map services that categorize map units with standard vocabularies to allow use of a common map legend for better visual integration of the maps (e.g. see OneGeology Europe, http://onegeology-europe.brgm.fr/ geoportal/ viewer.jsp). Current categorization of regional units with a single lithology from the CGI SimpleLithology (http://resource.geosciml.org/201202/ Vocab2012html/ SimpleLithology201012.html) vocabulary poorly captures the lithologic character of such units in a meaningful way. A lithogenetic unit category scheme accessible as a GeoSciML-portrayal-based OGC Styled Layer Description resource is key to enabling OneGeology (http://oneGeology.org) geologic map services to achieve a high degree of visual harmonization.
A Comparison of Mental Health Care Systems in Northern and Southern Europe: A Service Mapping Study.
Sadeniemi, Minna; Almeda, Nerea; Salinas-Pérez, Jose A; Gutiérrez-Colosía, Mencía R; García-Alonso, Carlos; Ala-Nikkola, Taina; Joffe, Grigori; Pirkola, Sami; Wahlbeck, Kristian; Cid, Jordi; Salvador-Carulla, Luis
2018-05-31
Mental health services (MHS) have gone through vast changes during the last decades, shifting from hospital to community-based care. Developing the optimal balance and use of resources requires standard comparisons of mental health care systems across countries. This study aimed to compare the structure, personnel resource allocation, and the productivity of the MHS in two benchmark health districts in a Nordic welfare state and a southern European, family-centered country. The study is part of the REFINEMENT (Research on Financing Systems' Effect on the Quality of Mental Health Care) project. The study areas were the Helsinki and Uusimaa region in Finland and the Girona region in Spain. The MHS were mapped by using the DESDE-LTC (Description and Evaluation of Services and Directories for Long Term Care) tool. There were 6.7 times more personnel resources in the MHS in Helsinki and Uusimaa than in Girona. The resource allocation was more residential-service-oriented in Helsinki and Uusimaa. The difference in mental health personnel resources is not explained by the respective differences in the need for MHS among the population. It is important to make a standard comparison of the MHS for supporting policymaking and to ensure equal access to care across European countries.
A European classification of services for long-term care—the EU-project eDESDE-LTC
Weber, Germain; Brehmer, Barbara; Zeilinger, Elisabeth; Salvador-Carulla, Luis
2009-01-01
Purpose and theory The eDESDE-LTC project aims at developing an operational system for coding, mapping and comparing services for long-term care (LTC) across EU. The projects strategy is to improve EU listing and access to relevant sources of healthcare information via development of SEMANTIC INTER-OPERABILITY in eHEALTH (coding and listing of services for LTC); to increase access to relevant sources of information on LTC services, and to improve linkages between national and regional websites; to foster cooperation with international organizations (OECD). Methods This operational system will include a standard classification of main types of care for persons with LTC needs and an instrument for mapping and standard description of services. These instruments are based on previous classification systems for mental health services (ESMS), disabilities services (DESDE) and ageing services (DESDAE). A Delphi panel made by seven partners developed a DESDE-LTC beta version, which was translated into six languages. The feasibility of DESDE-LTC is tested in six countries using national focal groups. Then the final version will be developed by the Delphi panel, a webpage, training material and course will be carried out. Results and conclusions The eDESDE-LTC system will be piloted in two EU countries (Spain and Bulgaria). Evaluation will focus primarily on usability and impact analysis. Discussion The added value of this project is related to the right of “having access to high-quality healthcare when and where it is needed” by EU citizens. Due to semantic variability and service complexity, existing national listings of services do not provide an adequate framework for patient mobility.
Habib, A.; Jarvis, A.; Al-Durgham, M. M.; Lay, J.; Quackenbush, P.; Stensaas, G.; Moe, D.
2007-01-01
The mapping community is witnessing significant advances in available sensors, such as medium format digital cameras (MFDC) and Light Detection and Ranging (LiDAR) systems. In this regard, the Digital Photogrammetry Research Group (DPRG) of the Department of Geomatics Engineering at the University of Calgary has been actively involved in the development of standards and specifications for regulating the use of these sensors in mapping activities. More specifically, the DPRG has been working on developing new techniques for the calibration and stability analysis of medium format digital cameras. This research is essential since these sensors have not been developed with mapping applications in mind. Therefore, prior to their use in Geomatics activies, new standards should be developed to ensure the quality of the developed products. In another front, the persistent improvement in direct geo-referencing technology has led to an expansion in the use of LiDAR systems for the acquisition of dense and accurate surface information. However, the processing of the raw LiDAR data (e.g., ranges, mirror angles, and navigation data) remains a non-transparent process that is proprietary to the manufacturers of LiDAR systems. Therefore, the DPRG has been focusing on the development of quality control procedures to quantify the accuracy of LiDAR output in the absence of initial system measurements. This paper presents a summary of the research conducted by the DPRG together with the British Columbia Base Mapping and Geomatic Services (BMGS) and the United States Geological Survey (USGS) for the development of quality assurance and quality control procedures for emerging mapping technologies. The outcome of this research will allow for the possiblity of introducing North American Standards and Specifications to regulate the use of MFDC and LiDAR systems in the mapping industry.
The OGC Sensor Web Enablement framework
NASA Astrophysics Data System (ADS)
Cox, S. J.; Botts, M.
2006-12-01
Sensor observations are at the core of natural sciences. Improvements in data-sharing technologies offer the promise of much greater utilisation of observational data. A key to this is interoperable data standards. The Open Geospatial Consortium's (OGC) Sensor Web Enablement initiative (SWE) is developing open standards for web interfaces for the discovery, exchange and processing of sensor observations, and tasking of sensor systems. The goal is to support the construction of complex sensor applications through real-time composition of service chains from standard components. The framework is based around a suite of standard interfaces, and standard encodings for the message transferred between services. The SWE interfaces include: Sensor Observation Service (SOS)-parameterized observation requests (by observation time, feature of interest, property, sensor); Sensor Planning Service (SPS)-tasking a sensor- system to undertake future observations; Sensor Alert Service (SAS)-subscription to an alert, usually triggered by a sensor result exceeding some value. The interface design generally follows the pattern established in the OGC Web Map Service (WMS) and Web Feature Service (WFS) interfaces, where the interaction between a client and service follows a standard sequence of requests and responses. The first obtains a general description of the service capabilities, followed by obtaining detail required to formulate a data request, and finally a request for a data instance or stream. These may be implemented in a stateless "REST" idiom, or using conventional "web-services" (SOAP) messaging. In a deployed system, the SWE interfaces are supplemented by Catalogue, data (WFS) and portrayal (WMS) services, as well as authentication and rights management. The standard SWE data formats are Observations and Measurements (O&M) which encodes observation metadata and results, Sensor Model Language (SensorML) which describes sensor-systems, Transducer Model Language (TML) which covers low-level data streams, and domain-specific GML Application Schemas for definitions of the target feature types. The SWE framework has been demonstrated in several interoperability testbeds. These were based around emergency management, security, contamination and environmental monitoring scenarios.
Using Standardized Lexicons for Report Template Validation with LexMap, a Web-based Application.
Hostetter, Jason; Wang, Kenneth; Siegel, Eliot; Durack, Jeremy; Morrison, James J
2015-06-01
An enormous amount of data exists in unstructured diagnostic and interventional radiology reports. Free text or non-standardized terminologies limit the ability to parse, extract, and analyze these report data elements. Medical lexicons and ontologies contain standardized terms for relevant concepts including disease entities, radiographic technique, and findings. The use of standardized terms offers the potential to improve reporting consistency and facilitate computer analysis. The purpose of this project was to implement an interface to aid in the creation of standards-compliant reporting templates for use in interventional radiology. Non-standardized procedure report text was analyzed and referenced to RadLex, SNOMED-CT, and LOINC. Using JavaScript, a web application was developed which determined whether exact terms or synonyms in reports existed within these three reference resources. The NCBO BioPortal Annotator web service was used to map terms, and output from this application was used to create an interactive annotated version of the original report. The application was successfully used to analyze and modify five distinct reports for the Society of Interventional Radiology's standardized reporting project.
First Prototype of a Web Map Interface for ESA's Planetary Science Archive (PSA)
NASA Astrophysics Data System (ADS)
Manaud, N.; Gonzalez, J.
2014-04-01
We present a first prototype of a Web Map Interface that will serve as a proof of concept and design for ESA's future fully web-based Planetary Science Archive (PSA) User Interface. The PSA is ESA's planetary science archiving authority and central repository for all scientific and engineering data returned by ESA's Solar System missions [1]. All data are compliant with NASA's Planetary Data System (PDS) Standards and are accessible through several interfaces [2]: in addition to serving all public data via FTP and the Planetary Data Access Protocol (PDAP), a Java-based User Interface provides advanced search, preview, download, notification and delivery-basket functionality. It allows the user to query and visualise instrument observations footprints using a map-based interface (currently only available for Mars Express HRSC and OMEGA instruments). During the last decade, the planetary mapping science community has increasingly been adopting Geographic Information System (GIS) tools and standards, originally developed for and used in Earth science. There is an ongoing effort to produce and share cartographic products through Open Geospatial Consortium (OGC) Web Services, or as standalone data sets, so that they can be readily used in existing GIS applications [3,4,5]. Previous studies conducted at ESAC [6,7] have helped identify the needs of Planetary GIS users, and define key areas of improvement for the future Web PSA User Interface. Its web map interface shall will provide access to the full geospatial content of the PSA, including (1) observation geometry footprints of all remote sensing instruments, and (2) all georeferenced cartographic products, such as HRSC map-projected data or OMEGA global maps from Mars Express. It shall aim to provide a rich user experience for search and visualisation of this content using modern and interactive web mapping technology. A comprehensive set of built-in context maps from external sources, such as MOLA topography, TES infrared maps or planetary surface nomenclature, provided in both simple cylindrical and polar stereographic projections, shall enhance this user experience. In addition, users should be able to import and export data in commonly used open- GIS formats. It is also intended to serve all PSA geospatial data through OGC-compliant Web Services so that they can be captured, visualised and analysed directly from GIS software, along with data from other sources. The following figure illustrates how the PSA web map interface and services shall fit in a typical Planetary GIS user working environment.
NASA Astrophysics Data System (ADS)
Allison, M.; Gundersen, L. C.; Richard, S. M.; Dickinson, T. L.
2008-12-01
A coalition of the state geological surveys (AASG), the U.S. Geological Survey (USGS), and partners will receive NSF funding over 3 years under the INTEROP solicitation to start building the Geoscience Information Network (www.geoinformatics.info/gin) a distributed, interoperable data network. The GIN project will develop standardized services to link existing and in-progress components using a few standards and protocols, and work with data providers to implement these services. The key components of this network are 1) catalog system(s) for data discovery; 2) service definitions for interfaces for searching catalogs and accessing resources; 3) shared interchange formats to encode information for transmission (e.g. various XML markup languages); 4) data providers that publish information using standardized services defined by the network; and 5) client applications adapted to use information resources provided by the network. The GIN will integrate and use catalog resources that currently exist or are in development. We are working with the USGS National Geologic Map Database's existing map catalog, with the USGS National Geological and Geophysical Data Preservation Program, which is developing a metadata catalog (National Digital Catalog) for geoscience information resource discovery, and with the GEON catalog. Existing interchange formats will be used, such as GeoSciML, ChemML, and Open Geospatial Consortium sensor, observation and measurement MLs. Client application development will be fostered by collaboration with industry and academic partners. The GIN project will focus on the remaining aspects of the system -- service definitions and assistance to data providers to implement the services and bring content online - and on system integration of the modules. Initial formal collaborators include the OneGeology-Europe consortium of 27 nations that is building a comparable network under the EU INSPIRE initiative, GEON, Earthchem, and GIS software company ESRI. OneGeology-Europe and GIN have agreed to integrate their networks, effectively adopting global standards among geological surveys that are available across the entire field. ESRI is creating a Geology Data Model for ArcGIS software to be compatible with GIN, and other companies are expressing interest in adapting their services, applications, and clients to take advantage of the large data resources planned to become available through GIN.
NASA Astrophysics Data System (ADS)
Saprudin, S.; Liliasari, L.; Prihatmanto, A. S.
2017-09-01
This study is a survey that aims to describe pre-service physics teachers’ concept mastery at a university in Ternate. Data were collected through test standard instrument for physics which used in the teacher certification program. Data were analyzed by using quantitative descriptive technique. Based on the results of data analysis, it was concluded that generally pre-service physics teachers’ concept mastery can be categorized on low category (25.4%). The map of concept mastery will be used as a reference to developing game design in the physics learning context for pre-service physics teachers.
Erberich, Stephan G; Bhandekar, Manasee; Chervenak, Ann; Kesselman, Carl; Nelson, Marvin D
2007-01-01
Functional MRI is successfully being used in clinical and research applications including preoperative planning, language mapping, and outcome monitoring. However, clinical use of fMRI is less widespread due to its complexity of imaging, image workflow, post-processing, and lack of algorithmic standards hindering result comparability. As a consequence, wide-spread adoption of fMRI as clinical tool is low contributing to the uncertainty of community physicians how to integrate fMRI into practice. In addition, training of physicians with fMRI is in its infancy and requires clinical and technical understanding. Therefore, many institutions which perform fMRI have a team of basic researchers and physicians to perform fMRI as a routine imaging tool. In order to provide fMRI as an advanced diagnostic tool to the benefit of a larger patient population, image acquisition and image post-processing must be streamlined, standardized, and available at any institution which does not have these resources available. Here we describe a software architecture, the functional imaging laboratory (funcLAB/G), which addresses (i) standardized image processing using Statistical Parametric Mapping and (ii) its extension to secure sharing and availability for the community using standards-based Grid technology (Globus Toolkit). funcLAB/G carries the potential to overcome the limitations of fMRI in clinical use and thus makes standardized fMRI available to the broader healthcare enterprise utilizing the Internet and HealthGrid Web Services technology.
Rose, Kathryn V.; Nayegandhi, Amar; Moses, Christopher S.; Beavers, Rebecca; Lavoie, Dawn; Brock, John C.
2012-01-01
The National Park Service (NPS) Inventory and Monitoring (I&M) Program initiated a benthic habitat mapping program in ocean and coastal parks in 2008-2009 in alignment with the NPS Ocean Park Stewardship 2007-2008 Action Plan. With more than 80 ocean and Great Lakes parks encompassing approximately 2.5 million acres of submerged territory and approximately 12,000 miles of coastline (Curdts, 2011), this Servicewide Benthic Mapping Program (SBMP) is essential. This report presents an initial gap analysis of three pilot parks under the SBMP: Assateague Island National Seashore (ASIS), Channel Islands National Park (CHIS), and Sleeping Bear Dunes National Lakeshore (SLBE) (fig. 1). The recommended SBMP protocols include servicewide standards (for example, gap analysis, minimum accuracy, final products) as well as standards that can be adapted to fit network and park unit needs (for example, minimum mapping unit, mapping priorities). The SBMP requires the inventory and mapping of critical components of coastal and marine ecosystems: bathymetry, geoforms, surface geology, and biotic cover. In order for a park unit benthic inventory to be considered complete, maps of bathymetry and other key components must be combined into a final report (Moses and others, 2010). By this standard, none of the three pilot parks are mapped (inventoried) to completion with respect to submerged resources. After compiling the existing benthic datasets for these parks, this report has concluded that CHIS, with 49 percent of its submerged area mapped, has the most complete benthic inventory of the three. The ASIS submerged inventory is 41 percent complete, and SLBE is 17.5 percent complete.
A multi-service data management platform for scientific oceanographic products
NASA Astrophysics Data System (ADS)
D'Anca, Alessandro; Conte, Laura; Nassisi, Paola; Palazzo, Cosimo; Lecci, Rita; Cretì, Sergio; Mancini, Marco; Nuzzo, Alessandra; Mirto, Maria; Mannarini, Gianandrea; Coppini, Giovanni; Fiore, Sandro; Aloisio, Giovanni
2017-02-01
An efficient, secure and interoperable data platform solution has been developed in the TESSA project to provide fast navigation and access to the data stored in the data archive, as well as a standard-based metadata management support. The platform mainly targets scientific users and the situational sea awareness high-level services such as the decision support systems (DSS). These datasets are accessible through the following three main components: the Data Access Service (DAS), the Metadata Service and the Complex Data Analysis Module (CDAM). The DAS allows access to data stored in the archive by providing interfaces for different protocols and services for downloading, variables selection, data subsetting or map generation. Metadata Service is the heart of the information system of the TESSA products and completes the overall infrastructure for data and metadata management. This component enables data search and discovery and addresses interoperability by exploiting widely adopted standards for geospatial data. Finally, the CDAM represents the back-end of the TESSA DSS by performing on-demand complex data analysis tasks.
Architecture of the local spatial data infrastructure for regional climate change research
NASA Astrophysics Data System (ADS)
Titov, Alexander; Gordov, Evgeny
2013-04-01
Georeferenced datasets (meteorological databases, modeling and reanalysis results, etc.) are actively used in modeling and analysis of climate change for various spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their size which might constitute up to tens terabytes for a single dataset studies in the area of climate and environmental change require a special software support based on SDI approach. A dedicated architecture of the local spatial data infrastructure aiming at regional climate change analysis using modern web mapping technologies is presented. Geoportal is a key element of any SDI, allowing searching of geoinformation resources (datasets and services) using metadata catalogs, producing geospatial data selections by their parameters (data access functionality) as well as managing services and applications of cartographical visualization. It should be noted that due to objective reasons such as big dataset volume, complexity of data models used, syntactic and semantic differences of various datasets, the development of environmental geodata access, processing and visualization services turns out to be quite a complex task. Those circumstances were taken into account while developing architecture of the local spatial data infrastructure as a universal framework providing geodata services. So that, the architecture presented includes: 1. Effective in terms of search, access, retrieval and subsequent statistical processing, model of storing big sets of regional georeferenced data, allowing in particular to store frequently used values (like monthly and annual climate change indices, etc.), thus providing different temporal views of the datasets 2. General architecture of the corresponding software components handling geospatial datasets within the storage model 3. Metadata catalog describing in detail using ISO 19115 and CF-convention standards datasets used in climate researches as a basic element of the spatial data infrastructure as well as its publication according to OGC CSW (Catalog Service Web) specification 4. Computational and mapping web services to work with geospatial datasets based on OWS (OGC Web Services) standards: WMS, WFS, WPS 5. Geoportal as a key element of thematic regional spatial data infrastructure providing also software framework for dedicated web applications development To realize web mapping services Geoserver software is used since it provides natural WPS implementation as a separate software module. To provide geospatial metadata services GeoNetwork Opensource (http://geonetwork-opensource.org) product is planned to be used for it supports ISO 19115/ISO 19119/ISO 19139 metadata standards as well as ISO CSW 2.0 profile for both client and server. To implement thematic applications based on geospatial web services within the framework of local SDI geoportal the following open source software have been selected: 1. OpenLayers JavaScript library, providing basic web mapping functionality for the thin client such as web browser 2. GeoExt/ExtJS JavaScript libraries for building client-side web applications working with geodata services. The web interface developed will be similar to the interface of such popular desktop GIS applications, as uDIG, QuantumGIS etc. The work is partially supported by RF Ministry of Education and Science grant 8345, SB RAS Program VIII.80.2.1 and IP 131.
Astronomical Data Integration Beyond the Virtual Observatory
NASA Astrophysics Data System (ADS)
Lemson, G.; Laurino, O.
2015-09-01
"Data integration" generally refers to the process of combining data from different source data bases into a unified view. Much work has been devoted in this area by the International Virtual Observatory Alliance (IVOA), allowing users to discover and access databases through standard protocols. However, different archives present their data through their own schemas and users must still select, filter, and combine data for each archive individually. An important reason for this is that the creation of common data models that satisfy all sub-disciplines is fraught with difficulties. Furthermore it requires a substantial amount of work for data providers to present their data according to some standard representation. We will argue that existing standards allow us to build a data integration framework that works around these problems. The particular framework requires the implementation of the IVOA Table Access Protocol (TAP) only. It uses the newly developed VO data modelling language (VO-DML) specification, which allows one to define extensible object-oriented data models using a subset of UML concepts through a simple XML serialization language. A rich mapping language allows one to describe how instances of VO-DML data models are represented by the TAP service, bridging the possible mismatch between a local archive's schema and some agreed-upon representation of the astronomical domain. In this so called local-as-view approach to data integration, “mediators" use the mapping prescriptions to translate queries phrased in terms of the common schema to the underlying TAP service. This mapping language has a graphical representation, which we expose through a web based graphical “drag-and-drop-and-connect" interface. This service allows any user to map the holdings of any TAP service to the data model(s) of choice. The mappings are defined and stored outside of the data sources themselves, which allows the interface to be used in a kind of crowd-sourcing effort to annotate any remote database of interest. This reduces the burden of publishing one's data and allows a great flexibility in the definition of the views through which particular communities might wish to access remote archives. At the same time, the framework easies the user's effort to select, filter, and combine data from many different archives, so as to build knowledge bases for their analysis. We will present the framework and demonstrate a prototype implementation. We will discuss ideas for producing the missing elements, in particular the query language and the implementation of mediator tools to translate object queries to ADQL
Evolution of System Architectures: Where Do We Need to Fail Next?
NASA Astrophysics Data System (ADS)
Bermudez, Luis; Alameh, Nadine; Percivall, George
2013-04-01
Innovation requires testing and failing. Thomas Edison was right when he said "I have not failed. I've just found 10,000 ways that won't work". For innovation and improvement of standards to happen, service Architectures have to be tested and tested. Within the Open Geospatial Consortium (OGC), testing of service architectures has occurred for the last 15 years. This talk will present an evolution of these service architectures and a possible future path. OGC is a global forum for the collaboration of developers and users of spatial data products and services, and for the advancement and development of international standards for geospatial interoperability. The OGC Interoperability Program is a series of hands-on, fast paced, engineering initiatives to accelerate the development and acceptance of OGC standards. Each initiative is organized in threads that provide focus under a particular theme. The first testbed, OGC Web Services phase 1, completed in 2003 had four threads: Common Architecture, Web Mapping, Sensor Web and Web Imagery Enablement. The Common Architecture was a cross-thread theme, to ensure that the Web Mapping and Sensor Web experiments built on a base common architecture. The architecture was based on the three main SOA components: Broker, Requestor and Provider. It proposed a general service model defining service interactions and dependencies; categorization of service types; registries to allow discovery and access of services; data models and encodings; and common services (WMS, WFS, WCS). For the latter, there was a clear distinction on the different services: Data Services (e.g. WMS), Application services (e.g. Coordinate transformation) and server-side client applications (e.g. image exploitation). The latest testbed, OGC Web Service phase 9, completed in 2012 had 5 threads: Aviation, Cross-Community Interoperability (CCI), Security and Services Interoperability (SSI), OWS Innovations and Compliance & Interoperability Testing & Evaluation (CITE). Compared to the first testbed, OWS-9 did not have a separate common architecture thread. Instead the emphasis was on brokering information models, securing them and making data available efficiently on mobile devices. The outcome is an architecture based on usability and non-intrusiveness while leveraging mediation of information models from different communities. This talk will use lessons learned from the evolution from OGC Testbed phase 1 to phase 9 to better understand how global and complex infrastructures evolve to support many communities including the Earth System Science Community.
Research and Practice of the News Map Compilation Service
NASA Astrophysics Data System (ADS)
Zhao, T.; Liu, W.; Ma, W.
2018-04-01
Based on the needs of the news media on the map, this paper researches on the news map compilation service, conducts demand research on the service of compiling news maps, designs and compiles the public authority base map suitable for media publication, and constructs the news base map material library. It studies the compilation of domestic and international news maps with timeliness and strong pertinence and cross-regional characteristics, constructs the hot news thematic gallery and news map customization services, conducts research on types of news maps, establish closer liaison and cooperation methods with news media, and guides news media to use correct maps. Through the practice of the news map compilation service, this paper lists two cases of news map preparation services used by different media, compares and analyses cases, summarizes the research situation of news map compilation service, and at the same time puts forward outstanding problems and development suggestions in the service of news map compilation service.
Description of the U.S. Geological Survey Geo Data Portal data integration framework
Blodgett, David L.; Booth, Nathaniel L.; Kunicki, Thomas C.; Walker, Jordan I.; Lucido, Jessica M.
2012-01-01
The U.S. Geological Survey has developed an open-standard data integration framework for working efficiently and effectively with large collections of climate and other geoscience data. A web interface accesses catalog datasets to find data services. Data resources can then be rendered for mapping and dataset metadata are derived directly from these web services. Algorithm configuration and information needed to retrieve data for processing are passed to a server where all large-volume data access and manipulation takes place. The data integration strategy described here was implemented by leveraging existing free and open source software. Details of the software used are omitted; rather, emphasis is placed on how open-standard web services and data encodings can be used in an architecture that integrates common geographic and atmospheric data.
European Marine Observation Data Network - EMODnet Physics
NASA Astrophysics Data System (ADS)
Manzella, Giuseppe M. R.; Novellino, Antonio; D'Angelo, Paolo; Gorringe, Patrick; Schaap, Dick; Pouliquen, Sylvie; Loubrieu, Thomas; Rickards, Lesley
2015-04-01
The EMODnet-Physics portal (www.emodnet-physics.eu) makes layers of physical data and their metadata available for use and contributes towards the definition of an operational European Marine Observation and Data Network (EMODnet). It is based on a strong collaboration between EuroGOOS associates and its regional operational systems (ROOSs), and it is bringing together two very different marine communities: the "real time" ocean observing institute/centers and the National Oceanographic Data Centres (NODCs) that are in charge of ocean data validation, quality check and update for marine environmental monitoring. The EMODnet-Physics is a Marine Observation and Data Information System that provides a single point of access to near real time and historical achieved data (www.emodnet-physics.eu/map) it is built on existing infrastructure by adding value and avoiding any unless complexity, it provides data access to users, it is aimed at attracting new data holders, better and more data. With a long-term vision for a pan European Ocean Observation System sustainability, the EMODnet-Physics is supporting the coordination of the EuroGOOS Regional components and the empowerment and improvement of their data management infrastructure. In turn, EMODnet-Physics already implemented high-level interoperability features (WMS, Web catalogue, web services, etc…) to facilitate connection and data exchange with the ROOS and the Institutes within the ROOSs (www.emodnet-physics.eu/services). The on-going EMODnet-Physics structure delivers environmental marine physical data from the whole Europe (wave height and period, temperature of the water column, wind speed and direction, salinity of the water column, horizontal velocity of the water column, light attenuation, and sea level) as monitored by fixed stations, ARGO floats, drifting buoys, gliders, and ferry-boxes. It does provide discovering of data sets (both NRT - near real time - and Historical data sets), visualization and free download of data from more than 1500 platforms. The portal is composed mainly of three sections: the Map, the Selection List and the Station Info Panel. The Map is the core of the EMODnet-Physics system: here the user can access all available data, customize the map visualization and set different display layers. It is also possible to interact with all the information on the map using the filters provided by the service that can be used to select the stations of interest depending on the type, physical parameters measured, the time period of the observations in the database of the system, country of origin, the water basin of reference. It is also possible to browse the data in time by means of the slider in the lower part of the page that allows the user to view the stations that recorded data in a particular time period. Finally, it is possible to change the standard map view with different layers that provide additional visual information on the status of the waters. The Station Info panel available from the main map by clicking on a single platform provides information on the measurements carried out by the station. Moreover, the system provides full interoperability with third-party software through WMS service, Web Service and Web catalogue in order to exchange data and products according to the most recent interop standards. Further developments will ensure the compatibility to the OGS-SWE (Sensor Web Enablement) standard for the description of sensors and related observations using OpenGIS specifications (SensorML, O&M, SOS). The full list of services is available at www.emodnet-physics.eu/services. The result is an excellent example of innovative technologies for providing open and free access to geo-referenced data for the creation of new advanced (operational) oceanography services.
Hydrologic Unit Map -- 1978, state of South Dakota
,
1978-01-01
This map and accompanying table show Hydrologic Unites that are basically hydrographic in nature. The Cataloging Unites shown supplant the Cataloging Units previously depicted n the 1974 State Hydrologic Unit Map. The boundaries as shown have been adapted from the 1974 State Hydrologic Unit Map, "The Catalog of Information on Water Data" (1972), "Water Resources Regions and Subregions for the National Assessment of Water and Related Land Resources" by the U.S. Water Resources Council (1970), "River Basin of the United States" by the U.S. Soil Conservation Service (1963, 1970), "River Basin Maps Showing Hydrologic Stations" by the Inter-Agency Committee on Water Resources, Subcommittee on Hydrology (1961), and State planning maps. The Political Subdivision has been adopted from "Counties and County Equivalents of the States if the United States" presented in Federal Information Processing Standards Publication 6-2, issued by the National Bureau of Standards (1973) in which each county or county equivalent is identified by a 2-character State code and a 3-character county code. The Regions, Subregions and Accounting Units are aggregates of the Cataloging Unites. The Regions and Sub regions are currently (1978) used by the U.S> Water Resources Council for comprehensive planning, including the National Assessment, and as a standard geographical framework for more detailed water and related land-resources planning. The Accounting Units are those currently (1978) in use by the U.S. Geological Survey for managing the National Water Data Network. This map was revised to include a boundary realinement between Cataloging Units 10140103 and 10160009.
Forest resources of the Clearwater National Forest
Ryan P. Hughes
2011-01-01
The Interior West Forest Inventory and Analysis (IWFIA) Program of the USDA Forest Service, Rocky Mountain Research Station, as part of our National Forest System cooperative inventories, conducted a forest resource inventory on the Clearwater National Forest using a nationally standardized mapped-plot design (for more details see section "Inventory methods...
Forest resources of the Medicine Bow National Forest
Jim Steed
2008-01-01
The Interior West Forest Inventory and Analysis (IWFIA) Program of the USDA Forest Service, Rocky Mountain Research Station, as part of our National Forest System cooperative inventories, conducted a forest resource inventory on the Medicine Bow National Forest using a nationally standardized mapped-plot design (for more details see "Inventory methods"...
NASA Technical Reports Server (NTRS)
1981-01-01
The use of an International Standards Organization (ISO) Open Systems Interconnection (OSI) Reference Model and its relevance to interconnecting an Applications Data Service (ADS) pilot program for data sharing is discussed. A top level mapping between the conjectured ADS requirements and identified layers within the OSI Reference Model was performed. It was concluded that the OSI model represents an orderly architecture for the ADS networking planning and that the protocols being developed by the National Bureau of Standards offer the best available implementation approach.
Cruza, Norberto Sotelo; Fierros, Luis E
2006-01-01
The present study was done at the internal medicine service oft he Hospital lnfantil in the State of Sonora, Mexico. We tried to address the question of the use of conceptual schemes and mind maps and its impact on the teaching-learning-evaluation process among medical residents. Analyze the effects of conceptual schemes, and mind maps as a teaching and evaluation tool and compare them with multiple choice exams among Pediatric residents. Twenty two residents (RI, RII, RIII)on service rotation during six months were assessed initially, followed by a lecture on a medical subject. Conceptual schemes and mind maps were then introduced as a teaching-learning-evaluation instrument. Comprehension impact and comparison with a standard multiple choice evaluation was done. The statistical package (JMP version 5, SAS inst. 2004) was used. We noted that when we used conceptual schemes and mind mapping, learning improvement was noticeable among the three groups of residents (P < 0.001) and constitutes a better evaluation tool when compared with multiple choice exams (P < 0.0005). Based on our experience we recommend the use of this educational technique for medical residents in training.
NASA Astrophysics Data System (ADS)
Haderman, M.; Dye, T. S.; White, J. E.; Dickerson, P.; Pasch, A. N.; Miller, D. S.; Chan, A. C.
2012-12-01
Built upon the success of the U.S. Environmental Protection Agency's (EPA) AirNow program (www.AirNow.gov), the AirNow-International (AirNow-I) system contains an enhanced suite of software programs that process and quality control real-time air quality and environmental data and distribute customized maps, files, and data feeds. The goals of the AirNow-I program are similar to those of the successful U.S. program and include fostering the exchange of environmental data; making advances in air quality knowledge and applications; and building a community of people, organizations, and decision makers in environmental management. In 2010, Shanghai became the first city in China to run this state-of-the-art air quality data management and notification system. AirNow-I consists of a suite of modules (software programs and schedulers) centered on a database. One such module is the Information Management System (IMS), which can automatically produce maps and other data products through the use of GIS software to provide the most current air quality information to the public. Developed with Global Earth Observation System of Systems (GEOSS) interoperability in mind, IMS is based on non-proprietary standards, with preference to formal international standards. The system depends on data and information providers accepting and implementing a set of interoperability arrangements, including technical specifications for collecting, processing, storing, and disseminating shared data, metadata, and products. In particular, the specifications include standards for service-oriented architecture and web-based interfaces, such as a web mapping service (WMS), web coverage service (WCS), web feature service (WFS), sensor web services, and Really Simple Syndication (RSS) feeds. IMS is flexible, open, redundant, and modular. It also allows the merging of data grids to create complex grids that show comprehensive air quality conditions. For example, the AirNow Satellite Data Processor (ASDP) was recently developed to merge PM2.5 estimates from National Aeronautics and Space Administration (NASA) satellite data and AirNow observational data, creating more precise maps and gridded data products for under-monitored areas. The ASDP can easily incorporate other data feeds, including fire and smoke locations, to build enhanced real-time air quality data products. In this presentation, we provide an overview of the features and functions of IMS, an explanation of how data moves through IMS, the rationale of the system architecture, and highlights of the ASDP as an example of the modularity and scalability of IMS.
Big Outcrops and Big Ideas in Earth Science K-8 Professional Development
NASA Astrophysics Data System (ADS)
Baldwin, K. A.; Cooper, C. M.; Cavagnetto, A.; Morrison, J.; Adesope, O.
2014-12-01
Washington State has recently adopted the Next Generation Science Standards (NGSS) and state leaders are now working toward supporting teachers' implementation of the new standards and the pedagogical practices that support them. This poster encompasses one of one such professional development (PD) effort. The Enhancing Understanding of Concepts and Processes of Science (EUCAPS) project serves 31 K-8 in-service teachers in two southeast Washington school districts. In year two of this three year PD project, in-service teachers explored the Earth sciences and pedagogical approaches such as the Science Writing Heuristic, concept mapping, and activities which emphasized the epistemic nature of science. The goals of the EUCAPS PD project are to increase in-service teachers' big ideas in science and to provide support to in-service teachers as they transition to the NGSS. Teachers used concepts maps to document their knowledge of Earth science processes before and after visiting a local field site in Lewiston, Idaho. In the context of immersive inquiries, teachers collected field-based evidence to support their claims about the geological history of the field site. Teachers presented their claims and evidence to their peers in the form a story about the local geologic history. This poster will present an overview of the PD as well as provide examples of teacher's work and alignment with the NGSS.
Semantics in NETMAR (open service NETwork for MARine environmental data)
NASA Astrophysics Data System (ADS)
Leadbetter, Adam; Lowry, Roy; Clements, Oliver
2010-05-01
Over recent years, there has been a proliferation of environmental data portals utilising a wide range of systems and services, many of which cannot interoperate. The European Union Framework 7 project NETMAR (that commenced February 2010) aims to provide a toolkit for building such portals in a coherent manner through the use of chained Open Geospatial Consortium Web Services (WxS), OPeNDAP file access and W3C standards controlled by a Business Process Execution Language workflow. As such, the end product will be configurable by user communities interested in developing a portal for marine environmental data, and will offer search, download and integration tools for a range of satellite, model and observed data from open ocean and coastal areas. Further processing of these data will also be available in order to provide statistics and derived products suitable for decision making in the chosen environmental domain. In order to make the resulting portals truly interoperable, the NETMAR programme requires a detailed definition of the semantics of the services being called and the data which are being requested. A key goal of the NETMAR programme is, therefore, to develop a multi-domain and multilingual ontology of marine data and services. This will allow searches across both human languages and across scientific domains. The approach taken will be to analyse existing semantic resources and provide mappings between them, gluing together the definitions, semantics and workflows of the WxS services. The mappings between terms aim to be more general than the standard "narrower than", "broader than" type seen in the thesauri or simple ontologies implemented by previous programmes. Tools for the development and population of ontologoies will also be provided by NETMAR as there will be instances in which existing resources cannot sufficiently describe newly encountered data or services.
Uncertainties in ecosystem service maps: a comparison on the European scale.
Schulp, Catharina J E; Burkhard, Benjamin; Maes, Joachim; Van Vliet, Jasper; Verburg, Peter H
2014-01-01
Safeguarding the benefits that ecosystems provide to society is increasingly included as a target in international policies. To support such policies, ecosystem service maps are made. However, there is little attention for the accuracy of these maps. We made a systematic review and quantitative comparison of ecosystem service maps on the European scale to generate insights in the uncertainty of ecosystem service maps and discuss the possibilities for quantitative validation. Maps of climate regulation and recreation were reasonably similar while large uncertainties among maps of erosion protection and flood regulation were observed. Pollination maps had a moderate similarity. Differences among the maps were caused by differences in indicator definition, level of process understanding, mapping aim, data sources and methodology. Absence of suitable observed data on ecosystem services provisioning hampers independent validation of the maps. Consequently, there are, so far, no accurate measures for ecosystem service map quality. Policy makers and other users need to be cautious when applying ecosystem service maps for decision-making. The results illustrate the need for better process understanding and data acquisition to advance ecosystem service mapping, modelling and validation.
US EPA Nonattainment Areas and Designations
This web service contains the following state level layers:Ozone 8-hr (1997 standard), Ozone 8-hr (2008 standard), Lead (2008 standard), SO2 1-hr (2010 standard), PM2.5 24hr (2006 standard), PM2.5 Annual (1997 standard), PM2.5 Annual (2012 standard), and PM10 (1987 standard). Full FGDC metadata records for each layer may be found by clicking the layer name at the web service endpoint (https://gispub.epa.gov/arcgis/rest/services/OAR_OAQPS/NonattainmentAreas/MapServer) and viewing the layer description. These layers identify areas in the U.S. where air pollution levels have not met the National Ambient Air Quality Standards (NAAQS) for criteria air pollutants and have been designated nonattainment?? areas (NAA). The data are updated weekly from an OAQPS internal database. However, that does not necessarily mean the data have changed. The EPA Office of Air Quality Planning and Standards (OAQPS) has set National Ambient Air Quality Standards for six principal pollutants, which are called criteria pollutants. Under provisions of the Clean Air Act, which is intended to improve the quality of the air we breathe, EPA is required to set National Ambient Air Quality Standards for six common air pollutants. These commonly found air pollutants (also known as criteria pollutants) are found all over the United States. They are particle pollution (often referred to as particulate matter), ground-level ozone, carbon monoxide, sulfur oxides, nitrogen oxides, and lead. For each
NASA Astrophysics Data System (ADS)
Chen, R. S.; MacManus, K.; Vinay, S.; Yetman, G.
2016-12-01
The Socioeconomic Data and Applications Center (SEDAC), one of 12 Distributed Active Archive Centers (DAACs) in the NASA Earth Observing System Data and Information System (EOSDIS), has developed a variety of operational spatial data services aimed at providing online access, visualization, and analytic functions for geospatial socioeconomic and environmental data. These services include: open web services that implement Open Geospatial Consortium (OGC) specifications such as Web Map Service (WMS), Web Feature Service (WFS), and Web Coverage Service (WCS); spatial query services that support Web Processing Service (WPS) and Representation State Transfer (REST); and web map clients and a mobile app that utilize SEDAC and other open web services. These services may be accessed from a variety of external map clients and visualization tools such as NASA's WorldView, NOAA's Climate Explorer, and ArcGIS Online. More than 200 data layers related to population, settlements, infrastructure, agriculture, environmental pollution, land use, health, hazards, climate change and other aspects of sustainable development are available through WMS, WFS, and/or WCS. Version 2 of the SEDAC Population Estimation Service (PES) supports spatial queries through WPS and REST in the form of a user-defined polygon or circle. The PES returns an estimate of the population residing in the defined area for a specific year (2000, 2005, 2010, 2015, or 2020) based on SEDAC's Gridded Population of the World version 4 (GPWv4) dataset, together with measures of accuracy. The SEDAC Hazards Mapper and the recently released HazPop iOS mobile app enable users to easily submit spatial queries to the PES and see the results. SEDAC has developed an operational virtualized backend infrastructure to manage these services and support their continual improvement as standards change, new data and services become available, and user needs evolve. An ongoing challenge is to improve the reliability and performance of the infrastructure, in conjunction with external services, to meet both research and operational needs.
Forest resources of the Idaho Panhandle National Forest
Joshua C. Holte
2012-01-01
The Interior West Forest Inventory and Analysis (IWFIA) Program of the USDA Forest Service, Rocky Mountain Research Station, as part of our National Forest System cooperative inventories, conducted a forest resource inventory on the Idaho Panhandle National Forest (IPNF) using a nationally standardized mapped-plot design (for more details see "The inventory...
Forest resources of the Black Hills National Forest
Larry T. DeBlander
2002-01-01
The Interior West Forest Inventory and Analysis (IWFIA) Program of the USDA Forest Service, Rocky Mountain Research Station, as part of our National Forest System cooperative inventories, conducted a forest resource inventory on the Black Hills National Forest using a nationally standardized mapped-plot design (for more details see section "How was the inventory...
Forest resources of the Nez Perce National Forest
Michele Disney
2010-01-01
As part of a National Forest System cooperative inventory, the Interior West Forest Inventory and Analysis (IWFIA) Program of the USDA Forest Service conducted a forest resource inventory on the Nez Perce National Forest using a nationally standardized mapped-plot design (for more details see the section "Inventory methods"). This report presents highlights...
Forest resources of the Bighorn National Forest
Christopher Witt
2008-01-01
The Interior West Forest Inventory and Analysis (IWFIA) Program of the USDA Forest Service, Rocky Mountain Research Station, as part of our National Forest System cooperative inventories, conducted a forest resource inventory on the Bighorn National Forest (Bighorn) using a nationally standardized mapped-plot design. This report presents the highlights of this 2000...
Forest resources of the Shoshone National Forest
James Menlove
2008-01-01
The Interior West Forest Inventory and Analysis (IWFIA) Program of the USDA Forest Service, Rocky Mountain Research Station, as part of our National Forest System cooperative inventories, conducted a forest resource inventory on the Shoshone National Forest using a nationally standardized mapped-plot design. This report presents the highlights of this 1999 inventory...
NASA Astrophysics Data System (ADS)
Kilb, D. L.; Fundis, A. T.; Risien, C. M.
2012-12-01
The focus of the Education and Public Engagement (EPE) component of the NSF's Ocean Observatories Initiative (OOI) is to provide a new layer of cyber-interactivity for undergraduate educators to bring near real-time data from the global ocean into learning environments. To accomplish this, we are designing six online services including: 1) visualization tools, 2) a lesson builder, 3) a concept map builder, 4) educational web services (middleware), 5) collaboration tools and 6) an educational resource database. Here, we report on our Fall 2012 release that includes the first four of these services: 1) Interactive visualization tools allow users to interactively select data of interest, display the data in various views (e.g., maps, time-series and scatter plots) and obtain statistical measures such as mean, standard deviation and a regression line fit to select data. Specific visualization tools include a tool to compare different months of data, a time series explorer tool to investigate the temporal evolution of select data parameters (e.g., sea water temperature or salinity), a glider profile tool that displays ocean glider tracks and associated transects, and a data comparison tool that allows users to view the data either in scatter plot view comparing one parameter with another, or in time series view. 2) Our interactive lesson builder tool allows users to develop a library of online lesson units, which are collaboratively editable and sharable and provides starter templates designed from learning theory knowledge. 3) Our interactive concept map tool allows the user to build and use concept maps, a graphical interface to map the connection between concepts and ideas. This tool also provides semantic-based recommendations, and allows for embedding of associated resources such as movies, images and blogs. 4) Education web services (middleware) will provide an educational resource database API.
Allones, J L; Martinez, D; Taboada, M
2014-10-01
Clinical terminologies are considered a key technology for capturing clinical data in a precise and standardized manner, which is critical to accurately exchange information among different applications, medical records and decision support systems. An important step to promote the real use of clinical terminologies, such as SNOMED-CT, is to facilitate the process of finding mappings between local terms of medical records and concepts of terminologies. In this paper, we propose a mapping tool to discover text-to-concept mappings in SNOMED-CT. Name-based techniques were combined with a query expansion system to generate alternative search terms, and with a strategy to analyze and take advantage of the semantic relationships of the SNOMED-CT concepts. The developed tool was evaluated and compared to the search services provided by two SNOMED-CT browsers. Our tool automatically mapped clinical terms from a Spanish glossary of procedures in pathology with 88.0% precision and 51.4% recall, providing a substantial improvement of recall (28% and 60%) over other publicly accessible mapping services. The improvements reached by the mapping tool are encouraging. Our results demonstrate the feasibility of accurately mapping clinical glossaries to SNOMED-CT concepts, by means a combination of structural, query expansion and named-based techniques. We have shown that SNOMED-CT is a great source of knowledge to infer synonyms for the medical domain. Results show that an automated query expansion system overcomes the challenge of vocabulary mismatch partially.
NASA Astrophysics Data System (ADS)
Signell, R. P.; Camossi, E.
2015-11-01
Work over the last decade has resulted in standardized web-services and tools that can significantly improve the efficiency and effectiveness of working with meteorological and ocean model data. While many operational modelling centres have enabled query and access to data via common web services, most small research groups have not. The penetration of this approach into the research community, where IT resources are limited, can be dramatically improved by: (1) making it simple for providers to enable web service access to existing output files; (2) using technology that is free, and that is easy to deploy and configure; and (3) providing tools to communicate with web services that work in existing research environments. We present a simple, local brokering approach that lets modelers continue producing custom data, but virtually aggregates and standardizes the data using NetCDF Markup Language. The THREDDS Data Server is used for data delivery, pycsw for data search, NCTOOLBOX (Matlab®1) and Iris (Python) for data access, and Ocean Geospatial Consortium Web Map Service for data preview. We illustrate the effectiveness of this approach with two use cases involving small research modelling groups at NATO and USGS.1 Mention of trade names or commercial products does not constitute endorsement or recommendation for use by the US Government.
An effective XML based name mapping mechanism within StoRM
NASA Astrophysics Data System (ADS)
Corso, E.; Forti, A.; Ghiselli, A.; Magnoni, L.; Zappi, R.
2008-07-01
In a Grid environment the naming capability allows users to refer to specific data resources in a physical storage system using a high level logical identifier. This logical identifier is typically organized in a file system like structure, a hierarchical tree of names. Storage Resource Manager (SRM) services map the logical identifier to the physical location of data evaluating a set of parameters as the desired quality of services and the VOMS attributes specified in the requests. StoRM is a SRM service developed by INFN and ICTP-EGRID to manage file and space on standard POSIX and high performing parallel and cluster file systems. An upcoming requirement in the Grid data scenario is the orthogonality of the logical name and the physical location of data, in order to refer, with the same identifier, to different copies of data archived in various storage areas with different quality of service. The mapping mechanism proposed in StoRM is based on a XML document that represents the different storage components managed by the service, the storage areas defined by the site administrator, the quality of service they provide and the Virtual Organization that want to use the storage area. An appropriate directory tree is realized in each storage component reflecting the XML schema. In this scenario StoRM is able to identify the physical location of a requested data evaluating the logical identifier and the specified attributes following the XML schema, without querying any database service. This paper presents the namespace schema defined, the different entities represented and the technical details of the StoRM implementation.
Current Approaches to Improving Marine Geophysical Data Discovery and Access
NASA Astrophysics Data System (ADS)
Jencks, J. H.; Cartwright, J.; Varner, J. D.; Anderson, C.; Robertson, E.; McLean, S. J.
2016-02-01
Exploring, understanding, and managing the global oceans is a challenge when hydrographic maps are available for only 5% of the world's oceans, even less of which have been mapped geologically or to identify benthic habitats. Seafloor mapping is expensive and most government and academic budgets continue to tighten. The first step for any mapping program, before setting out to map uncharted waters, should be to identify if data currently exist in the area of interest. There are many reasons why this seemingly simple suggestion is not commonplace. While certain datasets are accessible online (e.g., NOAA's NCEI, EMODnet, IHO-DCDB), many are not. In some cases, data that are publicly available are difficult to discover and access. No single agency can successfully resolve the complex and pressing demands of ocean and coastal mapping and the associated data stewardship. NOAA partners with other federal agencies to provide an integrated approach to carry out a coordinated and comprehensive ocean and coastal mapping program. In order to maximize the return on their mapping investment, legacy and newly acquired data must be easily discoverable and readily accessible by numerous applications and formats now and well into the future. At NOAA's National Centers for Environmental Information (NCEI), resources are focused on ensuring the security and widespread availability of the Nation's scientific marine geophysical data through long-term stewardship. The public value of these data and products is maximized by streamlining data acquisition and processing operations, minimizing redundancies, facilitating discovery, and developing common standards to promote re-use. For its part, NCEI draws on a variety of software technologies and adheres to international standards to meet this challenge. The result is a geospatial framework built on spatially-enabled databases, standards-based web services, and International Standards Organization (ISO) metadata. In order to maximize effectiveness in ocean and coastal mapping, we must be sure that limited funding is not being used to collect data in areas where data already exist. By making data more accessible, NCEI extends the use of, and therefore the value of, these data. Working together, we can ensure that valuable data are made available to the broadest community.
A New Map of Standardized Terrestrial Ecosystems of the Conterminous United States
Sayre, Roger G.; Comer, Patrick; Warner, Harumi; Cress, Jill
2009-01-01
A new map of standardized, mesoscale (tens to thousands of hectares) terrestrial ecosystems for the conterminous United States was developed by using a biophysical stratification approach. The ecosystems delineated in this top-down, deductive modeling effort are described in NatureServe's classification of terrestrial ecological systems of the United States. The ecosystems were mapped as physically distinct areas and were associated with known distributions of vegetation assemblages by using a standardized methodology first developed for South America. This approach follows the geoecosystems concept of R.J. Huggett and the ecosystem geography approach of R.G. Bailey. Unique physical environments were delineated through a geospatial combination of national data layers for biogeography, bioclimate, surficial materials lithology, land surface forms, and topographic moisture potential. Combining these layers resulted in a comprehensive biophysical stratification of the conterminous United States, which produced 13,482 unique biophysical areas. These were considered as fundamental units of ecosystem structure and were aggregated into 419 potential terrestrial ecosystems. The ecosystems classification effort preceded the mapping effort and involved the independent development of diagnostic criteria, descriptions, and nomenclature for describing expert-derived ecological systems. The aggregation and labeling of the mapped ecosystem structure units into the ecological systems classification was accomplished in an iterative, expert-knowledge-based process using automated rulesets for identifying ecosystems on the basis of their biophysical and biogeographic attributes. The mapped ecosystems, at a 30-meter base resolution, represent an improvement in spatial and thematic (class) resolution over existing ecoregionalizations and are useful for a variety of applications, including ecosystem services assessments, climate change impact studies, biodiversity conservation, and resource management.
Boulos, Maged N Kamel; Honda, Kiyoshi
2006-01-01
Open Source Web GIS software systems have reached a stage of maturity, sophistication, robustness and stability, and usability and user friendliness rivalling that of commercial, proprietary GIS and Web GIS server products. The Open Source Web GIS community is also actively embracing OGC (Open Geospatial Consortium) standards, including WMS (Web Map Service). WMS enables the creation of Web maps that have layers coming from multiple different remote servers/sources. In this article we present one easy to implement Web GIS server solution that is based on the Open Source University of Minnesota (UMN) MapServer. By following the accompanying step-by-step tutorial instructions, interested readers running mainstream Microsoft® Windows machines and with no prior technical experience in Web GIS or Internet map servers will be able to publish their own health maps on the Web and add to those maps additional layers retrieved from remote WMS servers. The 'digital Asia' and 2004 Indian Ocean tsunami experiences in using free Open Source Web GIS software are also briefly described. PMID:16420699
Auto-Generated Semantic Processing Services
NASA Technical Reports Server (NTRS)
Davis, Rodney; Hupf, Greg
2009-01-01
Auto-Generated Semantic Processing (AGSP) Services is a suite of software tools for automated generation of other computer programs, denoted cross-platform semantic adapters, that support interoperability of computer-based communication systems that utilize a variety of both new and legacy communication software running in a variety of operating- system/computer-hardware combinations. AGSP has numerous potential uses in military, space-exploration, and other government applications as well as in commercial telecommunications. The cross-platform semantic adapters take advantage of common features of computer- based communication systems to enforce semantics, messaging protocols, and standards of processing of streams of binary data to ensure integrity of data and consistency of meaning among interoperating systems. The auto-generation aspect of AGSP Services reduces development time and effort by emphasizing specification and minimizing implementation: In effect, the design, building, and debugging of software for effecting conversions among complex communication protocols, custom device mappings, and unique data-manipulation algorithms is replaced with metadata specifications that map to an abstract platform-independent communications model. AGSP Services is modular and has been shown to be easily integrable into new and legacy NASA flight and ground communication systems.
Geologic map of Big Bend National Park, Texas
Turner, Kenzie J.; Berry, Margaret E.; Page, William R.; Lehman, Thomas M.; Bohannon, Robert G.; Scott, Robert B.; Miggins, Daniel P.; Budahn, James R.; Cooper, Roger W.; Drenth, Benjamin J.; Anderson, Eric D.; Williams, Van S.
2011-01-01
The purpose of this map is to provide the National Park Service and the public with an updated digital geologic map of Big Bend National Park (BBNP). The geologic map report of Maxwell and others (1967) provides a fully comprehensive account of the important volcanic, structural, geomorphological, and paleontological features that define BBNP. However, the map is on a geographically distorted planimetric base and lacks topography, which has caused difficulty in conducting GIS-based data analyses and georeferencing the many geologic features investigated and depicted on the map. In addition, the map is outdated, excluding significant data from numerous studies that have been carried out since its publication more than 40 years ago. This report includes a modern digital geologic map that can be utilized with standard GIS applications to aid BBNP researchers in geologic data analysis, natural resource and ecosystem management, monitoring, assessment, inventory activities, and educational and recreational uses. The digital map incorporates new data, many revisions, and greater detail than the original map. Although some geologic issues remain unresolved for BBNP, the updated map serves as a foundation for addressing those issues. Funding for the Big Bend National Park geologic map was provided by the United States Geological Survey (USGS) National Cooperative Geologic Mapping Program and the National Park Service. The Big Bend mapping project was administered by staff in the USGS Geology and Environmental Change Science Center, Denver, Colo. Members of the USGS Mineral and Environmental Resources Science Center completed investigations in parallel with the geologic mapping project. Results of these investigations addressed some significant current issues in BBNP and the U.S.-Mexico border region, including contaminants and human health, ecosystems, and water resources. Funding for the high-resolution aeromagnetic survey in BBNP, and associated data analyses and interpretation, was from the USGS Crustal Geophysics and Geochemistry Science Center. Mapping contributed from university professors and students was mostly funded by independent sources, including academic institutions, private industry, and other agencies.
Personalized-detailed clinical model for data interoperability among clinical standards.
Khan, Wajahat Ali; Hussain, Maqbool; Afzal, Muhammad; Amin, Muhammad Bilal; Saleem, Muhammad Aamir; Lee, Sungyoung
2013-08-01
Data interoperability among health information exchange (HIE) systems is a major concern for healthcare practitioners to enable provisioning of telemedicine-related services. Heterogeneity exists in these systems not only at the data level but also among different heterogeneous healthcare standards with which these are compliant. The relationship between healthcare organization data and different heterogeneous standards is necessary to achieve the goal of data level interoperability. We propose a personalized-detailed clinical model (P-DCM) approach for the generation of customized mappings that creates the necessary linkage between organization-conformed healthcare standards concepts and clinical model concepts to ensure data interoperability among HIE systems. We consider electronic health record (EHR) standards, openEHR, and HL7 CDA instances transformation using P-DCM. P-DCM concepts associated with openEHR and HL7 CDA help in transformation of instances among these standards. We investigated two datasets: (1) data of 100 diabetic patients, including 50 each of type 1 and type 2, from a local hospital in Korea and (2) data of a single Alzheimer's disease patient. P-DCMs were created for both scenarios, which provided the basis for deriving instances for HL7 CDA and openEHR standards. For proof of concept, we present case studies of encounter information for type 2 diabetes mellitus patients and monitoring of daily routine activities of an Alzheimer's disease patient. These reflect P-DCM-based customized mappings generation with openEHR and HL7 CDA standards. Customized mappings are generated based on the relationship of P-DCM concepts with CDA and openEHR concepts. The objective of this work is to achieve semantic data interoperability among heterogeneous standards. This would lead to effective utilization of resources and allow timely information exchange among healthcare systems.
Personalized-Detailed Clinical Model for Data Interoperability Among Clinical Standards
Khan, Wajahat Ali; Hussain, Maqbool; Afzal, Muhammad; Amin, Muhammad Bilal; Saleem, Muhammad Aamir
2013-01-01
Abstract Objective: Data interoperability among health information exchange (HIE) systems is a major concern for healthcare practitioners to enable provisioning of telemedicine-related services. Heterogeneity exists in these systems not only at the data level but also among different heterogeneous healthcare standards with which these are compliant. The relationship between healthcare organization data and different heterogeneous standards is necessary to achieve the goal of data level interoperability. We propose a personalized-detailed clinical model (P-DCM) approach for the generation of customized mappings that creates the necessary linkage between organization-conformed healthcare standards concepts and clinical model concepts to ensure data interoperability among HIE systems. Materials and Methods: We consider electronic health record (EHR) standards, openEHR, and HL7 CDA instances transformation using P-DCM. P-DCM concepts associated with openEHR and HL7 CDA help in transformation of instances among these standards. We investigated two datasets: (1) data of 100 diabetic patients, including 50 each of type 1 and type 2, from a local hospital in Korea and (2) data of a single Alzheimer's disease patient. P-DCMs were created for both scenarios, which provided the basis for deriving instances for HL7 CDA and openEHR standards. Results: For proof of concept, we present case studies of encounter information for type 2 diabetes mellitus patients and monitoring of daily routine activities of an Alzheimer's disease patient. These reflect P-DCM-based customized mappings generation with openEHR and HL7 CDA standards. Customized mappings are generated based on the relationship of P-DCM concepts with CDA and openEHR concepts. Conclusions: The objective of this work is to achieve semantic data interoperability among heterogeneous standards. This would lead to effective utilization of resources and allow timely information exchange among healthcare systems. PMID:23875730
Rolling Deck to Repository (R2R): Big Data and Standard Services for the Fleet Community
NASA Astrophysics Data System (ADS)
Arko, R. A.; Carbotte, S. M.; Chandler, C. L.; Smith, S. R.; Stocks, K. I.
2014-12-01
The Rolling Deck to Repository (R2R; http://rvdata.us/) program curates underway environmental sensor data from the U.S. academic oceanographic research fleet, ensuring data sets are routinely and consistently documented, preserved in long-term archives, and disseminated to the science community. Currently 25 in-service vessels contribute 7 terabytes of data to R2R each year, acquired from a full suite of geophysical, oceanographic, meteorological, and navigational sensors on over 400 cruises worldwide. To accommodate this large volume and variety of data, R2R has developed highly efficient stewardship procedures. These include scripted "break out" of cruise data packages from each vessel based on standard filename and directory patterns; automated harvest of cruise metadata from the UNOLS Office via Web Services and from OpenXML-based forms submitted by vessel operators; scripted quality assessment routines that calculate statistical summaries and standard ratings for selected data types; adoption of community-standard controlled vocabularies for vessel codes, instrument types, etc, provided by the NERC Vocabulary Server, in lieu of maintaining custom local term lists; and a standard package structure based on the IETF BagIt format for delivering data to long-term archives. Documentation and standard post-field products, including quality-controlled shiptrack navigation data for every cruise, are published in multiple services and formats to satisfy a diverse range of clients. These include Catalog Service for Web (CSW), GeoRSS, and OAI-PMH discovery services via a GeoNetwork portal; OGC Web Map and Feature Services for GIS clients; a citable Digital Object Identifier (DOI) for each dataset; ISO 19115-2 standard geospatial metadata records suitable for submission to long-term archives as well as the POGO global catalog; and Linked Open Data resources with a SPARQL query endpoint for Semantic Web clients. R2R participates in initiatives such as the Ocean Data Interoperability Platform (ODIP) and the NSF EarthCube OceanLink project to promote community-standard formats, vocabularies, and services among ocean data providers.
Mapping Urban Ecosystem Services Using High Resolution Aerial Photography
NASA Astrophysics Data System (ADS)
Pilant, A. N.; Neale, A.; Wilhelm, D.
2010-12-01
Ecosystem services (ES) are the many life-sustaining benefits we receive from nature: e.g., clean air and water, food and fiber, cultural-aesthetic-recreational benefits, pollination and flood control. The ES concept is emerging as a means of integrating complex environmental and economic information to support informed environmental decision making. The US EPA is developing a web-based National Atlas of Ecosystem Services, with a component for urban ecosystems. Currently, the only wall-to-wall, national scale land cover data suitable for this analysis is the National Land Cover Data (NLCD) at 30 m spatial resolution with 5 and 10 year updates. However, aerial photography is acquired at higher spatial resolution (0.5-3 m) and more frequently (1-5 years, typically) for most urban areas. Land cover was mapped in Raleigh, NC using freely available USDA National Agricultural Imagery Program (NAIP) with 1 m ground sample distance to test the suitability of aerial photography for urban ES analysis. Automated feature extraction techniques were used to extract five land cover classes, and an accuracy assessment was performed using standard techniques. Results will be presented that demonstrate applications to mapping ES in urban environments: greenways, corridors, fragmentation, habitat, impervious surfaces, dark and light pavement (urban heat island). Automated feature extraction results mapped over NAIP color aerial photograph. At this scale, we can look at land cover and related ecosystem services at the 2-10 m scale. Small features such as individual trees and sidewalks are visible and mappable. Classified aerial photo of Downtown Raleigh NC Red: impervious surface Dark Green: trees Light Green: grass Tan: soil
Development of WMS Capabilities to Support NASA Disasters Applications and App Development
NASA Astrophysics Data System (ADS)
Bell, J. R.; Burks, J. E.; Molthan, A.; McGrath, K. M.
2013-12-01
During the last year several significant disasters have occurred such as Superstorm Sandy on the East coast of the United States, and Typhoon Bopha in the Phillipines, along with several others. In support of these disasters NASA's Short-term Prediction Research and Transition (SPoRT) Center delivered various products derived from satellite imagery to help in the assessment of damage and recovery of the affected areas. To better support the decision makers responding to the disasters SPoRT quickly developed several solutions to provide the data using open Geographical Information Service (GIS) formats. Providing the data in open GIS standard formats allowed the end user to easily integrate the data into existing Decision Support Systems (DSS). Both Tile Mapping Service (TMS) and Web Mapping Service (WMS) were leveraged to quickly provide the data to the end-user. Development of the deliver methodology allowed quick response to rapidly developing disasters and enabled NASA SPoRT to bring science data to decision makers in a successful research to operations transition.
Development of WMS Capabilities to Support NASA Disasters Applications and App Development
NASA Technical Reports Server (NTRS)
Bell, Jordan R.; Burks, Jason E.; Molthan, Andrew L.; McGrath, Kevin M.
2013-01-01
During the last year several significant disasters have occurred such as Superstorm Sandy on the East coast of the United States, and Typhoon Bopha in the Phillipines, along with several others. In support of these disasters NASA's Short-term Prediction Research and Transition (SPoRT) Center delivered various products derived from satellite imagery to help in the assessment of damage and recovery of the affected areas. To better support the decision makers responding to the disasters SPoRT quickly developed several solutions to provide the data using open Geographical Information Service (GIS) formats. Providing the data in open GIS standard formats allowed the end user to easily integrate the data into existing Decision Support Systems (DSS). Both Tile Mapping Service (TMS) and Web Mapping Service (WMS) were leveraged to quickly provide the data to the end-user. Development of the deliver methodology allowed quick response to rapidly developing disasters and enabled NASA SPoRT to bring science data to decision makers in a successful research to operations transition.
History and use of remote sensing for conservation and management of federal lands in Alaska, USA
Markon, Carl
1995-01-01
Remote sensing has been used to aid land use planning efforts for federal public lands in Alaska since the 1940s. Four federal land management agencies-the U.S. Fish and Wildlife Service, US. Bureau of Land Management, US. National Park Service, and U.S. Forest Service-have used aerial photography and satellite imagery to document the extent, type, and condition of Alaska's natural resources. Aerial photographs have been used to collect detailed information over small to medium-sized areas. This standard management tool is obtainable using equipment ranging from hand-held 35-mm cameras to precision metric mapping cameras. Satellite data, equally important, provide synoptic views of landscapes, are digitally manipulatable, and are easily merged with other digital databases. To date, over 109.2 million ha (72%) of Alaska's land cover have been mapped via remote sensing. This information has provided a base for conservation, management, and planning on federal public lands in Alaska.
NASA Astrophysics Data System (ADS)
Zhang, Wen-Yan; Lin, Chao-Yuan
2017-04-01
The Soil Conservation Service Curve Number (SCS-CN) method, which was originally developed by the USDA Natural Resources Conservation Service, is widely used to estimate direct runoff volume from rainfall. The runoff Curve Number (CN) parameter is based on the hydrologic soil group and land use factors. In Taiwan, the national land use maps were interpreted from aerial photos in 1995 and 2008. Rapid updating of post-disaster land use map is limited due to the high cost of production, so the classification of satellite images is the alternative method to obtain the land use map. In this study, Normalized Difference Vegetation Index (NDVI) in Chen-You-Lan Watershed was derived from dry and wet season of Landsat imageries during 2003 - 2008. Land covers were interpreted from mean value and standard deviation of NDVI and were categorized into 4 groups i.e. forest, grassland, agriculture and bare land. Then, the runoff volume of typhoon events during 2005 - 2009 were estimated using SCS-CN method and verified with the measured runoff data. The result showed that the model efficiency coefficient is 90.77%. Therefore, estimating runoff by using the land cover map classified from satellite images is practicable.
Interoperability In The New Planetary Science Archive (PSA)
NASA Astrophysics Data System (ADS)
Rios, C.; Barbarisi, I.; Docasal, R.; Macfarlane, A. J.; Gonzalez, J.; Arviset, C.; Grotheer, E.; Besse, S.; Martinez, S.; Heather, D.; De Marchi, G.; Lim, T.; Fraga, D.; Barthelemy, M.
2015-12-01
As the world becomes increasingly interconnected, there is a greater need to provide interoperability with software and applications that are commonly being used globally. For this purpose, the development of the new Planetary Science Archive (PSA), by the European Space Astronomy Centre (ESAC) Science Data Centre (ESDC), is focused on building a modern science archive that takes into account internationally recognised standards in order to provide access to the archive through tools from third parties, for example by the NASA Planetary Data System (PDS), the VESPA project from the Virtual Observatory of Paris as well as other international institutions. The protocols and standards currently being supported by the new Planetary Science Archive at this time are the Planetary Data Access Protocol (PDAP), the EuroPlanet-Table Access Protocol (EPN-TAP) and Open Geospatial Consortium (OGC) standards. The architecture of the PSA consists of a Geoserver (an open-source map server), the goal of which is to support use cases such as the distribution of search results, sharing and processing data through a OGC Web Feature Service (WFS) and a Web Map Service (WMS). This server also allows the retrieval of requested information in several standard output formats like Keyhole Markup Language (KML), Geography Markup Language (GML), shapefile, JavaScript Object Notation (JSON) and Comma Separated Values (CSV), among others. The provision of these various output formats enables end-users to be able to transfer retrieved data into popular applications such as Google Mars and NASA World Wind.
Standardized acquisition, storing and provision of 3D enabled spatial data
NASA Astrophysics Data System (ADS)
Wagner, B.; Maier, S.; Peinsipp-Byma, E.
2017-05-01
In the area of working with spatial data, in addition to the classic, two-dimensional geometrical data (maps, aerial images, etc.), the needs for three-dimensional spatial data (city models, digital elevation models, etc.) is increasing. Due to this increased demand the acquiring, storing and provision of 3D enabled spatial data in Geographic Information Systems (GIS) is more and more important. Existing proprietary solutions quickly reaches their limits during data exchange and data delivery to other systems. They generate a large workload, which will be very costly. However, it is noticeable that these expenses and costs can generally be significantly reduced using standards. The aim of this research is therefore to develop a concept in the field of three-dimensional spatial data that runs on existing standards whenever possible. In this research, the military image analysts are the preferred user group of the system. To achieve the objective of the widest possible use of standards in spatial 3D data, existing standards, proprietary interfaces and standards under discussion have been analyzed. Since the here used GIS of the Fraunhofer IOSB is already using and supporting OGC (Open Geospatial Consortium) and NATO-STANAG (NATO-Standardization Agreement) standards for the most part of it, a special attention for possible use was laid on their standards. The most promising standard is the OGC standard 3DPS (3D Portrayal Service) with its occurrences W3DS (Web 3D Service) and WVS (Web View Service). A demo system was created, using a standardized workflow from the data acquiring, storing and provision and showing the benefit of our approach.
Maps for the nation: The current federal mapping establishment
North, G.W.
1983-01-01
The U.S. Government annually produces an estimated 53,000 new maps and charts and distributes about 160 million copies. A large number of these maps are produced under the national mapping program, a decentralized Federal/State cooperative approach to mapping the country at standard scales. Circular A-16, issued by the Office of Management and Budget in 1953 and revised in 1967, delegates the mapping responsibilities to various federal agencies. The U.S. Department of the Interior's Geological Survey is the principal federal agency responsible for implementing the national mapping program. Other major federal map producing agencies include the Departments of Agriculture, Commerce, Defense, Housing and Urban Development, and Transportation, and the Tennessee Valley Authority. To make maps and mapping information more readily available, the National Cartographic Information Center was established in 1974 and an expanded National Map Library Depository Program in 1981. The most recent of many technological advances made under the mapping program are in the areas of digital cartography and video disc and optical disc information storage systems. Future trends and changes in the federal mapping program will involve expanded information and customer service operations, further developments in the production and use of digital cartographic data, and consideration of a Federal Mapping Agency. ?? 1983.
NASA Astrophysics Data System (ADS)
Arnold, Jeffery E.
The purpose of this study was to determine the effect of four different design layouts of the New York State elementary science learning standards on user processing time and preference. Three newly developed layouts contained the same information as the standards core curriculum. In this study, the layout of the core guide is referred to as Book. The layouts of the new documents are referred to as Chart, Map, and Tabloid based on the format used to convey content hierarchy information. Most notably, all the new layouts feature larger page sizes, color, page tabs, and an icon based navigation system (IBNS). A convenience sample of 48 New York State educators representing three educator types (16 pre-service teachers, 16 in-service teachers, and 16 administrators) participated in the study. After completing timed tasks accurately, participants scored each layout based on preference. Educator type and layout were the independent variables, and process time and user preference were the dependent variables. A two-factor experimental design with Educator Type as the between variable and with repeated measures on Layout, the within variable, showed a significant difference in process time for Educator Type and Layout. The main effect for Educator Type (F(2, 45) = 8.03, p <.001) was significant with an observed power of .94, and an effect size of .26. The pair-wise comparisons for process time showed that pre-service teachers (p = .02) and administrators (p =.009) completed the assigned tasks more quickly when compared to in-service teachers. The main effect for Layout (F(3, 135) = 4.47, p =.01) was also significant with an observed power of .80, and an effect size of .09. Pair-wise comparisons showed that the newly developed Chart (p = .019) and Map (p = .032) layouts reduced overall process time when compared to the existing state learning standards (Book). The Layout X Educator type interaction was not significant. The same two-factor experimental design on preference, showed the main effect for Layout (F(3, 135) = 28.43, p =.001) was significant. The observed power was 1.0, with an effect size of .39. Pair-wise comparisons for preference scores showed that the Chart (p = .001), Map (p = .001), and Tabloid (p = .001) were preferred over the Book layout. The Layout Type X Educator Type interaction and the main effect for Educator Type were not significant. This study provides evidence that the newly developed design layouts improve usability (as measured by process time and preference scores) of the New York State elementary science learning standard documents. Features in the new layout design, such as the IBNS, may provide a foundation for a visual language and aid users in navigating standard documents across grade level and subject areas. Implications for the next generation of standard documents are presented.
rasdaman Array Database: current status
NASA Astrophysics Data System (ADS)
Merticariu, George; Toader, Alexandru
2015-04-01
rasdaman (Raster Data Manager) is a Free Open Source Array Database Management System which provides functionality for storing and processing massive amounts of raster data in the form of multidimensional arrays. The user can access, process and delete the data using SQL. The key features of rasdaman are: flexibility (datasets of any dimensionality can be processed with the help of SQL queries), scalability (rasdaman's distributed architecture enables it to seamlessly run on cloud infrastructures while offering an increase in performance with the increase of computation resources), performance (real-time access, processing, mixing and filtering of arrays of any dimensionality) and reliability (legacy communication protocol replaced with a new one based on cutting edge technology - Google Protocol Buffers and ZeroMQ). Among the data with which the system works, we can count 1D time series, 2D remote sensing imagery, 3D image time series, 3D geophysical data, and 4D atmospheric and climate data. Most of these representations cannot be stored only in the form of raw arrays, as the location information of the contents is also important for having a correct geoposition on Earth. This is defined by ISO 19123 as coverage data. rasdaman provides coverage data support through the Petascope service. Extensions were added on top of rasdaman in order to provide support for the Geoscience community. The following OGC standards are currently supported: Web Map Service (WMS), Web Coverage Service (WCS), and Web Coverage Processing Service (WCPS). The Web Map Service is an extension which provides zoom and pan navigation over images provided by a map server. Starting with version 9.1, rasdaman supports WMS version 1.3. The Web Coverage Service provides capabilities for downloading multi-dimensional coverage data. Support is also provided for several extensions of this service: Subsetting Extension, Scaling Extension, and, starting with version 9.1, Transaction Extension, which defines request types for inserting, updating and deleting coverages. A web client, designed for both novice and experienced users, is also available for the service and its extensions. The client offers an intuitive interface that allows users to work with multi-dimensional coverages by abstracting the specifics of the standard definitions of the requests. The Web Coverage Processing Service defines a language for on-the-fly processing and filtering multi-dimensional raster coverages. rasdaman exposes this service through the WCS processing extension. Demonstrations are provided online via the Earthlook website (earthlook.org) which presents use-cases from a wide variety of application domains, using the rasdaman system as processing engine.
The flight telerobotic servicer: From functional architecture to computer architecture
NASA Technical Reports Server (NTRS)
Lumia, Ronald; Fiala, John
1989-01-01
After a brief tutorial on the NASA/National Bureau of Standards Standard Reference Model for Telerobot Control System Architecture (NASREM) functional architecture, the approach to its implementation is shown. First, interfaces must be defined which are capable of supporting the known algorithms. This is illustrated by considering the interfaces required for the SERVO level of the NASREM functional architecture. After interface definition, the specific computer architecture for the implementation must be determined. This choice is obviously technology dependent. An example illustrating one possible mapping of the NASREM functional architecture to a particular set of computers which implements it is shown. The result of choosing the NASREM functional architecture is that it provides a technology independent paradigm which can be mapped into a technology dependent implementation capable of evolving with technology in the laboratory and in space.
EnviroAtlas - Metrics for Austin, TX
This EnviroAtlas web service supports research and online mapping activities related to EnviroAtlas (https://enviroatlas.epa.gov/EnviroAtlas). The layers in this web service depict ecosystem services at the census block group level for the community of Austin, Texas. These layers illustrate the ecosystems and natural resources that are associated with clean air (https://enviroatlas.epa.gov/arcgis/rest/services/Communities/ESC_ATX_CleanAir/MapServer); clean and plentiful water (https://enviroatlas.epa.gov/arcgis/rest/services/Communities/ESC_ATX_CleanPlentifulWater/MapServer); natural hazard mitigation (https://enviroatlas.epa.gov/arcgis/rest/services/Communities/ESC_ATX_NaturalHazardMitigation/MapServer); climate stabilization (https://enviroatlas.epa.gov/arcgis/rest/services/Communities/ESC_ATX_ClimateStabilization/MapServer); food, fuel, and materials (https://enviroatlas.epa.gov/arcgis/rest/services/Communities/ESC_ATX_FoodFuelMaterials/MapServer); recreation, culture, and aesthetics (https://enviroatlas.epa.gov/arcgis/rest/services/Communities/ESC_ATX_RecreationCultureAesthetics/MapServer); and biodiversity conservation (https://enviroatlas.epa.gov/arcgis/rest/services/Communities/ESC_ATX_BiodiversityConservation/MapServer), and factors that place stress on those resources. EnviroAtlas allows the user to interact with a web-based, easy-to-use, mapping application to view and analyze multiple ecosystem services for the conterminous United States as well as de
NASA Astrophysics Data System (ADS)
Morton, J. J.; Ferrini, V. L.
2015-12-01
The Marine Geoscience Data System (MGDS, www.marine-geo.org) operates an interactive digital data repository and metadata catalog that provides access to a variety of marine geology and geophysical data from throughout the global oceans. Its Marine-Geo Digital Library includes common marine geophysical data types and supporting data and metadata, as well as complementary long-tail data. The Digital Library also includes community data collections and custom data portals for the GeoPRISMS, MARGINS and Ridge2000 programs, for active source reflection data (Academic Seismic Portal), and for marine data acquired by the US Antarctic Program (Antarctic and Southern Ocean Data Portal). Ensuring that these data are discoverable not only through our own interfaces but also through standards-compliant web services is critical for enabling investigators to find data of interest.Over the past two years, MGDS has developed several new RESTful web services that enable programmatic access to metadata and data holdings. These web services are compliant with the EarthCube GeoWS Building Blocks specifications and are currently used to drive our own user interfaces. New web applications have also been deployed to provide a more intuitive user experience for searching, accessing and browsing metadata and data. Our new map-based search interface combines components of the Google Maps API with our web services for dynamic searching and exploration of geospatially constrained data sets. Direct introspection of nearly all data formats for hundreds of thousands of data files curated in the Marine-Geo Digital Library has allowed for precise geographic bounds, which allow geographic searches to an extent not previously possible. All MGDS map interfaces utilize the web services of the Global Multi-Resolution Topography (GMRT) synthesis for displaying global basemap imagery and for dynamically provide depth values at the cursor location.
Investigating Methods for Serving Visualizations of Vertical Profiles
NASA Astrophysics Data System (ADS)
Roberts, J. T.; Cechini, M. F.; Lanjewar, K.; Rodriguez, J.; Boller, R. A.; Baynes, K.
2017-12-01
Several geospatial web servers, web service standards, and mapping clients exist for the visualization of two-dimensional raster and vector-based Earth science data products. However, data products with a vertical component (i.e., vertical profiles) do not have the same mature set of technologies and pose a greater technical challenge when it comes to visualizations. There are a variety of tools and proposed standards, but no obvious solution that can handle the variety of visualizations found with vertical profiles. An effort is being led by members of the NASA Global Imagery Browse Services (GIBS) team to gather a list of technologies relevant to existing vertical profile data products and user stories. The goal is to find a subset of technologies, standards, and tools that can be used to build publicly accessible web services that can handle the greatest number of use cases for the widest audience possible. This presentation will describe results of the investigation and offer directions for moving forward with building a system that is capable of effectively and efficiently serving visualizations of vertical profiles.
Mapping and Modeling Web Portal to Advance Global Monitoring and Climate Research
NASA Astrophysics Data System (ADS)
Chang, G.; Malhotra, S.; Bui, B.; Sadaqathulla, S.; Goodale, C. E.; Ramirez, P.; Kim, R. M.; Rodriguez, L.; Law, E.
2011-12-01
Today, the principal investigators of NASA Earth Science missions develop their own software to manipulate, visualize, and analyze the data collected from Earth, space, and airborne observation instruments. There is very little, if any, collaboration among these principal investigators due to the lack of collaborative tools, which would allow these scientists to share data and results. At NASA's Jet Propulsion Laboratory (JPL), under the Lunar Mapping and Modeling Project (LMMP), we have built a web portal that exposes a set of common services to users to allow search, visualization, subset, and download lunar science data. Users also have access to a set of tools that visualize, analyze and annotate the data. These services are developed according to industry standards for data access and manipulation, such REST and Open Geospatial Consortium (OGC) web services. As a result, users can access the datasets through custom written applications or off-the-shelf applications such as Google Earth. Even though it's currently used to store and process lunar data, this web portal infrastructure has been designed to support other solar system bodies such as asteroids and planets, including Earth. The infrastructure uses a combination of custom, commercial, and open-source software as well as off-the-shelf hardware and pay-by-use cloud computing services. The use of standardized web service interfaces facilitates platform and application-independent access to the services and data. For instance, we have software clients for the LMMP portal that provide a rich browsing and analysis experience from a variety of platforms including iOS and Android mobile platforms and large screen multi-touch displays with 3-D terrain viewing functions. The service-oriented architecture and design principles utilized in the implementation of the portal lends itself to be reusable and scalable and could naturally be extended to include a collaborative environment that enables scientists and principal investigators to share their research and analysis seamlessly. In addition, this extension will allow users to easily share their tools and data, and to enrich their mapping and analysis experiences. In this talk, we will describe the advanced data management and portal technologies used to power this collaborative environment. We will further illustrate how this environment can enable, enhance and advance global monitoring and climate research.
Expanding the use of Scientific Data through Maps and Apps
NASA Astrophysics Data System (ADS)
Shrestha, S. R.; Zimble, D. A.; Herring, D.; Halpert, M.
2014-12-01
The importance of making scientific data more available can't be overstated. There is a wealth of useful scientific data available and demand for this data is only increasing; however, applying scientific data towards practical uses poses several technical challenges. These challenges can arise from difficulty in handling the data due largely to 1) the complexity, variety and volume of scientific data and 2) applying and operating the techniques and tools needed to visualize and analyze the data. As a result, the combined knowledge required to take advantage of these data requires highly specialized skill sets that in total, limit the ability of scientific data from being used in more practical day-to-day decision making activities. While these challenges are daunting, information technologies do exist that can help mitigate some of these issues. Many organizations for years have already been enjoying the benefits of modern service oriented architectures (SOAs) for everyday enterprise tasks. We can use this approach to modernize how we share and access our scientific data where much of the specialized tools and techniques needed to handle and present scientific data can be automated and executed by servers and done so in an appropriate way. We will discuss and show an approach for preparing file based scientific data (e.g. GRIB, netCDF) for use in standard based scientific web services. These scientific web services are able to encapsulate the logic needed to handle and describe scientific data through a variety of service types including, image, map, feature, geoprocessing, and their respective service methods. By combining these types of services and leveraging well-documented and modern web development APIs, we can afford to focus our attention on the design and development of user-friendly maps and apps. Our scenario will include developing online maps through these services by integrating various forecast data from the Climate Forecast System (CFSv2). This presentation showcases a collaboration between the National Oceanic and Atmospheric Administration's (NOAA) Climate.gov portal, Climate Prediction Center and Esri, Inc. on the implementation of the ArcGIS platform, which is aimed at helping modernize scientific data access through a service oriented architecture.
NASA Astrophysics Data System (ADS)
Schmaltz, J. E.; Ilavajhala, S.; Plesea, L.; Hall, J. R.; Boller, R. A.; Chang, G.; Sadaqathullah, S.; Kim, R.; Murphy, K. J.; Thompson, C. K.
2012-12-01
Expedited processing of imagery from NASA satellites for near-real time use by non-science applications users has a long history, especially since the beginning of the Terra and Aqua missions. Several years ago, the Land Atmosphere Near-real-time Capability for EOS (LANCE) was created to greatly expand the range of near-real time data products from a variety of Earth Observing System (EOS) instruments. NASA's Earth Observing System Data and Information System (EOSDIS) began exploring methods to distribute these data as imagery in an intuitive, geo-referenced format, which would be available within three hours of acquisition. Toward this end, EOSDIS has developed the Global Imagery Browse Services (GIBS, http://earthdata.nasa.gov/gibs) to provide highly responsive, scalable, and expandable imagery services. The baseline technology chosen for GIBS was a Tiled Web Mapping Service (TWMS) developed at the Jet Propulsion Laboratory. Using this, global images and mosaics are divided into tiles with fixed bounding boxes for a pyramid of fixed resolutions. Initially, the satellite imagery is created at the existing data systems for each sensor, ensuring the oversight of those most knowledgeable about the science. There, the satellite data is geolocated and converted to an image format such as JPEG, TIFF, or PNG. The GIBS ingest server retrieves imagery from the various data systems and converts them into image tiles, which are stored in a highly-optimized raster format named Meta Raster Format (MRF). The image tiles are then served to users via HTTP by means of an Apache module. Services are available for the entire globe (lat-long projection) and for both polar regions (polar stereographic projection). Requests to the services can be made with the non-standard, but widely known, TWMS format or via the well-known OGC Web Map Tile Service (WMTS) standard format. Standard OGC Web Map Service (WMS) access to the GIBS server is also available. In addition, users may request a KML pyramid. This variety of access methods allows stakeholders to develop visualization/browse clients for a diverse variety of specific audiences. Currently, EOSDIS is providing an OpenLayers web client, Worldview (http://earthdata.nasa.gov/worldview), as an interface to GIBS. A variety of other existing clients can also be developed using such tools as Google Earth, Google Earth browser Plugin, ESRI's Adobe Flash/Flex Client Library, NASA World Wind, Perceptive Pixel Client, Esri's iOS Client Library, and OpenLayers for Mobile. The imagery browse capabilities from GIBS can be combined with other EOSDIS services (i.e. ECHO OpenSearch) via a client that ties them both together to provide an interface that enables data download from the onscreen imagery. Future plans for GIBS include providing imagery based on science quality data from the entire data record of these EOS instruments.
Towards Web-based representation and processing of health information
Gao, Sheng; Mioc, Darka; Yi, Xiaolun; Anton, Francois; Oldfield, Eddie; Coleman, David J
2009-01-01
Background There is great concern within health surveillance, on how to grapple with environmental degradation, rapid urbanization, population mobility and growth. The Internet has emerged as an efficient way to share health information, enabling users to access and understand data at their fingertips. Increasingly complex problems in the health field require increasingly sophisticated computer software, distributed computing power, and standardized data sharing. To address this need, Web-based mapping is now emerging as an important tool to enable health practitioners, policy makers, and the public to understand spatial health risks, population health trends and vulnerabilities. Today several web-based health applications generate dynamic maps; however, for people to fully interpret the maps they need data source description and the method used in the data analysis or statistical modeling. For the representation of health information through Web-mapping applications, there still lacks a standard format to accommodate all fixed (such as location) and variable (such as age, gender, health outcome, etc) indicators in the representation of health information. Furthermore, net-centric computing has not been adequately applied to support flexible health data processing and mapping online. Results The authors of this study designed a HEalth Representation XML (HERXML) schema that consists of the semantic (e.g., health activity description, the data sources description, the statistical methodology used for analysis), geometric, and cartographical representations of health data. A case study has been carried on the development of web application and services within the Canadian Geospatial Data Infrastructure (CGDI) framework for community health programs of the New Brunswick Lung Association. This study facilitated the online processing, mapping and sharing of health information, with the use of HERXML and Open Geospatial Consortium (OGC) services. It brought a new solution in better health data representation and initial exploration of the Web-based processing of health information. Conclusion The designed HERXML has been proven to be an appropriate solution in supporting the Web representation of health information. It can be used by health practitioners, policy makers, and the public in disease etiology, health planning, health resource management, health promotion and health education. The utilization of Web-based processing services in this study provides a flexible way for users to select and use certain processing functions for health data processing and mapping via the Web. This research provides easy access to geospatial and health data in understanding the trends of diseases, and promotes the growth and enrichment of the CGDI in the public health sector. PMID:19159445
The National Map: New Viewer, Services, and Data Download
Dollison, Robert M.
2010-01-01
Managed by the U.S. Geological Survey's (USGS) National Geospatial Program, The National Map has transitioned data assets and viewer applications to a new visualization and product and service delivery environment, which includes an improved viewing platform, base map data and overlay services, and an integrated data download service. This new viewing solution expands upon the National Geospatial Intelligence Agency (NGA) Palanterra X3 viewer, providing a solid technology foundation for navigation and basic Web mapping functionality. Building upon the NGA viewer allows The National Map to focus on improving data services, functions, and data download capabilities. Initially released to the public at the 125th anniversary of mapping in the USGS on December 3, 2009, the viewer and services are now the primary distribution point for The National Map data. The National Map Viewer: http://viewer.nationalmap.gov
NASA Astrophysics Data System (ADS)
Zlinszky, András; Pfeifer, Norbert
2016-04-01
"Ecosystem services" defined vaguely as "nature's benefits to people" are a trending concept in ecology and conservation. Quantifying and mapping these services is a longtime demand of both ecosystems science and environmental policy. The current state of the art is to use existing maps of land cover, and assign certain average ecosystem service values to their unit areas. This approach has some major weaknesses: the concept of "ecosystem services", the input land cover maps and the value indicators. Such assessments often aim at valueing services in terms of human currency as a basis for decision-making, although this approach remains contested. Land cover maps used for ecosystem service assessments (typically the CORINE land cover product) are generated from continental-scale satellite imagery, with resolution in the range of hundreds of meters. In some rare cases, airborne sensors are used, with higher resolution but less covered area. Typically, general land cover classes are used instead of categories defined specifically for the purpose of ecosystem service assessment. The value indicators are developed for and tested on small study sites, but widely applied and adapted to other sites far away (a process called benefit transfer) where local information may not be available. Upscaling is always problematic since such measurements investigate areas much smaller than the output map unit. Nevertheless, remote sensing is still expected to play a major role in conceptualization and assessment of ecosystem services. We propose that an improvement of several orders of magnitude in resolution and accuracy is possible through the application of airborne LIDAR, a measurement technique now routinely used for collection of countrywide three-dimensional datasets with typically sub-meter resolution. However, this requires a clear definition of the concept of ecosystem services and the variables in focus: remote sensing can measure variables closely related to "ecosystem service potential" which is the ability of the local ecosystem to deliver various functions (water retention, carbon storage etc.), but can't quantify how much of these are actually used by humans or what the estimated monetary value is. Due to its ability to measure both terrain relief and vegetation structure in high resolution, airborne LIDAR supports direct quantification of the properties of an ecosystem that lead to it delivering a given service (such as biomass, water retention, micro-climate regulation or habitat diversity). In addition, its high resolution allows direct calibration with field measurements: routine harvesting-based ecological measurements, local biodiversity indicator surveys or microclimate recordings all take place at the human scale and can be directly linked to the local value of LIDAR-based indicators at meter resolution. Therefore, if some field measurements with standard ecological methods are performed on site, the accuracy of LIDAR-based ecosystem service indicators can be rigorously validated. With this conceptual and technical approach high resolution ecosystem service assessments can be made with well established credibility. These would consolidate the concept of ecosystem services and support both scientific research and evidence-based environmental policy at local and - as data coverage is continually increasing - continental scale.
NASA Astrophysics Data System (ADS)
Cooksley, Geraint; Arnaud, Alain; Banwell, Marie-Josée
2013-04-01
Increasingly, geohazard risk managers are looking to satellite observations as a promising option for supporting their risk management and mitigation strategies. The Terrafirma project, aimed at supporting civil protection agencies, local authorities in charge of risk assessment and mitigation is a pan-European ground motion information service funded by the European Space Agency's Global Monitoring for Environment and Security initiative. Over 100 services were delivered to organizations over the last ten years. Terrafirma promotes the use of Synthetic Aperture Radar Interferometry (InSAR) and Persistent Scatterer InSAR (PSI) within three thematic areas for terrain motion analysis: Tectonics, Flooding and Hydrogeology (ground water, landslides and inactive mines), as well as the innovative Wide Area mapping service, aimed at measuring land deformation over very large areas. Terrafirma's thematic services are based on advanced satellite interferometry products; however they exploit additional data sources, including non-EO, coupled with expert interpretation specific to each thematic line. Based on the combination of satellite-derived ground-motion information products with expert motion interpretation, a portfolio of services addressing geo-hazard land motion issues was made available to users. Although not a thematic in itself, the Wide Area mapping product constitutes the fourth quarter of the Terrafirma activities. The wide area processing chain is nearly fully automatic and requires only a little operator interaction. The service offers an operational PSI processing for wide-area mapping with mm accuracy of ground-deformation measurement at a scale of 1:250,000 (i.e. one cm in the map corresponds to 2.5 Km on the ground) on a country or continent level. The WAP was demonstrated using stripmap ERS data however it is foreseen to be a standard for the upcoming Sentinel-1 mission that will be operated in Terrain Observation by Progressive Scan (TOPS) mode. Within each theme, a series of products are offered. The Hydrogeology service delivers geo-information for hydrogeological hazards affecting urban areas, mountainous zones and infra-structures. Areas where groundwater has been severely exploited often experience subsidence as a result. Likewise, many European towns and cities built above abandoned and inactive mines experience strong ground deformation. The hydrogeology theme products study these phenomenon as well as slope instability in mountainous areas. The Tectonics service presents information on seismic hazards. The crustal block boundaries service provides users with information on terrain motion related to major and local faults, earthquake cycles, and vertical deformation sources. The vulnerability map service combines radar satellite date with in situ measurements to identify regions that may be vulnerable in the case of an earthquake. Within the Coastal Lowland and Flood Risk service, the flood plain hazard product assesses flood risk in coastal lowland areas and flood-prone river basins. The advanced subsidence mapping service combines PSI with levelling data and GPS to enable users to interpret subsidence maps within their geodetic reference systems. The flood defence monitoring service focuses on flood protection systems such as dykes and dams. Between 2003 and 2013, Terrafirma delivered services to 51 user organizations in over 25 countries. The archive of datasets is available to organisations involved in geohazard risk management and mitigation. Keywords: Persistent Scatterer Interferometry, Synthetic Aperture Radar, ground motion monitoring, Terrafirma project, multi-hazard analysis
OpenSearch technology for geospatial resources discovery
NASA Astrophysics Data System (ADS)
Papeschi, Fabrizio; Enrico, Boldrini; Mazzetti, Paolo
2010-05-01
In 2005, the term Web 2.0 has been coined by Tim O'Reilly to describe a quickly growing set of Web-based applications that share a common philosophy of "mutually maximizing collective intelligence and added value for each participant by formalized and dynamic information sharing". Around this same period, OpenSearch a new Web 2.0 technology, was developed. More properly, OpenSearch is a collection of technologies that allow publishing of search results in a format suitable for syndication and aggregation. It is a way for websites and search engines to publish search results in a standard and accessible format. Due to its strong impact on the way the Web is perceived by users and also due its relevance for businesses, Web 2.0 has attracted the attention of both mass media and the scientific community. This explosive growth in popularity of Web 2.0 technologies like OpenSearch, and practical applications of Service Oriented Architecture (SOA) resulted in an increased interest in similarities, convergence, and a potential synergy of these two concepts. SOA is considered as the philosophy of encapsulating application logic in services with a uniformly defined interface and making these publicly available via discovery mechanisms. Service consumers may then retrieve these services, compose and use them according to their current needs. A great degree of similarity between SOA and Web 2.0 may be leading to a convergence between the two paradigms. They also expose divergent elements, such as the Web 2.0 support to the human interaction in opposition to the typical SOA machine-to-machine interaction. According to these considerations, the Geospatial Information (GI) domain, is also moving first steps towards a new approach of data publishing and discovering, in particular taking advantage of the OpenSearch technology. A specific GI niche is represented by the OGC Catalog Service for Web (CSW) that is part of the OGC Web Services (OWS) specifications suite, which provides a set of services for discovery, access, and processing of geospatial resources in a SOA framework. GI-cat is a distributed CSW framework implementation developed by the ESSI Lab of the Italian National Research Council (CNR-IMAA) and the University of Florence. It provides brokering and mediation functionalities towards heterogeneous resources and inventories, exposing several standard interfaces for query distribution. This work focuses on a new GI-cat interface which allows the catalog to be queried according to the OpenSearch syntax specification, thus filling the gap between the SOA architectural design of the CSW and the Web 2.0. At the moment, there is no OGC standard specification about this topic, but an official change request has been proposed in order to enable the OGC catalogues to support OpenSearch queries. In this change request, an OpenSearch extension is proposed providing a standard mechanism to query a resource based on temporal and geographic extents. Two new catalog operations are also proposed, in order to publish a suitable OpenSearch interface. This extended interface is implemented by the modular GI-cat architecture adding a new profiling module called "OpenSearch profiler". Since GI-cat also acts as a clearinghouse catalog, another component called "OpenSearch accessor" is added in order to access OpenSearch compliant services. An important role in the GI-cat extension, is played by the adopted mapping strategy. Two different kind of mappings are required: query, and response elements mapping. Query mapping is provided in order to fit the simple OpenSearch query syntax to the complex CSW query expressed by the OGC Filter syntax. GI-cat internal data model is based on the ISO-19115 profile, that is more complex than the simple XML syndication formats, such as RSS 2.0 and Atom 1.0, suggested by OpenSearch. Once response elements are available, in order to be presented, they need to be translated from the GI-cat internal data model, to the above mentioned syndication formats; the mapping processing, is bidirectional. When GI-cat is used to access OpenSearch compliant services, the CSW query must be mapped to the OpenSearch query, and the response elements, must be translated according to the GI-cat internal data model. As results of such extensions, GI-cat provides a user friendly facade to the complex CSW interface, thus enabling it to be queried, for example, using a browser toolbar.
Recommendations for a service framework to access astronomical archives
NASA Technical Reports Server (NTRS)
Travisano, J. J.; Pollizzi, J.
1992-01-01
There are a large number of astronomical archives and catalogs on-line for network access, with many different user interfaces and features. Some systems are moving towards distributed access, supplying users with client software for their home sites which connects to servers at the archive site. Many of the issues involved in defining a standard framework of services that archive/catalog suppliers can use to achieve a basic level of interoperability are described. Such a framework would simplify the development of client and server programs to access the wide variety of astronomical archive systems. The primary services that are supplied by current systems include: catalog browsing, dataset retrieval, name resolution, and data analysis. The following issues (and probably more) need to be considered in establishing a standard set of client/server interfaces and protocols: Archive Access - dataset retrieval, delivery, file formats, data browsing, analysis, etc.; Catalog Access - database management systems, query languages, data formats, synchronous/asynchronous mode of operation, etc.; Interoperability - transaction/message protocols, distributed processing mechanisms (DCE, ONC/SunRPC, etc), networking protocols, etc.; Security - user registration, authorization/authentication mechanisms, etc.; Service Directory - service registration, lookup, port/task mapping, parameters, etc.; Software - public vs proprietary, client/server software, standard interfaces to client/server functions, software distribution, operating system portability, data portability, etc. Several archive/catalog groups, notably the Astrophysics Data System (ADS), are already working in many of these areas. In the process of developing StarView, which is the user interface to the Space Telescope Data Archive and Distribution Service (ST-DADS), these issues and the work of others were analyzed. A framework of standard interfaces for accessing services on any archive system which would benefit archive user and supplier alike is proposed.
A Map-Based Service Supporting Different Types of Geographic Knowledge for the Public
Zhou, Mengjie; Wang, Rui; Tian, Jing; Ye, Ning; Mai, Shumin
2016-01-01
The internet enables the rapid and easy creation, storage, and transfer of knowledge; however, services that transfer geographic knowledge and facilitate the public understanding of geographic knowledge are still underdeveloped to date. Existing online maps (or atlases) can support limited types of geographic knowledge. In this study, we propose a framework for map-based services to represent and transfer different types of geographic knowledge to the public. A map-based service provides tools to ensure the effective transfer of geographic knowledge. We discuss the types of geographic knowledge that should be represented and transferred to the public, and we propose guidelines and a method to represent various types of knowledge through a map-based service. To facilitate the effective transfer of geographic knowledge, tools such as auxiliary background knowledge and auxiliary map-reading tools are provided through interactions with maps. An experiment conducted to illustrate our idea and to evaluate the usefulness of the map-based service is described; the results demonstrate that the map-based service is useful for transferring different types of geographic knowledge. PMID:27045314
A Map-Based Service Supporting Different Types of Geographic Knowledge for the Public.
Zhou, Mengjie; Wang, Rui; Tian, Jing; Ye, Ning; Mai, Shumin
2016-01-01
The internet enables the rapid and easy creation, storage, and transfer of knowledge; however, services that transfer geographic knowledge and facilitate the public understanding of geographic knowledge are still underdeveloped to date. Existing online maps (or atlases) can support limited types of geographic knowledge. In this study, we propose a framework for map-based services to represent and transfer different types of geographic knowledge to the public. A map-based service provides tools to ensure the effective transfer of geographic knowledge. We discuss the types of geographic knowledge that should be represented and transferred to the public, and we propose guidelines and a method to represent various types of knowledge through a map-based service. To facilitate the effective transfer of geographic knowledge, tools such as auxiliary background knowledge and auxiliary map-reading tools are provided through interactions with maps. An experiment conducted to illustrate our idea and to evaluate the usefulness of the map-based service is described; the results demonstrate that the map-based service is useful for transferring different types of geographic knowledge.
Hammoudeh, Mohammad; Newman, Robert; Dennett, Christopher; Mount, Sarah; Aldabbas, Omar
2015-01-01
This paper presents a distributed information extraction and visualisation service, called the mapping service, for maximising information return from large-scale wireless sensor networks. Such a service would greatly simplify the production of higher-level, information-rich, representations suitable for informing other network services and the delivery of field information visualisations. The mapping service utilises a blend of inductive and deductive models to map sense data accurately using externally available knowledge. It utilises the special characteristics of the application domain to render visualisations in a map format that are a precise reflection of the concrete reality. This service is suitable for visualising an arbitrary number of sense modalities. It is capable of visualising from multiple independent types of the sense data to overcome the limitations of generating visualisations from a single type of sense modality. Furthermore, the mapping service responds dynamically to changes in the environmental conditions, which may affect the visualisation performance by continuously updating the application domain model in a distributed manner. Finally, a distributed self-adaptation function is proposed with the goal of saving more power and generating more accurate data visualisation. We conduct comprehensive experimentation to evaluate the performance of our mapping service and show that it achieves low communication overhead, produces maps of high fidelity, and further minimises the mapping predictive error dynamically through integrating the application domain model in the mapping service. PMID:26378539
CrowdMapping: A Crowdsourcing-Based Terminology Mapping Method for Medical Data Standardization.
Mao, Huajian; Chi, Chenyang; Huang, Boyu; Meng, Haibin; Yu, Jinghui; Zhao, Dongsheng
2017-01-01
Standardized terminology is the prerequisite of data exchange in analysis of clinical processes. However, data from different electronic health record systems are based on idiosyncratic terminology systems, especially when the data is from different hospitals and healthcare organizations. Terminology standardization is necessary for the medical data analysis. We propose a crowdsourcing-based terminology mapping method, CrowdMapping, to standardize the terminology in medical data. CrowdMapping uses a confidential model to determine how terminologies are mapped to a standard system, like ICD-10. The model uses mappings from different health care organizations and evaluates the diversity of the mapping to determine a more sophisticated mapping rule. Further, the CrowdMapping model enables users to rate the mapping result and interact with the model evaluation. CrowdMapping is a work-in-progress system, we present initial results mapping terminologies.
Cyberinfrastructure for the digital brain: spatial standards for integrating rodent brain atlases
Zaslavsky, Ilya; Baldock, Richard A.; Boline, Jyl
2014-01-01
Biomedical research entails capture and analysis of massive data volumes and new discoveries arise from data-integration and mining. This is only possible if data can be mapped onto a common framework such as the genome for genomic data. In neuroscience, the framework is intrinsically spatial and based on a number of paper atlases. This cannot meet today's data-intensive analysis and integration challenges. A scalable and extensible software infrastructure that is standards based but open for novel data and resources, is required for integrating information such as signal distributions, gene-expression, neuronal connectivity, electrophysiology, anatomy, and developmental processes. Therefore, the International Neuroinformatics Coordinating Facility (INCF) initiated the development of a spatial framework for neuroscience data integration with an associated Digital Atlasing Infrastructure (DAI). A prototype implementation of this infrastructure for the rodent brain is reported here. The infrastructure is based on a collection of reference spaces to which data is mapped at the required resolution, such as the Waxholm Space (WHS), a 3D reconstruction of the brain generated using high-resolution, multi-channel microMRI. The core standards of the digital atlasing service-oriented infrastructure include Waxholm Markup Language (WaxML): XML schema expressing a uniform information model for key elements such as coordinate systems, transformations, points of interest (POI)s, labels, and annotations; and Atlas Web Services: interfaces for querying and updating atlas data. The services return WaxML-encoded documents with information about capabilities, spatial reference systems (SRSs) and structures, and execute coordinate transformations and POI-based requests. Key elements of INCF-DAI cyberinfrastructure have been prototyped for both mouse and rat brain atlas sources, including the Allen Mouse Brain Atlas, UCSD Cell-Centered Database, and Edinburgh Mouse Atlas Project. PMID:25309417
Cyberinfrastructure for the digital brain: spatial standards for integrating rodent brain atlases.
Zaslavsky, Ilya; Baldock, Richard A; Boline, Jyl
2014-01-01
Biomedical research entails capture and analysis of massive data volumes and new discoveries arise from data-integration and mining. This is only possible if data can be mapped onto a common framework such as the genome for genomic data. In neuroscience, the framework is intrinsically spatial and based on a number of paper atlases. This cannot meet today's data-intensive analysis and integration challenges. A scalable and extensible software infrastructure that is standards based but open for novel data and resources, is required for integrating information such as signal distributions, gene-expression, neuronal connectivity, electrophysiology, anatomy, and developmental processes. Therefore, the International Neuroinformatics Coordinating Facility (INCF) initiated the development of a spatial framework for neuroscience data integration with an associated Digital Atlasing Infrastructure (DAI). A prototype implementation of this infrastructure for the rodent brain is reported here. The infrastructure is based on a collection of reference spaces to which data is mapped at the required resolution, such as the Waxholm Space (WHS), a 3D reconstruction of the brain generated using high-resolution, multi-channel microMRI. The core standards of the digital atlasing service-oriented infrastructure include Waxholm Markup Language (WaxML): XML schema expressing a uniform information model for key elements such as coordinate systems, transformations, points of interest (POI)s, labels, and annotations; and Atlas Web Services: interfaces for querying and updating atlas data. The services return WaxML-encoded documents with information about capabilities, spatial reference systems (SRSs) and structures, and execute coordinate transformations and POI-based requests. Key elements of INCF-DAI cyberinfrastructure have been prototyped for both mouse and rat brain atlas sources, including the Allen Mouse Brain Atlas, UCSD Cell-Centered Database, and Edinburgh Mouse Atlas Project.
NASA Astrophysics Data System (ADS)
Lucido, J. M.
2013-12-01
Scientists in the fields of hydrology, geophysics, and climatology are increasingly using the vast quantity of publicly-available data to address broadly-scoped scientific questions. For example, researchers studying contamination of nearshore waters could use a combination of radar indicated precipitation, modeled water currents, and various sources of in-situ monitoring data to predict water quality near a beach. In discovering, gathering, visualizing and analyzing potentially useful data sets, data portals have become invaluable tools. The most effective data portals often aggregate distributed data sets seamlessly and allow multiple avenues for accessing the underlying data, facilitated by the use of open standards. Additionally, adequate metadata are necessary for attribution, documentation of provenance and relating data sets to one another. Metadata also enable thematic, geospatial and temporal indexing of data sets and entities. Furthermore, effective portals make use of common vocabularies for scientific methods, units of measure, geologic features, chemical, and biological constituents as they allow investigators to correctly interpret and utilize data from external sources. One application that employs these principles is the National Ground Water Monitoring Network (NGWMN) Data Portal (http://cida.usgs.gov/ngwmn), which makes groundwater data from distributed data providers available through a single, publicly accessible web application by mediating and aggregating native data exposed via web services on-the-fly into Open Geospatial Consortium (OGC) compliant service output. That output may be accessed either through the map-based user interface or through the aforementioned OGC web services. Furthermore, the Geo Data Portal (http://cida.usgs.gov/climate/gdp/), which is a system that provides users with data access, subsetting and geospatial processing of large and complex climate and land use data, exemplifies the application of International Standards Organization (ISO) metadata records to enhance data discovery for both human and machine interpretation. Lastly, the Water Quality Portal (http://www.waterqualitydata.us/) achieves interoperable dissemination of water quality data by referencing a vocabulary service for mapping constituents and methods between the USGS and USEPA. The NGWMN Data Portal, Geo Data Portal and Water Quality Portal are three examples of best practices when implementing data portals that provide distributed scientific data in an integrated, standards-based approach.
Comprehensive Evaluation and Analysis of China's Mainstream Online Map Service Websites
NASA Astrophysics Data System (ADS)
Zhang, H.; Jiang, J.; Huang, W.; Wang, Q.; Gu, X.
2012-08-01
With the flourish development of China's Internet market, all kinds of users for map service demand is rising continually, within it contains tremendous commercial interests. Many internet giants have got involved in the field of online map service, and defined it as an important strategic product of the company. The main purpose of this research is to evaluate these online map service websites comprehensively with a model, and analyse the problems according to the evaluation results. Then some corresponding solving measures are proposed, which provides a theoretical and application guidance for the future development of fiercely competitive online map websites. The research consists of three stages: (a) the mainstream online map service websites in China are introduced and the present situation of them is analysed through visit, investigation, consultant, analysis and research. (b) a whole comprehensive evaluation quota system of online map service websites from the view of functions, layout, interaction design color position and so on, combining with the data indexes such as time efficiency, accuracy, objectivity and authority. (c) a comprehensive evaluation to these online map service websites is proceeded based on the fuzzy evaluation mathematical model, and the difficulty that measure the map websites quantitatively is solved.
The SOOS Data Portal, providing access to Southern Oceans data
NASA Astrophysics Data System (ADS)
Proctor, Roger; Finney, Kim; Blain, Peter; Taylor, Fiona; Newman, Louise; Meredith, Mike; Schofield, Oscar
2013-04-01
The Southern Ocean Observing System (SOOS) is an international initiative to enhance, coordinate and expand the strategic observations of the Southern Oceans that are required to address key scientific and societal challenges. A key component of SOOS will be the creation and maintenance of a Southern Ocean Data Portal to provide improved access to historical and ongoing data (Schofield et al., 2012, Eos, Vol. 93, No. 26, pp 241-243). The scale of this effort will require strong leveraging of existing data centres, new cyberinfrastructure development efforts, and defined data collection, quality control, and archiving procedures across the international community. The task of assembling the SOOS data portal is assigned to the SOOS Data Management Sub-Committee. The information infrastructure chosen for the SOOS data portal is based on the Australian Ocean Data Network (AODN, http://portal.aodn.org.au). The AODN infrastructure is built on open-source tools and the use of international standards ensures efficiency of data exchange and interoperability between contributing systems. OGC standard web services protocols are used for serving of data via the internet. These include Web Map Service (WMS) for visualisation, Web Feature Service (WFS) for data download, and Catalogue Service for Web (CSW) for catalogue exchange. The portal offers a number of tools to access and visualize data: - a Search link to the metadata catalogue enables search and discovery by simple text search, by geographic area, temporal extent, keyword, parameter, organisation, or by any combination of these, allowing users to gain access to further information and/or the data for download. Also, searches can be restricted to items which have either data to download, or attached map layers, or both - a Map interface for discovery and display of data, with the ability to change the style and opacity of layers, add additional data layers via OGC Web Map Services, view animated timeseries datastreams - data can be easily accessed and downloaded including directly from OPeNDAP/THREDDS servers. The SOOS data portal (http://soos.aodn.org.au/soos) aims to make access to Southern Ocean data a simple process and the initial layout classifies data into six themes - Heat and Freshwater; Circulation; Ice-sheets and Sea level; Carbon; Sea-ice; and Ecosystems, with the ability to integrate layers between themes. The portal is in its infancy (pilot launched January 2013) with a limited number of datasets available; however, the number of datasets is expected to grow rapidly as the international community becomes fully engaged.
National Geothermal Data System: Open Access to Geoscience Data, Maps, and Documents
NASA Astrophysics Data System (ADS)
Caudill, C. M.; Richard, S. M.; Musil, L.; Sonnenschein, A.; Good, J.
2014-12-01
The U.S. National Geothermal Data System (NGDS) provides free open access to millions of geoscience data records, publications, maps, and reports via distributed web services to propel geothermal research, development, and production. NGDS is built on the US Geoscience Information Network (USGIN) data integration framework, which is a joint undertaking of the USGS and the Association of American State Geologists (AASG), and is compliant with international standards and protocols. NGDS currently serves geoscience information from 60+ data providers in all 50 states. Free and open source software is used in this federated system where data owners maintain control of their data. This interactive online system makes geoscience data easily discoverable, accessible, and interoperable at no cost to users. The dynamic project site http://geothermaldata.org serves as the information source and gateway to the system, allowing data and applications discovery and availability of the system's data feed. It also provides access to NGDS specifications and the free and open source code base (on GitHub), a map-centric and library style search interface, other software applications utilizing NGDS services, NGDS tutorials (via YouTube and USGIN site), and user-created tools and scripts. The user-friendly map-centric web-based application has been created to support finding, visualizing, mapping, and acquisition of data based on topic, location, time, provider, or key words. Geographic datasets visualized through the map interface also allow users to inspect the details of individual GIS data points (e.g. wells, geologic units, etc.). In addition, the interface provides the information necessary for users to access the GIS data from third party software applications such as GoogleEarth, UDig, and ArcGIS. A redistributable, free and open source software package called GINstack (USGIN software stack) was also created to give data providers a simple way to release data using interoperable and shareable standards, upload data and documents, and expose those data as a node in the NGDS or any larger data system through a CSW endpoint. The easy-to-use interface is supported by back-end software including Postgres, GeoServer, and custom CKAN extensions among others.
Internet-based information system of digital geological data providing
NASA Astrophysics Data System (ADS)
Yuon, Egor; Soukhanov, Mikhail; Markov, Kirill
2015-04-01
One of the Russian Federal аgency of mineral resources problems is to provide the geological information which was delivered during the field operation for the means of federal budget. This information should be present in the current, conditional form. Before, the leading way of presenting geological information were paper geological maps, slices, borehole diagrams reports etc. Technologies of database construction, including distributed databases, technologies of construction of distributed information-analytical systems and Internet-technologies are intensively developing nowadays. Most of geological organizations create their own information systems without any possibility of integration into other systems of the same orientation. In 2012, specialists of VNIIgeosystem together with specialists of VSEGEI started the large project - creating the system of providing digital geological materials with using modern and perspective internet-technologies. The system is based on the web-server and the set of special programs, which allows users to efficiently get rasterized and vectorised geological materials. These materials are: geological maps of scale 1:1M, geological maps of scale 1:200 000 and 1:2 500 000, the fragments of seamless geological 1:1M maps, structural zoning maps inside the seamless fragments, the legends for State geological maps 1:200 000 and 1:1 000 000, full author's set of maps and also current materials for international projects «Atlas of geological maps for Circumpolar Arctic scale 1:5 000 000» and «Atlas of Geologic maps of central Asia and adjacent areas scale 1:2 500 000». The most interesting and functional block of the system - is the block of providing structured and well-formalized geological vector materials, based on Gosgeolkart database (NGKIS), managed by Oracle and the Internet-access is supported by web-subsystem NGKIS, which is currently based on MGS-Framework platform, developed by VNIIgeosystem. One of the leading elements is the web-service, which realizes the interaction of all parts of the system and controls whole the way of the request from the user to the database and back, adopted to the GeoSciML and EarthResourceML view. The experience of creation the Internet-based information system of digital geological data providing, and also previous works, including the developing of web-service of NGKIS-system, allows to tell, that technological realization of presenting Russian geological-cartographical data with using of international standards is possible. While realizing, it could be some difficulties, associated with geological material depth. Russian informational geological model is more deep and wide, than foreign. This means the main problem of using international standards and formats: Russian geological data presentation is possible only with decreasing the data detalisation. But, such a problem becomes not very important, if the service publishes also Russian vocabularies, not associated with international vocabularies. In this case, the international format could be the interchange format to change data between Russian users. The integration into the international projects reaches developing of the correlation schemes between Russian and foreign classificators and vocabularies.
BioBarcode: a general DNA barcoding database and server platform for Asian biodiversity resources.
Lim, Jeongheui; Kim, Sang-Yoon; Kim, Sungmin; Eo, Hae-Seok; Kim, Chang-Bae; Paek, Woon Kee; Kim, Won; Bhak, Jong
2009-12-03
DNA barcoding provides a rapid, accurate, and standardized method for species-level identification using short DNA sequences. Such a standardized identification method is useful for mapping all the species on Earth, particularly when DNA sequencing technology is cheaply available. There are many nations in Asia with many biodiversity resources that need to be mapped and registered in databases. We have built a general DNA barcode data processing system, BioBarcode, with open source software - which is a general purpose database and server. It uses mySQL RDBMS 5.0, BLAST2, and Apache httpd server. An exemplary database of BioBarcode has around 11,300 specimen entries (including GenBank data) and registers the biological species to map their genetic relationships. The BioBarcode database contains a chromatogram viewer which improves the performance in DNA sequence analyses. Asia has a very high degree of biodiversity and the BioBarcode database server system aims to provide an efficient bioinformatics protocol that can be freely used by Asian researchers and research organizations interested in DNA barcoding. The BioBarcode promotes the rapid acquisition of biological species DNA sequence data that meet global standards by providing specialized services, and provides useful tools that will make barcoding cheaper and faster in the biodiversity community such as standardization, depository, management, and analysis of DNA barcode data. The system can be downloaded upon request, and an exemplary server has been constructed with which to build an Asian biodiversity system http://www.asianbarcode.org.
Next Generation Landsat Products Delivered Using Virtual Globes and OGC Standard Services
NASA Astrophysics Data System (ADS)
Neiers, M.; Dwyer, J.; Neiers, S.
2008-12-01
The Landsat Data Continuity Mission (LDCM) is the next in the series of Landsat satellite missions and is tasked with the objective of delivering data acquired by the Operational Land Imager (OLI). The OLI instrument will provide data continuity to over 30 years of global multispectral data collected by the Landsat series of satellites. The U.S. Geological Survey Earth Resources Observation and Science (USGS EROS) Center has responsibility for the development and operation of the LDCM ground system. One of the mission objectives of the LDCM is to distribute OLI data products electronically over the Internet to the general public on a nondiscriminatory basis and at no cost. To ensure the user community and general public can easily access LDCM data from multiple clients, the User Portal Element (UPE) of the LDCM ground system will use OGC standards and services such as Keyhole Markup Language (KML), Web Map Service (WMS), Web Coverage Service (WCS), and Geographic encoding of Really Simple Syndication (GeoRSS) feeds for both access to and delivery of LDCM products. The USGS has developed and tested the capabilities of several successful UPE prototypes for delivery of Landsat metadata, full resolution browse, and orthorectified (L1T) products from clients such as Google Earth, Google Maps, ESRI ArcGIS Explorer, and Microsoft's Virtual Earth. Prototyping efforts included the following services: using virtual globes to search the historical Landsat archive by dynamic generation of KML; notification of and access to new Landsat acquisitions and L1T downloads from GeoRSS feeds; Google indexing of KML files containing links to full resolution browse and data downloads; WMS delivery of reduced resolution browse, full resolution browse, and cloud mask overlays; and custom data downloads using WCS clients. These various prototypes will be demonstrated and LDCM service implementation plans will be discussed during this session.
Côté, Richard G; Jones, Philip; Martens, Lennart; Kerrien, Samuel; Reisinger, Florian; Lin, Quan; Leinonen, Rasko; Apweiler, Rolf; Hermjakob, Henning
2007-10-18
Each major protein database uses its own conventions when assigning protein identifiers. Resolving the various, potentially unstable, identifiers that refer to identical proteins is a major challenge. This is a common problem when attempting to unify datasets that have been annotated with proteins from multiple data sources or querying data providers with one flavour of protein identifiers when the source database uses another. Partial solutions for protein identifier mapping exist but they are limited to specific species or techniques and to a very small number of databases. As a result, we have not found a solution that is generic enough and broad enough in mapping scope to suit our needs. We have created the Protein Identifier Cross-Reference (PICR) service, a web application that provides interactive and programmatic (SOAP and REST) access to a mapping algorithm that uses the UniProt Archive (UniParc) as a data warehouse to offer protein cross-references based on 100% sequence identity to proteins from over 70 distinct source databases loaded into UniParc. Mappings can be limited by source database, taxonomic ID and activity status in the source database. Users can copy/paste or upload files containing protein identifiers or sequences in FASTA format to obtain mappings using the interactive interface. Search results can be viewed in simple or detailed HTML tables or downloaded as comma-separated values (CSV) or Microsoft Excel (XLS) files suitable for use in a local database or a spreadsheet. Alternatively, a SOAP interface is available to integrate PICR functionality in other applications, as is a lightweight REST interface. We offer a publicly available service that can interactively map protein identifiers and protein sequences to the majority of commonly used protein databases. Programmatic access is available through a standards-compliant SOAP interface or a lightweight REST interface. The PICR interface, documentation and code examples are available at http://www.ebi.ac.uk/Tools/picr.
Côté, Richard G; Jones, Philip; Martens, Lennart; Kerrien, Samuel; Reisinger, Florian; Lin, Quan; Leinonen, Rasko; Apweiler, Rolf; Hermjakob, Henning
2007-01-01
Background Each major protein database uses its own conventions when assigning protein identifiers. Resolving the various, potentially unstable, identifiers that refer to identical proteins is a major challenge. This is a common problem when attempting to unify datasets that have been annotated with proteins from multiple data sources or querying data providers with one flavour of protein identifiers when the source database uses another. Partial solutions for protein identifier mapping exist but they are limited to specific species or techniques and to a very small number of databases. As a result, we have not found a solution that is generic enough and broad enough in mapping scope to suit our needs. Results We have created the Protein Identifier Cross-Reference (PICR) service, a web application that provides interactive and programmatic (SOAP and REST) access to a mapping algorithm that uses the UniProt Archive (UniParc) as a data warehouse to offer protein cross-references based on 100% sequence identity to proteins from over 70 distinct source databases loaded into UniParc. Mappings can be limited by source database, taxonomic ID and activity status in the source database. Users can copy/paste or upload files containing protein identifiers or sequences in FASTA format to obtain mappings using the interactive interface. Search results can be viewed in simple or detailed HTML tables or downloaded as comma-separated values (CSV) or Microsoft Excel (XLS) files suitable for use in a local database or a spreadsheet. Alternatively, a SOAP interface is available to integrate PICR functionality in other applications, as is a lightweight REST interface. Conclusion We offer a publicly available service that can interactively map protein identifiers and protein sequences to the majority of commonly used protein databases. Programmatic access is available through a standards-compliant SOAP interface or a lightweight REST interface. The PICR interface, documentation and code examples are available at . PMID:17945017
Caching strategies for improving performance of web-based Geographic applications
NASA Astrophysics Data System (ADS)
Liu, M.; Brodzik, M.; Collins, J. A.; Lewis, S.; Oldenburg, J.
2012-12-01
The NASA Operation IceBridge mission collects airborne remote sensing measurements to bridge the gap between NASA's Ice, Cloud and Land Elevation Satellite (ICESat) mission and the upcoming ICESat-2 mission. The IceBridge Data Portal from the National Snow and Ice Data Center provides an intuitive web interface for accessing IceBridge mission observations and measurements. Scientists and users usually do not have knowledge about the individual campaigns but are interested in data collected in a specific place. We have developed a high-performance map interface to allow users to quickly zoom to an area of interest and see any Operation IceBridge overflights. The map interface consists of two layers: the user can pan and zoom on the base map layer; the flight line layer that overlays the base layer provides all the campaign missions that intersect with the current map view. The user can click on the flight campaigns and download the data as needed. The OpenGIS® Web Map Service Interface Standard (WMS) provides a simple HTTP interface for requesting geo-registered map images from one or more distributed geospatial databases. Web Feature Service (WFS) provides an interface allowing requests for geographical features across the web using platform-independent calls. OpenLayers provides vector support (points, polylines and polygons) to build a WMS/WFS client for displaying both layers on the screen. Map Server, an open source development environment for building spatially enabled internet applications, is serving the WMS and WFS spatial data to OpenLayers. Early releases of the portal displayed unacceptably poor load time performance for flight lines and the base map tiles. This issue was caused by long response times from the map server in generating all map tiles and flight line vectors. We resolved the issue by implementing various caching strategies on top of the WMS and WFS services, including the use of Squid (www.squid-cache.org) to cache frequently-used content. Our presentation includes the architectural design of the application, and how we use OpenLayers, WMS and WFS with Squid to build a responsive web application capable of efficiently displaying geospatial data to allow the user to quickly interact with the displayed information. We describe the design, implementation and performance improvement of our caching strategies, and the tools and techniques developed to assist our data caching strategies.
Improving Data Catalogs with Free and Open Source Software
NASA Astrophysics Data System (ADS)
Schweitzer, R.; Hankin, S.; O'Brien, K.
2013-12-01
The Global Earth Observation Integrated Data Environment (GEO-IDE) is NOAA's effort to successfully integrate data and information with partners in the national US-Global Earth Observation System (US-GEO) and the international Global Earth Observation System of Systems (GEOSS). As part of the GEO-IDE, the Unified Access Framework (UAF) is working to build momentum towards the goal of increased data integration and interoperability. The UAF project is moving towards this goal with an approach that includes leveraging well known and widely used standards, as well as free and open source software. The UAF project shares the widely held conviction that the use of data standards is a key ingredient necessary to achieve interoperability. Many community-based consensus standards fail, though, due to poor compliance. Compliance problems emerge for many reasons: because the standards evolve through versions, because documentation is ambiguous or because individual data providers find the standard inadequate as-is to meet their special needs. In addition, minimalist use of standards will lead to a compliant service, but one which is of low quality. In this presentation, we will be discussing the UAF effort to build a catalog cleaning tool which is designed to crawl THREDDS catalogs, analyze the data available, and then build a 'clean' catalog of data which is standards compliant and has a uniform set of data access services available. These data services include, among others, OPeNDAP, Web Coverage Service (WCS) and Web Mapping Service (WMS). We will also discuss how we are utilizing free and open source software and services to both crawl, analyze and build the clean data catalog, as well as our efforts to help data providers improve their data catalogs. We'll discuss the use of open source software such as DataNucleus, Thematic Realtime Environmental Distributed Data Services (THREDDS), ncISO and the netCDF Java Common Data Model (CDM). We'll also demonstrate how we are using free services such as Google Charts to create an easily identifiable visual metaphor which describes the quality of data catalogs. Using this rubric, in conjunction with the ncISO metadata quality rubric, will allow data providers to identify non-compliance issues in their data catalogs, thereby improving data availability to their users and to data discovery systems
Method and system for a network mapping service
Bynum, Leo
2017-10-17
A method and system of publishing a map includes providing access to a plurality of map data files or mapping services between at least one publisher and at least one subscriber; defining a map in a map context comprising parameters and descriptors to substantially duplicate a map by reference to mutually accessible data or mapping services, publishing a map to a channel in a table file on server; accessing the channel by at least one subscriber, transmitting the mapping context from the server to the at least one subscriber, executing the map context by the at least one subscriber, and generating the map on a display software associated with the at least one subscriber by reconstituting the map from the references and other data in the mapping context.
2017-01-31
mapping critical business workflows and then optimizing them with appropriate evolutionary technology choices is often called “ Product Line Architecture... technologies , products , services, and processes, and the USG evaluates them against its 360o requirements objectives, and refines them as appropriate, clarity...in rapidly evolving technological domains (e.g. by applying best commercial practices for open standard product line architecture.) An MP might be
Pathfinder. Volume 8, Number 6, November/December 2010
2010-12-01
transferring information between multiple systems . Nevertheless, without an end-to-end TCPED process and the associated standards, policies and equipment in...products with partners whose information technology systems vary and are not compatible with those of the NSG, NGA and the U.S. Depart- ment of...Pacific. ARF DReaMS is based on Web service technol- ogy, where traditional maps, data and any relevant geospatial information are made available
Data Mining Web Services for Science Data Repositories
NASA Astrophysics Data System (ADS)
Graves, S.; Ramachandran, R.; Keiser, K.; Maskey, M.; Lynnes, C.; Pham, L.
2006-12-01
The maturation of web services standards and technologies sets the stage for a distributed "Service-Oriented Architecture" (SOA) for NASA's next generation science data processing. This architecture will allow members of the scientific community to create and combine persistent distributed data processing services and make them available to other users over the Internet. NASA has initiated a project to create a suite of specialized data mining web services designed specifically for science data. The project leverages the Algorithm Development and Mining (ADaM) toolkit as its basis. The ADaM toolkit is a robust, mature and freely available science data mining toolkit that is being used by several research organizations and educational institutions worldwide. These mining services will give the scientific community a powerful and versatile data mining capability that can be used to create higher order products such as thematic maps from current and future NASA satellite data records with methods that are not currently available. The package of mining and related services are being developed using Web Services standards so that community-based measurement processing systems can access and interoperate with them. These standards-based services allow users different options for utilizing them, from direct remote invocation by a client application to deployment of a Business Process Execution Language (BPEL) solutions package where a complex data mining workflow is exposed to others as a single service. The ability to deploy and operate these services at a data archive allows the data mining algorithms to be run where the data are stored, a more efficient scenario than moving large amounts of data over the network. This will be demonstrated in a scenario in which a user uses a remote Web-Service-enabled clustering algorithm to create cloud masks from satellite imagery at the Goddard Earth Sciences Data and Information Services Center (GES DISC).
Automating the selection of standard parallels for conic map projections
NASA Astrophysics Data System (ADS)
Šavriǒ, Bojan; Jenny, Bernhard
2016-05-01
Conic map projections are appropriate for mapping regions at medium and large scales with east-west extents at intermediate latitudes. Conic projections are appropriate for these cases because they show the mapped area with less distortion than other projections. In order to minimize the distortion of the mapped area, the two standard parallels of conic projections need to be selected carefully. Rules of thumb exist for placing the standard parallels based on the width-to-height ratio of the map. These rules of thumb are simple to apply, but do not result in maps with minimum distortion. There also exist more sophisticated methods that determine standard parallels such that distortion in the mapped area is minimized. These methods are computationally expensive and cannot be used for real-time web mapping and GIS applications where the projection is adjusted automatically to the displayed area. This article presents a polynomial model that quickly provides the standard parallels for the three most common conic map projections: the Albers equal-area, the Lambert conformal, and the equidistant conic projection. The model defines the standard parallels with polynomial expressions based on the spatial extent of the mapped area. The spatial extent is defined by the length of the mapped central meridian segment, the central latitude of the displayed area, and the width-to-height ratio of the map. The polynomial model was derived from 3825 maps-each with a different spatial extent and computationally determined standard parallels that minimize the mean scale distortion index. The resulting model is computationally simple and can be used for the automatic selection of the standard parallels of conic map projections in GIS software and web mapping applications.
MapEdit: solution to continuous raster map creation
NASA Astrophysics Data System (ADS)
Rančić, Dejan; Djordjevi-Kajan, Slobodanka
2003-03-01
The paper describes MapEdit, MS Windows TM software for georeferencing and rectification of scanned paper maps. The software produces continuous raster maps which can be used as background in geographical information systems. Process of continuous raster map creation using MapEdit "mosaicking" function is also described as well as the georeferencing and rectification algorithms which are used in MapEdit. Our approach for georeferencing and rectification using four control points and two linear transformations for each scanned map part, together with nearest neighbor resampling method, represents low cost—high speed solution that produce continuous raster maps with satisfactory quality for many purposes (±1 pixel). Quality assessment of several continuous raster maps at different scales that have been created using our software and methodology, has been undertaken and results are presented in the paper. For the quality control of the produced raster maps we referred to three wide adopted standards: US Standard for Digital Cartographic Data, National Standard for Spatial Data Accuracy and US National Map Accuracy Standard. The results obtained during the quality assessment process are given in the paper and show that our maps meat all three standards.
FGDC Digital Cartographic Standard for Geologic Map Symbolization (PostScript Implementation)
,
2006-01-01
PLEASE NOTE: This now-approved 'FGDC Digital Cartographic Standard for Geologic Map Symbolization (PostScript Implementation)' officially supercedes its earlier (2000) Public Review Draft version (see 'Earlier Versions of the Standard' below). In August 2006, the Digital Cartographic Standard for Geologic Map Symbolization was officially endorsed by the Federal Geographic Data Committee (FGDC) as the national standard for the digital cartographic representation of geologic map features (FGDC Document Number FGDC-STD-013-2006). Presented herein is the PostScript Implementation of the standard, which will enable users to directly apply the symbols in the standard to geologic maps and illustrations prepared in desktop illustration and (or) publishing software. The FGDC Digital Cartographic Standard for Geologic Map Symbolization contains descriptions, examples, cartographic specifications, and notes on usage for a wide variety of symbols that may be used on typical, general-purpose geologic maps and related products such as cross sections. The standard also can be used for different kinds of special-purpose or derivative map products and databases that may be focused on a specific geoscience topic (for example, slope stability) or class of features (for example, a fault map). The standard is scale-independent, meaning that the symbols are appropriate for use with geologic mapping compiled or published at any scale. It will be useful to anyone who either produces or uses geologic map information, whether in analog or digital form. Please be aware that this standard is not intended to be used inflexibly or in a manner that will limit one's ability to communicate the observations and interpretations gained from geologic mapping. In certain situations, a symbol or its usage might need to be modified in order to better represent a particular feature on a geologic map or cross section. This standard allows the use of any symbol that doesn't conflict with others in the standard, provided that it is clearly explained on the map and in the database. In addition, modifying the size, color, and (or) lineweight of an existing symbol to suit the needs of a particular map or output device also is permitted, provided that the modified symbol's appearance is not too similar to another symbol on the map. Be aware, however, that reducing lineweights below .125 mm (.005 inch) may cause symbols to plot incorrectly if output at higher resolutions (1800 dpi or higher). For guidelines on symbol usage, as well as on color design and map labeling, please refer to the standard's introductory text. Also found there are informational sections covering concepts of geologic mapping and some definitions of geologic map features, as well as sections on the newly defined concepts and terminology for the scientific confidence and locational accuracy of geologic map features. More information on both the past development and the future maintenance of the FGDC Digital Cartographic Standard for Geologic Map Symbolization can be found at the FGDC Geologic Data Subcommittee website (http://ngmdb.usgs.gov/fgdc_gds/). Earlier Versions of the Standard
UNAVCO Software and Services for Visualization and Exploration of Geoscience Data
NASA Astrophysics Data System (ADS)
Meertens, C.; Wier, S.
2007-12-01
UNAVCO has been involved in visualization of geoscience data to support education and research for several years. An early and ongoing service is the Jules Verne Voyager, a web browser applet built on the GMT that displays any area on Earth, with many data set choices, including maps, satellite images, topography, geoid heights, sea-floor ages, strain rates, political boundaries, rivers and lakes, earthquake and volcano locations, focal mechanisms, stress axes, and observed and modeled plate motion and deformation velocity vectors from geodetic measurements around the world. As part of the GEON project, UNAVCO has developed the GEON IDV, a research-level, 4D (earth location, depth and/or altitude, and time), Java application for interactive display and analysis of geoscience data. The GEON IDV is designed to meet the challenge of investigating complex, multi-variate, time-varying, three-dimensional geoscience data anywhere on earth. The GEON IDV supports simultaneous displays of data sets from differing sources, with complete control over colors, time animation, map projection, map area, point of view, and vertical scale. The GEON IDV displays gridded and point data, images, GIS shape files, and several other types of data. The GEON IDV has symbols and displays for GPS velocity vectors, seismic tomography, earthquake focal mechanisms, earthquake locations with magnitude or depth, seismic ray paths in 3D, seismic anisotropy, convection model visualization, earth strain axes and strain field imagery, and high-resolution 3D topographic relief maps. Multiple data sources and display types may appear in one view. As an example of GEON IDV utility, it can display hypocenters under a volcano, a surface geology map of the volcano draped over 3D topographic relief, town locations and political boundaries, and real-time 3D weather radar clouds of volcanic ash in the atmosphere, with time animation. The GEON IDV can drive a GeoWall or other 3D stereo system. IDV output includes imagery, movies, and KML files for Google Earth use of IDV static images, where Google Earth can handle the display. The IDV can be scripted to create display images on user request or automatically on data arrival, offering the use of the IDV as a back end to support a data web site. We plan to extend the power of the IDV by accepting new data types and data services, such as GeoSciML. An active program of online and video training in GEON IDV use is planned. UNAVCO will support users who need assistance converting their data to the standard formats used by the GEON IDV. The UNAVCO Facility provides web-accessible support for Google Earth and Google Maps display of any of more than 9500 GPS stations and survey points, including metadata for each installation. UNAVCO provides corresponding Open Geospatial Consortium (OGC) web services with the same data. UNAVCO's goal is to facilitate data access, interoperability, and efficient searches, exploration, and use of data by promoting web services, standards for GEON IDV data formats and metadata, and software able to simultaneously read and display multiple data sources, formats, and map locations or projections. Retention and propagation of semantics and metadata with observational and experimental values is essential for interoperability and understanding diverse data sources.
[Who Hits the Mark? A Comparative Study of the Free Geocoding Services of Google and OpenStreetMap].
Lemke, D; Mattauch, V; Heidinger, O; Hense, H W
2015-09-01
Geocoding, the process of converting textual information (addresses) into geographic coordinates is increasingly used in public health/epidemiological research and practice. To date, little attention has been paid to geocoding quality and its impact on different types of spatially-related health studies. The primary aim of this study was to compare 2 freely available geocoding services (Google and OpenStreetMap) with regard to matching rate (percentage of address records capable of being geocoded) and positional accuracy (distance between geocodes and the ground truth locations). Residential addresses were geocoded by the NRW state office for information and technology and were considered as reference data (gold standard). The gold standard included the coordinates, the quality of the addresses (4 categories), and a binary urbanity indicator based on the CORINE land cover data. 2 500 addresses were randomly sampled after stratification for address quality and urbanity indicator (approximately 20 000 addresses). These address samples were geocoded using the geocoding services from Google and OSM. In general, both geocoding services showed a decrease in the matching rate with decreasing address quality and urbanity. Google showed consistently a higher completeness than OSM (>93 vs. >82%). Also, the cartographic confounding between urban and rural regions was less distinct with Google's geocoding API. Regarding the positional accuracy of the geo-coordinates, Google also showed the smallest deviations from the reference coordinates, with a median of <9 vs. <175.8 m. The cumulative density function derived from the positional accuracy showed for Google that nearly 95% and for OSM 50% of the addresses were geocoded within <50 m of their reference coordinates. The geocoding API from Google is superior to OSM regarding completeness and positional accuracy of the geocoded addresses. On the other hand, Google has several restrictions, such as the limitation of the requests to 2 500 addresses per 24 h and the presentation of the results exclusively on Google Maps, which may complicate the use for scientific purposes. © Georg Thieme Verlag KG Stuttgart · New York.
Kaizen method for esophagectomy patients: improved quality control, outcomes, and decreased costs.
Iannettoni, Mark D; Lynch, William R; Parekh, Kalpaj R; McLaughlin, Kelley A
2011-04-01
The majority of costs associated with esophagectomy are related to the initial 3 days of hospital stay requiring intensive care unit stays, ventilator support, and intraoperative time. Additional costs arise from hospital-based services. The major cost increases are related to complications associated with the procedure. We attempted to define these costs and identify expense management by streamlining care through strict adherence to patient care maps, operative standardization, and rapid discharge planning to reduce variability. Utilizing methods of Kaizen philosophy we evaluated all processes related to the entire experience of esophageal resection. This process has taken over 5 years to achieve, with quality and cost being tracked over this time period. Cost analysis included expenses related to intensive care unit, anesthesia, disposables, and hospital services. Quality improvement measures were related to intraoperative complications, in-hospital complications, and postoperative outcomes. The Institutional Review Board approved the use of anonymous data from standard clinical practice because no additional treatment was planned (observational study). Utilizing a continuous process improvement methodology, a 43% reduction in cost per case has been achieved with a significant increase in contribution margin for esophagectomy. The length of stay has been reduced from 14 days to 5. With intraoperative and postoperative standardization the leak rate has dropped from 12% to less than 3% to no leaks in our current Kaizen modification of care in our last 64 patients. Utilizing lean manufacturing techniques and continuous process evaluation we have attempted to eliminate variability, standardized the phases of care resulting in improved outcomes, decreased length of stay, and improved contribution margins. These Kaizen improvements require continuous interventions, strict adherence to care maps, and input from all levels for quality improvements. Copyright © 2011 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.
Oregon OCS seafloor mapping: Selected lease blocks relevant to renewable energy
Cochrane, Guy R.; Hemery, Lenaïg G.; Henkel, Sarah K.
2017-05-23
In 2014 the U.S. Geological Survey (USGS) and the Bureau of Ocean Energy Management (BOEM) entered into Intra-agency agreement M13PG00037 to map an area of the Oregon Outer Continental Shelf (OCS) off of Coos Bay, Oregon, under consideration for development of a floating wind energy farm. The BOEM requires seafloor mapping and site characterization studies in order to evaluate the impact of seafloor and sub-seafloor conditions on the installation, operation, and structural integrity of proposed renewable energy projects, as well as to assess the potential effects of construction and operations on archaeological resources. The mission of the USGS is to provide geologic, topographic, and hydrologic information that contributes to the wise management of the Nation's natural resources and that promotes the health, safety, and well being of the people. This information consists of maps, databases, and descriptions and analyses of the water, energy, and mineral resources, land surface, underlying geologic structure, and dynamic processes of the earth.For the Oregon OCS study, the USGS acquired multibeam echo sounder and seafloor video data surrounding the proposed development site, which is 95 km2 in area and 15 miles offshore from Coos Bay. The development site had been surveyed by Solmar Hydro Inc. in 2013 under a contract with WindFloat Pacific. The USGS subsequently produced a bathymetry digital elevation model and a backscatter intensity grid that were merged with existing data collected by the contractor. The merged grids were published along with visual observations of benthic geo-habitat from the video data in an associated USGS data release (Cochrane and others, 2015).This report includes the results of analysis of the video data conducted by Oregon State University and the geo-habitat interpretation of the multibeam echo sounder (MBES) data conducted by the USGS. MBES data was published in Cochrane and others (2015). Interpretive data associated with this publication is published in Cochrane (2017). All the data is provided as geographic information system (GIS) files that contain both Esri ArcGIS geotiffs or shapefiles. For those who do not own the full suite of Esri GIS and mapping software, the data can be read using Esri ArcReader, a free viewer that is available at http://www.esri.com/software/arcgis/arcreader/index.html (last accessed August 29, 2016). Web services, which consist of standard implementations of ArcGIS representational state transfer (REST) Service and Open Geospatial Consortium (OGC) GIS web map service (WMS), also are available for all published GIS data. Web services were created using an ArcGIS service definition file, resulting in data layers that are symbolized as shown on the associated report figures. Both the ArcGIS REST Service and OGC WMS Service include all the individual GIS layers. Data layers are bundled together in a map-area web service; however, each layer can be symbolized and accessed individually after the web service is ingested into a desktop application or web map. Web services enable users to download and view data, as well as to easily add data to their own workflows, using any browser-enabled, standalone or mobile device.Though the surficial substrate is dominated by combinations of mud and sand substrate, a diverse assortment of geomorphologic features are related to geologic processes—one anticlinal ridge where bedrock is exposed, a slump and associated scarps, and pockmarks. Pockmarks are seen in the form of fields of small pockmarks, a lineation of large pockmarks with methanogenic carbonates, and areas of large pockmarks that have merged into larger variously shaped depressions. The slump appears to have originated at the pockmark lineation. Video-supervised numerical analysis of the MBES backscatter intensity data and vector ruggedness derived from the MBES bathymetry data was used to produce a substrate model called a seafloor character raster for the study area. The seafloor character raster consists of three substrate classes: soft-flat areas, hard-flat areas, and hard-rugged areas. A Coastal and Marine Ecological Classification Standard (CMECS) geoform and substrate map was also produced using depth, slope, and benthic position index classes to delineate geoform boundaries. Seven geoforms were identified in this process, including ridges, slump scars, slump deposits, basins, and pockmarks.Statistical analysis of the video data for correlations between substrate, depth, and invertebrate assemblages resulted in the identification of seven biomes: three hard-bottom biomes and four softbottom biomes. A similar analysis of vertebrate observations produces a similar set of biomes. The biome between-group dissimilarity was very high or high. Invertebrates alone represent most of the structure of the whole benthic community into different assemblages. A biotope map was generated using the seafloor character raster and the substrate and depth values of the biomes. Hard substrate biotopes were small in size and were located primarily on the ridge and in pockmarks along the pockmark lineation. The soft-bottom bitopes consisted of large contiguous areas delimited by isobaths.
DIY-style GIS service in mobile navigation system integrated with web and wireless GIS
NASA Astrophysics Data System (ADS)
Yan, Yongbin; Wu, Jianping; Fan, Caiyou; Wang, Minqi; Dai, Sheng
2007-06-01
Mobile navigation system based on handheld device can not only provide basic GIS services, but also enable these GIS services to be provided without location limit, to be more instantly interacted between users and devices. However, we still see that most navigation systems have common defects on user experience like limited map format, few map resources, and unable location share. To overcome the above defects, we propose DIY-style GIS service which provide users a more free software environment and allow uses to customize their GIS services. These services include defining geographical coordinate system of maps which helps to hugely enlarge the map source, editing vector feature, related property information and hotlink images, customizing covered area of download map via General Packet Radio Service (GPRS), and sharing users' location information via SMS (Short Message Service) which establishes the communication between users who needs GIS services. The paper introduces the integration of web and wireless GIS service in a mobile navigation system and presents an implementation sample of a DIY-Style GIS service in a mobile navigation system.
Customised City Maps in Mobile Applications for Senior Citizens.
Reins, Frank; Berker, Frank; Heck, Helmut
2017-01-01
Map services should be used in mobile applications for senior citizens. Do the commonly used map services meet the needs of elderly people? - Exemplarily, the contrast ratios of common maps in comparison to an optimized custom rendered map are examined in the paper.
Yuksel, Mustafa; Dogac, Asuman
2011-07-01
Medical devices are essential to the practice of modern healthcare services. Their benefits will increase if clinical software applications can seamlessly acquire the medical device data. The need to represent medical device observations in a format that can be consumable by clinical applications has already been recognized by the industry. Yet, the solutions proposed involve bilateral mappings from the ISO/IEEE 11073 Domain Information Model (DIM) to specific message or document standards. Considering that there are many different types of clinical applications such as the electronic health record and the personal health record systems, the clinical workflows, and the clinical decision support systems each conforming to different standard interfaces, detailing a mapping mechanism for every one of them introduces significant work and, thus, limits the potential health benefits of medical devices. In this paper, to facilitate the interoperability of clinical applications and the medical device data, we use the ISO/IEEE 11073 DIM to derive an HL7 v3 Refined Message Information Model (RMIM) of the medical device domain from the HL7 v3 Reference Information Mode (RIM). This makes it possible to trace the medical device data back to a standard common denominator, that is, HL7 v3 RIM from which all the other medical domains under HL7 v3 are derived. Hence, once the medical device data are obtained in the RMIM format, it can easily be transformed into HL7-based standard interfaces through XML transformations because these interfaces all have their building blocks from the same RIM. To demonstrate this, we provide the mappings from the developed RMIM to some of the widely used HL7 v3-based standard interfaces.
The EarthServer Geology Service: web coverage services for geosciences
NASA Astrophysics Data System (ADS)
Laxton, John; Sen, Marcus; Passmore, James
2014-05-01
The EarthServer FP7 project is implementing web coverage services using the OGC WCS and WCPS standards for a range of earth science domains: cryospheric; atmospheric; oceanographic; planetary; and geological. BGS is providing the geological service (http://earthserver.bgs.ac.uk/). Geoscience has used remote sensed data from satellites and planes for some considerable time, but other areas of geosciences are less familiar with the use of coverage data. This is rapidly changing with the development of new sensor networks and the move from geological maps to geological spatial models. The BGS geology service is designed initially to address two coverage data use cases and three levels of data access restriction. Databases of remote sensed data are typically very large and commonly held offline, making it time-consuming for users to assess and then download data. The service is designed to allow the spatial selection, editing and display of Landsat and aerial photographic imagery, including band selection and contrast stretching. This enables users to rapidly view data, assess is usefulness for their purposes, and then enhance and download it if it is suitable. At present the service contains six band Landsat 7 (Blue, Green, Red, NIR 1, NIR 2, MIR) and three band false colour aerial photography (NIR, green, blue), totalling around 1Tb. Increasingly 3D spatial models are being produced in place of traditional geological maps. Models make explicit spatial information implicit on maps and thus are seen as a better way of delivering geosciences information to non-geoscientists. However web delivery of models, including the provision of suitable visualisation clients, has proved more challenging than delivering maps. The EarthServer geology service is delivering 35 surfaces as coverages, comprising the modelled superficial deposits of the Glasgow area. These can be viewed using a 3D web client developed in the EarthServer project by Fraunhofer. As well as remote sensed imagery and 3D models, the geology service is also delivering DTM coverages which can be viewed in the 3D client in conjunction with both imagery and models. The service is accessible through a web GUI which allows the imagery to be viewed against a range of background maps and DTMs, and in the 3D client; spatial selection to be carried out graphically; the results of image enhancement to be displayed; and selected data to be downloaded. The GUI also provides access to the Glasgow model in the 3D client, as well as tutorial material. In the final year of the project it is intended to increase the volume of data to 20Tb and enhance the WCPS processing, including depth and thickness querying of 3D models. We have also investigated the use of GeoSciML, developed to describe and interchange the information on geological maps, to describe model surface coverages. EarthServer is developing a combined WCPS and xQuery query language, and we will investigate applying this to the GeoSciML described surfaces to answer questions such as 'find all units with a predominant sand lithology within 25m of the surface'.
SCHeMA web-based observation data information system
NASA Astrophysics Data System (ADS)
Novellino, Antonio; Benedetti, Giacomo; D'Angelo, Paolo; Confalonieri, Fabio; Massa, Francesco; Povero, Paolo; Tercier-Waeber, Marie-Louise
2016-04-01
It is well recognized that the need of sharing ocean data among non-specialized users is constantly increasing. Initiatives that are built upon international standards will contribute to simplify data processing and dissemination, improve user-accessibility also through web browsers, facilitate the sharing of information across the integrated network of ocean observing systems; and ultimately provide a better understanding of the ocean functioning. The SCHeMA (Integrated in Situ Chemical MApping probe) Project is developing an open and modular sensing solution for autonomous in situ high resolution mapping of a wide range of anthropogenic and natural chemical compounds coupled to master bio-physicochemical parameters (www.schema-ocean.eu). The SCHeMA web system is designed to ensure user-friendly data discovery, access and download as well as interoperability with other projects through a dedicated interface that implements the Global Earth Observation System of Systems - Common Infrastructure (GCI) recommendations and the international Open Geospatial Consortium - Sensor Web Enablement (OGC-SWE) standards. This approach will insure data accessibility in compliance with major European Directives and recommendations. Being modular, the system allows the plug-and-play of commercially available probes as well as new sensor probess under development within the project. The access to the network of monitoring probes is provided via a web-based system interface that, being implemented as a SOS (Sensor Observation Service), is providing standard interoperability and access tosensor observations systems through O&M standard - as well as sensor descriptions - encoded in Sensor Model Language (SensorML). The use of common vocabularies in all metadatabases and data formats, to describe data in an already harmonized and common standard is a prerequisite towards consistency and interoperability. Therefore, the SCHeMA SOS has adopted the SeaVox common vocabularies populated by SeaDataNet network of National Oceanographic Data Centres. The SCHeMA presentation layer, a fundamental part of the software architecture, offers to the user a bidirectional interaction with the integrated system allowing to manage and configure the sensor probes; view the stored observations and metadata, and handle alarms. The overall structure of the web portal developed within the SCHeMA initiative (Sensor Configuration, development of Core Profile interface for data access via OGC standard, external services such as web services, WMS, WFS; and Data download and query manager) will be presented and illustrated with examples of ongoing tests in costal and open sea.
NASA Astrophysics Data System (ADS)
Julià Selvas, Núria; Ninyerola Casals, Miquel
2015-04-01
It has been implemented an automatic system to predict the fire risk in the Principality of Andorra, a small country located in the eastern Pyrenees mountain range, bordered by Catalonia and France, due to its location, his landscape is a set of a rugged mountains with an average elevation around 2000 meters. The system is based on the Fire Weather Index (FWI) that consists on different components, each one, measuring a different aspect of the fire danger calculated by the values of the weather variables at midday. CENMA (Centre d'Estudis de la Neu i de la Muntanya d'Andorra) has a network around 10 automatic meteorological stations, located in different places, peeks and valleys, that measure weather data like relative humidity, wind direction and speed, surface temperature, rainfall and snow cover every ten minutes; this data is sent daily and automatically to the system implemented that will be processed in the way to filter incorrect measurements and to homogenizer measurement units. Then this data is used to calculate all components of the FWI at midday and for the level of each station, creating a database with the values of the homogeneous measurements and the FWI components for each weather station. In order to extend and model this data to all Andorran territory and to obtain a continuous map, an interpolation method based on a multiple regression with spline residual interpolation has been implemented. This interpolation considerer the FWI data as well as other relevant predictors such as latitude, altitude, global solar radiation and sea distance. The obtained values (maps) are validated using a cross-validation leave-one-out method. The discrete and continuous maps are rendered in tiled raster maps and published in a web portal conform to Web Map Service (WMS) Open Geospatial Consortium (OGC) standard. Metadata and other reference maps (fuel maps, topographic maps, etc) are also available from this geoportal.
Bagstad, Kenneth J.; Reed, James; Semmens, Darius J.; Sherrouse, Ben C.; Troy, Austin
2016-01-01
Through extensive research, ecosystem services have been mapped using both survey-based and biophysical approaches, but comparative mapping of public values and those quantified using models has been lacking. In this paper, we mapped hot and cold spots for perceived and modeled ecosystem services by synthesizing results from a social-values mapping study of residents living near the Pike–San Isabel National Forest (PSI), located in the Southern Rocky Mountains, with corresponding biophysically modeled ecosystem services. Social-value maps for the PSI were developed using the Social Values for Ecosystem Services tool, providing statistically modeled continuous value surfaces for 12 value types, including aesthetic, biodiversity, and life-sustaining values. Biophysically modeled maps of carbon sequestration and storage, scenic viewsheds, sediment regulation, and water yield were generated using the Artificial Intelligence for Ecosystem Services tool. Hotspots for both perceived and modeled services were disproportionately located within the PSI’s wilderness areas. Additionally, we used regression analysis to evaluate spatial relationships between perceived biodiversity and cultural ecosystem services and corresponding biophysical model outputs. Our goal was to determine whether publicly valued locations for aesthetic, biodiversity, and life-sustaining values relate meaningfully to results from corresponding biophysical ecosystem service models. We found weak relationships between perceived and biophysically modeled services, indicating that public perception of ecosystem service provisioning regions is limited. We believe that biophysical and social approaches to ecosystem service mapping can serve as methodological complements that can advance ecosystem services-based resource management, benefitting resource managers by showing potential locations of synergy or conflict between areas supplying ecosystem services and those valued by the public.
Peterson, Kevin J.; Pathak, Jyotishman
2014-01-01
Automated execution of electronic Clinical Quality Measures (eCQMs) from electronic health records (EHRs) on large patient populations remains a significant challenge, and the testability, interoperability, and scalability of measure execution are critical. The High Throughput Phenotyping (HTP; http://phenotypeportal.org) project aligns with these goals by using the standards-based HL7 Health Quality Measures Format (HQMF) and Quality Data Model (QDM) for measure specification, as well as Common Terminology Services 2 (CTS2) for semantic interpretation. The HQMF/QDM representation is automatically transformed into a JBoss® Drools workflow, enabling horizontal scalability via clustering and MapReduce algorithms. Using Project Cypress, automated verification metrics can then be produced. Our results show linear scalability for nine executed 2014 Center for Medicare and Medicaid Services (CMS) eCQMs for eligible professionals and hospitals for >1,000,000 patients, and verified execution correctness of 96.4% based on Project Cypress test data of 58 eCQMs. PMID:25954459
US EPA Nonattainment Areas and Designations-PM10 (1987 NAAQS)
This web service contains the following layer: PM10 Nonattainment Areas (1987 NAAQS). Full FGDC metadata records for each layer may be found by clicking the layer name at the web service endpoint (https://gispub.epa.gov/arcgis/rest/services/OAR_OAQPS/NAA1987PM10/MapServer) and viewing the layer description. These layers identify areas in the U.S. where air pollution levels have not met the National Ambient Air Quality Standards (NAAQS) for criteria air pollutants and have been designated nonattainment?? areas (NAA). The data are updated weekly from an OAQPS internal database. However, that does not necessarily mean the data have changed. The EPA Office of Air Quality Planning and Standards (OAQPS) has set National Ambient Air Quality Standards for six principal pollutants, which are called criteria pollutants. Under provisions of the Clean Air Act, which is intended to improve the quality of the air we breathe, EPA is required to set National Ambient Air Quality Standards for six common air pollutants. These commonly found air pollutants (also known as criteria pollutants) are found all over the United States. They are particle pollution (often referred to as particulate matter), ground-level ozone, carbon monoxide, sulfur oxides, nitrogen oxides, and lead. For each criteria pollutant, there are specific procedures used for measuring ambient concentrations and for calculating long-term (quarterly or annual) and/or short-term (24-hour) exposure levels. The metho
US EPA Nonattainment Areas and Designations-Lead (2008 NAAQS)
This web service contains the following layers: Lead NAA 2008 NAAQS and Lead NAA Centroids 2008 NAAQS. Full FGDC metadata records for each layer may be found by clicking the layer name at the web service endpoint (https://gispub.epa.gov/arcgis/rest/services/OAR_OAQPS/NAA2008Lead/MapServer) and viewing the layer description. These layers identify areas in the U.S. where air pollution levels have not met the National Ambient Air Quality Standards (NAAQS) for criteria air pollutants and have been designated nonattainment?? areas (NAA). The data are updated weekly from an OAQPS internal database. However, that does not necessarily mean the data have changed. The EPA Office of Air Quality Planning and Standards (OAQPS) has set National Ambient Air Quality Standards for six principal pollutants, which are called criteria pollutants. Under provisions of the Clean Air Act, which is intended to improve the quality of the air we breathe, EPA is required to set National Ambient Air Quality Standards for six common air pollutants. These commonly found air pollutants (also known as criteria pollutants) are found all over the United States. They are particle pollution (often referred to as particulate matter), ground-level ozone, carbon monoxide, sulfur oxides, nitrogen oxides, and lead. For each criteria pollutant, there are specific procedures used for measuring ambient concentrations and for calculating long-term (quarterly or annual) and/or short-term (24-hour) exposure l
US EPA Nonattainment Areas and Designations-8 Hour Ozone (2008 NAAQS)
This web service contains the following layers: Ozone 2008 NAAQS NAA State Level and Ozone 2008 NAAQS NAA National Level. Full FGDC metadata records for each layer may be found by clicking the layer name at the web service endpoint (https://gispub.epa.gov/arcgis/rest/services/OAR_OAQPS/NAA2008Ozone8hour/MapServer) and viewing the layer description. These layers identify areas in the U.S. where air pollution levels have not met the National Ambient Air Quality Standards (NAAQS) for criteria air pollutants and have been designated nonattainment?? areas (NAA). The data are updated weekly from an OAQPS internal database. However, that does not necessarily mean the data have changed. The EPA Office of Air Quality Planning and Standards (OAQPS) has set National Ambient Air Quality Standards for six principal pollutants, which are called criteria pollutants. Under provisions of the Clean Air Act, which is intended to improve the quality of the air we breathe, EPA is required to set National Ambient Air Quality Standards for six common air pollutants. These commonly found air pollutants (also known as criteria pollutants) are found all over the United States. They are particle pollution (often referred to as particulate matter), ground-level ozone, carbon monoxide, sulfur oxides, nitrogen oxides, and lead. For each criteria pollutant, there are specific procedures used for measuring ambient concentrations and for calculating long-term (quarterly or annual) and/or short-
US EPA Nonattainment Areas and Designations-8 Hour Ozone (1997 NAAQS)
This web service contains the following layers: Ozone 1997 NAAQS NAA State Level and Ozone 1997 NAAQS NAA National Level. Full FGDC metadata records for each layer may be found by clicking the layer name at the web service endpoint (https://gispub.epa.gov/arcgis/rest/services/OAR_OAQPS/NAA1997Ozone8hour/MapServer) and viewing the layer description. These layers identify areas in the U.S. where air pollution levels have not met the National Ambient Air Quality Standards (NAAQS) for criteria air pollutants and have been designated nonattainment?? areas (NAA). The data are updated weekly from an OAQPS internal database. However, that does not necessarily mean the data have changed. The EPA Office of Air Quality Planning and Standards (OAQPS) has set National Ambient Air Quality Standards for six principal pollutants, which are called criteria pollutants. Under provisions of the Clean Air Act, which is intended to improve the quality of the air we breathe, EPA is required to set National Ambient Air Quality Standards for six common air pollutants. These commonly found air pollutants (also known as criteria pollutants) are found all over the United States. They are particle pollution (often referred to as particulate matter), ground-level ozone, carbon monoxide, sulfur oxides, nitrogen oxides, and lead. For each criteria pollutant, there are specific procedures used for measuring ambient concentrations and for calculating long-term (quarterly or annual) and/or short
A Framework for Building and Reasoning with Adaptive and Interoperable PMESII Models
2007-11-01
Description Logic SOA Service Oriented Architecture SPARQL Simple Protocol And RDF Query Language SQL Standard Query Language SROM Stability and...another by providing a more expressive ontological structure for one of the models, e.g., semantic networks can be mapped to first- order logical...Pellet is an open-source reasoner that works with OWL-DL. It accepts the SPARQL protocol and RDF query language ( SPARQL ) and provides a Java API to
Reija Haapanen; Kimmo Lehtinen; Jukka Miettinen; Marvin E. Bauer; Alan R. Ek
2002-01-01
The k-nearest neighbor (k-NN) method has been undergoing development and testing for applications with USDA Forest Service Forest Inventory and Analysis (FIA) data in Minnesota since 1997. Research began using the 1987-1990 FIA inventory of the state, the then standard 10-point cluster plots, and Landsat TM imagery. In the past year, research has moved to examine...
Modelling noise propagation using Grid Resources. Progress within GDI-Grid
NASA Astrophysics Data System (ADS)
Kiehle, Christian; Mayer, Christian; Padberg, Alexander; Stapelfeld, Hartmut
2010-05-01
Modelling noise propagation using Grid Resources. Progress within GDI-Grid. GDI-Grid (english: SDI-Grid) is a research project funded by the German Ministry for Science and Education (BMBF). It aims at bridging the gaps between OGC Web Services (OWS) and Grid infrastructures and identifying the potential of utilizing the superior storage capacities and computational power of grid infrastructures for geospatial applications while keeping the well-known service interfaces specified by the OGC. The project considers all major OGC webservice interfaces for Web Mapping (WMS), Feature access (Web Feature Service), Coverage access (Web Coverage Service) and processing (Web Processing Service). The major challenge within GDI-Grid is the harmonization of diverging standards as defined by standardization bodies for Grid computing and spatial information exchange. The project started in 2007 and will continue until June 2010. The concept for the gridification of OWS developed by lat/lon GmbH and the Department of Geography of the University of Bonn is applied to three real-world scenarios in order to check its practicability: a flood simulation, a scenario for emergency routing and a noise propagation simulation. The latter scenario is addressed by the Stapelfeldt Ingenieurgesellschaft mbH located in Dortmund adapting their LimA software to utilize grid resources. Noise mapping of e.g. traffic noise in urban agglomerates and along major trunk roads is a reoccurring demand of the EU Noise Directive. Input data requires road net and traffic, terrain, buildings and noise protection screens as well as population distribution. Noise impact levels are generally calculated in 10 m grid and along relevant building facades. For each receiver position sources within a typical range of 2000 m are split down into small segments, depending on local geometry. For each of the segments propagation analysis includes diffraction effects caused by all obstacles on the path of sound propagation. This immense intensive calculation needs to be performed for a major part of European landscape. A LINUX version of the commercial LimA software for noise mapping analysis has been implemented on a test cluster within the German D-GRID computer network. Results and performance indicators will be presented. The presentation is an extension to last-years presentation "Spatial Data Infrastructures and Grid Computing: the GDI-Grid project" that described the gridification concept developed in the GDI-Grid project and provided an overview of the conceptual gaps between Grid Computing and Spatial Data Infrastructures. Results from the GDI-Grid project are incorporated in the OGC-OGF (Open Grid Forum) collaboration efforts as well as the OGC WPS 2.0 standards working group developing the next major version of the WPS specification.
Lowering the Barrier for Standards-Compliant and Discoverable Hydrological Data Publication
NASA Astrophysics Data System (ADS)
Kadlec, J.
2013-12-01
The growing need for sharing and integration of hydrological and climate data across multiple organizations has resulted in the development of distributed, services-based, standards-compliant hydrological data management and data hosting systems. The problem with these systems is complicated set-up and deployment. Many existing systems assume that the data publisher has remote-desktop access to a locally managed server and experience with computer network setup. For corporate websites, shared web hosting services with limited root access provide an inexpensive, dynamic web presence solution using the Linux, Apache, MySQL and PHP (LAMP) software stack. In this paper, we hypothesize that a webhosting service provides an optimal, low-cost solution for hydrological data hosting. We propose a software architecture of a standards-compliant, lightweight and easy-to-deploy hydrological data management system that can be deployed on the majority of existing shared internet webhosting services. The architecture and design is validated by developing Hydroserver Lite: a PHP and MySQL-based hydrological data hosting package that is fully standards-compliant and compatible with the Consortium of Universities for Advancement of Hydrologic Sciences (CUAHSI) hydrologic information system. It is already being used for management of field data collection by students of the McCall Outdoor Science School in Idaho. For testing, the Hydroserver Lite software has been installed on multiple different free and low-cost webhosting sites including Godaddy, Bluehost and 000webhost. The number of steps required to set-up the server is compared with the number of steps required to set-up other standards-compliant hydrologic data hosting systems including THREDDS, IstSOS and MapServer SOS.
BioBarcode: a general DNA barcoding database and server platform for Asian biodiversity resources
2009-01-01
Background DNA barcoding provides a rapid, accurate, and standardized method for species-level identification using short DNA sequences. Such a standardized identification method is useful for mapping all the species on Earth, particularly when DNA sequencing technology is cheaply available. There are many nations in Asia with many biodiversity resources that need to be mapped and registered in databases. Results We have built a general DNA barcode data processing system, BioBarcode, with open source software - which is a general purpose database and server. It uses mySQL RDBMS 5.0, BLAST2, and Apache httpd server. An exemplary database of BioBarcode has around 11,300 specimen entries (including GenBank data) and registers the biological species to map their genetic relationships. The BioBarcode database contains a chromatogram viewer which improves the performance in DNA sequence analyses. Conclusion Asia has a very high degree of biodiversity and the BioBarcode database server system aims to provide an efficient bioinformatics protocol that can be freely used by Asian researchers and research organizations interested in DNA barcoding. The BioBarcode promotes the rapid acquisition of biological species DNA sequence data that meet global standards by providing specialized services, and provides useful tools that will make barcoding cheaper and faster in the biodiversity community such as standardization, depository, management, and analysis of DNA barcode data. The system can be downloaded upon request, and an exemplary server has been constructed with which to build an Asian biodiversity system http://www.asianbarcode.org. PMID:19958506
What Do Pre-Service Physics Teachers Know and Think about Concept Mapping?
ERIC Educational Resources Information Center
Didis, Nilüfer; Özcan, Özgür; Azar, Ali
2014-01-01
In order to use concept maps in physics classes effectively, teachers' knowledge and ideas about concept mapping are as important as the physics knowledge used in mapping. For this reason, we aimed to examine pre-service physics teachers' knowledge on concept mapping, their ideas about the implementation of concept mapping in physics…
Semantic Integration for Marine Science Interoperability Using Web Technologies
NASA Astrophysics Data System (ADS)
Rueda, C.; Bermudez, L.; Graybeal, J.; Isenor, A. W.
2008-12-01
The Marine Metadata Interoperability Project, MMI (http://marinemetadata.org) promotes the exchange, integration, and use of marine data through enhanced data publishing, discovery, documentation, and accessibility. A key effort is the definition of an Architectural Framework and Operational Concept for Semantic Interoperability (http://marinemetadata.org/sfc), which is complemented with the development of tools that realize critical use cases in semantic interoperability. In this presentation, we describe a set of such Semantic Web tools that allow performing important interoperability tasks, ranging from the creation of controlled vocabularies and the mapping of terms across multiple ontologies, to the online registration, storage, and search services needed to work with the ontologies (http://mmisw.org). This set of services uses Web standards and technologies, including Resource Description Framework (RDF), Web Ontology language (OWL), Web services, and toolkits for Rich Internet Application development. We will describe the following components: MMI Ontology Registry: The MMI Ontology Registry and Repository provides registry and storage services for ontologies. Entries in the registry are associated with projects defined by the registered users. Also, sophisticated search functions, for example according to metadata items and vocabulary terms, are provided. Client applications can submit search requests using the WC3 SPARQL Query Language for RDF. Voc2RDF: This component converts an ASCII comma-delimited set of terms and definitions into an RDF file. Voc2RDF facilitates the creation of controlled vocabularies by using a simple form-based user interface. Created vocabularies and their descriptive metadata can be submitted to the MMI Ontology Registry for versioning and community access. VINE: The Vocabulary Integration Environment component allows the user to map vocabulary terms across multiple ontologies. Various relationships can be established, for example exactMatch, narrowerThan, and subClassOf. VINE can compute inferred mappings based on the given associations. Attributes about each mapping, like comments and a confidence level, can also be included. VINE also supports registering and storing resulting mapping files in the Ontology Registry. The presentation will describe the application of semantic technologies in general, and our planned applications in particular, to solve data management problems in the marine and environmental sciences.
Jiang, Guoqian; Kiefer, Richard; Prud'hommeaux, Eric; Solbrig, Harold R
2017-01-01
The OHDSI Common Data Model (CDM) is a deep information model, in which its vocabulary component plays a critical role in enabling consistent coding and query of clinical data. The objective of the study is to create methods and tools to expose the OHDSI vocabularies and mappings as the vocabulary mapping services using two HL7 FHIR core terminology resources ConceptMap and ValueSet. We discuss the benefits and challenges in building the FHIR-based terminology services.
Non-linear assessment and deficiency of linear relationship for healthcare industry
NASA Astrophysics Data System (ADS)
Nordin, N.; Abdullah, M. M. A. B.; Razak, R. C.
2017-09-01
This paper presents the development of the non-linear service satisfaction model that assumes patients are not necessarily satisfied or dissatisfied with good or poor service delivery. With that, compliment and compliant assessment is considered, simultaneously. Non-linear service satisfaction instrument called Kano-Q and Kano-SS is developed based on Kano model and Theory of Quality Attributes (TQA) to define the unexpected, hidden and unspoken patient satisfaction and dissatisfaction into service quality attribute. A new Kano-Q and Kano-SS algorithm for quality attribute assessment is developed based satisfaction impact theories and found instrumentally fit the reliability and validity test. The results were also validated based on standard Kano model procedure before Kano model and Quality Function Deployment (QFD) is integrated for patient attribute and service attribute prioritization. An algorithm of Kano-QFD matrix operation is developed to compose the prioritized complaint and compliment indexes. Finally, the results of prioritized service attributes are mapped to service delivery category to determine the most prioritized service delivery that need to be improved at the first place by healthcare service provider.
A Story of a Crashed Plane in US-Mexican border
NASA Astrophysics Data System (ADS)
Bermudez, Luis; Hobona, Gobe; Vretanos, Peter; Peterson, Perry
2013-04-01
A plane has crashed on the US-Mexican border. The search and rescue command center planner needs to find information about the crash site, a mountain, nearby mountains for the establishment of a communications tower, as well as ranches for setting up a local incident center. Events like this one occur all over the world and exchanging information seamlessly is key to save lives and prevent further disasters. This abstract describes an interoperability testbed that applied this scenario using technologies based on Open Geospatial Consortium (OGC) standards. The OGC, which has about 500 members, serves as a global forum for the collaboration of developers and users of spatial data products and services, and to advance the development of international standards for geospatial interoperability. The OGC Interoperability Program conducts international interoperability testbeds, such as the OGC Web Services Phase 9 (OWS-9), that encourages rapid development, testing, validation, demonstration and adoption of open, consensus based standards and best practices. The Cross-Community Interoperability (CCI) thread in OWS-9 advanced the Web Feature Service for Gazetteers (WFS-G) by providing a Single Point of Entry Global Gazetteer (SPEGG), where a user can submit a single query and access global geographic names data across multiple Federal names databases. Currently users must make two queries with differing input parameters against two separate databases to obtain authoritative cross border geographic names data. The gazetteers in this scenario included: GNIS and GNS. GNIS or Geographic Names Information System is managed by USGS. It was first developed in 1964 and contains information about domestic and Antarctic names. GNS or GeoNET Names Server provides the Geographic Names Data Base (GNDB) and it is managed by National Geospatial Intelligence Agency (NGA). GNS has been in service since 1994, and serves names for areas outside the United States and its dependent areas, as well as names for undersea features. The following challenges were advanced: Cascaded WFS-G servers (allowing to query multiple WFSs with a "parent" WFS), implemented query names filters (e.g. fuzzy search, text search), implemented dealing with multilingualism and diacritics, implemented advanced spatial constraints (e.g. search by radial search and nearest neighbor) and semantically mediated feature types (e.g. mountain vs. hill). To enable semantic mediation, a series of semantic mappings were defined between the NGA GNS, USGS GNIS and the Alexandria Digital Library (ADL) Gazetteer. The mappings were encoded in the Web Ontology Language (OWL) to enable them to be used by semantic web technologies. The semantic mappings were then published for ingestion into a semantic mediator that used the mappings to associate location types from one gazetteer with location types in another. The semantic mediator was then able to transform requests on the fly, providing a single point of entry WFS-G to multiple gazetteers. The presentation will provide a live presentation of the work performed, highlight main developments, and discuss future development.
NASA Astrophysics Data System (ADS)
Gray, A. J. G.; Gray, N.; Ounis, I.
2009-09-01
There are multiple vocabularies and thesauri within astronomy, of which the best known are the 1993 IAU Thesaurus and the keyword list maintained by A&A, ApJ and MNRAS. The IVOA has agreed on a standard for publishing vocabularies, based on the W3C skos standard, to allow greater automated interaction with them, in particular on the Web. This allows links with the Semantic Web and looks forward to richer applications using the technologies of that domain. Vocabulary-aware applications can benefit from improvements in both precision and recall when searching for bibliographic or science data, and lightweight intelligent filtering for services such as VOEvent streams. In this paper we present two applications, the Vocabulary Explorer and its companion the Mapping Editor, which have been developed to support the use of vocabularies in the Virtual Observatory. These combine Semantic Web and Information Retrieval technologies to illustrate the way in which formal vocabularies might be used in a practical application, provide an online service which will allow astronomers to explore and relate existing vocabularies, and provide a service which translates free text user queries into vocabulary terms.
77 FR 15369 - Mobility Fund Phase I Auction GIS Data of Potentially Eligible Census Blocks
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-15
....fcc.gov/auctions/901/ , are the following: Downloadable shapefile Web mapping service MapBox map tiles... GIS software allows you to add this service as a layer to your session or project. 6. MapBox map tiles are cached map tiles of the data. With this open source software approach, these image tiles can be...
NASA Technical Reports Server (NTRS)
Panangadan, Anand; Monacos, Steve; Burleigh, Scott; Joswig, Joseph; James, Mark; Chow, Edward
2012-01-01
In this paper, we describe the architecture of both the PATS and SAP systems and how these two systems interoperate with each other forming a unified capability for deploying intelligence in hostile environments with the objective of providing actionable situational awareness of individuals. The SAP system works in concert with the UICDS information sharing middleware to provide data fusion from multiple sources. UICDS can then publish the sensor data using the OGC's Web Mapping Service, Web Feature Service, and Sensor Observation Service standards. The system described in the paper is able to integrate a spatially distributed sensor system, operating without the benefit of the Web infrastructure, with a remote monitoring and control system that is equipped to take advantage of SWE.
Smart POI: Open and linked spatial data
NASA Astrophysics Data System (ADS)
Cerba, Otakar; Berzins, Raitis; Charvat, Karel; Mildorf, Tomas
2016-04-01
The Smart Point of Interest (SPOI) represents an unique seamless spatial data set based on standards recommended for Linked and open data, which are supported by scientist and researchers as well as by several government authorities and European Union. This data set developed in cooperation of partners of SDI4Apps project contains almost 24 millions points of interest focused mainly on tourism, natural features, transport or citizen services. The SPOI data covers almost all countries and territories over the world. It is created as a harmonized combination of global data resources (selected points from OpenStreetMap, Natural Earth and GeoNames.org) and several local data sets (for example data published by the Citadel on the Move project, data from Posumavi region in the Czech Republic or experimental ontologies developed in the University of West Bohemia including ski regions in Europe or historical sights in Rome). The added value of the SDI4Apps approach in comparison to other similar solutions consists in implementation of linked data approach (several objects are connected to DBpedia or GeoNames.org), using of universal RDF format, using of standardized and respected properties or vocabularies (for example FOAF or GeoSPARQL) and development of the completely harmonized data set with uniform data model and common classification (not only a copy of original resources). The SPOI data is published as SPARQL endpoint as well as in the map client. The SPOI dataset is a specific set of POIs which could be "a data fuel" for applications and services related to tourism, local business, statistics or landscape monitoring. It can be used also as a background data layer for thematic maps.
Adapting the CUAHSI Hydrologic Information System to OGC standards
NASA Astrophysics Data System (ADS)
Valentine, D. W.; Whitenack, T.; Zaslavsky, I.
2010-12-01
The CUAHSI Hydrologic Information System (HIS) provides web and desktop client access to hydrologic observations via water data web services using an XML schema called “WaterML”. The WaterML 1.x specification and the corresponding Water Data Services have been the backbone of the HIS service-oriented architecture (SOA) and have been adopted for serving hydrologic data by several federal agencies and many academic groups. The central discovery service, HIS Central, is based on an metadata catalog that references 4.7 billion observations, organized as 23 million data series from 1.5 million sites from 51 organizations. Observations data are published using HydroServer nodes that have been deployed at 18 organizations. Usage of HIS has increased by 8x from 2008 to 2010, and doubled in usage from 1600 data series a day in 2009 to 3600 data series a day in the first half of 2010. The HIS central metadata catalog currently harvests information from 56 Water Data Services. We collaborate on the catalog updates with two federal partners, USGS and US EPA: their data series are periodically reloaded into the HIS metadata catalog. We are pursuing two main development directions in the HIS project: Cloud-based computing, and further compliance with Open Geospatial Consortium (OGC) standards. The goal of moving to cloud-computing is to provide a scalable collaborative system with a simpler deployment and less dependence of hardware maintenance and staff. This move requires re-architecting the information models underlying the metadata catalog, and Water Data Services to be independent of the underlying relational database model, allowing for implementation on both relational databases, and cloud-based processing systems. Cloud-based HIS central resources can be managed collaboratively; partners share responsibility for their metadata by publishing data series information into the centralized catalog. Publishing data series will use REST-based service interfaces, like OData, as the basis for ingesting data series information into a cloud-hosted catalog. The future HIS services involve providing information via OGC Standards that will allow for observational data access from commercial GIS applications. Use of standards will allow for tools to access observational data from other projects using standards, such as the Ocean Observatories Initiative, and for tools from such projects to be integrated into the HIS toolset. With international collaborators, we have been developing a water information exchange language called “WaterML 2.0” which will be used to deliver observations data over OGC Sensor Observation Services (SOS). A software stack of OGC standard services will provide access to HIS information. In addition to SOS, Web Mapping and Feature Services (WMS, and WFS) will provide access to location information. Catalog Services for the Web (CSW) will provide a catalog for water information that is both centralized, and distributed. We intend the OGC standards supplement the existing HIS service interfaces, rather than replace the present service interfaces. The ultimate goal of this development is expand access to hydrologic observations data, and create an environment where these data can be seamlessly integrated with standards-compliant data resources.
Construct Maps as a Foundation for Standard Setting
ERIC Educational Resources Information Center
Wyse, Adam E.
2013-01-01
Construct maps are tools that display how the underlying achievement construct upon which one is trying to set cut-scores is related to other information used in the process of standard setting. This article reviews what construct maps are, uses construct maps to provide a conceptual framework to view commonly used standard-setting procedures (the…
Jobs within a 30-minute transit ride - Service
This mapping service summarizes the total number of jobs that can be reached within 30 minutes by transit. EPA modeled accessibility via transit by calculating total travel time between block group centroids inclusive of walking to/from transit stops, wait times, and transfers. Block groups that can be accessed in 30 minutes or less from the origin block group are considered accessible. Values reflect public transit service in December 2012 and employment counts in 2010. Coverage is limited to census block groups within metropolitan regions served by transit agencies who share their service data in a standardized format called GTFS.All variable names refer to variables in EPA's Smart Location Database. For instance EmpTot10_sum summarizes total employment (EmpTot10) in block groups that are reachable within a 30-minute transit and walking commute. See Smart Location Database User Guide for full variable descriptions.
Skempes, Dimitrios; Bickenbach, Jerome
2015-09-24
Rehabilitation care is fundamental to health and human dignity and a human right enshrined in the United Nations Convention on the Rights of Persons with Disabilities. The provision of rehabilitation is important for reducing the need for formal support and enabling persons with disabilities to lead an independent life. Increasingly scholars and advocacy groups voice concerns over the significant barriers facing people with disabilities in accessing appropriate and quality rehabilitation. A growing body of research highlights a "respond-need" gap in the provision of rehabilitation and assistive technologies and underscore the lack of indicators for assessing performance of rehabilitation systems and monitoring States compliance with human rights standards in rehabilitation service planning and programming. While research on human rights and health monitoring has increased exponentially over the last decade far too little attention has been paid to rehabilitation services. The proposed research aims to reduce this knowledge gap by developing a human rights based monitoring framework with indicators to support human rights accountability and performance assessment in rehabilitation. Concept mapping, a stakeholder-driven approach will be used as the core method to identify rights based indicators and develop the rehabilitation services monitoring framework. Concept mapping requires participants from various stakeholders groups to generate a list of the potential indicators through on line brainstorming, sort the indicators for conceptual similarity into clusters and rate them against predefined criteria. Multidimensional scaling and hierarchical cluster data analysis will be performed to develop the monitoring framework while bridging analysis will provide useful insights about patterns of agreement or disagreement among participants views on indicators. This study has the potential to influence future practices on data collection and measurement of compliance with human rights standards in rehabilitation service delivery and organization. The development of a valid and universally applicable set of indicators will have a profound impact on the design and implementation of evidence informed disability policies and programs as it can support countries in strengthening performance measurement through documentation of comparative information on rehabilitation care systems. Most importantly, the resulting indicators can be used by disabled people's organizations as well as national and international institutions to define a minimal standard for monitoring and reporting progress on the implementation of the Convention on the Rights of Persons with Disabilities in the area of rehabilitation.
EnviroAtlas National Layers Master Web Service
This EnviroAtlas web service supports research and online mapping activities related to EnviroAtlas (https://www.epa.gov/enviroatlas). This web service includes layers depicting EnviroAtlas national metrics mapped at the 12-digit HUC within the conterminous United States. This dataset was produced by the US EPA to support research and online mapping activities related to EnviroAtlas. EnviroAtlas (https://www.epa.gov/enviroatlas) allows the user to interact with a web-based, easy-to-use, mapping application to view and analyze multiple ecosystem services for the contiguous United States. The dataset is available as downloadable data (https://edg.epa.gov/data/Public/ORD/EnviroAtlas) or as an EnviroAtlas map service. Additional descriptive information about each attribute in this dataset can be found in its associated EnviroAtlas Fact Sheet (https://www.epa.gov/enviroatlas/enviroatlas-fact-sheets).
Exploring NASA OMI Level 2 Data With Visualization
NASA Technical Reports Server (NTRS)
Wei, Jennifer; Yang, Wenli; Johnson, James; Zhao, Peisheng; Gerasimov, Irina; Pham, Long; Vicente, Gilberto
2014-01-01
Satellite data products are important for a wide variety of applications that can bring far-reaching benefits to the science community and the broader society. These benefits can best be achieved if the satellite data are well utilized and interpreted, such as model inputs from satellite, or extreme events (such as volcano eruptions, dust storms,... etc.). Unfortunately, this is not always the case, despite the abundance and relative maturity of numerous satellite data products provided by NASA and other organizations. Such obstacles may be avoided by allowing users to visualize satellite data as "images", with accurate pixel-level (Level-2) information, including pixel coverage area delineation and science team recommended quality screening for individual geophysical parameters. We present a prototype service from the Goddard Earth Sciences Data and Information Services Center (GES DISC) supporting Aura OMI Level-2 Data with GIS-like capabilities. Functionality includes selecting data sources (e.g., multiple parameters under the same scene, like NO2 and SO2, or the same parameter with different aggregation methods, like NO2 in OMNO2G and OMNO2D products), user-defined area-of-interest and temporal extents, zooming, panning, overlaying, sliding, and data subsetting, reformatting, and reprojection. The system will allow any user-defined portal interface (front-end) to connect to our backend server with OGC standard-compliant Web Mapping Service (WMS) and Web Coverage Service (WCS) calls. This back-end service should greatly enhance its expandability to integrate additional outside data/map sources.
Exploring NASA OMI Level 2 Data With Visualization
NASA Technical Reports Server (NTRS)
Wei, Jennifer C.; Yang, Wenli; Johnson, James; Zhao, Peisheng; Gerasimov, Irina; Pham, Long; Vincente, Gilbert
2014-01-01
Satellite data products are important for a wide variety of applications that can bring far-reaching benefits to the science community and the broader society. These benefits can best be achieved if the satellite data are well utilized and interpreted, such as model inputs from satellite, or extreme events (such as volcano eruptions, dust storms, etc.).Unfortunately, this is not always the case, despite the abundance and relative maturity of numerous satellite data products provided by NASA and other organizations. Such obstacles may be avoided by allowing users to visualize satellite data as images, with accurate pixel-level (Level-2) information, including pixel coverage area delineation and science team recommended quality screening for individual geophysical parameters. We present a prototype service from the Goddard Earth Sciences Data and Information Services Center (GES DISC) supporting Aura OMI Level-2 Data with GIS-like capabilities. Functionality includes selecting data sources (e.g., multiple parameters under the same scene, like NO2 and SO2, or the same parameter with different aggregation methods, like NO2 in OMNO2G and OMNO2D products), user-defined area-of-interest and temporal extents, zooming, panning, overlaying, sliding, and data subsetting, reformatting, and reprojection. The system will allow any user-defined portal interface (front-end) to connect to our backend server with OGC standard-compliant Web Mapping Service (WMS) and Web Coverage Service (WCS) calls. This back-end service should greatly enhance its expandability to integrate additional outside data-map sources.
Foster, Helen E; Minden, Kirsten; Clemente, Daniel; Leon, Leticia; McDonagh, Janet E; Kamphuis, Sylvia; Berggren, Karin; van Pelt, Philomine; Wouters, Carine; Waite-Jones, Jennifer; Tattersall, Rachel; Wyllie, Ruth; Stones, Simon R; Martini, Alberto; Constantin, Tamas; Schalm, Susanne; Fidanci, Berna; Erer, Burak; Demirkaya, Erkan; Ozen, Seza; Carmona, Loreto
2017-04-01
To develop standards and recommendations for transitional care for young people (YP) with juvenile-onset rheumatic and musculoskeletal diseases (jRMD). The consensus process involved the following: (1) establishing an international expert panel to include patients and representatives from multidisciplinary teams in adult and paediatric rheumatology; (2) a systematic review of published models of transitional care in jRMDs, potential standards and recommendations, strategies for implementation and tools to evaluate services and outcomes; (3) setting the framework, developing the process map and generating a first draft of standards and recommendations; (4) further iteration of recommendations; (5) establishing consensus recommendations with Delphi methodology and (6) establishing standards and quality indicators. The final consensus derived 12 specific recommendations for YP with jRMD focused on transitional care. These included: high-quality, multidisciplinary care starting in early adolescence; the integral role of a transition co-ordinator; transition policies and protocols; efficient communications; transfer documentation; an open electronic-based platform to access resources; appropriate training for paediatric and adult healthcare teams; secure funding to continue treatments and services into adult rheumatology and the need for increased evidence to inform best practice. These consensus-based recommendations inform strategies to reach optimal outcomes in transitional care for YP with jRMD based on available evidence and expert opinion. They need to be implemented in the context of individual countries, healthcare systems and regulatory frameworks. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
NASA Astrophysics Data System (ADS)
Gupta, V.; Gupta, N.; Gupta, S.; Field, E.; Maechling, P.
2003-12-01
Modern laptop computers, and personal computers, can provide capabilities that are, in many ways, comparable to workstations or departmental servers. However, this doesn't mean we should run all computations on our local computers. We have identified several situations in which it preferable to implement our seismological application programs in a distributed, server-based, computing model. In this model, application programs on the user's laptop, or local computer, invoke programs that run on an organizational server, and the results are returned to the invoking system. Situations in which a server-based architecture may be preferred include: (a) a program is written in a language, or written for an operating environment, that is unsupported on the local computer, (b) software libraries or utilities required to execute a program are not available on the users computer, (c) a computational program is physically too large, or computationally too expensive, to run on a users computer, (d) a user community wants to enforce a consistent method of performing a computation by standardizing on a single implementation of a program, and (e) the computational program may require current information, that is not available to all client computers. Until recently, distributed, server-based, computational capabilities were implemented using client/server architectures. In these architectures, client programs were often written in the same language, and they executed in the same computing environment, as the servers. Recently, a new distributed computational model, called Web Services, has been developed. Web Services are based on Internet standards such as XML, SOAP, WDSL, and UDDI. Web Services offer the promise of platform, and language, independent distributed computing. To investigate this new computational model, and to provide useful services to the SCEC Community, we have implemented several computational and utility programs using a Web Service architecture. We have hosted these Web Services as a part of the SCEC Community Modeling Environment (SCEC/CME) ITR Project (http://www.scec.org/cme). We have implemented Web Services for several of the reasons sited previously. For example, we implemented a FORTRAN-based Earthquake Rupture Forecast (ERF) as a Web Service for use by client computers that don't support a FORTRAN runtime environment. We implemented a Generic Mapping Tool (GMT) Web Service for use by systems that don't have local access to GMT. We implemented a Hazard Map Calculator Web Service to execute Hazard calculations that are too computationally intensive to run on a local system. We implemented a Coordinate Conversion Web Service to enforce a standard and consistent method for converting between UTM and Lat/Lon. Our experience developing these services indicates both strengths and weakness in current Web Service technology. Client programs that utilize Web Services typically need network access, a significant disadvantage at times. Programs with simple input and output parameters were the easiest to implement as Web Services, while programs with complex parameter-types required a significant amount of additional development. We also noted that Web services are very data-oriented, and adapting object-oriented software into the Web Service model proved problematic. Also, the Web Service approach of converting data types into XML format for network transmission has significant inefficiencies for some data sets.
An Offline-Online Android Application for Hazard Event Mapping Using WebGIS Open Source Technologies
NASA Astrophysics Data System (ADS)
Olyazadeh, Roya; Jaboyedoff, Michel; Sudmeier-Rieux, Karen; Derron, Marc-Henri; Devkota, Sanjaya
2016-04-01
Nowadays, Free and Open Source Software (FOSS) plays an important role in better understanding and managing disaster risk reduction around the world. National and local government, NGOs and other stakeholders are increasingly seeking and producing data on hazards. Most of the hazard event inventories and land use mapping are based on remote sensing data, with little ground truthing, creating difficulties depending on the terrain and accessibility. Open Source WebGIS tools offer an opportunity for quicker and easier ground truthing of critical areas in order to analyse hazard patterns and triggering factors. This study presents a secure mobile-map application for hazard event mapping using Open Source WebGIS technologies such as Postgres database, Postgis, Leaflet, Cordova and Phonegap. The objectives of this prototype are: 1. An Offline-Online android mobile application with advanced Geospatial visualisation; 2. Easy Collection and storage of events information applied services; 3. Centralized data storage with accessibility by all the service (smartphone, standard web browser); 4. Improving data management by using active participation in hazard event mapping and storage. This application has been implemented as a low-cost, rapid and participatory method for recording impacts from hazard events and includes geolocation (GPS data and Internet), visualizing maps with overlay of satellite images, viewing uploaded images and events as cluster points, drawing and adding event information. The data can be recorded in offline (Android device) or online version (all browsers) and consequently uploaded through the server whenever internet is available. All the events and records can be visualized by an administrator and made public after approval. Different user levels can be defined to access the data for communicating the information. This application was tested for landslides in post-earthquake Nepal but can be used for any other type of hazards such as flood, avalanche, etc. Keywords: Offline, Online, WebGIS Open source, Android, Hazard Event Mapping
NASA Astrophysics Data System (ADS)
Santhana Vannan, S.; Cook, R. B.; Wilson, B. E.; Wei, Y.
2010-12-01
Terrestrial ecology data sets are produced from diverse data sources such as model output, field data collection, laboratory analysis and remote sensing observation. These data sets can be created, distributed, and consumed in diverse ways as well. However, this diversity can hinder the usability of the data, and limit data users’ abilities to validate and reuse data for science and application purposes. Geospatial web services, such as those described in this paper, are an important means of reducing this burden. Terrestrial ecology researchers generally create the data sets in diverse file formats, with file and data structures tailored to the specific needs of their project, possibly as tabular data, geospatial images, or documentation in a report. Data centers may reformat the data to an archive-stable format and distribute the data sets through one or more protocols, such as FTP, email, and WWW. Because of the diverse data preparation, delivery, and usage patterns, users have to invest time and resources to bring the data into the format and structure most useful for their analysis. This time-consuming data preparation process shifts valuable resources from data analysis to data assembly. To address these issues, the ORNL DAAC, a NASA-sponsored terrestrial ecology data center, has utilized geospatial Web service technology, such as Open Geospatial Consortium (OGC) Web Map Service (WMS) and OGC Web Coverage Service (WCS) standards, to increase the usability and availability of terrestrial ecology data sets. Data sets are standardized into non-proprietary file formats and distributed through OGC Web Service standards. OGC Web services allow the ORNL DAAC to store data sets in a single format and distribute them in multiple ways and formats. Registering the OGC Web services through search catalogues and other spatial data tools allows for publicizing the data sets and makes them more available across the Internet. The ORNL DAAC has also created a Web-based graphical user interface called Spatial Data Access Tool (SDAT) that utilizes OGC Web services standards and allows data distribution and consumption for users not familiar with OGC standards. SDAT also allows for users to visualize the data set prior to download. Google Earth visualizations of the data set are also provided through SDAT. The use of OGC Web service standards at the ORNL DAAC has enabled an increase in data consumption. In one case, a data set had ~10 fold increase in download through OGC Web service in comparison to the conventional FTP and WWW method of access. The increase in download suggests that users are not only finding the data sets they need but also able to consume them readily in the format they need.
Standard map in magnetized relativistic systems: fixed points and regular acceleration.
de Sousa, M C; Steffens, F M; Pakter, R; Rizzato, F B
2010-08-01
We investigate the concept of a standard map for the interaction of relativistic particles and electrostatic waves of arbitrary amplitudes, under the action of external magnetic fields. The map is adequate for physical settings where waves and particles interact impulsively, and allows for a series of analytical result to be exactly obtained. Unlike the traditional form of the standard map, the present map is nonlinear in the wave amplitude and displays a series of peculiar properties. Among these properties we discuss the relation involving fixed points of the maps and accelerator regimes.
US EPA Nonattainment Areas and Designations-SO2 (2010 NAAQS)
This web service contains the following layer: SO2 2010 NAAQS State Level. Full FGDC metadata records for each layer may be found by clicking the layer name at the web service endpoint (https://gispub.epa.gov/arcgis/rest/services/OAR_OAQPS/NAA2010SO21hour/MapServer) and viewing the layer description. These layers identify areas in the U.S. where air pollution levels have not met the National Ambient Air Quality Standards (NAAQS) for criteria air pollutants and have been designated nonattainment?? areas (NAA). The data are updated weekly from an OAQPS internal database. However, that does not necessarily mean the data have changed. The EPA Office of Air Quality Planning and Standards (OAQPS) has set National Ambient Air Quality Standards for six principal pollutants, which are called criteria pollutants. Under provisions of the Clean Air Act, which is intended to improve the quality of the air we breathe, EPA is required to set National Ambient Air Quality Standards for six common air pollutants. These commonly found air pollutants (also known as criteria pollutants) are found all over the United States. They are particle pollution (often referred to as particulate matter), ground-level ozone, carbon monoxide, sulfur oxides, nitrogen oxides, and lead. For each criteria pollutant, there are specific procedures used for measuring ambient concentrations and for calculating long-term (quarterly or annual) and/or short-term (24-hour) exposure levels. The methods and a
US EPA Nonattainment Areas and Designations-24 Hour PM2.5 (2006 NAAQS)
This web service contains the following layers: PM2.5 24hr 2006 NAAQS State Level and PM2.5 24hr 2006 NAAQS National. Full FGDC metadata records for each layer may be found by clicking the layer name at the web service endpoint (https://gispub.epa.gov/arcgis/rest/services/OAR_OAQPS/NAA2006PM2524hour/MapServer) and viewing the layer description. These layers identify areas in the U.S. where air pollution levels have not met the National Ambient Air Quality Standards (NAAQS) for criteria air pollutants and have been designated nonattainment?? areas (NAA). The data are updated weekly from an OAQPS internal database. However, that does not necessarily mean the data have changed. The EPA Office of Air Quality Planning and Standards (OAQPS) has set National Ambient Air Quality Standards for six principal pollutants, which are called criteria pollutants. Under provisions of the Clean Air Act, which is intended to improve the quality of the air we breathe, EPA is required to set National Ambient Air Quality Standards for six common air pollutants. These commonly found air pollutants (also known as criteria pollutants) are found all over the United States. They are particle pollution (often referred to as particulate matter), ground-level ozone, carbon monoxide, sulfur oxides, nitrogen oxides, and lead. For each criteria pollutant, there are specific procedures used for measuring ambient concentrations and for calculating long-term (quarterly or annual) and/or short-ter
US EPA Nonattainment Areas and Designations-Annual PM2.5 (2012 NAAQS)
This web service contains the following layer: PM2.5 Annual 2012 NAAQS State Level. Full FGDC metadata records for each layer may be found by clicking the layer name at the web service endpoint (https://gispub.epa.gov/arcgis/rest/services/OAR_OAQPS/NAA2012PM25Annual/MapServer) and viewing the layer description. These layers identify areas in the U.S. where air pollution levels have not met the National Ambient Air Quality Standards (NAAQS) for criteria air pollutants and have been designated nonattainment?? areas (NAA). The data are updated weekly from an OAQPS internal database. However, that does not necessarily mean the data have changed. The EPA Office of Air Quality Planning and Standards (OAQPS) has set National Ambient Air Quality Standards for six principal pollutants, which are called criteria pollutants. Under provisions of the Clean Air Act, which is intended to improve the quality of the air we breathe, EPA is required to set National Ambient Air Quality Standards for six common air pollutants. These commonly found air pollutants (also known as criteria pollutants) are found all over the United States. They are particle pollution (often referred to as particulate matter), ground-level ozone, carbon monoxide, sulfur oxides, nitrogen oxides, and lead. For each criteria pollutant, there are specific procedures used for measuring ambient concentrations and for calculating long-term (quarterly or annual) and/or short-term (24-hour) exposure levels. The me
Application of SCOPE-C to Measure Social Inclusion Among Mental Health Services Users in Hong Kong.
Chan, Kara; Chiu, Marcus Yu-Lung; Evans, Sherrill; Huxley, Peter J; Ng, Yu-Leung
2016-11-01
This study describes the construction of the Chinese version of the Social and Communities Opportunities Profile (SCOPE), henceforth, the SCOPE-C, to measure social inclusion among mental health services users in Hong Kong. The SCOPE-C was developed based on concept-mapping and benchmarking of census questions. The questionnaire consisted of 56 items, went through a standardized linguistic validation process and was pilot tested with qualitative feedback from five users of mental health services. Altogether 168 Chinese service users were recruited through various NGO mental health services to have three times face-to-face interview between October 2013 and July 2014. Results indicated that items related to satisfaction with opportunities and perceived opportunities in various social domains had high consistency. Nearly all the Kappa statistics and Pearson correlation coefficients between the baseline and two rounds of re-test were significant. The SCOPE-C was considered a valid instrument for Hong Kong mental health user population.
Activities report of PTT Research
NASA Astrophysics Data System (ADS)
In the field of postal infrastructure research, activities were performed on postcode readers, radiolabels, and techniques of operations research and artificial intelligence. In the field of telecommunication, transportation, and information, research was made on multipurpose coding schemes, speech recognition, hypertext, a multimedia information server, security of electronic data interchange, document retrieval, improvement of the quality of user interfaces, domotics living support (techniques), and standardization of telecommunication prototcols. In the field of telecommunication infrastructure and provisions research, activities were performed on universal personal telecommunications, advanced broadband network technologies, coherent techniques, measurement of audio quality, near field facilities, local beam communication, local area networks, network security, coupling of broadband and narrowband integrated services digital networks, digital mapping, and standardization of protocols.
HTML5 microdata as a semantic container for medical information exchange.
Kimura, Eizen; Kobayashi, Shinji; Ishihara, Ken
2014-01-01
Achieving interoperability between clinical electronic medical records (EMR) systems and cloud computing systems is challenging because of the lack of a universal reference method as a standard for information exchange with a secure connection. Here we describe an information exchange scheme using HTML5 microdata, where the standard semantic container is an HTML document. We embed HL7 messages describing laboratory test results in the microdata. We also annotate items in the clinical research report with the microdata. We mapped the laboratory test result data into the clinical research report using an HL7 selector specified in the microdata. This scheme can provide secure cooperation between the cloud-based service and the EMR system.
Dorel, Mathurin; Viara, Eric; Barillot, Emmanuel; Zinovyev, Andrei; Kuperstein, Inna
2017-01-01
Human diseases such as cancer are routinely characterized by high-throughput molecular technologies, and multi-level omics data are accumulated in public databases at increasing rate. Retrieval and visualization of these data in the context of molecular network maps can provide insights into the pattern of regulation of molecular functions reflected by an omics profile. In order to make this task easy, we developed NaviCom, a Python package and web platform for visualization of multi-level omics data on top of biological network maps. NaviCom is bridging the gap between cBioPortal, the most used resource of large-scale cancer omics data and NaviCell, a data visualization web service that contains several molecular network map collections. NaviCom proposes several standardized modes of data display on top of molecular network maps, allowing addressing specific biological questions. We illustrate how users can easily create interactive network-based cancer molecular portraits via NaviCom web interface using the maps of Atlas of Cancer Signalling Network (ACSN) and other maps. Analysis of these molecular portraits can help in formulating a scientific hypothesis on the molecular mechanisms deregulated in the studied disease. NaviCom is available at https://navicom.curie.fr. © The Author(s) 2017. Published by Oxford University Press.
37. Photo copy of map, (original in Forest Service Office, ...
37. Photo copy of map, (original in Forest Service Office, Elkins, WV, 'Blister Rust Survey Map), 1930. PARSONS NURSERY SITE PLAN - Parsons Nursery, South side of U.S. Route 219, Parsons, Tucker County, WV
NASA Astrophysics Data System (ADS)
Álvarez Francoso, Jose; Prieto Campos, Antonio; Ojeda Zujar, Jose; Guisado-Pintado, Emilia; Pérez Alcántara, Juan Pedro
2017-04-01
The accessibility to environmental information via web viewers using map services (OGC or proprietary services) has become more frequent since newly information sources (ortophotos, LIDAR, GPS) are of great detailed and thus generate a great volume of data which barely can be disseminated using either analogue (paper maps) or digital (pdf) formats. Moreover, governments and public institutions are concerned about the need of facilitates provision to research results and improve communication about natural hazards to citizens and stakeholders. This information ultimately, if adequately disseminated, it's crucial in decision making processes, risk management approaches and could help to increase social awareness related to environmental issues (particularly climate change impacts). To overcome this issue, two strategies for wide dissemination and communication of the results achieved in the calculation of beach erosion for the 640 km length of the Andalusian coast (South Spain) using web viewer technology are presented. Each of them are oriented to different end users and thus based on different methodologies. Erosion rates has been calculated at 50m intervals for different periods (1956-1977-2001-2011) as part of a National Research Project based on the spasialisation and web-access of coastal vulnerability indicators for Andalusian region. The 1st proposal generates WMS services (following OGC standards) that are made available by Geoserver, using a geoviewer client developed through Leaflet. This viewer is designed to be used by the general public (citizens, politics, etc) by combining a set of tools that give access to related documents (pdfs), visualisation tools (panoramio pictures, geo-localisation with GPS) are which are displayed within an user-friendly interface. Further, the use of WMS services (implemented on Geoserver) provides a detailed semiology (arrows and proportional symbols, using alongshore coastaline buffers to represent data) which not only enhances access to erosion rates but also enables multi-scale data representation. The 2nd proposal, as intended to be used by technicians and specialists on the field, includes a geoviewer with an innovative profile (including visualization of time-ranges, application of different uncertainty levels to the data, etc) to fulfil the needs of these users. For its development, a set of Javascript libraries combined with Openlayers (or Leaflet) are implemented to guarantee all the functionalities existing for the basic geoviewer. Further to this, the viewer has been improved by i) the generation of services by request through the application of a filter in ECQL language (Extended Common Query Language), using the vendor parameter CQL_FILTER from Geoserver. These dynamic filters allow the final user to predefine the visualised variable, its spatial and temporal domain, a range of specific values and other attributes, thus multiplying the generation of real-time cartography; ii) by using the layer's WFS service, the Javascript application exploit the alphanumeric data to generate related statistics in real time (e.g. mean rates, length of eroded coast, etc.) and interactive graphs (via HighCharts.js library) which accurately help in beach erosion rates interpretation (representing trends and bars diagrams, among others. As a result two approaches for communicating scientific results to different audiences based on web-based with complete dataset of geo-information, services and functionalities are implemented. The combination of standardised environmental data with tailor-made exploitation techniques (interactive maps, and real-time statistics) assures the correct access and interpretation of the information.
Exploiting Aura OMI Level 2 Data with High Resolution Visualization
NASA Astrophysics Data System (ADS)
Wei, J. C.; Yang, W.; Johnson, J. E.; Zhao, P.; Gerasimov, I. V.; Pham, L.; Vicente, G. A.; Shen, S.
2014-12-01
Satellite data products are important for a wide variety of applications that can bring far-reaching benefits to the science community and the broader society. These benefits can best be achieved if the satellite data are well utilized and interpreted, such as model inputs from satellite, or extreme event (such as volcano eruption, dust storm, …etc) interpretation from satellite. Unfortunately, this is not always the case, despite the abundance and relative maturity of numerous satellite data products provided by NASA and other organizations. One way to help users better understand the satellite data is to provide data along with 'Images', including accurate pixel-level (Level 2) information, pixel coverage area delineation, and science team recommended quality screening for individual geophysical parameters. Goddard Earth Sciences Data and Information Services Center (GES DISC) always strives to best support (i.e., Software-as-a-service, SaaS) the user-community for NASA Earth Science Data. In this case, we will present a new visualization tool that helps users exploiting Aura Ozone Monitoring Instrument (OMI) Level 2 data. This new visualization service utilizes Open Geospatial Consortium (OGC) standard-compliant Web Mapping Service (WMS) and Web Coverage Service (WCS) calls in the backend infrastructure. The functionality of the service allows users to select data sources (e.g., multiple parameters under the same measurement, like NO2 and SO2 from OMI Level 2 or same parameter with different methods of aggregation, like NO2 in OMNO2G and OMNO2D products), defining area-of-interest and temporal extents, zooming, panning, overlaying, sliding, and data subsetting and reformatting. The interface will also be able to connect to other OGC WMS and WCS servers, which will greatly enhance its expandability to integrate additional outside data/map sources (such as Global Imagery Browse Services (GIBS)).
Using Ecological Asset Mapping to Investigate Pre-Service Teachers' Cultural Assets
ERIC Educational Resources Information Center
Borrero, Noah; Yeh, Christine
2016-01-01
We examined the impact of a pedagogical strategy, ecological asset mapping, on 19 pre-service teachers' self-exploration, development of respect for others, and critical examination of social injustice. Data were analyzed from participants' ecological asset maps and essays describing the experience of completing and sharing the maps. The analysis…
Journey Mapping the User Experience
ERIC Educational Resources Information Center
Samson, Sue; Granath, Kim; Alger, Adrienne
2017-01-01
This journey-mapping pilot study was designed to determine whether journey mapping is an effective method to enhance the student experience of using the library by assessing our services from their point of view. Journey mapping plots a process or service to produce a visual representation of a library transaction--from the point at which the…
Climate Prediction Center - Expert Assessments Index
Weather Service NWS logo - Click to go to the NWS home page Climate Prediction Center Home Site Map News Web resources and services. HOME > Monitoring and Data > Global Climate Data & Maps > ; Global Regional Climate Maps Regional Climate Maps Banner The Monthly regional analyses products are
IOOS Data Portals and Uniform On-line Browse Capabilities
NASA Astrophysics Data System (ADS)
Howard, M.; Currier, R. D.; Kobara, S.; Gayanilo, F.
2015-12-01
The Gulf of Mexico Coastal Ocean Observing System Regional Association (GCOOS-RA) is one of eleven Regional Associations organized under the NOAA-led U.S. Integrated Ocean Observing System (IOOS) Program Office. Each of the RAs operate standards-based regional data portals designed to aggregate near real-time and historical observed data and modeled outputs from distributed providers and to offer these and derived products in standardized ways to a diverse set of users. The RA's portals are based on the IOOS Data and Communications Plan which describes the functional elements needed for an interoperable system. One of these elements is called "Uniform On-line Browse" which is an informational service designed primarily to visualize the inventory of a portal. An on-line browse service supports the end user's need to discover what parameters are available, to learn the spatial and temporal extend of the holdings, and to examine the character of the data (e.g, variability, gappiness, etc). These pieces of information help the end user decide if the data are fit for his/her purpose and to construct valid data requests. Note that on-line browse is a distinctly different activity than data analysis because it seeks to yield knowledge about the inventory and not about what the data mean. "Uniform" on-line browse is a service that takes advantage of the standardization of the data portal's data access points. Most portals represent station locations on a map. This is a view of the data inventory but these plots are rarely generated by pulling data through the standards-based services offered to the end users but through methods only available to the portal programmers. This work will present results of Uniform On-line browse tools developed within GCOOS-RA and their applicability to other RA portals.
Toward digital geologic map standards: a progress report
Ulrech, George E.; Reynolds, Mitchell W.; Taylor, Richard B.
1992-01-01
Establishing modern scientific and technical standards for geologic maps and their derivative map products is vital to both producers and users of such maps as we move into an age of digital cartography. Application of earth-science data in complex geographic information systems, acceleration of geologic map production, and reduction of population costs require that national standards be developed for digital geologic cartography and computer analysis. Since December 1988, under commission of the Chief Geologic of the U.S. Geological Survey and the mandate of the National Geologic Mapping Program (with added representation from the Association of American State Geologists), a committee has been designing a comprehensive set of scientific map standards. Three primary issues were: (1) selecting scientific symbology and its digital representation; (2) creating an appropriate digital coding system that characterizes geologic features with respect to their physical properties, stratigraphic and structural relations, spatial orientation, and interpreted mode of origin; and (3) developing mechanisms for reporting levels of certainty for descriptive as well as measured properties. Approximately 650 symbols for geoscience maps, including present usage of the U.S Geological Survey, state geological surveys, industry, and academia have been identified and tentatively adopted. A proposed coding system comprises four-character groupings of major and minor codes that can identify all attributes of a geologic feature. Such a coding system allows unique identification of as many as 105 geologic names and values on a given map. The new standard will track closely the latest developments of the Proposed Standard for Digital Cartographic Data soon to be submitted to the National Institute of Standards and Technology by the Federal Interagency Coordinating Committee on Digital Cartography. This standard will adhere generally to the accepted definitions and specifications for spatial data transfer. It will require separate specifications of digital cartographic quality relating to positional accuracy and ranges of measured and interpreted values such as geologic age and rock composition. Provisional digital geologic map standards will be published for trial implementation. After approximately two years, when comments on the proposed standards have been solicited and modifications made, formal adoption of the standards will be recommended. Widespread acceptance of the new standards will depend on their applicability to the broadest range of earth-science map products and their adaptability to changing cartographic technology.
Place and provision: mapping mental health advocacy services in London.
Foley, Ronan; Platzer, Hazel
2007-02-01
The National Health Service (NHS) Executive for London carried out an investigation in 2002 as part of their wider mental health strategy to establish whether existing mental health advocacy provision in the city was meeting need. The project took a two-part approach, with an emphasis on, (a) mapping the provision of advocacy services and, (b) cartographic mapping of service location and catchments. Data were collected through a detailed questionnaire with service providers in collaboration with the Greater London Mental Health Advocacy Network (GLMHAN) and additional health and government sources. The service mapping identified some key statistics on funding, caseloads and models of service provision with an additional emphasis on coverage, capacity, and funding stability. The questionnaire was augmented by interviews and focus groups with commissioners, service providers and service users and identified differing perspectives and problems, which informed the different perspectives of each of these groups. The cartographic mapping exercise demonstrated a spatially-even provision of mental health advocacy services across the city with each borough being served by at least one local service as well as by London wide specialist schemes. However, at local level, no one borough had the full range of specialist provision to match local demographic need. Ultimately the research assisted the Advisory Group in providing commissioning agencies with clear information on the current status of city-wide mental health advocacy services, and on gaps in existing advocacy provision alongside previously unconsidered geographical and service dimensions of that provision.
Fiscal mapping autism spectrum disorder funds: a case study of Ohio.
Joyce, Hilary D; Hoffman, Jill; Anderson-Butcher, Dawn; Moodie-Dyer, Amber
2014-01-01
Individuals with autism spectrum disorders (ASDs) have complex needs requiring regular service utilization. Policymakers, administrators, and community leaders are looking for ways to finance ASD services and systems. Understanding the fiscal resources that support ASD services is essential. This article uses fiscal mapping to explore ASD funding streams in Ohio. Fiscal mapping steps are overviewed to assist ASD stakeholders in identifying and examining ASD-related funding. Implications are drawn related to how fiscal mapping could be used to identify and leverage funding for ASD services. The resulting information is critical to utilizing existing resources, advocating for resources, and leveraging available funds.
A GIS application for assessing, mapping, and quantifying the social values of ecosystem services
Sherrouse, Benson C.; Clement, Jessica M.; Semmens, Darius J.
2011-01-01
As human pressures on ecosystems continue to increase, research involving the effective incorporation of social values information into the context of comprehensive ecosystem services assessments is becoming more important. Including quantified, spatially explicit social value metrics in such assessments will improve the analysis of relative tradeoffs among ecosystem services. This paper describes a GIS application, Social Values for Ecosystem Services (SolVES), developed to assess, map, and quantify the perceived social values of ecosystem services by deriving a non-monetary Value Index from responses to a public attitude and preference survey. SolVES calculates and maps the Value Index for social values held by various survey subgroups, as distinguished by their attitudes regarding ecosystem use. Index values can be compared within and among survey subgroups to explore the effect of social contexts on the valuation of ecosystem services. Index values can also be correlated and regressed against landscape metrics SolVES calculates from various environmental data layers. Coefficients derived through these analyses were applied to their corresponding data layers to generate a predicted social value map. This map compared favorably with other SolVES output and led to the addition of a predictive mapping function to SolVES for value transfer to areas where survey data are unavailable. A more robust application is being developed as a public domain tool for decision makers and researchers to map social values of ecosystem services and to facilitate discussions among diverse stakeholders involving relative tradeoffs among different ecosystem services in a variety of physical and social contexts.
Ecosystem services, i.e., "services provided to humans from natural systems," have become a key issue of this century in resource management, conservation planning, human well-being, and environmental decision analysis. Mapping and quantifying ecosystem services have be...
Mapping ecosystem services in the St. Louis River estuary (presentation)
Management of ecosystems for sustainable provision of services beneficial to human communities requires reliable data about from where in the ecosystem services flow. Our objective is to map ecosystem services in the St. Louis River with the overarching EPA goal of community sust...
A Servicewide Benthic Mapping Program for National Parks
Moses, Christopher S.; Nayegandhi, Amar; Beavers, Rebecca; Brock, John
2010-01-01
In 2007, the National Park Service (NPS) Inventory and Monitoring Program directed the initiation of a benthic habitat mapping program in ocean and coastal parks in alignment with the NPS Ocean Park Stewardship 2007-2008 Action Plan. With 74 ocean and Great Lakes parks stretching over more than 5,000 miles of coastline across 26 States and territories, this Servicewide Benthic Mapping Program (SBMP) is essential. This program will deliver benthic habitat maps and their associated inventory reports to NPS managers in a consistent, servicewide format to support informed management and protection of 3 million acres of submerged National Park System natural and cultural resources. The NPS and the U.S. Geological Survey (USGS) convened a workshop June 3-5, 2008, in Lakewood, Colo., to discuss the goals and develop the design of the NPS SBMP with an assembly of experts (Moses and others, 2010) who identified park needs and suggested best practices for inventory and mapping of bathymetry, benthic cover, geology, geomorphology, and some water-column properties. The recommended SBMP protocols include servicewide standards (such as gap analysis, minimum accuracy, final products) as well as standards that can be adapted to fit network and park unit needs (for example, minimum mapping unit, mapping priorities). SBMP Mapping Process. The SBMP calls for a multi-step mapping process for each park, beginning with a gap assessment and data mining to determine data resources and needs. An interagency announcement of intent to acquire new data will provide opportunities to leverage partnerships. Prior to new data acquisition, all involved parties should be included in a scoping meeting held at network scale. Data collection will be followed by processing and interpretation, and finally expert review and publication. After publication, all digital materials will be archived in a common format. SBMP Classification Scheme. The SBMP will map using the Coastal and Marine Ecological Classification Standard (CMECS) that is being modified to include all NPS needs, such as lacustrine ecosystems and submerged cultural resources. CMECS Version III (Madden and others, 2010) includes components for water column, biotic cover, surface geology, sub-benthic, and geoform. SBMP Data Archiving. The SBMP calls for the storage of all raw data and final products in common-use data formats. The concept of 'collect once, use often' is essential to efficient use of mapping resources. Data should also be shared with other agencies and the public through various digital clearing houses, such as Geospatial One-Stop (http://gos2.geodata.gov/wps/portal/gos). To be most useful for managing submerged resources, the SBMP advocates the inventory and mapping of the five components of marine ecosystems: surface geology, biotic cover, geoform, sub-benthic, and water column. A complete benthic inventory of a park would include maps of bathymetry and the five components of CMECS. The completion of mapping for any set of components, such as bathymetry and surface geology, or a particular theme (for example, submerged aquatic vegetation) should also include a printed report.
Climate Data Service in the FP7 EarthServer Project
NASA Astrophysics Data System (ADS)
Mantovani, Simone; Natali, Stefano; Barboni, Damiano; Grazia Veratelli, Maria
2013-04-01
EarthServer is a European Framework Program project that aims at developing and demonstrating the usability of open standards (OGC and W3C) in the management of multi-source, any-size, multi-dimensional spatio-temporal data - in short: "Big Earth Data Analytics". In order to demonstrate the feasibility of the approach, six thematic Lighthouse Applications (Cryospheric Science, Airborne Science, Atmospheric/ Climate Science, Geology, Oceanography, and Planetary Science), each with 100+ TB, are implemented. Scope of the Atmospheric/Climate lighthouse application (Climate Data Service) is to implement the system containing global to regional 2D / 3D / 4D datasets retrieved either from satellite observations, from numerical modelling and in-situ observations. Data contained in the Climate Data Service regard atmospheric profiles of temperature / humidity, aerosol content, AOT, and cloud properties provided by entities such as the European Centre for Mesoscale Weather Forecast (ECMWF), the Austrian Meteorological Service (Zentralanstalt für Meteorologie und Geodynamik - ZAMG), the Italian National Agency for new technologies, energies and sustainable development (ENEA), and the Sweden's Meteorological and Hydrological Institute (Sveriges Meteorologiska och Hydrologiska Institut -- SMHI). The system, through an easy-to-use web application permits to browse the loaded data, visualize their temporal evolution on a specific point with the creation of 2D graphs of a single field, or compare different fields on the same point (e.g. temperatures from different models and satellite observations), and visualize maps of specific fields superimposed with high resolution background maps. All data access operations and display are performed by means of OGC standard operations namely WMS, WCS and WCPS. The EarthServer project has just started its second year over a 3-years development plan: the present status the system contains subsets of the final database, with the scope of demonstrating I/O modules and visualization tools. At the end of the project all datasets will be available to the users.
Improving Land Cover Mapping: a Mobile Application Based on ESA Sentinel 2 Imagery
NASA Astrophysics Data System (ADS)
Melis, M. T.; Dessì, F.; Loddo, P.; La Mantia, C.; Da Pelo, S.; Deflorio, A. M.; Ghiglieri, G.; Hailu, B. T.; Kalegele, K.; Mwasi, B. N.
2018-04-01
The increasing availability of satellite data is a real value for the enhancement of environmental knowledge and land management. Possibilities to integrate different source of geo-data are growing and methodologies to create thematic database are becoming very sophisticated. Moreover, the access to internet services and, in particular, to web mapping services is well developed and spread either between expert users than the citizens. Web map services, like Google Maps or Open Street Maps, give the access to updated optical imagery or topographic maps but information on land cover/use - are not still provided. Therefore, there are many failings in the general utilization -non-specialized users- and access to those maps. This issue is particularly felt where the digital (web) maps could form the basis for land use management as they are more economic and accessible than the paper maps. These conditions are well known in many African countries where, while the internet access is becoming open to all, the local map agencies and their products are not widespread.
Mapping public policy on genetics.
Weisfeld, N E
2002-06-01
The mapping of the human genome and related advances in genetics are stimulating the development of public policies on genetics. Certain notions that currently prevail in public policy development overall--including the importance of protecting privacy of information, an interest in cost-effectiveness, and the power of the anecdote--will help determine the future of public policy on genetics. Information areas affected include discrimination by insurers and employers, confidentiality, genetic databanks, genetic testing in law enforcement, and court-ordered genetic testing in civil cases. Service issues address clinical standards, insurance benefits, allocation of resources, and screening of populations at risk. Supply issues encompass funding of research and clinical positions. Likely government actions include, among others: (1) Requiring individual consent for the disclosure of personal information, except when such consent would impose inordinate costs; (2) licensing genetic databases; (3) allowing courts to use personal information in cases where a refusal to use such information would offend the public; (4) mandating health insurers to pay for cost-effective genetic services; (5) funding pharmaceutical research to develop tailored products to prevent or treat diseases; and (6) funding training programs.
Re-inventing Data Libraries: Ensuring Continuing Access To Curated (Value-added) Data
NASA Astrophysics Data System (ADS)
Burnhill, P.; Medyckyj-Scott, D.
2008-12-01
How many years of inexperience do we need in using, and in particular sharing, digital data generated by others? That history pre-dates, but must also gain leverage from, the emergence of the digital library. Much of this sharing was done within research groups but recent attention to spatial data infrastructure highlights the importance of achieving several 'right mixes': * between Internet-standards, geo-specific referencing, and domain-specific vocabulary (cf ontology); * between attention to user-focus'd services and machine-to-machine interoperability; * between the demands of current high-quality services, the practice of data curation, and the need for long term preservation. This presentation will draw upon ideas and experience data library services in research universities, a national (UK) academic data centre, and developments in digital curation. It will be argued that the 1980s term 'data library' has some polemic value in that we have yet to learn what it means to 'do library' for data: more than "a bit like inter-galactic library loan", perhaps. Illustration will be drawn from multi-faceted database of digitized boundaries (UKBORDERS), through the first Internet map delivery of national mapping agency data (Digimap), to strategic positioning to help geo-enable academic and scientific data and so enhance research (in the UK, in Europe, and beyond).
ERDDAP - An Easier Way for Diverse Clients to Access Scientific Data From Diverse Sources
NASA Astrophysics Data System (ADS)
Mendelssohn, R.; Simons, R. A.
2008-12-01
ERDDAP is a new open-source, web-based service that aggregates data from other web services: OPeNDAP grid servers (THREDDS), OPeNDAP sequence servers (Dapper), NOS SOAP service, SOS (IOOS, OOStethys), microWFS, DiGIR (OBIS, BMDE). Regardless of the data source, ERDDAP makes all datasets available to clients via standard (and enhanced) DAP requests and makes some datasets accessible via WMS. A client's request also specifies the desired format for the results, e.g., .asc, .csv, .das, .dds, .dods, htmlTable, XHTML, .mat, netCDF, .kml, .png, or .pdf (formats more directly useful to clients). ERDDAP interprets a client request, requests the data from the data source (in the appropriate way), reformats the data source's response, and sends the result to the client. Thus ERDDAP makes data from diverse sources available to diverse clients via standardized interfaces. Clients don't have to install libraries to get data from ERDDAP because ERDDAP is RESTful and resource-oriented: a URL completely defines a data request and the URL can be used in any application that can send a URL and receive a file. This also makes it easy to use ERDDAP in mashups with other web services. ERDDAP could be extended to support other protocols. ERDDAP's hub and spoke architecture simplifies adding support for new types of data sources and new types of clients. ERDDAP includes metadata management support, catalog services, and services to make graphs and maps.
SWOT analysis on National Common Geospatial Information Service Platform of China
NASA Astrophysics Data System (ADS)
Zheng, Xinyan; He, Biao
2010-11-01
Currently, the trend of International Surveying and Mapping is shifting from map production to integrated service of geospatial information, such as GOS of U.S. etc. Under this circumstance, the Surveying and Mapping of China is inevitably shifting from 4D product service to NCGISPC (National Common Geospatial Information Service Platform of China)-centered service. Although State Bureau of Surveying and Mapping of China has already provided a great quantity of geospatial information service to various lines of business, such as emergency and disaster management, transportation, water resource, agriculture etc. The shortcomings of the traditional service mode are more and more obvious, due to the highly emerging requirement of e-government construction, the remarkable development of IT technology and emerging online geospatial service demands of various lines of business. NCGISPC, which aimed to provide multiple authoritative online one-stop geospatial information service and API for further development to government, business and public, is now the strategic core of SBSM (State Bureau of Surveying and Mapping of China). This paper focuses on the paradigm shift that NCGISPC brings up by using SWOT (Strength, Weakness, Opportunity and Threat) analysis, compared to the service mode that based on 4D product. Though NCGISPC is still at its early stage, it represents the future service mode of geospatial information of China, and surely will have great impact not only on the construction of digital China, but also on the way that everyone uses geospatial information service.
Mapping ecosystem services in the St. Louis River Estuary
Sustainable management of ecosystems for the perpetual flow of services beneficial to human communities requires reliable data about from where in the ecosystem services flow. Our objective is to map ecosystem services in the St. Louis River with the overarching U.S. EPA goal of ...
Geologic map of the Valjean Hills 7.5' quadrangle, San Bernardino County, California
Calzia, J.P.; Troxel, Bennie W.; digital database by Raumann, Christian G.
2003-01-01
FGDC-compliant metadata for the ARC/INFO coverages. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Geologic Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Even though this is an Open-File Report and includes the standard USGS Open-File disclaimer, the report closely adheres to the stratigraphic nomenclature of the U.S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3 above) or plotting the postscript file (2 above).
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-02
...-0064; 4500030114] RIN 1018-AZ68 Endangered and Threatened Wildlife and Plants; Critical Habitat Map for... U.S. Fish and Wildlife Service (Service), are correcting the critical habitat map for the fountain... and the general public have an accurate critical habitat map for the species. This action does not...
Topographical Hill Shading Map Production Based Tianditu (map World)
NASA Astrophysics Data System (ADS)
Wang, C.; Zha, Z.; Tang, D.; Yang, J.
2018-04-01
TIANDITU (Map World) is the public version of National Platform for Common Geospatial Information Service, and the terrain service is an important channel for users on the platform. With the development of TIANDITU, topographical hill shading map production for providing and updating global terrain map on line becomes necessary for the characters of strong intuition, three-dimensional sense and aesthetic effect. As such, the terrain service of TIANDITU focuses on displaying the different scales of topographical data globally. And this paper mainly aims to research the method of topographical hill shading map production globally using DEM (Digital Elevation Model) data between the displaying scales about 1 : 140,000,000 to 1 : 4,000,000, corresponded the display level from 2 to 7 on TIANDITU website.
Web mapping system for complex processing and visualization of environmental geospatial datasets
NASA Astrophysics Data System (ADS)
Titov, Alexander; Gordov, Evgeny; Okladnikov, Igor
2016-04-01
Environmental geospatial datasets (meteorological observations, modeling and reanalysis results, etc.) are used in numerous research applications. Due to a number of objective reasons such as inherent heterogeneity of environmental datasets, big dataset volume, complexity of data models used, syntactic and semantic differences that complicate creation and use of unified terminology, the development of environmental geodata access, processing and visualization services as well as client applications turns out to be quite a sophisticated task. According to general INSPIRE requirements to data visualization geoportal web applications have to provide such standard functionality as data overview, image navigation, scrolling, scaling and graphical overlay, displaying map legends and corresponding metadata information. It should be noted that modern web mapping systems as integrated geoportal applications are developed based on the SOA and might be considered as complexes of interconnected software tools for working with geospatial data. In the report a complex web mapping system including GIS web client and corresponding OGC services for working with geospatial (NetCDF, PostGIS) dataset archive is presented. There are three basic tiers of the GIS web client in it: 1. Tier of geospatial metadata retrieved from central MySQL repository and represented in JSON format 2. Tier of JavaScript objects implementing methods handling: --- NetCDF metadata --- Task XML object for configuring user calculations, input and output formats --- OGC WMS/WFS cartographical services 3. Graphical user interface (GUI) tier representing JavaScript objects realizing web application business logic Metadata tier consists of a number of JSON objects containing technical information describing geospatial datasets (such as spatio-temporal resolution, meteorological parameters, valid processing methods, etc). The middleware tier of JavaScript objects implementing methods for handling geospatial metadata, task XML object, and WMS/WFS cartographical services interconnects metadata and GUI tiers. The methods include such procedures as JSON metadata downloading and update, launching and tracking of the calculation task running on the remote servers as well as working with WMS/WFS cartographical services including: obtaining the list of available layers, visualizing layers on the map, exporting layers in graphical (PNG, JPG, GeoTIFF), vector (KML, GML, Shape) and digital (NetCDF) formats. Graphical user interface tier is based on the bundle of JavaScript libraries (OpenLayers, GeoExt and ExtJS) and represents a set of software components implementing web mapping application business logic (complex menus, toolbars, wizards, event handlers, etc.). GUI provides two basic capabilities for the end user: configuring the task XML object functionality and cartographical information visualizing. The web interface developed is similar to the interface of such popular desktop GIS applications, as uDIG, QuantumGIS etc. Web mapping system developed has shown its effectiveness in the process of solving real climate change research problems and disseminating investigation results in cartographical form. The work is supported by SB RAS Basic Program Projects VIII.80.2.1 and IV.38.1.7.
Smart "geomorphological" map browsing - a tale about geomorphological maps and the internet
NASA Astrophysics Data System (ADS)
Geilhausen, M.; Otto, J.-C.
2012-04-01
With the digital production of geomorphological maps, the dissemination of research outputs now extends beyond simple paper products. Internet technologies can contribute to both, the dissemination of geomorphological maps and access to geomorphologic data and help to make geomorphological knowledge available to a greater public. Indeed, many national geological surveys employ end-to-end digital workflows from data capture in the field to final map production and dissemination. This paper deals with the potential of web mapping applications and interactive, portable georeferenced PDF maps for the distribution of geomorphological information. Web mapping applications such as Google Maps have become very popular and widespread and increased the interest and access to mapping. They link the Internet with GIS technology and are a common way of presenting dynamic maps online. The GIS processing is performed online and maps are visualised in interactive web viewers characterised by different capabilities such as zooming, panning or adding further thematic layers, with the map refreshed after each task. Depending on the system architecture and the components used, advanced symbology, map overlays from different applications and sources and their integration into a Desktop GIS are possible. This interoperability is achieved through the use of international open standards that include mechanisms for the integration and visualisation of information from multiple sources. The portable document format (PDF) is commonly used for printing and is a standard format that can be processed by many graphic software and printers without loss of information. A GeoPDF enables the sharing of geospatial maps and data in PDF documents. Multiple, independent map frames with individual spatial reference systems are possible within a GeoPDF, for example, for map overlays or insets. Geospatial functionality of a GeoPDF includes scalable map display, layer visibility control, access to attribute data, coordinate queries and spatial measurements. The full functionality of GeoPDFs requires free and user-friendly plug-ins for PDF readers and GIS software. A GeoPDF enables fundamental GIS functionality turning the formerly static PDF map into an interactive, portable georeferenced PDF map. GeoPDFs are easy to create and provide an interesting and valuable way to disseminate geomorphological maps. Our motivation to engage with the online distribution of geomorphological maps originates in the increasing number of web mapping applications available today indicating that the Internet has become a medium for displaying geographical information in rich forms and user-friendly interfaces. So, why not use the Internet to distribute geomorphological maps and enhance their practical application? Web mapping and dynamic PDF maps can play a key role in the movement towards a global dissemination of geomorphological information. This will be exemplified by live demonstrations of i.) existing geomorphological WebGIS applications, ii.) data merging from various sources using web map services, and iii.) free to download GeoPDF maps during the presentations.
Bagstad, Kenneth J.; Villa, Ferdinando; Batker, David; Harrison-Cox, Jennifer; Voigt, Brian; Johnson, Gary W.
2014-01-01
Ecosystem services mapping and modeling has focused more on supply than demand, until recently. Whereas the potential provision of economic benefits from ecosystems to people is often quantified through ecological production functions, the use of and demand for ecosystem services has received less attention, as have the spatial flows of services from ecosystems to people. However, new modeling approaches that map and quantify service-specific sources (ecosystem capacity to provide a service), sinks (biophysical or anthropogenic features that deplete or alter service flows), users (user locations and level of demand), and spatial flows can provide a more complete understanding of ecosystem services. Through a case study in Puget Sound, Washington State, USA, we quantify and differentiate between the theoretical or in situ provision of services, i.e., ecosystems’ capacity to supply services, and their actual provision when accounting for the location of beneficiaries and the spatial connections that mediate service flows between people and ecosystems. Our analysis includes five ecosystem services: carbon sequestration and storage, riverine flood regulation, sediment regulation for reservoirs, open space proximity, and scenic viewsheds. Each ecosystem service is characterized by different beneficiary groups and means of service flow. Using the ARtificial Intelligence for Ecosystem Services (ARIES) methodology we map service supply, demand, and flow, extending on simpler approaches used by past studies to map service provision and use. With the exception of the carbon sequestration service, regions that actually provided services to people, i.e., connected to beneficiaries via flow paths, amounted to 16-66% of those theoretically capable of supplying services, i.e., all ecosystems across the landscape. These results offer a more complete understanding of the spatial dynamics of ecosystem services and their effects, and may provide a sounder basis for economic valuation and policy applications than studies that consider only theoretical service provision and/or use.
Mapping ecosystem services for land use planning, the case of Central Kalimantan.
Sumarga, Elham; Hein, Lars
2014-07-01
Indonesia is subject to rapid land use change. One of the main causes for the conversion of land is the rapid expansion of the oil palm sector. Land use change involves a progressive loss of forest cover, with major impacts on biodiversity and global CO2 emissions. Ecosystem services have been proposed as a concept that would facilitate the identification of sustainable land management options, however, the scale of land conversion and its spatial diversity pose particular challenges in Indonesia. The objective of this paper is to analyze how ecosystem services can be mapped at the provincial scale, focusing on Central Kalimantan, and to examine how ecosystem services maps can be used for a land use planning. Central Kalimantan is subject to rapid deforestation including the loss of peatland forests and the provincial still lacks a comprehensive land use plan. We examine how seven key ecosystem services can be mapped and modeled at the provincial scale, using a variety of models, and how large scale ecosystem services maps can support the identification of options for sustainable expansion of palm oil production.
Namibia Dashboard Enhancements
NASA Technical Reports Server (NTRS)
Mandl, Daniel; Handy, Matthew
2014-01-01
The purpose of this presentation is for a Technical Interchange Meeting with the Namibia Hydrological Services (NHS) in Namibia. The meeting serves as a capacity building exercise. This presentation goes over existing software functionality developed in collaboration with NHS over the past five years called the Namibia Flood Dashboard. Furthermore, it outlines new functionality developed over the past year and future functionality that will be developed. The main purpose of the Dashboard is to assist in decision support for flood warning. The Namibia Flood Dashboard already exists online in a cloud environment and has been used in prototype mode for the past few years.Functionality in the Dashboard includes river gauge hydrographs, TRMM estimate rainfall, EO-1 flood maps, infrastructure maps and other related functions. Future functionality includes attempting to integrate interoperability standards and crowd-sourcing capability. To this end, we are adding OpenStreetMap compatibility and an Applications Program Interface (API) called a GeoSocial API to enable discovery and sharing of data products useful for decision support via social media.
U.S. Geological Survey spatial data access
Faundeen, John L.; Kanengieter, Ronald L.; Buswell, Michael D.
2002-01-01
The U.S. Geological Survey (USGS) has done a progress review on improving access to its spatial data holdings over the Web. The USGS EROS Data Center has created three major Web-based interfaces to deliver spatial data to the general public; they are Earth Explorer, the Seamless Data Distribution System (SDDS), and the USGS Web Mapping Portal. Lessons were learned in developing these systems, and various resources were needed for their implementation. The USGS serves as a fact-finding agency in the U.S. Government that collects, monitors, analyzes, and provides scientific information about natural resource conditions and issues. To carry out its mission, the USGS has created and managed spatial data since its inception. Originally relying on paper maps, the USGS now uses advanced technology to produce digital representations of the Earth’s features. The spatial products of the USGS include both source and derivative data. Derivative datasets include Digital Orthophoto Quadrangles (DOQ), Digital Elevation Models, Digital Line Graphs, land-cover Digital Raster Graphics, and the seamless National Elevation Dataset. These products, created with automated processes, use aerial photographs, satellite images, or other cartographic information such as scanned paper maps as source data. With Earth Explorer, users can search multiple inventories through metadata queries and can browse satellite and DOQ imagery. They can place orders and make payment through secure credit card transactions. Some USGS spatial data can be accessed with SDDS. The SDDS uses an ArcIMS map service interface to identify the user’s areas of interest and determine the output format; it allows the user to either download the actual spatial data directly for small areas or place orders for larger areas to be delivered on media. The USGS Web Mapping Portal provides views of national and international datasets through an ArcIMS map service interface. In addition, the map portal posts news about new map services available from the USGS, many simultaneously published on the Environmental Systems Research Institute Geography Network. These three information systems use new software tools and expanded hardware to meet the requirements of the users. The systems are designed to handle the required workload and are relatively easy to enhance and maintain. The software tools give users a high level of functionality and help the system conform to industry standards. The hardware and software architecture is designed to handle the large amounts of spatial data and Internet traffic required by the information systems. Last, customer support was needed to answer questions, monitor e-mail, and report customer problems.
NASA Astrophysics Data System (ADS)
Rhodes, C. R.; Sinha, P.; Amanda, N.
2013-12-01
In recent years the gap between what scientists know and what policymakers should appreciate in environmental decision making has received more attention, as the costs of the disconnect have become more apparent to both groups. Particularly for water-related policies, the EPA's Office of Water has struggled with benefit estimates held low by the inability to quantify ecological and economic effects that theory, modeling, and anecdotal or isolated case evidence suggest may prove to be larger. Better coordination with ecologists and hydrologists is being explored as a solution. The ecosystem services (ES) concept now nearly two decades old links ecosystem functions and processes to the human value system. But there remains no clear mapping of which ecosystem goods and services affect which individual or economic values. The National Ecosystem Services Classification System (NESCS, 'nexus') project brings together ecologists, hydrologists, and social scientists to do this mapping for aquatic and other ecosystem service-generating systems. The objective is to greatly reduce the uncertainty in water-related policy making by mapping and ultimately quantifying the various functions and products of aquatic systems, as well as how changes to aquatic systems impact the human economy and individual levels of non-monetary appreciation for those functions and products. Primary challenges to fostering interaction between scientists, social scientists, and policymakers are lack of a common vocabulary, and the need for a cohesive comprehensive framework that organizes concepts across disciplines and accommodates scientific data from a range of sources. NESCS builds the vocabulary and the framework so both may inform a scalable transdisciplinary policy-making application. This talk presents for discussion the process and progress in developing both this vocabulary and a classifying framework capable of bridging the gap between a newer but existing ecosystem services classification system, and a standardized industrial classification system. Our goal is to model then predict the effects of a policy choice on the environment, from impacts on ecological components and processes all the way through to endpoints in the human value chain.
A Different Web-Based Geocoding Service Using Fuzzy Techniques
NASA Astrophysics Data System (ADS)
Pahlavani, P.; Abbaspour, R. A.; Zare Zadiny, A.
2015-12-01
Geocoding - the process of finding position based on descriptive data such as address or postal code - is considered as one of the most commonly used spatial analyses. Many online map providers such as Google Maps, Bing Maps and Yahoo Maps present geocoding as one of their basic capabilities. Despite the diversity of geocoding services, users usually face some limitations when they use available online geocoding services. In existing geocoding services, proximity and nearness concept is not modelled appropriately as well as these services search address only by address matching based on descriptive data. In addition there are also some limitations in display searching results. Resolving these limitations can enhance efficiency of the existing geocoding services. This paper proposes the idea of integrating fuzzy technique with geocoding process to resolve these limitations. In order to implement the proposed method, a web-based system is designed. In proposed method, nearness to places is defined by fuzzy membership functions and multiple fuzzy distance maps are created. Then these fuzzy distance maps are integrated using fuzzy overlay technique for obtain the results. Proposed methods provides different capabilities for users such as ability to search multi-part addresses, searching places based on their location, non-point representation of results as well as displaying search results based on their priority.
Managing Vocabulary Mapping Services
Che, Chengjian; Monson, Kent; Poon, Kasey B.; Shakib, Shaun C.; Lau, Lee Min
2005-01-01
The efficient management and maintenance of large-scale and high-quality vocabulary mapping is an operational challenge. The 3M Health Information Systems (HIS) Healthcare Data Dictionary (HDD) group developed an information management system to provide controlled mapping services, resulting in improved efficiency and quality maintenance. PMID:16779203
47 CFR 73.4108 - FM transmitter site map submissions.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 47 Telecommunication 4 2011-10-01 2011-10-01 false FM transmitter site map submissions. 73.4108 Section 73.4108 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) BROADCAST RADIO SERVICES RADIO BROADCAST SERVICES Rules Applicable to All Broadcast Stations § 73.4108 FM transmitter site map...
Exploring NASA Satellite Data with High Resolution Visualization
NASA Astrophysics Data System (ADS)
Wei, J. C.; Yang, W.; Johnson, J. E.; Shen, S.; Zhao, P.; Gerasimov, I. V.; Vollmer, B.; Vicente, G. A.; Pham, L.
2013-12-01
Satellite data products are important for a wide variety of applications that can bring far-reaching benefits to the science community and the broader society. These benefits can best be achieved if the satellite data are well utilized and interpreted, such as model inputs from satellite, or extreme event (such as volcano eruption, dust storm, ...etc) interpretation from satellite. Unfortunately, this is not always the case, despite the abundance and relative maturity of numerous satellite data products provided by NASA and other organizations. Such obstacles may be avoided by providing satellite data as ';Images' with accurate pixel-level (Level 2) information, including pixel coverage area delineation and science team recommended quality screening for individual geophysical parameters. We will present a prototype service from the Goddard Earth Sciences Data and Information Services Center (GES DISC) supporting various visualization and data accessing capabilities from satellite Level 2 data (non-aggregated and un-gridded) at high spatial resolution. Functionality will include selecting data sources (e.g., multiple parameters under the same measurement, like NO2 and SO2 from Ozone Monitoring Instrument (OMI), or same parameter with different methods of aggregation, like NO2 in OMNO2G and OMNO2D products), defining area-of-interest and temporal extents, zooming, panning, overlaying, sliding, and data subsetting and reformatting. The portal interface will connect to the backend services with OGC standard-compliant Web Mapping Service (WMS) and Web Coverage Service (WCS) calls. The interface will also be able to connect to other OGC WMS and WCS servers, which will greatly enhance its expandability to integrate additional outside data/map sources.
Global mapping of ecosystem services and conservation priorities
Naidoo, R.; Balmford, A.; Costanza, R.; Fisher, B.; Green, R. E.; Lehner, B.; Malcolm, T. R.; Ricketts, T. H.
2008-01-01
Global efforts to conserve biodiversity have the potential to deliver economic benefits to people (i.e., “ecosystem services”). However, regions for which conservation benefits both biodiversity and ecosystem services cannot be identified unless ecosystem services can be quantified and valued and their areas of production mapped. Here we review the theory, data, and analyses needed to produce such maps and find that data availability allows us to quantify imperfect global proxies for only four ecosystem services. Using this incomplete set as an illustration, we compare ecosystem service maps with the global distributions of conventional targets for biodiversity conservation. Our preliminary results show that regions selected to maximize biodiversity provide no more ecosystem services than regions chosen randomly. Furthermore, spatial concordance among different services, and between ecosystem services and established conservation priorities, varies widely. Despite this lack of general concordance, “win–win” areas—regions important for both ecosystem services and biodiversity—can be usefully identified, both among ecoregions and at finer scales within them. An ambitious interdisciplinary research effort is needed to move beyond these preliminary and illustrative analyses to fully assess synergies and trade-offs in conserving biodiversity and ecosystem services. PMID:18621701
Towards an EO-based Landslide Web Mapping and Monitoring Service
NASA Astrophysics Data System (ADS)
Hölbling, Daniel; Weinke, Elisabeth; Albrecht, Florian; Eisank, Clemens; Vecchiotti, Filippo; Friedl, Barbara; Kociu, Arben
2017-04-01
National and regional authorities and infrastructure maintainers in mountainous regions require accurate knowledge of the location and spatial extent of landslides for hazard and risk management. Information on landslides is often collected by a combination of ground surveying and manual image interpretation following landslide triggering events. However, the high workload and limited time for data acquisition result in a trade-off between completeness, accuracy and detail. Remote sensing data offers great potential for mapping and monitoring landslides in a fast and efficient manner. While facing an increased availability of high-quality Earth Observation (EO) data and new computational methods, there is still a lack in science-policy interaction and in providing innovative tools and methods that can easily be used by stakeholders and users to support their daily work. Taking up this issue, we introduce an innovative and user-oriented EO-based web service for landslide mapping and monitoring. Three central design components of the service are presented: (1) the user requirements definition, (2) the semi-automated image analysis methods implemented in the service, and (3) the web mapping application with its responsive user interface. User requirements were gathered during semi-structured interviews with regional authorities. The potential users were asked if and how they employ remote sensing data for landslide investigation and what their expectations to a landslide web mapping service regarding reliability and usability are. The interviews revealed the capability of our service for landslide documentation and mapping as well as monitoring of selected landslide sites, for example to complete and update landslide inventory maps. In addition, the users see a considerable potential for landslide rapid mapping. The user requirements analysis served as basis for the service concept definition. Optical satellite imagery from different high resolution (HR) and very high resolution (VHR) sensors, e.g. Landsat, Sentinel-2, SPOT-5, WorldView-2/3, was acquired for different study areas in the Alps. Object-based image analysis (OBIA) methods were used for semi-automated mapping of landslides. Selected mapping routines and results, including a step-by-step guidance, are integrated in the service by means of a web processing chain. This allows the user to gain insights into the service idea, the potential of semi-automated mapping methods, and the applicability of various satellite data for specific landslide mapping tasks. Moreover, an easy-to use and guided classification workflow, which includes image segmentation, statistical classification and manual editing options, enables the user to perform his/her own analyses. For validation, the classification results can be downloaded or compared against uploaded reference data using the implemented tools. Furthermore, users can compare the classification results to freely available data such as OpenStreetMap to identify landslide-affected infrastructure (e.g. roads, buildings). They also can upload infrastructure data available at their organization for specific assessments or monitor the evolution of selected landslides over time. Further actions will include the validation of the service in collaboration with stakeholders, decision makers and experts, which is essential to produce landslide information products that can assist the targeted management of natural hazards, and the evaluation of the potential towards the development of an operational Copernicus downstream service.
NASA Astrophysics Data System (ADS)
Kattoula, Ehsan Habib
Recent reform efforts in science education have culminated in National Science Education Standards (NSES), which include the nature of science and science inquiry themes across all grade levels. Consideration must be given to pre-service science teachers' nature of science conceptions and their perceived roles in implementing the nature of science in the science classroom. This qualitative study investigates how pre-service science teachers' views about the nature of science develop and change when learning a college physics unit on waves in an urban university. The study uses case study methodology with four pre-service science teachers as individual units of analysis. Data regarding the participants' views about the nature of science were collected before and after the instruction on the physics of waves unit. The research design used 'The Views of Nature of Science/Views of Scientific Inquiry-Physics Questionnaire' followed by structured interviews throughout the wave unit. In addition, the participants responded to daily questions that incorporated nature of science themes and constructed concept maps regarding the physics content and their nature of science understanding. After completing the VNOS/VOSI-PHYS questionnaire the pre-service science teachers' views of the nature of science were found to be mainly naive and transitional before the instruction. At the end of the wave unit instruction, the data indicated that conceptual change occurred in participants' nature of science views, shifting toward informed views. The findings of this study provide evidence that using explicit instruction with specific activities, such as experiments and concept mapping, shifted the pre-service science teachers' views away from naive and toward informed.
Distributed spatial information integration based on web service
NASA Astrophysics Data System (ADS)
Tong, Hengjian; Zhang, Yun; Shao, Zhenfeng
2008-10-01
Spatial information systems and spatial information in different geographic locations usually belong to different organizations. They are distributed and often heterogeneous and independent from each other. This leads to the fact that many isolated spatial information islands are formed, reducing the efficiency of information utilization. In order to address this issue, we present a method for effective spatial information integration based on web service. The method applies asynchronous invocation of web service and dynamic invocation of web service to implement distributed, parallel execution of web map services. All isolated information islands are connected by the dispatcher of web service and its registration database to form a uniform collaborative system. According to the web service registration database, the dispatcher of web services can dynamically invoke each web map service through an asynchronous delegating mechanism. All of the web map services can be executed at the same time. When each web map service is done, an image will be returned to the dispatcher. After all of the web services are done, all images are transparently overlaid together in the dispatcher. Thus, users can browse and analyze the integrated spatial information. Experiments demonstrate that the utilization rate of spatial information resources is significantly raised thought the proposed method of distributed spatial information integration.
Distributed spatial information integration based on web service
NASA Astrophysics Data System (ADS)
Tong, Hengjian; Zhang, Yun; Shao, Zhenfeng
2009-10-01
Spatial information systems and spatial information in different geographic locations usually belong to different organizations. They are distributed and often heterogeneous and independent from each other. This leads to the fact that many isolated spatial information islands are formed, reducing the efficiency of information utilization. In order to address this issue, we present a method for effective spatial information integration based on web service. The method applies asynchronous invocation of web service and dynamic invocation of web service to implement distributed, parallel execution of web map services. All isolated information islands are connected by the dispatcher of web service and its registration database to form a uniform collaborative system. According to the web service registration database, the dispatcher of web services can dynamically invoke each web map service through an asynchronous delegating mechanism. All of the web map services can be executed at the same time. When each web map service is done, an image will be returned to the dispatcher. After all of the web services are done, all images are transparently overlaid together in the dispatcher. Thus, users can browse and analyze the integrated spatial information. Experiments demonstrate that the utilization rate of spatial information resources is significantly raised thought the proposed method of distributed spatial information integration.
Research on the construction of three level customer service knowledge graph
NASA Astrophysics Data System (ADS)
Cheng, Shi; Shen, Jiajie; Shi, Quan; Cheng, Xianyi
2017-09-01
With the explosion of knowledge and information of the enterprise and the growing demand for intelligent knowledge management and application and improve business performance the knowledge expression and processing of the enterprise has become a hot topic. Aim at the problems of the electric marketing customer service knowledge map (customer service knowledge map) in building theory and method, electric marketing knowledge map of three levels of customer service was discussed, and realizing knowledge reasoning based on Neo4j, achieve good results in practical application.
Imaging Cerebral Microhemorrhages in Military Service Members with Chronic Traumatic Brain Injury.
Liu, Wei; Soderlund, Karl; Senseney, Justin S; Joy, David; Yeh, Ping-Hong; Ollinger, John; Sham, Elyssa B; Liu, Tian; Wang, Yi; Oakes, Terrence R; Riedy, Gerard
2016-02-01
To detect cerebral microhemorrhages in military service members with chronic traumatic brain injury by using susceptibility-weighted magnetic resonance (MR) imaging. The longitudinal evolution of microhemorrhages was monitored in a subset of patients by using quantitative susceptibility mapping. The study was approved by the Walter Reed National Military Medical Center institutional review board and is compliant with HIPAA guidelines. All participants underwent two-dimensional conventional gradient-recalled-echo MR imaging and three-dimensional flow-compensated multiecho gradient-recalled-echo MR imaging (processed to generate susceptibility-weighted images and quantitative susceptibility maps), and a subset of patients underwent follow-up imaging. Microhemorrhages were identified by two radiologists independently. Comparisons of microhemorrhage number, size, and magnetic susceptibility derived from quantitative susceptibility maps between baseline and follow-up imaging examinations were performed by using the paired t test. Among the 603 patients, cerebral microhemorrhages were identified in 43 patients, with six excluded for further analysis owing to artifacts. Seventy-seven percent (451 of 585) of the microhemorrhages on susceptibility-weighted images had a more conspicuous appearance than on gradient-recalled-echo images. Thirteen of the 37 patients underwent follow-up imaging examinations. In these patients, a smaller number of microhemorrhages were identified at follow-up imaging compared with baseline on quantitative susceptibility maps (mean ± standard deviation, 9.8 microhemorrhages ± 12.8 vs 13.7 microhemorrhages ± 16.6; P = .019). Quantitative susceptibility mapping-derived quantitative measures of microhemorrhages also decreased over time: -0.85 mm(3) per day ± 1.59 for total volume (P = .039) and -0.10 parts per billion per day ± 0.14 for mean magnetic susceptibility (P = .016). The number of microhemorrhages and quantitative susceptibility mapping-derived quantitative measures of microhemorrhages all decreased over time, suggesting that hemosiderin products undergo continued, subtle evolution in the chronic stage. © RSNA, 2015.
ERIC Educational Resources Information Center
Snell, Robin Stanley; Chan, Maureen Yin Lee; Ma, Carol Hok Ka; Chan, Carman Ka Man
2015-01-01
We present a road map for providing course-embedded service-learning team projects as opportunities for undergraduates to practice as service leaders in Asia and beyond. Basic foundations are that projects address authentic problems or needs, partner organization representatives (PORs) indicate availability for ongoing consultation, students…
Ecosystem services, i.e., "services provided to humans from natural systems," have become a key issue of this century in resource management, conservation planning, human well-being, and environmental decision analysis. Mapping and quantifying ecosystem services have become stra...
ERIC Educational Resources Information Center
Fleming, Padraic; McGilloway, Sinead; Barry, Sarah
2017-01-01
Background: Day services for people with intellectual disabilities are experiencing a global paradigm shift towards innovative person-centred models of care. This study maps changing trends in day service utilization to highlight how policy, emergent patterns and demographic trends influence service delivery. Methods: National intellectual…
How to change GEBCO outreach activities with Information technologies?
NASA Astrophysics Data System (ADS)
Chang, E.; Park, K.
2014-12-01
Since 1995, when National Geographic Information Project began, we have great advance in mapping itself and information service on the earth surface in Korea whether paper maps or online service map. By reviewing geological and mine-related information service in current and comparisons of demands, GEBCO outreach master plan has been prepared. Information service cannot be separated from data production and on dissemination policies. We suggest the potential impact of the changes in information technologies such as mobile service and data fusion, and big data on GEBCO maps based. Less cost and high performance in data service will stimulate more information service; therefore it is necessary to have more customer-oriented manipulation on the data. By inquiring questionnaire, we can draw the potential needs on GEBCO products in various aspects: such as education, accessibility. The gap between experts and non-experts will decrease by digital service from the private and public organizations such as international academic societies since research funds and policies tend to pursue "openness" and "interoperability" among the domains. Some background why and how to prepare outreach activities in GEBCO will be shown.
Quantifying and Mapping Habitat-Based Biodiversity Metrics Within an Ecosystem Services Framework
Ecosystem services have become a key issue of this century in resource management, conservation planning, human well-being, and environmental decision analysis. Mapping and quantifying ecosystem services have become strategic national interests for integrating ecology with econom...
Tatem, Andrew J; Campbell, James; Guerra-Arias, Maria; de Bernis, Luc; Moran, Allisyn; Matthews, Zoë
2014-01-04
The health and survival of women and their new-born babies in low income countries has been a key priority in public health since the 1990s. However, basic planning data, such as numbers of pregnancies and births, remain difficult to obtain and information is also lacking on geographic access to key services, such as facilities with skilled health workers. For maternal and newborn health and survival, planning for safer births and healthier newborns could be improved by more accurate estimations of the distributions of women of childbearing age. Moreover, subnational estimates of projected future numbers of pregnancies are needed for more effective strategies on human resources and infrastructure, while there is a need to link information on pregnancies to better information on health facilities in districts and regions so that coverage of services can be assessed. This paper outlines demographic mapping methods based on freely available data for the production of high resolution datasets depicting estimates of numbers of people, women of childbearing age, live births and pregnancies, and distribution of comprehensive EmONC facilities in four large high burden countries: Afghanistan, Bangladesh, Ethiopia and Tanzania. Satellite derived maps of settlements and land cover were constructed and used to redistribute areal census counts to produce detailed maps of the distributions of women of childbearing age. Household survey data, UN statistics and other sources on growth rates, age specific fertility rates, live births, stillbirths and abortions were then integrated to convert the population distribution datasets to gridded estimates of births and pregnancies. These estimates, which can be produced for current, past or future years based on standard demographic projections, can provide the basis for strategic intelligence, planning services, and provide denominators for subnational indicators to track progress. The datasets produced are part of national midwifery workforce assessments conducted in collaboration with the respective Ministries of Health and the United Nations Population Fund (UNFPA) to identify disparities between population needs, health infrastructure and workforce supply. The datasets are available to the respective Ministries as part of the UNFPA programme to inform midwifery workforce planning and also publicly available through the WorldPop population mapping project.
Minecraft® on Demand - A new IGN service which combines game and 3D cartography
NASA Astrophysics Data System (ADS)
Lecordix, François; Fremont, David; Jilani, Moez; Séguin, Emmanuel; Kriat, Sofiane
2018-05-01
The French national mapping agency, Institut national de l'information géographique et forestière (IGN), decided to develop a new web service, called Minecraft on Demand (www.ign.fr/Minecraft), designed to provide Minecraft maps from the geographic data that IGN produces. This free web service enables the user to select the center of the map and to get a Minecraft world of 5 km long and 5 km wide, at the scale 1 : 1. The player can easily input this map into Minecraft, the world's most popular video game with 121 million copies sold. Launched in June 2016 in France, the service Minecraft® on Demand obtained a fair success (10,000 maps downloaded), more specifically among young people, since it may enable them to discover IGN data and geography.
The Heliophysics Integrated Observatory
NASA Astrophysics Data System (ADS)
Csillaghy, A.; Bentley, R. D.
2009-12-01
HELIO is a new Europe-wide, FP7-funded distributed network of services that will address the needs of a broad community of researchers in heliophysics. This new research field explores the “Sun-Solar System Connection” and requires the joint exploitation of solar, heliospheric, magnetospheric and ionospheric observations. HELIO will provide the most comprehensive integrated information system in this domain; it will coordinate access to the distributed resources needed by the community, and will provide access to services to mine and analyse the data. HELIO will be designed as a Service-oriented Architecture. The initial infrastructure will include services based on metadata and data servers deployed by the European Grid of Solar Observations (EGSO). We will extend these to address observations from all the disciplines of heliophysics; differences in the way the domains describe and handle the data will be resolved using semantic mapping techniques. Processing and storage services will allow the user to explore the data and create the products that meet stringent standards of interoperability. These capabilities will be orchestrated with the data and metadata services using the Taverna workflow tool. HELIO will address the challenges along the FP7 I3 activities model: (1) Networking: we will cooperate closely with the community to define new standards for heliophysics and the required capabilities of the HELIO system. (2) Services: we will integrate the services developed by the project and other groups to produce an infrastructure that can easily be extended to satisfy the growing and changing needs of the community. (3) Joint Research: we will develop search tools that span disciplinary boundaries and explore new types of user-friendly interfaces HELIO will be a key component of a worldwide effort to integrate heliophysics data and will coordinate closely with international organizations to exploit synergies with complementary domains.
Tile prediction schemes for wide area motion imagery maps in GIS
NASA Astrophysics Data System (ADS)
Michael, Chris J.; Lin, Bruce Y.
2017-11-01
Wide-area surveillance, traffic monitoring, and emergency management are just several of many applications benefiting from the incorporation of Wide-Area Motion Imagery (WAMI) maps into geographic information systems. Though the use of motion imagery as a GIS base map via the Web Map Service (WMS) standard is not a new concept, effectively streaming imagery is particularly challenging due to its large scale and the multidimensionally interactive nature of clients that use WMS. Ineffective streaming from a server to one or more clients can unnecessarily overwhelm network bandwidth and cause frustratingly large amounts of latency in visualization to the user. Seamlessly streaming WAMI through GIS requires good prediction to accurately guess the tiles of the video that will be traversed in the near future. In this study, we present an experimental framework for such prediction schemes by presenting a stochastic interaction model that represents a human user's interaction with a GIS video map. We then propose several algorithms by which the tiles of the stream may be predicted. Results collected both within the experimental framework and using human analyst trajectories show that, though each algorithm thrives under certain constraints, the novel Markovian algorithm yields the best results overall. Furthermore, we make the argument that the proposed experimental framework is sufficient for the study of these prediction schemes.
NASA Astrophysics Data System (ADS)
Hagemeier-Klose, M.; Wagner, K.
2009-04-01
Flood risk communication with the general public and the population at risk is getting increasingly important for flood risk management, especially as a precautionary measure. This is also underlined by the EU Flood Directive. The flood related authorities therefore have to develop adjusted information tools which meet the demands of different user groups. This article presents the formative evaluation of flood hazard maps and web mapping services according to the specific requirements and needs of the general public using the dynamic-transactional approach as a theoretical framework. The evaluation was done by a mixture of different methods; an analysis of existing tools, a creative workshop with experts and laymen and an online survey. The currently existing flood hazard maps or web mapping services or web GIS still lack a good balance between simplicity and complexity with adequate readability and usability for the public. Well designed and associative maps (e.g. using blue colours for water depths) which can be compared with past local flood events and which can create empathy in viewers, can help to raise awareness, to heighten the activity and knowledge level or can lead to further information seeking. Concerning web mapping services, a linkage between general flood information like flood extents of different scenarios and corresponding water depths and real time information like gauge levels is an important demand by users. Gauge levels of these scenarios are easier to understand than the scientifically correct return periods or annualities. The recently developed Bavarian web mapping service tries to integrate these requirements.
DOT National Transportation Integrated Search
1997-07-14
These standards represent a guideline for preparing digital data for inclusion in the National Pipeline Mapping System Repository. The standards were created with input from the pipeline industry and government agencies. They address the submission o...
The quality of free antenatal and delivery services in Northern Sierra Leone.
Koroma, Manso M; Kamara, Samuel S; Bangura, Evelyn A; Kamara, Mohamed A; Lokossou, Virgil; Keita, Namoudou
2017-07-12
The number of maternal deaths in sub-Saharan Africa continues to be overwhelmingly high. In West Africa, Sierra Leone leads the list, with the highest maternal mortality ratio. In 2010, financial barriers were removed as an incentive for more women to use available antenatal, delivery and postnatal services. Few published studies have examined the quality of free antenatal services and access to emergency obstetric care in Sierra Leone. A cross-sectional survey was conducted in 2014 in all 97 peripheral health facilities and three hospitals in Bombali District, Northern Region. One hundred antenatal care providers were interviewed, 276 observations were made and 486 pregnant women were interviewed. We assessed the adequacy of antenatal and delivery services provided using national standards. The distance was calculated between each facility providing delivery services and the nearest comprehensive emergency obstetric care (CEOC) facility, and the proportion of facilities in a chiefdom within 15 km of each CEOC facility was also calculated. A thematic map was developed to show inequities. The quality of services was poor. Based on national standards, only 27% of women were examined, 2% were screened on their first antenatal visit and 47% received interventions as recommended. Although 94% of facilities provided delivery services, a minority had delivery rooms (40%), delivery kits (42%) or portable water (46%). Skilled attendants supervised 35% of deliveries, and in only 35% of these were processes adequately documented. None of the five basic emergency obstetric care facilities were fully compliant with national standards, and the central and northernmost parts of the district had the least access to comprehensive emergency obstetric care. The health sector needs to monitor the quality of antenatal interventions in addition to measuring coverage. The quality of delivery services is compromised by poor infrastructure, inadequate skilled staff, stock-outs of consumables, non-functional basic emergency obstetric care facilities, and geographic inequities in access to CEOC facilities. These findings suggest that the health sector needs to urgently investigate continuing inequities adversely influencing the uptake of these services, and explore more sustainable funding mechanisms. Without this, the country is unlikely to achieve its goal of reducing maternal deaths.
Construct Maps: A Tool to Organize Validity Evidence
ERIC Educational Resources Information Center
McClarty, Katie Larsen
2013-01-01
The construct map is a promising tool for organizing the data standard-setting panelists interpret. The challenge in applying construct maps to standard-setting procedures will be the judicious selection of data to include within this organizing framework. Therefore, this commentary focuses on decisions about what to include in the construct map.…
Ecosystem services, i.e., "services provided to humans from natural systems," have become a key issue of this century in resource management, conservation planning, and environmental decision analysis. Mapping and quantifying ecosystem services have become strategic national inte...
Bonnet, F; Solignac, S; Marty, J
2008-03-01
The purpose of benchmarking is to settle improvement processes by comparing the activities to quality standards. The proposed methodology is illustrated by benchmark business cases performed inside medical plants on some items like nosocomial diseases or organization of surgery facilities. Moreover, the authors have built a specific graphic tool, enhanced with balance score numbers and mappings, so that the comparison between different anesthesia-reanimation services, which are willing to start an improvement program, is easy and relevant. This ready-made application is even more accurate as far as detailed tariffs of activities are implemented.
Genome Maps, a new generation genome browser.
Medina, Ignacio; Salavert, Francisco; Sanchez, Rubén; de Maria, Alejandro; Alonso, Roberto; Escobar, Pablo; Bleda, Marta; Dopazo, Joaquín
2013-07-01
Genome browsers have gained importance as more genomes and related genomic information become available. However, the increase of information brought about by new generation sequencing technologies is, at the same time, causing a subtle but continuous decrease in the efficiency of conventional genome browsers. Here, we present Genome Maps, a genome browser that implements an innovative model of data transfer and management. The program uses highly efficient technologies from the new HTML5 standard, such as scalable vector graphics, that optimize workloads at both server and client sides and ensure future scalability. Thus, data management and representation are entirely carried out by the browser, without the need of any Java Applet, Flash or other plug-in technology installation. Relevant biological data on genes, transcripts, exons, regulatory features, single-nucleotide polymorphisms, karyotype and so forth, are imported from web services and are available as tracks. In addition, several DAS servers are already included in Genome Maps. As a novelty, this web-based genome browser allows the local upload of huge genomic data files (e.g. VCF or BAM) that can be dynamically visualized in real time at the client side, thus facilitating the management of medical data affected by privacy restrictions. Finally, Genome Maps can easily be integrated in any web application by including only a few lines of code. Genome Maps is an open source collaborative initiative available in the GitHub repository (https://github.com/compbio-bigdata-viz/genome-maps). Genome Maps is available at: http://www.genomemaps.org.
Genome Maps, a new generation genome browser
Medina, Ignacio; Salavert, Francisco; Sanchez, Rubén; de Maria, Alejandro; Alonso, Roberto; Escobar, Pablo; Bleda, Marta; Dopazo, Joaquín
2013-01-01
Genome browsers have gained importance as more genomes and related genomic information become available. However, the increase of information brought about by new generation sequencing technologies is, at the same time, causing a subtle but continuous decrease in the efficiency of conventional genome browsers. Here, we present Genome Maps, a genome browser that implements an innovative model of data transfer and management. The program uses highly efficient technologies from the new HTML5 standard, such as scalable vector graphics, that optimize workloads at both server and client sides and ensure future scalability. Thus, data management and representation are entirely carried out by the browser, without the need of any Java Applet, Flash or other plug-in technology installation. Relevant biological data on genes, transcripts, exons, regulatory features, single-nucleotide polymorphisms, karyotype and so forth, are imported from web services and are available as tracks. In addition, several DAS servers are already included in Genome Maps. As a novelty, this web-based genome browser allows the local upload of huge genomic data files (e.g. VCF or BAM) that can be dynamically visualized in real time at the client side, thus facilitating the management of medical data affected by privacy restrictions. Finally, Genome Maps can easily be integrated in any web application by including only a few lines of code. Genome Maps is an open source collaborative initiative available in the GitHub repository (https://github.com/compbio-bigdata-viz/genome-maps). Genome Maps is available at: http://www.genomemaps.org. PMID:23748955
Coming Full Circle in Standard Setting: A Commentary on Wyse
ERIC Educational Resources Information Center
Skaggs, Gary
2013-01-01
The construct map is a particularly good way to approach instrument development, and this author states that he was delighted to read Adam Wyse's thoughts about how to use construct maps for standard setting. For a number of popular standard-setting methods, Wyse shows how typical feedback to panelists fits within a construct map framework.…
VegScape: U.S. Crop Condition Monitoring Service
NASA Astrophysics Data System (ADS)
mueller, R.; Yang, Z.; Di, L.
2013-12-01
Since 1995, the US Department of Agriculture (USDA)/National Agricultural Statistics Service (NASS) has provided qualitative biweekly vegetation condition indices to USDA policymakers and the public on a weekly basis during the growing season. Vegetation indices have proven useful for assessing crop condition and identifying the areal extent of floods, drought, major weather anomalies, and vulnerabilities of early/late season crops. With growing emphasis on more extreme weather events and food security issues rising to the forefront of national interest, a new vegetation condition monitoring system was developed. The new vegetation condition portal named VegScape was initiated at the start of the 2013 growing season. VegScape delivers web mapping service based interactive vegetation indices. Users can use an interactive map to explore, query and disseminate current crop conditions. Vegetation indices like Normal Difference Vegetation Index (NDVI), Vegetation Condition Index (VCI), and mean, median, and ratio comparisons to prior years can be constructed for analytical purposes and on-demand crop statistics. The NASA MODIS satellite with 250 meter (15 acres) resolution and thirteen years of data history provides improved spatial and temporal resolutions and delivers improved detailed timely (i.e., daily) crop specific condition and dynamics. VegScape thus provides supplemental information to support NASS' weekly crop reports. VegScape delivers an agricultural cultivated crop mask and the most recent Cropland Data Layer (CDL) product to exploit the agricultural domain and visualize prior years' planted crops. Additionally, the data can be directly exported to Google Earth for web mashups or delivered via web mapping services for uses in other applications. VegScape supports the ethos of data democracy by providing free and open access to digital geospatial data layers using open geospatial standards, thereby supporting transparent and collaborative government initiatives. NASS developed VegScape in cooperation with the Center for Spatial Information Science and Systems, George Mason University, Fairfax, VA. VegScape Ratio to Median NDVI
Emergency Response Damage Assessment using Satellite Remote Sensing Data
NASA Astrophysics Data System (ADS)
Clandillon, Stephen; Yésou, Hervé; Schneiderhan, Tobias; de Boissezon, Hélène; de Fraipont, Paul
2013-04-01
During disasters rescue and relief organisations need quick access to reliable and accurate information to be better equipped to do their job. It is increasingly felt that satellites offer a unique near real time (NRT) tool to aid disaster management. A short introduction to the International Charter 'Space and Major Disasters', in operation since 2000 promoting worldwide cooperation among member space agencies, will be given as it is the foundation on which satellite-based, emergency response, damage assessment has been built. Other complementary mechanisms will also be discussed. The user access, triggering mechanism, an essential component for this user-driven service, will be highlighted with its 24/7 single access point. Then, a clear distinction will be made between data provision and geo-information delivery mechanisms to underline the user need for geo-information that is easily integrated into their working environments. Briefly, the path to assured emergency response product quality will be presented beginning with user requirements, expressed early-on, for emergency response value-adding services. Initiatives were then established, supported by national and European institutions, to develop the sector, with SERTIT and DLR being key players, providing support to decision makers in headquarters and relief teams in the field. To consistently meet the high quality levels demanded by users, rapid mapping has been transformed via workflow and quality control standardisation to improve both speed and quality. As such, SERTIT located in Alsace, France, and DLR/ZKI from Bavaria, Germany, join their knowledge in this presentation to report about recent standards as both have ISO certified their rapid mapping services based on experienced, well-trained, 24/7 on-call teams and established systems providing the first crisis analysis product in 6 hours after satellite data reception. The three main product types provided are then outlined: up-to-date pre-event reference maps, disaster extent maps and damage assessment or intensity/grading maps. With Google and open-sourced information the need for the reference maps has diminished, but not altogether, as damage extent and assessment products also require coherent reference geo-information which often has to be produced internally. Increasingly users need up-to-date, highly detailed, customised products; it is in damage assessment that an operator's working environment, geomatic skills and experience can often provide the highest levels of value-adding while adapting to user requests. Accordingly, DLR and SERTIT are involved in R&D work integrating data, e.g. TerraSAR-X and Pléiades sources plus Sentinel simulated data, which have interesting emergency mapping capacities. Their close interaction with the research sector is essential to be at the cutting-edge of the field, implementing effective and efficient analysis methods. Future R&D challenges to further improve the quality of the damage mapping service will be highlighted. Finally, this presentation will show some practical examples and thus how at present, space-based rapid mapping, which has more than 10 years of experience, has come to being able to provide, if rapidly programmed and acquired, geo-information linked to disaster extent and damage assessment from overview scales down to the street level and this with an ever increasing array of satellite data sources.
Acknowledgments & Citation | USDA Plant Hardiness Zone Map
USDA Logo Agricultural Research Service United States Department of Agriculture Mapping by PRISM , 2012. Agricultural Research Service, U.S. Department of Agriculture. Accessed from http
Contact USDA-ARS | USDA Plant Hardiness Zone Map
USDA Logo Agricultural Research Service United States Department of Agriculture Mapping by PRISM / Help / Contact USDA-ARS Topics How to Use This Website Contact USDA-ARS Contact USDA Agricultural Mapping, please contact the USDA Agricultural Research Service by sending an e-mail to phzm@ars.usda.gov
[Land use and land cover charnge (LUCC) and landscape service: Evaluation, mapping and modeling].
Song, Zhang-jian; Cao, Yu; Tan, Yong-zhong; Chen, Xiao-dong; Chen, Xian-peng
2015-05-01
Studies on ecosystem service from landscape scale aspect have received increasing attention from researchers all over the world. Compared with ecosystem scale, it should be more suitable to explore the influence of human activities on land use and land cover change (LUCC), and to interpret the mechanisms and processes of sustainable landscape dynamics on landscape scale. Based on comprehensive and systematic analysis of researches on landscape service, this paper firstly discussed basic concepts and classification of landscape service. Then, methods of evaluation, mapping and modeling of landscape service were analyzed and concluded. Finally, future trends for the research on landscape service were proposed. It was put forward that, exploring further connotation and classification system of landscape service, improving methods and quantitative indicators for evaluation, mapping and modelling of landscape service, carrying out long-term integrated researches on landscape pattern-process-service-scale relationships and enhancing the applications of theories and methods on landscape economics and landscape ecology are very important fields of the research on landscape service in future.
ERIC Educational Resources Information Center
Jacobsen, Mikael
2008-01-01
Librarians use online mapping services such as Google Maps, MapQuest, Yahoo Maps, and others to check traffic conditions, find local businesses, and provide directions. However, few libraries are using one of Google Maps most outstanding applications, My Maps, for the creation of enhanced and interactive multimedia maps. My Maps is a simple and…
Mapping the Early Intervention System in Ontario, Canada
ERIC Educational Resources Information Center
Underwood, Kathryn
2012-01-01
This study documents the wide range of early intervention services across the province of Ontario. The services are mapped across the province showing geographic information as well as the scope of services (clinical, family-based, resource support, etc.), the range of early intervention professionals, sources of funding and the populations served…
Mapping the Natchez Trace Parkway
Rangoonwala, Amina; Bannister, Terri; Ramsey, Elijah W.
2011-01-01
Based on a National Park Service (NPS) landcover classification, a landcover map of the 715-km (444-mile) NPS Natchez Trace Parkway (hereafter referred to as the "Parkway") was created. The NPS landcover classification followed National Vegetation Classification (NVC) protocols. The landcover map, which extended the initial landcover classification to the entire Parkway, was based on color-infrared photography converted to 1-m raster-based digital orthophoto quarter quadrangles, according to U.S. Geological Survey mapping standards. Our goal was to include as many alliance classes as possible in the Parkway landcover map. To reach this goal while maintaining a consistent and quantifiable map product throughout the Parkway extent, a mapping strategy was implemented based on the migration of class-based spectral textural signatures and the congruent progressive refinement of those class signatures along the Parkway. Progressive refinement provided consistent mapping by evaluating the spectral textural distinctiveness of the alliance-association classes, and where necessary, introducing new map classes along the Parkway. By following this mapping strategy, the use of raster-based image processing and geographic information system analyses for the map production provided a quantitative and reproducible product. Although field-site classification data were severely limited, the combination of spectral migration of class membership along the Parkway and the progressive classification strategy produced an organization of alliances that was internally highly consistent. The organization resulted from the natural patterns or alignments of spectral variance and the determination of those spectral patterns that were compositionally similar in the dominant species as NVC alliances. Overall, the mapped landcovers represented the existent spectral textural patterns that defined and encompassed the complex variety of compositional alliances and associations of the Parkway. Based on that mapped representation, forests dominate the Parkway landscape. Grass is the second largest Parkway land cover, followed by scrub-shrub and shrubland classes and pine plantations. The map provides a good representation of the landcover patterns and their changes over the extent of the Parkway, south to north.
Zhao, Chang; Sander, Heather A
2015-01-01
Studies that assess the distribution of benefits provided by ecosystem services across urban areas are increasingly common. Nevertheless, current knowledge of both the supply and demand sides of ecosystem services remains limited, leaving a gap in our understanding of balance between ecosystem service supply and demand that restricts our ability to assess and manage these services. The present study seeks to fill this gap by developing and applying an integrated approach to quantifying the supply and demand of a key ecosystem service, carbon storage and sequestration, at the local level. This approach follows three basic steps: (1) quantifying and mapping service supply based upon Light Detection and Ranging (LiDAR) processing and allometric models, (2) quantifying and mapping demand for carbon sequestration using an indicator based on local anthropogenic CO2 emissions, and (3) mapping a supply-to-demand ratio. We illustrate this approach using a portion of the Twin Cities Metropolitan Area of Minnesota, USA. Our results indicate that 1735.69 million kg carbon are stored by urban trees in our study area. Annually, 33.43 million kg carbon are sequestered by trees, whereas 3087.60 million kg carbon are emitted by human sources. Thus, carbon sequestration service provided by urban trees in the study location play a minor role in combating climate change, offsetting approximately 1% of local anthropogenic carbon emissions per year, although avoided emissions via storage in trees are substantial. Our supply-to-demand ratio map provides insight into the balance between carbon sequestration supply in urban trees and demand for such sequestration at the local level, pinpointing critical locations where higher levels of supply and demand exist. Such a ratio map could help planners and policy makers to assess and manage the supply of and demand for carbon sequestration.
cPath: open source software for collecting, storing, and querying biological pathways.
Cerami, Ethan G; Bader, Gary D; Gross, Benjamin E; Sander, Chris
2006-11-13
Biological pathways, including metabolic pathways, protein interaction networks, signal transduction pathways, and gene regulatory networks, are currently represented in over 220 diverse databases. These data are crucial for the study of specific biological processes, including human diseases. Standard exchange formats for pathway information, such as BioPAX, CellML, SBML and PSI-MI, enable convenient collection of this data for biological research, but mechanisms for common storage and communication are required. We have developed cPath, an open source database and web application for collecting, storing, and querying biological pathway data. cPath makes it easy to aggregate custom pathway data sets available in standard exchange formats from multiple databases, present pathway data to biologists via a customizable web interface, and export pathway data via a web service to third-party software, such as Cytoscape, for visualization and analysis. cPath is software only, and does not include new pathway information. Key features include: a built-in identifier mapping service for linking identical interactors and linking to external resources; built-in support for PSI-MI and BioPAX standard pathway exchange formats; a web service interface for searching and retrieving pathway data sets; and thorough documentation. The cPath software is freely available under the LGPL open source license for academic and commercial use. cPath is a robust, scalable, modular, professional-grade software platform for collecting, storing, and querying biological pathways. It can serve as the core data handling component in information systems for pathway visualization, analysis and modeling.
NASA Astrophysics Data System (ADS)
LIU, Yiping; XU, Qing; ZhANG, Heng; LV, Liang; LU, Wanjie; WANG, Dandi
2016-11-01
The purpose of this paper is to solve the problems of the traditional single system for interpretation and draughting such as inconsistent standards, single function, dependence on plug-ins, closed system and low integration level. On the basis of the comprehensive analysis of the target elements composition, map representation and similar system features, a 3D interpretation and draughting integrated service platform for multi-source, multi-scale and multi-resolution geospatial objects is established based on HTML5 and WebGL, which not only integrates object recognition, access, retrieval, three-dimensional display and test evaluation but also achieves collection, transfer, storage, refreshing and maintenance of data about Geospatial Objects and shows value in certain prospects and potential for growth.
GeneWiz browser: An Interactive Tool for Visualizing Sequenced Chromosomes.
Hallin, Peter F; Stærfeldt, Hans-Henrik; Rotenberg, Eva; Binnewies, Tim T; Benham, Craig J; Ussery, David W
2009-09-25
We present an interactive web application for visualizing genomic data of prokaryotic chromosomes. The tool (GeneWiz browser) allows users to carry out various analyses such as mapping alignments of homologous genes to other genomes, mapping of short sequencing reads to a reference chromosome, and calculating DNA properties such as curvature or stacking energy along the chromosome. The GeneWiz browser produces an interactive graphic that enables zooming from a global scale down to single nucleotides, without changing the size of the plot. Its ability to disproportionally zoom provides optimal readability and increased functionality compared to other browsers. The tool allows the user to select the display of various genomic features, color setting and data ranges. Custom numerical data can be added to the plot allowing, for example, visualization of gene expression and regulation data. Further, standard atlases are pre-generated for all prokaryotic genomes available in GenBank, providing a fast overview of all available genomes, including recently deposited genome sequences. The tool is available online from http://www.cbs.dtu.dk/services/gwBrowser. Supplemental material including interactive atlases is available online at http://www.cbs.dtu.dk/services/gwBrowser/suppl/.
NASA Astrophysics Data System (ADS)
Sagarminaga, Y.; Galparsoro, I.; Reig, R.; Sánchez, J. A.
2012-04-01
Since 2000, an intense effort was conducted in AZTI's Marine Research Division to set up a data management system which could gather all the marine datasets that were being produced by different in-house research projects. For that, a corporative GIS was designed that included a data and metadata repository, a database, a layer catalog & search application and an internet map viewer. Several layers, mostly dealing with physical, chemical and biological in-situ sampling, and basic and thematic cartography including bathymetry, geomorphology, different species habitat maps, and human pressure and activities maps, were successfully gathered in this system. Very soon, it was realised that new marine technologies yielding continuous multidimensional data, sometimes called FES (Fluid Earth System) data, were difficult to handle in this structure. The data affected, mainly included numerical oceanographic and meteorological models, remote sensing data, coastal RADAR data, and some in-situ observational systems such as CTD's casts, moored or lagrangian buoys, etc. A management system for gridded multidimensional data was developed using standardized formats (netcdf using CF conventions) and tools such as THREDDS catalog (UNIDATA/UCAR) providing web services such as OPENDAP, NCSS, and WCS, as well as ncWMS service developed by the Reading e-science Center. At present, a system (ITSASGIS-5D) is being developed, based on OGC standards and open-source tools to allow interoperability between all the data types mentioned before. This system includes, in the server side, postgresql/postgis databases and geoserver for GIS layers, and THREDDS/Opendap and ncWMS services for FES gridded data. Moreover, an on-line client is being developed to allow joint access, user configuration, data visualisation & query and data distribution. This client is using mapfish, ExtJS - GeoEXT, and openlayers libraries. Through this presentation the elements of the first released version of this system will be described and showed, together with the new topics to be developed in new versions that include among others, the integration of geoNetwork libraries and tools for both FES and GIS metadata management, and the use of new OGC Sensor Observation Services (SOS) to integrate non gridded multidimensional data such as time series, depth profiles or trajectories provided by different observational systems. The final aim of this approach is to contribute to the multidisciplinary access and use of marine data for management and research activities, and facilitate the implementation of integrated ecosystem based approaches in the fields of fisheries advice and management, marine spatial planning, or the implementation of the European policies such as the Water Framework Directive, the Marine Strategy Framework Directive or the Habitat Framework Directive.
The status of soil mapping for the Idaho National Engineering Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olson, G.L.; Lee, R.D.; Jeppesen, D.J.
This report discusses the production of a revised version of the general soil map of the 2304-km{sup 2} (890-mi{sup 2}) Idaho National Engineering Laboratory (INEL) site in southeastern Idaho and the production of a geographic information system (GIS) soil map and supporting database. The revised general soil map replaces an INEL soil map produced in 1978 and incorporates the most current information on INEL soils. The general soil map delineates large soil associations based on National Resources Conservation Services [formerly the Soil Conservation Service (SCS)] principles of soil mapping. The GIS map incorporates detailed information that could not be presentedmore » on the general soil map and is linked to a database that contains the soil map unit descriptions, surficial geology codes, and other pertinent information.« less
Mapping Civic Engagement: A Case Study of Service-Learning in Appalachia
ERIC Educational Resources Information Center
Mann, Jessica; Casebeer, Daniel
2016-01-01
This study uses social cartography to map student perceptions of a co-curricular service-learning project in an impoverished rural community. As a complement to narrative discourse, mapping provides an opportunity to visualize not only the spatial nature of the educational experience but also, in this case, the benefits of civic engagement. The…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-13
... DEPARTMENT OF AGRICULTURE Forest Service Boundary Description and Final Map for Roaring Wild and... availability. SUMMARY: In accordance with section 3(b) of the Wild and Scenic Rivers Act, the USDA Forest Service, Washington Office, is transmitting the final boundary description and map of the Roaring Wild and...
NASA Astrophysics Data System (ADS)
Sabeur, Zoheir; Middleton, Stuart; Veres, Galina; Zlatev, Zlatko; Salvo, Nicola
2010-05-01
The advancement of smart sensor technology in the last few years has led to an increase in the deployment of affordable sensors for monitoring the environment around Europe. This is generating large amounts of sensor observation information and inevitably leading to problems about how to manage large volumes of data as well as making sense out the data for decision-making. In addition, the various European Directives (Water Framework Diectives, Bathing Water Directives, Habitat Directives, etc.. ) which regulate human activities in the environment and the INSPIRE Directive on spatial information management regulations have implicitely led the designated European Member States environment agencies and authorities to put in place new sensor monitoring infrastructure and share information about environmental regions under their statutory responsibilities. They will need to work cross border and collectively reach environmental quality standards. They will also need to regularly report to the EC on the quality of the environments of which they are responsible and make such information accessible to the members of the public. In recent years, early pioneering work on the design of service oriented architecture using sensor networks has been achieved. Information web-services infrastructure using existing data catalogues and web-GIS map services can now be enriched with the deployment of new sensor observation and data fusion and modelling services using OGC standards. The deployment of the new services which describe sensor observations and intelligent data-processing using data fusion techniques can now be implemented and provide added value information with spatial-temporal uncertainties to the next generation of decision support service systems. The new decision support service systems have become key to implement across Europe in order to comply with EU environmental regulations and INSPIRE. In this paper, data fusion services using OGC standards with sensor observation data streams are described in context of a geo-distributed service infrastructure specialising in multiple environmental risk management and decision-support. The sensor data fusion services are deployed and validated in two use cases. These are respectively concerned with: 1) Microbial risks forecast in bathing waters; and 2) Geohazards in urban zones during underground tunneling activities. This research was initiated in the SANY Integrated Project(www.sany-ip.org) and funded by the European Commission under the 6th Framework Programme.
EUMIS - an open portal framework for interoperable marine environmental services
NASA Astrophysics Data System (ADS)
Hamre, T.; Sandven, S.; Leadbetter, A.; Gouriou, V.; Dunne, D.; Grant, M.; Treguer, M.; Torget, Ø.
2012-04-01
NETMAR (Open service network for marine environmental data) is an FP7 project that aims to develop a pilot European Marine Information System (EUMIS) for searching, downloading and integrating satellite, in situ and model data from ocean and coastal areas. EUMIS will use a semantic framework coupled with ontologies for identifying and accessing distributed data, such as near-real time, forecast and historical data. Four pilots have been defined to clarify the needs for satellite, in situ and model based products and services in selected user communities. The pilots are: · Pilot 1: Arctic Sea Ice Monitoring and Forecasting · Pilot 2: Oil spill drift forecast and shoreline cleanup assessment services in France · Pilot 3: Ocean colour - Marine Ecosystem, Research and Monitoring · Pilot 4: International Coastal Atlas Network (ICAN) for coastal zone management NETMAR is developing a set of data delivery services for the targeted user communities by means of standard web-GIS and OPeNDAP protocols. Processing services and adaptive service chaining services will also be developed, to enable users to generate new products suited to their needs. Both data retrieved from online repositories as well as the products generated dynamically can be accessed and visualised in the EUMIS portal. For this purpose, a GIS Viewer, a Service Chaining Editor and a Ontology Browser/Discovery Client have been developed and integrated in EUMIS. The EUMIS portal is developed using a portal framework that is compliant with the JSR-168 (Java Portlet Specification 1.0) and JSR-286 (Java Portlet Specification, 2.0) standards. These standards defines the interface (contract) and lifecycle management for a portal system component, a portlet, which can be implemented in a number of programming languages, not only Java. The GIS Viewer is developed using a combination of Java, JavaScript and JSF (e.g. MapFaces). The Service chaining editor is implemented in JavaScript (using different libraries like jQuery and WireIt), and the Ontology Browser/Discovery Client by means of Adobe Flex. In addition to the portlets developed in the project, we have also used several of the pre-built portlets that come with the Liferay Community Edition portal framework, notably the wiki, forum and RSS feed portlets. The presentation will focus on the developed system components and show some examples of products and services from the defined pilots.
Agricultural Census 2012: Publishing Mashable GIS Big Data Services
NASA Astrophysics Data System (ADS)
Mueller, R.
2014-12-01
The 2012 Agricultural Census was released by the US Department of Agriculture (USDA) on May 2nd 2014; published on a quinquennial basis covering all facets of American production agriculture. The Agricultural Census is a comprehensive source of uniform published agricultural data for every state and county in the US. This is the first Agricultural Census that is disseminated with web mapping services using REST APIs. USDA developed an open GIS mashable web portal that depicts over 250 maps on Crops and Plants, Economics, Farms, Livestock and Animals, and Operators. These mapping services written in JavaScript replace the traditional static maps published as the Ag Atlas. Web users can now visualize, interact, query, and download the Agricultural Census data in a means not previously discoverable. Stakeholders will now be able to leverage this data for activities such as community planning, agribusiness location suitability analytics, availability of loans/funds, service center locations and staffing, and farm programs and policies. Additional sites serving compatible mashable USDA Big Data web services are as follows: The Food Environment Atlas, The Atlas of Rural and Small-Town America, The Farm Program Atlas, SNAP Data System, CropScape, and VegScape. All portals use a similar data organization scheme of "Categories" and "Maps" providing interactive mashable web services for agricultural stakeholders to exploit.
A Collaborative Decision Environment for UAV Operations
NASA Technical Reports Server (NTRS)
D'Ortenzio, Matthew V.; Enomoto, Francis Y.; Johan, Sandra L.
2005-01-01
NASA is developing Intelligent Mission Management (IMM) technology for science missions employing long endurance unmanned aerial vehicles (UAV's). The IMM groundbased component is the Collaborative Decision Environment (CDE), a ground system that provides the Mission/Science team with situational awareness, collaboration, and decisionmaking tools. The CDE is used for pre-flight planning, mission monitoring, and visualization of acquired data. It integrates external data products used for planning and executing a mission, such as weather, satellite data products, and topographic maps by leveraging established and emerging Open Geospatial Consortium (OGC) standards to acquire external data products via the Internet, and an industry standard geographic information system (GIs) toolkit for visualization As a Science/Mission team may be geographically dispersed, the CDE is capable of providing access to remote users across wide area networks using Web Services technology. A prototype CDE is being developed for an instrument checkout flight on a manned aircraft in the fall of 2005, in preparation for a full deployment in support of the US Forest Service and NASA Ames Western States Fire Mission in 2006.
EnviroAtlas - Austin, TX - Demographics by Block Group Web Service
This EnviroAtlas web service supports research and online mapping activities related to EnviroAtlas (https://enviroatlas.epa.gov/EnviroAtlas). This EnviroAtlas dataset is a summary of key demographic groups for the EnviroAtlas community. This dataset was produced by the US EPA to support research and online mapping activities related to EnviroAtlas. EnviroAtlas (https://www.epa.gov/enviroatlas) allows the user to interact with a web-based, easy-to-use, mapping application to view and analyze multiple ecosystem services for the contiguous United States. The dataset is available as downloadable data (https://edg.epa.gov/data/Public/ORD/EnviroAtlas) or as an EnviroAtlas map service. Additional descriptive information about each attribute in this dataset can be found in its associated EnviroAtlas Fact Sheet (https://www.epa.gov/enviroatlas/enviroatlas-fact-sheets).
Research into Australian emergency services personnel mental health and wellbeing: An evidence map.
Varker, Tracey; Metcalf, Olivia; Forbes, David; Chisolm, Katherine; Harvey, Sam; Van Hooff, Miranda; McFarlane, Alexander; Bryant, Richard; Phelps, Andrea J
2018-02-01
Evidence maps are a method of systematically characterising the range of research activity in broad topic areas and are a tool for guiding research priorities. 'Evidence-mapping' methodology was used to quantify the nature and distribution of recent peer-reviewed research into the mental health and wellbeing of Australian emergency services personnel. A search of the PsycINFO, EMBASE and Cochrane Library databases was performed for primary research articles that were published between January 2011 and July 2016. In all, 43 studies of primary research were identified and mapped. The majority of the research focused on organisational and individual/social factors and how they relate to mental health problems/wellbeing. There were several areas of research where very few studies were detected through the mapping process, including suicide, personality, stigma and pre-employment factors that may contribute to mental health outcomes and the use of e-health. No studies were detected which examined the prevalence of self-harm and/or harm to others, bullying, alcohol/substance use, barriers to care or experience of families of emergency services personnel. In addition, there was no comprehensive national study that had investigated all sectors of emergency services personnel. This evidence map highlights the need for future research to address the current gaps in mental health and wellbeing research among Australian emergency services personnel. Improved understanding of the mental health and wellbeing of emergency services personnel, and the factors that contribute, should guide organisations' wellbeing policies and procedures.
EnviroAtlas -Durham, NC- One Meter Resolution Urban Area Land Cover Map (2010) Web Service
This EnviroAtlas web service supports research and online mapping activities related to EnviroAtlas (https://www.epa.gov/enviroatlas ). The EnviroAtlas Durham, NC land cover map was generated from USDA NAIP (National Agricultural Imagery Program) four band (red, green, blue and near infrared) aerial photography from July 2010 at 1 m spatial resolution. Five land cover classes were mapped: impervious surface, soil and barren, grass and herbaceous, trees and forest, and water. An accuracy assessment using a stratified random sampling of 500 samples yielded an overall accuracy of 83 percent using a minimum mapping unit of 9 pixels (3x3 pixel window). The area mapped is defined by the US Census Bureau's 2010 Urban Statistical Area for Durham, and includes the cities of Durham, Chapel Hill, Carrboro and Hillsborough, NC. This dataset was produced by the US EPA to support research and online mapping activities related to EnviroAtlas. EnviroAtlas (https://www.epa.gov/enviroatlas ) allows the user to interact with a web-based, easy-to-use, mapping application to view and analyze multiple ecosystem services for the contiguous United States. The dataset is available as downloadable data (https://edg.epa.gov/data/Public/ORD/EnviroAtlas) or as an EnviroAtlas map service. Additional descriptive information about each attribute in this dataset can be found in its associated EnviroAtlas Fact Sheet (https://www.epa.gov/enviroatlas/enviroatlas-fact-sheets ).
NASA Astrophysics Data System (ADS)
Parker, Jay; Donnellan, Andrea; Glasscoe, Margaret; Fox, Geoffrey; Wang, Jun; Pierce, Marlon; Ma, Yu
2015-08-01
High-resolution maps of earth surface deformation are available in public archives for scientific interpretation, but are primarily available as bulky downloads on the internet. The NASA uninhabited aerial vehicle synthetic aperture radar (UAVSAR) archive of airborne radar interferograms delivers very high resolution images (approximately seven meter pixels) making remote handling of the files that much more pressing. Data exploration requiring data selection and exploratory analysis has been tedious. QuakeSim has implemented an archive of UAVSAR data in a web service and browser system based on GeoServer (http://geoserver.org). This supports a variety of services that supply consistent maps, raster image data and geographic information systems (GIS) objects including standard earthquake faults. Browsing the database is supported by initially displaying GIS-referenced thumbnail images of the radar displacement maps. Access is also provided to image metadata and links for full file downloads. One of the most widely used features is the QuakeSim line-of-sight profile tool, which calculates the radar-observed displacement (from an unwrapped interferogram product) along a line specified through a web browser. Displacement values along a profile are updated to a plot on the screen as the user interactively redefines the endpoints of the line and the sampling density. The profile and also a plot of the ground height are available as CSV (text) files for further examination, without any need to download the full radar file. Additional tools allow the user to select a polygon overlapping the radar displacement image, specify a downsampling rate and extract a modest sized grid of observations for display or for inversion, for example, the QuakeSim simplex inversion tool which estimates a consistent fault geometry and slip model.
a Map Mash-Up Application: Investigation the Temporal Effects of Climate Change on Salt Lake Basin
NASA Astrophysics Data System (ADS)
Kirtiloglu, O. S.; Orhan, O.; Ekercin, S.
2016-06-01
The main purpose of this paper is to investigate climate change effects that have been occurred at the beginning of the twenty-first century at the Konya Closed Basin (KCB) located in the semi-arid central Anatolian region of Turkey and particularly in Salt Lake region where many major wetlands located in and situated in KCB and to share the analysis results online in a Web Geographical Information System (GIS) environment. 71 Landsat 5-TM, 7-ETM+ and 8-OLI images and meteorological data obtained from 10 meteorological stations have been used at the scope of this work. 56 of Landsat images have been used for extraction of Salt Lake surface area through multi-temporal Landsat imagery collected from 2000 to 2014 in Salt lake basin. 15 of Landsat images have been used to make thematic maps of Normalised Difference Vegetation Index (NDVI) in KCB, and 10 meteorological stations data has been used to generate the Standardized Precipitation Index (SPI), which was used in drought studies. For the purpose of visualizing and sharing the results, a Web GIS-like environment has been established by using Google Maps and its useful data storage and manipulating product Fusion Tables which are all Google's free of charge Web service elements. The infrastructure of web application includes HTML5, CSS3, JavaScript, Google Maps API V3 and Google Fusion Tables API technologies. These technologies make it possible to make effective "Map Mash-Ups" involving an embedded Google Map in a Web page, storing the spatial or tabular data in Fusion Tables and add this data as a map layer on embedded map. The analysing process and map mash-up application have been discussed in detail as the main sections of this paper.
A comprehensive map of the influenza A virus replication cycle
2013-01-01
Background Influenza is a common infectious disease caused by influenza viruses. Annual epidemics cause severe illnesses, deaths, and economic loss around the world. To better defend against influenza viral infection, it is essential to understand its mechanisms and associated host responses. Many studies have been conducted to elucidate these mechanisms, however, the overall picture remains incompletely understood. A systematic understanding of influenza viral infection in host cells is needed to facilitate the identification of influential host response mechanisms and potential drug targets. Description We constructed a comprehensive map of the influenza A virus (‘IAV’) life cycle (‘FluMap’) by undertaking a literature-based, manual curation approach. Based on information obtained from publicly available pathway databases, updated with literature-based information and input from expert virologists and immunologists, FluMap is currently composed of 960 factors (i.e., proteins, mRNAs etc.) and 456 reactions, and is annotated with ~500 papers and curation comments. In addition to detailing the type of molecular interactions, isolate/strain specific data are also available. The FluMap was built with the pathway editor CellDesigner in standard SBML (Systems Biology Markup Language) format and visualized as an SBGN (Systems Biology Graphical Notation) diagram. It is also available as a web service (online map) based on the iPathways+ system to enable community discussion by influenza researchers. We also demonstrate computational network analyses to identify targets using the FluMap. Conclusion The FluMap is a comprehensive pathway map that can serve as a graphically presented knowledge-base and as a platform to analyze functional interactions between IAV and host factors. Publicly available webtools will allow continuous updating to ensure the most reliable representation of the host-virus interaction network. The FluMap is available at http://www.influenza-x.org/flumap/. PMID:24088197
GIM-TEC adaptive ionospheric weather assessment and forecast system
NASA Astrophysics Data System (ADS)
Gulyaeva, T. L.; Arikan, F.; Hernandez-Pajares, M.; Stanislawska, I.
2013-09-01
The Ionospheric Weather Assessment and Forecast (IWAF) system is a computer software package designed to assess and predict the world-wide representation of 3-D electron density profiles from the Global Ionospheric Maps of Total Electron Content (GIM-TEC). The unique system products include daily-hourly numerical global maps of the F2 layer critical frequency (foF2) and the peak height (hmF2) generated with the International Reference Ionosphere extended to the plasmasphere, IRI-Plas, upgraded by importing the daily-hourly GIM-TEC as a new model driving parameter. Since GIM-TEC maps are provided with 1- or 2-days latency, the global maps forecast for 1 day and 2 days ahead are derived using an harmonic analysis applied to the temporal changes of TEC, foF2 and hmF2 at 5112 grid points of a map encapsulated in IONEX format (-87.5°:2.5°:87.5°N in latitude, -180°:5°:180°E in longitude). The system provides online the ionospheric disturbance warnings in the global W-index map establishing categories of the ionospheric weather from the quiet state (W=±1) to intense storm (W=±4) according to the thresholds set for instant TEC perturbations regarding quiet reference median for the preceding 7 days. The accuracy of IWAF system predictions of TEC, foF2 and hmF2 maps is superior to the standard persistence model with prediction equal to the most recent ‘true’ map. The paper presents outcomes of the new service expressed by the global ionospheric foF2, hmF2 and W-index maps demonstrating the process of origin and propagation of positive and negative ionosphere disturbances in space and time and their forecast under different scenarios.
McErlane, Flora; Foster, Helen E; Armitt, Gillian; Bailey, Kathryn; Cobb, Joanna; Davidson, Joyce E; Douglas, Sharon; Fell, Andrew; Friswell, Mark; Pilkington, Clarissa; Strike, Helen; Smith, Nicola; Thomson, Wendy; Cleary, Gavin
2018-01-01
Abstract Objective Timely access to holistic multidisciplinary care is the core principle underpinning management of juvenile idiopathic arthritis (JIA). Data collected in national clinical audit programmes fundamentally aim to improve health outcomes of disease, ensuring clinical care is equitable, safe and patient-centred. The aim of this study was to develop a tool for national audit of JIA in the UK. Methods A staged and consultative methodology was used across a broad group of relevant stakeholders to develop a national audit tool, with reference to pre-existing standards of care for JIA. The tool comprises key service delivery quality measures assessed against two aspects of impact, namely disease-related outcome measures and patient/carer reported outcome and experience measures. Results Eleven service-related quality measures were identified, including those that map to current standards for commissioning of JIA clinical services in the UK. The three-variable Juvenile Arthritis Disease Activity Score and presence/absence of sacro-iliitis in patients with enthesitis-related arthritis were identified as the primary disease-related outcome measures, with presence/absence of uveitis a secondary outcome. Novel patient/carer reported outcomes and patient/carer reported experience measures were developed and face validity confirmed by relevant patient/carer groups. Conclusion A tool for national audit of JIA has been developed with the aim of benchmarking current clinical practice and setting future standards and targets for improvement. Staged implementation of this national audit tool should facilitate investigation of variability in levels of care and drive quality improvement. This will require engagement from patients and carers, clinical teams and commissioners of JIA services. PMID:29069424
McErlane, Flora; Foster, Helen E; Armitt, Gillian; Bailey, Kathryn; Cobb, Joanna; Davidson, Joyce E; Douglas, Sharon; Fell, Andrew; Friswell, Mark; Pilkington, Clarissa; Strike, Helen; Smith, Nicola; Thomson, Wendy; Cleary, Gavin
2018-01-01
Timely access to holistic multidisciplinary care is the core principle underpinning management of juvenile idiopathic arthritis (JIA). Data collected in national clinical audit programmes fundamentally aim to improve health outcomes of disease, ensuring clinical care is equitable, safe and patient-centred. The aim of this study was to develop a tool for national audit of JIA in the UK. A staged and consultative methodology was used across a broad group of relevant stakeholders to develop a national audit tool, with reference to pre-existing standards of care for JIA. The tool comprises key service delivery quality measures assessed against two aspects of impact, namely disease-related outcome measures and patient/carer reported outcome and experience measures. Eleven service-related quality measures were identified, including those that map to current standards for commissioning of JIA clinical services in the UK. The three-variable Juvenile Arthritis Disease Activity Score and presence/absence of sacro-iliitis in patients with enthesitis-related arthritis were identified as the primary disease-related outcome measures, with presence/absence of uveitis a secondary outcome. Novel patient/carer reported outcomes and patient/carer reported experience measures were developed and face validity confirmed by relevant patient/carer groups. A tool for national audit of JIA has been developed with the aim of benchmarking current clinical practice and setting future standards and targets for improvement. Staged implementation of this national audit tool should facilitate investigation of variability in levels of care and drive quality improvement. This will require engagement from patients and carers, clinical teams and commissioners of JIA services. © The Author 2017. Published by Oxford University Press on behalf of the British Society for Rheumatology.
Pre-Service Teacher Beliefs on the Antecedents to Bullying: A Concept Mapping Study
ERIC Educational Resources Information Center
Lopata, Joel A.; Nowicki, Elizabeth A.
2014-01-01
In this study, researchers gathered Canadian pre-service teachers' beliefs on the antecedents to bullying. Concept mapping (Kane & Trochim, 2007) was used to analyze the data. This study's findings identified pre-service teachers to have accurate beliefs, inaccurate beliefs, and a lack of knowledge about the antecedents to bullying. Concept…
CPC - Monitoring & Data: Pacific Island Climate Data
Weather Service NWS logo - Click to go to the NWS home page Climate Prediction Center Home Site Map News Web resources and services. HOME > Monitoring and Data > Pacific Islands Climate Data & Maps island stations. NOAA/ National Weather Service NOAA Center for Weather and Climate Prediction Climate
Geospatial Web Services in Real Estate Information System
NASA Astrophysics Data System (ADS)
Radulovic, Aleksandra; Sladic, Dubravka; Govedarica, Miro; Popovic, Dragana; Radovic, Jovana
2017-12-01
Since the data of cadastral records are of great importance for the economic development of the country, they must be well structured and organized. Records of real estate on the territory of Serbia met many problems in previous years. To prevent problems and to achieve efficient access, sharing and exchange of cadastral data on the principles of interoperability, domain model for real estate is created according to current standards in the field of spatial data. The resulting profile of the domain model for the Serbian real estate cadastre is based on the current legislation and on Land Administration Domain Model (LADM) which is specified in the ISO19152 standard. Above such organized data, and for their effective exchange, it is necessary to develop a model of services that must be provided by the institutions interested in the exchange of cadastral data. This is achieved by introducing a service-oriented architecture in the information system of real estate cadastre and with that ensures efficiency of the system. It is necessary to develop user services for download, review and use of the real estate data through the web. These services should be provided to all users who need access to cadastral data (natural and legal persons as well as state institutions) through e-government. It is also necessary to provide search, view and download of cadastral spatial data by specifying geospatial services. Considering that real estate contains geometric data for parcels and buildings it is necessary to establish set of geospatial services that would provide information and maps for the analysis of spatial data, and for forming a raster data. Besides the theme Cadastral parcels, INSPIRE directive specifies several themes that involve data on buildings and land use, for which data can be provided from real estate cadastre. In this paper, model of geospatial services in Serbia is defined. A case study of using these services to estimate which household is at risk of flooding using the Web Processing Service (WPS) spatial analysis is described.
Statistical density modification using local pattern matching
Terwilliger, Thomas C.
2007-01-23
A computer implemented method modifies an experimental electron density map. A set of selected known experimental and model electron density maps is provided and standard templates of electron density are created from the selected experimental and model electron density maps by clustering and averaging values of electron density in a spherical region about each point in a grid that defines each selected known experimental and model electron density maps. Histograms are also created from the selected experimental and model electron density maps that relate the value of electron density at the center of each of the spherical regions to a correlation coefficient of a density surrounding each corresponding grid point in each one of the standard templates. The standard templates and the histograms are applied to grid points on the experimental electron density map to form new estimates of electron density at each grid point in the experimental electron density map.
Mapping mental health service access: achieving equity through quality improvement.
Green, Stuart A; Poots, Alan J; Marcano-Belisario, Jose; Samarasundera, Edgar; Green, John; Honeybourne, Emmi; Barnes, Ruth
2013-06-01
Improving access to psychological therapies (IAPTs) services deliver evidence-based care to people with depression and anxiety. A quality improvement (QI) initiative was undertaken by an IAPT service to improve referrals providing an opportunity to evaluate equitable access. QI methodologies were used by the clinical team to improve referrals to the service. The collection of geo-coded data allowed referrals to be mapped to small geographical areas according to deprivation. A total of 6078 patients were referred to the IAPT service during the period of analysis and mapped to 120 unique lower super output areas (LSOAs). The average weekly referral rate rose from 17 during the baseline phase to 43 during the QI implementation phase. Spatial analysis demonstrated all 15 of the high deprivation/low referral LSOAs were converted to high deprivation/high or medium referral LSOAs following the QI initiative. This work highlights the importance of QI in developing clinical services aligned to the needs of the population through the analysis of routine data matched to health needs. Mapping can be utilized to communicate complex information to inform the planning and organization of clinical service delivery and evaluate the progress and sustainability of QI initiatives.
Ecosystem services provided by a complex coastal region: challenges of classification and mapping.
Sousa, Lisa P; Sousa, Ana I; Alves, Fátima L; Lillebø, Ana I
2016-03-11
A variety of ecosystem services classification systems and mapping approaches are available in the scientific and technical literature, which needs to be selected and adapted when applied to complex territories (e.g. in the interface between water and land, estuary and sea). This paper provides a framework for addressing ecosystem services in complex coastal regions. The roadmap comprises the definition of the exact geographic boundaries of the study area; the use of CICES (Common International Classification of Ecosystem Services) for ecosystem services identification and classification; and the definition of qualitative indicators that will serve as basis to map the ecosystem services. Due to its complexity, the Ria de Aveiro coastal region was selected as case study, presenting an opportunity to explore the application of such approaches at a regional scale. The main challenges of implementing the proposed roadmap, together with its advantages are discussed in this research. The results highlight the importance of considering both the connectivity of natural systems and the complexity of the governance framework; the flexibility and robustness, but also the challenges when applying CICES at regional scale; and the challenges regarding ecosystem services mapping.
Ecosystem services provided by a complex coastal region: challenges of classification and mapping
Sousa, Lisa P.; Sousa, Ana I.; Alves, Fátima L.; Lillebø, Ana I.
2016-01-01
A variety of ecosystem services classification systems and mapping approaches are available in the scientific and technical literature, which needs to be selected and adapted when applied to complex territories (e.g. in the interface between water and land, estuary and sea). This paper provides a framework for addressing ecosystem services in complex coastal regions. The roadmap comprises the definition of the exact geographic boundaries of the study area; the use of CICES (Common International Classification of Ecosystem Services) for ecosystem services identification and classification; and the definition of qualitative indicators that will serve as basis to map the ecosystem services. Due to its complexity, the Ria de Aveiro coastal region was selected as case study, presenting an opportunity to explore the application of such approaches at a regional scale. The main challenges of implementing the proposed roadmap, together with its advantages are discussed in this research. The results highlight the importance of considering both the connectivity of natural systems and the complexity of the governance framework; the flexibility and robustness, but also the challenges when applying CICES at regional scale; and the challenges regarding ecosystem services mapping. PMID:26964892
NASA Astrophysics Data System (ADS)
Yang, C.; Wong, D. W.; Phillips, T.; Wright, R. A.; Lindsey, S.; Kafatos, M.
2005-12-01
As a teamed partnership of the Center for Earth Observing and Space Research (CEOSR) at George Mason University (GMU), Virginia Department of Transportation (VDOT), Bureau of Transportation Statistics at the Department of Transportation (BTS/DOT), and Intergraph, we established Transportation Framework Data Services using Open Geospatial Consortium (OGC)'s Web Feature Service (WFS) Specification to enable the sharing of transportation data among the federal level with data from BTS/DOT, the state level through VDOT, the industries through Intergraph. CEOSR develops WFS solutions using Intergraph software. Relevant technical documents are also developed and disseminated through the partners. The WFS is integrated with operational geospatial systems at CEOSR and VDOT. CEOSR works with Intergraph on developing WFS solutions and technical documents. GeoMedia WebMap WFS toolkit is used with software and technical support from Intergraph. ESRI ArcIMS WFS connector is used with GMU's campus license of ESRI products. Tested solutions are integrated with framework data service operational systems, including 1) CEOSR's interoperable geospatial information services, FGDC clearinghouse Node, Geospatial One Stop (GOS) portal, and WMS services, 2) VDOT's state transportation data and GIS infrastructure, and 3)BTS/DOT's national transportation data. The project presents: 1) develop and deploy an operational OGC WFS 1.1 interfaces at CEOSR for registering with FGDC/GOS Portal and responding to Web ``POST'' requests for transportation Framework data as listed in Table 1; 2) build the WFS service that can return the data that conform to the drafted ANSI/INCITS L1 Standard (when available) for each identified theme in the format given by OGC Geography Markup Language (GML) Version 3.0 or higher; 3) integrate the OGC WFS with CEOSR's clearinghouse nodes, 4) establish a formal partnership to develop and share WFS-based geospatial interoperability technology among GMU, VDOT, BTS/DOT, and Intergraph; and 5) develop WFS-based solutions and technical documents using the GeoMedia WebMap WFS toolkit. Geospatial Web Feature Service is demonstrated to be more efficient in sharing vector data and supports direct Internet access transportation data. Developed WFS solutions also enhanced the interoperable service provided by CEOSR through the FGDC clearinghouse node and the GOS Portal.
Honer, William G; Cervantes-Larios, Alejandro; Jones, Andrea A; Vila-Rodriguez, Fidel; Montaner, Julio S; Tran, Howard; Nham, Jimmy; Panenka, William J; Lang, Donna J; Thornton, Allen E; Vertinsky, Talia; Barr, Alasdair M; Procyshyn, Ric M; Smith, Geoffrey N; Buchanan, Tari; Krajden, Mel; Krausz, Michael; MacEwan, G William; Gicas, Kristina M; Leonova, Olga; Langheimer, Verena; Rauscher, Alexander; Schultz, Krista
2017-07-01
The Hotel Study was initiated in Vancouver's Downtown East Side (DTES) neighborhood to investigate multimorbidity in homeless or marginally housed people. We evaluated the clinical effectiveness of existing, illness-specific treatment strategies and assessed the effectiveness of health care delivery for multimorbid illnesses. For context, we mapped the housing locations of patients presenting for 552,062 visits to the catchment hospital emergency department (2005-2013). Aggregate data on 22,519 apprehensions of mentally ill people were provided by the Vancouver Police Department (2009-2015). The primary strategy was a longitudinal cohort study of 375 people living in the DTES (2008-2015). We analysed mortality and evaluated the clinical and health service delivery effectiveness for infection with human immunodeficiency virus or hepatitis C virus, opioid dependence, and psychosis. Mapping confirmed the association between poverty and greater number of emergency visits related to substance use and mental illness. The annual change in police apprehensions did not differ between the DTES and other policing districts. During 1581 person-years of cohort observation, the standardized mortality ratio was 8.43 (95% confidence interval, 6.19 to 11.50). Physician visits were common (84.3% of participants over 6 months). Clinical treatment effectiveness was highest for HIV/AIDS, intermediate for opioid dependence, and lowest for psychosis. Health service delivery mechanisms provided examples of poor access, poor treatment adherence, and little effect on multimorbid illnesses. Clinical effectiveness was variable, and illness-specific service delivery appeared to have little effect on multimorbidity. New models of care may need to be implemented.
Cervantes-Larios, Alejandro; Jones, Andrea A.; Vila-Rodriguez, Fidel; Montaner, Julio S.; Tran, Howard; Nham, Jimmy; Panenka, William J.; Lang, Donna J.; Thornton, Allen E.; Vertinsky, Talia; Barr, Alasdair M.; Procyshyn, Ric M.; Smith, Geoffrey N.; Buchanan, Tari; Krajden, Mel; Krausz, Michael; MacEwan, G. William; Gicas, Kristina M.; Leonova, Olga; Langheimer, Verena; Rauscher, Alexander; Schultz, Krista
2017-01-01
Objective: The Hotel Study was initiated in Vancouver’s Downtown East Side (DTES) neighborhood to investigate multimorbidity in homeless or marginally housed people. We evaluated the clinical effectiveness of existing, illness-specific treatment strategies and assessed the effectiveness of health care delivery for multimorbid illnesses. Method: For context, we mapped the housing locations of patients presenting for 552,062 visits to the catchment hospital emergency department (2005-2013). Aggregate data on 22,519 apprehensions of mentally ill people were provided by the Vancouver Police Department (2009-2015). The primary strategy was a longitudinal cohort study of 375 people living in the DTES (2008-2015). We analysed mortality and evaluated the clinical and health service delivery effectiveness for infection with human immunodeficiency virus or hepatitis C virus, opioid dependence, and psychosis. Results: Mapping confirmed the association between poverty and greater number of emergency visits related to substance use and mental illness. The annual change in police apprehensions did not differ between the DTES and other policing districts. During 1581 person-years of cohort observation, the standardized mortality ratio was 8.43 (95% confidence interval, 6.19 to 11.50). Physician visits were common (84.3% of participants over 6 months). Clinical treatment effectiveness was highest for HIV/AIDS, intermediate for opioid dependence, and lowest for psychosis. Health service delivery mechanisms provided examples of poor access, poor treatment adherence, and little effect on multimorbid illnesses. Conclusions: Clinical effectiveness was variable, and illness-specific service delivery appeared to have little effect on multimorbidity. New models of care may need to be implemented. PMID:28199798
A Reference Implementation of the OGC CSW EO Standard for the ESA HMA-T project
NASA Astrophysics Data System (ADS)
Bigagli, Lorenzo; Boldrini, Enrico; Papeschi, Fabrizio; Vitale, Fabrizio
2010-05-01
This work was developed in the context of the ESA Heterogeneous Missions Accessibility (HMA) project, whose main objective is to involve the stakeholders, namely National space agencies, satellite or mission owners and operators, in an harmonization and standardization process of their ground segment services and related interfaces. Among HMA objectives was the specification, conformance testing, and experimentation of two Extension Packages (EPs) of the ebRIM Application Profile (AP) of the OGC Catalog Service for the Web (CSW) specification: the Earth Observation Products (EO) EP (OGC 06-131) and the Cataloguing of ISO Metadata (CIM) EP (OGC 07-038). Our contributions have included the development and deployment of Reference Implementations (RIs) for both the above specifications, and their integration with the ESA Service Support Environment (SSE). The RIs are based on the GI-cat framework, an implementation of a distributed catalog service, able to query disparate Earth and Space Science data sources (e.g. OGC Web Services, Unidata THREDDS) and to expose several standard interfaces for data discovery (e.g. OGC CSW ISO AP). Following our initial planning, the GI-cat framework has been extended in order to expose the CSW.ebRIM-CIM and CSW.ebRIM-EO interfaces, and to distribute queries to CSW.ebRIM-CIM and CSW.ebRIM-EO data sources. We expected that a mapping strategy would suffice for accommodating CIM, but this proved to be unpractical during implementation. Hence, a model extension strategy was eventually implemented for both the CIM and EO EPs, and the GI-cat federal model was enhanced in order to support the underlying ebRIM AP. This work has provided us with new insights into the different data models for geospatial data, and the technologies for their implementation. The extension is used by suitable CIM and EO profilers (front-end mediator components) and accessors (back-end mediator components), that relate ISO 19115 concepts to EO and CIM ones. Moreover, a mapping to GI-cat federal model was developed for each EP (quite limited for EO; complete for CIM), in order to enable the discovery of resources through any of GI-cat profilers. The query manager was also improved. GI-cat-EO and -CIM installation packages were made available for distribution, and two RI instances were deployed on the Amazon EC2 facility (plus an ad-hoc instance returning incorrect control data). Integration activities of the EO RI with the ESA SSE Portal for Earth Observation Products were also successfully carried on. During our work, we have contributed feedback and comments to the CIM and EO EP specification working groups. Our contributions resulted in version 0.2.5 of the EO EP, recently approved as an OGC standard, and were useful to consolidate version 0.1.11 of the CIM EP (still being developed).
Semantics-informed cartography: the case of Piemonte Geological Map
NASA Astrophysics Data System (ADS)
Piana, Fabrizio; Lombardo, Vincenzo; Mimmo, Dario; Giardino, Marco; Fubelli, Giandomenico
2016-04-01
In modern digital geological maps, namely those supported by a large geo-database and devoted to dynamical, interactive representation on WMS-WebGIS services, there is the need to provide, in an explicit form, the geological assumptions used for the design and compilation of the database of the Map, and to get a definition and/or adoption of semantic representation and taxonomies, in order to achieve a formal and interoperable representation of the geologic knowledge. These approaches are fundamental for the integration and harmonisation of geological information and services across cultural (e.g. different scientific disciplines) and/or physical barriers (e.g. administrative boundaries). Initiatives such as GeoScience Markup Language (last version is GeoSciML 4.0, 2015, http://www.geosciml.org) and the INSPIRE "Data Specification on Geology" http://inspire.jrc.ec.europa.eu/documents/Data_Specifications/INSPIRE_DataSpecification_GE_v3.0rc3.pdf (an operative simplification of GeoSciML, last version is 3.0 rc3, 2013), as well as the recent terminological shepherding of the Geoscience Terminology Working Group (GTWG) have been promoting information exchange of the geologic knowledge. Grounded on these standard vocabularies, schemas and data models, we provide a shared semantic classification of geological data referring to the study case of the synthetic digital geological map of the Piemonte region (NW Italy), named "GEOPiemonteMap", developed by the CNR Institute of Geosciences and Earth Resources, Torino (CNR IGG TO) and hosted as a dynamical interactive map on the geoportal of ARPA Piemonte Environmental Agency. The Piemonte Geological Map is grounded on a regional-scale geo-database consisting of some hundreds of GeologicUnits whose thousands instances (Mapped Features, polygons geometry) widely occur in Piemonte region, and each one is bounded by GeologicStructures (Mapped Features, line geometry). GeologicUnits and GeologicStructures have been spatially correlated through the whole region and described using the GeoSciML vocabularies. A hierarchical schema is provided for the Piemonte Geological Map that gives the parental relations between several orders of GeologicUnits referring to mostly recurring geological objects and main GeologicEvents, in a logical framework compliant with GeoSciML and INSPIRE data models. The classification criteria and the Hierarchy Schema used to define the GEOPiemonteMap Legend, as well as the intended meanings of the geological concepts used to achieve the overall classification schema, are explicitly described in several WikiGeo pages (implemented by "MediaWiki" open source software, https://www.mediawiki.org/wiki/MediaWiki). Moreover, a further step toward a formal classification of the contents (both data and interpretation) of the GEOPiemonteMap was triggered, by setting up an ontological framework, named "OntoGeonous", in order to achieve a thorough semantic characterization of the Map.
NASA Astrophysics Data System (ADS)
Benedict, K. K.; Scott, S.
2013-12-01
While there has been a convergence towards a limited number of standards for representing knowledge (metadata) about geospatial (and other) data objects and collections, there exist a variety of community conventions around the specific use of those standards and within specific data discovery and access systems. This combination of limited (but multiple) standards and conventions creates a challenge for system developers that aspire to participate in multiple data infrastrucutres, each of which may use a different combination of standards and conventions. While Extensible Markup Language (XML) is a shared standard for encoding most metadata, traditional direct XML transformations (XSLT) from one standard to another often result in an imperfect transfer of information due to incomplete mapping from one standard's content model to another. This paper presents the work at the University of New Mexico's Earth Data Analysis Center (EDAC) in which a unified data and metadata management system has been developed in support of the storage, discovery and access of heterogeneous data products. This system, the Geographic Storage, Transformation and Retrieval Engine (GSTORE) platform has adopted a polyglot database model in which a combination of relational and document-based databases are used to store both data and metadata, with some metadata stored in a custom XML schema designed as a superset of the requirements for multiple target metadata standards: ISO 19115-2/19139/19110/19119, FGCD CSDGM (both with and without remote sensing extensions) and Dublin Core. Metadata stored within this schema is complemented by additional service, format and publisher information that is dynamically "injected" into produced metadata documents when they are requested from the system. While mapping from the underlying common metadata schema is relatively straightforward, the generation of valid metadata within each target standard is necessary but not sufficient for integration into multiple data infrastructures, as has been demonstrated through EDAC's testing and deployment of metadata into multiple external systems: Data.Gov, the GEOSS Registry, the DataONE network, the DSpace based institutional repository at UNM and semantic mediation systems developed as part of the NASA ACCESS ELSeWEB project. Each of these systems requires valid metadata as a first step, but to make most effective use of the delivered metadata each also has a set of conventions that are specific to the system. This presentation will provide an overview of the underlying metadata management model, the processes and web services that have been developed to automatically generate metadata in a variety of standard formats and highlight some of the specific modifications made to the output metadata content to support the different conventions used by the multiple metadata integration endpoints.
Australia's TERN: Advancing Ecosystem Data Management in Australia
NASA Astrophysics Data System (ADS)
Phinn, S. R.; Christensen, R.; Guru, S.
2013-12-01
Globally, there is a consistent movement towards more open, collaborative and transparent science, where the publication and citation of data is considered standard practice. Australia's Terrestrial Ecosystem Research Network (TERN) is a national research infrastructure investment designed to support the ecosystem science community through all stages of the data lifecycle. TERN has developed and implemented a comprehensive network of ';hard' and ';soft' infrastructure that enables Australia's ecosystem scientists to collect, publish, store, share, discover and re-use data in ways not previously possible. The aim of this poster is to demonstrate how TERN has successfully delivered infrastructure that is enabling a significant cultural and practical shift in Australia's ecosystem science community towards consistent approaches for data collection, meta-data, data licensing, and data publishing. TERN enables multiple disciplines, within the ecosystem sciences to more effectively and efficiently collect, store and publish their data. A critical part of TERN's approach has been to build on existing data collection activities, networks and skilled people to enable further coordination and collaboration to build each data collection facility and coordinate data publishing. Data collection in TERN is through discipline based facilities, covering long term collection of: (1) systematic plot based measurements of vegetation structure, composition and faunal biodiversity; (2) instrumented towers making systematic measurements of solar, water and gas fluxes; and (3) satellite and airborne maps of biophysical properties of vegetation, soils and the atmosphere. Several other facilities collect and integrate environmental data to produce national products for fauna and vegetation surveys, soils and coastal data, as well as integrated or synthesised products for modelling applications. Data management, publishing and sharing in TERN are implemented through a tailored data licensing framework suitable for ecosystem data, national standards for metadata, a DOI-minting service, and context-appropriate data repositories and portals. The TERN Data infrastructure is based on loosely coupled 'network of networks.' Overall, the data formats used across the TERN facilities vary from NetCDF, comma-separated values and descriptive documents. Metadata standards include ISO19115, Ecological Metadata Language and rich semantic enabled contextual information. Data services vary from Web Mapping Service, Web Feature Service, OpeNDAP, file servers and KNB Metacat. These approaches enable each data collection facility to maintain their discipline based data collection and storage protocols. TERN facility meta-data are harvested regularly for the central TERN Data Discovery Portal and converted to a national standard format. This approach enables centralised discovery, access, and re-use of data simply and effectively, while maintaining disciplinary diversity. Effort is still required to support the cultural shift towards acceptance of effective data management, publication, sharing and re-use as standard practice. To this end TERN's future activities will be directed to supporting this transformation and undertaking ';education' to enable ecosystem scientists to take full advantage of TERN's infrastructure, and providing training and guidance for best practice data management.
River Basin Standards Interoperability Pilot
NASA Astrophysics Data System (ADS)
Pesquer, Lluís; Masó, Joan; Stasch, Christoph
2016-04-01
There is a lot of water information and tools in Europe to be applied in the river basin management but fragmentation and a lack of coordination between countries still exists. The European Commission and the member states have financed several research and innovation projects in support of the Water Framework Directive. Only a few of them are using the recently emerging hydrological standards, such as the OGC WaterML 2.0. WaterInnEU is a Horizon 2020 project focused on creating a marketplace to enhance the exploitation of EU funded ICT models, tools, protocols and policy briefs related to water and to establish suitable conditions for new market opportunities based on these offerings. One of WaterInnEU's main goals is to assess the level of standardization and interoperability of these outcomes as a mechanism to integrate ICT-based tools, incorporate open data platforms and generate a palette of interchangeable components that are able to use the water data emerging from the recently proposed open data sharing processes and data models stimulated by initiatives such as the INSPIRE directive. As part of the standardization and interoperability activities in the project, the authors are designing an experiment (RIBASE, the present work) to demonstrate how current ICT-based tools and water data can work in combination with geospatial web services in the Scheldt river basin. The main structure of this experiment, that is the core of the present work, is composed by the following steps: - Extraction of information from river gauges data in OGC WaterML 2.0 format using SOS services (preferably compliant to the OGC SOS 2.0 Hydrology Profile Best Practice). - Model floods using a WPS 2.0, WaterML 2.0 data and weather forecast models as input. - Evaluation of the applicability of Sensor Notification Services in water emergencies. - Open distribution of the input and output data as OGC web services WaterML, / WCS / WFS and with visualization utilities: WMS. The architecture tests the combination of Gauge data in a WPS that is triggered by a meteorological alert. The data is translated into OGC WaterML 2.0 time series data format and will be ingested in a SOS 2.0. SOS data is visualized in a SOS Client that is able to handle time series. The meteorological forecast data (with the supervision of an operator manipulating the WPS user interface) ingests with WaterML 2.0 time series and terrain data is input for a flooding modelling algorithm. The WPS is able to produce flooding datasets in the form of coverages that is offered to clients via a WCS 2.0 service or a WMS 1.3 service, and downloaded and visualized by the respective clients. The WPS triggers a notification or an alert that will be monitored from an emergency control response service. Acronyms AS: Alert Service ES: Event Service ICT: Information and Communication Technology NS: Notification Service OGC: Open Geospatial Consortium RIBASE: River Basin Standards Interoperability Pilot SOS: Sensor Observation Service WaterML: Water Markup Language WCS: Web Coverage Service WMS: Web Map Service WPS: Web Processing Service
Preliminary Validation of the AFWA-NASA Blended Snowcover Product Over the Lower Great Lakes region
NASA Technical Reports Server (NTRS)
Hall, D. K.; Montesano, P. M.; Foster, J. L.; Riggs, G. A.; Kelly, R. E. J.; Czajkowski, K.
2007-01-01
A new snow product created using the standard Moderate-Resolution Imaging Spectroradiometer (MODIS) and Advanced Microwave Scanning Radiometer for EOS (AMSR-E) snow cover and snow-water equivalent products has been evaluated for the Lower Great Lakes region during the winter of 2002- 03. National Weather Service Co-Operative Observing Network stations and student-acquired snow data were used as ground truth. An interpolation scheme was used to map snow cover on the ground from the station measurements for each day of the study period. It is concluded that this technique does not represent the actual ground conditions adequately to permit evaluation of the new snow product in an absolute sense. However, use of the new product was found to improve the mapping of snow cover as compared to using either the MODIS or AMSR-E product, alone. Plans for further analysis are discussed.
Modular avionics packaging standardization
NASA Astrophysics Data System (ADS)
Austin, M.; McNichols, J. K.
The Modular Avionics Packaging (MAP) Program for packaging future military avionics systems with the objective of improving reliability, maintainability, and supportability, and reducing equipment life cycle costs is addressed. The basic MAP packaging concepts called the Standard Avionics Module, the Standard Enclosure, and the Integrated Rack are summarized, and the benefits of modular avionics packaging, including low risk design, technology independence with common functions, improved maintainability and life cycle costs are discussed. Progress made in MAP is briefly reviewed.
NCI's Distributed Geospatial Data Server
NASA Astrophysics Data System (ADS)
Larraondo, P. R.; Evans, B. J. K.; Antony, J.
2016-12-01
Earth systems, environmental and geophysics datasets are an extremely valuable source of information about the state and evolution of the Earth. However, different disciplines and applications require this data to be post-processed in different ways before it can be used. For researchers experimenting with algorithms across large datasets or combining multiple data sets, the traditional approach to batch data processing and storing all the output for later analysis rapidly becomes unfeasible, and often requires additional work to publish for others to use. Recent developments on distributed computing using interactive access to significant cloud infrastructure opens the door for new ways of processing data on demand, hence alleviating the need for storage space for each individual copy of each product. The Australian National Computational Infrastructure (NCI) has developed a highly distributed geospatial data server which supports interactive processing of large geospatial data products, including satellite Earth Observation data and global model data, using flexible user-defined functions. This system dynamically and efficiently distributes the required computations among cloud nodes and thus provides a scalable analysis capability. In many cases this completely alleviates the need to preprocess and store the data as products. This system presents a standards-compliant interface, allowing ready accessibility for users of the data. Typical data wrangling problems such as handling different file formats and data types, or harmonising the coordinate projections or temporal and spatial resolutions, can now be handled automatically by this service. The geospatial data server exposes functionality for specifying how the data should be aggregated and transformed. The resulting products can be served using several standards such as the Open Geospatial Consortium's (OGC) Web Map Service (WMS) or Web Feature Service (WFS), Open Street Map tiles, or raw binary arrays under different conventions. We will show some cases where we have used this new capability to provide a significant improvement over previous approaches.
Miles, Alistair; Zhao, Jun; Klyne, Graham; White-Cooper, Helen; Shotton, David
2010-10-01
Integrating heterogeneous data across distributed sources is a major requirement for in silico bioinformatics supporting translational research. For example, genome-scale data on patterns of gene expression in the fruit fly Drosophila melanogaster are widely used in functional genomic studies in many organisms to inform candidate gene selection and validate experimental results. However, current data integration solutions tend to be heavy weight, and require significant initial and ongoing investment of effort. Development of a common Web-based data integration infrastructure (a.k.a. data web), using Semantic Web standards, promises to alleviate these difficulties, but little is known about the feasibility, costs, risks or practical means of migrating to such an infrastructure. We describe the development of OpenFlyData, a proof-of-concept system integrating gene expression data on D. melanogaster, combining Semantic Web standards with light-weight approaches to Web programming based on Web 2.0 design patterns. To support researchers designing and validating functional genomic studies, OpenFlyData includes user-facing search applications providing intuitive access to and comparison of gene expression data from FlyAtlas, the BDGP in situ database, and FlyTED, using data from FlyBase to expand and disambiguate gene names. OpenFlyData's services are also openly accessible, and are available for reuse by other bioinformaticians and application developers. Semi-automated methods and tools were developed to support labour- and knowledge-intensive tasks involved in deploying SPARQL services. These include methods for generating ontologies and relational-to-RDF mappings for relational databases, which we illustrate using the FlyBase Chado database schema; and methods for mapping gene identifiers between databases. The advantages of using Semantic Web standards for biomedical data integration are discussed, as are open issues. In particular, although the performance of open source SPARQL implementations is sufficient to query gene expression data directly from user-facing applications such as Web-based data fusions (a.k.a. mashups), we found open SPARQL endpoints to be vulnerable to denial-of-service-type problems, which must be mitigated to ensure reliability of services based on this standard. These results are relevant to data integration activities in translational bioinformatics. The gene expression search applications and SPARQL endpoints developed for OpenFlyData are deployed at http://openflydata.org. FlyUI, a library of JavaScript widgets providing re-usable user-interface components for Drosophila gene expression data, is available at http://flyui.googlecode.com. Software and ontologies to support transformation of data from FlyBase, FlyAtlas, BDGP and FlyTED to RDF are available at http://openflydata.googlecode.com. SPARQLite, an implementation of the SPARQL protocol, is available at http://sparqlite.googlecode.com. All software is provided under the GPL version 3 open source license.
National Park Service Vegetation Inventory Program, Cuyahoga Valley National Park, Ohio
Hop, Kevin D.; Drake, J.; Strassman, Andrew C.; Hoy, Erin E.; Menard, Shannon; Jakusz, J.W.; Dieck, J.J.
2013-01-01
The National Park Service (NPS) Vegetation Inventory Program (VIP) is an effort to classify, describe, and map existing vegetation of national park units for the NPS Natural Resource Inventory and Monitoring (I&M) Program. The NPS VIP is managed by the NPS Biological Resources Management Division and provides baseline vegetation information to the NPS Natural Resource I&M Program. The U.S. Geological Survey (USGS) Vegetation Characterization Program lends a cooperative role in the NPS VIP. The USGS Upper Midwest Environmental Sciences Center, NatureServe, and NPS Cuyahoga Valley National Park (CUVA) have completed vegetation classification and mapping of CUVA.Mappers, ecologists, and botanists collaborated to identify and describe vegetation types within the National Vegetation Classification Standard (NVCS) and to determine how best to map them by using aerial imagery. The team collected data from 221 vegetation plots within CUVA to develop detailed descriptions of vegetation types. Data from 50 verification sites were also collected to test both the key to vegetation types and the application of vegetation types to a sample set of map polygons. Furthermore, data from 647 accuracy assessment (AA) sites were collected (of which 643 were used to test accuracy of the vegetation map layer). These data sets led to the identification of 45 vegetation types at the association level in the NVCS at CUVA.A total of 44 map classes were developed to map the vegetation and general land cover of CUVA, including the following: 29 map classes represent natural/semi-natural vegetation types in the NVCS, 12 map classes represent cultural vegetation (agricultural and developed) in the NVCS, and 3 map classes represent non-vegetation features (open-water bodies). Features were interpreted from viewing color-infrared digital aerial imagery dated October 2010 (during peak leaf-phenology change of trees) via digital onscreen three-dimensional stereoscopic workflow systems in geographic information systems (GIS). The interpreted data were digitally and spatially referenced, thus making the spatial database layers usable in GIS. Polygon units were mapped to either a 0.5 ha or 0.25 ha minimum mapping unit, depending on vegetation type.A geodatabase containing various feature-class layers and tables shows the locations of vegetation types and general land cover (vegetation map), vegetation plot samples, verification sites, AA sites, project boundary extent, and aerial photographic centers. The feature-class layer and relate tables for the CUVA vegetation map provides 4,640 polygons of detailed attribute data covering 13,288.4 ha, with an average polygon size of 2.9 ha.Summary reports generated from the vegetation map layer show map classes representing natural/semi-natural types in the NVCS apply to 4,151 polygons (89.4% of polygons) and cover 11,225.0 ha (84.5%) of the map extent. Of these polygons, the map layer shows CUVA to be 74.4% forest (9,888.8 ha), 2.5% shrubland (329.7 ha), and 7.6% herbaceous vegetation cover (1,006.5 ha). Map classes representing cultural types in the NVCS apply to 435 polygons (9.4% of polygons) and cover 1,825.7 ha (13.7%) of the map extent. Map classes representing non-NVCS units (open water) apply to 54 polygons (1.2% of polygons) and cover 237.7 ha (1.8%) of the map extent.A thematic AA study was conducted of map classes representing natural/semi-natural types in the NVCS. Results present an overall accuracy of 80.7% (kappa index of 79.5%) based on data from 643 of the 647 AA sites. Most individual map-class themes exceed the NPS VIP standard of 80% with a 90% confidence interval.The CUVA vegetation mapping project delivers many geospatial and vegetation data products in hardcopy and/or digital formats. These products consist of an in-depth project report discussing methods and results, which include descriptions and a dichotomous key to vegetation types, map classification and map-class descriptions, and a contingency table showing AA results. The suite of products also includes a database of vegetation plots, verification sites, and AA sites; digital pictures of field sites; field data sheets; aerial photographic imagery; hardcopy and digital maps; and a geodatabase of vegetation types and land cover (map layer), fieldwork locations (vegetation plots, verification sites, and AA sites), aerial photographic index, project boundary, and metadata. All geospatial products are projected in Universal Transverse Mercator, Zone 17, by using the North American Datum of 1983. Information on the NPS VIP and completed park mapping projects are located on the Internet at
US Geoscience Information Network, Web Services for Geoscience Information Discovery and Access
NASA Astrophysics Data System (ADS)
Richard, S.; Allison, L.; Clark, R.; Coleman, C.; Chen, G.
2012-04-01
The US Geoscience information network has developed metadata profiles for interoperable catalog services based on ISO19139 and the OGC CSW 2.0.2. Currently data services are being deployed for the US Dept. of Energy-funded National Geothermal Data System. These services utilize OGC Web Map Services, Web Feature Services, and THREDDS-served NetCDF for gridded datasets. Services and underlying datasets (along with a wide variety of other information and non information resources are registered in the catalog system. Metadata for registration is produced by various workflows, including harvest from OGC capabilities documents, Drupal-based web applications, transformation from tabular compilations. Catalog search is implemented using the ESRI Geoportal open-source server. We are pursuing various client applications to demonstrated discovery and utilization of the data services. Currently operational applications allow catalog search and data acquisition from map services in an ESRI ArcMap extension, a catalog browse and search application built on openlayers and Django. We are developing use cases and requirements for other applications to utilize geothermal data services for resource exploration and evaluation.
Wolff, A P; Groen, G J; Crul, B J
2001-01-01
Selective spinal nerve infiltration blocks are used diagnostically in patients with chronic low back pain radiating into the leg. Generally, a segmental nerve block is considered successful if the pain is reduced substantially. Hypesthesia and elicited paresthesias coinciding with the presumed segmental level are used as controls. The interpretation depends on a standard dermatomal map. However, it is not clear if this interpretation is reliable enough, because standard dermatomal maps do not show the overlap of neighboring dermatomes. The goal of the present study is to establish if dissimilarities exist between areas of hypesthesia, spontaneous pain reported by the patient, pain reduction by local anesthetics, and paresthesias elicited by sensory electrostimulation. A secondary goal is to determine to what extent the interpretation is improved when the overlaps of neighboring dermatomes are taken into account. Patients suffering from chronic low back pain with pain radiating into the leg underwent lumbosacral segmental nerve root blocks at subsequent levels on separate days. Lidocaine (2%, 0.5 mL) mixed with radiopaque fluid (0.25 mL) was injected after verifying the target location using sensory and motor electrostimulation. Sensory changes (pinprick method), paresthesias (reported by the patient), and pain reduction (Numeric Rating Scale) were reported. Hypesthesia and paresthesias were registered in a standard dermatomal map and in an adapted map which included overlap of neighboring dermatomes. The relationships between spinal level of injection, extent of hypesthesia, location of paresthesias, and corresponding dermatome were assessed quantitatively. Comparison of the results between both dermatomal maps was done by paired t-tests. After inclusion, data were processed for 40 segmental nerve blocks (L2-S1) performed in 29 patients. Pain reduction was achieved in 43%. Hypesthetic areas showed a large variability in size and location, and also in comparison to paresthesias. Mean hypesthetic area amounted 2.7 +/- 1.4 (+/- SD: range, 0 to 6; standard map) and 3.6 +/- 1.8 (0 to 6; adapted map; P <.001) dermatomes. In these cases, hypesthesia in the corresponding dermatome was found in 80% (standard map) and 88% of the cases (adapted map, not significant). Paresthesias occurring in the corresponding dermatome were found in 80% (standard map) compared with 98% (adapted map, P <.001). In 85% (standard map) and 88% (adapted map), spontaneous pain was present in the dermatome corresponding to the level of local anesthetic injection. In 55% (standard map) versus 75% (adapted map, P <.005), a combination of spontaneous pain, hypesthesia, and paresthesias was found in the corresponding dermatome. Hypesthetic areas determined after lumbosacral segmental nerve blocks show a large variability in size and location compared with elicited paresthesias. Confirmation of an adequately performed segmental nerve block, determined by coexistence of hypesthesia, elicited paresthesias and pain in the presumed dermatome, is more reliable when the overlap of neighboring dermatomes is taken into account.
Mapping tsunami impacts on land cover and related ecosystem service supply in Phang Nga, Thailand
NASA Astrophysics Data System (ADS)
Kaiser, G.; Burkhard, B.; Römer, H.; Sangkaew, S.; Graterol, R.; Haitook, T.; Sterr, H.; Sakuna-Schwartz, D.
2013-12-01
The 2004 Indian Ocean tsunami caused damages to coastal ecosystems and thus affected the livelihoods of the coastal communities who depend on services provided by these ecosystems. The paper presents a case study on evaluating and mapping the spatial and temporal impacts of the tsunami on land use and land cover (LULC) and related ecosystem service supply in the Phang Nga province, Thailand. The method includes local stakeholder interviews, field investigations, remote-sensing techniques, and GIS. Results provide an ecosystem services matrix with capacity scores for 18 LULC classes and 17 ecosystem functions and services as well as pre-/post-tsunami and recovery maps indicating changes in the ecosystem service supply capacities in the study area. Local stakeholder interviews revealed that mangroves, casuarina forest, mixed beach forest, coral reefs, tidal inlets, as well as wetlands (peat swamp forest) have the highest capacity to supply ecosystem services, while e.g. plantations have a lower capacity. The remote-sensing based damage and recovery analysis showed a loss of the ecosystem service supply capacities in almost all LULC classes for most of the services due to the tsunami. A fast recovery of LULC and related ecosystem service supply capacities within one year could be observed for e.g. beaches, while mangroves or casuarina forest needed several years to recover. Applying multi-temporal mapping the spatial variations of recovery could be visualised. While some patches of coastal forest were fully recovered after 3 yr, other patches were still affected and thus had a reduced capacity to supply ecosystem services. The ecosystem services maps can be used to quantify ecological values and their spatial distribution in the framework of a tsunami risk assessment. Beyond that they are considered to be a useful tool for spatial analysis in coastal risk management in Phang Nga.
NASA Technical Reports Server (NTRS)
Wilson, C.; Dye, R.; Reed, L.
1982-01-01
The errors associated with planimetric mapping of the United States using satellite remote sensing techniques are analyzed. Assumptions concerning the state of the art achievable for satellite mapping systems and platforms in the 1995 time frame are made. An analysis of these performance parameters is made using an interactive cartographic satellite computer model, after first validating the model using LANDSAT 1 through 3 performance parameters. An investigation of current large scale (1:24,000) US National mapping techniques is made. Using the results of this investigation, and current national mapping accuracy standards, the 1995 satellite mapping system is evaluated for its ability to meet US mapping standards for planimetric and topographic mapping at scales of 1:24,000 and smaller.
ERIC Educational Resources Information Center
Aydin, Sevgi; Aydemir, Nurdane; Boz, Yezdan; Cetin-Dindar, Ayla; Bektas, Oktay
2009-01-01
The present study aimed to evaluate whether a chemistry laboratory course called "Laboratory Experiments in Science Education" based on constructivist instruction accompanied with concept mapping enhanced pre-service chemistry teachers' conceptual understanding. Data were collected from five pre-service chemistry teachers at a university…
skip to main content DDE Toggle Navigation Home About DDE FAQs DOE Data ID Service Data ID Service Data ID Service Workshops Contact Us dataexplorer Search For Terms: + Advanced Search à Advanced /Simulations Figures/Plots Genome/Genetics Data Interactive Data Map(s) Multimedia Numeric Data Specialized Mix
Enhancing The National Map Through Tactical Planning and Performance Monitoring
,
2008-01-01
Tactical planning and performance monitoring are initial steps toward improving 'the way The National Map works' and supporting the U.S. Geological Survey (USGS) Science Strategy. This Tactical Performance Planning Summary for The National Map combines information from The National Map 2.0 Tactical Plan and The National Map Performance Milestone Matrix. The National Map 2.0 Tactical Plan is primarily a working document to guide The National Map program's execution, production, and metrics monitoring for fiscal years (FY) 2008 and 2009. The Tactical Plan addresses data, products, and services, as well as supporting and enabling activities. The National Map's 2-year goal for FY 2008 and FY 2009 is to provide a range of geospatial products and services that further the National Spatial Data Infrastructure and underpin USGS science. To do this, the National Geospatial Program will develop a renewed understanding during FY 2008 of key customer needs and requirements, develop the infrastructure to support The National Map business model, modernize its business processes, and reengineer its workforce. Priorities for The National Map will be adjusted if necessary to respond to changes to the project that may impact resources, constrain timeframes, or change customer needs. The supporting and enabling activities that make it possible to produce the products and services of The National Map will include partnership activities, improved compatibility of systems, outreach, and integration of data themes.
Development of a 14-digit Hydrologic Unit Code Numbering System for South Carolina
Bower, David E.; Lowry, Claude; Lowery, Mark A.; Hurley, Noel M.
1999-01-01
A Hydrologic Unit Map showing the cataloging units, watersheds, and subwatersheds of South Carolina has been developed by the U.S. Geological Survey in cooperation with the South Carolina Department of Health and Environmental Control, funded through a U.S. Environmental Protection Agency 319 Grant, and the U.S. Department of Agriculture, Natural Resources Conservation Service. These delineations represent 8-, 11-, and 14-digit Hydrologic Unit Codes, respectively. This map presents information on drainage, hydrography, and hydrologic boundaries of the water-resources regions, subregions, accounting units, cataloging units, watersheds, and subwatersheds. The source maps for the basin delineations are 1:24,000-scale 7.5-minute series topographic maps and the base maps are from 1:100,000-scale Digital Line Graphs; however, the data are published at a scale of 1:500,000. In addition, an electronic version of the data is provided on a compact disc.Of the 1,022 subwatersheds delineated for this project, 1,004 range in size from 3,000 to 40,000 acres (4.69 to 62.5 square miles). Seventeen subwatersheds are smaller than 3,000 acres and one subwatershed, located on St. Helena Island, is larger than 40,000 acres.This map and its associated codes provide a standardized base for use by water-resource managers and planners in locating, storing, retrieving, and exchanging hydrologic data. In addition, the map can be used for cataloging water-data acquisition activities, geographically organizing hydrologic data, and planning and describing water-use and related land-use activities.
Baldovin, T; Zangrando, D; Casale, P; Ferrarese, F; Bertoncello, C; Buja, A; Marcolongo, A; Baldo, V
2015-08-05
Geographic Information Systems (GIS) have become an innovative and somewhat crucial tool for analyzing relationships between public health data and environment. This study, though focusing on a Local Health Unit of northeastern Italy, could be taken as a benchmark for developing a standardized national data-acquiring format, providing a step-by-step instructions on the manipulation of address elements specific for Italian language and traditions. Geocoding analysis was carried out on a health database comprising 268,517 records of the Local Health Unit of Rovigo in the Veneto region, covering a period of 10 years, starting from 2001 up to 2010. The Map Service provided by the Environmental Research System Institute (ESRI, Redlands, CA), and ArcMap 10.0 by ESRI(®) were, respectively, the reference data and the GIS software, employed in the geocoding process. The first attempt of geocoding produced a poor quality result, having about 40% of the addresses matched. A procedure of manual standardization was performed in order to enhance the quality of the results, consequently a set of guiding principle were expounded which should be pursued for geocoding health data. High-level geocoding detail will provide a more precise geographic representation of health related events. The main achievement of this study was to outline some of the difficulties encountered during the geocoding of health data and to put forward a set of guidelines, which could be useful to facilitate the process and enhance the quality of the results. Public health informatics represents an emerging specialty that highlights on the application of information science and technology to public health practice and research. Therefore, this study could draw the attention of the National Health Service to the underestimated problem of geocoding accuracy in health related data for environmental risk assessment. © Copyright by Pacini Editore SpA, Pisa, Italy.
Relief Presentation on US National Park Service Maps
NASA Astrophysics Data System (ADS)
Patterson, Tom
2018-05-01
This paper examines the evolution of relief presentations on maps developed by Harpers Ferry Center, the media service center of the US National Park Service (NPS). Harpers Ferry Center produces the maps used by park visitors. I will discuss five park maps, each with a distinctive relief style and mode of production. They appear in rough chronological order of their development. Recent relief presentations are generally more detailed, colorful, and realistic than those from earlier years. Changing technology is largely responsible for the different relief styles found on park maps. Some relief treatments today were not possible, or imaginable, in 1977 when the NPS established the brochure program in its modern phase. Landscape heterogeneity is another factor behind the development of different relief styles. With over 400 park sites ranging from the glacial mountains of Alaska to the rolling piedmont of Virginia, a one-style-fits-all approach cannot adequately depict all landscapes. NPS maps serve some 300 million park visitors each year. Our ongoing effort to make understandable maps for this diverse audience has further spurred experiments in relief presentation.
Using bedrock geology for making ecological base maps
NASA Astrophysics Data System (ADS)
Heldal, Tom; Solli, Arne; Torgersen, Espen
2017-04-01
For preparing for a sustainable future land use planning, a more holistic approach to nature management is important. This will imply more multidisciplinary research and cooperation across professional borders. In particular, the integration of knowledge about the geosphere and biosphere is needed. As the biosphere produces ecosystem services to us, the geosphere provides "geo-system" services or "Underground" services. In Norway, we have tried to investigate the connection between ecosystems and bedrock geology. The aim was to create various ecological base maps that can be used for improving mapping and investigations of biodiversity. By using geochemical analyses and linking the results to bedrock maps, we managed to get a rather realistic picture of the mineral content of soils formed by the chemical weathering of rocks. This made it possible to make the first national map of Ca-content in the bedrock. In addition, we can construct maps of anomal soil composition (such as high P, Mg and K). The presentation will outline the methodology for such ecological base maps, and discuss problems, challenges and further research.
NASA Astrophysics Data System (ADS)
Valentina, Gallina; Torresan, Silvia; Giannini, Valentina; Rizzi, Jonathan; Zabeo, Alex; Gualdi, Silvio; Bellucci, Alessio; Giorgi, Filippo; Critto, Andrea; Marcomini, Antonio
2013-04-01
At the international level, the interest for climate services is rising due to the social and economic benefits that different stakeholders can achieve to manage climate risks and take advantage of the opportunities associated with climate change impacts. However, there is a significant gap of tools aimed at providing information about risks and impacts induced by climate change and allowing non-expert stakeholders to use both climate-model and climate-impact data. Within the CLIM-RUN project (FP7), the case study of the North Adriatic Sea is aimed at analysing the need of climate information and the effectiveness of climate services for the integrated assessment of climate change impacts in coastal zones of the North Adriatic Sea at the regional to local scale. A participative approach was developed and applied to identify relevant stakeholders which have a mandate for coastal zone management and to interact with them in order to elicit their climate information needs. Specifically, the participative approach was carried out by means of two local workshops and trough the administration of a questionnaire related to climate information and services. The results of the process allowed identifying three major themes of interest for local stakeholders (i.e. hydro-climatic regime, coastal and marine environment, agriculture) and their preferences concerning key climate variables (e.g. extreme events, sea-level, wave height), mid-term temporal projections (i.e. for the next 30-40 years) and medium-high spatial resolution (i.e. from 1 to 50 km). Furthermore, the workshops highlighted stakeholder concern about several climate-related impacts (e.g. sea-level rise, storm surge, droughts) and vulnerable receptors (e.g. beaches, wetlands, agricultural areas) to be considered in vulnerability and risk assessment studies for the North Adriatic coastal zones. This information was used by climate and environmental risk experts in order to develop targeted climate information and services (e.g. climate projections and maps) for coastal stakeholders. The final results include climate products developed by climate experts through the analysis of climate observations and scenarios (e.g. standard indices of extreme precipitations and droughts, consecutive days of heavy rain, mean sea level pressure) and risk-based maps supplied by environmental risk experts to facilitate the definition of adaptation strategies (e.g. sea-level rise/storm surge risk maps with the surface of receptor lost; drought risk maps with the percentage of suffering agricultural areas). The preliminary climate products and the results of North Adriatic case study will be here presented and discussed.
NASA Technical Reports Server (NTRS)
Teng, William; Maidment, David; Rodell, Matthew; Strub, Richard; Arctur, David; Ames, Daniel; Rui, Hualan; Vollmer, Bruce; Seiler, Edward
2014-01-01
An ongoing NASA-funded Data Rods (time series) project has demonstrated the removal of a longstanding barrier to accessing NASA data (i.e., accessing archived time-step array data as point-time series) for selected variables of the North American and Global Land Data Assimilation Systems (NLDAS and GLDAS, respectively) and other NASA data sets. Data rods are pre-generated or generated on-the-fly (OTF), leveraging the NASA Simple Subset Wizard (SSW), a gateway to NASA data centers. Data rods Web services are accessible through the CUAHSI Hydrologic Information System (HIS) and the Goddard Earth Sciences Data and Information Services Center (GES DISC) but are not easily discoverable by users of other non-NASA data systems. An ongoing GEOSS Water Services project aims to develop a distributed, global registry of water data, map, and modeling services cataloged using the standards and procedures of the Open Geospatial Consortium and the World Meteorological Organization. Preliminary work has shown GEOSS can be leveraged to help provide access to data rods. A new NASA-funded project is extending this early work.
An Interoperable, Agricultural Information System Based on Satellite Remote Sensing Data
NASA Technical Reports Server (NTRS)
Teng, William; Chiu, Long; Doraiswamy, Paul; Kempler, Steven; Liu, Zhong; Pham, Long; Rui, Hualan
2005-01-01
Monitoring global agricultural crop conditions during the growing season and estimating potential seasonal production are critically important for market development of US. agricultural products and for global food security. The Goddard Space Flight Center Earth Sciences Data and Information Services Center Distributed Active Archive Center (GES DISC DAAC) is developing an Agricultural Information System (AIS), evolved from an existing TRMM Online Visualization and Analysis System (TOVAS), which will operationally provide satellite remote sensing data products (e.g., rainfall) and services. The data products will include crop condition and yield prediction maps, generated from a crop growth model with satellite data inputs, in collaboration with the USDA Agricultural Research Service. The AIS will enable the remote, interoperable access to distributed data, by using the GrADS-DODS Server (GDS) and by being compliant with Open GIS Consortium standards. Users will be able to download individual files, perform interactive online analysis, as well as receive operational data flows. AIS outputs will be integrated into existing operational decision support systems for global crop monitoring, such as those of the USDA Foreign Agricultural Service and the U.N. World Food Program.
Neuhaus, Philipp; Doods, Justin; Dugas, Martin
2015-01-01
Automatic coding of medical terms is an important, but highly complicated and laborious task. To compare and evaluate different strategies a framework with a standardized web-interface was created. Two UMLS mapping strategies are compared to demonstrate the interface. The framework is a Java Spring application running on a Tomcat application server. It accepts different parameters and returns results in JSON format. To demonstrate the framework, a list of medical data items was mapped by two different methods: similarity search in a large table of terminology codes versus search in a manually curated repository. These mappings were reviewed by a specialist. The evaluation shows that the framework is flexible (due to standardized interfaces like HTTP and JSON), performant and reliable. Accuracy of automatically assigned codes is limited (up to 40%). Combining different semantic mappers into a standardized Web-API is feasible. This framework can be easily enhanced due to its modular design.
Mapping Opthalmic Terms to a Standardized Vocabulary.
ERIC Educational Resources Information Center
Patrick, Timothy B.; Reid, John C.; Sievert, MaryEllen; Popescu, Mihail; Gigantelli, James W.; Shelton, Mark E.; Schiffman, Jade S.
2000-01-01
Describes work by the American Academy of Ophthalmology (AAO) to expand the standardized vocabulary, Systematized Nomenclature of Medicine (SNOMED), to accommodate a definitive ophthalmic standardized vocabulary. Mapped a practice-based clinical ophthalmic vocabulary to SNOMED and other vocabularies in the Metathesaurus of the Unified Medical…
EnviroAtlas -Milwaukee, WI- One Meter Resolution Urban Land Cover Data (2010) Web Service
This EnviroAtlas web service supports research and online mapping activities related to EnviroAtlas (https://www.epa.gov/enviroatlas). The EnviroAtlas Milwaukee, WI land cover data and map were generated from USDA NAIP (National Agricultural Imagery Program) four band (red, green, blue and near infrared) aerial photography from Late Summer 2010 at 1 m spatial resolution. Nine land cover classes were mapped: water, impervious surfaces (dark and light), soil and barren land, trees and forest, grass and herbaceous non-woody vegetation, agriculture, and wetlands (woody and emergent). An accuracy assessment using a completely random sampling of 600 samples yielded an overall accuracy of 85.39% percent using a minimum mapping unit of 9 pixels (3x3 pixel window). The area mapped is defined by the US Census Bureau's 2010 Urban Statistical Area for Milwaukee. This dataset was produced by the US EPA to support research and online mapping activities related to EnviroAtlas. EnviroAtlas (https://www.epa.gov/enviroatlas) allows the user to interact with a web-based, easy-to-use, mapping application to view and analyze multiple ecosystem services for the contiguous United States. The dataset is available as downloadable data (https://edg.epa.gov/data/Public/ORD/EnviroAtlas) or as an EnviroAtlas map service. Additional descriptive information about each attribute in this dataset can be found in its associated EnviroAtlas Fact Sheet (https://www.epa.gov/enviroatlas/enviroatlas-
EnviroAtlas -- Woodbine, IA -- One Meter Resolution Urban Land Cover Data (2011) Web Service
This EnviroAtlas web service supports research and online mapping activities related to EnviroAtlas (https://www.epa.gov/enviroatlas). The EnviroAtlas Woodbine, IA land cover (LC) data and map were generated from USDA NAIP (National Agricultural Imagery Program) four band (red, green, blue and near infrared) aerial photography from Late Summer 2011 at 1 m spatial resolution. Six land cover classes were mapped: water, impervious surfaces (dark and light), soil and barren land, trees and forest, grass and herbaceous non-woody vegetation, and agriculture. An accuracy assessment using a completely random sampling of 600 samples yielded an overall accuracy of 87.03% percent using a minimum mapping unit of 9 pixels (3x3 pixel window). The area mapped is defined by the US Census Bureau's 2010 Urban Statistical Area for Woodbine. This dataset was produced by the US EPA to support research and online mapping activities related to EnviroAtlas. EnviroAtlas (https://www.epa.gov/enviroatlas) allows the user to interact with a web-based, easy-to-use, mapping application to view and analyze multiple ecosystem services for the contiguous United States. The dataset is available as downloadable data (https://edg.epa.gov/data/Public/ORD/EnviroAtlas) or as an EnviroAtlas map service. Additional descriptive information about each attribute in this dataset can be found in its associated EnviroAtlas Fact Sheet (https://www.epa.gov/enviroatlas/enviroatlas-fact-sheets).
EnviroAtlas -Portland, ME- One Meter Resolution Urban Land Cover (2010) Web Service
This EnviroAtlas web service supports research and online mapping activities related to EnviroAtlas (https://www.epa.gov/enviroatlas). The Portland, ME land cover map was generated from USDA NAIP (National Agricultural Imagery Program) four band (red, green, blue and near infrared) aerial photography from Late Summer 2010 at 1 m spatial resolution. Nine land cover classes were mapped: water, impervious surfaces (dark and light), soil and barren land, trees and forest, grass and herbaceous non-woody vegetation, agriculture, and wetlands (woody and emergent). An accuracy assessment using a stratified random sampling of 600 samples yielded an overall accuracy of 87.5 percent using a minimum mapping unit of 9 pixels (3x3 pixel window). The area mapped is defined by the US Census Bureau's 2010 Urban Statistical Area for Portland.This dataset was produced by the US EPA to support research and online mapping activities related to EnviroAtlas. EnviroAtlas (https://www.epa.gov/enviroatlas) allows the user to interact with a web-based, easy-to-use, mapping application to view and analyze multiple ecosystem services for the contiguous United States. The dataset is available as downloadable data (https://edg.epa.gov/data/Public/ORD/EnviroAtlas) or as an EnviroAtlas map service. Additional descriptive information about each attribute in this dataset can be found in its associated EnviroAtlas Fact Sheet (https://www.epa.gov/enviroatlas/enviroatlas-fact-sheets).
Soller, David R.
1996-01-01
This report summarizes a technical review of USGS Open-File Report 95-525, 'Cartographic and Digital Standard for Geologic Map Information' and OFR 95-526 (diskettes containing digital representations of the standard symbols). If you are considering the purchase or use of those documents, you should read this report first. For some purposes, OFR 95-525 (the printed document) will prove to be an excellent resource. However, technical review identified significant problems with the two documents that will be addressed by various Federal and State committees composed of geologists and cartographers, as noted below. Therefore, the 2-year review period noted in OFR 95-525 is no longer applicable. Until those problems are resolved and formal standards are issued, you may consult the following World-Wide Web (WWW) site which contains information about development of geologic map standards: URL: http://ncgmp.usgs.gov/ngmdbproject/home.html
Yang, Dan; Xu, Bin; Rao, Kaiyou; Sheng, Weihua
2018-01-24
Indoor occupants' positions are significant for smart home service systems, which usually consist of robot service(s), appliance control and other intelligent applications. In this paper, an innovative localization method is proposed for tracking humans' position in indoor environments based on passive infrared (PIR) sensors using an accessibility map and an A-star algorithm, aiming at providing intelligent services. First the accessibility map reflecting the visiting habits of the occupants is established through the integral training with indoor environments and other prior knowledge. Then the PIR sensors, which placement depends on the training results in the accessibility map, get the rough location information. For more precise positioning, the A-start algorithm is used to refine the localization, fused with the accessibility map and the PIR sensor data. Experiments were conducted in a mock apartment testbed. The ground truth data was obtained from an Opti-track system. The results demonstrate that the proposed method is able to track persons in a smart home environment and provide a solution for home robot localization.
Yang, Dan; Xu, Bin; Rao, Kaiyou; Sheng, Weihua
2018-01-01
Indoor occupants’ positions are significant for smart home service systems, which usually consist of robot service(s), appliance control and other intelligent applications. In this paper, an innovative localization method is proposed for tracking humans’ position in indoor environments based on passive infrared (PIR) sensors using an accessibility map and an A-star algorithm, aiming at providing intelligent services. First the accessibility map reflecting the visiting habits of the occupants is established through the integral training with indoor environments and other prior knowledge. Then the PIR sensors, which placement depends on the training results in the accessibility map, get the rough location information. For more precise positioning, the A-start algorithm is used to refine the localization, fused with the accessibility map and the PIR sensor data. Experiments were conducted in a mock apartment testbed. The ground truth data was obtained from an Opti-track system. The results demonstrate that the proposed method is able to track persons in a smart home environment and provide a solution for home robot localization. PMID:29364188
Code of Federal Regulations, 2014 CFR
2014-07-01
..., maps, and other evidence and accounting procedures and practices, sufficient to reflect properly— (1... contractors for professional services, shall maintain books, documents, papers, maps, and records which are... contractors, including professional services contracts, shall be subject at all reasonable times to inspection...
Uncertainty visualisation in the Model Web
NASA Astrophysics Data System (ADS)
Gerharz, L. E.; Autermann, C.; Hopmann, H.; Stasch, C.; Pebesma, E.
2012-04-01
Visualisation of geospatial data as maps is a common way to communicate spatially distributed information. If temporal and furthermore uncertainty information are included in the data, efficient visualisation methods are required. For uncertain spatial and spatio-temporal data, numerous visualisation methods have been developed and proposed, but only few tools for visualisation of data in a standardised way exist. Furthermore, usually they are realised as thick clients, and lack functionality of handling data coming from web services as it is envisaged in the Model Web. We present an interactive web tool for visualisation of uncertain spatio-temporal data developed in the UncertWeb project. The client is based on the OpenLayers JavaScript library. OpenLayers provides standard map windows and navigation tools, i.e. pan, zoom in/out, to allow interactive control for the user. Further interactive methods are implemented using jStat, a JavaScript library for statistics plots developed in UncertWeb, and flot. To integrate the uncertainty information into existing standards for geospatial data, the Uncertainty Markup Language (UncertML) was applied in combination with OGC Observations&Measurements 2.0 and JavaScript Object Notation (JSON) encodings for vector and NetCDF for raster data. The client offers methods to visualise uncertain vector and raster data with temporal information. Uncertainty information considered for the tool are probabilistic and quantified attribute uncertainties which can be provided as realisations or samples, full probability distributions functions and statistics. Visualisation is supported for uncertain continuous and categorical data. In the client, the visualisation is realised using a combination of different methods. Based on previously conducted usability studies, a differentiation between expert (in statistics or mapping) and non-expert users has been indicated as useful. Therefore, two different modes are realised together in the tool: (i) adjacent maps showing data and uncertainty separately, and (ii) multidimensional mapping providing different visualisation methods in combination to explore the spatial, temporal and uncertainty distribution of the data. Adjacent maps allow a simpler visualisation by separating value and uncertainty maps for non-experts and a first overview. The multidimensional approach allows a more complex exploration of the data for experts by browsing through the different dimensions. It offers the visualisation of maps, statistic plots and time series in different windows and sliders to interactively move through time, space and uncertainty (thresholds).
Mapping specific soil functions based on digital soil property maps
NASA Astrophysics Data System (ADS)
Pásztor, László; Fodor, Nándor; Farkas-Iványi, Kinga; Szabó, József; Bakacsi, Zsófia; Koós, Sándor
2016-04-01
Quantification of soil functions and services is a great challenge in itself even if the spatial relevance is supposed to be identified and regionalized. Proxies and indicators are widely used in ecosystem service mapping. Soil services could also be approximated by elementary soil features. One solution is the association of soil types with services as basic principle. Soil property maps however provide quantified spatial information, which could be utilized more versatilely for the spatial inference of soil functions and services. In the frame of the activities referred as "Digital, Optimized, Soil Related Maps and Information in Hungary" (DOSoReMI.hu) numerous soil property maps have been compiled so far with proper DSM techniques partly according to GSM.net specifications, partly by slightly or more strictly changing some of its predefined parameters (depth intervals, pixel size, property etc.). The elaborated maps have been further utilized, since even DOSoReMI.hu was intended to take steps toward the regionalization of higher level soil information (secondary properties, functions, services). In the meantime the recently started AGRAGIS project requested spatial soil related information in order to estimate agri-environmental related impacts of climate change and support the associated vulnerability assessment. One of the most vulnerable services of soils in the context of climate change is their provisioning service. In our work it was approximated by productivity, which was estimated by a sequential scenario based crop modelling. It took into consideration long term (50 years) time series of both measured and predicted climatic parameters as well as accounted for the potential differences in agricultural practice and crop production. The flexible parametrization and multiple results of modelling was then applied for the spatial assessment of sensitivity, vulnerability, exposure and adaptive capacity of soils in the context of the forecasted changes in climatic conditions in the Carpathian Basin. In addition to soil fertility, degradation risk due to N-leaching was also assessed by the model runs by taking into account the movement of nitrate in the profile during the simulated periods. Our paper will present the resulted national maps and some conclusions drawn from the experiences. Acknowledgement: Our work was supported by Iceland, Liechtenstein and Norway through the EEA Grants and the REC (Project No: EEA C12-12) and the Hungarian National Scientific Research Foundation (OTKA, Grant No. K105167).
GeoNetwork powered GI-cat: a geoportal hybrid solution
NASA Astrophysics Data System (ADS)
Baldini, Alessio; Boldrini, Enrico; Santoro, Mattia; Mazzetti, Paolo
2010-05-01
To the aim of setting up a Spatial Data Infrastructures (SDI) the creation of a system for the metadata management and discovery plays a fundamental role. An effective solution is the use of a geoportal (e.g. FAO/ESA geoportal), that has the important benefit of being accessible from a web browser. With this work we present a solution based integrating two of the available frameworks: GeoNetwork and GI-cat. GeoNetwork is an opensource software designed to improve accessibility of a wide variety of data together with the associated ancillary information (metadata), at different scale and from multidisciplinary sources; data are organized and documented in a standard and consistent way. GeoNetwork implements both the Portal and Catalog components of a Spatial Data Infrastructure (SDI) defined in the OGC Reference Architecture. It provides tools for managing and publishing metadata on spatial data and related services. GeoNetwork allows harvesting of various types of web data sources e.g. OGC Web Services (e.g. CSW, WCS, WMS). GI-cat is a distributed catalog based on a service-oriented framework of modular components and can be customized and tailored to support different deployment scenarios. It can federate a multiplicity of catalogs services, as well as inventory and access services in order to discover and access heterogeneous ESS resources. The federated resources are exposed by GI-cat through several standard catalog interfaces (e.g. OGC CSW AP ISO, OpenSearch, etc.) and by the GI-cat extended interface. Specific components implement mediation services for interfacing heterogeneous service providers, each of which exposes a specific standard specification; such components are called Accessors. These mediating components solve providers data modelmultiplicity by mapping them onto the GI-cat internal data model which implements the ISO 19115 Core profile. Accessors also implement the query protocol mapping; first they translate the query requests expressed according to the interface protocols exposed by GI-cat into the multiple query dialects spoken by the resource service providers. Currently, a number of well-accepted catalog and inventory services are supported, including several OGC Web Services, THREDDS Data Server, SeaDataNet Common Data Index, GBIF and OpenSearch engines. A GeoNetwork powered GI-cat has been developed in order to exploit the best of the two frameworks. The new system uses a modified version of GeoNetwork web interface in order to add the capability of querying also the specified GI-cat catalog and not only the GeoNetwork internal database. The resulting system consists in a geoportal in which GI-cat plays the role of the search engine. This new system allows to distribute the query on the different types of data sources linked to a GI-cat. The metadata results of the query are then visualized by the Geonetwork web interface. This configuration was experimented in the framework of GIIDA, a project of the Italian National Research Council (CNR) focused on data accessibility and interoperability. A second advantage of this solution is achieved setting up a GeoNetwork catalog amongst the accessors of the GI-cat instance. Such a configuration will allow in turn GI-cat to run the query against the internal GeoNetwork database. This allows to have both the harvesting and the metadata editor functionalities provided by GeoNetwork and the distributed search functionality of GI-cat available in a consistent way through the same web interface.
Cloud GIS Based Watershed Management
NASA Astrophysics Data System (ADS)
Bediroğlu, G.; Colak, H. E.
2017-11-01
In this study, we generated a Cloud GIS based watershed management system with using Cloud Computing architecture. Cloud GIS is used as SAAS (Software as a Service) and DAAS (Data as a Service). We applied GIS analysis on cloud in terms of testing SAAS and deployed GIS datasets on cloud in terms of DAAS. We used Hybrid cloud computing model in manner of using ready web based mapping services hosted on cloud (World Topology, Satellite Imageries). We uploaded to system after creating geodatabases including Hydrology (Rivers, Lakes), Soil Maps, Climate Maps, Rain Maps, Geology and Land Use. Watershed of study area has been determined on cloud using ready-hosted topology maps. After uploading all the datasets to systems, we have applied various GIS analysis and queries. Results shown that Cloud GIS technology brings velocity and efficiency for watershed management studies. Besides this, system can be easily implemented for similar land analysis and management studies.
2014-01-01
Background The health and survival of women and their new-born babies in low income countries has been a key priority in public health since the 1990s. However, basic planning data, such as numbers of pregnancies and births, remain difficult to obtain and information is also lacking on geographic access to key services, such as facilities with skilled health workers. For maternal and newborn health and survival, planning for safer births and healthier newborns could be improved by more accurate estimations of the distributions of women of childbearing age. Moreover, subnational estimates of projected future numbers of pregnancies are needed for more effective strategies on human resources and infrastructure, while there is a need to link information on pregnancies to better information on health facilities in districts and regions so that coverage of services can be assessed. Methods This paper outlines demographic mapping methods based on freely available data for the production of high resolution datasets depicting estimates of numbers of people, women of childbearing age, live births and pregnancies, and distribution of comprehensive EmONC facilities in four large high burden countries: Afghanistan, Bangladesh, Ethiopia and Tanzania. Satellite derived maps of settlements and land cover were constructed and used to redistribute areal census counts to produce detailed maps of the distributions of women of childbearing age. Household survey data, UN statistics and other sources on growth rates, age specific fertility rates, live births, stillbirths and abortions were then integrated to convert the population distribution datasets to gridded estimates of births and pregnancies. Results and conclusions These estimates, which can be produced for current, past or future years based on standard demographic projections, can provide the basis for strategic intelligence, planning services, and provide denominators for subnational indicators to track progress. The datasets produced are part of national midwifery workforce assessments conducted in collaboration with the respective Ministries of Health and the United Nations Population Fund (UNFPA) to identify disparities between population needs, health infrastructure and workforce supply. The datasets are available to the respective Ministries as part of the UNFPA programme to inform midwifery workforce planning and also publicly available through the WorldPop population mapping project. PMID:24387010
EnviroAtlas -Pittsburgh, PA- One Meter Resolution Urban Land Cover Data (2010) Web Service
This EnviroAtlas web service supports research and online mapping activities related to EnviroAtlas (https://www.epa.gov/enviroatlas).The EnviroAtlas Pittsburgh, PA land cover map was generated from United States Department of Agriculture (USDA) National Agricultural Imagery Program (NAIP) four band (red, green, blue, and near infrared) aerial photography at 1 m spatial resolution. Imagery was collected on multiple dates in June 2010. Five land cover classes were mapped: water, impervious surfaces, soil and barren land, trees and forest, and grass and herbaceous non-woody vegetation. An accuracy assessment of 500 completely random and 81 stratified random points yielded an overall accuracy of 86.57 percent. The area mapped is defined by the US Census Bureau's 2010 Urban Statistical Area for Pittsburgh, PA. This dataset was produced by the US EPA to support research and online mapping activities related to EnviroAtlas. EnviroAtlas (https://www.epa.gov/enviroatlas) allows the user to interact with a web-based, easy-to-use, mapping application to view and analyze multiple ecosystem services for the contiguous United States. The dataset is available as downloadable data (https://edg.epa.gov/data/Public/ORD/EnviroAtlas) or as an EnviroAtlas map service. Additional descriptive information about each attribute in this dataset can be found in its associated EnviroAtlas Fact Sheet (https://www.epa.gov/enviroatlas/enviroatlas-fact-sheets).
Zhao, Chang; Sander, Heather A.
2015-01-01
Studies that assess the distribution of benefits provided by ecosystem services across urban areas are increasingly common. Nevertheless, current knowledge of both the supply and demand sides of ecosystem services remains limited, leaving a gap in our understanding of balance between ecosystem service supply and demand that restricts our ability to assess and manage these services. The present study seeks to fill this gap by developing and applying an integrated approach to quantifying the supply and demand of a key ecosystem service, carbon storage and sequestration, at the local level. This approach follows three basic steps: (1) quantifying and mapping service supply based upon Light Detection and Ranging (LiDAR) processing and allometric models, (2) quantifying and mapping demand for carbon sequestration using an indicator based on local anthropogenic CO2 emissions, and (3) mapping a supply-to-demand ratio. We illustrate this approach using a portion of the Twin Cities Metropolitan Area of Minnesota, USA. Our results indicate that 1735.69 million kg carbon are stored by urban trees in our study area. Annually, 33.43 million kg carbon are sequestered by trees, whereas 3087.60 million kg carbon are emitted by human sources. Thus, carbon sequestration service provided by urban trees in the study location play a minor role in combating climate change, offsetting approximately 1% of local anthropogenic carbon emissions per year, although avoided emissions via storage in trees are substantial. Our supply-to-demand ratio map provides insight into the balance between carbon sequestration supply in urban trees and demand for such sequestration at the local level, pinpointing critical locations where higher levels of supply and demand exist. Such a ratio map could help planners and policy makers to assess and manage the supply of and demand for carbon sequestration. PMID:26317530
Sibthorpe, Beverly; Gardner, Karen; McAullay, Daniel
2016-01-01
A rapidly expanding interest in quality in the Aboriginal-community-controlled health sector has led to widespread uptake of accreditation using more than one set of standards, a proliferation of continuous quality improvement programs and the introduction of key performance indicators. As yet, there has been no overarching logic that shows how they relate to each other, with consequent confusion within and outside the sector. We map the three approaches to the Framework for Performance Assessment in Primary Health Care, demonstrating their key differences and complementarity. There needs to be greater attention in both policy and practice to the purposes and alignment of the three approaches if they are to embed a system-wide focus that supports quality improvement at the service level.
Geospatial Data as a Service: The GEOGLAM Rangelands and Pasture Productivity Map Experience
NASA Astrophysics Data System (ADS)
Evans, B. J. K.; Antony, J.; Guerschman, J. P.; Larraondo, P. R.; Richards, C. J.
2017-12-01
Empowering end-users like pastoralists, land management specialists and land policy makers in the use of earth observation data for both day-to-day and seasonal planning needs both interactive delivery of multiple geospatial datasets and the capability of supporting on-the-fly dynamic queries while simultaneously fostering a community around the effort. The use of and wide adoption of large data archives, like those produced by earth observation missions, are often limited by compute and storage capabilities of the remote user. We demonstrate that wide-scale use of large data archives can be facilitated by end-users dynamically requesting value-added products using open standards (WCS, WMS, WPS), with compute running in the cloud or dedicated data-centres and visualizing outputs on web-front ends. As an example, we will demonstrate how a tool called GSKY can empower a remote end-user by providing the data delivery and analytics capabilities for the GEOGLAM Rangelands and Pasture Productivity (RAPP) Map tool. The GEOGLAM RAPP initiative from the Group on Earth Observations (GEO) and its Agricultural Monitoring subgroup aims at providing practical tools to end-users focusing on the important role of rangelands and pasture systems in providing food production security from both agricultural crops and animal protein. Figure 1, is a screen capture from the RAPP Map interface for an important pasture area in the Namibian rangelands. The RAPP Map has been in production for six months and has garnered significant interest from groups and users all over the world. GSKY, being formulated around the theme of Open Geospatial Data-as-a-Service capabilities uses distributed computing and storage to facilitate this. It works behind the scenes, accepting OGC standard requests in WCS, WMS and WPS. Results from these requests are rendered on a web-front end. In this way, the complexities of data locality and compute execution are masked from an end user. On-the-fly computation of products such as NDVI, Leaf Area Index, vegetation cover and others from original source data including MODIS are achived, with Landsat and Sentinel-2 on the horizon. Innovative use of cloud computing and storage along with flexible front-ends, allow the democratization of data dissemination and we hope better outcomes for the planet.
Cooley, Michael J.; Davis, Larry R.; Fishburn, Kristin A.; Lestinsky, Helmut; Moore, Laurence R.
2011-01-01
A full-size style sheet template in PDF that defines the placement of map elements, marginalia, and font sizes and styles accompanies this standard. The GeoPDF US Topo maps are fashioned to conform to this style sheet so that a user can print out a map at the 1:24,000-scale using the dimensions of the traditional standard 7.5-minute quadrangle. Symbology and type specifications for feature content are published separately. In addition, the GeoPDF design allows for custom printing, so that a user may zoom in and out, turn layers on and off, and view or print any combination of layers or any map portion at any desired scale.
ERIC Educational Resources Information Center
Govender, Nadaraj
2015-01-01
This case study explored the development of two pre-service teachers' subject matter knowledge (SMK) of electromagnetism while integrating the use of concept maps (CM) and collaborative learning (CL) strategies. The study aimed at capturing how these pre-service teachers' SMK in electromagnetism was enhanced after having been taught SMK in a…
ERIC Educational Resources Information Center
Petzold, Donald; Heppen, John
2005-01-01
Many student geography organizations or clubs associated with colleges and universities undertake community service projects each year to meet local needs and to gain recognition within the community. A uniquely geographical project of playground map painting provides a great community service and goes one step further by incorporating elements of…
Research notes : rainfall maps for the 21st century.
DOT National Transportation Integrated Search
2007-12-01
The report and included maps represent an update of the information contained in the precipitation-frequency atlas, published by the National Weather Service in 1973 (NOAA Atlas 2). Data collection for the National Weather Service (NWS) study ended i...
... a Member Home Resources & Services Professional Resource Baby Brain Map Mar 17, 2016 The Brain Map was adapted in 2006 by ZERO TO ... supports Adobe Flash Player. To view the Baby Brain Map, please visit this page on a browser ...
NASA Astrophysics Data System (ADS)
O'Neal, M. L.
2005-12-01
Science teaching reforms of the past 10 to 20 years have focused on a pedagogical shift from verification-style laboratory exercises, toward hands-on and inquiry-based constructivist teaching methods. Such methods, however, require teachers to be proficient in more than just basic content and teaching strategies. To be effective teachers, these professionals must also be skilled in the design and implementation of research-style investigations. At Loyola College in Maryland, topics in the earth and environmental sciences are used as the basis for field research projects that teach our students science content, along with how to design age-appropriate investigative activities and how to implement them in a stimulating, inquiry-based learning environment. Presented here are examples of three projects, demonstrating how these themes are woven throughout our pre- and in-service teacher preparation programs, at both undergraduate and graduate levels. 1. Watershed Studies - In our undergraduate, pre-service, elementary education teacher preparation program, students design and implement a water quality study in a local watershed. In the classroom, students use topographic maps and aerial photographs to delineate the watersheds' boundaries, to identify current land use patterns, and to select appropriate locations on the trunk stream for testing. Water testing at these sites is conducted during field trips, with data analysis and interpretation performed on-site. On-site work allows students to make connections between stream water quality and adjacent land use practices. Students then relate the content and research results to science teaching standards, in order to develop a unit-plan for use in their future classrooms. 2. Land Use Assessment - In our graduate, in-service, elementary and middle school science program, a local stream valley is used as the basis for an analysis of potential land use changes. Students first construct a topographic base map of the area, and then generate current land use/cover type maps. Soil texture, moisture, and depth data, as well as slope angle and infiltration/runoff potential information are collected throughout the map area, in order to assess the impact of proposed residential or agricultural land use changes. Students create maps delineating suitability and erosion potential, based upon their topographic maps and site data. A proposal for an analogous study, near the students' schools, is developed for use with their own students, as culmination of the project. 3. Climate Change - In our graduate, in-service, middle and high school earth science program, students are exposed to field research methods during a summer research project investigating relict shorelines of the Chesapeake Bay. In this project, students collect subsurface geophysical, sedimentological, and biological data through the use of ground penetrating radar, vibracoring, and hand-augering equipment. By combining the stratigraphy revealed in the radar records, with paleoenvironmental interpretations from sediment analyses and age estimates from fossil material encountered, students are able to construct cross sections of the region, delineating littoral deposits stemming from climate-induced, higher-than-present sea-level incursions. Students then prepare field and laboratory exercises for their own classrooms, relating the design and discoveries of the study to their own students. The students also participate in the preparation and presentation of their study in national and international scientific venues.
cPath: open source software for collecting, storing, and querying biological pathways
Cerami, Ethan G; Bader, Gary D; Gross, Benjamin E; Sander, Chris
2006-01-01
Background Biological pathways, including metabolic pathways, protein interaction networks, signal transduction pathways, and gene regulatory networks, are currently represented in over 220 diverse databases. These data are crucial for the study of specific biological processes, including human diseases. Standard exchange formats for pathway information, such as BioPAX, CellML, SBML and PSI-MI, enable convenient collection of this data for biological research, but mechanisms for common storage and communication are required. Results We have developed cPath, an open source database and web application for collecting, storing, and querying biological pathway data. cPath makes it easy to aggregate custom pathway data sets available in standard exchange formats from multiple databases, present pathway data to biologists via a customizable web interface, and export pathway data via a web service to third-party software, such as Cytoscape, for visualization and analysis. cPath is software only, and does not include new pathway information. Key features include: a built-in identifier mapping service for linking identical interactors and linking to external resources; built-in support for PSI-MI and BioPAX standard pathway exchange formats; a web service interface for searching and retrieving pathway data sets; and thorough documentation. The cPath software is freely available under the LGPL open source license for academic and commercial use. Conclusion cPath is a robust, scalable, modular, professional-grade software platform for collecting, storing, and querying biological pathways. It can serve as the core data handling component in information systems for pathway visualization, analysis and modeling. PMID:17101041
Dunford, Robert W; Smith, Alison C; Harrison, Paula A; Hanganu, Diana
Future patterns of European ecosystem services provision are likely to vary significantly as a result of climatic and socio-economic change and the implementation of adaptation strategies. However, there is little research in mapping future ecosystem services and no integrated assessment approach to map the combined impacts of these drivers. Map changing patterns in ecosystem services for different European futures and (a) identify the role of driving forces; (b) explore the potential influence of different adaptation options. The CLIMSAVE integrated assessment platform is used to map spatial patterns in services (food, water and timber provision, atmospheric regulation, biodiversity existence/bequest, landscape experience and land use diversity) for a number of combined climatic and socio-economic scenarios. Eight adaptation strategies are explored within each scenario. Future service provision (particularly water provision) will be significantly impacted by climate change. Socio-economic changes shift patterns of service provision: more dystopian societies focus on food provision at the expense of other services. Adaptation options offer significant opportunities, but may necessitate trade-offs between services, particularly between agriculture- and forestry-related services. Unavoidable trade-offs between regions (particularly South-North) are also identified in some scenarios. Coordinating adaptation across regions and sectors will be essential to ensure that all needs are met: a factor that will become increasingly pressing under dystopian futures where inter-regional cooperation breaks down. Integrated assessment enables exploration of interactions and trade-offs between ecosystem services, highlighting the importance of taking account of complex cross-sectoral interactions under different future scenarios of planning adaptation responses.
Web-GIS visualisation of permafrost-related Remote Sensing products for ESA GlobPermafrost
NASA Astrophysics Data System (ADS)
Haas, A.; Heim, B.; Schaefer-Neth, C.; Laboor, S.; Nitze, I.; Grosse, G.; Bartsch, A.; Kaab, A.; Strozzi, T.; Wiesmann, A.; Seifert, F. M.
2016-12-01
The ESA GlobPermafrost (www.globpermafrost.info) provides a remote sensing service for permafrost research and applications. The service comprises of data product generation for various sites and regions as well as specific infrastructure allowing overview and access to datasets. Based on an online user survey conducted within the project, the user community extensively applies GIS software to handle remote sensing-derived datasets and requires preview functionalities before accessing them. In response, we develop the Permafrost Information System PerSys which is conceptualized as an open access geospatial data dissemination and visualization portal. PerSys will allow visualisation of GlobPermafrost raster and vector products such as land cover classifications, Landsat multispectral index trend datasets, lake and wetland extents, InSAR-based land surface deformation maps, rock glacier velocity fields, spatially distributed permafrost model outputs, and land surface temperature datasets. The datasets will be published as WebGIS services relying on OGC-standardized Web Mapping Service (WMS) and Web Feature Service (WFS) technologies for data display and visualization. The WebGIS environment will be hosted at the AWI computing centre where a geodata infrastructure has been implemented comprising of ArcGIS for Server 10.4, PostgreSQL 9.2 and a browser-driven data viewer based on Leaflet (http://leafletjs.com). Independently, we will provide an `Access - Restricted Data Dissemination Service', which will be available to registered users for testing frequently updated versions of project datasets. PerSys will become a core project of the Arctic Permafrost Geospatial Centre (APGC) within the ERC-funded PETA-CARB project (www.awi.de/petacarb). The APGC Data Catalogue will contain all final products of GlobPermafrost, allow in-depth dataset search via keywords, spatial and temporal coverage, data type, etc., and will provide DOI-based links to the datasets archived in the long-term, open access PANGAEA data repository.
Javaid, M K; Kyer, C; Mitchell, P J; Chana, J; Moss, C; Edwards, M H; McLellan, A R; Stenmark, J; Pierroz, D D; Schneider, M C; Kanis, J A; Akesson, K; Cooper, C
2015-11-01
Fracture Liaison Services are the best model to prevent secondary fractures. The International Osteoporosis Foundation developed a Best Practice Framework to provide a quality benchmark. After a year of implementation, we confirmed that a single framework with set criteria is able to benchmark services across healthcare systems worldwide. Despite evidence for the clinical effectiveness of secondary fracture prevention, translation in the real-world setting remains disappointing. Where implemented, a wide variety of service models are used to deliver effective secondary fracture prevention. To support use of effective models of care across the globe, the International Osteoporosis Foundation's Capture the Fracture® programme developed a Best Practice Framework (BPF) tool of criteria and standards to provide a quality benchmark. We now report findings after the first 12 months of implementation. A questionnaire for the BPF was created and made available to institutions on the Capture the Fracture website. Responses from institutions were used to assign gold, silver, bronze or black (insufficient) level of achievements mapped across five domains. Through an interactive process with the institution, a final score was determined and published on the Capture the Fracture website Fracture Liaison Service (FLS) map. Sixty hospitals across six continents submitted their questionnaires. The hospitals served populations from 20,000 to 15 million and were a mix of private and publicly funded. Each FLS managed 146 to 6200 fragility fracture patients per year with a total of 55,160 patients across all sites. Overall, 27 hospitals scored gold, 23 silver and 10 bronze. The pathway for the hip fracture patients had the highest proportion of gold grading while vertebral fracture the lowest. In the first 12 months, we have successfully tested the BPF tool in a range of health settings across the globe. Initial findings confirm a significant heterogeneity in service provision and highlight the importance of a global approach to ensure high quality secondary fracture prevention services.
The service blueprint as a tool for designing innovative pharmaceutical services.
Holdford, D A; Kennedy, D T
1999-01-01
To describe service blueprints, discuss their need and design, and provide examples of their use in advancing pharmaceutical care. Service blueprints are pictures or maps of service processes that permit the people involved in designing, providing, managing, and using the service to better understand them and deal with them objectively. A service blueprint simultaneously depicts the service process and the roles of consumers, service providers, and supporting services. Service blueprints can be useful in pharmacy because many of the obstacles to pharmaceutical care are a result of insufficient planning by service designers and/or poor communication between those designing services and those implementing them. One consequence of this poor design and communication is that many consumers and third party payers are uninformed about pharmacist roles. Service blueprints can be used by pharmacists to promote the value of pharmaceutical care to consumers and other decision makers. They can also assist in designing better pharmaceutical services. Blueprints are designed by identifying and mapping a process from the consumer's point of view, mapping employee actions and support activities, and adding visible evidence of service at each consumer action step. Key components of service blueprints are consumer actions, "onstage" and "backstage" employee actions, and support processes. Blueprints can help pharmacy managers identify and correct problems with the service process, provide pharmacy employees an opportunity to offer feedback in the planning stages of services, and demonstrate the value of pharmaceutical services to consumers. Service blueprints can be a valuable tool for designing, implementing, and evaluating pharmacy services.
The climate4impact platform: Providing, tailoring and facilitating climate model data access
NASA Astrophysics Data System (ADS)
Pagé, Christian; Pagani, Andrea; Plieger, Maarten; Som de Cerff, Wim; Mihajlovski, Andrej; de Vreede, Ernst; Spinuso, Alessandro; Hutjes, Ronald; de Jong, Fokke; Bärring, Lars; Vega, Manuel; Cofiño, Antonio; d'Anca, Alessandro; Fiore, Sandro; Kolax, Michael
2017-04-01
One of the main objectives of climate4impact is to provide standardized web services and tools that are reusable in other portals. These services include web processing services, web coverage services and web mapping services (WPS, WCS and WMS). Tailored portals can be targeted to specific communities and/or countries/regions while making use of those services. Easier access to climate data is very important for the climate change impact communities. To fulfill this objective, the climate4impact (http://climate4impact.eu/) web portal and services has been developed, targeting climate change impact modellers, impact and adaptation consultants, as well as other experts using climate change data. It provides to users harmonized access to climate model data through tailored services. It features static and dynamic documentation, Use Cases and best practice examples, an advanced search interface, an integrated authentication and authorization system with the Earth System Grid Federation (ESGF), a visualization interface with ADAGUC web mapping tools. In the latest version, statistical downscaling services, provided by the Santander Meteorology Group Downscaling Portal, were integrated. An innovative interface to integrate statistical downscaling services will be released in the upcoming version. The latter will be a big step in bridging the gap between climate scientists and the climate change impact communities. The climate4impact portal builds on the infrastructure of an international distributed database that has been set to disseminate the results from the global climate model results of the Coupled Model Intercomparison project Phase 5 (CMIP5). This database, the ESGF, is an international collaboration that develops, deploys and maintains software infrastructure for the management, dissemination, and analysis of climate model data. The European FP7 project IS-ENES, Infrastructure for the European Network for Earth System modelling, supports the European contribution to ESGF and contributes to the ESGF open source effort, notably through the development of search, monitoring, quality control, and metadata services. In its second phase, IS-ENES2 supports the implementation of regional climate model results from the international Coordinated Regional Downscaling Experiments (CORDEX). These services were extended within the European FP7 Climate Information Portal for Copernicus (CLIPC) project, and some could be later integrated into the European Copernicus platform.
Characterizing and Mapping of Ecosystem Services (CMESs) Literature Database Version 1.0
Ecosystem services (ESs) represent an ecosystem’s capacity for satisfying essential human needs, directly or indirectly, above that required to maintain ecosystem integrity (structure, function and processes). The spatial characterization and mapping of ESs is an essential first ...
Standards-based sensor interoperability and networking SensorWeb: an overview
NASA Astrophysics Data System (ADS)
Bolling, Sam
2012-06-01
The War fighter lacks a unified Intelligence, Surveillance, and Reconnaissance (ISR) environment to conduct mission planning, command and control (C2), tasking, collection, exploitation, processing, and data discovery of disparate sensor data across the ISR Enterprise. Legacy sensors and applications are not standardized or integrated for assured, universal access. Existing tasking and collection capabilities are not unified across the enterprise, inhibiting robust C2 of ISR including near-real time, cross-cueing operations. To address these critical needs, the National Measurement and Signature Intelligence (MASINT) Office (NMO), and partnering Combatant Commands and Intelligence Agencies are developing SensorWeb, an architecture that harmonizes heterogeneous sensor data to a common standard for users to discover, access, observe, subscribe to and task sensors. The SensorWeb initiative long term goal is to establish an open commercial standards-based, service-oriented framework to facilitate plug and play sensors. The current development effort will produce non-proprietary deliverables, intended as a Government off the Shelf (GOTS) solution to address the U.S. and Coalition nations' inability to quickly and reliably detect, identify, map, track, and fully understand security threats and operational activities.
Effectiveness of Mind Mapping in English Teaching among VIII Standard Students
ERIC Educational Resources Information Center
Hallen, D.; Sangeetha, N.
2015-01-01
The aim of the study is to find out the effectiveness of mind mapping technique over conventional method in teaching English at high school level (VIII), in terms of Control and Experimental group. The sample of the study comprised, 60 VIII Standard students in Tiruchendur Taluk. Mind Maps and Achievement Test (Pretest & Posttest) were…
Map-IT! A Web-Based GIS Tool for Watershed Science Education.
ERIC Educational Resources Information Center
Curtis, David H.; Hewes, Christopher M.; Lossau, Matthew J.
This paper describes the development of a prototypic, Web-accessible GIS solution for K-12 science education and citizen-based watershed monitoring. The server side consists of ArcView IMS running on an NT workstation. The client is built around MapCafe. The client interface, which runs through a standard Web browser, supports standard MapCafe…
Reshadat, S; Saedi, S; Zangeneh, A; Ghasemi, S R; Gilan, N R; Karbasi, A; Bavandpoor, E
2015-09-08
Geographic information systems (GIS) analysis has not been widely used in underdeveloped countries to ensure that vulnerable populations have accessibility to primary health-care services. This study applied GIS methods to analyse the spatial accessibility to urban primary-care centres of the population in Kermanshah city, Islamic Republic of Iran, by age and sex groups. In a descriptive-analytical study over 3 time periods, network analysis, mean centre and standard distance methods were applied using ArcGIS 9.3. The analysis was based on a standard radius of 750 m distance from health centres, walking speed of 1 m/s and desired access time to health centres of 12.5 mins. The proportion of the population with inadequate geographical access to health centres rose from 47.3% in 1997 to 58.4% in 2012. The mean centre and standard distance mapping showed that the spatial distribution of health centres in Kermanshah needed to be adjusted to changes in population distribution.
Experimental Evaluation of High Performance Integrated Heat Pump
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, William A; Berry, Robert; Durfee, Neal
2016-01-01
Integrated heat pump (IHP) technology provides significant potential for energy savings and comfort improvement for residential buildings. In this study, we evaluate the performance of a high performance IHP that provides space heating, cooling, and water heating services. Experiments were conducted according to the ASHRAE Standard 206-2013 where 24 test conditions were identified in order to evaluate the IHP performance indices based on the airside performance. Empirical curve fits of the unit s compressor maps are used in conjunction with saturated condensing and evaporating refrigerant conditions to deduce the refrigerant mass flowrate, which, in turn was used to evaluate themore » refrigerant side performance as a check on the airside performance. Heat pump (compressor, fans, and controls) and water pump power were measured separately per requirements of Standard 206. The system was charged per the system manufacturer s specifications. System test results are presented for each operating mode. The overall IHP performance metrics are determined from the test results per the Standard 206 calculation procedures.« less
Noguchi, Kyo; Itoh, Toshihide; Naruto, Norihito; Takashima, Shutaro; Tanaka, Kortaro; Kuroda, Satoshi
2017-01-01
We evaluated whether X-map, a novel imaging technique, can visualize ischemic lesions within 20 hours after the onset in patients with acute ischemic stroke, using noncontrast dual-energy computed tomography (DECT). Six patients with acute ischemic stroke were included in this study. Noncontrast head DECT scans were acquired with 2 X-ray tubes operated at 80 kV and Sn150 kV between 32 minutes and 20 hours after the onset. Using these DECT scans, the X-map was reconstructed based on 3-material decomposition and compared with a simulated standard (120 kV) computed tomography (CT) and diffusion-weighted imaging (DWI). The X-map showed more sensitivity to identify the lesions as an area of lower attenuation value than a simulated standard CT in all 6 patients. The lesions on the X-map correlated well with those on DWI. In 3 of 6 patients, the X-map detected a transient decrease in the attenuation value in the peri-infarct area within 1 day after the onset. The X-map is a powerful tool to supplement a simulated standard CT and characterize acute ischemic lesions. However, the X-map cannot replace a simulated standard CT to diagnose acute cerebral infarction. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Geovisualization in the HydroProg web map service
NASA Astrophysics Data System (ADS)
Spallek, Waldemar; Wieczorek, Malgorzata; Szymanowski, Mariusz; Niedzielski, Tomasz; Swierczynska, Malgorzata
2016-04-01
The HydroProg system, built at the University of Wroclaw (Poland) in frame of the research project no. 2011/01/D/ST10/04171 financed by the National Science Centre of Poland, has been designed for computing predictions of river stages in real time on a basis of multimodelling. This experimental system works on the upper Nysa Klodzka basin (SW Poland) above the gauge in the town of Bardo, with the catchment area of 1744 square kilometres. The system operates in association with the Local System for Flood Monitoring of Klodzko County (LSOP), and produces hydrograph prognoses as well as inundation predictions. For presenting the up-to-date predictions and their statistics in the online mode, the dedicated real-time web map service has been designed. Geovisualisation in the HydroProg map service concerns: interactive maps of study area, interactive spaghetti hydrograms of water level forecasts along with observed river stages, animated images of inundation. The LSOP network offers a high spatial and temporal resolution of observations, as the length of the sampling interval is equal to 15 minutes. The main environmental elements related to hydrological modelling are shown on the main map. This includes elevation data (hillshading and hypsometric tints), rivers and reservoirs as well as catchment boundaries. Furthermore, we added main towns, roads as well as political and administrative boundaries for better map understanding. The web map was designed as a multi-scale representation, with levels of detail and zooming according to scales: 1:100 000, 1:250 000 and 1:500 000. Observations of water level in LSOP are shown on interactive hydrographs for each gauge. Additionally, predictions and some of their statistical characteristics (like prediction errors and Nash-Sutcliffe efficiency) are shown for selected gauges. Finally, predictions of inundation are presented on animated maps which have been added for four experimental sites. The HydroProg system is a strictly scientific project, but the web map service has been designed for all web users. The main objective of the paper is to present the design process of the web map service, following the cartographic and graphic principles.
Investigation of Mapping Skills of Pre-Service Teachers as Regards to Various Parameters
ERIC Educational Resources Information Center
Aksoy, Bulent
2013-01-01
The goal of this study is to investigate the mapping skills of pre-service teachers as regards to various parameters. The study was carried out using the survey method. The data collection tool employed in the study was the achievement test developed by Koc (2008). The study was carried out on 199 pre-service teachers studying in social studies,…
A Tool that Can be Effective in the Self-Regulated Learning of Pre-Service Teachers: The Mind Map
ERIC Educational Resources Information Center
Tanriseven, Isil
2014-01-01
The aim of this study is to analyse the effect of task planning with mind maps on the self-regulation strategies and motivational beliefs of pre-service teachers. A quasi-experimental design, with a pre-test and post-test control group, was applied in the research. The research group comprised of 60 pre-service teachers taking "Teaching…
High Temporal Resolution Mapping of Seismic Noise Sources Using Heterogeneous Supercomputers
NASA Astrophysics Data System (ADS)
Paitz, P.; Gokhberg, A.; Ermert, L. A.; Fichtner, A.
2017-12-01
The time- and space-dependent distribution of seismic noise sources is becoming a key ingredient of modern real-time monitoring of various geo-systems like earthquake fault zones, volcanoes, geothermal and hydrocarbon reservoirs. We present results of an ongoing research project conducted in collaboration with the Swiss National Supercomputing Centre (CSCS). The project aims at building a service providing seismic noise source maps for Central Europe with high temporal resolution. We use source imaging methods based on the cross-correlation of seismic noise records from all seismic stations available in the region of interest. The service is hosted on the CSCS computing infrastructure; all computationally intensive processing is performed on the massively parallel heterogeneous supercomputer "Piz Daint". The solution architecture is based on the Application-as-a-Service concept to provide the interested researchers worldwide with regular access to the noise source maps. The solution architecture includes the following sub-systems: (1) data acquisition responsible for collecting, on a periodic basis, raw seismic records from the European seismic networks, (2) high-performance noise source mapping application responsible for the generation of source maps using cross-correlation of seismic records, (3) back-end infrastructure for the coordination of various tasks and computations, (4) front-end Web interface providing the service to the end-users and (5) data repository. The noise source mapping itself rests on the measurement of logarithmic amplitude ratios in suitably pre-processed noise correlations, and the use of simplified sensitivity kernels. During the implementation we addressed various challenges, in particular, selection of data sources and transfer protocols, automation and monitoring of daily data downloads, ensuring the required data processing performance, design of a general service-oriented architecture for coordination of various sub-systems, and engineering an appropriate data storage solution. The present pilot version of the service implements noise source maps for Switzerland. Extension of the solution to Central Europe is planned for the next project phase.
Communicating data quality through Web Map Services
NASA Astrophysics Data System (ADS)
Blower, Jon; Roberts, Charles; Griffiths, Guy; Lewis, Jane; Yang, Kevin
2013-04-01
The sharing and visualization of environmental data through spatial data infrastructures is becoming increasingly common. However, information about the quality of data is frequently unavailable or presented in an inconsistent fashion. ("Data quality" is a phrase with many possible meanings but here we define it as "fitness for purpose" - therefore different users have different notions of what constitutes a "high quality" dataset.) The GeoViQua project (www.geoviqua.org) is developing means for eliciting, formatting, discovering and visualizing quality information using ISO and Open Geospatial Consortium (OGC) standards. Here we describe one aspect of the innovations of the GeoViQua project. In this presentation, we shall demonstrate new developments in using Web Map Services to communicate data quality at the level of datasets, variables and individual samples. We shall outline a new draft set of conventions (known as "WMS-Q"), which describe a set of rules for using WMS to convey quality information (OGC draft Engineering Report 12-160). We shall demonstrate these conventions through new prototype software, based upon the widely-used ncWMS software, that applies these rules to enable the visualization of uncertainties in raster data such as satellite products and the results of numerical simulations. Many conceptual and practical issues have arisen from these experiments. How can source data be formatted so that a WMS implementation can detect the semantic links between variables (e.g. the links between a mean field and its variance)? The visualization of uncertainty can be a complex task - how can we provide users with the power and flexibility to choose an optimal strategy? How can we maintain compatibility (as far as possible) with existing WMS clients? We explore these questions with reference to existing standards and approaches, including UncertML, NetCDF-U and Styled Layer Descriptors.
Building Geospatial Web Services for Ecological Monitoring and Forecasting
NASA Astrophysics Data System (ADS)
Hiatt, S. H.; Hashimoto, H.; Melton, F. S.; Michaelis, A. R.; Milesi, C.; Nemani, R. R.; Wang, W.
2008-12-01
The Terrestrial Observation and Prediction System (TOPS) at NASA Ames Research Center is a modeling system that generates a suite of gridded data products in near real-time that are designed to enhance management decisions related to floods, droughts, forest fires, human health, as well as crop, range, and forest production. While these data products introduce great possibilities for assisting management decisions and informing further research, realization of their full potential is complicated by their shear volume and by the need for a necessary infrastructure for remotely browsing, visualizing, and analyzing the data. In order to address these difficulties we have built an OGC-compliant WMS and WCS server based on an open source software stack that provides standardized access to our archive of data. This server is built using the open source Java library GeoTools which achieves efficient I/O and image rendering through Java Advanced Imaging. We developed spatio-temporal raster management capabilities using the PostGrid raster indexation engine. We provide visualization and browsing capabilities through a customized Ajax web interface derived from the kaMap project. This interface allows resource managers to quickly assess ecosystem conditions and identify significant trends and anomalies from within their web browser without the need to download source data or install special software. Our standardized web services also expose TOPS data to a range of potential clients, from web mapping applications to virtual globes and desktop GIS packages. However, support for managing the temporal dimension of our data is currently limited in existing software systems. Future work will attempt to overcome this shortcoming by building time-series visualization and analysis tools that can be integrated with existing geospatial software.
Advanced Map For Real-Time Process Control
NASA Astrophysics Data System (ADS)
Shiobara, Yasuhisa; Matsudaira, Takayuki; Sashida, Yoshio; Chikuma, Makoto
1987-10-01
MAP, a communications protocol for factory automation proposed by General Motors [1], has been accepted by users throughout the world and is rapidly becoming a user standard. In fact, it is now a LAN standard for factory automation. MAP is intended to interconnect different devices, such as computers and programmable devices, made by different manufacturers, enabling them to exchange information. It is based on the OSI intercomputer com-munications protocol standard under development by the ISO. With progress and standardization, MAP is being investigated for application to process control fields other than factory automation [2]. The transmission response time of the network system and centralized management of data exchanged with various devices for distributed control are import-ant in the case of a real-time process control with programmable controllers, computers, and instruments connected to a LAN system. MAP/EPA and MINI MAP aim at reduced overhead in protocol processing and enhanced transmission response. If applied to real-time process control, a protocol based on point-to-point and request-response transactions limits throughput and transmission response. This paper describes an advanced MAP LAN system applied to real-time process control by adding a new data transmission control that performs multicasting communication voluntarily and periodically in the priority order of data to be exchanged.
Biewick, Laura
2008-01-01
This report contains maps and associated spatial data showing historical oil and gas exploration and production in the United States. Because of the proprietary nature of many oil and gas well databases, the United States was divided into cells one-quarter square mile and the production status of all wells in a given cell was aggregated. Base-map reference data are included, using the U.S. Geological Survey (USGS) National Map, the USGS and American Geological Institute (AGI) Global GIS, and a World Shaded Relief map service from the ESRI Geography Network. A hardcopy map was created to synthesize recorded exploration data from 1859, when the first oil well was drilled in the U.S., to 2005. In addition to the hardcopy map product, the data have been refined and made more accessible through the use of Geographic Information System (GIS) tools. The cell data are included in a GIS database constructed for spatial analysis via the USGS Internet Map Service or by importing the data into GIS software such as ArcGIS. The USGS internet map service provides a number of useful and sophisticated geoprocessing and cartographic functions via an internet browser. Also included is a video clip of U.S. oil and gas exploration and production through time.
Approaches to Mapping Nitrogen Removal: Examples at a Landscape Scale
Wetlands can provide the ecosystem service of improved water quality via nitrogen removal, providing clean drinking water and reducing the eutrophication of aquatic resources. Within the ESRP, mapping nitrogen removal by wetlands is a service that incorporates the goals of the ni...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-04
... higher education or a consortium of institutions of higher education; (4) public or private non-profit... DEPARTMENT OF COMMERCE Economic Development Administration Proposed Information Collection; Comment Request; Award Amendment Requests and Project Service Maps AGENCY: Economic Development...
Reddy, Vinod; Swanson, Stanley M; Segelke, Brent; Kantardjieff, Katherine A; Sacchettini, James C; Rupp, Bernhard
2003-12-01
Anticipating a continuing increase in the number of structures solved by molecular replacement in high-throughput crystallography and drug-discovery programs, a user-friendly web service for automated molecular replacement, map improvement, bias removal and real-space correlation structure validation has been implemented. The service is based on an efficient bias-removal protocol, Shake&wARP, and implemented using EPMR and the CCP4 suite of programs, combined with various shell scripts and Fortran90 routines. The service returns improved maps, converted data files and real-space correlation and B-factor plots. User data are uploaded through a web interface and the CPU-intensive iteration cycles are executed on a low-cost Linux multi-CPU cluster using the Condor job-queuing package. Examples of map improvement at various resolutions are provided and include model completion and reconstruction of absent parts, sequence correction, and ligand validation in drug-target structures.
Mapping regional livelihood benefits from local ecosystem services assessments in rural Sahel
Sinare, Hanna; Enfors Kautsky, Elin; Ouedraogo, Issa; Gordon, Line J.
2018-01-01
Most current approaches to landscape scale ecosystem service assessments rely on detailed secondary data. This type of data is seldom available in regions with high levels of poverty and strong local dependence on provisioning ecosystem services for livelihoods. We develop a method to extrapolate results from a previously published village scale ecosystem services assessment to a higher administrative level, relevant for land use decision making. The method combines remote sensing (using a hybrid classification method) and interviews with community members. The resulting landscape scale maps show the spatial distribution of five different livelihood benefits (nutritional diversity, income, insurance/saving, material assets and energy, and crops for consumption) that illustrate the strong multifunctionality of the Sahelian landscapes. The maps highlight the importance of a diverse set of sub-units of the landscape in supporting Sahelian livelihoods. We see a large potential in using the resulting type of livelihood benefit maps for guiding future land use decisions in the Sahel. PMID:29389965
Mapping regional livelihood benefits from local ecosystem services assessments in rural Sahel.
Malmborg, Katja; Sinare, Hanna; Enfors Kautsky, Elin; Ouedraogo, Issa; Gordon, Line J
2018-01-01
Most current approaches to landscape scale ecosystem service assessments rely on detailed secondary data. This type of data is seldom available in regions with high levels of poverty and strong local dependence on provisioning ecosystem services for livelihoods. We develop a method to extrapolate results from a previously published village scale ecosystem services assessment to a higher administrative level, relevant for land use decision making. The method combines remote sensing (using a hybrid classification method) and interviews with community members. The resulting landscape scale maps show the spatial distribution of five different livelihood benefits (nutritional diversity, income, insurance/saving, material assets and energy, and crops for consumption) that illustrate the strong multifunctionality of the Sahelian landscapes. The maps highlight the importance of a diverse set of sub-units of the landscape in supporting Sahelian livelihoods. We see a large potential in using the resulting type of livelihood benefit maps for guiding future land use decisions in the Sahel.
Landsat Image Map Production Methods at the U. S. Geological Survey
Kidwell, R.D.; Binnie, D.R.; Martin, S.
1987-01-01
To maintain consistently high quality in satellite image map production, the U. S. Geological Survey (USGS) has developed standard procedures for the photographic and digital production of Landsat image mosaics, and for lithographic printing of multispectral imagery. This paper gives a brief review of the photographic, digital, and lithographic procedures currently in use for producing image maps from Landsat data. It is shown that consistency in the printing of image maps is achieved by standardizing the materials and procedures that affect the image detail and color balance of the final product. Densitometric standards are established by printing control targets using the pressplates, inks, pre-press proofs, and paper to be used for printing.
Developing a GIS for CO2 analysis using lightweight, open source components
NASA Astrophysics Data System (ADS)
Verma, R.; Goodale, C. E.; Hart, A. F.; Kulawik, S. S.; Law, E.; Osterman, G. B.; Braverman, A.; Nguyen, H. M.; Mattmann, C. A.; Crichton, D. J.; Eldering, A.; Castano, R.; Gunson, M. R.
2012-12-01
There are advantages to approaching the realm of geographic information systems (GIS) using lightweight, open source components in place of a more traditional web map service (WMS) solution. Rapid prototyping, schema-less data storage, the flexible interchange of components, and open source community support are just some of the benefits. In our effort to develop an application supporting the geospatial and temporal rendering of remote sensing carbon-dioxide (CO2) data for the CO2 Virtual Science Data Environment project, we have connected heterogeneous open source components together to form a GIS. Utilizing widely popular open source components including the schema-less database MongoDB, Leaflet interactive maps, the HighCharts JavaScript graphing library, and Python Bottle web-services, we have constructed a system for rapidly visualizing CO2 data with reduced up-front development costs. These components can be aggregated together, resulting in a configurable stack capable of replicating features provided by more standard GIS technologies. The approach we have taken is not meant to replace the more established GIS solutions, but to instead offer a rapid way to provide GIS features early in the development of an application and to offer a path towards utilizing more capable GIS technology in the future.
NASA Technical Reports Server (NTRS)
Feng, Wanda; Evans, Cynthia; Gruener, John; Eppler, Dean
2014-01-01
Geologic mapping involves interpreting relationships between identifiable units and landforms to understand the formative history of a region. Traditional field techniques are used to accomplish this on Earth. Mapping proves more challenging for other planets, which are studied primarily by orbital remote sensing and, less frequently, by robotic and human surface exploration. Systematic comparative assessments of geologic maps created by traditional mapping versus photogeology together with data from planned traverses are limited. The objective of this project is to produce a geologic map from data collected on the Desert Research and Technology Studies (RATS) 2010 analog mission using Apollo-style traverses in conjunction with remote sensing data. This map is compared with a geologic map produced using standard field techniques.
Lean methodology for performance improvement in the trauma discharge process.
O'Mara, Michael Shaymus; Ramaniuk, Aliaksandr; Graymire, Vickie; Rozzell, Monica; Martin, Stacey
2014-07-01
High-volume, complex services such as trauma and acute care surgery are at risk for inefficiency. Lean process improvement can reduce health care waste. Lean allows a structured look at processes not easily amenable to analysis. We applied lean methodology to the current state of communication and discharge planning on an urban trauma service, citing areas for improvement. A lean process mapping event was held. The process map was used to identify areas for immediate analysis and intervention-defining metrics for the stakeholders. After intervention, new performance was assessed by direct data evaluation. The process was completed with an analysis of effect and plans made for addressing future focus areas. The primary area of concern identified was interservice communication. Changes centering on a standardized morning report structure reduced the number of consult questions unanswered from 67% to 34% (p = 0.0021). Physical therapy rework was reduced from 35% to 19% (p = 0.016). Patients admitted to units not designated to the trauma service had 1.6 times longer stays (p < 0.0001). The lean process lasted 8 months, and three areas for new improvement were identified: (1) the off-unit patients; (2) patients with length of stay more than 15 days contribute disproportionately to length of stay; and (3) miscommunication exists around patient education at discharge. Lean process improvement is a viable means of health care analysis. When applied to a trauma service with 4,000 admissions annually, lean identifies areas ripe for improvement. Our inefficiencies surrounded communication and patient localization. Strategies arising from the input of all stakeholders led to real solutions for communication through a face-to-face morning report and identified areas for ongoing improvement. This focuses resource use and identifies areas for improvement of throughput in care delivery.
U.S. Level III and IV Ecoregions (U.S. EPA)
This map service displays Level III and Level IV Ecoregions of the United States and was created from ecoregion data obtained from the U.S. Environmental Protection Agency Office of Research and Development's Western Ecology Division. The original ecoregion data was projected from Albers to Web Mercator for this map service. To download shapefiles of ecoregion data (in Albers), please go to: ftp://newftp.epa.gov/EPADataCommons/ORD/Ecoregions/. IMPORTANT NOTE ABOUT LEVEL IV POLYGON LEGEND DISPLAY IN ARCMAP: Due to the limitations of Graphical Device Interface (GDI) resources per application on Windows, ArcMap does not display the legend in the Table of Contents for the ArcGIS Server service layer if the legend has more than 100 items. As of December 2011, there are 968 unique legend items in the Level IV Ecoregion Polygon legend. Follow this link (http://support.esri.com/en/knowledgebase/techarticles/detail/33741) for instructions about how to increase the maximum number of ArcGIS Server service layer legend items allowed for display in ArcMap. Note the instructions at this link provide a slightly incorrect path to Maximum Legend Count. The correct path is HKEY_CURRENT_USER > Software > ESRI > ArcMap > Server > MapServerLayer > Maximum Legend Count. When editing the Maximum Legend Count, update the field, Value data to 1000. To download a PDF version of the Level IV ecoregion map and legend, go to ftp://newftp.epa.gov/EPADataCommons/ORD/Ecoregions/us/Eco_Level_IV
Standardizing intensive care device data to enable secondary usages.
Ingenerf, Josef; Kock, Ann-Kristin; Poelker, Marcel; Seidl, Konrad; Zeplin, Georg; Mersmann, Stefan; Handels, Heinz
2012-01-01
To represent medical device observations in a format that is consumable by clinical software, standards like HL7v3 and ISO/IEEE 11073 should be used jointly. This is demonstrated in a project with Dräger Medical GmbH focusing on their Patient Data Management System (PDMS) in intensive care, called Integrated Care Manager (ICM). Patient and device data of interest should be mapped to suitable formats to enable data exchange and decision support. Instead of mapping device data to target formats bilaterally we use a generic HL7v3 Refined Message Information Model (RMIM) with device specific parts adapted to ISO/IEEE 11073 DIM. The generality of the underlying model (based on Yuksel et al. [1]) allows the flexible inclusion of IEEE 11073 conformant device models of interest on the one hand and the generation of needed artifacts for secondary usages on the other hand, e.g. HL7 V2 messages, HL7 CDA documents like the Personal Health Monitoring Report (PHMR) or web services. Hence, once the medical device data are obtained in the RMIM format, it can quite easily be transformed into HL7-based standard interfaces through XSL transformations because these interfaces all have their building blocks from the same RIM. From there data can be accessed uniformly, e.g. as needed by Dräger´s decision support system SmartCare [2] for automated control and optimization of weaning from mechanical ventilation.
EnviroAtlas - Austin, TX - Tree Cover Configuration and Connectivity, Water Background Web Service
This EnviroAtlas web service supports research and online mapping activities related to EnviroAtlas (https://enviroatlas.epa.gov/EnviroAtlas). The EnviroAtlas Austin, TX tree cover configuration and connectivity map categorizes forest land cover into structural elements (e.g. core, edge, connector, etc.). In this community, Forest is defined as Trees & Forest (Trees & Forest - 40 = 1; All Else = 0). Water was considered background (value 129) during the analysis to create this dataset, however it has been converted into value 10 to distinguish it from land area background. This dataset was produced by the US EPA to support research and online mapping activities related to EnviroAtlas. EnviroAtlas (https://www.epa.gov/enviroatlas) allows the user to interact with a web-based, easy-to-use, mapping application to view and analyze multiple ecosystem services for the contiguous United States. The dataset is available as downloadable data (https://edg.epa.gov/data/Public/ORD/EnviroAtlas) or as an EnviroAtlas map service. Additional descriptive information about each attribute in this dataset can be found in its associated EnviroAtlas Fact Sheet (https://www.epa.gov/enviroatlas/enviroatlas-fact-sheets).
HELI-DEM portal for geo-processing services
NASA Astrophysics Data System (ADS)
Cannata, Massimiliano; Antonovic, Milan; Molinari, Monia
2014-05-01
HELI-DEM (Helvetia-Italy Digital Elevation Model) is a project developed in the framework of Italy/Switzerland Operational Programme for Trans-frontier Cooperation 2007-2013 whose major aim is to create a unified digital terrain model that includes the alpine and sub-alpine areas between Italy and Switzerland. The partners of the project are: Lombardy Region, Piedmont Region, Polytechnic of Milan, Polytechnic of Turin and Fondazione Politecnico from Italy; Institute of Earth Sciences (SUPSI) from Switzerland. The digital terrain model has been produced by integrating and validating the different elevation data available for the areas of interest, characterized by different reference frame, resolutions and accuracies: DHM at 25 m resolution from Swisstopo, DTM at 20 m resolution from Lombardy Region, DTM at 5 m resolution from Piedmont Region and DTM LiDAR PST-A at about 1 m resolution, that covers the main river bed areas and is produced by the Italian Ministry of the Environment. Further results of the project are: the generation of a unique Italian Swiss geoid with an accuracy of few centimeters (Gilardoni et al. 2012); the establishment of a GNSS permanent network, prototype of a transnational positioning service; the development of a geo-portal, entirely based on open source technologies and open standards, which provides the cross-border DTM and offers some capabilities of analysis and processing through the Internet. With this talk, the authors want to present the main steps of the project with a focus on the HELI-DEM geo-portal development carried out by the Institute of Earth Sciences, which is the access point to the DTM outputted from the project. The portal, accessible at http://geoservice.ist.supsi.ch/helidem, is a demonstration of open source technologies combined for providing access to geospatial functionalities to wide non GIS expert public. In fact, the system is entirely developed using only Open Standards and Free and Open Source Software (FOSS) both on the server side (services) and on the client side (interface). In addition to self developed code the system relies mainly on teh software GRASS 7 [1], ZOO-project [2], Geoserver [3] and OpenLayers [4] and the standards WMS [5], WCS [6] and WPS [7]. At the time of writing, the portal offers features like profiling, contour extraction, watershed delineation and analysis, derivatives calculation, data extraction, coordinate conversion but it is evolving and it is planned to extend to a series of environmental modeling that the IST developed in the past like dam break simulation, landslide run-out estimation and floods due to landslide impact in artificial basins. [1] Neteler M., Mitasova H., Open Source GIS: A GRASS GIS Approach. 3rd Ed. 406 pp, Springer, New York, 2008. [2] Fenoy G., Bozon N., Raghavan V., ZOO Project: The Open Wps Platform. Proceeding of 1st International Workshop on Pervasive Web Mapping, Geoprocessing and Services (WebMGS). Como, http://www.isprs.org/proceedings/XXXVIII/4-W13/ID_32.pdf, 26-27 agosto 2010. [3] Giannecchini S., Aime A., GeoServer, il server open source per la gestione interoperabile dei dati geospaziali. Atti 15a Conferenza Nazionale ASITA. Reggia di Colorno, 15-18 novembre 2011. [4] Perez A.S., OpenLayers Cookbook. Packt Publishing, 2012. ISBN 1849517843. [5] OGC, OpenGIS Web Map Server Implementation Specification, http://www.opengeospatial.org/standards/wms, 2006. [6] OGC, OGC WCS 2.0 Interface Standard - Core, http://portal.opengeospatial.org/files/?artifact_id=41437, 2010b. [7] OGC, OpenGIS Web Processing Service, http://portal.opengeospatial.org/files/?artifact_id=24151, 2007.
CPC - Monitoring & Data: Regional Climate Maps
Weather Service NWS logo - Click to go to the NWS home page Climate Prediction Center Site Map News Information CPC Web Team HOME > Monitoring and Data > Global Climate Data & Maps > Global Regional Climate Maps Regional Climate Maps Banner The Monthly regional analyses products are usually
Field methods and data processing techniques associated with mapped inventory plots
William A. Bechtold; Stanley J. Zarnoch
1999-01-01
The U.S. Forest Inventory and Analysis (FIA) and Forest Health Monitoring (FHM) programs utilize a fixed-area mapped-plot design as the national standard for extensive forest inventories. The mapped-plot design is explained, as well as the rationale for its selection as the national standard. Ratio-of-means estimators am presented as a method to process data from...
MN GIS/LIS Consortium Annual Conference and Workshops, Rochester, MN, October 1-3, 2014
We mapped the distribution of multiple ecosystem services in the Saint Louis River Area of Concern (SLR AOC) under current and reported extreme lake levels. Services were mapped using measured or modeled natural features (i.e., bathymetry, vegetation, fetch, habitat, contaminated...
This dataset was produced by a joint effort of New Mexico State University (NMSU), the U.S. Environmental Protection Agency (EPA), and the U.S. Geological Survey (USGS) to support research and online mapping activities related to EnviroAtlas. Ecosystem services, i.e., services provided to humans from ecological systems, have become a key issue of this century in resource management, conservation planning, and environmental decision analysis. Mapping and quantifying ecosystem services have become strategic national interests for integrating ecology with economics to help understand the effects of human policies and actions and their subsequent impacts on both ecosystem function and human well-being. Some aspects of biodiversity are valued by humans in varied ways, and thus are important to include in any assessment that seeks to identify and quantify the benefits of ecosystems to humans. Some biodiversity metrics clearly reflect ecosystem services (e.g., abundance and diversity of harvestable species), whereas others may reflect indirect and difficult to quantify relationships to services (e.g., relevance of species diversity to ecosystem resilience, or cultural and aesthetic values). Wildlife habitat has been modeled at broad spatial scales and can be used to map a number of biodiversity metrics. We map 15 biodiversity metrics reflecting ecosystem services or other aspects of biodiversity for bird species. Metrics include all bird species richness, lists identif
ERIC Educational Resources Information Center
Karakuyu, Yunus
2011-01-01
The purpose of this study is to determine the thoughts of primary science and technology teachers, primary class teachers, pre-service primary class teachers and pre-service primary science and technology teachers' about concept maps. This scale applied the use of basic and random method on the chosen 125 4th and 5th grade primary class teachers…
Baxter, F.S.
1990-01-01
The US Geological Survey (USGS) programs can play an important role in support of President Bush's policy of no net loss of wetlands. A principal goal of USGS is to provide cartographic information that contributes to the wise management of the Nation's natural resources. This information consists of maps, cartographic data bases (graphic and digital), remotely sensed imagery, and information services. These products are used by Federal, State, and local governments, the private sector, and individual citizens in making decisions on the existence and use of land and water resources. I discuss the programs, products, and information services of the National Mapping Division, the tools available to determine where wetlands exist, and the capability of periodic measurement of wetlands to help in assessing compliance with the concept of no net loss of wetlands. -from Author
Gulf of Mexico Data Atlas: Digital Data Discovery and Access
NASA Astrophysics Data System (ADS)
Rose, K.
2014-12-01
The Gulf of Mexico Data Atlas is an online data discovery and access tool that allows users to browse a growing collection of ecosystem-related datasets visualized as map plates. Thematically, the Atlas includes updated long-term assessments of the physical, biological, environmental, economic and living marine resource characteristics that indicate baseline conditions of the Gulf of Mexico ecosystems. These data are crucial components of integrated ecosystem assessments and modeling and support restoration and monitoring efforts in the Gulf. A multi-agency executive steering committee including members from international, federal, state, and non-governmental organizations was established to guide Atlas development and to contribute data and expertise. The Atlas currently contains over 235 maps in 70 subject areas. Each map plate is accompanied by a descriptive summary authored by a subject matter expert and each data set is fully documented by metadata in Federal Geographic Data Committee (FGDC)-compliant standards. Source data are available in native formats and as web mapping services (WMS). Datasets are also searchable through an accompanying Map Catalog and RSS feed. The Gulf of Mexico Data Atlas is an operational example of the philosophy of leveraging resources among agencies and activities involved in geospatial data as outlined in the US Department of Interior and FGDC "Geospatial Platform Modernization Roadmap v4 - March 2011". We continue to update and add datasets through existing and new partnerships to ensure that the Atlas becomes a truly ecosystem-wide resource.
Cross-Dataset Analysis and Visualization Driven by Expressive Web Services
NASA Astrophysics Data System (ADS)
Alexandru Dumitru, Mircea; Catalin Merticariu, Vlad
2015-04-01
The deluge of data that is hitting us every day from satellite and airborne sensors is changing the workflow of environmental data analysts and modelers. Web geo-services play now a fundamental role, and are no longer needed to preliminary download and store the data, but rather they interact in real-time with GIS applications. Due to the very large amount of data that is curated and made available by web services, it is crucial to deploy smart solutions for optimizing network bandwidth, reducing duplication of data and moving the processing closer to the data. In this context we have created a visualization application for analysis and cross-comparison of aerosol optical thickness datasets. The application aims to help researchers identify and visualize discrepancies between datasets coming from various sources, having different spatial and time resolutions. It also acts as a proof of concept for integration of OGC Web Services under a user-friendly interface that provides beautiful visualizations of the explored data. The tool was built on top of the World Wind engine, a Java based virtual globe built by NASA and the open source community. For data retrieval and processing we exploited the OGC Web Coverage Service potential: the most exciting aspect being its processing extension, a.k.a. the OGC Web Coverage Processing Service (WCPS) standard. A WCPS-compliant service allows a client to execute a processing query on any coverage offered by the server. By exploiting a full grammar, several different kinds of information can be retrieved from one or more datasets together: scalar condensers, cross-sectional profiles, comparison maps and plots, etc. This combination of technology made the application versatile and portable. As the processing is done on the server-side, we ensured that the minimal amount of data is transferred and that the processing is done on a fully-capable server, leaving the client hardware resources to be used for rendering the visualization. The application offers a set of features to visualize and cross-compare the datasets. Users can select a region of interest in space and time on which an aerosol map layer is plotted. Hovmoeller time-latitude and time-longitude profiles can be displayed by selecting orthogonal cross-sections on the globe. Statistics about the selected dataset are also displayed in different text and plot formats. The datasets can also be cross-compared either by using the delta map tool or the merged map tool. For more advanced users, a WCPS query console is also offered allowing users to process their data with ad-hoc queries and then choose how to display the results. Overall, the user has a rich set of tools that can be used to visualize and cross-compare the aerosol datasets. With our application we have shown how the NASA WorldWind framework can be used to display results processed efficiently - and entirely - on the server side using the expressiveness of the OGC WCPS web-service. The application serves not only as a proof of concept of a new paradigm in working with large geospatial data but also as an useful tool for environmental data analysts.
EnviroAtlas -- Austin, TX -- One Meter Resolution Urban Land Cover Data (2010) Web Service
This EnviroAtlas web service supports research and online mapping activities related to EnviroAtlas (https://www.epa.gov/enviroatlas ). The Austin, TX EnviroAtlas One Meter-scale Urban Land Cover (MULC) Data were generated from United States Department of Agriculture (USDA) National Agricultural Imagery Program (NAIP) four band (red, green, blue, and near infrared) aerial photography at 1 m spatial resolution from multiple dates in May, 2010. Six land cover classes were mapped: water, impervious surfaces, soil and barren land, trees, grass-herbaceous non-woody vegetation, and agriculture. An accuracy assessment of 600 completely random and 55 stratified random photo interpreted reference points yielded an overall User's fuzzy accuracy of 87 percent. The area mapped is the US Census Bureau's 2010 Urban Statistical Area for Austin, TX plus a 1 km buffer. This dataset was produced by the US EPA to support research and online mapping activities related to EnviroAtlas. EnviroAtlas (https://www.epa.gov/enviroatlas) allows the user to interact with a web-based, easy-to-use, mapping application to view and analyze multiple ecosystem services for the contiguous United States. The dataset is available as downloadable data (https://edg.epa.gov/data/Public/ORD/EnviroAtlas) or as an EnviroAtlas map service. Additional descriptive information about each attribute in this dataset can be found in its associated EnviroAtlas Fact Sheet (https://www.epa.gov/enviroatlas/enviroatlas
Coastal wetlands: The present and future role of remote sensing
NASA Technical Reports Server (NTRS)
Carter, V.
1977-01-01
During the past decade, there has been a rapid expansion of remote sensing research and technology development related to coastal wetlands. As a result of this research, all of the 23 coastal states have ongoing or completed wetland inventories, most utilizing aerial photographs as the data source for producing a variety of map products with varying scales, formats, classification systems and intended uses. The U.S. Geological Survey is increasing emphasis on map production and revision for the coastal zone. The new U.S. Fish and Wildlife Service National Wetland Inventory is intended to provide a standardized method for comparison of wetlands on a national basis - it too will use available aerial photographs as a basic data source. At present, satellite data is not used for operational mapping of coastal wetlands because of resolution and geometric constraints. In the future, however, satellite data may provide an accurate reliable and economical source to update wetland inventories and to monitor or evaluate coastal wetlands. The technological improvements accompanying the development and launch of Landsat C and D and the space shuttle promise to make satellite digital data a more powerful tool to supply information for future management decisions for coastal wetlands.
Ecosystem services (ESS) represent an ecosystems capacity for satisfying essential human needs, directly or indirectly, above that required to maintain ecosystem integrity (structure, function and processes). The spatial characterization and mapping of ESS is an essential first s...
Biodiversity is crucial for the functioning of ecosystems and the products and services from which we transform natural assets of the Earth for human survival, security, and well-being. The ability to assess, report, map, and forecast the life support functions of ecosystems is a...
Maldonado, José Alberto; Marcos, Mar; Fernández-Breis, Jesualdo Tomás; Parcero, Estíbaliz; Boscá, Diego; Legaz-García, María Del Carmen; Martínez-Salvador, Begoña; Robles, Montserrat
2016-01-01
The heterogeneity of clinical data is a key problem in the sharing and reuse of Electronic Health Record (EHR) data. We approach this problem through the combined use of EHR standards and semantic web technologies, concretely by means of clinical data transformation applications that convert EHR data in proprietary format, first into clinical information models based on archetypes, and then into RDF/OWL extracts which can be used for automated reasoning. In this paper we describe a proof-of-concept platform to facilitate the (re)configuration of such clinical data transformation applications. The platform is built upon a number of web services dealing with transformations at different levels (such as normalization or abstraction), and relies on a collection of reusable mappings designed to solve specific transformation steps in a particular clinical domain. The platform has been used in the development of two different data transformation applications in the area of colorectal cancer.
Toward standardized mapping for left atrial analysis and cardiac ablation guidance
NASA Astrophysics Data System (ADS)
Rettmann, M. E.; Holmes, D. R.; Linte, C. A.; Packer, D. L.; Robb, R. A.
2014-03-01
In catheter-based cardiac ablation, the pulmonary vein ostia are important landmarks for guiding the ablation procedure, and for this reason, have been the focus of many studies quantifying their size, structure, and variability. Analysis of pulmonary vein structure, however, has been limited by the lack of a standardized reference space for population based studies. Standardized maps are important tools for characterizing anatomic variability across subjects with the goal of separating normal inter-subject variability from abnormal variability associated with disease. In this work, we describe a novel technique for computing flat maps of left atrial anatomy in a standardized space. A flat map of left atrial anatomy is created by casting a single ray through the volume and systematically rotating the camera viewpoint to obtain the entire field of view. The technique is validated by assessing preservation of relative surface areas and distances between the original 3D geometry and the flat map geometry. The proposed methodology is demonstrated on 10 subjects which are subsequently combined to form a probabilistic map of anatomic location for each of the pulmonary vein ostia and the boundary of the left atrial appendage. The probabilistic map demonstrates that the location of the inferior ostia have higher variability than the superior ostia and the variability of the left atrial appendage is similar to the superior pulmonary veins. This technique could also have potential application in mapping electrophysiology data, radio-frequency ablation burns, or treatment planning in cardiac ablation therapy.
Application of mapped plots for single-owner forest surveys
Paul C. Van Deusen; Francis Roesch
2009-01-01
Mapped plots are used for the nation forest inventory conducted by the U.S. Forest Service. Mapped plots are also useful foro single ownership inventoires. Mapped plots can handle boundary overlap and can aprovide less variable estimates for specified forest conditions. Mapping is a good fit for fixed plot inventories where the fixed area plot is used for both mapping...
van Vliet, Liesbeth M; Gao, Wei; DiFrancesco, Daniel; Crosby, Vincent; Wilcock, Andrew; Byrne, Anthony; Al-Chalabi, Ammar; Chaudhuri, K Ray; Evans, Catherine; Silber, Eli; Young, Carolyn; Malik, Farida; Quibell, Rachel; Higginson, Irene J
2016-05-10
Patients affected by progressive long-term neurological conditions might benefit from specialist palliative care involvement. However, little is known on how neurology and specialist palliative care services interact. This study aimed to map the current level of connections and integration between these services. The mapping exercise was conducted in eight centres with neurology and palliative care services in the United Kingdom. The data were provided by the respective neurology and specialist palliative care teams. Questions focused on: i) catchment and population served; ii) service provision and staffing; iii) integration and relationships. Centres varied in size of catchment areas (39-5,840 square miles) and population served (142,000-3,500,000). Neurology and specialist palliative care were often not co-terminus. Service provisions for neurology and specialist palliative care were also varied. For example, neurology services varied in the number and type of provided clinics and palliative care services in the settings they work in. Integration was most developed in Motor Neuron Disease (MND), e.g., joint meetings were often held, followed by Parkinsonism (made up of Parkinson's Disease (PD), Multiple-System Atrophy (MSA) and Progressive Supranuclear Palsy (PSP), with integration being more developed for MSA and PSP) and least in Multiple Sclerosis (MS), e.g., most sites had no formal links. The number of neurology patients per annum receiving specialist palliative care reflected these differences in integration (range: 9-88 MND, 3-25 Parkinsonism, and 0-5 MS). This mapping exercise showed heterogeneity in service provision and integration between neurology and specialist palliative care services, which varied not only between sites but also between diseases. This highlights the need and opportunities for improved models of integration, which should be rigorously tested for effectiveness.
Elementary maps on nest algebras
NASA Astrophysics Data System (ADS)
Li, Pengtong
2006-08-01
Let , be algebras and let , be maps. An elementary map of is an ordered pair (M,M*) such that for all , . In this paper, the general form of surjective elementary maps on standard subalgebras of nest algebras is described. In particular, such maps are automatically additive.
Standard for the U.S. Geological Survey Historical Topographic Map Collection
Allord, Gregory J.; Fishburn, Kristin A.; Walter, Jennifer L.
2014-01-01
This document defines the digital map product of the U.S. Geological Survey (USGS) Historical Topographic Map Collection (HTMC). The HTMC is a digital archive of about 190,000 printed topographic quadrangle maps published by the USGS from the inception of the topographic mapping program in 1884 until the last paper topographic map using lithographic printing technology was published in 2006. The HTMC provides a comprehensive digital repository of all scales and all editions of USGS printed topographic maps that is easily discovered, browsed, and downloaded by the public at no cost. Each printed topographic map is scanned “as is” and captures the content and condition of each map. The HTMC provides ready access to maps that are no longer available for distribution in print. A new generation of topographic maps called “US Topo” was defined in 2009. US Topo maps, though modeled on the legacy 7.5-minute topographic maps, conform to different standards. For more information on the HTMC, see the project Web site at: http://nationalmap.gov/historical/.
Mapping Partners Master Drug Dictionary to RxNorm using an NLP-based approach.
Zhou, Li; Plasek, Joseph M; Mahoney, Lisa M; Chang, Frank Y; DiMaggio, Dana; Rocha, Roberto A
2012-08-01
To develop an automated method based on natural language processing (NLP) to facilitate the creation and maintenance of a mapping between RxNorm and a local medication terminology for interoperability and meaningful use purposes. We mapped 5961 terms from Partners Master Drug Dictionary (MDD) and 99 of the top prescribed medications to RxNorm. The mapping was conducted at both term and concept levels using an NLP tool, called MTERMS, followed by a manual review conducted by domain experts who created a gold standard mapping. The gold standard was used to assess the overall mapping between MDD and RxNorm and evaluate the performance of MTERMS. Overall, 74.7% of MDD terms and 82.8% of the top 99 terms had an exact semantic match to RxNorm. Compared to the gold standard, MTERMS achieved a precision of 99.8% and a recall of 73.9% when mapping all MDD terms, and a precision of 100% and a recall of 72.6% when mapping the top prescribed medications. The challenges and gaps in mapping MDD to RxNorm are mainly due to unique user or application requirements for representing drug concepts and the different modeling approaches inherent in the two terminologies. An automated approach based on NLP followed by human expert review is an efficient and feasible way for conducting dynamic mapping. Copyright © 2011 Elsevier Inc. All rights reserved.
USGS standard quadrangle maps for emergency response
Moore, Laurence R.
2009-01-01
The 1:24,000-scale topographic quadrangle was the primary product of the U.S. Geological Survey's (USGS) National Mapping Program from 1947-1992. This map series includes about 54,000 map sheets for the conterminous United States, and is the only uniform map series ever produced that covers this area at such a large scale. This map series partially was revised under several programs, starting as early as 1968, but these programs were not adequate to keep the series current. Through the 1990s the emphasis of the USGS mapping program shifted away from topographic maps and toward more specialized digital data products. Topographic map revision dropped off rapidly after 1999, and stopped completely by 2004. Since 2001, emergency-response and homeland security requirement have revived the question of whether a standard national topographic series is needed. Emergencies such as Hurricane Katrina in 2005 and California wildfires in 2007-08 demonstrated that familiar maps are important to first responders. Maps that have a standard scale, extent, and grids help reduce confusion and save time in emergencies. Traditional maps are designed to allow the human brain to quickly process large amounts of information, and depend on artistic layout and design that cannot be fully automated. In spite of technical advances, creating a traditional, general-purpose topographic map is still expensive. Although the content and layout of traditional topographic maps probably is still desirable, the preferred packaging and delivery of maps has changed. Digital image files are now desired by most users, but to be useful to the emergency-response community, these files must be easy to view and easy to print without specialized geographic information system expertise or software.
Riparian wetlands are critical systems that perform functions and provide services disproportionate to their extent in the landscape. Mapping wetlands allows for better planning, management, and modeling, but riparian wetlands present several challenges to effective mapping due t...
Geologic map of the Great Smoky Mountains National Park region, Tennessee and North Carolina
Southworth, Scott; Schultz, Art; Denenny, Danielle
2005-01-01
The geology of the Great Smoky Mountain National Park (GSMNP) region of Tennessee and North Carolina was studied from 1993 to 2003 as part of a cooperative investigation with the National Park Service (NPS). This work has been compiled as a 1:100,000-scale map derived from mapping done at 1:24,000 and 1:62,500 scale. The geologic data are intended to support cooperative investigations with NPS, the development of a new soil map by the Natural Resources Conservation Service, and the All Taxa Biodiversity Inventory (http://www.discoverlifeinamerica.org/). At the request of NPS, we mapped areas previously not visited, revised the geology where stratigraphic and structural problems existed, and developed a map database for use in interdisciplinary research, land management, and interpretive programs for park visitors.
Geologic map of the Great Smoky Mountains National Park region, Tennessee and North Carolina
Southworth, Scott; Schultz, Art; Aleinikoff, John N.; Merschat, Arthur J.
2012-01-01
The geology of the Great Smoky Mountains National Park region of Tennessee and North Carolina was studied from 1993 to 2003 as part of a cooperative investigation by the U.S. Geological Survey with the National Park Service (NPS). This work resulted in a 1:100,000-scale geologic map derived from mapping that was conducted at scales of 1:24,000 and 1:62,500. The geologic data are intended to support cooperative investigations with the NPS, the development of a new soil map by the Natural Resources Conservation Service, and the All Taxa Biodiversity Inventory. In response to a request by the NPS, we mapped previously unstudied areas, revised the geology where problems existed, and developed a map database for use in interdisciplinary research, land management, and interpretive programs for park visitors.
Using mind mapping techniques for rapid qualitative data analysis in public participation processes.
Burgess-Allen, Jilla; Owen-Smith, Vicci
2010-12-01
In a health service environment where timescales for patient participation in service design are short and resources scarce, a balance needs to be achieved between research rigour and the timeliness and utility of the findings of patient participation processes. To develop a pragmatic mind mapping approach to managing the qualitative data from patient participation processes. While this article draws on experience of using mind maps in a variety of participation processes, a single example is used to illustrate the approach. In this example mind maps were created during the course of patient participation focus groups. Two group discussions were also transcribed verbatim to allow comparison of the rapid mind mapping approach with traditional thematic analysis of qualitative data. The illustrative example formed part of a local alcohol service review which included consultation with local alcohol service users, their families and staff groups. The mind mapping approach provided a pleasing graphical format for representing the key themes raised during the focus groups. It helped stimulate and galvanize discussion and keep it on track, enhanced transparency and group ownership of the data analysis process, allowed a rapid dynamic between data collection and feedback, and was considerably faster than traditional methods for the analysis of focus groups, while resulting in similar broad themes. This study suggests that the use of a mind mapping approach to managing qualitative data can provide a pragmatic resolution of the tension between limited resources and quality in patient participation processes. © 2010 The Authors. Health Expectations © 2010 Blackwell Publishing Ltd.
Data services providing by the Ukrainian NODC (MHI NASU)
NASA Astrophysics Data System (ADS)
Eremeev, V.; Godin, E.; Khaliulin, A.; Ingerov, A.; Zhuk, E.
2009-04-01
At modern stage of the World Ocean study information support of investigation based on ad-vanced computer technologies becomes of particular importance. These abstracts are devoted to presentation of several data services developed in the Ukrainian NODC on the base of the Ma-rine Environmental and Information Technologies Department of MHI NASU. The Data Quality Control Service Using experience of international collaboration in the field of data collection and quality check we have developed the quality control (QC) software providing both preliminary(automatic) and expert(manual) data quality check procedures. The current version of the QC software works for the Mediterranean and Black seas and includes the climatic arrays for hydrological and few hydrochemical parameters based on such products as MEDAR/MEDATLAS II, Physical Oceanography of the Black Sea and Climatic Atlas of Oxygen and Hydrogen Sulfide in the Black sea. The data quality check procedure includes metadata control and hydrological and hydrochemical data control. Metadata control provides checking of duplicate cruises and pro-files, date and chronology, ship velocity, station location, sea depth and observation depth. Data QC procedure includes climatic (or range for parameters with small number of observations) data QC, density inversion check for hydrological data and searching for spikes. Using of cli-matic fields and profiles prepared by regional oceanography experts leads to more reliable results of data quality check procedure. The Data Access Services The Ukrainian NODC provides two products for data access - on-line software and data access module for the MHI NASU local net. This software allows select-ing data on rectangle area, on date, on months, on cruises. The result of query is metadata which are presented in the table and the visual presentation of stations on the map. It is possible to see both metadata and data. For this purpose it is necessary to select station in the table of metadata or on the map. There is also an opportunity to export data in ODV format. The product is avail-able on http://www.ocean.nodc.org.ua/DataAccess.php The local net version provides access to the oceanological database of the MHI NASU. The cur-rent version allows selecting data by spatial and temporal limits, depth, values of parameters, quality flags and works for the Mediterranean and Black seas. It provides visualization of meta-data and data, statistics of data selection, data export into several data formats. The Operational Data Management Services The collaborators of the MHI Experimental Branch developed a system of obtaining information on water pressure and temperature, as well as on atmospheric pressure. Sea level observations are also conducted. The obtained data are transferred online. The interface for operation data access was developed. It allows to select parameters (sea level, water temperature, atmospheric pressure, wind and wa-ter pressure) and time interval to see parameter graphics. The product is available on http://www.ocean.nodc.org.ua/Katsively.php . The Climatic products The current version of the Climatic Atlas includes maps on such pa-rameters as temperature, salinity, density, heat storage, dynamic heights, upper boundary of hy-drogen sulfide and lower boundary of oxygen for the Black sea basin. Maps for temperature, sa-linity, density were calculated on 19 standard depths and averaged monthly for depths 0 - 300 m and annually for lower depth values. The climatic maps of upper boundary of hydrogen sulfide and lower boundary of oxygen were averaged by decades from 20 till 90 of the XX century and by seasons. Two versions of climatic atlas viewer - on-line and desktop for presentation of the climatic maps were developed. They provide similar functions of selection and viewing maps by parameter, month and depth and saving maps in various formats. On-line version of atlas is available on http://www.ocean.nodc.org.ua/Main_Atlas.php .
NASA Astrophysics Data System (ADS)
Pierce, M. E.; Aktas, M. S.; Aydin, G.; Fox, G. C.; Gadgil, H.; Sayar, A.
2005-12-01
We examine the application of Web Service Architectures and Grid-based distributed computing technologies to geophysics and geo-informatics. We are particularly interested in the integration of Geographical Information System (GIS) services with distributed data mining applications. GIS services provide the general purpose framework for building archival data services, real time streaming data services, and map-based visualization services that may be integrated with data mining and other applications through the use of distributed messaging systems and Web Service orchestration tools. Building upon on our previous work in these areas, we present our current research efforts. These include fundamental investigations into increasing XML-based Web service performance, supporting real time data streams, and integrating GIS mapping tools with audio/video collaboration systems for shared display and annotation.
Using a Web-based GIS to Teach Problem-based Science in High School and College
NASA Astrophysics Data System (ADS)
Metzger, E.; Lenkeit Meezan, , K. A.; Schmidt, C.; Taketa, R.; Carter, J.; Iverson, R.
2008-12-01
Foothill College has partnered with San Jose State University to bring GIS web mapping technology to the high school and college classroom. The project consists of two parts. In the first part, Foothill and San Jose State University have teamed up to offer classes on building and maintaining Web based Geographic Information Systems (GIS). Web-based GIS such as Google Maps, MapQuest and Yahoo Maps have become ubiquitous, and the skills to build and maintain these systems are in high demand from many employers. In the second part of the project, high school students will be able to learn about Web GIS as a real world tool used by scientists. The students in the Foothill College/San Jose State class will build their Web GIS using scientific data related to the San Francisco/San Joaquin Delta region, with a focus on watersheds, biodiversity and earthquake hazards. This project includes high school level curriculum development that will tie in to No Child Left Behind and National Curriculum Standards in both Science and Geography, and provide workshops for both pre-and in- service teachers in the use of Web GIS-driven course material in the high school classroom. The project will bring the work of professional scientists into any high school classroom with an internet connection; while simultaneously providing workforce training in high demand technology based jobs.
NASA Astrophysics Data System (ADS)
Sayyida, Ghany; Fahma, Fakhrina; Iftadi, Irwan
2018-03-01
RSUD dr. Soediran Mangun Sumarso is a public hospital in Wonogiri district which has an outpatient installation service. However, the waiting time of some services in outpatient installations exceeds the standard time set by the health minister of the Republic of Indonesia. It is known from the data waiting time in the outpatient installation. The purpose of this study is to provide improvements using lean hospital approach. Proposed improvement is done by eliminating waste that occurs in outpatient installation service. The methodology used in this study consists of four stages. The first stage is describing the service system using a cross-functional flowchart. The second stage is identifying waste using value stream mapping, observation and interview. The third stage is to determine critical waste by borda method and pareto diagram. The last stage is to provide recommendation improvement using fishbone diagram and FMEA. The result of this research is proposed improvements. The proposed improvements are adding special register counters, implementing an online reservation system, doctors schedule synchronization, adding doctors in polyclinics, fixing queue numbers, applying visual management concepts, making connecting glass in pharmacies and adding multifunction shelves in polyclinics.
Geologic map of the Sunnymead 7.5' quadrangle, Riverside County, California
Morton, Douglas M.; Matti, Jonathan C.
2001-01-01
a. This Readme; includes in Appendix I, data contained in sun_met.txt b. The same graphic as plotted in 2 above. Test plots have not produced 1:24,000-scale map sheets. Adobe Acrobat page size setting influences map scale. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Geologic Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Where known, grain size is indicated on the map by a subscripted letter or letters following the unit symbols as follows: lg, large boulders; b, boulder; g, gravel; a, arenaceous; s, silt; c, clay; e.g. Qyfa is a predominantly young alluvial fan deposit that is arenaceous. Multiple letters are used for more specific identification or for mixed units, e.g., Qfysa is a silty sand. In some cases, mixed units are indicated by a compound symbol; e.g., Qyf2sc. Marine deposits are in part overlain by local, mostly alluvial fan, deposits and are labeled Qomf. Grain size follows f. Even though this is an Open-File Report and includes the standard USGS Open-File disclaimer, the report closely adheres to the stratigraphic nomenclature of the U.S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above).
EnviroAtlas - Austin, TX - BenMAP Results by Block Group
This EnviroAtlas dataset demonstrates the effect of changes in pollution concentration on local populations in 750 block groups in Austin, Texas. The US EPA's Environmental Benefits Mapping and Analysis Program (BenMAP) was used to estimate the incidence of adverse health effects (i.e., mortality and morbidity) and associated monetary value that result from changes in pollution concentrations for Travis and Williamson Counties, TX. Incidence and value estimates for the block groups are calculated using i-Tree models (www.itreetools.org), local weather data, pollution data, and U.S. Census derived population data. This dataset was produced by the US Forest Service to support research and online mapping activities related to EnviroAtlas. EnviroAtlas (https://www.epa.gov/enviroatlas) allows the user to interact with a web-based, easy-to-use, mapping application to view and analyze multiple ecosystem services for the contiguous United States. The dataset is available as downloadable data (https://edg.epa.gov/data/Public/ORD/EnviroAtlas) or as an EnviroAtlas map service. Additional descriptive information about each attribute in this dataset can be found in its associated EnviroAtlas Fact Sheet (https://www.epa.gov/enviroatlas/enviroatlas-fact-sheets).
Classifying the Diversity of Bus Mapping Systems
NASA Astrophysics Data System (ADS)
Said, Mohd Shahmy Mohd; Forrest, David
2018-05-01
This study represents the first stage of an investigation into understanding the nature of different approaches to mapping bus routes and bus network, and how they may best be applied in different public transport situations. In many cities, bus services represent an important facet of easing traffic congestion and reducing pollution. However, with the entrenched car culture in many countries, persuading people to change their mode of transport is a major challenge. To promote this modal shift, people need to know what services are available and where (and when) they go. Bus service maps provide an invaluable element of providing suitable public transport information, but are often overlooked by transport planners, and are under-researched by cartographers. The method here consists of the creation of a map evaluation form and performing assessment of published bus networks maps. The analyses were completed by a combination of quantitative and qualitative data analysis of various aspects of cartographic design and classification. This paper focuses on the resulting classification, which is illustrated by a series of examples. This classification will facilitate more in depth investigations into the details of cartographic design for such maps and help direct areas for user evaluation.
Colombini, D; Di Leone, G; Occhipinti, E; Montomoli, L; Ruschioni, A; Giambartolomei, M; Ardissone, S; Fanti, M; Pressiani, S; Placci, M; Cerbai, M; Preite, S
2009-01-01
During the last Congress of the International Ergonomics Association (IEA), Beijing - China August 2009, in collaboration with World Health Organization an international group for developing a "toolkit for MSD prevention" was founded. Possible users of toolkits are: members of a health and safety committee; health and safety representatives; line supervisors; foremen; workers; government representatives; health workers implementing basic occupational health services; occupational health and safety specialists. According with ISO standard 11228 series and their ISO Application document for the Key enters and Quick Assessment (green/red conditions), our group developed a first mapping methodology of occupational hazards in handicraft, working with the support of the information technology (Excel). This methodology, utilizing specific key enters and quick evaluation, allows a simple risk estimation. So it is possible to decide for which occupational hazards will be necessary an exhaustive assessment and to which professional consultant it is better to direct them to (worker's doctor, engineer, chemical, etc.).
FACETS: using open data to measure community social determinants of health.
Cantor, Michael N; Chandras, Rajan; Pulgarin, Claudia
2018-04-01
To develop a dataset based on open data sources reflective of community-level social determinants of health (SDH). We created FACETS (Factors Affecting Communities and Enabling Targeted Services), an architecture that incorporates open data related to SDH into a single dataset mapped at the census-tract level for New York City. FACETS (https://github.com/mcantor2/FACETS) can be easily used to map individual addresses to their census-tract-level SDH. This dataset facilitates analysis across different determinants that are often not easily accessible. Wider access to open data from government agencies at the local, state, and national level would facilitate the aggregation and analysis of community-level determinants. Timeliness of updates to federal non-census data sources may limit their usefulness. FACETS is an important first step in standardizing and compiling SDH-related data in an open architecture that can give context to a patient's condition and enable better decision-making when developing a plan of care.
Biodiversity is crucial for the functioning of ecosystems and the products and services from which we transform natural assets of the Earth for human survival, security, and well-being. The ability to assess, report, map, and forecast the life support functions of ecosystems is a...
77 FR 42696 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-20
... construction awards, 30 requests for amendments to non-construction awards, 2 project service maps). Average Hours Per Response: 2 hours for an amendment to a construction award, 1 hour for an amendment to a non-construction award, 6 hours for a project service map. Burden Hours: 1,242. Needs and Uses: A recipient must...
Climate Prediction Center - Outlooks: Current UV Index Forecast Map
Weather Service NWS logo - Click to go to the NWS home page Climate Prediction Center Home Site Map News Service NOAA Center for Weather and Climate Prediction Climate Prediction Center 5830 University Research Court College Park, Maryland 20740 Page Author: Climate Prediction Center Internet Team Disclaimer
Bagstad, Kenneth J.; Semmens, Darius J.; Ancona, Zachary H.; Sherrouse, Ben C.
2017-01-01
Statistical hotspot methods of intermediate conservatism (i.e., Getis-Ord Gi*, α = 0.10 significance) may be most useful for ecosystem service hot/coldspot mapping to inform landscape scale planning. We also found spatially explicit evidence in support of past findings about public attitudes toward wilderness areas.
Sherrouse, Benson C.; Riegle, Jodi L.; Semmens, Darius J.
2010-01-01
In response to the need for incorporating quantified and spatially explicit measures of social values into ecosystem services assessments, the Rocky Mountain Geographic Science Center, in collaboration with Colorado State University, has developed a geographic information system application, Social Values for Ecosystem Services (SolVES). SolVES can be used to assess, map, and quantify the perceived social values of ecosystem services. SolVES derives a quantitative social values metric, the Value Index, from a combination of spatial and nonspatial responses to public attitude and preference surveys. SolVES also generates landscape metrics, such as average elevation and distance to water, calculated from spatial data layers describing the underlying physical environment. Using kernel density calculations and zonal statistics, SolVES derives and maps the 10-point Value Index and reports landscape metrics associated with each index value for social value types such as aesthetics, biodiversity, and recreation. This can be repeated for various survey subgroups as distinguished by their attitudes and preferences regarding public uses of the forests such as motorized recreation and logging for fuels reduction. The Value Index provides a basis of comparison within and among survey subgroups to consider the effect of social contexts on the valuation of ecosystem services. SolVES includes regression coefficients linking the predicted value (the Value Index) to landscape metrics. These coefficients are used to generate predicted social value maps using value transfer techniques for areas where primary survey data are not available. SolVES was developed, and will continue to be enhanced through future versions, as a public domain tool to enable decision makers and researchers to map the social values of ecosystem services and to facilitate discussions among diverse stakeholders regarding tradeoffs between different ecosystem services in a variety of physical and social contexts.
Geologic map of the Cucamonga Peak 7.5' quadrangle, San Bernardino County, California
Morton, D.M.; Matti, J.C.; Digital preparation by Koukladas, Catherine; Cossette, P.M.
2001-01-01
a. This Readme; includes in Appendix I, data contained in fif_met.txt b. The same graphic as plotted in 2 above. (Test plots have not produced 1:24,000-scale map sheets. Adobe Acrobat pagesize setting influences map scale.) The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Miscellaneous Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Even though this is an author-prepared report, every attempt has been made to closely adhere to the stratigraphic nomenclature of the U. S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above). If roads in some areas, especially forest roads that parallel topographic contours, do not show well on plots of the geologic map, we recommend use of the USGS Cucamonga Peak 7.5’ topographic quadrangle in conjunction with the geologic map.
Geologic map of the Telegraph Peak 7.5' quadrangle, San Bernardino County, California
Morton, D.M.; Woodburne, M.O.; Foster, J.H.; Morton, Gregory; Cossette, P.M.
2001-01-01
a. This Readme; includes in Appendix I, data contained in fif_met.txt b. The same graphic as plotted in 2 above. Test plots have not produced 1:24,000-scale map sheets. Adobe Acrobat pagesize setting influences map scale. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Miscellaneous Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Even though this is an author-prepared report, every attempt has been made to closely adhere to the stratigraphic nomenclature of the U. S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above). If roads in some areas, especially forest roads that parallel topographic contours, do not show well on plots of the geologic map, we recommend use of the USGS Telegraph Peak 7.5’ topographic quadrangle in conjunction with the geologic map.
Sherrouse, B.C.; Semmens, D.J.
2010-01-01
Ecosystem services can be defined in various ways; simply put, they are the benefits provided by nature, which contribute to human well-being. These benefits can range from tangible products such as food and fresh water to cultural services such as recreation and esthetics. As the use of these benefits continues to increase, additional pressures are placed on the natural ecosystems providing them. This makes it all the more important when assessing possible tradeoffs among ecosystem services to consider the human attitudes and preferences that express underlying social values associated with their benefits. While some of these values can be accounted for through economic markets, other values can be more difficult to quantify, and attaching dollar amounts to them may not be very useful in all cases. Regardless of the processes or units used for quantifying such values, the ability to map them across the landscape and relate them to the ecosystem services to which they are attributed is necessary for effective assessments. To address some of the needs associated with quantifying and mapping social values for inclusion in ecosystem services assessments, scientists at the Rocky Mountain Geographic Science Center (RMGSC), in collaboration with Colorado State University, have developed a public domain tool, Social Values for Ecosystem Services (SolVES). SolVES is a geographic information system (GIS) application designed to use data from public attitude and preference surveys to assess, map, and quantify social values for ecosystem services. SolVES calculates and maps a 10-point Value Index representing the relative perceived social values of ecosystem services such as recreation and biodiversity for various groups of ecosystem stakeholders. SolVES output can also be used to identify and model relationships between social values and physical characteristics of the underlying landscape. These relationships can then be used to generate predicted Value Index maps for areas where survey data are not available. RMGSC will continue to develop more robust versions of SolVES by pursuing opportunities to work with land and resource managers as well as other researchers to apply SolVES to specific ecosystem management problems.
NASA Astrophysics Data System (ADS)
Hugo, Wim
2013-04-01
Over the past 3 years, SAEON has worked with a number of stakeholders and funders to establish a shared platform for the management of dissemination of E&EO research outputs, data sets, and services. This platform is strongly aligned with GEO principles and architecture, allowing direct integration with the GEOSS Broker. The platform has two important characteristics: 1. It reduces the cost and lead time of provision of similar infrastructure for future initiatives. 2. The platform is domain-agnostic to some degree, and can be used for non E&EO applications. Projects to achive this is under way at present. The paper describes the application of the platform for a variety of user communities and initiatives (SAEON Data Portal, South African Earth Observation System, Risk and Vulnerability Atlas, BioEnergy Atlas, National Spatial Information Framework, ICSU World Data System Components, and many more), and demonstrates use cases utilising a distributed, service oriented architecture. Significant improvements have been made to the interoperability functions available to end users and content providers, and these are demonstrated and discussed in detail. Functions include • Creation and persistence of composite maps, as well as time series or scatter charts, supporting a variety of standardized data sources. • Search facilities have been extended to allow analysis and filtering of primary search results, and to deal with large meta-data collections. • In addition, data sources, data listings, news items, images, search results, and other platform content can, with increasing flexibility, be accessed as standardized services that are processed in standardized clients, allowing creation of a rich user interface, and permitting the inclusion of platform functionality into external websites and resources. This shift to explicit service-oriented, peer-to-peer architecture is a preparation for increased distributed processing and content composition, and will support the concept of virtualization of 'science gateways' based on the platform, in support of a growing number of domains and initiatives.
Imaging Cerebral Microhemorrhages in Military Service Members with Chronic Traumatic Brain Injury
Liu, Wei; Soderlund, Karl; Senseney, Justin S.; Joy, David; Yeh, Ping-Hong; Ollinger, John; Sham, Elyssa B.; Liu, Tian; Wang, Yi; Oakes, Terrence R.; Riedy, Gerard
2017-01-01
Purpose To detect cerebral microhemorrhages in military service members with chronic traumatic brain injury by using susceptibility-weighted magnetic resonance (MR) imaging. The longitudinal evolution of microhemorrhages was monitored in a subset of patients by using quantitative susceptibility mapping. Materials and Methods The study was approved by the Walter Reed National Military Medical Center institutional review board and is compliant with HIPAA guidelines. All participants underwent two-dimensional conventional gradient-recalled-echo MR imaging and three-dimensional flow-compensated multi-echo gradient-recalled-echo MR imaging (processed to generate susceptibility-weighted images and quantitative susceptibility maps), and a subset of patients underwent follow-up imaging. Microhemorrhages were identified by two radiologists independently. Comparisons of microhemorrhage number, size, and magnetic susceptibility derived from quantitative susceptibility maps between baseline and follow-up imaging examinations were performed by using the paired t test. Results Among the 603 patients, cerebral microhemorrhages were identified in 43 patients, with six excluded for further analysis owing to artifacts. Seventy-seven percent (451 of 585) of the microhemorrhages on susceptibility-weighted images had a more conspicuous appearance than on gradient-recalled-echo images. Thirteen of the 37 patients underwent follow-up imaging examinations. In these patients, a smaller number of microhemorrhages were identified at follow-up imaging compared with baseline on quantitative susceptibility maps (mean ± standard deviation, 9.8 microhemorrhages ± 12.8 vs 13.7 microhemorrhages ± 16.6; P = .019). Quantitative susceptibility mapping–derived quantitative measures of microhemorrhages also decreased over time: −0.85 mm3 per day ± 1.59 for total volume (P = .039) and −0.10 parts per billion per day ± 0.14 for mean magnetic susceptibility (P = .016). Conclusion The number of microhemorrhages and quantitative susceptibility mapping–derived quantitative measures of microhemorrhages all decreased over time, suggesting that hemosiderin products undergo continued, subtle evolution in the chronic stage. PMID:26371749
Assessment of disease named entity recognition on a corpus of annotated sentences.
Jimeno, Antonio; Jimenez-Ruiz, Ernesto; Lee, Vivian; Gaudan, Sylvain; Berlanga, Rafael; Rebholz-Schuhmann, Dietrich
2008-04-11
In recent years, the recognition of semantic types from the biomedical scientific literature has been focused on named entities like protein and gene names (PGNs) and gene ontology terms (GO terms). Other semantic types like diseases have not received the same level of attention. Different solutions have been proposed to identify disease named entities in the scientific literature. While matching the terminology with language patterns suffers from low recall (e.g., Whatizit) other solutions make use of morpho-syntactic features to better cover the full scope of terminological variability (e.g., MetaMap). Currently, MetaMap that is provided from the National Library of Medicine (NLM) is the state of the art solution for the annotation of concepts from UMLS (Unified Medical Language System) in the literature. Nonetheless, its performance has not yet been assessed on an annotated corpus. In addition, little effort has been invested so far to generate an annotated dataset that links disease entities in text to disease entries in a database, thesaurus or ontology and that could serve as a gold standard to benchmark text mining solutions. As part of our research work, we have taken a corpus that has been delivered in the past for the identification of associations of genes to diseases based on the UMLS Metathesaurus and we have reprocessed and re-annotated the corpus. We have gathered annotations for disease entities from two curators, analyzed their disagreement (0.51 in the kappa-statistic) and composed a single annotated corpus for public use. Thereafter, three solutions for disease named entity recognition including MetaMap have been applied to the corpus to automatically annotate it with UMLS Metathesaurus concepts. The resulting annotations have been benchmarked to compare their performance. The annotated corpus is publicly available at ftp://ftp.ebi.ac.uk/pub/software/textmining/corpora/diseases and can serve as a benchmark to other systems. In addition, we found that dictionary look-up already provides competitive results indicating that the use of disease terminology is highly standardized throughout the terminologies and the literature. MetaMap generates precise results at the expense of insufficient recall while our statistical method obtains better recall at a lower precision rate. Even better results in terms of precision are achieved by combining at least two of the three methods leading, but this approach again lowers recall. Altogether, our analysis gives a better understanding of the complexity of disease annotations in the literature. MetaMap and the dictionary based approach are available through the Whatizit web service infrastructure (Rebholz-Schuhmann D, Arregui M, Gaudan S, Kirsch H, Jimeno A: Text processing through Web services: Calling Whatizit. Bioinformatics 2008, 24:296-298).
NASA Astrophysics Data System (ADS)
Barrie, A. S.; Moore, J.
2012-12-01
Plate tectonics is one of the core scientific concepts in both the NRC K-12 standards documents (#ESS2.B) and College Board Standards for Science (#ES.1.3). These documents also mention the scientific practices expected to improve as students are learning plate tectonics: interpreting data based on their observations of maps and argumentation around the evidence based on data. Research on students' understanding of maps emphasizes the difficulty of reading maps in science classrooms.We are conducting an ethnographic case study of the process of learning and teaching by novice teachers in the middle school science major at a mid-Atlantic University. The participants of the study are third-year majors (in the middle school science program and middle students at a suburban middle school. The study uses the data from four different fields (geography, geochronology, volcanology and seismology) to help involve preservice teachers in the practices of geosciences.The data for the study includes video and audio records of novice teachers' learning and teaching processes as well as teachers' reflections about their learning and on teaching Plate Tectonics by using real data. The video and audio data will be compiled and synthesized into event maps and transcripts, which are necessary for sociolinguistic analysis. Event maps provide an overall view of the events and are used to map the learning and teaching events into timely sequences and phases based on the subtopics and types of educational activities. Transcripts cover in detail the discussion and activity observed at each phase of the learning and teaching events. After compilation, event maps and transcripts will be analyzed by using Discourse analysis with an ethnographic perspective in order to identify novice teachers' challenges and the improvement they want to make on their teaching and assessment artifacts. The preliminary findings of the project identified challenges faced by novice teachers learning and teaching plate tectonics using key scientific practices. As a result of the educational activities developed in this project, we will try help teachers to overcome their challenges and develop the pedagogical skills that novice teachers need to use to teach plate tectonics by focusing on key scientific practices with the help of previously-developed educational resources. Learning about the processes that occur at plate boundaries will help future teachers (and their students) understand natural disasters such as earthquakes and volcanoes. Furthermore, the study will have a significant, and broader, impact by 'teaching the teachers' and empowering novice teachers to overcome the challenges of reading maps and using argumentation in science classrooms.
Landscape scale mapping of forest inventory data by nearest neighbor classification
Andrew Lister
2009-01-01
One of the goals of the Forest Service, U.S. Department of Agriculture's Forest Inventory and Analysis (FIA) program is large-area mapping. FIA scientists have tried many methods in the past, including geostatistical methods, linear modeling, nonlinear modeling, and simple choropleth and dot maps. Mapping methods that require individual model-based maps to be...
The Westfield River Watershed Interactive Atlas: mapping recreation data on the web
Robert S. Bristow; Steven Riberdy
2002-01-01
Imagine searching the web to create a map to your house. You could use one of the many Internet mapping sites like MapBlast or MapQuest to create such a map. But maybe you wish to get a map of trails for the Grand Canyon. The National Park Service web site could serve that need. Or you may wish to get a map to show you the way from the Orlando...
Millonig, Marsha K
2009-01-01
To convene a diverse group of stakeholders to discuss medication therapy management (MTM) documentation and billing standardization and its interoperability within the health care system. More than 70 stakeholders from pharmacy, health information systems, insurers/payers, quality, and standard-setting organizations met on October 7-8, 2008, in Bethesda, MD. The American Pharmacists Association (APhA) organized the invitational conference to facilitate discussion on strategic directions for meeting current market need for MTM documentation and billing interoperability and future market needs for MTM integration into electronic health records (EHRs). APhA recently adopted policy that specifically addresses technology barriers and encourages the use and development of standardized systems for the documentation and billing of MTM services. Day 1 of the conference featured six foundational presentations on health information technology (HIT) trends, perspectives on MTM from the profession and the Centers for Medicare & Medicaid Services, health care quality and medication-related outcome measures, integrating MTM workflow in EHRs, and the current state of MTM operalization in practice. After hearing presentations on day 1 and having the opportunity to pose questions to each speaker, conference participants were divided into three breakout groups on day 2. Each group met three times for 60 minutes each and discussed five questions from the perspective of a patient, provider, or payer. Three facilitators met with each of the groups and led discussion from one perspective (i.e., patient, provider, payer). Participants then reconvened as a complete group to participate in a discussion on next steps. HIT is expected to assist in delivering safe, effective, efficient, coordinated care as health professionals strive to improve the quality of care and outcomes for individual patients. The pharmacy profession is actively contributing to quality patient care through MTM services focused on identifying and preventing medication-related problems, improving medication use, and optimizing individual therapeutic outcomes. As MTM programs continue to expand within the health care system, one important limiting factor is the lack of standardization for documentation and billing of MTM services. This lack of interoperability between technology systems, software, and system platforms is presenting as a barrier to MTM service delivery for patients. APhA convened this invitational conference to identify strategic directions to address MTM documentation and billing standardization and interoperability. Participants viewed the meeting as highly successful in bringing together a unique, wide-ranging set of stakeholders, including the government, regulators, standards organizations, other health professions, technology firms, professional organizations, and practitioners, to share perspectives. They strongly encouraged the Association to continue this unique stakeholder dialogue. Participants provided a number of next-step suggestions for APhA to consider because of the event. Participants noted the pharmacy profession's success in building information technology systems for product transactions with systematic, organized, methodical thinking and the need to apply this success to patient services. A unique opportunity exists for the profession to influence and lead the HIT community in creating a workable health technology solution for MTM services. Reaching consensus on minimum data sets for each functional area--clinical, billing, quality improvement--would be a very important short-term gain. Further, participants said it was imperative for pharmacists and the pharmacy community at large to become actively engaged in HIT standards development efforts.
Mapping Perinatal Nursing Process Measurement Concepts to Standard Terminologies.
Ivory, Catherine H
2016-07-01
The use of standard terminologies is an essential component for using data to inform practice and conduct research; perinatal nursing data standardization is needed. This study explored whether 76 distinct process elements important for perinatal nursing were present in four American Nurses Association-recognized standard terminologies. The 76 process elements were taken from a valid paper-based perinatal nursing process measurement tool. Using terminology-supported browsers, the elements were manually mapped to the selected terminologies by the researcher. A five-member expert panel validated 100% of the mapping findings. The majority of the process elements (n = 63, 83%) were present in SNOMED-CT, 28% (n = 21) in LOINC, 34% (n = 26) in ICNP, and 15% (n = 11) in CCC. SNOMED-CT and LOINC are terminologies currently recommended for use to facilitate interoperability in the capture of assessment and problem data in certified electronic medical records. Study results suggest that SNOMED-CT and LOINC contain perinatal nursing process elements and are useful standard terminologies to support perinatal nursing practice in electronic health records. Terminology mapping is the first step toward incorporating traditional paper-based tools into electronic systems.
The Proteins API: accessing key integrated protein and genome information
Antunes, Ricardo; Alpi, Emanuele; Gonzales, Leonardo; Liu, Wudong; Luo, Jie; Qi, Guoying; Turner, Edd
2017-01-01
Abstract The Proteins API provides searching and programmatic access to protein and associated genomics data such as curated protein sequence positional annotations from UniProtKB, as well as mapped variation and proteomics data from large scale data sources (LSS). Using the coordinates service, researchers are able to retrieve the genomic sequence coordinates for proteins in UniProtKB. This, the LSS genomics and proteomics data for UniProt proteins is programmatically only available through this service. A Swagger UI has been implemented to provide documentation, an interface for users, with little or no programming experience, to ‘talk’ to the services to quickly and easily formulate queries with the services and obtain dynamically generated source code for popular programming languages, such as Java, Perl, Python and Ruby. Search results are returned as standard JSON, XML or GFF data objects. The Proteins API is a scalable, reliable, fast, easy to use RESTful services that provides a broad protein information resource for users to ask questions based upon their field of expertise and allowing them to gain an integrated overview of protein annotations available to aid their knowledge gain on proteins in biological processes. The Proteins API is available at (http://www.ebi.ac.uk/proteins/api/doc). PMID:28383659
The Proteins API: accessing key integrated protein and genome information.
Nightingale, Andrew; Antunes, Ricardo; Alpi, Emanuele; Bursteinas, Borisas; Gonzales, Leonardo; Liu, Wudong; Luo, Jie; Qi, Guoying; Turner, Edd; Martin, Maria
2017-07-03
The Proteins API provides searching and programmatic access to protein and associated genomics data such as curated protein sequence positional annotations from UniProtKB, as well as mapped variation and proteomics data from large scale data sources (LSS). Using the coordinates service, researchers are able to retrieve the genomic sequence coordinates for proteins in UniProtKB. This, the LSS genomics and proteomics data for UniProt proteins is programmatically only available through this service. A Swagger UI has been implemented to provide documentation, an interface for users, with little or no programming experience, to 'talk' to the services to quickly and easily formulate queries with the services and obtain dynamically generated source code for popular programming languages, such as Java, Perl, Python and Ruby. Search results are returned as standard JSON, XML or GFF data objects. The Proteins API is a scalable, reliable, fast, easy to use RESTful services that provides a broad protein information resource for users to ask questions based upon their field of expertise and allowing them to gain an integrated overview of protein annotations available to aid their knowledge gain on proteins in biological processes. The Proteins API is available at (http://www.ebi.ac.uk/proteins/api/doc). © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
GapMap: Enabling Comprehensive Autism Resource Epidemiology
Albert, Nikhila; Schwartz, Jessey; Du, Michael
2017-01-01
Background For individuals with autism spectrum disorder (ASD), finding resources can be a lengthy and difficult process. The difficulty in obtaining global, fine-grained autism epidemiological data hinders researchers from quickly and efficiently studying large-scale correlations among ASD, environmental factors, and geographical and cultural factors. Objective The objective of this study was to define resource load and resource availability for families affected by autism and subsequently create a platform to enable a more accurate representation of prevalence rates and resource epidemiology. Methods We created a mobile application, GapMap, to collect locational, diagnostic, and resource use information from individuals with autism to compute accurate prevalence rates and better understand autism resource epidemiology. GapMap is hosted on AWS S3, running on a React and Redux front-end framework. The backend framework is comprised of an AWS API Gateway and Lambda Function setup, with secure and scalable end points for retrieving prevalence and resource data, and for submitting participant data. Measures of autism resource scarcity, including resource load, resource availability, and resource gaps were defined and preliminarily computed using simulated or scraped data. Results The average distance from an individual in the United States to the nearest diagnostic center is approximately 182 km (50 miles), with a standard deviation of 235 km (146 miles). The average distance from an individual with ASD to the nearest diagnostic center, however, is only 32 km (20 miles), suggesting that individuals who live closer to diagnostic services are more likely to be diagnosed. Conclusions This study confirmed that individuals closer to diagnostic services are more likely to be diagnosed and proposes GapMap, a means to measure and enable the alleviation of increasingly overburdened diagnostic centers and resource-poor areas where parents are unable to diagnose their children as quickly and easily as needed. GapMap will collect information that will provide more accurate data for computing resource loads and availability, uncovering the impact of resource epidemiology on age and likelihood of diagnosis, and gathering localized autism prevalence rates. PMID:28473303
GapMap: Enabling Comprehensive Autism Resource Epidemiology.
Albert, Nikhila; Daniels, Jena; Schwartz, Jessey; Du, Michael; Wall, Dennis P
2017-05-04
For individuals with autism spectrum disorder (ASD), finding resources can be a lengthy and difficult process. The difficulty in obtaining global, fine-grained autism epidemiological data hinders researchers from quickly and efficiently studying large-scale correlations among ASD, environmental factors, and geographical and cultural factors. The objective of this study was to define resource load and resource availability for families affected by autism and subsequently create a platform to enable a more accurate representation of prevalence rates and resource epidemiology. We created a mobile application, GapMap, to collect locational, diagnostic, and resource use information from individuals with autism to compute accurate prevalence rates and better understand autism resource epidemiology. GapMap is hosted on AWS S3, running on a React and Redux front-end framework. The backend framework is comprised of an AWS API Gateway and Lambda Function setup, with secure and scalable end points for retrieving prevalence and resource data, and for submitting participant data. Measures of autism resource scarcity, including resource load, resource availability, and resource gaps were defined and preliminarily computed using simulated or scraped data. The average distance from an individual in the United States to the nearest diagnostic center is approximately 182 km (50 miles), with a standard deviation of 235 km (146 miles). The average distance from an individual with ASD to the nearest diagnostic center, however, is only 32 km (20 miles), suggesting that individuals who live closer to diagnostic services are more likely to be diagnosed. This study confirmed that individuals closer to diagnostic services are more likely to be diagnosed and proposes GapMap, a means to measure and enable the alleviation of increasingly overburdened diagnostic centers and resource-poor areas where parents are unable to diagnose their children as quickly and easily as needed. GapMap will collect information that will provide more accurate data for computing resource loads and availability, uncovering the impact of resource epidemiology on age and likelihood of diagnosis, and gathering localized autism prevalence rates. ©Nikhila Albert, Jena Daniels, Jessey Schwartz, Michael Du, Dennis P Wall. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 04.05.2017.
EnviroAtlas - Biodiversity Metrics by 12-digit HUC for the Southwestern United States
This EnviroAtlas dataset was produced by a joint effort of New Mexico State University, US EPA, and the US Geological Survey (USGS) to support research and online mapping activities related to EnviroAtlas. Ecosystem services, i.e., services provided to humans from ecological systems, have become a key issue of this century in resource management, conservation planning, and environmental decision analysis. Mapping and quantifying ecosystem services have become strategic national interests for integrating ecology with economics to help understand the effects of human policies and actions and their subsequent impacts on both ecosystem function and human well-being. Some aspects of biodiversity are valued by humans in varied ways, and thus are important to include in any assessment that seeks to identify and quantify the benefits of ecosystems to humans. Some biodiversity metrics clearly reflect ecosystem services (e.g., abundance and diversity of harvestable species), whereas others may reflect indirect and difficult to quantify relationships to services (e.g., relevance of species diversity to ecosystem resilience, or cultural and aesthetic values). Wildlife habitat has been modeled at broad spatial scales and can be used to map a number of biodiversity metrics. We map 15 biodiversity metrics reflecting ecosystem services or other aspects of biodiversity for all vertebrate species except fish. Metrics include species richness for all vertebrates, specific taxon gr
EnviroAtlas - Biodiversity Metrics by 12-digit HUC for the Southeastern United States
This EnviroAtlas dataset was produced by a joint effort of New Mexico State University, US EPA, and the US Geological Survey (USGS) to support research and online mapping activities related to EnviroAtlas. Ecosystem services, i.e., services provided to humans from ecological systems, have become a key issue of this century in resource management, conservation planning, and environmental decision analysis. Mapping and quantifying ecosystem services have become strategic national interests for integrating ecology with economics to help understand the effects of human policies and actions and their subsequent impacts on both ecosystem function and human well-being. Some aspects of biodiversity are valued by humans in varied ways, and thus are important to include in any assessment that seeks to identify and quantify the benefits of ecosystems to humans. Some biodiversity metrics clearly reflect ecosystem services (e.g., abundance and diversity of harvestable species), whereas others may reflect indirect and difficult to quantify relationships to services (e.g., relevance of species diversity to ecosystem resilience, or cultural and aesthetic values). Wildlife habitat has been modeled at broad spatial scales and can be used to map a number of biodiversity metrics. We map 14 biodiversity metrics reflecting ecosystem services or other aspects of biodiversity for all vertebrate species except fish. Metrics include species richness for all vertebrates, specific taxon gr
EnviroAtlas - Total reptile species by 12-digit HUC for the conterminous United States
This EnviroAtlas dataset was produced by a joint effort of New Mexico State University, US Environmental Protection Agency (US EPA,) and the U.S. Geological Survey (USGS) to support research and online mapping activities related to EnviroAtlas. Ecosystem services, i.e., services provided to humans from ecological systems have become a key issue of this century in resource management, conservation planning, and environmental decision analysis. Mapping and quantifying ecosystem services have become strategic national interests for integrating ecology with economics to help understand the effects of human policies and actions and their subsequent impacts on both ecosystem function and human well-being. Some aspects of biodiversity are valued by humans in varied ways, and thus are important to include in any assessment that seeks to identify and quantify the benefits of ecosystems to humans. Some biodiversity metrics clearly reflect ecosystem services (e.g., abundance and diversity of harvestable species), whereas others may reflect indirect and difficult to quantify relationships to services (e.g., relevance of species diversity to ecosystem resilience, cultural and aesthetic values). Wildlife habitat has been modeled at broad spatial scales and can be used to map a number of biodiversity metrics. We map 15 biodiversity metrics reflecting ecosystem services or other aspects of biodiversity for all vertebrate species except fish. Metrics include species richness fo
Leveraging the NLM map from SNOMED CT to ICD-10-CM to facilitate adoption of ICD-10-CM.
Cartagena, F Phil; Schaeffer, Molly; Rifai, Dorothy; Doroshenko, Victoria; Goldberg, Howard S
2015-05-01
Develop and test web services to retrieve and identify the most precise ICD-10-CM code(s) for a given clinical encounter. Facilitate creation of user interfaces that 1) provide an initial shortlist of candidate codes, ideally visible on a single screen; and 2) enable code refinement. To satisfy our high-level use cases, the analysis and design process involved reviewing available maps and crosswalks, designing the rule adjudication framework, determining necessary metadata, retrieving related codes, and iteratively improving the code refinement algorithm. The Partners ICD-10-CM Search and Mapping Services (PI-10 Services) are SOAP web services written using Microsoft's.NET 4.0 Framework, Windows Communications Framework, and SQL Server 2012. The services cover 96% of the Partners problem list subset of SNOMED CT codes that map to ICD-10-CM codes and can return up to 76% of the 69,823 billable ICD-10-CM codes prior to creation of custom mapping rules. We consider ways to increase 1) the coverage ratio of the Partners problem list subset of SNOMED CT codes and 2) the upper bound of returnable ICD-10-CM codes by creating custom mapping rules. Future work will investigate the utility of the transitive closure of SNOMED CT codes and other methods to assist in custom rule creation and, ultimately, to provide more complete coverage of ICD-10-CM codes. ICD-10-CM will be easier for clinicians to manage if applications display short lists of candidate codes from which clinicians can subsequently select a code for further refinement. The PI-10 Services support ICD-10 migration by implementing this paradigm and enabling users to consistently and accurately find the best ICD-10-CM code(s) without translation from ICD-9-CM. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
GNSS monitoring of the ionosphere for Space Weather services
NASA Astrophysics Data System (ADS)
Krankowski, A.; Sieradzki, R.; Zakharenkova, I. E.; Cherniak, I. V.
2012-04-01
The International GNSS Service (IGS) Ionosphere Working Group routinely provides the users global ionosphere maps (GIMs) of vertical total electron content (vTEC). The IGS GIMs are provided with spatial resolution of 5.0 degrees x 2.5 degrees in longitude and latitude, respectively. The current temporal resolution is 2 hours, however, 1-hour maps are delivered as a pilot project. There are three types IGS GIMs: the final, rapid and predicted. The latencies of the IGS ionospheric final and rapid products are 10 days and 1 day, respectively. The predicted GIMs are generated for 1 and 2 days in advance. There are four IGS Associate Analysis Centres (IAACs) that provide ionosphere maps computed with independent methodologies using GNSS data. These maps are uploaded to the IGS Ionosphere Combination and Validation Center at the GRL/UWM (Geodynamics Research Laboratory of the University of Warmia and Mazury in Olsztyn, Poland) that produces the IGS official ionospheric products, which are published online via ftp and www. On the other hand, the increasing number of permanently tracking GNSS stations near the North Geomagnetic Pole allow for using satellite observations to detect the ionospheric disturbances at high latitudes with even higher spatial resolution. In the space weather service developed at GRL/UWM, the data from the Arctic stations belonging to IGS/EPN/POLENET networks were used to study TEC fluctuations and scintillations. Since the beginning of 2011, a near real-time service presenting the conditions in the ionosphere have been operational at GRL/UWM www site. The rate of TEC index (ROTI) expressed in TECU/min is used as a measure of TEC fluctuations. The service provides 2-hour maps of the TEC variability. In addition, for each day the daily map of the ionospheric fluctuations as a function geomagnetic local time is also created. This presentation shows the architecture, algorithms, performance and future developments of the IGS GIMs and this new space weather service at GRL/UWM.
Jung, Bo Kyeung; Kim, Jeeyong; Cho, Chi Hyun; Kim, Ju Yeon; Nam, Myung Hyun; Shin, Bong Kyung; Rho, Eun Youn; Kim, Sollip; Sung, Heungsup; Kim, Shinyoung; Ki, Chang Seok; Park, Min Jung; Lee, Kap No; Yoon, Soo Young
2017-04-01
The National Health Information Standards Committee was established in 2004 in Korea. The practical subcommittee for laboratory test terminology was placed in charge of standardizing laboratory medicine terminology in Korean. We aimed to establish a standardized Korean laboratory terminology database, Korea-Logical Observation Identifier Names and Codes (K-LOINC) based on former products sponsored by this committee. The primary product was revised based on the opinions of specialists. Next, we mapped the electronic data interchange (EDI) codes that were revised in 2014, to the corresponding K-LOINC. We established a database of synonyms, including the laboratory codes of three reference laboratories and four tertiary hospitals in Korea. Furthermore, we supplemented the clinical microbiology section of K-LOINC using an alternative mapping strategy. We investigated other systems that utilize laboratory codes in order to investigate the compatibility of K-LOINC with statistical standards for a number of tests. A total of 48,990 laboratory codes were adopted (21,539 new and 16,330 revised). All of the LOINC synonyms were translated into Korean, and 39,347 Korean synonyms were added. Moreover, 21,773 synonyms were added from reference laboratories and tertiary hospitals. Alternative strategies were established for mapping within the microbiology domain. When we applied these to a smaller hospital, the mapping rate was successfully increased. Finally, we confirmed K-LOINC compatibility with other statistical standards, including a newly proposed EDI code system. This project successfully established an up-to-date standardized Korean laboratory terminology database, as well as an updated EDI mapping to facilitate the introduction of standard terminology into institutions. © 2017 The Korean Academy of Medical Sciences.
NASA Technical Reports Server (NTRS)
Estefan, J. A.; Sovers, O. J.
1994-01-01
The standard tropospheric calibration model implemented in the operational Orbit Determination Program is the seasonal model developed by C. C. Chao in the early 1970's. The seasonal model has seen only slight modification since its release, particularly in the format and content of the zenith delay calibrations. Chao's most recent standard mapping tables, which are used to project the zenith delay calibrations along the station-to-spacecraft line of sight, have not been modified since they were first published in late 1972. This report focuses principally on proposed upgrades to the zenith delay mapping process, although modeling improvements to the zenith delay calibration process are also discussed. A number of candidate approximation models for the tropospheric mapping are evaluated, including the semi-analytic mapping function of Lanyi, and the semi-empirical mapping functions of Davis, et. al.('CfA-2.2'), of Ifadis (global solution model), of Herring ('MTT'), and of Niell ('NMF'). All of the candidate mapping functions are superior to the Chao standard mapping tables and approximation formulas when evaluated against the current Deep Space Network Mark 3 intercontinental very long baselines interferometry database.
NASA Astrophysics Data System (ADS)
Yarnykh, V.; Korostyshevskaya, A.
2017-08-01
Macromolecular proton fraction (MPF) is a biophysical parameter describing the amount of macromolecular protons involved into magnetization exchange with water protons in tissues. MPF represents a significant interest as a magnetic resonance imaging (MRI) biomarker of myelin for clinical applications. A recent fast MPF mapping method enabled clinical translation of MPF measurements due to time-efficient acquisition based on the single-point constrained fit algorithm. However, previous MPF mapping applications utilized only 3 Tesla MRI scanners and modified pulse sequences, which are not commonly available. This study aimed to test the feasibility of MPF mapping implementation on a 1.5 Tesla clinical scanner using standard manufacturer’s sequences and compare the performance of this method between 1.5 and 3 Tesla scanners. MPF mapping was implemented on 1.5 and 3 Tesla MRI units of one manufacturer with either optimized custom-written or standard product pulse sequences. Whole-brain three-dimensional MPF maps obtained from a single volunteer were compared between field strengths and implementation options. MPF maps demonstrated similar quality at both field strengths. MPF values in segmented brain tissues and specific anatomic regions appeared in close agreement. This experiment demonstrates the feasibility of fast MPF mapping using standard sequences on 1.5 T and 3 T clinical scanners.
78 FR 45941 - Changes in Flood Hazard Determinations
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-30
... (hereinafter referred to as flood hazard determinations) as shown on the indicated Letter of Map Revision (LOMR... Insurance Rate Maps (FIRMs), and in some cases the Flood Insurance Study (FIS) reports, currently in effect... respective Community Map Repository address listed in the table below and online through the FEMA Map Service...
Ballard, Clive; Powell, Ian; James, Ian; Reichelt, Katharina; Myint, Pat; Potkins, Dawn; Bannister, Carol; Lana, Marisa; Howard, Robert; O'Brien, John; Swann, Alan; Robinson, Damian; Shrimanker, Jay; Barber, Robert
2002-02-01
The quality of care and overuse of neuroleptic medication in care environments are major issues in the care of elderly people with dementia. The quality of care (Dementia Care Mapping), the severity of Behavioural and Psychological Symptoms (BPSD--Neuropsychiatric Inventory), expressive language skills (Sheffield Acquired Language Disorder scale), service utilization and use of neuroleptic drugs was compared over 9 months between six care facilities receiving a psychiatric liaison service and three facilities receiving the usual clinical support, using a single blind design. There was a significant reduction in neuroleptic usage in the facilities receiving the liaison service (McNemar test p<0.0001), but not amongst those receiving standard clinical support (McNemar test p=0.07). There were also significantly less GP contacts (t=3.9 p=0.0001) for residents in the facilities receiving the liaison service, and a three fold reduction in psychiatric in-patient bed usage (Bed days per person 0.6 vs. 1.5). Residents in care facilities receiving the liaison service experienced significantly less deterioration in expressive language skills (t=2.2 p=0.03), but there were no significant differences in BPSD or wellbeing. A resource efficient psychiatric liaison service can reduce neuroleptic drug use and reduce some aspects of health service utilization; but a more extensive intervention is probably required to improve the overall quality of care. Copyright 2002 John Wiley & Sons, Ltd.
FermiGrid—experience and future plans
NASA Astrophysics Data System (ADS)
Chadwick, K.; Berman, E.; Canal, P.; Hesselroth, T.; Garzoglio, G.; Levshina, T.; Sergeev, V.; Sfiligoi, I.; Sharma, N.; Timm, S.; Yocum, D. R.
2008-07-01
Fermilab supports a scientific program that includes experiments and scientists located across the globe. In order to better serve this community, Fermilab has placed its production computer resources in a Campus Grid infrastructure called 'FermiGrid'. The FermiGrid infrastructure allows the large experiments at Fermilab to have priority access to their own resources, enables sharing of these resources in an opportunistic fashion, and movement of work (jobs, data) between the Campus Grid and National Grids such as Open Science Grid (OSG) and the Worldwide LHC Computing Grid Collaboration (WLCG). FermiGrid resources support multiple Virtual Organizations (VOs), including VOs from the OSG, EGEE, and the WLCG. Fermilab also makes leading contributions to the Open Science Grid in the areas of accounting, batch computing, grid security, job management, resource selection, site infrastructure, storage management, and VO services. Through the FermiGrid interfaces, authenticated and authorized VOs and individuals may access our core grid services, the 10,000+ Fermilab resident CPUs, near-petabyte (including CMS) online disk pools and the multi-petabyte Fermilab Mass Storage System. These core grid services include a site wide Globus gatekeeper, VO management services for several VOs, Fermilab site authorization services, grid user mapping services, as well as job accounting and monitoring, resource selection and data movement services. Access to these services is via standard and well-supported grid interfaces. We will report on the user experience of using the FermiGrid campus infrastructure interfaced to a national cyberinfrastructure - the successes and the problems.
FermiGrid - experience and future plans
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chadwick, K.; Berman, E.; Canal, P.
2007-09-01
Fermilab supports a scientific program that includes experiments and scientists located across the globe. In order to better serve this community, Fermilab has placed its production computer resources in a Campus Grid infrastructure called 'FermiGrid'. The FermiGrid infrastructure allows the large experiments at Fermilab to have priority access to their own resources, enables sharing of these resources in an opportunistic fashion, and movement of work (jobs, data) between the Campus Grid and National Grids such as Open Science Grid and the WLCG. FermiGrid resources support multiple Virtual Organizations (VOs), including VOs from the Open Science Grid (OSG), EGEE and themore » Worldwide LHC Computing Grid Collaboration (WLCG). Fermilab also makes leading contributions to the Open Science Grid in the areas of accounting, batch computing, grid security, job management, resource selection, site infrastructure, storage management, and VO services. Through the FermiGrid interfaces, authenticated and authorized VOs and individuals may access our core grid services, the 10,000+ Fermilab resident CPUs, near-petabyte (including CMS) online disk pools and the multi-petabyte Fermilab Mass Storage System. These core grid services include a site wide Globus gatekeeper, VO management services for several VOs, Fermilab site authorization services, grid user mapping services, as well as job accounting and monitoring, resource selection and data movement services. Access to these services is via standard and well-supported grid interfaces. We will report on the user experience of using the FermiGrid campus infrastructure interfaced to a national cyberinfrastructure--the successes and the problems.« less
Study on the standard architecture for geoinformation common services
NASA Astrophysics Data System (ADS)
Zha, Z.; Zhang, L.; Wang, C.; Jiang, J.; Huang, W.
2014-04-01
The construction of platform for geoinformation common services was completed or on going in in most provinces and cities in these years in China, and the platforms plays an important role in the economic and social activities. Geoinfromation and geoinfromation based services are the key issues in the platform. The standards on geoinormation common services play as bridges among the users, systems and designers of the platform. The standard architecture for geoinformation common services is the guideline for designing and using the standard system in which the standards integrated to each other to promote the development, sharing and services of geoinformation resources. To establish the standard architecture for geoinformation common services is one of the tasks of "Study on important standards for geonformation common services and management of public facilities in city". The scope of the standard architecture is defined, such as data or information model, interoperability interface or service, information management. Some Research work on the status of international standards of geoinormation common services in organization and countries, like ISO/TC 211, OGC and other countries or unions like USA, EU, Japan have done. Some principles are set up to evaluate the standard, such as availability, suitability and extensible ability. Then the development requirement and practical situation are analyzed, and a framework of the standard architecture for geoinformation common services are proposed. Finally, a summary and prospects of the geoinformation standards are made.
Application of open source standards and technologies in the http://climate4impact.eu/ portal
NASA Astrophysics Data System (ADS)
Plieger, Maarten; Som de Cerff, Wim; Pagé, Christian; Tatarinova, Natalia
2015-04-01
This presentation will demonstrate how to calculate and visualize the climate indice SU (number of summer days) on the climate4impact portal. The following topics will be covered during the demonstration: - Security: Login using OpenID for access to the Earth System Grid Fedeation (ESGF) data nodes. The ESGF works in conjunction with several external websites and systems. The climate4impact portal uses X509 based short lived credentials, generated on behalf of the user with a MyProxy service. Single Sign-on (SSO) is used to make these websites and systems work together. - Discovery: Facetted search based on e.g. variable name, model and institute using the ESGF search services. A catalog browser allows for browsing through CMIP5 and any other climate model data catalogues (e.g. ESSENCE, EOBS, UNIDATA). - Processing using Web Processing Services (WPS): Transform data, subset, export into other formats, and perform climate indices calculations using Web Processing Services implemented by PyWPS, based on NCAR NCPP OpenClimateGIS and IS-ENES2 ICCLIM. - Visualization using Web Map Services (WMS): Visualize data from ESGF data nodes using ADAGUC Web Map Services. The aim of climate4impact is to enhance the use of Climate Research Data and to enhance the interaction with climate effect/impact communities. The portal is based on 21 impact use cases from 5 different European countries, and is evaluated by a user panel consisting of use case owners. It has been developed within the European projects IS-ENES and IS-ENES2 for more than 5 years, and its development currently continues within IS-ENES2 and CLIPC. As the climate impact community is very broad, the focus is mainly on the scientific impact community. This work has resulted in the ENES portal interface for climate impact communities and can be visited at http://climate4impact.eu/ The current main objectives for climate4impact can be summarized in two objectives. The first one is to work on a web interface which automatically generates a graphical user interface on WPS endpoints. The WPS calculates climate indices and subset data using OpenClimateGIS/ICCLIM on data stored in ESGF data nodes. Data is then transmitted from ESGF nodes over secured OpenDAP and becomes available in a new, per user, secured OpenDAP server. The results can then be visualized again using ADAGUC WMS. Dedicated wizards for processing of climate indices will be developed in close collaboration with users. The second one is to expose climate4impact services, so as to offer standardized services which can be used by other portals. This has the advantage to add interoperability between several portals, as well as to enable the design of specific portals aimed at different impact communities, either thematic or national, for example.
Southworth, Scott; Schultz, Art; Denenny, Danielle
2005-01-01
The geology of the Great Smoky Mountain National Park (GSMNP) region of Tennessee and North Carolina was studied from 1993 to 2003 as part of a cooperative investigation with the National Park Service (NPS). This work has been compiled as a 1:100,000-scale map derived from mapping done at 1:24,000 and 1:62,500 scale. The geologic data are intended to support cooperative investigations with NPS, the development of a new soil map by the Natural Resources Conservation Service, and the All Taxa Biodiversity Inventory (http://www.discoverlifeinamerica.org/). At the request of NPS, we mapped areas previously not visited, revised the geology where stratigraphic and structural problems existed, and developed a map database for use in interdisciplinary research, land management, and interpretive programs for park visitors.
Preliminary geologic map of the Elsinore 7.5' Quadrangle, Riverside County, California
Morton, Douglas M.; Weber, F. Harold; Digital preparation: Alvarez, Rachel M.; Burns, Diane
2003-01-01
Open-File Report 03-281 contains a digital geologic map database of the Elsinore 7.5’ quadrangle, Riverside County, California that includes: 1. ARC/INFO (Environmental Systems Research Institute, http://www.esri.com) version 7.2.1 coverages of the various elements of the geologic map. 2. A Postscript file to plot the geologic map on a topographic base, and containing a Correlation of Map Units diagram (CMU), a Description of Map Units (DMU), and an index map. 3. Portable Document Format (.pdf) files of: a. This Readme; includes in Appendix I, data contained in els_met.txt b. The same graphic as plotted in 2 above. Test plots have not produced precise 1:24,000-scale map sheets. Adobe Acrobat page size setting influences map scale. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Geologic Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Where known, grain size is indicated on the map by a subscripted letter or letters following the unit symbols as follows: lg, large boulders; b, boulder; g, gravel; a, arenaceous; s, silt; c, clay; e.g. Qyfa is a predominantly young alluvial fan deposit that is arenaceous. Multiple letters are used for more specific identification or for mixed units, e.g., Qfysa is a silty sand. In some cases, mixed units are indicated by a compound symbol; e.g., Qyf2sc. Even though this is an Open-File Report and includes the standard USGS Open-File disclaimer, the report closely adheres to the stratigraphic nomenclature of the U.S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above).
Services Contact Site Map Go Services GLCF General Services Contact us with the online form. Answer classification Special Group Services Special workspaces for groups accessing datasets Data Ordering Services Click here to proceed to Data Ordering Service Hard Media Orders Metrics Services ESDS Metrics
Preliminary geologic map of the northeast Dillingham quadrangle (D-1, D-2, C-1, and C-2), Alaska
Wilson, Frederic H.; Hudson, Travis L.; Grybeck, Donald; Stoeser, Douglas B.; Preller, Cindi C.; Bickerstaff, Damon; Labay, Keith A.; Miller, Martha L.
2003-01-01
The Correlation of Map Units and Description of Map Units are in a format similar to that of the USGS Geologic Investigations Series (I-series) maps but have not been edited to comply with I-map standards. Even though this is an Open-File Report and includes the standard USGS Open-File disclaimer, the report closely adheres to the Stratigraphic Nomenclature of the U.S. Geological Survey. ARC/INFO symbolsets (shade and line) as used for these maps have been made available elsewhere as part of Geologic map of Central (Interior) Alaska, published as a USGS Open-File Report (Wilson and others, 1998, http://geopubs.wr.usgs.gov/open-file/of98-133-a/). This product does not include the digital topographic base or land-grid files used to produce the map, nor does it include the AML and related ancillary key and other files used to assemble the components of the map.
Kooistra, Lammert; Bergsma, Aldo; Chuma, Beatus; de Bruin, Sytze
2009-01-01
This paper describes the development of a sensor web based approach which combines earth observation and in situ sensor data to derive typical information offered by a dynamic web mapping service (WMS). A prototype has been developed which provides daily maps of vegetation productivity for the Netherlands with a spatial resolution of 250 m. Daily available MODIS surface reflectance products and meteorological parameters obtained through a Sensor Observation Service (SOS) were used as input for a vegetation productivity model. This paper presents the vegetation productivity model, the sensor data sources and the implementation of the automated processing facility. Finally, an evaluation is made of the opportunities and limitations of sensor web based approaches for the development of web services which combine both satellite and in situ sensor sources. PMID:22574019
The National Map - Utah Transportation Pilot Project
,
2001-01-01
Governments depend on a common set of geographic base information as a tool for economic and community development, land and natural resource management, and health and safety services. Emergency management and defense operations rely on this information. Private industry, nongovernmental organizations, and individual citizens use the same geographic data. Geographic information underpins an increasingly large part of the Nation's economy. Available geographic data often have the following problems: * They do not align with each other because layers are frequently created or revised separately, * They do not match across administrative boundaries because each producing organization uses different methods and standards, and * They are not up to date because of the complexity and cost of revision. The U.S. Geological Survey (USGS) is developing The National Map to be a seamless, continuously maintained, and nationally consistent set of online, public domain, geographic base information to address these issues. The National Map will serve as a foundation for integrating, sharing, and using other data easily and consistently. In collaboration with other government agencies, the private sector, academia, and volunteer groups, the USGS will coordinate, integrate, and, where needed, produce and maintain base geographic data. The National Map will include digital orthorectified imagery; elevation data; vector data for hydrography, transportation, boundary, and structure features; geographic names; and land cover information. The data will be the source of revised paper topographic maps. Many technical and institutional issues must be resolved as The National Map is implemented. To begin the refinement of this new paradigm, pilot projects are being designed to identify and investigate these issues. The pilots are the foundation upon which future partnerships for data sharing and maintenance will be built.
The National Map - Texas Pilot Project
,
2001-01-01
Governments depend on a common set of geographic base information as a tool for economic and community development, land and natural resource management, and health and safety services. Emergency management and defense operations rely on this information. Private industry, nongovernmental organizations, and individual citizens use the same geographic data. Geographic information underpins an increasingly large part of the Nation's economy. Available geographic data often have the following problems: * They do not align with each other because layers are frequently created or revised separately, * They do not match across administrative boundaries because each producing organization uses different methods and standards, and * They are not up to date because of the complexity and cost of revision. The U.S. Geological Survey (USGS) is developing The National Map to be a seamless, continuously maintained, and nationally consistent set of online, public domain, geographic base information to address these issues. The National Map will serve as a foundation for integrating, sharing, and using other data easily and consistently. In collaboration with other government agencies, the private sector, academia, and volunteer groups, the USGS will coordinate, integrate, and, where needed, produce and maintain base geographic data. The National Map will include digital orthorectified imagery; elevation data; vector data for hydrography, transportation, boundary, and structure features; geographic names; and land cover information. The data will be the source of revised paper topographic maps. Many technical and institutional issues must be resolved as The National Map is implemented. To begin the refinement of this new paradigm, pilot projects are being designed to identify and investigate these issues. The pilots are the foundation upon which future partnerships for data sharing and maintenance will be built.
The National Map - Florida Pilot Project
,
2001-01-01
Governments depend on a common set of geographic base information as a tool for economic and community development, land and natural resource management, and health and safety services. Emergency management and defense operations rely on this information. Private industry, nongovernmental organizations, and individual citizens use the same geographic data. Geographic information underpins an increasingly large part of the Nation's economy. Available geographic data often have the following problems: * They do not align with each other because layers are frequently created or revised separately, * They do not match across administrative boundaries because each producing organization uses different methods and standards, and * They are not up to date because of the complexity and cost of revision. The U.S. Geological Survey (USGS) is developing The National Map to be a seamless, continuously maintained, and nationally consistent set of online, public domain, geographic base information to address these issues. The National Map will serve as a foundation for integrating, sharing, and using other data easily and consistently. In collaboration with other government agencies, the private sector, academia, and volunteer groups, the USGS will coordinate, integrate, and, where needed, produce and maintain base geographic data. The National Map will include digital orthorectified imagery; elevation data; vector data for hydrography, transportation, boundary, and structure features; geographic names; and land cover information. The data will be the source of revised paper topographic maps. Many technical and institutional issues must be resolved as The National Map is implemented. To begin the refinement of this new paradigm, pilot projects are being designed to identify and investigate these issues. The pilots are the foundation upon which future partnerships for data sharing and maintenance will be built.
The National Map - Pennsylvania Pilot Project
,
2001-01-01
Governments depend on a common set of geographic base information as a tool for economic and community development, land and natural resource management, and health and safety services. Emergency management and defense operations rely on this information. Private industry, nongovernmental organizations, and individual citizens use the same geographic data. Geographic information underpins an increasingly large part of the Nation's economy. Available geographic data often have the following problems: * They do not align with each other because layers are frequently created or revised separately, * They do not match across administrative boundaries because each producing organization uses different methods and standards, and * They are not up to date because of the complexity and cost of revision. The U.S. Geological Survey (USGS) is developing The National Map to be a seamless, continuously maintained, and nationally consistent set of online, public domain, geographic base information to address these issues. The National Map will serve as a foundation for integrating, sharing, and using other data easily and consistently. In collaboration with other government agencies, the private sector, academia, and volunteer groups, the USGS will coordinate, integrate, and, where needed, produce and maintain base geographic data. The National Map will include digital orthorectified imagery; elevation data; vector data for hydrography, transportation, boundary, and structure features; geographic names; and land cover information. The data will be the source of revised paper topographic maps. Many technical and institutional issues must be resolved as The National Map is implemented. To begin the refinement of this new paradigm, pilot projects are being designed to identify and investigate these issues. The pilots are the foundation upon which future partnerships for data sharing and maintenance will be built.
The National Map - Delaware Pilot Project
,
2001-01-01
Governments depend on a common set of geographic base information as a tool for economic and community development, land and natural resource management, and health and safety services. Emergency management and defense operations rely on this information. Private industry, nongovernmental organizations, and individual citizens use the same geographic data. Geographic information underpins an increasingly large part of the Nation's economy. Available geographic data often have the following problems: * They do not align with each other because layers are frequently created or revised separately, * They do not match across administrative boundaries because each producing organization uses different methods and standards, and * They are not up to date because of the complexity and cost of revision. The U.S. Geological Survey (USGS) is developing The National Map to be a seamless, continuously maintained, and nationally consistent set of online, public domain, geographic base information to address these issues. The National Map will serve as a foundation for integrating, sharing, and using other data easily and consistently. In collaboration with other government agencies, the private sector, academia, and volunteer groups, the USGS will coordinate, integrate, and, where needed, produce and maintain base geographic data. The National Map will include digital orthorectified imagery; elevation data; vector data for hydrography, transportation, boundary, and structure features; geographic names; and land cover information. The data will be the source of revised paper topographic maps. Many technical and institutional issues must be resolved as The National Map is implemented. To begin the refinement of this new paradigm, pilot projects are being designed to identify and investigate these issues. The pilots are the foundation upon which future partnerships for data sharing and maintenance will be built.
The National Map - Lake Tahoe Area Pilot Project
,
2001-01-01
Governments depend on a common set of geographic base information as a tool for economic and community development, land and natural resource management, and health and safety services. Emergency management and defense operations rely on this information. Private industry, nongovernmental organizations, and individual citizens use the same geographic data. Geographic information underpins an increasingly large part of the Nation's economy. Available geographic data often have the following problems: * They do not align with each other because layers are frequently created or revised separately, * They do not match across administrative boundaries because each producing organization uses different methods and standards, and * They are not up to date because of the complexity and cost of revision. The U.S. Geological Survey (USGS) is developing The National Map to be a seamless, continuously maintained, and nationally consistent set of online, public domain, geographic base information to address these issues. The National Map will serve as a foundation for integrating, sharing, and using other data easily and consistently. In collaboration with other government agencies, the private sector, academia, and volunteer groups, the USGS will coordinate, integrate, and, where needed, produce and maintain base geographic data. The National Map will include digital orthorectified imagery; elevation data; vector data for hydrography, transportation, boundary, and structure features; geographic names; and land cover information. The data will be the source of revised paper topographic maps. Many technical and institutional issues must be resolved as The National Map is implemented. To begin the refinement of this new paradigm, pilot projects are being designed to identify and investigate these issues. The pilots are the foundation upon which future partnerships for data sharing and maintenance will be built.
The National Map - Missouri Pilot Project
,
2001-01-01
Governments depend on a common set of geographic base information as a tool for economic and community development, land and natural resource management, and health and safety services. Emergency management and defense operations rely on this information. Private industry, nongovernmental organizations, and individual citizens use the same geographic data. Geographic information underpins an increasingly large part of the Nation's economy. Available geographic data often have the following problems: * They do not align with each other because layers are frequently created or revised separately, * They do not match across administrative boundaries because each producing organization uses different methods and standards, and * They are not up to date because of the complexity and cost of revision. The U.S. Geological Survey (USGS) is developing The National Map to be a seamless, continuously maintained, and nationally consistent set of online, public domain, geographic base information to address these issues. The National Map will serve as a foundation for integrating, sharing, and using other data easily and consistently. In collaboration with other government agencies, the private sector, academia, and volunteer groups, the USGS will coordinate, integrate, and, where needed, produce and maintain base geographic data. The National Map will include digital orthorectified imagery; elevation data; vector data for hydrography, transportation, boundary, and structure features; geographic names; and land cover information. The data will be the source of revised paper topographic maps. Many technical and institutional issues must be resolved as The National Map is implemented. To begin the refinement of this new paradigm, pilot projects are being designed to identify and investigate these issues. The pilots are the foundation upon which future partnerships for data sharing and maintenance will be built.
The National Map - Washington-Idaho Pilot Project
,
2001-01-01
Governments depend on a common set of geographic base information as a tool for economic and community development, land and natural resource management, and health and safety services. Emergency management and defense operations rely on this information. Private industry, nongovernmental organizations, and individual citizens use the same geographic data. Geographic information underpins an increasingly large part of the Nation's economy. Available geographic data often have the following problems: * They do not align with each other because layers are frequently created or revised separately, * They do not match across administrative boundaries because each producing organization uses different methods and standards, and * They are not up to date because of the complexity and cost of revision. The U.S. Geological Survey (USGS) is developing The National Map to be a seamless, continuously maintained, and nationally consistent set of online, public domain, geographic base information to address these issues. The National Map will serve as a foundation for integrating, sharing, and using other data easily and consistently. In collaboration with other government agencies, the private sector, academia, and volunteer groups, the USGS will coordinate, integrate, and, where needed, produce and maintain base geographic data. The National Map will include digital orthorectified imagery; elevation data; vector data for hydrography, transportation, boundary, and structure features; geographic names; and land cover information. The data will be the source of revised paper topographic maps. Many technical and institutional issues must be resolved as The National Map is implemented. To begin the refinement of this new paradigm, pilot projects are being designed to identify and investigate these issues. The pilots are the foundation upon which future partnerships for data sharing and maintenance will be built.
Datum Transformation of Spatial Data and Application in Cadastre
NASA Astrophysics Data System (ADS)
Kısa, A.; Erkek, B.; Ekin, L.
2012-07-01
In Turkey, cadastral works have been started with local-based works in 1924 and speeded up after 1950's by using photogrammetry. Different measurement methods, coordinate systems and scales have been used in these works. As a result of primary cadastral activities two main products are generated; cadastral maps and title deeds. After this, cadastral data live on the maps, by cadastral activities carried out by cadastral offices and title deed data live on the registrations by land registration activities carried out by land registration offices. Up to 2005 different references systems such as local (graphic) and ED50 have been used for Cadastral maps production. 2000's Land Registry and Cadastre Information System (TAKBİS) Project has started as a pilot application by Land Registry and Cadastre (TKGM). After completion of pilot project spreading activities started in 2005 and still has been ongoing. On the other hand The government has taken the decision to finish primary cadastral activities within three years. The primary cadastral activities completed at the end of 2008. And also TKGM has completed metadata portal in 2008. At last, cadastral map updating (renovation) started in 2009 by using digital orthophoto with 30 cm GSD. Today people have great expectations in accomplishing digital cadastral services, they need correct, reliable, easy and quick accessible land register and cadastral survey information. Even such request expressed in INPIRE directive by using ISO 191XX data standards. This means we have great hard work for spatial data conversion, datum and data transformation for map and cadastral data harmonization. This paper presents results of investigation of used cadastral maps and used datums of the TKGM and possible transformation methods of datum and some recommendations for future applications.
40 CFR 63.1028 - Agitators in gas and vapor service and in light liquid service standards.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Standards § 63.1028 Agitators in gas and vapor service and in light liquid service standards. (a) Compliance... 40 Protection of Environment 11 2013-07-01 2013-07-01 false Agitators in gas and vapor service and in light liquid service standards. 63.1028 Section 63.1028 Protection of Environment ENVIRONMENTAL...
40 CFR 63.1028 - Agitators in gas and vapor service and in light liquid service standards.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Standards § 63.1028 Agitators in gas and vapor service and in light liquid service standards. (a) Compliance... 40 Protection of Environment 11 2014-07-01 2014-07-01 false Agitators in gas and vapor service and in light liquid service standards. 63.1028 Section 63.1028 Protection of Environment ENVIRONMENTAL...
40 CFR 63.1025 - Valves in gas and vapor service and in light liquid service standards.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Standards § 63.1025 Valves in gas and vapor service and in light liquid service standards. (a) Compliance... 40 Protection of Environment 11 2014-07-01 2014-07-01 false Valves in gas and vapor service and in light liquid service standards. 63.1025 Section 63.1025 Protection of Environment ENVIRONMENTAL...
40 CFR 63.1028 - Agitators in gas and vapor service and in light liquid service standards.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Standards § 63.1028 Agitators in gas and vapor service and in light liquid service standards. (a) Compliance... 40 Protection of Environment 10 2011-07-01 2011-07-01 false Agitators in gas and vapor service and in light liquid service standards. 63.1028 Section 63.1028 Protection of Environment ENVIRONMENTAL...
40 CFR 63.1025 - Valves in gas and vapor service and in light liquid service standards.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Standards § 63.1025 Valves in gas and vapor service and in light liquid service standards. (a) Compliance... 40 Protection of Environment 10 2010-07-01 2010-07-01 false Valves in gas and vapor service and in light liquid service standards. 63.1025 Section 63.1025 Protection of Environment ENVIRONMENTAL...
40 CFR 63.1025 - Valves in gas and vapor service and in light liquid service standards.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Standards § 63.1025 Valves in gas and vapor service and in light liquid service standards. (a) Compliance... 40 Protection of Environment 10 2011-07-01 2011-07-01 false Valves in gas and vapor service and in light liquid service standards. 63.1025 Section 63.1025 Protection of Environment ENVIRONMENTAL...
40 CFR 63.1028 - Agitators in gas and vapor service and in light liquid service standards.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Standards § 63.1028 Agitators in gas and vapor service and in light liquid service standards. (a) Compliance... 40 Protection of Environment 11 2012-07-01 2012-07-01 false Agitators in gas and vapor service and in light liquid service standards. 63.1028 Section 63.1028 Protection of Environment ENVIRONMENTAL...
40 CFR 63.1028 - Agitators in gas and vapor service and in light liquid service standards.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Standards § 63.1028 Agitators in gas and vapor service and in light liquid service standards. (a) Compliance... 40 Protection of Environment 10 2010-07-01 2010-07-01 false Agitators in gas and vapor service and in light liquid service standards. 63.1028 Section 63.1028 Protection of Environment ENVIRONMENTAL...
40 CFR 63.1025 - Valves in gas and vapor service and in light liquid service standards.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Standards § 63.1025 Valves in gas and vapor service and in light liquid service standards. (a) Compliance... 40 Protection of Environment 11 2013-07-01 2013-07-01 false Valves in gas and vapor service and in light liquid service standards. 63.1025 Section 63.1025 Protection of Environment ENVIRONMENTAL...
40 CFR 63.1025 - Valves in gas and vapor service and in light liquid service standards.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Standards § 63.1025 Valves in gas and vapor service and in light liquid service standards. (a) Compliance... 40 Protection of Environment 11 2012-07-01 2012-07-01 false Valves in gas and vapor service and in light liquid service standards. 63.1025 Section 63.1025 Protection of Environment ENVIRONMENTAL...
Signell, Richard; Camossi, E.
2016-01-01
Work over the last decade has resulted in standardised web services and tools that can significantly improve the efficiency and effectiveness of working with meteorological and ocean model data. While many operational modelling centres have enabled query and access to data via common web services, most small research groups have not. The penetration of this approach into the research community, where IT resources are limited, can be dramatically improved by (1) making it simple for providers to enable web service access to existing output files; (2) using free technologies that are easy to deploy and configure; and (3) providing standardised, service-based tools that work in existing research environments. We present a simple, local brokering approach that lets modellers continue to use their existing files and tools, while serving virtual data sets that can be used with standardised tools. The goal of this paper is to convince modellers that a standardised framework is not only useful but can be implemented with modest effort using free software components. We use NetCDF Markup language for data aggregation and standardisation, the THREDDS Data Server for data delivery, pycsw for data search, NCTOOLBOX (MATLAB®) and Iris (Python) for data access, and Open Geospatial Consortium Web Map Service for data preview. We illustrate the effectiveness of this approach with two use cases involving small research modelling groups at NATO and USGS.
Bosomprah, Samuel; Tatem, Andrew J; Dotse-Gborgbortsi, Winfred; Aboagye, Patrick; Matthews, Zoe
2016-01-01
To provide clear policy directions for gaps in the provision of signal function services and sub-regions requiring priority attention using data from the 2010 Ghana Emergency Obstetric and Newborn Care (EmONC) survey. Using 2010 survey data, the fraction of facilities with only one or two signal functions missing was calculated for each facility type and EmONC designation. Thematic maps were used to provide insight into inequities in service provision. Of 1159 maternity facilities, 89 provided all the necessary basic or comprehensive EmONC signal functions 3months prior to the 2010 survey. Only 21% of facility-based births were in fully functioning EmONC facilities, but an additional 30% occurred in facilities missing one or two basic signal functions-most often assisted vaginal delivery and removal of retained products. Tackling these missing signal functions would extend births taking place in fully functioning facilities to over 50%. Subnational analyses based on estimated total pregnancies in each district revealed a pattern of inequity in service provision across the country. Upgrading facilities missing only one or two signal functions will allow Ghana to meet international standards for availability of EmONC services. Reducing maternal deaths will require high national priority given to addressing inequities in the distribution of EmONC services. Copyright © 2015 International Federation of Gynecology and Obstetrics. Published by Elsevier Ireland Ltd. All rights reserved.
A service-based framework for pharmacogenomics data integration
NASA Astrophysics Data System (ADS)
Wang, Kun; Bai, Xiaoying; Li, Jing; Ding, Cong
2010-08-01
Data are central to scientific research and practices. The advance of experiment methods and information retrieval technologies leads to explosive growth of scientific data and databases. However, due to the heterogeneous problems in data formats, structures and semantics, it is hard to integrate the diversified data that grow explosively and analyse them comprehensively. As more and more public databases are accessible through standard protocols like programmable interfaces and Web portals, Web-based data integration becomes a major trend to manage and synthesise data that are stored in distributed locations. Mashup, a Web 2.0 technique, presents a new way to compose content and software from multiple resources. The paper proposes a layered framework for integrating pharmacogenomics data in a service-oriented approach using the mashup technology. The framework separates the integration concerns from three perspectives including data, process and Web-based user interface. Each layer encapsulates the heterogeneous issues of one aspect. To facilitate the mapping and convergence of data, the ontology mechanism is introduced to provide consistent conceptual models across different databases and experiment platforms. To support user-interactive and iterative service orchestration, a context model is defined to capture information of users, tasks and services, which can be used for service selection and recommendation during a dynamic service composition process. A prototype system is implemented and cases studies are presented to illustrate the promising capabilities of the proposed approach.
NASA Astrophysics Data System (ADS)
Signell, Richard P.; Camossi, Elena
2016-05-01
Work over the last decade has resulted in standardised web services and tools that can significantly improve the efficiency and effectiveness of working with meteorological and ocean model data. While many operational modelling centres have enabled query and access to data via common web services, most small research groups have not. The penetration of this approach into the research community, where IT resources are limited, can be dramatically improved by (1) making it simple for providers to enable web service access to existing output files; (2) using free technologies that are easy to deploy and configure; and (3) providing standardised, service-based tools that work in existing research environments. We present a simple, local brokering approach that lets modellers continue to use their existing files and tools, while serving virtual data sets that can be used with standardised tools. The goal of this paper is to convince modellers that a standardised framework is not only useful but can be implemented with modest effort using free software components. We use NetCDF Markup language for data aggregation and standardisation, the THREDDS Data Server for data delivery, pycsw for data search, NCTOOLBOX (MATLAB®) and Iris (Python) for data access, and Open Geospatial Consortium Web Map Service for data preview. We illustrate the effectiveness of this approach with two use cases involving small research modelling groups at NATO and USGS.
Cartographic sign as a core of multimedia map prepared by non-cartographers in free map services
NASA Astrophysics Data System (ADS)
Medyńska-Gulij, Beata
2014-06-01
The fundamental importance of cartographic signs in traditional maps is unquestionable, although in the case of multimedia maps their key function is not so obvious. Our aim was to search the problem of cartographic signs as a core of multimedia maps prepared by non-cartographer in on-line Map Services. First, preestablished rules for multimedia map designers were prepared emphasizing the key role of the cartographic signs and habits of Web-users. The comparison of projects completed by a group of designers led us to the general conclusion that a cartographic sign should determine the design of a multimedia map in on-line Map Services. Despite the selection of five different map topics, one may list the general characteristics of the maps with a cartographic sign in the core. Fundamentalne znaczenie znaków kartograficznych na tradycyjnej mapie nie budzi wątpliwości, jednak w przypadku multimedialnej mapy ich kluczowa funkcja nie jest już tak oczywista. W tych badaniach podjęto problem znaczenia znaku kartograficznego jako spoiwa mapy multimedialnej opracowanej przez nie-kartografa w darmowych serwisach mapowych. Zadaniem dla projektujących mapy stało się opracowanie mapy multimedialnej według ustalonych wstępnie zasad, w której kluczową rolę odgrywały znaki kartograficzne oraz przyzwyczajenia użytkowników Internetu. Porównanie wypełnionych arkuszy zadań przez uczestników badań skłania do wyciągnięcia generalnego wniosku, że znak kartograficzny powinien determinować projektowanie multimedialnej mapy w serwisach mapowych on-line. Pomimo opracowania pięciu różnych tematów map, można wymienić ogólne charakterystyki map, w których znak kartograficzny jest spoiwem.
ERIC Educational Resources Information Center
McMillin, Bill; Gibson, Sally; MacDonald, Jean
2016-01-01
Animated maps of the library stacks were integrated into the catalog interface at Pratt Institute and into the EBSCO Discovery Service interface at Illinois State University. The mapping feature was developed for optimal automation of the update process to enable a range of library personnel to update maps and call-number ranges. The development…
Interoperability And Value Added To Earth Observation Data
NASA Astrophysics Data System (ADS)
Gasperi, J.
2012-04-01
Geospatial web services technology has provided a new means for geospatial data interoperability. Open Geospatial Consortium (OGC) services such as Web Map Service (WMS) to request maps on the Internet, Web Feature Service (WFS) to exchange vectors or Catalog Service for the Web (CSW) to search for geospatialized data have been widely adopted in the Geosciences community in general and in the remote sensing community in particular. These services make Earth Observation data available to a wider range of public users than ever before. The mapshup web client offers an innovative and efficient user interface that takes advantage of the power of interoperability. This presentation will demonstrate how mapshup can be effectively used in the context of natural disasters management.
Methamphetamine-associated psychosis: a new health challenge in Iran
2013-01-01
The rapidly growing popularity of methamphetamine use in Iran has posed a new health challenge to the Iranian health sector. Methamphetamine-associated psychosis (MAP) has been frequently reported in Iran in recent years. Although methamphetamine use and MAP are considerable health problems in Iran but there is still a need to conduct epidemiological studies on the prevalence of MAP and its health-related problems. The present paper emphasizes that health policy makers should consider the immediate needs of drug users, their families and the community to be informed about the detrimental health effects associated with MAP. Although MAP could be managed by prescribing benzodiazepines and psychiatric medications but the most effective regime for stabilizing patients with MAP still needs to be studied in Iran. Constant collaborations among psychiatric services and outpatient psychotherapeutic services should be established to successfully manage MAP in Iran. Iranian clinicians especially emergency medicine specialists should be informed about the differences between the two forms of transient and recurrent MAP in order to implement appropriate pharmacological therapies to manage MAP. It is hoped that special training courses are designed and implemented by health policy makers to inform clinicians, health providers and especially emergency medicine specialists to effectively deal with MAP. PMID:23577655
Vegetation types on acid soils of Micronesia
Marjorie C. Falanruw; Thomas G.. Cole; Craig D. Whitesell
1987-01-01
The soils and vegetation of the Caroline high islands, Federated States of Micronesia, are being mapped by the U.S. Department of Agriculture's Forest Service and Soil Conservation Service. By the end of 1987, vegetation maps and reports on Kosrae, Pohnpei, Yap, four Truk Islands, and Palau are expected to be available. To compare soil types with vegetation types...
ERIC Educational Resources Information Center
Chorpita, Bruce F.; Bernstein, Adam; Daleiden, Eric L.
2011-01-01
Objective: Despite substantial progress in the development and identification of psychosocial evidence-based treatments (EBTs) in mental health, there is minimal empirical guidance for selecting an optimal "set" of EBTs maximally applicable and generalizable to a chosen service sample. Relevance mapping is a proposed methodology that…