Sample records for geospatial information web

  1. Automatic geospatial information Web service composition based on ontology interface matching

    NASA Astrophysics Data System (ADS)

    Xu, Xianbin; Wu, Qunyong; Wang, Qinmin

    2008-10-01

    With Web services technology the functions of WebGIS can be presented as a kind of geospatial information service, and helped to overcome the limitation of the information-isolated situation in geospatial information sharing field. Thus Geospatial Information Web service composition, which conglomerates outsourced services working in tandem to offer value-added service, plays the key role in fully taking advantage of geospatial information services. This paper proposes an automatic geospatial information web service composition algorithm that employed the ontology dictionary WordNet to analyze semantic distances among the interfaces. Through making matching between input/output parameters and the semantic meaning of pairs of service interfaces, a geospatial information web service chain can be created from a number of candidate services. A practice of the algorithm is also proposed and the result of it shows the feasibility of this algorithm and the great promise in the emerging demand for geospatial information web service composition.

  2. Global polar geospatial information service retrieval based on search engine and ontology reasoning

    USGS Publications Warehouse

    Chen, Nengcheng; E, Dongcheng; Di, Liping; Gong, Jianya; Chen, Zeqiang

    2007-01-01

    In order to improve the access precision of polar geospatial information service on web, a new methodology for retrieving global spatial information services based on geospatial service search and ontology reasoning is proposed, the geospatial service search is implemented to find the coarse service from web, the ontology reasoning is designed to find the refined service from the coarse service. The proposed framework includes standardized distributed geospatial web services, a geospatial service search engine, an extended UDDI registry, and a multi-protocol geospatial information service client. Some key technologies addressed include service discovery based on search engine and service ontology modeling and reasoning in the Antarctic geospatial context. Finally, an Antarctica multi protocol OWS portal prototype based on the proposed methodology is introduced.

  3. Grid Enabled Geospatial Catalogue Web Service

    NASA Technical Reports Server (NTRS)

    Chen, Ai-Jun; Di, Li-Ping; Wei, Ya-Xing; Liu, Yang; Bui, Yu-Qi; Hu, Chau-Min; Mehrotra, Piyush

    2004-01-01

    Geospatial Catalogue Web Service is a vital service for sharing and interoperating volumes of distributed heterogeneous geospatial resources, such as data, services, applications, and their replicas over the web. Based on the Grid technology and the Open Geospatial Consortium (0GC) s Catalogue Service - Web Information Model, this paper proposes a new information model for Geospatial Catalogue Web Service, named as GCWS which can securely provides Grid-based publishing, managing and querying geospatial data and services, and the transparent access to the replica data and related services under the Grid environment. This information model integrates the information model of the Grid Replica Location Service (RLS)/Monitoring & Discovery Service (MDS) with the information model of OGC Catalogue Service (CSW), and refers to the geospatial data metadata standards from IS0 19115, FGDC and NASA EOS Core System and service metadata standards from IS0 191 19 to extend itself for expressing geospatial resources. Using GCWS, any valid geospatial user, who belongs to an authorized Virtual Organization (VO), can securely publish and manage geospatial resources, especially query on-demand data in the virtual community and get back it through the data-related services which provide functions such as subsetting, reformatting, reprojection etc. This work facilitates the geospatial resources sharing and interoperating under the Grid environment, and implements geospatial resources Grid enabled and Grid technologies geospatial enabled. It 2!so makes researcher to focus on science, 2nd not cn issues with computing ability, data locztic, processir,g and management. GCWS also is a key component for workflow-based virtual geospatial data producing.

  4. An Automated End-To Multi-Agent Qos Based Architecture for Selection of Geospatial Web Services

    NASA Astrophysics Data System (ADS)

    Shah, M.; Verma, Y.; Nandakumar, R.

    2012-07-01

    Over the past decade, Service-Oriented Architecture (SOA) and Web services have gained wide popularity and acceptance from researchers and industries all over the world. SOA makes it easy to build business applications with common services, and it provides like: reduced integration expense, better asset reuse, higher business agility, and reduction of business risk. Building of framework for acquiring useful geospatial information for potential users is a crucial problem faced by the GIS domain. Geospatial Web services solve this problem. With the help of web service technology, geospatial web services can provide useful geospatial information to potential users in a better way than traditional geographic information system (GIS). A geospatial Web service is a modular application designed to enable the discovery, access, and chaining of geospatial information and services across the web that are often both computation and data-intensive that involve diverse sources of data and complex processing functions. With the proliferation of web services published over the internet, multiple web services may provide similar functionality, but with different non-functional properties. Thus, Quality of Service (QoS) offers a metric to differentiate the services and their service providers. In a quality-driven selection of web services, it is important to consider non-functional properties of the web service so as to satisfy the constraints or requirements of the end users. The main intent of this paper is to build an automated end-to-end multi-agent based solution to provide the best-fit web service to service requester based on QoS.

  5. Integrating semantic web technologies and geospatial catalog services for geospatial information discovery and processing in cyberinfrastructure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yue, Peng; Gong, Jianya; Di, Liping

    Abstract A geospatial catalogue service provides a network-based meta-information repository and interface for advertising and discovering shared geospatial data and services. Descriptive information (i.e., metadata) for geospatial data and services is structured and organized in catalogue services. The approaches currently available for searching and using that information are often inadequate. Semantic Web technologies show promise for better discovery methods by exploiting the underlying semantics. Such development needs special attention from the Cyberinfrastructure perspective, so that the traditional focus on discovery of and access to geospatial data can be expanded to support the increased demand for processing of geospatial information andmore » discovery of knowledge. Semantic descriptions for geospatial data, services, and geoprocessing service chains are structured, organized, and registered through extending elements in the ebXML Registry Information Model (ebRIM) of a geospatial catalogue service, which follows the interface specifications of the Open Geospatial Consortium (OGC) Catalogue Services for the Web (CSW). The process models for geoprocessing service chains, as a type of geospatial knowledge, are captured, registered, and discoverable. Semantics-enhanced discovery for geospatial data, services/service chains, and process models is described. Semantic search middleware that can support virtual data product materialization is developed for the geospatial catalogue service. The creation of such a semantics-enhanced geospatial catalogue service is important in meeting the demands for geospatial information discovery and analysis in Cyberinfrastructure.« less

  6. Infrastructure for the Geospatial Web

    NASA Astrophysics Data System (ADS)

    Lake, Ron; Farley, Jim

    Geospatial data and geoprocessing techniques are now directly linked to business processes in many areas. Commerce, transportation and logistics, planning, defense, emergency response, health care, asset management and many other domains leverage geospatial information and the ability to model these data to achieve increased efficiencies and to develop better, more comprehensive decisions. However, the ability to deliver geospatial data and the capacity to process geospatial information effectively in these domains are dependent on infrastructure technology that facilitates basic operations such as locating data, publishing data, keeping data current and notifying subscribers and others whose applications and decisions are dependent on this information when changes are made. This chapter introduces the notion of infrastructure technology for the Geospatial Web. Specifically, the Geography Markup Language (GML) and registry technology developed using the ebRIM specification delivered from the OASIS consortium are presented as atomic infrastructure components in a working Geospatial Web.

  7. Grid computing enhances standards-compatible geospatial catalogue service

    NASA Astrophysics Data System (ADS)

    Chen, Aijun; Di, Liping; Bai, Yuqi; Wei, Yaxing; Liu, Yang

    2010-04-01

    A catalogue service facilitates sharing, discovery, retrieval, management of, and access to large volumes of distributed geospatial resources, for example data, services, applications, and their replicas on the Internet. Grid computing provides an infrastructure for effective use of computing, storage, and other resources available online. The Open Geospatial Consortium has proposed a catalogue service specification and a series of profiles for promoting the interoperability of geospatial resources. By referring to the profile of the catalogue service for Web, an innovative information model of a catalogue service is proposed to offer Grid-enabled registry, management, retrieval of and access to geospatial resources and their replicas. This information model extends the e-business registry information model by adopting several geospatial data and service metadata standards—the International Organization for Standardization (ISO)'s 19115/19119 standards and the US Federal Geographic Data Committee (FGDC) and US National Aeronautics and Space Administration (NASA) metadata standards for describing and indexing geospatial resources. In order to select the optimal geospatial resources and their replicas managed by the Grid, the Grid data management service and information service from the Globus Toolkits are closely integrated with the extended catalogue information model. Based on this new model, a catalogue service is implemented first as a Web service. Then, the catalogue service is further developed as a Grid service conforming to Grid service specifications. The catalogue service can be deployed in both the Web and Grid environments and accessed by standard Web services or authorized Grid services, respectively. The catalogue service has been implemented at the George Mason University/Center for Spatial Information Science and Systems (GMU/CSISS), managing more than 17 TB of geospatial data and geospatial Grid services. This service makes it easy to share and interoperate geospatial resources by using Grid technology and extends Grid technology into the geoscience communities.

  8. Modeling and formal representation of geospatial knowledge for the Geospatial Semantic Web

    NASA Astrophysics Data System (ADS)

    Huang, Hong; Gong, Jianya

    2008-12-01

    GML can only achieve geospatial interoperation at syntactic level. However, it is necessary to resolve difference of spatial cognition in the first place in most occasions, so ontology was introduced to describe geospatial information and services. But it is obviously difficult and improper to let users to find, match and compose services, especially in some occasions there are complicated business logics. Currently, with the gradual introduction of Semantic Web technology (e.g., OWL, SWRL), the focus of the interoperation of geospatial information has shifted from syntactic level to Semantic and even automatic, intelligent level. In this way, Geospatial Semantic Web (GSM) can be put forward as an augmentation to the Semantic Web that additionally includes geospatial abstractions as well as related reasoning, representation and query mechanisms. To advance the implementation of GSM, we first attempt to construct the mechanism of modeling and formal representation of geospatial knowledge, which are also two mostly foundational phases in knowledge engineering (KE). Our attitude in this paper is quite pragmatical: we argue that geospatial context is a formal model of the discriminate environment characters of geospatial knowledge, and the derivation, understanding and using of geospatial knowledge are located in geospatial context. Therefore, first, we put forward a primitive hierarchy of geospatial knowledge referencing first order logic, formal ontologies, rules and GML. Second, a metamodel of geospatial context is proposed and we use the modeling methods and representation languages of formal ontologies to process geospatial context. Thirdly, we extend Web Process Service (WPS) to be compatible with local DLL for geoprocessing and possess inference capability based on OWL.

  9. A resource-oriented architecture for a Geospatial Web

    NASA Astrophysics Data System (ADS)

    Mazzetti, Paolo; Nativi, Stefano

    2010-05-01

    In this presentation we discuss some architectural issues on the design of an architecture for a Geospatial Web, that is an information system for sharing geospatial resources according to the Web paradigm. The success of the Web in building a multi-purpose information space, has raised questions about the possibility of adopting the same approach for systems dedicated to the sharing of more specific resources, such as the geospatial information, that is information characterized by spatial/temporal reference. To this aim an investigation on the nature of the Web and on the validity of its paradigm for geospatial resources is required. The Web was born in the early 90's to provide "a shared information space through which people and machines could communicate" [Berners-Lee 1996]. It was originally built around a small set of specifications (e.g. URI, HTTP, HTML, etc.); however, in the last two decades several other technologies and specifications have been introduced in order to extend its capabilities. Most of them (e.g. the SOAP family) actually aimed to transform the Web in a generic Distributed Computing Infrastructure. While these efforts were definitely successful enabling the adoption of service-oriented approaches for machine-to-machine interactions supporting complex business processes (e.g. for e-Government and e-Business applications), they do not fit in the original concept of the Web. In the year 2000, R. T. Fielding, one of the designers of the original Web specifications, proposes a new architectural style for distributed systems, called REST (Representational State Transfer), aiming to capture the fundamental characteristics of the Web as it was originally conceived [Fielding 2000]. In this view, the nature of the Web lies not so much in the technologies, as in the way they are used. Maintaining the Web architecture conform to the REST style would then assure the scalability, extensibility and low entry barrier of the original Web. On the contrary, systems using the same Web technologies and specifications but according to a different architectural style, despite their usefulness, should not be considered part of the Web. If the REST style captures the significant Web characteristics, then, in order to build a Geospatial Web it is necessary that its architecture satisfies all the REST constraints. One of them is of particular importance: the adoption of a Uniform Interface. It prescribes that all the geospatial resources must be accessed through the same interface; moreover according to the REST style this interface must satisfy four further constraints: a) identification of resources; b) manipulation of resources through representations; c) self-descriptive messages; and, d) hypermedia as the engine of application state. In the Web, the uniform interface provides basic operations which are meaningful for generic resources. They typically implement the CRUD pattern (Create-Retrieve-Update-Delete) which demonstrated to be flexible and powerful in several general-purpose contexts (e.g. filesystem management, SQL for database management systems, etc.). Restricting the scope to a subset of resources it would be possible to identify other generic actions which are meaningful for all of them. For example for geospatial resources, subsetting, resampling, interpolation and coordinate reference systems transformations functionalities are candidate functionalities for a uniform interface. However an investigation is needed to clarify the semantics of those actions for different resources, and consequently if they can really ascend the role of generic interface operation. Concerning the point a), (identification of resources), it is required that every resource addressable in the Geospatial Web has its own identifier (e.g. a URI). This allows to implement citation and re-use of resources, simply providing the URI. OPeNDAP and KVP encodings of OGC data access services specifications might provide a basis for it. Concerning point b) (manipulation of resources through representations), the Geospatial Web poses several issues. In fact, while the Web mainly handles semi-structured information, in the Geospatial Web the information is typically structured with several possible data models (e.g. point series, gridded coverages, trajectories, etc.) and encodings. A possibility would be to simplify the interchange formats, choosing to support a subset of data models and format(s). This is what actually the Web designers did choosing to define a common format for hypermedia (HTML), although the underlying protocol would be generic. Concerning point c), self-descriptive messages, the exchanged messages should describe themselves and their content. This would not be actually a major issue considering the effort put in recent years on geospatial metadata models and specifications. The point d), hypermedia as the engine of application state, is actually where the Geospatial Web would mainly differ from existing geospatial information sharing systems. In fact the existing systems typically adopt a service-oriented architecture, where applications are built as a single service or as a workflow of services. On the other hand, in the Geospatial Web, applications should be built following the path between interconnected resources. The link between resources should be made explicit as hyperlinks. The adoption of Semantic Web solutions would allow to define not only the existence of a link between two resources, but also the nature of the link. The implementation of a Geospatial Web would allow to build an information system with the same characteristics of the Web sharing its points-of-strength and weaknesses. The main advantages would be the following: • The user would interact with the Geospatial Web according to the well-known Web navigation paradigm. This would lower the barrier to the access to geospatial applications for non-specialists (e.g. the success of Google Maps and other Web mapping applications); • Successful Web and Web 2.0 applications - search engines, feeds, social network - could be integrated/replicated in the Geospatial Web; The main drawbacks would be the following: • The Uniform Interface simplifies the overall system architecture (e.g. no service registry, and service descriptors required), but moves the complexity to the data representation. Moreover since the interface must stay generic, it results really simple and therefore complex interactions would require several transfers. • In the geospatial domain one of the most valuable resources are processes (e.g. environmental models). How they can be modeled as resources accessed through the common interface is an open issue. Taking into account advantages and drawback it seems that a Geospatial Web would be useful, but its use would be limited to specific use-cases not covering all the possible applications. The Geospatial Web architecture could be partly based on existing specifications, while other aspects need investigation. References [Berners-Lee 1996] T. Berners-Lee, "WWW: Past, present, and future". IEEE Computer, 29(10), Oct. 1996, pp. 69-77. [Fielding 2000] Fielding, R. T. 2000. Architectural styles and the design of network-based software architectures. PhD Dissertation. Dept. of Information and Computer Science, University of California, Irvine

  10. Using the Geospatial Web to Deliver and Teach Giscience Education Programs

    NASA Astrophysics Data System (ADS)

    Veenendaal, B.

    2015-05-01

    Geographic information science (GIScience) education has undergone enormous changes over the past years. One major factor influencing this change is the role of the geospatial web in GIScience. In addition to the use of the web for enabling and enhancing GIScience education, it is also used as the infrastructure for communicating and collaborating among geospatial data and users. The web becomes both the means and the content for a geospatial education program. However, the web does not replace the traditional face-to-face environment, but rather is a means to enhance it, expand it and enable an authentic and real world learning environment. This paper outlines the use of the web in both the delivery and content of the GIScience program at Curtin University. The teaching of the geospatial web, web and cloud based mapping, and geospatial web services are key components of the program, and the use of the web and online learning are important to deliver this program. Some examples of authentic and real world learning environments are provided including joint learning activities with partner universities.

  11. Borderless Geospatial Web (bolegweb)

    NASA Astrophysics Data System (ADS)

    Cetl, V.; Kliment, T.; Kliment, M.

    2016-06-01

    The effective access and use of geospatial information (GI) resources acquires a critical value of importance in modern knowledge based society. Standard web services defined by Open Geospatial Consortium (OGC) are frequently used within the implementations of spatial data infrastructures (SDIs) to facilitate discovery and use of geospatial data. This data is stored in databases located in a layer, called the invisible web, thus are ignored by search engines. SDI uses a catalogue (discovery) service for the web as a gateway to the GI world through the metadata defined by ISO standards, which are structurally diverse to OGC metadata. Therefore, a crosswalk needs to be implemented to bridge the OGC resources discovered on mainstream web with those documented by metadata in an SDI to enrich its information extent. A public global wide and user friendly portal of OGC resources available on the web ensures and enhances the use of GI within a multidisciplinary context and bridges the geospatial web from the end-user perspective, thus opens its borders to everybody. Project "Crosswalking the layers of geospatial information resources to enable a borderless geospatial web" with the acronym BOLEGWEB is ongoing as a postdoctoral research project at the Faculty of Geodesy, University of Zagreb in Croatia (http://bolegweb.geof.unizg.hr/). The research leading to the results of the project has received funding from the European Union Seventh Framework Programme (FP7 2007-2013) under Marie Curie FP7-PEOPLE-2011-COFUND. The project started in the November 2014 and is planned to be finished by the end of 2016. This paper provides an overview of the project, research questions and methodology, so far achieved results and future steps.

  12. Using a Web GIS Plate Tectonics Simulation to Promote Geospatial Thinking

    ERIC Educational Resources Information Center

    Bodzin, Alec M.; Anastasio, David; Sharif, Rajhida; Rutzmoser, Scott

    2016-01-01

    Learning with Web-based geographic information system (Web GIS) can promote geospatial thinking and analysis of georeferenced data. Web GIS can enable learners to analyze rich data sets to understand spatial relationships that are managed in georeferenced data visualizations. We developed a Web GIS plate tectonics simulation as a capstone learning…

  13. The National 3-D Geospatial Information Web-Based Service of Korea

    NASA Astrophysics Data System (ADS)

    Lee, D. T.; Kim, C. W.; Kang, I. G.

    2013-09-01

    3D geospatial information systems should provide efficient spatial analysis tools and able to use all capabilities of the third dimension, and a visualization. Currently, many human activities make steps toward the third dimension like land use, urban and landscape planning, cadastre, environmental monitoring, transportation monitoring, real estate market, military applications, etc. To reflect this trend, the Korean government has been started to construct the 3D geospatial data and service platform. Since the geospatial information was introduced in Korea, the construction of geospatial information (3D geospatial information, digital maps, aerial photographs, ortho photographs, etc.) has been led by the central government. The purpose of this study is to introduce the Korean government-lead 3D geospatial information web-based service for the people who interested in this industry and we would like to introduce not only the present conditions of constructed 3D geospatial data but methodologies and applications of 3D geospatial information. About 15% (about 3,278.74 km2) of the total urban area's 3D geospatial data have been constructed by the national geographic information institute (NGII) of Korea from 2005 to 2012. Especially in six metropolitan cities and Dokdo (island belongs to Korea) on level of detail (LOD) 4 which is photo-realistic textured 3D models including corresponding ortho photographs were constructed in 2012. In this paper, we represented web-based 3D map service system composition and infrastructure and comparison of V-world with Google Earth service will be presented. We also represented Open API based service cases and discussed about the protection of location privacy when we construct 3D indoor building models. In order to prevent an invasion of privacy, we processed image blurring, elimination and camouflage. The importance of public-private cooperation and advanced geospatial information policy is emphasized in Korea. Thus, the progress of spatial information industry of Korea is expected in the near future.

  14. Examining the Enactment of Web GIS on Students' Geospatial Thinking and Reasoning and Tectonics Understandings

    ERIC Educational Resources Information Center

    Bodzin, Alec M.; Fu, Qiong; Bressler, Denise; Vallera, Farah L.

    2015-01-01

    Geospatially enabled learning technologies may enhance Earth science learning by placing emphasis on geographic space, visualization, scale, representation, and geospatial thinking and reasoning (GTR) skills. This study examined if and how a series of Web geographic information system investigations that the researchers developed improved urban…

  15. Modern Technologies aspects for Oceanographic Data Management and Dissemination : The HNODC Implementation

    NASA Astrophysics Data System (ADS)

    Lykiardopoulos, A.; Iona, A.; Lakes, V.; Batis, A.; Balopoulos, E.

    2009-04-01

    The development of new technologies for the aim of enhancing Web Applications with Dynamically data access was the starting point for Geospatial Web Applications to developed at the same time as well. By the means of these technologies the Web Applications embed the capability of presenting Geographical representations of the Geo Information. The induction in nowadays, of the state of the art technologies known as Web Services, enforce the Web Applications to have interoperability among them i.e. to be able to process requests from each other via a network. In particular throughout the Oceanographic Community, modern Geographical Information systems based on Geospatial Web Services are now developed or will be developed shortly in the near future, with capabilities of managing the information itself fully through Web Based Geographical Interfaces. The exploitation of HNODC Data Base, through a Web Based Application enhanced with Web Services by the use of open source tolls may be consider as an ideal case of such implementation. Hellenic National Oceanographic Data Center (HNODC) as a National Public Oceanographic Data provider and at the same time a member of the International Net of Oceanographic Data Centers( IOC/IODE), owns a very big volume of Data and Relevant information about the Marine Ecosystem. For the efficient management and exploitation of these Data, a relational Data Base has been constructed with a storage of over 300.000 station data concerning, physical, chemical and biological Oceanographic information. The development of a modern Web Application for the End User worldwide to be able to explore and navigate throughout HNODC data via the use of an interface with the capability of presenting Geographical representations of the Geo Information, is today a fact. The application is constituted with State of the art software components and tools such as: • Geospatial and no Spatial Web Services mechanisms • Geospatial open source tools for the creation of Dynamic Geographical Representations. • Communication protocols (messaging mechanisms) in all Layers such as XML and GML together with SOAP protocol via Apache/Axis. At the same time the application may interact with any other SOA application either in sending or receiving Geospatial Data through Geographical Layers, since it inherits the big advantage of interoperability between Web Services systems. Roughly the Architecture can denoted as follows: • At the back End Open source PostgreSQL DBMS stands as the data storage mechanism with more than one Data Base Schemas cause of the separation of the Geospatial Data and the non Geospatial Data. • UMN Map Server and Geoserver are the mechanisms for: Represent Geospatial Data via Web Map Service (WMS) Querying and Navigating in Geospatial and Meta Data Information via Web Feature Service (WFS) oAnd in the near future Transacting and processing new or existing Geospatial Data via Web Processing Service (WPS) • Map Bender, a geospatial portal site management software for OGC and OWS architectures acts as the integration module between the Geospatial Mechanisms. Mapbender comes with an embedded data model capable to manage interfaces for displaying, navigating and querying OGC compliant web map and feature services (WMS and transactional WFS). • Apache and Tomcat stand again as the Web Service middle Layers • Apache Axis with it's embedded implementation of the SOAP protocol ("Simple Object Access Protocol") acts as the No spatial data Mechanism of Web Services. These modules of the platform are still under development but their implementation will be fulfilled in the near future. • And a new Web user Interface for the end user based on enhanced and customized version of a MapBender GUI, a powerful Web Services client. For HNODC the interoperability of Web Services is the big advantage of the developed platform since it is capable to act in the future as provider and consumer of Web Services in both ways: • Either as data products provider for external SOA platforms. • Or as consumer of data products from external SOA platforms for new applications to be developed or for existing applications to be enhanced. A great paradigm of Data Managenet integration and dissemination via the use of such technologies is the European's Union Research Project Seadatanet, with the main objective to develop a standardized distributed system for managing and disseminating the large and diverse data sets and to enhance the currently existing infrastructures with Web Services Further more and when the technology of Web Processing Service (WPS), will be mature enough and applicable for development, the derived data products will be able to have any kind of GIS functionality for consumers across the network. From this point of view HNODC, joins the global scientific community by providing and consuming application Independent data products.

  16. Introduction to geospatial semantics and technology workshop handbook

    USGS Publications Warehouse

    Varanka, Dalia E.

    2012-01-01

    The workshop is a tutorial on introductory geospatial semantics with hands-on exercises using standard Web browsers. The workshop is divided into two sections, general semantics on the Web and specific examples of geospatial semantics using data from The National Map of the U.S. Geological Survey and the Open Ontology Repository. The general semantics section includes information and access to publicly available semantic archives. The specific session includes information on geospatial semantics with access to semantically enhanced data for hydrography, transportation, boundaries, and names. The Open Ontology Repository offers open-source ontologies for public use.

  17. Exploring U.S Cropland - A Web Service based Cropland Data Layer Visualization, Dissemination and Querying System (Invited)

    NASA Astrophysics Data System (ADS)

    Yang, Z.; Han, W.; di, L.

    2010-12-01

    The National Agricultural Statistics Service (NASS) of the USDA produces the Cropland Data Layer (CDL) product, which is a raster-formatted, geo-referenced, U.S. crop specific land cover classification. These digital data layers are widely used for a variety of applications by universities, research institutions, government agencies, and private industry in climate change studies, environmental ecosystem studies, bioenergy production & transportation planning, environmental health research and agricultural production decision making. The CDL is also used internally by NASS for crop acreage and yield estimation. Like most geospatial data products, the CDL product is only available by CD/DVD delivery or online bulk file downloading via the National Research Conservation Research (NRCS) Geospatial Data Gateway (external users) or in a printed paper map format. There is no online geospatial information access and dissemination, no crop visualization & browsing, no geospatial query capability, nor online analytics. To facilitate the application of this data layer and to help disseminating the data, a web-service based CDL interactive map visualization, dissemination, querying system is proposed. It uses Web service based service oriented architecture, adopts open standard geospatial information science technology and OGC specifications and standards, and re-uses functions/algorithms from GeoBrain Technology (George Mason University developed). This system provides capabilities of on-line geospatial crop information access, query and on-line analytics via interactive maps. It disseminates all data to the decision makers and users via real time retrieval, processing and publishing over the web through standards-based geospatial web services. A CDL region of interest can also be exported directly to Google Earth for mashup or downloaded for use with other desktop application. This web service based system greatly improves equal-accessibility, interoperability, usability, and data visualization, facilitates crop geospatial information usage, and enables US cropland online exploring capability without any client-side software installation. It also greatly reduces the need for paper map and analysis report printing and media usages, and thus enhances low-carbon Agro-geoinformation dissemination for decision support.

  18. Smart Cities Intelligence System (SMACiSYS) Integrating Sensor Web with Spatial Data Infrastructures (sensdi)

    NASA Astrophysics Data System (ADS)

    Bhattacharya, D.; Painho, M.

    2017-09-01

    The paper endeavours to enhance the Sensor Web with crucial geospatial analysis capabilities through integration with Spatial Data Infrastructure. The objective is development of automated smart cities intelligence system (SMACiSYS) with sensor-web access (SENSDI) utilizing geomatics for sustainable societies. There has been a need to develop automated integrated system to categorize events and issue information that reaches users directly. At present, no web-enabled information system exists which can disseminate messages after events evaluation in real time. Research work formalizes a notion of an integrated, independent, generalized, and automated geo-event analysing system making use of geo-spatial data under popular usage platform. Integrating Sensor Web With Spatial Data Infrastructures (SENSDI) aims to extend SDIs with sensor web enablement, converging geospatial and built infrastructure, and implement test cases with sensor data and SDI. The other benefit, conversely, is the expansion of spatial data infrastructure to utilize sensor web, dynamically and in real time for smart applications that smarter cities demand nowadays. Hence, SENSDI augments existing smart cities platforms utilizing sensor web and spatial information achieved by coupling pairs of otherwise disjoint interfaces and APIs formulated by Open Geospatial Consortium (OGC) keeping entire platform open access and open source. SENSDI is based on Geonode, QGIS and Java, that bind most of the functionalities of Internet, sensor web and nowadays Internet of Things superseding Internet of Sensors as well. In a nutshell, the project delivers a generalized real-time accessible and analysable platform for sensing the environment and mapping the captured information for optimal decision-making and societal benefit.

  19. Importance of the spatial data and the sensor web in the ubiquitous computing area

    NASA Astrophysics Data System (ADS)

    Akçit, Nuhcan; Tomur, Emrah; Karslıoǧlu, Mahmut O.

    2014-08-01

    Spatial data has become a critical issue in recent years. In the past years, nearly more than three quarters of databases, were related directly or indirectly to locations referring to physical features, which constitute the relevant aspects. Spatial data is necessary to identify or calculate the relationships between spatial objects when using spatial operators in programs or portals. Originally, calculations were conducted using Geographic Information System (GIS) programs on local computers. Subsequently, through the Internet, they formed a geospatial web, which is integrated into a discoverable collection of geographically related web standards and key features, and constitutes a global network of geospatial data that employs the World Wide Web to process textual data. In addition, the geospatial web is used to gather spatial data producers, resources, and users. Standards also constitute a critical dimension in further globalizing the idea of the geospatial web. The sensor web is an example of the real time service that the geospatial web can provide. Sensors around the world collect numerous types of data. The sensor web is a type of sensor network that is used for visualizing, calculating, and analyzing collected sensor data. Today, people use smart devices and systems more frequently because of the evolution of technology and have more than one mobile device. The considerable number of sensors and different types of data that are positioned around the world have driven the production of interoperable and platform-independent sensor web portals. The focus of such production has been on further developing the idea of an interoperable and interdependent sensor web of all devices that share and collect information. The other pivotal idea consists of encouraging people to use and send data voluntarily for numerous purposes with the some level of credibility. The principal goal is to connect mobile and non-mobile device in the sensor web platform together to operate for serving and collecting information from people.

  20. SDI-based business processes: A territorial analysis web information system in Spain

    NASA Astrophysics Data System (ADS)

    Béjar, Rubén; Latre, Miguel Á.; Lopez-Pellicer, Francisco J.; Nogueras-Iso, Javier; Zarazaga-Soria, F. J.; Muro-Medrano, Pedro R.

    2012-09-01

    Spatial Data Infrastructures (SDIs) provide access to geospatial data and operations through interoperable Web services. These data and operations can be chained to set up specialized geospatial business processes, and these processes can give support to different applications. End users can benefit from these applications, while experts can integrate the Web services in their own business processes and developments. This paper presents an SDI-based territorial analysis Web information system for Spain, which gives access to land cover, topography and elevation data, as well as to a number of interoperable geospatial operations by means of a Web Processing Service (WPS). Several examples illustrate how different territorial analysis business processes are supported. The system has been established by the Spanish National SDI (Infraestructura de Datos Espaciales de España, IDEE) both as an experimental platform for geoscientists and geoinformation system developers, and as a mechanism to contribute to the Spanish citizens knowledge about their territory.

  1. Participating in the Geospatial Web: Collaborative Mapping, Social Networks and Participatory GIS

    NASA Astrophysics Data System (ADS)

    Rouse, L. Jesse; Bergeron, Susan J.; Harris, Trevor M.

    In 2005, Google, Microsoft and Yahoo! released free Web mapping applications that opened up digital mapping to mainstream Internet users. Importantly, these companies also released free APIs for their platforms, allowing users to geo-locate and map their own data. These initiatives have spurred the growth of the Geospatial Web and represent spatially aware online communities and new ways of enabling communities to share information from the bottom up. This chapter explores how the emerging Geospatial Web can meet some of the fundamental needs of Participatory GIS projects to incorporate local knowledge into GIS, as well as promote public access and collaborative mapping.

  2. Recent innovation of geospatial information technology to support disaster risk management and responses

    NASA Astrophysics Data System (ADS)

    Une, Hiroshi; Nakano, Takayuki

    2018-05-01

    Geographic location is one of the most fundamental and indispensable information elements in the field of disaster response and prevention. For example, in the case of the Tohoku Earthquake in 2011, aerial photos taken immediately after the earthquake greatly improved information sharing among different government offices and facilitated rescue and recovery operations, and maps prepared after the disaster assisted in the rapid reconstruction of affected local communities. Thanks to the recent development of geospatial information technology, this information has become more essential for disaster response activities. Advancements in web mapping technology allows us to better understand the situation by overlaying various location-specific data on base maps on the web and specifying the areas on which activities should be focused. Through 3-D modelling technology, we can have a more realistic understanding of the relationship between disaster and topography. Geospatial information technology can sup-port proper preparation and emergency responses against disasters by individuals and local communities through hazard mapping and other information services using mobile devices. Thus, geospatial information technology is playing a more vital role on all stages of disaster risk management and responses. In acknowledging geospatial information's vital role in disaster risk reduction, the Sendai Framework for Disaster Risk Reduction 2015-2030, adopted at the Third United Nations World Conference on Disaster Risk Reduction, repeatedly reveals the importance of utilizing geospatial information technology for disaster risk reduction. This presentation aims to report the recent practical applications of geospatial information technology for disaster risk management and responses.

  3. Advancements in Open Geospatial Standards for Photogrammetry and Remote Sensing from Ogc

    NASA Astrophysics Data System (ADS)

    Percivall, George; Simonis, Ingo

    2016-06-01

    The necessity of open standards for effective sharing and use of remote sensing continues to receive increasing emphasis in policies of agencies and projects around the world. Coordination on the development of open standards for geospatial information is a vital step to insure that the technical standards are ready to support the policy objectives. The mission of the Open Geospatial Consortium (OGC) is to advance development and use of international standards and supporting services that promote geospatial interoperability. To accomplish this mission, OGC serves as the global forum for the collaboration of geospatial data / solution providers and users. Photogrammetry and remote sensing are sources of the largest and most complex geospatial information. Some of the most mature OGC standards for remote sensing include the Sensor Web Enablement (SWE) standards, the Web Coverage Service (WCS) suite of standards, encodings such as NetCDF, GMLJP2 and GeoPackage, and the soon to be approved Discrete Global Grid Systems (DGGS) standard. In collaboration with ISPRS, OGC working with government, research and industrial organizations continue to advance the state of geospatial standards for full use of photogrammetry and remote sensing.

  4. Web-Based Geospatial Tools to Address Hazard Mitigation, Natural Resource Management, and Other Societal Issues

    USGS Publications Warehouse

    Hearn,, Paul P.

    2009-01-01

    Federal, State, and local government agencies in the United States face a broad range of issues on a daily basis. Among these are natural hazard mitigation, homeland security, emergency response, economic and community development, water supply, and health and safety services. The U.S. Geological Survey (USGS) helps decision makers address these issues by providing natural hazard assessments, information on energy, mineral, water and biological resources, maps, and other geospatial information. Increasingly, decision makers at all levels are challenged not by the lack of information, but by the absence of effective tools to synthesize the large volume of data available, and to utilize the data to frame policy options in a straightforward and understandable manner. While geographic information system (GIS) technology has been widely applied to this end, systems with the necessary analytical power have been usable only by trained operators. The USGS is addressing the need for more accessible, manageable data tools by developing a suite of Web-based geospatial applications that will incorporate USGS and cooperating partner data into the decision making process for a variety of critical issues. Examples of Web-based geospatial tools being used to address societal issues follow.

  5. Real-time GIS data model and sensor web service platform for environmental data management.

    PubMed

    Gong, Jianya; Geng, Jing; Chen, Zeqiang

    2015-01-09

    Effective environmental data management is meaningful for human health. In the past, environmental data management involved developing a specific environmental data management system, but this method often lacks real-time data retrieving and sharing/interoperating capability. With the development of information technology, a Geospatial Service Web method is proposed that can be employed for environmental data management. The purpose of this study is to determine a method to realize environmental data management under the Geospatial Service Web framework. A real-time GIS (Geographic Information System) data model and a Sensor Web service platform to realize environmental data management under the Geospatial Service Web framework are proposed in this study. The real-time GIS data model manages real-time data. The Sensor Web service platform is applied to support the realization of the real-time GIS data model based on the Sensor Web technologies. To support the realization of the proposed real-time GIS data model, a Sensor Web service platform is implemented. Real-time environmental data, such as meteorological data, air quality data, soil moisture data, soil temperature data, and landslide data, are managed in the Sensor Web service platform. In addition, two use cases of real-time air quality monitoring and real-time soil moisture monitoring based on the real-time GIS data model in the Sensor Web service platform are realized and demonstrated. The total time efficiency of the two experiments is 3.7 s and 9.2 s. The experimental results show that the method integrating real-time GIS data model and Sensor Web Service Platform is an effective way to manage environmental data under the Geospatial Service Web framework.

  6. Use of ebRIM-based CSW with sensor observation services for registry and discovery of remote-sensing observations

    NASA Astrophysics Data System (ADS)

    Chen, Nengcheng; Di, Liping; Yu, Genong; Gong, Jianya; Wei, Yaxing

    2009-02-01

    Recent advances in Sensor Web geospatial data capture, such as high-resolution in satellite imagery and Web-ready data processing and modeling technologies, have led to the generation of large numbers of datasets from real-time or near real-time observations and measurements. Finding which sensor or data complies with criteria such as specific times, locations, and scales has become a bottleneck for Sensor Web-based applications, especially remote-sensing observations. In this paper, an architecture for use of the integration Sensor Observation Service (SOS) with the Open Geospatial Consortium (OGC) Catalogue Service-Web profile (CSW) is put forward. The architecture consists of a distributed geospatial sensor observation service, a geospatial catalogue service based on the ebXML Registry Information Model (ebRIM), SOS search and registry middleware, and a geospatial sensor portal. The SOS search and registry middleware finds the potential SOS, generating data granule information and inserting the records into CSW. The contents and sequence of the services, the available observations, and the metadata of the observations registry are described. A prototype system is designed and implemented using the service middleware technology and a standard interface and protocol. The feasibility and the response time of registry and retrieval of observations are evaluated using a realistic Earth Observing-1 (EO-1) SOS scenario. Extracting information from SOS requires the same execution time as record generation for CSW. The average data retrieval response time in SOS+CSW mode is 17.6% of that of the SOS-alone mode. The proposed architecture has the more advantages of SOS search and observation data retrieval than the existing sensor Web enabled systems.

  7. A "Neogeographical Education"? The Geospatial Web, GIS and Digital Art in Adult Education

    ERIC Educational Resources Information Center

    Papadimitriou, Fivos

    2010-01-01

    Neogeography provides a link between the science of geography and digital art. The carriers of this link are geospatial technologies (global navigational satellite systems such as the global positioning system, Geographical Information System [GIS] and satellite imagery) along with ubiquitous information and communication technologies (such as…

  8. The Geospatial Web and Local Geographical Education

    ERIC Educational Resources Information Center

    Harris, Trevor M.; Rouse, L. Jesse; Bergeron, Susan J.

    2010-01-01

    Recent innovations in the Geospatial Web represent a paradigm shift in Web mapping by enabling educators to explore geography in the classroom by dynamically using a rapidly growing suite of impressive online geospatial tools. Coupled with access to spatial data repositories and User-Generated Content, the Geospatial Web provides a powerful…

  9. Building asynchronous geospatial processing workflows with web services

    NASA Astrophysics Data System (ADS)

    Zhao, Peisheng; Di, Liping; Yu, Genong

    2012-02-01

    Geoscience research and applications often involve a geospatial processing workflow. This workflow includes a sequence of operations that use a variety of tools to collect, translate, and analyze distributed heterogeneous geospatial data. Asynchronous mechanisms, by which clients initiate a request and then resume their processing without waiting for a response, are very useful for complicated workflows that take a long time to run. Geospatial contents and capabilities are increasingly becoming available online as interoperable Web services. This online availability significantly enhances the ability to use Web service chains to build distributed geospatial processing workflows. This paper focuses on how to orchestrate Web services for implementing asynchronous geospatial processing workflows. The theoretical bases for asynchronous Web services and workflows, including asynchrony patterns and message transmission, are examined to explore different asynchronous approaches to and architecture of workflow code for the support of asynchronous behavior. A sample geospatial processing workflow, issued by the Open Geospatial Consortium (OGC) Web Service, Phase 6 (OWS-6), is provided to illustrate the implementation of asynchronous geospatial processing workflows and the challenges in using Web Services Business Process Execution Language (WS-BPEL) to develop them.

  10. Public health, GIS, and the internet.

    PubMed

    Croner, Charles M

    2003-01-01

    Internet access and use of georeferenced public health information for GIS application will be an important and exciting development for the nation's Department of Health and Human Services and other health agencies in this new millennium. Technological progress toward public health geospatial data integration, analysis, and visualization of space-time events using the Web portends eventual robust use of GIS by public health and other sectors of the economy. Increasing Web resources from distributed spatial data portals and global geospatial libraries, and a growing suite of Web integration tools, will provide new opportunities to advance disease surveillance, control, and prevention, and insure public access and community empowerment in public health decision making. Emerging supercomputing, data mining, compression, and transmission technologies will play increasingly critical roles in national emergency, catastrophic planning and response, and risk management. Web-enabled public health GIS will be guided by Federal Geographic Data Committee spatial metadata, OpenGIS Web interoperability, and GML/XML geospatial Web content standards. Public health will become a responsive and integral part of the National Spatial Data Infrastructure.

  11. Web mapping system for complex processing and visualization of environmental geospatial datasets

    NASA Astrophysics Data System (ADS)

    Titov, Alexander; Gordov, Evgeny; Okladnikov, Igor

    2016-04-01

    Environmental geospatial datasets (meteorological observations, modeling and reanalysis results, etc.) are used in numerous research applications. Due to a number of objective reasons such as inherent heterogeneity of environmental datasets, big dataset volume, complexity of data models used, syntactic and semantic differences that complicate creation and use of unified terminology, the development of environmental geodata access, processing and visualization services as well as client applications turns out to be quite a sophisticated task. According to general INSPIRE requirements to data visualization geoportal web applications have to provide such standard functionality as data overview, image navigation, scrolling, scaling and graphical overlay, displaying map legends and corresponding metadata information. It should be noted that modern web mapping systems as integrated geoportal applications are developed based on the SOA and might be considered as complexes of interconnected software tools for working with geospatial data. In the report a complex web mapping system including GIS web client and corresponding OGC services for working with geospatial (NetCDF, PostGIS) dataset archive is presented. There are three basic tiers of the GIS web client in it: 1. Tier of geospatial metadata retrieved from central MySQL repository and represented in JSON format 2. Tier of JavaScript objects implementing methods handling: --- NetCDF metadata --- Task XML object for configuring user calculations, input and output formats --- OGC WMS/WFS cartographical services 3. Graphical user interface (GUI) tier representing JavaScript objects realizing web application business logic Metadata tier consists of a number of JSON objects containing technical information describing geospatial datasets (such as spatio-temporal resolution, meteorological parameters, valid processing methods, etc). The middleware tier of JavaScript objects implementing methods for handling geospatial metadata, task XML object, and WMS/WFS cartographical services interconnects metadata and GUI tiers. The methods include such procedures as JSON metadata downloading and update, launching and tracking of the calculation task running on the remote servers as well as working with WMS/WFS cartographical services including: obtaining the list of available layers, visualizing layers on the map, exporting layers in graphical (PNG, JPG, GeoTIFF), vector (KML, GML, Shape) and digital (NetCDF) formats. Graphical user interface tier is based on the bundle of JavaScript libraries (OpenLayers, GeoExt and ExtJS) and represents a set of software components implementing web mapping application business logic (complex menus, toolbars, wizards, event handlers, etc.). GUI provides two basic capabilities for the end user: configuring the task XML object functionality and cartographical information visualizing. The web interface developed is similar to the interface of such popular desktop GIS applications, as uDIG, QuantumGIS etc. Web mapping system developed has shown its effectiveness in the process of solving real climate change research problems and disseminating investigation results in cartographical form. The work is supported by SB RAS Basic Program Projects VIII.80.2.1 and IV.38.1.7.

  12. Automated geospatial Web Services composition based on geodata quality requirements

    NASA Astrophysics Data System (ADS)

    Cruz, Sérgio A. B.; Monteiro, Antonio M. V.; Santos, Rafael

    2012-10-01

    Service-Oriented Architecture and Web Services technologies improve the performance of activities involved in geospatial analysis with a distributed computing architecture. However, the design of the geospatial analysis process on this platform, by combining component Web Services, presents some open issues. The automated construction of these compositions represents an important research topic. Some approaches to solving this problem are based on AI planning methods coupled with semantic service descriptions. This work presents a new approach using AI planning methods to improve the robustness of the produced geospatial Web Services composition. For this purpose, we use semantic descriptions of geospatial data quality requirements in a rule-based form. These rules allow the semantic annotation of geospatial data and, coupled with the conditional planning method, this approach represents more precisely the situations of nonconformities with geodata quality that may occur during the execution of the Web Service composition. The service compositions produced by this method are more robust, thus improving process reliability when working with a composition of chained geospatial Web Services.

  13. Restful Implementation of Catalogue Service for Geospatial Data Provenance

    NASA Astrophysics Data System (ADS)

    Jiang, L. C.; Yue, P.; Lu, X. C.

    2013-10-01

    Provenance, also known as lineage, is important in understanding the derivation history of data products. Geospatial data provenance helps data consumers to evaluate the quality and reliability of geospatial data. In a service-oriented environment, where data are often consumed or produced by distributed services, provenance could be managed by following the same service-oriented paradigm. The Open Geospatial Consortium (OGC) Catalogue Service for the Web (CSW) is used for the registration and query of geospatial data provenance by extending ebXML Registry Information Model (ebRIM). Recent advance of the REpresentational State Transfer (REST) paradigm has shown great promise for the easy integration of distributed resources. RESTful Web Service aims to provide a standard way for Web clients to communicate with servers based on REST principles. The existing approach for provenance catalogue service could be improved by adopting the RESTful design. This paper presents the design and implementation of a catalogue service for geospatial data provenance following RESTful architecture style. A middleware named REST Converter is added on the top of the legacy catalogue service to support a RESTful style interface. The REST Converter is composed of a resource request dispatcher and six resource handlers. A prototype service is developed to demonstrate the applicability of the approach.

  14. Leveraging the geospatial advantage

    Treesearch

    Ben Butler; Andrew Bailey

    2013-01-01

    The Wildland Fire Decision Support System (WFDSS) web-based application leverages geospatial data to inform strategic decisions on wildland fires. A specialized data team, working within the Wildland Fire Management Research Development and Application group (WFM RD&A), assembles authoritative national-level data sets defining values to be protected. The use of...

  15. OGC and Grid Interoperability in enviroGRIDS Project

    NASA Astrophysics Data System (ADS)

    Gorgan, Dorian; Rodila, Denisa; Bacu, Victor; Giuliani, Gregory; Ray, Nicolas

    2010-05-01

    EnviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is a 4-years FP7 Project aiming to address the subjects of ecologically unsustainable development and inadequate resource management. The project develops a Spatial Data Infrastructure of the Black Sea Catchment region. The geospatial technologies offer very specialized functionality for Earth Science oriented applications as well as the Grid oriented technology that is able to support distributed and parallel processing. One challenge of the enviroGRIDS project is the interoperability between geospatial and Grid infrastructures by providing the basic and the extended features of the both technologies. The geospatial interoperability technology has been promoted as a way of dealing with large volumes of geospatial data in distributed environments through the development of interoperable Web service specifications proposed by the Open Geospatial Consortium (OGC), with applications spread across multiple fields but especially in Earth observation research. Due to the huge volumes of data available in the geospatial domain and the additional introduced issues (data management, secure data transfer, data distribution and data computation), the need for an infrastructure capable to manage all those problems becomes an important aspect. The Grid promotes and facilitates the secure interoperations of geospatial heterogeneous distributed data within a distributed environment, the creation and management of large distributed computational jobs and assures a security level for communication and transfer of messages based on certificates. This presentation analysis and discusses the most significant use cases for enabling the OGC Web services interoperability with the Grid environment and focuses on the description and implementation of the most promising one. In these use cases we give a special attention to issues such as: the relations between computational grid and the OGC Web service protocols, the advantages offered by the Grid technology - such as providing a secure interoperability between the distributed geospatial resource -and the issues introduced by the integration of distributed geospatial data in a secure environment: data and service discovery, management, access and computation. enviroGRIDS project proposes a new architecture which allows a flexible and scalable approach for integrating the geospatial domain represented by the OGC Web services with the Grid domain represented by the gLite middleware. The parallelism offered by the Grid technology is discussed and explored at the data level, management level and computation level. The analysis is carried out for OGC Web service interoperability in general but specific details are emphasized for Web Map Service (WMS), Web Feature Service (WFS), Web Coverage Service (WCS), Web Processing Service (WPS) and Catalog Service for Web (CSW). Issues regarding the mapping and the interoperability between the OGC and the Grid standards and protocols are analyzed as they are the base in solving the communication problems between the two environments: grid and geospatial. The presetation mainly highlights how the Grid environment and Grid applications capabilities can be extended and utilized in geospatial interoperability. Interoperability between geospatial and Grid infrastructures provides features such as the specific geospatial complex functionality and the high power computation and security of the Grid, high spatial model resolution and geographical area covering, flexible combination and interoperability of the geographical models. According with the Service Oriented Architecture concepts and requirements of interoperability between geospatial and Grid infrastructures each of the main functionality is visible from enviroGRIDS Portal and consequently, by the end user applications such as Decision Maker/Citizen oriented Applications. The enviroGRIDS portal is the single way of the user to get into the system and the portal faces a unique style of the graphical user interface. Main reference for further information: [1] enviroGRIDS Project, http://www.envirogrids.net/

  16. To ontologise or not to ontologise: An information model for a geospatial knowledge infrastructure

    NASA Astrophysics Data System (ADS)

    Stock, Kristin; Stojanovic, Tim; Reitsma, Femke; Ou, Yang; Bishr, Mohamed; Ortmann, Jens; Robertson, Anne

    2012-08-01

    A geospatial knowledge infrastructure consists of a set of interoperable components, including software, information, hardware, procedures and standards, that work together to support advanced discovery and creation of geoscientific resources, including publications, data sets and web services. The focus of the work presented is the development of such an infrastructure for resource discovery. Advanced resource discovery is intended to support scientists in finding resources that meet their needs, and focuses on representing the semantic details of the scientific resources, including the detailed aspects of the science that led to the resource being created. This paper describes an information model for a geospatial knowledge infrastructure that uses ontologies to represent these semantic details, including knowledge about domain concepts, the scientific elements of the resource (analysis methods, theories and scientific processes) and web services. This semantic information can be used to enable more intelligent search over scientific resources, and to support new ways to infer and visualise scientific knowledge. The work describes the requirements for semantic support of a knowledge infrastructure, and analyses the different options for information storage based on the twin goals of semantic richness and syntactic interoperability to allow communication between different infrastructures. Such interoperability is achieved by the use of open standards, and the architecture of the knowledge infrastructure adopts such standards, particularly from the geospatial community. The paper then describes an information model that uses a range of different types of ontologies, explaining those ontologies and their content. The information model was successfully implemented in a working geospatial knowledge infrastructure, but the evaluation identified some issues in creating the ontologies.

  17. The geo-spatial information infrastructure at the Centre for Control and Prevention of Zoonoses, University of Ibadan, Nigeria: an emerging sustainable One-Health pavilion.

    PubMed

    Olugasa, B O

    2014-12-01

    The World-Wide-Web as a contemporary means of information sharing offers a platform for geo-spatial information dissemination to improve education about spatio-temporal patterns of disease spread at the human-animal-environment interface in developing countries of West Africa. In assessing the quality of exposure to geospatial information applications among students in five purposively selected institutions in West Africa, this study reviewed course contents and postgraduate programmes in zoonoses surveillance. Geospatial information content and associated practical exercises in zoonoses surveillance were scored.. Seven criteria were used to categorize and score capability, namely, spatial data capture; thematic map design and interpretation; spatio-temporal analysis; remote sensing of data; statistical modelling; the management of spatial data-profile; and web-based map sharing operation within an organization. These criteria were used to compute weighted exposure during training at the institutions. A categorical description of institution with highest-scoring of computed Cumulative Exposure Point Average (CEPA) was based on an illustration with retrospective records of rabies cases, using data from humans, animals and the environment, that were sourced from Grand Bassa County, Liberia to create and share maps and information with faculty, staff, students and the neighbourhood about animal bite injury surveillance and spatial distribution of rabies-like illness. Uniformly low CEPA values (0-1.3) were observed across academic departments. The highest (3.8) was observed at the Centre for Control and Prevention of Zoonoses (CCPZ), University of Ibadan, Nigeria, where geospatial techniques were systematically taught, and thematic and predictive maps were produced and shared online with other institutions in West Africa. In addition, a short course in zoonosis surveillance, which offers inclusive learning in geospatial applications, is taught at CCPZ. The paper presents a graded capability for geospatial data capture, analysis and an emerging sustainable map pavilion dedicated to zoonoses disease surveillance training among collaborating institutions in West Africa.

  18. Data Collection and Management with ENSITE HUB: ENSITE HUB Version 1.0

    DTIC Science & Technology

    2017-08-01

    Model (GGDM) standards. The Army Geospatial Enterprise (AGE) is where the standardized geospatial information is collected, managed , ana- lyzed...acquisition information management . (http://asc.army.mil/web/organization) ERDC/CERL SR-17-14 6 • Static feature classes with a yearly vintage must...ER D C/ CE RL S R- 17 -1 4 Engineer Site Identification for the Tactical Environment (ENSITE) Data Collection and Management with ENSITE

  19. Nebhydro: Sharing Geospatial Data to Supportwater Management in Nebraska

    NASA Astrophysics Data System (ADS)

    Kamble, B.; Irmak, A.; Hubbard, K.; Deogun, J.; Dvorak, B.

    2012-12-01

    Recent advances in web-enabled geographical technologies have the potential to make a dramatic impact on development of highly interactive spatial applications on the web for visualization of large-scale geospatial data by water resources and irrigation scientists. Spatial and point scale water resources data visualization are an emerging and challenging application domain. Query based visual explorations of geospatial hydrological data can play an important role in stimulating scientific hypotheses and seeking causal relationships among hydro variables. The Nebraska Hydrological Information System (NebHydro) utilizes ESRI's ArcGIS server technology to increase technological awareness among farmers, irrigation managers and policy makers. Web-based geospatial applications are an effective way to expose scientific hydrological datasets to the research community and the public. NebHydro uses Adobe Flex technology to offer an online visualization and data analysis system for presentation of social and economic data. Internet mapping services is an integrated product of GIS and Internet technologies; it is a favored solution to achieve the interoperability of GIS. The development of Internet based GIS services in the state of Nebraska showcases the benefits of sharing geospatial hydrological data among agencies, resource managers and policy makers. Geospatial hydrological Information (Evapotranspiration from Remote Sensing, vegetation indices (NDVI), USGS Stream gauge data, Climatic data etc.) is generally generated through model simulation (METRIC, SWAP, Linux, Python based scripting etc). Information is compiled into and stored within object oriented relational spatial databases using a geodatabase information model that supports the key data types needed by applications including features, relationships, networks, imagery, terrains, maps and layers. The system provides online access, querying, visualization, and analysis of the hydrological data from several sources at one place. The study indicates that internet GIS, developed using advanced technologies, provides valuable education potential to users in hydrology and irrigation engineering and suggests that such a system can support advanced hydrological data access and analysis tools to improve utility of data in operations. Keywords: Hydrological Information System, NebHydro, Water Management, data sharing, data visualization, ArcGIS server.

  20. BPELPower—A BPEL execution engine for geospatial web services

    NASA Astrophysics Data System (ADS)

    Yu, Genong (Eugene); Zhao, Peisheng; Di, Liping; Chen, Aijun; Deng, Meixia; Bai, Yuqi

    2012-10-01

    The Business Process Execution Language (BPEL) has become a popular choice for orchestrating and executing workflows in the Web environment. As one special kind of scientific workflow, geospatial Web processing workflows are data-intensive, deal with complex structures in data and geographic features, and execute automatically with limited human intervention. To enable the proper execution and coordination of geospatial workflows, a specially enhanced BPEL execution engine is required. BPELPower was designed, developed, and implemented as a generic BPEL execution engine with enhancements for executing geospatial workflows. The enhancements are especially in its capabilities in handling Geography Markup Language (GML) and standard geospatial Web services, such as the Web Processing Service (WPS) and the Web Feature Service (WFS). BPELPower has been used in several demonstrations over the decade. Two scenarios were discussed in detail to demonstrate the capabilities of BPELPower. That study showed a standard-compliant, Web-based approach for properly supporting geospatial processing, with the only enhancement at the implementation level. Pattern-based evaluation and performance improvement of the engine are discussed: BPELPower directly supports 22 workflow control patterns and 17 workflow data patterns. In the future, the engine will be enhanced with high performance parallel processing and broad Web paradigms.

  1. Establishing Transportation Framework Services Using the Open Geospatial Consortium Web Feature Service Specification

    NASA Astrophysics Data System (ADS)

    Yang, C.; Wong, D. W.; Phillips, T.; Wright, R. A.; Lindsey, S.; Kafatos, M.

    2005-12-01

    As a teamed partnership of the Center for Earth Observing and Space Research (CEOSR) at George Mason University (GMU), Virginia Department of Transportation (VDOT), Bureau of Transportation Statistics at the Department of Transportation (BTS/DOT), and Intergraph, we established Transportation Framework Data Services using Open Geospatial Consortium (OGC)'s Web Feature Service (WFS) Specification to enable the sharing of transportation data among the federal level with data from BTS/DOT, the state level through VDOT, the industries through Intergraph. CEOSR develops WFS solutions using Intergraph software. Relevant technical documents are also developed and disseminated through the partners. The WFS is integrated with operational geospatial systems at CEOSR and VDOT. CEOSR works with Intergraph on developing WFS solutions and technical documents. GeoMedia WebMap WFS toolkit is used with software and technical support from Intergraph. ESRI ArcIMS WFS connector is used with GMU's campus license of ESRI products. Tested solutions are integrated with framework data service operational systems, including 1) CEOSR's interoperable geospatial information services, FGDC clearinghouse Node, Geospatial One Stop (GOS) portal, and WMS services, 2) VDOT's state transportation data and GIS infrastructure, and 3)BTS/DOT's national transportation data. The project presents: 1) develop and deploy an operational OGC WFS 1.1 interfaces at CEOSR for registering with FGDC/GOS Portal and responding to Web ``POST'' requests for transportation Framework data as listed in Table 1; 2) build the WFS service that can return the data that conform to the drafted ANSI/INCITS L1 Standard (when available) for each identified theme in the format given by OGC Geography Markup Language (GML) Version 3.0 or higher; 3) integrate the OGC WFS with CEOSR's clearinghouse nodes, 4) establish a formal partnership to develop and share WFS-based geospatial interoperability technology among GMU, VDOT, BTS/DOT, and Intergraph; and 5) develop WFS-based solutions and technical documents using the GeoMedia WebMap WFS toolkit. Geospatial Web Feature Service is demonstrated to be more efficient in sharing vector data and supports direct Internet access transportation data. Developed WFS solutions also enhanced the interoperable service provided by CEOSR through the FGDC clearinghouse node and the GOS Portal.

  2. Cloud computing geospatial application for water resources based on free and open source software and open standards - a prototype

    NASA Astrophysics Data System (ADS)

    Delipetrev, Blagoj

    2016-04-01

    Presently, most of the existing software is desktop-based, designed to work on a single computer, which represents a major limitation in many ways, starting from limited computer processing, storage power, accessibility, availability, etc. The only feasible solution lies in the web and cloud. This abstract presents research and development of a cloud computing geospatial application for water resources based on free and open source software and open standards using hybrid deployment model of public - private cloud, running on two separate virtual machines (VMs). The first one (VM1) is running on Amazon web services (AWS) and the second one (VM2) is running on a Xen cloud platform. The presented cloud application is developed using free and open source software, open standards and prototype code. The cloud application presents a framework how to develop specialized cloud geospatial application that needs only a web browser to be used. This cloud application is the ultimate collaboration geospatial platform because multiple users across the globe with internet connection and browser can jointly model geospatial objects, enter attribute data and information, execute algorithms, and visualize results. The presented cloud application is: available all the time, accessible from everywhere, it is scalable, works in a distributed computer environment, it creates a real-time multiuser collaboration platform, the programing languages code and components are interoperable, and it is flexible in including additional components. The cloud geospatial application is implemented as a specialized water resources application with three web services for 1) data infrastructure (DI), 2) support for water resources modelling (WRM), 3) user management. The web services are running on two VMs that are communicating over the internet providing services to users. The application was tested on the Zletovica river basin case study with concurrent multiple users. The application is a state-of-the-art cloud geospatial collaboration platform. The presented solution is a prototype and can be used as a foundation for developing of any specialized cloud geospatial applications. Further research will be focused on distributing the cloud application on additional VMs, testing the scalability and availability of services.

  3. Introduction to the Complex Geospatial Web in Geographical Education

    ERIC Educational Resources Information Center

    Papadimitriou, Fivos

    2010-01-01

    The Geospatial Web is emerging in the geographical education landscape in all its complexity. How will geographers and educators react? What are the most important facets of this development? After reviewing the possible impacts on geographical education, it can be conjectured that the Geospatial Web will eventually replace the usual geographical…

  4. Best Practices for Preparing Interoperable Geospatial Data

    NASA Astrophysics Data System (ADS)

    Wei, Y.; Santhana Vannan, S.; Cook, R. B.; Wilson, B. E.; Beaty, T. W.

    2010-12-01

    Geospatial data is critically important for a wide scope of research and applications: carbon cycle and ecosystem, climate change, land use and urban planning, environmental protecting, etc. Geospatial data is created by different organizations using different methods, from remote sensing observations, field surveys, model simulations, etc., and stored in various formats. So geospatial data is diverse and heterogeneous, which brings a huge barrier for the sharing and using of geospatial data, especially when targeting a broad user community. Many efforts have been taken to address different aspects of using geospatial data by improving its interoperability. For example, the specification for Open Geospatial Consortium (OGC) catalog services defines a standard way for geospatial information discovery; OGC Web Coverage Services (WCS) and OPeNDAP define interoperable protocols for geospatial data access, respectively. But the reality is that only having the standard mechanisms for data discovery and access is not enough. The geospatial data content itself has to be organized in standard, easily understandable, and readily usable formats. The Oak Ridge National Lab Distributed Archived Data Center (ORNL DAAC) archives data and information relevant to biogeochemical dynamics, ecological data, and environmental processes. The Modeling and Synthesis Thematic Data Center (MAST-DC) prepares and distributes both input data and output data of carbon cycle models and provides data support for synthesis and terrestrial model inter-comparison in multi-scales. Both of these NASA-funded data centers compile and distribute a large amount of diverse geospatial data and have broad user communities, including GIS users, Earth science researchers, and ecosystem modeling teams. The ORNL DAAC and MAST-DC address this geospatial data interoperability issue by standardizing the data content and feeding them into a well-designed Spatial Data Infrastructure (SDI) which provides interoperable mechanisms to advertise, visualize, and distribute the standardized geospatial data. In this presentation, we summarize the experiences learned and the best practices for geospatial data standardization. The presentation will describe how diverse and historical data archived in the ORNL DAAC were converted into standard and non-proprietary formats; what tools were used to make the conversion; how the spatial and temporal information are properly captured in a consistent manor; how to name a data file or a variable to make it both human-friendly and semantically interoperable; how NetCDF file format and CF convention can promote the data usage in ecosystem modeling user community; how those standardized geospatial data can be fed into OGC Web Services to support on-demand data visualization and access; and how the metadata should be collected and organized so that they can be discovered through standard catalog services.

  5. Sensor Web Interoperability Testbed Results Incorporating Earth Observation Satellites

    NASA Technical Reports Server (NTRS)

    Frye, Stuart; Mandl, Daniel J.; Alameh, Nadine; Bambacus, Myra; Cappelaere, Pat; Falke, Stefan; Derezinski, Linda; Zhao, Piesheng

    2007-01-01

    This paper describes an Earth Observation Sensor Web scenario based on the Open Geospatial Consortium s Sensor Web Enablement and Web Services interoperability standards. The scenario demonstrates the application of standards in describing, discovering, accessing and tasking satellites and groundbased sensor installations in a sequence of analysis activities that deliver information required by decision makers in response to national, regional or local emergencies.

  6. U.S. EPAs Geospatial Data Access Project

    EPA Pesticide Factsheets

    To improve public health and the environment, the United States Environmental Protection Agency (EPA) collects information about facilities, sites, or places subject to environmental regulation or of environmental interest. Through the Geospatial Data Download Service, the public is now able to download the EPA Geodata Shapefile, Feature Class or extensible markup language (XML) file containing facility and site information from EPA's national program systems. The files are Internet accessible from the Envirofacts Web site (https://www3.epa.gov/enviro/). The data may be used with geospatial mapping applications. (Note: The files omit facilities without latitude/longitude coordinates.) The EPA Geospatial Data contains the name, location (latitude/longitude), and EPA program information about specific facilities and sites. In addition, the files contain a Uniform Resource Locator (URL), which allows mapping applications to present an option to users to access additional EPA data resources on a specific facility or site.

  7. Interoperability And Value Added To Earth Observation Data

    NASA Astrophysics Data System (ADS)

    Gasperi, J.

    2012-04-01

    Geospatial web services technology has provided a new means for geospatial data interoperability. Open Geospatial Consortium (OGC) services such as Web Map Service (WMS) to request maps on the Internet, Web Feature Service (WFS) to exchange vectors or Catalog Service for the Web (CSW) to search for geospatialized data have been widely adopted in the Geosciences community in general and in the remote sensing community in particular. These services make Earth Observation data available to a wider range of public users than ever before. The mapshup web client offers an innovative and efficient user interface that takes advantage of the power of interoperability. This presentation will demonstrate how mapshup can be effectively used in the context of natural disasters management.

  8. Progress of Interoperability in Planetary Research for Geospatial Data Analysis

    NASA Astrophysics Data System (ADS)

    Hare, T. M.; Gaddis, L. R.

    2015-12-01

    For nearly a decade there has been a push in the planetary science community to support interoperable methods of accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (i.e., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized image formats that retain geographic information (e.g., GeoTiff, GeoJpeg2000), digital geologic mapping conventions, planetary extensions for symbols that comply with U.S. Federal Geographic Data Committee cartographic and geospatial metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they have been modified to support the planetary domain. The motivation to support common, interoperable data format and delivery standards is not only to improve access for higher-level products but also to address the increasingly distributed nature of the rapidly growing volumes of data. The strength of using an OGC approach is that it provides consistent access to data that are distributed across many facilities. While data-steaming standards are well-supported by both the more sophisticated tools used in Geographic Information System (GIS) and remote sensing industries, they are also supported by many light-weight browsers which facilitates large and small focused science applications and public use. Here we provide an overview of the interoperability initiatives that are currently ongoing in the planetary research community, examples of their successful application, and challenges that remain.

  9. A geospatial search engine for discovering multi-format geospatial data across the web

    Treesearch

    Christopher Bone; Alan Ager; Ken Bunzel; Lauren Tierney

    2014-01-01

    The volume of publically available geospatial data on the web is rapidly increasing due to advances in server-based technologies and the ease at which data can now be created. However, challenges remain with connecting individuals searching for geospatial data with servers and websites where such data exist. The objective of this paper is to present a publically...

  10. Browsing and Visualization of Linked Environmental Data

    NASA Astrophysics Data System (ADS)

    Nikolaou, Charalampos; Kyzirakos, Kostis; Bereta, Konstantina; Dogani, Kallirroi; Koubarakis, Manolis

    2014-05-01

    Linked environmental data has started to appear on the Web as environmental researchers make use of technologies such as ontologies, RDF, and SPARQL. Many of these datasets have an important geospatial and temporal dimension. The same is true also for the Web of data that is being rapidly populated not only with geospatial information, but also with temporal information. As the real-world entities represented in linked geospatial datasets evolve over time, the datasets themselves get updated and both the spatial and the temporal dimension of data become significant for users. For example, in the Earth Observation and Environment domains, data is constantly produced by satellite sensors and is associated with metadata containing, among others, temporal attributes, such as the time that an image was acquired. In addition, the acquisitions are considered to be valid for specific periods of time, for example until they get updated by new acquisitions. Satellite acquisitions might be utilized in applications such as the CORINE Land Cover programme operated by the European Environment Agency that makes available as a cartographic product the land cover of European areas. Periodically CORINE publishes the changes in the land cover of these areas in the form of changesets. Tools for exploiting the abundance of geospatial information have also started to emerge. However, these tools are designed for browsing a single data source, while in addition they cannot represent the temporal dimension. This is for two reasons: a) the lack of an implementation of a data model and a query language with temporal features covering the various semantics associated with the representation of time (e.g., valid and user-defined), and b) the lack of a standard temporal extension of RDF that would allow practitioners to utilize when publishing RDF data. Recently, we presented the temporal features of the data model stRDF, the query language stSPARQL, and their implementation in the geospatial RDF store Strabon (http://www.strabon.di.uoa.gr/) which, apart from querying geospatial information, can also be used to query both the valid time of a triple and user-defined time. With the aim of filling the aforementioned gaps and going beyond data exploration to map creation and sharing, we have designed and developed SexTant (http://sextant.di.uoa.gr/). SexTant can be used to produce thematic maps by layering spatiotemporal information which exists in a number of data sources ranging from standard SPARQL endpoints, to SPARQL endpoints following the standard GeoSPARQL defined by the Open Geospatial Consortium (OGC) for the modelling and querying of geospatial information, and other well-adopted geospatial file formats, such as KML and GeoJSON. In this work, we pick some real use cases from the environment domain to showcase the usefulness of SexTant to the environmental studies of a domain expert by presenting its browsing and visualization capabilities using a number of environmental datasets that we have published as linked data and also other geospatial data sources publicly available on the Web, such as KML files.

  11. Towards the Geospatial Web: Media Platforms for Managing Geotagged Knowledge Repositories

    NASA Astrophysics Data System (ADS)

    Scharl, Arno

    International media have recognized the visual appeal of geo-browsers such as NASA World Wind and Google Earth, for example, when Web and television coverage on Hurricane Katrina used interactive geospatial projections to illustrate its path and the scale of destruction in August 2005. Yet these early applications only hint at the true potential of geospatial technology to build and maintain virtual communities and to revolutionize the production, distribution and consumption of media products. This chapter investigates this potential by reviewing the literature and discussing the integration of geospatial and semantic reference systems, with an emphasis on extracting geospatial context from unstructured text. A content analysis of news coverage based on a suite of text mining tools (webLyzard) sheds light on the popularity and adoption of geospatial platforms.

  12. Using Web Crawler Technology for Text Analysis of Geo-Events: A Case Study of the Huangyan Island Incident

    NASA Astrophysics Data System (ADS)

    Hu, H.; Ge, Y. J.

    2013-11-01

    With the social networking and network socialisation have brought more text information and social relationships into our daily lives, the question of whether big data can be fully used to study the phenomenon and discipline of natural sciences has prompted many specialists and scholars to innovate their research. Though politics were integrally involved in the hyperlinked word issues since 1990s, automatic assembly of different geospatial web and distributed geospatial information systems utilizing service chaining have explored and built recently, the information collection and data visualisation of geo-events have always faced the bottleneck of traditional manual analysis because of the sensibility, complexity, relativity, timeliness and unexpected characteristics of political events. Based on the framework of Heritrix and the analysis of web-based text, word frequency, sentiment tendency and dissemination path of the Huangyan Island incident is studied here by combining web crawler technology and the text analysis method. The results indicate that tag cloud, frequency map, attitudes pie, individual mention ratios and dissemination flow graph based on the data collection and processing not only highlight the subject and theme vocabularies of related topics but also certain issues and problems behind it. Being able to express the time-space relationship of text information and to disseminate the information regarding geo-events, the text analysis of network information based on focused web crawler technology can be a tool for understanding the formation and diffusion of web-based public opinions in political events.

  13. The geospatial data quality REST API for primary biodiversity data

    PubMed Central

    Otegui, Javier; Guralnick, Robert P.

    2016-01-01

    Summary: We present a REST web service to assess the geospatial quality of primary biodiversity data. It enables access to basic and advanced functions to detect completeness and consistency issues as well as general errors in the provided record or set of records. The API uses JSON for data interchange and efficient parallelization techniques for fast assessments of large datasets. Availability and implementation: The Geospatial Data Quality API is part of the VertNet set of APIs. It can be accessed at http://api-geospatial.vertnet-portal.appspot.com/geospatial and is already implemented in the VertNet data portal for quality reporting. Source code is freely available under GPL license from http://www.github.com/vertnet/api-geospatial. Contact: javier.otegui@gmail.com or rguralnick@flmnh.ufl.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26833340

  14. An Architecture for Automated Fire Detection Early Warning System Based on Geoprocessing Service Composition

    NASA Astrophysics Data System (ADS)

    Samadzadegan, F.; Saber, M.; Zahmatkesh, H.; Joze Ghazi Khanlou, H.

    2013-09-01

    Rapidly discovering, sharing, integrating and applying geospatial information are key issues in the domain of emergency response and disaster management. Due to the distributed nature of data and processing resources in disaster management, utilizing a Service Oriented Architecture (SOA) to take advantages of workflow of services provides an efficient, flexible and reliable implementations to encounter different hazardous situation. The implementation specification of the Web Processing Service (WPS) has guided geospatial data processing in a Service Oriented Architecture (SOA) platform to become a widely accepted solution for processing remotely sensed data on the web. This paper presents an architecture design based on OGC web services for automated workflow for acquisition, processing remotely sensed data, detecting fire and sending notifications to the authorities. A basic architecture and its building blocks for an automated fire detection early warning system are represented using web-based processing of remote sensing imageries utilizing MODIS data. A composition of WPS processes is proposed as a WPS service to extract fire events from MODIS data. Subsequently, the paper highlights the role of WPS as a middleware interface in the domain of geospatial web service technology that can be used to invoke a large variety of geoprocessing operations and chaining of other web services as an engine of composition. The applicability of proposed architecture by a real world fire event detection and notification use case is evaluated. A GeoPortal client with open-source software was developed to manage data, metadata, processes, and authorities. Investigating feasibility and benefits of proposed framework shows that this framework can be used for wide area of geospatial applications specially disaster management and environmental monitoring.

  15. Exploring NASA GES DISC Data with Interoperable Services

    NASA Technical Reports Server (NTRS)

    Zhao, Peisheng; Yang, Wenli; Hegde, Mahabal; Wei, Jennifer C.; Kempler, Steven; Pham, Long; Teng, William; Savtchenko, Andrey

    2015-01-01

    Overview of NASA GES DISC (NASA Goddard Earth Science Data and Information Services Center) data with interoperable services: Open-standard and Interoperable Services Improve data discoverability, accessibility, and usability with metadata, catalogue and portal standards Achieve data, information and knowledge sharing across applications with standardized interfaces and protocols Open Geospatial Consortium (OGC) Data Services and Specifications Web Coverage Service (WCS) -- data Web Map Service (WMS) -- pictures of data Web Map Tile Service (WMTS) --- pictures of data tiles Styled Layer Descriptors (SLD) --- rendered styles.

  16. Teaching Tectonics to Undergraduates with Web GIS

    NASA Astrophysics Data System (ADS)

    Anastasio, D. J.; Bodzin, A.; Sahagian, D. L.; Rutzmoser, S.

    2013-12-01

    Geospatial reasoning skills provide a means for manipulating, interpreting, and explaining structured information and are involved in higher-order cognitive processes that include problem solving and decision-making. Appropriately designed tools, technologies, and curriculum can support spatial learning. We present Web-based visualization and analysis tools developed with Javascript APIs to enhance tectonic curricula while promoting geospatial thinking and scientific inquiry. The Web GIS interface integrates graphics, multimedia, and animations that allow users to explore and discover geospatial patterns that are not easily recognized. Features include a swipe tool that enables users to see underneath layers, query tools useful in exploration of earthquake and volcano data sets, a subduction and elevation profile tool which facilitates visualization between map and cross-sectional views, drafting tools, a location function, and interactive image dragging functionality on the Web GIS. The Web GIS platform is independent and can be implemented on tablets or computers. The GIS tool set enables learners to view, manipulate, and analyze rich data sets from local to global scales, including such data as geology, population, heat flow, land cover, seismic hazards, fault zones, continental boundaries, and elevation using two- and three- dimensional visualization and analytical software. Coverages which allow users to explore plate boundaries and global heat flow processes aided learning in a Lehigh University Earth and environmental science Structural Geology and Tectonics class and are freely available on the Web.

  17. PolarHub: A Global Hub for Polar Data Discovery

    NASA Astrophysics Data System (ADS)

    Li, W.

    2014-12-01

    This paper reports the outcome of a NSF project in developing a large-scale web crawler PolarHub to discover automatically the distributed polar dataset in the format of OGC web services (OWS) in the cyberspace. PolarHub is a machine robot; its goal is to visit as many webpages as possible to find those containing information about polar OWS, extract this information and store it into the backend data repository. This is a very challenging task given huge data volume of webpages on the Web. Three unique features was introduced in PolarHub to make it distinctive from earlier crawler solutions: (1) a multi-task, multi-user, multi-thread support to the crawling tasks; (2) an extensive use of thread pool and Data Access Object (DAO) design patterns to separate persistent data storage and business logic to achieve high extendibility of the crawler tool; (3) a pattern-matching based customizable crawling algorithm to support discovery of multi-type geospatial web services; and (4) a universal and portable client-server communication mechanism combining a server-push and client pull strategies for enhanced asynchronous processing. A series of experiments were conducted to identify the impact of crawling parameters to the overall system performance. The geographical distribution pattern of all PolarHub identified services is also demonstrated. We expect this work to make a major contribution to the field of geospatial information retrieval and geospatial interoperability, to bridge the gap between data provider and data consumer, and to accelerate polar science by enhancing the accessibility and reusability of adequate polar data.

  18. The National Map product and services directory

    USGS Publications Warehouse

    Newell, Mark R.

    2008-01-01

    As one of the cornerstones of the U.S. Geological Survey's (USGS) National Geospatial Program (NGP), The National Map is a collaborative effort among the USGS and other Federal, state, and local partners to improve and deliver topographic information for the Nation. It has many uses ranging from recreation to scientific analysis to emergency response. The National Map is easily accessible for display on the Web, as products, and as downloadable data. The geographic information available from The National Map includes orthoimagery (aerial photographs), elevation, geographic names, hydrography, boundaries, transportation, structures, and land cover. Other types of geographic information can be added to create specific types of maps. Of major importance, The National Map currently is being transformed to better serve the geospatial community. The USGS National Geospatial Program Office (NGPO) was established to provide leadership for placing geographic knowledge at the fingertips of the Nation. The office supports The National Map, Geospatial One-Stop (GOS), National Atlas of the United States®, and the Federal Geographic Data Committee (FGDC). This integrated portfolio of geospatial information and data supports the essential components of delivering the National Spatial Data Infrastructure (NSDI) and capitalizing on the power of place.

  19. Towards Web-based representation and processing of health information

    PubMed Central

    Gao, Sheng; Mioc, Darka; Yi, Xiaolun; Anton, Francois; Oldfield, Eddie; Coleman, David J

    2009-01-01

    Background There is great concern within health surveillance, on how to grapple with environmental degradation, rapid urbanization, population mobility and growth. The Internet has emerged as an efficient way to share health information, enabling users to access and understand data at their fingertips. Increasingly complex problems in the health field require increasingly sophisticated computer software, distributed computing power, and standardized data sharing. To address this need, Web-based mapping is now emerging as an important tool to enable health practitioners, policy makers, and the public to understand spatial health risks, population health trends and vulnerabilities. Today several web-based health applications generate dynamic maps; however, for people to fully interpret the maps they need data source description and the method used in the data analysis or statistical modeling. For the representation of health information through Web-mapping applications, there still lacks a standard format to accommodate all fixed (such as location) and variable (such as age, gender, health outcome, etc) indicators in the representation of health information. Furthermore, net-centric computing has not been adequately applied to support flexible health data processing and mapping online. Results The authors of this study designed a HEalth Representation XML (HERXML) schema that consists of the semantic (e.g., health activity description, the data sources description, the statistical methodology used for analysis), geometric, and cartographical representations of health data. A case study has been carried on the development of web application and services within the Canadian Geospatial Data Infrastructure (CGDI) framework for community health programs of the New Brunswick Lung Association. This study facilitated the online processing, mapping and sharing of health information, with the use of HERXML and Open Geospatial Consortium (OGC) services. It brought a new solution in better health data representation and initial exploration of the Web-based processing of health information. Conclusion The designed HERXML has been proven to be an appropriate solution in supporting the Web representation of health information. It can be used by health practitioners, policy makers, and the public in disease etiology, health planning, health resource management, health promotion and health education. The utilization of Web-based processing services in this study provides a flexible way for users to select and use certain processing functions for health data processing and mapping via the Web. This research provides easy access to geospatial and health data in understanding the trends of diseases, and promotes the growth and enrichment of the CGDI in the public health sector. PMID:19159445

  20. Interacting With A Near Real-Time Urban Digital Watershed Using Emerging Geospatial Web Technologies

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Fazio, D. J.; Abdelzaher, T.; Minsker, B.

    2007-12-01

    The value of real-time hydrologic data dissemination including river stage, streamflow, and precipitation for operational stormwater management efforts is particularly high for communities where flash flooding is common and costly. Ideally, such data would be presented within a watershed-scale geospatial context to portray a holistic view of the watershed. Local hydrologic sensor networks usually lack comprehensive integration with sensor networks managed by other agencies sharing the same watershed due to administrative, political, but mostly technical barriers. Recent efforts on providing unified access to hydrological data have concentrated on creating new SOAP-based web services and common data format (e.g. WaterML and Observation Data Model) for users to access the data (e.g. HIS and HydroSeek). Geospatial Web technology including OGC sensor web enablement (SWE), GeoRSS, Geo tags, Geospatial browsers such as Google Earth and Microsoft Virtual Earth and other location-based service tools provides possibilities for us to interact with a digital watershed in near-real-time. OGC SWE proposes a revolutionary concept towards a web-connected/controllable sensor networks. However, these efforts have not provided the capability to allow dynamic data integration/fusion among heterogeneous sources, data filtering and support for workflows or domain specific applications where both push and pull mode of retrieving data may be needed. We propose a light weight integration framework by extending SWE with open source Enterprise Service Bus (e.g., mule) as a backbone component to dynamically transform, transport, and integrate both heterogeneous sensor data sources and simulation model outputs. We will report our progress on building such framework where multi-agencies" sensor data and hydro-model outputs (with map layers) will be integrated and disseminated in a geospatial browser (e.g. Microsoft Virtual Earth). This is a collaborative project among NCSA, USGS Illinois Water Science Center, Computer Science Department at UIUC funded by the Adaptive Environmental Infrastructure Sensing and Information Systems initiative at UIUC.

  1. GeoSearch: A lightweight broking middleware for geospatial resources discovery

    NASA Astrophysics Data System (ADS)

    Gui, Z.; Yang, C.; Liu, K.; Xia, J.

    2012-12-01

    With petabytes of geodata, thousands of geospatial web services available over the Internet, it is critical to support geoscience research and applications by finding the best-fit geospatial resources from the massive and heterogeneous resources. Past decades' developments witnessed the operation of many service components to facilitate geospatial resource management and discovery. However, efficient and accurate geospatial resource discovery is still a big challenge due to the following reasons: 1)The entry barriers (also called "learning curves") hinder the usability of discovery services to end users. Different portals and catalogues always adopt various access protocols, metadata formats and GUI styles to organize, present and publish metadata. It is hard for end users to learn all these technical details and differences. 2)The cost for federating heterogeneous services is high. To provide sufficient resources and facilitate data discovery, many registries adopt periodic harvesting mechanism to retrieve metadata from other federated catalogues. These time-consuming processes lead to network and storage burdens, data redundancy, and also the overhead of maintaining data consistency. 3)The heterogeneous semantics issues in data discovery. Since the keyword matching is still the primary search method in many operational discovery services, the search accuracy (precision and recall) is hard to guarantee. Semantic technologies (such as semantic reasoning and similarity evaluation) offer a solution to solve these issues. However, integrating semantic technologies with existing service is challenging due to the expandability limitations on the service frameworks and metadata templates. 4)The capabilities to help users make final selection are inadequate. Most of the existing search portals lack intuitive and diverse information visualization methods and functions (sort, filter) to present, explore and analyze search results. Furthermore, the presentation of the value-added additional information (such as, service quality and user feedback), which conveys important decision supporting information, is missing. To address these issues, we prototyped a distributed search engine, GeoSearch, based on brokering middleware framework to search, integrate and visualize heterogeneous geospatial resources. Specifically, 1) A lightweight discover broker is developed to conduct distributed search. The broker retrieves metadata records for geospatial resources and additional information from dispersed services (portals and catalogues) and other systems on the fly. 2) A quality monitoring and evaluation broker (i.e., QoS Checker) is developed and integrated to provide quality information for geospatial web services. 3) The semantic assisted search and relevance evaluation functions are implemented by loosely interoperating with ESIP Testbed component. 4) Sophisticated information and data visualization functionalities and tools are assembled to improve user experience and assist resource selection.

  2. R4FRS_RCRAINFO

    EPA Pesticide Factsheets

    To improve public health and the environment, the United States Environmental Protection Agency (USEPA) collects information about facilities, sites, or places subject to environmental regulation or of environmental interest. Through the Geospatial Data Download Service, the public is now able to download the EPA Geodata shapefile containing facility and site information from EPA's national program systems. The file is Internet accessible from the Envirofacts Web site (http://www.epa.gov/enviro). The data may be used with geospatial mapping applications. (Note: The shapefile omits facilities without latitude/longitude coordinates.) The EPA Geospatial Data contains the name, location (latitude/longitude), and EPA program information about specific facilities and sites. In addition, the file contains a Uniform Resource Locator (URL), which allows mapping applications to present an option to users to access additional EPA data resources on a specific facility or site.

  3. US EPA Region 4 RMP Facilities

    EPA Pesticide Factsheets

    To improve public health and the environment, the United States Environmental Protection Agency (USEPA) collects information about facilities, sites, or places subject to environmental regulation or of environmental interest. Through the Geospatial Data Download Service, the public is now able to download the EPA Geodata shapefile containing facility and site information from EPA's national program systems. The file is Internet accessible from the Envirofacts Web site (http://www.epa.gov/enviro). The data may be used with geospatial mapping applications. (Note: The shapefile omits facilities without latitude/longitude coordinates.) The EPA Geospatial Data contains the name, location (latitude/longitude), and EPA program information about specific facilities and sites. In addition, the file contains a Uniform Resource Locator (URL), which allows mapping applications to present an option to users to access additional EPA data resources on a specific facility or site.

  4. GeoCENS: a geospatial cyberinfrastructure for the world-wide sensor web.

    PubMed

    Liang, Steve H L; Huang, Chih-Yuan

    2013-10-02

    The world-wide sensor web has become a very useful technique for monitoring the physical world at spatial and temporal scales that were previously impossible. Yet we believe that the full potential of sensor web has thus far not been revealed. In order to harvest the world-wide sensor web's full potential, a geospatial cyberinfrastructure is needed to store, process, and deliver large amount of sensor data collected worldwide. In this paper, we first define the issue of the sensor web long tail followed by our view of the world-wide sensor web architecture. Then, we introduce the Geospatial Cyberinfrastructure for Environmental Sensing (GeoCENS) architecture and explain each of its components. Finally, with demonstration of three real-world powered-by-GeoCENS sensor web applications, we believe that the GeoCENS architecture can successfully address the sensor web long tail issue and consequently realize the world-wide sensor web vision.

  5. GeoCENS: A Geospatial Cyberinfrastructure for the World-Wide Sensor Web

    PubMed Central

    Liang, Steve H.L.; Huang, Chih-Yuan

    2013-01-01

    The world-wide sensor web has become a very useful technique for monitoring the physical world at spatial and temporal scales that were previously impossible. Yet we believe that the full potential of sensor web has thus far not been revealed. In order to harvest the world-wide sensor web's full potential, a geospatial cyberinfrastructure is needed to store, process, and deliver large amount of sensor data collected worldwide. In this paper, we first define the issue of the sensor web long tail followed by our view of the world-wide sensor web architecture. Then, we introduce the Geospatial Cyberinfrastructure for Environmental Sensing (GeoCENS) architecture and explain each of its components. Finally, with demonstration of three real-world powered-by-GeoCENS sensor web applications, we believe that the GeoCENS architecture can successfully address the sensor web long tail issue and consequently realize the world-wide sensor web vision. PMID:24152921

  6. A flexible geospatial sensor observation service for diverse sensor data based on Web service

    NASA Astrophysics Data System (ADS)

    Chen, Nengcheng; Di, Liping; Yu, Genong; Min, Min

    Achieving a flexible and efficient geospatial Sensor Observation Service (SOS) is difficult, given the diversity of sensor networks, the heterogeneity of sensor data storage, and the differing requirements of users. This paper describes development of a service-oriented multi-purpose SOS framework. The goal is to create a single method of access to the data by integrating the sensor observation service with other Open Geospatial Consortium (OGC) services — Catalogue Service for the Web (CSW), Transactional Web Feature Service (WFS-T) and Transactional Web Coverage Service (WCS-T). The framework includes an extensible sensor data adapter, an OGC-compliant geospatial SOS, a geospatial catalogue service, a WFS-T, and a WCS-T for the SOS, and a geospatial sensor client. The extensible sensor data adapter finds, stores, and manages sensor data from live sensors, sensor models, and simulation systems. Abstract factory design patterns are used during design and implementation. A sensor observation service compatible with the SWE is designed, following the OGC "core" and "transaction" specifications. It is implemented using Java servlet technology. It can be easily deployed in any Java servlet container and automatically exposed for discovery using Web Service Description Language (WSDL). Interaction sequences between a Sensor Web data consumer and an SOS, between a producer and an SOS, and between an SOS and a CSW are described in detail. The framework has been successfully demonstrated in application scenarios for EO-1 observations, weather observations, and water height gauge observations.

  7. Finding geospatial pattern of unstructured data by clustering routes

    NASA Astrophysics Data System (ADS)

    Boustani, M.; Mattmann, C. A.; Ramirez, P.; Burke, W.

    2016-12-01

    Today the majority of data generated has a geospatial context to it. Either in attribute form as a latitude or longitude, or name of location or cross referenceable using other means such as an external gazetteer or location service. Our research is interested in exploiting geospatial location and context in unstructured data such as that found on the web in HTML pages, images, videos, documents, and other areas, and in structured information repositories found on intranets, in scientific environments, and otherwise. We are working together on the DARPA MEMEX project to exploit open source software tools such as the Lucene Geo Gazetteer, Apache Tika, Apache Lucene, and Apache OpenNLP, to automatically extract, and make meaning out of geospatial information. In particular, we are interested in unstructured descriptors e.g., a phone number, or a named entity, and the ability to automatically learn geospatial paths related to these descriptors. For example, a particular phone number may represent an entity that travels on a monthly basis, according to easily identifiable and somes more difficult to track patterns. We will present a set of automatic techniques to extract descriptors, and then to geospatially infer their paths across unstructured data.

  8. Architecture of a spatial data service system for statistical analysis and visualization of regional climate changes

    NASA Astrophysics Data System (ADS)

    Titov, A. G.; Okladnikov, I. G.; Gordov, E. P.

    2017-11-01

    The use of large geospatial datasets in climate change studies requires the development of a set of Spatial Data Infrastructure (SDI) elements, including geoprocessing and cartographical visualization web services. This paper presents the architecture of a geospatial OGC web service system as an integral part of a virtual research environment (VRE) general architecture for statistical processing and visualization of meteorological and climatic data. The architecture is a set of interconnected standalone SDI nodes with corresponding data storage systems. Each node runs a specialized software, such as a geoportal, cartographical web services (WMS/WFS), a metadata catalog, and a MySQL database of technical metadata describing geospatial datasets available for the node. It also contains geospatial data processing services (WPS) based on a modular computing backend realizing statistical processing functionality and, thus, providing analysis of large datasets with the results of visualization and export into files of standard formats (XML, binary, etc.). Some cartographical web services have been developed in a system’s prototype to provide capabilities to work with raster and vector geospatial data based on OGC web services. The distributed architecture presented allows easy addition of new nodes, computing and data storage systems, and provides a solid computational infrastructure for regional climate change studies based on modern Web and GIS technologies.

  9. Interoperability in planetary research for geospatial data analysis

    NASA Astrophysics Data System (ADS)

    Hare, Trent M.; Rossi, Angelo P.; Frigeri, Alessandro; Marmo, Chiara

    2018-01-01

    For more than a decade there has been a push in the planetary science community to support interoperable methods for accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (e.g., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized geospatial image formats, geologic mapping conventions, U.S. Federal Geographic Data Committee (FGDC) cartographic and metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Map Tile Services (cached image tiles), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they can be just as valuable for planetary domain. Another initiative, called VESPA (Virtual European Solar and Planetary Access), will marry several of the above geoscience standards and astronomy-based standards as defined by International Virtual Observatory Alliance (IVOA). This work outlines the current state of interoperability initiatives in use or in the process of being researched within the planetary geospatial community.

  10. Database Organisation in a Web-Enabled Free and Open-Source Software (foss) Environment for Spatio-Temporal Landslide Modelling

    NASA Astrophysics Data System (ADS)

    Das, I.; Oberai, K.; Sarathi Roy, P.

    2012-07-01

    Landslides exhibit themselves in different mass movement processes and are considered among the most complex natural hazards occurring on the earth surface. Making landslide database available online via WWW (World Wide Web) promotes the spreading and reaching out of the landslide information to all the stakeholders. The aim of this research is to present a comprehensive database for generating landslide hazard scenario with the help of available historic records of landslides and geo-environmental factors and make them available over the Web using geospatial Free & Open Source Software (FOSS). FOSS reduces the cost of the project drastically as proprietary software's are very costly. Landslide data generated for the period 1982 to 2009 were compiled along the national highway road corridor in Indian Himalayas. All the geo-environmental datasets along with the landslide susceptibility map were served through WEBGIS client interface. Open source University of Minnesota (UMN) mapserver was used as GIS server software for developing web enabled landslide geospatial database. PHP/Mapscript server-side application serve as a front-end application and PostgreSQL with PostGIS extension serve as a backend application for the web enabled landslide spatio-temporal databases. This dynamic virtual visualization process through a web platform brings an insight into the understanding of the landslides and the resulting damage closer to the affected people and user community. The landslide susceptibility dataset is also made available as an Open Geospatial Consortium (OGC) Web Feature Service (WFS) which can be accessed through any OGC compliant open source or proprietary GIS Software.

  11. GSKY: A scalable distributed geospatial data server on the cloud

    NASA Astrophysics Data System (ADS)

    Rozas Larraondo, Pablo; Pringle, Sean; Antony, Joseph; Evans, Ben

    2017-04-01

    Earth systems, environmental and geophysical datasets are an extremely valuable sources of information about the state and evolution of the Earth. Being able to combine information coming from different geospatial collections is in increasing demand by the scientific community, and requires managing and manipulating data with different formats and performing operations such as map reprojections, resampling and other transformations. Due to the large data volume inherent in these collections, storing multiple copies of them is unfeasible and so such data manipulation must be performed on-the-fly using efficient, high performance techniques. Ideally this should be performed using a trusted data service and common system libraries to ensure wide use and reproducibility. Recent developments in distributed computing based on dynamic access to significant cloud infrastructure opens the door for such new ways of processing geospatial data on demand. The National Computational Infrastructure (NCI), hosted at the Australian National University (ANU), has over 10 Petabytes of nationally significant research data collections. Some of these collections, which comprise a variety of observed and modelled geospatial data, are now made available via a highly distributed geospatial data server, called GSKY (pronounced [jee-skee]). GSKY supports on demand processing of large geospatial data products such as satellite earth observation data as well as numerical weather products, allowing interactive exploration and analysis of the data. It dynamically and efficiently distributes the required computations among cloud nodes providing a scalable analysis framework that can adapt to serve large number of concurrent users. Typical geospatial workflows handling different file formats and data types, or blending data in different coordinate projections and spatio-temporal resolutions, is handled transparently by GSKY. This is achieved by decoupling the data ingestion and indexing process as an independent service. An indexing service crawls data collections either locally or remotely by extracting, storing and indexing all spatio-temporal metadata associated with each individual record. GSKY provides the user with the ability of specifying how ingested data should be aggregated, transformed and presented. It presents an OGC standards-compliant interface, allowing ready accessibility for users of the data via Web Map Services (WMS), Web Processing Services (WPS) or raw data arrays using Web Coverage Services (WCS). The presentation will show some cases where we have used this new capability to provide a significant improvement over previous approaches.

  12. US EPA Region 4 Brownfields

    EPA Pesticide Factsheets

    To improve public health and the environment, the United States Environmental Protection Agency (USEPA) collects information about facilities, sites, or places subject to environmental regulation or of environmental interest. Through the Geospatial Data Download Service, the public is now able to download the EPA Geodata shapefile containing facility and site information from EPA's national program systems. The file is Internet accessible from the Envirofacts Web site (https://www3.epa.gov/enviro/). The data may be used with geospatial mapping applications. (Note: The shapefile omits facilities without latitude/longitude coordinates.) The EPA Geospatial Data contains the name, location (latitude/longitude), and EPA program information about specific facilities and sites. In addition, the file contains a Uniform Resource Locator (URL), which allows mapping applications to present an option to users to access additional EPA data resources on a specific facility or site. This dataset shows Brownfields listed in the 2012 Facility Registry System.

  13. 303(d) Listed Impaired Waters

    EPA Pesticide Factsheets

    Geospatial data for 303(d) Impaired Waters are available as prepackaged national downloads or as GIS web and and data services. EPA provides geospatial data in the formats: GIS compatible shapefiles and geodatabases and ESRI and OGC web mapping.

  14. Exchanging the Context between OGC Geospatial Web clients and GIS applications using Atom

    NASA Astrophysics Data System (ADS)

    Maso, Joan; Díaz, Paula; Riverola, Anna; Pons, Xavier

    2013-04-01

    Currently, the discovery and sharing of geospatial information over the web still presents difficulties. News distribution through website content was simplified by the use of Really Simple Syndication (RSS) and Atom syndication formats. This communication exposes an extension of Atom to redistribute references to geospatial information in a Spatial Data Infrastructure distributed environment. A geospatial client can save the status of an application that involves several OGC services of different kind and direct data and share this status with other users that need the same information and use different client vendor products in an interoperable way. The extensibility of the Atom format was essential to define a format that could be used in RSS enabled web browser, Mass Market map viewers and emerging geospatial enable integrated clients that support Open Geospatial Consortium (OGC) services. Since OWS Context has been designed as an Atom extension, it is possible to see the document in common places where Atom documents are valid. Internet web browsers are able to present the document as a list of items with title, abstract, time, description and downloading features. OWS Context uses GeoRSS so that, the document can be to be interpreted by both Google maps and Bing Maps as items that have the extent represented in a dynamic map. Another way to explode a OWS Context is to develop an XSLT to transform the Atom feed into an HTML5 document that shows the exact status of the client view window that saved the context document. To accomplish so, we use the width and height of the client window, and the extent of the view in world (geographic) coordinates in order to calculate the scale of the map. Then, we can mix elements in world coordinates (such as CF-NetCDF files or GML) with elements in pixel coordinates (such as WMS maps, WMTS tiles and direct SVG content). A smarter map browser application called MiraMon Map Browser is able to write a context document and read it again to recover the context of the previous view or load a context generated by another application. The possibility to store direct links to direct files in OWS Context is particularly interesting for GIS desktop solutions. This communication also presents the development made in the MiraMon desktop GIS solution to include OWS Context. MiraMon software is able to deal either with local files, web services and database connections. As in any other GIS solution, MiraMon team designed its own file (MiraMon Map MMM) for storing and sharing the status of a GIS session. The new OWS Context format is now adopted as an interoperable substitution of the MMM. The extensibility of the format makes it possible to map concepts in the MMM to current OWS Context elements (such as titles, data links, extent, etc) and to generate new elements that are able to include all extra metadata not currently covered by OWS Context. These developments were done in the nine edition of the OpenGIS Web Services Interoperability Experiment (OWS-9) and are demonstrated in this communication.

  15. Emerging Geospatial Sharing Technologies in Earth and Space Science Informatics

    NASA Astrophysics Data System (ADS)

    Singh, R.; Bermudez, L. E.

    2013-12-01

    Emerging Geospatial Sharing Technologies in Earth and Space Science Informatics The Open Geospatial Consortium (OGC) mission is to serve as a global forum for the collaboration of developers and users of spatial data products and services, and to advance the development of international standards for geospatial interoperability. The OGC coordinates with over 400 institutions in the development of geospatial standards. In the last years two main trends are making disruptions in geospatial applications: mobile and context sharing. People now have more and more mobile devices to support their work and personal life. Mobile devices are intermittently connected to the internet and have smaller computing capacity than a desktop computer. Based on this trend a new OGC file format standard called GeoPackage will enable greater geospatial data sharing on mobile devices. GeoPackage is perhaps best understood as the natural evolution of Shapefiles, which have been the predominant lightweight geodata sharing format for two decades. However the format is extremely limited. Four major shortcomings are that only vector points, lines, and polygons are supported; property names are constrained by the dBASE format; multiple files are required to encode a single data set; and multiple Shapefiles are required to encode multiple data sets. A more modern lingua franca for geospatial data is long overdue. GeoPackage fills this need with support for vector data, image tile matrices, and raster data. And it builds upon a database container - SQLite - that's self-contained, single-file, cross-platform, serverless, transactional, and open source. A GeoPackage, in essence, is a set of SQLite database tables whose content and layout is described in the candidate GeoPackage Implementation Specification available at https://portal.opengeospatial.org/files/?artifact_id=54838&version=1. The second trend is sharing client 'contexts'. When a user is looking into an article or a product on the web, they can easily share this information with colleagues or friends via an email that includes URLs (links to web resources) and attachments (inline data). In the case of geospatial information, a user would like to share a map created from different OGC sources, which may include for example, WMS and WFS links, and GML and KML annotations. The emerging OGC file format is called the OGC Web Services Context Document (OWS Context), which allows clients to reproduce a map previously created by someone else. Context sharing is important in a variety of domains, from emergency response, where fire, police and emergency medical personnel need to work off a common map, to multi-national military operations, where coalition forces need to share common data sources, but have cartographic displays in different languages and symbology sets. OWS Contexts can be written in XML (building upon the Atom Syndication Format) or JSON. This presentation will provide an introduction of GeoPackage and OWS Context and how they can be used to advance sharing of Earth and Space Science information.

  16. Spatial information semantic query based on SPARQL

    NASA Astrophysics Data System (ADS)

    Xiao, Zhifeng; Huang, Lei; Zhai, Xiaofang

    2009-10-01

    How can the efficiency of spatial information inquiries be enhanced in today's fast-growing information age? We are rich in geospatial data but poor in up-to-date geospatial information and knowledge that are ready to be accessed by public users. This paper adopts an approach for querying spatial semantic by building an Web Ontology language(OWL) format ontology and introducing SPARQL Protocol and RDF Query Language(SPARQL) to search spatial semantic relations. It is important to establish spatial semantics that support for effective spatial reasoning for performing semantic query. Compared to earlier keyword-based and information retrieval techniques that rely on syntax, we use semantic approaches in our spatial queries system. Semantic approaches need to be developed by ontology, so we use OWL to describe spatial information extracted by the large-scale map of Wuhan. Spatial information expressed by ontology with formal semantics is available to machines for processing and to people for understanding. The approach is illustrated by introducing a case study for using SPARQL to query geo-spatial ontology instances of Wuhan. The paper shows that making use of SPARQL to search OWL ontology instances can ensure the result's accuracy and applicability. The result also indicates constructing a geo-spatial semantic query system has positive efforts on forming spatial query and retrieval.

  17. Geospatial Web Services in Real Estate Information System

    NASA Astrophysics Data System (ADS)

    Radulovic, Aleksandra; Sladic, Dubravka; Govedarica, Miro; Popovic, Dragana; Radovic, Jovana

    2017-12-01

    Since the data of cadastral records are of great importance for the economic development of the country, they must be well structured and organized. Records of real estate on the territory of Serbia met many problems in previous years. To prevent problems and to achieve efficient access, sharing and exchange of cadastral data on the principles of interoperability, domain model for real estate is created according to current standards in the field of spatial data. The resulting profile of the domain model for the Serbian real estate cadastre is based on the current legislation and on Land Administration Domain Model (LADM) which is specified in the ISO19152 standard. Above such organized data, and for their effective exchange, it is necessary to develop a model of services that must be provided by the institutions interested in the exchange of cadastral data. This is achieved by introducing a service-oriented architecture in the information system of real estate cadastre and with that ensures efficiency of the system. It is necessary to develop user services for download, review and use of the real estate data through the web. These services should be provided to all users who need access to cadastral data (natural and legal persons as well as state institutions) through e-government. It is also necessary to provide search, view and download of cadastral spatial data by specifying geospatial services. Considering that real estate contains geometric data for parcels and buildings it is necessary to establish set of geospatial services that would provide information and maps for the analysis of spatial data, and for forming a raster data. Besides the theme Cadastral parcels, INSPIRE directive specifies several themes that involve data on buildings and land use, for which data can be provided from real estate cadastre. In this paper, model of geospatial services in Serbia is defined. A case study of using these services to estimate which household is at risk of flooding using the Web Processing Service (WPS) spatial analysis is described.

  18. EPA Geospatial Applications

    EPA Pesticide Factsheets

    EPA has developed many applications that allow users to explore and interact with geospatial data. This page highlights some of the flagship geospatial web applications but these represent only a fraction of the total.

  19. Sensor Webs with a Service-Oriented Architecture for On-demand Science Products

    NASA Technical Reports Server (NTRS)

    Mandl, Daniel; Ungar, Stephen; Ames, Troy; Justice, Chris; Frye, Stuart; Chien, Steve; Tran, Daniel; Cappelaere, Patrice; Derezinsfi, Linda; Paules, Granville; hide

    2007-01-01

    This paper describes the work being managed by the NASA Goddard Space Flight Center (GSFC) Information System Division (ISD) under a NASA Earth Science Technology Ofice (ESTO) Advanced Information System Technology (AIST) grant to develop a modular sensor web architecture which enables discovery of sensors and workflows that can create customized science via a high-level service-oriented architecture based on Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) web service standards. These capabilities serve as a prototype to a user-centric architecture for Global Earth Observing System of Systems (GEOSS). This work builds and extends previous sensor web efforts conducted at NASA/GSFC using the Earth Observing 1 (EO-1) satellite and other low-earth orbiting satellites.

  20. Oyster Fisheries App

    NASA Technical Reports Server (NTRS)

    Perez Guerrero, Geraldo A.; Armstrong, Duane; Underwood, Lauren

    2015-01-01

    This project is creating a cloud-enabled, HTML 5 web application to help oyster fishermen and state agencies apply Earth science to improve the management of this important natural and economic resource. The Oyster Fisheries app gathers and analyzes environmental and water quality information, and alerts fishermen and resources managers about problems in oyster fishing waters. An intuitive interface based on Google Maps displays the geospatial information and provides familiar interactive controls to the users. Alerts can be tailored to notify users when conditions in specific leases or public fishing areas require attention. The app is hosted on the Amazon Web Services cloud. It is being developed and tested using some of the latest web development tools such as web components and Polymer.

  1. Geospatial Technology Applications and Infrastructure in the Biological Resources Division.

    DTIC Science & Technology

    1998-09-01

    Forestry/forest ecology Geography Geology GIS/mapping technologies GPS technology HTML/World Wide Web Information management/transfer JAVA Land...tech- nologies are being used to understand diet selection, habitat use, hibernation behavior, and social interactions of desert tortoises

  2. Enhancing Access to Drought Information Using the CUAHSI Hydrologic Information System

    NASA Astrophysics Data System (ADS)

    Schreuders, K. A.; Tarboton, D. G.; Horsburgh, J. S.; Sen Gupta, A.; Reeder, S.

    2011-12-01

    The National Drought Information System (NIDIS) Upper Colorado River Basin pilot study is investigating and establishing capabilities for better dissemination of drought information for early warning and management. As part of this study we are using and extending functionality from the Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) Hydrologic Information System (HIS) to provide better access to drought-related data in the Upper Colorado River Basin. The CUAHSI HIS is a federated system for sharing hydrologic data. It is comprised of multiple data servers, referred to as HydroServers, that publish data in a standard XML format called Water Markup Language (WaterML), using web services referred to as WaterOneFlow web services. HydroServers can also publish geospatial data using Open Geospatial Consortium (OGC) web map, feature and coverage services and are capable of hosting web and map applications that combine geospatial datasets with observational data served via web services. HIS also includes a centralized metadata catalog that indexes data from registered HydroServers and a data access client referred to as HydroDesktop. For NIDIS, we have established a HydroServer to publish drought index values as well as the input data used in drought index calculations. Primary input data required for drought index calculation include streamflow, precipitation, reservoir storages, snow water equivalent, and soil moisture. We have developed procedures to redistribute the input data to the time and space scales chosen for drought index calculation, namely half monthly time intervals for HUC 10 subwatersheds. The spatial redistribution approaches used for each input parameter are dependent on the spatial linkages for that parameter, i.e., the redistribution procedure for streamflow is dependent on the upstream/downstream connectivity of the stream network, and the precipitation redistribution procedure is dependent on elevation to account for orographic effects. A set of drought indices are then calculated from the redistributed data. We have created automated data and metadata harvesters that periodically scan and harvest new data from each of the input databases, and calculates extensions to the resulting derived data sets, ensuring that the data available on the drought server is kept up to date. This paper will describe this system, showing how it facilitates the integration of data from multiple sources to inform the planning and management of water resources during drought. The system may be accessed at http://drought.usu.edu.

  3. Citing geospatial feature inventories with XML manifests

    NASA Astrophysics Data System (ADS)

    Bose, R.; McGarva, G.

    2006-12-01

    Today published scientific papers include a growing number of citations for online information sources that either complement or replace printed journals and books. We anticipate this same trend for cartographic citations used in the geosciences, following advances in web mapping and geographic feature-based services. Instead of using traditional libraries to resolve citations for print material, the geospatial citation life cycle will include requesting inventories of objects or geographic features from distributed geospatial data repositories. Using a case study from the UK Ordnance Survey MasterMap database, which is illustrative of geographic object-based products in general, we propose citing inventories of geographic objects using XML feature manifests. These manifests: (1) serve as a portable listing of sets of versioned features; (2) could be used as citations within the identification portion of an international geospatial metadata standard; (3) could be incorporated into geospatial data transfer formats such as GML; but (4) can be resolved only with comprehensive, curated repositories of current and historic data. This work has implications for any researcher who foresees the need to make or resolve references to online geospatial databases.

  4. Plug and Play web-based visualization of mobile air monitoring data (Abstract)

    EPA Science Inventory

    EPA’s Real-Time Geospatial (RETIGO) Data Viewer web-based tool is a new program reducing the technical barrier to visualize and understand geospatial air data time series collected using wearable, bicycle-mounted, or vehicle-mounted air sensors. The RETIGO tool, with anticipated...

  5. The EnviroAtlas: Connecting ecosystems, people, and well-being

    EPA Science Inventory

    The EnviroAtlas is a web-based application containing a collection of geospatial data, analysis tools, and interpretive information focused on ecosystem goods and services. Ecosystem goods and services are essentially defined as the benefits that humans receive from nature and en...

  6. Efficiently Communicating Rich Heterogeneous Geospatial Data from the FeMO2008 Dive Cruise with FlashMap on EarthRef.org

    NASA Astrophysics Data System (ADS)

    Minnett, R. C.; Koppers, A. A.; Staudigel, D.; Staudigel, H.

    2008-12-01

    EarthRef.org is comprehensive and convenient resource for Earth Science reference data and models. It encompasses four main portals: the Geochemical Earth Reference Model (GERM), the Magnetics Information Consortium (MagIC), the Seamount Biogeosciences Network (SBN), and the Enduring Resources for Earth Science Education (ERESE). Their underlying databases are publically available and the scientific community has contributed widely and is urged to continue to do so. However, the net result is a vast and largely heterogeneous warehouse of geospatial data ranging from carefully prepared maps of seamounts to geochemical data/metadata, daily reports from seagoing expeditions, large volumes of raw and processed multibeam data, images of paleomagnetic sampling sites, etc. This presents a considerable obstacle for integrating other rich media content, such as videos, images, data files, cruise tracks, and interoperable database results, without overwhelming the web user. The four EarthRef.org portals clearly lend themselves to a more intuitive user interface and has, therefore, been an invaluable test bed for the design and implementation of FlashMap, a versatile KML-driven geospatial browser written for reliability and speed in Adobe Flash. FlashMap allows layers of content to be loaded and displayed over a streaming high-resolution map which can be zoomed and panned similarly to Google Maps and Google Earth. Many organizations, from National Geographic to the USGS, have begun using Google Earth software to display geospatial content. However, Google Earth, as a desktop application, does not integrate cleanly with existing websites requiring the user to navigate away from the browser and focus on a separate application and Google Maps, written in Java Script, does not scale up reliably to large datasets. FlashMap remedies these problems as a web-based application that allows for seamless integration of the real-time display power of Google Earth and the flexibility of the web without losing scalability and control of the base maps. Our Flash-based application is fully compatible with KML (Keyhole Markup Language) 2.2, the most recent iteration of KML, allowing users with existing Google Earth KML files to effortlessly display their geospatial content embedded in a web page. As a test case for FlashMap, the annual Iron-Oxidizing Microbial Observatory (FeMO) dive cruise to the Loihi Seamount, in conjunction with data available from ongoing and published FeMO laboratory studies, showcases the flexibility of this single web-based application. With a KML 2.2 compatible web-service providing the content, any database can display results in FlashMap. The user can then hide and show multiple layers of content, potentially from several data sources, and rapidly digest a vast quantity of information to narrow the search results. This flexibility gives experienced users the ability to drill down to exactly the record they are looking for (SERC at Carleton College's educational application of FlashMap at http://serc.carleton.edu/sp/erese/activities/22223.html) and allows users familiar with Google Earth the ability to load and view geospatial data content within a browser from any computer with an internet connection.

  7. Facilitating Data-Intensive Education and Research in Earth Science through Geospatial Web Services

    ERIC Educational Resources Information Center

    Deng, Meixia

    2009-01-01

    The realm of Earth science (ES) is increasingly data-intensive. Geoinformatics research attempts to robustly smooth and accelerate the flow of data to information, information to knowledge, and knowledge to decisions and to supply necessary infrastructure and tools for advancing ES. Enabling easy access to and use of large volumes of ES data and…

  8. Geospatial data sharing, online spatial analysis and processing of Indian Biodiversity data in Internet GIS domain - A case study for raster based online geo-processing

    NASA Astrophysics Data System (ADS)

    Karnatak, H.; Pandey, K.; Oberai, K.; Roy, A.; Joshi, D.; Singh, H.; Raju, P. L. N.; Krishna Murthy, Y. V. N.

    2014-11-01

    National Biodiversity Characterization at Landscape Level, a project jointly sponsored by Department of Biotechnology and Department of Space, was implemented to identify and map the potential biodiversity rich areas in India. This project has generated spatial information at three levels viz. Satellite based primary information (Vegetation Type map, spatial locations of road & village, Fire occurrence); geospatially derived or modelled information (Disturbance Index, Fragmentation, Biological Richness) and geospatially referenced field samples plots. The study provides information of high disturbance and high biological richness areas suggesting future management strategies and formulating action plans. The study has generated for the first time baseline database in India which will be a valuable input towards climate change study in the Indian Subcontinent. The spatial data generated during the study is organized as central data repository in Geo-RDBMS environment using PostgreSQL and POSTGIS. The raster and vector data is published as OGC WMS and WFS standard for development of web base geoinformation system using Service Oriented Architecture (SOA). The WMS and WFS based system allows geo-visualization, online query and map outputs generation based on user request and response. This is a typical mashup architecture based geo-information system which allows access to remote web services like ISRO Bhuvan, Openstreet map, Google map etc., with overlay on Biodiversity data for effective study on Bio-resources. The spatial queries and analysis with vector data is achieved through SQL queries on POSTGIS and WFS-T operations. But the most important challenge is to develop a system for online raster based geo-spatial analysis and processing based on user defined Area of Interest (AOI) for large raster data sets. The map data of this study contains approximately 20 GB of size for each data layer which are five in number. An attempt has been to develop system using python, PostGIS and PHP for raster data analysis over the web for Biodiversity conservation and prioritization. The developed system takes inputs from users as WKT, Openlayer based Polygon geometry and Shape file upload as AOI to perform raster based operation using Python and GDAL/OGR. The intermediate products are stored in temporary files and tables which generate XML outputs for web representation. The raster operations like clip-zip-ship, class wise area statistics, single to multi-layer operations, diagrammatic representation and other geo-statistical analysis are performed. This is indigenous geospatial data processing engine developed using Open system architecture for spatial analysis of Biodiversity data sets in Internet GIS environment. The performance of this applications in multi-user environment like Internet domain is another challenging task which is addressed by fine tuning the source code, server hardening, spatial indexing and running the process in load balance mode. The developed system is hosted in Internet domain (http://bis.iirs.gov.in) for user access.

  9. Advancing the Implementation of Hydrologic Models as Web-based Applications

    NASA Astrophysics Data System (ADS)

    Dahal, P.; Tarboton, D. G.; Castronova, A. M.

    2017-12-01

    Advanced computer simulations are required to understand hydrologic phenomenon such as rainfall-runoff response, groundwater hydrology, snow hydrology, etc. Building a hydrologic model instance to simulate a watershed requires investment in data (diverse geospatial datasets such as terrain, soil) and computer resources, typically demands a wide skill set from the analyst, and the workflow involved is often difficult to reproduce. This work introduces a web-based prototype infrastructure in the form of a web application that provides researchers with easy to use access to complete hydrological modeling functionality. This includes creating the necessary geospatial and forcing data, preparing input files for a model by applying complex data preprocessing, running the model for a user defined watershed, and saving the results to a web repository. The open source Tethys Platform was used to develop the web app front-end Graphical User Interface (GUI). We used HydroDS, a webservice that provides data preparation processing capability to support backend computations used by the app. Results are saved in HydroShare, a hydrologic information system that supports the sharing of hydrologic data, model and analysis tools. The TOPographic Kinematic APproximation and Integration (TOPKAPI) model served as the example for which we developed a complete hydrologic modeling service to demonstrate the approach. The final product is a complete modeling system accessible through the web to create input files, and run the TOPKAPI hydrologic model for a watershed of interest. We are investigating similar functionality for the preparation of input to Regional Hydro-Ecological Simulation System (RHESSys). Key Words: hydrologic modeling, web services, hydrologic information system, HydroShare, HydroDS, Tethys Platform

  10. Nick Grue | NREL

    Science.gov Websites

    geospatial data analysis using parallel processing High performance computing Renewable resource technical potential and supply curve analysis Spatial database utilization Rapid analysis of large geospatial datasets energy and geospatial analysis products Research Interests Rapid, web-based renewable resource analysis

  11. Urban Climate Resilience - Connecting climate models with decision support cyberinfrastructure using open standards

    NASA Astrophysics Data System (ADS)

    Bermudez, L. E.; Percivall, G.; Idol, T. A.

    2015-12-01

    Experts in climate modeling, remote sensing of the Earth, and cyber infrastructure must work together in order to make climate predictions available to decision makers. Such experts and decision makers worked together in the Open Geospatial Consortium's (OGC) Testbed 11 to address a scenario of population displacement by coastal inundation due to the predicted sea level rise. In a Policy Fact Sheet "Harnessing Climate Data to Boost Ecosystem & Water Resilience", issued by White House Office of Science and Technology (OSTP) in December 2014, OGC committed to increase access to climate change information using open standards. In July 2015, the OGC Testbed 11 Urban Climate Resilience activity delivered on that commitment with open standards based support for climate-change preparedness. Using open standards such as the OGC Web Coverage Service and Web Processing Service and the NetCDF and GMLJP2 encoding standards, Testbed 11 deployed an interoperable high-resolution flood model to bring climate model outputs together with global change assessment models and other remote sensing data for decision support. Methods to confirm model predictions and to allow "what-if-scenarios" included in-situ sensor webs and crowdsourcing. A scenario was in two locations: San Francisco Bay Area and Mozambique. The scenarios demonstrated interoperation and capabilities of open geospatial specifications in supporting data services and processing services. The resultant High Resolution Flood Information System addressed access and control of simulation models and high-resolution data in an open, worldwide, collaborative Web environment. The scenarios examined the feasibility and capability of existing OGC geospatial Web service specifications in supporting the on-demand, dynamic serving of flood information from models with forecasting capacity. Results of this testbed included identification of standards and best practices that help researchers and cities deal with climate-related issues. Results of the testbeds will now be deployed in pilot applications. The testbed also identified areas of additional development needed to help identify scientific investments and cyberinfrastructure approaches needed to improve the application of climate science research results to urban climate resilence.

  12. GeoSearcher: Location-Based Ranking of Search Engine Results.

    ERIC Educational Resources Information Center

    Watters, Carolyn; Amoudi, Ghada

    2003-01-01

    Discussion of Web queries with geospatial dimensions focuses on an algorithm that assigns location coordinates dynamically to Web sites based on the URL. Describes a prototype search system that uses the algorithm to re-rank search engine results for queries with a geospatial dimension, thus providing an alternative ranking order for search engine…

  13. Making geospatial data in ASF archive readily accessible

    NASA Astrophysics Data System (ADS)

    Gens, R.; Hogenson, K.; Wolf, V. G.; Drew, L.; Stern, T.; Stoner, M.; Shapran, M.

    2015-12-01

    The way geospatial data is searched, managed, processed and used has changed significantly in recent years. A data archive such as the one at the Alaska Satellite Facility (ASF), one of NASA's twelve interlinked Distributed Active Archive Centers (DAACs), used to be searched solely via user interfaces that were specifically developed for its particular archive and data sets. ASF then moved to using an application programming interface (API) that defined a set of routines, protocols, and tools for distributing the geospatial information stored in the database in real time. This provided a more flexible access to the geospatial data. Yet, it was up to user to develop the tools to get a more tailored access to the data they needed. We present two new approaches for serving data to users. In response to the recent Nepal earthquake we developed a data feed for distributing ESA's Sentinel data. Users can subscribe to the data feed and are provided with the relevant metadata the moment a new data set is available for download. The second approach was an Open Geospatial Consortium (OGC) web feature service (WFS). The WFS hosts the metadata along with a direct link from which the data can be downloaded. It uses the open-source GeoServer software (Youngblood and Iacovella, 2013) and provides an interface to include the geospatial information in the archive directly into the user's geographic information system (GIS) as an additional data layer. Both services are run on top of a geospatial PostGIS database, an open-source geographic extension for the PostgreSQL object-relational database (Marquez, 2015). Marquez, A., 2015. PostGIS essentials. Packt Publishing, 198 p. Youngblood, B. and Iacovella, S., 2013. GeoServer Beginner's Guide, Packt Publishing, 350 p.

  14. Geospatial resources for the geologic community: The USGS National Map

    USGS Publications Warehouse

    Witt, Emitt C.

    2015-01-01

    Geospatial data are a key component of investigating, interpreting, and communicating the geological sciences. Locating geospatial data can be time-consuming, which detracts from time spent on a study because these data are not obviously placed in central locations or are served from many disparate databases. The National Map of the US Geological Survey is a publicly available resource for accessing the geospatial base map data needs of the geological community from a central location. The National Map data are available through a viewer and download platform providing access to eight primary data themes, plus the US Topo and scanned historical topographic maps. The eight themes are elevation, orthoimagery, hydrography, geographic names, boundaries, transportation, structures, and land cover, and they are being offered for download as predefined tiles in formats supported by leading geographic information system software. Data tiles are periodically refreshed to capture the most current content and are an efficient method for disseminating and receiving geospatial information. Elevation data, for example, are offered as a download from the National Map as 1° × 1° tiles for the 10- and 30- m products and as 15′ × 15′ tiles for the higher-resolution 3-m product. Vector data sets with smaller file sizes are offered at several tile sizes and formats. Partial tiles are not a download option—any prestaged data that intersect the requesting bounding box will be, in their entirety, part of the download order. While there are many options for accessing geospatial data via the Web, the National Map represents authoritative sources of data that are documented and can be referenced for citation and inclusion in scientific publications. Therefore, National Map products and services should be part of a geologist’s first stop for geospatial information and data.

  15. Development of a Web-Enabled Learning Platform for Geospatial Laboratories: Improving the Undergraduate Learning Experience

    ERIC Educational Resources Information Center

    Mui, Amy B.; Nelson, Sarah; Huang, Bruce; He, Yuhong; Wilson, Kathi

    2015-01-01

    This paper describes a web-enabled learning platform providing remote access to geospatial software that extends the learning experience outside of the laboratory setting. The platform was piloted in two undergraduate courses, and includes a software server, a data server, and remote student users. The platform was designed to improve the quality…

  16. RE Data Explorer: Informing Variable Renewable Energy Grid Integration for Low Emission Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cox, Sarah L

    The RE Data Explorer, developed by the National Renewable Energy Laboratory, is an innovative web-based analysis tool that utilizes geospatial and spatiotemporal renewable energy data to visualize, execute, and support analysis of renewable energy potential under various user-defined scenarios. This analysis can inform high-level prospecting, integrated planning, and policy making to enable low emission development.

  17. Matching Alternative Addresses: a Semantic Web Approach

    NASA Astrophysics Data System (ADS)

    Ariannamazi, S.; Karimipour, F.; Hakimpour, F.

    2015-12-01

    Rapid development of crowd-sourcing or volunteered geographic information (VGI) provides opportunities for authoritatives that deal with geospatial information. Heterogeneity of multiple data sources and inconsistency of data types is a key characteristics of VGI datasets. The expansion of cities resulted in the growing number of POIs in the OpenStreetMap, a well-known VGI source, which causes the datasets to outdate in short periods of time. These changes made to spatial and aspatial attributes of features such as names and addresses might cause confusion or ambiguity in the processes that require feature's literal information like addressing and geocoding. VGI sources neither will conform specific vocabularies nor will remain in a specific schema for a long period of time. As a result, the integration of VGI sources is crucial and inevitable in order to avoid duplication and the waste of resources. Information integration can be used to match features and qualify different annotation alternatives for disambiguation. This study enhances the search capabilities of geospatial tools with applications able to understand user terminology to pursuit an efficient way for finding desired results. Semantic web is a capable tool for developing technologies that deal with lexical and numerical calculations and estimations. There are a vast amount of literal-spatial data representing the capability of linguistic information in knowledge modeling, but these resources need to be harmonized based on Semantic Web standards. The process of making addresses homogenous generates a helpful tool based on spatial data integration and lexical annotation matching and disambiguating.

  18. Mapping and Analysis of Forest and Land Fire Potential Using Geospatial Technology and Mathematical Modeling

    NASA Astrophysics Data System (ADS)

    Suliman, M. D. H.; Mahmud, M.; Reba, M. N. M.; S, L. W.

    2014-02-01

    Forest and land fire can cause negative implications for forest ecosystems, biodiversity, air quality and soil structure. However, the implications involved can be minimized through effective disaster management system. Effective disaster management mechanisms can be developed through appropriate early warning system as well as an efficient delivery system. This study tried to focus on two aspects, namely by mapping the potential of forest fire and land as well as the delivery of information to users through WebGIS application. Geospatial technology and mathematical modeling used in this study for identifying, classifying and mapping the potential area for burning. Mathematical models used is the Analytical Hierarchy Process (AHP), while Geospatial technologies involved include remote sensing, Geographic Information System (GIS) and digital field data collection. The entire Selangor state was chosen as our study area based on a number of cases have been reported over the last two decades. AHP modeling to assess the comparison between the three main criteria of fuel, topography and human factors design. Contributions of experts directly involved in forest fire fighting operations and land comprising officials from the Fire and Rescue Department Malaysia also evaluated in this model. The study found that about 32.83 square kilometers of the total area of Selangor state are the extreme potential for fire. Extreme potential areas identified are in Bestari Jaya and Kuala Langat High Ulu. Continuity of information and terrestrial forest fire potential was displayed in WebGIS applications on the internet. Display information through WebGIS applications is a better approach to help the decision-making process at a high level of confidence and approximate real conditions. Agencies involved in disaster management such as Jawatankuasa Pengurusan Dan Bantuan Bencana (JPBB) of District, State and the National under the National Security Division and the Fire and Rescue Department Malaysia can use the end result of this study in preparation for the land and forest fires in the future.

  19. Pathfinder. Volume 8, Number 6, November/December 2010

    DTIC Science & Technology

    2010-12-01

    transferring information between multiple systems . Nevertheless, without an end-to-end TCPED process and the associated standards, policies and equipment in...products with partners whose information technology systems vary and are not compatible with those of the NSG, NGA and the U.S. Depart- ment of...Pacific. ARF DReaMS is based on Web service technol- ogy, where traditional maps, data and any relevant geospatial information are made available

  20. Geospatial Service Platform for Education and Research

    NASA Astrophysics Data System (ADS)

    Gong, J.; Wu, H.; Jiang, W.; Guo, W.; Zhai, X.; Yue, P.

    2014-04-01

    We propose to advance the scientific understanding through applications of geospatial service platforms, which can help students and researchers investigate various scientific problems in a Web-based environment with online tools and services. The platform also offers capabilities for sharing data, algorithm, and problem-solving knowledge. To fulfil this goal, the paper introduces a new course, named "Geospatial Service Platform for Education and Research", to be held in the ISPRS summer school in May 2014 at Wuhan University, China. The course will share cutting-edge achievements of a geospatial service platform with students from different countries, and train them with online tools from the platform for geospatial data processing and scientific research. The content of the course includes the basic concepts of geospatial Web services, service-oriented architecture, geoprocessing modelling and chaining, and problem-solving using geospatial services. In particular, the course will offer a geospatial service platform for handson practice. There will be three kinds of exercises in the course: geoprocessing algorithm sharing through service development, geoprocessing modelling through service chaining, and online geospatial analysis using geospatial services. Students can choose one of them, depending on their interests and background. Existing geoprocessing services from OpenRS and GeoPW will be introduced. The summer course offers two service chaining tools, GeoChaining and GeoJModelBuilder, as instances to explain specifically the method for building service chains in view of different demands. After this course, students can learn how to use online service platforms for geospatial resource sharing and problem-solving.

  1. A spatial information crawler for OpenGIS WFS

    NASA Astrophysics Data System (ADS)

    Jiang, Jun; Yang, Chong-jun; Ren, Ying-chao

    2008-10-01

    The growth of the internet makes it non-trivial to search for the accuracy information efficiently. Topical crawler, which is aiming at a certain area, attracts more and more intention now because it can help people to find out what they need. Furthermore, with the OpenGIS WFS (Web Feature Service) Specification developed by OGC (Open GIS Consortium), much more geospatial data providers adopt this protocol to publish their data on the internet. In this case, a crawler which is aiming at the WFS servers can help people to find the geospatial data from WFS servers. In this paper, we propose a prototype system of a WFS crawler based on the OpenGIS WFS Specification. The crawler architecture, working principles, and detailed function of each component are introduced. This crawler is capable of discovering WFS servers dynamically, saving and updating the service contents of the servers. The data collect by the crawler can be supported to a geospatial data search engine as its data source.

  2. Public participation in GIS via mobile applications

    NASA Astrophysics Data System (ADS)

    Brovelli, Maria Antonia; Minghini, Marco; Zamboni, Giorgio

    2016-04-01

    Driven by the recent trends in the GIS domain including Volunteered Geographic Information, geo-crowdsourcing and citizen science, and fostered by the constant technological advances, collection and dissemination of geospatial information by ordinary people has become commonplace. However, applications involving user-generated geospatial content show dramatically diversified patterns in terms of incentive, type and level of participation, purpose of the activity, data/metadata provided and data quality. This study contributes to this heterogeneous context by investigating public participation in GIS within the field of mobile-based applications. Results not only show examples of how to technically build GIS applications enabling user collection and interaction with geospatial data, but they also draw conclusions about the methods and needs of public participation. We describe three projects with different scales and purposes in the context of urban monitoring and planning, and tourism valorisation. In each case, an open source architecture is used, allowing users to exploit their mobile devices to collect georeferenced information. This data is then made publicly available on specific Web viewers. Analysis of user involvement in these projects provides insights related to participation patterns which suggests some generalized conclusions.

  3. Design for Connecting Spatial Data Infrastructures with Sensor Web (sensdi)

    NASA Astrophysics Data System (ADS)

    Bhattacharya, D.; M., M.

    2016-06-01

    Integrating Sensor Web With Spatial Data Infrastructures (SENSDI) aims to extend SDIs with sensor web enablement, converging geospatial and built infrastructure, and implement test cases with sensor data and SDI. It is about research to harness the sensed environment by utilizing domain specific sensor data to create a generalized sensor webframework. The challenges being semantic enablement for Spatial Data Infrastructures, and connecting the interfaces of SDI with interfaces of Sensor Web. The proposed research plan is to Identify sensor data sources, Setup an open source SDI, Match the APIs and functions between Sensor Web and SDI, and Case studies like hazard applications, urban applications etc. We take up co-operative development of SDI best practices to enable a new realm of a location enabled and semantically enriched World Wide Web - the "Geospatial Web" or "Geosemantic Web" by setting up one to one correspondence between WMS, WFS, WCS, Metadata and 'Sensor Observation Service' (SOS); 'Sensor Planning Service' (SPS); 'Sensor Alert Service' (SAS); a service that facilitates asynchronous message interchange between users and services, and between two OGC-SWE services, called the 'Web Notification Service' (WNS). Hence in conclusion, it is of importance to geospatial studies to integrate SDI with Sensor Web. The integration can be done through merging the common OGC interfaces of SDI and Sensor Web. Multi-usability studies to validate integration has to be undertaken as future research.

  4. A Cloud-enabled Service-oriented Spatial Web Portal for Facilitating Arctic Data Discovery, Integration, and Utilization

    NASA Astrophysics Data System (ADS)

    dias, S. B.; Yang, C.; Li, Z.; XIA, J.; Liu, K.; Gui, Z.; Li, W.

    2013-12-01

    Global climate change has become one of the biggest concerns for human kind in the 21st century due to its broad impacts on society and ecosystems across the world. Arctic has been observed as one of the most vulnerable regions to the climate change. In order to understand the impacts of climate change on the natural environment, ecosystems, biodiversity and others in the Arctic region, and thus to better support the planning and decision making process, cross-disciplinary researches are required to monitor and analyze changes of Arctic regions such as water, sea level, biodiversity and so on. Conducting such research demands the efficient utilization of various geospatially referenced data, web services and information related to Arctic region. In this paper, we propose a cloud-enabled and service-oriented Spatial Web Portal (SWP) to support the discovery, integration and utilization of Arctic related geospatial resources, serving as a building block of polar CI. This SWP leverages the following techniques: 1) a hybrid searching mechanism combining centralized local search, distributed catalogue search and specialized Internet search for effectively discovering Arctic data and web services from multiple sources; 2) a service-oriented quality-enabled framework for seamless integration and utilization of various geospatial resources; and 3) a cloud-enabled parallel spatial index building approach to facilitate near-real time resource indexing and searching. A proof-of-concept prototype is developed to demonstrate the feasibility of the proposed SWP, using an example of analyzing the Arctic snow cover change over the past 50 years.

  5. Fast segmentation of satellite images using SLIC, WebGL and Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Donchyts, Gennadii; Baart, Fedor; Gorelick, Noel; Eisemann, Elmar; van de Giesen, Nick

    2017-04-01

    Google Earth Engine (GEE) is a parallel geospatial processing platform, which harmonizes access to petabytes of freely available satellite images. It provides a very rich API, allowing development of dedicated algorithms to extract useful geospatial information from these images. At the same time, modern GPUs provide thousands of computing cores, which are mostly not utilized in this context. In the last years, WebGL became a popular and well-supported API, allowing fast image processing directly in web browsers. In this work, we will evaluate the applicability of WebGL to enable fast segmentation of satellite images. A new implementation of a Simple Linear Iterative Clustering (SLIC) algorithm using GPU shaders will be presented. SLIC is a simple and efficient method to decompose an image in visually homogeneous regions. It adapts a k-means clustering approach to generate superpixels efficiently. While this approach will be hard to scale, due to a significant amount of data to be transferred to the client, it should significantly improve exploratory possibilities and simplify development of dedicated algorithms for geoscience applications. Our prototype implementation will be used to improve surface water detection of the reservoirs using multispectral satellite imagery.

  6. Improving the Accessibility and Use of NASA Earth Science Data

    NASA Technical Reports Server (NTRS)

    Tisdale, Matthew; Tisdale, Brian

    2015-01-01

    Many of the NASA Langley Atmospheric Science Data Center (ASDC) Distributed Active Archive Center (DAAC) multidimensional tropospheric and atmospheric chemistry data products are stored in HDF4, HDF5 or NetCDF format, which traditionally have been difficult to analyze and visualize with geospatial tools. With the rising demand from the diverse end-user communities for geospatial tools to handle multidimensional products, several applications, such as ArcGIS, have refined their software. Many geospatial applications now have new functionalities that enable the end user to: Store, serve, and perform analysis on each individual variable, its time dimension, and vertical dimension. Use NetCDF, GRIB, and HDF raster data formats across applications directly. Publish output within REST image services or WMS for time and space enabled web application development. During this webinar, participants will learn how to leverage geospatial applications such as ArcGIS, OPeNDAP and ncWMS in the production of Earth science information, and in increasing data accessibility and usability.

  7. Sharing on Web 3d Models of Ancient Theatres. a Methodological Workflow

    NASA Astrophysics Data System (ADS)

    Scianna, A.; La Guardia, M.; Scaduto, M. L.

    2016-06-01

    In the last few years, the need to share on the Web the knowledge of Cultural Heritage (CH) through navigable 3D models has increased. This need requires the availability of Web-based virtual reality systems and 3D WEBGIS. In order to make the information available to all stakeholders, these instruments should be powerful and at the same time very user-friendly. However, research and experiments carried out so far show that a standardized methodology doesn't exist. All this is due both to complexity and dimensions of geometric models to be published, on the one hand, and to excessive costs of hardware and software tools, on the other. In light of this background, the paper describes a methodological approach for creating 3D models of CH, freely exportable on the Web, based on HTML5 and free and open source software. HTML5, supporting the WebGL standard, allows the exploration of 3D spatial models using most used Web browsers like Chrome, Firefox, Safari, Internet Explorer. The methodological workflow here described has been tested for the construction of a multimedia geo-spatial platform developed for three-dimensional exploration and documentation of the ancient theatres of Segesta and of Carthage, and the surrounding landscapes. The experimental application has allowed us to explore the potential and limitations of sharing on the Web of 3D CH models based on WebGL standard. Sharing capabilities could be extended defining suitable geospatial Web-services based on capabilities of HTML5 and WebGL technology.

  8. U.S. Geological Survey and Microsoft Cooperative Research and Development Agreement: Geospatial Data Browsing and Retrieval Site on the World Wide Web

    USGS Publications Warehouse

    ,

    1999-01-01

    In May 1997, the U.S. Geological Survey (USGS) and the Microsoft Corporation of Redmond, Wash., entered into a cooperative research and development agreement (CRADA) to make vast amounts of geospatial data available to the general public through the Internet. The CRADA is a 36-month joint effort to develop a general, public-oriented browsing and retrieval site for geospatial data on the Internet. Specifically, Microsoft plans to (1) modify a large volume of USGS geospatial data so the images can be displayed quickly and easily over the Internet, (2) implement an easy-to-use interface for low-speed connections, and (3) develop an Internet Web site capable of servicing millions of users per day.

  9. U.S. Geological Survey and Microsoft Cooperative Research and Development Agreement: Geospatial Data Browsing and Retrieval Site on the World Wide Web

    USGS Publications Warehouse

    ,

    1998-01-01

    In May 1997, the U.S. Geological Survey (USGS) and the Microsoft Corporation of Redmond, Wash., entered into a cooperative research and development agreement (CRADA) to make vast amounts of geospatial data available to the general public through the Internet. The CRADA is a 36-month joint effort to develop a general, public-oriented browsing and retrieval site for geospatial data on the Internet. Specifically, Microsoft plans to (1) modify a large volume of USGS geospatial data so the images can be displayed quickly and easily over the Internet, (2) implement an easy-to-use interface for low-speed connections, and (3) develop an Internet Web site capable of servicing millions of users per day.

  10. MyGeoHub: A Collaborative Geospatial Research and Education Platform

    NASA Astrophysics Data System (ADS)

    Kalyanam, R.; Zhao, L.; Biehl, L. L.; Song, C. X.; Merwade, V.; Villoria, N.

    2017-12-01

    Scientific research is increasingly collaborative and globally distributed; research groups now rely on web-based scientific tools and data management systems to simplify their day-to-day collaborative workflows. However, such tools often lack seamless interfaces, requiring researchers to contend with manual data transfers, annotation and sharing. MyGeoHub is a web platform that supports out-of-the-box, seamless workflows involving data ingestion, metadata extraction, analysis, sharing and publication. MyGeoHub is built on the HUBzero cyberinfrastructure platform and adds general-purpose software building blocks (GABBs), for geospatial data management, visualization and analysis. A data management building block iData, processes geospatial files, extracting metadata for keyword and map-based search while enabling quick previews. iData is pervasive, allowing access through a web interface, scientific tools on MyGeoHub or even mobile field devices via a data service API. GABBs includes a Python map library as well as map widgets that in a few lines of code, generate complete geospatial visualization web interfaces for scientific tools. GABBs also includes powerful tools that can be used with no programming effort. The GeoBuilder tool provides an intuitive wizard for importing multi-variable, geo-located time series data (typical of sensor readings, GPS trackers) to build visualizations supporting data filtering and plotting. MyGeoHub has been used in tutorials at scientific conferences and educational activities for K-12 students. MyGeoHub is also constantly evolving; the recent addition of Jupyter and R Shiny notebook environments enable reproducible, richly interactive geospatial analyses and applications ranging from simple pre-processing to published tools. MyGeoHub is not a monolithic geospatial science gateway, instead it supports diverse needs ranging from just a feature-rich data management system, to complex scientific tools and workflows.

  11. The Impact of a Geospatial Technology-Supported Energy Curriculum on Middle School Students' Science Achievement

    ERIC Educational Resources Information Center

    Kulo, Violet; Bodzin, Alec

    2013-01-01

    Geospatial technologies are increasingly being integrated in science classrooms to foster learning. This study examined whether a Web-enhanced science inquiry curriculum supported by geospatial technologies promoted urban middle school students' understanding of energy concepts. The participants included one science teacher and 108 eighth-grade…

  12. Remote Sensing Information Gateway: A free application and web service for fast, convenient, interoperable access to large repositories of atmospheric data

    NASA Astrophysics Data System (ADS)

    Plessel, T.; Szykman, J.; Freeman, M.

    2012-12-01

    EPA's Remote Sensing Information Gateway (RSIG) is a widely used free applet and web service for quickly and easily retrieving, visualizing and saving user-specified subsets of atmospheric data - by variable, geographic domain and time range. Petabytes of available data include thousands of variables from a set of NASA and NOAA satellites, aircraft, ground stations and EPA air-quality models. The RSIG applet is used by atmospheric researchers and uses the rsigserver web service to obtain data and images. The rsigserver web service is compliant with the Open Geospatial Consortium Web Coverage Service (OGC-WCS) standard to facilitate data discovery and interoperability. Since rsigserver is publicly accessible, it can be (and is) used by other applications. This presentation describes the architecture and technical implementation details of this successful system with an emphasis on achieving convenience, high-performance, data integrity and security.

  13. RacerGISOnline: Enhancing Learning in Marketing Classes with Web-Based Business GIS

    ERIC Educational Resources Information Center

    Miller, Fred L.; Mangold, W. Glynn; Roach, Joy; Brockway, Gary; Johnston, Timothy; Linnhoff, Stefan; McNeely, Sam; Smith, Kathy; Holmes, Terence

    2014-01-01

    Geographic Information Systems (GIS) offer geospatial analytical tools with great potential for applications in marketing decision making. However, for various reasons, the rate of adoption of these tools in academic marketing programs has lagged behind that of marketing practitioners. RacerGISOnline is an innovative approach to integrating these…

  14. Ontology for Transforming Geo-Spatial Data for Discovery and Integration of Scientific Data

    NASA Astrophysics Data System (ADS)

    Nguyen, L.; Chee, T.; Minnis, P.

    2013-12-01

    Discovery and access to geo-spatial scientific data across heterogeneous repositories and multi-discipline datasets can present challenges for scientist. We propose to build a workflow for transforming geo-spatial datasets into semantic environment by using relationships to describe the resource using OWL Web Ontology, RDF, and a proposed geo-spatial vocabulary. We will present methods for transforming traditional scientific dataset, use of a semantic repository, and querying using SPARQL to integrate and access datasets. This unique repository will enable discovery of scientific data by geospatial bound or other criteria.

  15. Diy Geospatial Web Service Chains: Geochaining Make it Easy

    NASA Astrophysics Data System (ADS)

    Wu, H.; You, L.; Gui, Z.

    2011-08-01

    It is a great challenge for beginners to create, deploy and utilize a Geospatial Web Service Chain (GWSC). People in Computer Science are usually not familiar with geospatial domain knowledge. Geospatial practitioners may lack the knowledge about web services and service chains. The end users may lack both. However, integrated visual editing interfaces, validation tools, and oneclick deployment wizards may help to lower the learning curve and improve modelling skills so beginners will have a better experience. GeoChaining is a GWSC modelling tool designed and developed based on these ideas. GeoChaining integrates visual editing, validation, deployment, execution etc. into a unified platform. By employing a Virtual Globe, users can intuitively visualize raw data and results produced by GeoChaining. All of these features allow users to easily start using GWSC, regardless of their professional background and computer skills. Further, GeoChaining supports GWSC model reuse, meaning that an entire GWSC model created or even a specific part can be directly reused in a new model. This greatly improves the efficiency of creating a new GWSC, and also contributes to the sharing and interoperability of GWSC.

  16. GeoBrain Computational Cyber-laboratory for Earth Science Studies

    NASA Astrophysics Data System (ADS)

    Deng, M.; di, L.

    2009-12-01

    Computational approaches (e.g., computer-based data visualization, analysis and modeling) are critical for conducting increasingly data-intensive Earth science (ES) studies to understand functions and changes of the Earth system. However, currently Earth scientists, educators, and students have met two major barriers that prevent them from being effectively using computational approaches in their learning, research and application activities. The two barriers are: 1) difficulties in finding, obtaining, and using multi-source ES data; and 2) lack of analytic functions and computing resources (e.g., analysis software, computing models, and high performance computing systems) to analyze the data. Taking advantages of recent advances in cyberinfrastructure, Web service, and geospatial interoperability technologies, GeoBrain, a project funded by NASA, has developed a prototype computational cyber-laboratory to effectively remove the two barriers. The cyber-laboratory makes ES data and computational resources at large organizations in distributed locations available to and easily usable by the Earth science community through 1) enabling seamless discovery, access and retrieval of distributed data, 2) federating and enhancing data discovery with a catalogue federation service and a semantically-augmented catalogue service, 3) customizing data access and retrieval at user request with interoperable, personalized, and on-demand data access and services, 4) automating or semi-automating multi-source geospatial data integration, 5) developing a large number of analytic functions as value-added, interoperable, and dynamically chainable geospatial Web services and deploying them in high-performance computing facilities, 6) enabling the online geospatial process modeling and execution, and 7) building a user-friendly extensible web portal for users to access the cyber-laboratory resources. Users can interactively discover the needed data and perform on-demand data analysis and modeling through the web portal. The GeoBrain cyber-laboratory provides solutions to meet common needs of ES research and education, such as, distributed data access and analysis services, easy access to and use of ES data, and enhanced geoprocessing and geospatial modeling capability. It greatly facilitates ES research, education, and applications. The development of the cyber-laboratory provides insights, lessons-learned, and technology readiness to build more capable computing infrastructure for ES studies, which can meet wide-range needs of current and future generations of scientists, researchers, educators, and students for their formal or informal educational training, research projects, career development, and lifelong learning.

  17. E-DECIDER Disaster Response and Decision Support Cyberinfrastructure: Technology and Challenges

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Parker, J. W.; Pierce, M. E.; Wang, J.; Eguchi, R. T.; Huyck, C. K.; Hu, Z.; Chen, Z.; Yoder, M. R.; Rundle, J. B.; Rosinski, A.

    2014-12-01

    Timely delivery of critical information to decision makers during a disaster is essential to response and damage assessment. Key issues to an efficient emergency response after a natural disaster include rapidly processing and delivering this critical information to emergency responders and reducing human intervention as much as possible. Essential elements of information necessary to achieve situational awareness are often generated by a wide array of organizations and disciplines, using any number of geospatial and non-geospatial technologies. A key challenge is the current state of practice does not easily support information sharing and technology interoperability. NASA E-DECIDER (Emergency Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response) has worked with the California Earthquake Clearinghouse and its partners to address these issues and challenges by adopting the XChangeCore Web Service Data Orchestration technology and participating in several earthquake response exercises. The E-DECIDER decision support system provides rapid delivery of advanced situational awareness data products to operations centers and emergency responders in the field. Remote sensing and hazard data, model-based map products, information from simulations, damage detection, and crowdsourcing is integrated into a single geospatial view and delivered through a service oriented architecture for improved decision-making and then directly to mobile devices of responders. By adopting a Service Oriented Architecture based on Open Geospatial Consortium standards, the system provides an extensible, comprehensive framework for geospatial data processing and distribution on Cloud platforms and other distributed environments. While the Clearinghouse and its partners are not first responders, they do support the emergency response community by providing information about the damaging effects earthquakes. It is critical for decision makers to maintain a situational awareness that is knowledgeable of potential and current conditions, possible impacts on populations and infrastructure, and other key information. E-DECIDER and the Clearinghouse have worked together to address many of these issues and challenges to deliver interoperable, authoritative decision support products.

  18. An Open Source Tool to Test Interoperability

    NASA Astrophysics Data System (ADS)

    Bermudez, L. E.

    2012-12-01

    Scientists interact with information at various levels from gathering of the raw observed data to accessing portrayed processed quality control data. Geoinformatics tools help scientist on the acquisition, storage, processing, dissemination and presentation of geospatial information. Most of the interactions occur in a distributed environment between software components that take the role of either client or server. The communication between components includes protocols, encodings of messages and managing of errors. Testing of these communication components is important to guarantee proper implementation of standards. The communication between clients and servers can be adhoc or follow standards. By following standards interoperability between components increase while reducing the time of developing new software. The Open Geospatial Consortium (OGC), not only coordinates the development of standards but also, within the Compliance Testing Program (CITE), provides a testing infrastructure to test clients and servers. The OGC Web-based Test Engine Facility, based on TEAM Engine, allows developers to test Web services and clients for correct implementation of OGC standards. TEAM Engine is a JAVA open source facility, available at Sourceforge that can be run via command line, deployed in a web servlet container or integrated in developer's environment via MAVEN. The TEAM Engine uses the Compliance Test Language (CTL) and TestNG to test HTTP requests, SOAP services and XML instances against Schemas and Schematron based assertions of any type of web service, not only OGC services. For example, the OGC Web Feature Service (WFS) 1.0.0 test has more than 400 test assertions. Some of these assertions includes conformance of HTTP responses, conformance of GML-encoded data; proper values for elements and attributes in the XML; and, correct error responses. This presentation will provide an overview of TEAM Engine, introduction of how to test via the OGC Testing web site and description of performing local tests. It will also provide information about how to participate in the open source code development of TEAM Engine.

  19. An Interoperable Architecture for Air Pollution Early Warning System Based on Sensor Web

    NASA Astrophysics Data System (ADS)

    Samadzadegan, F.; Zahmatkesh, H.; Saber, M.; Ghazi khanlou, H. J.

    2013-09-01

    Environmental monitoring systems deal with time-sensitive issues which require quick responses in emergency situations. Handling the sensor observations in near real-time and obtaining valuable information is challenging issues in these systems from a technical and scientific point of view. The ever-increasing population growth in urban areas has caused certain problems in developing countries, which has direct or indirect impact on human life. One of applicable solution for controlling and managing air quality by considering real time and update air quality information gathered by spatially distributed sensors in mega cities, using sensor web technology for developing monitoring and early warning systems. Urban air quality monitoring systems using functionalities of geospatial information system as a platform for analysing, processing, and visualization of data in combination with Sensor Web for supporting decision support systems in disaster management and emergency situations. This system uses Sensor Web Enablement (SWE) framework of the Open Geospatial Consortium (OGC), which offers a standard framework that allows the integration of sensors and sensor data into spatial data infrastructures. SWE framework introduces standards for services to access sensor data and discover events from sensor data streams as well as definition set of standards for the description of sensors and the encoding of measurements. The presented system provides capabilities to collect, transfer, share, process air quality sensor data and disseminate air quality status in real-time. It is possible to overcome interoperability challenges by using standard framework. In a routine scenario, air quality data measured by in-situ sensors are communicated to central station where data is analysed and processed. The extracted air quality status is processed for discovering emergency situations, and if necessary air quality reports are sent to the authorities. This research proposed an architecture to represent how integrate air quality sensor data stream into geospatial data infrastructure to present an interoperable air quality monitoring system for supporting disaster management systems by real time information. Developed system tested on Tehran air pollution sensors for calculating Air Quality Index (AQI) for CO pollutant and subsequently notifying registered users in emergency cases by sending warning E-mails. Air quality monitoring portal used to retrieving and visualize sensor observation through interoperable framework. This system provides capabilities to retrieve SOS observation using WPS in a cascaded service chaining pattern for monitoring trend of timely sensor observation.

  20. Web-Based Mapping Puts the World at Your Fingertips

    NASA Technical Reports Server (NTRS)

    2008-01-01

    NASA's award-winning Earth Resources Laboratory Applications Software (ELAS) package was developed at Stennis Space Center. Since 1978, ELAS has been used worldwide for processing satellite and airborne sensor imagery data of the Earth's surface into readable and usable information. DATASTAR Inc., of Picayune, Mississippi, has used ELAS software in the DATASTAR Image Processing Exploitation (DIPEx) desktop and Internet image processing, analysis, and manipulation software. The new DIPEx Version III includes significant upgrades and improvements compared to its esteemed predecessor. A true World Wide Web application, this product evolved with worldwide geospatial dimensionality and numerous other improvements that seamlessly support the World Wide Web version.

  1. Regional Geology Web Map Application Development: Javascript v2.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Russell, Glenn

    This is a milestone report for the FY2017 continuation of the Spent Fuel, Storage, and Waste, Technology (SFSWT) program (formerly Used Fuel Disposal (UFD) program) development of the Regional Geology Web Mapping Application by the Idaho National Laboratory Geospatial Science and Engineering group. This application was developed for general public use and is an interactive web-based application built in Javascript to visualize, reference, and analyze US pertinent geological features of the SFSWT program. This tool is a version upgrade from Adobe FLEX technology. It is designed to facilitate informed decision making of the geology of continental US relevant to themore » SFSWT program.« less

  2. High performance geospatial and climate data visualization using GeoJS

    NASA Astrophysics Data System (ADS)

    Chaudhary, A.; Beezley, J. D.

    2015-12-01

    GeoJS (https://github.com/OpenGeoscience/geojs) is an open-source library developed to support interactive scientific and geospatial visualization of climate and earth science datasets in a web environment. GeoJS has a convenient application programming interface (API) that enables users to harness the fast performance of WebGL and Canvas 2D APIs with sophisticated Scalable Vector Graphics (SVG) features in a consistent and convenient manner. We started the project in response to the need for an open-source JavaScript library that can combine traditional geographic information systems (GIS) and scientific visualization on the web. Many libraries, some of which are open source, support mapping or other GIS capabilities, but lack the features required to visualize scientific and other geospatial datasets. For instance, such libraries are not be capable of rendering climate plots from NetCDF files, and some libraries are limited in regards to geoinformatics (infovis in a geospatial environment). While libraries such as d3.js are extremely powerful for these kinds of plots, in order to integrate them into other GIS libraries, the construction of geoinformatics visualizations must be completed manually and separately, or the code must somehow be mixed in an unintuitive way.We developed GeoJS with the following motivations:• To create an open-source geovisualization and GIS library that combines scientific visualization with GIS and informatics• To develop an extensible library that can combine data from multiple sources and render them using multiple backends• To build a library that works well with existing scientific visualizations tools such as VTKWe have successfully deployed GeoJS-based applications for multiple domains across various projects. The ClimatePipes project funded by the Department of Energy, for example, used GeoJS to visualize NetCDF datasets from climate data archives. Other projects built visualizations using GeoJS for interactively exploring data and analysis regarding 1) the human trafficking domain, 2) New York City taxi drop-offs and pick-ups, and 3) the Ebola outbreak. GeoJS supports advanced visualization features such as picking and selecting, as well as clustering. It also supports 2D contour plots, vector plots, heat maps, and geospatial graphs.

  3. Brokered virtual hubs for facilitating access and use of geospatial Open Data

    NASA Astrophysics Data System (ADS)

    Mazzetti, Paolo; Latre, Miguel; Kamali, Nargess; Brumana, Raffaella; Braumann, Stefan; Nativi, Stefano

    2016-04-01

    Open Data is a major trend in current information technology scenario and it is often publicised as one of the pillars of the information society in the near future. In particular, geospatial Open Data have a huge potential also for Earth Sciences, through the enablement of innovative applications and services integrating heterogeneous information. However, open does not mean usable. As it was recognized at the very beginning of the Web revolution, many different degrees of openness exist: from simple sharing in a proprietary format to advanced sharing in standard formats and including semantic information. Therefore, to fully unleash the potential of geospatial Open Data, advanced infrastructures are needed to increase the data openness degree, enhancing their usability. In October 2014, the ENERGIC OD (European NEtwork for Redistributing Geospatial Information to user Communities - Open Data) project, funded by the European Union under the Competitiveness and Innovation framework Programme (CIP), has started. In response to the EU call, the general objective of the project is to "facilitate the use of open (freely available) geographic data from different sources for the creation of innovative applications and services through the creation of Virtual Hubs". The ENERGIC OD Virtual Hubs aim to facilitate the use of geospatial Open Data by lowering and possibly removing the main barriers which hampers geo-information (GI) usage by end-users and application developers. Data and services heterogeneity is recognized as one of the major barriers to Open Data (re-)use. It imposes end-users and developers to spend a lot of effort in accessing different infrastructures and harmonizing datasets. Such heterogeneity cannot be completely removed through the adoption of standard specifications for service interfaces, metadata and data models, since different infrastructures adopt different standards to answer to specific challenges and to address specific use-cases. Thus, beyond a certain extent, heterogeneity is irreducible especially in interdisciplinary contexts. ENERGIC OD Virtual Hubs address heterogeneity adopting a mediation and brokering approach: specific components (brokers) are dedicated to harmonize service interfaces, metadata and data models, enabling seamless discovery and access to heterogeneous infrastructures and datasets. As an innovation project, ENERGIC OD integrates several existing technologies to implement Virtual Hubs as single points of access to geospatial datasets provided by new or existing platforms and infrastructures, including INSPIRE-compliant systems and Copernicus services. A first version of the ENERGIC OD brokers has been implemented based on the GI-Suite Brokering Framework developed by CNR-IIA, and complemented with other tools under integration and development. It already enables mediated discovery and harmonized access to different geospatial Open Data sources. It is accessible by users as Software-as-a-Service through a browser. Moreover, open APIs and a Javascript library are available for application developers. Six ENERGIC OD Virtual Hubs have been currently deployed: one at regional level (Berlin metropolitan area) and five at national-level (in France, Germany, Italy, Poland and Spain). Each Virtual Hub manager decided the deployment strategy (local infrastructure or commercial Infrastructure-as-a-Service cloud), and the list of connected Open Data sources. The ENERGIC OD Virtual Hubs are under test and validation through the development of ten different mobile and Web applications.

  4. The semantic web and computer vision: old AI meets new AI

    NASA Astrophysics Data System (ADS)

    Mundy, J. L.; Dong, Y.; Gilliam, A.; Wagner, R.

    2018-04-01

    There has been vast process in linking semantic information across the billions of web pages through the use of ontologies encoded in the Web Ontology Language (OWL) based on the Resource Description Framework (RDF). A prime example is the Wikipedia where the knowledge contained in its more than four million pages is encoded in an ontological database called DBPedia http://wiki.dbpedia.org/. Web-based query tools can retrieve semantic information from DBPedia encoded in interlinked ontologies that can be accessed using natural language. This paper will show how this vast context can be used to automate the process of querying images and other geospatial data in support of report changes in structures and activities. Computer vision algorithms are selected and provided with context based on natural language requests for monitoring and analysis. The resulting reports provide semantically linked observations from images and 3D surface models.

  5. Study on generation and sharing of on-demand global seamless data—Taking MODIS NDVI as an example

    NASA Astrophysics Data System (ADS)

    Shen, Dayong; Deng, Meixia; Di, Liping; Han, Weiguo; Peng, Chunming; Yagci, Ali Levent; Yu, Genong; Chen, Zeqiang

    2013-04-01

    By applying advanced Geospatial Data Abstraction Library (GDAL) and BigTIFF technology in a Geographical Information System (GIS) with Service Oriented Architecture (SOA), this study has derived global datasets using tile-based input data and implemented Virtual Web Map Service (VWMS) and Virtual Web Coverage Service (VWCS) to provide software tools for visualization and acquisition of global data. Taking MODIS Normalized Difference Vegetation Index (NDVI) as an example, this study proves the feasibility, efficiency and features of the proposed approach.

  6. Lsiviewer 2.0 - a Client-Oriented Online Visualization Tool for Geospatial Vector Data

    NASA Astrophysics Data System (ADS)

    Manikanta, K.; Rajan, K. S.

    2017-09-01

    Geospatial data visualization systems have been predominantly through applications that are installed and run in a desktop environment. Over the last decade, with the advent of web technologies and its adoption by Geospatial community, the server-client model for data handling, data rendering and visualization respectively has been the most prevalent approach in Web-GIS. While the client devices have become functionally more powerful over the recent years, the above model has largely ignored it and is still in a mode of serverdominant computing paradigm. In this paper, an attempt has been made to develop and demonstrate LSIViewer - a simple, easy-to-use and robust online geospatial data visualisation system for the user's own data that harness the client's capabilities for data rendering and user-interactive styling, with a reduced load on the server. The developed system can support multiple geospatial vector formats and can be integrated with other web-based systems like WMS, WFS, etc. The technology stack used to build this system is Node.js on the server side and HTML5 Canvas and JavaScript on the client side. Various tests run on a range of vector datasets, upto 35 MB, showed that the time taken to render the vector data using LSIViewer is comparable to a desktop GIS application, QGIS, over an identical system.

  7. A Practice Approach of Multi-source Geospatial Data Integration for Web-based Geoinformation Services

    NASA Astrophysics Data System (ADS)

    Huang, W.; Jiang, J.; Zha, Z.; Zhang, H.; Wang, C.; Zhang, J.

    2014-04-01

    Geospatial data resources are the foundation of the construction of geo portal which is designed to provide online geoinformation services for the government, enterprise and public. It is vital to keep geospatial data fresh, accurate and comprehensive in order to satisfy the requirements of application and development of geographic location, route navigation, geo search and so on. One of the major problems we are facing is data acquisition. For us, integrating multi-sources geospatial data is the mainly means of data acquisition. This paper introduced a practice integration approach of multi-source geospatial data with different data model, structure and format, which provided the construction of National Geospatial Information Service Platform of China (NGISP) with effective technical supports. NGISP is the China's official geo portal which provides online geoinformation services based on internet, e-government network and classified network. Within the NGISP architecture, there are three kinds of nodes: national, provincial and municipal. Therefore, the geospatial data is from these nodes and the different datasets are heterogeneous. According to the results of analysis of the heterogeneous datasets, the first thing we do is to define the basic principles of data fusion, including following aspects: 1. location precision; 2.geometric representation; 3. up-to-date state; 4. attribute values; and 5. spatial relationship. Then the technical procedure is researched and the method that used to process different categories of features such as road, railway, boundary, river, settlement and building is proposed based on the principles. A case study in Jiangsu province demonstrated the applicability of the principle, procedure and method of multi-source geospatial data integration.

  8. Rethinking GIS Towards The Vision Of Smart Cities Through CityGML

    NASA Astrophysics Data System (ADS)

    Guney, C.

    2016-10-01

    Smart cities present a substantial growth opportunity in the coming years. The role of GIS in the smart city ecosystem is to integrate different data acquired by sensors in real time and provide better decisions, more efficiency and improved collaboration. Semantically enriched vision of GIS will help evolve smart cities into tomorrow's much smarter cities since geospatial/location data and applications may be recognized as a key ingredient of smart city vision. However, it is need for the Geospatial Information communities to debate on "Is 3D Web and mobile GIS technology ready for smart cities?" This research places an emphasis on the challenges of virtual 3D city models on the road to smarter cities.

  9. GABBs: Cyberinfrastructure for Self-Service Geospatial Data Exploration, Computation, and Sharing

    NASA Astrophysics Data System (ADS)

    Song, C. X.; Zhao, L.; Biehl, L. L.; Merwade, V.; Villoria, N.

    2016-12-01

    Geospatial data are present everywhere today with the proliferation of location-aware computing devices. This is especially true in the scientific community where large amounts of data are driving research and education activities in many domains. Collaboration over geospatial data, for example, in modeling, data analysis and visualization, must still overcome the barriers of specialized software and expertise among other challenges. In addressing these needs, the Geospatial data Analysis Building Blocks (GABBs) project aims at building geospatial modeling, data analysis and visualization capabilities in an open source web platform, HUBzero. Funded by NSF's Data Infrastructure Building Blocks initiative, GABBs is creating a geospatial data architecture that integrates spatial data management, mapping and visualization, and interfaces in the HUBzero platform for scientific collaborations. The geo-rendering enabled Rappture toolkit, a generic Python mapping library, geospatial data exploration and publication tools, and an integrated online geospatial data management solution are among the software building blocks from the project. The GABBS software will be available through Amazon's AWS Marketplace VM images and open source. Hosting services are also available to the user community. The outcome of the project will enable researchers and educators to self-manage their scientific data, rapidly create GIS-enable tools, share geospatial data and tools on the web, and build dynamic workflows connecting data and tools, all without requiring significant software development skills, GIS expertise or IT administrative privileges. This presentation will describe the GABBs architecture, toolkits and libraries, and showcase the scientific use cases that utilize GABBs capabilities, as well as the challenges and solutions for GABBs to interoperate with other cyberinfrastructure platforms.

  10. Quality Metadata Management for Geospatial Scientific Workflows: from Retrieving to Assessing with Online Tools

    NASA Astrophysics Data System (ADS)

    Leibovici, D. G.; Pourabdollah, A.; Jackson, M.

    2011-12-01

    Experts and decision-makers use or develop models to monitor global and local changes of the environment. Their activities require the combination of data and processing services in a flow of operations and spatial data computations: a geospatial scientific workflow. The seamless ability to generate, re-use and modify a geospatial scientific workflow is an important requirement but the quality of outcomes is equally much important [1]. Metadata information attached to the data and processes, and particularly their quality, is essential to assess the reliability of the scientific model that represents a workflow [2]. Managing tools, dealing with qualitative and quantitative metadata measures of the quality associated with a workflow, are, therefore, required for the modellers. To ensure interoperability, ISO and OGC standards [3] are to be adopted, allowing for example one to define metadata profiles and to retrieve them via web service interfaces. However these standards need a few extensions when looking at workflows, particularly in the context of geoprocesses metadata. We propose to fill this gap (i) at first through the provision of a metadata profile for the quality of processes, and (ii) through providing a framework, based on XPDL [4], to manage the quality information. Web Processing Services are used to implement a range of metadata analyses on the workflow in order to evaluate and present quality information at different levels of the workflow. This generates the metadata quality, stored in the XPDL file. The focus is (a) on the visual representations of the quality, summarizing the retrieved quality information either from the standardized metadata profiles of the components or from non-standard quality information e.g., Web 2.0 information, and (b) on the estimated qualities of the outputs derived from meta-propagation of uncertainties (a principle that we have introduced [5]). An a priori validation of the future decision-making supported by the outputs of the workflow once run, is then provided using the meta-propagated qualities, obtained without running the workflow [6], together with the visualization pointing out the need to improve the workflow with better data or better processes on the workflow graph itself. [1] Leibovici, DG, Hobona, G Stock, K Jackson, M (2009) Qualifying geospatial workfow models for adaptive controlled validity and accuracy. In: IEEE 17th GeoInformatics, 1-5 [2] Leibovici, DG, Pourabdollah, A (2010a) Workflow Uncertainty using a Metamodel Framework and Metadata for Data and Processes. OGC TC/PC Meetings, September 2010, Toulouse, France [3] OGC (2011) www.opengeospatial.org [4] XPDL (2008) Workflow Process Definition Interface - XML Process Definition Language.Workflow Management Coalition, Document WfMC-TC-1025, 2008 [5] Leibovici, DG Pourabdollah, A Jackson, M (2011) Meta-propagation of Uncertainties for Scientific Workflow Management in Interoperable Spatial Data Infrastructures. In: Proceedings of the European Geosciences Union (EGU2011), April 2011, Austria [6] Pourabdollah, A Leibovici, DG Jackson, M (2011) MetaPunT: an Open Source tool for Meta-Propagation of uncerTainties in Geospatial Processing. In: Proceedings of OSGIS2011, June 2011, Nottingham, UK

  11. Improving data discoverability, accessibility, and interoperability with the Esri ArcGIS Platform at the NASA Atmospheric Science Data Center (ASDC).

    NASA Astrophysics Data System (ADS)

    Tisdale, M.

    2017-12-01

    NASA's Atmospheric Science Data Center (ASDC) is operationally using the Esri ArcGIS Platform to improve data discoverability, accessibility and interoperability to meet the diversifying user requirements from government, private, public and academic communities. The ASDC is actively working to provide their mission essential datasets as ArcGIS Image Services, Open Geospatial Consortium (OGC) Web Mapping Services (WMS), and OGC Web Coverage Services (WCS) while leveraging the ArcGIS multidimensional mosaic dataset structure. Science teams at ASDC are utilizing these services through the development of applications using the Web AppBuilder for ArcGIS and the ArcGIS API for Javascript. These services provide greater exposure of ASDC data holdings to the GIS community and allow for broader sharing and distribution to various end users. These capabilities provide interactive visualization tools and improved geospatial analytical tools for a mission critical understanding in the areas of the earth's radiation budget, clouds, aerosols, and tropospheric chemistry. The presentation will cover how the ASDC is developing geospatial web services and applications to improve data discoverability, accessibility, and interoperability.

  12. Development of Web GIS for complex processing and visualization of climate geospatial datasets as an integral part of dedicated Virtual Research Environment

    NASA Astrophysics Data System (ADS)

    Gordov, Evgeny; Okladnikov, Igor; Titov, Alexander

    2017-04-01

    For comprehensive usage of large geospatial meteorological and climate datasets it is necessary to create a distributed software infrastructure based on the spatial data infrastructure (SDI) approach. Currently, it is generally accepted that the development of client applications as integrated elements of such infrastructure should be based on the usage of modern web and GIS technologies. The paper describes the Web GIS for complex processing and visualization of geospatial (mainly in NetCDF and PostGIS formats) datasets as an integral part of the dedicated Virtual Research Environment for comprehensive study of ongoing and possible future climate change, and analysis of their implications, providing full information and computing support for the study of economic, political and social consequences of global climate change at the global and regional levels. The Web GIS consists of two basic software parts: 1. Server-side part representing PHP applications of the SDI geoportal and realizing the functionality of interaction with computational core backend, WMS/WFS/WPS cartographical services, as well as implementing an open API for browser-based client software. Being the secondary one, this part provides a limited set of procedures accessible via standard HTTP interface. 2. Front-end part representing Web GIS client developed according to a "single page application" technology based on JavaScript libraries OpenLayers (http://openlayers.org/), ExtJS (https://www.sencha.com/products/extjs), GeoExt (http://geoext.org/). It implements application business logic and provides intuitive user interface similar to the interface of such popular desktop GIS applications, as uDIG, QuantumGIS etc. Boundless/OpenGeo architecture was used as a basis for Web-GIS client development. According to general INSPIRE requirements to data visualization Web GIS provides such standard functionality as data overview, image navigation, scrolling, scaling and graphical overlay, displaying map legends and corresponding metadata information. The specialized Web GIS client contains three basic tires: • Tier of NetCDF metadata in JSON format • Middleware tier of JavaScript objects implementing methods to work with: o NetCDF metadata o XML file of selected calculations configuration (XML task) o WMS/WFS/WPS cartographical services • Graphical user interface tier representing JavaScript objects realizing general application business logic Web-GIS developed provides computational processing services launching to support solving tasks in the area of environmental monitoring, as well as presenting calculation results in the form of WMS/WFS cartographical layers in raster (PNG, JPG, GeoTIFF), vector (KML, GML, Shape), and binary (NetCDF) formats. It has shown its effectiveness in the process of solving real climate change research problems and disseminating investigation results in cartographical formats. The work is supported by the Russian Science Foundation grant No 16-19-10257.

  13. Increasing the availability and usability of terrestrial ecology data through geospatial Web services and visualization tools (Invited)

    NASA Astrophysics Data System (ADS)

    Santhana Vannan, S.; Cook, R. B.; Wilson, B. E.; Wei, Y.

    2010-12-01

    Terrestrial ecology data sets are produced from diverse data sources such as model output, field data collection, laboratory analysis and remote sensing observation. These data sets can be created, distributed, and consumed in diverse ways as well. However, this diversity can hinder the usability of the data, and limit data users’ abilities to validate and reuse data for science and application purposes. Geospatial web services, such as those described in this paper, are an important means of reducing this burden. Terrestrial ecology researchers generally create the data sets in diverse file formats, with file and data structures tailored to the specific needs of their project, possibly as tabular data, geospatial images, or documentation in a report. Data centers may reformat the data to an archive-stable format and distribute the data sets through one or more protocols, such as FTP, email, and WWW. Because of the diverse data preparation, delivery, and usage patterns, users have to invest time and resources to bring the data into the format and structure most useful for their analysis. This time-consuming data preparation process shifts valuable resources from data analysis to data assembly. To address these issues, the ORNL DAAC, a NASA-sponsored terrestrial ecology data center, has utilized geospatial Web service technology, such as Open Geospatial Consortium (OGC) Web Map Service (WMS) and OGC Web Coverage Service (WCS) standards, to increase the usability and availability of terrestrial ecology data sets. Data sets are standardized into non-proprietary file formats and distributed through OGC Web Service standards. OGC Web services allow the ORNL DAAC to store data sets in a single format and distribute them in multiple ways and formats. Registering the OGC Web services through search catalogues and other spatial data tools allows for publicizing the data sets and makes them more available across the Internet. The ORNL DAAC has also created a Web-based graphical user interface called Spatial Data Access Tool (SDAT) that utilizes OGC Web services standards and allows data distribution and consumption for users not familiar with OGC standards. SDAT also allows for users to visualize the data set prior to download. Google Earth visualizations of the data set are also provided through SDAT. The use of OGC Web service standards at the ORNL DAAC has enabled an increase in data consumption. In one case, a data set had ~10 fold increase in download through OGC Web service in comparison to the conventional FTP and WWW method of access. The increase in download suggests that users are not only finding the data sets they need but also able to consume them readily in the format they need.

  14. In-field Access to Geoscientific Metadata through GPS-enabled Mobile Phones

    NASA Astrophysics Data System (ADS)

    Hobona, Gobe; Jackson, Mike; Jordan, Colm; Butchart, Ben

    2010-05-01

    Fieldwork is an integral part of much geosciences research. But whilst geoscientists have physical or online access to data collections whilst in the laboratory or at base stations, equivalent in-field access is not standard or straightforward. The increasing availability of mobile internet and GPS-supported mobile phones, however, now provides the basis for addressing this issue. The SPACER project was commissioned by the Rapid Innovation initiative of the UK Joint Information Systems Committee (JISC) to explore the potential for GPS-enabled mobile phones to access geoscientific metadata collections. Metadata collections within the geosciences and the wider geospatial domain can be disseminated through web services based on the Catalogue Service for Web(CSW) standard of the Open Geospatial Consortium (OGC) - a global grouping of over 380 private, public and academic organisations aiming to improve interoperability between geospatial technologies. CSW offers an XML-over-HTTP interface for querying and retrieval of geospatial metadata. By default, the metadata returned by CSW is based on the ISO19115 standard and encoded in XML conformant to ISO19139. The SPACER project has created a prototype application that enables mobile phones to send queries to CSW containing user-defined keywords and coordinates acquired from GPS devices built-into the phones. The prototype has been developed using the free and open source Google Android platform. The mobile application offers views for listing titles, presenting multiple metadata elements and a Google Map with an overlay of bounding coordinates of datasets. The presentation will describe the architecture and approach applied in the development of the prototype.

  15. Serving Satellite Remote Sensing Data to User Community through the OGC Interoperability Protocols

    NASA Astrophysics Data System (ADS)

    di, L.; Yang, W.; Bai, Y.

    2005-12-01

    Remote sensing is one of the major methods for collecting geospatial data. Hugh amount of remote sensing data has been collected by space agencies and private companies around the world. For example, NASA's Earth Observing System (EOS) is generating more than 3 Tb of remote sensing data per day. The data collected by EOS are processed, distributed, archived, and managed by the EOS Data and Information System (EOSDIS). Currently, EOSDIS is managing several petabytes of data. All of those data are not only valuable for global change research, but also useful for local and regional application and decision makings. How to make the data easily accessible to and usable by the user community is one of key issues for realizing the full potential of these valuable datasets. In the past several years, the Open Geospatial Consortium (OGC) has developed several interoperability protocols aiming at making geospatial data easily accessible to and usable by the user community through Internet. The protocols particularly relevant to the discovery, access, and integration of multi-source satellite remote sensing data are the Catalog Service for Web (CS/W) and Web Coverage Services (WCS) Specifications. The OGC CS/W specifies the interfaces, HTTP protocol bindings, and a framework for defining application profiles required to publish and access digital catalogues of metadata for geographic data, services, and related resource information. The OGC WCS specification defines the interfaces between web-based clients and servers for accessing on-line multi-dimensional, multi-temporal geospatial coverage in an interoperable way. Based on definitions by OGC and ISO 19123, coverage data include all remote sensing images as well as gridded model outputs. The Laboratory for Advanced Information Technology and Standards (LAITS), George Mason University, has been working on developing and implementing OGC specifications for better serving NASA Earth science data to the user community for many years. We have developed the NWGISS software package that implements multiple OGC specifications, including OGC WMS, WCS, CS/W, and WFS. As a part of NASA REASON GeoBrain project, the NWGISS WCS and CS/W servers have been extended to provide operational access to NASA EOS data at data pools through OGC protocols and to make both services chainable in the web-service chaining. The extensions in the WCS server include the implementation of WCS 1.0.0 and WCS 1.0.2, and the development of WSDL description of the WCS services. In order to find the on-line EOS data resources, the CS/W server is extended at the backend to search metadata in NASA ECHO. This presentation reports those extensions and discuss lessons-learned on the implementation. It also discusses the advantage, disadvantages, and future improvement of OGC specifications, particularly the WCS.

  16. Web Map Services (WMS) Global Mosaic

    NASA Technical Reports Server (NTRS)

    Percivall, George; Plesea, Lucian

    2003-01-01

    The WMS Global Mosaic provides access to imagery of the global landmass using an open standard for web mapping. The seamless image is a mosaic of Landsat 7 scenes; geographically-accurate with 30 and 15 meter resolutions. By using the OpenGIS Web Map Service (WMS) interface, any organization can use the global mosaic as a layer in their geospatial applications. Based on a trade study, an implementation approach was chosen that extends a previously developed WMS hosting a Landsat 5 CONUS mosaic developed by JPL. The WMS Global Mosaic supports the NASA Geospatial Interoperability Office goal of providing an integrated digital representation of the Earth, widely accessible for humanity's critical decisions.

  17. Optimizing Earth Data Search Ranking using Deep Learning and Real-time User Behaviour

    NASA Astrophysics Data System (ADS)

    Jiang, Y.; Yang, C. P.; Armstrong, E. M.; Huang, T.; Moroni, D. F.; McGibbney, L. J.; Greguska, F. R., III

    2017-12-01

    Finding Earth science data has been a challenging problem given both the quantity of data available and the heterogeneity of the data across a wide variety of domains. Current search engines in most geospatial data portals tend to induce end users to focus on one single data characteristic dimension (e.g., term frequency-inverse document frequency (TF-IDF) score, popularity, release date, etc.). This approach largely fails to take account of users' multidimensional preferences for geospatial data, and hence may likely result in a less than optimal user experience in discovering the most applicable dataset out of a vast range of available datasets. With users interacting with search engines, sufficient information is already hidden in the log files. Compared with explicit feedback data, information that can be derived/extracted from log files is virtually free and substantially more timely. In this dissertation, I propose an online deep learning framework that can quickly update the learning function based on real-time user clickstream data. The contributions of this framework include 1) a log processor that can ingest, process and create training data from web logs in a real-time manner; 2) a query understanding module to better interpret users' search intent using web log processing results and metadata; 3) a feature extractor that identifies ranking features representing users' multidimensional interests of geospatial data; and 4) a deep learning based ranking algorithm that can be trained incrementally using user behavior data. The search ranking results will be evaluated using precision at K and normalized discounted cumulative gain (NDCG).

  18. Cloud Computing for Geosciences--GeoCloud for standardized geospatial service platforms (Invited)

    NASA Astrophysics Data System (ADS)

    Nebert, D. D.; Huang, Q.; Yang, C.

    2013-12-01

    The 21st century geoscience faces challenges of Big Data, spike computing requirements (e.g., when natural disaster happens), and sharing resources through cyberinfrastructure across different organizations (Yang et al., 2011). With flexibility and cost-efficiency of computing resources a primary concern, cloud computing emerges as a promising solution to provide core capabilities to address these challenges. Many governmental and federal agencies are adopting cloud technologies to cut costs and to make federal IT operations more efficient (Huang et al., 2010). However, it is still difficult for geoscientists to take advantage of the benefits of cloud computing to facilitate the scientific research and discoveries. This presentation reports using GeoCloud to illustrate the process and strategies used in building a common platform for geoscience communities to enable the sharing, integration of geospatial data, information and knowledge across different domains. GeoCloud is an annual incubator project coordinated by the Federal Geographic Data Committee (FGDC) in collaboration with the U.S. General Services Administration (GSA) and the Department of Health and Human Services. It is designed as a staging environment to test and document the deployment of a common GeoCloud community platform that can be implemented by multiple agencies. With these standardized virtual geospatial servers, a variety of government geospatial applications can be quickly migrated to the cloud. In order to achieve this objective, multiple projects are nominated each year by federal agencies as existing public-facing geospatial data services. From the initial candidate projects, a set of common operating system and software requirements was identified as the baseline for platform as a service (PaaS) packages. Based on these developed common platform packages, each project deploys and monitors its web application, develops best practices, and documents cost and performance information. This paper presents the background, architectural design, and activities of GeoCloud in support of the Geospatial Platform Initiative. System security strategies and approval processes for migrating federal geospatial data, information, and applications into cloud, and cost estimation for cloud operations are covered. Finally, some lessons learned from the GeoCloud project are discussed as reference for geoscientists to consider in the adoption of cloud computing.

  19. A Webgis Framework for Disseminating Processed Remotely Sensed on Land Cover Transformations

    NASA Astrophysics Data System (ADS)

    Caradonna, Grazia; Novelli, Antonio; Tarantino, Eufemia; Cefalo, Raffaela; Fratino, Umberto

    2016-06-01

    Mediterranean regions have experienced significant soil degradation over the past decades. In this context, careful land observation using satellite data is crucial for understanding the long-term usage patterns of natural resources and facilitating their sustainable management to monitor and evaluate the potential degradation. Given the environmental and political interest on this problem, there is urgent need for a centralized repository and mechanism to share geospatial data, information and maps of land change. Geospatial data collecting is one of the most important task for many users because there are significant barriers in accessing and using data. This limit could be overcome by implementing a WebGIS through a combination of existing free and open source software for geographic information systems (FOSS4G). In this paper we preliminary discuss methods for collecting raster data in a geodatabase by processing open multi-temporal and multi-scale satellite data aimed at retrieving indicators for land degradation phenomenon (i.e. land cover/land use analysis, vegetation indices, trend analysis, etc.). Then we describe a methodology for designing a WebGIS framework in order to disseminate information through maps for territory monitoring. Basic WebGIS functions were extended with the help of POSTGIS database and OpenLayers libraries. Geoserver was customized to set up and enhance the website functions developing various advanced queries using PostgreSQL and innovative tools to carry out efficiently multi-layer overlay analysis. The end-product is a simple system that provides the opportunity not only to consult interactively but also download processed remote sensing data.

  20. A Comprehensive Optimization Strategy for Real-time Spatial Feature Sharing and Visual Analytics in Cyberinfrastructure

    NASA Astrophysics Data System (ADS)

    Li, W.; Shao, H.

    2017-12-01

    For geospatial cyberinfrastructure enabled web services, the ability of rapidly transmitting and sharing spatial data over the Internet plays a critical role to meet the demands of real-time change detection, response and decision-making. Especially for the vector datasets which serve as irreplaceable and concrete material in data-driven geospatial applications, their rich geometry and property information facilitates the development of interactive, efficient and intelligent data analysis and visualization applications. However, the big-data issues of vector datasets have hindered their wide adoption in web services. In this research, we propose a comprehensive optimization strategy to enhance the performance of vector data transmitting and processing. This strategy combines: 1) pre- and on-the-fly generalization, which automatically determines proper simplification level through the introduction of appropriate distance tolerance (ADT) to meet various visualization requirements, and at the same time speed up simplification efficiency; 2) a progressive attribute transmission method to reduce data size and therefore the service response time; 3) compressed data transmission and dynamic adoption of a compression method to maximize the service efficiency under different computing and network environments. A cyberinfrastructure web portal was developed for implementing the proposed technologies. After applying our optimization strategies, substantial performance enhancement is achieved. We expect this work to widen the use of web service providing vector data to support real-time spatial feature sharing, visual analytics and decision-making.

  1. Renewable Energy Data Explorer User Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cox, Sarah L; Grue, Nicholas W; Tran, July

    This publication provides a user guide for the Renewable Energy Data Explorer and technical potential tool within the Explorer. The Renewable Energy Data Explorer is a dynamic, web-based geospatial analysis tool that facilitates renewable energy decision-making, investment, and deployment. It brings together renewable energy resource data and other modeled or measured geographic information system (GIS) layers, including land use, weather, environmental, population density, administrative, and grid data.

  2. NCI's Distributed Geospatial Data Server

    NASA Astrophysics Data System (ADS)

    Larraondo, P. R.; Evans, B. J. K.; Antony, J.

    2016-12-01

    Earth systems, environmental and geophysics datasets are an extremely valuable source of information about the state and evolution of the Earth. However, different disciplines and applications require this data to be post-processed in different ways before it can be used. For researchers experimenting with algorithms across large datasets or combining multiple data sets, the traditional approach to batch data processing and storing all the output for later analysis rapidly becomes unfeasible, and often requires additional work to publish for others to use. Recent developments on distributed computing using interactive access to significant cloud infrastructure opens the door for new ways of processing data on demand, hence alleviating the need for storage space for each individual copy of each product. The Australian National Computational Infrastructure (NCI) has developed a highly distributed geospatial data server which supports interactive processing of large geospatial data products, including satellite Earth Observation data and global model data, using flexible user-defined functions. This system dynamically and efficiently distributes the required computations among cloud nodes and thus provides a scalable analysis capability. In many cases this completely alleviates the need to preprocess and store the data as products. This system presents a standards-compliant interface, allowing ready accessibility for users of the data. Typical data wrangling problems such as handling different file formats and data types, or harmonising the coordinate projections or temporal and spatial resolutions, can now be handled automatically by this service. The geospatial data server exposes functionality for specifying how the data should be aggregated and transformed. The resulting products can be served using several standards such as the Open Geospatial Consortium's (OGC) Web Map Service (WMS) or Web Feature Service (WFS), Open Street Map tiles, or raw binary arrays under different conventions. We will show some cases where we have used this new capability to provide a significant improvement over previous approaches.

  3. Uncertainty visualisation in the Model Web

    NASA Astrophysics Data System (ADS)

    Gerharz, L. E.; Autermann, C.; Hopmann, H.; Stasch, C.; Pebesma, E.

    2012-04-01

    Visualisation of geospatial data as maps is a common way to communicate spatially distributed information. If temporal and furthermore uncertainty information are included in the data, efficient visualisation methods are required. For uncertain spatial and spatio-temporal data, numerous visualisation methods have been developed and proposed, but only few tools for visualisation of data in a standardised way exist. Furthermore, usually they are realised as thick clients, and lack functionality of handling data coming from web services as it is envisaged in the Model Web. We present an interactive web tool for visualisation of uncertain spatio-temporal data developed in the UncertWeb project. The client is based on the OpenLayers JavaScript library. OpenLayers provides standard map windows and navigation tools, i.e. pan, zoom in/out, to allow interactive control for the user. Further interactive methods are implemented using jStat, a JavaScript library for statistics plots developed in UncertWeb, and flot. To integrate the uncertainty information into existing standards for geospatial data, the Uncertainty Markup Language (UncertML) was applied in combination with OGC Observations&Measurements 2.0 and JavaScript Object Notation (JSON) encodings for vector and NetCDF for raster data. The client offers methods to visualise uncertain vector and raster data with temporal information. Uncertainty information considered for the tool are probabilistic and quantified attribute uncertainties which can be provided as realisations or samples, full probability distributions functions and statistics. Visualisation is supported for uncertain continuous and categorical data. In the client, the visualisation is realised using a combination of different methods. Based on previously conducted usability studies, a differentiation between expert (in statistics or mapping) and non-expert users has been indicated as useful. Therefore, two different modes are realised together in the tool: (i) adjacent maps showing data and uncertainty separately, and (ii) multidimensional mapping providing different visualisation methods in combination to explore the spatial, temporal and uncertainty distribution of the data. Adjacent maps allow a simpler visualisation by separating value and uncertainty maps for non-experts and a first overview. The multidimensional approach allows a more complex exploration of the data for experts by browsing through the different dimensions. It offers the visualisation of maps, statistic plots and time series in different windows and sliders to interactively move through time, space and uncertainty (thresholds).

  4. NASA's Geospatial Interoperability Office(GIO)Program

    NASA Technical Reports Server (NTRS)

    Weir, Patricia

    2004-01-01

    NASA produces vast amounts of information about the Earth from satellites, supercomputer models, and other sources. These data are most useful when made easily accessible to NASA researchers and scientists, to NASA's partner Federal Agencies, and to society as a whole. A NASA goal is to apply its data for knowledge gain, decision support and understanding of Earth, and other planetary systems. The NASA Earth Science Enterprise (ESE) Geospatial Interoperability Office (GIO) Program leads the development, promotion and implementation of information technology standards that accelerate and expand the delivery of NASA's Earth system science research through integrated systems solutions. Our overarching goal is to make it easy for decision-makers, scientists and citizens to use NASA's science information. NASA's Federal partners currently participate with NASA and one another in the development and implementation of geospatial standards to ensure the most efficient and effective access to one another's data. Through the GIO, NASA participates with its Federal partners in implementing interoperability standards in support of E-Gov and the associated President's Management Agenda initiatives by collaborating on standards development. Through partnerships with government, private industry, education and communities the GIO works towards enhancing the ESE Applications Division in the area of National Applications and decision support systems. The GIO provides geospatial standards leadership within NASA, represents NASA on the Federal Geographic Data Committee (FGDC) Coordination Working Group and chairs the FGDC's Geospatial Applications and Interoperability Working Group (GAI) and supports development and implementation efforts such as Earth Science Gateway (ESG), Space Time Tool Kit and Web Map Services (WMS) Global Mosaic. The GIO supports NASA in the collection and dissemination of geospatial interoperability standards needs and progress throughout the agency including areas such as ESE Applications, the SEEDS Working Groups, the Facilities Engineering Division (Code JX) and NASA's Chief Information Offices (CIO). With these agency level requirements GIO leads, brokers and facilitates efforts to, develop, implement, influence and fully participate in standards development internationally, federally and locally. The GIO also represents NASA in the OpenGIS Consortium and ISO TC211. The OGC has made considerable progress in regards to relations with other open standards bodies; namely ISO, W3C and OASIS. ISO TC211 is the Geographic and Geomatics Information technical committee that works towards standardization in the field of digital geographic information. The GIO focuses on seamless access to data, applications of data, and enabling technologies furthering the interoperability of distributed data. Through teaming within the Applications Directorate and partnerships with government, private industry, education and communities, GIO works towards the data application goals of NASA, the ESE Applications Directorate, and our Federal partners by managing projects in four categories: Geospatial Standards and Leadership, Geospatial One Stop, Standards Development and Implementation, and National and NASA Activities.

  5. WPS mediation: An approach to process geospatial data on different computing backends

    NASA Astrophysics Data System (ADS)

    Giuliani, Gregory; Nativi, Stefano; Lehmann, Anthony; Ray, Nicolas

    2012-10-01

    The OGC Web Processing Service (WPS) specification allows generating information by processing distributed geospatial data made available through Spatial Data Infrastructures (SDIs). However, current SDIs have limited analytical capacities and various problems emerge when trying to use them in data and computing-intensive domains such as environmental sciences. These problems are usually not or only partially solvable using single computing resources. Therefore, the Geographic Information (GI) community is trying to benefit from the superior storage and computing capabilities offered by distributed computing (e.g., Grids, Clouds) related methods and technologies. Currently, there is no commonly agreed approach to grid-enable WPS. No implementation allows one to seamlessly execute a geoprocessing calculation following user requirements on different computing backends, ranging from a stand-alone GIS server up to computer clusters and large Grid infrastructures. Considering this issue, this paper presents a proof of concept by mediating different geospatial and Grid software packages, and by proposing an extension of WPS specification through two optional parameters. The applicability of this approach will be demonstrated using a Normalized Difference Vegetation Index (NDVI) mediated WPS process, highlighting benefits, and issues that need to be further investigated to improve performances.

  6. Future Teachers' Dispositions toward Teaching with Geospatial Technologies

    ERIC Educational Resources Information Center

    Jo, Injeong

    2016-01-01

    This study examined the effect of a minimal Web-based GIS experience within a semester-long methods course on enhancing preservice teachers' dispositions regarding the use of geospatial technologies for teaching. Fourteen preservice teachers enrolled in a senior-level methods course offered in geography and focused exclusively on how to teach…

  7. Availability of the OGC geoprocessing standard: March 2011 reality check

    NASA Astrophysics Data System (ADS)

    Lopez-Pellicer, Francisco J.; Rentería-Agualimpia, Walter; Béjar, Rubén; Muro-Medrano, Pedro R.; Zarazaga-Soria, F. Javier

    2012-10-01

    This paper presents an investigation about the servers available in March 2011 conforming to the Web Processing Service interface specification published by the geospatial standards organization Open Geospatial Consortium (OGC) in 2007. This interface specification gives support to standard Web-based geoprocessing. The data used in this research were collected using a focused crawler configured for finding OGC Web services. The research goals are (i) to provide a reality check of the availability of Web Processing Service servers, (ii) to provide quantitative data about the use of different features defined in the standard that are relevant for a scalable Geoprocessing Web (e.g. long-running processes, Web-accessible data outputs), and (iii) to test if the advances in the use of search engines and focused crawlers for finding Web services can be applied for finding geoscience processing systems. Research results show the feasibility of the discovery approach and provide data about the implementation of the Web Processing Service specification. These results also show extensive use of features related to scalability, except for those related to technical and semantic interoperability.

  8. The QuakeSim Project: Web Services for Managing Geophysical Data and Applications

    NASA Astrophysics Data System (ADS)

    Pierce, Marlon E.; Fox, Geoffrey C.; Aktas, Mehmet S.; Aydin, Galip; Gadgil, Harshawardhan; Qi, Zhigang; Sayar, Ahmet

    2008-04-01

    We describe our distributed systems research efforts to build the “cyberinfrastructure” components that constitute a geophysical Grid, or more accurately, a Grid of Grids. Service-oriented computing principles are used to build a distributed infrastructure of Web accessible components for accessing data and scientific applications. Our data services fall into two major categories: Archival, database-backed services based around Geographical Information System (GIS) standards from the Open Geospatial Consortium, and streaming services that can be used to filter and route real-time data sources such as Global Positioning System data streams. Execution support services include application execution management services and services for transferring remote files. These data and execution service families are bound together through metadata information and workflow services for service orchestration. Users may access the system through the QuakeSim scientific Web portal, which is built using a portlet component approach.

  9. First Prototype of a Web Map Interface for ESA's Planetary Science Archive (PSA)

    NASA Astrophysics Data System (ADS)

    Manaud, N.; Gonzalez, J.

    2014-04-01

    We present a first prototype of a Web Map Interface that will serve as a proof of concept and design for ESA's future fully web-based Planetary Science Archive (PSA) User Interface. The PSA is ESA's planetary science archiving authority and central repository for all scientific and engineering data returned by ESA's Solar System missions [1]. All data are compliant with NASA's Planetary Data System (PDS) Standards and are accessible through several interfaces [2]: in addition to serving all public data via FTP and the Planetary Data Access Protocol (PDAP), a Java-based User Interface provides advanced search, preview, download, notification and delivery-basket functionality. It allows the user to query and visualise instrument observations footprints using a map-based interface (currently only available for Mars Express HRSC and OMEGA instruments). During the last decade, the planetary mapping science community has increasingly been adopting Geographic Information System (GIS) tools and standards, originally developed for and used in Earth science. There is an ongoing effort to produce and share cartographic products through Open Geospatial Consortium (OGC) Web Services, or as standalone data sets, so that they can be readily used in existing GIS applications [3,4,5]. Previous studies conducted at ESAC [6,7] have helped identify the needs of Planetary GIS users, and define key areas of improvement for the future Web PSA User Interface. Its web map interface shall will provide access to the full geospatial content of the PSA, including (1) observation geometry footprints of all remote sensing instruments, and (2) all georeferenced cartographic products, such as HRSC map-projected data or OMEGA global maps from Mars Express. It shall aim to provide a rich user experience for search and visualisation of this content using modern and interactive web mapping technology. A comprehensive set of built-in context maps from external sources, such as MOLA topography, TES infrared maps or planetary surface nomenclature, provided in both simple cylindrical and polar stereographic projections, shall enhance this user experience. In addition, users should be able to import and export data in commonly used open- GIS formats. It is also intended to serve all PSA geospatial data through OGC-compliant Web Services so that they can be captured, visualised and analysed directly from GIS software, along with data from other sources. The following figure illustrates how the PSA web map interface and services shall fit in a typical Planetary GIS user working environment.

  10. Connecting long-tail scientists with big data centers using SaaS

    NASA Astrophysics Data System (ADS)

    Percivall, G. S.; Bermudez, L. E.

    2012-12-01

    Big data centers and long tail scientists represent two extremes in the geoscience research community. Interoperability and inter-use based on software-as-a-service (SaaS) increases access to big data holdings by this underserved community of scientists. Large, institutional data centers have long been recognized as vital resources in the geoscience community. Permanent data archiving and dissemination centers provide "access to the data and (are) a critical source of people who have experience in the use of the data and can provide advice and counsel for new applications." [NRC] The "long-tail of science" is the geoscience researchers that work separate from institutional data centers [Heidorn]. Long-tail scientists need to be efficient consumers of data from large, institutional data centers. Discussions in NSF EarthCube capture the challenges: "Like the vast majority of NSF-funded researchers, Alice (a long-tail scientist) works with limited resources. In the absence of suitable expertise and infrastructure, the apparently simple task that she assigns to her graduate student becomes an information discovery and management nightmare. Downloading and transforming datasets takes weeks." [Foster, et.al.] The long-tail metaphor points to methods to bridge the gap, i.e., the Web. A decade ago, OGC began building a geospatial information space using open, web standards for geoprocessing [ORM]. Recently, [Foster, et.al.] accurately observed that "by adopting, adapting, and applying semantic web and SaaS technologies, we can make the use of geoscience data as easy and convenient as consumption of online media." SaaS places web services into Cloud Computing. SaaS for geospatial is emerging rapidly building on the first-generation geospatial web, e.g., OGC Web Coverage Service [WCS] and the Data Access Protocol [DAP]. Several recent examples show progress in applying SaaS to geosciences: - NASA's Earth Data Coherent Web has a goal to improve science user experience using Web Services (e.g. W*S, SOAP, RESTful) to reduce barriers to using EOSDIS data [ECW]. - NASA's LANCE provides direct access to vast amounts of satellite data using the OGC Web Map Tile Service (WMTS). - NOAA's Unified Access Framework for Gridded Data (UAF Grid) is a web service based capability for direct access to a variety of datasets using netCDF, OPeNDAP, THREDDS, WMS and WCS. [UAF] Tools to access SaaS's are many and varied: some proprietary, others open source; some run in browsers, others are stand-alone applications. What's required is interoperability using web interfaces offered by the data centers. NOAA's UAF service stack supports Matlab, ArcGIS, Ferret, GrADS, Google Earth, IDV, LAS. Any SaaS that offers OGC Web Services (WMS, WFS, WCS) can be accessed by scores of clients [OGC]. While there has been much progress in the recent year toward offering web services for the long-tail of scientists, more needs to be done. Web services offer data access but more than access is needed for inter-use of data, e.g. defining data schemas that allow for data fusion, addressing coordinate systems, spatial geometry, and semantics for observations. Connecting long-tail scientists with large, data centers using SaaS and, in the future, semantic web, will address this large and currently underserved user community.

  11. Grid enablement of OpenGeospatial Web Services: the G-OWS Working Group

    NASA Astrophysics Data System (ADS)

    Mazzetti, Paolo

    2010-05-01

    In last decades two main paradigms for resource sharing emerged and reached maturity: the Web and the Grid. They both demonstrate suitable for building Distributed Computing Infrastructures (DCIs) supporting the coordinated sharing of resources (i.e. data, information, services, etc) on the Internet. Grid and Web DCIs have much in common as a result of their underlying Internet technology (protocols, models and specifications). However, being based on different requirements and architectural approaches, they show some differences as well. The Web's "major goal was to be a shared information space through which people and machines could communicate" [Berners-Lee 1996]. The success of the Web, and its consequent pervasiveness, made it appealing for building specialized systems like the Spatial Data Infrastructures (SDIs). In this systems the introduction of Web-based geo-information technologies enables specialized services for geospatial data sharing and processing. The Grid was born to achieve "flexible, secure, coordinated resource sharing among dynamic collections of individuals, institutions, and resources" [Foster 2001]. It specifically focuses on large-scale resource sharing, innovative applications, and, in some cases, high-performance orientation. In the Earth and Space Sciences (ESS) the most part of handled information is geo-referred (geo-information) since spatial and temporal meta-information is of primary importance in many application domains: Earth Sciences, Disasters Management, Environmental Sciences, etc. On the other hand, in several application areas there is the need of running complex models which require the large processing and storage capabilities that the Grids are able to provide. Therefore the integration of geo-information and Grid technologies might be a valuable approach in order to enable advanced ESS applications. Currently both geo-information and Grid technologies have reached a high level of maturity, allowing to build such an integration on existing solutions. More specifically, the Open Geospatial Consortium (OGC) Web Services (OWS) specifications play a fundamental role in geospatial information sharing (e.g. in INSPIRE Implementing Rules, GEOSS architecture, GMES Services, etc.). On the Grid side, the gLite middleware, developed in the European EGEE (Enabling Grids for E-sciencE) Projects, is widely spread in Europe and beyond, proving its high scalability and it is one of the middleware chosen for the future European Grid Infrastructure (EGI) initiative. Therefore the convergence between OWS and gLite technologies would be desirable for a seamless access to the Grid capabilities through OWS-compliant systems. Anyway, to achieve this harmonization there are some obstacles to overcome. Firstly, a semantics mismatch must be addressed: gLite handle low-level (e.g. close to the machine) concepts like "file", "data", "instruments", "job", etc., while geo-information services handle higher-level (closer to the human) concepts like "coverage", "observation", "measurement", "model", etc. Secondly, an architectural mismatch must be addressed: OWS implements a Web Service-Oriented-Architecture which is stateless, synchronous and with no embedded security (which is demanded to other specs), while gLite implements the Grid paradigm in an architecture which is stateful, asynchronous (even not fully event-based) and with strong embedded security (based on the VO paradigm). In recent years many initiatives and projects have worked out possible approaches for implementing Grid-enabled OWSs. Just to mention some: (i) in 2007 the OGC has signed a Memorandum of Understanding with the Open Grid Forum, "a community of users, developers, and vendors leading the global standardization effort for grid computing."; (ii) the OGC identified "WPS Profiles - Conflation; and Grid processing" as one of the tasks in the Geo Processing Workflow theme of the OWS Phase 6 (OWS-6); (iii) several national, European and international projects investigated different aspects of this integration, developing demonstrators and Proof-of-Concepts; In this context, "gLite enablement of OpenGeospatial Web Services" (G-OWS) is an initiative started in 2008 by the European CYCLOPS, GENESI-DR, and DORII Projects Consortia in order to collect/coordinate experiences on the enablement of OWS on top of the gLite middleware [GOWS]. Currently G-OWS counts ten member organizations from Europe and beyond, and four European Projects involved. It broadened its scope to the development of Spatial Data and Information Infrastructures (SDI and SII) based on the Grid/Cloud capacity in order to enable Earth Science applications and tools. Its operational objectives are the following: i) to contribute to the OGC-OGF initiative; ii) to release a reference implementation as standard gLite APIs (under the gLite software license); iii) to release a reference model (including procedures and guidelines) for OWS Grid-ification, as far as gLite is concerned; iv) to foster and promote the formation of consortiums for participation to projects/initiatives aimed at building Grid-enabled SDIs To achieve this objectives G-OWS bases its activities on two main guiding principles: a) the adoption of a service-oriented architecture based on the information modelling approach, and b) standardization as a means of achieving interoperability (i.e. adoption of standards from ISO TC211, OGC OWS, OGF). In the first year of activity G-OWS has designed a general architectural framework stemming from the FP6 CYCLOPS studies and enriched by the outcomes of other projects and initiatives involved (i.e. FP7 GENESI-DR, FP7 DORII, AIST GeoGrid, etc.). Some proof-of-concepts have been developed to demonstrate the flexibility and scalability of such architectural framework. The G-OWS WG developed implementations of gLite-enabled Web Coverage Service (WCS) and Web Processing Service (WPS), and an implementation of a Shibboleth authentication for gLite-enabled OWS in order to evaluate the possible integration of Web and Grid security models. The presentation will aim to communicate the G-OWS organization, activities, future plans and means to involve the ESSI community. References [Berners-Lee 1996] T. Berners-Lee, "WWW: Past, present, and future". IEEE Computer, 29(10), Oct. 1996, pp. 69-77. [Foster 2001] I. Foster, C. Kesselman and S. Tuecke, "The Anatomy of the Grid. The International Journal ofHigh Performance Computing Applications", 15(3):200-222, Fall 2001 [GOWS] G-OWS WG, https://www.g-ows.org/, accessed: 15 January 2010

  12. Development of a web application for water resources based on open source software

    NASA Astrophysics Data System (ADS)

    Delipetrev, Blagoj; Jonoski, Andreja; Solomatine, Dimitri P.

    2014-01-01

    This article presents research and development of a prototype web application for water resources using latest advancements in Information and Communication Technologies (ICT), open source software and web GIS. The web application has three web services for: (1) managing, presenting and storing of geospatial data, (2) support of water resources modeling and (3) water resources optimization. The web application is developed using several programming languages (PhP, Ajax, JavaScript, Java), libraries (OpenLayers, JQuery) and open source software components (GeoServer, PostgreSQL, PostGIS). The presented web application has several main advantages: it is available all the time, it is accessible from everywhere, it creates a real time multi-user collaboration platform, the programing languages code and components are interoperable and designed to work in a distributed computer environment, it is flexible for adding additional components and services and, it is scalable depending on the workload. The application was successfully tested on a case study with concurrent multi-users access.

  13. Cloud/web mapping and geoprocessing services - Intelligently linking geoinformation

    NASA Astrophysics Data System (ADS)

    Veenendaal, Bert; Brovelli, Maria Antonia; Wu, Lixin

    2016-04-01

    We live in a world that is alive with information and geographies. "Everything happens somewhere" (Tosta, 2001). This reality is being exposed in the digital earth technologies providing a multi-dimensional, multi-temporal and multi-resolution model of the planet, based on the needs of diverse actors: from scientists to decision makers, communities and citizens (Brovelli et al., 2015). We are building up a geospatial information infrastructure updated in real time thanks to mobile, positioning and sensor observations. Users can navigate, not only through space but also through time, to access historical data and future predictions based on social and/or environmental models. But how do we find the information about certain geographic locations or localities when it is scattered in the cloud and across the web of data behind a diversity of databases, web services and hyperlinked pages? We need to be able to link geoinformation together in order to integrate it, make sense of it, and use it appropriately for managing the world and making decisions.

  14. The geospatial data quality REST API for primary biodiversity data.

    PubMed

    Otegui, Javier; Guralnick, Robert P

    2016-06-01

    We present a REST web service to assess the geospatial quality of primary biodiversity data. It enables access to basic and advanced functions to detect completeness and consistency issues as well as general errors in the provided record or set of records. The API uses JSON for data interchange and efficient parallelization techniques for fast assessments of large datasets. The Geospatial Data Quality API is part of the VertNet set of APIs. It can be accessed at http://api-geospatial.vertnet-portal.appspot.com/geospatial and is already implemented in the VertNet data portal for quality reporting. Source code is freely available under GPL license from http://www.github.com/vertnet/api-geospatial javier.otegui@gmail.com or rguralnick@flmnh.ufl.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  15. The use of geospatial web services for exchanging utilities data

    NASA Astrophysics Data System (ADS)

    Kuczyńska, Joanna

    2013-04-01

    Geographic information technologies and related geo-information systems currently play an important role in the management of public administration in Poland. One of these tasks is to maintain and update Geodetic Evidence of Public Utilities (GESUT), part of the National Geodetic and Cartographic Resource, which contains an important for many institutions information of technical infrastructure. It requires an active exchange of data between the Geodesy and Cartography Documentation Centers and institutions, which administrate transmission lines. The administrator of public utilities, is legally obliged to provide information about utilities to GESUT. The aim of the research work was to develop a universal data exchange methodology, which can be implemented on a variety of hardware and software platforms. This methodology use Unified Modeling Language (UML), eXtensible Markup Language (XML), and Geography Markup Language (GML). The proposed methodology is based on the two different strategies: Model Driven Architecture (MDA) and Service Oriented Architecture (SOA). Used solutions are consistent with the INSPIRE Directive and ISO 19100 series standards for geographic information. On the basis of analysis of the input data structures, conceptual models were built for both databases. Models were written in the universal modeling language: UML. Combined model that defines a common data structure was also built. This model was transformed into developed for the exchange of geographic information GML standard. The structure of the document describing the data that may be exchanged is defined in the .xsd file. Network services were selected and implemented in the system designed for data exchange based on open source tools. Methodology was implemented and tested. Data in the agreed data structure and metadata were set up on the server. Data access was provided by geospatial network services: data searching possibilities by Catalog Service for the Web (CSW), data collection by Web Feature Service (WFS). WFS provides also operation for modification data, for example to update them by utility administrator. The proposed solution significantly increases the efficiency of data exchange and facilitates maintenance the National Geodetic and Cartographic Resource.

  16. Development and deployment of a water-crop-nutrient simulation model embedded in a web application

    NASA Astrophysics Data System (ADS)

    Langella, Giuliano; Basile, Angelo; Coppola, Antonio; Manna, Piero; Orefice, Nadia; Terribile, Fabio

    2016-04-01

    It is long time by now that scientific research on environmental and agricultural issues spent large effort in the development and application of models for prediction and simulation in spatial and temporal domains. This is fulfilled by studying and observing natural processes (e.g. rainfall, water and chemicals transport in soils, crop growth) whose spatiotemporal behavior can be reproduced for instance to predict irrigation and fertilizer requirements and yield quantities/qualities. In this work a mechanistic model to simulate water flow and solute transport in the soil-plant-atmosphere continuum is presented. This desktop computer program was written according to the specific requirement of developing web applications. The model is capable to solve the following issues all together: (a) water balance and (b) solute transport; (c) crop modelling; (d) GIS-interoperability; (e) embedability in web-based geospatial Decision Support Systems (DSS); (f) adaptability at different scales of application; and (g) ease of code modification. We maintained the desktop characteristic in order to further develop (e.g. integrate novel features) and run the key program modules for testing and validation purporses, but we also developed a middleware component to allow the model run the simulations directly over the web, without software to be installed. The GIS capabilities allows the web application to make simulations in a user-defined region of interest (delimited over a geographical map) without the need to specify the proper combination of model parameters. It is possible since the geospatial database collects information on pedology, climate, crop parameters and soil hydraulic characteristics. Pedological attributes include the spatial distribution of key soil data such as soil profile horizons and texture. Further, hydrological parameters are selected according to the knowledge about the spatial distribution of soils. The availability and definition in the geospatial domain of these attributes allow the simulation outputs at a different spatial scale. Two different applications were implemented using the same framework but with different configurations of the software pieces making the physically based modelling chain: an irrigation tool simulating water requirements and their dates and a fertilization tool for optimizing in particular mineral nitrogen adds.

  17. Operational Marine Data Acquisition and Delivery Powered by Web and Geospatial Standards

    NASA Astrophysics Data System (ADS)

    Thomas, R.; Buck, J. J. H.

    2015-12-01

    As novel sensor types and new platforms are deployed to monitor the global oceans, the volumes of scientific and environmental data collected in the marine context are rapidly growing. In order to use these data in both the traditional operational modes and in innovative "Big Data" applications the data must be readily understood by software agents. One approach to achieving this is the application of both World Wide Web and Open Geospatial Consortium standards: namely Linked Data1 and Sensor Web Enablement2 (SWE). The British Oceanographic Data Centre (BODC) is adopting this strategy in a number of European Commission funded projects (NETMAR; SenseOCEAN; Ocean Data Interoperability Platform - ODIP; and AtlantOS) to combine its existing data archiving architecture with SWE components (such as Sensor Observation Services) and a Linked Data interface. These will evolve the data management and data transfer from a process that requires significant manual intervention to an automated operational process enabling the rapid, standards-based, ingestion and delivery of data. This poster will show the current capabilities of BODC and the status of on-going implementation of this strategy. References1. World Wide Web Consortium. (2013). Linked Data. Available:http://www.w3.org/standards/semanticweb/data. Last accessed 7th April 20152. Open Geospatial Consortium. (2014). Sensor Web Enablement (SWE). Available:http://www.opengeospatial.org/ogc/markets-technologies/swe. Last accessed 8th October 2014

  18. Model My Watershed and BiG CZ Data Portal: Interactive geospatial analysis and hydrological modeling web applications that leverage the Amazon cloud for scientists, resource managers and students

    NASA Astrophysics Data System (ADS)

    Aufdenkampe, A. K.; Mayorga, E.; Tarboton, D. G.; Sazib, N. S.; Horsburgh, J. S.; Cheetham, R.

    2016-12-01

    The Model My Watershed Web app (http://wikiwatershed.org/model/) was designed to enable citizens, conservation practitioners, municipal decision-makers, educators, and students to interactively select any area of interest anywhere in the continental USA to: (1) analyze real land use and soil data for that area; (2) model stormwater runoff and water-quality outcomes; and (3) compare how different conservation or development scenarios could modify runoff and water quality. The BiG CZ Data Portal is a web application for scientists for intuitive, high-performance map-based discovery, visualization, access and publication of diverse earth and environmental science data via a map-based interface that simultaneously performs geospatial analysis of selected GIS and satellite raster data for a selected area of interest. The two web applications share a common codebase (https://github.com/WikiWatershed and https://github.com/big-cz), high performance geospatial analysis engine (http://geotrellis.io/ and https://github.com/geotrellis) and deployment on the Amazon Web Services (AWS) cloud cyberinfrastructure. Users can use "on-the-fly" rapid watershed delineation over the national elevation model to select their watershed or catchment of interest. The two web applications also share the goal of enabling the scientists, resource managers and students alike to share data, analyses and model results. We will present these functioning web applications and their potential to substantially lower the bar for studying and understanding our water resources. We will also present work in progress, including a prototype system for enabling citizen-scientists to register open-source sensor stations (http://envirodiy.org/mayfly/) to stream data into these systems, so that they can be reshared using Water One Flow web services.

  19. Towards a Web-Enabled Geovisualization and Analytics Platform for the Energy and Water Nexus

    NASA Astrophysics Data System (ADS)

    Sanyal, J.; Chandola, V.; Sorokine, A.; Allen, M.; Berres, A.; Pang, H.; Karthik, R.; Nugent, P.; McManamay, R.; Stewart, R.; Bhaduri, B. L.

    2017-12-01

    Interactive data analytics are playing an increasingly vital role in the generation of new, critical insights regarding the complex dynamics of the energy/water nexus (EWN) and its interactions with climate variability and change. Integration of impacts, adaptation, and vulnerability (IAV) science with emerging, and increasingly critical, data science capabilities offers a promising potential to meet the needs of the EWN community. To enable the exploration of pertinent research questions, a web-based geospatial visualization platform is being built that integrates a data analysis toolbox with advanced data fusion and data visualization capabilities to create a knowledge discovery framework for the EWN. The system, when fully built out, will offer several geospatial visualization capabilities including statistical visual analytics, clustering, principal-component analysis, dynamic time warping, support uncertainty visualization and the exploration of data provenance, as well as support machine learning discoveries to render diverse types of geospatial data and facilitate interactive analysis. Key components in the system architecture includes NASA's WebWorldWind, the Globus toolkit, postgresql, as well as other custom built software modules.

  20. Model My Watershed: A high-performance cloud application for public engagement, watershed modeling and conservation decision support

    NASA Astrophysics Data System (ADS)

    Aufdenkampe, A. K.; Tarboton, D. G.; Horsburgh, J. S.; Mayorga, E.; McFarland, M.; Robbins, A.; Haag, S.; Shokoufandeh, A.; Evans, B. M.; Arscott, D. B.

    2017-12-01

    The Model My Watershed Web app (https://app.wikiwatershed.org/) and the BiG-CZ Data Portal (http://portal.bigcz.org/) and are web applications that share a common codebase and a common goal to deliver high-performance discovery, visualization and analysis of geospatial data in an intuitive user interface in web browser. Model My Watershed (MMW) was designed as a decision support system for watershed conservation implementation. BiG CZ Data Portal was designed to provide context and background data for research sites. Users begin by creating an Area of Interest, via an automated watershed delineation tool, a free draw tool, selection of a predefined area such as a county or USGS Hydrological Unit (HUC), or uploading a custom polygon. Both Web apps visualize and provide summary statistics of land use, soil groups, streams, climate and other geospatial information. MMW then allows users to run a watershed model to simulate different scenarios of human impacts on stormwater runoff and water-quality. BiG CZ Data Portal allows users to search for scientific and monitoring data within the Area of Interest, which also serves as a prototype for the upcoming Monitor My Watershed web app. Both systems integrate with CUAHSI cyberinfrastructure, including visualizing observational data from CUAHSI Water Data Center and storing user data via CUAHSI HydroShare. Both systems also integrate with the new EnviroDIY Water Quality Data Portal (http://data.envirodiy.org/), a system for crowd-sourcing environmental monitoring data using open-source sensor stations (http://envirodiy.org/mayfly/) and based on the Observations Data Model v2.

  1. Real-Time Geospatial Data Viewer (RETIGO): Web-Based Tool for Researchers and Citizen Scientists to Explore their Air Measurements

    EPA Science Inventory

    The collection of air measurements in real-time on moving platforms, such as wearable, bicycle-mounted, or vehicle-mounted air sensors, is becoming an increasingly common method to investigate local air quality. However, visualizing and analyzing geospatial air monitoring data re...

  2. EnviroAtlas: A New Geospatial Tool to Foster Ecosystem Services Science and Resource Management

    EPA Science Inventory

    In this article we present EnviroAtlas, a web-based, open access tool that seeks to meet a range of needs by bringing together environmental, economic and demographic data in an ecosystem services framework. Within EnviroAtlas, there are three primary types of geospatial data: r...

  3. Bim-Gis Integrated Geospatial Information Model Using Semantic Web and Rdf Graphs

    NASA Astrophysics Data System (ADS)

    Hor, A.-H.; Jadidi, A.; Sohn, G.

    2016-06-01

    In recent years, 3D virtual indoor/outdoor urban modelling becomes a key spatial information framework for many civil and engineering applications such as evacuation planning, emergency and facility management. For accomplishing such sophisticate decision tasks, there is a large demands for building multi-scale and multi-sourced 3D urban models. Currently, Building Information Model (BIM) and Geographical Information Systems (GIS) are broadly used as the modelling sources. However, data sharing and exchanging information between two modelling domains is still a huge challenge; while the syntactic or semantic approaches do not fully provide exchanging of rich semantic and geometric information of BIM into GIS or vice-versa. This paper proposes a novel approach for integrating BIM and GIS using semantic web technologies and Resources Description Framework (RDF) graphs. The novelty of the proposed solution comes from the benefits of integrating BIM and GIS technologies into one unified model, so-called Integrated Geospatial Information Model (IGIM). The proposed approach consists of three main modules: BIM-RDF and GIS-RDF graphs construction, integrating of two RDF graphs, and query of information through IGIM-RDF graph using SPARQL. The IGIM generates queries from both the BIM and GIS RDF graphs resulting a semantically integrated model with entities representing both BIM classes and GIS feature objects with respect to the target-client application. The linkage between BIM-RDF and GIS-RDF is achieved through SPARQL endpoints and defined by a query using set of datasets and entity classes with complementary properties, relationships and geometries. To validate the proposed approach and its performance, a case study was also tested using IGIM system design.

  4. Open Data, Open Specifications and Free and Open Source Software: A powerful mix to create distributed Web-based water information systems

    NASA Astrophysics Data System (ADS)

    Arias, Carolina; Brovelli, Maria Antonia; Moreno, Rafael

    2015-04-01

    We are in an age when water resources are increasingly scarce and the impacts of human activities on them are ubiquitous. These problems don't respect administrative or political boundaries and they must be addressed integrating information from multiple sources at multiple spatial and temporal scales. Communication, coordination and data sharing are critical for addressing the water conservation and management issues of the 21st century. However, different countries, provinces, local authorities and agencies dealing with water resources have diverse organizational, socio-cultural, economic, environmental and information technology (IT) contexts that raise challenges to the creation of information systems capable of integrating and distributing information across their areas of responsibility in an efficient and timely manner. Tight and disparate financial resources, and dissimilar IT infrastructures (data, hardware, software and personnel expertise) further complicate the creation of these systems. There is a pressing need for distributed interoperable water information systems that are user friendly, easily accessible and capable of managing and sharing large volumes of spatial and non-spatial data. In a distributed system, data and processes are created and maintained in different locations each with competitive advantages to carry out specific activities. Open Data (data that can be freely distributed) is available in the water domain, and it should be further promoted across countries and organizations. Compliance with Open Specifications for data collection, storage and distribution is the first step toward the creation of systems that are capable of interacting and exchanging data in a seamlessly (interoperable) way. The features of Free and Open Source Software (FOSS) offer low access cost that facilitate scalability and long-term viability of information systems. The World Wide Web (the Web) will be the platform of choice to deploy and access these systems. Geospatial capabilities for mapping, visualization, and spatial analysis will be important components of these new generation of Web-based interoperable information systems in the water domain. The purpose of this presentation is to increase the awareness of scientists, IT personnel and agency managers about the advantages offered by the combined use of Open Data, Open Specifications for geospatial and water-related data collection, storage and sharing, as well as mature FOSS projects for the creation of interoperable Web-based information systems in the water domain. A case study is used to illustrate how these principles and technologies can be integrated to create a system with the previously mentioned characteristics for managing and responding to flood events.

  5. A web service for service composition to aid geospatial modelers

    NASA Astrophysics Data System (ADS)

    Bigagli, L.; Santoro, M.; Roncella, R.; Mazzetti, P.

    2012-04-01

    The identification of appropriate mechanisms for process reuse, chaining and composition is considered a key enabler for the effective uptake of a global Earth Observation infrastructure, currently pursued by the international geospatial research community. In the Earth and Space Sciences, such a facility could primarily enable integrated and interoperable modeling, for what several approaches have been proposed and developed, over the last years. In fact, GEOSS is specifically tasked with the development of the so-called "Model Web". At increasing levels of abstraction and generalization, the initial stove-pipe software tools have evolved to community-wide modeling frameworks, to Component-Based Architecture solution, and, more recently, started to embrace Service-Oriented Architectures technologies, such as the OGC WPS specification and the WS-* stack of W3C standards for service composition. However, so far, the level of abstraction seems too low for implementing the Model Web vision, and far too complex technological aspects must still be addressed by both providers and users, resulting in limited usability and, eventually, difficult uptake. As by the recent ICT trend of resource virtualization, it has been suggested that users in need of a particular processing capability, required by a given modeling workflow, may benefit from outsourcing the composition activities into an external first-class service, according to the Composition as a Service (CaaS) approach. A CaaS system provides the necessary interoperability service framework for adaptation, reuse and complementation of existing processing resources (including models and geospatial services in general) in the form of executable workflows. This work introduces the architecture of a CaaS system, as a distributed information system for creating, validating, editing, storing, publishing, and executing geospatial workflows. This way, the users can be freed from the need of a composition infrastructure and alleviated from the technicalities of workflow definitions (type matching, identification of external services endpoints, binding issues, etc.) and focus on their intended application. Moreover, the user may submit an incomplete workflow definition, and leverage CaaS recommendations (that may derive from an aggregated knowledge base of user feedback, underpinned by Web 2.0 technologies) to execute it. This is of particular interest for multidisciplinary scientific contexts, where different communities may benefit of each other knowledge through model chaining. Indeed, the CaaS approach is presented as an attempt to combine the recent advances in service-oriented computing with collaborative research principles, and social network information in general. Arguably, it may be considered a fundamental capability of the Model Web. The CaaS concept is being investigated in several application scenarios identified in the FP7 UncertWeb and EuroGEOSS projects. Key aspects of the described CaaS solution are: it provides a standard WPS interface for invoking Business Processes and allows on the fly recursive compositions of Business Processes into other Composite Processes; it is designed according to the extended SOA (broker-based) and the System-of-Systems approach, to support the reuse and integration of existing resources, in compliance with the GEOSS Model Web architecture. The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under Grant Agreement n° 248488.

  6. Applying Geospatial Technologies for International Development and Public Health: The USAID/NASA SERVIR Program

    NASA Technical Reports Server (NTRS)

    Hemmings, Sarah; Limaye, Ashutosh; Irwin, Dan

    2011-01-01

    Background: SERVIR -- the Regional Visualization and Monitoring System -- helps people use Earth observations and predictive models based on data from orbiting satellites to make timely decisions that benefit society. SERVIR operates through a network of regional hubs in Mesoamerica, East Africa, and the Hindu Kush-Himalayas. USAID and NASA support SERVIR, with the long-term goal of transferring SERVIR capabilities to the host countries. Objective/Purpose: The purpose of this presentation is to describe how the SERVIR system helps the SERVIR regions cope with eight areas of societal benefit identified by the Group on Earth Observations (GEO): health, disasters, ecosystems, biodiversity, weather, water, climate, and agriculture. This presentation will describe environmental health applications of data in the SERVIR system, as well as ongoing and future efforts to incorporate additional health applications into the SERVIR system. Methods: This presentation will discuss how the SERVIR Program makes environmental data available for use in environmental health applications. SERVIR accomplishes its mission by providing member nations with access to geospatial data and predictive models, information visualization, training and capacity building, and partnership development. SERVIR conducts needs assessments in partner regions, develops custom applications of Earth observation data, and makes NASA and partner data available through an online geospatial data portal at SERVIRglobal.net. Results: Decision makers use SERVIR to improve their ability to monitor air quality, extreme weather, biodiversity, and changes in land cover. In past several years, the system has been used over 50 times to respond to environmental threats such as wildfires, floods, landslides, and harmful algal blooms. Given that the SERVIR regions are experiencing increased stress under larger climate variability than historic observations, SERVIR provides information to support the development of adaptation strategies for nations affected by climate change. Conclusions: SERVIR is a platform for collaboration and cross-agency coordination, international partnerships, and delivery of web-based geospatial information services and applications. SERVIR makes a variety of geospatial data available for use in studies of environmental health outcomes.

  7. Open Technology Approaches to Geospatial Interface Design

    NASA Astrophysics Data System (ADS)

    Crevensten, B.; Simmons, D.; Alaska Satellite Facility

    2011-12-01

    What problems do you not want your software developers to be solving? Choosing open technologies across the entire stack of software development-from low-level shared libraries to high-level user interaction implementations-is a way to help ensure that customized software yields innovative and valuable tools for Earth Scientists. This demonstration will review developments in web application technologies and the recurring patterns of interaction design regarding exploration and discovery of geospatial data through the Vertex: ASF's Dataportal interface, a project utilizing current open web application standards and technologies including HTML5, jQueryUI, Backbone.js and the Jasmine unit testing framework.

  8. A Security Architecture for Grid-enabling OGC Web Services

    NASA Astrophysics Data System (ADS)

    Angelini, Valerio; Petronzio, Luca

    2010-05-01

    In the proposed presentation we describe an architectural solution for enabling a secure access to Grids and possibly other large scale on-demand processing infrastructures through OGC (Open Geospatial Consortium) Web Services (OWS). This work has been carried out in the context of the security thread of the G-OWS Working Group. G-OWS (gLite enablement of OGC Web Services) is an international open initiative started in 2008 by the European CYCLOPS , GENESI-DR, and DORII Project Consortia in order to collect/coordinate experiences in the enablement of OWS's on top of the gLite Grid middleware. G-OWS investigates the problem of the development of Spatial Data and Information Infrastructures (SDI and SII) based on the Grid/Cloud capacity in order to enable Earth Science applications and tools. Concerning security issues, the integration of OWS compliant infrastructures and gLite Grids needs to address relevant challenges, due to their respective design principles. In fact OWS's are part of a Web based architecture that demands security aspects to other specifications, whereas the gLite middleware implements the Grid paradigm with a strong security model (the gLite Grid Security Infrastructure: GSI). In our work we propose a Security Architectural Framework allowing the seamless use of Grid-enabled OGC Web Services through the federation of existing security systems (mostly web based) with the gLite GSI. This is made possible mediating between different security realms, whose mutual trust is established in advance during the deployment of the system itself. Our architecture is composed of three different security tiers: the user's security system, a specific G-OWS security system, and the gLite Grid Security Infrastructure. Applying the separation-of-concerns principle, each of these tiers is responsible for controlling the access to a well-defined resource set, respectively: the user's organization resources, the geospatial resources and services, and the Grid resources. While the gLite middleware is tied to a consolidated security approach based on X.509 certificates, our system is able to support different kinds of user's security infrastructures. Our central component, the G-OWS Security Framework, is based on the OASIS WS-Trust specifications and on the OGC GeoRM architectural framework. This allows to satisfy advanced requirements such as the enforcement of specific geospatial policies and complex secure web service chained requests. The typical use case is represented by a scientist belonging to a given organization who issues a request to a G-OWS Grid-enabled Web Service. The system initially asks the user to authenticate to his/her organization's security system and, after verification of the user's security credentials, it translates the user's digital identity into a G-OWS identity. This identity is linked to a set of attributes describing the user's access rights to the G-OWS services and resources. Inside the G-OWS Security system, access restrictions are applied making use of the enhanced Geospatial capabilities specified by the OGC GeoXACML. If the required action needs to make use of the Grid environment the system checks if the user is entitled to access a Grid infrastructure. In that case his/her identity is translated to a temporary Grid security token using the Short Lived Credential Services (IGTF Standard). In our case, for the specific gLite Grid infrastructure, some information (VOMS Attributes) is plugged into the Grid Security Token to grant the access to the user's Virtual Organization Grid resources. The resulting token is used to submit the request to the Grid and also by the various gLite middleware elements to verify the user's grants. Basing on the presented framework, the G-OWS Security Working Group developed a prototype, enabling the execution of OGC Web Services on the EGEE Production Grid through the federation with a Shibboleth based security infrastructure. Future plans aim to integrate other Web authentication services such as OpenID, Kerberos and WS-Federation.

  9. Geospatial Information from Satellite Imagery for Geovisualisation of Smart Cities in India

    NASA Astrophysics Data System (ADS)

    Mohan, M.

    2016-06-01

    In the recent past, there have been large emphasis on extraction of geospatial information from satellite imagery. The Geospatial information are being processed through geospatial technologies which are playing important roles in developing of smart cities, particularly in developing countries of the world like India. The study is based on the latest geospatial satellite imagery available for the multi-date, multi-stage, multi-sensor, and multi-resolution. In addition to this, the latest geospatial technologies have been used for digital image processing of remote sensing satellite imagery and the latest geographic information systems as 3-D GeoVisualisation, geospatial digital mapping and geospatial analysis for developing of smart cities in India. The Geospatial information obtained from RS and GPS systems have complex structure involving space, time and presentation. Such information helps in 3-Dimensional digital modelling for smart cities which involves of spatial and non-spatial information integration for geographic visualisation of smart cites in context to the real world. In other words, the geospatial database provides platform for the information visualisation which is also known as geovisualisation. So, as a result there have been an increasing research interest which are being directed to geospatial analysis, digital mapping, geovisualisation, monitoring and developing of smart cities using geospatial technologies. However, the present research has made an attempt for development of cities in real world scenario particulary to help local, regional and state level planners and policy makers to better understand and address issues attributed to cities using the geospatial information from satellite imagery for geovisualisation of Smart Cities in emerging and developing country, India.

  10. BAID: The Barrow Area Information Database - an interactive web mapping portal and cyberinfrastructure for scientific activities in the vicinity of Barrow, Alaska

    NASA Astrophysics Data System (ADS)

    Cody, R. P.; Kassin, A.; Gaylord, A.; Brown, J.; Tweedie, C. E.

    2012-12-01

    The Barrow area of northern Alaska is one of the most intensely researched locations in the Arctic. The Barrow Area Information Database (BAID, www.baidims.org) is a cyberinfrastructure (CI) that details much of the historic and extant research undertaken within in the Barrow region in a suite of interactive web-based mapping and information portals (geobrowsers). The BAID user community and target audience for BAID is diverse and includes research scientists, science logisticians, land managers, educators, students, and the general public. BAID contains information on more than 9,600 Barrow area research sites that extend back to the 1940's and more than 640 remote sensing images and geospatial datasets. In a web-based setting, users can zoom, pan, query, measure distance, and save or print maps and query results. Data are described with metadata that meet Federal Geographic Data Committee standards and are archived at the University Corporation for Atmospheric Research Earth Observing Laboratory (EOL) where non-proprietary BAID data can be freely downloaded. BAID has been used to: Optimize research site choice; Reduce duplication of science effort; Discover complementary and potentially detrimental research activities in an area of scientific interest; Re-establish historical research sites for resampling efforts assessing change in ecosystem structure and function over time; Exchange knowledge across disciplines and generations; Facilitate communication between western science and traditional ecological knowledge; Provide local residents access to science data that facilitates adaptation to arctic change; (and) Educate the next generation of environmental and computer scientists. This poster describes key activities that will be undertaken over the next three years to provide BAID users with novel software tools to interact with a current and diverse selection of information and data about the Barrow area. Key activities include: 1. Collecting data on research activities, generating geospatial data, and providing mapping support. 2. Maintaining, updating and innovating the existing suite of BAID geobrowsers. 3. Maintaining and updating aging server hardware supporting BAID. 4. Adding interoperability with other CI using workflows, controlled vocabularies and web services. 5. Linking BAID to data archives at the National Snow and Ice Data Center (NSIDC). 6. Developing a wireless sensor network that provides web based interaction with near-real time climate and other data. 7. Training next generation of environmental and computer scientists and conducting outreach.

  11. Cyber-physical geographical information service-enabled control of diverse in-situ sensors.

    PubMed

    Chen, Nengcheng; Xiao, Changjiang; Pu, Fangling; Wang, Xiaolei; Wang, Chao; Wang, Zhili; Gong, Jianya

    2015-01-23

    Realization of open online control of diverse in-situ sensors is a challenge. This paper proposes a Cyber-Physical Geographical Information Service-enabled method for control of diverse in-situ sensors, based on location-based instant sensing of sensors, which provides closed-loop feedbacks. The method adopts the concepts and technologies of newly developed cyber-physical systems (CPSs) to combine control with sensing, communication, and computation, takes advantage of geographical information service such as services provided by the Tianditu which is a basic geographic information service platform in China and Sensor Web services to establish geo-sensor applications, and builds well-designed human-machine interfaces (HMIs) to support online and open interactions between human beings and physical sensors through cyberspace. The method was tested with experiments carried out in two geographically distributed scientific experimental fields, Baoxie Sensor Web Experimental Field in Wuhan city and Yemaomian Landslide Monitoring Station in Three Gorges, with three typical sensors chosen as representatives using the prototype system Geospatial Sensor Web Common Service Platform. The results show that the proposed method is an open, online, closed-loop means of control.

  12. Cyber-Physical Geographical Information Service-Enabled Control of Diverse In-Situ Sensors

    PubMed Central

    Chen, Nengcheng; Xiao, Changjiang; Pu, Fangling; Wang, Xiaolei; Wang, Chao; Wang, Zhili; Gong, Jianya

    2015-01-01

    Realization of open online control of diverse in-situ sensors is a challenge. This paper proposes a Cyber-Physical Geographical Information Service-enabled method for control of diverse in-situ sensors, based on location-based instant sensing of sensors, which provides closed-loop feedbacks. The method adopts the concepts and technologies of newly developed cyber-physical systems (CPSs) to combine control with sensing, communication, and computation, takes advantage of geographical information service such as services provided by the Tianditu which is a basic geographic information service platform in China and Sensor Web services to establish geo-sensor applications, and builds well-designed human-machine interfaces (HMIs) to support online and open interactions between human beings and physical sensors through cyberspace. The method was tested with experiments carried out in two geographically distributed scientific experimental fields, Baoxie Sensor Web Experimental Field in Wuhan city and Yemaomian Landslide Monitoring Station in Three Gorges, with three typical sensors chosen as representatives using the prototype system Geospatial Sensor Web Common Service Platform. The results show that the proposed method is an open, online, closed-loop means of control. PMID:25625906

  13. Statistical Validation of a Web-Based GIS Application and Its Applicability to Cardiovascular-Related Studies.

    PubMed

    Lee, Jae Eun; Sung, Jung Hye; Malouhi, Mohamad

    2015-12-22

    There is abundant evidence that neighborhood characteristics are significantly linked to the health of the inhabitants of a given space within a given time frame. This study is to statistically validate a web-based GIS application designed to support cardiovascular-related research developed by the NIH funded Research Centers in Minority Institutions (RCMI) Translational Research Network (RTRN) Data Coordinating Center (DCC) and discuss its applicability to cardiovascular studies. Geo-referencing, geocoding and geospatial analyses were conducted for 500 randomly selected home addresses in a U.S. southeastern Metropolitan area. The correlation coefficient, factor analysis and Cronbach's alpha (α) were estimated to quantify measures of the internal consistency, reliability and construct/criterion/discriminant validity of the cardiovascular-related geospatial variables (walk score, number of hospitals, fast food restaurants, parks and sidewalks). Cronbach's α for CVD GEOSPATIAL variables was 95.5%, implying successful internal consistency. Walk scores were significantly correlated with number of hospitals (r = 0.715; p < 0.0001), fast food restaurants (r = 0.729; p < 0.0001), parks (r = 0.773; p < 0.0001) and sidewalks (r = 0.648; p < 0.0001) within a mile from homes. It was also significantly associated with diversity index (r = 0.138, p = 0.0023), median household incomes (r = -0.181; p < 0.0001), and owner occupied rates (r = -0.440; p < 0.0001). However, its non-significant correlation was found with median age, vulnerability, unemployment rate, labor force, and population growth rate. Our data demonstrates that geospatial data generated by the web-based application were internally consistent and demonstrated satisfactory validity. Therefore, the GIS application may be useful to apply to cardiovascular-related studies aimed to investigate potential impact of geospatial factors on diseases and/or the long-term effect of clinical trials.

  14. Visualization and interaction tools for aerial photograph mosaics

    NASA Astrophysics Data System (ADS)

    Fernandes, João Pedro; Fonseca, Alexandra; Pereira, Luís; Faria, Adriano; Figueira, Helder; Henriques, Inês; Garção, Rita; Câmara, António

    1997-05-01

    This paper describes the development of a digital spatial library based on mosaics of digital orthophotos, called Interactive Portugal, that will enable users both to retrieve geospatial information existing in the Portuguese National System for Geographic Information World Wide Web server, and to develop local databases connected to the main system. A set of navigation, interaction, and visualization tools are proposed and discussed. They include sketching, dynamic sketching, and navigation capabilities over the digital orthophotos mosaics. Main applications of this digital spatial library are pointed out and discussed, namely for education, professional, and tourism markets. Future developments are considered. These developments are related to user reactions, technological advancements, and projects that also aim at delivering and exploring digital imagery on the World Wide Web. Future capabilities for site selection and change detection are also considered.

  15. TopoCad - A unified system for geospatial data and services

    NASA Astrophysics Data System (ADS)

    Felus, Y. A.; Sagi, Y.; Regev, R.; Keinan, E.

    2013-10-01

    "E-government" is a leading trend in public sector activities in recent years. The Survey of Israel set as a vision to provide all of its services and datasets online. The TopoCad system is the latest software tool developed in order to unify a number of services and databases into one on-line and user friendly system. The TopoCad system is based on Web 1.0 technology; hence the customer is only a consumer of data. All data and services are accessible for the surveyors and geo-information professional in an easy and comfortable way. The future lies in Web 2.0 and Web 3.0 technologies through which professionals can upload their own data for quality control and future assimilation with the national database. A key issue in the development of this complex system was to implement a simple and easy (comfortable) user experience (UX). The user interface employs natural language dialog box in order to understand the user requirements. The system then links spatial data with alpha-numeric data in a flawless manner. The operation of the TopoCad requires no user guide or training. It is intuitive and self-taught. The system utilizes semantic engines and machine understanding technologies to link records from diverse databases in a meaningful way. Thus, the next generation of TopoCad will include five main modules: users and projects information, coordinates transformations and calculations services, geospatial data quality control, linking governmental systems and databases, smart forms and applications. The article describes the first stage of the TopoCad system and gives an overview of its future development.

  16. Design and Development of a Framework Based on Ogc Web Services for the Visualization of Three Dimensional Large-Scale Geospatial Data Over the Web

    NASA Astrophysics Data System (ADS)

    Roccatello, E.; Nozzi, A.; Rumor, M.

    2013-05-01

    This paper illustrates the key concepts behind the design and the development of a framework, based on OGC services, capable to visualize 3D large scale geospatial data streamed over the web. WebGISes are traditionally bounded to a bi-dimensional simplified representation of the reality and though they are successfully addressing the lack of flexibility and simplicity of traditional desktop clients, a lot of effort is still needed to reach desktop GIS features, like 3D visualization. The motivations behind this work lay in the widespread availability of OGC Web Services inside government organizations and in the technology support to HTML 5 and WebGL standard of the web browsers. This delivers an improved user experience, similar to desktop applications, therefore allowing to augment traditional WebGIS features with a 3D visualization framework. This work could be seen as an extension of the Cityvu project, started in 2008 with the aim of a plug-in free OGC CityGML viewer. The resulting framework has also been integrated in existing 3DGIS software products and will be made available in the next months.

  17. Creating of Central Geospatial Database of the Slovak Republic and Procedures of its Revision

    NASA Astrophysics Data System (ADS)

    Miškolci, M.; Šafář, V.; Šrámková, R.

    2016-06-01

    The article describes the creation of initial three dimensional geodatabase from planning and designing through the determination of technological and manufacturing processes to practical using of Central Geospatial Database (CGD - official name in Slovak language is Centrálna Priestorová Databáza - CPD) and shortly describes procedures of its revision. CGD ensures proper collection, processing, storing, transferring and displaying of digital geospatial information. CGD is used by Ministry of Defense (MoD) for defense and crisis management tasks and by Integrated rescue system. For military personnel CGD is run on MoD intranet, and for other users outside of MoD is transmutated to ZbGIS (Primary Geodatabase of Slovak Republic) and is run on public web site. CGD is a global set of geo-spatial information. CGD is a vector computer model which completely covers entire territory of Slovakia. Seamless CGD is created by digitizing of real world using of photogrammetric stereoscopic methods and measurements of objects properties. Basic vector model of CGD (from photogrammetric processing) is then taken out to the field for inspection and additional gathering of objects properties in the whole area of mapping. Finally real-world objects are spatially modeled as a entities of three-dimensional database. CGD gives us opportunity, to get know the territory complexly in all the three spatial dimensions. Every entity in CGD has recorded the time of collection, which allows the individual to assess the timeliness of information. CGD can be utilized for the purposes of geographical analysis, geo-referencing, cartographic purposes as well as various special-purpose mapping and has the ambition to cover the needs not only the MoD, but to become a reference model for the national geographical infrastructure.

  18. Geospatial economics of the woody biomass supply in Kansas -- A case study

    Treesearch

    Olga Khaliukova; Darci Paull; Sarah L. Lewis-Gonzales; Nicolas Andre; Larry E. Biles; Timothy M. Young; James H. Perdue

    2017-01-01

    This research assessed the geospatial supply of cellulosic feedstocks for potential mill sites in Kansas (KS), with procurement zones extending to Arkansas (AR), Iowa(IA), Missouri(MO), Oklahoma (OK), and Nebraska (NE). A web-based modeling system, the Kansas Biomass Supply Assessment Tool, was developed to identify least-cost sourcing areas for logging residues and...

  19. Publications - PDF 98-37B v. 1.1 | Alaska Division of Geological &

    Science.gov Websites

    Alaska's Mineral Industry Reports AKGeology.info Rare Earth Elements WebGeochem Engineering Geology Alaska ) Digital Geospatial Data Digital Geospatial Data Tanana A-1 and A-2 bedrock geology Data File Format File ; Bedrock; Bedrock Geologic Map; Bedrock Geology; CIPW Norms; Cerium; Dome; Economic Geology; Faults

  20. An approach for heterogeneous and loosely coupled geospatial data distributed computing

    NASA Astrophysics Data System (ADS)

    Chen, Bin; Huang, Fengru; Fang, Yu; Huang, Zhou; Lin, Hui

    2010-07-01

    Most GIS (Geographic Information System) applications tend to have heterogeneous and autonomous geospatial information resources, and the availability of these local resources is unpredictable and dynamic under a distributed computing environment. In order to make use of these local resources together to solve larger geospatial information processing problems that are related to an overall situation, in this paper, with the support of peer-to-peer computing technologies, we propose a geospatial data distributed computing mechanism that involves loosely coupled geospatial resource directories and a term named as Equivalent Distributed Program of global geospatial queries to solve geospatial distributed computing problems under heterogeneous GIS environments. First, a geospatial query process schema for distributed computing as well as a method for equivalent transformation from a global geospatial query to distributed local queries at SQL (Structured Query Language) level to solve the coordinating problem among heterogeneous resources are presented. Second, peer-to-peer technologies are used to maintain a loosely coupled network environment that consists of autonomous geospatial information resources, thus to achieve decentralized and consistent synchronization among global geospatial resource directories, and to carry out distributed transaction management of local queries. Finally, based on the developed prototype system, example applications of simple and complex geospatial data distributed queries are presented to illustrate the procedure of global geospatial information processing.

  1. EPA Facility Registry Service (FRS): ICIS

    EPA Pesticide Factsheets

    This web feature service contains location and facility identification information from EPA's Facility Registry Service (FRS) for the subset of facilities that link to the Integrated Compliance Information System (ICIS). When complete, ICIS will provide a database that will contain integrated enforcement and compliance information across most of EPA's programs. The vision for ICIS is to replace EPA's independent databases that contain enforcement data with a single repository for that information. Currently, ICIS contains all Federal Administrative and Judicial enforcement actions and a subset of the Permit Compliance System (PCS), which supports the National Pollutant Discharge Elimination System (NPDES). ICIS exchanges non-sensitive enforcement/compliance activities, non-sensitive formal enforcement actions and NPDES information with FRS. This web feature service contains the enforcement/compliance activities and formal enforcement action related facilities; the NPDES facilities are contained in the PCS_NPDES web feature service. FRS identifies and geospatially locates facilities, sites or places subject to environmental regulations or of environmental interest. Using vigorous verification and data management procedures, FRS integrates facility data from EPA's national program systems, other federal agencies, and State and tribal master facility records and provides EPA with a centrally managed, single source of comprehensive and authoritative information on f

  2. Architecture of the local spatial data infrastructure for regional climate change research

    NASA Astrophysics Data System (ADS)

    Titov, Alexander; Gordov, Evgeny

    2013-04-01

    Georeferenced datasets (meteorological databases, modeling and reanalysis results, etc.) are actively used in modeling and analysis of climate change for various spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their size which might constitute up to tens terabytes for a single dataset studies in the area of climate and environmental change require a special software support based on SDI approach. A dedicated architecture of the local spatial data infrastructure aiming at regional climate change analysis using modern web mapping technologies is presented. Geoportal is a key element of any SDI, allowing searching of geoinformation resources (datasets and services) using metadata catalogs, producing geospatial data selections by their parameters (data access functionality) as well as managing services and applications of cartographical visualization. It should be noted that due to objective reasons such as big dataset volume, complexity of data models used, syntactic and semantic differences of various datasets, the development of environmental geodata access, processing and visualization services turns out to be quite a complex task. Those circumstances were taken into account while developing architecture of the local spatial data infrastructure as a universal framework providing geodata services. So that, the architecture presented includes: 1. Effective in terms of search, access, retrieval and subsequent statistical processing, model of storing big sets of regional georeferenced data, allowing in particular to store frequently used values (like monthly and annual climate change indices, etc.), thus providing different temporal views of the datasets 2. General architecture of the corresponding software components handling geospatial datasets within the storage model 3. Metadata catalog describing in detail using ISO 19115 and CF-convention standards datasets used in climate researches as a basic element of the spatial data infrastructure as well as its publication according to OGC CSW (Catalog Service Web) specification 4. Computational and mapping web services to work with geospatial datasets based on OWS (OGC Web Services) standards: WMS, WFS, WPS 5. Geoportal as a key element of thematic regional spatial data infrastructure providing also software framework for dedicated web applications development To realize web mapping services Geoserver software is used since it provides natural WPS implementation as a separate software module. To provide geospatial metadata services GeoNetwork Opensource (http://geonetwork-opensource.org) product is planned to be used for it supports ISO 19115/ISO 19119/ISO 19139 metadata standards as well as ISO CSW 2.0 profile for both client and server. To implement thematic applications based on geospatial web services within the framework of local SDI geoportal the following open source software have been selected: 1. OpenLayers JavaScript library, providing basic web mapping functionality for the thin client such as web browser 2. GeoExt/ExtJS JavaScript libraries for building client-side web applications working with geodata services. The web interface developed will be similar to the interface of such popular desktop GIS applications, as uDIG, QuantumGIS etc. The work is partially supported by RF Ministry of Education and Science grant 8345, SB RAS Program VIII.80.2.1 and IP 131.

  3. Use of Open Standards and Technologies at the Lunar Mapping and Modeling Project

    NASA Astrophysics Data System (ADS)

    Law, E.; Malhotra, S.; Bui, B.; Chang, G.; Goodale, C. E.; Ramirez, P.; Kim, R. M.; Sadaqathulla, S.; Rodriguez, L.

    2011-12-01

    The Lunar Mapping and Modeling Project (LMMP), led by the Marshall Space Flight center (MSFC), is tasked by NASA. The project is responsible for the development of an information system to support lunar exploration activities. It provides lunar explorers a set of tools and lunar map and model products that are predominantly derived from present lunar missions (e.g., the Lunar Reconnaissance Orbiter (LRO)) and from historical missions (e.g., Apollo). At Jet Propulsion Laboratory (JPL), we have built the LMMP interoperable geospatial information system's underlying infrastructure and a single point of entry - the LMMP Portal by employing a number of open standards and technologies. The Portal exposes a set of services to users to allow search, visualization, subset, and download of lunar data managed by the system. Users also have access to a set of tools that visualize, analyze and annotate the data. The infrastructure and Portal are based on web service oriented architecture. We designed the system to support solar system bodies in general including asteroids, earth and planets. We employed a combination of custom software, commercial and open-source components, off-the-shelf hardware and pay-by-use cloud computing services. The use of open standards and web service interfaces facilitate platform and application independent access to the services and data, offering for instances, iPad and Android mobile applications and large screen multi-touch with 3-D terrain viewing functions, for a rich browsing and analysis experience from a variety of platforms. The web services made use of open standards including: Representational State Transfer (REST); and Open Geospatial Consortium (OGC)'s Web Map Service (WMS), Web Coverage Service (WCS), Web Feature Service (WFS). Its data management services have been built on top of a set of open technologies including: Object Oriented Data Technology (OODT) - open source data catalog, archive, file management, data grid framework; openSSO - open source access management and federation platform; solr - open source enterprise search platform; redmine - open source project collaboration and management framework; GDAL - open source geospatial data abstraction library; and others. Its data products are compliant with Federal Geographic Data Committee (FGDC) metadata standard. This standardization allows users to access the data products via custom written applications or off-the-shelf applications such as GoogleEarth. We will demonstrate this ready-to-use system for data discovery and visualization by walking through the data services provided through the portal such as browse, search, and other tools. We will further demonstrate image viewing and layering of lunar map images from the Internet, via mobile devices such as Apple's iPad.

  4. Brandenburg 3D - a comprehensive 3D Subsurface Model, Conception of an Infrastructure Node and a Web Application

    NASA Astrophysics Data System (ADS)

    Kerschke, Dorit; Schilling, Maik; Simon, Andreas; Wächter, Joachim

    2014-05-01

    The Energiewende and the increasing scarcity of raw materials will lead to an intensified utilization of the subsurface in Germany. Within this context, geological 3D modeling is a fundamental approach for integrated decision and planning processes. Initiated by the development of the European Geospatial Infrastructure INSPIRE, the German State Geological Offices started digitizing their predominantly analog archive inventory. Until now, a comprehensive 3D subsurface model of Brandenburg did not exist. Therefore the project B3D strived to develop a new 3D model as well as a subsequent infrastructure node to integrate all geological and spatial data within the Geodaten-Infrastruktur Brandenburg (Geospatial Infrastructure, GDI-BB) and provide it to the public through an interactive 2D/3D web application. The functionality of the web application is based on a client-server architecture. Server-sided, all available spatial data is published through GeoServer. GeoServer is designed for interoperability and acts as the reference implementation of the Open Geospatial Consortium (OGC) Web Feature Service (WFS) standard that provides the interface that allows requests for geographical features. In addition, GeoServer implements, among others, the high performance certified compliant Web Map Service (WMS) that serves geo-referenced map images. For publishing 3D data, the OGC Web 3D Service (W3DS), a portrayal service for three-dimensional geo-data, is used. The W3DS displays elements representing the geometry, appearance, and behavior of geographic objects. On the client side, the web application is solely based on Free and Open Source Software and leans on the JavaScript API WebGL that allows the interactive rendering of 2D and 3D graphics by means of GPU accelerated usage of physics and image processing as part of the web page canvas without the use of plug-ins. WebGL is supported by most web browsers (e.g., Google Chrome, Mozilla Firefox, Safari, and Opera). The web application enables an intuitive navigation through all available information and allows the visualization of geological maps (2D), seismic transects (2D/3D), wells (2D/3D), and the 3D-model. These achievements will alleviate spatial and geological data management within the German State Geological Offices and foster the interoperability of heterogeneous systems. It will provide guidance to a systematic subsurface management across system, domain and administrative boundaries on the basis of a federated spatial data infrastructure, and include the public in the decision processes (e-Governance). Yet, the interoperability of the systems has to be strongly propelled forward through agreements on standards that need to be decided upon in responsible committees. The project B3D is funded with resources from the European Fund for Regional Development (EFRE).

  5. Spatial epidemiology in zoonotic parasitic diseases: insights gained at the 1st International Symposium on Geospatial Health in Lijiang, China, 2007

    PubMed Central

    Zhou, Xiao-Nong; Lv, Shan; Yang, Guo-Jing; Kristensen, Thomas K; Bergquist, N Robert; Utzinger, Jürg; Malone, John B

    2009-01-01

    The 1st International Symposium on Geospatial Health was convened in Lijiang, Yunnan province, People's Republic of China from 8 to 9 September, 2007. The objective was to review progress made with the application of spatial techniques on zoonotic parasitic diseases, particularly in Southeast Asia. The symposium featured 71 presentations covering soil-transmitted and water-borne helminth infections, as well as arthropod-borne diseases such as leishmaniasis, malaria and lymphatic filariasis. The work made public at this occasion is briefly summarized here to highlight the advances made and to put forth research priorities in this area. Approaches such as geographical information systems (GIS), global positioning systems (GPS) and remote sensing (RS), including spatial statistics, web-based GIS and map visualization of field investigations, figured prominently in the presentation. PMID:19193214

  6. Building Geospatial Web Services for Ecological Monitoring and Forecasting

    NASA Astrophysics Data System (ADS)

    Hiatt, S. H.; Hashimoto, H.; Melton, F. S.; Michaelis, A. R.; Milesi, C.; Nemani, R. R.; Wang, W.

    2008-12-01

    The Terrestrial Observation and Prediction System (TOPS) at NASA Ames Research Center is a modeling system that generates a suite of gridded data products in near real-time that are designed to enhance management decisions related to floods, droughts, forest fires, human health, as well as crop, range, and forest production. While these data products introduce great possibilities for assisting management decisions and informing further research, realization of their full potential is complicated by their shear volume and by the need for a necessary infrastructure for remotely browsing, visualizing, and analyzing the data. In order to address these difficulties we have built an OGC-compliant WMS and WCS server based on an open source software stack that provides standardized access to our archive of data. This server is built using the open source Java library GeoTools which achieves efficient I/O and image rendering through Java Advanced Imaging. We developed spatio-temporal raster management capabilities using the PostGrid raster indexation engine. We provide visualization and browsing capabilities through a customized Ajax web interface derived from the kaMap project. This interface allows resource managers to quickly assess ecosystem conditions and identify significant trends and anomalies from within their web browser without the need to download source data or install special software. Our standardized web services also expose TOPS data to a range of potential clients, from web mapping applications to virtual globes and desktop GIS packages. However, support for managing the temporal dimension of our data is currently limited in existing software systems. Future work will attempt to overcome this shortcoming by building time-series visualization and analysis tools that can be integrated with existing geospatial software.

  7. The New USGS Volcano Hazards Program Web Site

    NASA Astrophysics Data System (ADS)

    Venezky, D. Y.; Graham, S. E.; Parker, T. J.; Snedigar, S. F.

    2008-12-01

    The U.S. Geological Survey's (USGS) Volcano Hazard Program (VHP) has launched a revised web site that uses a map-based interface to display hazards information for U.S. volcanoes. The web site is focused on better communication of hazards and background volcano information to our varied user groups by reorganizing content based on user needs and improving data display. The Home Page provides a synoptic view of the activity level of all volcanoes for which updates are written using a custom Google® Map. Updates are accessible by clicking on one of the map icons or clicking on the volcano of interest in the adjacent color-coded list of updates. The new navigation provides rapid access to volcanic activity information, background volcano information, images and publications, volcanic hazards, information about VHP, and the USGS volcano observatories. The Volcanic Activity section was tailored for emergency managers but provides information for all our user groups. It includes a Google® Map of the volcanoes we monitor, an Elevated Activity Page, a general status page, information about our Volcano Alert Levels and Aviation Color Codes, monitoring information, and links to monitoring data from VHP's volcano observatories: Alaska Volcano Observatory (AVO), Cascades Volcano Observatory (CVO), Long Valley Observatory (LVO), Hawaiian Volcano Observatory (HVO), and Yellowstone Volcano Observatory (YVO). The YVO web site was the first to move to the new navigation system and we are working on integrating the Long Valley Observatory web site next. We are excited to continue to implement new geospatial technologies to better display our hazards and supporting volcano information.

  8. Spatial Data Services for Interdisciplinary Applications from the NASA Socioeconomic Data and Applications Center

    NASA Astrophysics Data System (ADS)

    Chen, R. S.; MacManus, K.; Vinay, S.; Yetman, G.

    2016-12-01

    The Socioeconomic Data and Applications Center (SEDAC), one of 12 Distributed Active Archive Centers (DAACs) in the NASA Earth Observing System Data and Information System (EOSDIS), has developed a variety of operational spatial data services aimed at providing online access, visualization, and analytic functions for geospatial socioeconomic and environmental data. These services include: open web services that implement Open Geospatial Consortium (OGC) specifications such as Web Map Service (WMS), Web Feature Service (WFS), and Web Coverage Service (WCS); spatial query services that support Web Processing Service (WPS) and Representation State Transfer (REST); and web map clients and a mobile app that utilize SEDAC and other open web services. These services may be accessed from a variety of external map clients and visualization tools such as NASA's WorldView, NOAA's Climate Explorer, and ArcGIS Online. More than 200 data layers related to population, settlements, infrastructure, agriculture, environmental pollution, land use, health, hazards, climate change and other aspects of sustainable development are available through WMS, WFS, and/or WCS. Version 2 of the SEDAC Population Estimation Service (PES) supports spatial queries through WPS and REST in the form of a user-defined polygon or circle. The PES returns an estimate of the population residing in the defined area for a specific year (2000, 2005, 2010, 2015, or 2020) based on SEDAC's Gridded Population of the World version 4 (GPWv4) dataset, together with measures of accuracy. The SEDAC Hazards Mapper and the recently released HazPop iOS mobile app enable users to easily submit spatial queries to the PES and see the results. SEDAC has developed an operational virtualized backend infrastructure to manage these services and support their continual improvement as standards change, new data and services become available, and user needs evolve. An ongoing challenge is to improve the reliability and performance of the infrastructure, in conjunction with external services, to meet both research and operational needs.

  9. Your Personal Analysis Toolkit - An Open Source Solution

    NASA Astrophysics Data System (ADS)

    Mitchell, T.

    2009-12-01

    Open source software is commonly known for its web browsers, word processors and programming languages. However, there is a vast array of open source software focused on geographic information management and geospatial application building in general. As geo-professionals, having easy access to tools for our jobs is crucial. Open source software provides the opportunity to add a tool to your tool belt and carry it with you for your entire career - with no license fees, a supportive community and the opportunity to test, adopt and upgrade at your own pace. OSGeo is a US registered non-profit representing more than a dozen mature geospatial data management applications and programming resources. Tools cover areas such as desktop GIS, web-based mapping frameworks, metadata cataloging, spatial database analysis, image processing and more. Learn about some of these tools as they apply to AGU members, as well as how you can join OSGeo and its members in getting the job done with powerful open source tools. If you haven't heard of OSSIM, MapServer, OpenLayers, PostGIS, GRASS GIS or the many other projects under our umbrella - then you need to hear this talk. Invest in yourself - use open source!

  10. Using Open and Interoperable Ways to Publish and Access LANCE AIRS Near-Real Time Data

    NASA Technical Reports Server (NTRS)

    Zhao, Peisheng; Lynnes, Christopher; Vollmer, Bruce; Savtchenko, Andrey; Theobald, Michael; Yang, Wenli

    2011-01-01

    The Atmospheric Infrared Sounder (AIRS) Near-Real Time (NRT) data from the Land Atmosphere Near real-time Capability for EOS (LANCE) element at the Goddard Earth Sciences Data and Information Services Center (GES DISC) provides information on the global and regional atmospheric state, with very low temporal latency, to support climate research and improve weather forecasting. An open and interoperable platform is useful to facilitate access to, and integration of, LANCE AIRS NRT data. As Web services technology has matured in recent years, a new scalable Service-Oriented Architecture (SOA) is emerging as the basic platform for distributed computing and large networks of interoperable applications. Following the provide-register-discover-consume SOA paradigm, this presentation discusses how to use open-source geospatial software components to build Web services for publishing and accessing AIRS NRT data, explore the metadata relevant to registering and discovering data and services in the catalogue systems, and implement a Web portal to facilitate users' consumption of the data and services.

  11. Web Mapping Architectures Based on Open Specifications and Free and Open Source Software in the Water Domain

    NASA Astrophysics Data System (ADS)

    Arias Muñoz, C.; Brovelli, M. A.; Kilsedar, C. E.; Moreno-Sanchez, R.; Oxoli, D.

    2017-09-01

    The availability of water-related data and information across different geographical and jurisdictional scales is of critical importance for the conservation and management of water resources in the 21st century. Today information assets are often found fragmented across multiple agencies that use incompatible data formats and procedures for data collection, storage, maintenance, analysis, and distribution. The growing adoption of Web mapping systems in the water domain is reducing the gap between data availability and its practical use and accessibility. Nevertheless, more attention must be given to the design and development of these systems to achieve high levels of interoperability and usability while fulfilling different end user informational needs. This paper first presents a brief overview of technologies used in the water domain, and then presents three examples of Web mapping architectures based on free and open source software (FOSS) and the use of open specifications (OS) that address different users' needs for data sharing, visualization, manipulation, scenario simulations, and map production. The purpose of the paper is to illustrate how the latest developments in OS for geospatial and water-related data collection, storage, and sharing, combined with the use of mature FOSS projects facilitate the creation of sophisticated interoperable Web-based information systems in the water domain.

  12. Geospatial Data Fusion and Multigroup Decision Support for Surface Water Quality Management

    NASA Astrophysics Data System (ADS)

    Sun, A. Y.; Osidele, O.; Green, R. T.; Xie, H.

    2010-12-01

    Social networking and social media have gained significant popularity and brought fundamental changes to many facets of our everyday life. With the ever-increasing adoption of GPS-enabled gadgets and technology, location-based content is likely to play a central role in social networking sites. While location-based content is not new to the geoscience community, where geographic information systems (GIS) are extensively used, the delivery of useful geospatial data to targeted user groups for decision support is new. Decision makers and modelers ought to make more effective use of the new web-based tools to expand the scope of environmental awareness education, public outreach, and stakeholder interaction. Environmental decision processes are often rife with uncertainty and controversy, requiring integration of multiple sources of information and compromises between diverse interests. Fusing of multisource, multiscale environmental data for multigroup decision support is a challenging task. Toward this goal, a multigroup decision support platform should strive to achieve transparency, impartiality, and timely synthesis of information. The latter criterion often constitutes a major technical bottleneck to traditional GIS-based media, featuring large file or image sizes and requiring special processing before web deployment. Many tools and design patterns have appeared in recent years to ease the situation somewhat. In this project, we explore the use of Web 2.0 technologies for “pushing” location-based content to multigroups involved in surface water quality management and decision making. In particular, our granular bottom-up approach facilitates effective delivery of information to most relevant user groups. Our location-based content includes in-situ and remotely sensed data disseminated by NASA and other national and local agencies. Our project is demonstrated for managing the total maximum daily load (TMDL) program in the Arroyo Colorado coastal river basin in Texas. The overall design focuses on assigning spatial information to decision support elements and on efficiently using Web 2.0 technologies to relay scientific information to the nonscientific community. We conclude that (i) social networking, if appropriately used, has great potential for mitigating difficulty associated with multigroup decision making; (ii) all potential stakeholder groups should be involved in creating a useful decision support system; and (iii) environmental decision support systems should be considered a must-have, instead of an optional component of TMDL decision support projects. Acknowledgment: This project was supported by NASA grant NNX09AR63G.

  13. Decentralized Orchestration of Composite Ogc Web Processing Services in the Cloud

    NASA Astrophysics Data System (ADS)

    Xiao, F.; Shea, G. Y. K.; Cao, J.

    2016-09-01

    Current web-based GIS or RS applications generally rely on centralized structure, which has inherent drawbacks such as single points of failure, network congestion, and data inconsistency, etc. The inherent disadvantages of traditional GISs need to be solved for new applications on Internet or Web. Decentralized orchestration offers performance improvements in terms of increased throughput and scalability and lower response time. This paper investigates build time and runtime issues related to decentralized orchestration of composite geospatial processing services based on OGC WPS standard specification. A case study of dust storm detection was demonstrated to evaluate the proposed method and the experimental results indicate that the method proposed in this study is effective for its ability to produce the high quality solution at a low cost of communications for geospatial processing service composition problem.

  14. Online Maps and Cloud-Supported Location-Based Services across a Manifold of Devices

    NASA Astrophysics Data System (ADS)

    Kröpfl, M.; Buchmüller, D.; Leberl, F.

    2012-07-01

    Online mapping, miniaturization of computing devices, the "cloud", Global Navigation Satellite System (GNSS) and cell tower triangulation all coalesce into an entirely novel infrastructure for numerous innovative map applications. This impacts the planning of human activities, navigating and tracking these activities as they occur, and finally documenting their outcome for either a single user or a network of connected users in a larger context. In this paper, we provide an example of a simple geospatial application making use of this model, which we will use to explain the basic steps necessary to deploy an application involving a web service hosting geospatial information and a client software consuming the web service through an API. The application allows an insurance claim specialist to add claims to a cloud-based database including a claim location. A field agent then uses a smartphone application to query the database by proximity, and heads out to capture photographs as supporting documentation for the claim. Once the photos have been uploaded to the web service, a second web service for image matching is called in order to try and match the current photograph to previously submitted assets. Image matching is used as a pre-verification step to determine whether the coverage of the respective object is sufficient for the claim specialist to process the claim. The development of the application was based on Microsoft's® Bing Maps™, Windows Phone™, Silverlight™, Windows Azure™ and Visual Studio™, and was completed in approximately 30 labour hours split among two developers.

  15. The LandCarbon Web Application: Advanced Geospatial Data Delivery and Visualization Tools for Communication about Ecosystem Carbon Sequestration and Greenhouse Gas Fluxes

    NASA Astrophysics Data System (ADS)

    Thomas, N.; Galey, B.; Zhu, Z.; Sleeter, B. M.; Lehmer, E.

    2015-12-01

    The LandCarbon web application (http://landcarbon.org) is a collaboration between the U.S. Geological Survey and U.C. Berkeley's Geospatial Innovation Facility (GIF). The LandCarbon project is a national assessment focused on improved understanding of carbon sequestration and greenhouse gas fluxes in and out of ecosystems related to land use, using scientific capabilities from USGS and other organizations. The national assessment is conducted at a regional scale, covers all 50 states, and incorporates data from remote sensing, land change studies, aquatic and wetland data, hydrological and biogeochemical modeling, and wildfire mapping to estimate baseline and future potential carbon storage and greenhouse gas fluxes. The LandCarbon web application is a geospatial portal that allows for a sophisticated data delivery system as well as a suite of engaging tools that showcase the LandCarbon data using interactive web based maps and charts. The web application was designed to be flexible and accessible to meet the needs of a variety of users. Casual users can explore the input data and results of the assessment for a particular area of interest in an intuitive and interactive map, without the need for specialized software. Users can view and interact with maps, charts, and statistics that summarize the baseline and future potential carbon storage and fluxes for U.S. Level 2 Ecoregions for 3 IPCC emissions scenarios. The application allows users to access the primary data sources and assessment results for viewing and download, and also to learn more about the assessment's objectives, methods, and uncertainties through published reports and documentation. The LandCarbon web application is built on free and open source libraries including Django and D3. The GIF has developed the Django-Spillway package, which facilitates interactive visualization and serialization of complex geospatial raster data. The underlying LandCarbon data is available through an open application programming interface (API), which will allow other organizations to build their own custom applications and tools. New features such as finer scale aggregation and an online carbon calculator are being added to the LandCarbon web application to continue to make the site interactive, visually compelling, and useful for a wide range of users.

  16. EPA Facility Registry System (FRS): NEPT

    EPA Pesticide Factsheets

    This web feature service contains location and facility identification information from EPA's Facility Registry System (FRS) for the subset of facilities that link to the National Environmental Performance Track (NEPT) Program dataset. FRS identifies and geospatially locates facilities, sites or places subject to environmental regulations or of environmental interest. Using vigorous verification and data management procedures, FRS integrates facility data from EPA's national program systems, other federal agencies, and State and tribal master facility records and provides EPA with a centrally managed, single source of comprehensive and authoritative information on facilities. Additional information on FRS is available at the EPA website https://www.epa.gov/enviro/facility-registry-service-frs

  17. EPA Facility Registry Service (FRS): NEI

    EPA Pesticide Factsheets

    This web feature service contains location and facility identification information from EPA's Facility Registry Service (FRS) for the subset of facilities that link to the National Emissions Inventory (NEI) Program dataset. FRS identifies and geospatially locates facilities, sites or places subject to environmental regulations or of environmental interest. Using vigorous verification and data management procedures, FRS integrates facility data from EPA's national program systems, other federal agencies, and State and tribal master facility records and provides EPA with a centrally managed, single source of comprehensive and authoritative information on facilities. Additional information on FRS is available at the EPA website https://www.epa.gov/enviro/facility-registry-service-frs

  18. Prototype of Partial Cutting Tool of Geological Map Images Distributed by Geological Web Map Service

    NASA Astrophysics Data System (ADS)

    Nonogaki, S.; Nemoto, T.

    2014-12-01

    Geological maps and topographical maps play an important role in disaster assessment, resource management, and environmental preservation. These map information have been distributed in accordance with Web services standards such as Web Map Service (WMS) and Web Map Tile Service (WMTS) recently. In this study, a partial cutting tool of geological map images distributed by geological WMTS was implemented with Free and Open Source Software. The tool mainly consists of two functions: display function and cutting function. The former function was implemented using OpenLayers. The latter function was implemented using Geospatial Data Abstraction Library (GDAL). All other small functions were implemented by PHP and Python. As a result, this tool allows not only displaying WMTS layer on web browser but also generating a geological map image of intended area and zoom level. At this moment, available WTMS layers are limited to the ones distributed by WMTS for the Seamless Digital Geological Map of Japan. The geological map image can be saved as GeoTIFF format and WebGL format. GeoTIFF is one of the georeferenced raster formats that is available in many kinds of Geographical Information System. WebGL is useful for confirming a relationship between geology and geography in 3D. In conclusion, the partial cutting tool developed in this study would contribute to create better conditions for promoting utilization of geological information. Future work is to increase the number of available WMTS layers and the types of output file format.

  19. Sharing Human-Generated Observations by Integrating HMI and the Semantic Sensor Web

    PubMed Central

    Sigüenza, Álvaro; Díaz-Pardo, David; Bernat, Jesús; Vancea, Vasile; Blanco, José Luis; Conejero, David; Gómez, Luis Hernández

    2012-01-01

    Current “Internet of Things” concepts point to a future where connected objects gather meaningful information about their environment and share it with other objects and people. In particular, objects embedding Human Machine Interaction (HMI), such as mobile devices and, increasingly, connected vehicles, home appliances, urban interactive infrastructures, etc., may not only be conceived as sources of sensor information, but, through interaction with their users, they can also produce highly valuable context-aware human-generated observations. We believe that the great promise offered by combining and sharing all of the different sources of information available can be realized through the integration of HMI and Semantic Sensor Web technologies. This paper presents a technological framework that harmonizes two of the most influential HMI and Sensor Web initiatives: the W3C's Multimodal Architecture and Interfaces (MMI) and the Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) with its semantic extension, respectively. Although the proposed framework is general enough to be applied in a variety of connected objects integrating HMI, a particular development is presented for a connected car scenario where drivers' observations about the traffic or their environment are shared across the Semantic Sensor Web. For implementation and evaluation purposes an on-board OSGi (Open Services Gateway Initiative) architecture was built, integrating several available HMI, Sensor Web and Semantic Web technologies. A technical performance test and a conceptual validation of the scenario with potential users are reported, with results suggesting the approach is sound. PMID:22778643

  20. Sharing human-generated observations by integrating HMI and the Semantic Sensor Web.

    PubMed

    Sigüenza, Alvaro; Díaz-Pardo, David; Bernat, Jesús; Vancea, Vasile; Blanco, José Luis; Conejero, David; Gómez, Luis Hernández

    2012-01-01

    Current "Internet of Things" concepts point to a future where connected objects gather meaningful information about their environment and share it with other objects and people. In particular, objects embedding Human Machine Interaction (HMI), such as mobile devices and, increasingly, connected vehicles, home appliances, urban interactive infrastructures, etc., may not only be conceived as sources of sensor information, but, through interaction with their users, they can also produce highly valuable context-aware human-generated observations. We believe that the great promise offered by combining and sharing all of the different sources of information available can be realized through the integration of HMI and Semantic Sensor Web technologies. This paper presents a technological framework that harmonizes two of the most influential HMI and Sensor Web initiatives: the W3C's Multimodal Architecture and Interfaces (MMI) and the Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) with its semantic extension, respectively. Although the proposed framework is general enough to be applied in a variety of connected objects integrating HMI, a particular development is presented for a connected car scenario where drivers' observations about the traffic or their environment are shared across the Semantic Sensor Web. For implementation and evaluation purposes an on-board OSGi (Open Services Gateway Initiative) architecture was built, integrating several available HMI, Sensor Web and Semantic Web technologies. A technical performance test and a conceptual validation of the scenario with potential users are reported, with results suggesting the approach is sound.

  1. 76 FR 78944 - Announcement of National Geospatial Advisory Committee Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-20

    ...The National Geospatial Advisory Committee (NGAC) will meet on January 12, 2012, from 1 p.m. to 4 p.m. EST. The meeting will be held via Web conference and teleconference. The NGAC, which is composed of representatives from governmental, private sector, non-profit, and academic organizations, has been established to advise the Chair of the Federal Geographic Data Committee on management of Federal geospatial programs, the development of the National Spatial Data Infrastructure, and the implementation of Office of Management and Budget (OMB) Circular A-16. Topics to be addressed at the meeting include:

  2. 78 FR 16527 - Announcement of National Geospatial Advisory Committee Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-15

    ...The National Geospatial Advisory Committee (NGAC) will meet on April 3, 2013, from 1:00 p.m. to 5:00 p.m. EST. The meeting will be held via Web conference and teleconference. The NGAC, which is composed of representatives from governmental, private sector, non-profit, and academic organizations, has been established to advise the Chair of the Federal Geographic Data Committee on management of Federal geospatial programs, the development of the National Spatial Data Infrastructure, and the implementation of Office of Management and Budget (OMB) Circular A-16. Topics to be addressed at the meeting include:

  3. 78 FR 71638 - Announcement of National Geospatial Advisory Committee Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-29

    ...The National Geospatial Advisory Committee (NGAC) will meet on December 11, 2013, from 1:00 p.m. to 5:00 p.m. EST. The meeting will be held via web conference and teleconference. The NGAC, which is composed of representatives from governmental, private sector, non-profit, and academic organizations, has been established to advise the Chair of the Federal Geographic Data Committee on management of Federal geospatial programs, the development of the National Spatial Data Infrastructure, and the implementation of Office of Management and Budget (OMB) Circular A-16. Topics to be addressed at the meeting include:

  4. Open Source Web-Based Solutions for Disseminating and Analyzing Flood Hazard Information at the Community Level

    NASA Astrophysics Data System (ADS)

    Santillan, M. M.-M.; Santillan, J. R.; Morales, E. M. O.

    2017-09-01

    We discuss in this paper the development, including the features and functionalities, of an open source web-based flood hazard information dissemination and analytical system called "Flood EViDEns". Flood EViDEns is short for "Flood Event Visualization and Damage Estimations", an application that was developed by the Caraga State University to address the needs of local disaster managers in the Caraga Region in Mindanao, Philippines in accessing timely and relevant flood hazard information before, during and after the occurrence of flood disasters at the community (i.e., barangay and household) level. The web application made use of various free/open source web mapping and visualization technologies (GeoServer, GeoDjango, OpenLayers, Bootstrap), various geospatial datasets including LiDAR-derived elevation and information products, hydro-meteorological data, and flood simulation models to visualize various scenarios of flooding and its associated damages to infrastructures. The Flood EViDEns application facilitates the release and utilization of this flood-related information through a user-friendly front end interface consisting of web map and tables. A public version of the application can be accessed at http://121.97.192.11:8082/. The application is currently expanded to cover additional sites in Mindanao, Philippines through the "Geo-informatics for the Systematic Assessment of Flood Effects and Risks for a Resilient Mindanao" or the "Geo-SAFER Mindanao" Program.

  5. Geospatial Applications on Different Parallel and Distributed Systems in enviroGRIDS Project

    NASA Astrophysics Data System (ADS)

    Rodila, D.; Bacu, V.; Gorgan, D.

    2012-04-01

    The execution of Earth Science applications and services on parallel and distributed systems has become a necessity especially due to the large amounts of Geospatial data these applications require and the large geographical areas they cover. The parallelization of these applications comes to solve important performance issues and can spread from task parallelism to data parallelism as well. Parallel and distributed architectures such as Grid, Cloud, Multicore, etc. seem to offer the necessary functionalities to solve important problems in the Earth Science domain: storing, distribution, management, processing and security of Geospatial data, execution of complex processing through task and data parallelism, etc. A main goal of the FP7-funded project enviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is the development of a Spatial Data Infrastructure targeting this catchment region but also the development of standardized and specialized tools for storing, analyzing, processing and visualizing the Geospatial data concerning this area. For achieving these objectives, the enviroGRIDS deals with the execution of different Earth Science applications, such as hydrological models, Geospatial Web services standardized by the Open Geospatial Consortium (OGC) and others, on parallel and distributed architecture to maximize the obtained performance. This presentation analysis the integration and execution of Geospatial applications on different parallel and distributed architectures and the possibility of choosing among these architectures based on application characteristics and user requirements through a specialized component. Versions of the proposed platform have been used in enviroGRIDS project on different use cases such as: the execution of Geospatial Web services both on Web and Grid infrastructures [2] and the execution of SWAT hydrological models both on Grid and Multicore architectures [3]. The current focus is to integrate in the proposed platform the Cloud infrastructure, which is still a paradigm with critical problems to be solved despite the great efforts and investments. Cloud computing comes as a new way of delivering resources while using a large set of old as well as new technologies and tools for providing the necessary functionalities. The main challenges in the Cloud computing, most of them identified also in the Open Cloud Manifesto 2009, address resource management and monitoring, data and application interoperability and portability, security, scalability, software licensing, etc. We propose a platform able to execute different Geospatial applications on different parallel and distributed architectures such as Grid, Cloud, Multicore, etc. with the possibility of choosing among these architectures based on application characteristics and complexity, user requirements, necessary performances, cost support, etc. The execution redirection on a selected architecture is realized through a specialized component and has the purpose of offering a flexible way in achieving the best performances considering the existing restrictions.

  6. Economic assessment of the use value of geospatial information

    USGS Publications Warehouse

    Bernknopf, Richard L.; Shapiro, Carl D.

    2015-01-01

    Geospatial data inform decision makers. An economic model that involves application of spatial and temporal scientific, technical, and economic data in decision making is described. The value of information (VOI) contained in geospatial data is the difference between the net benefits (in present value terms) of a decision with and without the information. A range of technologies is used to collect and distribute geospatial data. These technical activities are linked to examples that show how the data can be applied in decision making, which is a cultural activity. The economic model for assessing the VOI in geospatial data for decision making is applied to three examples: (1) a retrospective model about environmental regulation of agrochemicals; (2) a prospective model about the impact and mitigation of earthquakes in urban areas; and (3) a prospective model about developing private–public geospatial information for an ecosystem services market. Each example demonstrates the potential value of geospatial information in a decision with uncertain information.

  7. Towards a semantics-based approach in the development of geographic portals

    NASA Astrophysics Data System (ADS)

    Athanasis, Nikolaos; Kalabokidis, Kostas; Vaitis, Michail; Soulakellis, Nikolaos

    2009-02-01

    As the demand for geospatial data increases, the lack of efficient ways to find suitable information becomes critical. In this paper, a new methodology for knowledge discovery in geographic portals is presented. Based on the Semantic Web, our approach exploits the Resource Description Framework (RDF) in order to describe the geoportal's information with ontology-based metadata. When users traverse from page to page in the portal, they take advantage of the metadata infrastructure to navigate easily through data of interest. New metadata descriptions are published in the geoportal according to the RDF schemas.

  8. Dynamic Server-Based KML Code Generator Method for Level-of-Detail Traversal of Geospatial Data

    NASA Technical Reports Server (NTRS)

    Baxes, Gregory; Mixon, Brian; Linger, TIm

    2013-01-01

    Web-based geospatial client applications such as Google Earth and NASA World Wind must listen to data requests, access appropriate stored data, and compile a data response to the requesting client application. This process occurs repeatedly to support multiple client requests and application instances. Newer Web-based geospatial clients also provide user-interactive functionality that is dependent on fast and efficient server responses. With massively large datasets, server-client interaction can become severely impeded because the server must determine the best way to assemble data to meet the client applications request. In client applications such as Google Earth, the user interactively wanders through the data using visually guided panning and zooming actions. With these actions, the client application is continually issuing data requests to the server without knowledge of the server s data structure or extraction/assembly paradigm. A method for efficiently controlling the networked access of a Web-based geospatial browser to server-based datasets in particular, massively sized datasets has been developed. The method specifically uses the Keyhole Markup Language (KML), an Open Geospatial Consortium (OGS) standard used by Google Earth and other KML-compliant geospatial client applications. The innovation is based on establishing a dynamic cascading KML strategy that is initiated by a KML launch file provided by a data server host to a Google Earth or similar KMLcompliant geospatial client application user. Upon execution, the launch KML code issues a request for image data covering an initial geographic region. The server responds with the requested data along with subsequent dynamically generated KML code that directs the client application to make follow-on requests for higher level of detail (LOD) imagery to replace the initial imagery as the user navigates into the dataset. The approach provides an efficient data traversal path and mechanism that can be flexibly established for any dataset regardless of size or other characteristics. The method yields significant improvements in userinteractive geospatial client and data server interaction and associated network bandwidth requirements. The innovation uses a C- or PHP-code-like grammar that provides a high degree of processing flexibility. A set of language lexer and parser elements is provided that offers a complete language grammar for writing and executing language directives. A script is wrapped and passed to the geospatial data server by a client application as a component of a standard KML-compliant statement. The approach provides an efficient means for a geospatial client application to request server preprocessing of data prior to client delivery. Data is structured in a quadtree format. As the user zooms into the dataset, geographic regions are subdivided into four child regions. Conversely, as the user zooms out, four child regions collapse into a single, lower-LOD region. The approach provides an efficient data traversal path and mechanism that can be flexibly established for any dataset regardless of size or other characteristics.

  9. Some legal concerns with the use of crowd-sourced Geospatial Information

    NASA Astrophysics Data System (ADS)

    Cho, George

    2014-06-01

    Volunteered geographic Information (VGI), citizens as sensors, crowd-sourcing and 'Wikipedia' of maps have been used to describe activity facilitated by the Internet and the dynamic Web 2.0 environment to collect geographic information (GI). Legal concerns raised in the creation, assembly and dissemination of GI by produsers include: quality, ownership and liability. In detail, accuracy and authoritativeness of the crowd-sourced GI; the ownership and moral rights to the information, and contractual and tort liability are key concerns. A legal framework and governance structure may be necessary whereby technology, networked governance and provision of legal protections may be combined to mitigate geo-liability as a 'chilling' factor in VGI development.

  10. Towards a virtual hub approach for landscape assessment and multimedia ecomuseum using multitemporal-maps

    NASA Astrophysics Data System (ADS)

    Brumana, R.; Santana Quintero, M.; Barazzetti, L.; Previtali, M.; Banfi, F.; Oreni, D.; Roels, D.; Roncoroni, F.

    2015-08-01

    Landscapes are dynamic entities, stretching and transforming across space and time, and need to be safeguarded as living places for the future, with interaction of human, social and economic dimensions. To have a comprehensive landscape evaluation several open data are needed, each one characterized by its own protocol, service interface, limiting or impeding this way interoperability and their integration. Indeed, nowadays the development of websites targeted to landscape assessment and touristic purposes requires many resources in terms of time, cost and IT skills to be implemented at different scales. For this reason these applications are limited to few cases mainly focusing on worldwide known touristic sites. The capability to spread the development of web-based multimedia virtual museum based on geospatial data relies for the future being on the possibility to discover the needed geo-spatial data through a single point of access in an homogenous way. In this paper the proposed innovative approach may facilitate the access to open data in a homogeneous way by means of specific components (the brokers) performing interoperability actions required to interconnect heterogeneous data sources. In the specific case study here analysed it has been implemented an interface to migrate a geo-swat chart based on local and regional geographic information into an user friendly Google Earth©-based infrastructure, integrating ancient cadastres and modern cartography, accessible by professionals and tourists via web and also via portable devices like tables and smartphones. The general aim of this work on the case study on the Lake of Como (Tremezzina municipality), is to boost the integration of assessment methodologies with digital geo-based technologies of map correlation for the multimedia ecomuseum system accessible via web. The developed WebGIS system integrates multi-scale and multi-temporal maps with different information (cultural, historical, landscape levels) represented by thematic icons allowing to transfer the richness of the landscape value to both tourists and professionals.

  11. GIS for the Gulf: A reference database for hurricane-affected areas: Chapter 4C in Science and the storms-the USGS response to the hurricanes of 2005

    USGS Publications Warehouse

    Greenlee, Dave

    2007-01-01

    A week after Hurricane Katrina made landfall in Louisiana, a collaboration among multiple organizations began building a database called the Geographic Information System for the Gulf, shortened to "GIS for the Gulf," to support the geospatial data needs of people in the hurricane-affected area. Data were gathered from diverse sources and entered into a consistent and standardized data model in a manner that is Web accessible.

  12. a Public Platform for Geospatial Data Sharing for Disaster Risk Management

    NASA Astrophysics Data System (ADS)

    Balbo, S.; Boccardo, P.; Dalmasso, S.; Pasquali, P.

    2013-01-01

    Several studies have been conducted in Africa to assist local governments in addressing the risk situation related to natural hazards. Geospatial data containing information on vulnerability, impacts, climate change, disaster risk reduction is usually part of the output of such studies and is valuable to national and international organizations to reduce the risks and mitigate the impacts of disasters. Nevertheless this data isn't efficiently widely distributed and often resides in remote storage solutions hardly reachable. Spatial Data Infrastructures are technical solutions capable to solve this issue, by storing geospatial data and making them widely available through the internet. Among these solutions, GeoNode, an open source online platform for geospatial data sharing, has been developed in recent years. GeoNode is a platform for the management and publication of geospatial data. It brings together mature and stable open-source software projects under a consistent and easy-to-use interface allowing users, with little training, to quickly and easily share data and create interactive maps. GeoNode data management tools allow for integrated creation of data, metadata, and map visualizations. Each dataset in the system can be shared publicly or restricted to allow access to only specific users. Social features like user profiles and commenting and rating systems allow for the development of communities around each platform to facilitate the use, management, and quality control of the data the GeoNode instance contains (http://geonode.org/). This paper presents a case study scenario of setting up a Web platform based on GeoNode. It is a public platform called MASDAP and promoted by the Government of Malawi in order to support development of the country and build resilience against natural disasters. A substantial amount of geospatial data has already been collected about hydrogeological risk, as well as several other-disasters related information. Moreover this platform will help to ensure that the data created by a number of past or ongoing projects is maintained and that this information remains accessible and useful. An Integrated Flood Risk Management Plan for a river basin has already been included in the platform and other data from future disaster risk management projects will be added as well.

  13. Prototyping an online wetland ecosystem services model using open model sharing standards

    USGS Publications Warehouse

    Feng, M.; Liu, S.; Euliss, N.H.; Young, Caitlin; Mushet, D.M.

    2011-01-01

    Great interest currently exists for developing ecosystem models to forecast how ecosystem services may change under alternative land use and climate futures. Ecosystem services are diverse and include supporting services or functions (e.g., primary production, nutrient cycling), provisioning services (e.g., wildlife, groundwater), regulating services (e.g., water purification, floodwater retention), and even cultural services (e.g., ecotourism, cultural heritage). Hence, the knowledge base necessary to quantify ecosystem services is broad and derived from many diverse scientific disciplines. Building the required interdisciplinary models is especially challenging as modelers from different locations and times may develop the disciplinary models needed for ecosystem simulations, and these models must be identified and made accessible to the interdisciplinary simulation. Additional difficulties include inconsistent data structures, formats, and metadata required by geospatial models as well as limitations on computing, storage, and connectivity. Traditional standalone and closed network systems cannot fully support sharing and integrating interdisciplinary geospatial models from variant sources. To address this need, we developed an approach to openly share and access geospatial computational models using distributed Geographic Information System (GIS) techniques and open geospatial standards. We included a means to share computational models compliant with Open Geospatial Consortium (OGC) Web Processing Services (WPS) standard to ensure modelers have an efficient and simplified means to publish new models. To demonstrate our approach, we developed five disciplinary models that can be integrated and shared to simulate a few of the ecosystem services (e.g., water storage, waterfowl breeding) that are provided by wetlands in the Prairie Pothole Region (PPR) of North America.

  14. Globalization and Mobilization of Earth Science Education with GeoBrain Geospatial Web Service Technology

    NASA Astrophysics Data System (ADS)

    Deng, M.; di, L.

    2005-12-01

    The needs for Earth science education to prepare students as globally-trained geoscience workforce increase tremendously with globalization of the economy. However, current academic programs often have difficulties in providing students world-view training or experiences with global context due to lack of resources and suitable teaching technology. This paper presents a NASA funded project with insights and solutions to this problem. The project aims to establish a geospatial data-rich learning and research environment that enable the students, faculty and researchers from institutes all over the world easily accessing, analyzing and modeling with the huge amount of NASA EOS data just like they possess those vast resources locally at their desktops. With the environment, classroom demonstration and training for students to deal with global climate and environment issues for any part of the world are possible in any classroom with Internet connection. Globalization and mobilization of Earth science education can be truly realized through the environment. This project, named as NASA EOS Higher Education Alliance: Mobilization of NASA EOS Data and Information through Web Services and Knowledge Management Technologies for Higher Education Teaching and Research, is built on profound technology and infrastructure foundations including web service technology, NASA EOS data resources, and open interoperability standards. An open, distributed, standard compliant, interoperable web-based system, called GeoBrain, is being developed by this project to provide a data-rich on-line learning and research environment. The system allows users to dynamically and collaboratively develop interoperable, web-executable geospatial process and analysis modules and models, and run them on-line against any part of the peta-byte archives for getting back the customized information products rather than raw data. The system makes a data-rich globally-capable Earth science learning and research environment, backed by NASA EOS data and computing resources that are unavailable to students and professors before, available to them at their desktops free of charge. In order to efficiently integrate this new environment into Earth science education and research, a NASA EOS Higher Education Alliance (NEHEA) is formed. The core members of NEHEA consist of the GeoBrain development team led by LAITS at George Mason University and a group of Earth science educators selected from an open RFP process. NEHEA is an open and free alliance. NEHEA welcomes Earth science educators around the world to join as associate members. NEHEA promotes international research and education collaborations in Earth science. NEHEA core members will provide technical support to NEHEA associate members for incorporating the data-rich learning environment into their teaching and research activities. The responsibilities of NEHEA education members include using the system in their research and teaching, providing feedback and requirements to the development team, exchanging information on the utilization of the system capabilities, participating in the system development, and developing new curriculums and research around the environment provided by GeoBrain.

  15. Evolution of System Architectures: Where Do We Need to Fail Next?

    NASA Astrophysics Data System (ADS)

    Bermudez, Luis; Alameh, Nadine; Percivall, George

    2013-04-01

    Innovation requires testing and failing. Thomas Edison was right when he said "I have not failed. I've just found 10,000 ways that won't work". For innovation and improvement of standards to happen, service Architectures have to be tested and tested. Within the Open Geospatial Consortium (OGC), testing of service architectures has occurred for the last 15 years. This talk will present an evolution of these service architectures and a possible future path. OGC is a global forum for the collaboration of developers and users of spatial data products and services, and for the advancement and development of international standards for geospatial interoperability. The OGC Interoperability Program is a series of hands-on, fast paced, engineering initiatives to accelerate the development and acceptance of OGC standards. Each initiative is organized in threads that provide focus under a particular theme. The first testbed, OGC Web Services phase 1, completed in 2003 had four threads: Common Architecture, Web Mapping, Sensor Web and Web Imagery Enablement. The Common Architecture was a cross-thread theme, to ensure that the Web Mapping and Sensor Web experiments built on a base common architecture. The architecture was based on the three main SOA components: Broker, Requestor and Provider. It proposed a general service model defining service interactions and dependencies; categorization of service types; registries to allow discovery and access of services; data models and encodings; and common services (WMS, WFS, WCS). For the latter, there was a clear distinction on the different services: Data Services (e.g. WMS), Application services (e.g. Coordinate transformation) and server-side client applications (e.g. image exploitation). The latest testbed, OGC Web Service phase 9, completed in 2012 had 5 threads: Aviation, Cross-Community Interoperability (CCI), Security and Services Interoperability (SSI), OWS Innovations and Compliance & Interoperability Testing & Evaluation (CITE). Compared to the first testbed, OWS-9 did not have a separate common architecture thread. Instead the emphasis was on brokering information models, securing them and making data available efficiently on mobile devices. The outcome is an architecture based on usability and non-intrusiveness while leveraging mediation of information models from different communities. This talk will use lessons learned from the evolution from OGC Testbed phase 1 to phase 9 to better understand how global and complex infrastructures evolve to support many communities including the Earth System Science Community.

  16. PlanetSense: A Real-time Streaming and Spatio-temporal Analytics Platform for Gathering Geo-spatial Intelligence from Open Source Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thakur, Gautam S; Bhaduri, Budhendra L; Piburn, Jesse O

    Geospatial intelligence has traditionally relied on the use of archived and unvarying data for planning and exploration purposes. In consequence, the tools and methods that are architected to provide insight and generate projections only rely on such datasets. Albeit, if this approach has proven effective in several cases, such as land use identification and route mapping, it has severely restricted the ability of researchers to inculcate current information in their work. This approach is inadequate in scenarios requiring real-time information to act and to adjust in ever changing dynamic environments, such as evacuation and rescue missions. In this work, wemore » propose PlanetSense, a platform for geospatial intelligence that is built to harness the existing power of archived data and add to that, the dynamics of real-time streams, seamlessly integrated with sophisticated data mining algorithms and analytics tools for generating operational intelligence on the fly. The platform has four main components i) GeoData Cloud a data architecture for storing and managing disparate datasets; ii) Mechanism to harvest real-time streaming data; iii) Data analytics framework; iv) Presentation and visualization through web interface and RESTful services. Using two case studies, we underpin the necessity of our platform in modeling ambient population and building occupancy at scale.« less

  17. Lowering the barriers for accessing distributed geospatial big data to advance spatial data science: the PolarHub solution

    NASA Astrophysics Data System (ADS)

    Li, W.

    2017-12-01

    Data is the crux of science. The widespread availability of big data today is of particular importance for fostering new forms of geospatial innovation. This paper reports a state-of-the-art solution that addresses a key cyberinfrastructure research problem—providing ready access to big, distributed geospatial data resources on the Web. We first formulate this data-access problem and introduce its indispensable elements, including identifying the cyber-location, space and time coverage, theme, and quality of the dataset. We then propose strategies to tackle each data-access issue and make the data more discoverable and usable for geospatial data users and decision makers. Among these strategies is large-scale web crawling as a key technique to support automatic collection of online geospatial data that are highly distributed, intrinsically heterogeneous, and known to be dynamic. To better understand the content and scientific meanings of the data, methods including space-time filtering, ontology-based thematic classification, and service quality evaluation are incorporated. To serve a broad scientific user community, these techniques are integrated into an operational data crawling system, PolarHub, which is also an important cyberinfrastructure building block to support effective data discovery. A series of experiments were conducted to demonstrate the outstanding performance of the PolarHub system. We expect this work to contribute significantly in building the theoretical and methodological foundation for data-driven geography and the emerging spatial data science.

  18. Geospatial Thinking of Information Professionals

    ERIC Educational Resources Information Center

    Bishop, Bradley Wade; Johnston, Melissa P.

    2013-01-01

    Geospatial thinking skills inform a host of library decisions including planning and managing facilities, analyzing service area populations, facility site location, library outlet and service point closures, as well as assisting users with their own geospatial needs. Geospatial thinking includes spatial cognition, spatial reasoning, and knowledge…

  19. Information Fusion for Feature Extraction and the Development of Geospatial Information

    DTIC Science & Technology

    2004-07-01

    of automated processing . 2. Requirements for Geospatial Information Accurate, timely geospatial information is critical for many military...this evaluation illustrates some of the difficulties in comparing manual and automated processing results (figure 5). The automated delineation of

  20. Openwebglobe 2: Visualization of Complex 3D-GEODATA in the (mobile) Webbrowser

    NASA Astrophysics Data System (ADS)

    Christen, M.

    2016-06-01

    Providing worldwide high resolution data for virtual globes consists of compute and storage intense tasks for processing data. Furthermore, rendering complex 3D-Geodata, such as 3D-City models with an extremely high polygon count and a vast amount of textures at interactive framerates is still a very challenging task, especially on mobile devices. This paper presents an approach for processing, caching and serving massive geospatial data in a cloud-based environment for large scale, out-of-core, highly scalable 3D scene rendering on a web based virtual globe. Cloud computing is used for processing large amounts of geospatial data and also for providing 2D and 3D map data to a large amount of (mobile) web clients. In this paper the approach for processing, rendering and caching very large datasets in the currently developed virtual globe "OpenWebGlobe 2" is shown, which displays 3D-Geodata on nearly every device.

  1. Development of an electronic emergency department-based geo-information injury surveillance system in Hong Kong.

    PubMed

    Chow, C B; Leung, M; Lai, Adela; Chow, Y H; Chung, Joanne; Tong, K M; Lit, Albert

    2012-06-01

    To describe the experience in the development of an electronic emergency department (ED)-based injury surveillance (IS) system in Hong Kong using data-mining and geo-spatial information technology (IT) for a Safe Community setup. This paper described the phased development of an emergency department-based IS system based on World Health Organization (WHO) injury surveillance Guideline to support safety promotion and injury prevention in a Safe Community in Hong Kong starting 2002. The initial ED data-based only collected data on name, sex, age, address, eight general categories of injury types (traffic, domestic, common assault, indecent assault, batter, industrial, self-harm and sports) and disposal from ED. Phase 1--manual data collection on International Classification of External Causes of Injury pre-event data; Phase 2--manual form was converted to electronic format using web-based data mining technology with built in data quality monitoring mechanism; Phase 3--integration of injury surveillance-data with in-patient hospital information; and Phase 4--geo-spatial information and body mapping were introduced to geo-code exact place of injury in an electronic map and site of injury on body map. It was feasible to develop a geo-spatial IS system at busy ED to collect valuable information for safety promotion and injury prevention at Safe Community setting. The keys for successful development and implementation involves engagement of all stakeholders at design and implementation of the system with injury prevention as ultimate goal, detail workflow planning at front end, support from the management, building on exiting system and appropriate utilisation of modern technology. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. Geospatial Data Processing for 3d City Model Generation, Management and Visualization

    NASA Astrophysics Data System (ADS)

    Toschi, I.; Nocerino, E.; Remondino, F.; Revolti, A.; Soria, G.; Piffer, S.

    2017-05-01

    Recent developments of 3D technologies and tools have increased availability and relevance of 3D data (from 3D points to complete city models) in the geospatial and geo-information domains. Nevertheless, the potential of 3D data is still underexploited and mainly confined to visualization purposes. Therefore, the major challenge today is to create automatic procedures that make best use of available technologies and data for the benefits and needs of public administrations (PA) and national mapping agencies (NMA) involved in "smart city" applications. The paper aims to demonstrate a step forward in this process by presenting the results of the SENECA project (Smart and SustaiNablE City from Above - http://seneca.fbk.eu). State-of-the-art processing solutions are investigated in order to (i) efficiently exploit the photogrammetric workflow (aerial triangulation and dense image matching), (ii) derive topologically and geometrically accurate 3D geo-objects (i.e. building models) at various levels of detail and (iii) link geometries with non-spatial information within a 3D geo-database management system accessible via web-based client. The developed methodology is tested on two case studies, i.e. the cities of Trento (Italy) and Graz (Austria). Both spatial (i.e. nadir and oblique imagery) and non-spatial (i.e. cadastral information and building energy consumptions) data are collected and used as input for the project workflow, starting from 3D geometry capture and modelling in urban scenarios to geometry enrichment and management within a dedicated webGIS platform.

  3. A VGI data integration framework based on linked data model

    NASA Astrophysics Data System (ADS)

    Wan, Lin; Ren, Rongrong

    2015-12-01

    This paper aims at the geographic data integration and sharing method for multiple online VGI data sets. We propose a semantic-enabled framework for online VGI sources cooperative application environment to solve a target class of geospatial problems. Based on linked data technologies - which is one of core components of semantic web, we can construct the relationship link among geographic features distributed in diverse VGI platform by using linked data modeling methods, then deploy these semantic-enabled entities on the web, and eventually form an interconnected geographic data network to support geospatial information cooperative application across multiple VGI data sources. The mapping and transformation from VGI sources to RDF linked data model is presented to guarantee the unique data represent model among different online social geographic data sources. We propose a mixed strategy which combined spatial distance similarity and feature name attribute similarity as the measure standard to compare and match different geographic features in various VGI data sets. And our work focuses on how to apply Markov logic networks to achieve interlinks of the same linked data in different VGI-based linked data sets. In our method, the automatic generating method of co-reference object identification model according to geographic linked data is discussed in more detail. It finally built a huge geographic linked data network across loosely-coupled VGI web sites. The results of the experiment built on our framework and the evaluation of our method shows the framework is reasonable and practicable.

  4. The Adversarial Route Analysis Tool: A Web Application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Casson, William H. Jr.

    2012-08-02

    The Adversarial Route Analysis Tool is a type of Google maps for adversaries. It's a web-based Geospatial application similar to Google Maps. It helps the U.S. government plan operations that predict where an adversary might be. It's easily accessible and maintainble and it's simple to use without much training.

  5. DOTAGWA: A CASE STUDY IN WEB-BASED ARCHITECTURES FOR CONNECTING SURFACE WATER MODELS TO SPATIALLY ENABLED WEB APPLICATIONS

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment (AGWA) tool is a desktop application that uses widely available standardized spatial datasets to derive inputs for multi-scale hydrologic models (Miller et al., 2007). The required data sets include topography (DEM data), soils, clima...

  6. Using Web GIS for Public Health Education

    ERIC Educational Resources Information Center

    Reed, Rajika E.; Bodzin, Alec M.

    2016-01-01

    An interdisciplinary curriculum unit that used Web GIS mapping to investigate malaria disease patterns and spread in relation to the environment for a high school Advanced Placement Environmental Science course was developed. A feasibility study was conducted to investigate the efficacy of the unit to promote geospatial thinking and reasoning…

  7. Geospatial Modeling of Asthma Population in Relation to Air Pollution

    NASA Technical Reports Server (NTRS)

    Kethireddy, Swatantra R.; Tchounwou, Paul B.; Young, John H.; Luvall, Jeffrey C.; Alhamdan, Mohammad

    2013-01-01

    Current observations indicate that asthma is growing every year in the United States, specific reasons for this are not well understood. This study stems from an ongoing research effort to investigate the spatio-temporal behavior of asthma and its relatedness to air pollution. The association between environmental variables such as air quality and asthma related health issues over Mississippi State are investigated using Geographic Information Systems (GIS) tools and applications. Health data concerning asthma obtained from Mississippi State Department of Health (MSDH) for 9-year period of 2003-2011, and data of air pollutant concentrations (PM2.5) collected from USEPA web resources, and are analyzed geospatially to establish the impacts of air quality on human health specifically related to asthma. Disease mapping using geospatial techniques provides valuable insights into the spatial nature, variability, and association of asthma to air pollution. Asthma patient hospitalization data of Mississippi has been analyzed and mapped using quantitative Choropleth techniques in ArcGIS. Patients have been geocoded to their respective zip codes. Potential air pollutant sources of Interstate highways, Industries, and other land use data have been integrated in common geospatial platform to understand their adverse contribution on human health. Existing hospitals and emergency clinics are being injected into analysis to further understand their proximity and easy access to patient locations. At the current level of analysis and understanding, spatial distribution of Asthma is observed in the populations of Zip code regions in gulf coast, along the interstates of south, and in counties of Northeast Mississippi. It is also found that asthma is prevalent in most of the urban population. This GIS based project would be useful to make health risk assessment and provide information support to the administrators and decision makers for establishing satellite clinics in future.

  8. Web-client based distributed generalization and geoprocessing

    USGS Publications Warehouse

    Wolf, E.B.; Howe, K.

    2009-01-01

    Generalization and geoprocessing operations on geospatial information were once the domain of complex software running on high-performance workstations. Currently, these computationally intensive processes are the domain of desktop applications. Recent efforts have been made to move geoprocessing operations server-side in a distributed, web accessible environment. This paper initiates research into portable client-side generalization and geoprocessing operations as part of a larger effort in user-centered design for the US Geological Survey's The National Map. An implementation of the Ramer-Douglas-Peucker (RDP) line simplification algorithm was created in the open source OpenLayers geoweb client. This algorithm implementation was benchmarked using differing data structures and browser platforms. The implementation and results of the benchmarks are discussed in the general context of client-side geoprocessing. (Abstract).

  9. Geospatial-enabled Data Exploration and Computation through Data Infrastructure Building Blocks

    NASA Astrophysics Data System (ADS)

    Song, C. X.; Biehl, L. L.; Merwade, V.; Villoria, N.

    2015-12-01

    Geospatial data are present everywhere today with the proliferation of location-aware computing devices and sensors. This is especially true in the scientific community where large amounts of data are driving research and education activities in many domains. Collaboration over geospatial data, for example, in modeling, data analysis and visualization, must still overcome the barriers of specialized software and expertise among other challenges. The GABBs project aims at enabling broader access to geospatial data exploration and computation by developing spatial data infrastructure building blocks that leverage capabilities of end-to-end application service and virtualized computing framework in HUBzero. Funded by NSF Data Infrastructure Building Blocks (DIBBS) initiative, GABBs provides a geospatial data architecture that integrates spatial data management, mapping and visualization and will make it available as open source. The outcome of the project will enable users to rapidly create tools and share geospatial data and tools on the web for interactive exploration of data without requiring significant software development skills, GIS expertise or IT administrative privileges. This presentation will describe the development of geospatial data infrastructure building blocks and the scientific use cases that help drive the software development, as well as seek feedback from the user communities.

  10. Developing a Web-based system by integrating VGI and SDI for real estate management and marketing

    NASA Astrophysics Data System (ADS)

    Salajegheh, J.; Hakimpour, F.; Esmaeily, A.

    2014-10-01

    Property importance of various aspects, especially the impact on various sectors of the economy and the country's macroeconomic is clear. Because of the real, multi-dimensional and heterogeneous nature of housing as a commodity, the lack of an integrated system includes comprehensive information of property, the lack of awareness of some actors in this field about comprehensive information about property and the lack of clear and comprehensive rules and regulations for the trading and pricing, several problems arise for the people involved in this field. In this research implementation of a crowd-sourced Web-based real estate support system is desired. Creating a Spatial Data Infrastructure (SDI) in this system for collecting, updating and integrating all official data about property is also desired in this study. In this system a Web2.0 broker and technologies such as Web services and service composition has been used. This work aims to provide comprehensive and diverse information about property from different sources. For this purpose five-level real estate support system architecture is used. PostgreSql DBMS is used to implement the desired system. Geoserver software is also used as map server and reference implementation of OGC (Open Geospatial Consortium) standards. And Apache server is used to run web pages and user interfaces. Integration introduced methods and technologies provide a proper environment for various users to use the system and share their information. This goal is only achieved by cooperation between all involved organizations in real estate with implementation their required infrastructures in interoperability Web services format.

  11. A Story of a Crashed Plane in US-Mexican border

    NASA Astrophysics Data System (ADS)

    Bermudez, Luis; Hobona, Gobe; Vretanos, Peter; Peterson, Perry

    2013-04-01

    A plane has crashed on the US-Mexican border. The search and rescue command center planner needs to find information about the crash site, a mountain, nearby mountains for the establishment of a communications tower, as well as ranches for setting up a local incident center. Events like this one occur all over the world and exchanging information seamlessly is key to save lives and prevent further disasters. This abstract describes an interoperability testbed that applied this scenario using technologies based on Open Geospatial Consortium (OGC) standards. The OGC, which has about 500 members, serves as a global forum for the collaboration of developers and users of spatial data products and services, and to advance the development of international standards for geospatial interoperability. The OGC Interoperability Program conducts international interoperability testbeds, such as the OGC Web Services Phase 9 (OWS-9), that encourages rapid development, testing, validation, demonstration and adoption of open, consensus based standards and best practices. The Cross-Community Interoperability (CCI) thread in OWS-9 advanced the Web Feature Service for Gazetteers (WFS-G) by providing a Single Point of Entry Global Gazetteer (SPEGG), where a user can submit a single query and access global geographic names data across multiple Federal names databases. Currently users must make two queries with differing input parameters against two separate databases to obtain authoritative cross border geographic names data. The gazetteers in this scenario included: GNIS and GNS. GNIS or Geographic Names Information System is managed by USGS. It was first developed in 1964 and contains information about domestic and Antarctic names. GNS or GeoNET Names Server provides the Geographic Names Data Base (GNDB) and it is managed by National Geospatial Intelligence Agency (NGA). GNS has been in service since 1994, and serves names for areas outside the United States and its dependent areas, as well as names for undersea features. The following challenges were advanced: Cascaded WFS-G servers (allowing to query multiple WFSs with a "parent" WFS), implemented query names filters (e.g. fuzzy search, text search), implemented dealing with multilingualism and diacritics, implemented advanced spatial constraints (e.g. search by radial search and nearest neighbor) and semantically mediated feature types (e.g. mountain vs. hill). To enable semantic mediation, a series of semantic mappings were defined between the NGA GNS, USGS GNIS and the Alexandria Digital Library (ADL) Gazetteer. The mappings were encoded in the Web Ontology Language (OWL) to enable them to be used by semantic web technologies. The semantic mappings were then published for ingestion into a semantic mediator that used the mappings to associate location types from one gazetteer with location types in another. The semantic mediator was then able to transform requests on the fly, providing a single point of entry WFS-G to multiple gazetteers. The presentation will provide a live presentation of the work performed, highlight main developments, and discuss future development.

  12. Enabling Web-Based GIS Tools for Internet and Mobile Devices To Improve and Expand NASA Data Accessibility and Analysis Functionality for the Renewable Energy and Agricultural Applications

    NASA Astrophysics Data System (ADS)

    Ross, A.; Stackhouse, P. W.; Tisdale, B.; Tisdale, M.; Chandler, W.; Hoell, J. M., Jr.; Kusterer, J.

    2014-12-01

    The NASA Langley Research Center Science Directorate and Atmospheric Science Data Center have initiated a pilot program to utilize Geographic Information System (GIS) tools that enable, generate and store climatological averages using spatial queries and calculations in a spatial database resulting in greater accessibility of data for government agencies, industry and private sector individuals. The major objectives of this effort include the 1) Processing and reformulation of current data to be consistent with ESRI and openGIS tools, 2) Develop functions to improve capability and analysis that produce "on-the-fly" data products, extending these past the single location to regional and global scales. 3) Update the current web sites to enable both web-based and mobile application displays for optimization on mobile platforms, 4) Interact with user communities in government and industry to test formats and usage of optimization, and 5) develop a series of metrics that allow for monitoring of progressive performance. Significant project results will include the the development of Open Geospatial Consortium (OGC) compliant web services (WMS, WCS, WFS, WPS) that serve renewable energy and agricultural application products to users using GIS software and tools. Each data product and OGC service will be registered within ECHO, the Common Metadata Repository, the Geospatial Platform, and Data.gov to ensure the data are easily discoverable and provide data users with enhanced access to SSE data, parameters, services, and applications. This effort supports cross agency, cross organization, and interoperability of SSE data products and services by collaborating with DOI, NRCan, NREL, NCAR, and HOMER for requirements vetting and test bed users before making available to the wider public.

  13. OpenFIRE - A Web GIS Service for Distributing the Finnish Reflection Experiment Datasets

    NASA Astrophysics Data System (ADS)

    Väkevä, Sakari; Aalto, Aleksi; Heinonen, Aku; Heikkinen, Pekka; Korja, Annakaisa

    2017-04-01

    The Finnish Reflection Experiment (FIRE) is a land-based deep seismic reflection survey conducted between 2001 and 2003 by a research consortium of the Universities of Helsinki and Oulu, the Geological Survey of Finland, and a Russian state-owned enterprise SpetsGeofysika. The dataset consists of 2100 kilometers of high-resolution profiles across the Archaean and Proterozoic nuclei of the Fennoscandian Shield. Although FIRE data have been available on request since 2009, the data have remained underused outside the original research consortium. The original FIRE data have been quality-controlled. The shot gathers have been cross-checked and comprehensive errata has been created. The brute stacks provided by the Russian seismic contractor have been reprocessed into seismic sections and replotted. A complete documentation of the intermediate processing steps is provided together with guidelines for setting up a computing environment and plotting the data. An open access web service "OpenFIRE" for the visualization and the downloading of FIRE data has been created. The service includes a mobile-responsive map application capable of enriching seismic sections with data from other sources such as open data from the National Land Survey and the Geological Survey of Finland. The AVAA team of the Finnish Open Science and Research Initiative has provided a tailored Liferay portal with necessary web components such as an API (Application Programming Interface) for download requests. INSPIRE (Infrastructure for Spatial Information in Europe) -compliant discovery metadata have been produced and geospatial data will be exposed as Open Geospatial Consortium standard services. The technical guidelines of the European Plate Observing System have been followed and the service could be considered as a reference application for sharing reflection seismic data. The OpenFIRE web service is available at www.seismo.helsinki.fi/openfire

  14. GEO Label Web Services for Dynamic and Effective Communication of Geospatial Metadata Quality

    NASA Astrophysics Data System (ADS)

    Lush, Victoria; Nüst, Daniel; Bastin, Lucy; Masó, Joan; Lumsden, Jo

    2014-05-01

    We present demonstrations of the GEO label Web services and their integration into a prototype extension of the GEOSS portal (http://scgeoviqua.sapienzaconsulting.com/web/guest/geo_home), the GMU portal (http://gis.csiss.gmu.edu/GADMFS/) and a GeoNetwork catalog application (http://uncertdata.aston.ac.uk:8080/geonetwork/srv/eng/main.home). The GEO label is designed to communicate, and facilitate interrogation of, geospatial quality information with a view to supporting efficient and effective dataset selection on the basis of quality, trustworthiness and fitness for use. The GEO label which we propose was developed and evaluated according to a user-centred design (UCD) approach in order to maximise the likelihood of user acceptance once deployed. The resulting label is dynamically generated from producer metadata in ISO or FDGC format, and incorporates user feedback on dataset usage, ratings and discovered issues, in order to supply a highly informative summary of metadata completeness and quality. The label was easily incorporated into a community portal as part of the GEO Architecture Implementation Programme (AIP-6) and has been successfully integrated into a prototype extension of the GEOSS portal, as well as the popular metadata catalog and editor, GeoNetwork. The design of the GEO label was based on 4 user studies conducted to: (1) elicit initial user requirements; (2) investigate initial user views on the concept of a GEO label and its potential role; (3) evaluate prototype label visualizations; and (4) evaluate and validate physical GEO label prototypes. The results of these studies indicated that users and producers support the concept of a label with drill-down interrogation facility, combining eight geospatial data informational aspects, namely: producer profile, producer comments, lineage information, standards compliance, quality information, user feedback, expert reviews, and citations information. These are delivered as eight facets of a wheel-like label, which are coloured according to metadata availability and are clickable to allow a user to engage with the original metadata and explore specific aspects in more detail. To support this graphical representation and allow for wider deployment architectures we have implemented two Web services, a PHP and a Java implementation, that generate GEO label representations by combining producer metadata (from standard catalogues or other published locations) with structured user feedback. Both services accept encoded URLs of publicly available metadata documents or metadata XML files as HTTP POST and GET requests and apply XPath and XSLT mappings to transform producer and feedback XML documents into clickable SVG GEO label representations. The label and services are underpinned by two XML-based quality models. The first is a producer model that extends ISO 19115 and 19157 to allow fuller citation of reference data, presentation of pixel- and dataset- level statistical quality information, and encoding of 'traceability' information on the lineage of an actual quality assessment. The second is a user quality model (realised as a feedback server and client) which allows reporting and query of ratings, usage reports, citations, comments and other domain knowledge. Both services are Open Source and are available on GitHub at https://github.com/lushv/geolabel-service and https://github.com/52North/GEO-label-java. The functionality of these services can be tested using our GEO label generation demos, available online at http://www.geolabel.net/demo.html and http://geoviqua.dev.52north.org/glbservice/index.jsf.

  15. Interoperable Data Access Services for NOAA IOOS

    NASA Astrophysics Data System (ADS)

    de La Beaujardiere, J.

    2008-12-01

    The Integrated Ocean Observing System (IOOS) is intended to enhance our ability to collect, deliver, and use ocean information. The goal is to support research and decision-making by providing data on our open oceans, coastal waters, and Great Lakes in the formats, rates, and scales required by scientists, managers, businesses, governments, and the public. The US National Oceanic and Atmospheric Administration (NOAA) is the lead agency for IOOS. NOAA's IOOS office supports the development of regional coastal observing capability and promotes data management efforts to increase data accessibility. Geospatial web services have been established at NOAA data providers including the National Data Buoy Center (NDBC), the Center for Operational Oceanographic Products and Services (CO-OPS), and CoastWatch, and at regional data provider sites. Services established include Open-source Project for a Network Data Access Protocol (OpenDAP), Open Geospatial Consortium (OGC) Sensor Observation Service (SOS), and OGC Web Coverage Service (WCS). These services provide integrated access to data holdings that have been aggregated at each center from multiple sources. We wish to collaborate with other groups to improve our service offerings to maximize interoperability and enhance cross-provider data integration, and to share common service components such as registries, catalogs, data conversion, and gateways. This paper will discuss the current status of NOAA's IOOS efforts and possible next steps.

  16. A Framework for Sharing and Integrating Remote Sensing and GIS Models Based on Web Service

    PubMed Central

    Chen, Zeqiang; Lin, Hui; Chen, Min; Liu, Deer; Bao, Ying; Ding, Yulin

    2014-01-01

    Sharing and integrating Remote Sensing (RS) and Geographic Information System/Science (GIS) models are critical for developing practical application systems. Facilitating model sharing and model integration is a problem for model publishers and model users, respectively. To address this problem, a framework based on a Web service for sharing and integrating RS and GIS models is proposed in this paper. The fundamental idea of the framework is to publish heterogeneous RS and GIS models into standard Web services for sharing and interoperation and then to integrate the RS and GIS models using Web services. For the former, a “black box” and a visual method are employed to facilitate the publishing of the models as Web services. For the latter, model integration based on the geospatial workflow and semantic supported marching method is introduced. Under this framework, model sharing and integration is applied for developing the Pearl River Delta water environment monitoring system. The results show that the framework can facilitate model sharing and model integration for model publishers and model users. PMID:24901016

  17. A framework for sharing and integrating remote sensing and GIS models based on Web service.

    PubMed

    Chen, Zeqiang; Lin, Hui; Chen, Min; Liu, Deer; Bao, Ying; Ding, Yulin

    2014-01-01

    Sharing and integrating Remote Sensing (RS) and Geographic Information System/Science (GIS) models are critical for developing practical application systems. Facilitating model sharing and model integration is a problem for model publishers and model users, respectively. To address this problem, a framework based on a Web service for sharing and integrating RS and GIS models is proposed in this paper. The fundamental idea of the framework is to publish heterogeneous RS and GIS models into standard Web services for sharing and interoperation and then to integrate the RS and GIS models using Web services. For the former, a "black box" and a visual method are employed to facilitate the publishing of the models as Web services. For the latter, model integration based on the geospatial workflow and semantic supported marching method is introduced. Under this framework, model sharing and integration is applied for developing the Pearl River Delta water environment monitoring system. The results show that the framework can facilitate model sharing and model integration for model publishers and model users.

  18. Bridging the Gap Between Surveyors and the Geo-Spatial Society

    NASA Astrophysics Data System (ADS)

    Müller, H.

    2016-06-01

    For many years FIG, the International Association of Surveyors, has been trying to bridge the gap between surveyors and the geospatial society as a whole, with the geospatial industries in particular. Traditionally the surveying profession contributed to the good of society by creating and maintaining highly precise and accurate geospatial data bases, based on an in-depth knowledge of spatial reference frameworks. Furthermore in many countries surveyors may be entitled to make decisions about land divisions and boundaries. By managing information spatially surveyors today develop into the role of geo-data managers, the longer the more. Job assignments in this context include data entry management, data and process quality management, design of formal and informal systems, information management, consultancy, land management, all that in close cooperation with many different stakeholders. Future tasks will include the integration of geospatial information into e-government and e-commerce systems. The list of professional tasks underpins the capabilities of surveyors to contribute to a high quality geospatial data and information management. In that way modern surveyors support the needs of a geo-spatial society. The paper discusses several approaches to define the role of the surveyor within the modern geospatial society.

  19. Geospatial datasets for watershed delineation and characterization used in the Hawaii StreamStats web application

    USGS Publications Warehouse

    Rea, Alan; Skinner, Kenneth D.

    2012-01-01

    The U.S. Geological Survey Hawaii StreamStats application uses an integrated suite of raster and vector geospatial datasets to delineate and characterize watersheds. The geospatial datasets used to delineate and characterize watersheds on the StreamStats website, and the methods used to develop the datasets are described in this report. The datasets for Hawaii were derived primarily from 10 meter resolution National Elevation Dataset (NED) elevation models, and the National Hydrography Dataset (NHD), using a set of procedures designed to enforce the drainage pattern from the NHD into the NED, resulting in an integrated suite of elevation-derived datasets. Additional sources of data used for computing basin characteristics include precipitation, land cover, soil permeability, and elevation-derivative datasets. The report also includes links for metadata and downloads of the geospatial datasets.

  20. Smart "geomorphological" map browsing - a tale about geomorphological maps and the internet

    NASA Astrophysics Data System (ADS)

    Geilhausen, M.; Otto, J.-C.

    2012-04-01

    With the digital production of geomorphological maps, the dissemination of research outputs now extends beyond simple paper products. Internet technologies can contribute to both, the dissemination of geomorphological maps and access to geomorphologic data and help to make geomorphological knowledge available to a greater public. Indeed, many national geological surveys employ end-to-end digital workflows from data capture in the field to final map production and dissemination. This paper deals with the potential of web mapping applications and interactive, portable georeferenced PDF maps for the distribution of geomorphological information. Web mapping applications such as Google Maps have become very popular and widespread and increased the interest and access to mapping. They link the Internet with GIS technology and are a common way of presenting dynamic maps online. The GIS processing is performed online and maps are visualised in interactive web viewers characterised by different capabilities such as zooming, panning or adding further thematic layers, with the map refreshed after each task. Depending on the system architecture and the components used, advanced symbology, map overlays from different applications and sources and their integration into a Desktop GIS are possible. This interoperability is achieved through the use of international open standards that include mechanisms for the integration and visualisation of information from multiple sources. The portable document format (PDF) is commonly used for printing and is a standard format that can be processed by many graphic software and printers without loss of information. A GeoPDF enables the sharing of geospatial maps and data in PDF documents. Multiple, independent map frames with individual spatial reference systems are possible within a GeoPDF, for example, for map overlays or insets. Geospatial functionality of a GeoPDF includes scalable map display, layer visibility control, access to attribute data, coordinate queries and spatial measurements. The full functionality of GeoPDFs requires free and user-friendly plug-ins for PDF readers and GIS software. A GeoPDF enables fundamental GIS functionality turning the formerly static PDF map into an interactive, portable georeferenced PDF map. GeoPDFs are easy to create and provide an interesting and valuable way to disseminate geomorphological maps. Our motivation to engage with the online distribution of geomorphological maps originates in the increasing number of web mapping applications available today indicating that the Internet has become a medium for displaying geographical information in rich forms and user-friendly interfaces. So, why not use the Internet to distribute geomorphological maps and enhance their practical application? Web mapping and dynamic PDF maps can play a key role in the movement towards a global dissemination of geomorphological information. This will be exemplified by live demonstrations of i.) existing geomorphological WebGIS applications, ii.) data merging from various sources using web map services, and iii.) free to download GeoPDF maps during the presentations.

  1. 71 FR 66315 - Notice of Availability of Invention for Licensing; Government-Owned Invention

    Federal Register 2010, 2011, 2012, 2013, 2014

    2006-11-14

    ... Coating and Method of Formulator.//Navy Case No. 97,486: Processing Semantic Markups in Web Ontology... Rotating Clip.//Navy Case No. 97,886: Adding Semantic Support to Existing UDDI Infrastructure.//Navy Case..., Binding, and Integration of Non-Registered Geospatial Web Services.//Navy Case No. 98,094: Novel, Single...

  2. Geospatial Information and Geographic Information Systems (GIS): Current Issues and Future Challenges

    DTIC Science & Technology

    2009-06-08

    CRS Report for Congress Prepared for Members and Committees of Congress Geospatial Information and Geographic Information Systems (GIS...Geographic Information Systems (GIS): Current Issues and Future Challenges 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...PAGE unclassified Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 Geospatial Information and Geographic Information Systems (GIS

  3. Towards a framework for geospatial tangible user interfaces in collaborative urban planning

    NASA Astrophysics Data System (ADS)

    Maquil, Valérie; Leopold, Ulrich; De Sousa, Luís Moreira; Schwartz, Lou; Tobias, Eric

    2018-04-01

    The increasing complexity of urban planning projects today requires new approaches to better integrate stakeholders with different professional backgrounds throughout a city. Traditional tools used in urban planning are designed for experts and offer little opportunity for participation and collaborative design. This paper introduces the concept of geospatial tangible user interfaces (GTUI) and reports on the design and implementation as well as the usability of such a GTUI to support stakeholder participation in collaborative urban planning. The proposed system uses physical objects to interact with large digital maps and geospatial data projected onto a tabletop. It is implemented using a PostGIS database, a web map server providing OGC web services, the computer vision framework reacTIVision, a Java-based TUIO client, and GeoTools. We describe how a GTUI has be instantiated and evaluated within the scope of two case studies related to real world collaborative urban planning scenarios. Our results confirm the feasibility of our proposed GTUI solutions to (a) instantiate different urban planning scenarios, (b) support collaboration, and (c) ensure an acceptable usability.

  4. Towards a framework for geospatial tangible user interfaces in collaborative urban planning

    NASA Astrophysics Data System (ADS)

    Maquil, Valérie; Leopold, Ulrich; De Sousa, Luís Moreira; Schwartz, Lou; Tobias, Eric

    2018-03-01

    The increasing complexity of urban planning projects today requires new approaches to better integrate stakeholders with different professional backgrounds throughout a city. Traditional tools used in urban planning are designed for experts and offer little opportunity for participation and collaborative design. This paper introduces the concept of geospatial tangible user interfaces (GTUI) and reports on the design and implementation as well as the usability of such a GTUI to support stakeholder participation in collaborative urban planning. The proposed system uses physical objects to interact with large digital maps and geospatial data projected onto a tabletop. It is implemented using a PostGIS database, a web map server providing OGC web services, the computer vision framework reacTIVision, a Java-based TUIO client, and GeoTools. We describe how a GTUI has be instantiated and evaluated within the scope of two case studies related to real world collaborative urban planning scenarios. Our results confirm the feasibility of our proposed GTUI solutions to (a) instantiate different urban planning scenarios, (b) support collaboration, and (c) ensure an acceptable usability.

  5. QBCov: A Linked Data interface for Discrete Global Grid Systems, a new approach to delivering coverage data on the web

    NASA Astrophysics Data System (ADS)

    Zhang, Z.; Toyer, S.; Brizhinev, D.; Ledger, M.; Taylor, K.; Purss, M. B. J.

    2016-12-01

    We are witnessing a rapid proliferation of geoscientific and geospatial data from an increasing variety of sensors and sensor networks. This data presents great opportunities to resolve cross-disciplinary problems. However, working with it often requires an understanding of file formats and protocols seldom used outside of scientific computing, potentially limiting the data's value to other disciplines. In this paper, we present a new approach to serving satellite coverage data on the web, which improves ease-of-access using the principles of linked data. Linked data adapts the concepts and protocols of the human-readable web to machine-readable data; the number of developers familiar with web technologies makes linked data a natural choice for bringing coverages to a wider audience. Our approach to using linked data also makes it possible to efficiently service high-level SPARQL queries: for example, "Retrieve all Landsat ETM+ observations of San Francisco between July and August 2016" can easily be encoded in a single query. We validate the new approach, which we call QBCov, with a reference implementation of the entire stack, including a simple web-based client for interacting with Landsat observations. In addition to demonstrating the utility of linked data for publishing coverages, we investigate the heretofore unexplored relationship between Discrete Global Grid Systems (DGGS) and linked data. Our conclusions are informed by the aforementioned reference implementation of QBCov, which is backed by a hierarchical file format designed around the rHEALPix DGGS. Not only does the choice of a DGGS-based representation provide an efficient mechanism for accessing large coverages at multiple scales, but the ability of DGGS to produce persistent, unique identifiers for spatial regions is especially valuable in a linked data context. This suggests that DGGS has an important role to play in creating sustainable and scalable linked data infrastructures. QBCov is being developed as a contribution to the Spatial Data on the Web working group--a joint activity of the Open Geospatial Consortium and World Wide Web Consortium.

  6. Trusted Data Sharing and Imagery Workflow for Disaster Response in Partnership with the State of California

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Aubrey, A. D.; Rosinski, A.; Morentz, J.; Beilin, P.; Jones, D.

    2016-12-01

    Providing actionable data for situational awareness following an earthquake or other disaster is critical to decision makers in order to improve their ability to anticipate requirements and provide appropriate resources for response. Key information on the nature, magnitude and scope of damage, or Essential Elements of Information (EEI), necessary to achieve situational awareness are often generated from a wide array of organizations and disciplines, using any number of geospatial and non-geospatial technologies. We have worked in partnership with the California Earthquake Clearinghouse to develop actionable data products for use in their response efforts, particularly in regularly scheduled, statewide exercises like the recent 2016 Cascadia Rising NLE, the May 2015 Capstone/SoCal NLE/Ardent Sentry Exercises and in the August 2014 South Napa earthquake activation and plan to participate in upcoming exercises with the National Guard (Vigilant Guard 17) and the USGS (Haywired). Our efforts over the past several years have been to aid in enabling coordination between research scientists, applied scientists and decision makers in order to reduce duplication of effort, maximize information sharing, translate scientific results into actionable information for decision-makers, and increase situational awareness. We will present perspectives on developing tools for decision support and data discovery in partnership with the Clearinghouse. Products delivered include map layers as part of the common operational data plan for the Clearinghouse delivered through XchangeCore Web Service Data Orchestration and the SpotOnResponse field analysis application. We are exploring new capabilities for real-time collaboration using GeoCollaborate®. XchangeCore allows real-time, two-way information sharing, enabling users to create merged datasets from multiple providers; SpotOnResponse provides web-enabled secure information exchange, collaboration, and field analysis for responders; and GeoCollaborate® enables users to access, share, manipulate, and interact across disparate platforms, connecting public and private sector agencies and organizations rapidly on the same map at the same time, allowing improved collaborative decision making on the same datasets simultaneously.

  7. Web Services Implementations at Land Process and Goddard Earth Sciences Distributed Active Archive Centers

    NASA Astrophysics Data System (ADS)

    Cole, M.; Bambacus, M.; Lynnes, C.; Sauer, B.; Falke, S.; Yang, W.

    2007-12-01

    NASA's vast array of scientific data within its Distributed Active Archive Centers (DAACs) is especially valuable to both traditional research scientists as well as the emerging market of Earth Science Information Partners. For example, the air quality science and management communities are increasingly using satellite derived observations in their analyses and decision making. The Air Quality Cluster in the Federation of Earth Science Information Partners (ESIP) uses web infrastructures of interoperability, or Service Oriented Architecture (SOA), to extend data exploration, use, and analysis and provides a user environment for DAAC products. In an effort to continually offer these NASA data to the broadest research community audience, and reusing emerging technologies, both NASA's Goddard Earth Science (GES) and Land Process (LP) DAACs have engaged in a web services pilot project. Through these projects both GES and LP have exposed data through the Open Geospatial Consortiums (OGC) Web Services standards. Reusing several different existing applications and implementation techniques, GES and LP successfully exposed a variety data, through distributed systems to be ingested into multiple end-user systems. The results of this project will enable researchers world wide to access some of NASA's GES & LP DAAC data through OGC protocols. This functionality encourages inter-disciplinary research while increasing data use through advanced technologies. This paper will concentrate on the implementation and use of OGC Web Services, specifically Web Map and Web Coverage Services (WMS, WCS) at GES and LP DAACs, and the value of these services within scientific applications, including integration with the DataFed air quality web infrastructure and in the development of data analysis web applications.

  8. Distributed Hydrologic Modeling Apps for Decision Support in the Cloud

    NASA Astrophysics Data System (ADS)

    Swain, N. R.; Latu, K.; Christiensen, S.; Jones, N.; Nelson, J.

    2013-12-01

    Advances in computation resources and greater availability of water resources data represent an untapped resource for addressing hydrologic uncertainties in water resources decision-making. The current practice of water authorities relies on empirical, lumped hydrologic models to estimate watershed response. These models are not capable of taking advantage of many of the spatial datasets that are now available. Physically-based, distributed hydrologic models are capable of using these data resources and providing better predictions through stochastic analysis. However, there exists a digital divide that discourages many science-minded decision makers from using distributed models. This divide can be spanned using a combination of existing web technologies. The purpose of this presentation is to present a cloud-based environment that will offer hydrologic modeling tools or 'apps' for decision support and the web technologies that have been selected to aid in its implementation. Compared to the more commonly used lumped-parameter models, distributed models, while being more intuitive, are still data intensive, computationally expensive, and difficult to modify for scenario exploration. However, web technologies such as web GIS, web services, and cloud computing have made the data more accessible, provided an inexpensive means of high-performance computing, and created an environment for developing user-friendly apps for distributed modeling. Since many water authorities are primarily interested in the scenario exploration exercises with hydrologic models, we are creating a toolkit that facilitates the development of a series of apps for manipulating existing distributed models. There are a number of hurdles that cloud-based hydrologic modeling developers face. One of these is how to work with the geospatial data inherent with this class of models in a web environment. Supporting geospatial data in a website is beyond the capabilities of standard web frameworks and it requires the use of additional software. In particular, there are at least three elements that are needed: a geospatially enabled database, a map server, and geoprocessing toolbox. We recommend a software stack for geospatial web application development comprising: MapServer, PostGIS, and 52 North with Python as the scripting language to tie them together. Another hurdle that must be cleared is managing the cloud-computing load. We are using HTCondor as a solution to this end. Finally, we are creating a scripting environment wherein developers will be able to create apps that use existing hydrologic models in our system with minimal effort. This capability will be accomplished by creating a plugin for a Python content management system called CKAN. We are currently developing cyberinfrastructure that utilizes this stack and greatly lowers the investment required to deploy cloud-based modeling apps. This material is based upon work supported by the National Science Foundation under Grant No. 1135482

  9. Web-GIS visualisation of permafrost-related Remote Sensing products for ESA GlobPermafrost

    NASA Astrophysics Data System (ADS)

    Haas, A.; Heim, B.; Schaefer-Neth, C.; Laboor, S.; Nitze, I.; Grosse, G.; Bartsch, A.; Kaab, A.; Strozzi, T.; Wiesmann, A.; Seifert, F. M.

    2016-12-01

    The ESA GlobPermafrost (www.globpermafrost.info) provides a remote sensing service for permafrost research and applications. The service comprises of data product generation for various sites and regions as well as specific infrastructure allowing overview and access to datasets. Based on an online user survey conducted within the project, the user community extensively applies GIS software to handle remote sensing-derived datasets and requires preview functionalities before accessing them. In response, we develop the Permafrost Information System PerSys which is conceptualized as an open access geospatial data dissemination and visualization portal. PerSys will allow visualisation of GlobPermafrost raster and vector products such as land cover classifications, Landsat multispectral index trend datasets, lake and wetland extents, InSAR-based land surface deformation maps, rock glacier velocity fields, spatially distributed permafrost model outputs, and land surface temperature datasets. The datasets will be published as WebGIS services relying on OGC-standardized Web Mapping Service (WMS) and Web Feature Service (WFS) technologies for data display and visualization. The WebGIS environment will be hosted at the AWI computing centre where a geodata infrastructure has been implemented comprising of ArcGIS for Server 10.4, PostgreSQL 9.2 and a browser-driven data viewer based on Leaflet (http://leafletjs.com). Independently, we will provide an `Access - Restricted Data Dissemination Service', which will be available to registered users for testing frequently updated versions of project datasets. PerSys will become a core project of the Arctic Permafrost Geospatial Centre (APGC) within the ERC-funded PETA-CARB project (www.awi.de/petacarb). The APGC Data Catalogue will contain all final products of GlobPermafrost, allow in-depth dataset search via keywords, spatial and temporal coverage, data type, etc., and will provide DOI-based links to the datasets archived in the long-term, open access PANGAEA data repository.

  10. A feature selection approach towards progressive vector transmission over the Internet

    NASA Astrophysics Data System (ADS)

    Miao, Ru; Song, Jia; Feng, Min

    2017-09-01

    WebGIS has been applied for visualizing and sharing geospatial information popularly over the Internet. In order to improve the efficiency of the client applications, the web-based progressive vector transmission approach is proposed. Important features should be selected and transferred firstly, and the methods for measuring the importance of features should be further considered in the progressive transmission. However, studies on progressive transmission for large-volume vector data have mostly focused on map generalization in the field of cartography, but rarely discussed on the selection of geographic features quantitatively. This paper applies information theory for measuring the feature importance of vector maps. A measurement model for the amount of information of vector features is defined based upon the amount of information for dealing with feature selection issues. The measurement model involves geometry factor, spatial distribution factor and thematic attribute factor. Moreover, a real-time transport protocol (RTP)-based progressive transmission method is then presented to improve the transmission of vector data. To clearly demonstrate the essential methodology and key techniques, a prototype for web-based progressive vector transmission is presented, and an experiment of progressive selection and transmission for vector features is conducted. The experimental results indicate that our approach clearly improves the performance and end-user experience of delivering and manipulating large vector data over the Internet.

  11. Investigating Climate Change Issues With Web-Based Geospatial Inquiry Activities

    NASA Astrophysics Data System (ADS)

    Dempsey, C.; Bodzin, A. M.; Sahagian, D. L.; Anastasio, D. J.; Peffer, T.; Cirucci, L.

    2011-12-01

    In the Environmental Literacy and Inquiry middle school Climate Change curriculum we focus on essential climate literacy principles with an emphasis on weather and climate, Earth system energy balance, greenhouse gases, paleoclimatology, and how human activities influence climate change (http://www.ei.lehigh.edu/eli/cc/). It incorporates a related set of a framework and design principles to provide guidance for the development of the geospatial technology-integrated Earth and environmental science curriculum materials. Students use virtual globes, Web-based tools including an interactive carbon calculator and geologic timeline, and inquiry-based lab activities to investigate climate change topics. The curriculum includes educative curriculum materials that are designed to promote and support teachers' learning of important climate change content and issues, geospatial pedagogical content knowledge, and geographic spatial thinking. The curriculum includes baseline instructional guidance for teachers and provides implementation and adaptation guidance for teaching with diverse learners including low-level readers, English language learners and students with disabilities. In the curriculum, students use geospatial technology tools including Google Earth with embedded spatial data to investigate global temperature changes, areas affected by climate change, evidence of climate change, and the effects of sea level rise on the existing landscape. We conducted a designed-based research implementation study with urban middle school students. Findings showed that the use of the Climate Change curriculum showed significant improvement in urban middle school students' understanding of climate change concepts.

  12. The Urbis Project: Identification and Characterization of Potential Urban Development Areas as a Web-Based Service

    NASA Astrophysics Data System (ADS)

    Manzke, Nina; Kada, Martin; Kastler, Thomas; Xu, Shaojuan; de Lange, Norbert; Ehlers, Manfred

    2016-06-01

    Urban sprawl and the related landscape fragmentation is a Europe-wide challenge in the context of sustainable urban planning. The URBan land recycling Information services for Sustainable cities (URBIS) project aims for the development, implementation, and validation of web-based information services for urban vacant land in European functional urban areas in order to provide end-users with site specific characteristics and to facilitate the identification and evaluation of potential development areas. The URBIS services are developed based on open geospatial data. In particular, the Copernicus Urban Atlas thematic layers serve as the main data source for an initial inventory of sites. In combination with remotely sensed data like SPOT5 images and ancillary datasets like OpenStreetMap, detailed site specific information is extracted. Services are defined for three main categories: i) baseline services, which comprise an initial inventory and typology of urban land, ii) update services, which provide a regular inventory update as well as an analysis of urban land use dynamics and changes, and iii) thematic services, which deliver specific information tailored to end-users' needs.

  13. SWOT analysis on National Common Geospatial Information Service Platform of China

    NASA Astrophysics Data System (ADS)

    Zheng, Xinyan; He, Biao

    2010-11-01

    Currently, the trend of International Surveying and Mapping is shifting from map production to integrated service of geospatial information, such as GOS of U.S. etc. Under this circumstance, the Surveying and Mapping of China is inevitably shifting from 4D product service to NCGISPC (National Common Geospatial Information Service Platform of China)-centered service. Although State Bureau of Surveying and Mapping of China has already provided a great quantity of geospatial information service to various lines of business, such as emergency and disaster management, transportation, water resource, agriculture etc. The shortcomings of the traditional service mode are more and more obvious, due to the highly emerging requirement of e-government construction, the remarkable development of IT technology and emerging online geospatial service demands of various lines of business. NCGISPC, which aimed to provide multiple authoritative online one-stop geospatial information service and API for further development to government, business and public, is now the strategic core of SBSM (State Bureau of Surveying and Mapping of China). This paper focuses on the paradigm shift that NCGISPC brings up by using SWOT (Strength, Weakness, Opportunity and Threat) analysis, compared to the service mode that based on 4D product. Though NCGISPC is still at its early stage, it represents the future service mode of geospatial information of China, and surely will have great impact not only on the construction of digital China, but also on the way that everyone uses geospatial information service.

  14. The role of visualization in learning from computer-based images

    NASA Astrophysics Data System (ADS)

    Piburn, Michael D.; Reynolds, Stephen J.; McAuliffe, Carla; Leedy, Debra E.; Birk, James P.; Johnson, Julia K.

    2005-05-01

    Among the sciences, the practice of geology is especially visual. To assess the role of spatial ability in learning geology, we designed an experiment using: (1) web-based versions of spatial visualization tests, (2) a geospatial test, and (3) multimedia instructional modules built around QuickTime Virtual Reality movies. Students in control and experimental sections were administered measures of spatial orientation and visualization, as well as a content-based geospatial examination. All subjects improved significantly in their scores on spatial visualization and the geospatial examination. There was no change in their scores on spatial orientation. A three-way analysis of variance, with the geospatial examination as the dependent variable, revealed significant main effects favoring the experimental group and a significant interaction between treatment and gender. These results demonstrate that spatial ability can be improved through instruction, that learning of geological content will improve as a result, and that differences in performance between the genders can be eliminated.

  15. The Impact of a Geospatial Technology-Supported Energy Curriculum on Middle School Students' Science Achievement

    NASA Astrophysics Data System (ADS)

    Kulo, Violet; Bodzin, Alec

    2013-02-01

    Geospatial technologies are increasingly being integrated in science classrooms to foster learning. This study examined whether a Web-enhanced science inquiry curriculum supported by geospatial technologies promoted urban middle school students' understanding of energy concepts. The participants included one science teacher and 108 eighth-grade students classified in three ability level tracks. Data were gathered through pre/posttest content knowledge assessments, daily classroom observations, and daily reflective meetings with the teacher. Findings indicated a significant increase in the energy content knowledge for all the students. Effect sizes were large for all three ability level tracks, with the middle and low track classes having larger effect sizes than the upper track class. Learners in all three tracks were highly engaged with the curriculum. Curriculum effectiveness and practical issues involved with using geospatial technologies to support science learning are discussed.

  16. Integrated web system of geospatial data services for climate research

    NASA Astrophysics Data System (ADS)

    Okladnikov, Igor; Gordov, Evgeny; Titov, Alexander

    2016-04-01

    Georeferenced datasets are currently actively used for modeling, interpretation and forecasting of climatic and ecosystem changes on different spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their huge size (up to tens terabytes for a single dataset) a special software supporting studies in the climate and environmental change areas is required. An approach for integrated analysis of georefernced climatological data sets based on combination of web and GIS technologies in the framework of spatial data infrastructure paradigm is presented. According to this approach a dedicated data-processing web system for integrated analysis of heterogeneous georeferenced climatological and meteorological data is being developed. It is based on Open Geospatial Consortium (OGC) standards and involves many modern solutions such as object-oriented programming model, modular composition, and JavaScript libraries based on GeoExt library, ExtJS Framework and OpenLayers software. This work is supported by the Ministry of Education and Science of the Russian Federation, Agreement #14.613.21.0037.

  17. Planning and Management of Real-Time Geospatialuas Missions Within a Virtual Globe Environment

    NASA Astrophysics Data System (ADS)

    Nebiker, S.; Eugster, H.; Flückiger, K.; Christen, M.

    2011-09-01

    This paper presents the design and development of a hardware and software framework supporting all phases of typical monitoring and mapping missions with mini and micro UAVs (unmanned aerial vehicles). The developed solution combines state-of-the art collaborative virtual globe technologies with advanced geospatial imaging techniques and wireless data link technologies supporting the combined and highly reliable transmission of digital video, high-resolution still imagery and mission control data over extended operational ranges. The framework enables the planning, simulation, control and real-time monitoring of UAS missions in application areas such as monitoring of forest fires, agronomical research, border patrol or pipeline inspection. The geospatial components of the project are based on the Virtual Globe Technology i3D OpenWebGlobe of the Institute of Geomatics Engineering at the University of Applied Sciences Northwestern Switzerland (FHNW). i3D OpenWebGlobe is a high-performance 3D geovisualisation engine supporting the web-based streaming of very large amounts of terrain and POI data.

  18. Intelligent services for discovery of complex geospatial features from remote sensing imagery

    NASA Astrophysics Data System (ADS)

    Yue, Peng; Di, Liping; Wei, Yaxing; Han, Weiguo

    2013-09-01

    Remote sensing imagery has been commonly used by intelligence analysts to discover geospatial features, including complex ones. The overwhelming volume of routine image acquisition requires automated methods or systems for feature discovery instead of manual image interpretation. The methods of extraction of elementary ground features such as buildings and roads from remote sensing imagery have been studied extensively. The discovery of complex geospatial features, however, is still rather understudied. A complex feature, such as a Weapon of Mass Destruction (WMD) proliferation facility, is spatially composed of elementary features (e.g., buildings for hosting fuel concentration machines, cooling towers, transportation roads, and fences). Such spatial semantics, together with thematic semantics of feature types, can be used to discover complex geospatial features. This paper proposes a workflow-based approach for discovery of complex geospatial features that uses geospatial semantics and services. The elementary features extracted from imagery are archived in distributed Web Feature Services (WFSs) and discoverable from a catalogue service. Using spatial semantics among elementary features and thematic semantics among feature types, workflow-based service chains can be constructed to locate semantically-related complex features in imagery. The workflows are reusable and can provide on-demand discovery of complex features in a distributed environment.

  19. Development of spatial density maps based on geoprocessing web services: application to tuberculosis incidence in Barcelona, Spain.

    PubMed

    Dominkovics, Pau; Granell, Carlos; Pérez-Navarro, Antoni; Casals, Martí; Orcau, Angels; Caylà, Joan A

    2011-11-29

    Health professionals and authorities strive to cope with heterogeneous data, services, and statistical models to support decision making on public health. Sophisticated analysis and distributed processing capabilities over geocoded epidemiological data are seen as driving factors to speed up control and decision making in these health risk situations. In this context, recent Web technologies and standards-based web services deployed on geospatial information infrastructures have rapidly become an efficient way to access, share, process, and visualize geocoded health-related information. Data used on this study is based on Tuberculosis (TB) cases registered in Barcelona city during 2009. Residential addresses are geocoded and loaded into a spatial database that acts as a backend database. The web-based application architecture and geoprocessing web services are designed according to the Representational State Transfer (REST) principles. These web processing services produce spatial density maps against the backend database. The results are focused on the use of the proposed web-based application to the analysis of TB cases in Barcelona. The application produces spatial density maps to ease the monitoring and decision making process by health professionals. We also include a discussion of how spatial density maps may be useful for health practitioners in such contexts. In this paper, we developed web-based client application and a set of geoprocessing web services to support specific health-spatial requirements. Spatial density maps of TB incidence were generated to help health professionals in analysis and decision-making tasks. The combined use of geographic information tools, map viewers, and geoprocessing services leads to interesting possibilities in handling health data in a spatial manner. In particular, the use of spatial density maps has been effective to identify the most affected areas and its spatial impact. This study is an attempt to demonstrate how web processing services together with web-based mapping capabilities suit the needs of health practitioners in epidemiological analysis scenarios.

  20. Development of spatial density maps based on geoprocessing web services: application to tuberculosis incidence in Barcelona, Spain

    PubMed Central

    2011-01-01

    Background Health professionals and authorities strive to cope with heterogeneous data, services, and statistical models to support decision making on public health. Sophisticated analysis and distributed processing capabilities over geocoded epidemiological data are seen as driving factors to speed up control and decision making in these health risk situations. In this context, recent Web technologies and standards-based web services deployed on geospatial information infrastructures have rapidly become an efficient way to access, share, process, and visualize geocoded health-related information. Methods Data used on this study is based on Tuberculosis (TB) cases registered in Barcelona city during 2009. Residential addresses are geocoded and loaded into a spatial database that acts as a backend database. The web-based application architecture and geoprocessing web services are designed according to the Representational State Transfer (REST) principles. These web processing services produce spatial density maps against the backend database. Results The results are focused on the use of the proposed web-based application to the analysis of TB cases in Barcelona. The application produces spatial density maps to ease the monitoring and decision making process by health professionals. We also include a discussion of how spatial density maps may be useful for health practitioners in such contexts. Conclusions In this paper, we developed web-based client application and a set of geoprocessing web services to support specific health-spatial requirements. Spatial density maps of TB incidence were generated to help health professionals in analysis and decision-making tasks. The combined use of geographic information tools, map viewers, and geoprocessing services leads to interesting possibilities in handling health data in a spatial manner. In particular, the use of spatial density maps has been effective to identify the most affected areas and its spatial impact. This study is an attempt to demonstrate how web processing services together with web-based mapping capabilities suit the needs of health practitioners in epidemiological analysis scenarios. PMID:22126392

  1. Remote sensing applied to resource management

    Treesearch

    Henry M. Lachowski

    1998-01-01

    Effective management of forest resources requires access to current and consistent geospatial information that can be shared by resource managers and the public. Geospatial information describing our land and natural resources comes from many sources and is most effective when stored in a geospatial database and used in a geographic information system (GIS). The...

  2. Using Airborne Remote Sensing to Increase Situational Awareness in Civil Protection and Humanitarian Relief - the Importance of User Involvement

    NASA Astrophysics Data System (ADS)

    Römer, H.; Kiefl, R.; Henkel, F.; Wenxi, C.; Nippold, R.; Kurz, F.; Kippnich, U.

    2016-06-01

    Enhancing situational awareness in real-time (RT) civil protection and emergency response scenarios requires the development of comprehensive monitoring concepts combining classical remote sensing disciplines with geospatial information science. In the VABENE++ project of the German Aerospace Center (DLR) monitoring tools are being developed by which innovative data acquisition approaches are combined with information extraction as well as the generation and dissemination of information products to a specific user. DLR's 3K and 4k camera system which allow for a RT acquisition and pre-processing of high resolution aerial imagery are applied in two application examples conducted with end users: a civil protection exercise with humanitarian relief organisations and a large open-air music festival in cooperation with a festival organising company. This study discusses how airborne remote sensing can significantly contribute to both, situational assessment and awareness, focussing on the downstream processes required for extracting information from imagery and for visualising and disseminating imagery in combination with other geospatial information. Valuable user feedback and impetus for further developments has been obtained from both applications, referring to innovations in thematic image analysis (supporting festival site management) and product dissemination (editable web services). Thus, this study emphasises the important role of user involvement in application-related research, i.e. by aligning it closer to user's requirements.

  3. Searching and exploitation of distributed geospatial data sources via the Naval Research Lab's Geospatial Information Database (GIDB) Portal System

    NASA Astrophysics Data System (ADS)

    McCreedy, Frank P.; Sample, John T.; Ladd, William P.; Thomas, Michael L.; Shaw, Kevin B.

    2005-05-01

    The Naval Research Laboratory"s Geospatial Information Database (GIDBTM) Portal System has been extended to now include an extensive geospatial search functionality. The GIDB Portal System interconnects over 600 distributed geospatial data sources via the Internet with a thick client, thin client and a PDA client. As the GIDB Portal System has rapidly grown over the last two years (adding hundreds of geospatial sources), the obvious requirement has arisen to more effectively mine the interconnected sources in near real-time. How the GIDB Search addresses this issue is the prime focus of this paper.

  4. Geospatial Information Best Practices

    DTIC Science & Technology

    2012-01-01

    26 Spring - 2012 By MAJ Christopher Blais, CW2 Joshua Stratton and MSG Moise Danjoint The fact that Geospatial information can be codified and...Operation Iraqi Freedom V (2007-2008, and Operation New Dawn (2011). MSG Moise Danjoint is the noncommissioned officer in charge, Geospatial

  5. BAID: The Barrow Area Information Database - an interactive web mapping portal and cyberinfrastructure for scientific activities in the vicinity of Barrow, Alaska

    NASA Astrophysics Data System (ADS)

    Cody, R. P.; Kassin, A.; Gaylord, A. G.; Tweedie, C. E.

    2013-12-01

    In 2013, the Barrow Area Information Database (BAID, www.baid.utep.edu) project resumed field operations in Barrow, AK. The Barrow area of northern Alaska is one of the most intensely researched locations in the Arctic. BAID is a cyberinfrastructure (CI) that details much of the historic and extant research undertaken within in the Barrow region in a suite of interactive web-based mapping and information portals (geobrowsers). The BAID user community and target audience for BAID is diverse and includes research scientists, science logisticians, land managers, educators, students, and the general public. BAID contains information on more than 11,000 Barrow area research sites that extend back to the 1940's and more than 640 remote sensing images and geospatial datasets. In a web-based setting, users can zoom, pan, query, measure distance, and save or print maps and query results. Data are described with metadata that meet Federal Geographic Data Committee standards and are archived at the University Corporation for Atmospheric Research Earth Observing Laboratory (EOL) where non-proprietary BAID data can be freely downloaded. Highlights for the 2013 season include the addition of more than 2000 additional research sites, providing differential global position system (dGPS) support to visiting scientists, surveying over 80 miles of coastline to document rates of erosion, training of local GIS personal, deployment of a wireless sensor network, and substantial upgrades to the BAID website and web mapping applications.

  6. The Challenge of Handling Big Data Sets in the Sensor Web

    NASA Astrophysics Data System (ADS)

    Autermann, Christian; Stasch, Christoph; Jirka, Simon

    2016-04-01

    More and more Sensor Web components are deployed in different domains such as hydrology, oceanography or air quality in order to make observation data accessible via the Web. However, besides variability of data formats and protocols in environmental applications, the fast growing volume of data with high temporal and spatial resolution is imposing new challenges for Sensor Web technologies when sharing observation data and metadata about sensors. Variability, volume and velocity are the core issues that are addressed by Big Data concepts and technologies. Most solutions in the geospatial sector focus on remote sensing and raster data, whereas big in-situ observation data sets relying on vector features require novel approaches. Hence, in order to deal with big data sets in infrastructures for observational data, the following questions need to be answered: 1. How can big heterogeneous spatio-temporal datasets be organized, managed, and provided to Sensor Web applications? 2. How can views on big data sets and derived information products be made accessible in the Sensor Web? 3. How can big observation data sets be processed efficiently? We illustrate these challenges with examples from the marine domain and outline how we address these challenges. We therefore show how big data approaches from mainstream IT can be re-used and applied to Sensor Web application scenarios.

  7. Open-source web-enabled data management, analyses, and visualization of very large data in geosciences using Jupyter, Apache Spark, and community tools

    NASA Astrophysics Data System (ADS)

    Chaudhary, A.

    2017-12-01

    Current simulation models and sensors are producing high-resolution, high-velocity data in geosciences domain. Knowledge discovery from these complex and large size datasets require tools that are capable of handling very large data and providing interactive data analytics features to researchers. To this end, Kitware and its collaborators are producing open-source tools GeoNotebook, GeoJS, Gaia, and Minerva for geosciences that are using hardware accelerated graphics and advancements in parallel and distributed processing (Celery and Apache Spark) and can be loosely coupled to solve real-world use-cases. GeoNotebook (https://github.com/OpenGeoscience/geonotebook) is co-developed by Kitware and NASA-Ames and is an extension to the Jupyter Notebook. It provides interactive visualization and python-based analysis of geospatial data and depending the backend (KTile or GeoPySpark) can handle data sizes of Hundreds of Gigabytes to Terabytes. GeoNotebook uses GeoJS (https://github.com/OpenGeoscience/geojs) to render very large geospatial data on the map using WebGL and Canvas2D API. GeoJS is more than just a GIS library as users can create scientific plots such as vector and contour and can embed InfoVis plots using D3.js. GeoJS aims for high-performance visualization and interactive data exploration of scientific and geospatial location aware datasets and supports features such as Point, Line, Polygon, and advanced features such as Pixelmap, Contour, Heatmap, and Choropleth. Our another open-source tool Minerva ((https://github.com/kitware/minerva) is a geospatial application that is built on top of open-source web-based data management system Girder (https://github.com/girder/girder) which provides an ability to access data from HDFS or Amazon S3 buckets and provides capabilities to perform visualization and analyses on geosciences data in a web environment using GDAL and GeoPandas wrapped in a unified API provided by Gaia (https://github.com/OpenDataAnalytics/gaia). In this presentation, we will discuss core features of each of these tools and will present lessons learned on handling large data in the context of data management, analyses and visualization.

  8. Construction of a Distributed-network Digital Watershed Management System with B/S Techniques

    NASA Astrophysics Data System (ADS)

    Zhang, W. C.; Liu, Y. M.; Fang, J.

    2017-07-01

    Integrated watershed assessment tools for supporting land management and hydrologic research are becoming established tools in both basic and applied research. The core of these tools are mainly spatially distributed hydrologic models as they can provide a mechanism for investigating interactions among climate, topography, vegetation, and soil. However, the extensive data requirements and the difficult task of building input parameter files for driving these distributed models, have long been an obstacle to the timely and cost-effective use of such complex models by watershed managers and policy-makers. Recently, a web based geographic information system (GIS) tool to facilitate this process has been developed for a large watersheds of Jinghe and Weihe catchments located in the loess plateau of the Huanghe River basin in north-western China. A web-based GIS provides the framework within which spatially distributed data are collected and used to prepare model input files of these two watersheds and evaluate model results as well as to provide the various clients for watershed information inquiring, visualizing and assessment analysis. This Web-based Automated Geospatial Watershed Assessment GIS (WAGWA-GIS) tool uses widely available standardized spatial datasets that can be obtained via the internet oracle databank designed with association of Map Guide platform to develop input parameter files for online simulation at different spatial and temporal scales with Xing’anjiang and TOPMODEL that integrated with web-based digital watershed. WAGWA-GIS automates the process of transforming both digital data including remote sensing data, DEM, Land use/cover, soil digital maps and meteorological and hydrological station geo-location digital maps and text files containing meteorological and hydrological data obtained from stations of the watershed into hydrological models for online simulation and geo-spatial analysis and provides a visualization tool to help the user interpret results. The utility of WAGWA-GIS in jointing hydrologic and ecological investigations has been demonstrated on such diverse landscapes as Jinhe and Weihe watersheds, and will be extended to be utilized in the other watersheds in China step by step in coming years

  9. Supporting Timely Humanitarian Assistance/Disaster Relief (HA/DR) Decisions Through Geospatial Intelligence (GEOINT) and Geographical Information Systems (GIS) Tools

    DTIC Science & Technology

    2014-05-22

    attempted to respond to the advances in technology and the growing power of geographical information system (GIS) tools. However, the doctrine...Geospatial intelligence (GEOINT), Geographical information systems (GIS) tools, Humanitarian Assistance/Disaster Relief (HA/DR), 2010 Haiti Earthquake...Humanitarian Assistance/Disaster Relief (HA/DR) Decisions Through Geospatial Intelligence (GEOINT) and Geographical Information Systems (GIS) Tools

  10. PLANNING QUALITY IN GEOSPATIAL PROJECTS

    EPA Science Inventory

    This presentation will briefly review some legal drivers and present a structure for the writing of geospatial Quality Assurance Projects Plans. In addition, the Geospatial Quality Council geospatial information life-cycle and sources of error flowchart will be reviewed.

  11. The use of Web-based GIS data technologies in the construction of geoscience instructional materials: examples from the MARGINS Data in the Classroom project

    NASA Astrophysics Data System (ADS)

    Ryan, J. G.; McIlrath, J. A.

    2008-12-01

    Web-accessible geospatial information system (GIS) technologies have advanced in concert with an expansion of data resources that can be accessed and used by researchers, educators and students. These resources facilitate the development of data-rich instructional resources and activities that can be used to transition seamlessly into undergraduate research projects. MARGINS Data in the Classroom (http://serc.carleton.edu/ margins/index.html) seeks to engage MARGINS researchers and educators in using the images, datasets, and visualizations produced by NSF-MARGINS Program-funded research and related efforts to create Web-deliverable instructional materials for use in undergraduate-level geoscience courses (MARGINS Mini-Lessons). MARGINS science data is managed by the Marine Geosciences Data System (MGDS), and these and all other MGDS-hosted data can be accessed, manipulated and visualized using GeoMapApp (www.geomapapp.org; Carbotte et al, 2004), a freely available geographic information system focused on the marine environment. Both "packaged" MGDS datasets (i.e., global earthquake foci, volcanoes, bathymetry) and "raw" data (seismic surveys, magnetics, gravity) are accessible via GeoMapApp, with WFS linkages to other resources (geodesy from UNAVCO; seismic profiles from IRIS; geochemical and drillsite data from EarthChem, IODP, and others), permitting the comprehensive characterization of many regions of the ocean basins. Geospatially controlled datasets can be imported into GeoMapApp visualizations, and these visualizations can be exported into Google Earth as .kmz image files. Many of the MARGINS Mini-Lessons thus far produced use (or have studentss use the varied capabilities of GeoMapApp (i.e., constructing topographic profiles, overlaying varied geophysical and bathymetric datasets, characterizing geochemical data). These materials are available for use and testing from the project webpage (http://serc.carleton.edu/margins/). Classroom testing and assessment of the Mini- Lessons begins this Fall.

  12. VegScape: U.S. Crop Condition Monitoring Service

    NASA Astrophysics Data System (ADS)

    mueller, R.; Yang, Z.; Di, L.

    2013-12-01

    Since 1995, the US Department of Agriculture (USDA)/National Agricultural Statistics Service (NASS) has provided qualitative biweekly vegetation condition indices to USDA policymakers and the public on a weekly basis during the growing season. Vegetation indices have proven useful for assessing crop condition and identifying the areal extent of floods, drought, major weather anomalies, and vulnerabilities of early/late season crops. With growing emphasis on more extreme weather events and food security issues rising to the forefront of national interest, a new vegetation condition monitoring system was developed. The new vegetation condition portal named VegScape was initiated at the start of the 2013 growing season. VegScape delivers web mapping service based interactive vegetation indices. Users can use an interactive map to explore, query and disseminate current crop conditions. Vegetation indices like Normal Difference Vegetation Index (NDVI), Vegetation Condition Index (VCI), and mean, median, and ratio comparisons to prior years can be constructed for analytical purposes and on-demand crop statistics. The NASA MODIS satellite with 250 meter (15 acres) resolution and thirteen years of data history provides improved spatial and temporal resolutions and delivers improved detailed timely (i.e., daily) crop specific condition and dynamics. VegScape thus provides supplemental information to support NASS' weekly crop reports. VegScape delivers an agricultural cultivated crop mask and the most recent Cropland Data Layer (CDL) product to exploit the agricultural domain and visualize prior years' planted crops. Additionally, the data can be directly exported to Google Earth for web mashups or delivered via web mapping services for uses in other applications. VegScape supports the ethos of data democracy by providing free and open access to digital geospatial data layers using open geospatial standards, thereby supporting transparent and collaborative government initiatives. NASS developed VegScape in cooperation with the Center for Spatial Information Science and Systems, George Mason University, Fairfax, VA. VegScape Ratio to Median NDVI

  13. The research and implementation of coalfield spontaneous combustion of carbon emission WebGIS based on Silverlight and ArcGIS server

    NASA Astrophysics Data System (ADS)

    Zhu, Z.; Bi, J.; Wang, X.; Zhu, W.

    2014-02-01

    As an important sub-topic of the natural process of carbon emission data public information platform construction, coalfield spontaneous combustion of carbon emission WebGIS system has become an important study object. In connection with data features of coalfield spontaneous combustion carbon emissions (i.e. a wide range of data, which is rich and complex) and the geospatial characteristics, data is divided into attribute data and spatial data. Based on full analysis of the data, completed the detailed design of the Oracle database and stored on the Oracle database. Through Silverlight rich client technology and the expansion of WCF services, achieved the attribute data of web dynamic query, retrieval, statistical, analysis and other functions. For spatial data, we take advantage of ArcGIS Server and Silverlight-based API to invoke GIS server background published map services, GP services, Image services and other services, implemented coalfield spontaneous combustion of remote sensing image data and web map data display, data analysis, thematic map production. The study found that the Silverlight technology, based on rich client and object-oriented framework for WCF service, can efficiently constructed a WebGIS system. And then, combined with ArcGIS Silverlight API to achieve interactive query attribute data and spatial data of coalfield spontaneous emmission, can greatly improve the performance of WebGIS system. At the same time, it provided a strong guarantee for the construction of public information on China's carbon emission data.

  14. Teaching And Learning Tectonics With Web-GIS

    NASA Astrophysics Data System (ADS)

    Anastasio, D. J.; Sahagian, D. L.; Bodzin, A.; Teletzke, A. L.; Rutzmoser, S.; Cirucci, L.; Bressler, D.; Burrows, J. E.

    2012-12-01

    Tectonics is a new curriculum enhancement consisting of six Web GIS investigations designed to augment a traditional middle school Earth science curriculum. The investigations are aligned to Disciplinary Core Ideas: Earth and Space Science from the National Research Council's (2012) Framework for K-12 Science Education and to tectonics benchmark ideas articulated in the AAAS Project 2061 (2007) Atlas of Science Literacy. The curriculum emphasizes geospatial thinking and scientific inquiry and consists of the following modules: Geohazards, which plate boundary is closest to me? How do we recognize plate boundaries? How does thermal energy move around the Earth? What happens when plates diverge? What happens when plate move sideways past each other? What happens when plates collide? The Web GIS interface uses JavaScript for simplicity, intuition, and convenience for implementation on a variety of platforms making it easier for diverse middle school learners and their teachers to conduct authentic Earth science investigations, including multidisciplinary visualization, analysis, and synthesis of data. Instructional adaptations allow students who are English language learners, have disabilities, or are reluctant readers to perform advanced desktop GIS functions including spatial analysis, map visualization and query. The Web GIS interface integrates graphics, multimedia, and animation in addition to newly developed features, which allow users to explore and discover geospatial patterns that would not be easily visible using typical classroom instructional materials. The Tectonics curriculum uses a spatial learning design model that incorporates a related set of frameworks and design principles. The framework builds on the work of other successful technology-integrated curriculum projects and includes, alignment of materials and assessments with learning goals, casting key ideas in real-world problems, engaging students in scientific practices that foster the use of key ideas, uses geospatial technology, and supports for teachers in adopting and implementing GIS and inquiry-based activities.

  15. 78 FR 32635 - Privacy Act of 1974; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-31

    ...; System of Records AGENCY: National Geospatial-Intelligence Agency, DoD. ACTION: Notice to Add a New System of Records. SUMMARY: The National Geospatial-Intelligence Agency is establishing a new system of... information. FOR FURTHER INFORMATION CONTACT: National Geospatial-Intelligence Agency [[Page 32636

  16. User-driven generation of standard data services

    NASA Astrophysics Data System (ADS)

    Díaz, Laura; Granell, Carlos; Gould, Michael; Huerta, Joaquín.

    2010-05-01

    Geospatial Information systems are experiencing the shift from monolithic to distributed environments (Bernard, 2003). Current research trends for discover and access of geospatial resources, in these distributed environments, are being addressed by deployment of interconnected Spatial Data Infrastructure (SDI) nodes at different scales to build a global spatial information infrastructure (Masser et al., 2008; Rajabifard et al., 2002). One of the challenges for implementing these global and multiscale SDIs is to agree with common standards in consideration with heterogeneity of various stakeholders [Masser 2005]. In Europe, the European Commission took the INSPIRE initiative to monitor the development of European SDIs. INSPIRE Directive addresses the need for web services to discover, view, transform, invoke, and download geospatial resources, which enable various stakeholders to share resources in an interoperable manner [INSPIRE 2007]. Such web services require technical specifications for the interoperability and harmonization of their SDIs [INSPIRE 2007]. Moreover, interoperability is ensured by a number of specification efforts, in the geo domain most prominently by ISO/TC 211 and the OpenGIS Consortium (OGC) (Bernard, 2003). Other research challenges regarding SDI are on one hand how to handle complexity by users in charge of maintaining SDIs as they grow, and on the other hand the fact the SDI maintenance and evolution should be guided (Bejar et al, 2009). So there is a motivation to improve the complex deployment mechanisms in SDI since there is a need of expertise and time to deploy resources and integrate them by means of standard services. In this context we present an architecture following the INSPIRE technical guidelines and therefore based on SDI principles. This architecture supports distributed applications and provides components to assist users in deploying and updating SDI resources. Therefore mechanisms and components for the automatic generation and publication of standard geospatial are proposed. These mechanisms deal with the fact of hiding the underlying technology and let stakeholders wrap resources as standard services to share these resources in a transparent manner. These components are integrated in our architecture within the Service Framework node (module). PIC Figure 1: Figure 1. Architecture components diagram Figure 1 shows the components of the architecture: The Application Node provides the entry point for users to run distributed applications. This software component has the user interface and the application logic. The Service Connector component provides the ability to connect to the services available in the middleware layer of SDI. This node acts as a socket to OGC Web Services. For instance we appreciate the WMS component implementing the OGC WMS specification as it is the standard recommended by the INSPIRE implementation rules as View Service Type.The Service Framework node contains several components. The Service Framework main functionality is to assist users in wrapping and sharing geospatial resources. It implements the proposed mechanisms to improve the availability and visibility of geospatial resources. The main components of this framework are the Data wrapper, the Process Wrapper and the Service Publisher. The Data Wrapper and Process Wrapper components guide users to wrap data and tools as standard services according with INSPIRE implementing rules (availability). The Service Publisher component aims at creating service metadata and publishing them in catalogues (visibility). Roughly speaking, all of these components are concerned with the idea of acting as a service generator and publisher, i.e., they get a resource (data or process) and return an INSPIRE service that will be published in catalogue services. References Béjar, R., Latre, M. Á., Nogueras-Iso, J., Muro-Medrano, P. R., Zarazaga-Soria, F. J. 2009. International Journal of Geographical Information Science, 23(3), 271-294. Bernard, L, U Einspanier, M Lutz & C Portele. Interoperability in GI Service Chains The Way Forward. In: M. Gould, R. Laurini & S. Coulondre (Eds.). 6th AGILE Conference on Geographic Information Science 2003, Lyon: 179-188. INSPIRE. Directive 2007/2/EC of the European Parliament and of the Council of 14 March 2007 establishing an Infrastructure for Spatial Information in the European Community. (2007) Masser, I. GIS Worlds: Creating Spatial Data Infrastructures. Redlands, California. ESRI Press. (2005) Masser, I., Rajabifard, A., Williamson, I. 2008. Spatially enabling governments through SDI implementation. International Journal of Geographical Information Science. Vol. 22, No. 1, (2008) 5-20 Rajabifard, A., Feeney, M-E. F., Williamson, I. P. 2002. Future directions for SDI development. International Journal of Applied Earth Observation and Geoinformation 4 (2002) 11-22

  17. Sea Level Rise Data Discovery

    NASA Astrophysics Data System (ADS)

    Quach, N.; Huang, T.; Boening, C.; Gill, K. M.

    2016-12-01

    Research related to sea level rise crosses multiple disciplines from sea ice to land hydrology. The NASA Sea Level Change Portal (SLCP) is a one-stop source for current sea level change information and data, including interactive tools for accessing and viewing regional data, a virtual dashboard of sea level indicators, and ongoing updates through a suite of editorial products that include content articles, graphics, videos, and animations. The architecture behind the SLCP makes it possible to integrate web content and data relevant to sea level change that are archived across various data centers as well as new data generated by sea level change principal investigators. The Extensible Data Gateway Environment (EDGE) is incorporated into the SLCP architecture to provide a unified platform for web content and science data discovery. EDGE is a data integration platform designed to facilitate high-performance geospatial data discovery and access with the ability to support multi-metadata standard specifications. EDGE has the capability to retrieve data from one or more sources and package the resulting sets into a single response to the requestor. With this unified endpoint, the Data Analysis Tool that is available on the SLCP can retrieve dataset and granule level metadata as well as perform geospatial search on the data. This talk focuses on the architecture that makes it possible to seamlessly integrate and enable discovery of disparate data relevant to sea level rise.

  18. SensorWeb 3G: Extending On-Orbit Sensor Capabilities to Enable Near Realtime User Configurability

    NASA Technical Reports Server (NTRS)

    Mandl, Daniel; Cappelaere, Pat; Frye, Stuart; Sohlberg, Rob; Ly, Vuong; Chien, Steve; Tran, Daniel; Davies, Ashley; Sullivan, Don; Ames, Troy; hide

    2010-01-01

    This research effort prototypes an implementation of a standard interface, Web Coverage Processing Service (WCPS), which is an Open Geospatial Consortium(OGC) standard, to enable users to define, test, upload and execute algorithms for on-orbit sensor systems. The user is able to customize on-orbit data products that result from raw data streaming from an instrument. This extends the SensorWeb 2.0 concept that was developed under a previous Advanced Information System Technology (AIST) effort in which web services wrap sensors and a standardized Extensible Markup Language (XML) based scripting workflow language orchestrates processing steps across multiple domains. SensorWeb 3G extends the concept by providing the user controls into the flight software modules associated with on-orbit sensor and thus provides a degree of flexibility which does not presently exist. The successful demonstrations to date will be presented, which includes a realistic HyspIRI decadal mission testbed. Furthermore, benchmarks that were run will also be presented along with future demonstration and benchmark tests planned. Finally, we conclude with implications for the future and how this concept dovetails into efforts to develop "cloud computing" methods and standards.

  19. Data to Decisions: Valuing the Societal Benefit of Geospatial Information

    NASA Astrophysics Data System (ADS)

    Pearlman, F.; Kain, D.

    2016-12-01

    The March 10-11, 2016 GEOValue workshop on "Data to Decisions" was aimed at creating a framework for identification and implementation of best practices that capture the societal value of geospatial information for both public and private uses. The end-to-end information flow starts with the earth observation and data acquisition systems, includes the full range of processes from geospatial information to decisions support systems, and concludes with the end user. Case studies, which will be described in this presentation, were identified for a range of applications. The goal was to demonstrate and compare approaches to valuation of geospatial information and forge a path forward for research that leads to standards of practice.

  20. A SOA-based approach to geographical data sharing

    NASA Astrophysics Data System (ADS)

    Li, Zonghua; Peng, Mingjun; Fan, Wei

    2009-10-01

    In the last few years, large volumes of spatial data have been available in different government departments in China, but these data are mainly used within these departments. With the e-government project initiated, spatial data sharing become more and more necessary. Currently, the Web has been used not only for document searching but also for the provision and use of services, known as Web services, which are published in a directory and may be automatically discovered by software agents. Particularly in the spatial domain, the possibility of accessing these large spatial datasets via Web services has motivated research into the new field of Spatial Data Infrastructure (SDI) implemented using service-oriented architecture. In this paper a Service-Oriented Architecture (SOA) based Geographical Information Systems (GIS) is proposed, and a prototype system is deployed based on Open Geospatial Consortium (OGC) standard in Wuhan, China, thus that all the departments authorized can access the spatial data within the government intranet, and also these spatial data can be easily integrated into kinds of applications.

  1. Streamlining geospatial metadata in the Semantic Web

    NASA Astrophysics Data System (ADS)

    Fugazza, Cristiano; Pepe, Monica; Oggioni, Alessandro; Tagliolato, Paolo; Carrara, Paola

    2016-04-01

    In the geospatial realm, data annotation and discovery rely on a number of ad-hoc formats and protocols. These have been created to enable domain-specific use cases generalized search is not feasible for. Metadata are at the heart of the discovery process and nevertheless they are often neglected or encoded in formats that either are not aimed at efficient retrieval of resources or are plainly outdated. Particularly, the quantum leap represented by the Linked Open Data (LOD) movement did not induce so far a consistent, interlinked baseline in the geospatial domain. In a nutshell, datasets, scientific literature related to them, and ultimately the researchers behind these products are only loosely connected; the corresponding metadata intelligible only to humans, duplicated on different systems, seldom consistently. Instead, our workflow for metadata management envisages i) editing via customizable web- based forms, ii) encoding of records in any XML application profile, iii) translation into RDF (involving the semantic lift of metadata records), and finally iv) storage of the metadata as RDF and back-translation into the original XML format with added semantics-aware features. Phase iii) hinges on relating resource metadata to RDF data structures that represent keywords from code lists and controlled vocabularies, toponyms, researchers, institutes, and virtually any description one can retrieve (or directly publish) in the LOD Cloud. In the context of a distributed Spatial Data Infrastructure (SDI) built on free and open-source software, we detail phases iii) and iv) of our workflow for the semantics-aware management of geospatial metadata.

  2. The Value of Information - Accounting for a New Geospatial Paradigm

    NASA Astrophysics Data System (ADS)

    Pearlman, J.; Coote, A. M.

    2014-12-01

    A new frontier in consideration of socio-economic benefit is valuing information as an asset, often referred to as Infonomics. Conventional financial practice does not easily provide a mechanism for valuing information and yet clearly for many of the largest corporations, such as Google and Facebook, it is their principal asset. This is exacerbated for public sector organizations, as those that information-centric rather than information-enabled are relatively few - statistics, archiving and mapping agencies are perhaps the only examples - so it's not at the top of the agenda for Government. However, it is a hugely important issue when valuing Geospatial data and information. Geospatial data allows public institutions to operate, and facilitates the provision of essential services for emergency response and national defense. In this respect, geospatial data is strongly analogous to other types of public infrastructure, such as utilities and roads. The use of Geospatial data is widespread from companies in the transportation or construction sectors to individual planning for daily events. The categorization of geospatial data as infrastructure is critical to decisions related to investment in its management, maintenance and upgrade over time. Geospatial data depreciates in the same way that physical infrastructure depreciates. It needs to be maintained otherwise its functionality and value in use declines. We have coined the term geo-infonomics to encapsulate the concept. This presentation will develop the arguments around its importance and current avenues of research.

  3. The GLIMS Glacier Database

    NASA Astrophysics Data System (ADS)

    Raup, B. H.; Khalsa, S. S.; Armstrong, R.

    2007-12-01

    The Global Land Ice Measurements from Space (GLIMS) project has built a geospatial and temporal database of glacier data, composed of glacier outlines and various scalar attributes. These data are being derived primarily from satellite imagery, such as from ASTER and Landsat. Each "snapshot" of a glacier is from a specific time, and the database is designed to store multiple snapshots representative of different times. We have implemented two web-based interfaces to the database; one enables exploration of the data via interactive maps (web map server), while the other allows searches based on text-field constraints. The web map server is an Open Geospatial Consortium (OGC) compliant Web Map Server (WMS) and Web Feature Server (WFS). This means that other web sites can display glacier layers from our site over the Internet, or retrieve glacier features in vector format. All components of the system are implemented using Open Source software: Linux, PostgreSQL, PostGIS (geospatial extensions to the database), MapServer (WMS and WFS), and several supporting components such as Proj.4 (a geographic projection library) and PHP. These tools are robust and provide a flexible and powerful framework for web mapping applications. As a service to the GLIMS community, the database contains metadata on all ASTER imagery acquired over glacierized terrain. Reduced-resolution of the images (browse imagery) can be viewed either as a layer in the MapServer application, or overlaid on the virtual globe within Google Earth. The interactive map application allows the user to constrain by time what data appear on the map. For example, ASTER or glacier outlines from 2002 only, or from Autumn in any year, can be displayed. The system allows users to download their selected glacier data in a choice of formats. The results of a query based on spatial selection (using a mouse) or text-field constraints can be downloaded in any of these formats: ESRI shapefiles, KML (Google Earth), MapInfo, GML (Geography Markup Language) and GMT (Generic Mapping Tools). This "clip-and-ship" function allows users to download only the data they are interested in. Our flexible web interfaces to the database, which includes various support layers (e.g. a layer to help collaborators identify satellite imagery over their region of expertise) will facilitate enhanced analysis to be undertaken on glacier systems, their distribution, and their impacts on other Earth systems.

  4. Development of Distributed Research Center for monitoring and projecting regional climatic and environmental changes: first results

    NASA Astrophysics Data System (ADS)

    Gordov, Evgeny; Shiklomanov, Alexander; Okladinikov, Igor; Prusevich, Alex; Titov, Alexander

    2016-04-01

    Description and first results of the cooperative project "Development of Distributed Research Center for monitoring and projecting of regional climatic and environmental changes" recently started by SCERT IMCES and ESRC UNH are reported. The project is aimed at development of hardware and software platform prototype of Distributed Research Center (DRC) for monitoring and projecting regional climatic and environmental changes over the areas of mutual interest and demonstration the benefits of such collaboration that complements skills and regional knowledge across the northern extratropics. In the framework of the project, innovative approaches of "cloud" processing and analysis of large geospatial datasets will be developed on the technical platforms of two U.S. and Russian leading institutions involved in research of climate change and its consequences. Anticipated results will create a pathway for development and deployment of thematic international virtual research centers focused on interdisciplinary environmental studies by international research teams. DRC under development will comprise best features and functionality of earlier developed by the cooperating teams' information-computational systems RIMS (http://rims.unh.edu) and CLIMATE(http://climate.scert.ru/), which are widely used in Northern Eurasia environment studies. The project includes several major directions of research (Tasks) listed below. 1. Development of architecture and defining major hardware and software components of DRC for monitoring and projecting of regional environmental changes. 2. Development of an information database and computing software suite for distributed processing and analysis of large geospatial data hosted at ESRC and IMCES SB RAS. 3. Development of geoportal, thematic web client and web services providing international research teams with an access to "cloud" computing resources at DRC; two options will be executed: access through a basic graphical web browser and using geographic information systems - (GIS). 4. Using the output of the first three tasks, compilation of the DRC prototype, its validation, and testing the DRC feasibility for analyses of the recent regional environmental changes over Northern Eurasia and North America. Results of the first stage of the Project implementation are presented. This work is supported by the Ministry of Education and Science of the Russian Federation, Agreement № 14.613.21.0037.

  5. Strategizing Teacher Professional Development for Classroom Uses of Geospatial Data and Tools

    ERIC Educational Resources Information Center

    Zalles, Daniel R.; Manitakos, James

    2016-01-01

    Studying Topography, Orographic Rainfall, and Ecosystems with Geospatial Information Technology (STORE), a 4.5-year National Science Foundation funded project, explored the strategies that stimulate teacher commitment to the project's driving innovation: having students use geospatial information technology (GIT) to learn about weather, climate,…

  6. Crowdsourcing The National Map

    USGS Publications Warehouse

    McCartney, Elizabeth; Craun, Kari J.; Korris, Erin M.; Brostuen, David A.; Moore, Laurence R.

    2015-01-01

    Using crowdsourcing techniques, the US Geological Survey’s (USGS) Volunteered Geographic Information (VGI) project known as “The National Map Corps (TNMCorps)” encourages citizen scientists to collect and edit data about man-made structures in an effort to provide accurate and authoritative map data for the USGS National Geospatial Program’s web-based The National Map. VGI is not new to the USGS, but past efforts have been hampered by available technologies. Building on lessons learned, TNMCorps volunteers are successfully editing 10 different structure types in all 50 states as well as Puerto Rico and the US Virgin Islands.

  7. GISpark: A Geospatial Distributed Computing Platform for Spatiotemporal Big Data

    NASA Astrophysics Data System (ADS)

    Wang, S.; Zhong, E.; Wang, E.; Zhong, Y.; Cai, W.; Li, S.; Gao, S.

    2016-12-01

    Geospatial data are growing exponentially because of the proliferation of cost effective and ubiquitous positioning technologies such as global remote-sensing satellites and location-based devices. Analyzing large amounts of geospatial data can provide great value for both industrial and scientific applications. Data- and compute- intensive characteristics inherent in geospatial big data increasingly pose great challenges to technologies of data storing, computing and analyzing. Such challenges require a scalable and efficient architecture that can store, query, analyze, and visualize large-scale spatiotemporal data. Therefore, we developed GISpark - a geospatial distributed computing platform for processing large-scale vector, raster and stream data. GISpark is constructed based on the latest virtualized computing infrastructures and distributed computing architecture. OpenStack and Docker are used to build multi-user hosting cloud computing infrastructure for GISpark. The virtual storage systems such as HDFS, Ceph, MongoDB are combined and adopted for spatiotemporal data storage management. Spark-based algorithm framework is developed for efficient parallel computing. Within this framework, SuperMap GIScript and various open-source GIS libraries can be integrated into GISpark. GISpark can also integrated with scientific computing environment (e.g., Anaconda), interactive computing web applications (e.g., Jupyter notebook), and machine learning tools (e.g., TensorFlow/Orange). The associated geospatial facilities of GISpark in conjunction with the scientific computing environment, exploratory spatial data analysis tools, temporal data management and analysis systems make up a powerful geospatial computing tool. GISpark not only provides spatiotemporal big data processing capacity in the geospatial field, but also provides spatiotemporal computational model and advanced geospatial visualization tools that deals with other domains related with spatial property. We tested the performance of the platform based on taxi trajectory analysis. Results suggested that GISpark achieves excellent run time performance in spatiotemporal big data applications.

  8. A Web-based Visualization System for Three Dimensional Geological Model using Open GIS

    NASA Astrophysics Data System (ADS)

    Nemoto, T.; Masumoto, S.; Nonogaki, S.

    2017-12-01

    A three dimensional geological model is an important information in various fields such as environmental assessment, urban planning, resource development, waste management and disaster mitigation. In this study, we have developed a web-based visualization system for 3D geological model using free and open source software. The system has been successfully implemented by integrating web mapping engine MapServer and geographic information system GRASS. MapServer plays a role of mapping horizontal cross sections of 3D geological model and a topographic map. GRASS provides the core components for management, analysis and image processing of the geological model. Online access to GRASS functions has been enabled using PyWPS that is an implementation of WPS (Web Processing Service) Open Geospatial Consortium (OGC) standard. The system has two main functions. Two dimensional visualization function allows users to generate horizontal and vertical cross sections of 3D geological model. These images are delivered via WMS (Web Map Service) and WPS OGC standards. Horizontal cross sections are overlaid on the topographic map. A vertical cross section is generated by clicking a start point and an end point on the map. Three dimensional visualization function allows users to visualize geological boundary surfaces and a panel diagram. The user can visualize them from various angles by mouse operation. WebGL is utilized for 3D visualization. WebGL is a web technology that brings hardware-accelerated 3D graphics to the browser without installing additional software. The geological boundary surfaces can be downloaded to incorporate the geologic structure in a design on CAD and model for various simulations. This study was supported by JSPS KAKENHI Grant Number JP16K00158.

  9. How NASA's Atmospheric Science Data Center (ASDC) is operationally using the Esri ArcGIS Platform to improve data discoverability, accessibility and interoperability to meet the diversifying government, private, public and academic communities' driven requirements.

    NASA Astrophysics Data System (ADS)

    Tisdale, M.

    2016-12-01

    NASA's Atmospheric Science Data Center (ASDC) is operationally using the Esri ArcGIS Platform to improve data discoverability, accessibility and interoperability to meet the diversifying government, private, public and academic communities' driven requirements. The ASDC is actively working to provide their mission essential datasets as ArcGIS Image Services, Open Geospatial Consortium (OGC) Web Mapping Services (WMS), OGC Web Coverage Services (WCS) and leveraging the ArcGIS multidimensional mosaic dataset structure. Science teams and ASDC are utilizing these services, developing applications using the Web AppBuilder for ArcGIS and ArcGIS API for Javascript, and evaluating restructuring their data production and access scripts within the ArcGIS Python Toolbox framework and Geoprocessing service environment. These capabilities yield a greater usage and exposure of ASDC data holdings and provide improved geospatial analytical tools for a mission critical understanding in the areas of the earth's radiation budget, clouds, aerosols, and tropospheric chemistry.

  10. Identification of the condition of crops based on geospatial data embedded in graph databases

    NASA Astrophysics Data System (ADS)

    Idziaszek, P.; Mueller, W.; Górna, K.; Okoń, P.; Boniecki, P.; Koszela, K.; Fojud, A.

    2017-07-01

    The Web application presented here supports plant production and works with the graph database Neo4j shell to support the assessment of the condition of crops on the basis of geospatial data, including raster and vector data. The adoption of a graph database as a tool to store and manage the data, including geospatial data, is completely justified in the case of those agricultural holdings that have a wide range of types and sizes of crops. In addition, the authors tested the option of using the technology of Microsoft Cognitive Services at the level of produced application that enables an image analysis using the services provided. The presented application was designed using ASP.NET MVC technology and a wide range of leading IT tools.

  11. Rapid Data Delivery System (RDDS)

    USGS Publications Warehouse

    Cress, Jill J.; Goplen, Susan E.

    2007-01-01

    Since the start of the active 2000 summer fire season, the U. S. Geological Survey (USGS) Rocky Mountain Geographic Science Center (RMGSC) has been actively engaged in providing crucial and timely support to Federal, State, and local natural hazards monitoring, analysis, response, and recovery activities. As part of this support, RMGSC has developed the Rapid Data Delivery System (RDDS) to provide emergency and incident response teams with timely access to geospatial data. The RDDS meets these needs by combining a simple web-enabled data viewer for the selection and preview of vector and raster geospatial data with an easy to use data ordering form. The RDDS viewer also incorporates geospatial locations for current natural hazard incidents, including wildfires, earthquakes, hurricanes, and volcanoes, allowing incident responders to quickly focus on their area of interest for data selection.

  12. Geospatial Services in Special Libraries: A Needs Assessment Perspective

    ERIC Educational Resources Information Center

    Barnes, Ilana

    2013-01-01

    Once limited to geographers and mapmakers, Geographic Information Systems (GIS) has taken a growing central role in information management and visualization. Geospatial services run a gamut of different products and services from Google maps to ArcGIS servers to Mobile development. Geospatial services are not new. Libraries have been writing about…

  13. Information gathering, management and transferring for geospatial intelligence - A conceptual approach to create a spatial data infrastructure

    NASA Astrophysics Data System (ADS)

    Nunes, Paulo; Correia, Anacleto; Teodoro, M. Filomena

    2017-06-01

    Since long ago, information is a key factor for military organizations. In military context the success of joint and combined operations depends on the accurate information and knowledge flow concerning the operational theatre: provision of resources, environment evolution, targets' location, where and when an event will occur. Modern military operations cannot be conceive without maps and geospatial information. Staffs and forces on the field request large volume of information during the planning and execution process, horizontal and vertical geospatial information integration is critical for decision cycle. Information and knowledge management are fundamental to clarify an environment full of uncertainty. Geospatial information (GI) management rises as a branch of information and knowledge management, responsible for the conversion process from raw data collect by human or electronic sensors to knowledge. Geospatial information and intelligence systems allow us to integrate all other forms of intelligence and act as a main platform to process and display geospatial-time referenced events. Combining explicit knowledge with person know-how to generate a continuous learning cycle that supports real time decisions, mitigates the influences of fog of war and provides the knowledge supremacy. This paper presents the analysis done after applying a questionnaire and interviews about the GI and intelligence management in a military organization. The study intended to identify the stakeholder's requirements for a military spatial data infrastructure as well as the requirements for a future software system development.

  14. A novel web informatics approach for automated surveillance of cancer mortality trends✩

    PubMed Central

    Tourassi, Georgia; Yoon, Hong-Jun; Xu, Songhua

    2016-01-01

    Cancer surveillance data are collected every year in the United States via the National Program of Cancer Registries (NPCR) and the Surveillance, Epidemiology and End Results (SEER) Program of the National Cancer Institute (NCI). General trends are closely monitored to measure the nation's progress against cancer. The objective of this study was to apply a novel web informatics approach for enabling fully automated monitoring of cancer mortality trends. The approach involves automated collection and text mining of online obituaries to derive the age distribution, geospatial, and temporal trends of cancer deaths in the US. Using breast and lung cancer as examples, we mined 23,850 cancer-related and 413,024 general online obituaries spanning the timeframe 2008–2012. There was high correlation between the web-derived mortality trends and the official surveillance statistics reported by NCI with respect to the age distribution (ρ = 0.981 for breast; ρ = 0.994 for lung), the geospatial distribution (ρ = 0.939 for breast; ρ = 0.881 for lung), and the annual rates of cancer deaths (ρ = 0.661 for breast; ρ = 0.839 for lung). Additional experiments investigated the effect of sample size on the consistency of the web-based findings. Overall, our study findings support web informatics as a promising, cost-effective way to dynamically monitor spatiotemporal cancer mortality trends. PMID:27044930

  15. Born semantic: linking data from sensors to users and balancing hardware limitations with data standards

    NASA Astrophysics Data System (ADS)

    Buck, Justin; Leadbetter, Adam

    2015-04-01

    New users for the growing volume of ocean data for purposes such as 'big data' data products and operational data assimilation/ingestion require data to be readily ingestible. This can be achieved via the application of World Wide Web Consortium (W3C) Linked Data and Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) standards to data management. As part of several Horizons 2020 European projects (SenseOCEAN, ODIP, AtlantOS) the British Oceanographic Data Centre (BODC) are working on combining existing data centre architecture and SWE software such as Sensor Observation Services with a Linked Data front end. The standards to enable data delivery are proven and well documented1,2 There are practical difficulties when SWE standards are applied to real time data because of internal hardware bandwidth restrictions and a requirement to constrain data transmission costs. A pragmatic approach is proposed where sensor metadata and data output in OGC standards are implemented "shore-side" with sensors and instruments transmitting unique resolvable web linkages to persistent OGC SensorML records published at the BODC. References: 1. World Wide Web Consortium. (2013). Linked Data. Available: http://www.w3.org/standards/semanticweb/data. Last accessed 8th October 2014. 2. Open Geospatial Consortium. (2014). Sensor Web Enablement (SWE). Available: http://www.opengeospatial.org/ogc/markets-technologies/swe. Last accessed 8th October 2014.

  16. A novel web informatics approach for automated surveillance of cancer mortality trends

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tourassi, Georgia; Yoon, Hong -Jun; Xu, Songhua

    Cancer surveillance data are collected every year in the United States via the National Program of Cancer Registries (NPCR) and the Surveillance, Epidemiology and End Results (SEER) Program of the National Cancer Institute (NCI). General trends are closely monitored to measure the nation’s progress against cancer. The objective of this study was to apply a novel web informatics approach for enabling fully automated monitoring of cancer mortality trends. The approach involves automated collection and text mining of online obituaries to derive the age distribution, geospatial, and temporal trends of cancer deaths in the US. Using breast and lung cancer asmore » examples, we mined 23,850 cancer-related and 413,024 general online obituaries spanning the timeframe 2008–2012. There was high correlation between the web-derived mortality trends and the official surveillance statistics reported by NCI with respect to the age distribution (ρ = 0.981 for breast; ρ = 0.994 for lung), the geospatial distribution (ρ = 0.939 for breast; ρ = 0.881 for lung), and the annual rates of cancer deaths (ρ = 0.661 for breast; ρ = 0.839 for lung). Additional experiments investigated the effect of sample size on the consistency of the web-based findings. Altogether, our study findings support web informatics as a promising, cost-effective way to dynamically monitor spatiotemporal cancer mortality trends.« less

  17. A novel web informatics approach for automated surveillance of cancer mortality trends

    DOE PAGES

    Tourassi, Georgia; Yoon, Hong -Jun; Xu, Songhua

    2016-04-01

    Cancer surveillance data are collected every year in the United States via the National Program of Cancer Registries (NPCR) and the Surveillance, Epidemiology and End Results (SEER) Program of the National Cancer Institute (NCI). General trends are closely monitored to measure the nation’s progress against cancer. The objective of this study was to apply a novel web informatics approach for enabling fully automated monitoring of cancer mortality trends. The approach involves automated collection and text mining of online obituaries to derive the age distribution, geospatial, and temporal trends of cancer deaths in the US. Using breast and lung cancer asmore » examples, we mined 23,850 cancer-related and 413,024 general online obituaries spanning the timeframe 2008–2012. There was high correlation between the web-derived mortality trends and the official surveillance statistics reported by NCI with respect to the age distribution (ρ = 0.981 for breast; ρ = 0.994 for lung), the geospatial distribution (ρ = 0.939 for breast; ρ = 0.881 for lung), and the annual rates of cancer deaths (ρ = 0.661 for breast; ρ = 0.839 for lung). Additional experiments investigated the effect of sample size on the consistency of the web-based findings. Altogether, our study findings support web informatics as a promising, cost-effective way to dynamically monitor spatiotemporal cancer mortality trends.« less

  18. Methods and Tools to Align Curriculum to the Skills and Competencies Needed by the Workforce - an Example from Geospatial Science and Technology

    NASA Astrophysics Data System (ADS)

    Johnson, A. B.

    2012-12-01

    Geospatial science and technology (GST) including geographic information systems, remote sensing, global positioning systems and mobile applications, are valuable tools for geoscientists and students learning to become geoscientists. GST allows the user to analyze data spatially and temporarily and then visualize the data and outcomes in multiple formats (digital, web and paper). GST has evolved rapidly and it has been difficult to create effective curriculum as few guidelines existed to help educators. In 2010, the US Department of Labor (DoL), in collaboration with the National Geospatial Center of Excellence (GeoTech Center), a National Science Foundation supported grant, approved the Geospatial Technology Competency Mode (GTCM). The GTCM was developed and vetted with industry experts and provided the structure and example competencies needed across the industry. While the GTCM was helpful, a more detailed list of skills and competencies needed to be identified in order to build appropriate curriculum. The GeoTech Center carried out multiple DACUM events to identify the skills and competencies needed by entry-level workers. DACUM (Developing a Curriculum) is a job analysis process whereby expert workers are convened to describe what they do for a specific occupation. The outcomes from multiple DACUMs were combined into a MetaDACUM and reviewed by hundreds of GST professionals. This provided a list of more than 320 skills and competencies needed by the workforce. The GeoTech Center then held multiple workshops across the U.S. where more than 100 educators knowledgeable in teaching GST parsed the list into Model Courses and a Model Certificate Program. During this process, tools were developed that helped educators define which competency should be included in a specific course and the depth of instruction for that competency. This presentation will provide details about the process, methodology and tools used to create the Models and suggest how they can be used to create customized curriculum integrating geospatial science and technology into geoscience programs.

  19. The Inter-American Geospatial Data Network— developing a Western Hemisphere geospatial data clearinghouse

    USGS Publications Warehouse

    Anthony, Michelle L.; Klaver, Jacqueline M.; Quenzer, Robert

    1998-01-01

    The US Geological Survey and US Agency for International Development are enhancing the geographic information infrastructure of the Western Hemisphere by establishing the Inter-American Geospatial Data Network (IGDN). In its efforts to strengthen the Western Hemisphere's information infrastructure, the IGDN is consistent with the goals of the Plan of Action that emerged from the 1994 Summit of the Americas. The IGDN is an on-line cooperative, or clearinghouse, of geospatial data. Internet technology is used to facilitate the discovery and access of Western Hemisphere geospatial data. It was established by using the standards and guidelines of the Federal Geographic Data Committee to provide a consistent data discovery mechanism that will help minimize geospatial data duplication, promote data availability, and coordinate data collection and research activities.

  20. One map policy (OMP) implementation strategy to accelerate mapping of regional spatial planing (RTRW) in Indonesia

    NASA Astrophysics Data System (ADS)

    Hasyim, Fuad; Subagio, Habib; Darmawan, Mulyanto

    2016-06-01

    A preparation of spatial planning documents require basic geospatial information and thematic accuracies. Recently these issues become important because spatial planning maps are impartial attachment of the regional act draft on spatial planning (PERDA). The needs of geospatial information in the preparation of spatial planning maps preparation can be divided into two major groups: (i). basic geospatial information (IGD), consist of of Indonesia Topographic maps (RBI), coastal and marine environmental maps (LPI), and geodetic control network and (ii). Thematic Geospatial Information (IGT). Currently, mostly local goverment in Indonesia have not finished their regulation draft on spatial planning due to some constrain including technical aspect. Some constrain in mapping of spatial planning are as follows: the availability of large scale ofbasic geospatial information, the availability of mapping guidelines, and human resources. Ideal conditions to be achieved for spatial planning maps are: (i) the availability of updated geospatial information in accordance with the scale needed for spatial planning maps, (ii) the guideline of mapping for spatial planning to support local government in completion their PERDA, and (iii) capacity building of local goverment human resources to completed spatial planning maps. The OMP strategies formulated to achieve these conditions are: (i) accelerating of IGD at scale of 1:50,000, 1: 25,000 and 1: 5,000, (ii) to accelerate mapping and integration of Thematic Geospatial Information (IGT) through stocktaking availability and mapping guidelines, (iii) the development of mapping guidelines and dissemination of spatial utilization and (iv) training of human resource on mapping technology.

  1. Tribal-Focused Environmental Risk and Sustainability Tool (Tribal-FERST) Fact Sheet

    EPA Pesticide Factsheets

    The Tribal-Focused Environmental Risk and Sustainability Tool (Tribal- FERST) is a web-based geospatial decision support tool that will provide tribes with easy access to the best available human health and ecological science.

  2. Mapping a Difference: The Power of Geospatial Visualization

    NASA Astrophysics Data System (ADS)

    Kolvoord, B.

    2015-12-01

    Geospatial Technologies (GST), such as GIS, GPS and remote sensing, offer students and teachers the opportunity to study the "why" of where. By making maps and collecting location-based data, students can pursue authentic problems using sophisticated tools. The proliferation of web- and cloud-based tools has made these technologies broadly accessible to schools. In addition, strong spatial thinking skills have been shown to be a key factor in supporting students that want to study science, technology, engineering, and mathematics (STEM) disciplines (Wai, Lubinski and Benbow) and pursue STEM careers. Geospatial technologies strongly scaffold the development of these spatial thinking skills. For the last ten years, the Geospatial Semester, a unique dual-enrollment partnership between James Madison University and Virginia high schools, has provided students with the opportunity to use GST's to hone their spatial thinking skills and to do extended projects of local interest, including environmental, geological and ecological studies. Along with strong spatial thinking skills, these students have also shown strong problem solving skills, often beyond those of fellow students in AP classes. Programs like the Geospatial Semester are scalable and within the reach of many college and university departments, allowing strong engagement with K-12 schools. In this presentation, we'll share details of the Geospatial Semester and research results on the impact of the use of these technologies on students' spatial thinking skills, and discuss the success and challenges of developing K-12 partnerships centered on geospatial visualization.

  3. Distributed geospatial model sharing based on open interoperability standards

    USGS Publications Warehouse

    Feng, Min; Liu, Shuguang; Euliss, Ned H.; Fang, Yin

    2009-01-01

    Numerous geospatial computational models have been developed based on sound principles and published in journals or presented in conferences. However modelers have made few advances in the development of computable modules that facilitate sharing during model development or utilization. Constraints hampering development of model sharing technology includes limitations on computing, storage, and connectivity; traditional stand-alone and closed network systems cannot fully support sharing and integrating geospatial models. To address this need, we have identified methods for sharing geospatial computational models using Service Oriented Architecture (SOA) techniques and open geospatial standards. The service-oriented model sharing service is accessible using any tools or systems compliant with open geospatial standards, making it possible to utilize vast scientific resources available from around the world to solve highly sophisticated application problems. The methods also allow model services to be empowered by diverse computational devices and technologies, such as portable devices and GRID computing infrastructures. Based on the generic and abstract operations and data structures required for Web Processing Service (WPS) standards, we developed an interactive interface for model sharing to help reduce interoperability problems for model use. Geospatial computational models are shared on model services, where the computational processes provided by models can be accessed through tools and systems compliant with WPS. We developed a platform to help modelers publish individual models in a simplified and efficient way. Finally, we illustrate our technique using wetland hydrological models we developed for the prairie pothole region of North America.

  4. Integrated Geospatial Education and Technology Training (iGETT) for Workforce Development

    NASA Astrophysics Data System (ADS)

    Allen, J. E.; Johnson, A.; Headley, R. K.

    2009-12-01

    The increasing availability of no-cost remote sensing data and improvements in analysis software have presented an unprecedented opportunity for the integration of geospatial technologies into a wide variety of disciplines for learning and teaching at community colleges and Tribal colleges. These technologies magnify the effectiveness of problem solving in agriculture, disaster management, environmental sciences, urbanization monitoring, and multiple other domains for societal benefit. This session will demonstrate the approach and lessons learned by federal and private industry partners leading a professional development program, “Integrated Geospatial Education and Technology Training” (iGETT; http://igett.delmar.edu), 2007-2010. iGETT is funded by the National Science Foundation’s Advanced Technological Education Program, (NSF DUE 0703185). 40 participants were selected from a nationwide pool and received training in how to understand, identify, download, and integrate federal land remote sensing data into existing Geographic Information Systems programs to address specific issues of concern to the local workforce. Each participant has authored a “Learning Unit” that covers at least two weeks of class time. All training resources and Learning Units are publicly available on the iGETT Web site. A follow-on project is under consideration to develop core competencies for the remote sensing technician. Authors: Jeannie Allen, Sigma Space Corp. for NASA Landsat, at Goddard Space Flight Center; Ann Johnson, ESRI Higher Education; Rachel Headley, USGS EROS Land Remote Sensing Program

  5. WEB-GIS Decision Support System for CO2 storage

    NASA Astrophysics Data System (ADS)

    Gaitanaru, Dragos; Leonard, Anghel; Radu Gogu, Constantin; Le Guen, Yvi; Scradeanu, Daniel; Pagnejer, Mihaela

    2013-04-01

    Environmental decision support systems (DSS) paradigm evolves and changes as more knowledge and technology become available to the environmental community. Geographic Information Systems (GIS) can be used to extract, assess and disseminate some types of information, which are otherwise difficult to access by traditional methods. In the same time, with the help of the Internet and accompanying tools, creating and publishing online interactive maps has become easier and rich with options. The Decision Support System (MDSS) developed for the MUSTANG (A MUltiple Space and Time scale Approach for the quaNtification of deep saline formations for CO2 storaGe) project is a user friendly web based application that uses the GIS capabilities. MDSS can be exploited by the experts for CO2 injection and storage in deep saline aquifers. The main objective of the MDSS is to help the experts to take decisions based large structured types of data and information. In order to achieve this objective the MDSS has a geospatial objected-orientated database structure for a wide variety of data and information. The entire application is based on several principles leading to a series of capabilities and specific characteristics: (i) Open-Source - the entire platform (MDSS) is based on open-source technologies - (1) database engine, (2) application server, (3) geospatial server, (4) user interfaces, (5) add-ons, etc. (ii) Multiple database connections - MDSS is capable to connect to different databases that are located on different server machines. (iii)Desktop user experience - MDSS architecture and design follows the structure of a desktop software. (iv)Communication - the server side and the desktop are bound together by series functions that allows the user to upload, use, modify and download data within the application. The architecture of the system involves one database and a modular application composed by: (1) a visualization module, (2) an analysis module, (3) a guidelines module, and (4) a risk assessment module. The Database component is build by using the PostgreSQL and PostGIS open source technology. The visualization module allows the user to view data of CO2 injection sites in different ways: (1) geospatial visualization, (2) table view, (3) 3D visualization. The analysis module will allow the user to perform certain analysis like Injectivity, Containment and Capacity analysis. The Risk Assessment module focus on the site risk matrix approach. The Guidelines module contains the methodologies of CO2 injection and storage into deep saline aquifers guidelines.

  6. WikiPEATia - a web based platform for assembling peatland data through ‘crowd sourcing’

    NASA Astrophysics Data System (ADS)

    Wisser, D.; Glidden, S.; Fieseher, C.; Treat, C. C.; Routhier, M.; Frolking, S. E.

    2009-12-01

    The Earth System Science community is realizing that peatlands are an important and unique terrestrial ecosystem that has not yet been well-integrated into large-scale earth system analyses. A major hurdle is the lack of accessible, geospatial data of peatland distribution, coupled with data on peatland properties (e.g., vegetation composition, peat depth, basal dates, soil chemistry, peatland class) at the global scale. This data, however, is available at the local scale. Although a comprehensive global database on peatlands probably lags similar data on more economically important ecosystems such as forests, grasslands, croplands, a large amount of field data have been collected over the past several decades. A few efforts have been made to map peatlands at large scales but existing data have not been assembled into a single geospatial database that is publicly accessible or do not depict data with a level of detail that is needed in the Earth System Science Community. A global peatland database would contribute to advances in a number of research fields such as hydrology, vegetation and ecosystem modeling, permafrost modeling, and earth system modeling. We present a Web 2.0 approach that uses state-of-the-art webserver and innovative online mapping technologies and is designed to create such a global database through ‘crowd-sourcing’. Primary functions of the online system include form-driven textual user input of peatland research metadata, spatial data input of peatland areas via a mapping interface, database editing and querying editing capabilities, as well as advanced visualization and data analysis tools. WikiPEATia provides an integrated information technology platform for assembling, integrating, and posting peatland-related geospatial datasets facilitates and encourages research community involvement. A successful effort will make existing peatland data much more useful to the research community, and will help to identify significant data gaps.

  7. Soil Monitor: an open source web application for real-time soil sealing monitoring and assessment

    NASA Astrophysics Data System (ADS)

    Langella, Giuliano; Basile, Angelo; Giannecchini, Simone; Iamarino, Michela; Munafò, Michele; Terribile, Fabio

    2016-04-01

    Soil sealing is one of the most important causes of land degradation and desertification. In Europe, soil covered by impermeable materials has increased by about 80% from the Second World War till nowadays, while population has only grown by one third. There is an increasing concern at the high political levels about the need to attenuate imperviousness itself and its effects on soil functions. European Commission promulgated a roadmap (COM(2011) 571) by which the net land take would be zero by 2050. Furthermore, European Commission also published a report in 2011 providing best practices and guidelines for limiting soil sealing and imperviousness. In this scenario, we developed an open source and an open source based Soil Sealing Geospatial Cyber Infrastructure (SS-GCI) named as "Soil Monitor". This tool merges a webGIS with parallel geospatial computation in a fast and dynamic fashion in order to provide real-time assessments of soil sealing at high spatial resolution (20 meters and below) over the whole Italy. Common open source webGIS packages are used to implement both the data management and visualization infrastructures, such as GeoServer and MapStore. The high-speed geospatial computation is ensured by a GPU parallelism using the CUDA (Computing Unified Device Architecture) framework by NVIDIA®. This kind of parallelism required the writing - from scratch - all codes needed to fulfil the geospatial computation built behind the soil sealing toolbox. The combination of GPU computing with webGIS infrastructures is relatively novel and required particular attention at the Java-CUDA programming interface. As a result, Soil Monitor is smart because it can perform very high time-consuming calculations (querying for instance an Italian administrative region as area of interest) in less than one minute. The web application is embedded in a web browser and nothing must be installed before using it. Potentially everybody can use it, but the main targets are the stakeholders dealing with sealing, such as policy makers, land owners and asphalt/cement companies. As a matter of fact, Soil Monitor can be used to improve the spatial planning therefore limiting the progression of disordered soil sealing which causes both the direct loss of soils due to imperviousness but also the indirect loss caused by fragmentation of soils (which has different negative effects on the durability of soil functions, such as habitat corridors). Further, in a future version, Soil Monitor would estimate the best location for a new building or help compensating soil losses by actions in other areas to offset drawbacks at zero. The presented SS-GCI dealing with soil sealing - if opportunely scaled - would aid the implementation of best practices for limiting soil sealing or mitigating its effects on soil functions.

  8. Urban networks among Chinese cities along "the Belt and Road": A case of web search activity in cyberspace.

    PubMed

    Zhang, Lu; Du, Hongru; Zhao, Yannan; Wu, Rongwei; Zhang, Xiaolei

    2017-01-01

    "The Belt and Road" initiative has been expected to facilitate interactions among numerous city centers. This initiative would generate a number of centers, both economic and political, which would facilitate greater interaction. To explore how information flows are merged and the specific opportunities that may be offered, Chinese cities along "the Belt and Road" are selected for a case study. Furthermore, urban networks in cyberspace have been characterized by their infrastructure orientation, which implies that there is a relative dearth of studies focusing on the investigation of urban hierarchies by capturing information flows between Chinese cities along "the Belt and Road". This paper employs Baidu, the main web search engine in China, to examine urban hierarchies. The results show that urban networks become more balanced, shifting from a polycentric to a homogenized pattern. Furthermore, cities in networks tend to have both a hierarchical system and a spatial concentration primarily in regions such as Beijing-Tianjin-Hebei, Yangtze River Delta and the Pearl River Delta region. Urban hierarchy based on web search activity does not follow the existing hierarchical system based on geospatial and economic development in all cases. Moreover, urban networks, under the framework of "the Belt and Road", show several significant corridors and more opportunities for more cities, particularly western cities. Furthermore, factors that may influence web search activity are explored. The results show that web search activity is significantly influenced by the economic gap, geographical proximity and administrative rank of the city.

  9. Real-time notification and improved situational awareness in fire emergencies using geospatial-based publish/subscribe

    NASA Astrophysics Data System (ADS)

    Kassab, Ala'; Liang, Steve; Gao, Yang

    2010-12-01

    Emergency agencies seek to maintain situational awareness and effective decision making through continuous monitoring of, and real-time alerting about, sources of information regarding current incidents and developing fire hazards. The nature of this goal requires integrating different, potentially numerous, sources of dynamic geospatial information on the one side, and a large number of clients having heterogeneous and specific interests in data on the other side. In such scenarios, the traditional request/reply communication style may function inefficiently, as it is based on point-to-point, synchronous, and pulling mode interaction between consumer clients and information providers/services. In this work, we propose Geospatial-based Publish/ Subscribe, an interaction framework that serves as a middleware for real-time transacting of spatially related information of interest, termed geospatial events, in distributed systems. Expressive data models, including geospatial event and geospatial subscription, as well as an efficient matching approach for fast dissemination of geospatial events to interested clients, are introduced. The proposed interaction framework is realized through the development of a Real-Time Fire Emergency Response System (RFERS) prototype. The prototype is designed for transacting several topics of geospatial events that are crucial within the context of fire emergencies, including GPS locations of emergency assets, meteorological observations of wireless sensors, fire incidents reports, and temporal sequences of remote sensing images of active wildfires. The performance of the system prototype has been evaluated in order to demonstrate its efficiency.

  10. Integrating geospatial data and cropping system simulation within a geographic information system to analyze spatial seed cotton yield, water use, and irrigation requirements

    USDA-ARS?s Scientific Manuscript database

    The development of sensors that provide geospatial information on crop and soil conditions has been a primary success for precision agriculture. However, further developments are needed to integrate geospatial data into computer algorithms that spatially optimize crop production while considering po...

  11. A discussion for integrating INSPIRE with volunteered geographic information (VGI) and the vision for a global spatial-based platform

    NASA Astrophysics Data System (ADS)

    Demetriou, Demetris; Campagna, Michele; Racetin, Ivana; Konecny, Milan

    2017-09-01

    INSPIRE is the EU's authoritative Spatial Data Infrastructure (SDI) in which each Member State provides access to their spatial data across a wide spectrum of data themes to support policy making. In contrast, Volunteered Geographic Information (VGI) is one type of user-generated geographic information where volunteers use the web and mobile devices to create, assemble and disseminate spatial information. There are similarities and differences between SDIs and VGI initiatives, as well as advantages and disadvantages. Thus, the integration of these two data sources will enhance what is offered to end users to facilitate decision makers and the wider community regarding solving complex spatial problems, managing emergency situations and getting useful information for peoples' daily activities. Although some efforts towards this direction have been arisen, several key issues need to be considered and resolved. Further to this integration, the vision is the development of a global integrated GIS platform, which extends the capabilities of a typical data-hub by embedding on-line spatial and non-spatial applications, to deliver both static and dynamic outputs to support planning and decision making. In this context, this paper discusses the challenges of integrating INSPIRE with VGI and outlines a generic framework towards creating a global integrated web-based GIS platform. The tremendous high speed evolution of the Web and Geospatial technologies suggest that this "super" global Geo-system is not far away.

  12. Geospatial Information is the Cornerstone of Effective Hazards Response

    USGS Publications Warehouse

    Newell, Mark

    2008-01-01

    Every day there are hundreds of natural disasters world-wide. Some are dramatic, whereas others are barely noticeable. A natural disaster is commonly defined as a natural event with catastrophic consequences for living things in the vicinity. Those events include earthquakes, floods, hurricanes, landslides, tsunami, volcanoes, and wildfires. Man-made disasters are events that are caused by man either intentionally or by accident, and that directly or indirectly threaten public health and well-being. These occurrences span the spectrum from terrorist attacks to accidental oil spills. To assist in responding to natural and potential man-made disasters, the U.S. Geological Survey (USGS) has established the Geospatial Information Response Team (GIRT) (http://www.usgs.gov/emergency/). The primary purpose of the GIRT is to ensure rapid coordination and availability of geospatial information for effective response by emergency responders, and land and resource managers, and for scientific analysis. The GIRT is responsible for establishing monitoring procedures for geospatial data acquisition, processing, and archiving; discovery, access, and delivery of data; anticipating geospatial needs; and providing relevant geospatial products and services. The GIRT is focused on supporting programs, offices, other agencies, and the public in mission response to hazards. The GIRT will leverage the USGS Geospatial Liaison Network and partnerships with the Department of Homeland Security (DHS), National Geospatial-Intelligence Agency (NGA), and Northern Command (NORTHCOM) to coordinate the provisioning and deployment of USGS geospatial data, products, services, and equipment. The USGS geospatial liaisons will coordinate geospatial information sharing with State, local, and tribal governments, and ensure geospatial liaison back-up support procedures are in place. The GIRT will coordinate disposition of USGS staff in support of DHS response center activities as requested by DHS. The GIRT is a standing team that is available during all hazard events and is on high alert during the hurricane season from June through November each year. To track all of the requirements and data acquisitions processed through the team, the GIRT will use the new Emergency Request Track (ER Track) tool. Currently, the ER Track is only available to USGS personnel.

  13. An Optimized Autonomous Space In-situ Sensorweb (OASIS) for Volcano Monitoring

    NASA Astrophysics Data System (ADS)

    Song, W.; Shirazi, B.; Lahusen, R.; Chien, S.; Kedar, S.; Webb, F.

    2006-12-01

    In response to NASA's announced requirement for Earth hazard monitoring sensor-web technology, we are developing a prototype real-time Optimized Autonomous Space In-situ Sensorweb. The prototype will be focused on volcano hazard monitoring at Mount St. Helens, which has been in continuous eruption since October 2004. The system is designed to be flexible and easily configurable for many other applications as well. The primary goals of the project are: 1) integrating complementary space (i.e., Earth Observing One (EO- 1) satellite) and in-situ (ground-based) elements into an interactive, autonomous sensor-web; 2) advancing sensor-web power and communication resource management technology; and 3) enabling scalability for seamless infusion of future space and in-situ assets into the sensor-web. To meet these goals, we are developing: 1) a test-bed in-situ array with smart sensor nodes capable of making autonomous data acquisition decisions; 2) efficient self-organization algorithm of sensor-web topology to support efficient data communication and command control; 3) smart bandwidth allocation algorithms in which sensor nodes autonomously determine packet priorities based on mission needs and local bandwidth information in real- time; and 4) remote network management and reprogramming tools. The space and in-situ control components of the system will be integrated such that each element is capable of triggering the other. Sensor-web data acquisition and dissemination will be accomplished through the use of SensorML language standards for geospatial information. The three-year project will demonstrate end-to-end system performance with the in-situ test-bed at Mount St. Helens and NASA's EO-1 platform.

  14. A Semantic Approach for Geospatial Information Extraction from Unstructured Documents

    NASA Astrophysics Data System (ADS)

    Sallaberry, Christian; Gaio, Mauro; Lesbegueries, Julien; Loustau, Pierre

    Local cultural heritage document collections are characterized by their content, which is strongly attached to a territory and its land history (i.e., geographical references). Our contribution aims at making the content retrieval process more efficient whenever a query includes geographic criteria. We propose a core model for a formal representation of geographic information. It takes into account characteristics of different modes of expression, such as written language, captures of drawings, maps, photographs, etc. We have developed a prototype that fully implements geographic information extraction (IE) and geographic information retrieval (IR) processes. All PIV prototype processing resources are designed as Web Services. We propose a geographic IE process based on semantic treatment as a supplement to classical IE approaches. We implement geographic IR by using intersection computing algorithms that seek out any intersection between formal geocoded representations of geographic information in a user query and similar representations in document collection indexes.

  15. From Analysis to Impact: Challenges and Outcomes from Google's Cloud-based Platforms for Analyzing and Leveraging Petapixels of Geospatial Data

    NASA Astrophysics Data System (ADS)

    Thau, D.

    2017-12-01

    For the past seven years, Google has made petabytes of Earth observation data, and the tools to analyze it, freely available to researchers around the world via cloud computing. These data and tools were initially available via Google Earth Engine and are increasingly available on the Google Cloud Platform. We have introduced a number of APIs for both the analysis and presentation of geospatial data that have been successfully used to create impactful datasets and web applications, including studies of global surface water availability, global tree cover change, and crop yield estimation. Each of these projects used the cloud to analyze thousands to millions of Landsat scenes. The APIs support a range of publishing options, from outputting imagery and data for inclusion in papers, to providing tools for full scale web applications that provide analysis tools of their own. Over the course of developing these tools, we have learned a number of lessons about how to build a publicly available cloud platform for geospatial analysis, and about how the characteristics of an API can affect the kinds of impacts a platform can enable. This study will present an overview of how Google Earth Engine works and how Google's geospatial capabilities are extending to Google Cloud Platform. We will provide a number of case studies describing how these platforms, and the data they host, have been leveraged to build impactful decision support tools used by governments, researchers, and other institutions, and we will describe how the available APIs have shaped (or constrained) those tools. [Image Credit: Tyler A. Erickson

  16. BAID: The Barrow Area Information Database - an interactive web mapping portal and cyberinfrastructure for scientific activities in the vicinity of Barrow, Alaska.

    NASA Astrophysics Data System (ADS)

    Cody, R. P.; Kassin, A.; Kofoed, K. B.; Copenhaver, W.; Laney, C. M.; Gaylord, A. G.; Collins, J. A.; Tweedie, C. E.

    2014-12-01

    The Barrow area of northern Alaska is one of the most intensely researched locations in the Arctic and the Barrow Area Information Database (BAID, www.barrowmapped.org) tracks and facilitates a gamut of research, management, and educational activities in the area. BAID is a cyberinfrastructure (CI) that details much of the historic and extant research undertaken within in the Barrow region in a suite of interactive web-based mapping and information portals (geobrowsers). The BAID user community and target audience for BAID is diverse and includes research scientists, science logisticians, land managers, educators, students, and the general public. BAID contains information on more than 12,000 Barrow area research sites that extend back to the 1940's and more than 640 remote sensing images and geospatial datasets. In a web-based setting, users can zoom, pan, query, measure distance, save or print maps and query results, and filter or view information by space, time, and/or other tags. Data are described with metadata that meet Federal Geographic Data Committee standards and are archived at the University Corporation for Atmospheric Research Earth Observing Laboratory (EOL) where non-proprietary BAID data can be freely downloaded. Recent advances include the addition of more than 2000 new research sites, provision of differential global position system (dGPS) and Unmanned Aerial Vehicle (UAV) support to visiting scientists, surveying over 80 miles of coastline to document rates of erosion, training of local GIS personal to better make use of science in local decision making, deployment and near real time connectivity to a wireless micrometeorological sensor network, links to Barrow area datasets housed at national data archives and substantial upgrades to the BAID website and web mapping applications.

  17. Using Sensor Web Processes and Protocols to Assimilate Satellite Data into a Forecast Model

    NASA Technical Reports Server (NTRS)

    Goodman, H. Michael; Conover, Helen; Zavodsky, Bradley; Maskey, Manil; Jedlovec, Gary; Regner, Kathryn; Li, Xiang; Lu, Jessica; Botts, Mike; Berthiau, Gregoire

    2008-01-01

    The goal of the Sensor Management Applied Research Technologies (SMART) On-Demand Modeling project is to develop and demonstrate the readiness of the Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) capabilities to integrate both space-based Earth observations and forecast model output into new data acquisition and assimilation strategies. The project is developing sensor web-enabled processing plans to assimilate Atmospheric Infrared Sounding (AIRS) satellite temperature and moisture retrievals into a regional Weather Research and Forecast (WRF) model over the southeastern United States.

  18. Design and development of linked data from the National Map

    USGS Publications Warehouse

    Usery, E. Lynn; Varanka, Dalia E.

    2012-01-01

    The development of linked data on the World-Wide Web provides the opportunity for the U.S. Geological Survey (USGS) to supply its extensive volumes of geospatial data, information, and knowledge in a machine interpretable form and reach users and applications that heretofore have been unavailable. To pilot a process to take advantage of this opportunity, the USGS is developing an ontology for The National Map and converting selected data from nine research test areas to a Semantic Web format to support machine processing and linked data access. In a case study, the USGS has developed initial methods for legacy vector and raster formatted geometry, attributes, and spatial relationships to be accessed in a linked data environment maintaining the capability to generate graphic or image output from semantic queries. The description of an initial USGS approach to developing ontology, linked data, and initial query capability from The National Map databases is presented.

  19. The GeoCitizen-approach: community-based spatial planning – an Ecuadorian case study

    PubMed Central

    Atzmanstorfer, Karl; Resl, Richard; Eitzinger, Anton; Izurieta, Xiomara

    2014-01-01

    Over the last years, geospatial web platforms, social media, and volunteered geographic information (VGI) have opened a window of opportunity for traditional Public Participatory GIS (PPGIS) to usher in a new era. Taking advantage of these technological achievements, this paper presents a new approach for a citizen-orientated framework of spatial planning that aims at integrating participatory community work into existing decision-making structures. One major cornerstone of the presented approach is the application of a social geoweb platform (the GeoCitizen platform) that combines geo-web technologies and social media in one single tool allowing citizens to collaboratively report observations, discuss ideas, solve, and monitor problems in their living environment at a local level. This paper gives an account of an ongoing participatory land-zoning process in the Capital District of Quito, Ecuador, where the GeoCitizen platform is applied in a long-term study. PMID:27019644

  20. Proposal for a Web Encoding Service (wes) for Spatial Data Transactio

    NASA Astrophysics Data System (ADS)

    Siew, C. B.; Peters, S.; Rahman, A. A.

    2015-10-01

    Web services utilizations in Spatial Data Infrastructure (SDI) have been well established and standardized by Open Geospatial Consortium (OGC). Similar web services for 3D SDI are also being established in recent years, with extended capabilities to handle 3D spatial data. The increasing popularity of using City Geographic Markup Language (CityGML) for 3D city modelling applications leads to the needs for large spatial data handling for data delivery. This paper revisits the available web services in OGC Web Services (OWS), and propose the background concepts and requirements for encoding spatial data via Web Encoding Service (WES). Furthermore, the paper discusses the data flow of the encoder within web service, e.g. possible integration with Web Processing Service (WPS) or Web 3D Services (W3DS). The integration with available web service could be extended to other available web services for efficient handling of spatial data, especially 3D spatial data.

  1. The Hazards Data Distribution System update

    USGS Publications Warehouse

    Jones, Brenda K.; Lamb, Rynn M.

    2010-01-01

    After a major disaster, a satellite image or a collection of aerial photographs of the event is frequently the fastest, most effective way to determine its scope and severity. The U.S. Geological Survey (USGS) Emergency Operations Portal provides emergency first responders and support personnel with easy access to imagery and geospatial data, geospatial Web services, and a digital library focused on emergency operations. Imagery and geospatial data are accessed through the Hazards Data Distribution System (HDDS). HDDS historically provided data access and delivery services through nongraphical interfaces that allow emergency response personnel to select and obtain pre-event baseline data and (or) event/disaster response data. First responders are able to access full-resolution GeoTIFF images or JPEG images at medium- and low-quality compressions through ftp downloads. USGS HDDS home page: http://hdds.usgs.gov/hdds2/

  2. Web-based spatial analysis with the ILWIS open source GIS software and satellite images from GEONETCast

    NASA Astrophysics Data System (ADS)

    Lemmens, R.; Maathuis, B.; Mannaerts, C.; Foerster, T.; Schaeffer, B.; Wytzisk, A.

    2009-12-01

    This paper involves easy accessible integrated web-based analysis of satellite images with a plug-in based open source software. The paper is targeted to both users and developers of geospatial software. Guided by a use case scenario, we describe the ILWIS software and its toolbox to access satellite images through the GEONETCast broadcasting system. The last two decades have shown a major shift from stand-alone software systems to networked ones, often client/server applications using distributed geo-(web-)services. This allows organisations to combine without much effort their own data with remotely available data and processing functionality. Key to this integrated spatial data analysis is a low-cost access to data from within a user-friendly and flexible software. Web-based open source software solutions are more often a powerful option for developing countries. The Integrated Land and Water Information System (ILWIS) is a PC-based GIS & Remote Sensing software, comprising a complete package of image processing, spatial analysis and digital mapping and was developed as commercial software from the early nineties onwards. Recent project efforts have migrated ILWIS into a modular, plug-in-based open source software, and provide web-service support for OGC-based web mapping and processing. The core objective of the ILWIS Open source project is to provide a maintainable framework for researchers and software developers to implement training components, scientific toolboxes and (web-) services. The latest plug-ins have been developed for multi-criteria decision making, water resources analysis and spatial statistics analysis. The development of this framework is done since 2007 in the context of 52°North, which is an open initiative that advances the development of cutting edge open source geospatial software, using the GPL license. GEONETCast, as part of the emerging Global Earth Observation System of Systems (GEOSS), puts essential environmental data at the fingertips of users around the globe. This user-friendly and low-cost information dissemination provides global information as a basis for decision-making in a number of critical areas, including public health, energy, agriculture, weather, water, climate, natural disasters and ecosystems. GEONETCast makes available satellite images via Digital Video Broadcast (DVB) technology. An OGC WMS interface and plug-ins which convert GEONETCast data streams allow an ILWIS user to integrate various distributed data sources with data locally stored on his machine. Our paper describes a use case in which ILWIS is used with GEONETCast satellite imagery for decision making processes in Ghana. We also explain how the ILWIS software can be extended with additional functionality by means of building plug-ins and unfold our plans to implement other OGC standards, such as WCS and WPS in the same context. Especially, the latter one can be seen as a major step forward in terms of moving well-proven desktop based processing functionality to the web. This enables the embedding of ILWIS functionality in Spatial Data Infrastructures or even the execution in scalable and on-demand cloud computing environments.

  3. NHDPlusHR: A national geospatial framework for surface-water information

    USGS Publications Warehouse

    Viger, Roland; Rea, Alan H.; Simley, Jeffrey D.; Hanson, Karen M.

    2016-01-01

    The U.S. Geological Survey is developing a new geospatial hydrographic framework for the United States, called the National Hydrography Dataset Plus High Resolution (NHDPlusHR), that integrates a diversity of the best-available information, robustly supports ongoing dataset improvements, enables hydrographic generalization to derive alternate representations of the network while maintaining feature identity, and supports modern scientific computing and Internet accessibility needs. This framework is based on the High Resolution National Hydrography Dataset, the Watershed Boundaries Dataset, and elevation from the 3-D Elevation Program, and will provide an authoritative, high precision, and attribute-rich geospatial framework for surface-water information for the United States. Using this common geospatial framework will provide a consistent basis for indexing water information in the United States, eliminate redundancy, and harmonize access to, and exchange of water information.

  4. Real-time access of large volume imagery through low-bandwidth links

    NASA Astrophysics Data System (ADS)

    Phillips, James; Grohs, Karl; Brower, Bernard; Kelly, Lawrence; Carlisle, Lewis; Pellechia, Matthew

    2010-04-01

    Providing current, time-sensitive imagery and geospatial information to deployed tactical military forces or first responders continues to be a challenge. This challenge is compounded through rapid increases in sensor collection volumes, both with larger arrays and higher temporal capture rates. Focusing on the needs of these military forces and first responders, ITT developed a system called AGILE (Advanced Geospatial Imagery Library Enterprise) Access as an innovative approach based on standard off-the-shelf techniques to solving this problem. The AGILE Access system is based on commercial software called Image Access Solutions (IAS) and incorporates standard JPEG 2000 processing. Our solution system is implemented in an accredited, deployable form, incorporating a suite of components, including an image database, a web-based search and discovery tool, and several software tools that act in concert to process, store, and disseminate imagery from airborne systems and commercial satellites. Currently, this solution is operational within the U.S. Government tactical infrastructure and supports disadvantaged imagery users in the field. This paper presents the features and benefits of this system to disadvantaged users as demonstrated in real-world operational environments.

  5. Design & implementation of distributed spatial computing node based on WPS

    NASA Astrophysics Data System (ADS)

    Liu, Liping; Li, Guoqing; Xie, Jibo

    2014-03-01

    Currently, the research work of SIG (Spatial Information Grid) technology mostly emphasizes on the spatial data sharing in grid environment, while the importance of spatial computing resources is ignored. In order to implement the sharing and cooperation of spatial computing resources in grid environment, this paper does a systematical research of the key technologies to construct Spatial Computing Node based on the WPS (Web Processing Service) specification by OGC (Open Geospatial Consortium). And a framework of Spatial Computing Node is designed according to the features of spatial computing resources. Finally, a prototype of Spatial Computing Node is implemented and the relevant verification work under the environment is completed.

  6. Forest Fire Management: A Comprehensive And Operational Approach

    NASA Astrophysics Data System (ADS)

    Fabrizi, Roberto; Perez, Bruno; Gomez, Antonio

    2013-12-01

    Remote sensing plays an important role in obtaining rapid and complete information on the occurrence and evolution in space and time of forest fires. In this paper, we present a comprehensive study of fire events through Earth Observation data for early warning, crisis monitoring and post-event damage assessment or a synthesis of the fire event, both in a wide spatial range (local to regional) and temporal scale (short to long term). The fire products are stored and distributed by means of a WebGIS and a Geoportal with additional auxiliary geospatial data. These products allow fire managers to perform analysis and decision making in a more comprehensive manner.

  7. A Geospatial Semantic Enrichment and Query Service for Geotagged Photographs

    PubMed Central

    Ennis, Andrew; Nugent, Chris; Morrow, Philip; Chen, Liming; Ioannidis, George; Stan, Alexandru; Rachev, Preslav

    2015-01-01

    With the increasing abundance of technologies and smart devices, equipped with a multitude of sensors for sensing the environment around them, information creation and consumption has now become effortless. This, in particular, is the case for photographs with vast amounts being created and shared every day. For example, at the time of this writing, Instagram users upload 70 million photographs a day. Nevertheless, it still remains a challenge to discover the “right” information for the appropriate purpose. This paper describes an approach to create semantic geospatial metadata for photographs, which can facilitate photograph search and discovery. To achieve this we have developed and implemented a semantic geospatial data model by which a photograph can be enrich with geospatial metadata extracted from several geospatial data sources based on the raw low-level geo-metadata from a smartphone photograph. We present the details of our method and implementation for searching and querying the semantic geospatial metadata repository to enable a user or third party system to find the information they are looking for. PMID:26205265

  8. A conceptual model of the automated credibility assessment of the volunteered geographic information

    NASA Astrophysics Data System (ADS)

    Idris, N. H.; Jackson, M. J.; Ishak, M. H. I.

    2014-02-01

    The use of Volunteered Geographic Information (VGI) in collecting, sharing and disseminating geospatially referenced information on the Web is increasingly common. The potentials of this localized and collective information have been seen to complement the maintenance process of authoritative mapping data sources and in realizing the development of Digital Earth. The main barrier to the use of this data in supporting this bottom up approach is the credibility (trust), completeness, accuracy, and quality of both the data input and outputs generated. The only feasible approach to assess these data is by relying on an automated process. This paper describes a conceptual model of indicators (parameters) and practical approaches to automated assess the credibility of information contributed through the VGI including map mashups, Geo Web and crowd - sourced based applications. There are two main components proposed to be assessed in the conceptual model - metadata and data. The metadata component comprises the indicator of the hosting (websites) and the sources of data / information. The data component comprises the indicators to assess absolute and relative data positioning, attribute, thematic, temporal and geometric correctness and consistency. This paper suggests approaches to assess the components. To assess the metadata component, automated text categorization using supervised machine learning is proposed. To assess the correctness and consistency in the data component, we suggest a matching validation approach using the current emerging technologies from Linked Data infrastructures and using third party reviews validation. This study contributes to the research domain that focuses on the credibility, trust and quality issues of data contributed by web citizen providers.

  9. Developing a WebGIS for Geo-Visualization of Cancer.

    PubMed

    Khoshabi, Mostafa; Taleai, Mohammad; Motlagh, Ali; Hosseini Kamal, Farnaz

    2016-04-01

    Considering the hygiene facilities and sharing the data of diseases, considerable attempts to promote the public awareness have been made by various media; however, most of the provided information is based on numerical and verbal statistics, and may not provide suitable understanding for people in regard with the situation of diseases. The main aim of this study is to design an interactive WebGIS system in which people could simply produce and observe their favorite maps of different cancers and environmental parameters. They can use this tools to produce their personalized maps and explore various aspects of the cancer. A system has been developed by using WebGIS for convenience of ordinary users without any knowledge about geospatial information system (GIS) to observe the situation of the diseases and environmental conditions in terms of static and user-produced interactive maps. It has also provided the possibility of spatial comparison of the arbitrary parameters in the framework of bar and pie diagrams. This system has been designed and launched on cancer database of Iran where information of meteorological stations has been embedded as environmental parameters. The innovative idea in this study has received less attention in previous works including possibility of producing web-based Choropleth map so that users could easily select the parameters and algorithms for classification and interactive coloring in the system to produce their personalized maps. Development of WebGIS tools and increased cooperation of people in terms of inserting the spatial labels on the map to report a disease or using their views about reasons of occurring a specific cancer in a specific region may cause turning the process of mono-direction flow of information to users to a bi-directional flow of information. As a result, cancer specialists could use the knowledge of local people and residents of different regions of the country to better analyze the situation of various kinds of cancers.

  10. Caching strategies for improving performance of web-based Geographic applications

    NASA Astrophysics Data System (ADS)

    Liu, M.; Brodzik, M.; Collins, J. A.; Lewis, S.; Oldenburg, J.

    2012-12-01

    The NASA Operation IceBridge mission collects airborne remote sensing measurements to bridge the gap between NASA's Ice, Cloud and Land Elevation Satellite (ICESat) mission and the upcoming ICESat-2 mission. The IceBridge Data Portal from the National Snow and Ice Data Center provides an intuitive web interface for accessing IceBridge mission observations and measurements. Scientists and users usually do not have knowledge about the individual campaigns but are interested in data collected in a specific place. We have developed a high-performance map interface to allow users to quickly zoom to an area of interest and see any Operation IceBridge overflights. The map interface consists of two layers: the user can pan and zoom on the base map layer; the flight line layer that overlays the base layer provides all the campaign missions that intersect with the current map view. The user can click on the flight campaigns and download the data as needed. The OpenGIS® Web Map Service Interface Standard (WMS) provides a simple HTTP interface for requesting geo-registered map images from one or more distributed geospatial databases. Web Feature Service (WFS) provides an interface allowing requests for geographical features across the web using platform-independent calls. OpenLayers provides vector support (points, polylines and polygons) to build a WMS/WFS client for displaying both layers on the screen. Map Server, an open source development environment for building spatially enabled internet applications, is serving the WMS and WFS spatial data to OpenLayers. Early releases of the portal displayed unacceptably poor load time performance for flight lines and the base map tiles. This issue was caused by long response times from the map server in generating all map tiles and flight line vectors. We resolved the issue by implementing various caching strategies on top of the WMS and WFS services, including the use of Squid (www.squid-cache.org) to cache frequently-used content. Our presentation includes the architectural design of the application, and how we use OpenLayers, WMS and WFS with Squid to build a responsive web application capable of efficiently displaying geospatial data to allow the user to quickly interact with the displayed information. We describe the design, implementation and performance improvement of our caching strategies, and the tools and techniques developed to assist our data caching strategies.

  11. Sensor Webs and Virtual Globes: Enabling Understanding of Changes in a partially Glaciated Watershed

    NASA Astrophysics Data System (ADS)

    Heavner, M.; Fatland, D. R.; Habermann, M.; Berner, L.; Hood, E.; Connor, C.; Galbraith, J.; Knuth, E.; O'Brien, W.

    2008-12-01

    The University of Alaska Southeast is currently implementing a sensor web identified as the SouthEast Alaska MOnitoring Network for Science, Telecommunications, Education, and Research (SEAMONSTER). SEAMONSTER is operating in the partially glaciated Mendenhall and Lemon Creek Watersheds, in the Juneau area, on the margins of the Juneau Icefield. These watersheds are studied for both 1. long term monitoring of changes, and 2. detection and analysis of transient events (such as glacier lake outburst floods). The heterogeneous sensors (meteorologic, dual frequency GPS, water quality, lake level, etc), power and bandwidth constraints, and competing time scales of interest require autonomous reactivity of the sensor web. They also present challenges for operational management of the sensor web. The harsh conditions on the glaciers provide additional operating constraints. The tight integration of the sensor web and virtual global enabling technology enhance the project in multiple ways. We are utilizing virtual globe infrastructures to enhance both sensor web management and data access. SEAMONSTER utilizes virtual globes for education and public outreach, sensor web management, data dissemination, and enabling collaboration. Using a PosgreSQL with GIS extensions database coupled to the Open Geospatial Consortium (OGC) Geoserver, we generate near-real-time auto-updating geobrowser files of the data in multiple OGC standard formats (e.g KML, WCS). Additionally, embedding wiki pages in this database allows the development of a geospatially aware wiki describing the projects for better public outreach and education. In this presentation we will describe how we have implemented these technologies to date, the lessons learned, and our efforts towards greater OGC standard implementation. A major focus will be on demonstrating how geobrowsers and virtual globes have made this project possible.

  12. Disaster relief through composite signatures

    NASA Astrophysics Data System (ADS)

    Hawley, Chadwick T.; Hyde, Brian; Carpenter, Tom; Nichols, Steve

    2012-06-01

    A composite signature is a group of signatures that are related in such a way to more completely or further define a target or operational endeavor at a higher fidelity. This paper builds on previous work developing innovative composite signatures associated with civil disasters, including physical, chemical and pattern/behavioral. For the composite signature approach to be successful it requires effective data fusion and visualization. This plays a key role in both preparedness and the response and recovery which are critical to saving lives. Visualization tools enhance the overall understanding of the crisis by pulling together and analyzing the data, and providing a clear and complete analysis of the information to the organizations/agencies dependant on it for a successful operation. An example of this, Freedom Web, is an easy-to-use data visualization and collaboration solution for use in homeland security, emergency preparedness, situational awareness, and event management. The solution provides a nationwide common operating picture for all levels of government through a web based, map interface. The tool was designed to be utilized by non-geospatial experts and is easily tailored to the specific needs of the users. Consisting of standard COTS and open source databases and a web server, users can view, edit, share, and highlight information easily and quickly through a standard internet browser.

  13. Advances in the TRIDEC Cloud

    NASA Astrophysics Data System (ADS)

    Hammitzsch, Martin; Spazier, Johannes; Reißland, Sven

    2016-04-01

    The TRIDEC Cloud is a platform that merges several complementary cloud-based services for instant tsunami propagation calculations and automated background computation with graphics processing units (GPU), for web-mapping of hazard specific geospatial data, and for serving relevant functionality to handle, share, and communicate threat specific information in a collaborative and distributed environment. The platform offers a modern web-based graphical user interface so that operators in warning centres and stakeholders of other involved parties (e.g. CPAs, ministries) just need a standard web browser to access a full-fledged early warning and information system with unique interactive features such as Cloud Messages and Shared Maps. Furthermore, the TRIDEC Cloud can be accessed in different modes, e.g. the monitoring mode, which provides important functionality required to act in a real event, and the exercise-and-training mode, which enables training and exercises with virtual scenarios re-played by a scenario player. The software system architecture and open interfaces facilitate global coverage so that the system is applicable for any region in the world and allow the integration of different sensor systems as well as the integration of other hazard types and use cases different to tsunami early warning. Current advances of the TRIDEC Cloud platform will be summarized in this presentation.

  14. Ontology Based Quality Evaluation for Spatial Data

    NASA Astrophysics Data System (ADS)

    Yılmaz, C.; Cömert, Ç.

    2015-08-01

    Many institutions will be providing data to the National Spatial Data Infrastructure (NSDI). Current technical background of the NSDI is based on syntactic web services. It is expected that this will be replaced by semantic web services. The quality of the data provided is important in terms of the decision-making process and the accuracy of transactions. Therefore, the data quality needs to be tested. This topic has been neglected in Turkey. Data quality control for NSDI may be done by private or public "data accreditation" institutions. A methodology is required for data quality evaluation. There are studies for data quality including ISO standards, academic studies and software to evaluate spatial data quality. ISO 19157 standard defines the data quality elements. Proprietary software such as, 1Spatial's 1Validate and ESRI's Data Reviewer offers quality evaluation based on their own classification of rules. Commonly, rule based approaches are used for geospatial data quality check. In this study, we look for the technical components to devise and implement a rule based approach with ontologies using free and open source software in semantic web context. Semantic web uses ontologies to deliver well-defined web resources and make them accessible to end-users and processes. We have created an ontology conforming to the geospatial data and defined some sample rules to show how to test data with respect to data quality elements including; attribute, topo-semantic and geometrical consistency using free and open source software. To test data against rules, sample GeoSPARQL queries are created, associated with specifications.

  15. National hydrography dataset--linear referencing

    USGS Publications Warehouse

    Simley, Jeffrey; Doumbouya, Ariel

    2012-01-01

    Geospatial data normally have a certain set of standard attributes, such as an identification number, the type of feature, and name of the feature. These standard attributes are typically embedded into the default attribute table, which is directly linked to the geospatial features. However, it is impractical to embed too much information because it can create a complex, inflexible, and hard to maintain geospatial dataset. Many scientists prefer to create a modular, or relational, data design where the information about the features is stored and maintained separately, then linked to the geospatial data. For example, information about the water chemistry of a lake can be maintained in a separate file and linked to the lake. A Geographic Information System (GIS) can then relate the water chemistry to the lake and analyze it as one piece of information. For example, the GIS can select all lakes more than 50 acres, with turbidity greater than 1.5 milligrams per liter.

  16. Transformation of HDF-EOS metadata from the ECS model to ISO 19115-based XML

    NASA Astrophysics Data System (ADS)

    Wei, Yaxing; Di, Liping; Zhao, Baohua; Liao, Guangxuan; Chen, Aijun

    2007-02-01

    Nowadays, geographic data, such as NASA's Earth Observation System (EOS) data, are playing an increasing role in many areas, including academic research, government decisions and even in people's every lives. As the quantity of geographic data becomes increasingly large, a major problem is how to fully make use of such data in a distributed, heterogeneous network environment. In order for a user to effectively discover and retrieve the specific information that is useful, the geographic metadata should be described and managed properly. Fortunately, the emergence of XML and Web Services technologies greatly promotes information distribution across the Internet. The research effort discussed in this paper presents a method and its implementation for transforming Hierarchical Data Format (HDF)-EOS metadata from the NASA ECS model to ISO 19115-based XML, which will be managed by the Open Geospatial Consortium (OGC) Catalogue Services—Web Profile (CSW). Using XML and international standards rather than domain-specific models to describe the metadata of those HDF-EOS data, and further using CSW to manage the metadata, can allow metadata information to be searched and interchanged more widely and easily, thus promoting the sharing of HDF-EOS data.

  17. Geospatial Information System Capability Maturity Models

    DOT National Transportation Integrated Search

    2017-06-01

    To explore how State departments of transportation (DOTs) evaluate geospatial tool applications and services within their own agencies, particularly their experiences using capability maturity models (CMMs) such as the Urban and Regional Information ...

  18. Web GIS in practice IX: a demonstration of geospatial visual analytics using Microsoft Live Labs Pivot technology and WHO mortality data

    PubMed Central

    2011-01-01

    The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of human health, surveillance, emergency management and epidemiology-related geospatial visual analytics applications and examples of free software tools that readers can experiment with, such as Google Public Data Explorer. The authors also present a practical demonstration of geospatial visual analytics using partial data for 35 countries from a publicly available World Health Organization (WHO) mortality dataset and Microsoft Live Labs Pivot technology, a free, general purpose visual analytics tool that offers a fresh way to visually browse and arrange massive amounts of data and images online and also supports geographic and temporal classifications of datasets featuring geospatial and temporal components. Interested readers can download a Zip archive (included with the manuscript as an additional file) containing all files, modules and library functions used to deploy the WHO mortality data Pivot collection described in this paper. PMID:21410968

  19. Web GIS in practice IX: a demonstration of geospatial visual analytics using Microsoft Live Labs Pivot technology and WHO mortality data.

    PubMed

    Kamel Boulos, Maged N; Viangteeravat, Teeradache; Anyanwu, Matthew N; Ra Nagisetty, Venkateswara; Kuscu, Emin

    2011-03-16

    The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of human health, surveillance, emergency management and epidemiology-related geospatial visual analytics applications and examples of free software tools that readers can experiment with, such as Google Public Data Explorer. The authors also present a practical demonstration of geospatial visual analytics using partial data for 35 countries from a publicly available World Health Organization (WHO) mortality dataset and Microsoft Live Labs Pivot technology, a free, general purpose visual analytics tool that offers a fresh way to visually browse and arrange massive amounts of data and images online and also supports geographic and temporal classifications of datasets featuring geospatial and temporal components. Interested readers can download a Zip archive (included with the manuscript as an additional file) containing all files, modules and library functions used to deploy the WHO mortality data Pivot collection described in this paper.

  20. LDRD final report :

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brost, Randolph C.; McLendon, William Clarence,

    2013-01-01

    Modeling geospatial information with semantic graphs enables search for sites of interest based on relationships between features, without requiring strong a priori models of feature shape or other intrinsic properties. Geospatial semantic graphs can be constructed from raw sensor data with suitable preprocessing to obtain a discretized representation. This report describes initial work toward extending geospatial semantic graphs to include temporal information, and initial results applying semantic graph techniques to SAR image data. We describe an efficient graph structure that includes geospatial and temporal information, which is designed to support simultaneous spatial and temporal search queries. We also report amore » preliminary implementation of feature recognition, semantic graph modeling, and graph search based on input SAR data. The report concludes with lessons learned and suggestions for future improvements.« less

  1. Strategic Model for Future Geospatial Education.

    DTIC Science & Technology

    1998-05-18

    There appears to be only one benefit to doing nothing as option one dictates-there are no up front costs to the government for doing nothing. The costs...the government can ensure that US industry and academia benefit from decades of geospatial information expertise. Industry and academia will be...or militarily unique topics. In summary, option two provides more benefits for both the government and the geospatial information community as a

  2. Benefits of using Open Geo-spatial Data for valorization of Cultural Heritage: GeoPan app

    NASA Astrophysics Data System (ADS)

    Cuca, Branka; Previtali, Mattia; Barazzetti, Luigi; Brumana, Raffaella

    2017-04-01

    Experts evaluate the spatial data to be one of the categories of Public Sector Information (PSI), of which the exchange is particularly important. On the other side an initiative with a great vision such as Digital Agenda for Europe, emphasizes on intelligent processing of information as essential factor for tackling the challenges of the contemporary society. In such context, the Open Data are considered to be crucial in addressing, environmental pressures, energy efficiency issues, land use and climate change, pollution and traffic management. Furthermore, Open Data are thought to have an important impact on more informed decision making and policy creation for multiple domains that could be addressed even through "apps" of our smart devices. Activities performed in ENERGIC OD project - "European NEtwork for Redistributing Geospatial Information to user Communities - Open Data" have led to some first conclusions on the use and re-use of geo-spatial Open Data by means of Virtual Hubs - an innovative method for brokering of geo-spatial information. This paper illustrates some main benefits of using Open Geo-spatial Data for valorisation of Cultural Heritage through a case of an innovative app called "GeoPan Atl@s". GeoPan, inserted in a dynamic policy context described, aims to provide all information valuable for a sustainable territorial development in a common platform, in particular the material that regards history and changes of the cultural landscapes in Lombardy region. Furthermore, this innovative app is used as a test-bed to facilitate and encourage a more active exchange and exploitation of open geo-spatial information for purposes of valorisation of cultural heritage and landscapes. The aim of this practice is also to achieve a more active participation of experts, VGI communities and citizens and a higher awareness of the multiple use-possibilities of historic and contemporary geo-spatial information for smarter decision making.

  3. 78 FR 69393 - Privacy Act of 1974; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-19

    .... FOR FURTHER INFORMATION CONTACT: National Geospatial-Intelligence Agency (NGA), ATTN: Human...: Delete entry and replace with ``Human Development Directorate, National Geospatial-Intelligence Agency...; System of Records AGENCY: National Geospatial-Intelligence Agency, DoD. ACTION: Notice to alter a System...

  4. Automating Mapping Production for the Enterprise: from Contract to Delivery

    NASA Astrophysics Data System (ADS)

    Uebbing, R.; Xie, C.; Beshah, B.; Welter, J.

    2012-07-01

    The ever increasing volume and quality of geospatial data has created new challenges for mapping companies. Due to increased image resolution, fusion of different data sources and more frequent data update requirements, mapping production is forced to streamline the work flow to meet client deadlines. But the data volume alone is not the only barrier for an efficient production work flow. Processing geospatial information traditionally uses domain and vendor specific applications that do not interface with each other, often leading to data duplication and therefore creating sources for error. Also, it creates isolation between different departments within a mapping company resulting in additional communication barriers. North West Geomatics has designed and implemented a data centric enterprise solution for the flight acquisition and production work flow to combat the above challenges. A central data repository containing not only geospatial data in the strictest sense such as images, vector layers and 3D point clouds, but also other information such as product specifications, client requirements, flight acquisition data, production resource usage and much more has been deployed at the company. As there is only one instance of the database shared throughout the whole organization it allows all employees, given they have been granted the appropriate permission, to view the current status of any project with a graphical and table based interface through its life cycle from sales, through flight acquisition, production and product delivery. Not only can users track progress and status of various work flow steps, but the system also allows users and applications to actively schedule or start specific production steps such as data ingestion and triangulation with many other steps (orthorectification, mosaicing, accounting, etc.) in the planning stages. While the complete system is exposed to the users through a web interface and therefore allowing outside customers to also view their data, much of the design and development was focused on work flow automation, scalability and security. Ideally, users will interact with the system to retrieve a specific project status and summaries while the work flow processes are triggered automatically by modeling their dependencies. The enterprise system is built using open source technologies (PostGIS, Hibernate, OpenLayers, GWT and others) and adheres to OGC web services for data delivery (WMS/WFS/WCS) to third party applications.

  5. An Electronic Tree Inventory for Arboriculture Management

    NASA Astrophysics Data System (ADS)

    Tait, Roger J.; Allen, Tony J.; Sherkat, Nasser; Bellett-Travers, Marcus D.

    The integration of Global Positioning System (GPS) technology into mobile devices provides them with an awareness of their physical location. This geospatial context can be employed in a wide range of applications including locating nearby places of interest as well as guiding emergency services to incidents. In this research, a GPS-enabled Personal Digital Assistant (PDA) is used to create a computerised tree inventory for the management of arboriculture. Using the General Packet Radio Service (GPRS), GPS information and arboreal image data are sent to a web-server. An office-based PC running customised Geographical Information Software (GIS) then automatically retrieves the GPS tagged image data for display and analysis purposes. The resulting application allows an expert user to view the condition of individual trees in greater detail than is possible using remotely sensed imagery.

  6. Geospatial Technology Strategic Plan 1997-2000

    USGS Publications Warehouse

    D'Erchia, Frank; D'Erchia, Terry D.; Getter, James; McNiff, Marcia; Root, Ralph; Stitt, Susan; White, Barbara

    1997-01-01

    Executive Summary -- Geospatial technology applications have been identified in many U.S. Geological Survey Biological Resources Division (BRD) proposals for grants awarded through internal and partnership programs. Because geospatial data and tools have become more sophisticated, accessible, and easy to use, BRD scientists frequently are using these tools and capabilities to enhance a broad spectrum of research activities. Bruce Babbitt, Secretary of the Interior, has acknowledged--and lauded--the important role of geospatial technology in natural resources management. In his keynote address to more than 5,500 people representing 87 countries at the Environmental Systems Research Institute Annual Conference (May 21, 1996), Secretary Babbitt stated, '. . .GIS [geographic information systems], if properly used, can provide a lot more than sets of data. Used effectively, it can help stakeholders to bring consensus out of conflict. And it can, by providing information, empower the participants to find new solutions to their problems.' This Geospatial Technology Strategic Plan addresses the use and application of geographic information systems, remote sensing, satellite positioning systems, image processing, and telemetry; describes methods of meeting national plans relating to geospatial data development, management, and serving; and provides guidance for sharing expertise and information. Goals are identified along with guidelines that focus on data sharing, training, and technology transfer. To measure success, critical performance indicators are included. The ability of the BRD to use and apply geospatial technology across all disciplines will greatly depend upon its success in transferring the technology to field biologists and researchers. The Geospatial Technology Strategic Planning Development Team coordinated and produced this document in the spirit of this premise. Individual Center and Program managers have the responsibility to implement the Strategic Plan by working within the policy and guidelines stated herein.

  7. LiDAR Vegetation Investigation and Signature Analysis System (LVISA)

    NASA Astrophysics Data System (ADS)

    Höfle, Bernhard; Koenig, Kristina; Griesbaum, Luisa; Kiefer, Andreas; Hämmerle, Martin; Eitel, Jan; Koma, Zsófia

    2015-04-01

    Our physical environment undergoes constant changes in space and time with strongly varying triggers, frequencies, and magnitudes. Monitoring these environmental changes is crucial to improve our scientific understanding of complex human-environmental interactions and helps us to respond to environmental change by adaptation or mitigation. The three-dimensional (3D) description of the Earth surface features and the detailed monitoring of surface processes using 3D spatial data have gained increasing attention within the last decades, such as in climate change research (e.g., glacier retreat), carbon sequestration (e.g., forest biomass monitoring), precision agriculture and natural hazard management. In all those areas, 3D data have helped to improve our process understanding by allowing quantifying the structural properties of earth surface features and their changes over time. This advancement has been fostered by technological developments and increased availability of 3D sensing systems. In particular, LiDAR (light detection and ranging) technology, also referred to as laser scanning, has made significant progress and has evolved into an operational tool in environmental research and geosciences. The main result of LiDAR measurements is a highly spatially resolved 3D point cloud. Each point within the LiDAR point cloud has a XYZ coordinate associated with it and often additional information such as the strength of the returned backscatter. The point cloud provided by LiDAR contains rich geospatial, structural, and potentially biochemical information about the surveyed objects. To deal with the inherently unorganized datasets and the large data volume (frequently millions of XYZ coordinates) of LiDAR datasets, a multitude of algorithms for automatic 3D object detection (e.g., of single trees) and physical surface description (e.g., biomass) have been developed. However, so far the exchange of datasets and approaches (i.e., extraction algorithms) among LiDAR users lacks behind. We propose a novel concept, the LiDAR Vegetation Investigation and Signature Analysis System (LVISA), which shall enhance sharing of i) reference datasets of single vegetation objects with rich reference data (e.g., plant species, basic plant morphometric information) and ii) approaches for information extraction (e.g., single tree detection, tree species classification based on waveform LiDAR features). We will build an extensive LiDAR data repository for supporting the development and benchmarking of LiDAR-based object information extraction. The LiDAR Vegetation Investigation and Signature Analysis System (LVISA) uses international web service standards (Open Geospatial Consortium, OGC) for geospatial data access and also analysis (e.g., OGC Web Processing Services). This will allow the research community identifying plant object specific vegetation features from LiDAR data, while accounting for differences in LiDAR systems (e.g., beam divergence), settings (e.g., point spacing), and calibration techniques. It is the goal of LVISA to develop generic 3D information extraction approaches, which can be seamlessly transferred to other datasets, timestamps and also extraction tasks. The current prototype of LVISA can be visited and tested online via http://uni-heidelberg.de/lvisa. Video tutorials provide a quick overview and entry into the functionality of LVISA. We will present the current advances of LVISA and we will highlight future research and extension of LVISA, such as integrating low-cost LiDAR data and datasets acquired by highly temporal scanning of vegetation (e.g., continuous measurements). Everybody is invited to join the LVISA development and share datasets and analysis approaches in an interoperable way via the web-based LVISA geoportal.

  8. Center of Excellence for Geospatial Information Science research plan 2013-18

    USGS Publications Warehouse

    Usery, E. Lynn

    2013-01-01

    The U.S. Geological Survey Center of Excellence for Geospatial Information Science (CEGIS) was created in 2006 and since that time has provided research primarily in support of The National Map. The presentations and publications of the CEGIS researchers document the research accomplishments that include advances in electronic topographic map design, generalization, data integration, map projections, sea level rise modeling, geospatial semantics, ontology, user-centered design, volunteer geographic information, and parallel and grid computing for geospatial data from The National Map. A research plan spanning 2013–18 has been developed extending the accomplishments of the CEGIS researchers and documenting new research areas that are anticipated to support The National Map of the future. In addition to extending the 2006–12 research areas, the CEGIS research plan for 2013–18 includes new research areas in data models, geospatial semantics, high-performance computing, volunteered geographic information, crowdsourcing, social media, data integration, and multiscale representations to support the Three-Dimensional Elevation Program (3DEP) and The National Map of the future of the U.S. Geological Survey.

  9. Urban networks among Chinese cities along "the Belt and Road": A case of web search activity in cyberspace

    PubMed Central

    Du, Hongru; Zhao, Yannan; Wu, Rongwei; Zhang, Xiaolei

    2017-01-01

    “The Belt and Road” initiative has been expected to facilitate interactions among numerous city centers. This initiative would generate a number of centers, both economic and political, which would facilitate greater interaction. To explore how information flows are merged and the specific opportunities that may be offered, Chinese cities along “the Belt and Road” are selected for a case study. Furthermore, urban networks in cyberspace have been characterized by their infrastructure orientation, which implies that there is a relative dearth of studies focusing on the investigation of urban hierarchies by capturing information flows between Chinese cities along “the Belt and Road”. This paper employs Baidu, the main web search engine in China, to examine urban hierarchies. The results show that urban networks become more balanced, shifting from a polycentric to a homogenized pattern. Furthermore, cities in networks tend to have both a hierarchical system and a spatial concentration primarily in regions such as Beijing-Tianjin-Hebei, Yangtze River Delta and the Pearl River Delta region. Urban hierarchy based on web search activity does not follow the existing hierarchical system based on geospatial and economic development in all cases. Moreover, urban networks, under the framework of “the Belt and Road”, show several significant corridors and more opportunities for more cities, particularly western cities. Furthermore, factors that may influence web search activity are explored. The results show that web search activity is significantly influenced by the economic gap, geographical proximity and administrative rank of the city. PMID:29200421

  10. Progress connecting multi-disciplinary geoscience communities through the VIVO semantic web application

    NASA Astrophysics Data System (ADS)

    Gross, M. B.; Mayernik, M. S.; Rowan, L. R.; Khan, H.; Boler, F. M.; Maull, K. E.; Stott, D.; Williams, S.; Corson-Rikert, J.; Johns, E. M.; Daniels, M. D.; Krafft, D. B.

    2015-12-01

    UNAVCO, UCAR, and Cornell University are working together to leverage semantic web technologies to enable discovery of people, datasets, publications and other research products, as well as the connections between them. The EarthCollab project, an EarthCube Building Block, is enhancing an existing open-source semantic web application, VIVO, to address connectivity gaps across distributed networks of researchers and resources related to the following two geoscience-based communities: (1) the Bering Sea Project, an interdisciplinary field program whose data archive is hosted by NCAR's Earth Observing Laboratory (EOL), and (2) UNAVCO, a geodetic facility and consortium that supports diverse research projects informed by geodesy. People, publications, datasets and grant information have been mapped to an extended version of the VIVO-ISF ontology and ingested into VIVO's database. Data is ingested using a custom set of scripts that include the ability to perform basic automated and curated disambiguation. VIVO can display a page for every object ingested, including connections to other objects in the VIVO database. A dataset page, for example, includes the dataset type, time interval, DOI, related publications, and authors. The dataset type field provides a connection to all other datasets of the same type. The author's page will show, among other information, related datasets and co-authors. Information previously spread across several unconnected databases is now stored in a single location. In addition to VIVO's default display, the new database can also be queried using SPARQL, a query language for semantic data. EarthCollab will also extend the VIVO web application. One such extension is the ability to cross-link separate VIVO instances across institutions, allowing local display of externally curated information. For example, Cornell's VIVO faculty pages will display UNAVCO's dataset information and UNAVCO's VIVO will display Cornell faculty member contact and position information. Additional extensions, including enhanced geospatial capabilities, will be developed following task-centered usability testing.

  11. Generation of Multiple Metadata Formats from a Geospatial Data Repository

    NASA Astrophysics Data System (ADS)

    Hudspeth, W. B.; Benedict, K. K.; Scott, S.

    2012-12-01

    The Earth Data Analysis Center (EDAC) at the University of New Mexico is partnering with the CYBERShARE and Environmental Health Group from the Center for Environmental Resource Management (CERM), located at the University of Texas, El Paso (UTEP), the Biodiversity Institute at the University of Kansas (KU), and the New Mexico Geo- Epidemiology Research Network (GERN) to provide a technical infrastructure that enables investigation of a variety of climate-driven human/environmental systems. Two significant goals of this NASA-funded project are: a) to increase the use of NASA Earth observational data at EDAC by various modeling communities through enabling better discovery, access, and use of relevant information, and b) to expose these communities to the benefits of provenance for improving understanding and usability of heterogeneous data sources and derived model products. To realize these goals, EDAC has leveraged the core capabilities of its Geographic Storage, Transformation, and Retrieval Engine (Gstore) platform, developed with support of the NSF EPSCoR Program. The Gstore geospatial services platform provides general purpose web services based upon the REST service model, and is capable of data discovery, access, and publication functions, metadata delivery functions, data transformation, and auto-generated OGC services for those data products that can support those services. Central to the NASA ACCESS project is the delivery of geospatial metadata in a variety of formats, including ISO 19115-2/19139, FGDC CSDGM, and the Proof Markup Language (PML). This presentation details the extraction and persistence of relevant metadata in the Gstore data store, and their transformation into multiple metadata formats that are increasingly utilized by the geospatial community to document not only core library catalog elements (e.g. title, abstract, publication data, geographic extent, projection information, and database elements), but also the processing steps used to generate derived modeling products. In particular, we discuss the generation and service delivery of provenance, or trace of data sources and analytical methods used in a scientific analysis, for archived data. We discuss the workflows developed by EDAC to capture end-to-end provenance, the storage model for those data in a delivery format independent data structure, and delivery of PML, ISO, and FGDC documents to clients requesting those products.

  12. Climatic Data Integration and Analysis - Regional Approaches to Climate Change for Pacific Northwest Agriculture (REACCH PNA)

    NASA Astrophysics Data System (ADS)

    Seamon, E.; Gessler, P. E.; Flathers, E.; Sheneman, L.; Gollberg, G.

    2013-12-01

    The Regional Approaches to Climate Change for Pacific Northwest Agriculture (REACCH PNA) is a five-year USDA/NIFA-funded coordinated agriculture project to examine the sustainability of cereal crop production systems in the Pacific Northwest, in relationship to ongoing climate change. As part of this effort, an extensive data management system has been developed to enable researchers, students, and the public, to upload, manage, and analyze various data. The REACCH PNA data management team has developed three core systems to encompass cyberinfrastructure and data management needs: 1) the reacchpna.org portal (https://www.reacchpna.org) is the entry point for all public and secure information, with secure access by REACCH PNA members for data analysis, uploading, and informational review; 2) the REACCH PNA Data Repository is a replicated, redundant database server environment that allows for file and database storage and access to all core data; and 3) the REACCH PNA Libraries which are functional groupings of data for REACCH PNA members and the public, based on their access level. These libraries are accessible thru our https://www.reacchpna.org portal. The developed system is structured in a virtual server environment (data, applications, web) that includes a geospatial database/geospatial web server for web mapping services (ArcGIS Server), use of ESRI's Geoportal Server for data discovery and metadata management (under the ISO 19115-2 standard), Thematic Realtime Environmental Distributed Data Services (THREDDS) for data cataloging, and Interactive Python notebook server (IPython) technology for data analysis. REACCH systems are housed and maintained by the Northwest Knowledge Network project (www.northwestknowledge.net), which provides data management services to support research. Initial project data harvesting and meta-tagging efforts have resulted in the interrogation and loading of over 10 terabytes of climate model output, regional entomological data, agricultural and atmospheric information, as well as imagery, publications, videos, and other soft content. In addition, the outlined data management approach has focused on the integration and interconnection of hard data (raw data output) with associated publications, presentations, or other narrative documentation - through metadata lineage associations. This harvest-and-consume data management methodology could additionally be applied to other research team environments that involve large and divergent data.

  13. Business logic for geoprocessing of distributed geodata

    NASA Astrophysics Data System (ADS)

    Kiehle, Christian

    2006-12-01

    This paper describes the development of a business-logic component for the geoprocessing of distributed geodata. The business logic acts as a mediator between the data and the user, therefore playing a central role in any spatial information system. The component is used in service-oriented architectures to foster the reuse of existing geodata inventories. Based on a geoscientific case study of groundwater vulnerability assessment and mapping, the demands for such architectures are identified with special regard to software engineering tasks. Methods are derived from the field of applied Geosciences (Hydrogeology), Geoinformatics, and Software Engineering. In addition to the development of a business logic component, a forthcoming Open Geospatial Consortium (OGC) specification is introduced: the OGC Web Processing Service (WPS) specification. A sample application is introduced to demonstrate the potential of WPS for future information systems. The sample application Geoservice Groundwater Vulnerability is described in detail to provide insight into the business logic component, and demonstrate how information can be generated out of distributed geodata. This has the potential to significantly accelerate the assessment and mapping of groundwater vulnerability. The presented concept is easily transferable to other geoscientific use cases dealing with distributed data inventories. Potential application fields include web-based geoinformation systems operating on distributed data (e.g. environmental planning systems, cadastral information systems, and others).

  14. Plug-and-play web-based visualization of mobile air monitoring data

    EPA Science Inventory

    The collection of air measurements in real-time on moving platforms, such as wearable, bicycle-mounted, or vehicle-mounted air sensors, is becoming an increasingly common method to investigate local air quality. However, visualizing and analyzing geospatial air monitoring data r...

  15. OpenClimateGIS - A Web Service Providing Climate Model Data in Commonly Used Geospatial Formats

    NASA Astrophysics Data System (ADS)

    Erickson, T. A.; Koziol, B. W.; Rood, R. B.

    2011-12-01

    The goal of the OpenClimateGIS project is to make climate model datasets readily available in commonly used, modern geospatial formats used by GIS software, browser-based mapping tools, and virtual globes.The climate modeling community typically stores climate data in multidimensional gridded formats capable of efficiently storing large volumes of data (such as netCDF, grib) while the geospatial community typically uses flexible vector and raster formats that are capable of storing small volumes of data (relative to the multidimensional gridded formats). OpenClimateGIS seeks to address this difference in data formats by clipping climate data to user-specified vector geometries (i.e. areas of interest) and translating the gridded data on-the-fly into multiple vector formats. The OpenClimateGIS system does not store climate data archives locally, but rather works in conjunction with external climate archives that expose climate data via the OPeNDAP protocol. OpenClimateGIS provides a RESTful API web service for accessing climate data resources via HTTP, allowing a wide range of applications to access the climate data.The OpenClimateGIS system has been developed using open source development practices and the source code is publicly available. The project integrates libraries from several other open source projects (including Django, PostGIS, numpy, Shapely, and netcdf4-python).OpenClimateGIS development is supported by a grant from NOAA's Climate Program Office.

  16. 78 FR 35606 - Privacy Act of 1974; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-13

    ...; System of Records AGENCY: National Geospatial-Intelligence Agency, DoD. ACTION: Notice to alter a System of Records. SUMMARY: The National Geospatial-Intelligence Agency is altering a system of records in.... FOR FURTHER INFORMATION CONTACT: National Geospatial-Intelligence Agency (NGA), ATTN: Security...

  17. Turning Interoperability Operational with GST

    NASA Astrophysics Data System (ADS)

    Schaeben, Helmut; Gabriel, Paul; Gietzel, Jan; Le, Hai Ha

    2013-04-01

    GST - Geosciences in space and time is being developed and implemented as hub to facilitate the exchange of spatially and temporally indexed multi-dimensional geoscience data and corresponding geomodels amongst partners. It originates from TUBAF's contribution to the EU project "ProMine" and its perspective extensions are TUBAF's contribution to the actual EU project "GeoMol". As of today, it provides basic components of a geodata infrastructure as required to establish interoperability with respect to geosciences. Generally, interoperability means the facilitation of cross-border and cross-sector information exchange, taking into account legal, organisational, semantic and technical aspects, cf. Interoperability Solutions for European Public Administrations (ISA), cf. http://ec.europa.eu/isa/. Practical interoperability for partners of a joint geoscience project, say European Geological Surveys acting in a border region, means in particular provision of IT technology to exchange spatially and maybe additionally temporally indexed multi-dimensional geoscience data and corresponding models, i.e. the objects composing geomodels capturing the geometry, topology, and various geoscience contents. Geodata Infrastructure (GDI) and interoperability are objectives of several inititatives, e.g. INSPIRE, OneGeology-Europe, and most recently EGDI-SCOPE to name just the most prominent ones. Then there are quite a few markup languages (ML) related to geographical or geological information like GeoSciML, EarthResourceML, BoreholeML, ResqML for reservoir characterization, earth and reservoir models, and many others featuring geoscience information. Several Web Services are focused on geographical or geoscience information. The Open Geospatial Consortium (OGC) promotes specifications of a Web Feature Service (WFS), a Web Map Service (WMS), a Web Coverage Serverice (WCS), a Web 3D Service (W3DS), and many more. It will be clarified how GST is related to these initiatives, especially how it complies with existing or developing standards or quasi-standards and how it applies and extents services towards interoperability in the Earth sciences.

  18. 75 FR 43497 - Privacy Act of 1974; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-26

    ...; System of Records AGENCY: National Geospatial-Intelligence Agency (NGA), DoD. ACTION: Notice to add a system of records. SUMMARY: The National Geospatial-Intelligence Agency (NGA) proposes to add a system of...-3808. SUPPLEMENTARY INFORMATION: The National Geospatial-Intelligence Agency notices for systems of...

  19. Assessing the socioeconomic impact and value of open geospatial information

    USGS Publications Warehouse

    Pearlman, Francoise; Pearlman, Jay; Bernknopf, Richard; Coote, Andrew; Craglia, Massimo; Friedl, Lawrence; Gallo, Jason; Hertzfeld, Henry; Jolly, Claire; Macauley, Molly K.; Shapiro, Carl; Smart, Alan

    2016-03-10

    The workshop included 68 participants coming from international organizations, the U.S. public and private sectors, nongovernmental organizations, and academia. Participants included policy makers and analysts, financial analysts, economists, information scientists, geospatial practitioners, and other discipline experts.

  20. Mapping the Future Today: The Community College of Baltimore County Geospatial Applications Program

    ERIC Educational Resources Information Center

    Jeffrey, Scott; Alvarez, Jaime

    2010-01-01

    The Geospatial Applications Program at the Community College of Baltimore County (CCBC), located five miles west of downtown Baltimore, Maryland, provides comprehensive instruction in geographic information systems (GIS), remote sensing and global positioning systems (GPS). Geospatial techniques, which include computer-based mapping and remote…

  1. Interactive Visualization of Near Real-Time and Production Global Precipitation Mission Data Online Using CesiumJS

    NASA Astrophysics Data System (ADS)

    Lammers, M.

    2016-12-01

    Advancements in the capabilities of JavaScript frameworks and web browsing technology make online visualization of large geospatial datasets viable. Commonly this is done using static image overlays, pre-rendered animations, or cumbersome geoservers. These methods can limit interactivity and/or place a large burden on server-side post-processing and storage of data. Geospatial data, and satellite data specifically, benefit from being visualized both on and above a three-dimensional surface. The open-source JavaScript framework CesiumJS, developed by Analytical Graphics, Inc., leverages the WebGL protocol to do just that. It has entered the void left by the abandonment of the Google Earth Web API, and it serves as a capable and well-maintained platform upon which data can be displayed. This paper will describe the technology behind the two primary products developed as part of the NASA Precipitation Processing System STORM website: GPM Near Real Time Viewer (GPMNRTView) and STORM Virtual Globe (STORM VG). GPMNRTView reads small post-processed CZML files derived from various Level 1 through 3 near real-time products. For swath-based products, several brightness temperature channels or precipitation-related variables are available for animating in virtual real-time as the satellite observed them on and above the Earth's surface. With grid-based products, only precipitation rates are available, but the grid points are visualized in such a way that they can be interactively examined to explore raw values. STORM VG reads values directly off the HDF5 files, converting the information into JSON on the fly. All data points both on and above the surface can be examined here as well. Both the raw values and, if relevant, elevations are displayed. Surface and above-ground precipitation rates from select Level 2 and 3 products are shown. Examples from both products will be shown, including visuals from high impact events observed by GPM constellation satellites.

  2. Interactive Visualization of Near Real Time and Production Global Precipitation Measurement (GPM) Mission Data Online Using CesiumJS

    NASA Technical Reports Server (NTRS)

    Lammers, Matthew

    2016-01-01

    Advancements in the capabilities of JavaScript frameworks and web browsing technology make online visualization of large geospatial datasets viable. Commonly this is done using static image overlays, prerendered animations, or cumbersome geoservers. These methods can limit interactivity andor place a large burden on server-side post-processing and storage of data. Geospatial data, and satellite data specifically, benefit from being visualized both on and above a three-dimensional surface. The open-source JavaScript framework CesiumJS, developed by Analytical Graphics, Inc., leverages the WebGL protocol to do just that. It has entered the void left by the abandonment of the Google Earth Web API, and it serves as a capable and well-maintained platform upon which data can be displayed. This paper will describe the technology behind the two primary products developed as part of the NASA Precipitation Processing System STORM website: GPM Near Real Time Viewer (GPMNRTView) and STORM Virtual Globe (STORM VG). GPMNRTView reads small post-processed CZML files derived from various Level 1 through 3 near real-time products. For swath-based products, several brightness temperature channels or precipitation-related variables are available for animating in virtual real-time as the satellite-observed them on and above the Earths surface. With grid-based products, only precipitation rates are available, but the grid points are visualized in such a way that they can be interactively examined to explore raw values. STORM VG reads values directly off the HDF5 files, converting the information into JSON on the fly. All data points both on and above the surface can be examined here as well. Both the raw values and, if relevant, elevations are displayed. Surface and above-ground precipitation rates from select Level 2 and 3 products are shown. Examples from both products will be shown, including visuals from high impact events observed by GPM constellation satellites.

  3. The Federal Geospatial Platform a shared infrastructure for publishing, discovering and exploiting public data and spatial applications.

    NASA Astrophysics Data System (ADS)

    Dabolt, T. O.

    2016-12-01

    The proliferation of open data and data services continues to thrive and is creating new challenges on how researchers, policy analysts and other decision makes can quickly discover and use relevant data. While traditional metadata catalog approaches used by applications such as data.gov prove to be useful starting points for data search they can quickly frustrate end users who are seeking ways to quickly find and then use data in machine to machine environs. The Geospatial Platform is overcoming these obstacles and providing end users and applications developers a richer more productive user experience. The Geospatial Platform leverages a collection of open source and commercial technology hosted on Amazon Web Services providing an ecosystem of services delivering trusted, consistent data in open formats to all users as well as a shared infrastructure for federal partners to serve their spatial data assets. It supports a diverse array of communities of practice ranging on topics from the 16 National Geospatial Data Assets Themes, to homeland security and climate adaptation. Come learn how you can contribute your data and leverage others or check it out on your own at https://www.geoplatform.gov/

  4. Geospatial considerations for a multiorganizational, landscape-scale program

    USGS Publications Warehouse

    O'Donnell, Michael S.; Assal, Timothy J.; Anderson, Patrick J.; Bowen, Zachary H.

    2013-01-01

    Geospatial data play an increasingly important role in natural resources management, conservation, and science-based projects. The management and effective use of spatial data becomes significantly more complex when the efforts involve a myriad of landscape-scale projects combined with a multiorganizational collaboration. There is sparse literature to guide users on this daunting subject; therefore, we present a framework of considerations for working with geospatial data that will provide direction to data stewards, scientists, collaborators, and managers for developing geospatial management plans. The concepts we present apply to a variety of geospatial programs or projects, which we describe as a “scalable framework” of processes for integrating geospatial efforts with management, science, and conservation initiatives. Our framework includes five tenets of geospatial data management: (1) the importance of investing in data management and standardization, (2) the scalability of content/efforts addressed in geospatial management plans, (3) the lifecycle of a geospatial effort, (4) a framework for the integration of geographic information systems (GIS) in a landscape-scale conservation or management program, and (5) the major geospatial considerations prior to data acquisition. We conclude with a discussion of future considerations and challenges.

  5. Spatial Information in local society's cultural conservation and research

    NASA Astrophysics Data System (ADS)

    Jang, J.-J.; Liao, H.-M.; Fan, I.-C.

    2015-09-01

    Center for Geographic Information Science, Research Center for Humanities and Social Sciences,Academia Sinica (GIS center), Coordinate short-, medium-, and long-term operations of multidisciplinary researches focusing on related topics in the sciences and humanities. Based on the requirements of multi-disciplinary research applications, sustain collection and construction of sustaining and unifying spatial base data and knowledge and building of spatial data infrastructure. Since the 1990s, GIS center build geographic information platform: "Time and space infrastructure of Chinese civilization" (Chinese Civilizationin Time and Space, CCTS) and "Taiwan History and Culture Map" (Taiwan History and Culture in Time and Space, THCTS) . the goal of both system is constructing an integrated GIS-based application infrastructure on the spatial extent of China and Taiwan, in the timeframe of Chinese and Taiwanese history, and with the contents of Chinese and Taiwanese civilization. Base on THCTS, we began to build Cultural Resources GIS(CRGIS, http://crgis.rchss.sinica.edu.tw) in 2006, to collect temples, historic Monuments, historic buildings, old trees, wind lions god and other cultural resource in Taiwan, and provide a platform for the volunteers to make for all types of tangible, intangible cultural resources, add, edit, organize and query data via Content Management System(CMS) . CRGIS collected aggregated 13,000 temples, 4,900 churches. On this basis, draw a variety of religious beliefs map-multiple times Temple distributions, different main god distributions, church distribution. Such as Mazu maps, Multiple times temple distributions map (before 1823, 1823-1895,1895-1949,1949-2015 years) at Taijiang inner sea areas in Tainan. In Taiwan, there is a religious ritual through folk activities for a period ranging from one day to several days, passing specific geospatial range and passes through some temples or houses. Such an important folk activity somewhat similar to Western parade, called " raojing " , the main spirit is passing through of these ranges in the process, to reach the people within bless range, many scholars and academic experts's folk research are dependent on such spatial information. 2012, GIS center applied WebGIS and GPS to gather raojing activities spatial information in cooperation with multi-units, aggregated seven sessions, 22 days, 442 temples had pass through . The atlas also published named "Atlas of the 2012 Religious Processions in the Tainan Region" in 2014. we also applied national cultural resources data form relevant government authorities, through the metadata design and data processing(geocoding), established cultural geospatial and thematic information ,such as 800 monuments, 1,100 historic buildings and 4,300 old trees data. In recent years, based on CRGIS technology and operational concepts, different domain experts or local culture-ahistory research worker/team had to cooperate with us to establish local or thematic material and cultural resources. As in collaboration with local culture-history research worker in Kinmen County in 2012, build Kinmen intangible cultural assets - Wind Lion God ,set metadata and build 122 wind lion god `s attribute data and maps through field survey, it is worth mention such fieldwork data integrity is more than the official registration data form Kinmen National Park, the number of is wind lion god more than 40; in 2013,we were in cooperation with academic experts to establish property data and map of the theatre during the Japanese colonial era in Taiwan, a total of 170 theatres ; we were in cooperation with Japanese scholars, used his 44 detaile field survey data of sugar refineries during the Japanese colonial era in Taiwan ,to produce a sugar refineries distribution map and extend to a thematic web(http://map.net.tw/) [The Cultural Heritage Maps of Taiwan Suger Factories in a Century]site according to CRGIS independent cultural concept. Deployment and operation of the CRGIS, the meaning is not only build the thematic GIS system ,but also contain these concepts: Open Data, Wikipedia ,Public Participation, and we provide an interactive platform with culture resource data and geospatial technology. In addition to providing these reference material for local culture education, local cultural recognition, but to further cooperate with scholars, academic experts , local culture-history research worker / team, accumulated rich record of the past, research results, through the spatial database planning, data processing(ex. geocoding), field survey, geospatial materials overlapping, such as nesting geospatial technology to compile the cultural information and value-added applications.

  6. 3D geospatial visualizations: Animation and motion effects on spatial objects

    NASA Astrophysics Data System (ADS)

    Evangelidis, Konstantinos; Papadopoulos, Theofilos; Papatheodorou, Konstantinos; Mastorokostas, Paris; Hilas, Constantinos

    2018-02-01

    Digital Elevation Models (DEMs), in combination with high quality raster graphics provide realistic three-dimensional (3D) representations of the globe (virtual globe) and amazing navigation experience over the terrain through earth browsers. In addition, the adoption of interoperable geospatial mark-up languages (e.g. KML) and open programming libraries (Javascript) makes it also possible to create 3D spatial objects and convey on them the sensation of any type of texture by utilizing open 3D representation models (e.g. Collada). One step beyond, by employing WebGL frameworks (e.g. Cesium.js, three.js) animation and motion effects are attributed on 3D models. However, major GIS-based functionalities in combination with all the above mentioned visualization capabilities such as for example animation effects on selected areas of the terrain texture (e.g. sea waves) as well as motion effects on 3D objects moving in dynamically defined georeferenced terrain paths (e.g. the motion of an animal over a hill, or of a big fish in an ocean etc.) are not widely supported at least by open geospatial applications or development frameworks. Towards this we developed and made available to the research community, an open geospatial software application prototype that provides high level capabilities for dynamically creating user defined virtual geospatial worlds populated by selected animated and moving 3D models on user specified locations, paths and areas. At the same time, the generated code may enhance existing open visualization frameworks and programming libraries dealing with 3D simulations, with the geospatial aspect of a virtual world.

  7. Real-Time Geospatial Data Viewer (RETIGO)

    EPA Science Inventory

    This is a web-based method that allows the users to upload their air monitoring data and explore the data on graphical interface. The method is optimized for mobile monitoring data sets, showing the data on a map, on a time series, and referenced to a hypothesized line and/or poi...

  8. Development of Distributed Research Center for analysis of regional climatic and environmental changes

    NASA Astrophysics Data System (ADS)

    Gordov, E.; Shiklomanov, A.; Okladnikov, I.; Prusevich, A.; Titov, A.

    2016-11-01

    We present an approach and first results of a collaborative project being carried out by a joint team of researchers from the Institute of Monitoring of Climatic and Ecological Systems, Russia and Earth Systems Research Center UNH, USA. Its main objective is development of a hardware and software platform prototype of a Distributed Research Center (DRC) for monitoring and projecting of regional climatic and environmental changes in the Northern extratropical areas. The DRC should provide the specialists working in climate related sciences and decision-makers with accurate and detailed climatic characteristics for the selected area and reliable and affordable tools for their in-depth statistical analysis and studies of the effects of climate change. Within the framework of the project, new approaches to cloud processing and analysis of large geospatial datasets (big geospatial data) inherent to climate change studies are developed and deployed on technical platforms of both institutions. We discuss here the state of the art in this domain, describe web based information-computational systems developed by the partners, justify the methods chosen to reach the project goal, and briefly list the results obtained so far.

  9. Operational Remote Sensing Services in North Eastern Region of India for Natural Resources Management, Early Warning for Disaster Risk Reduction and Dissemination of Information and Services

    NASA Astrophysics Data System (ADS)

    Raju, P. L. N.; Sarma, K. K.; Barman, D.; Handique, B. K.; Chutia, D.; Kundu, S. S.; Das, R. Kr.; Chakraborty, K.; Das, R.; Goswami, J.; Das, P.; Devi, H. S.; Nongkynrih, J. M.; Bhusan, K.; Singh, M. S.; Singh, P. S.; Saikhom, V.; Goswami, C.; Pebam, R.; Borgohain, A.; Gogoi, R. B.; Singh, N. R.; Bharali, A.; Sarma, D.; Lyngdoh, R. B.; Mandal, P. P.; Chabukdhara, M.

    2016-06-01

    North Eastern Region (NER) of India comprising of eight states considered to be most unique and one of the most challenging regions to govern due to its unique physiographic condition, rich biodiversity, disaster prone and diverse socio-economic characteristics. Operational Remote Sensing services increased manifolds in the region with the establishment of North Eastern Space Applications Centre (NESAC) in the year 2000. Since inception, NESAC has been providing remote sensing services in generating inventory, planning and developmental activities, and management of natural resources, disasters and dissemination of information and services through geo-web services for NER. The operational remote sensing services provided by NESAC can be broadly divided into three categories viz. natural resource planning and developmental services, disaster risk reduction and early warning services and information dissemination through geo-portal services. As a apart of natural resources planning and developmental services NESAC supports the state forest departments in preparing the forest working plans by providing geospatial inputs covering entire NER, identifying the suitable culturable wastelands for cultivation of silkworm food plants, mapping of natural resources such as land use/land cover, wastelands, land degradation etc. on temporal basis. In the area of disaster risk reduction, NESAC has initiated operational services for early warning and post disaster assessment inputs for flood early warning system (FLEWS) using satellite remote sensing, numerical weather prediction, hydrological modeling etc.; forest fire alert system with actionable attribute information; Japanese Encephalitis Early Warning System (JEWS) based on mosquito vector abundance, pig population and historical disease intensity and agriculture drought monitoring for the region. The large volumes of geo-spatial databases generated as part of operational services are made available to the administrators and local government bodies for better management, preparing prospective planning, and sustainable use of available resources. The knowledge dissemination is being done through online web portals wherever the internet access is available and as well as offline space based information kiosks, where the internet access is not available or having limited bandwidth availability. This paper presents a systematic and comprehensive study on the remote sensing services operational in NER of India for natural resources management, disaster risk reduction and dissemination of information and services, in addition to outlining future areas and direction of space applications for the region.

  10. Realising the Uncertainty Enabled Model Web

    NASA Astrophysics Data System (ADS)

    Cornford, D.; Bastin, L.; Pebesma, E. J.; Williams, M.; Stasch, C.; Jones, R.; Gerharz, L.

    2012-12-01

    The FP7 funded UncertWeb project aims to create the "uncertainty enabled model web". The central concept here is that geospatial models and data resources are exposed via standard web service interfaces, such as the Open Geospatial Consortium (OGC) suite of encodings and interface standards, allowing the creation of complex workflows combining both data and models. The focus of UncertWeb is on the issue of managing uncertainty in such workflows, and providing the standards, architecture, tools and software support necessary to realise the "uncertainty enabled model web". In this paper we summarise the developments in the first two years of UncertWeb, illustrating several key points with examples taken from the use case requirements that motivate the project. Firstly we address the issue of encoding specifications. We explain the usage of UncertML 2.0, a flexible encoding for representing uncertainty based on a probabilistic approach. This is designed to be used within existing standards such as Observations and Measurements (O&M) and data quality elements of ISO19115 / 19139 (geographic information metadata and encoding specifications) as well as more broadly outside the OGC domain. We show profiles of O&M that have been developed within UncertWeb and how UncertML 2.0 is used within these. We also show encodings based on NetCDF and discuss possible future directions for encodings in JSON. We then discuss the issues of workflow construction, considering discovery of resources (both data and models). We discuss why a brokering approach to service composition is necessary in a world where the web service interfaces remain relatively heterogeneous, including many non-OGC approaches, in particular the more mainstream SOAP and WSDL approaches. We discuss the trade-offs between delegating uncertainty management functions to the service interfaces themselves and integrating the functions in the workflow management system. We describe two utility services to address conversion between uncertainty types, and between the spatial / temporal support of service inputs / outputs. Finally we describe the tools being generated within the UncertWeb project, considering three main aspects: i) Elicitation of uncertainties on model inputs. We are developing tools to enable domain experts to provide judgements about input uncertainties from UncertWeb model components (e.g. parameters in meteorological models) which allow panels of experts to engage in the process and reach a consensus view on the current knowledge / beliefs about that parameter or variable. We are developing systems for continuous and categorical variables as well as stationary spatial fields. ii) Visualisation of the resulting uncertain outputs from the end of the workflow, but also at intermediate steps. At this point we have prototype implementations driven by the requirements from the use cases that motivate UncertWeb. iii) Sensitivity and uncertainty analysis on model outputs. Here we show the design of the overall system we are developing, including the deployment of an emulator framework to allow computationally efficient approaches. We conclude with a summary of the open issues and remaining challenges we are facing in UncertWeb, and provide a brief overview of how we plan to tackle these.

  11. The Sky's the Limit: Integrating Geospatial Tools with Pre-College Youth Education

    ERIC Educational Resources Information Center

    McGee, John; Kirwan, Jeff

    2010-01-01

    Geospatial tools, which include global positioning systems (GPS), geographic information systems (GIS), and remote sensing, are increasingly driving a variety of applications. Local governments and private industry are embracing these tools, and the public is beginning to demand geospatial services. The U.S. Department of Labor (DOL) reported that…

  12. Expansion and Enhacement of the Wyoming Coalbed Methane Clearinghouse Website to the Wyoming Energy Resources Information Clearinghouse.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hulme, Diana; Hamerlinck, Jeffrey; Bergman, Harold

    Energy development is expanding across the United States, particularly in western states like Wyoming. Federal and state land management agencies, local governments, industry and non-governmental organizations have realized the need to access spatially-referenced data and other non-spatial information to determine the geographical extent and cumulative impacts of expanding energy development. The Wyoming Energy Resources Information Clearinghouse (WERIC) is a web-based portal which centralizes access to news, data, maps, reports and other information related to the development, management and conservation of Wyoming's diverse energy resources. WERIC was established in 2006 by the University of Wyoming's Ruckelshaus Institute of Environment and Naturalmore » Resources (ENR) and the Wyoming Geographic Information Science Center (WyGISC) with funding from the US Department of Energy (DOE) and the US Bureau of Land Management (BLM). The WERIC web portal originated in concept from a more specifically focused website, the Coalbed Methane (CBM) Clearinghouse. The CBM Clearinghouse effort focused only on coalbed methane production within the Powder River Basin of northeast Wyoming. The CBM Clearinghouse demonstrated a need to expand the effort statewide with a comprehensive energy focus, including fossil fuels and renewable and alternative energy resources produced and/or developed in Wyoming. WERIC serves spatial data to the greater Wyoming geospatial community through the Wyoming GeoLibrary, the WyGISC Data Server and the Wyoming Energy Map. These applications are critical components that support the Wyoming Energy Resources Information Clearinghouse (WERIC). The Wyoming GeoLibrary is a tool for searching and browsing a central repository for metadata. It provides the ability to publish and maintain metadata and geospatial data in a distributed environment. The WyGISC Data Server is an internet mapping application that provides traditional GIS mapping and analysis functionality via the web. It is linked into various state and federal agency spatial data servers allowing users to visualize multiple themes, such as well locations and core sage grouse areas, in one domain. Additionally, this application gives users the ability to download any of the data being displayed within the web map. The Wyoming Energy Map is the newest mapping application developed directly from this effort. With over a 100 different layers accessible via this mapping application, it is the most comprehensive Wyoming energy mapping application available. This application also provides the public with the ability to create cultural and wildlife reports based on any location throughout Wyoming and at multiple scales. The WERIC website also allows users to access links to federal, state, and local natural resource agency websites and map servers; research documents about energy; and educational information, including information on upcoming energy-relate conferences. The WERIC website has seen significant use by energy industry consultants, land management agencies, state and local decision-makers, non-governmental organizations and the public. Continued service to these sectors is desirable but some challenges remain in keeping the WERIC site viable. The most pressing issue is finding the human and financial resources to keep the site continually updated. Initially, the concept included offering users the ability to maintain the site themselves; however, this has proven not to be a viable option since very few people contributed. Without user contributions, the web page relied on already committed university staff to publish and link to the appropriate documents and web-pages. An option that is currently being explored to address this issue is development of a partnership with the University of Wyoming, School of Energy Resources (SER). As part of their outreach program, SER may be able to contribute funding for a full-time position dedicated to maintenance of WERIC.« less

  13. BAID: The Barrow Area Information Database - an Interactive Web Mapping Portal and Cyberinfrastructure for Science and Land Management in the Vicinity of Barrow on the North Slope of Alaska.

    NASA Astrophysics Data System (ADS)

    Escarzaga, S. M.; Cody, R. P.; Gaylord, A. G.; Kassin, A.; Barba, M.; Aiken, Q.; Nelson, L.; Mazza Ramsay, F. D.; Tweedie, C. E.

    2016-12-01

    The Barrow area of northern Alaska is one of the most intensely researched locations in the Arctic and the Barrow Area Information Database (BAID, www.barrowmapped.org) tracks and facilitates a gamut of research, management, and educational activities in the area. BAID is a cyberinfrastructure (CI) that details much of the historic and extant research undertaken within in the Barrow region in a suite of interactive web-based mapping and information portals (geobrowsers). The BAID user community and target audience for BAID is diverse and includes research scientists, science logisticians, land managers, educators, students, and the general public. BAID contains information on more than 16,000 Barrow area research sites that extend back to the 1940's and more than 640 remote sensing images and geospatial datasets. In a web-based setting, users can zoom, pan, query, measure distance, save or print maps and query results, and filter or view information by space, time, and/or other tags. Recent advances include provision of differential global positioning (dGPS) system and high resolution aerial imagery support to visiting scientists, analysis and multitemporal mapping of over 120 km of coastline for erosion monitoring; maintenance of a wireless micrometeorological sensor network; links to Barrow area datasets housed at national data archives; and substantial upgrades to the BAID website. Web mapping applications that have launched to the public include: an Imagery Time Viewer that allows users to compare imagery of the Barrow area between 1949 and the present; a Coastal Erosion Viewer that allows users to view long-term (1955-2015) and recent (2013-2015) rates of erosion for the Barrow area; and a Community Planning Tool that allows users to view and print dynamic reports based on an array of basemaps including a new 0.5m resolution wetlands map designed to enhance decision making for development and land management.

  14. Advancing Collaboration through Hydrologic Data and Model Sharing

    NASA Astrophysics Data System (ADS)

    Tarboton, D. G.; Idaszak, R.; Horsburgh, J. S.; Ames, D. P.; Goodall, J. L.; Band, L. E.; Merwade, V.; Couch, A.; Hooper, R. P.; Maidment, D. R.; Dash, P. K.; Stealey, M.; Yi, H.; Gan, T.; Castronova, A. M.; Miles, B.; Li, Z.; Morsy, M. M.

    2015-12-01

    HydroShare is an online, collaborative system for open sharing of hydrologic data, analytical tools, and models. It supports the sharing of and collaboration around "resources" which are defined primarily by standardized metadata, content data models for each resource type, and an overarching resource data model based on the Open Archives Initiative's Object Reuse and Exchange (OAI-ORE) standard and a hierarchical file packaging system called "BagIt". HydroShare expands the data sharing capability of the CUAHSI Hydrologic Information System by broadening the classes of data accommodated to include geospatial and multidimensional space-time datasets commonly used in hydrology. HydroShare also includes new capability for sharing models, model components, and analytical tools and will take advantage of emerging social media functionality to enhance information about and collaboration around hydrologic data and models. It also supports web services and server/cloud based computation operating on resources for the execution of hydrologic models and analysis and visualization of hydrologic data. HydroShare uses iRODS as a network file system for underlying storage of datasets and models. Collaboration is enabled by casting datasets and models as "social objects". Social functions include both private and public sharing, formation of collaborative groups of users, and value-added annotation of shared datasets and models. The HydroShare web interface and social media functions were developed using the Django web application framework coupled to iRODS. Data visualization and analysis is supported through the Tethys Platform web GIS software stack. Links to external systems are supported by RESTful web service interfaces to HydroShare's content. This presentation will introduce the HydroShare functionality developed to date and describe ongoing development of functionality to support collaboration and integration of data and models.

  15. Crowdsourcing, citizen sensing and sensor web technologies for public and environmental health surveillance and crisis management: trends, OGC standards and application examples

    PubMed Central

    2011-01-01

    'Wikification of GIS by the masses' is a phrase-term first coined by Kamel Boulos in 2005, two years earlier than Goodchild's term 'Volunteered Geographic Information'. Six years later (2005-2011), OpenStreetMap and Google Earth (GE) are now full-fledged, crowdsourced 'Wikipedias of the Earth' par excellence, with millions of users contributing their own layers to GE, attaching photos, videos, notes and even 3-D (three dimensional) models to locations in GE. From using Twitter in participatory sensing and bicycle-mounted sensors in pervasive environmental sensing, to creating a 100,000-sensor geo-mashup using Semantic Web technology, to the 3-D visualisation of indoor and outdoor surveillance data in real-time and the development of next-generation, collaborative natural user interfaces that will power the spatially-enabled public health and emergency situation rooms of the future, where sensor data and citizen reports can be triaged and acted upon in real-time by distributed teams of professionals, this paper offers a comprehensive state-of-the-art review of the overlapping domains of the Sensor Web, citizen sensing and 'human-in-the-loop sensing' in the era of the Mobile and Social Web, and the roles these domains can play in environmental and public health surveillance and crisis/disaster informatics. We provide an in-depth review of the key issues and trends in these areas, the challenges faced when reasoning and making decisions with real-time crowdsourced data (such as issues of information overload, "noise", misinformation, bias and trust), the core technologies and Open Geospatial Consortium (OGC) standards involved (Sensor Web Enablement and Open GeoSMS), as well as a few outstanding project implementation examples from around the world. PMID:22188675

  16. A National Crop Progress Monitoring System Based on NASA Earth Science Results

    NASA Astrophysics Data System (ADS)

    Di, L.; Yu, G.; Zhang, B.; Deng, M.; Yang, Z.

    2011-12-01

    Crop progress is an important piece of information for food security and agricultural commodities. Timely monitoring and reporting are mandated for the operation of agricultural statistical agencies. Traditionally, the weekly reporting issued by the National Agricultural Statistics Service (NASS) of the United States Department of Agriculture (USDA) is based on reports from the knowledgeable state and county agricultural officials and farmers. The results are spatially coarse and subjective. In this project, a remote-sensing-supported crop progress monitoring system is being developed intensively using the data and derived products from NASA Earth Observing satellites. Moderate Resolution Imaging Spectroradiometer (MODIS) Level 3 product - MOD09 (Surface Reflectance) is used for deriving daily normalized vegetation index (NDVI), vegetation condition index (VCI), and mean vegetation condition index (MVCI). Ratio change to previous year and multiple year mean can be also produced on demand. The time-series vegetation condition indices are further combined with the NASS' remote-sensing-derived Cropland Data Layer (CDL) to estimate crop condition and progress crop by crop. To facilitate the operational requirement and increase the accessibility of data and products by different users, each component of the system has being developed and implemented following open specifications under the Web Service reference model of Open Geospatial Consortium Inc. Sensor observations and data are accessed through Web Coverage Service (WCS), Web Feature Service (WFS), or Sensor Observation Service (SOS) if available. Products are also served through such open-specification-compliant services. For rendering and presentation, Web Map Service (WMS) is used. A Web-service based system is set up and deployed at dss.csiss.gmu.edu/NDVIDownload. Further development will adopt crop growth models, feed the models with remotely sensed precipitation and soil moisture information, and incorporate the model results with vegetation-index time series for crop progress stage estimation.

  17. Crowdsourcing, citizen sensing and sensor web technologies for public and environmental health surveillance and crisis management: trends, OGC standards and application examples.

    PubMed

    Kamel Boulos, Maged N; Resch, Bernd; Crowley, David N; Breslin, John G; Sohn, Gunho; Burtner, Russ; Pike, William A; Jezierski, Eduardo; Chuang, Kuo-Yu Slayer

    2011-12-21

    'Wikification of GIS by the masses' is a phrase-term first coined by Kamel Boulos in 2005, two years earlier than Goodchild's term 'Volunteered Geographic Information'. Six years later (2005-2011), OpenStreetMap and Google Earth (GE) are now full-fledged, crowdsourced 'Wikipedias of the Earth' par excellence, with millions of users contributing their own layers to GE, attaching photos, videos, notes and even 3-D (three dimensional) models to locations in GE. From using Twitter in participatory sensing and bicycle-mounted sensors in pervasive environmental sensing, to creating a 100,000-sensor geo-mashup using Semantic Web technology, to the 3-D visualisation of indoor and outdoor surveillance data in real-time and the development of next-generation, collaborative natural user interfaces that will power the spatially-enabled public health and emergency situation rooms of the future, where sensor data and citizen reports can be triaged and acted upon in real-time by distributed teams of professionals, this paper offers a comprehensive state-of-the-art review of the overlapping domains of the Sensor Web, citizen sensing and 'human-in-the-loop sensing' in the era of the Mobile and Social Web, and the roles these domains can play in environmental and public health surveillance and crisis/disaster informatics. We provide an in-depth review of the key issues and trends in these areas, the challenges faced when reasoning and making decisions with real-time crowdsourced data (such as issues of information overload, "noise", misinformation, bias and trust), the core technologies and Open Geospatial Consortium (OGC) standards involved (Sensor Web Enablement and Open GeoSMS), as well as a few outstanding project implementation examples from around the world.

  18. Enhancing Earth Observation and Modeling for Tsunami Disaster Response and Management

    NASA Astrophysics Data System (ADS)

    Koshimura, Shunichi; Post, Joachim

    2017-04-01

    In the aftermath of catastrophic natural disasters, such as earthquakes and tsunamis, our society has experienced significant difficulties in assessing disaster impact in the limited amount of time. In recent years, the quality of satellite sensors and access to and use of satellite imagery and services has greatly improved. More and more space agencies have embraced data-sharing policies that facilitate access to archived and up-to-date imagery. Tremendous progress has been achieved through the continuous development of powerful algorithms and software packages to manage and process geospatial data and to disseminate imagery and geospatial datasets in near-real time via geo-web-services, which can be used in disaster-risk management and emergency response efforts. Satellite Earth observations now offer consistent coverage and scope to provide a synoptic overview of large areas, repeated regularly. These can be used to compare risk across different countries, day and night, in all weather conditions, and in trans-boundary areas. On the other hand, with use of modern computing power and advanced sensor networks, the great advances of real-time simulation have been achieved. The data and information derived from satellite Earth observations, integrated with in situ information and simulation modeling provides unique value and the necessary complement to socio-economic data. Emphasis also needs to be placed on ensuring space-based data and information are used in existing and planned national and local disaster risk management systems, together with other data and information sources as a way to strengthen the resilience of communities. Through the case studies of the 2011 Great East Japan earthquake and tsunami disaster, we aim to discuss how earth observations and modeling, in combination with local, in situ data and information sources, can support the decision-making process before, during and after a disaster strikes.

  19. Web catalog of oceanographic data using GeoNetwork

    NASA Astrophysics Data System (ADS)

    Marinova, Veselka; Stefanov, Asen

    2017-04-01

    Most of the data collected, analyzed and used by Bulgarian oceanographic data center (BgODC) from scientific cruises, argo floats, ferry boxes and real time operating systems are spatially oriented and need to be displayed on the map. The challenge is to make spatial information more accessible to users, decision makers and scientists. In order to meet this challenge, BgODC concentrate its efforts on improving dynamic and standardized access to their geospatial data as well as those from various related organizations and institutions. BgODC currently is implementing a project to create a geospatial portal for distributing metadata and search, exchange and harvesting spatial data. There are many open source software solutions able to create such spatial data infrastructure (SDI). Finally, the GeoNetwork open source is chosen, as it is already widespread. This software is free, effective and "cheap" solution for implementing SDI at organization level. It is platform independent and runs under many operating systems. Filling of the catalog goes through these practical steps: • Managing and storing data reliably within MS SQL spatial data base; • Registration of maps and data of various formats and sources in GeoServer (most popular open source geospatial server embedded with GeoNetwork) ; • Filling added meta data and publishing geospatial data at the desktop of GeoNetwork. GeoServer and GeoNetwork are based on Java so they require installing of a servlet engine like Tomcat. The experience gained from the use of GeoNetwork Open Source confirms that the catalog meets the requirements for data management and is flexible enough to customize. Building the catalog facilitates sustainable data exchange between end users. The catalog is a big step towards implementation of the INSPIRE directive due to availability of many features necessary for producing "INSPIRE compliant" metadata records. The catalog now contains all available GIS data provided by BgODC for Internet access. Searching data within the catalog is based upon geographic extent, theme type and free text search.

  20. Geospatial characteristics of Florida's coastal and offshore environments: Administrative and political boundaries and offshore sand resources

    USGS Publications Warehouse

    Demopoulos, Amanda W.J.; Foster, Ann M.; Jones, Michal L.; Gualtieri, Daniel J.

    2011-01-01

    The Geospatial Characteristics Geopdf of Florida's Coastal and Offshore Environments is a comprehensive collection of geospatial data describing the political and natural resources of Florida. This interactive map provides spatial information on bathymetry, sand resources, military areas, marine protected areas, cultural resources, locations of submerged cables, and shipping routes. The map should be useful to coastal resource managers and others interested in the administrative and political boundaries of Florida's coastal and offshore region. In particular, as oil and gas explorations continue to expand, the map may be used to explore information regarding sensitive areas and resources in the State of Florida. Users of this geospatial database will find that they have access to synthesized information in a variety of scientific disciplines concerning Florida's coastal zone. This powerful tool provides a one-stop assembly of data that can be tailored to fit the needs of many natural resource managers.

  1. Sharing and interoperation of Digital Dongying geospatial data

    NASA Astrophysics Data System (ADS)

    Zhao, Jun; Liu, Gaohuan; Han, Lit-tao; Zhang, Rui-ju; Wang, Zhi-an

    2006-10-01

    Digital Dongying project was put forward by Dongying city, Shandong province, and authenticated by Ministry of Information Industry, Ministry of Science and Technology and Ministry of Construction P.R.CHINA in 2002. After five years of building, informationization level of Dongying has reached to the advanced degree. In order to forward the step of digital Dongying building, and to realize geospatial data sharing, geographic information sharing standards are drawn up and applied into realization. Secondly, Digital Dongying Geographic Information Sharing Platform has been constructed and developed, which is a highly integrated platform of WEBGIS. 3S (GIS, GPS, RS), Object oriented RDBMS, Internet, DCOM, etc. It provides an indispensable platform for sharing and interoperation of Digital Dongying Geospatial Data. According to the standards, and based on the platform, sharing and interoperation of "Digital Dongying" geospatial data have come into practice and the good results have been obtained. However, a perfect leadership group is necessary for data sharing and interoperation.

  2. Design and Implementation of Surrounding Transaction Plotting and Management System Based on Google Map API

    NASA Astrophysics Data System (ADS)

    Cao, Y. B.; Hua, Y. X.; Zhao, J. X.; Guo, S. M.

    2013-11-01

    With China's rapid economic development and comprehensive national strength growing, Border work has become a long-term and important task in China's diplomatic work. How to implement rapid plotting, real-time sharing and mapping surrounding affairs has taken great significance for government policy makers and diplomatic staff. However, at present the already exists Boundary information system are mainly have problems of Geospatial data update is heavily workload, plotting tools are in a state of serious lack of, Geographic events are difficult to share, this phenomenon has seriously hampered the smooth development of the border task. The development and progress of Geographic information system technology especially the development of Web GIS offers the possibility to solve the above problems, this paper adopts four layers of B/S architecture, with the support of Google maps service, uses the free API which is offered by Google maps and its features of openness, ease of use, sharing characteristics, highresolution images to design and implement the surrounding transaction plotting and management system based on the web development technology of ASP.NET, C#, Ajax. The system can provide decision support for government policy makers as well as diplomatic staff's real-time plotting and sharing of surrounding information. The practice has proved that the system has good usability and strong real-time.

  3. Metadata squared: enhancing its usability for volunteered geographic information and the GeoWeb

    USGS Publications Warehouse

    Poore, Barbara S.; Wolf, Eric B.; Sui, Daniel Z.; Elwood, Sarah; Goodchild, Michael F.

    2013-01-01

    The Internet has brought many changes to the way geographic information is created and shared. One aspect that has not changed is metadata. Static spatial data quality descriptions were standardized in the mid-1990s and cannot accommodate the current climate of data creation where nonexperts are using mobile phones and other location-based devices on a continuous basis to contribute data to Internet mapping platforms. The usability of standard geospatial metadata is being questioned by academics and neogeographers alike. This chapter analyzes current discussions of metadata to demonstrate how the media shift that is occurring has affected requirements for metadata. Two case studies of metadata use are presented—online sharing of environmental information through a regional spatial data infrastructure in the early 2000s, and new types of metadata that are being used today in OpenStreetMap, a map of the world created entirely by volunteers. Changes in metadata requirements are examined for usability, the ease with which metadata supports coproduction of data by communities of users, how metadata enhances findability, and how the relationship between metadata and data has changed. We argue that traditional metadata associated with spatial data infrastructures is inadequate and suggest several research avenues to make this type of metadata more interactive and effective in the GeoWeb.

  4. A survey of tools and resources for the next generation analyst

    NASA Astrophysics Data System (ADS)

    Hall, David L.; Graham, Jake; Catherman, Emily

    2015-05-01

    We have previously argued that a combination of trends in information technology (IT) and changing habits of people using IT provide opportunities for the emergence of a new generation of analysts that can perform effective intelligence, surveillance and reconnaissance (ISR) on a "do it yourself" (DIY) or "armchair" approach (see D.L. Hall and J. Llinas (2014)). Key technology advances include: i) new sensing capabilities including the use of micro-scale sensors and ad hoc deployment platforms such as commercial drones, ii) advanced computing capabilities in mobile devices that allow advanced signal and image processing and modeling, iii) intelligent interconnections due to advances in "web N" capabilities, and iv) global interconnectivity and increasing bandwidth. In addition, the changing habits of the digital natives reflect new ways of collecting and reporting information, sharing information, and collaborating in dynamic teams. This paper provides a survey and assessment of tools and resources to support this emerging analysis approach. The tools range from large-scale commercial tools such as IBM i2 Analyst Notebook, Palantir, and GeoSuite to emerging open source tools such as GeoViz and DECIDE from university research centers. The tools include geospatial visualization tools, social network analysis tools and decision aids. A summary of tools is provided along with links to web sites for tool access.

  5. Using Watershed Boundaries to Map Adverse Health Outcomes: Examples From Nebraska, USA

    PubMed Central

    Corley, Brittany; Bartelt-Hunt, Shannon; Rogan, Eleanor; Coulter, Donald; Sparks, John; Baccaglini, Lorena; Howell, Madeline; Liaquat, Sidra; Commack, Rex; Kolok, Alan S

    2018-01-01

    In 2009, a paper was published suggesting that watersheds provide a geospatial platform for establishing linkages between aquatic contaminants, the health of the environment, and human health. This article is a follow-up to that original article. From an environmental perspective, watersheds segregate landscapes into geospatial units that may be relevant to human health outcomes. From an epidemiologic perspective, the watershed concept places anthropogenic health data into a geospatial framework that has environmental relevance. Research discussed in this article includes information gathered from the literature, as well as recent data collected and analyzed by this research group. It is our contention that the use of watersheds to stratify geospatial information may be both environmentally and epidemiologically valuable. PMID:29398918

  6. United States Geological Survey (USGS) Natural Hazards Response

    USGS Publications Warehouse

    Lamb, Rynn M.; Jones, Brenda K.

    2012-01-01

    The primary goal of U.S. Geological Survey (USGS) Natural Hazards Response is to ensure that the disaster response community has access to timely, accurate, and relevant geospatial products, imagery, and services during and after an emergency event. To accomplish this goal, products and services provided by the National Geospatial Program (NGP) and Land Remote Sensing (LRS) Program serve as a geospatial framework for mapping activities of the emergency response community. Post-event imagery and analysis can provide important and timely information about the extent and severity of an event. USGS Natural Hazards Response will also support the coordination of remotely sensed data acquisitions, image distribution, and authoritative geospatial information production as required for use in disaster preparedness, response, and recovery operations.

  7. Data Democracy and Decision Making: Enhancing the Use and Value of Geospatial Data and Scientific Information

    NASA Astrophysics Data System (ADS)

    Shapiro, C. D.

    2014-12-01

    Data democracy is a concept that has great relevance to the use and value of geospatial data and scientific information. Data democracy describes a world in which data and information are widely and broadly accessible, understandable, and useable. The concept operationalizes the public good nature of scientific information and provides a framework for increasing benefits from its use. Data democracy encompasses efforts to increase accessibility to geospatial data and to expand participation in its collection, analysis, and application. These two pillars are analogous to demand and supply relationships. Improved accessibility, or demand, includes increased knowledge about geospatial data and low barriers to retrieval and use. Expanded participation, or supply, encompasses a broader community involved in developing geospatial data and scientific information. This pillar of data democracy is characterized by methods such as citizen science or crowd sourcing.A framework is developed for advancing the use of data democracy. This includes efforts to assess the societal benefits (economic and social) of scientific information. This knowledge is critical to continued monitoring of the effectiveness of data democracy implementation and of potential impact on the use and value of scientific information. The framework also includes an assessment of opportunities for advancing data democracy both on the supply and demand sides. These opportunities include relatively inexpensive efforts to reduce barriers to use as well as the identification of situations in which participation can be expanded in scientific efforts to enhance the breadth of involvement as well as expanding participation to non-traditional communities. This framework provides an initial perspective on ways to expand the "scientific community" of data users and providers. It also describes a way forward for enhancing the societal benefits from geospatial data and scientific information. As a result, data democracy not only provides benefits to a greater population, it enhances the value of science.

  8. Implementation of Open-Source Web Mapping Technologies to Support Monitoring of Governmental Schemes

    NASA Astrophysics Data System (ADS)

    Pulsani, B. R.

    2015-10-01

    Several schemes are undertaken by the government to uplift social and economic condition of people. The monitoring of these schemes is done through information technology where involvement of Geographic Information System (GIS) is lacking. To demonstrate the benefits of thematic mapping as a tool for assisting the officials in making decisions, a web mapping application for three government programs such as Mother and Child Tracking system (MCTS), Telangana State Housing Corporation Limited (TSHCL) and Ground Water Quality Mapping (GWQM) has been built. Indeed the three applications depicted the distribution of various parameters thematically and helped in identifying the areas with higher and weaker distributions. Based on the three applications, the study tends to find similarities of many government schemes reflecting the nature of thematic mapping and hence deduces to implement this kind of approach for other schemes as well. These applications have been developed using SharpMap Csharp library which is a free and open source mapping library for developing geospatial applications. The study highlights upon the cost benefits of SharpMap and brings out the advantage of this library over proprietary vendors and further discusses its advantages over other open source libraries as well.

  9. Digital Geologic Mapping and Integration with the Geoweb: The Death Knell for Exclusively Paper Geologic Maps

    NASA Astrophysics Data System (ADS)

    House, P. K.

    2008-12-01

    The combination of traditional methods of geologic mapping with rapidly developing web-based geospatial applications ('the geoweb') and the various collaborative opportunities of web 2.0 have the potential to change the nature, value, and relevance of geologic maps and related field studies. Parallel advances in basic GPS technology, digital photography, and related integrative applications provide practicing geologic mappers with greatly enhanced methods for collecting, visualizing, interpreting, and disseminating geologic information. Even a cursory application of available tools can make field and office work more enriching and efficient; whereas more advanced and systematic applications provide new avenues for collaboration, outreach, and public education. Moreover, they ensure a much broader audience among an immense number of internet savvy end-users with very specific expectations for geospatial data availability. Perplexingly, the geologic community as a whole is not fully exploring this opportunity despite the inevitable revolution in portends. The slow acceptance follows a broad generational trend wherein seasoned professionals are lagging behind geology students and recent graduates in their grasp of and interest in the capabilities of the geoweb and web 2.0 types of applications. Possible explanations for this include: fear of the unknown, fear of learning curve, lack of interest, lack of academic/professional incentive, and (hopefully not) reluctance toward open collaboration. Although some aspects of the expanding geoweb are cloaked in arcane computer code, others are extremely simple to understand and use. A particularly obvious and simple application to enhance any field study is photo geotagging, the digital documentation of the locations of key outcrops, illustrative vistas, and particularly complicated geologic field relations. Viewing geotagged photos in their appropriate context on a virtual globe with high-resolution imagery can be an extremely useful accompaniment to compilation of field mapping efforts. It can also complement published geologic maps by vastly improving their comprehensibility when field photos, and specific notes can be viewed interactively with them. Other useful applications include GPS tracking/documentation of field traverses; invoking multiple geologic layers; 3-D visualizations of terrain and structure; and online collaboration with colleagues via blogs or wikis. Additional steps towards collaborative geologic mapping on the web may also enhance efficient and open sharing of data and ideas. Geologists are well aware that paper geologic maps can convey tremendous amounts of information. Digital geologic maps linked via a virtual globe with field data, diverse imagery, historical photographs, explanatory diagrams, and 3-D models convey a much greater amount of information and can provide a much richer context for comprehension and interpretation. They can also serve as an efficient, entertaining, and potentially compelling mechanism for fostering inspiration in the minds of budding (and aging) geologists.

  10. Geospatial Brokering - Challenges and Future Directions

    NASA Astrophysics Data System (ADS)

    White, C. E.

    2012-12-01

    An important feature of many brokers is to facilitate straightforward human access to scientific data while maintaining programmatic access to it for system solutions. Standards-based protocols are critical for this, and there are a number of protocols to choose from. In this discussion, we will present a web application solution that leverages certain protocols - e.g., OGC CSW, REST, and OpenSearch - to provide programmatic as well as human access to geospatial resources. We will also discuss managing resources to reduce duplication yet increase discoverability, federated search solutions, and architectures that combine human-friendly interfaces with powerful underlying data management. The changing requirements witnessed in brokering solutions over time, our recent experience participating in the EarthCube brokering hack-a-thon, and evolving interoperability standards provide insight to future technological and philosophical directions planned for geospatial broker solutions. There has been much change over the past decade, but with the unprecedented data collaboration of recent years, in many ways the challenges and opportunities are just beginning.

  11. Integration of Grid and Sensor Web for Flood Monitoring and Risk Assessment from Heterogeneous Data

    NASA Astrophysics Data System (ADS)

    Kussul, Nataliia; Skakun, Sergii; Shelestov, Andrii

    2013-04-01

    Over last decades we have witnessed the upward global trend in natural disaster occurrence. Hydrological and meteorological disasters such as floods are the main contributors to this pattern. In recent years flood management has shifted from protection against floods to managing the risks of floods (the European Flood risk directive). In order to enable operational flood monitoring and assessment of flood risk, it is required to provide an infrastructure with standardized interfaces and services. Grid and Sensor Web can meet these requirements. In this paper we present a general approach to flood monitoring and risk assessment based on heterogeneous geospatial data acquired from multiple sources. To enable operational flood risk assessment integration of Grid and Sensor Web approaches is proposed [1]. Grid represents a distributed environment that integrates heterogeneous computing and storage resources administrated by multiple organizations. SensorWeb is an emerging paradigm for integrating heterogeneous satellite and in situ sensors and data systems into a common informational infrastructure that produces products on demand. The basic Sensor Web functionality includes sensor discovery, triggering events by observed or predicted conditions, remote data access and processing capabilities to generate and deliver data products. Sensor Web is governed by the set of standards, called Sensor Web Enablement (SWE), developed by the Open Geospatial Consortium (OGC). Different practical issues regarding integration of Sensor Web with Grids are discussed in the study. We show how the Sensor Web can benefit from using Grids and vice versa. For example, Sensor Web services such as SOS, SPS and SAS can benefit from the integration with the Grid platform like Globus Toolkit. The proposed approach is implemented within the Sensor Web framework for flood monitoring and risk assessment, and a case-study of exploiting this framework, namely the Namibia SensorWeb Pilot Project, is described. The project was created as a testbed for evaluating and prototyping key technologies for rapid acquisition and distribution of data products for decision support systems to monitor floods and enable flood risk assessment. The system provides access to real-time products on rainfall estimates and flood potential forecast derived from the Tropical Rainfall Measuring Mission (TRMM) mission with lag time of 6 h, alerts from the Global Disaster Alert and Coordination System (GDACS) with lag time of 4 h, and the Coupled Routing and Excess STorage (CREST) model to generate alerts. These are alerts are used to trigger satellite observations. With deployed SPS service for NASA's EO-1 satellite it is possible to automatically task sensor with re-image capability of less 8 h. Therefore, with enabled computational and storage services provided by Grid and cloud infrastructure it was possible to generate flood maps within 24-48 h after trigger was alerted. To enable interoperability between system components and services OGC-compliant standards are utilized. [1] Hluchy L., Kussul N., Shelestov A., Skakun S., Kravchenko O., Gripich Y., Kopp P., Lupian E., "The Data Fusion Grid Infrastructure: Project Objectives and Achievements," Computing and Informatics, 2010, vol. 29, no. 2, pp. 319-334.

  12. An Offline-Online Android Application for Hazard Event Mapping Using WebGIS Open Source Technologies

    NASA Astrophysics Data System (ADS)

    Olyazadeh, Roya; Jaboyedoff, Michel; Sudmeier-Rieux, Karen; Derron, Marc-Henri; Devkota, Sanjaya

    2016-04-01

    Nowadays, Free and Open Source Software (FOSS) plays an important role in better understanding and managing disaster risk reduction around the world. National and local government, NGOs and other stakeholders are increasingly seeking and producing data on hazards. Most of the hazard event inventories and land use mapping are based on remote sensing data, with little ground truthing, creating difficulties depending on the terrain and accessibility. Open Source WebGIS tools offer an opportunity for quicker and easier ground truthing of critical areas in order to analyse hazard patterns and triggering factors. This study presents a secure mobile-map application for hazard event mapping using Open Source WebGIS technologies such as Postgres database, Postgis, Leaflet, Cordova and Phonegap. The objectives of this prototype are: 1. An Offline-Online android mobile application with advanced Geospatial visualisation; 2. Easy Collection and storage of events information applied services; 3. Centralized data storage with accessibility by all the service (smartphone, standard web browser); 4. Improving data management by using active participation in hazard event mapping and storage. This application has been implemented as a low-cost, rapid and participatory method for recording impacts from hazard events and includes geolocation (GPS data and Internet), visualizing maps with overlay of satellite images, viewing uploaded images and events as cluster points, drawing and adding event information. The data can be recorded in offline (Android device) or online version (all browsers) and consequently uploaded through the server whenever internet is available. All the events and records can be visualized by an administrator and made public after approval. Different user levels can be defined to access the data for communicating the information. This application was tested for landslides in post-earthquake Nepal but can be used for any other type of hazards such as flood, avalanche, etc. Keywords: Offline, Online, WebGIS Open source, Android, Hazard Event Mapping

  13. Issues on Building Kazakhstan Geospatial Portal to Implement E-Government

    NASA Astrophysics Data System (ADS)

    Sagadiyev, K.; Kang, H. K.; Li, K. J.

    2016-06-01

    A main issue in developing e-government is about how to integrate and organize many complicated processes and different stakeholders. Interestingly geospatial information provides an efficient framework to integrate and organized them. In particular, it is very useful to integrate the process of land management in e-government with geospatial information framework, since most of land management tasks are related with geospatial properties. In this paper, we present a use-case on the e-government project in Kazakhstan for land management. We develop a geoportal to connect many tasks and different users via geospatial information framework. This geoportal is based on open source geospatial software including GeoServer, PostGIS, and OpenLayers. With this geoportal, we expect three achievements as follows. First we establish a transparent governmental process, which is one of main goal of e-government. Every stakeholder monitors what is happening in land management process. Second, we can significantly reduce the time and efforts in the government process. For example, a grant procedure for a building construction has taken more than one year with more than 50 steps. It is expected that this procedure would be reduced to 2 weeks by the geoportal framework. Third we provide a collaborative environment between different governmental structures via the geoportal, while many conflicts and mismatches have been a critical issue of governmental administration processes.

  14. OpenSearch technology for geospatial resources discovery

    NASA Astrophysics Data System (ADS)

    Papeschi, Fabrizio; Enrico, Boldrini; Mazzetti, Paolo

    2010-05-01

    In 2005, the term Web 2.0 has been coined by Tim O'Reilly to describe a quickly growing set of Web-based applications that share a common philosophy of "mutually maximizing collective intelligence and added value for each participant by formalized and dynamic information sharing". Around this same period, OpenSearch a new Web 2.0 technology, was developed. More properly, OpenSearch is a collection of technologies that allow publishing of search results in a format suitable for syndication and aggregation. It is a way for websites and search engines to publish search results in a standard and accessible format. Due to its strong impact on the way the Web is perceived by users and also due its relevance for businesses, Web 2.0 has attracted the attention of both mass media and the scientific community. This explosive growth in popularity of Web 2.0 technologies like OpenSearch, and practical applications of Service Oriented Architecture (SOA) resulted in an increased interest in similarities, convergence, and a potential synergy of these two concepts. SOA is considered as the philosophy of encapsulating application logic in services with a uniformly defined interface and making these publicly available via discovery mechanisms. Service consumers may then retrieve these services, compose and use them according to their current needs. A great degree of similarity between SOA and Web 2.0 may be leading to a convergence between the two paradigms. They also expose divergent elements, such as the Web 2.0 support to the human interaction in opposition to the typical SOA machine-to-machine interaction. According to these considerations, the Geospatial Information (GI) domain, is also moving first steps towards a new approach of data publishing and discovering, in particular taking advantage of the OpenSearch technology. A specific GI niche is represented by the OGC Catalog Service for Web (CSW) that is part of the OGC Web Services (OWS) specifications suite, which provides a set of services for discovery, access, and processing of geospatial resources in a SOA framework. GI-cat is a distributed CSW framework implementation developed by the ESSI Lab of the Italian National Research Council (CNR-IMAA) and the University of Florence. It provides brokering and mediation functionalities towards heterogeneous resources and inventories, exposing several standard interfaces for query distribution. This work focuses on a new GI-cat interface which allows the catalog to be queried according to the OpenSearch syntax specification, thus filling the gap between the SOA architectural design of the CSW and the Web 2.0. At the moment, there is no OGC standard specification about this topic, but an official change request has been proposed in order to enable the OGC catalogues to support OpenSearch queries. In this change request, an OpenSearch extension is proposed providing a standard mechanism to query a resource based on temporal and geographic extents. Two new catalog operations are also proposed, in order to publish a suitable OpenSearch interface. This extended interface is implemented by the modular GI-cat architecture adding a new profiling module called "OpenSearch profiler". Since GI-cat also acts as a clearinghouse catalog, another component called "OpenSearch accessor" is added in order to access OpenSearch compliant services. An important role in the GI-cat extension, is played by the adopted mapping strategy. Two different kind of mappings are required: query, and response elements mapping. Query mapping is provided in order to fit the simple OpenSearch query syntax to the complex CSW query expressed by the OGC Filter syntax. GI-cat internal data model is based on the ISO-19115 profile, that is more complex than the simple XML syndication formats, such as RSS 2.0 and Atom 1.0, suggested by OpenSearch. Once response elements are available, in order to be presented, they need to be translated from the GI-cat internal data model, to the above mentioned syndication formats; the mapping processing, is bidirectional. When GI-cat is used to access OpenSearch compliant services, the CSW query must be mapped to the OpenSearch query, and the response elements, must be translated according to the GI-cat internal data model. As results of such extensions, GI-cat provides a user friendly facade to the complex CSW interface, thus enabling it to be queried, for example, using a browser toolbar.

  15. Geospatial Analysis and Model Evaluation Software (GAMES): Integrated Web-Based Analysis and Visualization

    DTIC Science & Technology

    2014-04-11

    particle location files for each source (hours) dti : time step in seconds horzmix: CONSTANT = use the value of horcon...however, if leg lengths are short. Extreme values of D/Lo can occur. We will handle these by assigning a maximum to the output. This is discussed by

  16. 33 CFR 72.05-10 - Free distribution.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Geospatial-Intelligence Agency's Web site: (http://pollux.nss.nima.mil/pubs/USCGLL/pubs_j_uscgll_list.html). (R.S. 501, as amended, sec. 5, 38 Stat. 75; 44 U.S.C. 82, 84) [CGFR 51-15, 18 FR 13, Jan. 1, 1953, as amended by USCG-2001-10714, 69 FR 24984, May 5, 2004] ...

  17. A Curriculum-Linked Professional Development Approach to Support Teachers' Adoption of Web GIS Tectonics Investigations

    ERIC Educational Resources Information Center

    Bodzin, Alec; Anastasio, David; Sahagian, Dork; Henry, Jill Burrows

    2016-01-01

    A curriculum-linked professional development approach designed to support middle level science teachers' understandings about tectonics and geospatial pedagogical content knowledge was developed. This approach takes into account limited face-to-face professional development time and instead provides pedagogical support within the design of a…

  18. Accessing Geospatial Services in Limited Bandwidth Service-Oriented Architecture (SOA) Environments

    ERIC Educational Resources Information Center

    Boggs, James D.

    2013-01-01

    First responders are continuously moving at an incident site and this movement requires them to access Service-Oriented Architecture services, such as a Web Map Service, via mobile wireless networks. First responders from inside a building often have problems in communicating to devices outside that building due to propagation obstacles. Dynamic…

  19. WebGL Visualisation of 3D Environmental Models Based on Finnish Open Geospatial Data Sets

    NASA Astrophysics Data System (ADS)

    Krooks, A.; Kahkonen, J.; Lehto, L.; Latvala, P.; Karjalainen, M.; Honkavaara, E.

    2014-08-01

    Recent developments in spatial data infrastructures have enabled real time GIS analysis and visualization using open input data sources and service interfaces. In this study we present a new concept where metric point clouds derived from national open airborne laser scanning (ALS) and photogrammetric image data are processed, analyzed, finally visualised a through open service interfaces to produce user-driven analysis products from targeted areas. The concept is demonstrated in three environmental applications: assessment of forest storm damages, assessment of volumetric changes in open pit mine and 3D city model visualization. One of the main objectives was to study the usability and requirements of national level photogrammetric imagery in these applications. The results demonstrated that user driven 3D geospatial analyses were possible with the proposed approach and current technology, for instance, the landowner could assess the amount of fallen trees within his property borders after a storm easily using any web browser. On the other hand, our study indicated that there are still many uncertainties especially due to the insufficient standardization of photogrammetric products and processes and their quality indicators.

  20. Creating Actionable Data from an Optical Depth Measurement Network using RDF

    NASA Astrophysics Data System (ADS)

    Freemantle, J. R.; O'Neill, N. T.; Lumb, L. I.; Abboud, I.; McArthur, B.

    2010-12-01

    The AEROCAN sunphotometery network has, for more than a decade, generated optical indicators of aerosol concentration and size on a regional and national scale. We believe this optical information can be rendered more “actionable” to the health care community by developing a technical and interpretative information-sharing geospatial strategy with that community. By actionable data we mean information that is presented in manner that can be understood and then used in the decision making process. The decision may be that of a technical professional, a policy maker or a machine. The information leading up to a decision may come from many sources; this means it is particularly important that data are well defined across knowledge fields, in our case atmospheric science and respiratory health science. As part of the AEROCAN operational quality assurance (QA) methodology we have written automatic procedures to make some of the AEROCAN data more accessible or “actionable”. Tim Berners-Lee has advocated making datasets, “Linked Data”, available on the web with a proper structural description (metadata). We have been using RDF (Resource Description Framework) to enhance the utility of our sunphotometer data; the resulting self-describing representation is structured so that it is machine readable. This allows semantically based queries (e.g., via SPARQL) on our dataset that in the past were only viewable as passive Web tables of data.

  1. Incorporating Historic Facility Geospatial Data and Federal Preservation Requirements into SDSFIE/FMSFIE

    DTIC Science & Technology

    2006-11-01

    29 3.2.4 National Register Information System Model ............................................................... 30 3.3 Summary of...are later based on that information . Despite their general level of power and resolution, Federal data management and accounting tools have not yet...have begun tracking their historic building and structure inven- tories using geographic information systems (GISs). A geospatial-referenced data

  2. MapFactory - Towards a mapping design pattern for big geospatial data

    NASA Astrophysics Data System (ADS)

    Rautenbach, Victoria; Coetzee, Serena

    2018-05-01

    With big geospatial data emerging, cartographers and geographic information scientists have to find new ways of dealing with the volume, variety, velocity, and veracity (4Vs) of the data. This requires the development of tools that allow processing, filtering, analysing, and visualising of big data through multidisciplinary collaboration. In this paper, we present the MapFactory design pattern that will be used for the creation of different maps according to the (input) design specification for big geospatial data. The design specification is based on elements from ISO19115-1:2014 Geographic information - Metadata - Part 1: Fundamentals that would guide the design and development of the map or set of maps to be produced. The results of the exploratory research suggest that the MapFactory design pattern will help with software reuse and communication. The MapFactory design pattern will aid software developers to build the tools that are required to automate map making with big geospatial data. The resulting maps would assist cartographers and others to make sense of big geospatial data.

  3. Web GIS in practice IV: publishing your health maps and connecting to remote WMS sources using the Open Source UMN MapServer and DM Solutions MapLab

    PubMed Central

    Boulos, Maged N Kamel; Honda, Kiyoshi

    2006-01-01

    Open Source Web GIS software systems have reached a stage of maturity, sophistication, robustness and stability, and usability and user friendliness rivalling that of commercial, proprietary GIS and Web GIS server products. The Open Source Web GIS community is also actively embracing OGC (Open Geospatial Consortium) standards, including WMS (Web Map Service). WMS enables the creation of Web maps that have layers coming from multiple different remote servers/sources. In this article we present one easy to implement Web GIS server solution that is based on the Open Source University of Minnesota (UMN) MapServer. By following the accompanying step-by-step tutorial instructions, interested readers running mainstream Microsoft® Windows machines and with no prior technical experience in Web GIS or Internet map servers will be able to publish their own health maps on the Web and add to those maps additional layers retrieved from remote WMS servers. The 'digital Asia' and 2004 Indian Ocean tsunami experiences in using free Open Source Web GIS software are also briefly described. PMID:16420699

  4. Common Approach to Geoprocessing of Uav Data across Application Domains

    NASA Astrophysics Data System (ADS)

    Percivall, G. S.; Reichardt, M.; Taylor, T.

    2015-08-01

    UAVs are a disruptive technology bringing new geographic data and information to many application domains. UASs are similar to other geographic imagery systems so existing frameworks are applicable. But the diversity of UAVs as platforms along with the diversity of available sensors are presenting challenges in the processing and creation of geospatial products. Efficient processing and dissemination of the data is achieved using software and systems that implement open standards. The challenges identified point to the need for use of existing standards and extending standards. Results from the use of the OGC Sensor Web Enablement set of standards are presented. Next steps in the progress of UAVs and UASs may follow the path of open data, open source and open standards.

  5. Use of NASA Near Real-Time and Archived Satellite Data to Support Disaster Assessment

    NASA Technical Reports Server (NTRS)

    McGrath, Kevin M.; Molthan, Andrew L.; Burks, Jason E.

    2014-01-01

    NASA's Short-term Prediction Research and Transition (SPoRT) Center partners with the NWS to provide near realtime data in support of a variety of weather applications, including disasters. SPoRT supports NASA's Applied Sciences Program: Disasters focus area by developing techniques that will aid the disaster monitoring, response, and assessment communities. SPoRT has explored a variety of techniques for utilizing archived and near real-time NASA satellite data. An increasing number of end-users - such as the NWS Damage Assessment Toolkit (DAT) - access geospatial data via a Web Mapping Service (WMS). SPoRT has begun developing open-standard Geographic Information Systems (GIS) data sets via WMS to respond to end-user needs.

  6. Towards the Development of a Taxonomy for Visualisation of Streamed Geospatial Data

    NASA Astrophysics Data System (ADS)

    Sibolla, B. H.; Van Zyl, T.; Coetzee, S.

    2016-06-01

    Geospatial data has very specific characteristics that need to be carefully captured in its visualisation, in order for the user and the viewer to gain knowledge from it. The science of visualisation has gained much traction over the last decade as a response to various visualisation challenges. During the development of an open source based, dynamic two-dimensional visualisation library, that caters for geospatial streaming data, it was found necessary to conduct a review of existing geospatial visualisation taxonomies. The review was done in order to inform the design phase of the library development, such that either an existing taxonomy can be adopted or extended to fit the needs at hand. The major challenge in this case is to develop dynamic two dimensional visualisations that enable human interaction in order to assist the user to understand the data streams that are continuously being updated. This paper reviews the existing geospatial data visualisation taxonomies that have been developed over the years. Based on the review, an adopted taxonomy for visualisation of geospatial streaming data is presented. Example applications of this taxonomy are also provided. The adopted taxonomy will then be used to develop the information model for the visualisation library in a further study.

  7. Automating Geospatial Visualizations with Smart Default Renderers for Data Exploration Web Applications

    NASA Astrophysics Data System (ADS)

    Ekenes, K.

    2017-12-01

    This presentation will outline the process of creating a web application for exploring large amounts of scientific geospatial data using modern automated cartographic techniques. Traditional cartographic methods, including data classification, may inadvertently hide geospatial and statistical patterns in the underlying data. This presentation demonstrates how to use smart web APIs that quickly analyze the data when it loads, and provides suggestions for the most appropriate visualizations based on the statistics of the data. Since there are just a few ways to visualize any given dataset well, it is imperative to provide smart default color schemes tailored to the dataset as opposed to static defaults. Since many users don't go beyond default values, it is imperative that they are provided with smart default visualizations. Multiple functions for automating visualizations are available in the Smart APIs, along with UI elements allowing users to create more than one visualization for a dataset since there isn't a single best way to visualize a given dataset. Since bivariate and multivariate visualizations are particularly difficult to create effectively, this automated approach removes the guesswork out of the process and provides a number of ways to generate multivariate visualizations for the same variables. This allows the user to choose which visualization is most appropriate for their presentation. The methods used in these APIs and the renderers generated by them are not available elsewhere. The presentation will show how statistics can be used as the basis for automating default visualizations of data along continuous ramps, creating more refined visualizations while revealing the spread and outliers of the data. Adding interactive components to instantaneously alter visualizations allows users to unearth spatial patterns previously unknown among one or more variables. These applications may focus on a single dataset that is frequently updated, or configurable for a variety of datasets from multiple sources.

  8. An information model for managing multi-dimensional gridded data in a GIS

    NASA Astrophysics Data System (ADS)

    Xu, H.; Abdul-Kadar, F.; Gao, P.

    2016-04-01

    Earth observation agencies like NASA and NOAA produce huge volumes of historical, near real-time, and forecasting data representing terrestrial, atmospheric, and oceanic phenomena. The data drives climatological and meteorological studies, and underpins operations ranging from weather pattern prediction and forest fire monitoring to global vegetation analysis. These gridded data sets are distributed mostly as files in HDF, GRIB, or netCDF format and quantify variables like precipitation, soil moisture, or sea surface temperature, along one or more dimensions like time and depth. Although the data cube is a well-studied model for storing and analyzing multi-dimensional data, the GIS community remains in need of a solution that simplifies interactions with the data, and elegantly fits with existing database schemas and dissemination protocols. This paper presents an information model that enables Geographic Information Systems (GIS) to efficiently catalog very large heterogeneous collections of geospatially-referenced multi-dimensional rasters—towards providing unified access to the resulting multivariate hypercubes. We show how the implementation of the model encapsulates format-specific variations and provides unified access to data along any dimension. We discuss how this framework lends itself to familiar GIS concepts like image mosaics, vector field visualization, layer animation, distributed data access via web services, and scientific computing. Global data sources like MODIS from USGS and HYCOM from NOAA illustrate how one would employ this framework for cataloging, querying, and intuitively visualizing such hypercubes. ArcGIS—an established platform for processing, analyzing, and visualizing geospatial data—serves to demonstrate how this integration brings the full power of GIS to the scientific community.

  9. Implementing a Web-Based Decision Support System to Spatially and Statistically Analyze Ecological Conditions of the Sierra Nevada

    NASA Astrophysics Data System (ADS)

    Nguyen, A.; Mueller, C.; Brooks, A. N.; Kislik, E. A.; Baney, O. N.; Ramirez, C.; Schmidt, C.; Torres-Perez, J. L.

    2014-12-01

    The Sierra Nevada is experiencing changes in hydrologic regimes, such as decreases in snowmelt and peak runoff, which affect forest health and the availability of water resources. Currently, the USDA Forest Service Region 5 is undergoing Forest Plan revisions to include climate change impacts into mitigation and adaptation strategies. However, there are few processes in place to conduct quantitative assessments of forest conditions in relation to mountain hydrology, while easily and effectively delivering that information to forest managers. To assist the USDA Forest Service, this study is the final phase of a three-term project to create a Decision Support System (DSS) to allow ease of access to historical and forecasted hydrologic, climatic, and terrestrial conditions for the entire Sierra Nevada. This data is featured within three components of the DSS: the Mapping Viewer, Statistical Analysis Portal, and Geospatial Data Gateway. Utilizing ArcGIS Online, the Sierra DSS Mapping Viewer enables users to visually analyze and locate areas of interest. Once the areas of interest are targeted, the Statistical Analysis Portal provides subbasin level statistics for each variable over time by utilizing a recently developed web-based data analysis and visualization tool called Plotly. This tool allows users to generate graphs and conduct statistical analyses for the Sierra Nevada without the need to download the dataset of interest. For more comprehensive analysis, users are also able to download datasets via the Geospatial Data Gateway. The third phase of this project focused on Python-based data processing, the adaptation of the multiple capabilities of ArcGIS Online and Plotly, and the integration of the three Sierra DSS components within a website designed specifically for the USDA Forest Service.

  10. Basic Information

    EPA Pesticide Factsheets

    NHDPlus is a geospatial, hydrologic framework dataset that is intended for use by geospatial analysts and modelers to support water resources related applications. NHDPlus was developed by the USEPA in partnership with the US Geologic Survey

  11. EPA Facility Registry Service (FRS): RCRA

    EPA Pesticide Factsheets

    This web feature service contains location and facility identification information from EPA's Facility Registry Service (FRS) for the subset of hazardous waste facilities that link to the Resource Conservation and Recovery Act Information System (RCRAInfo). EPA's comprehensive information system in support of the Resource Conservation and Recovery Act (RCRA) of 1976 and the Hazardous and Solid Waste Amendments (HSWA) of 1984, RCRAInfo tracks many types of information about generators, transporters, treaters, storers, and disposers of hazardous waste. FRS identifies and geospatially locates facilities, sites or places subject to environmental regulations or of environmental interest. Using vigorous verification and data management procedures, FRS integrates facility data from EPA's national program systems, other federal agencies, and State and tribal master facility records and provides EPA with a centrally managed, single source of comprehensive and authoritative information on facilities. This data set contains the subset of FRS integrated facilities that link to RCRAInfo hazardous waste facilities once the RCRAInfo data has been integrated into the FRS database. Additional information on FRS is available at the EPA website https://www.epa.gov/enviro/facility-registry-service-frs

  12. Representing Geospatial Environment Observation Capability Information: A Case Study of Managing Flood Monitoring Sensors in the Jinsha River Basin

    PubMed Central

    Hu, Chuli; Guan, Qingfeng; Li, Jie; Wang, Ke; Chen, Nengcheng

    2016-01-01

    Sensor inquirers cannot understand comprehensive or accurate observation capability information because current observation capability modeling does not consider the union of multiple sensors nor the effect of geospatial environmental features on the observation capability of sensors. These limitations result in a failure to discover credible sensors or plan for their collaboration for environmental monitoring. The Geospatial Environmental Observation Capability (GEOC) is proposed in this study and can be used as an information basis for the reliable discovery and collaborative planning of multiple environmental sensors. A field-based GEOC (GEOCF) information representation model is built. Quintuple GEOCF feature components and two GEOCF operations are formulated based on the geospatial field conceptual framework. The proposed GEOCF markup language is used to formalize the proposed GEOCF. A prototype system called GEOCapabilityManager is developed, and a case study is conducted for flood observation in the lower reaches of the Jinsha River Basin. The applicability of the GEOCF is verified through the reliable discovery of flood monitoring sensors and planning for the collaboration of these sensors. PMID:27999247

  13. Representing Geospatial Environment Observation Capability Information: A Case Study of Managing Flood Monitoring Sensors in the Jinsha River Basin.

    PubMed

    Hu, Chuli; Guan, Qingfeng; Li, Jie; Wang, Ke; Chen, Nengcheng

    2016-12-16

    Sensor inquirers cannot understand comprehensive or accurate observation capability information because current observation capability modeling does not consider the union of multiple sensors nor the effect of geospatial environmental features on the observation capability of sensors. These limitations result in a failure to discover credible sensors or plan for their collaboration for environmental monitoring. The Geospatial Environmental Observation Capability (GEOC) is proposed in this study and can be used as an information basis for the reliable discovery and collaborative planning of multiple environmental sensors. A field-based GEOC (GEOCF) information representation model is built. Quintuple GEOCF feature components and two GEOCF operations are formulated based on the geospatial field conceptual framework. The proposed GEOCF markup language is used to formalize the proposed GEOCF. A prototype system called GEOCapabilityManager is developed, and a case study is conducted for flood observation in the lower reaches of the Jinsha River Basin. The applicability of the GEOCF is verified through the reliable discovery of flood monitoring sensors and planning for the collaboration of these sensors.

  14. A Spatial Data Infrastructure to Share Earth and Space Science Data

    NASA Astrophysics Data System (ADS)

    Nativi, S.; Mazzetti, P.; Bigagli, L.; Cuomo, V.

    2006-05-01

    Spatial Data Infrastructure:SDI (also known as Geospatial Data Infrastructure) is fundamentally a mechanism to facilitate the sharing and exchange of geospatial data. SDI is a scheme necessary for the effective collection, management, access, delivery and utilization of geospatial data; it is important for: objective decision making and sound land based policy, support economic development and encourage socially and environmentally sustainable development. As far as data model and semantics are concerned, a valuable and effective SDI should be able to cross the boundaries between the Geographic Information System/Science (GIS) and Earth and Space Science (ESS) communities. Hence, SDI should be able to discover, access and share information and data produced and managed by both GIS and ESS communities, in an integrated way. In other terms, SDI must be built on a conceptual and technological framework which abstracts the nature and structure of shared dataset: feature-based data or Imagery, Gridded and Coverage Data (IGCD). ISO TC211 and the Open Geospatial Consortium provided important artifacts to build up this framework. In particular, the OGC Web Services (OWS) initiatives and several Interoperability Experiment (e.g. the GALEON IE) are extremely useful for this purpose. We present a SDI solution which is able to manage both GIS and ESS datasets. It is based on OWS and other well-accepted or promising technologies, such as: UNIDATA netCDF and CDM, ncML and ncML-GML. Moreover, it uses a specific technology to implement a distributed and federated system of catalogues: the GI-Cat. This technology performs data model mediation and protocol adaptation tasks. It is used to work out a metadata clearinghouse service, implementing a common (federal) catalogue model which is based on the ISO 19115 core metadata for geo-dataset. Nevertheless, other well- accepted or standard catalogue data models can be easily implemented as common view (e.g. OGC CS-W, the next coming INSPIRE discovery metadata model, etc.). The proposed solution has been conceived and developed for building up the "Lucan SDI". This is the SDI of the Italian Basilicata Region. It aims to connect the following data providers and users: the National River Basin Authority of Basilicata, the Regional Environmental Agency, the Land Management & Cadastre Regional Authorities, the Prefecture, the Regional Civil Protection Centers, the National Research Council Institutes in Basilicata, the Academia, several SMEs.

  15. Application of OpenStreetMap (OSM) to Support the Mapping Village in Indonesia

    NASA Astrophysics Data System (ADS)

    Swasti Kanthi, Nurin; Hery Purwanto, Taufik

    2016-11-01

    Geospatial Information is a important thing in this era, because the need for location information is needed to know the condition of a region. In 2015 the Indonesian government release detailed mapping in village level and their Parent maps Indonesian state regulatory standards set forth in Rule form Norm Standards, Procedures and Criteria for Mapping Village (NSPK). Over time Web and Mobile GIS was developed with a wide range of applications. The merger between detailed mapping and Web GIS is still rarely performed and not used optimally. OpenStreetMap (OSM) is a WebGIS which can be utilized as Mobile GIS providing sufficient information to the representative levels of the building and can be used for mapping the village.Mapping Village using OSM was conducted using remote sensing approach and Geographical Information Systems (GIS), which's to interpret remote sensing imagery from OSM. The study was conducted to analyzed how far the role of OSM to support the mapping of the village, it's done by entering the house number data, administrative boundaries, public facilities and land use into OSM with reference data and data image Village Plan. The results of the mapping portion villages in OSM as a reference map-making village and analyzed in accordance with NSPK for detailed mapping Rukun Warga (RW) is part of the village mapping. The use of OSM greatly assists the process of mapping the details of the region with data sources in the form of images and can be accessed for Open Source. But still need their care and updating the data source to maintain the validity of the data.

  16. Autonomous Mission Operations for Sensor Webs

    NASA Astrophysics Data System (ADS)

    Underbrink, A.; Witt, K.; Stanley, J.; Mandl, D.

    2008-12-01

    We present interim results of a 2005 ROSES AIST project entitled, "Using Intelligent Agents to Form a Sensor Web for Autonomous Mission Operations", or SWAMO. The goal of the SWAMO project is to shift the control of spacecraft missions from a ground-based, centrally controlled architecture to a collaborative, distributed set of intelligent agents. The network of intelligent agents intends to reduce management requirements by utilizing model-based system prediction and autonomic model/agent collaboration. SWAMO agents are distributed throughout the Sensor Web environment, which may include multiple spacecraft, aircraft, ground systems, and ocean systems, as well as manned operations centers. The agents monitor and manage sensor platforms, Earth sensing systems, and Earth sensing models and processes. The SWAMO agents form a Sensor Web of agents via peer-to-peer coordination. Some of the intelligent agents are mobile and able to traverse between on-orbit and ground-based systems. Other agents in the network are responsible for encapsulating system models to perform prediction of future behavior of the modeled subsystems and components to which they are assigned. The software agents use semantic web technologies to enable improved information sharing among the operational entities of the Sensor Web. The semantics include ontological conceptualizations of the Sensor Web environment, plus conceptualizations of the SWAMO agents themselves. By conceptualizations of the agents, we mean knowledge of their state, operational capabilities, current operational capacities, Web Service search and discovery results, agent collaboration rules, etc. The need for ontological conceptualizations over the agents is to enable autonomous and autonomic operations of the Sensor Web. The SWAMO ontology enables automated decision making and responses to the dynamic Sensor Web environment and to end user science requests. The current ontology is compatible with Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) Sensor Model Language (SensorML) concepts and structures. The agents are currently deployed on the U.S. Naval Academy MidSTAR-1 satellite and are actively managing the power subsystem on-orbit without the need for human intervention.

  17. Impacts of Geospatial Information for Decision Making

    NASA Astrophysics Data System (ADS)

    Pearlman, F.; Coote, A.; Friedl, L.; Stewart, M.

    2012-12-01

    Geospatial information contributes to decisions by both societal and individual decision-makers. More effective use of this information is essential as issues are increasingly complex and consequences can be critical for future economic and social development. To address this, a workshop brought together analysts, communicators, officials, and researchers from academia, government, non-governmental organizations, and the private sector. A range of policy issues, management needs, and resource requirements were discussed and a wide array of analyses, geospatial data, methods of analysis, and metrics were presented for assessing and communicating the value of geospatial information. It is clear that there are many opportunities for integrating science and engineering disciplines with the social sciences for addressing societal issues that would benefit from using geospatial information and earth observations. However, these collaborations must have outcomes that can be easily communicated to decision makers. This generally requires either succinct quantitative statements of value based on rigorous models and/or user testimonials of actual applications that save real money. An outcome of the workshop is to pursue the development of a community of practice or society that encompasses a wide range of scientific, social, management, and communication disciplines and fosters collaboration across specialties, helping to build trust across social and science aspects. A resource base is also necessary. This presentation will address approaches for creating a shared knowledge database, containing a glossary of terms, reference materials and examples of case studies and the potential applications for benefit analyses.

  18. Geospatial Data Sciences | Energy Analysis | NREL

    Science.gov Websites

    , demographics, and the earth's physical geography to provide the foundation for energy analysis and decision -making. Photo of two people discussing a map. Geospatial Analysis Our geographic information system

  19. The U.S. Geological Survey cartographic and geographic information science research activities 2006-2010

    USGS Publications Warehouse

    Usery, E. Lynn

    2011-01-01

    The U.S. Geological Survey (USGS) produces geospatial databases and topographic maps for the United States of America. A part of that mission includes conducting research in geographic information science (GIScience) and cartography to support mapping and improve the design, quality, delivery, and use of geospatial data and topographic maps. The Center of Excellence for Geospatial Information Science (CEGIS) was established by the USGS in January 2006 as a part of the National Geospatial Program Office. CEGIS (http://cegis.usgs.gov) evolved from a team of cartographic researchers at the Mid-Continent Mapping Center. The team became known as the Cartographic Research group and was supported by the Cooperative Topographic Mapping, Geographic Analysis and Monitoring, and Land Remote Sensing programs of the Geography Discipline of the USGS from 1999-2005. In 2006, the Cartographic Research group and its projects (http://carto-research.er.usgs.gov/) became the core of CEGIS staff and research. In 2006, CEGIS research became focused on The National Map (http://nationalmap.gov).

  20. Geo-spatial Service and Application based on National E-government Network Platform and Cloud

    NASA Astrophysics Data System (ADS)

    Meng, X.; Deng, Y.; Li, H.; Yao, L.; Shi, J.

    2014-04-01

    With the acceleration of China's informatization process, our party and government take a substantive stride in advancing development and application of digital technology, which promotes the evolution of e-government and its informatization. Meanwhile, as a service mode based on innovative resources, cloud computing may connect huge pools together to provide a variety of IT services, and has become one relatively mature technical pattern with further studies and massive practical applications. Based on cloud computing technology and national e-government network platform, "National Natural Resources and Geospatial Database (NRGD)" project integrated and transformed natural resources and geospatial information dispersed in various sectors and regions, established logically unified and physically dispersed fundamental database and developed national integrated information database system supporting main e-government applications. Cross-sector e-government applications and services are realized to provide long-term, stable and standardized natural resources and geospatial fundamental information products and services for national egovernment and public users.

  1. Focused sunlight factor of forest fire danger assessment using Web-GIS and RS technologies

    NASA Astrophysics Data System (ADS)

    Baranovskiy, Nikolay V.; Sherstnyov, Vladislav S.; Yankovich, Elena P.; Engel, Marina V.; Belov, Vladimir V.

    2016-08-01

    Timiryazevskiy forestry of Tomsk region (Siberia, Russia) is a study area elaborated in current research. Forest fire danger assessment is based on unique technology using probabilistic criterion, statistical data on forest fires, meteorological conditions, forest sites classification and remote sensing data. MODIS products are used for estimating some meteorological conditions and current forest fire situation. Geonformation technologies are used for geospatial analysis of forest fire danger situation on controlled forested territories. GIS-engine provides opportunities to construct electronic maps with different levels of forest fire probability and support raster layer for satellite remote sensing data on current forest fires. Web-interface is used for data loading on specific web-site and for forest fire danger data representation via World Wide Web. Special web-forms provide interface for choosing of relevant input data in order to process the forest fire danger data and assess the forest fire probability.

  2. Geospatial Information Response Team

    USGS Publications Warehouse

    Witt, Emitt C.

    2010-01-01

    Extreme emergency events of national significance that include manmade and natural disasters seem to have become more frequent during the past two decades. The Nation is becoming more resilient to these emergencies through better preparedness, reduced duplication, and establishing better communications so every response and recovery effort saves lives and mitigates the long-term social and economic impacts on the Nation. The National Response Framework (NRF) (http://www.fema.gov/NRF) was developed to provide the guiding principles that enable all response partners to prepare for and provide a unified national response to disasters and emergencies. The NRF provides five key principles for better preparation, coordination, and response: 1) engaged partnerships, 2) a tiered response, 3) scalable, flexible, and adaptable operations, 4) unity of effort, and 5) readiness to act. The NRF also describes how communities, tribes, States, Federal Government, privatesector, and non-governmental partners apply these principles for a coordinated, effective national response. The U.S. Geological Survey (USGS) has adopted the NRF doctrine by establishing several earth-sciences, discipline-level teams to ensure that USGS science, data, and individual expertise are readily available during emergencies. The Geospatial Information Response Team (GIRT) is one of these teams. The USGS established the GIRT to facilitate the effective collection, storage, and dissemination of geospatial data information and products during an emergency. The GIRT ensures that timely geospatial data are available for use by emergency responders, land and resource managers, and for scientific analysis. In an emergency and response capacity, the GIRT is responsible for establishing procedures for geospatial data acquisition, processing, and archiving; discovery, access, and delivery of data; anticipating geospatial needs; and providing coordinated products and services utilizing the USGS' exceptional pool of geospatial experts and equipment.

  3. New Generation Sensor Web Enablement

    PubMed Central

    Bröring, Arne; Echterhoff, Johannes; Jirka, Simon; Simonis, Ingo; Everding, Thomas; Stasch, Christoph; Liang, Steve; Lemmens, Rob

    2011-01-01

    Many sensor networks have been deployed to monitor Earth’s environment, and more will follow in the future. Environmental sensors have improved continuously by becoming smaller, cheaper, and more intelligent. Due to the large number of sensor manufacturers and differing accompanying protocols, integrating diverse sensors into observation systems is not straightforward. A coherent infrastructure is needed to treat sensors in an interoperable, platform-independent and uniform way. The concept of the Sensor Web reflects such a kind of infrastructure for sharing, finding, and accessing sensors and their data across different applications. It hides the heterogeneous sensor hardware and communication protocols from the applications built on top of it. The Sensor Web Enablement initiative of the Open Geospatial Consortium standardizes web service interfaces and data encodings which can be used as building blocks for a Sensor Web. This article illustrates and analyzes the recent developments of the new generation of the Sensor Web Enablement specification framework. Further, we relate the Sensor Web to other emerging concepts such as the Web of Things and point out challenges and resulting future work topics for research on Sensor Web Enablement. PMID:22163760

  4. Geospatial Authentication

    NASA Technical Reports Server (NTRS)

    Lyle, Stacey D.

    2009-01-01

    A software package that has been designed to allow authentication for determining if the rover(s) is/are within a set of boundaries or a specific area to access critical geospatial information by using GPS signal structures as a means to authenticate mobile devices into a network wirelessly and in real-time. The advantage lies in that the system only allows those with designated geospatial boundaries or areas into the server.

  5. Geospatial Data Availability for Haiti: An Aid in the Development of GIS-Based Natural Resource Assessments for Conservation Planning.

    Treesearch

    Maya Quinones; William Gould; Carlos D. Rodriguez-Pedraza

    2007-01-01

    This report documents the type and source of geospatial data available for Haiti. It was compiled to serve as a resource for geographic information system (GIS)-based land management and planning. It will be useful for conservation planning, reforestation efforts, and agricultural extension projects. Our study indicates that there is a great deal of geospatial...

  6. Adoption of Geospatial Systems towards evolving Sustainable Himalayan Mountain Development

    NASA Astrophysics Data System (ADS)

    Murthy, M. S. R.; Bajracharya, B.; Pradhan, S.; Shestra, B.; Bajracharya, R.; Shakya, K.; Wesselmann, S.; Ali, M.; Bajracharya, S.; Pradhan, S.

    2014-11-01

    Natural resources dependence of mountain communities, rapid social and developmental changes, disaster proneness and climate change are conceived as the critical factors regulating sustainable Himalayan mountain development. The Himalayan region posed by typical geographic settings, diverse physical and cultural diversity present a formidable challenge to collect and manage data, information and understands varied socio-ecological settings. Recent advances in earth observation, near real-time data, in-situ measurements and in combination of information and communication technology have transformed the way we collect, process, and generate information and how we use such information for societal benefits. Glacier dynamics, land cover changes, disaster risk reduction systems, food security and ecosystem conservation are a few thematic areas where geospatial information and knowledge have significantly contributed to informed decision making systems over the region. The emergence and adoption of near-real time systems, unmanned aerial vehicles (UAV), board-scale citizen science (crowd-sourcing), mobile services and mapping, and cloud computing have paved the way towards developing automated environmental monitoring systems, enhanced scientific understanding of geophysical and biophysical processes, coupled management of socio-ecological systems and community based adaptation models tailored to mountain specific environment. There are differentiated capacities among the ICIMOD regional member countries with regard to utilization of earth observation and geospatial technologies. The region can greatly benefit from a coordinated and collaborative approach to capture the opportunities offered by earth observation and geospatial technologies. The regional level data sharing, knowledge exchange, and Himalayan GEO supporting geospatial platforms, spatial data infrastructure, unique region specific satellite systems to address trans-boundary challenges would go a long way in evolving sustainable Himalayan livelihoods.

  7. Enhancing Discovery, Search, and Access of NASA Hydrological Data by Leveraging GEOSS

    NASA Technical Reports Server (NTRS)

    Teng, William L.

    2015-01-01

    An ongoing NASA-funded project has removed a longstanding barrier to accessing NASA data (i.e., accessing archived time-step array data as point-time series) for selected variables of the North American and Global Land Data Assimilation Systems (NLDAS and GLDAS, respectively) and other EOSDIS (Earth Observing System Data Information System) data sets (e.g., precipitation, soil moisture). These time series (data rods) are pre-generated. Data rods Web services are accessible through the CUAHSI Hydrologic Information System (HIS) and the Goddard Earth Sciences Data and Information Services Center (GES DISC) but are not easily discoverable by users of other non-NASA data systems. The Global Earth Observation System of Systems (GEOSS) is a logical mechanism for providing access to the data rods. An ongoing GEOSS Water Services project aims to develop a distributed, global registry of water data, map, and modeling services cataloged using the standards and procedures of the Open Geospatial Consortium and the World Meteorological Organization. The ongoing data rods project has demonstrated the feasibility of leveraging the GEOSS infrastructure to help provide access to time series of model grid information or grids of information over a geographical domain for a particular time interval. A recently-begun, related NASA-funded ACCESS-GEOSS project expands on these prior efforts. Current work is focused on both improving the performance of the generation of on-the-fly (OTF) data rods and the Web interfaces from which users can easily discover, search, and access NASA data.

  8. Incorporating Geographic Information Science in the BSc Environ-mental Science Program in Botswana

    NASA Astrophysics Data System (ADS)

    Akinyemi, Felicia O.

    2018-05-01

    Critical human capacity in Geographic Information Science (GISc) is developed at the Botswana International University of Science and Technology, a specialized, research university. Strategies employed include GISc courses offered each semester to students from various programs, the conduct of field-based projects, enrolment in online courses, geo-spatial initiatives with external partners, and final year research projects utilizing geospatial technologies. A review is made of available GISc courses embedded in the Bachelor of Science Environmental Science program. GISc courses are incorporated in three Bachelor degree programs as distinct courses. Geospatial technologies are employed in several other courses. Student researches apply GIS and Remote Sensing methods to environmental and geological themes. The overarching goals are to equip students in various disciplines to utilize geospatial technologies, and enhance their spatial thinking and reasoning skills.

  9. The Role of Visualization in Learning from Computer-Based Images. Research Report

    ERIC Educational Resources Information Center

    Piburn, Michael D.; Reynolds, Stephen J.; McAuliffe, Carla; Leedy, Debra E.; Birk, James P.; Johnson, Julia K.

    2005-01-01

    Among the sciences, the practice of geology is especially visual. To assess the role of spatial ability in learning geology, we designed an experiment using: (1) web-based versions of spatial visualization tests, (2) a geospatial test, and (3) multimedia instructional modules built around QuickTime Virtual Reality movies. Students in control and…

  10. Broad-Scale Assessment of Fuel Treatment Opportunities

    Treesearch

    Patrick D. Miles; Kenneth E. Skog; Wayne D. Shepperd; Elizabeth D. Reinhardt; Roger D. Fight

    2006-01-01

    The Forest Inventory and Analysis (FIA) program has produced estimates of the extent and composition of the Nation?s forests for several decades. FIA data have been used with a flexible silvicultural thinning option, a fire hazard model for preharvest and postharvest fire hazard assessment, a harvest economics model, and geospatial data to produce a Web-based tool to...

  11. EPA Facility Registry System (FRS): NCES

    EPA Pesticide Factsheets

    This web feature service contains location and facility identification information from EPA's Facility Registry System (FRS) for the subset of facilities that link to the National Center for Education Statistics (NCES). The primary federal database for collecting and analyzing data related to education in the United States and other Nations, NCES is located in the U.S. Department of Education, within the Institute of Education Sciences. FRS identifies and geospatially locates facilities, sites or places subject to environmental regulations or of environmental interest. Using vigorous verification and data management procedures, FRS integrates facility data from EPA00e2??s national program systems, other federal agencies, and State and tribal master facility records and provides EPA with a centrally managed, single source of comprehensive and authoritative information on facilities. This data set contains the subset of FRS integrated facilities that link to NCES school facilities once the NCES data has been integrated into the FRS database. Additional information on FRS is available at the EPA website http://www.epa.gov/enviro/html/fii/index.html.

  12. Onshore industrial wind turbine locations for the United States up to March 2014

    USGS Publications Warehouse

    Diffendorfer, James E.; Kramer, Louisa; Ancona, Zachary H.; Garrity, Christopher P.

    2015-01-01

    Wind energy is a rapidly growing form of renewable energy in the United States. While summary information on the total amounts of installed capacity are available by state, a free, centralized, national, turbine-level, geospatial dataset useful for scientific research, land and resource management, and other uses did not exist. Available in multiple formats and in a web application, these public domain data provide industrial-scale onshore wind turbine locations in the United States up to March 2014, corresponding facility information, and turbine technical specifications. Wind turbine records have been collected and compiled from various public sources, digitized or position verified from aerial imagery, and quality assured and quality controlled. Technical specifications for turbines were assigned based on the wind turbine make and model as described in public literature. In some cases, turbines were not seen in imagery or turbine information did not exist or was difficult to obtain. Uncertainty associated with these is recorded in a confidence rating.

  13. Onshore industrial wind turbine locations for the United States up to March 2014.

    PubMed

    Diffendorfer, Jay E; Kramer, Louisa A; Ancona, Zach H; Garrity, Christopher P

    2015-11-24

    Wind energy is a rapidly growing form of renewable energy in the United States. While summary information on the total amounts of installed capacity are available by state, a free, centralized, national, turbine-level, geospatial dataset useful for scientific research, land and resource management, and other uses did not exist. Available in multiple formats and in a web application, these public domain data provide industrial-scale onshore wind turbine locations in the United States up to March 2014, corresponding facility information, and turbine technical specifications. Wind turbine records have been collected and compiled from various public sources, digitized or position verified from aerial imagery, and quality assured and quality controlled. Technical specifications for turbines were assigned based on the wind turbine make and model as described in public literature. In some cases, turbines were not seen in imagery or turbine information did not exist or was difficult to obtain. Uncertainty associated with these is recorded in a confidence rating.

  14. EPA Facility Registry Service (FRS): CAMDBS

    EPA Pesticide Factsheets

    This web feature service contains location and facility identification information from EPA's Facility Registry Service (FRS) for the subset of facilities that link to the Clean Air Markets Division Business System (CAMDBS). Administered by the EPA Clean Air Markets Division, within the Office of Air and Radiation, CAMDBS supports the implementation of market-based air pollution control programs, including the Acid Rain Program and regional programs designed to reduce the transport of ozone. FRS identifies and geospatially locates facilities, sites or places subject to environmental regulations or of environmental interest. Using vigorous verification and data management procedures, FRS integrates facility data from EPA's national program systems, other federal agencies, and State and tribal master facility records and provides EPA with a centrally managed, single source of comprehensive and authoritative information on facilities. This data set contains the subset of FRS integrated facilities that link to CAMDBS facilities once the CAMDBS data has been integrated into the FRS database. Additional information on FRS is available at the EPA website https://www.epa.gov/enviro/facility-registry-service-frs.

  15. Onshore industrial wind turbine locations for the United States up to March 2014

    PubMed Central

    Diffendorfer, Jay E.; Kramer, Louisa A.; Ancona, Zach H.; Garrity, Christopher P.

    2015-01-01

    Wind energy is a rapidly growing form of renewable energy in the United States. While summary information on the total amounts of installed capacity are available by state, a free, centralized, national, turbine-level, geospatial dataset useful for scientific research, land and resource management, and other uses did not exist. Available in multiple formats and in a web application, these public domain data provide industrial-scale onshore wind turbine locations in the United States up to March 2014, corresponding facility information, and turbine technical specifications. Wind turbine records have been collected and compiled from various public sources, digitized or position verified from aerial imagery, and quality assured and quality controlled. Technical specifications for turbines were assigned based on the wind turbine make and model as described in public literature. In some cases, turbines were not seen in imagery or turbine information did not exist or was difficult to obtain. Uncertainty associated with these is recorded in a confidence rating. PMID:26601687

  16. New implementation of OGC Web Processing Service in Python programming language. PyWPS-4 and issues we are facing with processing of large raster data using OGC WPS

    NASA Astrophysics Data System (ADS)

    Čepický, Jáchym; Moreira de Sousa, Luís

    2016-06-01

    The OGC® Web Processing Service (WPS) Interface Standard provides rules for standardizing inputs and outputs (requests and responses) for geospatial processing services, such as polygon overlay. The standard also defines how a client can request the execution of a process, and how the output from the process is handled. It defines an interface that facilitates publishing of geospatial processes and client discovery of processes and and binding to those processes into workflows. Data required by a WPS can be delivered across a network or they can be available at a server. PyWPS was one of the first implementations of OGC WPS on the server side. It is written in the Python programming language and it tries to connect to all existing tools for geospatial data analysis, available on the Python platform. During the last two years, the PyWPS development team has written a new version (called PyWPS-4) completely from scratch. The analysis of large raster datasets poses several technical issues in implementing the WPS standard. The data format has to be defined and validated on the server side and binary data have to be encoded using some numeric representation. Pulling raster data from remote servers introduces security risks, in addition, running several processes in parallel has to be possible, so that system resources are used efficiently while preserving security. Here we discuss these topics and illustrate some of the solutions adopted within the PyWPS implementation.

  17. Providing Geographic Datasets as Linked Data in Sdi

    NASA Astrophysics Data System (ADS)

    Hietanen, E.; Lehto, L.; Latvala, P.

    2016-06-01

    In this study, a prototype service to provide data from Web Feature Service (WFS) as linked data is implemented. At first, persistent and unique Uniform Resource Identifiers (URI) are created to all spatial objects in the dataset. The objects are available from those URIs in Resource Description Framework (RDF) data format. Next, a Web Ontology Language (OWL) ontology is created to describe the dataset information content using the Open Geospatial Consortium's (OGC) GeoSPARQL vocabulary. The existing data model is modified in order to take into account the linked data principles. The implemented service produces an HTTP response dynamically. The data for the response is first fetched from existing WFS. Then the Geographic Markup Language (GML) format output of the WFS is transformed on-the-fly to the RDF format. Content Negotiation is used to serve the data in different RDF serialization formats. This solution facilitates the use of a dataset in different applications without replicating the whole dataset. In addition, individual spatial objects in the dataset can be referred with URIs. Furthermore, the needed information content of the objects can be easily extracted from the RDF serializations available from those URIs. A solution for linking data objects to the dataset URI is also introduced by using the Vocabulary of Interlinked Datasets (VoID). The dataset is divided to the subsets and each subset is given its persistent and unique URI. This enables the whole dataset to be explored with a web browser and all individual objects to be indexed by search engines.

  18. Exploring Local Level Factors Shaping the Implementation of a Blended Learning Module for Information and Geospatial Literacy in Ontario

    ERIC Educational Resources Information Center

    Vine, Michelle M.; Chiappetta-Swanson, Catherine; Maclachlan, John; Brodeur, Jason J.; Bagg, Julianne

    2016-01-01

    The objectives of this research study were to examine local level factors shaping the implementation of a blended pedagogical approach for geospatial- and information-literacy, and to understand implementer satisfaction. As such, we addressed the following research questions: What local-level factors shape the implementation of the blended…

  19. A Rule-Based Spatial Reasoning Approach for OpenStreetMap Data Quality Enrichment; Case Study of Routing and Navigation

    PubMed Central

    2017-01-01

    Finding relevant geospatial information is increasingly critical because of the growing volume of geospatial data available within the emerging “Big Data” era. Users are expecting that the availability of massive datasets will create more opportunities to uncover hidden information and answer more complex queries. This is especially the case with routing and navigation services where the ability to retrieve points of interest and landmarks make the routing service personalized, precise, and relevant. In this paper, we propose a new geospatial information approach that enables the retrieval of implicit information, i.e., geospatial entities that do not exist explicitly in the available source. We present an information broker that uses a rule-based spatial reasoning algorithm to detect topological relations. The information broker is embedded into a framework where annotations and mappings between OpenStreetMap data attributes and external resources, such as taxonomies, support the enrichment of queries to improve the ability of the system to retrieve information. Our method is tested with two case studies that leads to enriching the completeness of OpenStreetMap data with footway crossing points-of-interests as well as building entrances for routing and navigation purposes. It is concluded that the proposed approach can uncover implicit entities and contribute to extract required information from the existing datasets. PMID:29088125

  20. EPA Facility Registry Service (FRS): TRI

    EPA Pesticide Factsheets

    This web feature service contains location and facility identification information from EPA's Facility Registry Service (FRS) for the subset of facilities that link to the Toxic Release Inventory (TRI) System. TRI is a publicly available EPA database reported annually by certain covered industry groups, as well as federal facilities. It contains information about more than 650 toxic chemicals that are being used, manufactured, treated, transported, or released into the environment, and includes information about waste management and pollution prevention activities. FRS identifies and geospatially locates facilities, sites or places subject to environmental regulations or of environmental interest. Using vigorous verification and data management procedures, FRS integrates facility data from EPA's national program systems, other federal agencies, and State and tribal master facility records and provides EPA with a centrally managed, single source of comprehensive and authoritative information on facilities. This data set contains the subset of FRS integrated facilities that link to TRI facilities once the TRI data has been integrated into the FRS database. Additional information on FRS is available at the EPA website https://www.epa.gov/enviro/facility-registry-service-frs.

  1. a Novel Approach of Indexing and Retrieving Spatial Polygons for Efficient Spatial Region Queries

    NASA Astrophysics Data System (ADS)

    Zhao, J. H.; Wang, X. Z.; Wang, F. Y.; Shen, Z. H.; Zhou, Y. C.; Wang, Y. L.

    2017-10-01

    Spatial region queries are more and more widely used in web-based applications. Mechanisms to provide efficient query processing over geospatial data are essential. However, due to the massive geospatial data volume, heavy geometric computation, and high access concurrency, it is difficult to get response in real time. Spatial indexes are usually used in this situation. In this paper, based on k-d tree, we introduce a distributed KD-Tree (DKD-Tree) suitbable for polygon data, and a two-step query algorithm. The spatial index construction is recursive and iterative, and the query is an in memory process. Both the index and query methods can be processed in parallel, and are implemented based on HDFS, Spark and Redis. Experiments on a large volume of Remote Sensing images metadata have been carried out, and the advantages of our method are investigated by comparing with spatial region queries executed on PostgreSQL and PostGIS. Results show that our approach not only greatly improves the efficiency of spatial region query, but also has good scalability, Moreover, the two-step spatial range query algorithm can also save cluster resources to support a large number of concurrent queries. Therefore, this method is very useful when building large geographic information systems.

  2. Regulating outdoor advertisement boards; employing spatial decision support system to control urban visual pollution

    NASA Astrophysics Data System (ADS)

    Wakil, K.; Hussnain, MQ; Tahir, A.; Naeem, M. A.

    2016-06-01

    Unmanaged placement, size, location, structure and contents of outdoor advertisement boards have resulted in severe urban visual pollution and deterioration of the socio-physical living environment in urban centres of Pakistan. As per the regulatory instruments, the approval decision for a new advertisement installation is supposed to be based on the locational density of existing boards and their proximity or remoteness to certain land- uses. In cities, where regulatory tools for the control of advertisement boards exist, responsible authorities are handicapped in effective implementation due to the absence of geospatial analysis capacity. This study presents the development of a spatial decision support system (SDSS) for regularization of advertisement boards in terms of their location and placement. The knowledge module of the proposed SDSS is based on provisions and restrictions prescribed in regulatory documents. While the user interface allows visualization and scenario evaluation to understand if the new board will affect existing linear density on a particular road and if it violates any buffer restrictions around a particular land use. Technically the structure of the proposed SDSS is a web-based solution which includes open geospatial tools such as OpenGeo Suite, GeoExt, PostgreSQL, and PHP. It uses three key data sets including road network, locations of existing billboards and building parcels with land use information to perform the analysis. Locational suitability has been calculated using pairwise comparison through analytical hierarchy process (AHP) and weighted linear combination (WLC). Our results indicate that open geospatial tools can be helpful in developing an SDSS which can assist solving space related iterative decision challenges on outdoor advertisements. Employing such a system will result in effective implementation of regulations resulting in visual harmony and aesthetic improvement in urban communities.

  3. Prediction of fish and sediment mercury in streams using landscape variables and historical mining.

    PubMed

    Alpers, Charles N; Yee, Julie L; Ackerman, Joshua T; Orlando, James L; Slotton, Darrel G; Marvin-DiPasquale, Mark C

    2016-11-15

    Widespread mercury (Hg) contamination of aquatic systems in the Sierra Nevada of California, U.S., is associated with historical use to enhance gold (Au) recovery by amalgamation. In areas affected by historical Au mining operations, including the western slope of the Sierra Nevada and downstream areas in northern California, such as San Francisco Bay and the Sacramento River-San Joaquin River Delta, microbial conversion of Hg to methylmercury (MeHg) leads to bioaccumulation of MeHg in food webs, and increased risks to humans and wildlife. This study focused on developing a predictive model for THg in stream fish tissue based on geospatial data, including land use/land cover data, and the distribution of legacy Au mines. Data on total mercury (THg) and MeHg concentrations in fish tissue and streambed sediment collected during 1980-2012 from stream sites in the Sierra Nevada, California were combined with geospatial data to estimate fish THg concentrations across the landscape. THg concentrations of five fish species (Brown Trout, Rainbow Trout, Sacramento Pikeminnow, Sacramento Sucker, and Smallmouth Bass) within stream sections were predicted using multi-model inference based on Akaike Information Criteria, using geospatial data for mining history and landscape characteristics as well as fish species and length (r(2)=0.61, p<0.001). Including THg concentrations in streambed sediment did not improve the model's fit, however including MeHg concentrations in streambed sediment, organic content (loss on ignition), and sediment grain size resulted in an improved fit (r(2)=0.63, p<0.001). These models can be used to estimate THg concentrations in stream fish based on landscape variables in the Sierra Nevada in areas where direct measurements of THg concentration in fish are unavailable. Published by Elsevier B.V.

  4. Prediction of fish and sediment mercury in streams using landscape variables and historical mining

    USGS Publications Warehouse

    Alpers, Charles N.; Yee, Julie L.; Ackerman, Joshua T.; Orlando, James L.; Slotton, Darrell G.; Marvin-DiPasquale, Mark C.

    2016-01-01

    Widespread mercury (Hg) contamination of aquatic systems in the Sierra Nevada of California, U.S., is associated with historical use to enhance gold (Au) recovery by amalgamation. In areas affected by historical Au mining operations, including the western slope of the Sierra Nevada and downstream areas in northern California, such as San Francisco Bay and the Sacramento River–San Joaquin River Delta, microbial conversion of Hg to methylmercury (MeHg) leads to bioaccumulation of MeHg in food webs, and increased risks to humans and wildlife. This study focused on developing a predictive model for THg in stream fish tissue based on geospatial data, including land use/land cover data, and the distribution of legacy Au mines. Data on total mercury (THg) and MeHg concentrations in fish tissue and streambed sediment collected during 1980–2012 from stream sites in the Sierra Nevada, California were combined with geospatial data to estimate fish THg concentrations across the landscape. THg concentrations of five fish species (Brown Trout, Rainbow Trout, Sacramento Pikeminnow, Sacramento Sucker, and Smallmouth Bass) within stream sections were predicted using multi-model inference based on Akaike Information Criteria, using geospatial data for mining history and landscape characteristics as well as fish species and length (r2 = 0.61, p < 0.001). Including THg concentrations in streambed sediment did not improve the model's fit, however including MeHg concentrations in streambed sediment, organic content (loss on ignition), and sediment grain size resulted in an improved fit (r2 = 0.63, p < 0.001). These models can be used to estimate THg concentrations in stream fish based on landscape variables in the Sierra Nevada in areas where direct measurements of THg concentration in fish are unavailable.

  5. Semantically optiMize the dAta seRvice operaTion (SMART) system for better data discovery and access

    NASA Astrophysics Data System (ADS)

    Yang, C.; Huang, T.; Armstrong, E. M.; Moroni, D. F.; Liu, K.; Gui, Z.

    2013-12-01

    Abstract: We present a Semantically optiMize the dAta seRvice operaTion (SMART) system for better data discovery and access across the NASA data systems, Global Earth Observation System of Systems (GEOSS) Clearinghouse and Data.gov to facilitate scientists to select Earth observation data that fit better their needs in four aspects: 1. Integrating and interfacing the SMART system to include the functionality of a) semantic reasoning based on Jena, an open source semantic reasoning engine, b) semantic similarity calculation, c) recommendation based on spatiotemporal, semantic, and user workflow patterns, and d) ranking results based on similarity between search terms and data ontology. 2. Collaborating with data user communities to a) capture science data ontology and record relevant ontology triple stores, b) analyze and mine user search and download patterns, c) integrate SMART into metadata-centric discovery system for community-wide usage and feedback, and d) customizing data discovery, search and access user interface to include the ranked results, recommendation components, and semantic based navigations. 3. Laying the groundwork to interface the SMART system with other data search and discovery systems as an open source data search and discovery solution. The SMART systems leverages NASA, GEO, FGDC data discovery, search and access for the Earth science community by enabling scientists to readily discover and access data appropriate to their endeavors, increasing the efficiency of data exploration and decreasing the time that scientists must spend on searching, downloading, and processing the datasets most applicable to their research. By incorporating the SMART system, it is a likely aim that the time being devoted to discovering the most applicable dataset will be substantially reduced, thereby reducing the number of user inquiries and likewise reducing the time and resources expended by a data center in addressing user inquiries. Keywords: EarthCube; ECHO, DAACs, GeoPlatform; Geospatial Cyberinfrastructure References: 1. Yang, P., Evans, J., Cole, M., Alameh, N., Marley, S., & Bambacus, M., (2007). The Emerging Concepts and Applications of the Spatial Web Portal. Photogrammetry Engineering &Remote Sensing,73(6):691-698. 2. Zhang, C, Zhao, T. and W. Li. (2010). The Framework of a Geospatial Semantic Web based Spatial Decision Support System for Digital Earth. International Journal of Digital Earth. 3(2):111-134. 3. Yang C., Raskin R., Goodchild M.F., Gahegan M., 2010, Geospatial Cyberinfrastructure: Past, Present and Future,Computers, Environment, and Urban Systems, 34(4):264-277. 4. Liu K., Yang C., Li W., Gui Z., Xu C., Xia J., 2013. Using ontology and similarity calculations to rank Earth science data searching results, International Journal of Geospatial Information Applications. (in press)

  6. Challenges in sharing of geospatial data by data custodians in South Africa

    NASA Astrophysics Data System (ADS)

    Kay, Sissiel E.

    2018-05-01

    As most development planning and rendering of public services happens at a place or in a space, geospatial data is required. This geospatial data is best managed through a spatial data infrastructure, which has as a key objective to share geospatial data. The collection and maintenance of geospatial data is expensive and time consuming and so the principle of "collect once - use many times" should apply. It is best to obtain the geospatial data from the authoritative source - the appointed data custodian. In South Africa the South African Spatial Data Infrastructure (SASDI) is the means to achieve the requirement for geospatial data sharing. This requires geospatial data sharing to take place between the data custodian and the user. All data custodians are expected to comply with the Spatial Data Infrastructure Act (SDI Act) in terms of geo-spatial data sharing. Currently data custodians are experiencing challenges with regard to the sharing of geospatial data. This research is based on the current ten data themes selected by the Committee for Spatial Information and the organisations identified as the data custodians for these ten data themes. The objectives are to determine whether the identified data custodians comply with the SDI Act with respect to geospatial data sharing, and if not what are the reasons for this. Through an international comparative assessment it then determines if the compliance with the SDI Act is not too onerous on the data custodians. The research concludes that there are challenges with geospatial data sharing in South Africa and that the data custodians only partially comply with the SDI Act in terms of geospatial data sharing. However, it is shown that the South African legislation is not too onerous on the data custodians.

  7. River Basin Standards Interoperability Pilot

    NASA Astrophysics Data System (ADS)

    Pesquer, Lluís; Masó, Joan; Stasch, Christoph

    2016-04-01

    There is a lot of water information and tools in Europe to be applied in the river basin management but fragmentation and a lack of coordination between countries still exists. The European Commission and the member states have financed several research and innovation projects in support of the Water Framework Directive. Only a few of them are using the recently emerging hydrological standards, such as the OGC WaterML 2.0. WaterInnEU is a Horizon 2020 project focused on creating a marketplace to enhance the exploitation of EU funded ICT models, tools, protocols and policy briefs related to water and to establish suitable conditions for new market opportunities based on these offerings. One of WaterInnEU's main goals is to assess the level of standardization and interoperability of these outcomes as a mechanism to integrate ICT-based tools, incorporate open data platforms and generate a palette of interchangeable components that are able to use the water data emerging from the recently proposed open data sharing processes and data models stimulated by initiatives such as the INSPIRE directive. As part of the standardization and interoperability activities in the project, the authors are designing an experiment (RIBASE, the present work) to demonstrate how current ICT-based tools and water data can work in combination with geospatial web services in the Scheldt river basin. The main structure of this experiment, that is the core of the present work, is composed by the following steps: - Extraction of information from river gauges data in OGC WaterML 2.0 format using SOS services (preferably compliant to the OGC SOS 2.0 Hydrology Profile Best Practice). - Model floods using a WPS 2.0, WaterML 2.0 data and weather forecast models as input. - Evaluation of the applicability of Sensor Notification Services in water emergencies. - Open distribution of the input and output data as OGC web services WaterML, / WCS / WFS and with visualization utilities: WMS. The architecture tests the combination of Gauge data in a WPS that is triggered by a meteorological alert. The data is translated into OGC WaterML 2.0 time series data format and will be ingested in a SOS 2.0. SOS data is visualized in a SOS Client that is able to handle time series. The meteorological forecast data (with the supervision of an operator manipulating the WPS user interface) ingests with WaterML 2.0 time series and terrain data is input for a flooding modelling algorithm. The WPS is able to produce flooding datasets in the form of coverages that is offered to clients via a WCS 2.0 service or a WMS 1.3 service, and downloaded and visualized by the respective clients. The WPS triggers a notification or an alert that will be monitored from an emergency control response service. Acronyms AS: Alert Service ES: Event Service ICT: Information and Communication Technology NS: Notification Service OGC: Open Geospatial Consortium RIBASE: River Basin Standards Interoperability Pilot SOS: Sensor Observation Service WaterML: Water Markup Language WCS: Web Coverage Service WMS: Web Map Service WPS: Web Processing Service

  8. Automated Geospatial Watershed Assessment

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment (AGWA) tool is a Geographic Information Systems (GIS) interface jointly developed by the U.S. Environmental Protection Agency, the U.S. Department of Agriculture (USDA) Agricultural Research Service, and the University of Arizona to a...

  9. Publications - RDF 2015-17 | Alaska Division of Geological & Geophysical

    Science.gov Websites

    /10.14509/29519 Publication Products Report Report Information rdf2015_017.pdf (347.0 K) Digital Geospatial Data Digital Geospatial Data Tonsina geochemistry: DGGS samples Data File Format File Size Info

  10. Geospatial Authentication

    NASA Technical Reports Server (NTRS)

    Lyle, Stacey D.

    2009-01-01

    A software package that has been designed to allow authentication for determining if the rover(s) is/are within a set of boundaries or a specific area to access critical geospatial information by using GPS signal structures as a means to authenticate mobile devices into a network wirelessly and in real-time has been developed. The advantage lies in that the system only allows those with designated geospatial boundaries or areas into the server. The Geospatial Authentication software has two parts Server and Client. The server software is a virtual private network (VPN) developed in Linux operating system using Perl programming language. The server can be a stand-alone VPN server or can be combined with other applications and services. The client software is a GUI Windows CE software, or Mobile Graphical Software, that allows users to authenticate into a network. The purpose of the client software is to pass the needed satellite information to the server for authentication.

  11. A lake-centric geospatial database to guide research and inform management decisions in an Arctic watershed in northern Alaska experiencing climate and land-use changes

    USGS Publications Warehouse

    Jones, Benjamin M.; Arp, Christopher D.; Whitman, Matthew S.; Nigro, Debora A.; Nitze, Ingmar; Beaver, John; Gadeke, Anne; Zuck, Callie; Liljedahl, Anna K.; Daanen, Ronald; Torvinen, Eric; Fritz, Stacey; Grosse, Guido

    2017-01-01

    Lakes are dominant and diverse landscape features in the Arctic, but conventional land cover classification schemes typically map them as a single uniform class. Here, we present a detailed lake-centric geospatial database for an Arctic watershed in northern Alaska. We developed a GIS dataset consisting of 4362 lakes that provides information on lake morphometry, hydrologic connectivity, surface area dynamics, surrounding terrestrial ecotypes, and other important conditions describing Arctic lakes. Analyzing the geospatial database relative to fish and bird survey data shows relations to lake depth and hydrologic connectivity, which are being used to guide research and aid in the management of aquatic resources in the National Petroleum Reserve in Alaska. Further development of similar geospatial databases is needed to better understand and plan for the impacts of ongoing climate and land-use changes occurring across lake-rich landscapes in the Arctic.

  12. Geospatial cryptography: enabling researchers to access private, spatially referenced, human subjects data for cancer control and prevention.

    PubMed

    Jacquez, Geoffrey M; Essex, Aleksander; Curtis, Andrew; Kohler, Betsy; Sherman, Recinda; Emam, Khaled El; Shi, Chen; Kaufmann, Andy; Beale, Linda; Cusick, Thomas; Goldberg, Daniel; Goovaerts, Pierre

    2017-07-01

    As the volume, accuracy and precision of digital geographic information have increased, concerns regarding individual privacy and confidentiality have come to the forefront. Not only do these challenge a basic tenet underlying the advancement of science by posing substantial obstacles to the sharing of data to validate research results, but they are obstacles to conducting certain research projects in the first place. Geospatial cryptography involves the specification, design, implementation and application of cryptographic techniques to address privacy, confidentiality and security concerns for geographically referenced data. This article defines geospatial cryptography and demonstrates its application in cancer control and surveillance. Four use cases are considered: (1) national-level de-duplication among state or province-based cancer registries; (2) sharing of confidential data across cancer registries to support case aggregation across administrative geographies; (3) secure data linkage; and (4) cancer cluster investigation and surveillance. A secure multi-party system for geospatial cryptography is developed. Solutions under geospatial cryptography are presented and computation time is calculated. As services provided by cancer registries to the research community, de-duplication, case aggregation across administrative geographies and secure data linkage are often time-consuming and in some instances precluded by confidentiality and security concerns. Geospatial cryptography provides secure solutions that hold significant promise for addressing these concerns and for accelerating the pace of research with human subjects data residing in our nation's cancer registries. Pursuit of the research directions posed herein conceivably would lead to a geospatially encrypted geographic information system (GEGIS) designed specifically to promote the sharing and spatial analysis of confidential data. Geospatial cryptography holds substantial promise for accelerating the pace of research with spatially referenced human subjects data.

  13. Towards Precise Metadata-set for Discovering 3D Geospatial Models in Geo-portals

    NASA Astrophysics Data System (ADS)

    Zamyadi, A.; Pouliot, J.; Bédard, Y.

    2013-09-01

    Accessing 3D geospatial models, eventually at no cost and for unrestricted use, is certainly an important issue as they become popular among participatory communities, consultants, and officials. Various geo-portals, mainly established for 2D resources, have tried to provide access to existing 3D resources such as digital elevation model, LIDAR or classic topographic data. Describing the content of data, metadata is a key component of data discovery in geo-portals. An inventory of seven online geo-portals and commercial catalogues shows that the metadata referring to 3D information is very different from one geo-portal to another as well as for similar 3D resources in the same geo-portal. The inventory considered 971 data resources affiliated with elevation. 51% of them were from three geo-portals running at Canadian federal and municipal levels whose metadata resources did not consider 3D model by any definition. Regarding the remaining 49% which refer to 3D models, different definition of terms and metadata were found, resulting in confusion and misinterpretation. The overall assessment of these geo-portals clearly shows that the provided metadata do not integrate specific and common information about 3D geospatial models. Accordingly, the main objective of this research is to improve 3D geospatial model discovery in geo-portals by adding a specific metadata-set. Based on the knowledge and current practices on 3D modeling, and 3D data acquisition and management, a set of metadata is proposed to increase its suitability for 3D geospatial models. This metadata-set enables the definition of genuine classes, fields, and code-lists for a 3D metadata profile. The main structure of the proposal contains 21 metadata classes. These classes are classified in three packages as General and Complementary on contextual and structural information, and Availability on the transition from storage to delivery format. The proposed metadata set is compared with Canadian Geospatial Data Infrastructure (CGDI) metadata which is an implementation of North American Profile of ISO-19115. The comparison analyzes the two metadata against three simulated scenarios about discovering needed 3D geo-spatial datasets. Considering specific metadata about 3D geospatial models, the proposed metadata-set has six additional classes on geometric dimension, level of detail, geometric modeling, topology, and appearance information. In addition classes on data acquisition, preparation, and modeling, and physical availability have been specialized for 3D geospatial models.

  14. Mobile Traffic Alert and Tourist Route Guidance System Design Using Geospatial Data

    NASA Astrophysics Data System (ADS)

    Bhattacharya, D.; Painho, M.; Mishra, S.; Gupta, A.

    2017-09-01

    The present study describes an integrated system for traffic data collection and alert warning. Geographical information based decision making related to traffic destinations and routes is proposed through the design. The system includes a geospatial database having profile relating to a user of a mobile device. The processing and understanding of scanned maps, other digital data input leads to route guidance. The system includes a server configured to receive traffic information relating to a route and location information relating to the mobile device. Server is configured to send a traffic alert to the mobile device when the traffic information and the location information indicate that the mobile device is traveling toward traffic congestion. Proposed system has geospatial and mobile data sets pertaining to Bangalore city in India. It is envisaged to be helpful for touristic purposes as a route guidance and alert relaying information system to tourists for proximity to sites worth seeing in a city they have entered into. The system is modular in architecture and the novelty lies in integration of different modules carrying different technologies for a complete traffic information system. Generic information processing and delivery system has been tested to be functional and speedy under test geospatial domains. In a restricted prototype model with geo-referenced route data required information has been delivered correctly over sustained trials to designated cell numbers, with average time frame of 27.5 seconds, maximum 50 and minimum 5 seconds. Traffic geo-data set trials testing is underway.

  15. a Virtual Hub Brokering Approach for Integration of Historical and Modern Maps

    NASA Astrophysics Data System (ADS)

    Bruno, N.; Previtali, M.; Barazzetti, L.; Brumana, R.; Roncella, R.

    2016-06-01

    Geospatial data are today more and more widespread. Many different institutions, such as Geographical Institutes, Public Administrations, collaborative communities (e.g., OSM) and web companies, make available nowadays a large number of maps. Besides this cartography, projects of digitizing, georeferencing and web publication of historical maps have increasingly spread in the recent years. In spite of these variety and availability of data, information overload makes difficult their discovery and management: without knowing the specific repository where the data are stored, it is difficult to find the information required and problems of interconnection between different data sources and their restricted interoperability limit a wide utilization of available geo-data. This paper aims to describe some actions performed to assure interoperability between data, in particular spatial and geographic data, gathered from different data providers, with different features and referring to different historical periods. The article summarizes and exemplifies how, starting from projects of historical map digitizing and Historical GIS implementation, respectively for the Lombardy and for the city of Parma, the interoperability is possible in the framework of the ENERGIC OD project. The European project ENERGIC OD, thanks to a specific component - the virtual hub - based on a brokering framework, copes with the previous listed problems and allows the interoperability between different data sources.

  16. NASA World Wind: A New Mission

    NASA Astrophysics Data System (ADS)

    Hogan, P.; Gaskins, T.; Bailey, J. E.

    2008-12-01

    Virtual Globes are well into their first generation, providing increasingly rich and beautiful visualization of more types and quantities of information. However, they are still mostly single and proprietary programs, akin to a web browser whose content and functionality are controlled and constrained largely by the browser's manufacturer. Today Google and Microsoft determine what we can and cannot see and do in these programs. NASA World Wind started out in nearly the same mode, a single program with limited functionality and information content. But as the possibilities of virtual globes became more apparent, we found that while enabling a new class of information visualization, we were also getting in the way. Many users want to provide World Wind functionality and information in their programs, not ours. They want it in their web pages. They want to include their own features. They told us that only with this kind of flexibility, could their objectives and the potential of the technology be truly realized. World Wind therefore changed its mission: from providing a single information browser to enabling a whole class of 3D geographic applications. Instead of creating one program, we create components to be used in any number of programs. World Wind is NASA open source software. With the source code being fully visible, anyone can readily use it and freely extend it to serve any use. Imagery and other information provided by the World Wind servers is also free and unencumbered, including the server technology to deliver geospatial data. World Wind developers can therefore provide exclusive and custom solutions based on user needs.

  17. Spatial Databases

    DTIC Science & Technology

    2007-09-19

    extended object relations such as boundary, interior, open, closed , within, connected, and overlaps, which are invariant under elastic deformation...is required in a geo-spatial semantic web is challenging because the defining properties of geographic entities are very closely related to space. In...Objects under Primitive will be open (i.e., they will not contain their boundary points) and the objects under Complex will be closed . In addition to

  18. Geospatial Multi-Agency Coordination (GeoMAC) wildland fire perimeters, 2008

    USGS Publications Warehouse

    Walters, Sandra P.; Schneider, Norma J.; Guthrie, John D.

    2011-01-01

    The Geospatial Multi-Agency Coordination (GeoMAC) has been collecting and storing data on wildland fire perimeters since August 2000. The dataset presented via this U.S. Geological Survey Data Series product contains the GeoMAC wildland fire perimeter data for the calendar year 2008, which are based upon input from incident intelligence sources, Global Positioning System (GPS) data, and infrared (IR) imagery. Wildland fire perimeter data are obtained from the incidents, evaluated for completeness and accuracy, and processed to reflect consistent field names and attributes. After a quality check, the perimeters are loaded to GeoMAC databases, which support the GeoMAC Web application for access by wildland fire managers and the public. The wildland fire perimeters are viewed through the Web application. The data are subsequently archived according to year and state and are made available for downloading through the Internet in shapefile and Keyhole Markup Language (KML) format. These wildland fire perimeter data are also retained for historical, planning, and research purposes. The datasets that pertain to this report can be found on the Rocky Mountain Geographic Science Center HTTP site at http://rmgsc.cr.usgs.gov/outgoing/GeoMAC/historic_fire_data/. The links are also provided on the sidebar.

  19. Geospatial Data Science Data and Tools | Geospatial Data Science | NREL

    Science.gov Websites

    help sizing a residential photovoltaic system? Want to know what renewable energy resources are science tools help users apply NREL's geographic information system expertise to their own projects. Need

  20. Publications - RDF 2007-1 | Alaska Division of Geological & Geophysical

    Science.gov Websites

    ://doi.org/10.14509/15759 Publication Products Report Report Information rdf2007_001.pdf (443.0 K) Digital Geospatial Data Digital Geospatial Data Fairbanks Mining District Geochemical Data Data File Format File Size

  1. Publications - RDF 2011-4 v. 2 | Alaska Division of Geological &

    Science.gov Websites

    ://doi.org/10.14509/23002 Publication Products Report Report Information rdf2011_004.pdf (519.0 K) Digital Geospatial Data Digital Geospatial Data Moran Geochemistry Data File Format File Size Info Download moran

  2. Stakeholder Alignment and Changing Geospatial Information Capabilities

    NASA Astrophysics Data System (ADS)

    Winter, S.; Cutcher-Gershenfeld, J.; King, J. L.

    2015-12-01

    Changing geospatial information capabilities can have major economic and social effects on activities such as drought monitoring, weather forecasts, agricultural productivity projections, water and air quality assessments, the effects of forestry practices and so on. Whose interests are served by such changes? Two common mistakes are assuming stability in the community of stakeholders and consistency in stakeholder behavior. Stakeholder communities can reconfigure dramatically as some leave the discussion, others enter, and circumstances shift — all resulting in dynamic points of alignment and misalignment . New stakeholders can bring new interests, and existing stakeholders can change their positions. Stakeholders and their interests need to be be considered as geospatial information capabilities change, but this is easier said than done. New ways of thinking about stakeholder alignment in light of changes in capability are presented.

  3. GIS-and Web-based Water Resource Geospatial Infrastructure for Oil Shale Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Wei; Minnick, Matthew; Geza, Mengistu

    2012-09-30

    The Colorado School of Mines (CSM) was awarded a grant by the National Energy Technology Laboratory (NETL), Department of Energy (DOE) to conduct a research project en- titled GIS- and Web-based Water Resource Geospatial Infrastructure for Oil Shale Development in October of 2008. The ultimate goal of this research project is to develop a water resource geo-spatial infrastructure that serves as “baseline data” for creating solutions on water resource management and for supporting decisions making on oil shale resource development. The project came to the end on September 30, 2012. This final project report will report the key findings frommore » the project activity, major accomplishments, and expected impacts of the research. At meantime, the gamma version (also known as Version 4.0) of the geodatabase as well as other various deliverables stored on digital storage media will be send to the program manager at NETL, DOE via express mail. The key findings from the project activity include the quantitative spatial and temporal distribution of the water resource throughout the Piceance Basin, water consumption with respect to oil shale production, and data gaps identified. Major accomplishments of this project include the creation of a relational geodatabase, automated data processing scripts (Matlab) for database link with surface water and geological model, ArcGIS Model for hydrogeologic data processing for groundwater model input, a 3D geological model, surface water/groundwater models, energy resource development systems model, as well as a web-based geo-spatial infrastructure for data exploration, visualization and dissemination. This research will have broad impacts of the devel- opment of the oil shale resources in the US. The geodatabase provides a “baseline” data for fur- ther study of the oil shale development and identification of further data collection needs. The 3D geological model provides better understanding through data interpolation and visualization techniques of the Piceance Basin structure spatial distribution of the oil shale resources. The sur- face water/groundwater models quantify the water shortage and better understanding the spatial distribution of the available water resources. The energy resource development systems model reveals the phase shift of water usage and the oil shale production, which will facilitate better planning for oil shale development. Detailed descriptions about the key findings from the project activity, major accomplishments, and expected impacts of the research will be given in the sec- tion of “ACCOMPLISHMENTS, RESULTS, AND DISCUSSION” of this report.« less

  4. A Integrated Service Platform for Remote Sensing Image 3D Interpretation and Draughting based on HTML5

    NASA Astrophysics Data System (ADS)

    LIU, Yiping; XU, Qing; ZhANG, Heng; LV, Liang; LU, Wanjie; WANG, Dandi

    2016-11-01

    The purpose of this paper is to solve the problems of the traditional single system for interpretation and draughting such as inconsistent standards, single function, dependence on plug-ins, closed system and low integration level. On the basis of the comprehensive analysis of the target elements composition, map representation and similar system features, a 3D interpretation and draughting integrated service platform for multi-source, multi-scale and multi-resolution geospatial objects is established based on HTML5 and WebGL, which not only integrates object recognition, access, retrieval, three-dimensional display and test evaluation but also achieves collection, transfer, storage, refreshing and maintenance of data about Geospatial Objects and shows value in certain prospects and potential for growth.

  5. FASTMap v. 2010.01

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bynum, Leo

    FASTMap is mapping application available for the web or on mobile devices (IOS and Android) that browses geospatial data and produces detailed reports of objects within any area of analysis. FASTMap can access any geospatial dataset. The software can provide immediate access to the selected data through a fully symbolized interactive mapping interface. FASTMap can load arbitrary contours that represent a region of interest and can dynamically identify and geospatially select objects that reside within the region. The software can produce a report listing the objects and aggregations for the region, as well as producing publication quality maps. FASTMap alsomore » has the ability to post and maintain authored maps, any GIS data included in the map, areas of interest, as well as any titles, and labels. These defining ingredients of a map are called map contexts. These mao contexts can be instantly broadcast via the internet through any of an infinite number of named channels to small or large numbers of users monitouring any of the channels being posted to, so a user can author a map and immediately share that map with others instantly, whether they are on traditional desktop computer, laptop, mobile tablet or smartphone. Further, users receiving broadcast maps can also alter the maps can also alter the maps, or create new ones and publish back to the channel in a collaborative manner. FASTMap can be configured to access virtually any geospatial data.« less

  6. Development of a spatial decision support system for flood risk management in Brazil that combines volunteered geographic information with wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Horita, Flávio E. A.; Albuquerque, João Porto de; Degrossi, Lívia C.; Mendiondo, Eduardo M.; Ueyama, Jó

    2015-07-01

    Effective flood risk management requires updated information to ensure that the correct decisions can be made. This can be provided by Wireless Sensor Networks (WSN) which are a low-cost means of collecting updated information about rivers. Another valuable resource is Volunteered Geographic Information (VGI) which is a comparatively new means of improving the coverage of monitored areas because it is able to supply supplementary information to the WSN and thus support decision-making in flood risk management. However, there still remains the problem of how to combine WSN data with VGI. In this paper, an attempt is made to investigate AGORA-DS, which is a Spatial Decision Support System (SDSS) that is able to make flood risk management more effective by combining these data sources, i.e. WSN with VGI. This approach is built over a conceptual model that complies with the interoperable standards laid down by the Open Geospatial Consortium (OGC) - e.g. Sensor Observation Service (SOS) and Web Feature Service (WFS) - and seeks to combine and present unified information in a web-based decision support tool. This work was deployed in a real scenario of flood risk management in the town of São Carlos in Brazil. The evidence obtained from this deployment confirmed that interoperable standards can support the integration of data from distinct data sources. In addition, they also show that VGI is able to provide information about areas of the river basin which lack data since there is no appropriate station in the area. Hence it provides a valuable support for the WSN data. It can thus be concluded that AGORA-DS is able to combine information provided by WSN and VGI, and provide useful information for supporting flood risk management.

  7. Visualization and Ontology of Geospatial Intelligence

    NASA Astrophysics Data System (ADS)

    Chan, Yupo

    Recent events have deepened our conviction that many human endeavors are best described in a geospatial context. This is evidenced in the prevalence of location-based services, as afforded by the ubiquitous cell phone usage. It is also manifested by the popularity of such internet engines as Google Earth. As we commute to work, travel on business or pleasure, we make decisions based on the geospatial information provided by such location-based services. When corporations devise their business plans, they also rely heavily on such geospatial data. By definition, local, state and federal governments provide services according to geographic boundaries. One estimate suggests that 85 percent of data contain spatial attributes.

  8. BAID: The Barrow Area Information Database - an interactive web mapping portal and cyberinfrastructure for science and land management in the vicinity of Utqiaġvik (Barrow) on the North Slope of Alaska.

    NASA Astrophysics Data System (ADS)

    Cody, R. P.; Escarzaga, S. M.; Gaylord, A. G.; Kassin, A.; Barba, M.; Tweedie, C. E.

    2017-12-01

    The Utqiaġvik (Barrow) area of northern Alaska is one of the most intensely researched locations in the Arctic and the Barrow Area Information Database (BAID, www.barrowmapped.org) tracks and facilitates a gamut of research, management, and educational activities in the area. BAID is a cyberinfrastructure (CI) that details much of the historic and extant research undertaken within in the Barrow region in a suite of interactive web-based mapping and information portals (geobrowsers). The BAID user community and target audience for BAID is diverse and includes research scientists, science logisticians, land managers, educators, students, and the general public. BAID contains information on more than 18,000 Barrow area research sites that extend back to the 1940's and more than 640 remote sensing images and geospatial datasets. In a web-based setting, users can zoom, pan, query, measure distance, save or print maps and query results, and filter or view information by space, time, and/or other tags. Recent advances include provision of differential global positioning (dGPS) system and high resolution aerial imagery support to visiting scientists, analysis and multitemporal mapping of over 120 km of coastline for erosion monitoring; maintenance of a wireless micrometeorological sensor network; links to Barrow area datasets housed at national data archives; a NOAA funded community outreach program for citizen science and public outreach on costal erosion; and substantial upgrades to the BAID website. Web mapping applications that have launched to the public include: an Imagery Time Viewer that allows users to compare imagery of the Barrow area between 1948 and the present; a Coastal Erosion Viewer that allows users to view long-term (1955-2015) and recent (2013-2015) rates of erosion for the Barrow area; and a Community Planning tool that allows users to view and print dynamic reports based on an array of basemaps including a new 0.5m resolution wetlands map designed to enhance decision making for development and land management.

  9. Leveraging Open Standard Interfaces in Accessing and Processing NASA Data Model Outputs

    NASA Astrophysics Data System (ADS)

    Falke, S. R.; Alameh, N. S.; Hoijarvi, K.; de La Beaujardiere, J.; Bambacus, M. J.

    2006-12-01

    An objective of NASA's Earth Science Division is to develop advanced information technologies for processing, archiving, accessing, visualizing, and communicating Earth Science data. To this end, NASA and other federal agencies have collaborated with the Open Geospatial Consortium (OGC) to research, develop, and test interoperability specifications within projects and testbeds benefiting the government, industry, and the public. This paper summarizes the results of a recent effort under the auspices of the OGC Web Services testbed phase 4 (OWS-4) to explore standardization approaches for accessing and processing the outputs of NASA models of physical phenomena. Within the OWS-4 context, experiments were designed to leverage the emerging OGC Web Processing Service (WPS) and Web Coverage Service (WCS) specifications to access, filter and manipulate the outputs of the NASA Goddard Earth Observing System (GEOS) and Goddard Chemistry Aerosol Radiation and Transport (GOCART) forecast models. In OWS-4, the intent is to provide the users with more control over the subsets of data that they can extract from the model results as well as over the final portrayal of that data. To meet that goal, experiments have been designed to test the suitability of use of OGC's Web Processing Service (WPS) and Web Coverage Service (WCS) for filtering, processing and portraying the model results (including slices by height or by time), and to identify any enhancements to the specs to meet the desired objectives. This paper summarizes the findings of the experiments highlighting the value of the Web Processing Service in providing standard interfaces for accessing and manipulating model data within spatial and temporal frameworks. The paper also points out the key shortcomings of the WPS especially in terms in comparison with a SOAP/WSDL approach towards solving the same problem.

  10. The Wildland Fire Emissions Information System: Providing information for carbon cycle studies with open source geospatial tools

    NASA Astrophysics Data System (ADS)

    French, N. H.; Erickson, T.; McKenzie, D.

    2008-12-01

    A major goal of the North American Carbon Program is to resolve uncertainties in understanding and managing the carbon cycle of North America. As carbon modeling tools become more comprehensive and spatially oriented, accurate datasets to spatially quantify carbon emissions from fire are needed, and these data resources need to be accessible to users for decision-making. Under a new NASA Carbon Cycle Science project, Drs. Nancy French and Tyler Erickson, of the Michigan Technological University, Michigan Tech Research Institute (MTRI), are teaming with specialists with the USDA Forest Service Fire and Environmental Research Applications (FERA) team to provide information for mapping fire-derived carbon emissions to users. The project focus includes development of a web-based system to provide spatially resolved fire emissions estimates for North America in a user-friendly environment. The web-based Decision Support System will be based on a variety of open source technologies. The Fuel Characteristic Classification System (FCCS) raster map of fuels and MODIS-derived burned area vector maps will be processed using the Geographic Data Abstraction Library (GDAL) and OGR Simple Features Library. Tabular and spatial project data will be stored in a PostgreSQL/PostGIS, a spatially enabled relational database server. The browser-based user interface will be created using the Django web page framework to allow user input for the decision support system. The OpenLayers mapping framework will be used to provide users with interactive maps within the browser. In addition, the data products will be made available in standard open data formats such as KML, to allow for easy integration into other spatial models and data systems.

  11. Web-Based Geospatial Visualization of GPM Data with CesiumJS

    NASA Technical Reports Server (NTRS)

    Lammers, Matt

    2018-01-01

    Advancements in the capabilities of JavaScript frameworks and web browsing technology have made online visualization of large geospatial datasets such as those coming from precipitation satellites viable. These data benefit from being visualized on and above a three-dimensional surface. The open-source JavaScript framework CesiumJS (http://cesiumjs.org), developed by Analytical Graphics, Inc., leverages the WebGL protocol to do just that. This presentation will describe how CesiumJS has been used in three-dimensional visualization products developed as part of the NASA Precipitation Processing System (PPS) STORM data-order website. Existing methods of interacting with Global Precipitation Measurement (GPM) Mission data primarily focus on two-dimensional static images, whether displaying vertical slices or horizontal surface/height-level maps. These methods limit interactivity with the robust three-dimensional data coming from the GPM core satellite. Integrating the data with CesiumJS in a web-based user interface has allowed us to create the following products. We have linked with the data-order interface an on-the-fly visualization tool for any GPM/partner satellite orbit. A version of this tool also focuses on high-impact weather events. It enables viewing of combined radar and microwave-derived precipitation data on mobile devices and in a way that can be embedded into other websites. We also have used CesiumJS to visualize a method of integrating gridded precipitation data with modeled wind speeds that animates over time. Emphasis in the presentation will be placed on how a variety of technical methods were used to create these tools, and how the flexibility of the CesiumJS framework facilitates creative approaches to interact with the data.

  12. Integration of Remotely Sensed Data Into Geospatial Reference Information Databases. Un-Ggim National Approach

    NASA Astrophysics Data System (ADS)

    Arozarena, A.; Villa, G.; Valcárcel, N.; Pérez, B.

    2016-06-01

    Remote sensing satellites, together with aerial and terrestrial platforms (mobile and fixed), produce nowadays huge amounts of data coming from a wide variety of sensors. These datasets serve as main data sources for the extraction of Geospatial Reference Information (GRI), constituting the "skeleton" of any Spatial Data Infrastructure (SDI). Since very different situations can be found around the world in terms of geographic information production and management, the generation of global GRI datasets seems extremely challenging. Remotely sensed data, due to its wide availability nowadays, is able to provide fundamental sources for any production or management system present in different countries. After several automatic and semiautomatic processes including ancillary data, the extracted geospatial information is ready to become part of the GRI databases. In order to optimize these data flows for the production of high quality geospatial information and to promote its use to address global challenges several initiatives at national, continental and global levels have been put in place, such as European INSPIRE initiative and Copernicus Programme, and global initiatives such as the Group on Earth Observation/Global Earth Observation System of Systems (GEO/GEOSS) and United Nations Global Geospatial Information Management (UN-GGIM). These workflows are established mainly by public organizations, with the adequate institutional arrangements at national, regional or global levels. Other initiatives, such as Volunteered Geographic Information (VGI), on the other hand may contribute to maintain the GRI databases updated. Remotely sensed data hence becomes one of the main pillars underpinning the establishment of a global SDI, as those datasets will be used by public agencies or institutions as well as by volunteers to extract the required spatial information that in turn will feed the GRI databases. This paper intends to provide an example of how institutional arrangements and cooperative production systems can be set up at any territorial level in order to exploit remotely sensed data in the most intensive manner, taking advantage of all its potential.

  13. Geospatial Database for Strata Objects Based on Land Administration Domain Model (ladm)

    NASA Astrophysics Data System (ADS)

    Nasorudin, N. N.; Hassan, M. I.; Zulkifli, N. A.; Rahman, A. Abdul

    2016-09-01

    Recently in our country, the construction of buildings become more complex and it seems that strata objects database becomes more important in registering the real world as people now own and use multilevel of spaces. Furthermore, strata title was increasingly important and need to be well-managed. LADM is a standard model for land administration and it allows integrated 2D and 3D representation of spatial units. LADM also known as ISO 19152. The aim of this paper is to develop a strata objects database using LADM. This paper discusses the current 2D geospatial database and needs for 3D geospatial database in future. This paper also attempts to develop a strata objects database using a standard data model (LADM) and to analyze the developed strata objects database using LADM data model. The current cadastre system in Malaysia includes the strata title is discussed in this paper. The problems in the 2D geospatial database were listed and the needs for 3D geospatial database in future also is discussed. The processes to design a strata objects database are conceptual, logical and physical database design. The strata objects database will allow us to find the information on both non-spatial and spatial strata title information thus shows the location of the strata unit. This development of strata objects database may help to handle the strata title and information.

  14. The Implementation of a Geospatial Information Technology (GIT)-Supported Land Use Change Curriculum with Urban Middle School Learners to Promote Spatial Thinking

    ERIC Educational Resources Information Center

    Bodzin, Alec M.

    2011-01-01

    This study investigated whether a geospatial information technology (GIT)-supported science curriculum helped students in an urban middle school understand land use change (LUC) concepts and enhanced their spatial thinking. Five 8th grade earth and space science classes in an urban middle school consisting of three different ability level tracks…

  15. Geospatial Informational Security Risks and Concerns of the U.S. Air Force GeoBase Program

    DTIC Science & Technology

    2007-03-01

    multiple governmental directives such as the Government Performance and Results Act (GPRA), Paperwork Reduction Act (PRA), and Office of Management and... governments , non- governmental organizations (NGOs), universities, and commercial sector contractors (Lachman, 2006). One command noted that over...Defense, or the United States Government . AFIT/GEM/ENV/07-M1 GEOSPATIAL INFORMATIONAL SECURITY RISKS AND CONCERNS OF THE UNITED STATES

  16. Enriching the Web Processing Service

    NASA Astrophysics Data System (ADS)

    Wosniok, Christoph; Bensmann, Felix; Wössner, Roman; Kohlus, Jörn; Roosmann, Rainer; Heidmann, Carsten; Lehfeldt, Rainer

    2014-05-01

    The OGC Web Processing Service (WPS) provides a standard for implementing geospatial processes in service-oriented networks. In its current version 1.0.0 it allocates the operations GetCapabilities, DescribeProcess and Execute, which can be used to offer custom processes based on single or multiple sub-processes. A large range of ready to use fine granular, fundamental geospatial processes have been developed by the GIS-community in the past. However, modern use cases or whole workflow processes demand specifications of lifecycle management and service orchestration. Orchestrating smaller sub-processes is a task towards interoperability; a comprehensive documentation by using appropriate metadata is also required. Though different approaches were tested in the past, developing complex WPS applications still requires programming skills, knowledge about software libraries in use and a lot of effort for integration. Our toolset RichWPS aims at providing a better overall experience by setting up two major components. The RichWPS ModelBuilder enables the graphics-aided design of workflow processes based on existing local and distributed processes and geospatial services. Once tested by the RichWPS Server, a composition can be deployed for production use on the RichWPS Server. The ModelBuilder obtains necessary processes and services from a directory service, the RichWPS semantic proxy. It manages the lifecycle and is able to visualize results and debugging-information. One aim will be to generate reproducible results; the workflow should be documented by metadata that can be integrated in Spatial Data Infrastructures. The RichWPS Server provides a set of interfaces to the ModelBuilder for, among others, testing composed workflow sequences, estimating their performance and to publish them as common processes. Therefore the server is oriented towards the upcoming WPS 2.0 standard and its ability to transactionally deploy and undeploy processes making use of a WPS-T interface. In order to deal with the results of these processing workflows, a server side extension enables the RichWPS Server and its clients to use WPS presentation directives (WPS-PD), a content related enhancement for the standardized WPS schema. We identified essential requirements of the components of our toolset by applying two use cases. The first enables the simplified comparison of modeled and measured data, a common task in hydro-engineering to validate the accuracy of a model. An implementation of the workflow includes reading, harmonizing and comparing two datasets in NetCDF-format. 2D Water level data from the German Bight can be chosen, presented and evaluated in a web client with interactive plots. The second use case is motivated by the Marine Strategy Directive (MSD) of the EU, which demands monitoring, action plans and at least an evaluation of the ecological situation in marine environment. Information technics adapted to those of INSPIRE should be used. One of the parameters monitored and evaluated for MSD is the expansion and quality of seagrass fields. With the view towards other evaluation parameters we decompose the complex process of evaluation of seagrass in reusable process steps and implement those packages as configurable WPS.

  17. GeoSymbio: a hybrid, cloud-based web application of global geospatial bioinformatics and ecoinformatics for Symbiodinium-host symbioses.

    PubMed

    Franklin, Erik C; Stat, Michael; Pochon, Xavier; Putnam, Hollie M; Gates, Ruth D

    2012-03-01

    The genus Symbiodinium encompasses a group of unicellular, photosynthetic dinoflagellates that are found free living or in hospite with a wide range of marine invertebrate hosts including scleractinian corals. We present GeoSymbio, a hybrid web application that provides an online, easy to use and freely accessible interface for users to discover, explore and utilize global geospatial bioinformatic and ecoinformatic data on Symbiodinium-host symbioses. The novelty of this application lies in the combination of a variety of query and visualization tools, including dynamic searchable maps, data tables with filter and grouping functions, and interactive charts that summarize the data. Importantly, this application is hosted remotely or 'in the cloud' using Google Apps, and therefore does not require any specialty GIS, web programming or data programming expertise from the user. The current version of the application utilizes Symbiodinium data based on the ITS2 genetic marker from PCR-based techniques, including denaturing gradient gel electrophoresis, sequencing and cloning of specimens collected during 1982-2010. All data elements of the application are also downloadable as spatial files, tables and nucleic acid sequence files in common formats for desktop analysis. The application provides a unique tool set to facilitate research on the basic biology of Symbiodinium and expedite new insights into their ecology, biogeography and evolution in the face of a changing global climate. GeoSymbio can be accessed at https://sites.google.com/site/geosymbio/. © 2011 Blackwell Publishing Ltd.

  18. The European Location Framework - from National to European

    NASA Astrophysics Data System (ADS)

    Pauknerova, E.; Sidlichovsky, P.; Urbanas, S.; Med, M.

    2016-06-01

    The European Location Framework (ELF) means a technical infrastructure which will deliver authoritative, interoperable geospatial reference data from all over Europe for analysing and understanding information connected to places and features. The ELF has been developed and set up through the ELF Project, which has been realized by a consortium of partners (public, private and academic organisations) since March 2013. Their number increased from thirty to forty in the year 2016, together with a project extension from 36 to 44 months. The project is co-funded by the European Commission's Competitiveness and Innovation Framework Programme (CIP) and will end in October 2016. In broad terms, the ELF Project will deliver a unique gateway to the authoritative reference geospatial information for Europe (harmonised pan-European maps, geographic and land information) sourced from the National Mapping and Cadastral Authorities (NMCAs) around Europe and including transparent licensing. This will be provided as an online ELF web service that will deliver an up-to-date topographic base map and also as view & download services for access to the ELF datasets. To develop and build up the ELF, NMCAs are accompanied and collaborate with several research & academia institutes, a standardisation body, system integrators, software developers and application providers. The harmonisation is in progress developing and triggering a number of geo-tools like edge-matching, generalisation, transformation and others. ELF will provide also some centralised tools like Geo Locator for searching location based on geographical names, addresses and administrative units, and GeoProduct Finder for discovering the available web-services and licensing them. ELF combines national reference geo-information through the ELF platform. ELF web services will be offered to users and application developers through open source (OSKARI) and proprietary (ArcGIS Online) cloud platforms. Recently, 29 NMCAs plus the EuroGeographics - their pan-European umbrella association, contribute to the ELF through an enrichment of data coverage. As a result, over 20 European countries will be covered with the ELF topo Base Map in 2016. Most countries will contribute also with other harmonized thematic data for viewing or down-loading. To overcome the heterogeneity of data resources and diversity of languages in tens of European countries, ELF builds on the existing INSPIRE rules and its own coordination and interoperability measures. ELF realisation empowers the implementation of INSPIRE in Europe and it complements related activities of European NMCAs, e.g. Czech Office for Surveying, Mapping and Cadastre (CUZK), which provides a large portfolio of spatial data/services and contributes significantly to the NSDI of Czech Republic. CUZK is also responsible for the Base Register of Territorial Identification, Addresses and Real Estates (RUIAN) - an important pillar of Czech e-Government. CUZK became an early-bird in implementing INSPIRE and it provides to the ELF a number of compliant datasets and web services. CUZK and the Polish NMCA (GUGiK) collaborate in the Central-European ELF Pilot (cluster) and test various cross-border prototypes. The presentation combines the national and crossborder view and experiences of CUZK and the European perspective of EuroGeographics.

  19. Web-based Tsunami Early Warning System with instant Tsunami Propagation Calculations in the GPU Cloud

    NASA Astrophysics Data System (ADS)

    Hammitzsch, M.; Spazier, J.; Reißland, S.

    2014-12-01

    Usually, tsunami early warning and mitigation systems (TWS or TEWS) are based on several software components deployed in a client-server based infrastructure. The vast majority of systems importantly include desktop-based clients with a graphical user interface (GUI) for the operators in early warning centers. However, in times of cloud computing and ubiquitous computing the use of concepts and paradigms, introduced by continuously evolving approaches in information and communications technology (ICT), have to be considered even for early warning systems (EWS). Based on the experiences and the knowledge gained in three research projects - 'German Indonesian Tsunami Early Warning System' (GITEWS), 'Distant Early Warning System' (DEWS), and 'Collaborative, Complex, and Critical Decision-Support in Evolving Crises' (TRIDEC) - new technologies are exploited to implement a cloud-based and web-based prototype to open up new prospects for EWS. This prototype, named 'TRIDEC Cloud', merges several complementary external and in-house cloud-based services into one platform for automated background computation with graphics processing units (GPU), for web-mapping of hazard specific geospatial data, and for serving relevant functionality to handle, share, and communicate threat specific information in a collaborative and distributed environment. The prototype in its current version addresses tsunami early warning and mitigation. The integration of GPU accelerated tsunami simulation computations have been an integral part of this prototype to foster early warning with on-demand tsunami predictions based on actual source parameters. However, the platform is meant for researchers around the world to make use of the cloud-based GPU computation to analyze other types of geohazards and natural hazards and react upon the computed situation picture with a web-based GUI in a web browser at remote sites. The current website is an early alpha version for demonstration purposes to give the concept a whirl and to shape science's future. Further functionality, improvements and possible profound changes have to implemented successively based on the users' evolving needs.

  20. Distributed Storage Algorithm for Geospatial Image Data Based on Data Access Patterns.

    PubMed

    Pan, Shaoming; Li, Yongkai; Xu, Zhengquan; Chong, Yanwen

    2015-01-01

    Declustering techniques are widely used in distributed environments to reduce query response time through parallel I/O by splitting large files into several small blocks and then distributing those blocks among multiple storage nodes. Unfortunately, however, many small geospatial image data files cannot be further split for distributed storage. In this paper, we propose a complete theoretical system for the distributed storage of small geospatial image data files based on mining the access patterns of geospatial image data using their historical access log information. First, an algorithm is developed to construct an access correlation matrix based on the analysis of the log information, which reveals the patterns of access to the geospatial image data. Then, a practical heuristic algorithm is developed to determine a reasonable solution based on the access correlation matrix. Finally, a number of comparative experiments are presented, demonstrating that our algorithm displays a higher total parallel access probability than those of other algorithms by approximately 10-15% and that the performance can be further improved by more than 20% by simultaneously applying a copy storage strategy. These experiments show that the algorithm can be applied in distributed environments to help realize parallel I/O and thereby improve system performance.

  1. 78 FR 43868 - Privacy Act of 1974; System of Records; Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-22

    ... notice altering a Privacy Act System of Records notice (NGA-013, National Geospatial-Intelligence Agency (NGA) Inspector General Investigative and Complaint Files). Subsequent to the publication of that... omission. FOR FURTHER INFORMATION CONTACT: National Geospatial-Intelligence Agency (NGA), ATTN: Security...

  2. Mapping the world: cartographic and geographic visualization by the United Nations Geospatial Information Section (formerly Cartographic Section)

    NASA Astrophysics Data System (ADS)

    Kagawa, Ayako; Le Sourd, Guillaume

    2018-05-01

    United Nations Secretariat activities, mapping began in 1946, and by 1951, the need for maps increased and an office with a team of cartographers was established. Since then, with the development of technologies including internet, remote sensing, unmanned aerial systems, relationship database management and information systems, geospatial information provides an ever-increasing variation of support to the work of the Organization for planning of operations, decision-making and monitoring of crises. However, the need for maps has remained intact. This presentation aims to highlight some of the cartographic representation styles over the decades by reviewing the evolution of selected maps by the office, and noting the changing cognitive and semiotic aspects of cartographic and geographic visualization required by the United Nations. Through presentation and analysis of these maps, the changing dynamics of the Organization in information management can be reflected, with a reminder of the continuing and expanding deconstructionist role of a cartographer, now geospatial information management experts.

  3. Examining the Effect of Enactment of a Geospatial Curriculum on Students' Geospatial Thinking and Reasoning

    NASA Astrophysics Data System (ADS)

    Bodzin, Alec M.; Fu, Qiong; Kulo, Violet; Peffer, Tamara

    2014-08-01

    A potential method for teaching geospatial thinking and reasoning (GTR) is through geospatially enabled learning technologies. We developed an energy resources geospatial curriculum that included learning activities with geographic information systems and virtual globes. This study investigated how 13 urban middle school teachers implemented and varied the enactment of the curriculum with their students and investigated which teacher- and student-level factors accounted for students' GTR posttest achievement. Data included biweekly implementation surveys from teachers and energy resources content and GTR pre- and posttest achievement measures from 1,049 students. Students significantly increased both their energy resources content knowledge and their GTR skills related to energy resources at the end of the curriculum enactment. Both multiple regression and hierarchical linear modeling found that students' initial GTR abilities and gain in energy content knowledge were significantly explanatory variables for their geospatial achievement at the end of curriculum enactment, p < .001. Teacher enactment factors, including adherence to implementing the critical components of the curriculum or the number of years the teachers had taught the curriculum, did not have significant effects on students' geospatial posttest achievement. The findings from this study provide support that learning with geospatially enabled learning technologies can support GTR with urban middle-level learners.

  4. Intergraph video and images exploitation capabilities

    NASA Astrophysics Data System (ADS)

    Colla, Simone; Manesis, Charalampos

    2013-08-01

    The current paper focuses on the capture, fusion and process of aerial imagery in order to leverage full motion video, giving analysts the ability to collect, analyze, and maximize the value of video assets. Unmanned aerial vehicles (UAV) have provided critical real-time surveillance and operational support to military organizations, and are a key source of intelligence, particularly when integrated with other geospatial data. In the current workflow, at first, the UAV operators plan the flight by using a flight planning software. During the flight the UAV send a live video stream directly on the field to be processed by Intergraph software, to generate and disseminate georeferenced images trough a service oriented architecture based on ERDAS Apollo suite. The raw video-based data sources provide the most recent view of a situation and can augment other forms of geospatial intelligence - such as satellite imagery and aerial photos - to provide a richer, more detailed view of the area of interest. To effectively use video as a source of intelligence, however, the analyst needs to seamlessly fuse the video with these other types of intelligence, such as map features and annotations. Intergraph has developed an application that automatically generates mosaicked georeferenced image, tags along the video route which can then be seamlessly integrated with other forms of static data, such as aerial photos, satellite imagery, or geospatial layers and features. Consumers will finally have the ability to use a single, streamlined system to complete the entire geospatial information lifecycle: capturing geospatial data using sensor technology; processing vector, raster, terrain data into actionable information; managing, fusing, and sharing geospatial data and video toghether; and finally, rapidly and securely delivering integrated information products, ensuring individuals can make timely decisions.

  5. Deductive Coordination of Multiple Geospatial Knowledge Sources

    NASA Astrophysics Data System (ADS)

    Waldinger, R.; Reddy, M.; Culy, C.; Hobbs, J.; Jarvis, P.; Dungan, J. L.

    2002-12-01

    Deductive inference is applied to choreograph the cooperation of multiple knowledge sources to respond to geospatial queries. When no one source can provide an answer, the response may be deduced from pieces of the answer provided by many sources. Examples of sources include (1) The Alexandria Digital Library Gazetteer, a repository that gives the locations for almost six million place names, (2) The Cia World Factbook, an online almanac with basic information about more than 200 countries. (3) The SRI TerraVision 3D Terrain Visualization System, which displays a flight-simulator-like interactive display of geographic data held in a database, (4) The NASA GDACC WebGIS client for searching satellite and other geographic data available through OpenGIS Consortium (OGC) Web Map Servers, and (5) The Northern Arizona University Latitude/Longitude Distance Calculator. Queries are phrased in English and are translated into logical theorems by the Gemini Natural Language Parser. The theorems are proved by SNARK, a first-order-logic theorem prover, in the context of an axiomatic geospatial theory. The theory embodies a representational scheme that takes into account the fact that the same place may have many names, and the same name may refer to many places. SNARK has built-in procedures (RCC8 and the Allen calculus, respectively) for reasoning about spatial and temporal concepts. External knowledge sources may be consulted by SNARK as the proof is in progress, so that most knowledge need not be stored axiomatically. The Open Agent Architecture (OAA) facilitates communication between sources that may be implemented on different machines in different computer languages. An answer to the query, in the form of text or an image, is extracted from the proof. Currently, three-dimensional images are displayed by TerraVision but other displays are possible. The combined system is called Geo-Logica. Some example queries that can be handled by Geo-Logica include: (1) show the petrified forests in Oregon north of Portland, (2) show the lake in Argentina with the highest elevation, and (3) Show the IGPB land cover classification, derived using MODIS, of Montana for July, 2000. Use of a theorem prover allows sources to cooperate even if they adapt different notational conventions and representation schemes and have never been designed to work together. New sources can be added without reprogramming the system, by providing axioms that advertise their capabilities. Future directions include entering into a dialogue with the user to clarify ambiguities, elaborate on previous questions, or provide new information necessary to answer the question. In addition, of particular interest is to deal with temporally varying data, with answers displayed as animated images.

  6. Teaching the Geoweb: Interdisciplinary Undergraduate Research in Wireless Sensor Networks, Web Mapping, and Geospatial Data Management

    ERIC Educational Resources Information Center

    Abernathy, David

    2011-01-01

    This article addresses an effort to incorporate wireless sensor networks and the emerging tools of the Geoweb into undergraduate teaching and research at a small liberal arts college. The primary goal of the research was to identify the hardware, software, and skill sets needed to deploy a local sensor network, collect data, and transmit that data…

  7. Best Practices for Making Scientific Data Discoverable and Accessible through Integrated, Standards-Based Data Portals

    NASA Astrophysics Data System (ADS)

    Lucido, J. M.

    2013-12-01

    Scientists in the fields of hydrology, geophysics, and climatology are increasingly using the vast quantity of publicly-available data to address broadly-scoped scientific questions. For example, researchers studying contamination of nearshore waters could use a combination of radar indicated precipitation, modeled water currents, and various sources of in-situ monitoring data to predict water quality near a beach. In discovering, gathering, visualizing and analyzing potentially useful data sets, data portals have become invaluable tools. The most effective data portals often aggregate distributed data sets seamlessly and allow multiple avenues for accessing the underlying data, facilitated by the use of open standards. Additionally, adequate metadata are necessary for attribution, documentation of provenance and relating data sets to one another. Metadata also enable thematic, geospatial and temporal indexing of data sets and entities. Furthermore, effective portals make use of common vocabularies for scientific methods, units of measure, geologic features, chemical, and biological constituents as they allow investigators to correctly interpret and utilize data from external sources. One application that employs these principles is the National Ground Water Monitoring Network (NGWMN) Data Portal (http://cida.usgs.gov/ngwmn), which makes groundwater data from distributed data providers available through a single, publicly accessible web application by mediating and aggregating native data exposed via web services on-the-fly into Open Geospatial Consortium (OGC) compliant service output. That output may be accessed either through the map-based user interface or through the aforementioned OGC web services. Furthermore, the Geo Data Portal (http://cida.usgs.gov/climate/gdp/), which is a system that provides users with data access, subsetting and geospatial processing of large and complex climate and land use data, exemplifies the application of International Standards Organization (ISO) metadata records to enhance data discovery for both human and machine interpretation. Lastly, the Water Quality Portal (http://www.waterqualitydata.us/) achieves interoperable dissemination of water quality data by referencing a vocabulary service for mapping constituents and methods between the USGS and USEPA. The NGWMN Data Portal, Geo Data Portal and Water Quality Portal are three examples of best practices when implementing data portals that provide distributed scientific data in an integrated, standards-based approach.

  8. A novel algorithm for fully automated mapping of geospatial ontologies

    NASA Astrophysics Data System (ADS)

    Chaabane, Sana; Jaziri, Wassim

    2018-01-01

    Geospatial information is collected from different sources thus making spatial ontologies, built for the same geographic domain, heterogeneous; therefore, different and heterogeneous conceptualizations may coexist. Ontology integrating helps creating a common repository of the geospatial ontology and allows removing the heterogeneities between the existing ontologies. Ontology mapping is a process used in ontologies integrating and consists in finding correspondences between the source ontologies. This paper deals with the "mapping" process of geospatial ontologies which consist in applying an automated algorithm in finding the correspondences between concepts referring to the definitions of matching relationships. The proposed algorithm called "geographic ontologies mapping algorithm" defines three types of mapping: semantic, topological and spatial.

  9. GeoSpatial Workforce Development: enhancing the traditional learning environment in geospatial information technology

    NASA Astrophysics Data System (ADS)

    Lawhead, Pamela B.; Aten, Michelle L.

    2003-04-01

    The Center for GeoSpatial Workforce Development is embarking on a new era in education by developing a repository of dynamic online courseware authored by the foremost industry experts within the remote sensing and GIS industries. Virtual classrooms equipped with the most advanced instructions, computations, communications, course evaluation, and management facilities amplify these courses to enhance the learning environment and provide rapid feedback between instructors and students. The launch of this program included the objective development of the Model Curriculum by an independent consortium of remote sensing industry leaders. The Center's research and development focus on recruiting additional industry experts to develop the technical content of the courseware and then utilize state-of-the-art technology to enhance their material with visually stimulating animations, compelling audio clips and entertaining, interactive exercises intended to reach the broadest audience possible by targeting various learning styles. The courseware will be delivered via various media: Internet, CD-ROM, DVD, and compressed video, that translates into anywhere, anytime delivery of GeoSpatial Information Technology education.

  10. Monitoring Colonias Development along the United States-Mexico Border: A Process Application using GIS and Remote Sensing in Douglas, Arizona, and Agua Prieta, Sonora

    USGS Publications Warehouse

    Norman, Laura M.; Donelson, Angela J.; Pfeifer, Edwin L.; Lam, Alven H.; Osborn, Kenneth J.

    2004-01-01

    The U.S. Department of Housing and Urban Development (HUD) and the U.S. Geological Survey (USGS) have developed a joint project to create Internet-enabled geographic information systems (GIS) that will help cities along the United States-Mexico border deal with issues related to colonias. HUD defines colonias as rural neighborhoods in the United States-Mexico border region that lack adequate infrastructure or housing and other basic services. They typically have high poverty rates that make it difficult for residents to pay for roads, sanitary water and sewer systems, decent housing, street lighting, and other services through assessment. Many Federal agencies recognize colonias designations and provide funding assistance. It is the intention of this project to empower Arizona-Sonora borderland neighborhoods and community members by recognizing them as colonias. This recognition will result in eligibility for available economic subsidies and accessibility to geospatial tools and information for urban planning. The steps to achieve this goal include delineation of colonia-like neighborhoods, identification of their urbanization over time, development of geospatial databases describing their infrastructure, and establishment of a framework for distributing Web-based GIS decision support systems. A combination of imagery and infrastructure information was used to help delineate colonia boundaries. A land-use change analysis, focused on urbanization in the cities over a 30-year timeframe, was implemented. The results of this project are being served over the Internet, providing data to the public as well as to participating agencies. One of the initial study areas for this project was the City of Douglas, Ariz., and its Mexican sister-city Agua Prieta, Sonora, which are described herein. Because of its location on the border, this twin-cities area is especially well suited to international manufacturing and commerce, which has, in turn, led to an uncontrolled spread of colonias. The USGS worked with local organizations in developing the Web-based GIS database. Community involvement ensured that the database and map server would meet the current and long-term needs of the communities and end users. Partners include Federal agencies, State agencies, county officials, town representatives, universities, and youth organizations, as well as interested local advocacy groups and individuals. A significant component of this project was development of relationships and partnerships in the border towns for facilitating binational approaches to land management.

  11. Eodataservice.org: Big Data Platform to Enable Multi-disciplinary Information Extraction from Geospatial Data

    NASA Astrophysics Data System (ADS)

    Natali, S.; Mantovani, S.; Barboni, D.; Hogan, P.

    2017-12-01

    In 1999, US Vice-President Al Gore outlined the concept of `Digital Earth' as a multi-resolution, three-dimensional representation of the planet to find, visualise and make sense of vast amounts of geo- referenced information on physical and social environments, allowing to navigate through space and time, accessing historical and forecast data to support scientists, policy-makers, and any other user. The eodataservice platform (http://eodataservice.org/) implements the Digital Earth Concept: eodatasevice is a cross-domain platform that makes available a large set of multi-year global environmental collections allowing data discovery, visualization, combination, processing and download. It implements a "virtual datacube" approach where data stored on distributed data centers are made available via standardized OGC-compliant interfaces. Dedicated web-based Graphic User Interfaces (based on the ESA-NASA WebWorldWind technology) as well as web-based notebooks (e.g. Jupyter notebook), deskop GIS tools and command line interfaces can be used to access and manipulate the data. The platform can be fully customized on users' needs. So far eodataservice has been used for the following thematic applications: High resolution satellite data distribution Land surface monitoring using SAR surface deformation data Atmosphere, ocean and climate applications Climate-health applications Urban Environment monitoring Safeguard of cultural heritage sites Support to farmers and (re)-insurances in the agriculturés field In the current work, the EO Data Service concept is presented as key enabling technology; furthermore various examples are provided to demonstrate the high level of interdisciplinarity of the platform.

  12. Enhancing Geoscience Research Discovery Through the Semantic Web

    NASA Astrophysics Data System (ADS)

    Rowan, Linda R.; Gross, M. Benjamin; Mayernik, Matthew; Khan, Huda; Boler, Frances; Maull, Keith; Stott, Don; Williams, Steve; Corson-Rikert, Jon; Johns, Erica M.; Daniels, Michael; Krafft, Dean B.; Meertens, Charles

    2016-04-01

    UNAVCO, UCAR, and Cornell University are working together to leverage semantic web technologies to enable discovery of people, datasets, publications and other research products, as well as the connections between them. The EarthCollab project, a U.S. National Science Foundation EarthCube Building Block, is enhancing an existing open-source semantic web application, VIVO, to enhance connectivity across distributed networks of researchers and resources related to the following two geoscience-based communities: (1) the Bering Sea Project, an interdisciplinary field program whose data archive is hosted by NCAR's Earth Observing Laboratory (EOL), and (2) UNAVCO, a geodetic facility and consortium that supports diverse research projects informed by geodesy. People, publications, datasets and grant information have been mapped to an extended version of the VIVO-ISF ontology and ingested into VIVO's database. Much of the VIVO ontology was built for the life sciences, so we have added some components of existing geoscience-based ontologies and a few terms from a local ontology that we created. The UNAVCO VIVO instance, connect.unavco.org, utilizes persistent identifiers whenever possible; for example using ORCIDs for people, publication DOIs, data DOIs and unique NSF grant numbers. Data is ingested using a custom set of scripts that include the ability to perform basic automated and curated disambiguation. VIVO can display a page for every object ingested, including connections to other objects in the VIVO database. A dataset page, for example, includes the dataset type, time interval, DOI, related publications, and authors. The dataset type field provides a connection to all other datasets of the same type. The author's page shows, among other information, related datasets and co-authors. Information previously spread across several unconnected databases is now stored in a single location. In addition to VIVO's default display, the new database can be queried using SPARQL, a query language for semantic data. EarthCollab is extending the VIVO web application. One such extension is the ability to cross-link separate VIVO instances across institutions, allowing local display of externally curated information. For example, Cornell's VIVO faculty pages will display UNAVCO's dataset information and UNAVCO's VIVO will display Cornell faculty member contact and position information. About half of UNAVCO's membership is international and we hope to connect our data to institutions in other countries with a similar approach. Additional extensions, including enhanced geospatial capabilities, will be developed based on task-centered usability testing.

  13. Task and Progress of Iaeg-Sdgs Wggi in Monitoring Sdgs Through a `GEOGRAPHIC Location' Lens

    NASA Astrophysics Data System (ADS)

    Geng, W.; Chen, J.; Zhang, H. P.; Xu, K.

    2018-04-01

    In September 2015, the 193 Member States of the United Nations (UN) unanimously adopted the 2030 Agenda for Sustainable Development and its 17 Sustainable Development Goals (SDGs), aiming to transform the world over the next 15 years (ESDN, 2016). To meet the ambitions and demands of the 2030 Agenda, it is necessary for the global indicator framework to adequately and systematically address the issue of alternative data sources and methodologies, including geospatial information and Earth observations in the context of geographic location (UN-GGIM, 2016). For this purpose, the Inter-Agency and Expert Group on Sustainable Development Goals Indicator (IAEG-SDGs) created the Working Group on Geospatial Information (IAEG-SDGs: WGGI) to give full play to the role of geospatial data in SDGs measurement and monitoring. The Working Group reviewed global indicators through a `geographic location' lens to pick out those which geospatial information can significantly support the production, and analyzed the methodological and measurements issues. This paper has discussed the progress in monitoring SDGs ever since the establishment of IAEG-SDGs: WGGI, as well as the existing problems, appropriate solutions and plans for the next stage of work.

  14. The National Map: New Viewer, Services, and Data Download

    USGS Publications Warehouse

    Dollison, Robert M.

    2010-01-01

    Managed by the U.S. Geological Survey's (USGS) National Geospatial Program, The National Map has transitioned data assets and viewer applications to a new visualization and product and service delivery environment, which includes an improved viewing platform, base map data and overlay services, and an integrated data download service. This new viewing solution expands upon the National Geospatial Intelligence Agency (NGA) Palanterra X3 viewer, providing a solid technology foundation for navigation and basic Web mapping functionality. Building upon the NGA viewer allows The National Map to focus on improving data services, functions, and data download capabilities. Initially released to the public at the 125th anniversary of mapping in the USGS on December 3, 2009, the viewer and services are now the primary distribution point for The National Map data. The National Map Viewer: http://viewer.nationalmap.gov

  15. National Geospatial Program

    USGS Publications Warehouse

    Carswell, William J.

    2011-01-01

    increases the efficiency of the Nation's geospatial community by improving communications about geospatial data, products, services, projects, needs, standards, and best practices. The NGP comprises seven major components (described below), that are managed as a unified set. For example, The National Map establishes data standards and identifies geographic areas where specific types of geospatial data need to be incorporated into The National Map. Partnership Network Liaisons work with Federal, State, local, and tribal partners to help acquire the data. Geospatial technical operations ensure the quality control, integration, and availability to the public of the data acquired. The Emergency Operations Office provides the requirements to The National Map and, during emergencies and natural disasters, provides rapid dissemination of information and data targeted to the needs of emergency responders. The National Atlas uses data from The National Map and other sources to make small-scale maps and multimedia articles about the maps.

  16. The geospatial modeling interface (GMI) framework for deploying and assessing environmental models

    USDA-ARS?s Scientific Manuscript database

    Geographical information systems (GIS) software packages have been used for close to three decades as analytical tools in environmental management for geospatial data assembly, processing, storage, and visualization of input data and model output. However, with increasing availability and use of ful...

  17. Publications - MP 150 | Alaska Division of Geological & Geophysical Surveys

    Science.gov Websites

    Skip to content State of Alaska myAlaska My Government Resident Business in Alaska Visiting Alaska larger work. Please see DDS 3 for more information. Digital Geospatial Data Digital Geospatial Data Business in Alaska Visiting Alaska State Employees

  18. Geospatial Technologies and Higher Education in Argentina

    ERIC Educational Resources Information Center

    Leguizamon, Saturnino

    2010-01-01

    The term "geospatial technologies" encompasses a large area of fields involving cartography, spatial analysis, geographic information system, remote sensing, global positioning systems and many others. These technologies should be expected to be available (as "natural tools") for a country with a large surface and a variety of…

  19. Modeling photovoltaic diffusion: an analysis of geospatial datasets

    NASA Astrophysics Data System (ADS)

    Davidson, Carolyn; Drury, Easan; Lopez, Anthony; Elmore, Ryan; Margolis, Robert

    2014-07-01

    This study combines address-level residential photovoltaic (PV) adoption trends in California with several types of geospatial information—population demographics, housing characteristics, foreclosure rates, solar irradiance, vehicle ownership preferences, and others—to identify which subsets of geospatial information are the best predictors of historical PV adoption. Number of rooms, heating source and house age were key variables that had not been previously explored in the literature, but are consistent with the expected profile of a PV adopter. The strong relationship provided by foreclosure indicators and mortgage status have less of an intuitive connection to PV adoption, but may be highly correlated with characteristics inherent in PV adopters. Next, we explore how these predictive factors and model performance varies between different Investor Owned Utility (IOU) regions in California, and at different spatial scales. Results suggest that models trained with small subsets of geospatial information (five to eight variables) may provide similar explanatory power as models using hundreds of geospatial variables. Further, the predictive performance of models generally decreases at higher resolution, i.e., below ZIP code level since several geospatial variables with coarse native resolution become less useful for representing high resolution variations in PV adoption trends. However, for California we find that model performance improves if parameters are trained at the regional IOU level rather than the state-wide level. We also find that models trained within one IOU region are generally representative for other IOU regions in CA, suggesting that a model trained with data from one state may be applicable in another state.

  20. Community Needs Assessment and Portal Prototype Development for an Arctic Spatial Data Infrastructure (ASDI)

    NASA Astrophysics Data System (ADS)

    Wiggins, H. V.; Warnick, W. K.; Hempel, L. C.; Henk, J.; Sorensen, M.; Tweedie, C. E.; Gaylord, A. G.

    2007-12-01

    As the creation and use of geospatial data in research, management, logistics, and education applications has proliferated, there is now a tremendous potential for advancing science through a variety of cyber-infrastructure applications, including Spatial Data Infrastructure (SDI) and related technologies. SDIs provide a necessary and common framework of standards, securities, policies, procedures, and technology to support the effective acquisition, coordination, dissemination and use of geospatial data by multiple and distributed stakeholder and user groups. Despite the numerous research activities in the Arctic, there is no established SDI and, because of this lack of a coordinated infrastructure, there is inefficiency, duplication of effort, and reduced data quality and search ability of arctic geospatial data. The urgency for establishing this framework is significant considering the myriad of data that is being collected in celebration of the International Polar Year (IPY) in 2007-2008 and the current international momentum for an improved and integrated circum-arctic terrestrial-marine-atmospheric environmental observatories network. The key objective of this project is to lay the foundation for full implementation of an Arctic Spatial Data Infrastructure (ASDI) through an assessment of community needs, readiness, and resources and through the development of a prototype web-mapping portal.

Top