Sample records for geospatial visual analytics

  1. Web GIS in practice IX: a demonstration of geospatial visual analytics using Microsoft Live Labs Pivot technology and WHO mortality data

    PubMed Central

    2011-01-01

    The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of human health, surveillance, emergency management and epidemiology-related geospatial visual analytics applications and examples of free software tools that readers can experiment with, such as Google Public Data Explorer. The authors also present a practical demonstration of geospatial visual analytics using partial data for 35 countries from a publicly available World Health Organization (WHO) mortality dataset and Microsoft Live Labs Pivot technology, a free, general purpose visual analytics tool that offers a fresh way to visually browse and arrange massive amounts of data and images online and also supports geographic and temporal classifications of datasets featuring geospatial and temporal components. Interested readers can download a Zip archive (included with the manuscript as an additional file) containing all files, modules and library functions used to deploy the WHO mortality data Pivot collection described in this paper. PMID:21410968

  2. Web GIS in practice IX: a demonstration of geospatial visual analytics using Microsoft Live Labs Pivot technology and WHO mortality data.

    PubMed

    Kamel Boulos, Maged N; Viangteeravat, Teeradache; Anyanwu, Matthew N; Ra Nagisetty, Venkateswara; Kuscu, Emin

    2011-03-16

    The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of human health, surveillance, emergency management and epidemiology-related geospatial visual analytics applications and examples of free software tools that readers can experiment with, such as Google Public Data Explorer. The authors also present a practical demonstration of geospatial visual analytics using partial data for 35 countries from a publicly available World Health Organization (WHO) mortality dataset and Microsoft Live Labs Pivot technology, a free, general purpose visual analytics tool that offers a fresh way to visually browse and arrange massive amounts of data and images online and also supports geographic and temporal classifications of datasets featuring geospatial and temporal components. Interested readers can download a Zip archive (included with the manuscript as an additional file) containing all files, modules and library functions used to deploy the WHO mortality data Pivot collection described in this paper.

  3. Towards a Web-Enabled Geovisualization and Analytics Platform for the Energy and Water Nexus

    NASA Astrophysics Data System (ADS)

    Sanyal, J.; Chandola, V.; Sorokine, A.; Allen, M.; Berres, A.; Pang, H.; Karthik, R.; Nugent, P.; McManamay, R.; Stewart, R.; Bhaduri, B. L.

    2017-12-01

    Interactive data analytics are playing an increasingly vital role in the generation of new, critical insights regarding the complex dynamics of the energy/water nexus (EWN) and its interactions with climate variability and change. Integration of impacts, adaptation, and vulnerability (IAV) science with emerging, and increasingly critical, data science capabilities offers a promising potential to meet the needs of the EWN community. To enable the exploration of pertinent research questions, a web-based geospatial visualization platform is being built that integrates a data analysis toolbox with advanced data fusion and data visualization capabilities to create a knowledge discovery framework for the EWN. The system, when fully built out, will offer several geospatial visualization capabilities including statistical visual analytics, clustering, principal-component analysis, dynamic time warping, support uncertainty visualization and the exploration of data provenance, as well as support machine learning discoveries to render diverse types of geospatial data and facilitate interactive analysis. Key components in the system architecture includes NASA's WebWorldWind, the Globus toolkit, postgresql, as well as other custom built software modules.

  4. The geospatial modeling interface (GMI) framework for deploying and assessing environmental models

    USDA-ARS?s Scientific Manuscript database

    Geographical information systems (GIS) software packages have been used for close to three decades as analytical tools in environmental management for geospatial data assembly, processing, storage, and visualization of input data and model output. However, with increasing availability and use of ful...

  5. A robust and flexible Geospatial Modeling Interface (GMI) for deploying and evaluating natural resource models

    USDA-ARS?s Scientific Manuscript database

    Geographical information systems (GIS) software packages have been used for nearly three decades as analytical tools in natural resource management for geospatial data assembly, processing, storage, and visualization of input data and model output. However, with increasing availability and use of fu...

  6. The Geoinformatica free and open source software stack

    NASA Astrophysics Data System (ADS)

    Jolma, A.

    2012-04-01

    The Geoinformatica free and open source software (FOSS) stack is based mainly on three established FOSS components, namely GDAL, GTK+, and Perl. GDAL provides access to a very large selection of geospatial data formats and data sources, a generic geospatial data model, and a large collection of geospatial analytical and processing functionality. GTK+ and the Cairo graphics library provide generic graphics and graphical user interface capabilities. Perl is a programming language, for which there is a very large set of FOSS modules for a wide range of purposes and which can be used as an integrative tool for building applications. In the Geoinformatica stack, data storages such as FOSS RDBMS PostgreSQL with its geospatial extension PostGIS can be used below the three above mentioned components. The top layer of Geoinformatica consists of a C library and several Perl modules. The C library comprises a general purpose raster algebra library, hydrological terrain analysis functions, and visualization code. The Perl modules define a generic visualized geospatial data layer and subclasses for raster and vector data and graphs. The hydrological terrain functions are already rather old and they suffer for example from the requirement of in-memory rasters. Newer research conducted using the platform include basic geospatial simulation modeling, visualization of ecological data, linking with a Bayesian network engine for spatial risk assessment in coastal areas, and developing standards-based distributed water resources information systems in Internet. The Geoinformatica stack constitutes a platform for geospatial research, which is targeted towards custom analytical tools, prototyping and linking with external libraries. Writing custom analytical tools is supported by the Perl language and the large collection of tools that are available especially in GDAL and Perl modules. Prototyping is supported by the GTK+ library, the GUI tools, and the support for object-oriented programming in Perl. New feature types, geospatial layer classes, and tools as extensions with specific features can be defined, used, and studied. Linking with external libraries is possible using the Perl foreign function interface tools or with generic tools such as Swig. We are interested in implementing and testing linking Geoinformatica with existing or new more specific hydrological FOSS.

  7. Exploring U.S Cropland - A Web Service based Cropland Data Layer Visualization, Dissemination and Querying System (Invited)

    NASA Astrophysics Data System (ADS)

    Yang, Z.; Han, W.; di, L.

    2010-12-01

    The National Agricultural Statistics Service (NASS) of the USDA produces the Cropland Data Layer (CDL) product, which is a raster-formatted, geo-referenced, U.S. crop specific land cover classification. These digital data layers are widely used for a variety of applications by universities, research institutions, government agencies, and private industry in climate change studies, environmental ecosystem studies, bioenergy production & transportation planning, environmental health research and agricultural production decision making. The CDL is also used internally by NASS for crop acreage and yield estimation. Like most geospatial data products, the CDL product is only available by CD/DVD delivery or online bulk file downloading via the National Research Conservation Research (NRCS) Geospatial Data Gateway (external users) or in a printed paper map format. There is no online geospatial information access and dissemination, no crop visualization & browsing, no geospatial query capability, nor online analytics. To facilitate the application of this data layer and to help disseminating the data, a web-service based CDL interactive map visualization, dissemination, querying system is proposed. It uses Web service based service oriented architecture, adopts open standard geospatial information science technology and OGC specifications and standards, and re-uses functions/algorithms from GeoBrain Technology (George Mason University developed). This system provides capabilities of on-line geospatial crop information access, query and on-line analytics via interactive maps. It disseminates all data to the decision makers and users via real time retrieval, processing and publishing over the web through standards-based geospatial web services. A CDL region of interest can also be exported directly to Google Earth for mashup or downloaded for use with other desktop application. This web service based system greatly improves equal-accessibility, interoperability, usability, and data visualization, facilitates crop geospatial information usage, and enables US cropland online exploring capability without any client-side software installation. It also greatly reduces the need for paper map and analysis report printing and media usages, and thus enhances low-carbon Agro-geoinformation dissemination for decision support.

  8. Open cyberGIS software for geospatial research and education in the big data era

    NASA Astrophysics Data System (ADS)

    Wang, Shaowen; Liu, Yan; Padmanabhan, Anand

    CyberGIS represents an interdisciplinary field combining advanced cyberinfrastructure, geographic information science and systems (GIS), spatial analysis and modeling, and a number of geospatial domains to improve research productivity and enable scientific breakthroughs. It has emerged as new-generation GIS that enable unprecedented advances in data-driven knowledge discovery, visualization and visual analytics, and collaborative problem solving and decision-making. This paper describes three open software strategies-open access, source, and integration-to serve various research and education purposes of diverse geospatial communities. These strategies have been implemented in a leading-edge cyberGIS software environment through three corresponding software modalities: CyberGIS Gateway, Toolkit, and Middleware, and achieved broad and significant impacts.

  9. PANTHER. Pattern ANalytics To support High-performance Exploitation and Reasoning.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Czuchlewski, Kristina Rodriguez; Hart, William E.

    Sandia has approached the analysis of big datasets with an integrated methodology that uses computer science, image processing, and human factors to exploit critical patterns and relationships in large datasets despite the variety and rapidity of information. The work is part of a three-year LDRD Grand Challenge called PANTHER (Pattern ANalytics To support High-performance Exploitation and Reasoning). To maximize data analysis capability, Sandia pursued scientific advances across three key technical domains: (1) geospatial-temporal feature extraction via image segmentation and classification; (2) geospatial-temporal analysis capabilities tailored to identify and process new signatures more efficiently; and (3) domain- relevant models of humanmore » perception and cognition informing the design of analytic systems. Our integrated results include advances in geographical information systems (GIS) in which we discover activity patterns in noisy, spatial-temporal datasets using geospatial-temporal semantic graphs. We employed computational geometry and machine learning to allow us to extract and predict spatial-temporal patterns and outliers from large aircraft and maritime trajectory datasets. We automatically extracted static and ephemeral features from real, noisy synthetic aperture radar imagery for ingestion into a geospatial-temporal semantic graph. We worked with analysts and investigated analytic workflows to (1) determine how experiential knowledge evolves and is deployed in high-demand, high-throughput visual search workflows, and (2) better understand visual search performance and attention. Through PANTHER, Sandia's fundamental rethinking of key aspects of geospatial data analysis permits the extraction of much richer information from large amounts of data. The project results enable analysts to examine mountains of historical and current data that would otherwise go untouched, while also gaining meaningful, measurable, and defensible insights into overlooked relationships and patterns. The capability is directly relevant to the nation's nonproliferation remote-sensing activities and has broad national security applications for military and intelligence- gathering organizations.« less

  10. Visual analytics of inherently noisy crowdsourced data on ultra high resolution displays

    NASA Astrophysics Data System (ADS)

    Huynh, Andrew; Ponto, Kevin; Lin, Albert Yu-Min; Kuester, Falko

    The increasing prevalence of distributed human microtasking, crowdsourcing, has followed the exponential increase in data collection capabilities. The large scale and distributed nature of these microtasks produce overwhelming amounts of information that is inherently noisy due to the nature of human input. Furthermore, these inputs create a constantly changing dataset with additional information added on a daily basis. Methods to quickly visualize, filter, and understand this information over temporal and geospatial constraints is key to the success of crowdsourcing. This paper present novel methods to visually analyze geospatial data collected through crowdsourcing on top of remote sensing satellite imagery. An ultra high resolution tiled display system is used to explore the relationship between human and satellite remote sensing data at scale. A case study is provided that evaluates the presented technique in the context of an archaeological field expedition. A team in the field communicated in real-time with and was guided by researchers in the remote visual analytics laboratory, swiftly sifting through incoming crowdsourced data to identify target locations that were identified as viable archaeological sites.

  11. Forecasting hotspots using predictive visual analytics approach

    DOEpatents

    Maciejewski, Ross; Hafen, Ryan; Rudolph, Stephen; Cleveland, William; Ebert, David

    2014-12-30

    A method for forecasting hotspots is provided. The method may include the steps of receiving input data at an input of the computational device, generating a temporal prediction based on the input data, generating a geospatial prediction based on the input data, and generating output data based on the time series and geospatial predictions. The output data may be configured to display at least one user interface at an output of the computational device.

  12. Considerations on Geospatial Big Data

    NASA Astrophysics Data System (ADS)

    LIU, Zhen; GUO, Huadong; WANG, Changlin

    2016-11-01

    Geospatial data, as a significant portion of big data, has recently gained the full attention of researchers. However, few researchers focus on the evolution of geospatial data and its scientific research methodologies. When entering into the big data era, fully understanding the changing research paradigm associated with geospatial data will definitely benefit future research on big data. In this paper, we look deep into these issues by examining the components and features of geospatial big data, reviewing relevant scientific research methodologies, and examining the evolving pattern of geospatial data in the scope of the four ‘science paradigms’. This paper proposes that geospatial big data has significantly shifted the scientific research methodology from ‘hypothesis to data’ to ‘data to questions’ and it is important to explore the generality of growing geospatial data ‘from bottom to top’. Particularly, four research areas that mostly reflect data-driven geospatial research are proposed: spatial correlation, spatial analytics, spatial visualization, and scientific knowledge discovery. It is also pointed out that privacy and quality issues of geospatial data may require more attention in the future. Also, some challenges and thoughts are raised for future discussion.

  13. An Interactive Visual Analytics Framework for Multi-Field Data in a Geo-Spatial Context

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Zhiyuan; Tong, Xiaonan; McDonnell, Kevin T.

    2013-04-01

    Climate research produces a wealth of multivariate data. These data often have a geospatial reference and so it is of interest to show them within their geospatial context. One can consider this configuration as a multi field visualization problem, where the geospace provides the expanse of the field. However, there is a limit on the amount of multivariate information that can be fit within a certain spatial location, and the use of linked multivari ate information displays has previously been devised to bridge this gap. In this paper we focus on the interactions in the geographical display, present an implementationmore » that uses Google Earth, and demonstrate it within a tightly linked parallel coordinates display. Several other visual representations, such as pie and bar charts are integrated into the Google Earth display and can be interactively manipulated. Further, we also demonstrate new brushing and visualization techniques for parallel coordinates, such as fixedwindow brushing and correlationenhanced display. We conceived our system with a team of climate researchers, who already made a few important discov eries using it. This demonstrates our system’s great potential to enable scientific discoveries, possibly also in oth er domains where data have a geospatial reference.« less

  14. Increasing the value of geospatial informatics with open approaches for Big Data

    NASA Astrophysics Data System (ADS)

    Percivall, G.; Bermudez, L. E.

    2017-12-01

    Open approaches to big data provide geoscientists with new capabilities to address problems of unmatched size and complexity. Consensus approaches for Big Geo Data have been addressed in multiple international workshops and testbeds organized by the Open Geospatial Consortium (OGC) in the past year. Participants came from government (NASA, ESA, USGS, NOAA, DOE); research (ORNL, NCSA, IU, JPL, CRIM, RENCI); industry (ESRI, Digital Globe, IBM, rasdaman); standards (JTC 1/NIST); and open source software communities. Results from the workshops and testbeds are documented in Testbed reports and a White Paper published by the OGC. The White Paper identifies the following set of use cases: Collection and Ingest: Remote sensed data processing; Data stream processing Prepare and Structure: SQL and NoSQL databases; Data linking; Feature identification Analytics and Visualization: Spatial-temporal analytics; Machine Learning; Data Exploration Modeling and Prediction: Integrated environmental models; Urban 4D models. Open implementations were developed in the Arctic Spatial Data Pilot using Discrete Global Grid Systems (DGGS) and in Testbeds using WPS and ESGF to publish climate predictions. Further development activities to advance open implementations of Big Geo Data include the following: Open Cloud Computing: Avoid vendor lock-in through API interoperability and Application portability. Open Source Extensions: Implement geospatial data representations in projects from Apache, Location Tech, and OSGeo. Investigate parallelization strategies for N-Dimensional spatial data. Geospatial Data Representations: Schemas to improve processing and analysis using geospatial concepts: Features, Coverages, DGGS. Use geospatial encodings like NetCDF and GeoPackge. Big Linked Geodata: Use linked data methods scaled to big geodata. Analysis Ready Data: Support "Download as last resort" and "Analytics as a service". Promote elements common to "datacubes."

  15. Open-source web-enabled data management, analyses, and visualization of very large data in geosciences using Jupyter, Apache Spark, and community tools

    NASA Astrophysics Data System (ADS)

    Chaudhary, A.

    2017-12-01

    Current simulation models and sensors are producing high-resolution, high-velocity data in geosciences domain. Knowledge discovery from these complex and large size datasets require tools that are capable of handling very large data and providing interactive data analytics features to researchers. To this end, Kitware and its collaborators are producing open-source tools GeoNotebook, GeoJS, Gaia, and Minerva for geosciences that are using hardware accelerated graphics and advancements in parallel and distributed processing (Celery and Apache Spark) and can be loosely coupled to solve real-world use-cases. GeoNotebook (https://github.com/OpenGeoscience/geonotebook) is co-developed by Kitware and NASA-Ames and is an extension to the Jupyter Notebook. It provides interactive visualization and python-based analysis of geospatial data and depending the backend (KTile or GeoPySpark) can handle data sizes of Hundreds of Gigabytes to Terabytes. GeoNotebook uses GeoJS (https://github.com/OpenGeoscience/geojs) to render very large geospatial data on the map using WebGL and Canvas2D API. GeoJS is more than just a GIS library as users can create scientific plots such as vector and contour and can embed InfoVis plots using D3.js. GeoJS aims for high-performance visualization and interactive data exploration of scientific and geospatial location aware datasets and supports features such as Point, Line, Polygon, and advanced features such as Pixelmap, Contour, Heatmap, and Choropleth. Our another open-source tool Minerva ((https://github.com/kitware/minerva) is a geospatial application that is built on top of open-source web-based data management system Girder (https://github.com/girder/girder) which provides an ability to access data from HDFS or Amazon S3 buckets and provides capabilities to perform visualization and analyses on geosciences data in a web environment using GDAL and GeoPandas wrapped in a unified API provided by Gaia (https://github.com/OpenDataAnalytics/gaia). In this presentation, we will discuss core features of each of these tools and will present lessons learned on handling large data in the context of data management, analyses and visualization.

  16. Map LineUps: Effects of spatial structure on graphical inference.

    PubMed

    Beecham, Roger; Dykes, Jason; Meulemans, Wouter; Slingsby, Aidan; Turkay, Cagatay; Wood, Jo

    2017-01-01

    Fundamental to the effective use of visualization as an analytic and descriptive tool is the assurance that presenting data visually provides the capability of making inferences from what we see. This paper explores two related approaches to quantifying the confidence we may have in making visual inferences from mapped geospatial data. We adapt Wickham et al.'s 'Visual Line-up' method as a direct analogy with Null Hypothesis Significance Testing (NHST) and propose a new approach for generating more credible spatial null hypotheses. Rather than using as a spatial null hypothesis the unrealistic assumption of complete spatial randomness, we propose spatially autocorrelated simulations as alternative nulls. We conduct a set of crowdsourced experiments (n=361) to determine the just noticeable difference (JND) between pairs of choropleth maps of geographic units controlling for spatial autocorrelation (Moran's I statistic) and geometric configuration (variance in spatial unit area). Results indicate that people's abilities to perceive differences in spatial autocorrelation vary with baseline autocorrelation structure and the geometric configuration of geographic units. These results allow us, for the first time, to construct a visual equivalent of statistical power for geospatial data. Our JND results add to those provided in recent years by Klippel et al. (2011), Harrison et al. (2014) and Kay & Heer (2015) for correlation visualization. Importantly, they provide an empirical basis for an improved construction of visual line-ups for maps and the development of theory to inform geospatial tests of graphical inference.

  17. KOLAM: a cross-platform architecture for scalable visualization and tracking in wide-area imagery

    NASA Astrophysics Data System (ADS)

    Fraser, Joshua; Haridas, Anoop; Seetharaman, Guna; Rao, Raghuveer M.; Palaniappan, Kannappan

    2013-05-01

    KOLAM is an open, cross-platform, interoperable, scalable and extensible framework supporting a novel multi- scale spatiotemporal dual-cache data structure for big data visualization and visual analytics. This paper focuses on the use of KOLAM for target tracking in high-resolution, high throughput wide format video also known as wide-area motion imagery (WAMI). It was originally developed for the interactive visualization of extremely large geospatial imagery of high spatial and spectral resolution. KOLAM is platform, operating system and (graphics) hardware independent, and supports embedded datasets scalable from hundreds of gigabytes to feasibly petabytes in size on clusters, workstations, desktops and mobile computers. In addition to rapid roam, zoom and hyper- jump spatial operations, a large number of simultaneously viewable embedded pyramid layers (also referred to as multiscale or sparse imagery), interactive colormap and histogram enhancement, spherical projection and terrain maps are supported. The KOLAM software architecture was extended to support airborne wide-area motion imagery by organizing spatiotemporal tiles in very large format video frames using a temporal cache of tiled pyramid cached data structures. The current version supports WAMI animation, fast intelligent inspection, trajectory visualization and target tracking (digital tagging); the latter by interfacing with external automatic tracking software. One of the critical needs for working with WAMI is a supervised tracking and visualization tool that allows analysts to digitally tag multiple targets, quickly review and correct tracking results and apply geospatial visual analytic tools on the generated trajectories. One-click manual tracking combined with multiple automated tracking algorithms are available to assist the analyst and increase human effectiveness.

  18. The PANTHER User Experience

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coram, Jamie L.; Morrow, James D.; Perkins, David Nikolaus

    2015-09-01

    This document describes the PANTHER R&D Application, a proof-of-concept user interface application developed under the PANTHER Grand Challenge LDRD. The purpose of the application is to explore interaction models for graph analytics, drive algorithmic improvements from an end-user point of view, and support demonstration of PANTHER technologies to potential customers. The R&D Application implements a graph-centric interaction model that exposes analysts to the algorithms contained within the GeoGraphy graph analytics library. Users define geospatial-temporal semantic graph queries by constructing search templates based on nodes, edges, and the constraints among them. Users then analyze the results of the queries using bothmore » geo-spatial and temporal visualizations. Development of this application has made user experience an explicit driver for project and algorithmic level decisions that will affect how analysts one day make use of PANTHER technologies.« less

  19. A Comprehensive Optimization Strategy for Real-time Spatial Feature Sharing and Visual Analytics in Cyberinfrastructure

    NASA Astrophysics Data System (ADS)

    Li, W.; Shao, H.

    2017-12-01

    For geospatial cyberinfrastructure enabled web services, the ability of rapidly transmitting and sharing spatial data over the Internet plays a critical role to meet the demands of real-time change detection, response and decision-making. Especially for the vector datasets which serve as irreplaceable and concrete material in data-driven geospatial applications, their rich geometry and property information facilitates the development of interactive, efficient and intelligent data analysis and visualization applications. However, the big-data issues of vector datasets have hindered their wide adoption in web services. In this research, we propose a comprehensive optimization strategy to enhance the performance of vector data transmitting and processing. This strategy combines: 1) pre- and on-the-fly generalization, which automatically determines proper simplification level through the introduction of appropriate distance tolerance (ADT) to meet various visualization requirements, and at the same time speed up simplification efficiency; 2) a progressive attribute transmission method to reduce data size and therefore the service response time; 3) compressed data transmission and dynamic adoption of a compression method to maximize the service efficiency under different computing and network environments. A cyberinfrastructure web portal was developed for implementing the proposed technologies. After applying our optimization strategies, substantial performance enhancement is achieved. We expect this work to widen the use of web service providing vector data to support real-time spatial feature sharing, visual analytics and decision-making.

  20. Text Stream Trend Analysis using Multiscale Visual Analytics with Applications to Social Media Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A; Beaver, Justin M; BogenII, Paul L.

    In this paper, we introduce a new visual analytics system, called Matisse, that allows exploration of global trends in textual information streams with specific application to social media platforms. Despite the potential for real-time situational awareness using these services, interactive analysis of such semi-structured textual information is a challenge due to the high-throughput and high-velocity properties. Matisse addresses these challenges through the following contributions: (1) robust stream data management, (2) automated sen- timent/emotion analytics, (3) inferential temporal, geospatial, and term-frequency visualizations, and (4) a flexible drill-down interaction scheme that progresses from macroscale to microscale views. In addition to describing thesemore » contributions, our work-in-progress paper concludes with a practical case study focused on the analysis of Twitter 1% sample stream information captured during the week of the Boston Marathon bombings.« less

  1. Not Just a Game … When We Play Together, We Learn Together: Interactive Virtual Environments and Gaming Engines for Geospatial Visualization

    NASA Astrophysics Data System (ADS)

    Shipman, J. S.; Anderson, J. W.

    2017-12-01

    An ideal tool for ecologists and land managers to investigate the impacts of both projected environmental changes and policy alternatives is the creation of immersive, interactive, virtual landscapes. As a new frontier in visualizing and understanding geospatial data, virtual landscapes require a new toolbox for data visualization that includes traditional GIS tools and uncommon tools such as the Unity3d game engine. Game engines provide capabilities to not only explore data but to build and interact with dynamic models collaboratively. These virtual worlds can be used to display and illustrate data that is often more understandable and plausible to both stakeholders and policy makers than is achieved using traditional maps.Within this context we will present funded research that has been developed utilizing virtual landscapes for geographic visualization and decision support among varied stakeholders. We will highlight the challenges and lessons learned when developing interactive virtual environments that require large multidisciplinary team efforts with varied competences. The results will emphasize the importance of visualization and interactive virtual environments and the link with emerging research disciplines within Visual Analytics.

  2. The Role of Discrete Global Grid Systems in the Global Statistical Geospatial Framework

    NASA Astrophysics Data System (ADS)

    Purss, M. B. J.; Peterson, P.; Minchin, S. A.; Bermudez, L. E.

    2016-12-01

    The United Nations Committee of Experts on Global Geospatial Information Management (UN-GGIM) has proposed the development of a Global Statistical Geospatial Framework (GSGF) as a mechanism for the establishment of common analytical systems that enable the integration of statistical and geospatial information. Conventional coordinate reference systems address the globe with a continuous field of points suitable for repeatable navigation and analytical geometry. While this continuous field is represented on a computer in a digitized and discrete fashion by tuples of fixed-precision floating point values, it is a non-trivial exercise to relate point observations spatially referenced in this way to areal coverages on the surface of the Earth. The GSGF states the need to move to gridded data delivery and the importance of using common geographies and geocoding. The challenges associated with meeting these goals are not new and there has been a significant effort within the geospatial community to develop nested gridding standards to tackle these issues over many years. These efforts have recently culminated in the development of a Discrete Global Grid Systems (DGGS) standard which has been developed under the auspices of Open Geospatial Consortium (OGC). DGGS provide a fixed areal based geospatial reference frame for the persistent location of measured Earth observations, feature interpretations, and modelled predictions. DGGS address the entire planet by partitioning it into a discrete hierarchical tessellation of progressively finer resolution cells, which are referenced by a unique index that facilitates rapid computation, query and analysis. The geometry and location of the cell is the principle aspect of a DGGS. Data integration, decomposition, and aggregation is optimised in the DGGS hierarchical structure and can be exploited for efficient multi-source data processing, storage, discovery, transmission, visualization, computation, analysis, and modelling. During the 6th Session of the UN-GGIM in August 2016 the role of DGGS in the context of the GSGF was formally acknowledged. This paper proposes to highlight the synergies and role of DGGS in the Global Statistical Geospatial Framework and to show examples of the use of DGGS to combine geospatial statistics with traditional geoscientific data.

  3. PlanetSense: A Real-time Streaming and Spatio-temporal Analytics Platform for Gathering Geo-spatial Intelligence from Open Source Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thakur, Gautam S; Bhaduri, Budhendra L; Piburn, Jesse O

    Geospatial intelligence has traditionally relied on the use of archived and unvarying data for planning and exploration purposes. In consequence, the tools and methods that are architected to provide insight and generate projections only rely on such datasets. Albeit, if this approach has proven effective in several cases, such as land use identification and route mapping, it has severely restricted the ability of researchers to inculcate current information in their work. This approach is inadequate in scenarios requiring real-time information to act and to adjust in ever changing dynamic environments, such as evacuation and rescue missions. In this work, wemore » propose PlanetSense, a platform for geospatial intelligence that is built to harness the existing power of archived data and add to that, the dynamics of real-time streams, seamlessly integrated with sophisticated data mining algorithms and analytics tools for generating operational intelligence on the fly. The platform has four main components i) GeoData Cloud a data architecture for storing and managing disparate datasets; ii) Mechanism to harvest real-time streaming data; iii) Data analytics framework; iv) Presentation and visualization through web interface and RESTful services. Using two case studies, we underpin the necessity of our platform in modeling ambient population and building occupancy at scale.« less

  4. Teaching Tectonics to Undergraduates with Web GIS

    NASA Astrophysics Data System (ADS)

    Anastasio, D. J.; Bodzin, A.; Sahagian, D. L.; Rutzmoser, S.

    2013-12-01

    Geospatial reasoning skills provide a means for manipulating, interpreting, and explaining structured information and are involved in higher-order cognitive processes that include problem solving and decision-making. Appropriately designed tools, technologies, and curriculum can support spatial learning. We present Web-based visualization and analysis tools developed with Javascript APIs to enhance tectonic curricula while promoting geospatial thinking and scientific inquiry. The Web GIS interface integrates graphics, multimedia, and animations that allow users to explore and discover geospatial patterns that are not easily recognized. Features include a swipe tool that enables users to see underneath layers, query tools useful in exploration of earthquake and volcano data sets, a subduction and elevation profile tool which facilitates visualization between map and cross-sectional views, drafting tools, a location function, and interactive image dragging functionality on the Web GIS. The Web GIS platform is independent and can be implemented on tablets or computers. The GIS tool set enables learners to view, manipulate, and analyze rich data sets from local to global scales, including such data as geology, population, heat flow, land cover, seismic hazards, fault zones, continental boundaries, and elevation using two- and three- dimensional visualization and analytical software. Coverages which allow users to explore plate boundaries and global heat flow processes aided learning in a Lehigh University Earth and environmental science Structural Geology and Tectonics class and are freely available on the Web.

  5. Matisse: A Visual Analytics System for Exploring Emotion Trends in Social Media Text Streams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A; Drouhard, Margaret MEG G; Beaver, Justin M

    Dynamically mining textual information streams to gain real-time situational awareness is especially challenging with social media systems where throughput and velocity properties push the limits of a static analytical approach. In this paper, we describe an interactive visual analytics system, called Matisse, that aids with the discovery and investigation of trends in streaming text. Matisse addresses the challenges inherent to text stream mining through the following technical contributions: (1) robust stream data management, (2) automated sentiment/emotion analytics, (3) interactive coordinated visualizations, and (4) a flexible drill-down interaction scheme that accesses multiple levels of detail. In addition to positive/negative sentiment prediction,more » Matisse provides fine-grained emotion classification based on Valence, Arousal, and Dominance dimensions and a novel machine learning process. Information from the sentiment/emotion analytics are fused with raw data and summary information to feed temporal, geospatial, term frequency, and scatterplot visualizations using a multi-scale, coordinated interaction model. After describing these techniques, we conclude with a practical case study focused on analyzing the Twitter sample stream during the week of the 2013 Boston Marathon bombings. The case study demonstrates the effectiveness of Matisse at providing guided situational awareness of significant trends in social media streams by orchestrating computational power and human cognition.« less

  6. Regulating outdoor advertisement boards; employing spatial decision support system to control urban visual pollution

    NASA Astrophysics Data System (ADS)

    Wakil, K.; Hussnain, MQ; Tahir, A.; Naeem, M. A.

    2016-06-01

    Unmanaged placement, size, location, structure and contents of outdoor advertisement boards have resulted in severe urban visual pollution and deterioration of the socio-physical living environment in urban centres of Pakistan. As per the regulatory instruments, the approval decision for a new advertisement installation is supposed to be based on the locational density of existing boards and their proximity or remoteness to certain land- uses. In cities, where regulatory tools for the control of advertisement boards exist, responsible authorities are handicapped in effective implementation due to the absence of geospatial analysis capacity. This study presents the development of a spatial decision support system (SDSS) for regularization of advertisement boards in terms of their location and placement. The knowledge module of the proposed SDSS is based on provisions and restrictions prescribed in regulatory documents. While the user interface allows visualization and scenario evaluation to understand if the new board will affect existing linear density on a particular road and if it violates any buffer restrictions around a particular land use. Technically the structure of the proposed SDSS is a web-based solution which includes open geospatial tools such as OpenGeo Suite, GeoExt, PostgreSQL, and PHP. It uses three key data sets including road network, locations of existing billboards and building parcels with land use information to perform the analysis. Locational suitability has been calculated using pairwise comparison through analytical hierarchy process (AHP) and weighted linear combination (WLC). Our results indicate that open geospatial tools can be helpful in developing an SDSS which can assist solving space related iterative decision challenges on outdoor advertisements. Employing such a system will result in effective implementation of regulations resulting in visual harmony and aesthetic improvement in urban communities.

  7. The role of visualization in learning from computer-based images

    NASA Astrophysics Data System (ADS)

    Piburn, Michael D.; Reynolds, Stephen J.; McAuliffe, Carla; Leedy, Debra E.; Birk, James P.; Johnson, Julia K.

    2005-05-01

    Among the sciences, the practice of geology is especially visual. To assess the role of spatial ability in learning geology, we designed an experiment using: (1) web-based versions of spatial visualization tests, (2) a geospatial test, and (3) multimedia instructional modules built around QuickTime Virtual Reality movies. Students in control and experimental sections were administered measures of spatial orientation and visualization, as well as a content-based geospatial examination. All subjects improved significantly in their scores on spatial visualization and the geospatial examination. There was no change in their scores on spatial orientation. A three-way analysis of variance, with the geospatial examination as the dependent variable, revealed significant main effects favoring the experimental group and a significant interaction between treatment and gender. These results demonstrate that spatial ability can be improved through instruction, that learning of geological content will improve as a result, and that differences in performance between the genders can be eliminated.

  8. Improving data discoverability, accessibility, and interoperability with the Esri ArcGIS Platform at the NASA Atmospheric Science Data Center (ASDC).

    NASA Astrophysics Data System (ADS)

    Tisdale, M.

    2017-12-01

    NASA's Atmospheric Science Data Center (ASDC) is operationally using the Esri ArcGIS Platform to improve data discoverability, accessibility and interoperability to meet the diversifying user requirements from government, private, public and academic communities. The ASDC is actively working to provide their mission essential datasets as ArcGIS Image Services, Open Geospatial Consortium (OGC) Web Mapping Services (WMS), and OGC Web Coverage Services (WCS) while leveraging the ArcGIS multidimensional mosaic dataset structure. Science teams at ASDC are utilizing these services through the development of applications using the Web AppBuilder for ArcGIS and the ArcGIS API for Javascript. These services provide greater exposure of ASDC data holdings to the GIS community and allow for broader sharing and distribution to various end users. These capabilities provide interactive visualization tools and improved geospatial analytical tools for a mission critical understanding in the areas of the earth's radiation budget, clouds, aerosols, and tropospheric chemistry. The presentation will cover how the ASDC is developing geospatial web services and applications to improve data discoverability, accessibility, and interoperability.

  9. Transduction between worlds: using virtual and mixed reality for earth and planetary science

    NASA Astrophysics Data System (ADS)

    Hedley, N.; Lochhead, I.; Aagesen, S.; Lonergan, C. D.; Benoy, N.

    2017-12-01

    Virtual reality (VR) and augmented reality (AR) have the potential to transform the way we visualize multidimensional geospatial datasets in support of geoscience research, exploration and analysis. The beauty of virtual environments is that they can be built at any scale, users can view them at many levels of abstraction, move through them in unconventional ways, and experience spatial phenomena as if they had superpowers. Similarly, augmented reality allows you to bring the power of virtual 3D data visualizations into everyday spaces. Spliced together, these interface technologies hold incredible potential to support 21st-century geoscience. In my ongoing research, my team and I have made significant advances to connect data and virtual simulations with real geographic spaces, using virtual environments, geospatial augmented reality and mixed reality. These research efforts have yielded new capabilities to connect users with spatial data and phenomena. These innovations include: geospatial x-ray vision; flexible mixed reality; augmented 3D GIS; situated augmented reality 3D simulations of tsunamis and other phenomena interacting with real geomorphology; augmented visual analytics; and immersive GIS. These new modalities redefine the ways in which we can connect digital spaces of spatial analysis, simulation and geovisualization, with geographic spaces of data collection, fieldwork, interpretation and communication. In a way, we are talking about transduction between real and virtual worlds. Taking a mixed reality approach to this, we can link real and virtual worlds. This paper presents a selection of our 3D geovisual interface projects in terrestrial, coastal, underwater and other environments. Using rigorous applied geoscience data, analyses and simulations, our research aims to transform the novelty of virtual and augmented reality interface technologies into game-changing mixed reality geoscience.

  10. Geospatial Perspective: Toward a Visual Political Literacy Project in Education, Health, and Human Services

    ERIC Educational Resources Information Center

    Hogrebe, Mark C.; Tate, William F., IV

    2012-01-01

    In this chapter, "geospatial" refers to geographic space that includes location, distance, and the relative position of things on the earth's surface. Geospatial perspective calls for the addition of a geographic lens that focuses on place and space as important contextual variables. A geospatial view increases one's understanding of…

  11. GABBs: Cyberinfrastructure for Self-Service Geospatial Data Exploration, Computation, and Sharing

    NASA Astrophysics Data System (ADS)

    Song, C. X.; Zhao, L.; Biehl, L. L.; Merwade, V.; Villoria, N.

    2016-12-01

    Geospatial data are present everywhere today with the proliferation of location-aware computing devices. This is especially true in the scientific community where large amounts of data are driving research and education activities in many domains. Collaboration over geospatial data, for example, in modeling, data analysis and visualization, must still overcome the barriers of specialized software and expertise among other challenges. In addressing these needs, the Geospatial data Analysis Building Blocks (GABBs) project aims at building geospatial modeling, data analysis and visualization capabilities in an open source web platform, HUBzero. Funded by NSF's Data Infrastructure Building Blocks initiative, GABBs is creating a geospatial data architecture that integrates spatial data management, mapping and visualization, and interfaces in the HUBzero platform for scientific collaborations. The geo-rendering enabled Rappture toolkit, a generic Python mapping library, geospatial data exploration and publication tools, and an integrated online geospatial data management solution are among the software building blocks from the project. The GABBS software will be available through Amazon's AWS Marketplace VM images and open source. Hosting services are also available to the user community. The outcome of the project will enable researchers and educators to self-manage their scientific data, rapidly create GIS-enable tools, share geospatial data and tools on the web, and build dynamic workflows connecting data and tools, all without requiring significant software development skills, GIS expertise or IT administrative privileges. This presentation will describe the GABBs architecture, toolkits and libraries, and showcase the scientific use cases that utilize GABBs capabilities, as well as the challenges and solutions for GABBs to interoperate with other cyberinfrastructure platforms.

  12. Plug and Play web-based visualization of mobile air monitoring data (Abstract)

    EPA Science Inventory

    EPA’s Real-Time Geospatial (RETIGO) Data Viewer web-based tool is a new program reducing the technical barrier to visualize and understand geospatial air data time series collected using wearable, bicycle-mounted, or vehicle-mounted air sensors. The RETIGO tool, with anticipated...

  13. Airborne single particle mass spectrometers (SPLAT II & miniSPLAT) and new software for data visualization and analysis in a geo-spatial context.

    PubMed

    Zelenyuk, Alla; Imre, Dan; Wilson, Jacqueline; Zhang, Zhiyuan; Wang, Jun; Mueller, Klaus

    2015-02-01

    Understanding the effect of aerosols on climate requires knowledge of the size and chemical composition of individual aerosol particles-two fundamental properties that determine an aerosol's optical properties and ability to serve as cloud condensation or ice nuclei. Here we present our aircraft-compatible single particle mass spectrometers, SPLAT II and its new, miniaturized version, miniSPLAT that measure in-situ and in real-time the size and chemical composition of individual aerosol particles with extremely high sensitivity, temporal resolution, and sizing precision on the order of a monolayer. Although miniSPLAT's size, weight, and power consumption are significantly smaller, its performance is on par with SPLAT II. Both instruments operate in dual data acquisition mode to measure, in addition to single particle size and composition, particle number concentrations, size distributions, density, and asphericity with high temporal resolution. We also present ND-Scope, our newly developed interactive visual analytics software package. ND-Scope is designed to explore and visualize the vast amount of complex, multidimensional data acquired by our single particle mass spectrometers, along with other aerosol and cloud characterization instruments on-board aircraft. We demonstrate that ND-Scope makes it possible to visualize the relationships between different observables and to view the data in a geo-spatial context, using the interactive and fully coupled Google Earth and Parallel Coordinates displays. Here we illustrate the utility of ND-Scope to visualize the spatial distribution of atmospheric particles of different compositions, and explore the relationship between individual particle compositions and their activity as cloud condensation nuclei.

  14. Architecture of a spatial data service system for statistical analysis and visualization of regional climate changes

    NASA Astrophysics Data System (ADS)

    Titov, A. G.; Okladnikov, I. G.; Gordov, E. P.

    2017-11-01

    The use of large geospatial datasets in climate change studies requires the development of a set of Spatial Data Infrastructure (SDI) elements, including geoprocessing and cartographical visualization web services. This paper presents the architecture of a geospatial OGC web service system as an integral part of a virtual research environment (VRE) general architecture for statistical processing and visualization of meteorological and climatic data. The architecture is a set of interconnected standalone SDI nodes with corresponding data storage systems. Each node runs a specialized software, such as a geoportal, cartographical web services (WMS/WFS), a metadata catalog, and a MySQL database of technical metadata describing geospatial datasets available for the node. It also contains geospatial data processing services (WPS) based on a modular computing backend realizing statistical processing functionality and, thus, providing analysis of large datasets with the results of visualization and export into files of standard formats (XML, binary, etc.). Some cartographical web services have been developed in a system’s prototype to provide capabilities to work with raster and vector geospatial data based on OGC web services. The distributed architecture presented allows easy addition of new nodes, computing and data storage systems, and provides a solid computational infrastructure for regional climate change studies based on modern Web and GIS technologies.

  15. Web-Based Geospatial Visualization of GPM Data with CesiumJS

    NASA Technical Reports Server (NTRS)

    Lammers, Matt

    2018-01-01

    Advancements in the capabilities of JavaScript frameworks and web browsing technology have made online visualization of large geospatial datasets such as those coming from precipitation satellites viable. These data benefit from being visualized on and above a three-dimensional surface. The open-source JavaScript framework CesiumJS (http://cesiumjs.org), developed by Analytical Graphics, Inc., leverages the WebGL protocol to do just that. This presentation will describe how CesiumJS has been used in three-dimensional visualization products developed as part of the NASA Precipitation Processing System (PPS) STORM data-order website. Existing methods of interacting with Global Precipitation Measurement (GPM) Mission data primarily focus on two-dimensional static images, whether displaying vertical slices or horizontal surface/height-level maps. These methods limit interactivity with the robust three-dimensional data coming from the GPM core satellite. Integrating the data with CesiumJS in a web-based user interface has allowed us to create the following products. We have linked with the data-order interface an on-the-fly visualization tool for any GPM/partner satellite orbit. A version of this tool also focuses on high-impact weather events. It enables viewing of combined radar and microwave-derived precipitation data on mobile devices and in a way that can be embedded into other websites. We also have used CesiumJS to visualize a method of integrating gridded precipitation data with modeled wind speeds that animates over time. Emphasis in the presentation will be placed on how a variety of technical methods were used to create these tools, and how the flexibility of the CesiumJS framework facilitates creative approaches to interact with the data.

  16. Multi-focused geospatial analysis using probes.

    PubMed

    Butkiewicz, Thomas; Dou, Wenwen; Wartell, Zachary; Ribarsky, William; Chang, Remco

    2008-01-01

    Traditional geospatial information visualizations often present views that restrict the user to a single perspective. When zoomed out, local trends and anomalies become suppressed and lost; when zoomed in for local inspection, spatial awareness and comparison between regions become limited. In our model, coordinated visualizations are integrated within individual probe interfaces, which depict the local data in user-defined regions-of-interest. Our probe concept can be incorporated into a variety of geospatial visualizations to empower users with the ability to observe, coordinate, and compare data across multiple local regions. It is especially useful when dealing with complex simulations or analyses where behavior in various localities differs from other localities and from the system as a whole. We illustrate the effectiveness of our technique over traditional interfaces by incorporating it within three existing geospatial visualization systems: an agent-based social simulation, a census data exploration tool, and an 3D GIS environment for analyzing urban change over time. In each case, the probe-based interaction enhances spatial awareness, improves inspection and comparison capabilities, expands the range of scopes, and facilitates collaboration among multiple users.

  17. Geospatial-enabled Data Exploration and Computation through Data Infrastructure Building Blocks

    NASA Astrophysics Data System (ADS)

    Song, C. X.; Biehl, L. L.; Merwade, V.; Villoria, N.

    2015-12-01

    Geospatial data are present everywhere today with the proliferation of location-aware computing devices and sensors. This is especially true in the scientific community where large amounts of data are driving research and education activities in many domains. Collaboration over geospatial data, for example, in modeling, data analysis and visualization, must still overcome the barriers of specialized software and expertise among other challenges. The GABBs project aims at enabling broader access to geospatial data exploration and computation by developing spatial data infrastructure building blocks that leverage capabilities of end-to-end application service and virtualized computing framework in HUBzero. Funded by NSF Data Infrastructure Building Blocks (DIBBS) initiative, GABBs provides a geospatial data architecture that integrates spatial data management, mapping and visualization and will make it available as open source. The outcome of the project will enable users to rapidly create tools and share geospatial data and tools on the web for interactive exploration of data without requiring significant software development skills, GIS expertise or IT administrative privileges. This presentation will describe the development of geospatial data infrastructure building blocks and the scientific use cases that help drive the software development, as well as seek feedback from the user communities.

  18. Examining the Enactment of Web GIS on Students' Geospatial Thinking and Reasoning and Tectonics Understandings

    ERIC Educational Resources Information Center

    Bodzin, Alec M.; Fu, Qiong; Bressler, Denise; Vallera, Farah L.

    2015-01-01

    Geospatially enabled learning technologies may enhance Earth science learning by placing emphasis on geographic space, visualization, scale, representation, and geospatial thinking and reasoning (GTR) skills. This study examined if and how a series of Web geographic information system investigations that the researchers developed improved urban…

  19. Geospatial Services in Special Libraries: A Needs Assessment Perspective

    ERIC Educational Resources Information Center

    Barnes, Ilana

    2013-01-01

    Once limited to geographers and mapmakers, Geographic Information Systems (GIS) has taken a growing central role in information management and visualization. Geospatial services run a gamut of different products and services from Google maps to ArcGIS servers to Mobile development. Geospatial services are not new. Libraries have been writing about…

  20. Airborne Single Particle Mass Spectrometers (SPLAT II & miniSPLAT) and New Software for Data Visualization and Analysis in a Geo-Spatial Context

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zelenyuk, Alla; Imre, D.; Wilson, Jacqueline M.

    2015-02-01

    Understanding the effect of aerosols on climate requires knowledge of the size and chemical composition of individual aerosol particles - two fundamental properties that determine an aerosol’s optical properties and ability to serve as cloud condensation or ice nuclei. Here we present miniSPLAT, our new aircraft compatible single particle mass spectrometer, that measures in-situ and in real-time size and chemical composition of individual aerosol particles with extremely high sensitivity, temporal resolution, and sizing precision on the order of a monolayer. miniSPLAT operates in dual data acquisition mode to measure, in addition to single particle size and composition, particle number concentrations,more » size distributions, density, and asphericity with high temporal resolution. When compared to our previous instrument, SPLAT II, miniSPLAT has been significantly reduced in size, weight, and power consumption without loss in performance. We also present ND-Scope, our newly developed interactive visual analytics software package. ND-Scope is designed to explore and visualize the vast amount of complex, multidimensional data acquired by our single particle mass spectrometers, along with other aerosol and cloud characterization instruments on-board aircraft. We demonstrate that ND-Scope makes it possible to visualize the relationships between different observables and to view the data in a geo-spatial context, using the interactive and fully coupled Google Earth and Parallel Coordinates displays. Here we illustrate the utility of ND-Scope to visualize the spatial distribution of atmospheric particles of different compositions, and explore the relationship between individual particle composition and their activity as cloud condensation nuclei.« less

  1. High performance geospatial and climate data visualization using GeoJS

    NASA Astrophysics Data System (ADS)

    Chaudhary, A.; Beezley, J. D.

    2015-12-01

    GeoJS (https://github.com/OpenGeoscience/geojs) is an open-source library developed to support interactive scientific and geospatial visualization of climate and earth science datasets in a web environment. GeoJS has a convenient application programming interface (API) that enables users to harness the fast performance of WebGL and Canvas 2D APIs with sophisticated Scalable Vector Graphics (SVG) features in a consistent and convenient manner. We started the project in response to the need for an open-source JavaScript library that can combine traditional geographic information systems (GIS) and scientific visualization on the web. Many libraries, some of which are open source, support mapping or other GIS capabilities, but lack the features required to visualize scientific and other geospatial datasets. For instance, such libraries are not be capable of rendering climate plots from NetCDF files, and some libraries are limited in regards to geoinformatics (infovis in a geospatial environment). While libraries such as d3.js are extremely powerful for these kinds of plots, in order to integrate them into other GIS libraries, the construction of geoinformatics visualizations must be completed manually and separately, or the code must somehow be mixed in an unintuitive way.We developed GeoJS with the following motivations:• To create an open-source geovisualization and GIS library that combines scientific visualization with GIS and informatics• To develop an extensible library that can combine data from multiple sources and render them using multiple backends• To build a library that works well with existing scientific visualizations tools such as VTKWe have successfully deployed GeoJS-based applications for multiple domains across various projects. The ClimatePipes project funded by the Department of Energy, for example, used GeoJS to visualize NetCDF datasets from climate data archives. Other projects built visualizations using GeoJS for interactively exploring data and analysis regarding 1) the human trafficking domain, 2) New York City taxi drop-offs and pick-ups, and 3) the Ebola outbreak. GeoJS supports advanced visualization features such as picking and selecting, as well as clustering. It also supports 2D contour plots, vector plots, heat maps, and geospatial graphs.

  2. Estuary Data Mapper: A Stand-Alone Tool for Geospatial Data Access, Visualization and Download for Estuaries and Coastal Watersheds of the United States

    EPA Science Inventory

    The US EPA Estuary Data Mapper (EDM; http://badger.epa.gov/rsig/edm/index.html) has been designed as a free stand-alone tool for geospatial data discovery, visualization, and data download for estuaries and their associated watersheds in the conterminous United States. EDM requi...

  3. Estuary Data Mapper: A Stand-Alone Tool for Geospatial Data Access, Visualization and Download for Estuaries and Coastal Watersheds of the United States. (UNH)

    EPA Science Inventory

    The US EPA Estuary Data Mapper (EDM; http://badger.epa.gov/rsig/edm/index.html) has been designed as a free stand-alone tool for geospatial data discovery, visualization, and data download for estuaries and their associated watersheds in the conterminous United States. EDM requi...

  4. GeoNotebook: Browser based Interactive analysis and visualization workflow for very large climate and geospatial datasets

    NASA Astrophysics Data System (ADS)

    Ozturk, D.; Chaudhary, A.; Votava, P.; Kotfila, C.

    2016-12-01

    Jointly developed by Kitware and NASA Ames, GeoNotebook is an open source tool designed to give the maximum amount of flexibility to analysts, while dramatically simplifying the process of exploring geospatially indexed datasets. Packages like Fiona (backed by GDAL), Shapely, Descartes, Geopandas, and PySAL provide a stack of technologies for reading, transforming, and analyzing geospatial data. Combined with the Jupyter notebook and libraries like matplotlib/Basemap it is possible to generate detailed geospatial visualizations. Unfortunately, visualizations generated is either static or does not perform well for very large datasets. Also, this setup requires a great deal of boilerplate code to create and maintain. Other extensions exist to remedy these problems, but they provide a separate map for each input cell and do not support map interactions that feed back into the python environment. To support interactive data exploration and visualization on large datasets we have developed an extension to the Jupyter notebook that provides a single dynamic map that can be managed from the Python environment, and that can communicate back with a server which can perform operations like data subsetting on a cloud-based cluster.

  5. Big Data Geo-Analytical Tool Development for Spatial Analysis Uncertainty Visualization and Quantification Needs

    NASA Astrophysics Data System (ADS)

    Rose, K.; Bauer, J. R.; Baker, D. V.

    2015-12-01

    As big data computing capabilities are increasingly paired with spatial analytical tools and approaches, there is a need to ensure uncertainty associated with the datasets used in these analyses is adequately incorporated and portrayed in results. Often the products of spatial analyses, big data and otherwise, are developed using discontinuous, sparse, and often point-driven data to represent continuous phenomena. Results from these analyses are generally presented without clear explanations of the uncertainty associated with the interpolated values. The Variable Grid Method (VGM) offers users with a flexible approach designed for application to a variety of analyses where users there is a need to study, evaluate, and analyze spatial trends and patterns while maintaining connection to and communicating the uncertainty in the underlying spatial datasets. The VGM outputs a simultaneous visualization representative of the spatial data analyses and quantification of underlying uncertainties, which can be calculated using data related to sample density, sample variance, interpolation error, uncertainty calculated from multiple simulations. In this presentation we will show how we are utilizing Hadoop to store and perform spatial analysis through the development of custom Spark and MapReduce applications that incorporate ESRI Hadoop libraries. The team will present custom 'Big Data' geospatial applications that run on the Hadoop cluster and integrate with ESRI ArcMap with the team's probabilistic VGM approach. The VGM-Hadoop tool has been specially built as a multi-step MapReduce application running on the Hadoop cluster for the purpose of data reduction. This reduction is accomplished by generating multi-resolution, non-overlapping, attributed topology that is then further processed using ESRI's geostatistical analyst to convey a probabilistic model of a chosen study region. Finally, we will share our approach for implementation of data reduction and topology generation via custom multi-step Hadoop applications, performance benchmarking comparisons, and Hadoop-centric opportunities for greater parallelization of geospatial operations. The presentation includes examples of the approach being applied to a range of subsurface, geospatial studies (e.g. induced seismicity risk).

  6. Delivery of Forecasted Atmospheric Ozone and Dust for the New Mexico Environmental Public Health Tracking System - An Open Source Geospatial Solution

    NASA Astrophysics Data System (ADS)

    Hudspeth, W. B.; Sanchez-Silva, R.; Cavner, J. A.

    2010-12-01

    New Mexico's Environmental Public Health Tracking System (EPHTS), funded by the Centers for Disease Control (CDC) Environmental Public Health Tracking Network (EPHTN), aims to improve health awareness and services by linking health effects data with levels and frequency of environmental exposure. As a public health decision-support system, EPHTS systems include: state-of-the-art statistical analysis tools; geospatial visualization tools; data discovery, extraction, and delivery tools; and environmental/public health linkage information. As part of its mandate, EPHTS issues public health advisories and forecasts of environmental conditions that have consequences for human health. Through a NASA-funded partnership between the University of New Mexico and the University of Arizona, NASA Earth Science results are fused into two existing models (the Dust Regional Atmospheric Model (DREAM) and the Community Multiscale Air Quality (CMAQ) model) in order to improve forecasts of atmospheric dust, ozone, and aerosols. The results and products derived from the outputs of these models are made available to an Open Source mapping component of the New Mexico EPHTS. In particular, these products are integrated into a Django content management system using GeoDjango, GeoAlchemy, and other OGC-compliant geospatial libraries written in the Python and C++ programming languages. Capabilities of the resultant mapping system include indicator-based thematic mapping, data delivery, and analytical capabilities. DREAM and CMAQ outputs can be inspected, via REST calls, through temporal and spatial subsetting of the atmospheric concentration data across analytical units employed by the public health community. This paper describes details of the architecture and integration of NASA Earth Science into the EPHTS decision-support system.

  7. Innovating Big Data Computing Geoprocessing for Analysis of Engineered-Natural Systems

    NASA Astrophysics Data System (ADS)

    Rose, K.; Baker, V.; Bauer, J. R.; Vasylkivska, V.

    2016-12-01

    Big data computing and analytical techniques offer opportunities to improve predictions about subsurface systems while quantifying and characterizing associated uncertainties from these analyses. Spatial analysis, big data and otherwise, of subsurface natural and engineered systems are based on variable resolution, discontinuous, and often point-driven data to represent continuous phenomena. We will present examples from two spatio-temporal methods that have been adapted for use with big datasets and big data geo-processing capabilities. The first approach uses regional earthquake data to evaluate spatio-temporal trends associated with natural and induced seismicity. The second algorithm, the Variable Grid Method (VGM), is a flexible approach that presents spatial trends and patterns, such as those resulting from interpolation methods, while simultaneously visualizing and quantifying uncertainty in the underlying spatial datasets. In this presentation we will show how we are utilizing Hadoop to store and perform spatial analyses to efficiently consume and utilize large geospatial data in these custom analytical algorithms through the development of custom Spark and MapReduce applications that incorporate ESRI Hadoop libraries. The team will present custom `Big Data' geospatial applications that run on the Hadoop cluster and integrate with ESRI ArcMap with the team's probabilistic VGM approach. The VGM-Hadoop tool has been specially built as a multi-step MapReduce application running on the Hadoop cluster for the purpose of data reduction. This reduction is accomplished by generating multi-resolution, non-overlapping, attributed topology that is then further processed using ESRI's geostatistical analyst to convey a probabilistic model of a chosen study region. Finally, we will share our approach for implementation of data reduction and topology generation via custom multi-step Hadoop applications, performance benchmarking comparisons, and Hadoop-centric opportunities for greater parallelization of geospatial operations.

  8. Arc4nix: A cross-platform geospatial analytical library for cluster and cloud computing

    NASA Astrophysics Data System (ADS)

    Tang, Jingyin; Matyas, Corene J.

    2018-02-01

    Big Data in geospatial technology is a grand challenge for processing capacity. The ability to use a GIS for geospatial analysis on Cloud Computing and High Performance Computing (HPC) clusters has emerged as a new approach to provide feasible solutions. However, users lack the ability to migrate existing research tools to a Cloud Computing or HPC-based environment because of the incompatibility of the market-dominating ArcGIS software stack and Linux operating system. This manuscript details a cross-platform geospatial library "arc4nix" to bridge this gap. Arc4nix provides an application programming interface compatible with ArcGIS and its Python library "arcpy". Arc4nix uses a decoupled client-server architecture that permits geospatial analytical functions to run on the remote server and other functions to run on the native Python environment. It uses functional programming and meta-programming language to dynamically construct Python codes containing actual geospatial calculations, send them to a server and retrieve results. Arc4nix allows users to employ their arcpy-based script in a Cloud Computing and HPC environment with minimal or no modification. It also supports parallelizing tasks using multiple CPU cores and nodes for large-scale analyses. A case study of geospatial processing of a numerical weather model's output shows that arcpy scales linearly in a distributed environment. Arc4nix is open-source software.

  9. Nebhydro: Sharing Geospatial Data to Supportwater Management in Nebraska

    NASA Astrophysics Data System (ADS)

    Kamble, B.; Irmak, A.; Hubbard, K.; Deogun, J.; Dvorak, B.

    2012-12-01

    Recent advances in web-enabled geographical technologies have the potential to make a dramatic impact on development of highly interactive spatial applications on the web for visualization of large-scale geospatial data by water resources and irrigation scientists. Spatial and point scale water resources data visualization are an emerging and challenging application domain. Query based visual explorations of geospatial hydrological data can play an important role in stimulating scientific hypotheses and seeking causal relationships among hydro variables. The Nebraska Hydrological Information System (NebHydro) utilizes ESRI's ArcGIS server technology to increase technological awareness among farmers, irrigation managers and policy makers. Web-based geospatial applications are an effective way to expose scientific hydrological datasets to the research community and the public. NebHydro uses Adobe Flex technology to offer an online visualization and data analysis system for presentation of social and economic data. Internet mapping services is an integrated product of GIS and Internet technologies; it is a favored solution to achieve the interoperability of GIS. The development of Internet based GIS services in the state of Nebraska showcases the benefits of sharing geospatial hydrological data among agencies, resource managers and policy makers. Geospatial hydrological Information (Evapotranspiration from Remote Sensing, vegetation indices (NDVI), USGS Stream gauge data, Climatic data etc.) is generally generated through model simulation (METRIC, SWAP, Linux, Python based scripting etc). Information is compiled into and stored within object oriented relational spatial databases using a geodatabase information model that supports the key data types needed by applications including features, relationships, networks, imagery, terrains, maps and layers. The system provides online access, querying, visualization, and analysis of the hydrological data from several sources at one place. The study indicates that internet GIS, developed using advanced technologies, provides valuable education potential to users in hydrology and irrigation engineering and suggests that such a system can support advanced hydrological data access and analysis tools to improve utility of data in operations. Keywords: Hydrological Information System, NebHydro, Water Management, data sharing, data visualization, ArcGIS server.

  10. Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Gorelick, Noel

    2013-04-01

    The Google Earth Engine platform is a system designed to enable petabyte-scale, scientific analysis and visualization of geospatial datasets. Earth Engine provides a consolidated environment including a massive data catalog co-located with thousands of computers for analysis. The user-friendly front-end provides a workbench environment to allow interactive data and algorithm development and exploration and provides a convenient mechanism for scientists to share data, visualizations and analytic algorithms via URLs. The Earth Engine data catalog contains a wide variety of popular, curated datasets, including the world's largest online collection of Landsat scenes (> 2.0M), numerous MODIS collections, and many vector-based data sets. The platform provides a uniform access mechanism to a variety of data types, independent of their bands, projection, bit-depth, resolution, etc..., facilitating easy multi-sensor analysis. Additionally, a user is able to add and curate their own data and collections. Using a just-in-time, distributed computation model, Earth Engine can rapidly process enormous quantities of geo-spatial data. All computation is performed lazily; nothing is computed until it's required either for output or as input to another step. This model allows real-time feedback and preview during algorithm development, supporting a rapid algorithm development, test, and improvement cycle that scales seamlessly to large-scale production data processing. Through integration with a variety of other services, Earth Engine is able to bring to bear considerable analytic and technical firepower in a transparent fashion, including: AI-based classification via integration with Google's machine learning infrastructure, publishing and distribution at Google scale through integration with the Google Maps API, Maps Engine and Google Earth, and support for in-the-field activities such as validation, ground-truthing, crowd-sourcing and citizen science though the Android Open Data Kit.

  11. Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Gorelick, N.

    2012-12-01

    The Google Earth Engine platform is a system designed to enable petabyte-scale, scientific analysis and visualization of geospatial datasets. Earth Engine provides a consolidated environment including a massive data catalog co-located with thousands of computers for analysis. The user-friendly front-end provides a workbench environment to allow interactive data and algorithm development and exploration and provides a convenient mechanism for scientists to share data, visualizations and analytic algorithms via URLs. The Earth Engine data catalog contains a wide variety of popular, curated datasets, including the world's largest online collection of Landsat scenes (> 2.0M), numerous MODIS collections, and many vector-based data sets. The platform provides a uniform access mechanism to a variety of data types, independent of their bands, projection, bit-depth, resolution, etc..., facilitating easy multi-sensor analysis. Additionally, a user is able to add and curate their own data and collections. Using a just-in-time, distributed computation model, Earth Engine can rapidly process enormous quantities of geo-spatial data. All computation is performed lazily; nothing is computed until it's required either for output or as input to another step. This model allows real-time feedback and preview during algorithm development, supporting a rapid algorithm development, test, and improvement cycle that scales seamlessly to large-scale production data processing. Through integration with a variety of other services, Earth Engine is able to bring to bear considerable analytic and technical firepower in a transparent fashion, including: AI-based classification via integration with Google's machine learning infrastructure, publishing and distribution at Google scale through integration with the Google Maps API, Maps Engine and Google Earth, and support for in-the-field activities such as validation, ground-truthing, crowd-sourcing and citizen science though the Android Open Data Kit.

  12. Diy Geospatial Web Service Chains: Geochaining Make it Easy

    NASA Astrophysics Data System (ADS)

    Wu, H.; You, L.; Gui, Z.

    2011-08-01

    It is a great challenge for beginners to create, deploy and utilize a Geospatial Web Service Chain (GWSC). People in Computer Science are usually not familiar with geospatial domain knowledge. Geospatial practitioners may lack the knowledge about web services and service chains. The end users may lack both. However, integrated visual editing interfaces, validation tools, and oneclick deployment wizards may help to lower the learning curve and improve modelling skills so beginners will have a better experience. GeoChaining is a GWSC modelling tool designed and developed based on these ideas. GeoChaining integrates visual editing, validation, deployment, execution etc. into a unified platform. By employing a Virtual Globe, users can intuitively visualize raw data and results produced by GeoChaining. All of these features allow users to easily start using GWSC, regardless of their professional background and computer skills. Further, GeoChaining supports GWSC model reuse, meaning that an entire GWSC model created or even a specific part can be directly reused in a new model. This greatly improves the efficiency of creating a new GWSC, and also contributes to the sharing and interoperability of GWSC.

  13. GeoBrain Computational Cyber-laboratory for Earth Science Studies

    NASA Astrophysics Data System (ADS)

    Deng, M.; di, L.

    2009-12-01

    Computational approaches (e.g., computer-based data visualization, analysis and modeling) are critical for conducting increasingly data-intensive Earth science (ES) studies to understand functions and changes of the Earth system. However, currently Earth scientists, educators, and students have met two major barriers that prevent them from being effectively using computational approaches in their learning, research and application activities. The two barriers are: 1) difficulties in finding, obtaining, and using multi-source ES data; and 2) lack of analytic functions and computing resources (e.g., analysis software, computing models, and high performance computing systems) to analyze the data. Taking advantages of recent advances in cyberinfrastructure, Web service, and geospatial interoperability technologies, GeoBrain, a project funded by NASA, has developed a prototype computational cyber-laboratory to effectively remove the two barriers. The cyber-laboratory makes ES data and computational resources at large organizations in distributed locations available to and easily usable by the Earth science community through 1) enabling seamless discovery, access and retrieval of distributed data, 2) federating and enhancing data discovery with a catalogue federation service and a semantically-augmented catalogue service, 3) customizing data access and retrieval at user request with interoperable, personalized, and on-demand data access and services, 4) automating or semi-automating multi-source geospatial data integration, 5) developing a large number of analytic functions as value-added, interoperable, and dynamically chainable geospatial Web services and deploying them in high-performance computing facilities, 6) enabling the online geospatial process modeling and execution, and 7) building a user-friendly extensible web portal for users to access the cyber-laboratory resources. Users can interactively discover the needed data and perform on-demand data analysis and modeling through the web portal. The GeoBrain cyber-laboratory provides solutions to meet common needs of ES research and education, such as, distributed data access and analysis services, easy access to and use of ES data, and enhanced geoprocessing and geospatial modeling capability. It greatly facilitates ES research, education, and applications. The development of the cyber-laboratory provides insights, lessons-learned, and technology readiness to build more capable computing infrastructure for ES studies, which can meet wide-range needs of current and future generations of scientists, researchers, educators, and students for their formal or informal educational training, research projects, career development, and lifelong learning.

  14. Geospatial Data Science Applications and Visualizations | Geospatial Data

    Science.gov Websites

    . Since before the time of Google Maps, NREL has used the internet to allow stakeholders to view and world, these maps drive understanding. See our collection of key maps for examples. Featured Analysis

  15. Fast Tracking Data to Informed Decisions: An Advanced Information System to Improve Environmental Understanding and Management (Invited)

    NASA Astrophysics Data System (ADS)

    Minsker, B. S.; Myers, J.; Liu, Y.; Bajcsy, P.

    2010-12-01

    Emerging sensing and information technology are rapidly creating a new paradigm for environmental research and management, in which data from multiple sensors and information sources can guide real-time adaptive observation and decision making. This talk will provide an overview of emerging cyberinfrastructure and three case studies that illustrate their potential: combined sewer overflows in Chicago, hypoxia in Corpus Christi Bay, Texas, and sustainable agriculture in Illinois. An advanced information system for real-time decision making and visual geospatial analytics will be presented as an example of cyberinfrastructure that enables easier implementation of numerous real-time applications.

  16. 3D geospatial visualizations: Animation and motion effects on spatial objects

    NASA Astrophysics Data System (ADS)

    Evangelidis, Konstantinos; Papadopoulos, Theofilos; Papatheodorou, Konstantinos; Mastorokostas, Paris; Hilas, Constantinos

    2018-02-01

    Digital Elevation Models (DEMs), in combination with high quality raster graphics provide realistic three-dimensional (3D) representations of the globe (virtual globe) and amazing navigation experience over the terrain through earth browsers. In addition, the adoption of interoperable geospatial mark-up languages (e.g. KML) and open programming libraries (Javascript) makes it also possible to create 3D spatial objects and convey on them the sensation of any type of texture by utilizing open 3D representation models (e.g. Collada). One step beyond, by employing WebGL frameworks (e.g. Cesium.js, three.js) animation and motion effects are attributed on 3D models. However, major GIS-based functionalities in combination with all the above mentioned visualization capabilities such as for example animation effects on selected areas of the terrain texture (e.g. sea waves) as well as motion effects on 3D objects moving in dynamically defined georeferenced terrain paths (e.g. the motion of an animal over a hill, or of a big fish in an ocean etc.) are not widely supported at least by open geospatial applications or development frameworks. Towards this we developed and made available to the research community, an open geospatial software application prototype that provides high level capabilities for dynamically creating user defined virtual geospatial worlds populated by selected animated and moving 3D models on user specified locations, paths and areas. At the same time, the generated code may enhance existing open visualization frameworks and programming libraries dealing with 3D simulations, with the geospatial aspect of a virtual world.

  17. A big data geospatial analytics platform - Physical Analytics Integrated Repository and Services (PAIRS)

    NASA Astrophysics Data System (ADS)

    Hamann, H.; Jimenez Marianno, F.; Klein, L.; Albrecht, C.; Freitag, M.; Hinds, N.; Lu, S.

    2015-12-01

    A big data geospatial analytics platform:Physical Analytics Information Repository and Services (PAIRS)Fernando Marianno, Levente Klein, Siyuan Lu, Conrad Albrecht, Marcus Freitag, Nigel Hinds, Hendrik HamannIBM TJ Watson Research Center, Yorktown Heights, NY 10598A major challenge in leveraging big geospatial data sets is the ability to quickly integrate multiple data sources into physical and statistical models and be run these models in real time. A geospatial data platform called Physical Analytics Information and Services (PAIRS) is developed on top of open source hardware and software stack to manage Terabyte of data. A new data interpolation and re gridding is implemented where any geospatial data layers can be associated with a set of global grid where the grid resolutions is doubling for consecutive layers. Each pixel on the PAIRS grid have an index that is a combination of locations and time stamp. The indexing allow quick access to data sets that are part of a global data layers and allowing to retrieve only the data of interest. PAIRS takes advantages of parallel processing framework (Hadoop) in a cloud environment to digest, curate, and analyze the data sets while being very robust and stable. The data is stored on a distributed no-SQL database (Hbase) across multiple server, data upload and retrieval is parallelized where the original analytics task is broken up is smaller areas/volume, analyzed independently, and then reassembled for the original geographical area. The differentiating aspect of PAIRS is the ability to accelerate model development across large geographical regions and spatial resolution ranging from 0.1 m up to hundreds of kilometer. System performance is benchmarked on real time automated data ingestion and retrieval of Modis and Landsat data layers. The data layers are curated for sensor error, verified for correctness, and analyzed statistically to detect local anomalies. Multi-layer query enable PAIRS to filter different data layers based on specific conditions (e.g analyze flooding risk of a property based on topography, soil ability to hold water, and forecasted precipitation) or retrieve information about locations that share similar weather and vegetation patterns during extreme weather events like heat wave.

  18. Lsiviewer 2.0 - a Client-Oriented Online Visualization Tool for Geospatial Vector Data

    NASA Astrophysics Data System (ADS)

    Manikanta, K.; Rajan, K. S.

    2017-09-01

    Geospatial data visualization systems have been predominantly through applications that are installed and run in a desktop environment. Over the last decade, with the advent of web technologies and its adoption by Geospatial community, the server-client model for data handling, data rendering and visualization respectively has been the most prevalent approach in Web-GIS. While the client devices have become functionally more powerful over the recent years, the above model has largely ignored it and is still in a mode of serverdominant computing paradigm. In this paper, an attempt has been made to develop and demonstrate LSIViewer - a simple, easy-to-use and robust online geospatial data visualisation system for the user's own data that harness the client's capabilities for data rendering and user-interactive styling, with a reduced load on the server. The developed system can support multiple geospatial vector formats and can be integrated with other web-based systems like WMS, WFS, etc. The technology stack used to build this system is Node.js on the server side and HTML5 Canvas and JavaScript on the client side. Various tests run on a range of vector datasets, upto 35 MB, showed that the time taken to render the vector data using LSIViewer is comparable to a desktop GIS application, QGIS, over an identical system.

  19. Geo-epidemiologic mapping in the new public health surveillance. The malaria case in Chiapas, Mexico, 2002.

    PubMed

    Castillo-Salgado, Carlos

    2017-01-01

    The new public health surveillance requires at the global, national and local levels the use of new authoritative analytical approaches and tools for better recognition of the epidemiologic characteristics of the priority health events and risk factors affecting the population health. The identification of the events in time and space is of fundamental importance so that the geo-spatial description of the situation of diseases and health events facilitates the identification of social, environmental and health care related risks. This assessment examines the application and use of geo-spatial tools for identifying relevant spatial and epidemiological conglomerates of malaria in Chiapas, Mexico. The study design was ecological and the level of aggregation of the collected information of the epidemiological and spatial variables was municipalities. The data were collected in all municipalities of the state of Chiapas, Mexico during the years 2000-2002. The main outcome variable was cases and types of malaria diagnosed by blood smears in weekly reports. Independent variables were age, sex, ethnicity, literacy of the cases of malaria and environmental factors such as altitude, road type and network in the municipalities and cities of Chiapas. The production of thematic maps and the application of geo-spatial analytical tools such Moran and local indicator of spatial autocorrelation metrics for malaria clustering allowed the visualization and recognition that the important population risk factors associated with high malaria incidence in Chiapas were low literacy rate, areas with high percentage of indigenous population that reflects the social inequalities gaps in health and the great burden of disease that is affecting this important vulnerable group in Chiapas. The presence of road networks allowed greater spatial diffusion of Malaria. An important epidemiological and spatial cluster of malaria was identified in the areas and populations in the proximity of the southern border. The use of geospatial metrics in local areas will assist in the epidemiological stratification of malaria for better targeting more effective and equitable prevention and control interventions. Copyright: © 2017 SecretarÍa de Salud.

  20. Mining patterns in persistent surveillance systems with smart query and visual analytics

    NASA Astrophysics Data System (ADS)

    Habibi, Mohammad S.; Shirkhodaie, Amir

    2013-05-01

    In Persistent Surveillance Systems (PSS) the ability to detect and characterize events geospatially help take pre-emptive steps to counter adversary's actions. Interactive Visual Analytic (VA) model offers this platform for pattern investigation and reasoning to comprehend and/or predict such occurrences. The need for identifying and offsetting these threats requires collecting information from diverse sources, which brings with it increasingly abstract data. These abstract semantic data have a degree of inherent uncertainty and imprecision, and require a method for their filtration before being processed further. In this paper, we have introduced an approach based on Vector Space Modeling (VSM) technique for classification of spatiotemporal sequential patterns of group activities. The feature vectors consist of an array of attributes extracted from generated sensors semantic annotated messages. To facilitate proper similarity matching and detection of time-varying spatiotemporal patterns, a Temporal-Dynamic Time Warping (DTW) method with Gaussian Mixture Model (GMM) for Expectation Maximization (EM) is introduced. DTW is intended for detection of event patterns from neighborhood-proximity semantic frames derived from established ontology. GMM with EM, on the other hand, is employed as a Bayesian probabilistic model to estimated probability of events associated with a detected spatiotemporal pattern. In this paper, we present a new visual analytic tool for testing and evaluation group activities detected under this control scheme. Experimental results demonstrate the effectiveness of proposed approach for discovery and matching of subsequences within sequentially generated patterns space of our experiments.

  1. GISpark: A Geospatial Distributed Computing Platform for Spatiotemporal Big Data

    NASA Astrophysics Data System (ADS)

    Wang, S.; Zhong, E.; Wang, E.; Zhong, Y.; Cai, W.; Li, S.; Gao, S.

    2016-12-01

    Geospatial data are growing exponentially because of the proliferation of cost effective and ubiquitous positioning technologies such as global remote-sensing satellites and location-based devices. Analyzing large amounts of geospatial data can provide great value for both industrial and scientific applications. Data- and compute- intensive characteristics inherent in geospatial big data increasingly pose great challenges to technologies of data storing, computing and analyzing. Such challenges require a scalable and efficient architecture that can store, query, analyze, and visualize large-scale spatiotemporal data. Therefore, we developed GISpark - a geospatial distributed computing platform for processing large-scale vector, raster and stream data. GISpark is constructed based on the latest virtualized computing infrastructures and distributed computing architecture. OpenStack and Docker are used to build multi-user hosting cloud computing infrastructure for GISpark. The virtual storage systems such as HDFS, Ceph, MongoDB are combined and adopted for spatiotemporal data storage management. Spark-based algorithm framework is developed for efficient parallel computing. Within this framework, SuperMap GIScript and various open-source GIS libraries can be integrated into GISpark. GISpark can also integrated with scientific computing environment (e.g., Anaconda), interactive computing web applications (e.g., Jupyter notebook), and machine learning tools (e.g., TensorFlow/Orange). The associated geospatial facilities of GISpark in conjunction with the scientific computing environment, exploratory spatial data analysis tools, temporal data management and analysis systems make up a powerful geospatial computing tool. GISpark not only provides spatiotemporal big data processing capacity in the geospatial field, but also provides spatiotemporal computational model and advanced geospatial visualization tools that deals with other domains related with spatial property. We tested the performance of the platform based on taxi trajectory analysis. Results suggested that GISpark achieves excellent run time performance in spatiotemporal big data applications.

  2. Towards the Geospatial Web: Media Platforms for Managing Geotagged Knowledge Repositories

    NASA Astrophysics Data System (ADS)

    Scharl, Arno

    International media have recognized the visual appeal of geo-browsers such as NASA World Wind and Google Earth, for example, when Web and television coverage on Hurricane Katrina used interactive geospatial projections to illustrate its path and the scale of destruction in August 2005. Yet these early applications only hint at the true potential of geospatial technology to build and maintain virtual communities and to revolutionize the production, distribution and consumption of media products. This chapter investigates this potential by reviewing the literature and discussing the integration of geospatial and semantic reference systems, with an emphasis on extracting geospatial context from unstructured text. A content analysis of news coverage based on a suite of text mining tools (webLyzard) sheds light on the popularity and adoption of geospatial platforms.

  3. MyGeoHub: A Collaborative Geospatial Research and Education Platform

    NASA Astrophysics Data System (ADS)

    Kalyanam, R.; Zhao, L.; Biehl, L. L.; Song, C. X.; Merwade, V.; Villoria, N.

    2017-12-01

    Scientific research is increasingly collaborative and globally distributed; research groups now rely on web-based scientific tools and data management systems to simplify their day-to-day collaborative workflows. However, such tools often lack seamless interfaces, requiring researchers to contend with manual data transfers, annotation and sharing. MyGeoHub is a web platform that supports out-of-the-box, seamless workflows involving data ingestion, metadata extraction, analysis, sharing and publication. MyGeoHub is built on the HUBzero cyberinfrastructure platform and adds general-purpose software building blocks (GABBs), for geospatial data management, visualization and analysis. A data management building block iData, processes geospatial files, extracting metadata for keyword and map-based search while enabling quick previews. iData is pervasive, allowing access through a web interface, scientific tools on MyGeoHub or even mobile field devices via a data service API. GABBs includes a Python map library as well as map widgets that in a few lines of code, generate complete geospatial visualization web interfaces for scientific tools. GABBs also includes powerful tools that can be used with no programming effort. The GeoBuilder tool provides an intuitive wizard for importing multi-variable, geo-located time series data (typical of sensor readings, GPS trackers) to build visualizations supporting data filtering and plotting. MyGeoHub has been used in tutorials at scientific conferences and educational activities for K-12 students. MyGeoHub is also constantly evolving; the recent addition of Jupyter and R Shiny notebook environments enable reproducible, richly interactive geospatial analyses and applications ranging from simple pre-processing to published tools. MyGeoHub is not a monolithic geospatial science gateway, instead it supports diverse needs ranging from just a feature-rich data management system, to complex scientific tools and workflows.

  4. Mapping a Difference: The Power of Geospatial Visualization

    NASA Astrophysics Data System (ADS)

    Kolvoord, B.

    2015-12-01

    Geospatial Technologies (GST), such as GIS, GPS and remote sensing, offer students and teachers the opportunity to study the "why" of where. By making maps and collecting location-based data, students can pursue authentic problems using sophisticated tools. The proliferation of web- and cloud-based tools has made these technologies broadly accessible to schools. In addition, strong spatial thinking skills have been shown to be a key factor in supporting students that want to study science, technology, engineering, and mathematics (STEM) disciplines (Wai, Lubinski and Benbow) and pursue STEM careers. Geospatial technologies strongly scaffold the development of these spatial thinking skills. For the last ten years, the Geospatial Semester, a unique dual-enrollment partnership between James Madison University and Virginia high schools, has provided students with the opportunity to use GST's to hone their spatial thinking skills and to do extended projects of local interest, including environmental, geological and ecological studies. Along with strong spatial thinking skills, these students have also shown strong problem solving skills, often beyond those of fellow students in AP classes. Programs like the Geospatial Semester are scalable and within the reach of many college and university departments, allowing strong engagement with K-12 schools. In this presentation, we'll share details of the Geospatial Semester and research results on the impact of the use of these technologies on students' spatial thinking skills, and discuss the success and challenges of developing K-12 partnerships centered on geospatial visualization.

  5. Interactive Visualization of Near Real-Time and Production Global Precipitation Mission Data Online Using CesiumJS

    NASA Astrophysics Data System (ADS)

    Lammers, M.

    2016-12-01

    Advancements in the capabilities of JavaScript frameworks and web browsing technology make online visualization of large geospatial datasets viable. Commonly this is done using static image overlays, pre-rendered animations, or cumbersome geoservers. These methods can limit interactivity and/or place a large burden on server-side post-processing and storage of data. Geospatial data, and satellite data specifically, benefit from being visualized both on and above a three-dimensional surface. The open-source JavaScript framework CesiumJS, developed by Analytical Graphics, Inc., leverages the WebGL protocol to do just that. It has entered the void left by the abandonment of the Google Earth Web API, and it serves as a capable and well-maintained platform upon which data can be displayed. This paper will describe the technology behind the two primary products developed as part of the NASA Precipitation Processing System STORM website: GPM Near Real Time Viewer (GPMNRTView) and STORM Virtual Globe (STORM VG). GPMNRTView reads small post-processed CZML files derived from various Level 1 through 3 near real-time products. For swath-based products, several brightness temperature channels or precipitation-related variables are available for animating in virtual real-time as the satellite observed them on and above the Earth's surface. With grid-based products, only precipitation rates are available, but the grid points are visualized in such a way that they can be interactively examined to explore raw values. STORM VG reads values directly off the HDF5 files, converting the information into JSON on the fly. All data points both on and above the surface can be examined here as well. Both the raw values and, if relevant, elevations are displayed. Surface and above-ground precipitation rates from select Level 2 and 3 products are shown. Examples from both products will be shown, including visuals from high impact events observed by GPM constellation satellites.

  6. Interactive Visualization of Near Real Time and Production Global Precipitation Measurement (GPM) Mission Data Online Using CesiumJS

    NASA Technical Reports Server (NTRS)

    Lammers, Matthew

    2016-01-01

    Advancements in the capabilities of JavaScript frameworks and web browsing technology make online visualization of large geospatial datasets viable. Commonly this is done using static image overlays, prerendered animations, or cumbersome geoservers. These methods can limit interactivity andor place a large burden on server-side post-processing and storage of data. Geospatial data, and satellite data specifically, benefit from being visualized both on and above a three-dimensional surface. The open-source JavaScript framework CesiumJS, developed by Analytical Graphics, Inc., leverages the WebGL protocol to do just that. It has entered the void left by the abandonment of the Google Earth Web API, and it serves as a capable and well-maintained platform upon which data can be displayed. This paper will describe the technology behind the two primary products developed as part of the NASA Precipitation Processing System STORM website: GPM Near Real Time Viewer (GPMNRTView) and STORM Virtual Globe (STORM VG). GPMNRTView reads small post-processed CZML files derived from various Level 1 through 3 near real-time products. For swath-based products, several brightness temperature channels or precipitation-related variables are available for animating in virtual real-time as the satellite-observed them on and above the Earths surface. With grid-based products, only precipitation rates are available, but the grid points are visualized in such a way that they can be interactively examined to explore raw values. STORM VG reads values directly off the HDF5 files, converting the information into JSON on the fly. All data points both on and above the surface can be examined here as well. Both the raw values and, if relevant, elevations are displayed. Surface and above-ground precipitation rates from select Level 2 and 3 products are shown. Examples from both products will be shown, including visuals from high impact events observed by GPM constellation satellites.

  7. Geospatial Data Science Research | Geospatial Data Science | NREL

    Science.gov Websites

    , maps, and tools that determine which energy technologies are viable solutions across the globe ) to manipulate, manage, and analyze multidisciplinary geographic and energy data. The GIS includes of applications and visualizations. Analysis Renewable Energy Technical Potential Renewable Energy

  8. Best Practices for Preparing Interoperable Geospatial Data

    NASA Astrophysics Data System (ADS)

    Wei, Y.; Santhana Vannan, S.; Cook, R. B.; Wilson, B. E.; Beaty, T. W.

    2010-12-01

    Geospatial data is critically important for a wide scope of research and applications: carbon cycle and ecosystem, climate change, land use and urban planning, environmental protecting, etc. Geospatial data is created by different organizations using different methods, from remote sensing observations, field surveys, model simulations, etc., and stored in various formats. So geospatial data is diverse and heterogeneous, which brings a huge barrier for the sharing and using of geospatial data, especially when targeting a broad user community. Many efforts have been taken to address different aspects of using geospatial data by improving its interoperability. For example, the specification for Open Geospatial Consortium (OGC) catalog services defines a standard way for geospatial information discovery; OGC Web Coverage Services (WCS) and OPeNDAP define interoperable protocols for geospatial data access, respectively. But the reality is that only having the standard mechanisms for data discovery and access is not enough. The geospatial data content itself has to be organized in standard, easily understandable, and readily usable formats. The Oak Ridge National Lab Distributed Archived Data Center (ORNL DAAC) archives data and information relevant to biogeochemical dynamics, ecological data, and environmental processes. The Modeling and Synthesis Thematic Data Center (MAST-DC) prepares and distributes both input data and output data of carbon cycle models and provides data support for synthesis and terrestrial model inter-comparison in multi-scales. Both of these NASA-funded data centers compile and distribute a large amount of diverse geospatial data and have broad user communities, including GIS users, Earth science researchers, and ecosystem modeling teams. The ORNL DAAC and MAST-DC address this geospatial data interoperability issue by standardizing the data content and feeding them into a well-designed Spatial Data Infrastructure (SDI) which provides interoperable mechanisms to advertise, visualize, and distribute the standardized geospatial data. In this presentation, we summarize the experiences learned and the best practices for geospatial data standardization. The presentation will describe how diverse and historical data archived in the ORNL DAAC were converted into standard and non-proprietary formats; what tools were used to make the conversion; how the spatial and temporal information are properly captured in a consistent manor; how to name a data file or a variable to make it both human-friendly and semantically interoperable; how NetCDF file format and CF convention can promote the data usage in ecosystem modeling user community; how those standardized geospatial data can be fed into OGC Web Services to support on-demand data visualization and access; and how the metadata should be collected and organized so that they can be discovered through standard catalog services.

  9. Data Visualization and Geospatial Tools | Geospatial Data Science | NREL

    Science.gov Websites

    renewable resources are available in a specific areas. General Analysis Renewable Energy Atlas View the geographic distribution of wind, solar, geothermal, hydropower, and biomass resources in the United States . Solar and Wind Energy Resource Assessment (SWERA) Model Access international renewable energy resource

  10. Geospatial Data as a Service: Towards planetary scale real-time analytics

    NASA Astrophysics Data System (ADS)

    Evans, B. J. K.; Larraondo, P. R.; Antony, J.; Richards, C. J.

    2017-12-01

    The rapid growth of earth systems, environmental and geophysical datasets poses a challenge to both end-users and infrastructure providers. For infrastructure and data providers, tasks like managing, indexing and storing large collections of geospatial data needs to take into consideration the various use cases by which consumers will want to access and use the data. Considerable investment has been made by the Earth Science community to produce suitable real-time analytics platforms for geospatial data. There are currently different interfaces that have been defined to provide data services. Unfortunately, there is considerable difference on the standards, protocols or data models which have been designed to target specific communities or working groups. The Australian National University's National Computational Infrastructure (NCI) is used for a wide range of activities in the geospatial community. Earth observations, climate and weather forecasting are examples of these communities which generate large amounts of geospatial data. The NCI has been carrying out significant effort to develop a data and services model that enables the cross-disciplinary use of data. Recent developments in cloud and distributed computing provide a publicly accessible platform where new infrastructures can be built. One of the key components these technologies offer is the possibility of having "limitless" compute power next to where the data is stored. This model is rapidly transforming data delivery from centralised monolithic services towards ubiquitous distributed services that scale up and down adapting to fluctuations in the demand. NCI has developed GSKY, a scalable, distributed server which presents a new approach for geospatial data discovery and delivery based on OGC standards. We will present the architecture and motivating use-cases that drove GSKY's collaborative design, development and production deployment. We show our approach offers the community valuable exploratory analysis capabilities, for dealing with petabyte-scale geospatial data collections.

  11. Visualization and Ontology of Geospatial Intelligence

    NASA Astrophysics Data System (ADS)

    Chan, Yupo

    Recent events have deepened our conviction that many human endeavors are best described in a geospatial context. This is evidenced in the prevalence of location-based services, as afforded by the ubiquitous cell phone usage. It is also manifested by the popularity of such internet engines as Google Earth. As we commute to work, travel on business or pleasure, we make decisions based on the geospatial information provided by such location-based services. When corporations devise their business plans, they also rely heavily on such geospatial data. By definition, local, state and federal governments provide services according to geographic boundaries. One estimate suggests that 85 percent of data contain spatial attributes.

  12. KML Tours: A New Platform for Exploring and Sharing Geospatial Data

    NASA Astrophysics Data System (ADS)

    Barcay, D. P.; Weiss-Malik, M.

    2009-12-01

    Google Earth and other virtual globes have allowed millions of people to explore the world from their own home. This technology has also raised the bar for professional visualizations: enabling interactive 3D visualizations to be created from massive data-sets, and shared using the KML language. For academics and professionals alike, an engaging presentation of your geospatial data is generally expected and can be the most effective form of advertisement. To that end, we released 'Touring' in Google Earth 5.0: a new medium for cinematic expression, visualized in Google Earth and written as extensions to the KML language. In a KML tour, the author has fine-grained control over the entire visual experience: precisely moving the virtual camera through the world while dynamically modifying the content, style, position, and visibility of the displayed data. An author can synchronize audio to this experience, bringing further immersion to a visualization. KML tours can help engage a broad user-base and conveying subtle concepts that aren't immediately apparent in traditional geospatial content. Unlike a pre-rendered video, a KML Tour maintains the rich interactivity of Google Earth, allowing users to continue exploring your content, and to mash-up other content with your visualization. This session will include conceptual explanations of the Touring feature in Google Earth, the structure of the touring KML extensions, as well as examples of compelling tours.

  13. WC WAVE - Integrating Diverse Hydrological-Modeling Data and Services Into an Interoperable Geospatial Infrastructure

    NASA Astrophysics Data System (ADS)

    Hudspeth, W. B.; Baros, S.; Barrett, H.; Savickas, J.; Erickson, J.

    2015-12-01

    WC WAVE (Western Consortium for Watershed Analysis, Visualization and Exploration) is a collaborative research project between the states of Idaho, Nevada, and New Mexico that is funded under the National Science Foundation's Experimental Program to Stimulate Competitive Research (EPSCoR). The goal of the project is to understand and document the effects of climate change on interactions between precipitation, vegetation growth, soil moisture and other landscape properties. These interactions are modeled within a framework we refer to as a virtual watershed (VW), a computer infrastructure that simulates watershed dynamics by linking scientific modeling, visualization, and data management components into a coherent whole. Developed and hosted at the Earth Data Analysis Center, University of New Mexico, the virtual watershed has a number of core functions which include: a) streamlined access to data required for model initialization and boundary conditions; b) the development of analytic scenarios through interactive visualization of available data and the storage of model configuration options; c) coupling of hydrological models through the rapid assimilation of model outputs into the data management system for access and use by sequent models. The WC-WAVE virtual watershed accomplishes these functions by provision of large-scale vector and raster data discovery, subsetting, and delivery via Open Geospatial Consortium (OGC) and REST web service standards. Central to the virtual watershed is the design and use of an innovative array of metadata elements that permits the stepwise coupling of diverse hydrological models (e.g. ISNOBAL, PRMS, CASiMiR) and input data to rapidly assess variation in outcomes under different climatic conditions. We present details on the architecture and functionality of the virtual watershed, results from three western U.S. watersheds, and discuss the realized benefits to watershed science of employing this integrated solution.

  14. Using a Web GIS Plate Tectonics Simulation to Promote Geospatial Thinking

    ERIC Educational Resources Information Center

    Bodzin, Alec M.; Anastasio, David; Sharif, Rajhida; Rutzmoser, Scott

    2016-01-01

    Learning with Web-based geographic information system (Web GIS) can promote geospatial thinking and analysis of georeferenced data. Web GIS can enable learners to analyze rich data sets to understand spatial relationships that are managed in georeferenced data visualizations. We developed a Web GIS plate tectonics simulation as a capstone learning…

  15. Real-Time Geospatial Data Viewer (RETIGO): Web-Based Tool for Researchers and Citizen Scientists to Explore their Air Measurements

    EPA Science Inventory

    The collection of air measurements in real-time on moving platforms, such as wearable, bicycle-mounted, or vehicle-mounted air sensors, is becoming an increasingly common method to investigate local air quality. However, visualizing and analyzing geospatial air monitoring data re...

  16. Using Geospatial Analysis to Align Little Free Library Locations with Community Literacy Needs

    ERIC Educational Resources Information Center

    Rebori, Marlene K.; Burge, Peter

    2017-01-01

    We used geospatial analysis tools to develop community maps depicting fourth-grade reading proficiency test scores and locations of facilities offering public access to reading materials (i.e., public libraries, elementary schools, and Little Free Libraries). The maps visually highlighted areas with struggling readers and areas without adequate…

  17. A Compilation of Provisional Karst Geospatial Data for the Interior Low Plateaus Physiographic Region, Central United States

    USGS Publications Warehouse

    Taylor, Charles J.; Nelson, Hugh L.

    2008-01-01

    Geospatial data needed to visualize and evaluate the hydrogeologic framework and distribution of karst features in the Interior Low Plateaus physiographic region of the central United States were compiled during 2004-2007 as part of the Ground-Water Resources Program Karst Hydrology Initiative (KHI) project. Because of the potential usefulness to environmental and water-resources regulators, private consultants, academic researchers, and others, the geospatial data files created during the KHI project are being made available to the public as a provisional regional karst dataset. To enhance accessibility and visualization, the geospatial data files have been compiled as ESRI ArcReader data folders and user interactive Published Map Files (.pmf files), all of which are catalogued by the boundaries of surface watersheds using U.S. Geological Survey (USGS) eight-digit hydrologic unit codes (HUC-8s). Specific karst features included in the dataset include mapped sinkhole locations, sinking (or disappearing) streams, internally drained catchments, karst springs inventoried in the USGS National Water Information System (NWIS) database, relic stream valleys, and karst flow paths obtained from results of previously reported water-tracer tests.

  18. Characterization and visualization of the accuracy of FIA's CONUS-wide tree species datasets

    Treesearch

    Rachel Riemann; Barry T. Wilson

    2014-01-01

    Modeled geospatial datasets have been created for 325 tree species across the contiguous United States (CONUS). Effective application of all geospatial datasets depends on their accuracy. Dataset error can be systematic (bias) or unsystematic (scatter), and their magnitude can vary by region and scale. Each of these characteristics affects the locations, scales, uses,...

  19. Automating Geospatial Visualizations with Smart Default Renderers for Data Exploration Web Applications

    NASA Astrophysics Data System (ADS)

    Ekenes, K.

    2017-12-01

    This presentation will outline the process of creating a web application for exploring large amounts of scientific geospatial data using modern automated cartographic techniques. Traditional cartographic methods, including data classification, may inadvertently hide geospatial and statistical patterns in the underlying data. This presentation demonstrates how to use smart web APIs that quickly analyze the data when it loads, and provides suggestions for the most appropriate visualizations based on the statistics of the data. Since there are just a few ways to visualize any given dataset well, it is imperative to provide smart default color schemes tailored to the dataset as opposed to static defaults. Since many users don't go beyond default values, it is imperative that they are provided with smart default visualizations. Multiple functions for automating visualizations are available in the Smart APIs, along with UI elements allowing users to create more than one visualization for a dataset since there isn't a single best way to visualize a given dataset. Since bivariate and multivariate visualizations are particularly difficult to create effectively, this automated approach removes the guesswork out of the process and provides a number of ways to generate multivariate visualizations for the same variables. This allows the user to choose which visualization is most appropriate for their presentation. The methods used in these APIs and the renderers generated by them are not available elsewhere. The presentation will show how statistics can be used as the basis for automating default visualizations of data along continuous ramps, creating more refined visualizations while revealing the spread and outliers of the data. Adding interactive components to instantaneously alter visualizations allows users to unearth spatial patterns previously unknown among one or more variables. These applications may focus on a single dataset that is frequently updated, or configurable for a variety of datasets from multiple sources.

  20. Effects of ensemble and summary displays on interpretations of geospatial uncertainty data.

    PubMed

    Padilla, Lace M; Ruginski, Ian T; Creem-Regehr, Sarah H

    2017-01-01

    Ensemble and summary displays are two widely used methods to represent visual-spatial uncertainty; however, there is disagreement about which is the most effective technique to communicate uncertainty to the general public. Visualization scientists create ensemble displays by plotting multiple data points on the same Cartesian coordinate plane. Despite their use in scientific practice, it is more common in public presentations to use visualizations of summary displays, which scientists create by plotting statistical parameters of the ensemble members. While prior work has demonstrated that viewers make different decisions when viewing summary and ensemble displays, it is unclear what components of the displays lead to diverging judgments. This study aims to compare the salience of visual features - or visual elements that attract bottom-up attention - as one possible source of diverging judgments made with ensemble and summary displays in the context of hurricane track forecasts. We report that salient visual features of both ensemble and summary displays influence participant judgment. Specifically, we find that salient features of summary displays of geospatial uncertainty can be misunderstood as displaying size information. Further, salient features of ensemble displays evoke judgments that are indicative of accurate interpretations of the underlying probability distribution of the ensemble data. However, when participants use ensemble displays to make point-based judgments, they may overweight individual ensemble members in their decision-making process. We propose that ensemble displays are a promising alternative to summary displays in a geospatial context but that decisions about visualization methods should be informed by the viewer's task.

  1. Analytical Hierarchy Process modeling for malaria risk zones in Vadodara district, Gujarat

    NASA Astrophysics Data System (ADS)

    Bhatt, B.; Joshi, J. P.

    2014-11-01

    Malaria epidemic is one of the complex spatial problems around the world. According to WHO, an estimated 6, 27, 000 deaths occurred due to malaria in 2012. In many developing nations with diverse ecological regions, it is still a large cause of human mortality. Owing to the incompleteness of epidemiological data and their spatial origin, the quantification of disease incidence burdening basic public health planning is a major constrain especially in developing countries. The present study focuses on the integrated Geospatial and Multi-Criteria Evaluation (AHP) technique to determine malaria risk zones. The study is conducted in Vadodara district, including 12 Taluka among which 4 Taluka are predominantly tribal. The influence of climatic and physical environmental factors viz., rainfall, hydro geomorphology; drainage, elevation, and land cover are used to score their share in the evaluation of malariogenic condition. This was synthesized on the basis of preference over each factor and the total weights of each data and data layer were computed and visualized. The district was divided into three viz., high, moderate and low risk zones .It was observed that a geographical area of 1885.2sq.km comprising 30.3% fall in high risk zone. The risk zones identified on the basis of these parameters and assigned weights shows a close resemblance with ground condition. As the API distribution for 2011overlaid corresponds to the risk zones identified. The study demonstrates the significance and prospect of integrating Geospatial tools and Analytical Hierarchy Process for malaria risk zones and dynamics of malaria transmission.

  2. Bridging the Gap between NASA Hydrological Data and the Geospatial Community

    NASA Technical Reports Server (NTRS)

    Rui, Hualan; Teng, Bill; Vollmer, Bruce; Mocko, David M.; Beaudoing, Hiroko K.; Nigro, Joseph; Gary, Mark; Maidment, David; Hooper, Richard

    2011-01-01

    There is a vast and ever increasing amount of data on the Earth interconnected energy and hydrological systems, available from NASA remote sensing and modeling systems, and yet, one challenge persists: increasing the usefulness of these data for, and thus their use by, the geospatial communities. The Hydrology Data and Information Services Center (HDISC), part of the Goddard Earth Sciences DISC, has continually worked to better understand the hydrological data needs of the geospatial end users, to thus better able to bridge the gap between NASA data and the geospatial communities. This paper will cover some of the hydrological data sets available from HDISC, and the various tools and services developed for data searching, data subletting ; format conversion. online visualization and analysis; interoperable access; etc.; to facilitate the integration of NASA hydrological data by end users. The NASA Goddard data analysis and visualization system, Giovanni, is described. Two case examples of user-customized data services are given, involving the EPA BASINS (Better Assessment Science Integrating point & Non-point Sources) project and the CUAHSI Hydrologic Information System, with the common requirement of on-the-fly retrieval of long duration time series for a geographical point

  3. Visa: AN Automatic Aware and Visual Aids Mechanism for Improving the Correct Use of Geospatial Data

    NASA Astrophysics Data System (ADS)

    Hong, J. H.; Su, Y. T.

    2016-06-01

    With the fast growth of internet-based sharing mechanism and OpenGIS technology, users nowadays enjoy the luxury to quickly locate and access a variety of geospatial data for the tasks at hands. While this sharing innovation tremendously expand the possibility of application and reduce the development cost, users nevertheless have to deal with all kinds of "differences" implicitly hidden behind the acquired georesources. We argue the next generation of GIS-based environment, regardless internet-based or not, must have built-in knowledge to automatically and correctly assess the fitness of data use and present the analyzed results to users in an intuitive and meaningful way. The VISA approach proposed in this paper refer to four different types of visual aids that can be respectively used for addressing analyzed results, namely, virtual layer, informative window, symbol transformation and augmented TOC. The VISA-enabled interface works in an automatic-aware fashion, where the standardized metadata serve as the known facts about the selected geospatial resources, algorithms for analyzing the differences of temporality and quality of the geospatial resources were designed and the transformation of analyzed results into visual aids were automatically executed. It successfully presents a new way for bridging the communication gaps between systems and users. GIS has been long seen as a powerful integration tool, but its achievements would be highly restricted if it fails to provide a friendly and correct working platform.

  4. Web mapping system for complex processing and visualization of environmental geospatial datasets

    NASA Astrophysics Data System (ADS)

    Titov, Alexander; Gordov, Evgeny; Okladnikov, Igor

    2016-04-01

    Environmental geospatial datasets (meteorological observations, modeling and reanalysis results, etc.) are used in numerous research applications. Due to a number of objective reasons such as inherent heterogeneity of environmental datasets, big dataset volume, complexity of data models used, syntactic and semantic differences that complicate creation and use of unified terminology, the development of environmental geodata access, processing and visualization services as well as client applications turns out to be quite a sophisticated task. According to general INSPIRE requirements to data visualization geoportal web applications have to provide such standard functionality as data overview, image navigation, scrolling, scaling and graphical overlay, displaying map legends and corresponding metadata information. It should be noted that modern web mapping systems as integrated geoportal applications are developed based on the SOA and might be considered as complexes of interconnected software tools for working with geospatial data. In the report a complex web mapping system including GIS web client and corresponding OGC services for working with geospatial (NetCDF, PostGIS) dataset archive is presented. There are three basic tiers of the GIS web client in it: 1. Tier of geospatial metadata retrieved from central MySQL repository and represented in JSON format 2. Tier of JavaScript objects implementing methods handling: --- NetCDF metadata --- Task XML object for configuring user calculations, input and output formats --- OGC WMS/WFS cartographical services 3. Graphical user interface (GUI) tier representing JavaScript objects realizing web application business logic Metadata tier consists of a number of JSON objects containing technical information describing geospatial datasets (such as spatio-temporal resolution, meteorological parameters, valid processing methods, etc). The middleware tier of JavaScript objects implementing methods for handling geospatial metadata, task XML object, and WMS/WFS cartographical services interconnects metadata and GUI tiers. The methods include such procedures as JSON metadata downloading and update, launching and tracking of the calculation task running on the remote servers as well as working with WMS/WFS cartographical services including: obtaining the list of available layers, visualizing layers on the map, exporting layers in graphical (PNG, JPG, GeoTIFF), vector (KML, GML, Shape) and digital (NetCDF) formats. Graphical user interface tier is based on the bundle of JavaScript libraries (OpenLayers, GeoExt and ExtJS) and represents a set of software components implementing web mapping application business logic (complex menus, toolbars, wizards, event handlers, etc.). GUI provides two basic capabilities for the end user: configuring the task XML object functionality and cartographical information visualizing. The web interface developed is similar to the interface of such popular desktop GIS applications, as uDIG, QuantumGIS etc. Web mapping system developed has shown its effectiveness in the process of solving real climate change research problems and disseminating investigation results in cartographical form. The work is supported by SB RAS Basic Program Projects VIII.80.2.1 and IV.38.1.7.

  5. GeoSearch: A lightweight broking middleware for geospatial resources discovery

    NASA Astrophysics Data System (ADS)

    Gui, Z.; Yang, C.; Liu, K.; Xia, J.

    2012-12-01

    With petabytes of geodata, thousands of geospatial web services available over the Internet, it is critical to support geoscience research and applications by finding the best-fit geospatial resources from the massive and heterogeneous resources. Past decades' developments witnessed the operation of many service components to facilitate geospatial resource management and discovery. However, efficient and accurate geospatial resource discovery is still a big challenge due to the following reasons: 1)The entry barriers (also called "learning curves") hinder the usability of discovery services to end users. Different portals and catalogues always adopt various access protocols, metadata formats and GUI styles to organize, present and publish metadata. It is hard for end users to learn all these technical details and differences. 2)The cost for federating heterogeneous services is high. To provide sufficient resources and facilitate data discovery, many registries adopt periodic harvesting mechanism to retrieve metadata from other federated catalogues. These time-consuming processes lead to network and storage burdens, data redundancy, and also the overhead of maintaining data consistency. 3)The heterogeneous semantics issues in data discovery. Since the keyword matching is still the primary search method in many operational discovery services, the search accuracy (precision and recall) is hard to guarantee. Semantic technologies (such as semantic reasoning and similarity evaluation) offer a solution to solve these issues. However, integrating semantic technologies with existing service is challenging due to the expandability limitations on the service frameworks and metadata templates. 4)The capabilities to help users make final selection are inadequate. Most of the existing search portals lack intuitive and diverse information visualization methods and functions (sort, filter) to present, explore and analyze search results. Furthermore, the presentation of the value-added additional information (such as, service quality and user feedback), which conveys important decision supporting information, is missing. To address these issues, we prototyped a distributed search engine, GeoSearch, based on brokering middleware framework to search, integrate and visualize heterogeneous geospatial resources. Specifically, 1) A lightweight discover broker is developed to conduct distributed search. The broker retrieves metadata records for geospatial resources and additional information from dispersed services (portals and catalogues) and other systems on the fly. 2) A quality monitoring and evaluation broker (i.e., QoS Checker) is developed and integrated to provide quality information for geospatial web services. 3) The semantic assisted search and relevance evaluation functions are implemented by loosely interoperating with ESIP Testbed component. 4) Sophisticated information and data visualization functionalities and tools are assembled to improve user experience and assist resource selection.

  6. Situational Awareness Geospatial Application (iSAGA)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sher, Benjamin

    Situational Awareness Geospatial Application (iSAGA) is a geospatial situational awareness software tool that uses an algorithm to extract location data from nearly any internet-based, or custom data source and display it geospatially; allows user-friendly conduct of spatial analysis using custom-developed tools; searches complex Geographic Information System (GIS) databases and accesses high resolution imagery. iSAGA has application at the federal, state and local levels of emergency response, consequence management, law enforcement, emergency operations and other decision makers as a tool to provide complete, visual, situational awareness using data feeds and tools selected by the individual agency or organization. Feeds may bemore » layered and custom tools developed to uniquely suit each subscribing agency or organization. iSAGA may similarly be applied to international agencies and organizations.« less

  7. Cultivating Research Skills: An interdisciplinary approach in training and supporting energy research

    NASA Astrophysics Data System (ADS)

    Winkler, H.; Carbajales-Dale, P.; Alschbach, E.

    2013-12-01

    Geoscience and energy research has essentially separate and diverse tracks and traditions, making the education process labor-intensive and burdensome. Using a combined forces approach to training, a multidisciplinary workshop on information and data sources and research skills was developed and offered through several departments at Stanford University. The popular workshops taught required skills to scientists - giving training on new technologies, access to restricted energy-related scientific and government databases, search strategies for data-driven resources, and visualization and geospatial analytics. Feedback and data suggest these workshops were fundamental as they set the foundation for subsequent learning opportunities for students and faculty. This session looks at the integration of the information workshops within multiple energy and geoscience programs and the importance of formally cultivating research and information skills.

  8. Mapping the world: cartographic and geographic visualization by the United Nations Geospatial Information Section (formerly Cartographic Section)

    NASA Astrophysics Data System (ADS)

    Kagawa, Ayako; Le Sourd, Guillaume

    2018-05-01

    United Nations Secretariat activities, mapping began in 1946, and by 1951, the need for maps increased and an office with a team of cartographers was established. Since then, with the development of technologies including internet, remote sensing, unmanned aerial systems, relationship database management and information systems, geospatial information provides an ever-increasing variation of support to the work of the Organization for planning of operations, decision-making and monitoring of crises. However, the need for maps has remained intact. This presentation aims to highlight some of the cartographic representation styles over the decades by reviewing the evolution of selected maps by the office, and noting the changing cognitive and semiotic aspects of cartographic and geographic visualization required by the United Nations. Through presentation and analysis of these maps, the changing dynamics of the Organization in information management can be reflected, with a reminder of the continuing and expanding deconstructionist role of a cartographer, now geospatial information management experts.

  9. Geospatial Technology and Geosciences - Defining the skills and competencies in the geosciences needed to effectively use the technology (Invited)

    NASA Astrophysics Data System (ADS)

    Johnson, A.

    2010-12-01

    Maps, spatial and temporal data and their use in analysis and visualization are integral components for studies in the geosciences. With the emergence of geospatial technology (Geographic Information Systems (GIS), remote sensing and imagery, Global Positioning Systems (GPS) and mobile technologies) scientists and the geosciences user community are now able to more easily accessed and share data, analyze their data and present their results. Educators are also incorporating geospatial technology into their geosciences programs by including an awareness of the technology in introductory courses to advanced courses exploring the capabilities to help answer complex questions in the geosciences. This paper will look how the new Geospatial Technology Competency Model from the Department of Labor can help ensure that geosciences programs address the skills and competencies identified by the workforce for geospatial technology as well as look at new tools created by the GeoTech Center to help do self and program assessments.

  10. Interactive Visualization and Analysis of Geospatial Data Sets - TrikeND-iGlobe

    NASA Astrophysics Data System (ADS)

    Rosebrock, Uwe; Hogan, Patrick; Chandola, Varun

    2013-04-01

    The visualization of scientific datasets is becoming an ever-increasing challenge as advances in computing technologies have enabled scientists to build high resolution climate models that have produced petabytes of climate data. To interrogate and analyze these large datasets in real-time is a task that pushes the boundaries of computing hardware and software. But integration of climate datasets with geospatial data requires considerable amount of effort and close familiarity of various data formats and projection systems, which has prevented widespread utilization outside of climate community. TrikeND-iGlobe is a sophisticated software tool that bridges this gap, allows easy integration of climate datasets with geospatial datasets and provides sophisticated visualization and analysis capabilities. The objective for TrikeND-iGlobe is the continued building of an open source 4D virtual globe application using NASA World Wind technology that integrates analysis of climate model outputs with remote sensing observations as well as demographic and environmental data sets. This will facilitate a better understanding of global and regional phenomenon, and the impact analysis of climate extreme events. The critical aim is real-time interactive interrogation. At the data centric level the primary aim is to enable the user to interact with the data in real-time for the purpose of analysis - locally or remotely. TrikeND-iGlobe provides the basis for the incorporation of modular tools that provide extended interactions with the data, including sub-setting, aggregation, re-shaping, time series analysis methods and animation to produce publication-quality imagery. TrikeND-iGlobe may be run locally or can be accessed via a web interface supported by high-performance visualization compute nodes placed close to the data. It supports visualizing heterogeneous data formats: traditional geospatial datasets along with scientific data sets with geographic coordinates (NetCDF, HDF, etc.). It also supports multiple data access mechanisms, including HTTP, FTP, WMS, WCS, and Thredds Data Server (for NetCDF data and for scientific data, TrikeND-iGlobe supports various visualization capabilities, including animations, vector field visualization, etc. TrikeND-iGlobe is a collaborative open-source project, contributors include NASA (ARC-PX), ORNL (Oakridge National Laboratories), Unidata, Kansas University, CSIRO CMAR Australia and Geoscience Australia.

  11. ArcGIS Framework for Scientific Data Analysis and Serving

    NASA Astrophysics Data System (ADS)

    Xu, H.; Ju, W.; Zhang, J.

    2015-12-01

    ArcGIS is a platform for managing, visualizing, analyzing, and serving geospatial data. Scientific data as part of the geospatial data features multiple dimensions (X, Y, time, and depth) and large volume. Multidimensional mosaic dataset (MDMD), a newly enhanced data model in ArcGIS, models the multidimensional gridded data (e.g. raster or image) as a hypercube and enables ArcGIS's capabilities to handle the large volume and near-real time scientific data. Built on top of geodatabase, the MDMD stores the dimension values and the variables (2D arrays) in a geodatabase table which allows accessing a slice or slices of the hypercube through a simple query and supports animating changes along time or vertical dimension using ArcGIS desktop or web clients. Through raster types, MDMD can manage not only netCDF, GRIB, and HDF formats but also many other formats or satellite data. It is scalable and can handle large data volume. The parallel geo-processing engine makes the data ingestion fast and easily. Raster function, definition of a raster processing algorithm, is a very important component in ArcGIS platform for on-demand raster processing and analysis. The scientific data analytics is achieved through the MDMD and raster function templates which perform on-demand scientific computation with variables ingested in the MDMD. For example, aggregating monthly average from daily data; computing total rainfall of a year; calculating heat index for forecasting data, and identifying fishing habitat zones etc. Addtionally, MDMD with the associated raster function templates can be served through ArcGIS server as image services which provide a framework for on-demand server side computation and analysis, and the published services can be accessed by multiple clients such as ArcMap, ArcGIS Online, JavaScript, REST, WCS, and WMS. This presentation will focus on the MDMD model and raster processing templates. In addtion, MODIS land cover, NDFD weather service, and HYCOM ocean model will be used to illustrate how ArcGIS platform and MDMD model can facilitate scientific data visualization and analytics and how the analysis results can be shared to more audience through ArcGIS Online and Portal.

  12. The Role of Visualization in Learning from Computer-Based Images. Research Report

    ERIC Educational Resources Information Center

    Piburn, Michael D.; Reynolds, Stephen J.; McAuliffe, Carla; Leedy, Debra E.; Birk, James P.; Johnson, Julia K.

    2005-01-01

    Among the sciences, the practice of geology is especially visual. To assess the role of spatial ability in learning geology, we designed an experiment using: (1) web-based versions of spatial visualization tests, (2) a geospatial test, and (3) multimedia instructional modules built around QuickTime Virtual Reality movies. Students in control and…

  13. Learning topography with Tangible Landscape games

    NASA Astrophysics Data System (ADS)

    Petrasova, A.; Tabrizian, P.; Harmon, B. A.; Petras, V.; Millar, G.; Mitasova, H.; Meentemeyer, R. K.

    2017-12-01

    Understanding topography and its representations is crucial for correct interpretation and modeling of surface processes. However, novice earth science and landscape architecture students often find reading topographic maps challenging. As a result, many students struggle to comprehend more complex spatial concepts and processes such as flow accumulation or sediment transport.We developed and tested a new method for teaching hydrology, geomorphology, and grading using Tangible Landscape—a tangible interface for geospatial modeling. Tangible Landscape couples a physical and digital model of a landscape through a real-time cycle of hands-on modeling, 3D scanning, geospatial computation, and projection. With Tangible Landscape students can sculpt a projection-augmented topographic model of a landscape with their hands and use a variety of tangible objects to immediately see how they are changing geospatial analytics such as contours, profiles, water flow, or landform types. By feeling and manipulating the shape of the topography, while seeing projected geospatial analytics, students can intuitively learn about 3D topographic form, its representations, and how topography controls physical processes. Tangible Landscape is powered by GRASS GIS, an open source geospatial platform with extensive libraries for geospatial modeling and analysis. As such, Tangible Landscape can be used to design a wide range of learning experiences across a large number of geoscience disciplines.As part of a graduate level course that teaches grading, 16 students participated in a series of workshops, which were developed as serious games to encourage learning through structured play. These serious games included 1) diverting rain water to a specified location with minimal changes to landscape, 2) building different combinations of landforms, and 3) reconstructing landscapes based on projected contour information with feedback.In this poster, we will introduce Tangible Landscape, and describe the games and their implementation. We will then present preliminary results of a user experience survey we conducted as part of the workshops. All developed materials and software are open source and available online.

  14. Plug-and-play web-based visualization of mobile air monitoring data

    EPA Science Inventory

    The collection of air measurements in real-time on moving platforms, such as wearable, bicycle-mounted, or vehicle-mounted air sensors, is becoming an increasingly common method to investigate local air quality. However, visualizing and analyzing geospatial air monitoring data r...

  15. The Hico Image Processing System: A Web-Accessible Hyperspectral Remote Sensing Toolbox

    NASA Astrophysics Data System (ADS)

    Harris, A. T., III; Goodman, J.; Justice, B.

    2014-12-01

    As the quantity of Earth-observation data increases, the use-case for hosting analytical tools in geospatial data centers becomes increasingly attractive. To address this need, HySpeed Computing and Exelis VIS have developed the HICO Image Processing System, a prototype cloud computing system that provides online, on-demand, scalable remote sensing image processing capabilities. The system provides a mechanism for delivering sophisticated image processing analytics and data visualization tools into the hands of a global user community, who will only need a browser and internet connection to perform analysis. Functionality of the HICO Image Processing System is demonstrated using imagery from the Hyperspectral Imager for the Coastal Ocean (HICO), an imaging spectrometer located on the International Space Station (ISS) that is optimized for acquisition of aquatic targets. Example applications include a collection of coastal remote sensing algorithms that are directed at deriving critical information on water and habitat characteristics of our vulnerable coastal environment. The project leverages the ENVI Services Engine as the framework for all image processing tasks, and can readily accommodate the rapid integration of new algorithms, datasets and processing tools.

  16. Robert Spencer | NREL

    Science.gov Websites

    & Simulation Research Interests Remote Sensing Natural Resource Modeling Machine Learning Education Analysis Center. Areas of Expertise Geospatial Analysis Data Visualization Algorithm Development Modeling

  17. The National 3-D Geospatial Information Web-Based Service of Korea

    NASA Astrophysics Data System (ADS)

    Lee, D. T.; Kim, C. W.; Kang, I. G.

    2013-09-01

    3D geospatial information systems should provide efficient spatial analysis tools and able to use all capabilities of the third dimension, and a visualization. Currently, many human activities make steps toward the third dimension like land use, urban and landscape planning, cadastre, environmental monitoring, transportation monitoring, real estate market, military applications, etc. To reflect this trend, the Korean government has been started to construct the 3D geospatial data and service platform. Since the geospatial information was introduced in Korea, the construction of geospatial information (3D geospatial information, digital maps, aerial photographs, ortho photographs, etc.) has been led by the central government. The purpose of this study is to introduce the Korean government-lead 3D geospatial information web-based service for the people who interested in this industry and we would like to introduce not only the present conditions of constructed 3D geospatial data but methodologies and applications of 3D geospatial information. About 15% (about 3,278.74 km2) of the total urban area's 3D geospatial data have been constructed by the national geographic information institute (NGII) of Korea from 2005 to 2012. Especially in six metropolitan cities and Dokdo (island belongs to Korea) on level of detail (LOD) 4 which is photo-realistic textured 3D models including corresponding ortho photographs were constructed in 2012. In this paper, we represented web-based 3D map service system composition and infrastructure and comparison of V-world with Google Earth service will be presented. We also represented Open API based service cases and discussed about the protection of location privacy when we construct 3D indoor building models. In order to prevent an invasion of privacy, we processed image blurring, elimination and camouflage. The importance of public-private cooperation and advanced geospatial information policy is emphasized in Korea. Thus, the progress of spatial information industry of Korea is expected in the near future.

  18. Analyzing a 35-Year Hourly Data Record: Why So Difficult?

    NASA Technical Reports Server (NTRS)

    Lynnes, Chris

    2014-01-01

    At the Goddard Distributed Active Archive Center, we have recently added a 35-Year record of output data from the North American Land Assimilation System (NLDAS) to the Giovanni web-based analysis and visualization tool. Giovanni (Geospatial Interactive Online Visualization ANd aNalysis Infrastructure) offers a variety of data summarization and visualization to users that operate at the data center, obviating the need for users to download and read the data themselves for exploratory data analysis. However, the NLDAS data has proven surprisingly resistant to application of the summarization algorithms. Algorithms that were perfectly happy analyzing 15 years of daily satellite data encountered limitations both at the algorithm and system level for 35 years of hourly data. Failures arose, sometimes unexpectedly, from command line overflows, memory overflows, internal buffer overflows, and time-outs, among others. These serve as an early warning sign for the problems likely to be encountered by the general user community as they try to scale up to Big Data analytics. Indeed, it is likely that more users will seek to perform remote web-based analysis precisely to avoid the issues, or the need to reprogram around them. We will discuss approaches to mitigating the limitations and the implications for data systems serving the user communities that try to scale up their current techniques to analyze Big Data.

  19. Estuary Data Mapper (EDM)

    EPA Pesticide Factsheets

    Estuary Data Mapper is a tool for geospatial data discovery, visualization, and data download for any of the approximately 2,000 estuaries and associated watersheds in along the five US coastal regions

  20. Data-Driven Geospatial Visual Analytics for Real-Time Urban Flooding Decision Support

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Hill, D.; Rodriguez, A.; Marini, L.; Kooper, R.; Myers, J.; Wu, X.; Minsker, B. S.

    2009-12-01

    Urban flooding is responsible for the loss of life and property as well as the release of pathogens and other pollutants into the environment. Previous studies have shown that spatial distribution of intense rainfall significantly impacts the triggering and behavior of urban flooding. However, no general purpose tools yet exist for deriving rainfall data and rendering them in real-time at the resolution of hydrologic units used for analyzing urban flooding. This paper presents a new visual analytics system that derives and renders rainfall data from the NEXRAD weather radar system at the sewershed (i.e. urban hydrologic unit) scale in real-time for a Chicago stormwater management project. We introduce a lightweight Web 2.0 approach which takes advantages of scientific workflow management and publishing capabilities developed at NCSA (National Center for Supercomputing Applications), streaming data-aware semantic content management repository, web-based Google Earth/Map and time-aware KML (Keyhole Markup Language). A collection of polygon-based virtual sensors is created from the NEXRAD Level II data using spatial, temporal and thematic transformations at the sewershed level in order to produce persistent virtual rainfall data sources for the animation. Animated color-coded rainfall map in the sewershed can be played in real-time as a movie using time-aware KML inside the web browser-based Google Earth for visually analyzing the spatiotemporal patterns of the rainfall intensity in the sewershed. Such system provides valuable information for situational awareness and improved decision support during extreme storm events in an urban area. Our further work includes incorporating additional data (such as basement flooding events data) or physics-based predictive models that can be used for more integrated data-driven decision support.

  1. About Estuary Data Mapper (EDM)

    EPA Pesticide Factsheets

    Estuary Data Mapper is a tool for geospatial data discovery, visualization, and data download for any of the approximately 2,000 estuaries and associated watersheds in along the five US coastal regions.

  2. Geovisualization applications to examine and explore high-density and hierarchical critical infrastructure data

    NASA Astrophysics Data System (ADS)

    Edsall, Robert; Hembree, Harvey

    2018-05-01

    The geospatial research and development team in the National and Homeland Security Division at Idaho National Laboratory was tasked with providing tools to derive insight from the substantial amount of data currently available - and continuously being produced - associated with the critical infrastructure of the US. This effort is in support of the Department of Homeland Security, whose mission includes the protection of this infrastructure and the enhancement of its resilience to hazards, both natural and human. We present geovisual-analytics-based approaches for analysis of vulnerabilities and resilience of critical infrastructure, designed so that decision makers, analysts, and infrastructure owners and managers can manage risk, prepare for hazards, and direct resources before and after an incident that might result in an interruption in service. Our designs are based on iterative discussions with DHS leadership and analysts, who in turn will use these tools to explore and communicate data in partnership with utility providers, law enforcement, and emergency response and recovery organizations, among others. In most cases these partners desire summaries of large amounts of data, but increasingly, our users seek the additional capability of focusing on, for example, a specific infrastructure sector, a particular geographic region, or time period, or of examining data in a variety of generalization or aggregation levels. These needs align well with tenets of in-formation-visualization design; in this paper, selected applications among those that we have designed are described and positioned within geovisualization, geovisual analytical, and information visualization frameworks.

  3. Improving the Accessibility and Use of NASA Earth Science Data

    NASA Technical Reports Server (NTRS)

    Tisdale, Matthew; Tisdale, Brian

    2015-01-01

    Many of the NASA Langley Atmospheric Science Data Center (ASDC) Distributed Active Archive Center (DAAC) multidimensional tropospheric and atmospheric chemistry data products are stored in HDF4, HDF5 or NetCDF format, which traditionally have been difficult to analyze and visualize with geospatial tools. With the rising demand from the diverse end-user communities for geospatial tools to handle multidimensional products, several applications, such as ArcGIS, have refined their software. Many geospatial applications now have new functionalities that enable the end user to: Store, serve, and perform analysis on each individual variable, its time dimension, and vertical dimension. Use NetCDF, GRIB, and HDF raster data formats across applications directly. Publish output within REST image services or WMS for time and space enabled web application development. During this webinar, participants will learn how to leverage geospatial applications such as ArcGIS, OPeNDAP and ncWMS in the production of Earth science information, and in increasing data accessibility and usability.

  4. GeoSpatial Workforce Development: enhancing the traditional learning environment in geospatial information technology

    NASA Astrophysics Data System (ADS)

    Lawhead, Pamela B.; Aten, Michelle L.

    2003-04-01

    The Center for GeoSpatial Workforce Development is embarking on a new era in education by developing a repository of dynamic online courseware authored by the foremost industry experts within the remote sensing and GIS industries. Virtual classrooms equipped with the most advanced instructions, computations, communications, course evaluation, and management facilities amplify these courses to enhance the learning environment and provide rapid feedback between instructors and students. The launch of this program included the objective development of the Model Curriculum by an independent consortium of remote sensing industry leaders. The Center's research and development focus on recruiting additional industry experts to develop the technical content of the courseware and then utilize state-of-the-art technology to enhance their material with visually stimulating animations, compelling audio clips and entertaining, interactive exercises intended to reach the broadest audience possible by targeting various learning styles. The courseware will be delivered via various media: Internet, CD-ROM, DVD, and compressed video, that translates into anywhere, anytime delivery of GeoSpatial Information Technology education.

  5. Recent Advances in Geospatial Visualization with the New Google Earth

    NASA Astrophysics Data System (ADS)

    Anderson, J. C.; Poyart, E.; Yan, S.; Sargent, R.

    2017-12-01

    Google Earth's detailed, world-wide imagery and terrain data provide a rich backdrop for geospatial visualization at multiple scales, from global to local. The Keyhole Markup Language (KML) is an open standard that has been the primary way for users to author and share data visualizations in Google Earth. Despite its ease of use and flexibility for relatively small amounts of data, users can quickly run into difficulties and limitations working with large-scale or time-varying datasets using KML in Google Earth. Recognizing these challenges, we present our recent work toward extending Google Earth to be a more powerful data visualization platform. We describe a new KML extension to simplify the display of multi-resolution map tile pyramids - which can be created by analysis platforms like Google Earth Engine, or by a variety of other map tile production pipelines. We also describe how this implementation can pave the way to creating novel data visualizations by leveraging custom graphics shaders. Finally, we present our investigations into native support in Google Earth for data storage and transport formats that are well-suited for big raster and vector data visualization. Taken together, these capabilities make it easier to create and share new scientific data visualization experiences using Google Earth, and simplify the integration of Google Earth with existing map data products, services, and analysis pipelines.

  6. CELL5M: A geospatial database of agricultural indicators for Africa South of the Sahara.

    PubMed

    Koo, Jawoo; Cox, Cindy M; Bacou, Melanie; Azzarri, Carlo; Guo, Zhe; Wood-Sichra, Ulrike; Gong, Queenie; You, Liangzhi

    2016-01-01

    Recent progress in large-scale georeferenced data collection is widening opportunities for combining multi-disciplinary datasets from biophysical to socioeconomic domains, advancing our analytical and modeling capacity. Granular spatial datasets provide critical information necessary for decision makers to identify target areas, assess baseline conditions, prioritize investment options, set goals and targets and monitor impacts. However, key challenges in reconciling data across themes, scales and borders restrict our capacity to produce global and regional maps and time series. This paper provides overview, structure and coverage of CELL5M-an open-access database of geospatial indicators at 5 arc-minute grid resolution-and introduces a range of analytical applications and case-uses. CELL5M covers a wide set of agriculture-relevant domains for all countries in Africa South of the Sahara and supports our understanding of multi-dimensional spatial variability inherent in farming landscapes throughout the region.

  7. Downloading and Installing Estuary Data Mapper (EDM)

    EPA Pesticide Factsheets

    Estuary Data Mapper is a tool for geospatial data discovery, visualization, and data download for any of the approximately 2,000 estuaries and associated watersheds in along the five US coastal regions

  8. Frequent Questions about Estuary Data Mapper (EDM)

    EPA Pesticide Factsheets

    Estuary Data Mapper is a tool for geospatial data discovery, visualization, and data download for any of the approximately 2,000 estuaries and associated watersheds in along the five US coastal regions

  9. NASA Sea Level Change Portal - It not just another portal site

    NASA Astrophysics Data System (ADS)

    Huang, T.; Quach, N.; Abercrombie, S. P.; Boening, C.; Brennan, H. P.; Gill, K. M.; Greguska, F. R., III; Jackson, R.; Larour, E. Y.; Shaftel, H.; Tenenbaum, L. F.; Zlotnicki, V.; Moore, B.; Moore, J.; Boeck, A.

    2017-12-01

    The NASA Sea Level Change Portal (https://sealevel.nasa.gov) is designed as a "one-stop" source for current sea level change information, including interactive tools for accessing and viewing regional data, a virtual dashboard of sea level indicators, and ongoing updates through a suite of editorial products that include content articles, graphics, videos, and animations. With increasing global temperatures warming the ocean and melting ice sheets and glaciers, there is an immediate need both for accelerating sea level change research and for making this research accessible to scientists in disparate discipline, to the general public, to policy makers and business. The immersive and innovative NASA portal debuted at the 2015 AGU attracts thousands of daily visitors and over 30K followers on Facebook®. Behind its intuitive interface is an extensible architecture that integrates site contents, data for various sources, visualization, horizontal-scale geospatial data analytic technology (called NEXUS), and an interactive 3D simulation platform (called the Virtual Earth System Laboratory). We will present an overview of our NASA portal and some of our architectural decisions along with discussion on our open-source, cloud-based data analytic technology that enables on-the-fly analysis of heterogeneous data.

  10. GIS and paleoanthropology: incorporating new approaches from the geospatial sciences in the analysis of primate and human evolution.

    PubMed

    Anemone, R L; Conroy, G C; Emerson, C W

    2011-01-01

    The incorporation of research tools and analytical approaches from the geospatial sciences is a welcome trend for the study of primate and human evolution. The use of remote sensing (RS) imagery and geographic information systems (GIS) allows vertebrate paleontologists, paleoanthropologists, and functional morphologists to study fossil localities, landscapes, and individual specimens in new and innovative ways that recognize and analyze the spatial nature of much paleoanthropological data. Whether one is interested in locating and mapping fossiliferous rock units in the field, creating a searchable and georeferenced database to catalog fossil localities and specimens, or studying the functional morphology of fossil teeth, bones, or artifacts, the new geospatial sciences provide an essential element in modern paleoanthropological inquiry. In this article we review recent successful applications of RS and GIS within paleoanthropology and related fields and argue for the importance of these methods for the study of human evolution in the twenty first century. We argue that the time has come for inclusion of geospatial specialists in all interdisciplinary field research in paleoanthropology, and suggest some promising areas of development and application of the methods of geospatial science to the science of human evolution. Copyright © 2011 Wiley Periodicals, Inc.

  11. Visualization Beyond the Map: The Challenges of Managing Data for Re-Use

    NASA Astrophysics Data System (ADS)

    Allison, M. D.; Groman, R. C.; Chandler, C. L.; Galvarino, C. R.; Wiebe, P. H.; Glover, D. M.

    2012-12-01

    The Biological and Chemical Oceanography Data Management Office (BCO-DMO) makes data publicly accessible via both a text-based and a geospatial interface, the latter using the Open Geospatial Consortium (OGC) compliant open-source MapServer software originally from the University of Minnesota. Making data available for reuse by the widest variety of users is one of the overriding goals of BCO-DMO and one of our greatest challenges. The biogeochemical, ecological and physical data we manage are extremely heterogeneous. Although it is not possible to be all things to all people, we are actively working on ways to make the data re-usable by the most people. Looking at data in a different way is one of the underpinnings of data re-use and the easier we can make data accessible, the more the community of users will benefit. We can help the user determine usefulness by providing some specific tools. Sufficiently well-informed metadata can often be enough to determine fitness for purpose, but many times our geospatial interface to the data and metadata is more compelling. Displaying the data visually in as many ways as possible enables the scientist, teacher or manager to decide if the data are useful and then being able to download the data right away with no login required is very attractive. We will present ways of visualizing different kinds of data and discuss using metadata to drive the visualization tools. We will also discuss our attempts to work with data providers to organize their data in ways to make them reusable to the largest audience and to solicit input from data users about the effectiveness of our solutions.

  12. Design and Development of a Framework Based on Ogc Web Services for the Visualization of Three Dimensional Large-Scale Geospatial Data Over the Web

    NASA Astrophysics Data System (ADS)

    Roccatello, E.; Nozzi, A.; Rumor, M.

    2013-05-01

    This paper illustrates the key concepts behind the design and the development of a framework, based on OGC services, capable to visualize 3D large scale geospatial data streamed over the web. WebGISes are traditionally bounded to a bi-dimensional simplified representation of the reality and though they are successfully addressing the lack of flexibility and simplicity of traditional desktop clients, a lot of effort is still needed to reach desktop GIS features, like 3D visualization. The motivations behind this work lay in the widespread availability of OGC Web Services inside government organizations and in the technology support to HTML 5 and WebGL standard of the web browsers. This delivers an improved user experience, similar to desktop applications, therefore allowing to augment traditional WebGIS features with a 3D visualization framework. This work could be seen as an extension of the Cityvu project, started in 2008 with the aim of a plug-in free OGC CityGML viewer. The resulting framework has also been integrated in existing 3DGIS software products and will be made available in the next months.

  13. Ibmdbpy-spatial : An Open-source implementation of in-database geospatial analytics in Python

    NASA Astrophysics Data System (ADS)

    Roy, Avipsa; Fouché, Edouard; Rodriguez Morales, Rafael; Moehler, Gregor

    2017-04-01

    As the amount of spatial data acquired from several geodetic sources has grown over the years and as data infrastructure has become more powerful, the need for adoption of in-database analytic technology within geosciences has grown rapidly. In-database analytics on spatial data stored in a traditional enterprise data warehouse enables much faster retrieval and analysis for making better predictions about risks and opportunities, identifying trends and spot anomalies. Although there are a number of open-source spatial analysis libraries like geopandas and shapely available today, most of them have been restricted to manipulation and analysis of geometric objects with a dependency on GEOS and similar libraries. We present an open-source software package, written in Python, to fill the gap between spatial analysis and in-database analytics. Ibmdbpy-spatial provides a geospatial extension to the ibmdbpy package, implemented in 2015. It provides an interface for spatial data manipulation and access to in-database algorithms in IBM dashDB, a data warehouse platform with a spatial extender that runs as a service on IBM's cloud platform called Bluemix. Working in-database reduces the network overload, as the complete data need not be replicated into the user's local system altogether and only a subset of the entire dataset can be fetched into memory in a single instance. Ibmdbpy-spatial accelerates Python analytics by seamlessly pushing operations written in Python into the underlying database for execution using the dashDB spatial extender, thereby benefiting from in-database performance-enhancing features, such as columnar storage and parallel processing. The package is currently supported on Python versions from 2.7 up to 3.4. The basic architecture of the package consists of three main components - 1) a connection to the dashDB represented by the instance IdaDataBase, which uses a middleware API namely - pypyodbc or jaydebeapi to establish the database connection via ODBC or JDBC respectively, 2) an instance to represent the spatial data stored in the database as a dataframe in Python, called the IdaGeoDataFrame, with a specific geometry attribute which recognises a planar geometry column in dashDB and 3) Python wrappers for spatial functions like within, distance, area, buffer} and more which dashDB currently supports to make the querying process from Python much simpler for the users. The spatial functions translate well-known geopandas-like syntax into SQL queries utilising the database connection to perform spatial operations in-database and can operate on single geometries as well two different geometries from different IdaGeoDataFrames. The in-database queries strictly follow the standards of OpenGIS Implementation Specification for Geographic information - Simple feature access for SQL. The results of the operations obtained can thereby be accessed dynamically via interactive Jupyter notebooks from any system which supports Python, without any additional dependencies and can also be combined with other open source libraries such as matplotlib and folium in-built within Jupyter notebooks for visualization purposes. We built a use case to analyse crime hotspots in New York city to validate our implementation and visualized the results as a choropleth map for each borough.

  14. Two Contrasting Approaches to Building High School Teacher Capacity to Teach About Local Climate Change Using Powerful Geospatial Data and Visualization Technology

    NASA Astrophysics Data System (ADS)

    Zalles, D. R.

    2011-12-01

    The presentation will compare and contrast two different place-based approaches to helping high school science teachers use geospatial data visualization technology to teach about climate change in their local regions. The approaches are being used in the development, piloting, and dissemination of two projects for high school science led by the author: the NASA-funded Data-enhanced Investigations for Climate Change Education (DICCE) and the NSF funded Studying Topography, Orographic Rainfall, and Ecosystems with Geospatial Information Technology (STORE). DICCE is bringing an extensive portal of Earth observation data, the Goddard Interactive Online Visualization and Analysis Infrastructure, to high school classrooms. STORE is making available data for viewing results of a particular IPCC-sanctioned climate change model in relation to recent data about average temperatures, precipitation, and land cover for study areas in central California and western New York State. Across the two projects, partner teachers of academically and ethnically diverse students from five states are participating in professional development and pilot testing. Powerful geospatial data representation technologies are difficult to implement in high school science because of challenges that teachers and students encounter navigating data access and making sense of data characteristics and nomenclature. Hence, on DICCE, the researchers are testing the theory that by providing a scaffolded technology-supported process for instructional design, starting from fundamental questions about the content domain, teachers will make better instructional decisions. Conversely, the STORE approach is rooted in the perspective that co-design of curricular materials among researchers and teacher partners that work off of "starter" lessons covering focal skills and understandings will lead to the most effective utilizations of the technology in the classroom. The projects' goals and strategies for student learning proceed from research suggesting that students will be more engaged and able to utilize prior knowledge better when seeing the local and hence personal relevance of climate change and other pressing contemporary science-related issues. In these projects, the students look for climate change trends in geospatial Earth System data layers from weather stations, satellites, and models in relation to global trends. They examine these data to (1) reify what they are learning in science class about meteorology, climate, and ecology, (2) build inquiry skills by posing and seeking answers to research questions, and (3) build data literacy skills through experience generating appropriate data queries and examining data output on different forms of geospatial representations such as maps, elevation profiles, and time series plots. Teachers also are given the opportunity to have their students look at geospatially represented census data from the tool Social Explorer (http://www.socialexplorer.com/pub/maps/home.aspx) in order to better understand demographic trends in relation to climate change-related trends in the Earth system. Early results will be reported about teacher professional development and student learning, gleaned from interviews and observations.

  15. Applying Geospatial Technologies for International Development and Public Health: The USAID/NASA SERVIR Program

    NASA Technical Reports Server (NTRS)

    Hemmings, Sarah; Limaye, Ashutosh; Irwin, Dan

    2011-01-01

    Background: SERVIR -- the Regional Visualization and Monitoring System -- helps people use Earth observations and predictive models based on data from orbiting satellites to make timely decisions that benefit society. SERVIR operates through a network of regional hubs in Mesoamerica, East Africa, and the Hindu Kush-Himalayas. USAID and NASA support SERVIR, with the long-term goal of transferring SERVIR capabilities to the host countries. Objective/Purpose: The purpose of this presentation is to describe how the SERVIR system helps the SERVIR regions cope with eight areas of societal benefit identified by the Group on Earth Observations (GEO): health, disasters, ecosystems, biodiversity, weather, water, climate, and agriculture. This presentation will describe environmental health applications of data in the SERVIR system, as well as ongoing and future efforts to incorporate additional health applications into the SERVIR system. Methods: This presentation will discuss how the SERVIR Program makes environmental data available for use in environmental health applications. SERVIR accomplishes its mission by providing member nations with access to geospatial data and predictive models, information visualization, training and capacity building, and partnership development. SERVIR conducts needs assessments in partner regions, develops custom applications of Earth observation data, and makes NASA and partner data available through an online geospatial data portal at SERVIRglobal.net. Results: Decision makers use SERVIR to improve their ability to monitor air quality, extreme weather, biodiversity, and changes in land cover. In past several years, the system has been used over 50 times to respond to environmental threats such as wildfires, floods, landslides, and harmful algal blooms. Given that the SERVIR regions are experiencing increased stress under larger climate variability than historic observations, SERVIR provides information to support the development of adaptation strategies for nations affected by climate change. Conclusions: SERVIR is a platform for collaboration and cross-agency coordination, international partnerships, and delivery of web-based geospatial information services and applications. SERVIR makes a variety of geospatial data available for use in studies of environmental health outcomes.

  16. Diagnosing Geospatial Uncertainty Visualization Challenges in Seasonal Temperature and Precipitation Forecasts

    NASA Astrophysics Data System (ADS)

    Speciale, A.; Kenney, M. A.; Gerst, M.; Baer, A. E.; DeWitt, D.; Gottschalk, J.; Handel, S.

    2017-12-01

    The uncertainty of future weather and climate conditions is important for many decisions made in communities and economic sectors. One tool that decision-makers use in gauging this uncertainty is forecasts, especially maps (or visualizations) of probabilistic forecast results. However, visualizing geospatial uncertainty is challenging because including probability introduces an extra variable to represent and probability is often poorly understood by users. Using focus group and survey methods, this study seeks to understand the barriers to using probabilistic temperature and precipitation visualizations for specific decisions in the agriculture, energy, emergency management, and water resource sectors. Preliminary results shown here focus on findings of emergency manager needs. Our experimental design uses National Oceanic and Atmospheric Administration (NOAA's) Climate Prediction Center (CPC) climate outlooks, which produce probabilistic temperature and precipitation forecast visualizations at the 6-10 day, 8-14 day, 3-4 week, and 1 and 3 month timeframes. Users were asked to complete questions related to how they use weather information, how uncertainty is represented, and design elements (e.g., color, contour lines) of the visualizations. Preliminary results from the emergency management sector indicate there is significant confusion on how "normal" weather is defined, boundaries between probability ranges, and meaning of the contour lines. After a complete understandability diagnosis is made using results from all sectors, we will collaborate with CPC to suggest modifications to the climate outlook visualizations. These modifications will then be retested in similar focus groups and web-based surveys to confirm they better meet the needs of users.

  17. Near Real-time Scientific Data Analysis and Visualization with the ArcGIS Platform

    NASA Astrophysics Data System (ADS)

    Shrestha, S. R.; Viswambharan, V.; Doshi, A.

    2017-12-01

    Scientific multidimensional data are generated from a variety of sources and platforms. These datasets are mostly produced by earth observation and/or modeling systems. Agencies like NASA, NOAA, USGS, and ESA produce large volumes of near real-time observation, forecast, and historical data that drives fundamental research and its applications in larger aspects of humanity from basic decision making to disaster response. A common big data challenge for organizations working with multidimensional scientific data and imagery collections is the time and resources required to manage and process such large volumes and varieties of data. The challenge of adopting data driven real-time visualization and analysis, as well as the need to share these large datasets, workflows, and information products to wider and more diverse communities, brings an opportunity to use the ArcGIS platform to handle such demand. In recent years, a significant effort has put in expanding the capabilities of ArcGIS to support multidimensional scientific data across the platform. New capabilities in ArcGIS to support scientific data management, processing, and analysis as well as creating information products from large volumes of data using the image server technology are becoming widely used in earth science and across other domains. We will discuss and share the challenges associated with big data by the geospatial science community and how we have addressed these challenges in the ArcGIS platform. We will share few use cases, such as NOAA High Resolution Refresh Radar (HRRR) data, that demonstrate how we access large collections of near real-time data (that are stored on-premise or on the cloud), disseminate them dynamically, process and analyze them on-the-fly, and serve them to a variety of geospatial applications. We will also share how on-the-fly processing using raster functions capabilities, can be extended to create persisted data and information products using raster analytics capabilities that exploit distributed computing in an enterprise environment.

  18. Spatial Data Services for Interdisciplinary Applications from the NASA Socioeconomic Data and Applications Center

    NASA Astrophysics Data System (ADS)

    Chen, R. S.; MacManus, K.; Vinay, S.; Yetman, G.

    2016-12-01

    The Socioeconomic Data and Applications Center (SEDAC), one of 12 Distributed Active Archive Centers (DAACs) in the NASA Earth Observing System Data and Information System (EOSDIS), has developed a variety of operational spatial data services aimed at providing online access, visualization, and analytic functions for geospatial socioeconomic and environmental data. These services include: open web services that implement Open Geospatial Consortium (OGC) specifications such as Web Map Service (WMS), Web Feature Service (WFS), and Web Coverage Service (WCS); spatial query services that support Web Processing Service (WPS) and Representation State Transfer (REST); and web map clients and a mobile app that utilize SEDAC and other open web services. These services may be accessed from a variety of external map clients and visualization tools such as NASA's WorldView, NOAA's Climate Explorer, and ArcGIS Online. More than 200 data layers related to population, settlements, infrastructure, agriculture, environmental pollution, land use, health, hazards, climate change and other aspects of sustainable development are available through WMS, WFS, and/or WCS. Version 2 of the SEDAC Population Estimation Service (PES) supports spatial queries through WPS and REST in the form of a user-defined polygon or circle. The PES returns an estimate of the population residing in the defined area for a specific year (2000, 2005, 2010, 2015, or 2020) based on SEDAC's Gridded Population of the World version 4 (GPWv4) dataset, together with measures of accuracy. The SEDAC Hazards Mapper and the recently released HazPop iOS mobile app enable users to easily submit spatial queries to the PES and see the results. SEDAC has developed an operational virtualized backend infrastructure to manage these services and support their continual improvement as standards change, new data and services become available, and user needs evolve. An ongoing challenge is to improve the reliability and performance of the infrastructure, in conjunction with external services, to meet both research and operational needs.

  19. Using Globe Browsing Systems in Planetariums to Take Audiences to Other Worlds.

    NASA Astrophysics Data System (ADS)

    Emmart, C. B.

    2014-12-01

    For the last decade planetariums have been adding capability of "full dome video" systems for both movie playback and interactive display. True scientific data visualization has now come to planetarium audiences as a means to display the actual three dimensional layout of the universe, the time based array of planets, minor bodies and spacecraft across the solar system, and now globe browsing systems to examine planetary bodies to the limits of resolutions acquired. Additionally, such planetarium facilities can be networked for simultaneous display across the world for wider audience and reach to authoritative scientist description and commentary. Data repositories such as NASA's Lunar Mapping and Modeling Project (LMMP), NASA GSFC's LANCE-MODIS, and others conforming to the Open Geospatial Consortium (OGC) standard of Web Map Server (WMS) protocols make geospatial data available for a growing number of dome supporting globe visualization systems. The immersive surround graphics of full dome video replicates our visual system creating authentic virtual scenes effectively placing audiences on location in some cases to other worlds only mapped robotically.

  20. Browsing and Visualization of Linked Environmental Data

    NASA Astrophysics Data System (ADS)

    Nikolaou, Charalampos; Kyzirakos, Kostis; Bereta, Konstantina; Dogani, Kallirroi; Koubarakis, Manolis

    2014-05-01

    Linked environmental data has started to appear on the Web as environmental researchers make use of technologies such as ontologies, RDF, and SPARQL. Many of these datasets have an important geospatial and temporal dimension. The same is true also for the Web of data that is being rapidly populated not only with geospatial information, but also with temporal information. As the real-world entities represented in linked geospatial datasets evolve over time, the datasets themselves get updated and both the spatial and the temporal dimension of data become significant for users. For example, in the Earth Observation and Environment domains, data is constantly produced by satellite sensors and is associated with metadata containing, among others, temporal attributes, such as the time that an image was acquired. In addition, the acquisitions are considered to be valid for specific periods of time, for example until they get updated by new acquisitions. Satellite acquisitions might be utilized in applications such as the CORINE Land Cover programme operated by the European Environment Agency that makes available as a cartographic product the land cover of European areas. Periodically CORINE publishes the changes in the land cover of these areas in the form of changesets. Tools for exploiting the abundance of geospatial information have also started to emerge. However, these tools are designed for browsing a single data source, while in addition they cannot represent the temporal dimension. This is for two reasons: a) the lack of an implementation of a data model and a query language with temporal features covering the various semantics associated with the representation of time (e.g., valid and user-defined), and b) the lack of a standard temporal extension of RDF that would allow practitioners to utilize when publishing RDF data. Recently, we presented the temporal features of the data model stRDF, the query language stSPARQL, and their implementation in the geospatial RDF store Strabon (http://www.strabon.di.uoa.gr/) which, apart from querying geospatial information, can also be used to query both the valid time of a triple and user-defined time. With the aim of filling the aforementioned gaps and going beyond data exploration to map creation and sharing, we have designed and developed SexTant (http://sextant.di.uoa.gr/). SexTant can be used to produce thematic maps by layering spatiotemporal information which exists in a number of data sources ranging from standard SPARQL endpoints, to SPARQL endpoints following the standard GeoSPARQL defined by the Open Geospatial Consortium (OGC) for the modelling and querying of geospatial information, and other well-adopted geospatial file formats, such as KML and GeoJSON. In this work, we pick some real use cases from the environment domain to showcase the usefulness of SexTant to the environmental studies of a domain expert by presenting its browsing and visualization capabilities using a number of environmental datasets that we have published as linked data and also other geospatial data sources publicly available on the Web, such as KML files.

  1. Geospatial methods and data analysis for assessing distribution of grazing livestock

    USDA-ARS?s Scientific Manuscript database

    Free-ranging livestock research must begin with a well conceived problem statement and employ appropriate data acquisition tools and analytical techniques to accomplish the research objective. These requirements are especially critical in addressing animal distribution. Tools and statistics used t...

  2. Geospatial health: the first five years.

    PubMed

    Utzinger, Jürg; Rinaldi, Laura; Malone, John B; Krauth, Stefanie J; Kristensen, Thomas K; Cringoli, Giuseppe; Bergquist, Robert

    2011-11-01

    Geospatial Health is an international, peer-reviewed scientific journal produced by the Global Network for Geospatial Health (GnosisGIS). This network was founded in 2000 and the inaugural issue of its official journal was published in November 2006 with the aim to cover all aspects of geographical information system (GIS) applications, remote sensing and other spatial analytic tools focusing on human and veterinary health. The University of Naples Federico II is the publisher, producing two issues per year, both as hard copy and an open-access online version. The journal is referenced in major databases, including CABI, ISI Web of Knowledge and PubMed. In 2008, it was assigned its first impact factor (1.47), which has now reached 1.71. Geospatial Health is managed by an editor-in-chief and two associate editors, supported by five regional editors and a 23-member strong editorial board. This overview takes stock of the first five years of publishing: 133 contributions have been published so far, primarily original research (79.7%), followed by reviews (7.5%), announcements (6.0%), editorials and meeting reports (3.0% each) and a preface in the first issue. A content analysis of all the original research articles and reviews reveals that three quarters of the publications focus on human health with the remainder dealing with veterinary health. Two thirds of the papers come from Africa, Asia and Europe with similar numbers of contributions from each continent. Studies of more than 35 different diseases, injuries and risk factors have been presented. Malaria and schistosomiasis were identified as the two most important diseases (11.2% each). Almost half the contributions were based on GIS, one third on spatial analysis, often using advanced Bayesian geostatistics (13.8%), and one quarter on remote sensing. The 120 original research articles, reviews and editorials were produced by 505 authors based at institutions and universities in 52 countries. Importantly, a considerable proportion of the authors come from countries with a low or medium human development index (29.3%). In view of the increasing number of submissions, we are considering to publish more than two issues per year in the future. Finally, our vision is to open-up a new section predominantly based on visual presentations, including brief video clips, as discussed in a symposium at the 60th annual meeting of the American Society of Tropical Medicine and Hygiene in December 2011.

  3. Extending Climate Analytics as a Service to the Earth System Grid Federation Progress Report on the Reanalysis Ensemble Service

    NASA Astrophysics Data System (ADS)

    Tamkin, G.; Schnase, J. L.; Duffy, D.; Li, J.; Strong, S.; Thompson, J. H.

    2016-12-01

    We are extending climate analytics-as-a-service, including: (1) A high-performance Virtual Real-Time Analytics Testbed supporting six major reanalysis data sets using advanced technologies like the Cloudera Impala-based SQL and Hadoop-based MapReduce analytics over native NetCDF files. (2) A Reanalysis Ensemble Service (RES) that offers a basic set of commonly used operations over the reanalysis collections that are accessible through NASA's climate data analytics Web services and our client-side Climate Data Services Python library, CDSlib. (3) An Open Geospatial Consortium (OGC) WPS-compliant Web service interface to CDSLib to accommodate ESGF's Web service endpoints. This presentation will report on the overall progress of this effort, with special attention to recent enhancements that have been made to the Reanalysis Ensemble Service, including the following: - An CDSlib Python library that supports full temporal, spatial, and grid-based resolution services - A new reanalysis collections reference model to enable operator design and implementation - An enhanced library of sample queries to demonstrate and develop use case scenarios - Extended operators that enable single- and multiple reanalysis area average, vertical average, re-gridding, and trend, climatology, and anomaly computations - Full support for the MERRA-2 reanalysis and the initial integration of two additional reanalyses - A prototype Jupyter notebook-based distribution mechanism that combines CDSlib documentation with interactive use case scenarios and personalized project management - Prototyped uncertainty quantification services that combine ensemble products with comparative observational products - Convenient, one-stop shopping for commonly used data products from multiple reanalyses, including basic subsetting and arithmetic operations over the data and extractions of trends, climatologies, and anomalies - The ability to compute and visualize multiple reanalysis intercomparisons

  4. Web-Based Geospatial Tools to Address Hazard Mitigation, Natural Resource Management, and Other Societal Issues

    USGS Publications Warehouse

    Hearn,, Paul P.

    2009-01-01

    Federal, State, and local government agencies in the United States face a broad range of issues on a daily basis. Among these are natural hazard mitigation, homeland security, emergency response, economic and community development, water supply, and health and safety services. The U.S. Geological Survey (USGS) helps decision makers address these issues by providing natural hazard assessments, information on energy, mineral, water and biological resources, maps, and other geospatial information. Increasingly, decision makers at all levels are challenged not by the lack of information, but by the absence of effective tools to synthesize the large volume of data available, and to utilize the data to frame policy options in a straightforward and understandable manner. While geographic information system (GIS) technology has been widely applied to this end, systems with the necessary analytical power have been usable only by trained operators. The USGS is addressing the need for more accessible, manageable data tools by developing a suite of Web-based geospatial applications that will incorporate USGS and cooperating partner data into the decision making process for a variety of critical issues. Examples of Web-based geospatial tools being used to address societal issues follow.

  5. Topologically Consistent Models for Efficient Big Geo-Spatio Data Distribution

    NASA Astrophysics Data System (ADS)

    Jahn, M. W.; Bradley, P. E.; Doori, M. Al; Breunig, M.

    2017-10-01

    Geo-spatio-temporal topology models are likely to become a key concept to check the consistency of 3D (spatial space) and 4D (spatial + temporal space) models for emerging GIS applications such as subsurface reservoir modelling or the simulation of energy and water supply of mega or smart cities. Furthermore, the data management for complex models consisting of big geo-spatial data is a challenge for GIS and geo-database research. General challenges, concepts, and techniques of big geo-spatial data management are presented. In this paper we introduce a sound mathematical approach for a topologically consistent geo-spatio-temporal model based on the concept of the incidence graph. We redesign DB4GeO, our service-based geo-spatio-temporal database architecture, on the way to the parallel management of massive geo-spatial data. Approaches for a new geo-spatio-temporal and object model of DB4GeO meeting the requirements of big geo-spatial data are discussed in detail. Finally, a conclusion and outlook on our future research are given on the way to support the processing of geo-analytics and -simulations in a parallel and distributed system environment.

  6. An Environmental Decision Support System for Spatial Assessment and Selective Remediation

    EPA Science Inventory

    Spatial Analysis and Decision Assistance (SADA) is a Windows freeware program that incorporates environmental assessment tools for effective problem-solving. The software integrates modules for GIS, visualization, geospatial analysis, statistical analysis, human health and ecolog...

  7. Augmenting Austrian flood management practices through geospatial predictive analytics: a study in Carinthia

    NASA Astrophysics Data System (ADS)

    Ward, S. M.; Paulus, G.

    2013-06-01

    The Danube River basin has long been the location of significant flooding problems across central Europe. The last decade has seen a sharp increase in the frequency, duration and intensity of these flood events, unveiling a dire need for enhanced flood management policy and tools in the region. Located in the southern portion of Austria, the state of Carinthia has experienced a significant volume of intense flood impacts over the last decade. Although the Austrian government has acknowledged these issues, their remedial actions have been primarily structural to date. Continued focus on controlling the natural environment through infrastructure while disregarding the need to consider alternative forms of assessing flood exposure will only act as a provisional solution to this inescapable risk. In an attempt to remedy this flaw, this paper highlights the application of geospatial predictive analytics and spatial recovery index as a proxy for community resilience, as well as the cultural challenges associated with the application of foreign models within an Austrian environment.

  8. A Graphical-User Interface for the U. S. Geological Survey's SUTRA Code using Argus ONE (for simulation of variable-density saturated-unsaturated ground-water flow with solute or energy transport)

    USGS Publications Warehouse

    Voss, Clifford I.; Boldt, David; Shapiro, Allen M.

    1997-01-01

    This report describes a Graphical-User Interface (GUI) for SUTRA, the U.S. Geological Survey (USGS) model for saturated-unsaturated variable-fluid-density ground-water flow with solute or energy transport,which combines a USGS-developed code that interfaces SUTRA with Argus ONE, a commercial software product developed by Argus Interware. This product, known as Argus Open Numerical Environments (Argus ONETM), is a programmable system with geographic-information-system-like (GIS-like) functionality that includes automated gridding and meshing capabilities for linking geospatial information with finite-difference and finite-element numerical model discretizations. The GUI for SUTRA is based on a public-domain Plug-In Extension (PIE) to Argus ONE that automates the use of ArgusONE to: automatically create the appropriate geospatial information coverages (information layers) for SUTRA, provide menus and dialogs for inputting geospatial information and simulation control parameters for SUTRA, and allow visualization of SUTRA simulation results. Following simulation control data and geospatial data input bythe user through the GUI, ArgusONE creates text files in a format required for normal input to SUTRA,and SUTRA can be executed within the Argus ONE environment. Then, hydraulic head, pressure, solute concentration, temperature, saturation and velocity results from the SUTRA simulation may be visualized. Although the GUI for SUTRA discussed in this report provides all of the graphical pre- and post-processor functions required for running SUTRA, it is also possible for advanced users to apply programmable features within Argus ONE to modify the GUI to meet the unique demands of particular ground-water modeling projects.

  9. Disaster protection of transport infrastructure and mobility using flood risk modeling and geospatial visualization.

    DOT National Transportation Integrated Search

    2015-05-01

    infrastructure networks are essential to sustain our economy, society and quality of life. Natural disasters cost lives, infrastructure destruction, and economic losses. In 2013 over 28 million people were displaced worldwide by natural disasters wit...

  10. Automated Geospatial Watershed Assessment (AGWA) Tool for hydrologic modeling and watershed assessment

    EPA Pesticide Factsheets

    Using basic, easily attainable GIS data, AGWA provides a simple, direct, and repeatable methodology for hydrologic model setup, execution, and visualization. AGWA experiences activity from over 170 countries. It l has been downloaded over 11,000 times.

  11. Strengthened IAEA Safeguards-Imagery Analysis: Geospatial Tools for Nonproliferation Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pabian, Frank V

    2012-08-14

    This slide presentation focuses on the growing role and importance of imagery analysis for IAEA safeguards applications and how commercial satellite imagery, together with the newly available geospatial tools, can be used to promote 'all-source synergy.' As additional sources of openly available information, satellite imagery in conjunction with the geospatial tools can be used to significantly augment and enhance existing information gathering techniques, procedures, and analyses in the remote detection and assessment of nonproliferation relevant activities, facilities, and programs. Foremost of the geospatial tools are the 'Digital Virtual Globes' (i.e., GoogleEarth, Virtual Earth, etc.) that are far better than previouslymore » used simple 2-D plan-view line drawings for visualization of known and suspected facilities of interest which can be critical to: (1) Site familiarization and true geospatial context awareness; (2) Pre-inspection planning; (3) Onsite orientation and navigation; (4) Post-inspection reporting; (5) Site monitoring over time for changes; (6) Verification of states site declarations and for input to State Evaluation reports; and (7) A common basis for discussions among all interested parties (Member States). Additionally, as an 'open-source', such virtual globes can also provide a new, essentially free, means to conduct broad area search for undeclared nuclear sites and activities - either alleged through open source leads; identified on internet BLOGS and WIKI Layers, with input from a 'free' cadre of global browsers and/or by knowledgeable local citizens (a.k.a.: 'crowdsourcing'), that can include ground photos and maps; or by other initiatives based on existing information and in-house country knowledge. They also provide a means to acquire ground photography taken by locals, hobbyists, and tourists of the surrounding locales that can be useful in identifying and discriminating between relevant and non-relevant facilities and their associated infrastructure. The digital globes also provide highly accurate terrain mapping for better geospatial context and allow detailed 3-D perspectives of all sites or areas of interest. 3-D modeling software (i.e., Google's SketchUp6 newly available in 2007) when used in conjunction with these digital globes can significantly enhance individual building characterization and visualization (including interiors), allowing for better assessments including walk-arounds or fly-arounds and perhaps better decision making on multiple levels (e.g., the best placement for International Atomic Energy Agency (IAEA) video monitoring cameras).« less

  12. A model of clutter for complex, multivariate geospatial displays.

    PubMed

    Lohrenz, Maura C; Trafton, J Gregory; Beck, R Melissa; Gendron, Marlin L

    2009-02-01

    A novel model of measuring clutter in complex geospatial displays was compared with human ratings of subjective clutter as a measure of convergent validity. The new model is called the color-clustering clutter (C3) model. Clutter is a known problem in displays of complex data and has been shown to affect target search performance. Previous clutter models are discussed and compared with the C3 model. Two experiments were performed. In Experiment 1, participants performed subjective clutter ratings on six classes of information visualizations. Empirical results were used to set two free parameters in the model. In Experiment 2, participants performed subjective clutter ratings on aeronautical charts. Both experiments compared and correlated empirical data to model predictions. The first experiment resulted in a .76 correlation between ratings and C3. The second experiment resulted in a .86 correlation, significantly better than results from a model developed by Rosenholtz et al. Outliers to our correlation suggest further improvements to C3. We suggest that (a) the C3 model is a good predictor of subjective impressions of clutter in geospatial displays, (b) geospatial clutter is a function of color density and saliency (primary C3 components), and (c) pattern analysis techniques could further improve C3. The C3 model could be used to improve the design of electronic geospatial displays by suggesting when a display will be too cluttered for its intended audience.

  13. Authoring Tours of Geospatial Data With KML and Google Earth

    NASA Astrophysics Data System (ADS)

    Barcay, D. P.; Weiss-Malik, M.

    2008-12-01

    As virtual globes become widely adopted by the general public, the use of geospatial data has expanded greatly. With the popularization of Google Earth and other platforms, GIS systems have become virtual reality platforms. Using these platforms, a casual user can easily explore the world, browse massive data-sets, create powerful 3D visualizations, and share those visualizations with millions of people using the KML language. This technology has raised the bar for professionals and academics alike. It is now expected that studies and projects will be accompanied by compelling, high-quality visualizations. In this new landscape, a presentation of geospatial data can be the most effective form of advertisement for a project: engaging both the general public and the scientific community in a unified interactive experience. On the other hand, merely dumping a dataset into a virtual globe can be a disorienting, alienating experience for many users. To create an effective, far-reaching presentation, an author must take care to make their data approachable to a wide variety of users with varying knowledge of the subject matter, expertise in virtual globes, and attention spans. To that end, we present techniques for creating self-guided interactive tours of data represented in KML and visualized in Google Earth. Using these methods, we provide the ability to move the camera through the world while dynamically varying the content, style, and visibility of the displayed data. Such tours can automatically guide users through massive, complex datasets: engaging a broad user-base, and conveying subtle concepts that aren't immediately apparent when viewing the raw data. To the casual user these techniques result in an extremely compelling experience similar to watching video. Unlike video though, these techniques maintain the rich interactive environment provided by the virtual globe, allowing users to explore the data in detail and to add other data sources to the presentation.

  14. An extreme events laboratory to provide network centric collaborative situation assessment and decision making

    NASA Astrophysics Data System (ADS)

    Panulla, Brian J.; More, Loretta D.; Shumaker, Wade R.; Jones, Michael D.; Hooper, Robert; Vernon, Jeffrey M.; Aungst, Stanley G.

    2009-05-01

    Rapid improvements in communications infrastructure and sophistication of commercial hand-held devices provide a major new source of information for assessing extreme situations such as environmental crises. In particular, ad hoc collections of humans can act as "soft sensors" to augment data collected by traditional sensors in a net-centric environment (in effect, "crowd-sourcing" observational data). A need exists to understand how to task such soft sensors, characterize their performance and fuse the data with traditional data sources. In order to quantitatively study such situations, as well as study distributed decision-making, we have developed an Extreme Events Laboratory (EEL) at The Pennsylvania State University. This facility provides a network-centric, collaborative situation assessment and decision-making capability by supporting experiments involving human observers, distributed decision making and cognition, and crisis management. The EEL spans the information chain from energy detection via sensors, human observations, signal and image processing, pattern recognition, statistical estimation, multi-sensor data fusion, visualization and analytics, and modeling and simulation. The EEL command center combines COTS and custom collaboration tools in innovative ways, providing capabilities such as geo-spatial visualization and dynamic mash-ups of multiple data sources. This paper describes the EEL and several on-going human-in-the-loop experiments aimed at understanding the new collective observation and analysis landscape.

  15. Using Immersive Visualizations to Improve Decision Making and Enhancing Public Understanding of Earth Resource and Climate Issues

    NASA Astrophysics Data System (ADS)

    Yu, K. C.; Raynolds, R. G.; Dechesne, M.

    2008-12-01

    New visualization technologies, from ArcGIS to Google Earth, have allowed for the integration of complex, disparate data sets to produce visually rich and compelling three-dimensional models of sub-surface and surface resource distribution patterns. The rendering of these models allows the public to quickly understand complicated geospatial relationships that would otherwise take much longer to explain using traditional media. We have impacted the community through topical policy presentations at both state and city levels, adult education classes at the Denver Museum of Nature and Science (DMNS), and public lectures at DMNS. We have constructed three-dimensional models from well data and surface observations which allow policy makers to better understand the distribution of groundwater in sandstone aquifers of the Denver Basin. Our presentations to local governments in the Denver metro area have allowed resource managers to better project future ground water depletion patterns, and to encourage development of alternative sources. DMNS adult education classes on water resources, geography, and regional geology, as well as public lectures on global issues such as earthquakes, tsunamis, and resource depletion, have utilized the visualizations developed from these research models. In addition to presenting GIS models in traditional lectures, we have also made use of the immersive display capabilities of the digital "fulldome" Gates Planetarium at DMNS. The real-time Uniview visualization application installed at Gates was designed for teaching astronomy, but it can be re-purposed for displaying our model datasets in the context of the Earth's surface. The 17-meter diameter dome of the Gates Planetarium allows an audience to have an immersive experience---similar to virtual reality CAVEs employed by the oil exploration industry---that would otherwise not be available to the general public. Public lectures in the dome allow audiences of over 100 people to comprehend dynamically- changing geospatial datasets in an exciting and engaging fashion. In our presentation, we will demonstrate how new software tools like Uniview can be used to dramatically enhance and accelerate public comprehension of complex, multi-scale geospatial phenomena.

  16. Open Access to Multi-Domain Collaborative Analysis of Geospatial Data Through the Internet

    NASA Astrophysics Data System (ADS)

    Turner, A.

    2009-12-01

    The internet has provided us with a high bandwidth, low latency, globally connected network in which to rapidly share realtime data from sensors, reports, and imagery. In addition, the availability of this data is even easier to obtain, consume and analyze. Another aspect of the internet has been the increased approachability of complex systems through lightweight interfaces - with additional complex services able to provide more advanced connections into data services. These analyses and discussions have primarily been siloed within single domains, or kept out of the reach of amateur scientists and interested citizens. However, through more open access to analytical tools and data, experts can collaborate with citizens to gather information, provide interfaces for experimenting and querying results, and help make improved insights and feedback for further investigation. For example, farmers in Uganda are able to use their mobile phones to query, analyze, and be alerted to banana crop disease based on agriculture and climatological data. In the U.S., local groups use online social media sharing sites to gather data on storm-water runoff and stream siltation in order to alert wardens and environmental agencies. This talk will present various web-based geospatial visualization and analysis techniques and tools such as Google Earth and GeoCommons that have emerged that provide for a collaboration between experts of various domains as well as between experts, government, and citizen scientists. Through increased communication and the sharing of data and tools, it is possible to gain broad insight and development of joint, working solutions to a variety of difficult scientific and policy related questions.

  17. Knowledge Glyphs: Visualization Theory Development to Support C2 Practice

    DTIC Science & Technology

    2006-03-01

    interface’s graphic structure (Calder and Linton, 2003). "• ’Glyphs’ as components of a typographical set (Microsoft Typography Standards). "* ’DataGlyphs...MOOTW) factors MIL STD 2525’s symbology set was designed for application in the context of geospatial representations - i.e., geographical maps. It is...the visual elements used to portray discrete entities. In a conventional windowing environment, such entities are likely to be graphically portrayed

  18. Open Source Web-Based Solutions for Disseminating and Analyzing Flood Hazard Information at the Community Level

    NASA Astrophysics Data System (ADS)

    Santillan, M. M.-M.; Santillan, J. R.; Morales, E. M. O.

    2017-09-01

    We discuss in this paper the development, including the features and functionalities, of an open source web-based flood hazard information dissemination and analytical system called "Flood EViDEns". Flood EViDEns is short for "Flood Event Visualization and Damage Estimations", an application that was developed by the Caraga State University to address the needs of local disaster managers in the Caraga Region in Mindanao, Philippines in accessing timely and relevant flood hazard information before, during and after the occurrence of flood disasters at the community (i.e., barangay and household) level. The web application made use of various free/open source web mapping and visualization technologies (GeoServer, GeoDjango, OpenLayers, Bootstrap), various geospatial datasets including LiDAR-derived elevation and information products, hydro-meteorological data, and flood simulation models to visualize various scenarios of flooding and its associated damages to infrastructures. The Flood EViDEns application facilitates the release and utilization of this flood-related information through a user-friendly front end interface consisting of web map and tables. A public version of the application can be accessed at http://121.97.192.11:8082/. The application is currently expanded to cover additional sites in Mindanao, Philippines through the "Geo-informatics for the Systematic Assessment of Flood Effects and Risks for a Resilient Mindanao" or the "Geo-SAFER Mindanao" Program.

  19. Progressive Visual Analytics: User-Driven Visual Exploration of In-Progress Analytics.

    PubMed

    Stolper, Charles D; Perer, Adam; Gotz, David

    2014-12-01

    As datasets grow and analytic algorithms become more complex, the typical workflow of analysts launching an analytic, waiting for it to complete, inspecting the results, and then re-Iaunching the computation with adjusted parameters is not realistic for many real-world tasks. This paper presents an alternative workflow, progressive visual analytics, which enables an analyst to inspect partial results of an algorithm as they become available and interact with the algorithm to prioritize subspaces of interest. Progressive visual analytics depends on adapting analytical algorithms to produce meaningful partial results and enable analyst intervention without sacrificing computational speed. The paradigm also depends on adapting information visualization techniques to incorporate the constantly refining results without overwhelming analysts and provide interactions to support an analyst directing the analytic. The contributions of this paper include: a description of the progressive visual analytics paradigm; design goals for both the algorithms and visualizations in progressive visual analytics systems; an example progressive visual analytics system (Progressive Insights) for analyzing common patterns in a collection of event sequences; and an evaluation of Progressive Insights and the progressive visual analytics paradigm by clinical researchers analyzing electronic medical records.

  20. Advancing Collaboration through Hydrologic Data and Model Sharing

    NASA Astrophysics Data System (ADS)

    Tarboton, D. G.; Idaszak, R.; Horsburgh, J. S.; Ames, D. P.; Goodall, J. L.; Band, L. E.; Merwade, V.; Couch, A.; Hooper, R. P.; Maidment, D. R.; Dash, P. K.; Stealey, M.; Yi, H.; Gan, T.; Castronova, A. M.; Miles, B.; Li, Z.; Morsy, M. M.

    2015-12-01

    HydroShare is an online, collaborative system for open sharing of hydrologic data, analytical tools, and models. It supports the sharing of and collaboration around "resources" which are defined primarily by standardized metadata, content data models for each resource type, and an overarching resource data model based on the Open Archives Initiative's Object Reuse and Exchange (OAI-ORE) standard and a hierarchical file packaging system called "BagIt". HydroShare expands the data sharing capability of the CUAHSI Hydrologic Information System by broadening the classes of data accommodated to include geospatial and multidimensional space-time datasets commonly used in hydrology. HydroShare also includes new capability for sharing models, model components, and analytical tools and will take advantage of emerging social media functionality to enhance information about and collaboration around hydrologic data and models. It also supports web services and server/cloud based computation operating on resources for the execution of hydrologic models and analysis and visualization of hydrologic data. HydroShare uses iRODS as a network file system for underlying storage of datasets and models. Collaboration is enabled by casting datasets and models as "social objects". Social functions include both private and public sharing, formation of collaborative groups of users, and value-added annotation of shared datasets and models. The HydroShare web interface and social media functions were developed using the Django web application framework coupled to iRODS. Data visualization and analysis is supported through the Tethys Platform web GIS software stack. Links to external systems are supported by RESTful web service interfaces to HydroShare's content. This presentation will introduce the HydroShare functionality developed to date and describe ongoing development of functionality to support collaboration and integration of data and models.

  1. a Geo-Visual Analytics Approach to Biological Shepherding: Modelling Animal Movements and Impacts

    NASA Astrophysics Data System (ADS)

    Benke, K. K.; Sheth, F.; Betteridge, K.; Pettit, C. J.; Aurambout, J.-P.

    2012-07-01

    The lamb industry in Victoria is a significant component of the state economy with annual exports in the vicinity of 1 billion. GPS and visualisation tools can be used to monitor grazing animal movements at the farm scale and observe interactions with the environment. Modelling the spatial-temporal movements of grazing animals in response to environmental conditions provides input for the design of paddocks with the aim of improving management procedures, animal performance and animal welfare. The term "biological shepherding" is associated with the re-design of environmental conditions and the analysis of responses from grazing animals. The combination of biological shepherding with geo-visual analytics (geo-spatial data analysis with visualisation) provides a framework for improving landscape design and supports research in grazing behaviour in variable landscapes, heat stress avoidance behaviour during summer months, and modelling excreta distributions (with respect to nitrogen emissions and nitrogen return for fertilising the paddock). Nitrogen losses due to excreta are mainly in the form of gaseous emissions to the atmosphere and leaching into the groundwater. In this study, background and context are provided in the case of biological shepherding and tracking animal movements. Examples are provided of recent applications in regional Australia and New Zealand. Based on experimental data and computer simulation, and using data visualisation and feature extraction, it was demonstrated that livestock excreta are not always randomly located, but concentrated around localised gathering points, sometimes separated by the nature of the excretion. Farmers require information on the nitrogen losses in order to reduce emissions to meet local and international nitrogen leaching and greenhouse gas targets and to improve the efficiency of nutrient management.

  2. Visualizing dynamic geosciences phenomena using an octree-based view-dependent LOD strategy within virtual globes

    NASA Astrophysics Data System (ADS)

    Li, Jing; Wu, Huayi; Yang, Chaowei; Wong, David W.; Xie, Jibo

    2011-09-01

    Geoscientists build dynamic models to simulate various natural phenomena for a better understanding of our planet. Interactive visualizations of these geoscience models and their outputs through virtual globes on the Internet can help the public understand the dynamic phenomena related to the Earth more intuitively. However, challenges arise when the volume of four-dimensional data (4D), 3D in space plus time, is huge for rendering. Datasets loaded from geographically distributed data servers require synchronization between ingesting and rendering data. Also the visualization capability of display clients varies significantly in such an online visualization environment; some may not have high-end graphic cards. To enhance the efficiency of visualizing dynamic volumetric data in virtual globes, this paper proposes a systematic framework, in which an octree-based multiresolution data structure is implemented to organize time series 3D geospatial data to be used in virtual globe environments. This framework includes a view-dependent continuous level of detail (LOD) strategy formulated as a synchronized part of the virtual globe rendering process. Through the octree-based data retrieval process, the LOD strategy enables the rendering of the 4D simulation at a consistent and acceptable frame rate. To demonstrate the capabilities of this framework, data of a simulated dust storm event are rendered in World Wind, an open source virtual globe. The rendering performances with and without the octree-based LOD strategy are compared. The experimental results show that using the proposed data structure and processing strategy significantly enhances the visualization performance when rendering dynamic geospatial phenomena in virtual globes.

  3. Feature Geo Analytics and Big Data Processing: Hybrid Approaches for Earth Science and Real-Time Decision Support

    NASA Astrophysics Data System (ADS)

    Wright, D. J.; Raad, M.; Hoel, E.; Park, M.; Mollenkopf, A.; Trujillo, R.

    2016-12-01

    Introduced is a new approach for processing spatiotemporal big data by leveraging distributed analytics and storage. A suite of temporally-aware analysis tools summarizes data nearby or within variable windows, aggregates points (e.g., for various sensor observations or vessel positions), reconstructs time-enabled points into tracks (e.g., for mapping and visualizing storm tracks), joins features (e.g., to find associations between features based on attributes, spatial relationships, temporal relationships or all three simultaneously), calculates point densities, finds hot spots (e.g., in species distributions), and creates space-time slices and cubes (e.g., in microweather applications with temperature, humidity, and pressure, or within human mobility studies). These "feature geo analytics" tools run in both batch and streaming spatial analysis mode as distributed computations across a cluster of servers on typical "big" data sets, where static data exist in traditional geospatial formats (e.g., shapefile) locally on a disk or file share, attached as static spatiotemporal big data stores, or streamed in near-real-time. In other words, the approach registers large datasets or data stores with ArcGIS Server, then distributes analysis across a cluster of machines for parallel processing. Several brief use cases will be highlighted based on a 16-node server cluster at 14 Gb RAM per node, allowing, for example, the buffering of over 8 million points or thousands of polygons in 1 minute. The approach is "hybrid" in that ArcGIS Server integrates open-source big data frameworks such as Apache Hadoop and Apache Spark on the cluster in order to run the analytics. In addition, the user may devise and connect custom open-source interfaces and tools developed in Python or Python Notebooks; the common denominator being the familiar REST API.

  4. Public health, GIS, and the internet.

    PubMed

    Croner, Charles M

    2003-01-01

    Internet access and use of georeferenced public health information for GIS application will be an important and exciting development for the nation's Department of Health and Human Services and other health agencies in this new millennium. Technological progress toward public health geospatial data integration, analysis, and visualization of space-time events using the Web portends eventual robust use of GIS by public health and other sectors of the economy. Increasing Web resources from distributed spatial data portals and global geospatial libraries, and a growing suite of Web integration tools, will provide new opportunities to advance disease surveillance, control, and prevention, and insure public access and community empowerment in public health decision making. Emerging supercomputing, data mining, compression, and transmission technologies will play increasingly critical roles in national emergency, catastrophic planning and response, and risk management. Web-enabled public health GIS will be guided by Federal Geographic Data Committee spatial metadata, OpenGIS Web interoperability, and GML/XML geospatial Web content standards. Public health will become a responsive and integral part of the National Spatial Data Infrastructure.

  5. The Whole World In Your Hands: Using an Interactive Virtual Reality Sandbox for Geospatial Education and Outreach

    NASA Astrophysics Data System (ADS)

    Clucas, T.; Wirth, G. S.; Broderson, D.

    2014-12-01

    Traditional geospatial education tools such as maps and computer screens don't convey the rich topography present on Earth. Translating lines on a contour lines on a topo map to relief in a landscape can be a challenging concept to convey.A partnership between Alaska EPSCoR and the Geographic Information Network of Alaska has successfully constructed an Interactive Virtual Reality Sandbox, an education tool that in real-time projects and updates topographic contours on the surface of a sandbox. The sandbox has been successfully deployed at public science events as well as professional geospatial and geodesy conferences. Landscape change, precipitation, and evaporation can all be modeled, much to the delight of our enthusiasts, who range in age from 3 to 90. Visually, as well as haptically, demonstrating the effects of events (such as dragging a hand through the sand) on a landscape, as well as the intuitive realization of meaning of topographic contour lines, has proven to be engaging.

  6. Cloud computing geospatial application for water resources based on free and open source software and open standards - a prototype

    NASA Astrophysics Data System (ADS)

    Delipetrev, Blagoj

    2016-04-01

    Presently, most of the existing software is desktop-based, designed to work on a single computer, which represents a major limitation in many ways, starting from limited computer processing, storage power, accessibility, availability, etc. The only feasible solution lies in the web and cloud. This abstract presents research and development of a cloud computing geospatial application for water resources based on free and open source software and open standards using hybrid deployment model of public - private cloud, running on two separate virtual machines (VMs). The first one (VM1) is running on Amazon web services (AWS) and the second one (VM2) is running on a Xen cloud platform. The presented cloud application is developed using free and open source software, open standards and prototype code. The cloud application presents a framework how to develop specialized cloud geospatial application that needs only a web browser to be used. This cloud application is the ultimate collaboration geospatial platform because multiple users across the globe with internet connection and browser can jointly model geospatial objects, enter attribute data and information, execute algorithms, and visualize results. The presented cloud application is: available all the time, accessible from everywhere, it is scalable, works in a distributed computer environment, it creates a real-time multiuser collaboration platform, the programing languages code and components are interoperable, and it is flexible in including additional components. The cloud geospatial application is implemented as a specialized water resources application with three web services for 1) data infrastructure (DI), 2) support for water resources modelling (WRM), 3) user management. The web services are running on two VMs that are communicating over the internet providing services to users. The application was tested on the Zletovica river basin case study with concurrent multiple users. The application is a state-of-the-art cloud geospatial collaboration platform. The presented solution is a prototype and can be used as a foundation for developing of any specialized cloud geospatial applications. Further research will be focused on distributing the cloud application on additional VMs, testing the scalability and availability of services.

  7. RacerGISOnline: Enhancing Learning in Marketing Classes with Web-Based Business GIS

    ERIC Educational Resources Information Center

    Miller, Fred L.; Mangold, W. Glynn; Roach, Joy; Brockway, Gary; Johnston, Timothy; Linnhoff, Stefan; McNeely, Sam; Smith, Kathy; Holmes, Terence

    2014-01-01

    Geographic Information Systems (GIS) offer geospatial analytical tools with great potential for applications in marketing decision making. However, for various reasons, the rate of adoption of these tools in academic marketing programs has lagged behind that of marketing practitioners. RacerGISOnline is an innovative approach to integrating these…

  8. Geo-Spatial Social Network Analysis of Social Media to Mitigate Disasters

    NASA Astrophysics Data System (ADS)

    Carley, K. M.

    2017-12-01

    Understanding the spatial layout of human activity can afford a better understanding many phenomena - such as local cultural, the spread of ideas, and the scope of a disaster. Today, social media is one of the key sensors for acquiring information on socio-cultural activity, some with cues as to the geo-location. We ask, What can be learned by putting such data on maps? For example, are people who chat on line more likely to be near each other? Can Twitter data support disaster planning or early warning? In this talk, such issues are examined using data collected via Twitter and analyzed using ORA. ORA is a network analysis and visualization system. It supports not just social networks (who is interacting with whom), but also high dimensional networks with many types of nodes (e.g. people, organizations, resources, activities …) and relations, geo-spatial network analysis, dynamic network analysis, & geo-temporal analysis. Using ORA lessons learned from five case studies are considered: Arab Spring, Tsunami warning in Padang Indonesia, Twitter around Fukushima in Japan, Typhoon Haiyan (Yolanda), & regional conflict. Using Padang Indonesia data, we characterize the strengths and limitations of social media data to support disaster planning & early warning, identify at risk areas & issues of concern, and estimate where people are and which areas are impacted. Using Fukushima Japanese data, social media is used to estimate geo-spatial regularities in movement and communication that can inform disaster response and risk estimation. Using Arab Spring data, we find that the spread of bots & extremists varies by country and time, to the extent that using twitter to understand who is important or what ideas are critical can be compromised. Bots and extremists can exploit disaster messaging to create havoc and facilitate criminal activity e.g. human trafficking. Event discovery mechanisms support isolating geo-epi-centers for key events become crucial. Spatial inference enables improved country, and city identification. Geo-network analytics with and without these inferences reveal that explicitly geo-tagged data may not be representative and that improved location estimation provides better insight into the social condition. These results demonstrate the value of these technique to mitigate the social impact of disasters.

  9. Application of the AMBUR R package for spatio-temporal analysis of shoreline change: Jekyll Island, Georgia, USA

    NASA Astrophysics Data System (ADS)

    Jackson, Chester W.; Alexander, Clark R.; Bush, David M.

    2012-04-01

    The AMBUR (Analyzing Moving Boundaries Using R) package for the R software environment provides a collection of functions for assisting with analyzing and visualizing historical shoreline change. The package allows import and export of geospatial data in ESRI shapefile format, which is compatible with most commercial and open-source GIS software. The "baseline and transect" method is the primary technique used to quantify distances and rates of shoreline movement, and to detect classification changes across time. Along with the traditional "perpendicular" transect method, two new transect methods, "near" and "filtered," assist with quantifying changes along curved shorelines that are problematic for perpendicular transect methods. Output from the analyses includes data tables, graphics, and geospatial data, which are useful in rapidly assessing trends and potential errors in the dataset. A forecasting function also allows the user to estimate the future location of the shoreline and store the results in a shapefile. Other utilities and tools provided in the package assist with preparing and manipulating geospatial data, error checking, and generating supporting graphics and shapefiles. The package can be customized to perform additional statistical, graphical, and geospatial functions, and, it is capable of analyzing the movement of any boundary (e.g., shorelines, glacier terminus, fire edge, and marine and terrestrial ecozones).

  10. Developing Transferrable Geospatial Skills in a Liberal Arts Context

    ERIC Educational Resources Information Center

    Colaianne, Blake A.; Powell, Matthew G.

    2011-01-01

    Geology education usually takes place within the context of a broader curriculum, but specific synergies between disciplines have rarely been explored or exploited. Here, we have assessed the spatial visualization skills of undergraduate students in a variety of disciplines to determine which are most compatible with a geology curriculum. Spatial…

  11. Uses of GIS for Homeland Security and Emergency Management for Higher Education Institutions

    ERIC Educational Resources Information Center

    Murchison, Stuart B.

    2010-01-01

    Geographic information systems (GIS) are a major component of the geospatial sciences, which are also composed of geostatistical analysis, remote sensing, and global positional satellite systems. These systems can be integrated into GIS for georeferencing, pattern analysis, visualization, and understanding spatial concepts that transcend…

  12. Transforming Undergraduate Education Through the use of Analytical Reasoning (TUETAR)

    NASA Astrophysics Data System (ADS)

    Bishop, M. P.; Houser, C.; Lemmons, K.

    2015-12-01

    Traditional learning limits the potential for self-discovery, and the use of data and knowledge to understand Earth system relationships, processes, feedback mechanisms and system coupling. It is extremely difficult for undergraduate students to analyze, synthesize, and integrate quantitative information related to complex systems, as many concepts may not be mathematically tractable or yet to be formalized. Conceptual models have long served as a means for Earth scientists to organize their understanding of Earth's dynamics, and have served as a basis for human analytical reasoning and landscape interpretation. Consequently, we evaluated the use of conceptual modeling, knowledge representation and analytical reasoning to provide undergraduate students with an opportunity to develop and test geocomputational conceptual models based upon their understanding of Earth science concepts. This study describes the use of geospatial technologies and fuzzy cognitive maps to predict desertification across the South-Texas Sandsheet in an upper-level geomorphology course. Students developed conceptual models based on their understanding of aeolian processes from lectures, and then compared and evaluated their modeling results against an expert conceptual model and spatial predictions, and the observed distribution of dune activity in 2010. Students perceived that the analytical reasoning approach was significantly better for understanding desertification compared to traditional lecture, and promoted reflective learning, working with data, teamwork, student interaction, innovation, and creative thinking. Student evaluations support the notion that the adoption of knowledge representation and analytical reasoning in the classroom has the potential to transform undergraduate education by enabling students to formalize and test their conceptual understanding of Earth science. A model for developing and utilizing this geospatial technology approach in Earth science is presented.

  13. Increasing the availability and usability of terrestrial ecology data through geospatial Web services and visualization tools (Invited)

    NASA Astrophysics Data System (ADS)

    Santhana Vannan, S.; Cook, R. B.; Wilson, B. E.; Wei, Y.

    2010-12-01

    Terrestrial ecology data sets are produced from diverse data sources such as model output, field data collection, laboratory analysis and remote sensing observation. These data sets can be created, distributed, and consumed in diverse ways as well. However, this diversity can hinder the usability of the data, and limit data users’ abilities to validate and reuse data for science and application purposes. Geospatial web services, such as those described in this paper, are an important means of reducing this burden. Terrestrial ecology researchers generally create the data sets in diverse file formats, with file and data structures tailored to the specific needs of their project, possibly as tabular data, geospatial images, or documentation in a report. Data centers may reformat the data to an archive-stable format and distribute the data sets through one or more protocols, such as FTP, email, and WWW. Because of the diverse data preparation, delivery, and usage patterns, users have to invest time and resources to bring the data into the format and structure most useful for their analysis. This time-consuming data preparation process shifts valuable resources from data analysis to data assembly. To address these issues, the ORNL DAAC, a NASA-sponsored terrestrial ecology data center, has utilized geospatial Web service technology, such as Open Geospatial Consortium (OGC) Web Map Service (WMS) and OGC Web Coverage Service (WCS) standards, to increase the usability and availability of terrestrial ecology data sets. Data sets are standardized into non-proprietary file formats and distributed through OGC Web Service standards. OGC Web services allow the ORNL DAAC to store data sets in a single format and distribute them in multiple ways and formats. Registering the OGC Web services through search catalogues and other spatial data tools allows for publicizing the data sets and makes them more available across the Internet. The ORNL DAAC has also created a Web-based graphical user interface called Spatial Data Access Tool (SDAT) that utilizes OGC Web services standards and allows data distribution and consumption for users not familiar with OGC standards. SDAT also allows for users to visualize the data set prior to download. Google Earth visualizations of the data set are also provided through SDAT. The use of OGC Web service standards at the ORNL DAAC has enabled an increase in data consumption. In one case, a data set had ~10 fold increase in download through OGC Web service in comparison to the conventional FTP and WWW method of access. The increase in download suggests that users are not only finding the data sets they need but also able to consume them readily in the format they need.

  14. The LandCarbon Web Application: Advanced Geospatial Data Delivery and Visualization Tools for Communication about Ecosystem Carbon Sequestration and Greenhouse Gas Fluxes

    NASA Astrophysics Data System (ADS)

    Thomas, N.; Galey, B.; Zhu, Z.; Sleeter, B. M.; Lehmer, E.

    2015-12-01

    The LandCarbon web application (http://landcarbon.org) is a collaboration between the U.S. Geological Survey and U.C. Berkeley's Geospatial Innovation Facility (GIF). The LandCarbon project is a national assessment focused on improved understanding of carbon sequestration and greenhouse gas fluxes in and out of ecosystems related to land use, using scientific capabilities from USGS and other organizations. The national assessment is conducted at a regional scale, covers all 50 states, and incorporates data from remote sensing, land change studies, aquatic and wetland data, hydrological and biogeochemical modeling, and wildfire mapping to estimate baseline and future potential carbon storage and greenhouse gas fluxes. The LandCarbon web application is a geospatial portal that allows for a sophisticated data delivery system as well as a suite of engaging tools that showcase the LandCarbon data using interactive web based maps and charts. The web application was designed to be flexible and accessible to meet the needs of a variety of users. Casual users can explore the input data and results of the assessment for a particular area of interest in an intuitive and interactive map, without the need for specialized software. Users can view and interact with maps, charts, and statistics that summarize the baseline and future potential carbon storage and fluxes for U.S. Level 2 Ecoregions for 3 IPCC emissions scenarios. The application allows users to access the primary data sources and assessment results for viewing and download, and also to learn more about the assessment's objectives, methods, and uncertainties through published reports and documentation. The LandCarbon web application is built on free and open source libraries including Django and D3. The GIF has developed the Django-Spillway package, which facilitates interactive visualization and serialization of complex geospatial raster data. The underlying LandCarbon data is available through an open application programming interface (API), which will allow other organizations to build their own custom applications and tools. New features such as finer scale aggregation and an online carbon calculator are being added to the LandCarbon web application to continue to make the site interactive, visually compelling, and useful for a wide range of users.

  15. Geospatial Data Stream Processing in Python Using FOSS4G Components

    NASA Astrophysics Data System (ADS)

    McFerren, G.; van Zyl, T.

    2016-06-01

    One viewpoint of current and future IT systems holds that there is an increase in the scale and velocity at which data are acquired and analysed from heterogeneous, dynamic sources. In the earth observation and geoinformatics domains, this process is driven by the increase in number and types of devices that report location and the proliferation of assorted sensors, from satellite constellations to oceanic buoy arrays. Much of these data will be encountered as self-contained messages on data streams - continuous, infinite flows of data. Spatial analytics over data streams concerns the search for spatial and spatio-temporal relationships within and amongst data "on the move". In spatial databases, queries can assess a store of data to unpack spatial relationships; this is not the case on streams, where spatial relationships need to be established with the incomplete data available. Methods for spatially-based indexing, filtering, joining and transforming of streaming data need to be established and implemented in software components. This article describes the usage patterns and performance metrics of a number of well known FOSS4G Python software libraries within the data stream processing paradigm. In particular, we consider the RTree library for spatial indexing, the Shapely library for geometric processing and transformation and the PyProj library for projection and geodesic calculations over streams of geospatial data. We introduce a message oriented Python-based geospatial data streaming framework called Swordfish, which provides data stream processing primitives, functions, transports and a common data model for describing messages, based on the Open Geospatial Consortium Observations and Measurements (O&M) and Unidata Common Data Model (CDM) standards. We illustrate how the geospatial software components are integrated with the Swordfish framework. Furthermore, we describe the tight temporal constraints under which geospatial functionality can be invoked when processing high velocity, potentially infinite geospatial data streams. The article discusses the performance of these libraries under simulated streaming loads (size, complexity and volume of messages) and how they can be deployed and utilised with Swordfish under real load scenarios, illustrated by a set of Vessel Automatic Identification System (AIS) use cases. We conclude that the described software libraries are able to perform adequately under geospatial data stream processing scenarios - many real application use cases will be handled sufficiently by the software.

  16. NASA World Wind, Open Source 4D Geospatial Visualization Platform: *.NET & Java* for EDUCATION

    NASA Astrophysics Data System (ADS)

    Hogan, P.; Kuehnel, F.

    2006-12-01

    NASA World Wind has only one goal, to provide the maximum opportunity for geospatial information to be experienced, be it education, science, research, business, or government. The benefits to understanding for information delivered in the context of its 4D virtual reality are extraordinary. The NASA World Wind visualization platform is open source and therefore lends itself well to being extended to service *any* requirements, be they proprietary and commercial or simply available. Data accessibility is highly optimized using standard formats including internationally certified open standards (W*S). Although proprietary applications can be built based on World Wind, and proprietary data delivered that leverage World Wind, there is nothing proprietary about the visualization platform itself or the multiple planetary data sets readily available, including global animations of live weather. NASA World Wind is being used by NASA research teams as well as being a formal part of high school and university curriculum. The National Guard uses World Wind for emergency response activities and State governments have incorporated high resolution imagery for GIS management as well as for their cross-agency emergency response activities. The U.S. federal government uses NASA World Wind for a myriad of GIS and security-related issues (NSA, NGA, DOE, FAA, etc.).

  17. NASA World Wind, Open Source 4D Geospatial Visualization Platform: *.NET & Java*

    NASA Astrophysics Data System (ADS)

    Hogan, P.; Coughlan, J.

    2006-12-01

    NASA World Wind has only one goal, to provide the maximum opportunity for geospatial information to be experienced, be it education, science, research, business, or government. The benefits to understanding for information delivered in the context of its 4D virtual reality are extraordinary. The NASA World Wind visualization platform is open source and therefore lends itself well to being extended to service *any* requirements, be they proprietary and commercial or simply available. Data accessibility is highly optimized using standard formats including internationally certified open standards (W*S). Although proprietary applications can be built based on World Wind, and proprietary data delivered that leverage World Wind, there is nothing proprietary about the visualization platform itself or the multiple planetary data sets readily available, including global animations of live weather. NASA World Wind is being used by NASA research teams as well as being a formal part of high school and university curriculum. The National Guard uses World Wind for emergency response activities and State governments have incorporated high resolution imagery for GIS management as well as for their cross-agency emergency response activities. The U.S. federal government uses NASA World Wind for a myriad of GIS and security-related issues (NSA, NGA, DOE, FAA, etc.).

  18. Adding uncertainty to forest inventory plot locations: effects on analyses using geospatial data

    Treesearch

    Alexia A. Sabor; Volker C. Radeloff; Ronald E. McRoberts; Murray Clayton; Susan I. Stewart

    2007-01-01

    The Forest Inventory and Analysis (FIA) program of the USDA Forest Service alters plot locations before releasing data to the public to ensure landowner confidentiality and sample integrity, but using data with altered plot locations in conjunction with other spatially explicit data layers produces analytical results with unknown amounts of error. We calculated the...

  19. New Techniques for Deep Learning with Geospatial Data using TensorFlow, Earth Engine, and Google Cloud Platform

    NASA Astrophysics Data System (ADS)

    Hancher, M.

    2017-12-01

    Recent years have seen promising results from many research teams applying deep learning techniques to geospatial data processing. In that same timeframe, TensorFlow has emerged as the most popular framework for deep learning in general, and Google has assembled petabytes of Earth observation data from a wide variety of sources and made them available in analysis-ready form in the cloud through Google Earth Engine. Nevertheless, developing and applying deep learning to geospatial data at scale has been somewhat cumbersome to date. We present a new set of tools and techniques that simplify this process. Our approach combines the strengths of several underlying tools: TensorFlow for its expressive deep learning framework; Earth Engine for data management, preprocessing, postprocessing, and visualization; and other tools in Google Cloud Platform to train TensorFlow models at scale, perform additional custom parallel data processing, and drive the entire process from a single familiar Python development environment. These tools can be used to easily apply standard deep neural networks, convolutional neural networks, and other custom model architectures to a variety of geospatial data structures. We discuss our experiences applying these and related tools to a range of machine learning problems, including classic problems like cloud detection, building detection, land cover classification, as well as more novel problems like illegal fishing detection. Our improved tools will make it easier for geospatial data scientists to apply modern deep learning techniques to their own problems, and will also make it easier for machine learning researchers to advance the state of the art of those techniques.

  20. The Matsu Wheel: A Cloud-Based Framework for Efficient Analysis and Reanalysis of Earth Satellite Imagery

    NASA Technical Reports Server (NTRS)

    Patterson, Maria T.; Anderson, Nicholas; Bennett, Collin; Bruggemann, Jacob; Grossman, Robert L.; Handy, Matthew; Ly, Vuong; Mandl, Daniel J.; Pederson, Shane; Pivarski, James; hide

    2016-01-01

    Project Matsu is a collaboration between the Open Commons Consortium and NASA focused on developing open source technology for cloud-based processing of Earth satellite imagery with practical applications to aid in natural disaster detection and relief. Project Matsu has developed an open source cloud-based infrastructure to process, analyze, and reanalyze large collections of hyperspectral satellite image data using OpenStack, Hadoop, MapReduce and related technologies. We describe a framework for efficient analysis of large amounts of data called the Matsu "Wheel." The Matsu Wheel is currently used to process incoming hyperspectral satellite data produced daily by NASA's Earth Observing-1 (EO-1) satellite. The framework allows batches of analytics, scanning for new data, to be applied to data as it flows in. In the Matsu Wheel, the data only need to be accessed and preprocessed once, regardless of the number or types of analytics, which can easily be slotted into the existing framework. The Matsu Wheel system provides a significantly more efficient use of computational resources over alternative methods when the data are large, have high-volume throughput, may require heavy preprocessing, and are typically used for many types of analysis. We also describe our preliminary Wheel analytics, including an anomaly detector for rare spectral signatures or thermal anomalies in hyperspectral data and a land cover classifier that can be used for water and flood detection. Each of these analytics can generate visual reports accessible via the web for the public and interested decision makers. The result products of the analytics are also made accessible through an Open Geospatial Compliant (OGC)-compliant Web Map Service (WMS) for further distribution. The Matsu Wheel allows many shared data services to be performed together to efficiently use resources for processing hyperspectral satellite image data and other, e.g., large environmental datasets that may be analyzed for many purposes.

  1. The Human is the Loop: New Directions for Visual Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Endert, Alexander; Hossain, Shahriar H.; Ramakrishnan, Naren

    2014-01-28

    Visual analytics is the science of marrying interactive visualizations and analytic algorithms to support exploratory knowledge discovery in large datasets. We argue for a shift from a ‘human in the loop’ philosophy for visual analytics to a ‘human is the loop’ viewpoint, where the focus is on recognizing analysts’ work processes, and seamlessly fitting analytics into that existing interactive process. We survey a range of projects that provide visual analytic support contextually in the sensemaking loop, and outline a research agenda along with future challenges.

  2. Geoscience data visualization and analysis using GeoMapApp

    NASA Astrophysics Data System (ADS)

    Ferrini, Vicki; Carbotte, Suzanne; Ryan, William; Chan, Samantha

    2013-04-01

    Increased availability of geoscience data resources has resulted in new opportunities for developing visualization and analysis tools that not only promote data integration and synthesis, but also facilitate quantitative cross-disciplinary access to data. Interdisciplinary investigations, in particular, frequently require visualizations and quantitative access to specialized data resources across disciplines, which has historically required specialist knowledge of data formats and software tools. GeoMapApp (www.geomapapp.org) is a free online data visualization and analysis tool that provides direct quantitative access to a wide variety of geoscience data for a broad international interdisciplinary user community. While GeoMapApp provides access to online data resources, it can also be packaged to work offline through the deployment of a small portable hard drive. This mode of operation can be particularly useful during field programs to provide functionality and direct access to data when a network connection is not possible. Hundreds of data sets from a variety of repositories are directly accessible in GeoMapApp, without the need for the user to understand the specifics of file formats or data reduction procedures. Available data include global and regional gridded data, images, as well as tabular and vector datasets. In addition to basic visualization and data discovery functionality, users are provided with simple tools for creating customized maps and visualizations and to quantitatively interrogate data. Specialized data portals with advanced functionality are also provided for power users to further analyze data resources and access underlying component datasets. Users may import and analyze their own geospatial datasets by loading local versions of geospatial data and can access content made available through Web Feature Services (WFS) and Web Map Services (WMS). Once data are loaded in GeoMapApp, a variety options are provided to export data and/or 2D/3D visualizations into common formats including grids, images, text files, spreadsheets, etc. Examples of interdisciplinary investigations that make use of GeoMapApp visualization and analysis functionality will be provided.

  3. Geospatial Analysis and Model Evaluation Software (GAMES): Integrated Web-Based Analysis and Visualization

    DTIC Science & Technology

    2014-04-11

    particle location files for each source (hours) dti : time step in seconds horzmix: CONSTANT = use the value of horcon...however, if leg lengths are short. Extreme values of D/Lo can occur. We will handle these by assigning a maximum to the output. This is discussed by

  4. Scales of heterogeneity of water quality in rivers: Insights from high resolution maps based on integrated geospatial, sensor and ROV technologies

    EPA Science Inventory

    While the spatial heterogeneity of many aquatic ecosystems is acknowledged, rivers are often mistakenly described as homogenous and well-mixed. The collection and visualization of attributes like water quality is key to our perception and management of these ecosystems. The ass...

  5. Learning about Urban Ecology through the Use of Visualization and Geospatial Technologies

    ERIC Educational Resources Information Center

    Barnett, Michael; Houle, Meredith; Mark, Sheron; Strauss, Eric; Hoffman, Emily

    2010-01-01

    During the past three years we have been designing and implementing a technology enhanced urban ecology program using geographic information systems (GIS) coupled with technology. Our initial work focused on professional development for in-service teachers and implementation in K-12 classrooms. However, upon reflection and analysis of the…

  6. WebGL Visualisation of 3D Environmental Models Based on Finnish Open Geospatial Data Sets

    NASA Astrophysics Data System (ADS)

    Krooks, A.; Kahkonen, J.; Lehto, L.; Latvala, P.; Karjalainen, M.; Honkavaara, E.

    2014-08-01

    Recent developments in spatial data infrastructures have enabled real time GIS analysis and visualization using open input data sources and service interfaces. In this study we present a new concept where metric point clouds derived from national open airborne laser scanning (ALS) and photogrammetric image data are processed, analyzed, finally visualised a through open service interfaces to produce user-driven analysis products from targeted areas. The concept is demonstrated in three environmental applications: assessment of forest storm damages, assessment of volumetric changes in open pit mine and 3D city model visualization. One of the main objectives was to study the usability and requirements of national level photogrammetric imagery in these applications. The results demonstrated that user driven 3D geospatial analyses were possible with the proposed approach and current technology, for instance, the landowner could assess the amount of fallen trees within his property borders after a storm easily using any web browser. On the other hand, our study indicated that there are still many uncertainties especially due to the insufficient standardization of photogrammetric products and processes and their quality indicators.

  7. Data management for geospatial vulnerability assessment of interdependencies in US power generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shih, C.Y.; Scown, C.D.; Soibelman, L.

    2009-09-15

    Critical infrastructures maintain our society's stability, security, and quality of life. These systems are also interdependent, which means that the disruption of one infrastructure system can significantly impact the operation of other systems. Because of the heavy reliance on electricity production, it is important to assess possible vulnerabilities. Determining the source of these vulnerabilities can provide insight for risk management and emergency response efforts. This research uses data warehousing and visualization techniques to explore the interdependencies between coal mines, rail transportation, and electric power plants. By merging geospatial and nonspatial data, we are able to model the potential impacts ofmore » a disruption to one or more mines, rail lines, or power plants, and visually display the results using a geographical information system. A scenario involving a severe earthquake in the New Madrid Seismic Zone is used to demonstrate the capabilities of the model when given input in the form of a potentially impacted area. This type of interactive analysis can help decision makers to understand the vulnerabilities of the coal distribution network and the potential impact it can have on electricity production.« less

  8. Long-Term Audience Impacts of Live Fulldome Planetarium Lectures for Earth Science and Global Change Education

    NASA Astrophysics Data System (ADS)

    Yu, K. C.; Champlin, D. M.; Goldsworth, D. A.; Raynolds, R. G.; Dechesne, M.

    2011-09-01

    Digital Earth visualization technologies, from ArcGIS to Google Earth, have allowed for the integration of complex, disparate data sets to produce visually rich and compelling three-dimensional models of sub-surface and surface resource distribution patterns. The rendering of these models allows the public to quickly understand complicated geospatial relationships that would otherwise take much longer to explain using traditional media. At the Denver Museum of Nature & Science (DMNS), we have used such visualization technologies, including real-time virtual reality software running in the immersive digital "fulldome" Gates Planetarium, to impact the community through topical policy presentations. DMNS public lectures have covered regional issues like water resources, as well as global topics such as earthquakes, tsunamis, and resource depletion. The Gates Planetarium allows an audience to have an immersive experience-similar to virtual reality "CAVE" environments found in academia-that would otherwise not be available to the general public. Public lectures in the dome allow audiences of over 100 people to comprehend dynamically changing geospatial datasets in an exciting and engaging fashion. Surveys and interviews show that these talks are effective in heightening visitor interest in the subjects weeks or months after the presentation. Many visitors take additional steps to learn more, while one was so inspired that she actively worked to bring the same programming to her children's school. These preliminary findings suggest that fulldome real-time visualizations can have a substantial long-term impact on an audience's engagement and interest in science topics.

  9. Visual analytics for aviation safety: A collaborative approach to sensemaking

    NASA Astrophysics Data System (ADS)

    Wade, Andrew

    Visual analytics, the "science of analytical reasoning facilitated by interactive visual interfaces", is more than just visualization. Understanding the human reasoning process is essential for designing effective visualization tools and providing correct analyses. This thesis describes the evolution, application and evaluation of a new method for studying analytical reasoning that we have labeled paired analysis. Paired analysis combines subject matter experts (SMEs) and tool experts (TE) in an analytic dyad, here used to investigate aircraft maintenance and safety data. The method was developed and evaluated using interviews, pilot studies and analytic sessions during an internship at the Boeing Company. By enabling a collaborative approach to sensemaking that can be captured by researchers, paired analysis yielded rich data on human analytical reasoning that can be used to support analytic tool development and analyst training. Keywords: visual analytics, paired analysis, sensemaking, boeing, collaborative analysis.

  10. VAST Challenge 2016: Streaming Visual Analytics

    DTIC Science & Technology

    2016-10-25

    understand rapidly evolving situations. To support such tasks, visual analytics solutions must move well beyond systems that simply provide real-time...received. Mini-Challenge 1: Design Challenge Mini-Challenge 1 focused on systems to support security and operational analytics at the Euybia...Challenge 1 was to solicit novel approaches for streaming visual analytics that push the boundaries for what constitutes a visual analytics system , and to

  11. Exploring the effects of population growth on future land use change in the Las Vegas Wash watershed: an integrated approach of geospatial modeling and analytics

    EPA Science Inventory

    The Las Vegas Valley metropolitan area is one of the fastest growing areas in the southwestern United States. The rapid urbanization has led to many environmental problems. For instance, as population growth and urbanization continue, there will be a problem with water shortage. ...

  12. Developing Guidelines for Assessing Visual Analytics Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholtz, Jean

    2011-07-01

    In this paper, we develop guidelines for evaluating visual analytic environments based on a synthesis of reviews for the entries to the 2009 Visual Analytics Science and Technology (VAST) Symposium Challenge and from a user study with professional intelligence analysts. By analyzing the 2009 VAST Challenge reviews we gained a better understanding of what is important to our reviewers, both visualization researchers and professional analysts. We also report on a small user study with professional analysts to determine the important factors that they use in evaluating visual analysis systems. We then looked at guidelines developed by researchers in various domainsmore » and synthesized these into an initial set for use by others in the community. In a second part of the user study, we looked at guidelines for a new aspect of visual analytic systems – the generation of reports. Future visual analytic systems have been challenged to help analysts generate their reports. In our study we worked with analysts to understand the criteria they used to evaluate the quality of analytic reports. We propose that this knowledge will be useful as researchers look at systems to automate some of the report generation.1 Based on these efforts, we produced some initial guidelines for evaluating visual analytic environment and for evaluation of analytic reports. It is important to understand that these guidelines are initial drafts and are limited in scope because of the type of tasks for which the visual analytic systems used in the studies in this paper were designed. More research and refinement is needed by the Visual Analytics Community to provide additional evaluation guidelines for different types of visual analytic environments.« less

  13. Introduction of geospatial perspective to the ecology of fish-habitat relationships in Indonesian coral reefs: A remote sensing approach

    NASA Astrophysics Data System (ADS)

    Sawayama, Shuhei; Nurdin, Nurjannah; Akbar AS, Muhammad; Sakamoto, Shingo X.; Komatsu, Teruhisa

    2015-06-01

    Coral reef ecosystems worldwide are now being harmed by various stresses accompanying the degradation of fish habitats and thus knowledge of fish-habitat relationships is urgently required. Because conventional research methods were not practical for this purpose due to the lack of a geospatial perspective, we attempted to develop a research method integrating visual fish observation with a seabed habitat map and to expand knowledge to a two-dimensional scale. WorldView-2 satellite imagery of Spermonde Archipelago, Indonesia obtained in September 2012 was analyzed and classified into four typical substrates: live coral, dead coral, seagrass and sand. Overall classification accuracy of this map was 81.3% and considered precise enough for subsequent analyses. Three sub-areas (CC: continuous coral reef, BC: boundary of coral reef and FC: few live coral zone) around reef slopes were extracted from the map. Visual transect surveys for several fish species were conducted within each sub-area in June 2013. As a result, Mean density (Ind. / 300 m2) of Chaetodon octofasciatus, known as an obligate feeder of corals, was significantly higher at BC than at the others (p < 0.05), implying that this species' density is strongly influenced by spatial configuration of its habitat, like the "edge effect." This indicates that future conservation procedures for coral reef fishes should consider not only coral cover but also its spatial configuration. The present study also indicates that the introduction of a geospatial perspective derived from remote sensing has great potential to progress conventional ecological studies on coral reef fishes.

  14. ClimatePipes: User-Friendly Data Access, Manipulation, Analysis & Visualization of Community Climate Models

    NASA Astrophysics Data System (ADS)

    Chaudhary, A.; DeMarle, D.; Burnett, B.; Harris, C.; Silva, W.; Osmari, D.; Geveci, B.; Silva, C.; Doutriaux, C.; Williams, D. N.

    2013-12-01

    The impact of climate change will resonate through a broad range of fields including public health, infrastructure, water resources, and many others. Long-term coordinated planning, funding, and action are required for climate change adaptation and mitigation. Unfortunately, widespread use of climate data (simulated and observed) in non-climate science communities is impeded by factors such as large data size, lack of adequate metadata, poor documentation, and lack of sufficient computational and visualization resources. We present ClimatePipes to address many of these challenges by creating an open source platform that provides state-of-the-art, user-friendly data access, analysis, and visualization for climate and other relevant geospatial datasets, making the climate data available to non-researchers, decision-makers, and other stakeholders. The overarching goals of ClimatePipes are: - Enable users to explore real-world questions related to climate change. - Provide tools for data access, analysis, and visualization. - Facilitate collaboration by enabling users to share datasets, workflows, and visualization. ClimatePipes uses a web-based application platform for its widespread support on mainstream operating systems, ease-of-use, and inherent collaboration support. The front-end of ClimatePipes uses HTML5 (WebGL, Canvas2D, CSS3) to deliver state-of-the-art visualization and to provide a best-in-class user experience. The back-end of the ClimatePipes is built around Python using the Visualization Toolkit (VTK, http://vtk.org), Climate Data Analysis Tools (CDAT, http://uv-cdat.llnl.gov), and other climate and geospatial data processing tools such as GDAL and PROJ4. ClimatePipes web-interface to query and access data from remote sources (such as ESGF). Shown in the figure is climate data layer from ESGF on top of map data layer from OpenStreetMap. The ClimatePipes workflow editor provides flexibility and fine grained control, and uses the VisTrails (http://www.vistrails.org) workflow engine in the backend.

  15. New Technologies for Acquisition and 3-D Visualization of Geophysical and Other Data Types Combined for Enhanced Understandings and Efficiencies of Oil and Gas Operations, Deepwater Gulf of Mexico

    NASA Astrophysics Data System (ADS)

    Thomson, J. A.; Gee, L. J.; George, T.

    2002-12-01

    This presentation shows results of a visualization method used to display and analyze multiple data types in a geospatially referenced three-dimensional (3-D) space. The integrated data types include sonar and seismic geophysical data, pipeline and geotechnical engineering data, and 3-D facilities models. Visualization of these data collectively in proper 3-D orientation yields insights and synergistic understandings not previously obtainable. Key technological components of the method are: 1) high-resolution geophysical data obtained using a newly developed autonomous underwater vehicle (AUV), 2) 3-D visualization software that delivers correctly positioned display of multiple data types and full 3-D flight navigation within the data space and 3) a highly immersive visualization environment (HIVE) where multidisciplinary teams can work collaboratively to develop enhanced understandings of geospatially complex data relationships. The initial study focused on an active deepwater development area in the Green Canyon protraction area, Gulf of Mexico. Here several planned production facilities required detailed, integrated data analysis for design and installation purposes. To meet the challenges of tight budgets and short timelines, an innovative new method was developed based on the combination of newly developed technologies. Key benefits of the method include enhanced understanding of geologically complex seabed topography and marine soils yielding safer and more efficient pipeline and facilities siting. Environmental benefits include rapid and precise identification of potential locations of protected deepwater biological communities for avoidance and protection during exploration and production operations. In addition, the method allows data presentation and transfer of learnings to an audience outside the scientific and engineering team. This includes regulatory personnel, marine archaeologists, industry partners and others.

  16. Tsunami vertical-evacuation planning in the U.S. Pacific Northwest as a geospatial, multi-criteria decision problem

    USGS Publications Warehouse

    Wood, Nathan; Jones, Jeanne; Schelling, John; Schmidtlein, Mathew

    2014-01-01

    Tsunami vertical-evacuation (TVE) refuges can be effective risk-reduction options for coastal communities with local tsunami threats but no accessible high ground for evacuations. Deciding where to locate TVE refuges is a complex risk-management question, given the potential for conflicting stakeholder priorities and multiple, suitable sites. We use the coastal community of Ocean Shores (Washington, USA) and the local tsunami threat posed by Cascadia subduction zone earthquakes as a case study to explore the use of geospatial, multi-criteria decision analysis for framing the locational problem of TVE siting. We demonstrate a mixed-methods approach that uses potential TVE sites identified at community workshops, geospatial analysis to model changes in pedestrian evacuation times for TVE options, and statistical analysis to develop metrics for comparing population tradeoffs and to examine influences in decision making. Results demonstrate that no one TVE site can save all at-risk individuals in the community and each site provides varying benefits to residents, employees, customers at local stores, tourists at public venues, children at schools, and other vulnerable populations. The benefit of some proposed sites varies depending on whether or not nearby bridges will be functioning after the preceding earthquake. Relative rankings of the TVE sites are fairly stable under various criteria-weighting scenarios but do vary considerably when comparing strategies to exclusively protect tourists or residents. The proposed geospatial framework can serve as an analytical foundation for future TVE siting discussions.

  17. WPS mediation: An approach to process geospatial data on different computing backends

    NASA Astrophysics Data System (ADS)

    Giuliani, Gregory; Nativi, Stefano; Lehmann, Anthony; Ray, Nicolas

    2012-10-01

    The OGC Web Processing Service (WPS) specification allows generating information by processing distributed geospatial data made available through Spatial Data Infrastructures (SDIs). However, current SDIs have limited analytical capacities and various problems emerge when trying to use them in data and computing-intensive domains such as environmental sciences. These problems are usually not or only partially solvable using single computing resources. Therefore, the Geographic Information (GI) community is trying to benefit from the superior storage and computing capabilities offered by distributed computing (e.g., Grids, Clouds) related methods and technologies. Currently, there is no commonly agreed approach to grid-enable WPS. No implementation allows one to seamlessly execute a geoprocessing calculation following user requirements on different computing backends, ranging from a stand-alone GIS server up to computer clusters and large Grid infrastructures. Considering this issue, this paper presents a proof of concept by mediating different geospatial and Grid software packages, and by proposing an extension of WPS specification through two optional parameters. The applicability of this approach will be demonstrated using a Normalized Difference Vegetation Index (NDVI) mediated WPS process, highlighting benefits, and issues that need to be further investigated to improve performances.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholtz, Jean

    A new field of research, visual analytics, has recently been introduced. This has been defined as “the science of analytical reasoning facilitated by visual interfaces." Visual analytic environments, therefore, support analytical reasoning using visual representations and interactions, with data representations and transformation capabilities, to support production, presentation and dissemination. As researchers begin to develop visual analytic environments, it will be advantageous to develop metrics and methodologies to help researchers measure the progress of their work and understand the impact their work will have on the users who will work in such environments. This paper presents five areas or aspects ofmore » visual analytic environments that should be considered as metrics and methodologies for evaluation are developed. Evaluation aspects need to include usability, but it is necessary to go beyond basic usability. The areas of situation awareness, collaboration, interaction, creativity, and utility are proposed as areas for initial consideration. The steps that need to be undertaken to develop systematic evaluation methodologies and metrics for visual analytic environments are outlined.« less

  19. Innovative Visualization Techniques applied to a Flood Scenario

    NASA Astrophysics Data System (ADS)

    Falcão, António; Ho, Quan; Lopes, Pedro; Malamud, Bruce D.; Ribeiro, Rita; Jern, Mikael

    2013-04-01

    The large and ever-increasing amounts of multi-dimensional, time-varying and geospatial digital information from multiple sources represent a major challenge for today's analysts. We present a set of visualization techniques that can be used for the interactive analysis of geo-referenced and time sampled data sets, providing an integrated mechanism and that aids the user to collaboratively explore, present and communicate visually complex and dynamic data. Here we present these concepts in the context of a 4 hour flood scenario from Lisbon in 2010, with data that includes measures of water column (flood height) every 10 minutes at a 4.5 m x 4.5 m resolution, topography, building damage, building information, and online base maps. Techniques we use include web-based linked views, multiple charts, map layers and storytelling. We explain two of these in more detail that are not currently in common use for visualization of data: storytelling and web-based linked views. Visual storytelling is a method for providing a guided but interactive process of visualizing data, allowing more engaging data exploration through interactive web-enabled visualizations. Within storytelling, a snapshot mechanism helps the author of a story to highlight data views of particular interest and subsequently share or guide others within the data analysis process. This allows a particular person to select relevant attributes for a snapshot, such as highlighted regions for comparisons, time step, class values for colour legend, etc. and provide a snapshot of the current application state, which can then be provided as a hyperlink and recreated by someone else. Since data can be embedded within this snapshot, it is possible to interactively visualize and manipulate it. The second technique, web-based linked views, includes multiple windows which interactively respond to the user selections, so that when selecting an object and changing it one window, it will automatically update in all the other windows. These concepts can be part of a collaborative platform, where multiple people share and work together on the data, via online access, which also allows its remote usage from a mobile platform. Storytelling augments analysis and decision-making capabilities allowing to assimilate complex situations and reach informed decisions, in addition to helping the public visualize information. In our visualization scenario, developed in the context of the VA-4D project for the European Space Agency (see http://www.ca3-uninova.org/project_va4d), we make use of the GAV (GeoAnalytics Visualization) framework, a web-oriented visual analytics application based on multiple interactive views. The final visualization that we produce includes multiple interactive views, including a dynamic multi-layer map surrounded by other visualizations such as bar charts, time graphs and scatter plots. The map provides flood and building information, on top of a base city map (street maps and/or satellite imagery provided by online map services such as Google Maps, Bing Maps etc.). Damage over time for selected buildings, damage for all buildings at a chosen time period, correlation between damage and water depth can be analysed in the other views. This interactive web-based visualization that incorporates the ideas of storytelling, web-based linked views, and other visualization techniques, for a 4 hour flood event in Lisbon in 2010, can be found online at http://www.ncomva.se/flash/projects/esa/flooding/.

  20. 3D Geospatial Models for Visualization and Analysis of Groundwater Contamination at a Nuclear Materials Processing Facility

    NASA Astrophysics Data System (ADS)

    Stirewalt, G. L.; Shepherd, J. C.

    2003-12-01

    Analysis of hydrostratigraphy and uranium and nitrate contamination in groundwater at a former nuclear materials processing facility in Oklahoma were undertaken employing 3-dimensional (3D) geospatial modeling software. Models constructed played an important role in the regulatory decision process of the U.S. Nuclear Regulatory Commission (NRC) because they enabled visualization of temporal variations in contaminant concentrations and plume geometry. Three aquifer systems occur at the site, comprised of water-bearing fractured shales separated by indurated sandstone aquitards. The uppermost terrace groundwater system (TGWS) aquifer is composed of terrace and alluvial deposits and a basal shale. The shallow groundwater system (SGWS) aquifer is made up of three shale units and two sandstones. It is separated from the overlying TGWS and underlying deep groundwater system (DGWS) aquifer by sandstone aquitards. Spills of nitric acid solutions containing uranium and radioactive decay products around the main processing building (MPB), leakage from storage ponds west of the MPB, and leaching of radioactive materials from discarded equipment and waste containers contaminated both the TGWS and SGWS aquifers during facility operation between 1970 and 1993. Constructing 3D geospatial property models for analysis of groundwater contamination at the site involved use of EarthVision (EV), a 3D geospatial modeling software developed by Dynamic Graphics, Inc. of Alameda, CA. A viable 3D geohydrologic framework model was initially constructed so property data could be spatially located relative to subsurface geohydrologic units. The framework model contained three hydrostratigraphic zones equivalent to the TGWS, SGWS, and DGWS aquifers in which groundwater samples were collected, separated by two sandstone aquitards. Groundwater data collected in the three aquifer systems since 1991 indicated high concentrations of uranium (>10,000 micrograms/liter) and nitrate (> 500 milligrams/liter) around the MPB and elevated nitrate (> 2000 milligrams/ liter) around storage ponds. Vertical connectivity was suggested between the TGWS and SGWS, while the DGWS appeared relatively isolated from the overlying aquifers. Lateral movement of uranium was also suggested over time. For example, lateral migration in the TGWS is suggested along a shallow depression in the bedrock surface trending south-southwest from the southwest corner of the MPB. Another pathway atop the buried bedrock surface, trending west-northwest from the MPB and partially reflected by current surface topography, suggested lateral migration of nitrate in the SGWS. Lateral movement of nitrate in the SGWS was also indicated north, south, and west of the largest storage pond. Definition of contaminant plume movement over time is particularly important for assessing direction and rate of migration and the potential need for preventive measures to control contamination of groundwater outside facility property lines. The 3D geospatial property models proved invaluable for visualizing and analyzing variations in subsurface uranium and nitrate contamination in space and time within and between the three aquifers at the site. The models were an exceptional visualization tool for illustrating extent, volume, and quantitative amounts of uranium and nitrate contamination in the subsurface to regulatory decision-makers in regard to site decommissioning issues, including remediation concerns, providing a perspective not possible to achieve with traditional 2D maps. The geohydrologic framework model provides a conceptual model for consideration in flow and transport analyses.

  1. Remote Sensing of Soils for Environmental Assessment and Management.

    NASA Technical Reports Server (NTRS)

    DeGloria, Stephen D.; Irons, James R.; West, Larry T.

    2014-01-01

    The next generation of imaging systems integrated with complex analytical methods will revolutionize the way we inventory and manage soil resources across a wide range of scientific disciplines and application domains. This special issue highlights those systems and methods for the direct benefit of environmental professionals and students who employ imaging and geospatial information for improved understanding, management, and monitoring of soil resources.

  2. A tool for exploring space-time patterns: an animation user research.

    PubMed

    Ogao, Patrick J

    2006-08-29

    Ever since Dr. John Snow (1813-1854) used a case map to identify water well as the source of a cholera outbreak in London in the 1800s, the use of spatio-temporal maps have become vital tools in a wide range of disease mapping and control initiatives. The increasing use of spatio-temporal maps in these life-threatening sectors warrants that they are accurate, and easy to interpret to enable prompt decision making by health experts. Similar spatio-temporal maps are observed in urban growth and census mapping--all critical aspects a of a country's socio-economic development. In this paper, a user test research was carried out to determine the effectiveness of spatio-temporal maps (animation) in exploring geospatial structures encompassing disease, urban and census mapping. Three types of animation were used, namely; passive, interactive and inference-based animation, with the key differences between them being on the level of interactivity and complementary domain knowledge that each offers to the user. Passive animation maintains the view only status. The user has no control over its contents and dynamic variables. Interactive animation provides users with the basic media player controls, navigation and orientation tools. Inference-based animation incorporates these interactive capabilities together with a complementary automated intelligent view that alerts users to interesting patterns, trends or anomalies that may be inherent in the data sets. The test focussed on the role of animation passive and interactive capabilities in exploring space-time patterns by engaging test-subjects in thinking aloud evaluation protocol. The test subjects were selected from a geoinformatics (map reading, interpretation and analysis abilities) background. Every test-subject used each of the three types of animation and their performances for each session assessed. The results show that interactivity in animation is a preferred exploratory tool in identifying, interpreting and providing explanations about observed geospatial phenomena. Also, exploring geospatial data structures using animation is best achieved using provocative interactive tools such as was seen with the inference-based animation. The visual methods employed using the three types of animation are all related and together these patterns confirm the exploratory cognitive structure and processes for visualization tools. The generic types of animation as defined in this paper play a crucial role in facilitating the visualization of geospatial data. These animations can be created and their contents defined based on the user's presentational and exploratory needs. For highly explorative tasks, maintaining a link between the data sets and the animation is crucial to enabling a rich and effective knowledge discovery environment.

  3. A tool for exploring space-time patterns : an animation user research

    PubMed Central

    Ogao, Patrick J

    2006-01-01

    Background Ever since Dr. John Snow (1813–1854) used a case map to identify water well as the source of a cholera outbreak in London in the 1800s, the use of spatio-temporal maps have become vital tools in a wide range of disease mapping and control initiatives. The increasing use of spatio-temporal maps in these life-threatening sectors warrants that they are accurate, and easy to interpret to enable prompt decision making by health experts. Similar spatio-temporal maps are observed in urban growth and census mapping – all critical aspects a of a country's socio-economic development. In this paper, a user test research was carried out to determine the effectiveness of spatio-temporal maps (animation) in exploring geospatial structures encompassing disease, urban and census mapping. Results Three types of animation were used, namely; passive, interactive and inference-based animation, with the key differences between them being on the level of interactivity and complementary domain knowledge that each offers to the user. Passive animation maintains the view only status. The user has no control over its contents and dynamic variables. Interactive animation provides users with the basic media player controls, navigation and orientation tools. Inference-based animation incorporates these interactive capabilities together with a complementary automated intelligent view that alerts users to interesting patterns, trends or anomalies that may be inherent in the data sets. The test focussed on the role of animation passive and interactive capabilities in exploring space-time patterns by engaging test-subjects in thinking aloud evaluation protocol. The test subjects were selected from a geoinformatics (map reading, interpretation and analysis abilities) background. Every test-subject used each of the three types of animation and their performances for each session assessed. The results show that interactivity in animation is a preferred exploratory tool in identifying, interpreting and providing explanations about observed geospatial phenomena. Also, exploring geospatial data structures using animation is best achieved using provocative interactive tools such as was seen with the inference-based animation. The visual methods employed using the three types of animation are all related and together these patterns confirm the exploratory cognitive structure and processes for visualization tools. Conclusion The generic types of animation as defined in this paper play a crucial role in facilitating the visualization of geospatial data. These animations can be created and their contents defined based on the user's presentational and exploratory needs. For highly explorative tasks, maintaining a link between the data sets and the animation is crucial to enabling a rich and effective knowledge discovery environment. PMID:16938138

  4. Data Analytics and Visualization for Large Army Testing Data

    DTIC Science & Technology

    2013-09-01

    and relationships in the data that would otherwise remain hidden. 7 Bibliography 1. Goodall , J. R.; Tesone, D. R. Visual Analytics for Network...Software Visualization, 2003, pp 143–149. 3. Goodall , J. R.; Sowul, M. VIAssist: Visual Analytics for Cyber Defense, IEEE Conference on Technologies

  5. Geospatial Data as a Service: The GEOGLAM Rangelands and Pasture Productivity Map Experience

    NASA Astrophysics Data System (ADS)

    Evans, B. J. K.; Antony, J.; Guerschman, J. P.; Larraondo, P. R.; Richards, C. J.

    2017-12-01

    Empowering end-users like pastoralists, land management specialists and land policy makers in the use of earth observation data for both day-to-day and seasonal planning needs both interactive delivery of multiple geospatial datasets and the capability of supporting on-the-fly dynamic queries while simultaneously fostering a community around the effort. The use of and wide adoption of large data archives, like those produced by earth observation missions, are often limited by compute and storage capabilities of the remote user. We demonstrate that wide-scale use of large data archives can be facilitated by end-users dynamically requesting value-added products using open standards (WCS, WMS, WPS), with compute running in the cloud or dedicated data-centres and visualizing outputs on web-front ends. As an example, we will demonstrate how a tool called GSKY can empower a remote end-user by providing the data delivery and analytics capabilities for the GEOGLAM Rangelands and Pasture Productivity (RAPP) Map tool. The GEOGLAM RAPP initiative from the Group on Earth Observations (GEO) and its Agricultural Monitoring subgroup aims at providing practical tools to end-users focusing on the important role of rangelands and pasture systems in providing food production security from both agricultural crops and animal protein. Figure 1, is a screen capture from the RAPP Map interface for an important pasture area in the Namibian rangelands. The RAPP Map has been in production for six months and has garnered significant interest from groups and users all over the world. GSKY, being formulated around the theme of Open Geospatial Data-as-a-Service capabilities uses distributed computing and storage to facilitate this. It works behind the scenes, accepting OGC standard requests in WCS, WMS and WPS. Results from these requests are rendered on a web-front end. In this way, the complexities of data locality and compute execution are masked from an end user. On-the-fly computation of products such as NDVI, Leaf Area Index, vegetation cover and others from original source data including MODIS are achived, with Landsat and Sentinel-2 on the horizon. Innovative use of cloud computing and storage along with flexible front-ends, allow the democratization of data dissemination and we hope better outcomes for the planet.

  6. Geospatial Modelling Approach for Interlinking of Rivers: A Case Study of Vamsadhara and Nagavali River Systems in Srikakulam, Andhra Pradesh

    NASA Astrophysics Data System (ADS)

    Swathi Lakshmi, A.; Saran, S.; Srivastav, S. K.; Krishna Murthy, Y. V. N.

    2014-11-01

    India is prone to several natural disasters such as floods, droughts, cyclones, landslides and earthquakes on account of its geoclimatic conditions. But the most frequent and prominent disasters are floods and droughts. So to reduce the impact of floods and droughts in India, interlinking of rivers is one of the best solutions to transfer the surplus flood waters to deficit/drought prone areas. Geospatial modelling provides a holistic approach to generate probable interlinking routes of rivers based on existing geoinformatics tools and technologies. In the present study, SRTM DEM and AWiFS datasets coupled with land-use/land -cover, geomorphology, soil and interpolated rainfall surface maps have been used to identify the potential routes in geospatial domain for interlinking of Vamsadhara and Nagavali River Systems in Srikakulam district, Andhra Pradesh. The first order derivatives are derived from DEM and road, railway and drainage networks have been delineated using the satellite data. The inundation map has been prepared using AWiFS derived Normalized Difference Water Index (NDWI). The Drought prone areas were delineated on the satellite image as per the records declared by Revenue Department, Srikakulam. Majority Rule Based (MRB) aggregation technique is performed to optimize the resolution of obtained data in order to retain the spatial variability of the classes. Analytical Hierarchy Process (AHP) based Multi-Criteria Decision Making (MCDM) is implemented to obtain the prioritization of parameters like geomorphology, soil, DEM, slope, and land use/land-cover. A likelihood grid has been generated and all the thematic layers are overlaid to identify the potential grids for routing optimization. To give a better routing map, impedance map has been generated and several other constraints are considered. The implementation of canal construction needs extra cost in some areas. The developed routing map is published into OGC WMS services using open source GeoServer and proposed routing service can be visualized over Bhuvan portal (http://www.bhuvan.nrsc.gov.in/).Thus the obtained routing map of proposed canals focuses on transferring the surplus waters to drought prone areas to solve the problem of water scarcity, to properly utilize the flood waters for irrigational purposes and also help in recharging of groundwater. Similar methodology can be adopted in other interlinking of river systems.

  7. Familiarity Vs Trust: A Comparative Study of Domain Scientists' Trust in Visual Analytics and Conventional Analysis Methods.

    PubMed

    Dasgupta, Aritra; Lee, Joon-Yong; Wilson, Ryan; Lafrance, Robert A; Cramer, Nick; Cook, Kristin; Payne, Samuel

    2017-01-01

    Combining interactive visualization with automated analytical methods like statistics and data mining facilitates data-driven discovery. These visual analytic methods are beginning to be instantiated within mixed-initiative systems, where humans and machines collaboratively influence evidence-gathering and decision-making. But an open research question is that, when domain experts analyze their data, can they completely trust the outputs and operations on the machine-side? Visualization potentially leads to a transparent analysis process, but do domain experts always trust what they see? To address these questions, we present results from the design and evaluation of a mixed-initiative, visual analytics system for biologists, focusing on analyzing the relationships between familiarity of an analysis medium and domain experts' trust. We propose a trust-augmented design of the visual analytics system, that explicitly takes into account domain-specific tasks, conventions, and preferences. For evaluating the system, we present the results of a controlled user study with 34 biologists where we compare the variation of the level of trust across conventional and visual analytic mediums and explore the influence of familiarity and task complexity on trust. We find that despite being unfamiliar with a visual analytic medium, scientists seem to have an average level of trust that is comparable with the same in conventional analysis medium. In fact, for complex sense-making tasks, we find that the visual analytic system is able to inspire greater trust than other mediums. We summarize the implications of our findings with directions for future research on trustworthiness of visual analytic systems.

  8. Dynamic Server-Based KML Code Generator Method for Level-of-Detail Traversal of Geospatial Data

    NASA Technical Reports Server (NTRS)

    Baxes, Gregory; Mixon, Brian; Linger, TIm

    2013-01-01

    Web-based geospatial client applications such as Google Earth and NASA World Wind must listen to data requests, access appropriate stored data, and compile a data response to the requesting client application. This process occurs repeatedly to support multiple client requests and application instances. Newer Web-based geospatial clients also provide user-interactive functionality that is dependent on fast and efficient server responses. With massively large datasets, server-client interaction can become severely impeded because the server must determine the best way to assemble data to meet the client applications request. In client applications such as Google Earth, the user interactively wanders through the data using visually guided panning and zooming actions. With these actions, the client application is continually issuing data requests to the server without knowledge of the server s data structure or extraction/assembly paradigm. A method for efficiently controlling the networked access of a Web-based geospatial browser to server-based datasets in particular, massively sized datasets has been developed. The method specifically uses the Keyhole Markup Language (KML), an Open Geospatial Consortium (OGS) standard used by Google Earth and other KML-compliant geospatial client applications. The innovation is based on establishing a dynamic cascading KML strategy that is initiated by a KML launch file provided by a data server host to a Google Earth or similar KMLcompliant geospatial client application user. Upon execution, the launch KML code issues a request for image data covering an initial geographic region. The server responds with the requested data along with subsequent dynamically generated KML code that directs the client application to make follow-on requests for higher level of detail (LOD) imagery to replace the initial imagery as the user navigates into the dataset. The approach provides an efficient data traversal path and mechanism that can be flexibly established for any dataset regardless of size or other characteristics. The method yields significant improvements in userinteractive geospatial client and data server interaction and associated network bandwidth requirements. The innovation uses a C- or PHP-code-like grammar that provides a high degree of processing flexibility. A set of language lexer and parser elements is provided that offers a complete language grammar for writing and executing language directives. A script is wrapped and passed to the geospatial data server by a client application as a component of a standard KML-compliant statement. The approach provides an efficient means for a geospatial client application to request server preprocessing of data prior to client delivery. Data is structured in a quadtree format. As the user zooms into the dataset, geographic regions are subdivided into four child regions. Conversely, as the user zooms out, four child regions collapse into a single, lower-LOD region. The approach provides an efficient data traversal path and mechanism that can be flexibly established for any dataset regardless of size or other characteristics.

  9. Geospatial Applications on Different Parallel and Distributed Systems in enviroGRIDS Project

    NASA Astrophysics Data System (ADS)

    Rodila, D.; Bacu, V.; Gorgan, D.

    2012-04-01

    The execution of Earth Science applications and services on parallel and distributed systems has become a necessity especially due to the large amounts of Geospatial data these applications require and the large geographical areas they cover. The parallelization of these applications comes to solve important performance issues and can spread from task parallelism to data parallelism as well. Parallel and distributed architectures such as Grid, Cloud, Multicore, etc. seem to offer the necessary functionalities to solve important problems in the Earth Science domain: storing, distribution, management, processing and security of Geospatial data, execution of complex processing through task and data parallelism, etc. A main goal of the FP7-funded project enviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is the development of a Spatial Data Infrastructure targeting this catchment region but also the development of standardized and specialized tools for storing, analyzing, processing and visualizing the Geospatial data concerning this area. For achieving these objectives, the enviroGRIDS deals with the execution of different Earth Science applications, such as hydrological models, Geospatial Web services standardized by the Open Geospatial Consortium (OGC) and others, on parallel and distributed architecture to maximize the obtained performance. This presentation analysis the integration and execution of Geospatial applications on different parallel and distributed architectures and the possibility of choosing among these architectures based on application characteristics and user requirements through a specialized component. Versions of the proposed platform have been used in enviroGRIDS project on different use cases such as: the execution of Geospatial Web services both on Web and Grid infrastructures [2] and the execution of SWAT hydrological models both on Grid and Multicore architectures [3]. The current focus is to integrate in the proposed platform the Cloud infrastructure, which is still a paradigm with critical problems to be solved despite the great efforts and investments. Cloud computing comes as a new way of delivering resources while using a large set of old as well as new technologies and tools for providing the necessary functionalities. The main challenges in the Cloud computing, most of them identified also in the Open Cloud Manifesto 2009, address resource management and monitoring, data and application interoperability and portability, security, scalability, software licensing, etc. We propose a platform able to execute different Geospatial applications on different parallel and distributed architectures such as Grid, Cloud, Multicore, etc. with the possibility of choosing among these architectures based on application characteristics and complexity, user requirements, necessary performances, cost support, etc. The execution redirection on a selected architecture is realized through a specialized component and has the purpose of offering a flexible way in achieving the best performances considering the existing restrictions.

  10. RE Data Explorer: Informing Variable Renewable Energy Grid Integration for Low Emission Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cox, Sarah L

    The RE Data Explorer, developed by the National Renewable Energy Laboratory, is an innovative web-based analysis tool that utilizes geospatial and spatiotemporal renewable energy data to visualize, execute, and support analysis of renewable energy potential under various user-defined scenarios. This analysis can inform high-level prospecting, integrated planning, and policy making to enable low emission development.

  11. SERVIR: Environmental Decision Making in the Americas

    NASA Technical Reports Server (NTRS)

    Lapenta, William; Irwin, Dan

    2008-01-01

    SERVIR is a regional visualization and monitoring system for Mesoamerica that integrates satellite and other geospatial data for improved scientific knowledge and decision making by managers, researchers, students, and the general public. SERVIR addresses the nine societal benefit areas of the Global Earth Observation System of Systems (GEOSS). This talk will provide an overview of products and services available through SERVIR.

  12. The Learning Benefits of Using Eye Trackers to Enhance the Geospatial Abilities of Elementary School Students

    ERIC Educational Resources Information Center

    Wang, Hsiao-shen; Chen, Yi-Ting; Lin, Chih-Hung

    2014-01-01

    In this study, we examined the spatial abilities of students using eye-movement tracking devices to identify and analyze their characteristics. For this research, 12 students aged 11-12 years participated as novices and 4 mathematics students participated as experts. A comparison of the visual-spatial abilities of each group showed key factors of…

  13. From Particles and Point Clouds to Voxel Models: High Resolution Modeling of Dynamic Landscapes in Open Source GIS

    NASA Astrophysics Data System (ADS)

    Mitasova, H.; Hardin, E. J.; Kratochvilova, A.; Landa, M.

    2012-12-01

    Multitemporal data acquired by modern mapping technologies provide unique insights into processes driving land surface dynamics. These high resolution data also offer an opportunity to improve the theoretical foundations and accuracy of process-based simulations of evolving landforms. We discuss development of new generation of visualization and analytics tools for GRASS GIS designed for 3D multitemporal data from repeated lidar surveys and from landscape process simulations. We focus on data and simulation methods that are based on point sampling of continuous fields and lead to representation of evolving surfaces as series of raster map layers or voxel models. For multitemporal lidar data we present workflows that combine open source point cloud processing tools with GRASS GIS and custom python scripts to model and analyze dynamics of coastal topography (Figure 1) and we outline development of coastal analysis toolbox. The simulations focus on particle sampling method for solving continuity equations and its application for geospatial modeling of landscape processes. In addition to water and sediment transport models, already implemented in GIS, the new capabilities under development combine OpenFOAM for wind shear stress simulation with a new module for aeolian sand transport and dune evolution simulations. Comparison of observed dynamics with the results of simulations is supported by a new, integrated 2D and 3D visualization interface that provides highly interactive and intuitive access to the redesigned and enhanced visualization tools. Several case studies will be used to illustrate the presented methods and tools and demonstrate the power of workflows built with FOSS and highlight their interoperability.Figure 1. Isosurfaces representing evolution of shoreline and a z=4.5m contour between the years 1997-2011at Cape Hatteras, NC extracted from a voxel model derived from series of lidar-based DEMs.

  14. Teaching And Learning Tectonics With Web-GIS

    NASA Astrophysics Data System (ADS)

    Anastasio, D. J.; Sahagian, D. L.; Bodzin, A.; Teletzke, A. L.; Rutzmoser, S.; Cirucci, L.; Bressler, D.; Burrows, J. E.

    2012-12-01

    Tectonics is a new curriculum enhancement consisting of six Web GIS investigations designed to augment a traditional middle school Earth science curriculum. The investigations are aligned to Disciplinary Core Ideas: Earth and Space Science from the National Research Council's (2012) Framework for K-12 Science Education and to tectonics benchmark ideas articulated in the AAAS Project 2061 (2007) Atlas of Science Literacy. The curriculum emphasizes geospatial thinking and scientific inquiry and consists of the following modules: Geohazards, which plate boundary is closest to me? How do we recognize plate boundaries? How does thermal energy move around the Earth? What happens when plates diverge? What happens when plate move sideways past each other? What happens when plates collide? The Web GIS interface uses JavaScript for simplicity, intuition, and convenience for implementation on a variety of platforms making it easier for diverse middle school learners and their teachers to conduct authentic Earth science investigations, including multidisciplinary visualization, analysis, and synthesis of data. Instructional adaptations allow students who are English language learners, have disabilities, or are reluctant readers to perform advanced desktop GIS functions including spatial analysis, map visualization and query. The Web GIS interface integrates graphics, multimedia, and animation in addition to newly developed features, which allow users to explore and discover geospatial patterns that would not be easily visible using typical classroom instructional materials. The Tectonics curriculum uses a spatial learning design model that incorporates a related set of frameworks and design principles. The framework builds on the work of other successful technology-integrated curriculum projects and includes, alignment of materials and assessments with learning goals, casting key ideas in real-world problems, engaging students in scientific practices that foster the use of key ideas, uses geospatial technology, and supports for teachers in adopting and implementing GIS and inquiry-based activities.

  15. How bicycle level of traffic stress correlate with reported cyclist accidents injury severities: A geospatial and mixed logit analysis.

    PubMed

    Chen, Chen; Anderson, Jason C; Wang, Haizhong; Wang, Yinhai; Vogt, Rachel; Hernandez, Salvador

    2017-11-01

    Transportation agencies need efficient methods to determine how to reduce bicycle accidents while promoting cycling activities and prioritizing safety improvement investments. Many studies have used standalone methods, such as level of traffic stress (LTS) and bicycle level of service (BLOS), to better understand bicycle mode share and network connectivity for a region. However, in most cases, other studies rely on crash severity models to explain what variables contribute to the severity of bicycle related crashes. This research uniquely correlates bicycle LTS with reported bicycle crash locations for four cities in New Hampshire through geospatial mapping. LTS measurements and crash locations are compared visually using a GIS framework. Next, a bicycle injury severity model, that incorporates LTS measurements, is created through a mixed logit modeling framework. Results of the visual analysis show some geospatial correlation between higher LTS roads and "Injury" type bicycle crashes. It was determined, statistically, that LTS has an effect on the severity level of bicycle crashes and high LTS can have varying effects on severity outcome. However, it is recommended that further analyses be conducted to better understand the statistical significance and effect of LTS on injury severity. As such, this research will validate the use of LTS as a proxy for safety risk regardless of the recorded bicycle crash history. This research will help identify the clustering patterns of bicycle crashes on high-risk corridors and, therefore, assist with bicycle route planning and policy making. This paper also suggests low-cost countermeasures or treatments that can be implemented to address high-risk areas. Specifically, with the goal of providing safer routes for cyclists, such countermeasures or treatments have the potential to substantially reduce the number of fatalities and severe injuries. Published by Elsevier Ltd.

  16. Mapping longitudinal scientific progress, collaboration and impact of the Alzheimer's disease neuroimaging initiative.

    PubMed

    Yao, Xiaohui; Yan, Jingwen; Ginda, Michael; Börner, Katy; Saykin, Andrew J; Shen, Li

    2017-01-01

    Alzheimer's disease neuroimaging initiative (ADNI) is a landmark imaging and omics study in AD. ADNI research literature has increased substantially over the past decade, which poses challenges for effectively communicating information about the results and impact of ADNI-related studies. In this work, we employed advanced information visualization techniques to perform a comprehensive and systematic mapping of the ADNI scientific growth and impact over a period of 12 years. Citation information of ADNI-related publications from 01/01/2003 to 05/12/2015 were downloaded from the Scopus database. Five fields, including authors, years, affiliations, sources (journals), and keywords, were extracted and preprocessed. Statistical analyses were performed on basic publication data as well as journal and citations information. Science mapping workflows were conducted using the Science of Science (Sci2) Tool to generate geospatial, topical, and collaboration visualizations at the micro (individual) to macro (global) levels such as geospatial layouts of institutional collaboration networks, keyword co-occurrence networks, and author collaboration networks evolving over time. During the studied period, 996 ADNI manuscripts were published across 233 journals and conference proceedings. The number of publications grew linearly from 2008 to 2015, so did the number of involved institutions. ADNI publications received much more citations than typical papers from the same set of journals. Collaborations were visualized at multiple levels, including authors, institutions, and research areas. The evolution of key ADNI research topics was also plotted over the studied period. Both statistical and visualization results demonstrate the increasing attention of ADNI research, strong citation impact of ADNI publications, the expanding collaboration networks among researchers, institutions and ADNI core areas, and the dynamic evolution of ADNI research topics. The visualizations presented here can help improve daily decision making based on a deep understanding of existing patterns and trends using proven and replicable data analysis and visualization methods. They have great potential to provide new insights and actionable knowledge for helping translational research in AD.

  17. Mapping longitudinal scientific progress, collaboration and impact of the Alzheimer’s disease neuroimaging initiative

    PubMed Central

    Yao, Xiaohui; Yan, Jingwen; Ginda, Michael; Börner, Katy; Saykin, Andrew J.

    2017-01-01

    Background Alzheimer’s disease neuroimaging initiative (ADNI) is a landmark imaging and omics study in AD. ADNI research literature has increased substantially over the past decade, which poses challenges for effectively communicating information about the results and impact of ADNI-related studies. In this work, we employed advanced information visualization techniques to perform a comprehensive and systematic mapping of the ADNI scientific growth and impact over a period of 12 years. Methods Citation information of ADNI-related publications from 01/01/2003 to 05/12/2015 were downloaded from the Scopus database. Five fields, including authors, years, affiliations, sources (journals), and keywords, were extracted and preprocessed. Statistical analyses were performed on basic publication data as well as journal and citations information. Science mapping workflows were conducted using the Science of Science (Sci2) Tool to generate geospatial, topical, and collaboration visualizations at the micro (individual) to macro (global) levels such as geospatial layouts of institutional collaboration networks, keyword co-occurrence networks, and author collaboration networks evolving over time. Results During the studied period, 996 ADNI manuscripts were published across 233 journals and conference proceedings. The number of publications grew linearly from 2008 to 2015, so did the number of involved institutions. ADNI publications received much more citations than typical papers from the same set of journals. Collaborations were visualized at multiple levels, including authors, institutions, and research areas. The evolution of key ADNI research topics was also plotted over the studied period. Conclusions Both statistical and visualization results demonstrate the increasing attention of ADNI research, strong citation impact of ADNI publications, the expanding collaboration networks among researchers, institutions and ADNI core areas, and the dynamic evolution of ADNI research topics. The visualizations presented here can help improve daily decision making based on a deep understanding of existing patterns and trends using proven and replicable data analysis and visualization methods. They have great potential to provide new insights and actionable knowledge for helping translational research in AD. PMID:29095836

  18. Multimedia Analysis plus Visual Analytics = Multimedia Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chinchor, Nancy; Thomas, James J.; Wong, Pak C.

    2010-10-01

    Multimedia analysis has focused on images, video, and to some extent audio and has made progress in single channels excluding text. Visual analytics has focused on the user interaction with data during the analytic process plus the fundamental mathematics and has continued to treat text as did its precursor, information visualization. The general problem we address in this tutorial is the combining of multimedia analysis and visual analytics to deal with multimedia information gathered from different sources, with different goals or objectives, and containing all media types and combinations in common usage.

  19. Using spatial principles to optimize distributed computing for enabling the physical science discoveries

    PubMed Central

    Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing

    2011-01-01

    Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century. PMID:21444779

  20. Using spatial principles to optimize distributed computing for enabling the physical science discoveries.

    PubMed

    Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing

    2011-04-05

    Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century.

  1. Visual Analytics 101

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholtz, Jean; Burtner, Edwin R.; Cook, Kristin A.

    This course will introduce the field of Visual Analytics to HCI researchers and practitioners highlighting the contributions they can make to this field. Topics will include a definition of visual analytics along with examples of current systems, types of tasks and end users, issues in defining user requirements, design of visualizations and interactions, guidelines and heuristics, the current state of user-centered evaluations, and metrics for evaluation. We encourage designers, HCI researchers, and HCI practitioners to attend to learn how their skills can contribute to advancing the state of the art of visual analytics

  2. Using Interactive Data Visualizations for Exploratory Analysis in Undergraduate Genomics Coursework: Field Study Findings and Guidelines

    NASA Astrophysics Data System (ADS)

    Mirel, Barbara; Kumar, Anuj; Nong, Paige; Su, Gang; Meng, Fan

    2016-02-01

    Life scientists increasingly use visual analytics to explore large data sets and generate hypotheses. Undergraduate biology majors should be learning these same methods. Yet visual analytics is one of the most underdeveloped areas of undergraduate biology education. This study sought to determine the feasibility of undergraduate biology majors conducting exploratory analysis using the same interactive data visualizations as practicing scientists. We examined 22 upper level undergraduates in a genomics course as they engaged in a case-based inquiry with an interactive heat map. We qualitatively and quantitatively analyzed students' visual analytic behaviors, reasoning and outcomes to identify student performance patterns, commonly shared efficiencies and task completion. We analyzed students' successes and difficulties in applying knowledge and skills relevant to the visual analytics case and related gaps in knowledge and skill to associated tool designs. Findings show that undergraduate engagement in visual analytics is feasible and could be further strengthened through tool usability improvements. We identify these improvements. We speculate, as well, on instructional considerations that our findings suggested may also enhance visual analytics in case-based modules.

  3. Using Interactive Data Visualizations for Exploratory Analysis in Undergraduate Genomics Coursework: Field Study Findings and Guidelines

    PubMed Central

    Kumar, Anuj; Nong, Paige; Su, Gang; Meng, Fan

    2016-01-01

    Life scientists increasingly use visual analytics to explore large data sets and generate hypotheses. Undergraduate biology majors should be learning these same methods. Yet visual analytics is one of the most underdeveloped areas of undergraduate biology education. This study sought to determine the feasibility of undergraduate biology majors conducting exploratory analysis using the same interactive data visualizations as practicing scientists. We examined 22 upper level undergraduates in a genomics course as they engaged in a case-based inquiry with an interactive heat map. We qualitatively and quantitatively analyzed students’ visual analytic behaviors, reasoning and outcomes to identify student performance patterns, commonly shared efficiencies and task completion. We analyzed students’ successes and difficulties in applying knowledge and skills relevant to the visual analytics case and related gaps in knowledge and skill to associated tool designs. Findings show that undergraduate engagement in visual analytics is feasible and could be further strengthened through tool usability improvements. We identify these improvements. We speculate, as well, on instructional considerations that our findings suggested may also enhance visual analytics in case-based modules. PMID:26877625

  4. The National Map: New Viewer, Services, and Data Download

    USGS Publications Warehouse

    Dollison, Robert M.

    2010-01-01

    Managed by the U.S. Geological Survey's (USGS) National Geospatial Program, The National Map has transitioned data assets and viewer applications to a new visualization and product and service delivery environment, which includes an improved viewing platform, base map data and overlay services, and an integrated data download service. This new viewing solution expands upon the National Geospatial Intelligence Agency (NGA) Palanterra X3 viewer, providing a solid technology foundation for navigation and basic Web mapping functionality. Building upon the NGA viewer allows The National Map to focus on improving data services, functions, and data download capabilities. Initially released to the public at the 125th anniversary of mapping in the USGS on December 3, 2009, the viewer and services are now the primary distribution point for The National Map data. The National Map Viewer: http://viewer.nationalmap.gov

  5. Spatial epidemiology in zoonotic parasitic diseases: insights gained at the 1st International Symposium on Geospatial Health in Lijiang, China, 2007

    PubMed Central

    Zhou, Xiao-Nong; Lv, Shan; Yang, Guo-Jing; Kristensen, Thomas K; Bergquist, N Robert; Utzinger, Jürg; Malone, John B

    2009-01-01

    The 1st International Symposium on Geospatial Health was convened in Lijiang, Yunnan province, People's Republic of China from 8 to 9 September, 2007. The objective was to review progress made with the application of spatial techniques on zoonotic parasitic diseases, particularly in Southeast Asia. The symposium featured 71 presentations covering soil-transmitted and water-borne helminth infections, as well as arthropod-borne diseases such as leishmaniasis, malaria and lymphatic filariasis. The work made public at this occasion is briefly summarized here to highlight the advances made and to put forth research priorities in this area. Approaches such as geographical information systems (GIS), global positioning systems (GPS) and remote sensing (RS), including spatial statistics, web-based GIS and map visualization of field investigations, figured prominently in the presentation. PMID:19193214

  6. Improving the Slum Planning Through Geospatial Decision Support System

    NASA Astrophysics Data System (ADS)

    Shekhar, S.

    2014-11-01

    In India, a number of schemes and programmes have been launched from time to time in order to promote integrated city development and to enable the slum dwellers to gain access to the basic services. Despite the use of geospatial technologies in planning, the local, state and central governments have only been partially successful in dealing with these problems. The study on existing policies and programmes also proved that when the government is the sole provider or mediator, GIS can become a tool of coercion rather than participatory decision-making. It has also been observed that local level administrators who have adopted Geospatial technology for local planning continue to base decision-making on existing political processes. In this juncture, geospatial decision support system (GSDSS) can provide a framework for integrating database management systems with analytical models, graphical display, tabular reporting capabilities and the expert knowledge of decision makers. This assists decision-makers to generate and evaluate alternative solutions to spatial problems. During this process, decision-makers undertake a process of decision research - producing a large number of possible decision alternatives and provide opportunities to involve the community in decision making. The objective is to help decision makers and planners to find solutions through a quantitative spatial evaluation and verification process. The study investigates the options for slum development in a formal framework of RAY (Rajiv Awas Yojana), an ambitious program of Indian Government for slum development. The software modules for realizing the GSDSS were developed using the ArcGIS and Community -VIZ software for Gulbarga city.

  7. Cyberspatial mechanics.

    PubMed

    Bayne, Jay S

    2008-06-01

    In support of a generalization of systems theory, this paper introduces a new approach in modeling complex distributed systems. It offers an analytic framework for describing the behavior of interactive cyberphysical systems (CPSs), which are networked stationary or mobile information systems responsible for the real-time governance of physical processes whose behaviors unfold in cyberspace. The framework is predicated on a cyberspace-time reference model comprising three spatial dimensions plus time. The spatial domains include geospatial, infospatial, and sociospatial references, the latter describing relationships among sovereign enterprises (rational agents) that choose voluntarily to organize and interoperate for individual and mutual benefit through geospatial (physical) and infospatial (logical) transactions. Of particular relevance to CPSs are notions of timeliness and value, particularly as they relate to the real-time governance of physical processes and engagements with other cooperating CPS. Our overarching interest, as with celestial mechanics, is in the formation and evolution of clusters of cyberspatial objects and the federated systems they form.

  8. Changes in Visual/Spatial and Analytic Strategy Use in Organic Chemistry with the Development of Expertise

    ERIC Educational Resources Information Center

    Vlacholia, Maria; Vosniadou, Stella; Roussos, Petros; Salta, Katerina; Kazi, Smaragda; Sigalas, Michael; Tzougraki, Chryssa

    2017-01-01

    We present two studies that investigated the adoption of visual/spatial and analytic strategies by individuals at different levels of expertise in the area of organic chemistry, using the Visual Analytic Chemistry Task (VACT). The VACT allows the direct detection of analytic strategy use without drawing inferences about underlying mental…

  9. Distributed Research Center for Analysis of Regional Climatic Changes and Their Impacts on Environment

    NASA Astrophysics Data System (ADS)

    Shiklomanov, A. I.; Okladnikov, I.; Gordov, E. P.; Proussevitch, A. A.; Titov, A. G.

    2016-12-01

    Presented is a collaborative project carrying out by joint team of researchers from the Institute of Monitoring of Climatic and Ecological Systems, Russia and Earth Systems Research Center, University of New Hampshire, USA. Its main objective is development of a hardware and software prototype of Distributed Research Center (DRC) for monitoring and projecting of regional climatic and and their impacts on the environment over the Northern extratropical areas. In the framework of the project new approaches to "cloud" processing and analysis of large geospatial datasets (big geospatial data) are being developed. It will be deployed on technical platforms of both institutions and applied in research of climate change and its consequences. Datasets available at NCEI and IMCES include multidimensional arrays of climatic, environmental, demographic, and socio-economic characteristics. The project is aimed at solving several major research and engineering tasks: 1) structure analysis of huge heterogeneous climate and environmental geospatial datasets used in the project, their preprocessing and unification; 2) development of a new distributed storage and processing model based on a "shared nothing" paradigm; 3) development of a dedicated database of metadata describing geospatial datasets used in the project; 4) development of a dedicated geoportal and a high-end graphical frontend providing intuitive user interface, internet-accessible online tools for analysis of geospatial data and web services for interoperability with other geoprocessing software packages. DRC will operate as a single access point to distributed archives of spatial data and online tools for their processing. Flexible modular computational engine running verified data processing routines will provide solid results of geospatial data analysis. "Cloud" data analysis and visualization approach will guarantee access to the DRC online tools and data from all over the world. Additionally, exporting of data processing results through WMS and WFS services will be used to provide their interoperability. Financial support of this activity by the RF Ministry of Education and Science under Agreement 14.613.21.0037 (RFMEFI61315X0037) and by the Iola Hubbard Climate Change Endowment is acknowledged.

  10. User-Centered Evaluation of Visual Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholtz, Jean C.

    Visual analytics systems are becoming very popular. More domains now use interactive visualizations to analyze the ever-increasing amount and heterogeneity of data. More novel visualizations are being developed for more tasks and users. We need to ensure that these systems can be evaluated to determine that they are both useful and usable. A user-centered evaluation for visual analytics needs to be developed for these systems. While many of the typical human-computer interaction (HCI) evaluation methodologies can be applied as is, others will need modification. Additionally, new functionality in visual analytics systems needs new evaluation methodologies. There is a difference betweenmore » usability evaluations and user-centered evaluations. Usability looks at the efficiency, effectiveness, and user satisfaction of users carrying out tasks with software applications. User-centered evaluation looks more specifically at the utility provided to the users by the software. This is reflected in the evaluations done and in the metrics used. In the visual analytics domain this is very challenging as users are most likely experts in a particular domain, the tasks they do are often not well defined, the software they use needs to support large amounts of different kinds of data, and often the tasks last for months. These difficulties are discussed more in the section on User-centered Evaluation. Our goal is to provide a discussion of user-centered evaluation practices for visual analytics, including existing practices that can be carried out and new methodologies and metrics that need to be developed and agreed upon by the visual analytics community. The material provided here should be of use for both researchers and practitioners in the field of visual analytics. Researchers and practitioners in HCI and interested in visual analytics will find this information useful as well as a discussion on changes that need to be made to current HCI practices to make them more suitable to visual analytics. A history of analysis and analysis techniques and problems is provided as well as an introduction to user-centered evaluation and various evaluation techniques for readers from different disciplines. The understanding of these techniques is imperative if we wish to support analysis in the visual analytics software we develop. Currently the evaluations that are conducted and published for visual analytics software are very informal and consist mainly of comments from users or potential users. Our goal is to help researchers in visual analytics to conduct more formal user-centered evaluations. While these are time-consuming and expensive to carryout, the outcomes of these studies will have a defining impact on the field of visual analytics and help point the direction for future features and visualizations to incorporate. While many researchers view work in user-centered evaluation as a less-than-exciting area to work, the opposite is true. First of all, the goal is user-centered evaluation is to help visual analytics software developers, researchers, and designers improve their solutions and discover creative ways to better accommodate their users. Working with the users is extremely rewarding as well. While we use the term “users” in almost all situations there are a wide variety of users that all need to be accommodated. Moreover, the domains that use visual analytics are varied and expanding. Just understanding the complexities of a number of these domains is exciting. Researchers are trying out different visualizations and interactions as well. And of course, the size and variety of data are expanding rapidly. User-centered evaluation in this context is rapidly changing. There are no standard processes and metrics and thus those of us working on user-centered evaluation must be creative in our work with both the users and with the researchers and developers.« less

  11. The case for visual analytics of arsenic concentrations in foods.

    PubMed

    Johnson, Matilda O; Cohly, Hari H P; Isokpehi, Raphael D; Awofolu, Omotayo R

    2010-05-01

    Arsenic is a naturally occurring toxic metal and its presence in food could be a potential risk to the health of both humans and animals. Prolonged ingestion of arsenic contaminated water may result in manifestations of toxicity in all systems of the body. Visual Analytics is a multidisciplinary field that is defined as the science of analytical reasoning facilitated by interactive visual interfaces. The concentrations of arsenic vary in foods making it impractical and impossible to provide regulatory limit for each food. This review article presents a case for the use of visual analytics approaches to provide comparative assessment of arsenic in various foods. The topics covered include (i) metabolism of arsenic in the human body; (ii) arsenic concentrations in various foods; (ii) factors affecting arsenic uptake in plants; (ii) introduction to visual analytics; and (iv) benefits of visual analytics for comparative assessment of arsenic concentration in foods. Visual analytics can provide an information superstructure of arsenic in various foods to permit insightful comparative risk assessment of the diverse and continually expanding data on arsenic in food groups in the context of country of study or origin, year of study, method of analysis and arsenic species.

  12. The Case for Visual Analytics of Arsenic Concentrations in Foods

    PubMed Central

    Johnson, Matilda O.; Cohly, Hari H.P.; Isokpehi, Raphael D.; Awofolu, Omotayo R.

    2010-01-01

    Arsenic is a naturally occurring toxic metal and its presence in food could be a potential risk to the health of both humans and animals. Prolonged ingestion of arsenic contaminated water may result in manifestations of toxicity in all systems of the body. Visual Analytics is a multidisciplinary field that is defined as the science of analytical reasoning facilitated by interactive visual interfaces. The concentrations of arsenic vary in foods making it impractical and impossible to provide regulatory limit for each food. This review article presents a case for the use of visual analytics approaches to provide comparative assessment of arsenic in various foods. The topics covered include (i) metabolism of arsenic in the human body; (ii) arsenic concentrations in various foods; (ii) factors affecting arsenic uptake in plants; (ii) introduction to visual analytics; and (iv) benefits of visual analytics for comparative assessment of arsenic concentration in foods. Visual analytics can provide an information superstructure of arsenic in various foods to permit insightful comparative risk assessment of the diverse and continually expanding data on arsenic in food groups in the context of country of study or origin, year of study, method of analysis and arsenic species. PMID:20623005

  13. Scalable Visual Analytics of Massive Textual Datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krishnan, Manoj Kumar; Bohn, Shawn J.; Cowley, Wendy E.

    2007-04-01

    This paper describes the first scalable implementation of text processing engine used in Visual Analytics tools. These tools aid information analysts in interacting with and understanding large textual information content through visual interfaces. By developing parallel implementation of the text processing engine, we enabled visual analytics tools to exploit cluster architectures and handle massive dataset. The paper describes key elements of our parallelization approach and demonstrates virtually linear scaling when processing multi-gigabyte data sets such as Pubmed. This approach enables interactive analysis of large datasets beyond capabilities of existing state-of-the art visual analytics tools.

  14. The Diverse Data, User Driven Services and the Power of Giovanni at NASA GES DISC

    NASA Technical Reports Server (NTRS)

    Shen, Suhung

    2017-01-01

    This presentation provides an overview of remote sensing and model data at GES (Goddard Earth Sciences) DISC (Data and Information Services Center); Overview of data services at GES DISC (Registration with NASA data system; Searching and downloading data); Giovanni (Geospatial Interactive Online VisualizationANd aNalysis Infrastructure): online data exploration tool; and NASA Earth Data and Information System.

  15. Beyond Control Panels: Direct Manipulation for Visual Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Endert, Alexander; Bradel, Lauren; North, Chris

    2013-07-19

    Information Visualization strives to provide visual representations through which users can think about and gain insight into information. By leveraging the visual and cognitive systems of humans, complex relationships and phenomena occurring within datasets can be uncovered by exploring information visually. Interaction metaphors for such visualizations are designed to enable users direct control over the filters, queries, and other parameters controlling how the data is visually represented. Through the evolution of information visualization, more complex mathematical and data analytic models are being used to visualize relationships and patterns in data – creating the field of Visual Analytics. However, the expectationsmore » for how users interact with these visualizations has remained largely unchanged – focused primarily on the direct manipulation of parameters of the underlying mathematical models. In this article we present an opportunity to evolve the methodology for user interaction from the direct manipulation of parameters through visual control panels, to interactions designed specifically for visual analytic systems. Instead of focusing on traditional direct manipulation of mathematical parameters, the evolution of the field can be realized through direct manipulation within the visual representation – where users can not only gain insight, but also interact. This article describes future directions and research challenges that fundamentally change the meaning of direct manipulation with regards to visual analytics, advancing the Science of Interaction.« less

  16. Description of Existing Data for Integrated Landscape Monitoring in the Puget Sound Basin, Washington

    USGS Publications Warehouse

    Aiello, Danielle P.; Torregrosa, Alicia; Jason, Allyson L.; Fuentes, Tracy L.; Josberger, Edward G.

    2008-01-01

    This report summarizes existing geospatial data and monitoring programs for the Puget Sound Basin in northwestern Washington. This information was assembled as a preliminary data-development task for the U.S. Geological Survey (USGS) Puget Sound Integrated Landscape Monitoring (PSILM) pilot project. The PSILM project seeks to support natural resource decision-making by developing a 'whole system' approach that links ecological processes at the landscape level to the local level (Benjamin and others, 2008). Part of this effort will include building the capacity to provide cumulative information about impacts that cross jurisdictional and regulatory boundaries, such as cumulative effects of land-cover change and shoreline modification, or region-wide responses to climate change. The PSILM project study area is defined as the 23 HUC-8 (hydrologic unit code) catchments that comprise the watersheds that drain into Puget Sound and their near-shore environments. The study area includes 13 counties and more than four million people. One goal of the PSILM geospatial database is to integrate spatial data collected at multiple scales across the Puget Sound Basin marine and terrestrial landscape. The PSILM work plan specifies an iterative process that alternates between tasks associated with data development and tasks associated with research or strategy development. For example, an initial work-plan goal was to delineate the study area boundary. Geospatial data required to address this task included data from ecological regions, watersheds, jurisdictions, and other boundaries. This assemblage of data provided the basis for identifying larger research issues and delineating the study-area boundary based on these research needs. Once the study-area boundary was agreed upon, the next iteration between data development and research activities was guided by questions about data availability, data extent, data abundance, and data types. This report is not intended as an exhaustive compilation of all available geospatial data, rather, it is a collection of information about geospatial data that can be used to help answer the suite of questions posed after the study-area boundary was defined. This information will also be useful to the PSILM team for future project tasks, such as assessing monitoring gaps, exploring monitoring-design strategies, identifying and deriving landscape indicators and metrics, and visual geographic communication. The two main geospatial data types referenced in this report - base-reference layers and monitoring data - originated from numerous and varied sources. In addition to collecting information and metadata about the base-reference layers, the data also were collected for project needs, such as developing maps for visual communication among team members and with outside groups. In contrast, only information about the data was typically required for the monitoring data. The information on base-reference layers and monitoring data included in this report is only as detailed as what was readily available from the sources themselves. Although this report may appear to lack consistency between data records, the varying degree of details contained in this report are merely a reflection of varying source detail. This compilation is just a beginning. All data listed also are being catalogued in spreadsheets and knowledge-management systems. Our efforts are continual as we develop a geospatial catalog for the PSILM pilot project.

  17. Interaction Junk: User Interaction-Based Evaluation of Visual Analytic Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Endert, Alexander; North, Chris

    2012-10-14

    With the growing need for visualization to aid users in understanding large, complex datasets, the ability for users to interact and explore these datasets is critical. As visual analytic systems have advanced to leverage powerful computational models and data analytics capabilities, the modes by which users engage and interact with the information are limited. Often, users are taxed with directly manipulating parameters of these models through traditional GUIs (e.g., using sliders to directly manipulate the value of a parameter). However, the purpose of user interaction in visual analytic systems is to enable visual data exploration – where users can focusmore » on their task, as opposed to the tool or system. As a result, users can engage freely in data exploration and decision-making, for the purpose of gaining insight. In this position paper, we discuss how evaluating visual analytic systems can be approached through user interaction analysis, where the goal is to minimize the cognitive translation between the visual metaphor and the mode of interaction (i.e., reducing the “Interactionjunk”). We motivate this concept through a discussion of traditional GUIs used in visual analytics for direct manipulation of model parameters, and the importance of designing interactions the support visual data exploration.« less

  18. Methods and Tools to Align Curriculum to the Skills and Competencies Needed by the Workforce - an Example from Geospatial Science and Technology

    NASA Astrophysics Data System (ADS)

    Johnson, A. B.

    2012-12-01

    Geospatial science and technology (GST) including geographic information systems, remote sensing, global positioning systems and mobile applications, are valuable tools for geoscientists and students learning to become geoscientists. GST allows the user to analyze data spatially and temporarily and then visualize the data and outcomes in multiple formats (digital, web and paper). GST has evolved rapidly and it has been difficult to create effective curriculum as few guidelines existed to help educators. In 2010, the US Department of Labor (DoL), in collaboration with the National Geospatial Center of Excellence (GeoTech Center), a National Science Foundation supported grant, approved the Geospatial Technology Competency Mode (GTCM). The GTCM was developed and vetted with industry experts and provided the structure and example competencies needed across the industry. While the GTCM was helpful, a more detailed list of skills and competencies needed to be identified in order to build appropriate curriculum. The GeoTech Center carried out multiple DACUM events to identify the skills and competencies needed by entry-level workers. DACUM (Developing a Curriculum) is a job analysis process whereby expert workers are convened to describe what they do for a specific occupation. The outcomes from multiple DACUMs were combined into a MetaDACUM and reviewed by hundreds of GST professionals. This provided a list of more than 320 skills and competencies needed by the workforce. The GeoTech Center then held multiple workshops across the U.S. where more than 100 educators knowledgeable in teaching GST parsed the list into Model Courses and a Model Certificate Program. During this process, tools were developed that helped educators define which competency should be included in a specific course and the depth of instruction for that competency. This presentation will provide details about the process, methodology and tools used to create the Models and suggest how they can be used to create customized curriculum integrating geospatial science and technology into geoscience programs.

  19. Importance of the spatial data and the sensor web in the ubiquitous computing area

    NASA Astrophysics Data System (ADS)

    Akçit, Nuhcan; Tomur, Emrah; Karslıoǧlu, Mahmut O.

    2014-08-01

    Spatial data has become a critical issue in recent years. In the past years, nearly more than three quarters of databases, were related directly or indirectly to locations referring to physical features, which constitute the relevant aspects. Spatial data is necessary to identify or calculate the relationships between spatial objects when using spatial operators in programs or portals. Originally, calculations were conducted using Geographic Information System (GIS) programs on local computers. Subsequently, through the Internet, they formed a geospatial web, which is integrated into a discoverable collection of geographically related web standards and key features, and constitutes a global network of geospatial data that employs the World Wide Web to process textual data. In addition, the geospatial web is used to gather spatial data producers, resources, and users. Standards also constitute a critical dimension in further globalizing the idea of the geospatial web. The sensor web is an example of the real time service that the geospatial web can provide. Sensors around the world collect numerous types of data. The sensor web is a type of sensor network that is used for visualizing, calculating, and analyzing collected sensor data. Today, people use smart devices and systems more frequently because of the evolution of technology and have more than one mobile device. The considerable number of sensors and different types of data that are positioned around the world have driven the production of interoperable and platform-independent sensor web portals. The focus of such production has been on further developing the idea of an interoperable and interdependent sensor web of all devices that share and collect information. The other pivotal idea consists of encouraging people to use and send data voluntarily for numerous purposes with the some level of credibility. The principal goal is to connect mobile and non-mobile device in the sensor web platform together to operate for serving and collecting information from people.

  20. A Bayesian Machine Learning Model for Estimating Building Occupancy from Open Source Data

    DOE PAGES

    Stewart, Robert N.; Urban, Marie L.; Duchscherer, Samantha E.; ...

    2016-01-01

    Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the artmore » by introducing the Population Data Tables (PDT), a Bayesian based informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000ft2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the art by introducing the Population Data Tables (PDT), a Bayesian model and informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000 ft 2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.« less

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, Robert N.; Urban, Marie L.; Duchscherer, Samantha E.

    Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the artmore » by introducing the Population Data Tables (PDT), a Bayesian based informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000ft2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the art by introducing the Population Data Tables (PDT), a Bayesian model and informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000 ft 2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.« less

  2. Visual Analytics for Law Enforcement: Deploying a Service-Oriented Analytic Framework for Web-based Visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dowson, Scott T.; Bruce, Joseph R.; Best, Daniel M.

    2009-04-14

    This paper presents key components of the Law Enforcement Information Framework (LEIF) that provides communications, situational awareness, and visual analytics tools in a service-oriented architecture supporting web-based desktop and handheld device users. LEIF simplifies interfaces and visualizations of well-established visual analytical techniques to improve usability. Advanced analytics capability is maintained by enhancing the underlying processing to support the new interface. LEIF development is driven by real-world user feedback gathered through deployments at three operational law enforcement organizations in the US. LEIF incorporates a robust information ingest pipeline supporting a wide variety of information formats. LEIF also insulates interface and analyticalmore » components from information sources making it easier to adapt the framework for many different data repositories.« less

  3. Examining the Use of a Visual Analytics System for Sensemaking Tasks: Case Studies with Domain Experts.

    PubMed

    Kang, Youn-Ah; Stasko, J

    2012-12-01

    While the formal evaluation of systems in visual analytics is still relatively uncommon, particularly rare are case studies of prolonged system use by domain analysts working with their own data. Conducting case studies can be challenging, but it can be a particularly effective way to examine whether visual analytics systems are truly helping expert users to accomplish their goals. We studied the use of a visual analytics system for sensemaking tasks on documents by six analysts from a variety of domains. We describe their application of the system along with the benefits, issues, and problems that we uncovered. Findings from the studies identify features that visual analytics systems should emphasize as well as missing capabilities that should be addressed. These findings inform design implications for future systems.

  4. Openwebglobe 2: Visualization of Complex 3D-GEODATA in the (mobile) Webbrowser

    NASA Astrophysics Data System (ADS)

    Christen, M.

    2016-06-01

    Providing worldwide high resolution data for virtual globes consists of compute and storage intense tasks for processing data. Furthermore, rendering complex 3D-Geodata, such as 3D-City models with an extremely high polygon count and a vast amount of textures at interactive framerates is still a very challenging task, especially on mobile devices. This paper presents an approach for processing, caching and serving massive geospatial data in a cloud-based environment for large scale, out-of-core, highly scalable 3D scene rendering on a web based virtual globe. Cloud computing is used for processing large amounts of geospatial data and also for providing 2D and 3D map data to a large amount of (mobile) web clients. In this paper the approach for processing, rendering and caching very large datasets in the currently developed virtual globe "OpenWebGlobe 2" is shown, which displays 3D-Geodata on nearly every device.

  5. Digital geospatial presentation of geoelectrical and geotechnical data for the lower American River and flood plain, east Sacramento, California

    USGS Publications Warehouse

    Ball, Lyndsay B.; Burton, Bethany L.; Powers, Michael H.; Asch, Theodore H.

    2015-01-01

    To characterize the extent and thickness of lithologic units that may have differing scour potential, the U.S. Geological Survey, in cooperation with the U.S. Army Corps of Engineers, has performed several geoelectrical surveys of the lower American River channel and flood plain between Cal Expo and the Rio Americano High School in east Sacramento, California. Additional geotechnical data have been collected by the U.S. Army Corps of Engineers and its contractors. Data resulting from these surveys have been compiled into similar database formats and converted to uniform geospatial datums and projections. These data have been visualized in a digital three-dimensional framework project that can be viewed using freely available software. These data facilitate a comprehensive analysis of the resistivity structure underlying the lower American River corridor and assist in levee system management.

  6. VegScape: U.S. Crop Condition Monitoring Service

    NASA Astrophysics Data System (ADS)

    mueller, R.; Yang, Z.; Di, L.

    2013-12-01

    Since 1995, the US Department of Agriculture (USDA)/National Agricultural Statistics Service (NASS) has provided qualitative biweekly vegetation condition indices to USDA policymakers and the public on a weekly basis during the growing season. Vegetation indices have proven useful for assessing crop condition and identifying the areal extent of floods, drought, major weather anomalies, and vulnerabilities of early/late season crops. With growing emphasis on more extreme weather events and food security issues rising to the forefront of national interest, a new vegetation condition monitoring system was developed. The new vegetation condition portal named VegScape was initiated at the start of the 2013 growing season. VegScape delivers web mapping service based interactive vegetation indices. Users can use an interactive map to explore, query and disseminate current crop conditions. Vegetation indices like Normal Difference Vegetation Index (NDVI), Vegetation Condition Index (VCI), and mean, median, and ratio comparisons to prior years can be constructed for analytical purposes and on-demand crop statistics. The NASA MODIS satellite with 250 meter (15 acres) resolution and thirteen years of data history provides improved spatial and temporal resolutions and delivers improved detailed timely (i.e., daily) crop specific condition and dynamics. VegScape thus provides supplemental information to support NASS' weekly crop reports. VegScape delivers an agricultural cultivated crop mask and the most recent Cropland Data Layer (CDL) product to exploit the agricultural domain and visualize prior years' planted crops. Additionally, the data can be directly exported to Google Earth for web mashups or delivered via web mapping services for uses in other applications. VegScape supports the ethos of data democracy by providing free and open access to digital geospatial data layers using open geospatial standards, thereby supporting transparent and collaborative government initiatives. NASS developed VegScape in cooperation with the Center for Spatial Information Science and Systems, George Mason University, Fairfax, VA. VegScape Ratio to Median NDVI

  7. The forensic validity of visual analytics

    NASA Astrophysics Data System (ADS)

    Erbacher, Robert F.

    2008-01-01

    The wider use of visualization and visual analytics in wide ranging fields has led to the need for visual analytics capabilities to be legally admissible, especially when applied to digital forensics. This brings the need to consider legal implications when performing visual analytics, an issue not traditionally examined in visualization and visual analytics techniques and research. While digital data is generally admissible under the Federal Rules of Evidence [10][21], a comprehensive validation of the digital evidence is considered prudent. A comprehensive validation requires validation of the digital data under rules for authentication, hearsay, best evidence rule, and privilege. Additional issues with digital data arise when exploring digital data related to admissibility and the validity of what information was examined, to what extent, and whether the analysis process was sufficiently covered by a search warrant. For instance, a search warrant generally covers very narrow requirements as to what law enforcement is allowed to examine and acquire during an investigation. When searching a hard drive for child pornography, how admissible is evidence of an unrelated crime, i.e. drug dealing. This is further complicated by the concept of "in plain view". When performing an analysis of a hard drive what would be considered "in plain view" when analyzing a hard drive. The purpose of this paper is to discuss the issues of digital forensics and the related issues as they apply to visual analytics and identify how visual analytics techniques fit into the digital forensics analysis process, how visual analytics techniques can improve the legal admissibility of digital data, and identify what research is needed to further improve this process. The goal of this paper is to open up consideration of legal ramifications among the visualization community; the author is not a lawyer and the discussions are not meant to be inclusive of all differences in laws between states and countries.

  8. Visualization Development of the Ballistic Threat Geospatial Optimization

    DTIC Science & Technology

    2015-07-01

    topographic globes, Keyhole Markup Language (KML), and Collada files. World Wind gives the user the ability to import 3-D models and navigate...present. After the first person view window is closed , the images stored in memory are then converted to a QuickTime movie (.MOV). The video will be...processing unit HPC high-performance computing JOGL Java implementation of OpenGL KML Keyhole Markup Language NASA National Aeronautics and Space

  9. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT ...

    EPA Pesticide Factsheets

    The Automated Geospatial Watershed Assessment tool (AGWA) is a GIS interface jointly developed by the USDA Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona, and the University of Wyoming to automate the parameterization and execution of the Soil Water Assessment Tool (SWAT) and KINEmatic Runoff and EROSion (KINEROS2) hydrologic models. The application of these two models allows AGWA to conduct hydrologic modeling and watershed assessments at multiple temporal and spatial scales. AGWA’s current outputs are runoff (volumes and peaks) and sediment yield, plus nitrogen and phosphorus with the SWAT model. AGWA uses commonly available GIS data layers to fully parameterize, execute, and visualize results from both models. Through an intuitive interface the user selects an outlet from which AGWA delineates and discretizes the watershed using a Digital Elevation Model (DEM) based on the individual model requirements. The watershed model elements are then intersected with soils and land cover data layers to derive the requisite model input parameters. The chosen model is then executed, and the results are imported back into AGWA for visualization. This allows managers to identify potential problem areas where additional monitoring can be undertaken or mitigation activities can be focused. AGWA also has tools to apply an array of best management practices. There are currently two versions of AGWA available; AGWA 1.5 for

  10. Multi-class geospatial object detection and geographic image classification based on collection of part detectors

    NASA Astrophysics Data System (ADS)

    Cheng, Gong; Han, Junwei; Zhou, Peicheng; Guo, Lei

    2014-12-01

    The rapid development of remote sensing technology has facilitated us the acquisition of remote sensing images with higher and higher spatial resolution, but how to automatically understand the image contents is still a big challenge. In this paper, we develop a practical and rotation-invariant framework for multi-class geospatial object detection and geographic image classification based on collection of part detectors (COPD). The COPD is composed of a set of representative and discriminative part detectors, where each part detector is a linear support vector machine (SVM) classifier used for the detection of objects or recurring spatial patterns within a certain range of orientation. Specifically, when performing multi-class geospatial object detection, we learn a set of seed-based part detectors where each part detector corresponds to a particular viewpoint of an object class, so the collection of them provides a solution for rotation-invariant detection of multi-class objects. When performing geographic image classification, we utilize a large number of pre-trained part detectors to discovery distinctive visual parts from images and use them as attributes to represent the images. Comprehensive evaluations on two remote sensing image databases and comparisons with some state-of-the-art approaches demonstrate the effectiveness and superiority of the developed framework.

  11. Measuring the Carolina Bays Using Archetype Template Overlays on the Google Earth Virtual Globe; Planform Metrics for 25,000 Bays Extracted from LiDAR and Satellite Imagery

    NASA Astrophysics Data System (ADS)

    Davias, M. E.; Gilbride, J. L.

    2011-12-01

    Aerial photographs of Carolina bays taken in the 1930's sparked the initial research into their geomorphology. Satellite Imagery available today through the Google Earth Virtual Globe facility expands the regions available for interrogation, but reveal only part of their unique planforms. Digital Elevation Maps (DEMs), using Light Detection And Ranging (LiDAR) remote sensing data, accentuate the visual presentation of these aligned ovoid shallow basins by emphasizing their robust circumpheral rims. To support a geospatial survey of Carolina bay landforms in the continental USA, 400,000 km2 of hsv-shaded DEMs were created as KML-JPEG tile sets. A majority of these DEMs were generated with LiDAR-derived data. We demonstrate the tile generation process and their integration into Google Earth, where the DEMs augment available photographic imagery for the visualization of bay planforms. While the generic Carolina bay planform is considered oval, we document subtle regional variations. Using a small set of empirically derived planform shapes, we created corresponding Google Earth overlay templates. We demonstrate the analysis of an individual Carolina bay by placing an appropriate overlay onto the virtually globe, then orientating, sizing and rotating it by edit handles such that it satisfactorily represents the bay's rim. The resulting overlay data element is extracted from Google Earth's object directory and programmatically processed to generate metrics such as geographic location, elevation, major and minor axis and inferred orientation. Utilizing a virtual globe facility for data capture may result in higher quality data compared to methods that reference flat maps, where geospatial shape and orientation of the bays could be skewed and distorted in the orthographic projection process. Using the methodology described, we have measured over 25k distinct Carolina bays. We discuss the Google Fusion geospatial data repository facility, through which these data have been assembled and made web-accessible to other researchers. Preliminary findings from the survey are discussed, such as how bay surface area, eccentricity and orientation vary across ~800 1/4° × 1/4° grid elements. Future work includes measuring 25k additional bays, as well as interrogation of the orientation data to identify any possible systematic geospatial relationships.

  12. Web-based Visual Analytics for Extreme Scale Climate Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A; Evans, Katherine J; Harney, John F

    In this paper, we introduce a Web-based visual analytics framework for democratizing advanced visualization and analysis capabilities pertinent to large-scale earth system simulations. We address significant limitations of present climate data analysis tools such as tightly coupled dependencies, ineffi- cient data movements, complex user interfaces, and static visualizations. Our Web-based visual analytics framework removes critical barriers to the widespread accessibility and adoption of advanced scientific techniques. Using distributed connections to back-end diagnostics, we minimize data movements and leverage HPC platforms. We also mitigate system dependency issues by employing a RESTful interface. Our framework embraces the visual analytics paradigm via newmore » visual navigation techniques for hierarchical parameter spaces, multi-scale representations, and interactive spatio-temporal data mining methods that retain details. Although generalizable to other science domains, the current work focuses on improving exploratory analysis of large-scale Community Land Model (CLM) and Community Atmosphere Model (CAM) simulations.« less

  13. A Visual Analytics Approach to Structured Data Analysis to Enhance Nonproliferation and Arms Control Verification Activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gillen, David S.

    Analysis activities for Nonproliferation and Arms Control verification require the use of many types of data. Tabular structured data, such as Excel spreadsheets and relational databases, have traditionally been used for data mining activities, where specific queries are issued against data to look for matching results. The application of visual analytics tools to structured data enables further exploration of datasets to promote discovery of previously unknown results. This paper discusses the application of a specific visual analytics tool to datasets related to the field of Arms Control and Nonproliferation to promote the use of visual analytics more broadly in thismore » domain. Visual analytics focuses on analytical reasoning facilitated by interactive visual interfaces (Wong and Thomas 2004). It promotes exploratory analysis of data, and complements data mining technologies where known patterns can be mined for. Also with a human in the loop, they can bring in domain knowledge and subject matter expertise. Visual analytics has not widely been applied to this domain. In this paper, we will focus on one type of data: structured data, and show the results of applying a specific visual analytics tool to answer questions in the Arms Control and Nonproliferation domain. We chose to use the T.Rex tool, a visual analytics tool developed at PNNL, which uses a variety of visual exploration patterns to discover relationships in structured datasets, including a facet view, graph view, matrix view, and timeline view. The facet view enables discovery of relationships between categorical information, such as countries and locations. The graph tool visualizes node-link relationship patterns, such as the flow of materials being shipped between parties. The matrix visualization shows highly correlated categories of information. The timeline view shows temporal patterns in data. In this paper, we will use T.Rex with two different datasets to demonstrate how interactive exploration of the data can aid an analyst with arms control and nonproliferation verification activities. Using a dataset from PIERS (PIERS 2014), we will show how container shipment imports and exports can aid an analyst in understanding the shipping patterns between two countries. We will also use T.Rex to examine a collection of research publications from the IAEA International Nuclear Information System (IAEA 2014) to discover collaborations of concern. We hope this paper will encourage the use of visual analytics structured data analytics in the field of nonproliferation and arms control verification. Our paper outlines some of the challenges that exist before broad adoption of these kinds of tools can occur and offers next steps to overcome these challenges.« less

  14. An Analysis of Machine- and Human-Analytics in Classification.

    PubMed

    Tam, Gary K L; Kothari, Vivek; Chen, Min

    2017-01-01

    In this work, we present a study that traces the technical and cognitive processes in two visual analytics applications to a common theoretic model of soft knowledge that may be added into a visual analytics process for constructing a decision-tree model. Both case studies involved the development of classification models based on the "bag of features" approach. Both compared a visual analytics approach using parallel coordinates with a machine-learning approach using information theory. Both found that the visual analytics approach had some advantages over the machine learning approach, especially when sparse datasets were used as the ground truth. We examine various possible factors that may have contributed to such advantages, and collect empirical evidence for supporting the observation and reasoning of these factors. We propose an information-theoretic model as a common theoretic basis to explain the phenomena exhibited in these two case studies. Together we provide interconnected empirical and theoretical evidence to support the usefulness of visual analytics.

  15. A reference web architecture and patterns for real-time visual analytics on large streaming data

    NASA Astrophysics Data System (ADS)

    Kandogan, Eser; Soroker, Danny; Rohall, Steven; Bak, Peter; van Ham, Frank; Lu, Jie; Ship, Harold-Jeffrey; Wang, Chun-Fu; Lai, Jennifer

    2013-12-01

    Monitoring and analysis of streaming data, such as social media, sensors, and news feeds, has become increasingly important for business and government. The volume and velocity of incoming data are key challenges. To effectively support monitoring and analysis, statistical and visual analytics techniques need to be seamlessly integrated; analytic techniques for a variety of data types (e.g., text, numerical) and scope (e.g., incremental, rolling-window, global) must be properly accommodated; interaction, collaboration, and coordination among several visualizations must be supported in an efficient manner; and the system should support the use of different analytics techniques in a pluggable manner. Especially in web-based environments, these requirements pose restrictions on the basic visual analytics architecture for streaming data. In this paper we report on our experience of building a reference web architecture for real-time visual analytics of streaming data, identify and discuss architectural patterns that address these challenges, and report on applying the reference architecture for real-time Twitter monitoring and analysis.

  16. Visual Analytics in Public Safety: Example Capabilities for Example Government Agencies

    DTIC Science & Technology

    2011-10-01

    is not limited to: the Police Records Information Management Environment for British Columbia (PRIME-BC), the Police Reporting and Occurrence System...and filtering for rapid identification of relevant documents - Graphical environment for visual evidence marshaling - Interactive linking and...analytical reasoning facilitated by interactive visual interfaces and integration with computational analytics. Indeed, a wide variety of technologies

  17. WEB-GIS Decision Support System for CO2 storage

    NASA Astrophysics Data System (ADS)

    Gaitanaru, Dragos; Leonard, Anghel; Radu Gogu, Constantin; Le Guen, Yvi; Scradeanu, Daniel; Pagnejer, Mihaela

    2013-04-01

    Environmental decision support systems (DSS) paradigm evolves and changes as more knowledge and technology become available to the environmental community. Geographic Information Systems (GIS) can be used to extract, assess and disseminate some types of information, which are otherwise difficult to access by traditional methods. In the same time, with the help of the Internet and accompanying tools, creating and publishing online interactive maps has become easier and rich with options. The Decision Support System (MDSS) developed for the MUSTANG (A MUltiple Space and Time scale Approach for the quaNtification of deep saline formations for CO2 storaGe) project is a user friendly web based application that uses the GIS capabilities. MDSS can be exploited by the experts for CO2 injection and storage in deep saline aquifers. The main objective of the MDSS is to help the experts to take decisions based large structured types of data and information. In order to achieve this objective the MDSS has a geospatial objected-orientated database structure for a wide variety of data and information. The entire application is based on several principles leading to a series of capabilities and specific characteristics: (i) Open-Source - the entire platform (MDSS) is based on open-source technologies - (1) database engine, (2) application server, (3) geospatial server, (4) user interfaces, (5) add-ons, etc. (ii) Multiple database connections - MDSS is capable to connect to different databases that are located on different server machines. (iii)Desktop user experience - MDSS architecture and design follows the structure of a desktop software. (iv)Communication - the server side and the desktop are bound together by series functions that allows the user to upload, use, modify and download data within the application. The architecture of the system involves one database and a modular application composed by: (1) a visualization module, (2) an analysis module, (3) a guidelines module, and (4) a risk assessment module. The Database component is build by using the PostgreSQL and PostGIS open source technology. The visualization module allows the user to view data of CO2 injection sites in different ways: (1) geospatial visualization, (2) table view, (3) 3D visualization. The analysis module will allow the user to perform certain analysis like Injectivity, Containment and Capacity analysis. The Risk Assessment module focus on the site risk matrix approach. The Guidelines module contains the methodologies of CO2 injection and storage into deep saline aquifers guidelines.

  18. Model My Watershed: A high-performance cloud application for public engagement, watershed modeling and conservation decision support

    NASA Astrophysics Data System (ADS)

    Aufdenkampe, A. K.; Tarboton, D. G.; Horsburgh, J. S.; Mayorga, E.; McFarland, M.; Robbins, A.; Haag, S.; Shokoufandeh, A.; Evans, B. M.; Arscott, D. B.

    2017-12-01

    The Model My Watershed Web app (https://app.wikiwatershed.org/) and the BiG-CZ Data Portal (http://portal.bigcz.org/) and are web applications that share a common codebase and a common goal to deliver high-performance discovery, visualization and analysis of geospatial data in an intuitive user interface in web browser. Model My Watershed (MMW) was designed as a decision support system for watershed conservation implementation. BiG CZ Data Portal was designed to provide context and background data for research sites. Users begin by creating an Area of Interest, via an automated watershed delineation tool, a free draw tool, selection of a predefined area such as a county or USGS Hydrological Unit (HUC), or uploading a custom polygon. Both Web apps visualize and provide summary statistics of land use, soil groups, streams, climate and other geospatial information. MMW then allows users to run a watershed model to simulate different scenarios of human impacts on stormwater runoff and water-quality. BiG CZ Data Portal allows users to search for scientific and monitoring data within the Area of Interest, which also serves as a prototype for the upcoming Monitor My Watershed web app. Both systems integrate with CUAHSI cyberinfrastructure, including visualizing observational data from CUAHSI Water Data Center and storing user data via CUAHSI HydroShare. Both systems also integrate with the new EnviroDIY Water Quality Data Portal (http://data.envirodiy.org/), a system for crowd-sourcing environmental monitoring data using open-source sensor stations (http://envirodiy.org/mayfly/) and based on the Observations Data Model v2.

  19. An information model for managing multi-dimensional gridded data in a GIS

    NASA Astrophysics Data System (ADS)

    Xu, H.; Abdul-Kadar, F.; Gao, P.

    2016-04-01

    Earth observation agencies like NASA and NOAA produce huge volumes of historical, near real-time, and forecasting data representing terrestrial, atmospheric, and oceanic phenomena. The data drives climatological and meteorological studies, and underpins operations ranging from weather pattern prediction and forest fire monitoring to global vegetation analysis. These gridded data sets are distributed mostly as files in HDF, GRIB, or netCDF format and quantify variables like precipitation, soil moisture, or sea surface temperature, along one or more dimensions like time and depth. Although the data cube is a well-studied model for storing and analyzing multi-dimensional data, the GIS community remains in need of a solution that simplifies interactions with the data, and elegantly fits with existing database schemas and dissemination protocols. This paper presents an information model that enables Geographic Information Systems (GIS) to efficiently catalog very large heterogeneous collections of geospatially-referenced multi-dimensional rasters—towards providing unified access to the resulting multivariate hypercubes. We show how the implementation of the model encapsulates format-specific variations and provides unified access to data along any dimension. We discuss how this framework lends itself to familiar GIS concepts like image mosaics, vector field visualization, layer animation, distributed data access via web services, and scientific computing. Global data sources like MODIS from USGS and HYCOM from NOAA illustrate how one would employ this framework for cataloging, querying, and intuitively visualizing such hypercubes. ArcGIS—an established platform for processing, analyzing, and visualizing geospatial data—serves to demonstrate how this integration brings the full power of GIS to the scientific community.

  20. TimeBench: a data model and software library for visual analytics of time-oriented data.

    PubMed

    Rind, Alexander; Lammarsch, Tim; Aigner, Wolfgang; Alsallakh, Bilal; Miksch, Silvia

    2013-12-01

    Time-oriented data play an essential role in many Visual Analytics scenarios such as extracting medical insights from collections of electronic health records or identifying emerging problems and vulnerabilities in network traffic. However, many software libraries for Visual Analytics treat time as a flat numerical data type and insufficiently tackle the complexity of the time domain such as calendar granularities and intervals. Therefore, developers of advanced Visual Analytics designs need to implement temporal foundations in their application code over and over again. We present TimeBench, a software library that provides foundational data structures and algorithms for time-oriented data in Visual Analytics. Its expressiveness and developer accessibility have been evaluated through application examples demonstrating a variety of challenges with time-oriented data and long-term developer studies conducted in the scope of research and student projects.

  1. a Public Platform for Geospatial Data Sharing for Disaster Risk Management

    NASA Astrophysics Data System (ADS)

    Balbo, S.; Boccardo, P.; Dalmasso, S.; Pasquali, P.

    2013-01-01

    Several studies have been conducted in Africa to assist local governments in addressing the risk situation related to natural hazards. Geospatial data containing information on vulnerability, impacts, climate change, disaster risk reduction is usually part of the output of such studies and is valuable to national and international organizations to reduce the risks and mitigate the impacts of disasters. Nevertheless this data isn't efficiently widely distributed and often resides in remote storage solutions hardly reachable. Spatial Data Infrastructures are technical solutions capable to solve this issue, by storing geospatial data and making them widely available through the internet. Among these solutions, GeoNode, an open source online platform for geospatial data sharing, has been developed in recent years. GeoNode is a platform for the management and publication of geospatial data. It brings together mature and stable open-source software projects under a consistent and easy-to-use interface allowing users, with little training, to quickly and easily share data and create interactive maps. GeoNode data management tools allow for integrated creation of data, metadata, and map visualizations. Each dataset in the system can be shared publicly or restricted to allow access to only specific users. Social features like user profiles and commenting and rating systems allow for the development of communities around each platform to facilitate the use, management, and quality control of the data the GeoNode instance contains (http://geonode.org/). This paper presents a case study scenario of setting up a Web platform based on GeoNode. It is a public platform called MASDAP and promoted by the Government of Malawi in order to support development of the country and build resilience against natural disasters. A substantial amount of geospatial data has already been collected about hydrogeological risk, as well as several other-disasters related information. Moreover this platform will help to ensure that the data created by a number of past or ongoing projects is maintained and that this information remains accessible and useful. An Integrated Flood Risk Management Plan for a river basin has already been included in the platform and other data from future disaster risk management projects will be added as well.

  2. Rethinking Visual Analytics for Streaming Data Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crouser, R. Jordan; Franklin, Lyndsey; Cook, Kris

    In the age of data science, the use of interactive information visualization techniques has become increasingly ubiquitous. From online scientific journals to the New York Times graphics desk, the utility of interactive visualization for both storytelling and analysis has become ever more apparent. As these techniques have become more readily accessible, the appeal of combining interactive visualization with computational analysis continues to grow. Arising out of a need for scalable, human-driven analysis, primary objective of visual analytics systems is to capitalize on the complementary strengths of human and machine analysis, using interactive visualization as a medium for communication between themore » two. These systems leverage developments from the fields of information visualization, computer graphics, machine learning, and human-computer interaction to support insight generation in areas where purely computational analyses fall short. Over the past decade, visual analytics systems have generated remarkable advances in many historically challenging analytical contexts. These include areas such as modeling political systems [Crouser et al. 2012], detecting financial fraud [Chang et al. 2008], and cybersecurity [Harrison et al. 2012]. In each of these contexts, domain expertise and human intuition is a necessary component of the analysis. This intuition is essential to building trust in the analytical products, as well as supporting the translation of evidence into actionable insight. In addition, each of these examples also highlights the need for scalable analysis. In each case, it is infeasible for a human analyst to manually assess the raw information unaided, and the communication overhead to divide the task between a large number of analysts makes simple parallelism intractable. Regardless of the domain, visual analytics tools strive to optimize the allocation of human analytical resources, and to streamline the sensemaking process on data that is massive, complex, incomplete, and uncertain in scenarios requiring human judgment.« less

  3. Visualizing Uncertainty for Data Fusion Graphics: Review of Selected Literature and Industry Approaches

    DTIC Science & Technology

    2015-06-09

    anomaly detection , which is generally considered part of high level information fusion (HLIF) involving temporal-geospatial data as well as meta-data... Anomaly detection in the Maritime defence and security domain typically focusses on trying to identify vessels that are behaving in an unusual...manner compared with lawful vessels operating in the area – an applied case of target detection among distractors. Anomaly detection is a complex problem

  4. Visual and Analytic Strategies in Geometry

    ERIC Educational Resources Information Center

    Kospentaris, George; Vosniadou, Stella; Kazic, Smaragda; Thanou, Emilian

    2016-01-01

    We argue that there is an increasing reliance on analytic strategies compared to visuospatial strategies, which is related to geometry expertise and not on individual differences in cognitive style. A Visual/Analytic Strategy Test (VAST) was developed to investigate the use of visuo-spatial and analytic strategies in geometry in 30 mathematics…

  5. Google Earth and Geo Applications: A Toolset for Viewing Earth's Geospatial Information

    NASA Astrophysics Data System (ADS)

    Tuxen-Bettman, K.

    2016-12-01

    Earth scientists measure and derive fundamental data that can be of broad general interest to the public and policy makers. Yet, one of the challenges that has always faced the Earth science community is how to present their data and findings in an easy-to-use and compelling manner. Google's Geo Tools offer an efficient and dynamic way for scientists, educators, journalists and others to both access data and view or tell stories in a dynamic three-dimensional geospatial context. Google Earth in particular provides a dense canvas of satellite imagery on which can be viewed rich vector and raster datasets using the medium of Keyhole Markup Language (KML). Through KML, Google Earth can combine the analytical capabilities of Earth Engine, collaborative mapping of My Maps, and storytelling of Tour Builder and more to make Google's Geo Applications a coherent suite of tools for exploring our planet.https://earth.google.com/https://earthengine.google.com/https://mymaps.google.com/https://tourbuilder.withgoogle.com/https://www.google.com/streetview/

  6. Geo-spatial reporting for monitoring of household immunization coverage through mobile phones: Findings from a feasibility study.

    PubMed

    Kazi, A M; Ali, M; K, Ayub; Kalimuddin, H; Zubair, K; Kazi, A N; A, Artani; Ali, S A

    2017-11-01

    The addition of Global Positioning System (GPS) to a mobile phone makes it a very powerful tool for surveillance and monitoring coverage of health programs. This technology enables transfer of data directly into computer applications and cross-references to Geographic Information Systems (GIS) maps, which enhances assessment of coverage and trends. Utilization of these systems in low and middle income countries is currently limited, particularly for immunization coverage assessments and polio vaccination campaigns. We piloted the use of this system and discussed its potential to improve the efficiency of field-based health providers and health managers for monitoring of the immunization program. Using "30×7" WHO sampling technique, a survey of children less than five years of age was conducted in random clusters of Karachi, Pakistan in three high risk towns where a polio case was detected in 2011. Center point of the cluster was calculated by the application on the mobile. Data and location coordinates were collected through a mobile phone. This data was linked with an automated mHealth based monitoring system for monitoring of Supplementary Immunization Activities (SIAs) in Karachi. After each SIA, a visual report was generated according to the coordinates collected from the survey. A total of 3535 participants consented to answer to a baseline survey. We found that the mobile phones incorporated with GIS maps can improve efficiency of health providers through real-time reporting and replacing paper based questionnaire for collection of data at household level. Visual maps generated from the data and geospatial analysis can also give a better assessment of the immunization coverage and polio vaccination campaigns. The study supports a model system in resource constrained settings that allows routine capture of individual level data through GPS enabled mobile phone providing actionable information and geospatial maps to local public health managers, policy makers and study staff monitoring immunization coverage. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Flexible Environmental Modeling with Python and Open - GIS

    NASA Astrophysics Data System (ADS)

    Pryet, Alexandre; Atteia, Olivier; Delottier, Hugo; Cousquer, Yohann

    2015-04-01

    Numerical modeling now represents a prominent task of environmental studies. During the last decades, numerous commercial programs have been made available to environmental modelers. These software applications offer user-friendly graphical user interfaces that allow an efficient management of many case studies. However, they suffer from a lack of flexibility and closed-source policies impede source code reviewing and enhancement for original studies. Advanced modeling studies require flexible tools capable of managing thousands of model runs for parameter optimization, uncertainty and sensitivity analysis. In addition, there is a growing need for the coupling of various numerical models associating, for instance, groundwater flow modeling to multi-species geochemical reactions. Researchers have produced hundreds of open-source powerful command line programs. However, there is a need for a flexible graphical user interface allowing an efficient processing of geospatial data that comes along any environmental study. Here, we present the advantages of using the free and open-source Qgis platform and the Python scripting language for conducting environmental modeling studies. The interactive graphical user interface is first used for the visualization and pre-processing of input geospatial datasets. Python scripting language is then employed for further input data processing, call to one or several models, and post-processing of model outputs. Model results are eventually sent back to the GIS program, processed and visualized. This approach combines the advantages of interactive graphical interfaces and the flexibility of Python scripting language for data processing and model calls. The numerous python modules available facilitate geospatial data processing and numerical analysis of model outputs. Once input data has been prepared with the graphical user interface, models may be run thousands of times from the command line with sequential or parallel calls. We illustrate this approach with several case studies in groundwater hydrology and geochemistry and provide links to several python libraries that facilitate pre- and post-processing operations.

  8. Multidimensional poverty in rural Mozambique: a new metric for evaluating public health interventions.

    PubMed

    Victor, Bart; Blevins, Meridith; Green, Ann F; Ndatimana, Elisée; González-Calvo, Lázaro; Fischer, Edward F; Vergara, Alfredo E; Vermund, Sten H; Olupona, Omo; Moon, Troy D

    2014-01-01

    Poverty is a multidimensional phenomenon and unidimensional measurements have proven inadequate to the challenge of assessing its dynamics. Dynamics between poverty and public health intervention is among the most difficult yet important problems faced in development. We sought to demonstrate how multidimensional poverty measures can be utilized in the evaluation of public health interventions; and to create geospatial maps of poverty deprivation to aid implementers in prioritizing program planning. Survey teams interviewed a representative sample of 3,749 female heads of household in 259 enumeration areas across Zambézia in August-September 2010. We estimated a multidimensional poverty index, which can be disaggregated into context-specific indicators. We produced an MPI comprised of 3 dimensions and 11 weighted indicators selected from the survey. Households were identified as "poor" if were deprived in >33% of indicators. Our MPI is an adjusted headcount, calculated by multiplying the proportion identified as poor (headcount) and the poverty gap (average deprivation). Geospatial visualizations of poverty deprivation were created as a contextual baseline for future evaluation. In our rural (96%) and urban (4%) interviewees, the 33% deprivation cut-off suggested 58.2% of households were poor (29.3% of urban vs. 59.5% of rural). Among the poor, households experienced an average deprivation of 46%; thus the MPI/adjusted headcount is 0.27 ( = 0.58×0.46). Of households where a local language was the primary language, 58.6% were considered poor versus Portuguese-speaking households where 73.5% were considered non-poor. Living standard is the dominant deprivation, followed by health, and then education. Multidimensional poverty measurement can be integrated into program design for public health interventions, and geospatial visualization helps examine the impact of intervention deployment within the context of distinct poverty conditions. Both permit program implementers to focus resources and critically explore linkages between poverty and its social determinants, thus deriving useful findings for evidence-based planning.

  9. Multidimensional Poverty in Rural Mozambique: A New Metric for Evaluating Public Health Interventions

    PubMed Central

    Victor, Bart; Blevins, Meridith; Green, Ann F.; Ndatimana, Elisée; González-Calvo, Lázaro; Fischer, Edward F.; Vergara, Alfredo E.; Vermund, Sten H.; Olupona, Omo; Moon, Troy D.

    2014-01-01

    Background Poverty is a multidimensional phenomenon and unidimensional measurements have proven inadequate to the challenge of assessing its dynamics. Dynamics between poverty and public health intervention is among the most difficult yet important problems faced in development. We sought to demonstrate how multidimensional poverty measures can be utilized in the evaluation of public health interventions; and to create geospatial maps of poverty deprivation to aid implementers in prioritizing program planning. Methods Survey teams interviewed a representative sample of 3,749 female heads of household in 259 enumeration areas across Zambézia in August-September 2010. We estimated a multidimensional poverty index, which can be disaggregated into context-specific indicators. We produced an MPI comprised of 3 dimensions and 11 weighted indicators selected from the survey. Households were identified as “poor” if were deprived in >33% of indicators. Our MPI is an adjusted headcount, calculated by multiplying the proportion identified as poor (headcount) and the poverty gap (average deprivation). Geospatial visualizations of poverty deprivation were created as a contextual baseline for future evaluation. Results In our rural (96%) and urban (4%) interviewees, the 33% deprivation cut-off suggested 58.2% of households were poor (29.3% of urban vs. 59.5% of rural). Among the poor, households experienced an average deprivation of 46%; thus the MPI/adjusted headcount is 0.27 ( = 0.58×0.46). Of households where a local language was the primary language, 58.6% were considered poor versus Portuguese-speaking households where 73.5% were considered non-poor. Living standard is the dominant deprivation, followed by health, and then education. Conclusions Multidimensional poverty measurement can be integrated into program design for public health interventions, and geospatial visualization helps examine the impact of intervention deployment within the context of distinct poverty conditions. Both permit program implementers to focus resources and critically explore linkages between poverty and its social determinants, thus deriving useful findings for evidence-based planning. PMID:25268951

  10. Lack of habituation of evoked visual potentials in analytic information processing style: evidence in healthy subjects.

    PubMed

    Buonfiglio, Marzia; Toscano, M; Puledda, F; Avanzini, G; Di Clemente, L; Di Sabato, F; Di Piero, V

    2015-03-01

    Habituation is considered one of the most basic mechanisms of learning. Habituation deficit to several sensory stimulations has been defined as a trait of migraine brain and also observed in other disorders. On the other hand, analytic information processing style is characterized by the habit of continually evaluating stimuli and it has been associated with migraine. We investigated a possible correlation between lack of habituation of evoked visual potentials and analytic cognitive style in healthy subjects. According to Sternberg-Wagner self-assessment inventory, 15 healthy volunteers (HV) with high analytic score and 15 HV with high global score were recruited. Both groups underwent visual evoked potentials recordings after psychological evaluation. We observed significant lack of habituation in analytical individuals compared to global group. In conclusion, a reduced habituation of visual evoked potentials has been observed in analytic subjects. Our results suggest that further research should be undertaken regarding the relationship between analytic cognitive style and lack of habituation in both physiological and pathophysiological conditions.

  11. Mixed Initiative Visual Analytics Using Task-Driven Recommendations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, Kristin A.; Cramer, Nicholas O.; Israel, David

    2015-12-07

    Visual data analysis is composed of a collection of cognitive actions and tasks to decompose, internalize, and recombine data to produce knowledge and insight. Visual analytic tools provide interactive visual interfaces to data to support tasks involved in discovery and sensemaking, including forming hypotheses, asking questions, and evaluating and organizing evidence. Myriad analytic models can be incorporated into visual analytic systems, at the cost of increasing complexity in the analytic discourse between user and system. Techniques exist to increase the usability of interacting with such analytic models, such as inferring data models from user interactions to steer the underlying modelsmore » of the system via semantic interaction, shielding users from having to do so explicitly. Such approaches are often also referred to as mixed-initiative systems. Researchers studying the sensemaking process have called for development of tools that facilitate analytic sensemaking through a combination of human and automated activities. However, design guidelines do not exist for mixed-initiative visual analytic systems to support iterative sensemaking. In this paper, we present a candidate set of design guidelines and introduce the Active Data Environment (ADE) prototype, a spatial workspace supporting the analytic process via task recommendations invoked by inferences on user interactions within the workspace. ADE recommends data and relationships based on a task model, enabling users to co-reason with the system about their data in a single, spatial workspace. This paper provides an illustrative use case, a technical description of ADE, and a discussion of the strengths and limitations of the approach.« less

  12. Development of Analytical Plug-ins for ENSITE: Version 1.0

    DTIC Science & Technology

    2017-11-01

    ENSITE’s core-software platform builds upon leading geospatial platforms already in use by the Army and is designed to offer an easy-to-use, customized ...use by the Army and is designed to offer an easy-to-use, customized set of workflows for CB planners. Within this platform are added software compo...public good . Find out more at www.erdc.usace.army.mil. To search for other technical reports published by ERDC, visit the ERDC online library at

  13. Applying a Geospatial Visualization Based on USSD Messages to Real Time Identification of Epidemiological Risk Areas in Developing Countries: A Case of Study of Paraguay.

    PubMed

    Ochoa, Silvia; Talavera, Julia; Paciello, Julio

    2015-01-01

    The identification of epidemiological risk areas is one of the major problems in public health. Information management strategies are needed to facilitate prevention and control of disease in the affected areas. This paper presents a model to optimize geographical data collection of suspected or confirmed disease occurrences using the Unstructured Supplementary Service Data (USSD) mobile technology, considering its wide adoption even in developing countries such as Paraguay. A Geographic Information System (GIS) is proposed for visualizing potential epidemiological risk areas in real time, that aims to support decision making and to implement prevention or contingency programs for public health.

  14. 3D Online Visualization and Synergy of NASA A-Train Data Using Google Earth

    NASA Technical Reports Server (NTRS)

    Chen, Aijun; Kempler, Steven; Leptoukh, Gregory; Smith, Peter

    2010-01-01

    This poster presentation reviews the use of Google Earth to assist in three dimensional online visualization of NASA Earth science and geospatial data. The NASA A-Train satellite constellation is a succession of seven sun-synchronous orbit satellites: (1) OCO-2 (Orbiting Carbon Observatory) (will launch in Feb. 2013), (2) GCOM-W1 (Global Change Observation Mission), (3) Aqua, (4) CloudSat, (5) CALIPSO (Cloud-Aerosol Lidar & Infrared Pathfinder Satellite Observations), (6) Glory, (7) Aura. The A-Train makes possible synergy of information from multiple resources, so more information about earth condition is obtained from the combined observations than would be possible from the sum of the observations taken independently

  15. EarthServer: Cross-Disciplinary Earth Science Through Data Cube Analytics

    NASA Astrophysics Data System (ADS)

    Baumann, P.; Rossi, A. P.

    2016-12-01

    The unprecedented increase of imagery, in-situ measurements, and simulation data produced by Earth (and Planetary) Science observations missions bears a rich, yet not leveraged potential for getting insights from integrating such diverse datasets and transform scientific questions into actual queries to data, formulated in a standardized way.The intercontinental EarthServer [1] initiative is demonstrating new directions for flexible, scalable Earth Science services based on innovative NoSQL technology. Researchers from Europe, the US and Australia have teamed up to rigorously implement the concept of the datacube. Such a datacube may have spatial and temporal dimensions (such as a satellite image time series) and may unite an unlimited number of scenes. Independently from whatever efficient data structuring a server network may perform internally, users (scientist, planners, decision makers) will always see just a few datacubes they can slice and dice.EarthServer has established client [2] and server technology for such spatio-temporal datacubes. The underlying scalable array engine, rasdaman [3,4], enables direct interaction, including 3-D visualization, common EO data processing, and general analytics. Services exclusively rely on the open OGC "Big Geo Data" standards suite, the Web Coverage Service (WCS). Conversely, EarthServer has shaped and advanced WCS based on the experience gained. The first phase of EarthServer has advanced scalable array database technology into 150+ TB services. Currently, Petabyte datacubes are being built for ad-hoc and cross-disciplinary querying, e.g. using climate, Earth observation and ocean data.We will present the EarthServer approach, its impact on OGC / ISO / INSPIRE standardization, and its platform technology, rasdaman.References: [1] Baumann, et al. (2015) DOI: 10.1080/17538947.2014.1003106 [2] Hogan, P., (2011) NASA World Wind, Proceedings of the 2nd International Conference on Computing for Geospatial Research & Applications ACM. [3] Baumann, Peter, et al. (2014) In Proc. 10th ICDM, 194-201. [4] Dumitru, A. et al. (2014) In Proc ACM SIGMOD Workshop on Data Analytics in the Cloud (DanaC'2014), 1-4.

  16. Geospatial Data Science Modeling | Geospatial Data Science | NREL

    Science.gov Websites

    Geospatial Data Science Modeling Geospatial Data Science Modeling NREL uses geospatial data science modeling to develop innovative models and tools for energy professionals, project developers, and consumers . Photo of researchers inspecting maps on a large display. Geospatial modeling at NREL often produces the

  17. A graph algebra for scalable visual analytics.

    PubMed

    Shaverdian, Anna A; Zhou, Hao; Michailidis, George; Jagadish, Hosagrahar V

    2012-01-01

    Visual analytics (VA), which combines analytical techniques with advanced visualization features, is fast becoming a standard tool for extracting information from graph data. Researchers have developed many tools for this purpose, suggesting a need for formal methods to guide these tools' creation. Increased data demands on computing requires redesigning VA tools to consider performance and reliability in the context of analysis of exascale datasets. Furthermore, visual analysts need a way to document their analyses for reuse and results justification. A VA graph framework encapsulated in a graph algebra helps address these needs. Its atomic operators include selection and aggregation. The framework employs a visual operator and supports dynamic attributes of data to enable scalable visual exploration of data.

  18. Data visualisation in surveillance for injury prevention and control: conceptual bases and case studies

    PubMed Central

    Martinez, Ramon; Ordunez, Pedro; Soliz, Patricia N; Ballesteros, Michael F

    2016-01-01

    Background The complexity of current injury-related health issues demands the usage of diverse and massive data sets for comprehensive analyses, and application of novel methods to communicate data effectively to the public health community, decision-makers and the public. Recent advances in information visualisation, availability of new visual analytic methods and tools, and progress on information technology provide an opportunity for shaping the next generation of injury surveillance. Objective To introduce data visualisation conceptual bases, and propose a visual analytic and visualisation platform in public health surveillance for injury prevention and control. Methods The paper introduces data visualisation conceptual bases, describes a visual analytic and visualisation platform, and presents two real-world case studies illustrating their application in public health surveillance for injury prevention and control. Results Application of visual analytic and visualisation platform is presented as solution for improved access to heterogeneous data sources, enhance data exploration and analysis, communicate data effectively, and support decision-making. Conclusions Applications of data visualisation concepts and visual analytic platform could play a key role to shape the next generation of injury surveillance. Visual analytic and visualisation platform could improve data use, the analytic capacity, and ability to effectively communicate findings and key messages. The public health surveillance community is encouraged to identify opportunities to develop and expand its use in injury prevention and control. PMID:26728006

  19. The Value of Information and Geospatial Technologies for the analysis of tidal current patterns in the Guanabara Bay (Rio de Janeiro)

    NASA Astrophysics Data System (ADS)

    Isotta Cristofori, Elena; Demarchi, Alessandro; Facello, Anna; Cámaro, Walther; Hermosilla, Fernando; López, Jaime

    2016-04-01

    The study and validation of tidal current patterns relies on the combination of several data sources such as numerical weather prediction models, hydrodynamic models, weather stations, current drifters and remote sensing observations. The assessment of the accuracy and the reliability of produced patterns and the communication of results, including an easy to understand visualization of data, is crucial for a variety of stakeholders including decision-makers. The large diffusion of geospatial equipment such as GPS, current drifters, aerial photogrammetry, allows to collect data in the field using mobile and portable devices with a relative limited effort in terms of time and economic resources. Theses real-time measurements are essential in order to validate the models and specifically to assess the skill of the model during critical environmental conditions. Moreover, the considerable development in remote sensing technologies, cartographic services and GPS applications have enabled the creation of Geographic Information Systems (GIS) capable to store, analyze, manage and integrate spatial or geographical information with hydro-meteorological data. This valuable contribution of Information and geospatial technologies can benefit manifold decision-makers including high level sport athletes. While the numerical approach, commonly used to validate models with in-situ data, is more familiar for scientific users, high level sport users are not familiar with a numerical representations of data. Therefore the integration of data collected in the field into a GIS allows an immediate visualization of performed analysis into geographic maps. This visualization represents a particularly effective way to communicate current patterns assessment results and uncertainty in information, leading to an increase of confidence level about the forecast. The aim of this paper is to present the methodology set-up in collaboration with the Austrian Sailing Federation, for the study of tidal current patterns of the Guanabara Bay, venue for the sailing competitions of Rio 2016 Olympic Games. The methodology relies on the integration of a consistent amount of data collected in the field, hydrodynamic model output, cartography and "key-signs" visible on the water into a GIS, proving to be particularly useful to simplify the final information, to help the learning process and to improve the decision making.

  20. Integrated remote sensing and visualization (IRSV) system for transportation infrastructure operations and management, phase two, volume 4 : web-based bridge information database--visualization analytics and distributed sensing.

    DOT National Transportation Integrated Search

    2012-03-01

    This report introduces the design and implementation of a Web-based bridge information visual analytics system. This : project integrates Internet, multiple databases, remote sensing, and other visualization technologies. The result : combines a GIS ...

  1. Visualisation and Analytic Strategies for Anticipating the Folding of Nets

    ERIC Educational Resources Information Center

    Wright, Vince

    2016-01-01

    Visual and analytic strategies are features of students' schemes for spatial tasks. The strategies used by six students to anticipate the folding of nets were investigated. Evidence suggested that visual and analytic strategies were strongly connected in competent performance.

  2. Integrating semantic web technologies and geospatial catalog services for geospatial information discovery and processing in cyberinfrastructure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yue, Peng; Gong, Jianya; Di, Liping

    Abstract A geospatial catalogue service provides a network-based meta-information repository and interface for advertising and discovering shared geospatial data and services. Descriptive information (i.e., metadata) for geospatial data and services is structured and organized in catalogue services. The approaches currently available for searching and using that information are often inadequate. Semantic Web technologies show promise for better discovery methods by exploiting the underlying semantics. Such development needs special attention from the Cyberinfrastructure perspective, so that the traditional focus on discovery of and access to geospatial data can be expanded to support the increased demand for processing of geospatial information andmore » discovery of knowledge. Semantic descriptions for geospatial data, services, and geoprocessing service chains are structured, organized, and registered through extending elements in the ebXML Registry Information Model (ebRIM) of a geospatial catalogue service, which follows the interface specifications of the Open Geospatial Consortium (OGC) Catalogue Services for the Web (CSW). The process models for geoprocessing service chains, as a type of geospatial knowledge, are captured, registered, and discoverable. Semantics-enhanced discovery for geospatial data, services/service chains, and process models is described. Semantic search middleware that can support virtual data product materialization is developed for the geospatial catalogue service. The creation of such a semantics-enhanced geospatial catalogue service is important in meeting the demands for geospatial information discovery and analysis in Cyberinfrastructure.« less

  3. Opportunities for GEOGLAM to contribute to Food Systems Sustainability

    NASA Astrophysics Data System (ADS)

    LeZaks, D.; Jahn, M.

    2013-12-01

    Since the GEO Global Agricultural Monitoring (GEO-GLAM) community of practice was formed, there has been much interest in how this community can be leveraged to address a series of challenges that has received recognition from a variety of stakeholder groups across acacemia, government, the private sector and multilateral international organizations. This talk will review the collaborative network that has formed around the on-going and planned activities of GEOGLAM, and how future research and development activities within and around GEOGLAM can contribute to the innovation ecosystem around agricultural monitoring and how monitoring activities can contribute to informing decision processes from stakeholders ranging from farmers to policy-makers and other key stakeholders. These collaborative activities revolve around sharing data, information, knowledge, analytics, improved reflections of risks, and opportunities related to humanity's sustainable provisioning at the land/water/energy nexus. The goal of extending GEOGLAMs collaborative activities is to mobilize aligned assets and commitments to set up more ordered approaches to describing and managing the dynamics of food systems, viewed more holistically as sets of nested geospatially and temporally explicit processes. A special focus will be given to how information assets originating from within GEOGLAM can be used to support a coherent visualization of the world's food systems along with improving representation of the resource bases upon which our survival depends

  4. epiDMS: Data Management and Analytics for Decision-Making From Epidemic Spread Simulation Ensembles.

    PubMed

    Liu, Sicong; Poccia, Silvestro; Candan, K Selçuk; Chowell, Gerardo; Sapino, Maria Luisa

    2016-12-01

    Carefully calibrated large-scale computational models of epidemic spread represent a powerful tool to support the decision-making process during epidemic emergencies. Epidemic models are being increasingly used for generating forecasts of the spatial-temporal progression of epidemics at different spatial scales and for assessing the likely impact of different intervention strategies. However, the management and analysis of simulation ensembles stemming from large-scale computational models pose challenges, particularly when dealing with multiple interdependent parameters, spanning multiple layers and geospatial frames, affected by complex dynamic processes operating at different resolutions. We describe and illustrate with examples a novel epidemic simulation data management system, epiDMS, that was developed to address the challenges that arise from the need to generate, search, visualize, and analyze, in a scalable manner, large volumes of epidemic simulation ensembles and observations during the progression of an epidemic. epiDMS is a publicly available system that facilitates management and analysis of large epidemic simulation ensembles. epiDMS aims to fill an important hole in decision-making during healthcare emergencies by enabling critical services with significant economic and health impact. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail journals.permissions@oup.com.

  5. Visual analytics of brain networks.

    PubMed

    Li, Kaiming; Guo, Lei; Faraco, Carlos; Zhu, Dajiang; Chen, Hanbo; Yuan, Yixuan; Lv, Jinglei; Deng, Fan; Jiang, Xi; Zhang, Tuo; Hu, Xintao; Zhang, Degang; Miller, L Stephen; Liu, Tianming

    2012-05-15

    Identification of regions of interest (ROIs) is a fundamental issue in brain network construction and analysis. Recent studies demonstrate that multimodal neuroimaging approaches and joint analysis strategies are crucial for accurate, reliable and individualized identification of brain ROIs. In this paper, we present a novel approach of visual analytics and its open-source software for ROI definition and brain network construction. By combining neuroscience knowledge and computational intelligence capabilities, visual analytics can generate accurate, reliable and individualized ROIs for brain networks via joint modeling of multimodal neuroimaging data and an intuitive and real-time visual analytics interface. Furthermore, it can be used as a functional ROI optimization and prediction solution when fMRI data is unavailable or inadequate. We have applied this approach to an operation span working memory fMRI/DTI dataset, a schizophrenia DTI/resting state fMRI (R-fMRI) dataset, and a mild cognitive impairment DTI/R-fMRI dataset, in order to demonstrate the effectiveness of visual analytics. Our experimental results are encouraging. Copyright © 2012 Elsevier Inc. All rights reserved.

  6. Comparative study of cocoa black ants temporal population distribution utilizing geospatial analysis

    NASA Astrophysics Data System (ADS)

    Adnan, N. A.; Bakar, S.; Mazlan, A. H.; Yusoff, Z. Mohd; Rasam, A. R. Abdul

    2018-02-01

    Cocoa plantation also subjected to diseases and pests infestation. Some pests not only reduced the yield but also inhibit the growth of trees. Therefore, the Malaysia Cocoa Board (MCB) has explored Cocoa Black Ants (CBA) as one of their biological control mechanism to reduce the pest infestation of the Cocoa Pod Borer (CPB). CPB is capable to cause damage to cocoa beans, and later on will reduce the quality of dried cocoa beans. This study tries to integrate the use of geospatial analysis in understanding population distribution pattern of CBA to enhance its capability in controlling CPB infestation. Two objectives of the study are i) to generate temporal CBA distribution of cocoa plantation for two different blocks, and ii) to compare visually the CBA population distribution pattern with the aid of geospatial technique. This study managed to find the CBA population pattern which indicated spatially modest amount of low pattern distribution in February of 2007 until reaching the highest levels of ant populations in September 2007 and decreasing by the end of the year in 2009 for two different blocks (i.e 10B and 18A). Therefore, the usage of GIS is important to explain the CBA pattern population in the mature cocoa field. This finding might to be used as an indicator to examine the optimum distribution of CBA, which needed as a biological control agent against the CPB in the future.

  7. An approach for heterogeneous and loosely coupled geospatial data distributed computing

    NASA Astrophysics Data System (ADS)

    Chen, Bin; Huang, Fengru; Fang, Yu; Huang, Zhou; Lin, Hui

    2010-07-01

    Most GIS (Geographic Information System) applications tend to have heterogeneous and autonomous geospatial information resources, and the availability of these local resources is unpredictable and dynamic under a distributed computing environment. In order to make use of these local resources together to solve larger geospatial information processing problems that are related to an overall situation, in this paper, with the support of peer-to-peer computing technologies, we propose a geospatial data distributed computing mechanism that involves loosely coupled geospatial resource directories and a term named as Equivalent Distributed Program of global geospatial queries to solve geospatial distributed computing problems under heterogeneous GIS environments. First, a geospatial query process schema for distributed computing as well as a method for equivalent transformation from a global geospatial query to distributed local queries at SQL (Structured Query Language) level to solve the coordinating problem among heterogeneous resources are presented. Second, peer-to-peer technologies are used to maintain a loosely coupled network environment that consists of autonomous geospatial information resources, thus to achieve decentralized and consistent synchronization among global geospatial resource directories, and to carry out distributed transaction management of local queries. Finally, based on the developed prototype system, example applications of simple and complex geospatial data distributed queries are presented to illustrate the procedure of global geospatial information processing.

  8. Modeling and formal representation of geospatial knowledge for the Geospatial Semantic Web

    NASA Astrophysics Data System (ADS)

    Huang, Hong; Gong, Jianya

    2008-12-01

    GML can only achieve geospatial interoperation at syntactic level. However, it is necessary to resolve difference of spatial cognition in the first place in most occasions, so ontology was introduced to describe geospatial information and services. But it is obviously difficult and improper to let users to find, match and compose services, especially in some occasions there are complicated business logics. Currently, with the gradual introduction of Semantic Web technology (e.g., OWL, SWRL), the focus of the interoperation of geospatial information has shifted from syntactic level to Semantic and even automatic, intelligent level. In this way, Geospatial Semantic Web (GSM) can be put forward as an augmentation to the Semantic Web that additionally includes geospatial abstractions as well as related reasoning, representation and query mechanisms. To advance the implementation of GSM, we first attempt to construct the mechanism of modeling and formal representation of geospatial knowledge, which are also two mostly foundational phases in knowledge engineering (KE). Our attitude in this paper is quite pragmatical: we argue that geospatial context is a formal model of the discriminate environment characters of geospatial knowledge, and the derivation, understanding and using of geospatial knowledge are located in geospatial context. Therefore, first, we put forward a primitive hierarchy of geospatial knowledge referencing first order logic, formal ontologies, rules and GML. Second, a metamodel of geospatial context is proposed and we use the modeling methods and representation languages of formal ontologies to process geospatial context. Thirdly, we extend Web Process Service (WPS) to be compatible with local DLL for geoprocessing and possess inference capability based on OWL.

  9. The use of Web-based GIS data technologies in the construction of geoscience instructional materials: examples from the MARGINS Data in the Classroom project

    NASA Astrophysics Data System (ADS)

    Ryan, J. G.; McIlrath, J. A.

    2008-12-01

    Web-accessible geospatial information system (GIS) technologies have advanced in concert with an expansion of data resources that can be accessed and used by researchers, educators and students. These resources facilitate the development of data-rich instructional resources and activities that can be used to transition seamlessly into undergraduate research projects. MARGINS Data in the Classroom (http://serc.carleton.edu/ margins/index.html) seeks to engage MARGINS researchers and educators in using the images, datasets, and visualizations produced by NSF-MARGINS Program-funded research and related efforts to create Web-deliverable instructional materials for use in undergraduate-level geoscience courses (MARGINS Mini-Lessons). MARGINS science data is managed by the Marine Geosciences Data System (MGDS), and these and all other MGDS-hosted data can be accessed, manipulated and visualized using GeoMapApp (www.geomapapp.org; Carbotte et al, 2004), a freely available geographic information system focused on the marine environment. Both "packaged" MGDS datasets (i.e., global earthquake foci, volcanoes, bathymetry) and "raw" data (seismic surveys, magnetics, gravity) are accessible via GeoMapApp, with WFS linkages to other resources (geodesy from UNAVCO; seismic profiles from IRIS; geochemical and drillsite data from EarthChem, IODP, and others), permitting the comprehensive characterization of many regions of the ocean basins. Geospatially controlled datasets can be imported into GeoMapApp visualizations, and these visualizations can be exported into Google Earth as .kmz image files. Many of the MARGINS Mini-Lessons thus far produced use (or have studentss use the varied capabilities of GeoMapApp (i.e., constructing topographic profiles, overlaying varied geophysical and bathymetric datasets, characterizing geochemical data). These materials are available for use and testing from the project webpage (http://serc.carleton.edu/margins/). Classroom testing and assessment of the Mini- Lessons begins this Fall.

  10. Discovery of Marine Datasets and Geospatial Metadata Visualization

    NASA Astrophysics Data System (ADS)

    Schwehr, K. D.; Brennan, R. T.; Sellars, J.; Smith, S.

    2009-12-01

    NOAA's National Geophysical Data Center (NGDC) provides the deep archive of US multibeam sonar hydrographic surveys. NOAA stores the data as Bathymetric Attributed Grids (BAG; http://www.opennavsurf.org/) that are HDF5 formatted files containing gridded bathymetry, gridded uncertainty, and XML metadata. While NGDC provides the deep store and a basic ERSI ArcIMS interface to the data, additional tools need to be created to increase the frequency with which researchers discover hydrographic surveys that might be beneficial for their research. Using Open Source tools, we have created a draft of a Google Earth visualization of NOAA's complete collection of BAG files as of March 2009. Each survey is represented as a bounding box, an optional preview image of the survey data, and a pop up placemark. The placemark contains a brief summary of the metadata and links to directly download of the BAG survey files and the complete metadata file. Each survey is time tagged so that users can search both in space and time for surveys that meet their needs. By creating this visualization, we aim to make the entire process of data discovery, validation of relevance, and download much more efficient for research scientists who may not be familiar with NOAA's hydrographic survey efforts or the BAG format. In the process of creating this demonstration, we have identified a number of improvements that can be made to the hydrographic survey process in order to make the results easier to use especially with respect to metadata generation. With the combination of the NGDC deep archiving infrastructure, a Google Earth virtual globe visualization, and GeoRSS feeds of updates, we hope to increase the utilization of these high-quality gridded bathymetry. This workflow applies equally well to LIDAR topography and bathymetry. Additionally, with proper referencing and geotagging in journal publications, we hope to close the loop and help the community create a true “Geospatial Scholar” infrastructure.

  11. Establishment of the Northeast Coastal Watershed Geospatial Data Network (NECWGDN)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hannigan, Robyn

    The goals of NECWGDN were to establish integrated geospatial databases that interfaced with existing open-source (water.html) environmental data server technologies (e.g., HydroDesktop) and included ecological and human data to enable evaluation, prediction, and adaptation in coastal environments to climate- and human-induced threats to the coastal marine resources within the Gulf of Maine. We have completed the development and testing of a "test bed" architecture that is compatible with HydroDesktop and have identified key metadata structures that will enable seamless integration and delivery of environmental, ecological, and human data as well as models to predict threats to end-users. Uniquely this databasemore » integrates point as well as model data and so offers capacities to end-users that are unique among databases. Future efforts will focus on the development of integrated environmental-human dimension models that can serve, in near real time, visualizations of threats to coastal resources and habitats.« less

  12. Remote measurement methods for 3-D modeling purposes using BAE Systems' Software

    NASA Astrophysics Data System (ADS)

    Walker, Stewart; Pietrzak, Arleta

    2015-06-01

    Efficient, accurate data collection from imagery is the key to an economical generation of useful geospatial products. Incremental developments of traditional geospatial data collection and the arrival of new image data sources cause new software packages to be created and existing ones to be adjusted to enable such data to be processed. In the past, BAE Systems' digital photogrammetric workstation, SOCET SET®, met fin de siècle expectations in data processing and feature extraction. Its successor, SOCET GXP®, addresses today's photogrammetric requirements and new data sources. SOCET GXP is an advanced workstation for mapping and photogrammetric tasks, with automated functionality for triangulation, Digital Elevation Model (DEM) extraction, orthorectification and mosaicking, feature extraction and creation of 3-D models with texturing. BAE Systems continues to add sensor models to accommodate new image sources, in response to customer demand. New capabilities added in the latest version of SOCET GXP facilitate modeling, visualization and analysis of 3-D features.

  13. Geospatial Technology in Disease Mapping, E- Surveillance and Health Care for Rural Population in South India

    NASA Astrophysics Data System (ADS)

    Praveenkumar, B. A.; Suresh, K.; Nikhil, A.; Rohan, M.; Nikhila, B. S.; Rohit, C. K.; Srinivas, A.

    2014-11-01

    Providing Healthcare to rural population has been a challenge to the medical service providers especially in developing countries. For this to be effective, scalable and sustainable, certain strategic decisions have to be taken during the planning phase. Also, there is a big gap between the services available and the availability of doctors and medical resources in rural areas. Use of Information Technology can aid this deficiency to a good extent. In this paper, a mobile application has been developed to gather data from the field. A cloud based interface has been developed to store the data in the cloud for effective usage and management of the data. A decision tree based solution developed in this paper helps in diagnosing a patient based on his health parameters. Interactive geospatial maps have been developed to provide effective data visualization facility. This will help both the user community as well as decision makers to carry out long term strategy planning.

  14. High Performance Visualization using Query-Driven Visualizationand Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bethel, E. Wes; Campbell, Scott; Dart, Eli

    2006-06-15

    Query-driven visualization and analytics is a unique approach for high-performance visualization that offers new capabilities for knowledge discovery and hypothesis testing. The new capabilities akin to finding needles in haystacks are the result of combining technologies from the fields of scientific visualization and scientific data management. This approach is crucial for rapid data analysis and visualization in the petascale regime. This article describes how query-driven visualization is applied to a hero-sized network traffic analysis problem.

  15. Building Geospatial Web Services for Ecological Monitoring and Forecasting

    NASA Astrophysics Data System (ADS)

    Hiatt, S. H.; Hashimoto, H.; Melton, F. S.; Michaelis, A. R.; Milesi, C.; Nemani, R. R.; Wang, W.

    2008-12-01

    The Terrestrial Observation and Prediction System (TOPS) at NASA Ames Research Center is a modeling system that generates a suite of gridded data products in near real-time that are designed to enhance management decisions related to floods, droughts, forest fires, human health, as well as crop, range, and forest production. While these data products introduce great possibilities for assisting management decisions and informing further research, realization of their full potential is complicated by their shear volume and by the need for a necessary infrastructure for remotely browsing, visualizing, and analyzing the data. In order to address these difficulties we have built an OGC-compliant WMS and WCS server based on an open source software stack that provides standardized access to our archive of data. This server is built using the open source Java library GeoTools which achieves efficient I/O and image rendering through Java Advanced Imaging. We developed spatio-temporal raster management capabilities using the PostGrid raster indexation engine. We provide visualization and browsing capabilities through a customized Ajax web interface derived from the kaMap project. This interface allows resource managers to quickly assess ecosystem conditions and identify significant trends and anomalies from within their web browser without the need to download source data or install special software. Our standardized web services also expose TOPS data to a range of potential clients, from web mapping applications to virtual globes and desktop GIS packages. However, support for managing the temporal dimension of our data is currently limited in existing software systems. Future work will attempt to overcome this shortcoming by building time-series visualization and analysis tools that can be integrated with existing geospatial software.

  16. Data visualisation in surveillance for injury prevention and control: conceptual bases and case studies.

    PubMed

    Martinez, Ramon; Ordunez, Pedro; Soliz, Patricia N; Ballesteros, Michael F

    2016-04-01

    The complexity of current injury-related health issues demands the usage of diverse and massive data sets for comprehensive analyses, and application of novel methods to communicate data effectively to the public health community, decision-makers and the public. Recent advances in information visualisation, availability of new visual analytic methods and tools, and progress on information technology provide an opportunity for shaping the next generation of injury surveillance. To introduce data visualisation conceptual bases, and propose a visual analytic and visualisation platform in public health surveillance for injury prevention and control. The paper introduces data visualisation conceptual bases, describes a visual analytic and visualisation platform, and presents two real-world case studies illustrating their application in public health surveillance for injury prevention and control. Application of visual analytic and visualisation platform is presented as solution for improved access to heterogeneous data sources, enhance data exploration and analysis, communicate data effectively, and support decision-making. Applications of data visualisation concepts and visual analytic platform could play a key role to shape the next generation of injury surveillance. Visual analytic and visualisation platform could improve data use, the analytic capacity, and ability to effectively communicate findings and key messages. The public health surveillance community is encouraged to identify opportunities to develop and expand its use in injury prevention and control. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  17. Simultaneous Visualization of Different Utility Networks for Disaster Management

    NASA Astrophysics Data System (ADS)

    Semm, S.; Becker, T.; Kolbe, T. H.

    2012-07-01

    Cartographic visualizations of crises are used to create a Common Operational Picture (COP) and enforce Situational Awareness by presenting and representing relevant information. As nearly all crises affect geospatial entities, geo-data representations have to support location-specific decision-making throughout the crises. Since, Operator's attention span and their working memory are limiting factors for the process of getting and interpreting information; the cartographic presentation has to support individuals in coordinating their activities and with handling highly dynamic situations. The Situational Awareness of operators in conjunction with a COP are key aspects of the decision making process and essential for coming to appropriate decisions. Utility networks are one of the most complex and most needed systems within a city. The visualization of utility infrastructure in crisis situations is addressed in this paper. The paper will provide a conceptual approach on how to simplify, aggregate, and visualize multiple utility networks and their components to meet the requirements of the decision-making process and to support Situational Awareness.

  18. PAVA: Physiological and Anatomical Visual Analytics for Mapping of Tissue-Specific Concentration and Time-Course Data

    EPA Science Inventory

    We describe the development and implementation of a Physiological and Anatomical Visual Analytics tool (PAVA), a web browser-based application, used to visualize experimental/simulated chemical time-course data (dosimetry), epidemiological data and Physiologically-Annotated Data ...

  19. Improving the User Experience of Finding and Visualizing Oceanographic Data

    NASA Astrophysics Data System (ADS)

    Rauch, S.; Allison, M. D.; Groman, R. C.; Chandler, C. L.; Galvarino, C.; Gegg, S. R.; Kinkade, D.; Shepherd, A.; Wiebe, P. H.; Glover, D. M.

    2013-12-01

    Searching for and locating data of interest can be a challenge to researchers as increasing volumes of data are made available online through various data centers, repositories, and archives. The Biological and Chemical Oceanography Data Management Office (BCO-DMO) is keenly aware of this challenge and, as a result, has implemented features and technologies aimed at improving data discovery and enhancing the user experience. BCO-DMO was created in 2006 to manage and publish data from research projects funded by the Division of Ocean Sciences (OCE) Biological and Chemical Oceanography Sections and the Division of Polar Programs (PLR) Antarctic Sciences Organisms and Ecosystems Program (ANT) of the US National Science Foundation (NSF). The BCO-DMO text-based and geospatial-based data access systems provide users with tools to search, filter, and visualize data in order to efficiently find data of interest. The geospatial interface, developed using a suite of open-source software (including MapServer [1], OpenLayers [2], ExtJS [3], and MySQL [4]), allows users to search and filter/subset metadata based on program, project, or deployment, or by using a simple word search. The map responds based on user selections, presents options that allow the user to choose specific data parameters (e.g., a species or an individual drifter), and presents further options for visualizing those data on the map or in "quick-view" plots. The data managed and made available by BCO-DMO are very heterogeneous in nature, from in-situ biogeochemical, ecological, and physical data, to controlled laboratory experiments. Due to the heterogeneity of the data types, a 'one size fits all' approach to visualization cannot be applied. Datasets are visualized in a way that will best allow users to assess fitness for purpose. An advanced geospatial interface, which contains a semantically-enabled faceted search [5], is also available. These search facets are highly interactive and responsive, allowing users to construct their own custom searches by applying multiple filters. New filtering and visualization tools are continually being added to the BCO-DMO system as new data types are encountered and as we receive feedback from our data contributors and users. As our system becomes more complex, teaching users about the many interactive features becomes increasingly important. Tutorials and videos are made available online. Recent in-person classroom-style tutorials have proven useful for both demonstrating our system to users and for obtaining feedback to further improve the user experience. References: [1] University of Minnesota. MapServer: Open source web mapping. http://www.mapserver.org [2] OpenLayers: Free Maps for the Web. http://www.openlayers.org [3] Sencha. ExtJS. http://www.sencha.com/products/extjs [4] MySQL. http://www.mysql.com/ [5] Maffei, A. R., Rozell, E. A., West, P., Zednik, S., and Fox, P. A. 2011. Open Standards and Technologies in the S2S Framework. Abstract IN31A-1435 presented at American Geophysical Union 2011 Fall Meeting, San Francisco, CA, 7 December 2011.

  20. A Visual Analytics Paradigm Enabling Trillion-Edge Graph Exploration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wong, Pak C.; Haglin, David J.; Gillen, David S.

    We present a visual analytics paradigm and a system prototype for exploring web-scale graphs. A web-scale graph is described as a graph with ~one trillion edges and ~50 billion vertices. While there is an aggressive R&D effort in processing and exploring web-scale graphs among internet vendors such as Facebook and Google, visualizing a graph of that scale still remains an underexplored R&D area. The paper describes a nontraditional peek-and-filter strategy that facilitates the exploration of a graph database of unprecedented size for visualization and analytics. We demonstrate that our system prototype can 1) preprocess a graph with ~25 billion edgesmore » in less than two hours and 2) support database query and visualization on the processed graph database afterward. Based on our computational performance results, we argue that we most likely will achieve the one trillion edge mark (a computational performance improvement of 40 times) for graph visual analytics in the near future.« less

  1. Visual analytics in medical education: impacting analytical reasoning and decision making for quality improvement.

    PubMed

    Vaitsis, Christos; Nilsson, Gunnar; Zary, Nabil

    2015-01-01

    The medical curriculum is the main tool representing the entire undergraduate medical education. Due to its complexity and multilayered structure it is of limited use to teachers in medical education for quality improvement purposes. In this study we evaluated three visualizations of curriculum data from a pilot course, using teachers from an undergraduate medical program and applying visual analytics methods. We found that visual analytics can be used to positively impacting analytical reasoning and decision making in medical education through the realization of variables capable to enhance human perception and cognition on complex curriculum data. The positive results derived from our evaluation of a medical curriculum and in a small scale, signify the need to expand this method to an entire medical curriculum. As our approach sustains low levels of complexity it opens a new promising direction in medical education informatics research.

  2. EPA Geospatial Quality Council Promoting Quality Assurance in the Geospatial Coummunity

    EPA Science Inventory

    After establishing a foundation for the EPA National Geospatial Program, the EPA Geospatial Quality Council (GQC) is, in part, focusing on improving administrative efficiency in the geospatial community. To realize this goal, the GQC is developing Standard Operating Procedures (S...

  3. Combining photorealistic immersive geovisualization and high-resolution geospatial data to enhance human-scale viewshed modelling

    NASA Astrophysics Data System (ADS)

    Tabrizian, P.; Petrasova, A.; Baran, P.; Petras, V.; Mitasova, H.; Meentemeyer, R. K.

    2017-12-01

    Viewshed modelling- a process of defining, parsing and analysis of landscape visual space's structure within GIS- has been commonly used in applications ranging from landscape planning and ecosystem services assessment to geography and archaeology. However, less effort has been made to understand whether and to what extent these objective analyses predict actual on-the-ground perception of human observer. Moreover, viewshed modelling at the human-scale level require incorporation of fine-grained landscape structure (eg., vegetation) and patterns (e.g, landcover) that are typically omitted from visibility calculations or unrealistically simulated leading to significant error in predicting visual attributes. This poster illustrates how photorealistic Immersive Virtual Environments and high-resolution geospatial data can be used to integrate objective and subjective assessments of visual characteristics at the human-scale level. We performed viewshed modelling for a systematically sampled set of viewpoints (N=340) across an urban park using open-source GIS (GRASS GIS). For each point a binary viewshed was computed on a 3D surface model derived from high-density leaf-off LIDAR (QL2) points. Viewshed map was combined with high-resolution landcover (.5m) derived through fusion of orthoimagery, lidar vegetation, and vector data. Geo-statistics and landscape structure analysis was performed to compute topological and compositional metrics for visual-scale (e.g., openness), complexity (pattern, shape and object diversity), and naturalness. Based on the viewshed model output, a sample of 24 viewpoints representing the variation of visual characteristics were selected and geolocated. For each location, 360o imagery were captured using a DSL camera mounted on a GIGA PAN robot. We programmed a virtual reality application through which human subjects (N=100) immersively experienced a random representation of selected environments via a head-mounted display (Oculus Rift CV1), and rated each location on perceived openness, naturalness and complexity. Regression models were performed to correlate model outputs with participants' responses. The results indicated strong, significant correlations for openness, and naturalness and moderate correlation for complexity estimations.

  4. Applying Pragmatics Principles for Interaction with Visual Analytics.

    PubMed

    Hoque, Enamul; Setlur, Vidya; Tory, Melanie; Dykeman, Isaac

    2018-01-01

    Interactive visual data analysis is most productive when users can focus on answering the questions they have about their data, rather than focusing on how to operate the interface to the analysis tool. One viable approach to engaging users in interactive conversations with their data is a natural language interface to visualizations. These interfaces have the potential to be both more expressive and more accessible than other interaction paradigms. We explore how principles from language pragmatics can be applied to the flow of visual analytical conversations, using natural language as an input modality. We evaluate the effectiveness of pragmatics support in our system Evizeon, and present design considerations for conversation interfaces to visual analytics tools.

  5. Transformation of an uncertain video search pipeline to a sketch-based visual analytics loop.

    PubMed

    Legg, Philip A; Chung, David H S; Parry, Matthew L; Bown, Rhodri; Jones, Mark W; Griffiths, Iwan W; Chen, Min

    2013-12-01

    Traditional sketch-based image or video search systems rely on machine learning concepts as their core technology. However, in many applications, machine learning alone is impractical since videos may not be semantically annotated sufficiently, there may be a lack of suitable training data, and the search requirements of the user may frequently change for different tasks. In this work, we develop a visual analytics systems that overcomes the shortcomings of the traditional approach. We make use of a sketch-based interface to enable users to specify search requirement in a flexible manner without depending on semantic annotation. We employ active machine learning to train different analytical models for different types of search requirements. We use visualization to facilitate knowledge discovery at the different stages of visual analytics. This includes visualizing the parameter space of the trained model, visualizing the search space to support interactive browsing, visualizing candidature search results to support rapid interaction for active learning while minimizing watching videos, and visualizing aggregated information of the search results. We demonstrate the system for searching spatiotemporal attributes from sports video to identify key instances of the team and player performance.

  6. Lateral flow devices

    DOEpatents

    Mazumdar, Debapriya; Liu, Juewen; Lu, Yi

    2010-09-21

    An analytical test for an analyte comprises (a) a base, having a reaction area and a visualization area, (b) a capture species, on the base in the visualization area, comprising nucleic acid, and (c) analysis chemistry reagents, on the base in the reaction area. The analysis chemistry reagents comprise (i) a substrate comprising nucleic acid and a first label, and (ii) a reactor comprising nucleic acid. The analysis chemistry reagents can react with a sample comprising the analyte and water, to produce a visualization species comprising nucleic acid and the first label, and the capture species can bind the visualization species.

  7. plas.io: Open Source, Browser-based WebGL Point Cloud Visualization

    NASA Astrophysics Data System (ADS)

    Butler, H.; Finnegan, D. C.; Gadomski, P. J.; Verma, U. K.

    2014-12-01

    Point cloud data, in the form of Light Detection and Ranging (LiDAR), RADAR, or semi-global matching (SGM) image processing, are rapidly becoming a foundational data type to quantify and characterize geospatial processes. Visualization of these data, due to overall volume and irregular arrangement, is often difficult. Technological advancement in web browsers, in the form of WebGL and HTML5, have made interactivity and visualization capabilities ubiquitously available which once only existed in desktop software. plas.io is an open source JavaScript application that provides point cloud visualization, exploitation, and compression features in a web-browser platform, reducing the reliance for client-based desktop applications. The wide reach of WebGL and browser-based technologies mean plas.io's capabilities can be delivered to a diverse list of devices -- from phones and tablets to high-end workstations -- with very little custom software development. These properties make plas.io an ideal open platform for researchers and software developers to communicate visualizations of complex and rich point cloud data to devices to which everyone has easy access.

  8. Investigating Methods for Serving Visualizations of Vertical Profiles

    NASA Astrophysics Data System (ADS)

    Roberts, J. T.; Cechini, M. F.; Lanjewar, K.; Rodriguez, J.; Boller, R. A.; Baynes, K.

    2017-12-01

    Several geospatial web servers, web service standards, and mapping clients exist for the visualization of two-dimensional raster and vector-based Earth science data products. However, data products with a vertical component (i.e., vertical profiles) do not have the same mature set of technologies and pose a greater technical challenge when it comes to visualizations. There are a variety of tools and proposed standards, but no obvious solution that can handle the variety of visualizations found with vertical profiles. An effort is being led by members of the NASA Global Imagery Browse Services (GIBS) team to gather a list of technologies relevant to existing vertical profile data products and user stories. The goal is to find a subset of technologies, standards, and tools that can be used to build publicly accessible web services that can handle the greatest number of use cases for the widest audience possible. This presentation will describe results of the investigation and offer directions for moving forward with building a system that is capable of effectively and efficiently serving visualizations of vertical profiles.

  9. Grid Enabled Geospatial Catalogue Web Service

    NASA Technical Reports Server (NTRS)

    Chen, Ai-Jun; Di, Li-Ping; Wei, Ya-Xing; Liu, Yang; Bui, Yu-Qi; Hu, Chau-Min; Mehrotra, Piyush

    2004-01-01

    Geospatial Catalogue Web Service is a vital service for sharing and interoperating volumes of distributed heterogeneous geospatial resources, such as data, services, applications, and their replicas over the web. Based on the Grid technology and the Open Geospatial Consortium (0GC) s Catalogue Service - Web Information Model, this paper proposes a new information model for Geospatial Catalogue Web Service, named as GCWS which can securely provides Grid-based publishing, managing and querying geospatial data and services, and the transparent access to the replica data and related services under the Grid environment. This information model integrates the information model of the Grid Replica Location Service (RLS)/Monitoring & Discovery Service (MDS) with the information model of OGC Catalogue Service (CSW), and refers to the geospatial data metadata standards from IS0 19115, FGDC and NASA EOS Core System and service metadata standards from IS0 191 19 to extend itself for expressing geospatial resources. Using GCWS, any valid geospatial user, who belongs to an authorized Virtual Organization (VO), can securely publish and manage geospatial resources, especially query on-demand data in the virtual community and get back it through the data-related services which provide functions such as subsetting, reformatting, reprojection etc. This work facilitates the geospatial resources sharing and interoperating under the Grid environment, and implements geospatial resources Grid enabled and Grid technologies geospatial enabled. It 2!so makes researcher to focus on science, 2nd not cn issues with computing ability, data locztic, processir,g and management. GCWS also is a key component for workflow-based virtual geospatial data producing.

  10. GIS-and Web-based Water Resource Geospatial Infrastructure for Oil Shale Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Wei; Minnick, Matthew; Geza, Mengistu

    2012-09-30

    The Colorado School of Mines (CSM) was awarded a grant by the National Energy Technology Laboratory (NETL), Department of Energy (DOE) to conduct a research project en- titled GIS- and Web-based Water Resource Geospatial Infrastructure for Oil Shale Development in October of 2008. The ultimate goal of this research project is to develop a water resource geo-spatial infrastructure that serves as “baseline data” for creating solutions on water resource management and for supporting decisions making on oil shale resource development. The project came to the end on September 30, 2012. This final project report will report the key findings frommore » the project activity, major accomplishments, and expected impacts of the research. At meantime, the gamma version (also known as Version 4.0) of the geodatabase as well as other various deliverables stored on digital storage media will be send to the program manager at NETL, DOE via express mail. The key findings from the project activity include the quantitative spatial and temporal distribution of the water resource throughout the Piceance Basin, water consumption with respect to oil shale production, and data gaps identified. Major accomplishments of this project include the creation of a relational geodatabase, automated data processing scripts (Matlab) for database link with surface water and geological model, ArcGIS Model for hydrogeologic data processing for groundwater model input, a 3D geological model, surface water/groundwater models, energy resource development systems model, as well as a web-based geo-spatial infrastructure for data exploration, visualization and dissemination. This research will have broad impacts of the devel- opment of the oil shale resources in the US. The geodatabase provides a “baseline” data for fur- ther study of the oil shale development and identification of further data collection needs. The 3D geological model provides better understanding through data interpolation and visualization techniques of the Piceance Basin structure spatial distribution of the oil shale resources. The sur- face water/groundwater models quantify the water shortage and better understanding the spatial distribution of the available water resources. The energy resource development systems model reveals the phase shift of water usage and the oil shale production, which will facilitate better planning for oil shale development. Detailed descriptions about the key findings from the project activity, major accomplishments, and expected impacts of the research will be given in the sec- tion of “ACCOMPLISHMENTS, RESULTS, AND DISCUSSION” of this report.« less

  11. Giovanni - The Bridge Between Data and Science

    NASA Technical Reports Server (NTRS)

    Liu, Zhong; Acker, James

    2017-01-01

    This article describes new features in the Geospatial Interactive Online Visualization ANd aNalysis Infrastructure (Giovanni), a user-friendly online tool that enables visualization, analysis, and assessment of NASA Earth science data sets without downloading data and software. Since the satellite era began, data collected from Earth-observing satellites have been widely used in research and applications; however, using satellite-based data sets can still be a challenge to many. To facilitate data access and evaluation, as well as scientific exploration and discovery, the NASA Goddard Earth Sciences (GES) Data and Information Services Center (DISC) has developed Giovanni for a wide range of users around the world. This article describes the latest capabilities of Giovanni with examples, and discusses future plans for this innovative system.

  12. Visual Analytics and Storytelling through Video

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wong, Pak C.; Perrine, Kenneth A.; Mackey, Patrick S.

    2005-10-31

    This paper supplements a video clip submitted to the Video Track of IEEE Symposium on Information Visualization 2005. The original video submission applies a two-way storytelling approach to demonstrate the visual analytics capabilities of a new visualization technique. The paper presents our video production philosophy, describes the plot of the video, explains the rationale behind the plot, and finally, shares our production experiences with our readers.

  13. Geospatial considerations for a multiorganizational, landscape-scale program

    USGS Publications Warehouse

    O'Donnell, Michael S.; Assal, Timothy J.; Anderson, Patrick J.; Bowen, Zachary H.

    2013-01-01

    Geospatial data play an increasingly important role in natural resources management, conservation, and science-based projects. The management and effective use of spatial data becomes significantly more complex when the efforts involve a myriad of landscape-scale projects combined with a multiorganizational collaboration. There is sparse literature to guide users on this daunting subject; therefore, we present a framework of considerations for working with geospatial data that will provide direction to data stewards, scientists, collaborators, and managers for developing geospatial management plans. The concepts we present apply to a variety of geospatial programs or projects, which we describe as a “scalable framework” of processes for integrating geospatial efforts with management, science, and conservation initiatives. Our framework includes five tenets of geospatial data management: (1) the importance of investing in data management and standardization, (2) the scalability of content/efforts addressed in geospatial management plans, (3) the lifecycle of a geospatial effort, (4) a framework for the integration of geographic information systems (GIS) in a landscape-scale conservation or management program, and (5) the major geospatial considerations prior to data acquisition. We conclude with a discussion of future considerations and challenges.

  14. Modeling and visualizing borehole information on virtual globes using KML

    NASA Astrophysics Data System (ADS)

    Zhu, Liang-feng; Wang, Xi-feng; Zhang, Bing

    2014-01-01

    Advances in virtual globes and Keyhole Markup Language (KML) are providing the Earth scientists with the universal platforms to manage, visualize, integrate and disseminate geospatial information. In order to use KML to represent and disseminate subsurface geological information on virtual globes, we present an automatic method for modeling and visualizing a large volume of borehole information. Based on a standard form of borehole database, the method first creates a variety of borehole models with different levels of detail (LODs), including point placemarks representing drilling locations, scatter dots representing contacts and tube models representing strata. Subsequently, the level-of-detail based (LOD-based) multi-scale representation is constructed to enhance the efficiency of visualizing large numbers of boreholes. Finally, the modeling result can be loaded into a virtual globe application for 3D visualization. An implementation program, termed Borehole2KML, is developed to automatically convert borehole data into KML documents. A case study of using Borehole2KML to create borehole models in Shanghai shows that the modeling method is applicable to visualize, integrate and disseminate borehole information on the Internet. The method we have developed has potential use in societal service of geological information.

  15. Global polar geospatial information service retrieval based on search engine and ontology reasoning

    USGS Publications Warehouse

    Chen, Nengcheng; E, Dongcheng; Di, Liping; Gong, Jianya; Chen, Zeqiang

    2007-01-01

    In order to improve the access precision of polar geospatial information service on web, a new methodology for retrieving global spatial information services based on geospatial service search and ontology reasoning is proposed, the geospatial service search is implemented to find the coarse service from web, the ontology reasoning is designed to find the refined service from the coarse service. The proposed framework includes standardized distributed geospatial web services, a geospatial service search engine, an extended UDDI registry, and a multi-protocol geospatial information service client. Some key technologies addressed include service discovery based on search engine and service ontology modeling and reasoning in the Antarctic geospatial context. Finally, an Antarctica multi protocol OWS portal prototype based on the proposed methodology is introduced.

  16. Data analytics approach to create waste generation profiles for waste management and collection.

    PubMed

    Niska, Harri; Serkkola, Ari

    2018-04-30

    Extensive monitoring data on waste generation is increasingly collected in order to implement cost-efficient and sustainable waste management operations. In addition, geospatial data from different registries of the society are opening for free usage. Novel data analytics approaches can be built on the top of the data to produce more detailed, and in-time waste generation information for the basis of waste management and collection. In this paper, a data-based approach based on the self-organizing map (SOM) and the k-means algorithm is developed for creating a set of waste generation type profiles. The approach is demonstrated using the extensive container-level waste weighting data collected in the metropolitan area of Helsinki, Finland. The results obtained highlight the potential of advanced data analytic approaches in producing more detailed waste generation information e.g. for the basis of tailored feedback services for waste producers and the planning and optimization of waste collection and recycling. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. Reducing the Analytical Bottleneck for Domain Scientists: Lessons from a Climate Data Visualization Case Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dasgupta, Aritra; Poco, Jorge; Bertini, Enrico

    2016-01-01

    The gap between large-scale data production rate and the rate of generation of data-driven scientific insights has led to an analytical bottleneck in scientific domains like climate, biology, etc. This is primarily due to the lack of innovative analytical tools that can help scientists efficiently analyze and explore alternative hypotheses about the data, and communicate their findings effectively to a broad audience. In this paper, by reflecting on a set of successful collaborative research efforts between with a group of climate scientists and visualization researchers, we introspect how interactive visualization can help reduce the analytical bottleneck for domain scientists.

  18. How NASA's Atmospheric Science Data Center (ASDC) is operationally using the Esri ArcGIS Platform to improve data discoverability, accessibility and interoperability to meet the diversifying government, private, public and academic communities' driven requirements.

    NASA Astrophysics Data System (ADS)

    Tisdale, M.

    2016-12-01

    NASA's Atmospheric Science Data Center (ASDC) is operationally using the Esri ArcGIS Platform to improve data discoverability, accessibility and interoperability to meet the diversifying government, private, public and academic communities' driven requirements. The ASDC is actively working to provide their mission essential datasets as ArcGIS Image Services, Open Geospatial Consortium (OGC) Web Mapping Services (WMS), OGC Web Coverage Services (WCS) and leveraging the ArcGIS multidimensional mosaic dataset structure. Science teams and ASDC are utilizing these services, developing applications using the Web AppBuilder for ArcGIS and ArcGIS API for Javascript, and evaluating restructuring their data production and access scripts within the ArcGIS Python Toolbox framework and Geoprocessing service environment. These capabilities yield a greater usage and exposure of ASDC data holdings and provide improved geospatial analytical tools for a mission critical understanding in the areas of the earth's radiation budget, clouds, aerosols, and tropospheric chemistry.

  19. USGS Geospatial Fabric and Geo Data Portal for Continental Scale Hydrology Simulations

    NASA Astrophysics Data System (ADS)

    Sampson, K. M.; Newman, A. J.; Blodgett, D. L.; Viger, R.; Hay, L.; Clark, M. P.

    2013-12-01

    This presentation describes use of United States Geological Survey (USGS) data products and server-based resources for continental-scale hydrologic simulations. The USGS Modeling of Watershed Systems (MoWS) group provides a consistent national geospatial fabric built on NHDPlus. They have defined more than 100,000 hydrologic response units (HRUs) over the continental United States based on points of interest (POIs) and split into left and right bank based on the corresponding stream segment. Geophysical attributes are calculated for each HRU that can be used to define parameters in hydrologic and land-surface models. The Geo Data Portal (GDP) project at the USGS Center for Integrated Data Analytics (CIDA) provides access to downscaled climate datasets and processing services via web-interface and python modules for creating forcing datasets for any polygon (such as an HRU). These resources greatly reduce the labor required for creating model-ready data in-house, contributing to efficient and effective modeling applications. We will present an application of this USGS cyber-infrastructure for assessments of impacts of climate change on hydrology over the continental United States.

  20. Geospatial Information from Satellite Imagery for Geovisualisation of Smart Cities in India

    NASA Astrophysics Data System (ADS)

    Mohan, M.

    2016-06-01

    In the recent past, there have been large emphasis on extraction of geospatial information from satellite imagery. The Geospatial information are being processed through geospatial technologies which are playing important roles in developing of smart cities, particularly in developing countries of the world like India. The study is based on the latest geospatial satellite imagery available for the multi-date, multi-stage, multi-sensor, and multi-resolution. In addition to this, the latest geospatial technologies have been used for digital image processing of remote sensing satellite imagery and the latest geographic information systems as 3-D GeoVisualisation, geospatial digital mapping and geospatial analysis for developing of smart cities in India. The Geospatial information obtained from RS and GPS systems have complex structure involving space, time and presentation. Such information helps in 3-Dimensional digital modelling for smart cities which involves of spatial and non-spatial information integration for geographic visualisation of smart cites in context to the real world. In other words, the geospatial database provides platform for the information visualisation which is also known as geovisualisation. So, as a result there have been an increasing research interest which are being directed to geospatial analysis, digital mapping, geovisualisation, monitoring and developing of smart cities using geospatial technologies. However, the present research has made an attempt for development of cities in real world scenario particulary to help local, regional and state level planners and policy makers to better understand and address issues attributed to cities using the geospatial information from satellite imagery for geovisualisation of Smart Cities in emerging and developing country, India.

  1. Integrated remote sensing and visualization (IRSV) system for transportation infrastructure operations and management, phase one, volume 4 : use of knowledge integrated visual analytics system in supporting bridge management.

    DOT National Transportation Integrated Search

    2009-12-01

    The goals of integration should be: Supporting domain oriented data analysis through the use of : knowledge augmented visual analytics system. In this project, we focus on: : Providing interactive data exploration for bridge managements. : ...

  2. Human Factors in Streaming Data Analysis: Challenges and Opportunities for Information Visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dasgupta, Aritra; Arendt, Dustin L.; Franklin, Lyndsey

    State-of-the-art visual analytics models and frameworks mostly assume a static snapshot of the data, while in many cases it is a stream with constant updates and changes. Exploration of streaming data poses unique challenges as machine-level computations and abstractions need to be synchronized with the visual representation of the data and the temporally evolving human insights. In the visual analytics literature, we lack a thorough characterization of streaming data and analysis of the challenges associated with task abstraction, visualization design, and adaptation of the role of human-in-the-loop for exploration of data streams. We aim to fill this gap by conductingmore » a survey of the state-of-the-art in visual analytics of streaming data for systematically describing the contributions and shortcomings of current techniques and analyzing the research gaps that need to be addressed in the future. Our contributions are: i) problem characterization for identifying challenges that are unique to streaming data analysis tasks, ii) a survey and analysis of the state-of-the-art in streaming data visualization research with a focus on the visualization design space for dynamic data and the role of the human-in-the-loop, and iii) reflections on the design-trade-offs for streaming visual analytics techniques and their practical applicability in real-world application scenarios.« less

  3. Geospatial Visualization of Scientific Data Through Keyhole Markup Language

    NASA Astrophysics Data System (ADS)

    Wernecke, J.; Bailey, J. E.

    2008-12-01

    The development of virtual globes has provided a fun and innovative tool for exploring the surface of the Earth. However, it has been the paralleling maturation of Keyhole Markup Language (KML) that has created a new medium and perspective through which to visualize scientific datasets. Originally created by Keyhole Inc., and then acquired by Google in 2004, in 2007 KML was given over to the Open Geospatial Consortium (OGC). It became an OGC international standard on 14 April 2008, and has subsequently been adopted by all major geobrowser developers (e.g., Google, Microsoft, ESRI, NASA) and many smaller ones (e.g., Earthbrowser). By making KML a standard at a relatively young stage in its evolution, developers of the language are seeking to avoid the issues that plagued the early World Wide Web and development of Hypertext Markup Language (HTML). The popularity and utility of Google Earth, in particular, has been enhanced by KML features such as the Smithsonian volcano layer and the dynamic weather layers. Through KML, users can view real-time earthquake locations (USGS), view animations of polar sea-ice coverage (NSIDC), or read about the daily activities of chimpanzees (Jane Goodall Institute). Perhaps even more powerful is the fact that any users can create, edit, and share their own KML, with no or relatively little knowledge of manipulating computer code. We present an overview of the best current scientific uses of KML and a guide to how scientists can learn to use KML themselves.

  4. Merging and Visualization of Archived Oceanographic Acoustic, Optical, and Sensor Data to Support Improved Access and Interpretation

    NASA Astrophysics Data System (ADS)

    Malik, M. A.; Cantwell, K. L.; Reser, B.; Gray, L. M.

    2016-02-01

    Marine researchers and managers routinely rely on interdisciplinary data sets collected using hull-mounted sonars, towed sensors, or submersible vehicles. These data sets can be broadly categorized into acoustic remote sensing, imagery-based observations, water property measurements, and physical samples. The resulting raw data sets are overwhelmingly large and complex, and often require specialized software and training to process. To address these challenges, NOAA's Office of Ocean Exploration and Research (OER) is developing tools to improve the discoverability of raw data sets and integration of quality-controlled processed data in order to facilitate re-use of archived oceanographic data. Majority of recently collected OER raw oceanographic data can be retrieved from national data archives (e.g. NCEI and NOAA central library). Merging of disperse data sets by scientists with diverse expertise, however remains problematic. Initial efforts at OER have focused on merging geospatial acoustic remote sensing data with imagery and water property measurements that typically lack direct geo-referencing. OER has developed `smart' ship and submersible tracks that can provide a synopsis of geospatial coverage of various data sets. Tools under development enable scientists to quickly assess the relevance of archived OER data to their respective research or management interests, and enable quick access to the desired raw and processed data sets. Pre-processing of the data and visualization to combine various data sets also offers benefits to streamline data quality assurance and quality control efforts.

  5. Geospatial Modelling for Micro Zonation of Groundwater Regime in Western Assam, India

    NASA Astrophysics Data System (ADS)

    Singh, R. P.

    2016-12-01

    Water, most precious natural resource on earth, is vital to sustain the natural system and human civilisation on the earth. The Assam state located in north-eastern part of India has a relatively good source of ground water due to their geographic and physiographic location but there is problem deterioration of groundwater quality causing major health problem in the area. In this study, I tried a integrated study of remote sensing and GIS and chemical analysis of groundwater samples to throw a light over groundwater regime and provides information for decision makers to make sustainable water resource management. The geospatial modelling performed by integrating hydrogeomorphic features. Geomorphology, lineament, Drainage, Landuse/landcover layer were generated through visual interpretation on satellite image (LISS III) based on tone, texture, shape, size, and arrangement of the features. Slope layer was prepared by using SRTM DEM data set .The LULC of the area were categories in to 6 classes of Agricultural field, Forest area ,River, Settlement , Tree-clad area and Wetlands. The geospatial modelling performed through weightage and rank method in GIS, depending on the influence of the features on ground water regime. To Assess the ground water quality of the area 45 groundwater samples have been collected from the field and chemical analysis performed through the standard method in the laboratory. The overall assessment of the ground water quality of the area analyse through Water Quality Index and found that about 70% samples are not potable for drinking purposes due to higher concentration Arsenic, Fluoride and Iron. It appears that, source of all these pollutants geologically and geomorphologically derived. Interpolated layer of Water Quality Index and geospatial modelled Groundwater potential layer provides a holistic view of groundwater scenario and provide direction for better planning and groundwater resource management. Study will be discussed in details during the conference.

  6. Challenges in sharing of geospatial data by data custodians in South Africa

    NASA Astrophysics Data System (ADS)

    Kay, Sissiel E.

    2018-05-01

    As most development planning and rendering of public services happens at a place or in a space, geospatial data is required. This geospatial data is best managed through a spatial data infrastructure, which has as a key objective to share geospatial data. The collection and maintenance of geospatial data is expensive and time consuming and so the principle of "collect once - use many times" should apply. It is best to obtain the geospatial data from the authoritative source - the appointed data custodian. In South Africa the South African Spatial Data Infrastructure (SASDI) is the means to achieve the requirement for geospatial data sharing. This requires geospatial data sharing to take place between the data custodian and the user. All data custodians are expected to comply with the Spatial Data Infrastructure Act (SDI Act) in terms of geo-spatial data sharing. Currently data custodians are experiencing challenges with regard to the sharing of geospatial data. This research is based on the current ten data themes selected by the Committee for Spatial Information and the organisations identified as the data custodians for these ten data themes. The objectives are to determine whether the identified data custodians comply with the SDI Act with respect to geospatial data sharing, and if not what are the reasons for this. Through an international comparative assessment it then determines if the compliance with the SDI Act is not too onerous on the data custodians. The research concludes that there are challenges with geospatial data sharing in South Africa and that the data custodians only partially comply with the SDI Act in terms of geospatial data sharing. However, it is shown that the South African legislation is not too onerous on the data custodians.

  7. SemantGeo: Powering Ecological and Environment Data Discovery and Search with Standards-Based Geospatial Reasoning

    NASA Astrophysics Data System (ADS)

    Seyed, P.; Ashby, B.; Khan, I.; Patton, E. W.; McGuinness, D. L.

    2013-12-01

    Recent efforts to create and leverage standards for geospatial data specification and inference include the GeoSPARQL standard, Geospatial OWL ontologies (e.g., GAZ, Geonames), and RDF triple stores that support GeoSPARQL (e.g., AllegroGraph, Parliament) that use RDF instance data for geospatial features of interest. However, there remains a gap on how best to fuse software engineering best practices and GeoSPARQL within semantic web applications to enable flexible search driven by geospatial reasoning. In this abstract we introduce the SemantGeo module for the SemantEco framework that helps fill this gap, enabling scientists find data using geospatial semantics and reasoning. SemantGeo provides multiple types of geospatial reasoning for SemantEco modules. The server side implementation uses the Parliament SPARQL Endpoint accessed via a Tomcat servlet. SemantGeo uses the Google Maps API for user-specified polygon construction and JsTree for providing containment and categorical hierarchies for search. SemantGeo uses GeoSPARQL for spatial reasoning alone and in concert with RDFS/OWL reasoning capabilities to determine, e.g., what geofeatures are within, partially overlap with, or within a certain distance from, a given polygon. We also leverage qualitative relationships defined by the Gazetteer ontology that are composites of spatial relationships as well as administrative designations or geophysical phenomena. We provide multiple mechanisms for exploring data, such as polygon (map-based) and named-feature (hierarchy-based) selection, that enable flexible search constraints using boolean combination of selections. JsTree-based hierarchical search facets present named features and include a 'part of' hierarchy (e.g., measurement-site-01, Lake George, Adirondack Region, NY State) and type hierarchies (e.g., nodes in the hierarchy for WaterBody, Park, MeasurementSite), depending on the ';axis of choice' option selected. Using GeoSPARQL and aforementioned ontology, these hierarchies are constrained based on polygon selection, where the corresponding polygons of the contained features are visually rendered to assist exploration. Once measurement sites are plotted based on initial search, subsequent searches using JsTree selections can extend the previous based on nearby waterbodies in some semantic relationship of interest. For example, ';tributary of' captures water bodies that flow into the current one, and extending the original search to include tributaries of the observed water body is useful to environmental scientists for isolating the source of characteristic levels, including pollutants. Ultimately any SemantEco module can leverage SemantGeo's underlying APIs, leveraged in a deployment of SemantEco that combines EPA and USGS water quality data, and one customized for searching data available from the Darrin Freshwater Institute. Future work will address generating RDF geometry data from shape files, aligning RDF data sources to better leverage qualitative and spatial relationships, and validating newly generated RDF data adhering to the GeoSPARQL standard.

  8. The generation of criteria for selecting analytical tools for landscape management

    Treesearch

    Marilyn Duffey-Armstrong

    1979-01-01

    This paper presents an approach to generating criteria for selecting the analytical tools used to assess visual resources for various landscape management tasks. The approach begins by first establishing the overall parameters for the visual assessment task, and follows by defining the primary requirements of the various sets of analytical tools to be used. Finally,...

  9. Modelling a suitable location for Urban Solid Waste Management using AHP method and GIS -A geospatial approach and MCDM Model

    NASA Astrophysics Data System (ADS)

    Iqbal, M.; Islam, A.; Hossain, A.; Mustaque, S.

    2016-12-01

    Multi-Criteria Decision Making(MCDM) is advanced analytical method to evaluate appropriate result or decision from multiple criterion environment. Present time in advanced research, MCDM technique is progressive analytical process to evaluate a logical decision from various conflict. In addition, Present day Geospatial approach (e.g. Remote sensing and GIS) also another advanced technical approach in a research to collect, process and analyze various spatial data at a time. GIS and Remote sensing together with the MCDM technique could be the best platform to solve a complex decision making process. These two latest process combined very effectively used in site selection for solid waste management in urban policy. The most popular MCDM technique is Weighted Linear Method (WLC) where Analytical Hierarchy Process (AHP) is another popular and consistent techniques used in worldwide as dependable decision making. Consequently, the main objective of this study is improving a AHP model as MCDM technique with Geographic Information System (GIS) to select a suitable landfill site for urban solid waste management. Here AHP technique used as a MCDM tool to select the best suitable landfill location for urban solid waste management. To protect the urban environment in a sustainable way municipal waste needs an appropriate landfill site considering environmental, geological, social and technical aspect of the region. A MCDM model generate from five class related which related to environmental, geological, social and technical using AHP method and input the result set in GIS for final model location for urban solid waste management. The final suitable location comes out that 12.2% of the area corresponds to 22.89 km2 considering the total study area. In this study, Keraniganj sub-district of Dhaka district in Bangladesh is consider as study area which is densely populated city currently undergoes an unmanaged waste management system especially the suitable landfill sites for waste dumping site.

  10. Geospatial Data Science Research Staff | Geospatial Data Science | NREL

    Science.gov Websites

    Oliveira, Ricardo Researcher II-Geospatial Science Ricardo.Oliveira@nrel.gov 303-275-3272 Gilroy, Nicholas Specialist Pamela.Gray.hann@nrel.gov 303-275-4626 Grue, Nicholas Researcher III-Geospatial Science Nick.Grue

  11. PLANNING QUALITY IN GEOSPATIAL PROJECTS

    EPA Science Inventory

    This presentation will briefly review some legal drivers and present a structure for the writing of geospatial Quality Assurance Projects Plans. In addition, the Geospatial Quality Council geospatial information life-cycle and sources of error flowchart will be reviewed.

  12. Automatic geospatial information Web service composition based on ontology interface matching

    NASA Astrophysics Data System (ADS)

    Xu, Xianbin; Wu, Qunyong; Wang, Qinmin

    2008-10-01

    With Web services technology the functions of WebGIS can be presented as a kind of geospatial information service, and helped to overcome the limitation of the information-isolated situation in geospatial information sharing field. Thus Geospatial Information Web service composition, which conglomerates outsourced services working in tandem to offer value-added service, plays the key role in fully taking advantage of geospatial information services. This paper proposes an automatic geospatial information web service composition algorithm that employed the ontology dictionary WordNet to analyze semantic distances among the interfaces. Through making matching between input/output parameters and the semantic meaning of pairs of service interfaces, a geospatial information web service chain can be created from a number of candidate services. A practice of the algorithm is also proposed and the result of it shows the feasibility of this algorithm and the great promise in the emerging demand for geospatial information web service composition.

  13. The Role of Teamwork in the Analysis of Big Data: A Study of Visual Analytics and Box Office Prediction.

    PubMed

    Buchanan, Verica; Lu, Yafeng; McNeese, Nathan; Steptoe, Michael; Maciejewski, Ross; Cooke, Nancy

    2017-03-01

    Historically, domains such as business intelligence would require a single analyst to engage with data, develop a model, answer operational questions, and predict future behaviors. However, as the problems and domains become more complex, organizations are employing teams of analysts to explore and model data to generate knowledge. Furthermore, given the rapid increase in data collection, organizations are struggling to develop practices for intelligence analysis in the era of big data. Currently, a variety of machine learning and data mining techniques are available to model data and to generate insights and predictions, and developments in the field of visual analytics have focused on how to effectively link data mining algorithms with interactive visuals to enable analysts to explore, understand, and interact with data and data models. Although studies have explored the role of single analysts in the visual analytics pipeline, little work has explored the role of teamwork and visual analytics in the analysis of big data. In this article, we present an experiment integrating statistical models, visual analytics techniques, and user experiments to study the role of teamwork in predictive analytics. We frame our experiment around the analysis of social media data for box office prediction problems and compare the prediction performance of teams, groups, and individuals. Our results indicate that a team's performance is mediated by the team's characteristics such as openness of individual members to others' positions and the type of planning that goes into the team's analysis. These findings have important implications for how organizations should create teams in order to make effective use of information from their analytic models.

  14. Using Interactive Data Visualizations for Exploratory Analysis in Undergraduate Genomics Coursework: Field Study Findings and Guidelines

    ERIC Educational Resources Information Center

    Mirel, Barbara; Kumar, Anuj; Nong, Paige; Su, Gang; Meng, Fan

    2016-01-01

    Life scientists increasingly use visual analytics to explore large data sets and generate hypotheses. Undergraduate biology majors should be learning these same methods. Yet visual analytics is one of the most underdeveloped areas of undergraduate biology education. This study sought to determine the feasibility of undergraduate biology majors…

  15. Literature and Product Review of Visual Analytics for Maritime Awareness

    DTIC Science & Technology

    2009-10-28

    the user’s knowledge and experience. • Riveiro et al [107] provide a useful discussion of the cognitive process of anomaly detection based on...changes over time can be seen visually. • Wilkinson et al [140] suggests that we need visual analytics for three principal purposes: checking raw data...Predictions within the Current Plot • Yue et al [146] describe an AI blackboard-based agent that leverages interactive visualization and mixed

  16. Big data in medical informatics: improving education through visual analytics.

    PubMed

    Vaitsis, Christos; Nilsson, Gunnar; Zary, Nabil

    2014-01-01

    A continuous effort to improve healthcare education today is currently driven from the need to create competent health professionals able to meet healthcare demands. Limited research reporting how educational data manipulation can help in healthcare education improvement. The emerging research field of visual analytics has the advantage to combine big data analysis and manipulation techniques, information and knowledge representation, and human cognitive strength to perceive and recognise visual patterns. The aim of this study was therefore to explore novel ways of representing curriculum and educational data using visual analytics. Three approaches of visualization and representation of educational data were presented. Five competencies at undergraduate medical program level addressed in courses were identified to inaccurately correspond to higher education board competencies. Different visual representations seem to have a potential in impacting on the ability to perceive entities and connections in the curriculum data.

  17. 75 FR 6056 - National Geospatial Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-05

    ... DEPARTMENT OF THE INTERIOR Office of the Secretary National Geospatial Advisory Committee AGENCY: Office of the Secretary, Interior. ACTION: Notice of renewal of National Geospatial Advisory Committee... renewed the National Geospatial Advisory Committee. The Committee will provide advice and recommendations...

  18. Building asynchronous geospatial processing workflows with web services

    NASA Astrophysics Data System (ADS)

    Zhao, Peisheng; Di, Liping; Yu, Genong

    2012-02-01

    Geoscience research and applications often involve a geospatial processing workflow. This workflow includes a sequence of operations that use a variety of tools to collect, translate, and analyze distributed heterogeneous geospatial data. Asynchronous mechanisms, by which clients initiate a request and then resume their processing without waiting for a response, are very useful for complicated workflows that take a long time to run. Geospatial contents and capabilities are increasingly becoming available online as interoperable Web services. This online availability significantly enhances the ability to use Web service chains to build distributed geospatial processing workflows. This paper focuses on how to orchestrate Web services for implementing asynchronous geospatial processing workflows. The theoretical bases for asynchronous Web services and workflows, including asynchrony patterns and message transmission, are examined to explore different asynchronous approaches to and architecture of workflow code for the support of asynchronous behavior. A sample geospatial processing workflow, issued by the Open Geospatial Consortium (OGC) Web Service, Phase 6 (OWS-6), is provided to illustrate the implementation of asynchronous geospatial processing workflows and the challenges in using Web Services Business Process Execution Language (WS-BPEL) to develop them.

  19. Extending Climate Analytics-As to the Earth System Grid Federation

    NASA Astrophysics Data System (ADS)

    Tamkin, G.; Schnase, J. L.; Duffy, D.; McInerney, M.; Nadeau, D.; Li, J.; Strong, S.; Thompson, J. H.

    2015-12-01

    We are building three extensions to prior-funded work on climate analytics-as-a-service that will benefit the Earth System Grid Federation (ESGF) as it addresses the Big Data challenges of future climate research: (1) We are creating a cloud-based, high-performance Virtual Real-Time Analytics Testbed supporting a select set of climate variables from six major reanalysis data sets. This near real-time capability will enable advanced technologies like the Cloudera Impala-based Structured Query Language (SQL) query capabilities and Hadoop-based MapReduce analytics over native NetCDF files while providing a platform for community experimentation with emerging analytic technologies. (2) We are building a full-featured Reanalysis Ensemble Service comprising monthly means data from six reanalysis data sets. The service will provide a basic set of commonly used operations over the reanalysis collections. The operations will be made accessible through NASA's climate data analytics Web services and our client-side Climate Data Services (CDS) API. (3) We are establishing an Open Geospatial Consortium (OGC) WPS-compliant Web service interface to our climate data analytics service that will enable greater interoperability with next-generation ESGF capabilities. The CDS API will be extended to accommodate the new WPS Web service endpoints as well as ESGF's Web service endpoints. These activities address some of the most important technical challenges for server-side analytics and support the research community's requirements for improved interoperability and improved access to reanalysis data.

  20. Geospatial analysis of spaceborne remote sensing data for assessing disaster impacts and modeling surface runoff in the built-environment

    NASA Astrophysics Data System (ADS)

    Wodajo, Bikila Teklu

    Every year, coastal disasters such as hurricanes and floods claim hundreds of lives and severely damage homes, businesses, and lifeline infrastructure. This research was motivated by the 2005 Hurricane Katrina disaster, which devastated the Mississippi and Louisiana Gulf Coast. The primary objective was to develop a geospatial decision-support system for extracting built-up surfaces and estimating disaster impacts using spaceborne remote sensing satellite imagery. Pre-Katrina 1-m Ikonos imagery of a 5km x 10km area of Gulfport, Mississippi, was used as source data to develop the built-up area and natural surfaces or BANS classification methodology. Autocorrelation of 0.6 or higher values related to spectral reflectance values of groundtruth pixels were used to select spectral bands and establish the BANS decision criteria of unique ranges of reflectance values. Surface classification results using GeoMedia Pro geospatial analysis for Gulfport sample areas, based on BANS criteria and manually drawn polygons, were within +/-7% of the groundtruth. The difference between the BANS results and the groundtruth was statistically not significant. BANS is a significant improvement over other supervised classification methods, which showed only 50% correctly classified pixels. The storm debris and erosion estimation or SDE methodology was developed from analysis of pre- and post-Katrina surface classification results of Gulfport samples. The SDE severity level criteria considered hurricane and flood damages and vulnerability of inhabited built-environment. A linear regression model, with +0.93 Pearson R-value, was developed for predicting SDE as a function of pre-disaster percent built-up area. SDE predictions for Gulfport sample areas, used for validation, were within +/-4% of calculated values. The damage cost model considered maintenance, rehabilitation and reconstruction costs related to infrastructure damage and community impacts of Hurricane Katrina. The developed models were implemented for a study area along I-10 considering the predominantly flood-induced damages in New Orleans. The BANS methodology was calibrated for 0.6-m QuickBird2 multispectral imagery of Karachi Port area in Pakistan. The results were accurate within +/-6% of the groundtruth. Due to its computational simplicity, the unit hydrograph method is recommended for geospatial visualization of surface runoff in the built-environment using BANS surface classification maps and elevations data. Key words. geospatial analysis, satellite imagery, built-environment, hurricane, disaster impacts, runoff.

  1. Visualization and interaction tools for aerial photograph mosaics

    NASA Astrophysics Data System (ADS)

    Fernandes, João Pedro; Fonseca, Alexandra; Pereira, Luís; Faria, Adriano; Figueira, Helder; Henriques, Inês; Garção, Rita; Câmara, António

    1997-05-01

    This paper describes the development of a digital spatial library based on mosaics of digital orthophotos, called Interactive Portugal, that will enable users both to retrieve geospatial information existing in the Portuguese National System for Geographic Information World Wide Web server, and to develop local databases connected to the main system. A set of navigation, interaction, and visualization tools are proposed and discussed. They include sketching, dynamic sketching, and navigation capabilities over the digital orthophotos mosaics. Main applications of this digital spatial library are pointed out and discussed, namely for education, professional, and tourism markets. Future developments are considered. These developments are related to user reactions, technological advancements, and projects that also aim at delivering and exploring digital imagery on the World Wide Web. Future capabilities for site selection and change detection are also considered.

  2. Quality Metadata Management for Geospatial Scientific Workflows: from Retrieving to Assessing with Online Tools

    NASA Astrophysics Data System (ADS)

    Leibovici, D. G.; Pourabdollah, A.; Jackson, M.

    2011-12-01

    Experts and decision-makers use or develop models to monitor global and local changes of the environment. Their activities require the combination of data and processing services in a flow of operations and spatial data computations: a geospatial scientific workflow. The seamless ability to generate, re-use and modify a geospatial scientific workflow is an important requirement but the quality of outcomes is equally much important [1]. Metadata information attached to the data and processes, and particularly their quality, is essential to assess the reliability of the scientific model that represents a workflow [2]. Managing tools, dealing with qualitative and quantitative metadata measures of the quality associated with a workflow, are, therefore, required for the modellers. To ensure interoperability, ISO and OGC standards [3] are to be adopted, allowing for example one to define metadata profiles and to retrieve them via web service interfaces. However these standards need a few extensions when looking at workflows, particularly in the context of geoprocesses metadata. We propose to fill this gap (i) at first through the provision of a metadata profile for the quality of processes, and (ii) through providing a framework, based on XPDL [4], to manage the quality information. Web Processing Services are used to implement a range of metadata analyses on the workflow in order to evaluate and present quality information at different levels of the workflow. This generates the metadata quality, stored in the XPDL file. The focus is (a) on the visual representations of the quality, summarizing the retrieved quality information either from the standardized metadata profiles of the components or from non-standard quality information e.g., Web 2.0 information, and (b) on the estimated qualities of the outputs derived from meta-propagation of uncertainties (a principle that we have introduced [5]). An a priori validation of the future decision-making supported by the outputs of the workflow once run, is then provided using the meta-propagated qualities, obtained without running the workflow [6], together with the visualization pointing out the need to improve the workflow with better data or better processes on the workflow graph itself. [1] Leibovici, DG, Hobona, G Stock, K Jackson, M (2009) Qualifying geospatial workfow models for adaptive controlled validity and accuracy. In: IEEE 17th GeoInformatics, 1-5 [2] Leibovici, DG, Pourabdollah, A (2010a) Workflow Uncertainty using a Metamodel Framework and Metadata for Data and Processes. OGC TC/PC Meetings, September 2010, Toulouse, France [3] OGC (2011) www.opengeospatial.org [4] XPDL (2008) Workflow Process Definition Interface - XML Process Definition Language.Workflow Management Coalition, Document WfMC-TC-1025, 2008 [5] Leibovici, DG Pourabdollah, A Jackson, M (2011) Meta-propagation of Uncertainties for Scientific Workflow Management in Interoperable Spatial Data Infrastructures. In: Proceedings of the European Geosciences Union (EGU2011), April 2011, Austria [6] Pourabdollah, A Leibovici, DG Jackson, M (2011) MetaPunT: an Open Source tool for Meta-Propagation of uncerTainties in Geospatial Processing. In: Proceedings of OSGIS2011, June 2011, Nottingham, UK

  3. Architecture of the local spatial data infrastructure for regional climate change research

    NASA Astrophysics Data System (ADS)

    Titov, Alexander; Gordov, Evgeny

    2013-04-01

    Georeferenced datasets (meteorological databases, modeling and reanalysis results, etc.) are actively used in modeling and analysis of climate change for various spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their size which might constitute up to tens terabytes for a single dataset studies in the area of climate and environmental change require a special software support based on SDI approach. A dedicated architecture of the local spatial data infrastructure aiming at regional climate change analysis using modern web mapping technologies is presented. Geoportal is a key element of any SDI, allowing searching of geoinformation resources (datasets and services) using metadata catalogs, producing geospatial data selections by their parameters (data access functionality) as well as managing services and applications of cartographical visualization. It should be noted that due to objective reasons such as big dataset volume, complexity of data models used, syntactic and semantic differences of various datasets, the development of environmental geodata access, processing and visualization services turns out to be quite a complex task. Those circumstances were taken into account while developing architecture of the local spatial data infrastructure as a universal framework providing geodata services. So that, the architecture presented includes: 1. Effective in terms of search, access, retrieval and subsequent statistical processing, model of storing big sets of regional georeferenced data, allowing in particular to store frequently used values (like monthly and annual climate change indices, etc.), thus providing different temporal views of the datasets 2. General architecture of the corresponding software components handling geospatial datasets within the storage model 3. Metadata catalog describing in detail using ISO 19115 and CF-convention standards datasets used in climate researches as a basic element of the spatial data infrastructure as well as its publication according to OGC CSW (Catalog Service Web) specification 4. Computational and mapping web services to work with geospatial datasets based on OWS (OGC Web Services) standards: WMS, WFS, WPS 5. Geoportal as a key element of thematic regional spatial data infrastructure providing also software framework for dedicated web applications development To realize web mapping services Geoserver software is used since it provides natural WPS implementation as a separate software module. To provide geospatial metadata services GeoNetwork Opensource (http://geonetwork-opensource.org) product is planned to be used for it supports ISO 19115/ISO 19119/ISO 19139 metadata standards as well as ISO CSW 2.0 profile for both client and server. To implement thematic applications based on geospatial web services within the framework of local SDI geoportal the following open source software have been selected: 1. OpenLayers JavaScript library, providing basic web mapping functionality for the thin client such as web browser 2. GeoExt/ExtJS JavaScript libraries for building client-side web applications working with geodata services. The web interface developed will be similar to the interface of such popular desktop GIS applications, as uDIG, QuantumGIS etc. The work is partially supported by RF Ministry of Education and Science grant 8345, SB RAS Program VIII.80.2.1 and IP 131.

  4. Geospatial Service Platform for Education and Research

    NASA Astrophysics Data System (ADS)

    Gong, J.; Wu, H.; Jiang, W.; Guo, W.; Zhai, X.; Yue, P.

    2014-04-01

    We propose to advance the scientific understanding through applications of geospatial service platforms, which can help students and researchers investigate various scientific problems in a Web-based environment with online tools and services. The platform also offers capabilities for sharing data, algorithm, and problem-solving knowledge. To fulfil this goal, the paper introduces a new course, named "Geospatial Service Platform for Education and Research", to be held in the ISPRS summer school in May 2014 at Wuhan University, China. The course will share cutting-edge achievements of a geospatial service platform with students from different countries, and train them with online tools from the platform for geospatial data processing and scientific research. The content of the course includes the basic concepts of geospatial Web services, service-oriented architecture, geoprocessing modelling and chaining, and problem-solving using geospatial services. In particular, the course will offer a geospatial service platform for handson practice. There will be three kinds of exercises in the course: geoprocessing algorithm sharing through service development, geoprocessing modelling through service chaining, and online geospatial analysis using geospatial services. Students can choose one of them, depending on their interests and background. Existing geoprocessing services from OpenRS and GeoPW will be introduced. The summer course offers two service chaining tools, GeoChaining and GeoJModelBuilder, as instances to explain specifically the method for building service chains in view of different demands. After this course, students can learn how to use online service platforms for geospatial resource sharing and problem-solving.

  5. An Intelligent Polar Cyberinfrastrucuture to Support Spatiotemporal Decision Making

    NASA Astrophysics Data System (ADS)

    Song, M.; Li, W.; Zhou, X.

    2014-12-01

    In the era of big data, polar sciences have already faced an urgent demand of utilizing intelligent approaches to support precise and effective spatiotemporal decision-making. Service-oriented cyberinfrastructure has advantages of seamlessly integrating distributed computing resources, and aggregating a variety of geospatial data derived from Earth observation network. This paper focuses on building a smart service-oriented cyberinfrastructure to support intelligent question answering related to polar datasets. The innovation of this polar cyberinfrastructure includes: (1) a problem-solving environment that parses geospatial question in natural language, builds geoprocessing rules, composites atomic processing services and executes the entire workflow; (2) a self-adaptive spatiotemporal filter that is capable of refining query constraints through semantic analysis; (3) a dynamic visualization strategy to support results animation and statistics in multiple spatial reference systems; and (4) a user-friendly online portal to support collaborative decision-making. By means of this polar cyberinfrastructure, we intend to facilitate integration of distributed and heterogeneous Arctic datasets and comprehensive analysis of multiple environmental elements (e.g. snow, ice, permafrost) to provide a better understanding of the environmental variation in circumpolar regions.

  6. EPA GEOSPATIAL QUALITY COUNCIL

    EPA Science Inventory

    The EPA Geospatial Quality Council (previously known as the EPA GIS-QA Team - EPA/600/R-00/009 was created to fill the gap between the EPA Quality Assurance (QA) and Geospatial communities. All EPA Offices and Regions were invited to participate. Currently, the EPA Geospatial Q...

  7. Geospatial Thinking of Information Professionals

    ERIC Educational Resources Information Center

    Bishop, Bradley Wade; Johnston, Melissa P.

    2013-01-01

    Geospatial thinking skills inform a host of library decisions including planning and managing facilities, analyzing service area populations, facility site location, library outlet and service point closures, as well as assisting users with their own geospatial needs. Geospatial thinking includes spatial cognition, spatial reasoning, and knowledge…

  8. Route visualization using detail lenses.

    PubMed

    Karnick, Pushpak; Cline, David; Jeschke, Stefan; Razdan, Anshuman; Wonka, Peter

    2010-01-01

    We present a method designed to address some limitations of typical route map displays of driving directions. The main goal of our system is to generate a printable version of a route map that shows the overview and detail views of the route within a single, consistent visual frame. Our proposed visualization provides a more intuitive spatial context than a simple list of turns. We present a novel multifocus technique to achieve this goal, where the foci are defined by points of interest (POI) along the route. A detail lens that encapsulates the POI at a finer geospatial scale is created for each focus. The lenses are laid out on the map to avoid occlusion with the route and each other, and to optimally utilize the free space around the route. We define a set of layout metrics to evaluate the quality of a lens layout for a given route map visualization. We compare standard lens layout methods to our proposed method and demonstrate the effectiveness of our method in generating aesthetically pleasing layouts. Finally, we perform a user study to evaluate the effectiveness of our layout choices.

  9. Physiological and Anatomical Visual Analytics (PAVA) Background

    EPA Pesticide Factsheets

    The need to efficiently analyze human chemical disposition data from in vivo studies or in silico PBPK modeling efforts, and to see complex disposition data in a logical manner, has created a unique opportunity for visual analytics applid to PAD.

  10. EPA Geospatial Quality Council Strategic and Implementation Plan 2010 to 2015

    EPA Science Inventory

    The EPA Geospatial Quality Council (GQC) was created to promote and provide Quality Assurance guidance for the development, use, and products of geospatial science. The GQC was created when the gap between the EPA Quality Assurance (QA) and Geospatial communities was recognized. ...

  11. US EPA GEOSPATIAL QUALITY COUNCIL: ENSURING QUALITY GEOSPATIAL SOLUTIONS

    EPA Science Inventory

    This presentation will discuss the history, strategy, products, and future plans of the EPA Geospatial Quality Council (GQC). A topical review of GQC products will be presented including:

    o Guidance for Geospatial Data Quality Assurance Project Plans.

    o GPS - Tec...

  12. Study on generation and sharing of on-demand global seamless data—Taking MODIS NDVI as an example

    NASA Astrophysics Data System (ADS)

    Shen, Dayong; Deng, Meixia; Di, Liping; Han, Weiguo; Peng, Chunming; Yagci, Ali Levent; Yu, Genong; Chen, Zeqiang

    2013-04-01

    By applying advanced Geospatial Data Abstraction Library (GDAL) and BigTIFF technology in a Geographical Information System (GIS) with Service Oriented Architecture (SOA), this study has derived global datasets using tile-based input data and implemented Virtual Web Map Service (VWMS) and Virtual Web Coverage Service (VWCS) to provide software tools for visualization and acquisition of global data. Taking MODIS Normalized Difference Vegetation Index (NDVI) as an example, this study proves the feasibility, efficiency and features of the proposed approach.

  13. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics

    PubMed Central

    2016-01-01

    Background We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. Objective To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. Methods The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Results Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix. Conclusions IBMWA is a new alternative for data analytics software that automates descriptive, predictive, and visual analytics. This program is very user-friendly but requires data preprocessing, statistical conceptual understanding, and domain expertise. PMID:27729304

  14. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics.

    PubMed

    Hoyt, Robert Eugene; Snider, Dallas; Thompson, Carla; Mantravadi, Sarita

    2016-10-11

    We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix. IBMWA is a new alternative for data analytics software that automates descriptive, predictive, and visual analytics. This program is very user-friendly but requires data preprocessing, statistical conceptual understanding, and domain expertise.

  15. Just-in-time Time Data Analytics and Visualization of Climate Simulations using the Bellerophon Framework

    NASA Astrophysics Data System (ADS)

    Anantharaj, V. G.; Venzke, J.; Lingerfelt, E.; Messer, B.

    2015-12-01

    Climate model simulations are used to understand the evolution and variability of earth's climate. Unfortunately, high-resolution multi-decadal climate simulations can take days to weeks to complete. Typically, the simulation results are not analyzed until the model runs have ended. During the course of the simulation, the output may be processed periodically to ensure that the model is preforming as expected. However, most of the data analytics and visualization are not performed until the simulation is finished. The lengthy time period needed for the completion of the simulation constrains the productivity of climate scientists. Our implementation of near real-time data visualization analytics capabilities allows scientists to monitor the progress of their simulations while the model is running. Our analytics software executes concurrently in a co-scheduling mode, monitoring data production. When new data are generated by the simulation, a co-scheduled data analytics job is submitted to render visualization artifacts of the latest results. These visualization output are automatically transferred to Bellerophon's data server located at ORNL's Compute and Data Environment for Science (CADES) where they are processed and archived into Bellerophon's database. During the course of the experiment, climate scientists can then use Bellerophon's graphical user interface to view animated plots and their associated metadata. The quick turnaround from the start of the simulation until the data are analyzed permits research decisions and projections to be made days or sometimes even weeks sooner than otherwise possible! The supercomputer resources used to run the simulation are unaffected by co-scheduling the data visualization jobs, so the model runs continuously while the data are visualized. Our just-in-time data visualization software looks to increase climate scientists' productivity as climate modeling moves into exascale era of computing.

  16. Searching and exploitation of distributed geospatial data sources via the Naval Research Lab's Geospatial Information Database (GIDB) Portal System

    NASA Astrophysics Data System (ADS)

    McCreedy, Frank P.; Sample, John T.; Ladd, William P.; Thomas, Michael L.; Shaw, Kevin B.

    2005-05-01

    The Naval Research Laboratory"s Geospatial Information Database (GIDBTM) Portal System has been extended to now include an extensive geospatial search functionality. The GIDB Portal System interconnects over 600 distributed geospatial data sources via the Internet with a thick client, thin client and a PDA client. As the GIDB Portal System has rapidly grown over the last two years (adding hundreds of geospatial sources), the obvious requirement has arisen to more effectively mine the interconnected sources in near real-time. How the GIDB Search addresses this issue is the prime focus of this paper.

  17. The National Geospatial Technical Operations Center

    USGS Publications Warehouse

    Craun, Kari J.; Constance, Eric W.; Donnelly, Jay; Newell, Mark R.

    2009-01-01

    The United States Geological Survey (USGS) National Geospatial Technical Operations Center (NGTOC) provides geospatial technical expertise in support of the National Geospatial Program in its development of The National Map, National Atlas of the United States, and implementation of key components of the National Spatial Data Infrastructure (NSDI).

  18. SeeDB: Efficient Data-Driven Visualization Recommendations to Support Visual Analytics

    PubMed Central

    Vartak, Manasi; Rahman, Sajjadur; Madden, Samuel; Parameswaran, Aditya; Polyzotis, Neoklis

    2015-01-01

    Data analysts often build visualizations as the first step in their analytical workflow. However, when working with high-dimensional datasets, identifying visualizations that show relevant or desired trends in data can be laborious. We propose SeeDB, a visualization recommendation engine to facilitate fast visual analysis: given a subset of data to be studied, SeeDB intelligently explores the space of visualizations, evaluates promising visualizations for trends, and recommends those it deems most “useful” or “interesting”. The two major obstacles in recommending interesting visualizations are (a) scale: evaluating a large number of candidate visualizations while responding within interactive time scales, and (b) utility: identifying an appropriate metric for assessing interestingness of visualizations. For the former, SeeDB introduces pruning optimizations to quickly identify high-utility visualizations and sharing optimizations to maximize sharing of computation across visualizations. For the latter, as a first step, we adopt a deviation-based metric for visualization utility, while indicating how we may be able to generalize it to other factors influencing utility. We implement SeeDB as a middleware layer that can run on top of any DBMS. Our experiments show that our framework can identify interesting visualizations with high accuracy. Our optimizations lead to multiple orders of magnitude speedup on relational row and column stores and provide recommendations at interactive time scales. Finally, we demonstrate via a user study the effectiveness of our deviation-based utility metric and the value of recommendations in supporting visual analytics. PMID:26779379

  19. SeeDB: Efficient Data-Driven Visualization Recommendations to Support Visual Analytics.

    PubMed

    Vartak, Manasi; Rahman, Sajjadur; Madden, Samuel; Parameswaran, Aditya; Polyzotis, Neoklis

    2015-09-01

    Data analysts often build visualizations as the first step in their analytical workflow. However, when working with high-dimensional datasets, identifying visualizations that show relevant or desired trends in data can be laborious. We propose SeeDB, a visualization recommendation engine to facilitate fast visual analysis: given a subset of data to be studied, SeeDB intelligently explores the space of visualizations, evaluates promising visualizations for trends, and recommends those it deems most "useful" or "interesting". The two major obstacles in recommending interesting visualizations are (a) scale : evaluating a large number of candidate visualizations while responding within interactive time scales, and (b) utility : identifying an appropriate metric for assessing interestingness of visualizations. For the former, SeeDB introduces pruning optimizations to quickly identify high-utility visualizations and sharing optimizations to maximize sharing of computation across visualizations. For the latter, as a first step, we adopt a deviation-based metric for visualization utility, while indicating how we may be able to generalize it to other factors influencing utility. We implement SeeDB as a middleware layer that can run on top of any DBMS. Our experiments show that our framework can identify interesting visualizations with high accuracy. Our optimizations lead to multiple orders of magnitude speedup on relational row and column stores and provide recommendations at interactive time scales. Finally, we demonstrate via a user study the effectiveness of our deviation-based utility metric and the value of recommendations in supporting visual analytics.

  20. The Geospatial Web and Local Geographical Education

    ERIC Educational Resources Information Center

    Harris, Trevor M.; Rouse, L. Jesse; Bergeron, Susan J.

    2010-01-01

    Recent innovations in the Geospatial Web represent a paradigm shift in Web mapping by enabling educators to explore geography in the classroom by dynamically using a rapidly growing suite of impressive online geospatial tools. Coupled with access to spatial data repositories and User-Generated Content, the Geospatial Web provides a powerful…

  1. Effects of Using Dynamic Mathematics Software on Preservice Mathematics Teachers' Spatial Visualization Skills: The Case of Spatial Analytic Geometry

    ERIC Educational Resources Information Center

    Kösa, Temel

    2016-01-01

    The purpose of this study was to investigate the effects of using dynamic geometry software on preservice mathematics teachers' spatial visualization skills and to determine whether spatial visualization skills can be a predictor of success in learning analytic geometry of space. The study used a quasi-experimental design with a control group.…

  2. Visual Information for the Desktop, version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2006-03-29

    VZIN integrates visual analytics capabilities into popular desktop tools to aid a user in searching and understanding an information space. VZIN allows users to Drag-Drop-Visualize-Explore-Organize information within tools such as Microsoft Office, Windows Explorer, Excel, and Outlook. VZIN is tailorable to specific client or industry requirements. VZIN follows the desktop metaphors so that advanced analytical capabilities are available with minimal user training.

  3. Explore Earth Science Datasets for STEM with the NASA GES DISC Online Visualization and Analysis Tool, GIOVANNI

    NASA Astrophysics Data System (ADS)

    Liu, Z.; Acker, J. G.; Kempler, S. J.

    2016-12-01

    The NASA Goddard Earth Sciences (GES) Data and Information Services Center (DISC) is one of twelve NASA Science Mission Directorate (SMD) Data Centers that provide Earth science data, information, and services to research scientists, applications scientists, applications users, and students around the world. The GES DISC is the home (archive) of NASA Precipitation and Hydrology, as well as Atmospheric Composition and Dynamics remote sensing data and information. To facilitate Earth science data access, the GES DISC has been developing user-friendly data services for users at different levels. Among them, the Geospatial Interactive Online Visualization ANd aNalysis Infrastructure (GIOVANNI, http://giovanni.gsfc.nasa.gov/) allows users to explore satellite-based data using sophisticated analyses and visualizations without downloading data and software, which is particularly suitable for novices to use NASA datasets in STEM activities. In this presentation, we will briefly introduce GIOVANNI and recommend datasets for STEM. Examples of using these datasets in STEM activities will be presented as well.

  4. Explore Earth Science Datasets for STEM with the NASA GES DISC Online Visualization and Analysis Tool, Giovanni

    NASA Technical Reports Server (NTRS)

    Liu, Z.; Acker, J.; Kempler, S.

    2016-01-01

    The NASA Goddard Earth Sciences (GES) Data and Information Services Center(DISC) is one of twelve NASA Science Mission Directorate (SMD) Data Centers that provide Earth science data, information, and services to users around the world including research and application scientists, students, citizen scientists, etc. The GESDISC is the home (archive) of remote sensing datasets for NASA Precipitation and Hydrology, Atmospheric Composition and Dynamics, etc. To facilitate Earth science data access, the GES DISC has been developing user-friendly data services for users at different levels in different countries. Among them, the Geospatial Interactive Online Visualization ANd aNalysis Infrastructure (Giovanni, http:giovanni.gsfc.nasa.gov) allows users to explore satellite-based datasets using sophisticated analyses and visualization without downloading data and software, which is particularly suitable for novices (such as students) to use NASA datasets in STEM (science, technology, engineering and mathematics) activities. In this presentation, we will briefly introduce Giovanni along with examples for STEM activities.

  5. Sensing and Virtual Worlds - A Survey of Research Opportunities

    NASA Technical Reports Server (NTRS)

    Moore, Dana

    2012-01-01

    Virtual Worlds (VWs) have been used effectively in live and constructive military training. An area that remains fertile ground for exploration and a new vision involves integrating various traditional and now non-traditional sensors into virtual worlds. In this paper, we will assert that the benefits of this integration are several. First, we maintain that virtual worlds offer improved sensor deployment planning through improved visualization and stimulation of the model, using geo-specific terrain and structure. Secondly, we assert that VWs enhance the mission rehearsal process, and that using a mix of live avatars, non-player characters, and live sensor feeds (e.g. real time meteorology) can help visualization of the area of operations. Finally, tactical operations are improved via better collaboration and integration of real world sensing capabilities, and in most situations, 30 VWs improve the state of the art over current "dots on a map" 20 geospatial visualization. However, several capability gaps preclude a fuller realization of this vision. In this paper, we identify many of these gaps and suggest research directions

  6. D Visibility Analysis in Urban Environment - Cognition Research Based on Vge

    NASA Astrophysics Data System (ADS)

    Lin, T. P.; Lin, H.; Hu, M. Y.

    2013-09-01

    The author in this research attempts to illustrate a measurable relationship between the physical environment and human's visual perception, including the distance, visual angle impact and visual field (a 3D isovist conception) against human's cognition way, by using a 3D visibility analysis method based on the platform of Virtual Geographic Environment (VGE). The whole project carries out in the CUHK campus (the Chinese University of Hong Kong), by adopting a virtual 3D model of the whole campus and survey in real world. A possible model for the simulation of human cognition in urban spaces is expected to be the output of this research, such as what the human perceive from the environment, how their feelings and behaviours are and how they affect the surrounding world. Kevin Lynch raised 5 elements of urban design in 1960s, which are "vitality, sense, fit, access and control". As the development of urban design, several problems around the human's cognitive and behaviour have come out. Due to the restriction of sensing knowledge in urban spaces, the research among the "sense" and the "fit" of urban design were not quite concerned in recent decades. The geo-spatial cognition field comes into being in 1997 and developed in recent 15 years, which made great effort in way-finding and urban behaviour simulation based on the platform of GIS (geographic information system) or VGE. The platform of VGE is recognized as a proper tool for the analysis of human's perception in urban places, because of its efficient 3D spatial data management and excellent 3D visualization for output result. This article will generally describe the visibility analysis method based on the 3D VGE platform. According to the uncertainty and variety of human perception existed in this research, the author attempts to arrange a survey of observer investigation and validation for the analysis results. Four figures related with space and human's perception will be mainly concerned in this proposal: openness, permeability, environmental pressure and visibility, and these will also be used as the identification for different type of spaces. Generally, the author is aiming at contributing a possible way to understand the reason of human's cognition in geo-spatial area, and provides efficient mathematical model between spatial information and visual perception to the related research field.

  7. 4-D Visualization of Seismic and Geodetic Data of the Big Island of Hawai'i

    NASA Astrophysics Data System (ADS)

    Burstein, J. A.; Smith-Konter, B. R.; Aryal, A.

    2017-12-01

    For decades Hawai'i has served as a natural laboratory for studying complex interactions between magmatic and seismic processes. Investigating characteristics of these processes, as well as the crustal response to major Hawaiian earthquakes, requires a synthesis of seismic and geodetic data and models. Here, we present a 4-D visualization of the Big Island of Hawai'i that investigates geospatial and temporal relationships of seismicity, seismic velocity structure, and GPS crustal motions to known volcanic and seismically active features. Using the QPS Fledermaus visualization package, we compile 90 m resolution topographic data from NASA's Shuttle Radar Topography Mission (SRTM) and 50 m resolution bathymetric data from the Hawaiian Mapping Research Group (HMRG) with a high-precision earthquake catalog of more than 130,000 events from 1992-2009 [Matoza et al., 2013] and a 3-D seismic velocity model of Hawai'i [Lin et al., 2014] based on seismic data from the Hawaiian Volcano Observatory (HVO). Long-term crustal motion vectors are integrated into the visualization from HVO GPS time-series data. These interactive data sets reveal well-defined seismic structure near the summit areas of Mauna Loa and Kilauea volcanoes, where high Vp and high Vp/Vs anomalies at 5-12 km depth, as well as clusters of low magnitude (M < 3.5) seismicity, are observed. These areas of high Vp and high Vp/Vs are interpreted as mafic dike complexes and the surrounding seismic clusters are associated with shallow magma processes. GPS data are also used to help identify seismic clusters associated with the steady crustal detachment of the south flank of Kilauea's East Rift Zone. We also investigate the fault geometry of the 2006 M6.7 Kiholo Bay earthquake event by analyzing elastic dislocation deformation modeling results [Okada, 1985] and HVO GPS and seismic data of this event. We demonstrate the 3-D fault mechanisms of the Kiholo Bay main shock as a combination of strike-slip and dip-slip components (net slip 0.55 m) delineating a 30 km east-west striking, southward-dipping fault plane, occurring at 39 km depth. This visualization serves as a resource for advancing scientific analyses of Hawaiian seismic processes, as well as an interactive educational tool for demonstrating the geospatial and geophysical structure of the Big Island of Hawai'i.

  8. Looking back to inform the future: The role of cognition in forest disturbance characterization from remote sensing imagery

    NASA Astrophysics Data System (ADS)

    Bianchetti, Raechel Anne

    Remotely sensed images have become a ubiquitous part of our daily lives. From novice users, aiding in search and rescue missions using tools such as TomNod, to trained analysts, synthesizing disparate data to address complex problems like climate change, imagery has become central to geospatial problem solving. Expert image analysts are continually faced with rapidly developing sensor technologies and software systems. In response to these cognitively demanding environments, expert analysts develop specialized knowledge and analytic skills to address increasingly complex problems. This study identifies the knowledge, skills, and analytic goals of expert image analysts tasked with identification of land cover and land use change. Analysts participating in this research are currently working as part of a national level analysis of land use change, and are well versed with the use of TimeSync, forest science, and image analysis. The results of this study benefit current analysts as it improves their awareness of their mental processes used during the image interpretation process. The study also can be generalized to understand the types of knowledge and visual cues that analysts use when reasoning with imagery for purposes beyond land use change studies. Here a Cognitive Task Analysis framework is used to organize evidence from qualitative knowledge elicitation methods for characterizing the cognitive aspects of the TimeSync image analysis process. Using a combination of content analysis, diagramming, semi-structured interviews, and observation, the study highlights the perceptual and cognitive elements of expert remote sensing interpretation. Results show that image analysts perform several standard cognitive processes, but flexibly employ these processes in response to various contextual cues. Expert image analysts' ability to think flexibly during their analysis process was directly related to their amount of image analysis experience. Additionally, results show that the basic Image Interpretation Elements continue to be important despite technological augmentation of the interpretation process. These results are used to derive a set of design guidelines for developing geovisual analytic tools and training to support image analysis.

  9. Empirical Analysis of the Subjective Impressions and Objective Measures of Domain Scientists’ Visual Analytic Judgments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dasgupta, Aritra; Burrows, Susannah M.; Han, Kyungsik

    2017-05-08

    Scientists often use specific data analysis and presentation methods familiar within their domain. But does high familiarity drive better analytical judgment? This question is especially relevant when familiar methods themselves can have shortcomings: many visualizations used conventionally for scientific data analysis and presentation do not follow established best practices. This necessitates new methods that might be unfamiliar yet prove to be more effective. But there is little empirical understanding of the relationships between scientists’ subjective impressions about familiar and unfamiliar visualizations and objective measures of their visual analytic judgments. To address this gap and to study these factors, we focusmore » on visualizations used for comparison of climate model performance. We report on a comprehensive survey-based user study with 47 climate scientists and present an analysis of : i) relationships among scientists’ familiarity, their perceived lev- els of comfort, confidence, accuracy, and objective measures of accuracy, and ii) relationships among domain experience, visualization familiarity, and post-study preference.« less

  10. a Kml-Based Approach for Distributed Collaborative Interpretation of Remote Sensing Images in the Geo-Browser

    NASA Astrophysics Data System (ADS)

    Huang, L.; Zhu, X.; Guo, W.; Xiang, L.; Chen, X.; Mei, Y.

    2012-07-01

    Existing implementations of collaborative image interpretation have many limitations for very large satellite imageries, such as inefficient browsing, slow transmission, etc. This article presents a KML-based approach to support distributed, real-time, synchronous collaborative interpretation for remote sensing images in the geo-browser. As an OGC standard, KML (Keyhole Markup Language) has the advantage of organizing various types of geospatial data (including image, annotation, geometry, etc.) in the geo-browser. Existing KML elements can be used to describe simple interpretation results indicated by vector symbols. To enlarge its application, this article expands KML elements to describe some complex image processing operations, including band combination, grey transformation, geometric correction, etc. Improved KML is employed to describe and share interpretation operations and results among interpreters. Further, this article develops some collaboration related services that are collaboration launch service, perceiving service and communication service. The launch service creates a collaborative interpretation task and provides a unified interface for all participants. The perceiving service supports interpreters to share collaboration awareness. Communication service provides interpreters with written words communication. Finally, the GeoGlobe geo-browser (an extensible and flexible geospatial platform developed in LIESMARS) is selected to perform experiments of collaborative image interpretation. The geo-browser, which manage and visualize massive geospatial information, can provide distributed users with quick browsing and transmission. Meanwhile in the geo-browser, GIS data (for example DEM, DTM, thematic map and etc.) can be integrated to assist in improving accuracy of interpretation. Results show that the proposed method is available to support distributed collaborative interpretation of remote sensing image

  11. Database Organisation in a Web-Enabled Free and Open-Source Software (foss) Environment for Spatio-Temporal Landslide Modelling

    NASA Astrophysics Data System (ADS)

    Das, I.; Oberai, K.; Sarathi Roy, P.

    2012-07-01

    Landslides exhibit themselves in different mass movement processes and are considered among the most complex natural hazards occurring on the earth surface. Making landslide database available online via WWW (World Wide Web) promotes the spreading and reaching out of the landslide information to all the stakeholders. The aim of this research is to present a comprehensive database for generating landslide hazard scenario with the help of available historic records of landslides and geo-environmental factors and make them available over the Web using geospatial Free & Open Source Software (FOSS). FOSS reduces the cost of the project drastically as proprietary software's are very costly. Landslide data generated for the period 1982 to 2009 were compiled along the national highway road corridor in Indian Himalayas. All the geo-environmental datasets along with the landslide susceptibility map were served through WEBGIS client interface. Open source University of Minnesota (UMN) mapserver was used as GIS server software for developing web enabled landslide geospatial database. PHP/Mapscript server-side application serve as a front-end application and PostgreSQL with PostGIS extension serve as a backend application for the web enabled landslide spatio-temporal databases. This dynamic virtual visualization process through a web platform brings an insight into the understanding of the landslides and the resulting damage closer to the affected people and user community. The landslide susceptibility dataset is also made available as an Open Geospatial Consortium (OGC) Web Feature Service (WFS) which can be accessed through any OGC compliant open source or proprietary GIS Software.

  12. Geospatial Information System Analysis of Healthcare Need and Telemedicine Delivery in California.

    PubMed

    Kaufman, Taylor; Geraghty, Estella M; Dullet, Navjit; King, Jesse; Kissee, Jamie; Marcin, James P

    2017-05-01

    Geospatial Information Systems (GIS) superimpose data on geographical maps to provide visual representations of data by region. Few studies have used GIS data to investigate if telemedicine services are preferentially provided to communities of greatest need. This study compared the healthcare needs of communities with and without telemedicine services from a university-based telemedicine program. Originating sites for all telemedicine consultations between July 1996 and December 2013 were geocoded using ArcGIS software. ZIP Code Tabulation Areas (ZCTAs) were extracted from the 2010 U.S. Census Bureau's Topologically Integrated Geographic Encoding and Referencing file and assigned a community needs index (CNI) score to reflect the ZCTA community's healthcare needs based on evidence-based barriers to healthcare access. CNI scores were compared across communities with and without active telemedicine services. One hundred ninety-four originating telemedicine clinic sites in California were evaluated. The mean CNI score for ZCTAs with at least one telemedicine clinic was significantly higher (3.32 ± 0.84) than those without a telemedicine site (2.95 ± 0.99) and higher than the mean ZCTAs for all of California (2.99 ± 1.01). Of the 194 telemedicine clinics, 71.4% were located in communities with above average need and 33.2% were located in communities with very high needs. Originating sites receiving telemedicine services from a university-based telemedicine program were located in regions with significantly higher community healthcare needs. Leveraging a geospatial information system to understand community healthcare needs provides an opportunity for payers, hospitals, and patients to be strategic in the allocation of telemedicine services.

  13. Infrastructure for the Geospatial Web

    NASA Astrophysics Data System (ADS)

    Lake, Ron; Farley, Jim

    Geospatial data and geoprocessing techniques are now directly linked to business processes in many areas. Commerce, transportation and logistics, planning, defense, emergency response, health care, asset management and many other domains leverage geospatial information and the ability to model these data to achieve increased efficiencies and to develop better, more comprehensive decisions. However, the ability to deliver geospatial data and the capacity to process geospatial information effectively in these domains are dependent on infrastructure technology that facilitates basic operations such as locating data, publishing data, keeping data current and notifying subscribers and others whose applications and decisions are dependent on this information when changes are made. This chapter introduces the notion of infrastructure technology for the Geospatial Web. Specifically, the Geography Markup Language (GML) and registry technology developed using the ebRIM specification delivered from the OASIS consortium are presented as atomic infrastructure components in a working Geospatial Web.

  14. WetDATA Hub: Democratizing Access to Water Data to Accelerate Innovation through Data Visualization, Predictive Analytics and Artificial Intelligence Applications

    NASA Astrophysics Data System (ADS)

    Sarni, W.

    2017-12-01

    Water scarcity and poor quality impacts economic development, business growth, and social well-being. Water has become, in our generation, the foremost critical local, regional, and global issue of our time. Despite these needs, there is no water hub or water technology accelerator solely dedicated to water data and tools. There is a need by the public and private sectors for vastly improved data management and visualization tools. This is the WetDATA opportunity - to develop a water data tech hub dedicated to water data acquisition, analytics, and visualization tools for informed policy and business decisions. WetDATA's tools will help incubate disruptive water data technologies and accelerate adoption of current water data solutions. WetDATA is a Colorado-based (501c3), global hub for water data analytics and technology innovation. WetDATA's vision is to be a global leader in water information, data technology innovation and collaborate with other US and global water technology hubs. ROADMAP * Portal (www.wetdata.org) to provide stakeholders with tools/resources to understand related water risks. * The initial activities will provide education, awareness and tools to stakeholders to support the implementation of the Colorado State Water Plan. * Leverage the Western States Water Council Water Data Exchange database. * Development of visualization, predictive analytics and AI tools to engage with stakeholders and provide actionable data and information. TOOLS Education: Provide information on water issues and risks at the local, state, national and global scale. Visualizations: Development of data analytics and visualization tools based upon the 2030 Water Resources Group methodology to support the implementation of the Colorado State Water Plan. Predictive Analytics: Accessing publically available water databases and using machine learning to develop water availability forecasting tools, and time lapse images to support city / urban planning.

  15. Geospatial Data Curation at the University of Idaho

    ERIC Educational Resources Information Center

    Kenyon, Jeremy; Godfrey, Bruce; Eckwright, Gail Z.

    2012-01-01

    The management and curation of digital geospatial data has become a central concern for many academic libraries. Geospatial data is a complex type of data critical to many different disciplines, and its use has become more expansive in the past decade. The University of Idaho Library maintains a geospatial data repository called the Interactive…

  16. Geospatial Engineering

    DTIC Science & Technology

    2017-02-22

    manages operations through guidance, policies, programs, and organizations. The NSG is designed to be a mutually supportive enterprise that...deliberate technical design and deliberate human actions. Geospatial engineer teams (GETs) within the geospatial intelligence cells are the day-to-day...standards working group and are designated by the AGC Geospatial Acquisition Support Directorate as required for interoperability. Applicable standards

  17. Grid computing enhances standards-compatible geospatial catalogue service

    NASA Astrophysics Data System (ADS)

    Chen, Aijun; Di, Liping; Bai, Yuqi; Wei, Yaxing; Liu, Yang

    2010-04-01

    A catalogue service facilitates sharing, discovery, retrieval, management of, and access to large volumes of distributed geospatial resources, for example data, services, applications, and their replicas on the Internet. Grid computing provides an infrastructure for effective use of computing, storage, and other resources available online. The Open Geospatial Consortium has proposed a catalogue service specification and a series of profiles for promoting the interoperability of geospatial resources. By referring to the profile of the catalogue service for Web, an innovative information model of a catalogue service is proposed to offer Grid-enabled registry, management, retrieval of and access to geospatial resources and their replicas. This information model extends the e-business registry information model by adopting several geospatial data and service metadata standards—the International Organization for Standardization (ISO)'s 19115/19119 standards and the US Federal Geographic Data Committee (FGDC) and US National Aeronautics and Space Administration (NASA) metadata standards for describing and indexing geospatial resources. In order to select the optimal geospatial resources and their replicas managed by the Grid, the Grid data management service and information service from the Globus Toolkits are closely integrated with the extended catalogue information model. Based on this new model, a catalogue service is implemented first as a Web service. Then, the catalogue service is further developed as a Grid service conforming to Grid service specifications. The catalogue service can be deployed in both the Web and Grid environments and accessed by standard Web services or authorized Grid services, respectively. The catalogue service has been implemented at the George Mason University/Center for Spatial Information Science and Systems (GMU/CSISS), managing more than 17 TB of geospatial data and geospatial Grid services. This service makes it easy to share and interoperate geospatial resources by using Grid technology and extends Grid technology into the geoscience communities.

  18. a Framework for AN Open Source Geospatial Certification Model

    NASA Astrophysics Data System (ADS)

    Khan, T. U. R.; Davis, P.; Behr, F.-J.

    2016-06-01

    The geospatial industry is forecasted to have an enormous growth in the forthcoming years and an extended need for well-educated workforce. Hence ongoing education and training play an important role in the professional life. Parallel, in the geospatial and IT arena as well in the political discussion and legislation Open Source solutions, open data proliferation, and the use of open standards have an increasing significance. Based on the Memorandum of Understanding between International Cartographic Association, OSGeo Foundation, and ISPRS this development led to the implementation of the ICA-OSGeo-Lab imitative with its mission "Making geospatial education and opportunities accessible to all". Discussions in this initiative and the growth and maturity of geospatial Open Source software initiated the idea to develop a framework for a worldwide applicable Open Source certification approach. Generic and geospatial certification approaches are already offered by numerous organisations, i.e., GIS Certification Institute, GeoAcademy, ASPRS, and software vendors, i. e., Esri, Oracle, and RedHat. They focus different fields of expertise and have different levels and ways of examination which are offered for a wide range of fees. The development of the certification framework presented here is based on the analysis of diverse bodies of knowledge concepts, i.e., NCGIA Core Curriculum, URISA Body Of Knowledge, USGIF Essential Body Of Knowledge, the "Geographic Information: Need to Know", currently under development, and the Geospatial Technology Competency Model (GTCM). The latter provides a US American oriented list of the knowledge, skills, and abilities required of workers in the geospatial technology industry and influenced essentially the framework of certification. In addition to the theoretical analysis of existing resources the geospatial community was integrated twofold. An online survey about the relevance of Open Source was performed and evaluated with 105 respondents worldwide. 15 interviews (face-to-face or by telephone) with experts in different countries provided additional insights into Open Source usage and certification. The findings led to the development of a certification framework of three main categories with in total eleven sub-categories, i.e., "Certified Open Source Geospatial Data Associate / Professional", "Certified Open Source Geospatial Analyst Remote Sensing & GIS", "Certified Open Source Geospatial Cartographer", "Certified Open Source Geospatial Expert", "Certified Open Source Geospatial Associate Developer / Professional Developer", "Certified Open Source Geospatial Architect". Each certification is described by pre-conditions, scope and objectives, course content, recommended software packages, target group, expected benefits, and the methods of examination. Examinations can be flanked by proofs of professional career paths and achievements which need a peer qualification evaluation. After a couple of years a recertification is required. The concept seeks the accreditation by the OSGeo Foundation (and other bodies) and international support by a group of geospatial scientific institutions to achieve wide and international acceptance for this Open Source geospatial certification model. A business case for Open Source certification and a corresponding SWOT model is examined to support the goals of the Geo-For-All initiative of the ICA-OSGeo pact.

  19. Visualization and Analytics Software Tools for Peregrine System |

    Science.gov Websites

    R is a language and environment for statistical computing and graphics. Go to the R web site for System Visualization and Analytics Software Tools for Peregrine System Learn about the available visualization for OpenGL-based applications. For more information, please go to the FastX page. ParaView An open

  20. Addressing fundamental architectural challenges of an activity-based intelligence and advanced analytics (ABIAA) system

    NASA Astrophysics Data System (ADS)

    Yager, Kevin; Albert, Thomas; Brower, Bernard V.; Pellechia, Matthew F.

    2015-06-01

    The domain of Geospatial Intelligence Analysis is rapidly shifting toward a new paradigm of Activity Based Intelligence (ABI) and information-based Tipping and Cueing. General requirements for an advanced ABIAA system present significant challenges in architectural design, computing resources, data volumes, workflow efficiency, data mining and analysis algorithms, and database structures. These sophisticated ABI software systems must include advanced algorithms that automatically flag activities of interest in less time and within larger data volumes than can be processed by human analysts. In doing this, they must also maintain the geospatial accuracy necessary for cross-correlation of multi-intelligence data sources. Historically, serial architectural workflows have been employed in ABIAA system design for tasking, collection, processing, exploitation, and dissemination. These simpler architectures may produce implementations that solve short term requirements; however, they have serious limitations that preclude them from being used effectively in an automated ABIAA system with multiple data sources. This paper discusses modern ABIAA architectural considerations providing an overview of an advanced ABIAA system and comparisons to legacy systems. It concludes with a recommended strategy and incremental approach to the research, development, and construction of a fully automated ABIAA system.

  1. Generation of Multiple Metadata Formats from a Geospatial Data Repository

    NASA Astrophysics Data System (ADS)

    Hudspeth, W. B.; Benedict, K. K.; Scott, S.

    2012-12-01

    The Earth Data Analysis Center (EDAC) at the University of New Mexico is partnering with the CYBERShARE and Environmental Health Group from the Center for Environmental Resource Management (CERM), located at the University of Texas, El Paso (UTEP), the Biodiversity Institute at the University of Kansas (KU), and the New Mexico Geo- Epidemiology Research Network (GERN) to provide a technical infrastructure that enables investigation of a variety of climate-driven human/environmental systems. Two significant goals of this NASA-funded project are: a) to increase the use of NASA Earth observational data at EDAC by various modeling communities through enabling better discovery, access, and use of relevant information, and b) to expose these communities to the benefits of provenance for improving understanding and usability of heterogeneous data sources and derived model products. To realize these goals, EDAC has leveraged the core capabilities of its Geographic Storage, Transformation, and Retrieval Engine (Gstore) platform, developed with support of the NSF EPSCoR Program. The Gstore geospatial services platform provides general purpose web services based upon the REST service model, and is capable of data discovery, access, and publication functions, metadata delivery functions, data transformation, and auto-generated OGC services for those data products that can support those services. Central to the NASA ACCESS project is the delivery of geospatial metadata in a variety of formats, including ISO 19115-2/19139, FGDC CSDGM, and the Proof Markup Language (PML). This presentation details the extraction and persistence of relevant metadata in the Gstore data store, and their transformation into multiple metadata formats that are increasingly utilized by the geospatial community to document not only core library catalog elements (e.g. title, abstract, publication data, geographic extent, projection information, and database elements), but also the processing steps used to generate derived modeling products. In particular, we discuss the generation and service delivery of provenance, or trace of data sources and analytical methods used in a scientific analysis, for archived data. We discuss the workflows developed by EDAC to capture end-to-end provenance, the storage model for those data in a delivery format independent data structure, and delivery of PML, ISO, and FGDC documents to clients requesting those products.

  2. Geospatial Data Processing for 3d City Model Generation, Management and Visualization

    NASA Astrophysics Data System (ADS)

    Toschi, I.; Nocerino, E.; Remondino, F.; Revolti, A.; Soria, G.; Piffer, S.

    2017-05-01

    Recent developments of 3D technologies and tools have increased availability and relevance of 3D data (from 3D points to complete city models) in the geospatial and geo-information domains. Nevertheless, the potential of 3D data is still underexploited and mainly confined to visualization purposes. Therefore, the major challenge today is to create automatic procedures that make best use of available technologies and data for the benefits and needs of public administrations (PA) and national mapping agencies (NMA) involved in "smart city" applications. The paper aims to demonstrate a step forward in this process by presenting the results of the SENECA project (Smart and SustaiNablE City from Above - http://seneca.fbk.eu). State-of-the-art processing solutions are investigated in order to (i) efficiently exploit the photogrammetric workflow (aerial triangulation and dense image matching), (ii) derive topologically and geometrically accurate 3D geo-objects (i.e. building models) at various levels of detail and (iii) link geometries with non-spatial information within a 3D geo-database management system accessible via web-based client. The developed methodology is tested on two case studies, i.e. the cities of Trento (Italy) and Graz (Austria). Both spatial (i.e. nadir and oblique imagery) and non-spatial (i.e. cadastral information and building energy consumptions) data are collected and used as input for the project workflow, starting from 3D geometry capture and modelling in urban scenarios to geometry enrichment and management within a dedicated webGIS platform.

  3. Development of climate data storage and processing model

    NASA Astrophysics Data System (ADS)

    Okladnikov, I. G.; Gordov, E. P.; Titov, A. G.

    2016-11-01

    We present a storage and processing model for climate datasets elaborated in the framework of a virtual research environment (VRE) for climate and environmental monitoring and analysis of the impact of climate change on the socio-economic processes on local and regional scales. The model is based on a «shared nothings» distributed computing architecture and assumes using a computing network where each computing node is independent and selfsufficient. Each node holds a dedicated software for the processing and visualization of geospatial data providing programming interfaces to communicate with the other nodes. The nodes are interconnected by a local network or the Internet and exchange data and control instructions via SSH connections and web services. Geospatial data is represented by collections of netCDF files stored in a hierarchy of directories in the framework of a file system. To speed up data reading and processing, three approaches are proposed: a precalculation of intermediate products, a distribution of data across multiple storage systems (with or without redundancy), and caching and reuse of the previously obtained products. For a fast search and retrieval of the required data, according to the data storage and processing model, a metadata database is developed. It contains descriptions of the space-time features of the datasets available for processing, their locations, as well as descriptions and run options of the software components for data analysis and visualization. The model and the metadata database together will provide a reliable technological basis for development of a high- performance virtual research environment for climatic and environmental monitoring.

  4. Visual Analytics for MOOC Data.

    PubMed

    Qu, Huamin; Chen, Qing

    2015-01-01

    With the rise of massive open online courses (MOOCs), tens of millions of learners can now enroll in more than 1,000 courses via MOOC platforms such as Coursera and edX. As a result, a huge amount of data has been collected. Compared with traditional education records, the data from MOOCs has much finer granularity and also contains new pieces of information. It is the first time in history that such comprehensive data related to learning behavior has become available for analysis. What roles can visual analytics play in this MOOC movement? The authors survey the current practice and argue that MOOCs provide an opportunity for visualization researchers and that visual analytics systems for MOOCs can benefit a range of end users such as course instructors, education researchers, students, university administrators, and MOOC providers.

  5. Ontology for Transforming Geo-Spatial Data for Discovery and Integration of Scientific Data

    NASA Astrophysics Data System (ADS)

    Nguyen, L.; Chee, T.; Minnis, P.

    2013-12-01

    Discovery and access to geo-spatial scientific data across heterogeneous repositories and multi-discipline datasets can present challenges for scientist. We propose to build a workflow for transforming geo-spatial datasets into semantic environment by using relationships to describe the resource using OWL Web Ontology, RDF, and a proposed geo-spatial vocabulary. We will present methods for transforming traditional scientific dataset, use of a semantic repository, and querying using SPARQL to integrate and access datasets. This unique repository will enable discovery of scientific data by geospatial bound or other criteria.

  6. Bibliometric mapping: eight decades of analytical chemistry, with special focus on the use of mass spectrometry.

    PubMed

    Waaijer, Cathelijn J F; Palmblad, Magnus

    2015-01-01

    In this Feature we use automatic bibliometric mapping tools to visualize the history of analytical chemistry from the 1920s until the present. In particular, we have focused on the application of mass spectrometry in different fields. The analysis shows major shifts in research focus and use of mass spectrometry. We conclude by discussing the application of bibliometric mapping and visualization tools in analytical chemists' research.

  7. Geospatial analytics to evaluate point-of-dispensing sites for mass immunizations in Allegheny County, Pennsylvania.

    PubMed

    Everett, Kibri H; Potter, Margaret A; Wheaton, William D; Gleason, Sherrianne M; Brown, Shawn T; Lee, Bruce Y

    2013-01-01

    Public health agencies use mass immunization locations to quickly administer vaccines to protect a population against an epidemic. The selection of such locations is frequently determined by available staffing levels and in some places, not all potential sites can be opened, often because of a lack of resources. Public health agencies need assistance in determining which n sites are the prime ones to open given available staff to minimize travel time and travel distance for those in the population who need to get to a site to receive treatment. Employ geospatial analytical methods to identify the prime n locations from a predetermined set of potential locations (eg, schools) and determine which locations may not be able to achieve the throughput necessary to reach the herd immunity threshold based on varying R0 values. Spatial location-allocation algorithms were used to select the ideal n mass vaccination locations. Allegheny County, Pennsylvania, served as the study area. The most favorable sites were selected and the number of individuals required to be vaccinated to achieve the herd immunity threshold for a given R0, ranging from 1.5 to 7, was determined. Locations that did not meet the Centers for Disease Control and Prevention throughput recommendation for smallpox were identified. At R0 = 1.5, all mass immunization locations met the required throughput to achieve the herd immunity threshold within 5 days. As R0s increased from 2 to 7, an increasing number of sites were inadequate to meet throughput requirements. Identifying the top n sites and categorizing those with throughput challenges allows health departments to adjust staffing, shift length, or the number of sites. This method has the potential to be expanded to select immunization locations under a number of additional scenarios.

  8. Collaborative visual analytics of radio surveys in the Big Data era

    NASA Astrophysics Data System (ADS)

    Vohl, Dany; Fluke, Christopher J.; Hassan, Amr H.; Barnes, David G.; Kilborn, Virginia A.

    2017-06-01

    Radio survey datasets comprise an increasing number of individual observations stored as sets of multidimensional data. In large survey projects, astronomers commonly face limitations regarding: 1) interactive visual analytics of sufficiently large subsets of data; 2) synchronous and asynchronous collaboration; and 3) documentation of the discovery workflow. To support collaborative data inquiry, we present encube, a large-scale comparative visual analytics framework. encube can utilise advanced visualization environments such as the CAVE2 (a hybrid 2D and 3D virtual reality environment powered with a 100 Tflop/s GPU-based supercomputer and 84 million pixels) for collaborative analysis of large subsets of data from radio surveys. It can also run on standard desktops, providing a capable visual analytics experience across the display ecology. encube is composed of four primary units enabling compute-intensive processing, advanced visualisation, dynamic interaction, parallel data query, along with data management. Its modularity will make it simple to incorporate astronomical analysis packages and Virtual Observatory capabilities developed within our community. We discuss how encube builds a bridge between high-end display systems (such as CAVE2) and the classical desktop, preserving all traces of the work completed on either platform - allowing the research process to continue wherever you are.

  9. A geospatial search engine for discovering multi-format geospatial data across the web

    Treesearch

    Christopher Bone; Alan Ager; Ken Bunzel; Lauren Tierney

    2014-01-01

    The volume of publically available geospatial data on the web is rapidly increasing due to advances in server-based technologies and the ease at which data can now be created. However, challenges remain with connecting individuals searching for geospatial data with servers and websites where such data exist. The objective of this paper is to present a publically...

  10. The Use of Geospatial Technologies Instruction within a Student/Teacher/Scientist Partnership: Increasing Students' Geospatial Skills and Atmospheric Concept Knowledge

    ERIC Educational Resources Information Center

    Hedley, Mikell Lynne; Templin, Mark A.; Czaljkowski, Kevin; Czerniak, Charlene

    2013-01-01

    Many 21st century careers rely on geospatial skills; yet, curricula and professional development lag behind in incorporating these skills. As a result, many teachers have limited experience or preparation for teaching geospatial skills. One strategy for overcoming such problems is the creation of a student/teacher/scientist (STS) partnership…

  11. Bridging the Gap Between Surveyors and the Geo-Spatial Society

    NASA Astrophysics Data System (ADS)

    Müller, H.

    2016-06-01

    For many years FIG, the International Association of Surveyors, has been trying to bridge the gap between surveyors and the geospatial society as a whole, with the geospatial industries in particular. Traditionally the surveying profession contributed to the good of society by creating and maintaining highly precise and accurate geospatial data bases, based on an in-depth knowledge of spatial reference frameworks. Furthermore in many countries surveyors may be entitled to make decisions about land divisions and boundaries. By managing information spatially surveyors today develop into the role of geo-data managers, the longer the more. Job assignments in this context include data entry management, data and process quality management, design of formal and informal systems, information management, consultancy, land management, all that in close cooperation with many different stakeholders. Future tasks will include the integration of geospatial information into e-government and e-commerce systems. The list of professional tasks underpins the capabilities of surveyors to contribute to a high quality geospatial data and information management. In that way modern surveyors support the needs of a geo-spatial society. The paper discusses several approaches to define the role of the surveyor within the modern geospatial society.

  12. Automated geospatial Web Services composition based on geodata quality requirements

    NASA Astrophysics Data System (ADS)

    Cruz, Sérgio A. B.; Monteiro, Antonio M. V.; Santos, Rafael

    2012-10-01

    Service-Oriented Architecture and Web Services technologies improve the performance of activities involved in geospatial analysis with a distributed computing architecture. However, the design of the geospatial analysis process on this platform, by combining component Web Services, presents some open issues. The automated construction of these compositions represents an important research topic. Some approaches to solving this problem are based on AI planning methods coupled with semantic service descriptions. This work presents a new approach using AI planning methods to improve the robustness of the produced geospatial Web Services composition. For this purpose, we use semantic descriptions of geospatial data quality requirements in a rule-based form. These rules allow the semantic annotation of geospatial data and, coupled with the conditional planning method, this approach represents more precisely the situations of nonconformities with geodata quality that may occur during the execution of the Web Service composition. The service compositions produced by this method are more robust, thus improving process reliability when working with a composition of chained geospatial Web Services.

  13. Economic assessment of the use value of geospatial information

    USGS Publications Warehouse

    Bernknopf, Richard L.; Shapiro, Carl D.

    2015-01-01

    Geospatial data inform decision makers. An economic model that involves application of spatial and temporal scientific, technical, and economic data in decision making is described. The value of information (VOI) contained in geospatial data is the difference between the net benefits (in present value terms) of a decision with and without the information. A range of technologies is used to collect and distribute geospatial data. These technical activities are linked to examples that show how the data can be applied in decision making, which is a cultural activity. The economic model for assessing the VOI in geospatial data for decision making is applied to three examples: (1) a retrospective model about environmental regulation of agrochemicals; (2) a prospective model about the impact and mitigation of earthquakes in urban areas; and (3) a prospective model about developing private–public geospatial information for an ecosystem services market. Each example demonstrates the potential value of geospatial information in a decision with uncertain information.

  14. Remote Sensing Data Analytics for Planetary Science with PlanetServer/EarthServer

    NASA Astrophysics Data System (ADS)

    Rossi, Angelo Pio; Figuera, Ramiro Marco; Flahaut, Jessica; Martinot, Melissa; Misev, Dimitar; Baumann, Peter; Pham Huu, Bang; Besse, Sebastien

    2016-04-01

    Planetary Science datasets, beyond the change in the last two decades from physical volumes to internet-accessible archives, still face the problem of large-scale processing and analytics (e.g. Rossi et al., 2014, Gaddis and Hare, 2015). PlanetServer, the Planetary Science Data Service of the EC-funded EarthServer-2 project (#654367) tackles the planetary Big Data analytics problem with an array database approach (Baumann et al., 2014). It is developed to serve a large amount of calibrated, map-projected planetary data online, mainly through Open Geospatial Consortium (OGC) Web Coverage Processing Service (WCPS) (e.g. Rossi et al., 2014; Oosthoek et al., 2013; Cantini et al., 2014). The focus of the H2020 evolution of PlanetServer is still on complex multidimensional data, particularly hyperspectral imaging and topographic cubes and imagery. In addition to hyperspectral and topographic from Mars (Rossi et al., 2014), the use of WCPS is applied to diverse datasets on the Moon, as well as Mercury. Other Solar System Bodies are going to be progressively available. Derived parameters such as summary products and indices can be produced through WCPS queries, as well as derived imagery colour combination products, dynamically generated and accessed also through OGC Web Coverage Service (WCS). Scientific questions translated into queries can be posed to a large number of individual coverages (data products), locally, regionally or globally. The new PlanetServer system uses the the Open Source Nasa WorldWind (e.g. Hogan, 2011) virtual globe as visualisation engine, and the array database Rasdaman Community Edition as core server component. Analytical tools and client components of relevance for multiple communities and disciplines are shared across service such as the Earth Observation and Marine Data Services of EarthServer. The Planetary Science Data Service of EarthServer is accessible on http://planetserver.eu. All its code base is going to be available on GitHub, on https://github.com/planetserver References: Baumann, P., et al. (2015) Big Data Analytics for Earth Sciences: the EarthServer approach, International Journal of Digital Earth, doi: 10.1080/17538947.2014.1003106. Cantini, F. et al. (2014) Geophys. Res. Abs., Vol. 16, #EGU2014-3784. Gaddis, L., and T. Hare (2015), Status of tools and data for planetary research, Eos, 96, dos: 10.1029/2015EO041125. Hogan, P., 2011. NASA World Wind: Infrastructure for Spatial Data. Technical report. Proceedings of the 2nd International Conference on Computing for Geospatial Research & Applications ACM. Oosthoek, J.H.P, et al. (2013) Advances in Space Research. doi: 10.1016/j.asr.2013.07.002. Rossi, A. P., et al. (2014) PlanetServer/EarthServer: Big Data analytics in Planetary Science. Geophysical Research Abstracts, Vol. 16, #EGU2014-5149.

  15. EPA National Geospatial Data Policy

    EPA Pesticide Factsheets

    National Geospatial Data Policy (NGDP) establishes principles, responsibilities, and requirements for collecting and managing geospatial data used by Federal environmental programs and projects within the jurisdiction of the U.S. EPA

  16. The Inter-American Geospatial Data Network— developing a Western Hemisphere geospatial data clearinghouse

    USGS Publications Warehouse

    Anthony, Michelle L.; Klaver, Jacqueline M.; Quenzer, Robert

    1998-01-01

    The US Geological Survey and US Agency for International Development are enhancing the geographic information infrastructure of the Western Hemisphere by establishing the Inter-American Geospatial Data Network (IGDN). In its efforts to strengthen the Western Hemisphere's information infrastructure, the IGDN is consistent with the goals of the Plan of Action that emerged from the 1994 Summit of the Americas. The IGDN is an on-line cooperative, or clearinghouse, of geospatial data. Internet technology is used to facilitate the discovery and access of Western Hemisphere geospatial data. It was established by using the standards and guidelines of the Federal Geographic Data Committee to provide a consistent data discovery mechanism that will help minimize geospatial data duplication, promote data availability, and coordinate data collection and research activities.

  17. Streaming Visual Analytics Workshop Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, Kristin A.; Burtner, Edwin R.; Kritzstein, Brian P.

    How can we best enable users to understand complex emerging events and make appropriate assessments from streaming data? This was the central question addressed at a three-day workshop on streaming visual analytics. This workshop was organized by Pacific Northwest National Laboratory for a government sponsor. It brought together forty researchers and subject matter experts from government, industry, and academia. This report summarizes the outcomes from that workshop. It describes elements of the vision for a streaming visual analytic environment and set of important research directions needed to achieve this vision. Streaming data analysis is in many ways the analysis andmore » understanding of change. However, current visual analytics systems usually focus on static data collections, meaning that dynamically changing conditions are not appropriately addressed. The envisioned mixed-initiative streaming visual analytics environment creates a collaboration between the analyst and the system to support the analysis process. It raises the level of discourse from low-level data records to higher-level concepts. The system supports the analyst’s rapid orientation and reorientation as situations change. It provides an environment to support the analyst’s critical thinking. It infers tasks and interests based on the analyst’s interactions. The system works as both an assistant and a devil’s advocate, finding relevant data and alerts as well as considering alternative hypotheses. Finally, the system supports sharing of findings with others. Making such an environment a reality requires research in several areas. The workshop discussions focused on four broad areas: support for critical thinking, visual representation of change, mixed-initiative analysis, and the use of narratives for analysis and communication.« less

  18. EPA Geospatial Applications

    EPA Pesticide Factsheets

    EPA has developed many applications that allow users to explore and interact with geospatial data. This page highlights some of the flagship geospatial web applications but these represent only a fraction of the total.

  19. Semantic Interaction for Sensemaking: Inferring Analytical Reasoning for Model Steering.

    PubMed

    Endert, A; Fiaux, P; North, C

    2012-12-01

    Visual analytic tools aim to support the cognitively demanding task of sensemaking. Their success often depends on the ability to leverage capabilities of mathematical models, visualization, and human intuition through flexible, usable, and expressive interactions. Spatially clustering data is one effective metaphor for users to explore similarity and relationships between information, adjusting the weighting of dimensions or characteristics of the dataset to observe the change in the spatial layout. Semantic interaction is an approach to user interaction in such spatializations that couples these parametric modifications of the clustering model with users' analytic operations on the data (e.g., direct document movement in the spatialization, highlighting text, search, etc.). In this paper, we present results of a user study exploring the ability of semantic interaction in a visual analytic prototype, ForceSPIRE, to support sensemaking. We found that semantic interaction captures the analytical reasoning of the user through keyword weighting, and aids the user in co-creating a spatialization based on the user's reasoning and intuition.

  20. Big data and visual analytics in anaesthesia and health care.

    PubMed

    Simpao, A F; Ahumada, L M; Rehman, M A

    2015-09-01

    Advances in computer technology, patient monitoring systems, and electronic health record systems have enabled rapid accumulation of patient data in electronic form (i.e. big data). Organizations such as the Anesthesia Quality Institute and Multicenter Perioperative Outcomes Group have spearheaded large-scale efforts to collect anaesthesia big data for outcomes research and quality improvement. Analytics--the systematic use of data combined with quantitative and qualitative analysis to make decisions--can be applied to big data for quality and performance improvements, such as predictive risk assessment, clinical decision support, and resource management. Visual analytics is the science of analytical reasoning facilitated by interactive visual interfaces, and it can facilitate performance of cognitive activities involving big data. Ongoing integration of big data and analytics within anaesthesia and health care will increase demand for anaesthesia professionals who are well versed in both the medical and the information sciences. © The Author 2015. Published by Oxford University Press on behalf of the British Journal of Anaesthesia. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  1. The African Geospatial Sciences Institute (agsi): a New Approach to Geospatial Training in North Africa

    NASA Astrophysics Data System (ADS)

    Oeldenberger, S.; Khaled, K. B.

    2012-07-01

    The African Geospatial Sciences Institute (AGSI) is currently being established in Tunisia as a non-profit, non-governmental organization (NGO). Its objective is to accelerate the geospatial capacity development in North-Africa, providing the facilities for geospatial project and management training to regional government employees, university graduates, private individuals and companies. With typical course durations between one and six months, including part-time programs and long-term mentoring, its focus is on practical training, providing actual project execution experience. The AGSI will complement formal university education and will work closely with geospatial certification organizations and the geospatial industry. In the context of closer cooperation between neighboring North Africa and the European Community, the AGSI will be embedded in a network of several participating European and African universities, e. g. the ITC, and international organizations, such as the ISPRS, the ICA and the OGC. Through a close cooperation with African organizations, such as the AARSE, the RCMRD and RECTAS, the network and exchange of ideas, experiences, technology and capabilities will be extended to Saharan and sub-Saharan Africa. A board of trustees will be steering the AGSI operations and will ensure that practical training concepts and contents are certifiable and can be applied within a credit system to graduate and post-graduate education at European and African universities. The geospatial training activities of the AGSI are centered on a facility with approximately 30 part- and full-time general staff and lecturers in Tunis during the first year. The AGSI will operate a small aircraft with a medium-format aerial camera and compact LIDAR instrument for local, community-scale data capture. Surveying training, the photogrammetric processing of aerial images, GIS data capture and remote sensing training will be the main components of the practical training courses offered, to build geospatial capacity and ensure that AGSI graduates will have the appropriate skill-sets required for employment in the geospatial industry. Geospatial management courses and high-level seminars will be targeted at decision makers in government and industry to build awareness for geospatial applications and benefits. Online education will be developed together with international partners and internet-based activities will involve the public to familiarize them with geospatial data and its many applications.

  2. Visual analytics in healthcare education: exploring novel ways to analyze and represent big data in undergraduate medical education

    PubMed Central

    Nilsson, Gunnar; Zary, Nabil

    2014-01-01

    Introduction. The big data present in the medical curriculum that informs undergraduate medical education is beyond human abilities to perceive and analyze. The medical curriculum is the main tool used by teachers and directors to plan, design, and deliver teaching and assessment activities and student evaluations in medical education in a continuous effort to improve it. Big data remains largely unexploited for medical education improvement purposes. The emerging research field of visual analytics has the advantage of combining data analysis and manipulation techniques, information and knowledge representation, and human cognitive strength to perceive and recognize visual patterns. Nevertheless, there is a lack of research on the use and benefits of visual analytics in medical education. Methods. The present study is based on analyzing the data in the medical curriculum of an undergraduate medical program as it concerns teaching activities, assessment methods and learning outcomes in order to explore visual analytics as a tool for finding ways of representing big data from undergraduate medical education for improvement purposes. Cytoscape software was employed to build networks of the identified aspects and visualize them. Results. After the analysis of the curriculum data, eleven aspects were identified. Further analysis and visualization of the identified aspects with Cytoscape resulted in building an abstract model of the examined data that presented three different approaches; (i) learning outcomes and teaching methods, (ii) examination and learning outcomes, and (iii) teaching methods, learning outcomes, examination results, and gap analysis. Discussion. This study identified aspects of medical curriculum that play an important role in how medical education is conducted. The implementation of visual analytics revealed three novel ways of representing big data in the undergraduate medical education context. It appears to be a useful tool to explore such data with possible future implications on healthcare education. It also opens a new direction in medical education informatics research. PMID:25469323

  3. Visual analytics in healthcare education: exploring novel ways to analyze and represent big data in undergraduate medical education.

    PubMed

    Vaitsis, Christos; Nilsson, Gunnar; Zary, Nabil

    2014-01-01

    Introduction. The big data present in the medical curriculum that informs undergraduate medical education is beyond human abilities to perceive and analyze. The medical curriculum is the main tool used by teachers and directors to plan, design, and deliver teaching and assessment activities and student evaluations in medical education in a continuous effort to improve it. Big data remains largely unexploited for medical education improvement purposes. The emerging research field of visual analytics has the advantage of combining data analysis and manipulation techniques, information and knowledge representation, and human cognitive strength to perceive and recognize visual patterns. Nevertheless, there is a lack of research on the use and benefits of visual analytics in medical education. Methods. The present study is based on analyzing the data in the medical curriculum of an undergraduate medical program as it concerns teaching activities, assessment methods and learning outcomes in order to explore visual analytics as a tool for finding ways of representing big data from undergraduate medical education for improvement purposes. Cytoscape software was employed to build networks of the identified aspects and visualize them. Results. After the analysis of the curriculum data, eleven aspects were identified. Further analysis and visualization of the identified aspects with Cytoscape resulted in building an abstract model of the examined data that presented three different approaches; (i) learning outcomes and teaching methods, (ii) examination and learning outcomes, and (iii) teaching methods, learning outcomes, examination results, and gap analysis. Discussion. This study identified aspects of medical curriculum that play an important role in how medical education is conducted. The implementation of visual analytics revealed three novel ways of representing big data in the undergraduate medical education context. It appears to be a useful tool to explore such data with possible future implications on healthcare education. It also opens a new direction in medical education informatics research.

  4. Geospatial Data Science Analysis | Geospatial Data Science | NREL

    Science.gov Websites

    different levels of technology maturity. Photo of a man taking field measurements. Geospatial analysis energy for different technologies across the nation? Featured Analysis Products Renewable Energy

  5. Geospatial Information is the Cornerstone of Effective Hazards Response

    USGS Publications Warehouse

    Newell, Mark

    2008-01-01

    Every day there are hundreds of natural disasters world-wide. Some are dramatic, whereas others are barely noticeable. A natural disaster is commonly defined as a natural event with catastrophic consequences for living things in the vicinity. Those events include earthquakes, floods, hurricanes, landslides, tsunami, volcanoes, and wildfires. Man-made disasters are events that are caused by man either intentionally or by accident, and that directly or indirectly threaten public health and well-being. These occurrences span the spectrum from terrorist attacks to accidental oil spills. To assist in responding to natural and potential man-made disasters, the U.S. Geological Survey (USGS) has established the Geospatial Information Response Team (GIRT) (http://www.usgs.gov/emergency/). The primary purpose of the GIRT is to ensure rapid coordination and availability of geospatial information for effective response by emergency responders, and land and resource managers, and for scientific analysis. The GIRT is responsible for establishing monitoring procedures for geospatial data acquisition, processing, and archiving; discovery, access, and delivery of data; anticipating geospatial needs; and providing relevant geospatial products and services. The GIRT is focused on supporting programs, offices, other agencies, and the public in mission response to hazards. The GIRT will leverage the USGS Geospatial Liaison Network and partnerships with the Department of Homeland Security (DHS), National Geospatial-Intelligence Agency (NGA), and Northern Command (NORTHCOM) to coordinate the provisioning and deployment of USGS geospatial data, products, services, and equipment. The USGS geospatial liaisons will coordinate geospatial information sharing with State, local, and tribal governments, and ensure geospatial liaison back-up support procedures are in place. The GIRT will coordinate disposition of USGS staff in support of DHS response center activities as requested by DHS. The GIRT is a standing team that is available during all hazard events and is on high alert during the hurricane season from June through November each year. To track all of the requirements and data acquisitions processed through the team, the GIRT will use the new Emergency Request Track (ER Track) tool. Currently, the ER Track is only available to USGS personnel.

  6. SOCRAT Platform Design: A Web Architecture for Interactive Visual Analytics Applications

    PubMed Central

    Kalinin, Alexandr A.; Palanimalai, Selvam; Dinov, Ivo D.

    2018-01-01

    The modern web is a successful platform for large scale interactive web applications, including visualizations. However, there are no established design principles for building complex visual analytics (VA) web applications that could efficiently integrate visualizations with data management, computational transformation, hypothesis testing, and knowledge discovery. This imposes a time-consuming design and development process on many researchers and developers. To address these challenges, we consider the design requirements for the development of a module-based VA system architecture, adopting existing practices of large scale web application development. We present the preliminary design and implementation of an open-source platform for Statistics Online Computational Resource Analytical Toolbox (SOCRAT). This platform defines: (1) a specification for an architecture for building VA applications with multi-level modularity, and (2) methods for optimizing module interaction, re-usage, and extension. To demonstrate how this platform can be used to integrate a number of data management, interactive visualization, and analysis tools, we implement an example application for simple VA tasks including raw data input and representation, interactive visualization and analysis. PMID:29630069

  7. SOCRAT Platform Design: A Web Architecture for Interactive Visual Analytics Applications.

    PubMed

    Kalinin, Alexandr A; Palanimalai, Selvam; Dinov, Ivo D

    2017-04-01

    The modern web is a successful platform for large scale interactive web applications, including visualizations. However, there are no established design principles for building complex visual analytics (VA) web applications that could efficiently integrate visualizations with data management, computational transformation, hypothesis testing, and knowledge discovery. This imposes a time-consuming design and development process on many researchers and developers. To address these challenges, we consider the design requirements for the development of a module-based VA system architecture, adopting existing practices of large scale web application development. We present the preliminary design and implementation of an open-source platform for Statistics Online Computational Resource Analytical Toolbox (SOCRAT). This platform defines: (1) a specification for an architecture for building VA applications with multi-level modularity, and (2) methods for optimizing module interaction, re-usage, and extension. To demonstrate how this platform can be used to integrate a number of data management, interactive visualization, and analysis tools, we implement an example application for simple VA tasks including raw data input and representation, interactive visualization and analysis.

  8. Use of Open Standards and Technologies at the Lunar Mapping and Modeling Project

    NASA Astrophysics Data System (ADS)

    Law, E.; Malhotra, S.; Bui, B.; Chang, G.; Goodale, C. E.; Ramirez, P.; Kim, R. M.; Sadaqathulla, S.; Rodriguez, L.

    2011-12-01

    The Lunar Mapping and Modeling Project (LMMP), led by the Marshall Space Flight center (MSFC), is tasked by NASA. The project is responsible for the development of an information system to support lunar exploration activities. It provides lunar explorers a set of tools and lunar map and model products that are predominantly derived from present lunar missions (e.g., the Lunar Reconnaissance Orbiter (LRO)) and from historical missions (e.g., Apollo). At Jet Propulsion Laboratory (JPL), we have built the LMMP interoperable geospatial information system's underlying infrastructure and a single point of entry - the LMMP Portal by employing a number of open standards and technologies. The Portal exposes a set of services to users to allow search, visualization, subset, and download of lunar data managed by the system. Users also have access to a set of tools that visualize, analyze and annotate the data. The infrastructure and Portal are based on web service oriented architecture. We designed the system to support solar system bodies in general including asteroids, earth and planets. We employed a combination of custom software, commercial and open-source components, off-the-shelf hardware and pay-by-use cloud computing services. The use of open standards and web service interfaces facilitate platform and application independent access to the services and data, offering for instances, iPad and Android mobile applications and large screen multi-touch with 3-D terrain viewing functions, for a rich browsing and analysis experience from a variety of platforms. The web services made use of open standards including: Representational State Transfer (REST); and Open Geospatial Consortium (OGC)'s Web Map Service (WMS), Web Coverage Service (WCS), Web Feature Service (WFS). Its data management services have been built on top of a set of open technologies including: Object Oriented Data Technology (OODT) - open source data catalog, archive, file management, data grid framework; openSSO - open source access management and federation platform; solr - open source enterprise search platform; redmine - open source project collaboration and management framework; GDAL - open source geospatial data abstraction library; and others. Its data products are compliant with Federal Geographic Data Committee (FGDC) metadata standard. This standardization allows users to access the data products via custom written applications or off-the-shelf applications such as GoogleEarth. We will demonstrate this ready-to-use system for data discovery and visualization by walking through the data services provided through the portal such as browse, search, and other tools. We will further demonstrate image viewing and layering of lunar map images from the Internet, via mobile devices such as Apple's iPad.

  9. Development of Web GIS for complex processing and visualization of climate geospatial datasets as an integral part of dedicated Virtual Research Environment

    NASA Astrophysics Data System (ADS)

    Gordov, Evgeny; Okladnikov, Igor; Titov, Alexander

    2017-04-01

    For comprehensive usage of large geospatial meteorological and climate datasets it is necessary to create a distributed software infrastructure based on the spatial data infrastructure (SDI) approach. Currently, it is generally accepted that the development of client applications as integrated elements of such infrastructure should be based on the usage of modern web and GIS technologies. The paper describes the Web GIS for complex processing and visualization of geospatial (mainly in NetCDF and PostGIS formats) datasets as an integral part of the dedicated Virtual Research Environment for comprehensive study of ongoing and possible future climate change, and analysis of their implications, providing full information and computing support for the study of economic, political and social consequences of global climate change at the global and regional levels. The Web GIS consists of two basic software parts: 1. Server-side part representing PHP applications of the SDI geoportal and realizing the functionality of interaction with computational core backend, WMS/WFS/WPS cartographical services, as well as implementing an open API for browser-based client software. Being the secondary one, this part provides a limited set of procedures accessible via standard HTTP interface. 2. Front-end part representing Web GIS client developed according to a "single page application" technology based on JavaScript libraries OpenLayers (http://openlayers.org/), ExtJS (https://www.sencha.com/products/extjs), GeoExt (http://geoext.org/). It implements application business logic and provides intuitive user interface similar to the interface of such popular desktop GIS applications, as uDIG, QuantumGIS etc. Boundless/OpenGeo architecture was used as a basis for Web-GIS client development. According to general INSPIRE requirements to data visualization Web GIS provides such standard functionality as data overview, image navigation, scrolling, scaling and graphical overlay, displaying map legends and corresponding metadata information. The specialized Web GIS client contains three basic tires: • Tier of NetCDF metadata in JSON format • Middleware tier of JavaScript objects implementing methods to work with: o NetCDF metadata o XML file of selected calculations configuration (XML task) o WMS/WFS/WPS cartographical services • Graphical user interface tier representing JavaScript objects realizing general application business logic Web-GIS developed provides computational processing services launching to support solving tasks in the area of environmental monitoring, as well as presenting calculation results in the form of WMS/WFS cartographical layers in raster (PNG, JPG, GeoTIFF), vector (KML, GML, Shape), and binary (NetCDF) formats. It has shown its effectiveness in the process of solving real climate change research problems and disseminating investigation results in cartographical formats. The work is supported by the Russian Science Foundation grant No 16-19-10257.

  10. Intelligence, mapping, and geospatial exploitation system (IMAGES)

    NASA Astrophysics Data System (ADS)

    Moellman, Dennis E.; Cain, Joel M.

    1998-08-01

    This paper provides further detail to one facet of the battlespace visualization concept described in last year's paper Battlespace Situation Awareness for Force XXI. It focuses on the National Imagery and Mapping Agency (NIMA) goal to 'provide customers seamless access to tailorable imagery, imagery intelligence, and geospatial information.' This paper describes Intelligence, Mapping, and Geospatial Exploitation System (IMAGES), an exploitation element capable of CONUS baseplant operations or field deployment to provide NIMA geospatial information collaboratively into a reconnaissance, surveillance, and target acquisition (RSTA) environment through the United States Imagery and Geospatial Information System (USIGS). In a baseplant CONUS setting IMAGES could be used to produce foundation data to support mission planning. In the field it could be directly associated with a tactical sensor receiver or ground station (e.g. UAV or UGV) to provide near real-time and mission specific RSTA to support mission execution. This paper provides IMAGES functional level design; describes the technologies, their interactions and interdependencies; and presents a notional operational scenario to illustrate the system flexibility. Using as a system backbone an intelligent software agent technology, called Open Agent ArchitectureTM (OAATM), IMAGES combines multimodal data entry, natural language understanding, and perceptual and evidential reasoning for system management. Configured to be DII COE compliant, it would utilize, to the extent possible, COTS applications software for data management, processing, fusion, exploitation, and reporting. It would also be modular, scaleable, and reconfigurable. This paper describes how the OAATM achieves data synchronization and enables the necessary level of information to be rapidly available to various command echelons for making informed decisions. The reasoning component will provide for the best information to be developed in the timeline available and it will also provide statistical pedigree data. This pedigree data provides both uncertainties associated with the information and an audit trail cataloging the raw data sources and the processing/exploitation applied to derive the final product. Collaboration provides for a close union between the information producer(s)/exploiter(s) and the information user(s) as well as between local and remote producer(s)/exploiter(s). From a military operational perspective, IMAGES is a step toward further uniting NIMA with its customers and further blurring the dividing line between operational command and control (C2) and its supporting intelligence activities. IMAGES also provides a foundation for reachback to remote data sources, data stores, application software, and computational resources for achieving 'just-in- time' information delivery -- all of which is transparent to the analyst or operator employing the system.

  11. Immersive Visual Analytics for Transformative Neutron Scattering Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A; Daniel, Jamison R; Drouhard, Margaret

    The ORNL Spallation Neutron Source (SNS) provides the most intense pulsed neutron beams in the world for scientific research and development across a broad range of disciplines. SNS experiments produce large volumes of complex data that are analyzed by scientists with varying degrees of experience using 3D visualization and analysis systems. However, it is notoriously difficult to achieve proficiency with 3D visualizations. Because 3D representations are key to understanding the neutron scattering data, scientists are unable to analyze their data in a timely fashion resulting in inefficient use of the limited and expensive SNS beam time. We believe a moremore » intuitive interface for exploring neutron scattering data can be created by combining immersive virtual reality technology with high performance data analytics and human interaction. In this paper, we present our initial investigations of immersive visualization concepts as well as our vision for an immersive visual analytics framework that could lower the barriers to 3D exploratory data analysis of neutron scattering data at the SNS.« less

  12. A Software Developer’s Guide to Informal Evaluation of Visual Analytics Environments Using VAST Challenge Information

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, Kristin A.; Scholtz, Jean; Whiting, Mark A.

    The VAST Challenge has been a popular venue for academic and industry participants for over ten years. Many participants comment that the majority of their time in preparing VAST Challenge entries is discovering elements in their software environments that need to be redesigned in order to solve the given task. Fortunately, there is no need to wait until the VAST Challenge is announced to test out software systems. The Visual Analytics Benchmark Repository contains all past VAST Challenge tasks, data, solutions and submissions. This paper details the various types of evaluations that may be conducted using the Repository information. Inmore » this paper we describe how developers can do informal evaluations of various aspects of their visual analytics environments using VAST Challenge information. Aspects that can be evaluated include the appropriateness of the software for various tasks, the various data types and formats that can be accommodated, the effectiveness and efficiency of the process supported by the software, and the intuitiveness of the visualizations and interactions. Researchers can compare their visualizations and interactions to those submitted to determine novelty. In addition, the paper provides pointers to various guidelines that software teams can use to evaluate the usability of their software. While these evaluations are not a replacement for formal evaluation methods, this information can be extremely useful during the development of visual analytics environments.« less

  13. GEOSPATIAL QA

    EPA Science Inventory

    Geospatial Science is increasingly becoming an important tool in making Agency decisions. Quality Control and Quality Assurance are required to be integrated during the planning, implementation and assessment of geospatial databases, processes and products. In order to ensure Age...

  14. The geospatial data quality REST API for primary biodiversity data

    PubMed Central

    Otegui, Javier; Guralnick, Robert P.

    2016-01-01

    Summary: We present a REST web service to assess the geospatial quality of primary biodiversity data. It enables access to basic and advanced functions to detect completeness and consistency issues as well as general errors in the provided record or set of records. The API uses JSON for data interchange and efficient parallelization techniques for fast assessments of large datasets. Availability and implementation: The Geospatial Data Quality API is part of the VertNet set of APIs. It can be accessed at http://api-geospatial.vertnet-portal.appspot.com/geospatial and is already implemented in the VertNet data portal for quality reporting. Source code is freely available under GPL license from http://www.github.com/vertnet/api-geospatial. Contact: javier.otegui@gmail.com or rguralnick@flmnh.ufl.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26833340

  15. The geospatial data quality REST API for primary biodiversity data.

    PubMed

    Otegui, Javier; Guralnick, Robert P

    2016-06-01

    We present a REST web service to assess the geospatial quality of primary biodiversity data. It enables access to basic and advanced functions to detect completeness and consistency issues as well as general errors in the provided record or set of records. The API uses JSON for data interchange and efficient parallelization techniques for fast assessments of large datasets. The Geospatial Data Quality API is part of the VertNet set of APIs. It can be accessed at http://api-geospatial.vertnet-portal.appspot.com/geospatial and is already implemented in the VertNet data portal for quality reporting. Source code is freely available under GPL license from http://www.github.com/vertnet/api-geospatial javier.otegui@gmail.com or rguralnick@flmnh.ufl.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  16. Cartographic symbol library considering symbol relations based on anti-aliasing graphic library

    NASA Astrophysics Data System (ADS)

    Mei, Yang; Li, Lin

    2007-06-01

    Cartographic visualization represents geographic information with a map form, which enables us retrieve useful geospatial information. In digital environment, cartographic symbol library is the base of cartographic visualization and is an essential component of Geographic Information System as well. Existing cartographic symbol libraries have two flaws. One is the display quality and the other one is relations adjusting. Statistic data presented in this paper indicate that the aliasing problem is a major factor on the symbol display quality on graphic display devices. So, effective graphic anti-aliasing methods based on a new anti-aliasing algorithm are presented and encapsulated in an anti-aliasing graphic library with the form of Component Object Model. Furthermore, cartographic visualization should represent feature relation in the way of correctly adjusting symbol relations besides displaying an individual feature. But current cartographic symbol libraries don't have this capability. This paper creates a cartographic symbol design model to implement symbol relations adjusting. Consequently the cartographic symbol library based on this design model can provide cartographic visualization with relations adjusting capability. The anti-aliasing graphic library and the cartographic symbol library are sampled and the results prove that the two libraries both have better efficiency and effect.

  17. SmartAdP: Visual Analytics of Large-scale Taxi Trajectories for Selecting Billboard Locations.

    PubMed

    Liu, Dongyu; Weng, Di; Li, Yuhong; Bao, Jie; Zheng, Yu; Qu, Huamin; Wu, Yingcai

    2017-01-01

    The problem of formulating solutions immediately and comparing them rapidly for billboard placements has plagued advertising planners for a long time, owing to the lack of efficient tools for in-depth analyses to make informed decisions. In this study, we attempt to employ visual analytics that combines the state-of-the-art mining and visualization techniques to tackle this problem using large-scale GPS trajectory data. In particular, we present SmartAdP, an interactive visual analytics system that deals with the two major challenges including finding good solutions in a huge solution space and comparing the solutions in a visual and intuitive manner. An interactive framework that integrates a novel visualization-driven data mining model enables advertising planners to effectively and efficiently formulate good candidate solutions. In addition, we propose a set of coupled visualizations: a solution view with metaphor-based glyphs to visualize the correlation between different solutions; a location view to display billboard locations in a compact manner; and a ranking view to present multi-typed rankings of the solutions. This system has been demonstrated using case studies with a real-world dataset and domain-expert interviews. Our approach can be adapted for other location selection problems such as selecting locations of retail stores or restaurants using trajectory data.

  18. Fast segmentation of satellite images using SLIC, WebGL and Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Donchyts, Gennadii; Baart, Fedor; Gorelick, Noel; Eisemann, Elmar; van de Giesen, Nick

    2017-04-01

    Google Earth Engine (GEE) is a parallel geospatial processing platform, which harmonizes access to petabytes of freely available satellite images. It provides a very rich API, allowing development of dedicated algorithms to extract useful geospatial information from these images. At the same time, modern GPUs provide thousands of computing cores, which are mostly not utilized in this context. In the last years, WebGL became a popular and well-supported API, allowing fast image processing directly in web browsers. In this work, we will evaluate the applicability of WebGL to enable fast segmentation of satellite images. A new implementation of a Simple Linear Iterative Clustering (SLIC) algorithm using GPU shaders will be presented. SLIC is a simple and efficient method to decompose an image in visually homogeneous regions. It adapts a k-means clustering approach to generate superpixels efficiently. While this approach will be hard to scale, due to a significant amount of data to be transferred to the client, it should significantly improve exploratory possibilities and simplify development of dedicated algorithms for geoscience applications. Our prototype implementation will be used to improve surface water detection of the reservoirs using multispectral satellite imagery.

  19. Examining the Effect of Enactment of a Geospatial Curriculum on Students' Geospatial Thinking and Reasoning

    NASA Astrophysics Data System (ADS)

    Bodzin, Alec M.; Fu, Qiong; Kulo, Violet; Peffer, Tamara

    2014-08-01

    A potential method for teaching geospatial thinking and reasoning (GTR) is through geospatially enabled learning technologies. We developed an energy resources geospatial curriculum that included learning activities with geographic information systems and virtual globes. This study investigated how 13 urban middle school teachers implemented and varied the enactment of the curriculum with their students and investigated which teacher- and student-level factors accounted for students' GTR posttest achievement. Data included biweekly implementation surveys from teachers and energy resources content and GTR pre- and posttest achievement measures from 1,049 students. Students significantly increased both their energy resources content knowledge and their GTR skills related to energy resources at the end of the curriculum enactment. Both multiple regression and hierarchical linear modeling found that students' initial GTR abilities and gain in energy content knowledge were significantly explanatory variables for their geospatial achievement at the end of curriculum enactment, p < .001. Teacher enactment factors, including adherence to implementing the critical components of the curriculum or the number of years the teachers had taught the curriculum, did not have significant effects on students' geospatial posttest achievement. The findings from this study provide support that learning with geospatially enabled learning technologies can support GTR with urban middle-level learners.

  20. Leveraging geospatial data, technology, and methods for improving the health of communities: priorities and strategies from an expert panel convened by the CDC.

    PubMed

    Elmore, Kim; Flanagan, Barry; Jones, Nicholas F; Heitgerd, Janet L

    2010-04-01

    In 2008, CDC convened an expert panel to gather input on the use of geospatial science in surveillance, research and program activities focused on CDC's Healthy Communities Goal. The panel suggested six priorities: spatially enable and strengthen public health surveillance infrastructure; develop metrics for geospatial categorization of community health and health inequity; evaluate the feasibility and validity of standard metrics of community health and health inequities; support and develop GIScience and geospatial analysis; provide geospatial capacity building, training and education; and, engage non-traditional partners. Following the meeting, the strategies and action items suggested by the expert panel were reviewed by a CDC subcommittee to determine priorities relative to ongoing CDC geospatial activities, recognizing that many activities may need to occur either in parallel, or occur multiple times across phases. Phase A of the action items centers on developing leadership support. Phase B focuses on developing internal and external capacity in both physical (e.g., software and hardware) and intellectual infrastructure. Phase C of the action items plan concerns the development and integration of geospatial methods. In summary, the panel members provided critical input to the development of CDC's strategic thinking on integrating geospatial methods and research issues across program efforts in support of its Healthy Communities Goal.

  1. The Role of Visual Learning in Improving Students' High-Order Thinking Skills

    ERIC Educational Resources Information Center

    Raiyn, Jamal

    2016-01-01

    Various concepts have been introduced to improve students' analytical thinking skills based on problem based learning (PBL). This paper introduces a new concept to increase student's analytical thinking skills based on a visual learning strategy. Such a strategy has three fundamental components: a teacher, a student, and a learning process. The…

  2. Using Enabling Technologies to Advance Data Intensive Analysis Tools in the JPL Tropical Cyclone Information System

    NASA Astrophysics Data System (ADS)

    Knosp, B.; Gangl, M. E.; Hristova-Veleva, S. M.; Kim, R. M.; Lambrigtsen, B.; Li, P.; Niamsuwan, N.; Shen, T. P. J.; Turk, F. J.; Vu, Q. A.

    2014-12-01

    The JPL Tropical Cyclone Information System (TCIS) brings together satellite, aircraft, and model forecast data from several NASA, NOAA, and other data centers to assist researchers in comparing and analyzing data related to tropical cyclones. The TCIS has been supporting specific science field campaigns, such as the Genesis and Rapid Intensification Processes (GRIP) campaign and the Hurricane and Severe Storm Sentinel (HS3) campaign, by creating near real-time (NRT) data visualization portals. These portals are intended to assist in mission planning, enhance the understanding of current physical processes, and improve model data by comparing it to satellite and aircraft observations. The TCIS NRT portals allow the user to view plots on a Google Earth interface. To compliment these visualizations, the team has been working on developing data analysis tools to let the user actively interrogate areas of Level 2 swath and two-dimensional plots they see on their screen. As expected, these observation and model data are quite voluminous and bottlenecks in the system architecture can occur when the databases try to run geospatial searches for data files that need to be read by the tools. To improve the responsiveness of the data analysis tools, the TCIS team has been conducting studies on how to best store Level 2 swath footprints and run sub-second geospatial searches to discover data. The first objective was to improve the sampling accuracy of the footprints being stored in the TCIS database by comparing the Java-based NASA PO.DAAC Level 2 Swath Generator with a TCIS Python swath generator. The second objective was to compare the performance of four database implementations - MySQL, MySQL+Solr, MongoDB, and PostgreSQL - to see which database management system would yield the best geospatial query and storage performance. The final objective was to integrate our chosen technologies with our Joint Probability Density Function (Joint PDF), Wave Number Analysis, and Automated Rotational Center Hurricane Eye Retrieval (ARCHER) tools. In this presentation, we will compare the enabling technologies we tested and discuss which ones we selected for integration into the TCIS' data analysis tool architecture. We will also show how these techniques have been automated to provide access to NRT data through our analysis tools.

  3. An Interoperable Architecture for Air Pollution Early Warning System Based on Sensor Web

    NASA Astrophysics Data System (ADS)

    Samadzadegan, F.; Zahmatkesh, H.; Saber, M.; Ghazi khanlou, H. J.

    2013-09-01

    Environmental monitoring systems deal with time-sensitive issues which require quick responses in emergency situations. Handling the sensor observations in near real-time and obtaining valuable information is challenging issues in these systems from a technical and scientific point of view. The ever-increasing population growth in urban areas has caused certain problems in developing countries, which has direct or indirect impact on human life. One of applicable solution for controlling and managing air quality by considering real time and update air quality information gathered by spatially distributed sensors in mega cities, using sensor web technology for developing monitoring and early warning systems. Urban air quality monitoring systems using functionalities of geospatial information system as a platform for analysing, processing, and visualization of data in combination with Sensor Web for supporting decision support systems in disaster management and emergency situations. This system uses Sensor Web Enablement (SWE) framework of the Open Geospatial Consortium (OGC), which offers a standard framework that allows the integration of sensors and sensor data into spatial data infrastructures. SWE framework introduces standards for services to access sensor data and discover events from sensor data streams as well as definition set of standards for the description of sensors and the encoding of measurements. The presented system provides capabilities to collect, transfer, share, process air quality sensor data and disseminate air quality status in real-time. It is possible to overcome interoperability challenges by using standard framework. In a routine scenario, air quality data measured by in-situ sensors are communicated to central station where data is analysed and processed. The extracted air quality status is processed for discovering emergency situations, and if necessary air quality reports are sent to the authorities. This research proposed an architecture to represent how integrate air quality sensor data stream into geospatial data infrastructure to present an interoperable air quality monitoring system for supporting disaster management systems by real time information. Developed system tested on Tehran air pollution sensors for calculating Air Quality Index (AQI) for CO pollutant and subsequently notifying registered users in emergency cases by sending warning E-mails. Air quality monitoring portal used to retrieving and visualize sensor observation through interoperable framework. This system provides capabilities to retrieve SOS observation using WPS in a cascaded service chaining pattern for monitoring trend of timely sensor observation.

  4. A Dynamic Information Framework: A Multi-Sector, Geospatial Gateway for Environmental Conservation and Adaptation to Climate Change

    NASA Astrophysics Data System (ADS)

    Fernandes, E. C.; Norbu, C.; Juizo, D.; Wangdi, T.; Richey, J. E.

    2011-12-01

    Landscapes, watersheds, and their downstream coastal and lacustrine zones are facing a series of challenges critical to their future, centered on the availability and distribution of water. Management options cover a range of issues, from bringing safe water to local villages for the rural poor, developing adaptation strategies for both rural and urban populations and large infrastructure, and sustaining environmental flows and ecosystem services needed for natural and human-dominated ecosystems. These targets represent a very complex set of intersecting issues of scale, cross-sector science and technology, education, politics, and economics, and the desired sustainable development is closely linked to how the nominally responsible governmental Ministries respond to the information they have. In practice, such information and even perspectives are virtually absent, in much of the developing world. A Dynamic Information Framework (DIF) is being designed as a knowledge platform whereby decision-makers in information-sparse regions can consider rigorous scenarios of alternative futures and obtain decision support for complex environmental and economic decisions is essential. The DIF is geospatial gateway, with functional components of base data layers, directed data layers focused on synthetic objectives, geospatially-explicit, process-based, cross-sector simulation models (requiring data from the directed data layers), and facilitated input/output (including visualizations), and decision support system and scenario testing capabilities. A fundamental aspect to a DIF is not only the convergence of multi-sector information, but how that information can be (a) integrated (b) used for robust simulations and projections, and (c) conveyed to policymakers and stakeholders, in the most compelling, and visual, manner. Examples are given of emerging applications. The ZambeziDIF was used to establish baselines for agriculture, biodiversity, and water resources in the lower Zambezi valley of Mozambique. The DrukDIF for Bhutan is moving from a test-of-concept to an operational phase, with uses from extending local biodiversity to computing how much energy can be sold tomorrow, based on waterflows today. AralDIF is being developed to serve as a neutral and transparent platform, as a catalyst for open and transparent discussion on water and energy linkages, for central Asia. ImisoziDIF is now being ramped up in Rwanda, to help guide scaling up of agricultural practices and biodiversity from sites to the country. The Virtual Mekong Basin, "tells the story" of the multiple issues facing the Mekong Basin.

  5. GEO Label: User and Producer Perspectives on a Label for Geospatial Data

    NASA Astrophysics Data System (ADS)

    Lush, V.; Lumsden, J.; Masó, J.; Díaz, P.; McCallum, I.

    2012-04-01

    One of the aims of the Science and Technology Committee (STC) of the Group on Earth Observations (GEO) was to establish a GEO Label- a label to certify geospatial datasets and their quality. As proposed, the GEO Label will be used as a value indicator for geospatial data and datasets accessible through the Global Earth Observation System of Systems (GEOSS). It is suggested that the development of such a label will significantly improve user recognition of the quality of geospatial datasets and that its use will help promote trust in datasets that carry the established GEO Label. Furthermore, the GEO Label is seen as an incentive to data providers. At the moment GEOSS contains a large amount of data and is constantly growing. Taking this into account, a GEO Label could assist in searching by providing users with visual cues of dataset quality and possibly relevance; a GEO Label could effectively stand as a decision support mechanism for dataset selection. Currently our project - GeoViQua, - together with EGIDA and ID-03 is undertaking research to define and evaluate the concept of a GEO Label. The development and evaluation process will be carried out in three phases. In phase I we have conducted an online survey (GEO Label Questionnaire) to identify the initial user and producer views on a GEO Label or its potential role. In phase II we will conduct a further study presenting some GEO Label examples that will be based on Phase I. We will elicit feedback on these examples under controlled conditions. In phase III we will create physical prototypes which will be used in a human subject study. The most successful prototypes will then be put forward as potential GEO Label options. At the moment we are in phase I, where we developed an online questionnaire to collect the initial GEO Label requirements and to identify the role that a GEO Label should serve from the user and producer standpoint. The GEO Label Questionnaire consists of generic questions to identify whether users and producers believe a GEO Label is relevant to geospatial data; whether they want a single "one-for-all" label or separate labels that will serve a particular role; the function that would be most relevant for a GEO Label to carry; and the functionality that users and producers would like to see from common rating and review systems they use. To distribute the questionnaire, relevant user and expert groups were contacted at meetings or by email. At this stage we successfully collected over 80 valid responses from geospatial data users and producers. This communication will provide a comprehensive analysis of the survey results, indicating to what extent the users surveyed in Phase I value a GEO Label, and suggesting in what directions a GEO Label may develop. Potential GEO Label examples based on the results of the survey will be presented for use in Phase II.

  6. A prototype system based on visual interactive SDM called VGC

    NASA Astrophysics Data System (ADS)

    Jia, Zelu; Liu, Yaolin; Liu, Yanfang

    2009-10-01

    In many application domains, data is collected and referenced by its geo-spatial location. Spatial data mining, or the discovery of interesting patterns in such databases, is an important capability in the development of database systems. Spatial data mining recently emerges from a number of real applications, such as real-estate marketing, urban planning, weather forecasting, medical image analysis, road traffic accident analysis, etc. It demands for efficient solutions for many new, expensive, and complicated problems. For spatial data mining of large data sets to be effective, it is also important to include humans in the data exploration process and combine their flexibility, creativity, and general knowledge with the enormous storage capacity and computational power of today's computers. Visual spatial data mining applies human visual perception to the exploration of large data sets. Presenting data in an interactive, graphical form often fosters new insights, encouraging the information and validation of new hypotheses to the end of better problem-solving and gaining deeper domain knowledge. In this paper a visual interactive spatial data mining prototype system (visual geo-classify) based on VC++6.0 and MapObject2.0 are designed and developed, the basic algorithms of the spatial data mining is used decision tree and Bayesian networks, and data classify are used training and learning and the integration of the two to realize. The result indicates it's a practical and extensible visual interactive spatial data mining tool.

  7. GEOSPATIAL QUALITY COUNCIL

    EPA Science Inventory

    Geospatial Science is increasingly becoming an important tool in making Agency decisions. QualIty Control and Quality Assurance are required to be integrated during the planning, implementation and assessment of geospatial databases, processes and products. In order to ensure Age...

  8. A Graphics Design Framework to Visualize Multi-Dimensional Economic Datasets

    ERIC Educational Resources Information Center

    Chandramouli, Magesh; Narayanan, Badri; Bertoline, Gary R.

    2013-01-01

    This study implements a prototype graphics visualization framework to visualize multidimensional data. This graphics design framework serves as a "visual analytical database" for visualization and simulation of economic models. One of the primary goals of any kind of visualization is to extract useful information from colossal volumes of…

  9. Advancements in Open Geospatial Standards for Photogrammetry and Remote Sensing from Ogc

    NASA Astrophysics Data System (ADS)

    Percivall, George; Simonis, Ingo

    2016-06-01

    The necessity of open standards for effective sharing and use of remote sensing continues to receive increasing emphasis in policies of agencies and projects around the world. Coordination on the development of open standards for geospatial information is a vital step to insure that the technical standards are ready to support the policy objectives. The mission of the Open Geospatial Consortium (OGC) is to advance development and use of international standards and supporting services that promote geospatial interoperability. To accomplish this mission, OGC serves as the global forum for the collaboration of geospatial data / solution providers and users. Photogrammetry and remote sensing are sources of the largest and most complex geospatial information. Some of the most mature OGC standards for remote sensing include the Sensor Web Enablement (SWE) standards, the Web Coverage Service (WCS) suite of standards, encodings such as NetCDF, GMLJP2 and GeoPackage, and the soon to be approved Discrete Global Grid Systems (DGGS) standard. In collaboration with ISPRS, OGC working with government, research and industrial organizations continue to advance the state of geospatial standards for full use of photogrammetry and remote sensing.

  10. Unlocking Proteomic Heterogeneity in Complex Diseases through Visual Analytics

    PubMed Central

    Bhavnani, Suresh K.; Dang, Bryant; Bellala, Gowtham; Divekar, Rohit; Visweswaran, Shyam; Brasier, Allan; Kurosky, Alex

    2015-01-01

    Despite years of preclinical development, biological interventions designed to treat complex diseases like asthma often fail in phase III clinical trials. These failures suggest that current methods to analyze biomedical data might be missing critical aspects of biological complexity such as the assumption that cases and controls come from homogeneous distributions. Here we discuss why and how methods from the rapidly evolving field of visual analytics can help translational teams (consisting of biologists, clinicians, and bioinformaticians) to address the challenge of modeling and inferring heterogeneity in the proteomic and phenotypic profiles of patients with complex diseases. Because a primary goal of visual analytics is to amplify the cognitive capacities of humans for detecting patterns in complex data, we begin with an overview of the cognitive foundations for the field of visual analytics. Next, we organize the primary ways in which a specific form of visual analytics called networks have been used to model and infer biological mechanisms, which help to identify the properties of networks that are particularly useful for the discovery and analysis of proteomic heterogeneity in complex diseases. We describe one such approach called subject-protein networks, and demonstrate its application on two proteomic datasets. This demonstration provides insights to help translational teams overcome theoretical, practical, and pedagogical hurdles for the widespread use of subject-protein networks for analyzing molecular heterogeneities, with the translational goal of designing biomarker-based clinical trials, and accelerating the development of personalized approaches to medicine. PMID:25684269

  11. Application of Data Provenance in Healthcare Analytics Software: Information Visualisation of User Activities

    PubMed Central

    Xu, Shen; Rogers, Toby; Fairweather, Elliot; Glenn, Anthony; Curran, James; Curcin, Vasa

    2018-01-01

    Data provenance is a technique that describes the history of digital objects. In health data settings, it can be used to deliver auditability and transparency, and to achieve trust in a software system. However, implementing data provenance in analytics software at an enterprise level presents a different set of challenges from the research environments where data provenance was originally devised. In this paper, the challenges of reporting provenance information to the user is presented. Provenance captured from analytics software can be large and complex and visualizing a series of tasks over a long period can be overwhelming even for a domain expert, requiring visual aggregation mechanisms that fit with complex human cognitive activities involved in the process. This research studied how provenance-based reporting can be integrated into a health data analytics software, using the example of Atmolytics visual reporting tool. PMID:29888084

  12. The science of visual analysis at extreme scale

    NASA Astrophysics Data System (ADS)

    Nowell, Lucy T.

    2011-01-01

    Driven by market forces and spanning the full spectrum of computational devices, computer architectures are changing in ways that present tremendous opportunities and challenges for data analysis and visual analytic technologies. Leadership-class high performance computing system will have as many as a million cores by 2020 and support 10 billion-way concurrency, while laptop computers are expected to have as many as 1,000 cores by 2015. At the same time, data of all types are increasing exponentially and automated analytic methods are essential for all disciplines. Many existing analytic technologies do not scale to make full use of current platforms and fewer still are likely to scale to the systems that will be operational by the end of this decade. Furthermore, on the new architectures and for data at extreme scales, validating the accuracy and effectiveness of analytic methods, including visual analysis, will be increasingly important.

  13. Geographic Information System Technology Leveraged for Crisis Planning, Emergency, Response, and Disaster Management

    NASA Astrophysics Data System (ADS)

    Ross, A.; Little, M. M.

    2013-12-01

    NASA's Atmospheric Science Data Center (ASDC) is piloting the use of Geographic Information System (GIS) technology that can be leveraged for crisis planning, emergency response, and disaster management/awareness. Many different organizations currently use GIS tools and geospatial data during a disaster event. ASDC datasets have not been fully utilized by this community in the past due to incompatible data formats that ASDC holdings are archived in. Through the successful implementation of this pilot effort and continued collaboration with the larger Homeland Defense and Department of Defense emergency management community through the Homeland Infrastructure Foundation-Level Data Working Group (HIFLD WG), our data will be easily accessible to those using GIS and increase the ability to plan, respond, manage, and provide awareness during disasters. The HIFLD WG Partnership has expanded to include more than 5,900 mission partners representing the 14 executive departments, 98 agencies, 50 states (and 3 territories), and more than 700 private sector organizations to directly enhance the federal, state, and local government's ability to support domestic infrastructure data gathering, sharing and protection, visualization, and spatial knowledge management.The HIFLD WG Executive Membership is lead by representatives from the Department of Defense (DoD) Office of the Assistant Secretary of Defense for Homeland Defense and Americas' Security Affairs - OASD (HD&ASA); the Department of Homeland Security (DHS), National Protection and Programs Directorate's Office of Infrastructure Protection (NPPD IP); the National Geospatial-Intelligence Agency (NGA) Integrated Working Group - Readiness, Response and Recovery (IWG-R3); the Department of Interior (DOI) United States Geological Survey (USGS) National Geospatial Program (NGP), and DHS Federal Emergency Management Agency (FEMA).

  14. Role of geospatial technology in identifying natural habitat of malarial vectors in South Andaman, India.

    PubMed

    Shankar, Shiva; Agrawal, Deepak Kumar

    2016-03-01

    Malaria is a serious disease which has repeatedly threatened Andaman, an island territory of India. Uncharted dense vegetation and inaccessibility are the salient features of the area and the major areas are covered by remotely sensed data to identify the malaria vector's natural habitat. The present investigation appraises the role of geospatial technologies in identifying the natural habitat of malarial vectors. The base map was prepared from Survey of India's toposheets, the landuse map was prepared from indices techniques like normalised difference vegetation index (NDVI), normalised difference water index (NDWI), modified normalised difference water index (MNDWI), normalised difference pond index (NDPI), and normalized difference turbidity index (NDTI) in conjugation with visual interpretation. The soil moisture content map was reproduced from the soil atlas of Andaman and Nicobar Islands followed by generation of an aspect profile from ASTER-GDEM satellite data. Both the landuse map and aspect profile were validated for accuracy in the field. A weighted overlay analysis of the classes like landuse, soil moisture and aspect profile of the study area resulted in identification of the potential natural habitat map of malaria vector surrounding the areas of Tushnabad, Garacharma, Manglutan, Chouldari, Ferrargunj and Wimberlygunj hamlets. The natural habitat of malaria vector indicated that Tushnabad, Garacharma, Manglutan, Chouldari, Ferrargunj and Wimberlygunj hamlets are within the proximity of 2.5 km from the prime habitat location with more number of malaria positive cases. Also these hamlets are surrounded by dense evergreen forest and the land surface is draped by clay loam and clay soil texture exhibiting high soil moisture content warranting high rates of survival and proliferation of the vector ensuring resurgence of malaria every year. It is thus concluded that application of geospatial technologies plays an important role in identifying the natural habitat of malaria vector.

  15. Geospatial intelligence and visual classification of environmentally observed species in the Future Internet

    NASA Astrophysics Data System (ADS)

    Arbab-Zavar, B.; Chakravarthy, A.; Sabeur, Z. A.

    2012-04-01

    The rapid development of advanced smart communication tools with good quality and resolution video cameras, audio and GPS devices in the last few years shall lead to profound impacts on the way future environmental observations are conducted and accessed by communities. The resulting large scale interconnections of these "Future Internet Things" form a large environmental sensing network which will generate large volumes of quality environmental observations and at highly localised spatial scales. This enablement in environmental sensing at local scales will be of great importance to contribute in the study of fauna and flora in the near future, particularly on the effect of climate change on biodiversity in various regions of Europe and beyond. The Future Internet could also potentially become the de facto information space to provide participative real-time sensing by communities and improve our situation awarness of the effect of climate on local environments. In the ENVIROFI(2011-2013) Usage Area project in the FP7 FI-PPP programme, a set of requirements for specific (and generic) enablers is achieved with the potential establishement of participating community observatories of the future. In particular, the specific enablement of interest concerns the building of future interoperable services for the management of environmental data intelligently with tagged contextual geo-spatial information generated by multiple operators in communities (Using smart phones). The classification of observed species in the resulting images is achieved with structured data pre-processing, semantic enrichement using contextual geospatial information, and high level fusion with controlled uncertainty estimations. The returned identification of species is further improved using future ground truth corrections and learning by the specific enablers.

  16. Model My Watershed and BiG CZ Data Portal: Interactive geospatial analysis and hydrological modeling web applications that leverage the Amazon cloud for scientists, resource managers and students

    NASA Astrophysics Data System (ADS)

    Aufdenkampe, A. K.; Mayorga, E.; Tarboton, D. G.; Sazib, N. S.; Horsburgh, J. S.; Cheetham, R.

    2016-12-01

    The Model My Watershed Web app (http://wikiwatershed.org/model/) was designed to enable citizens, conservation practitioners, municipal decision-makers, educators, and students to interactively select any area of interest anywhere in the continental USA to: (1) analyze real land use and soil data for that area; (2) model stormwater runoff and water-quality outcomes; and (3) compare how different conservation or development scenarios could modify runoff and water quality. The BiG CZ Data Portal is a web application for scientists for intuitive, high-performance map-based discovery, visualization, access and publication of diverse earth and environmental science data via a map-based interface that simultaneously performs geospatial analysis of selected GIS and satellite raster data for a selected area of interest. The two web applications share a common codebase (https://github.com/WikiWatershed and https://github.com/big-cz), high performance geospatial analysis engine (http://geotrellis.io/ and https://github.com/geotrellis) and deployment on the Amazon Web Services (AWS) cloud cyberinfrastructure. Users can use "on-the-fly" rapid watershed delineation over the national elevation model to select their watershed or catchment of interest. The two web applications also share the goal of enabling the scientists, resource managers and students alike to share data, analyses and model results. We will present these functioning web applications and their potential to substantially lower the bar for studying and understanding our water resources. We will also present work in progress, including a prototype system for enabling citizen-scientists to register open-source sensor stations (http://envirodiy.org/mayfly/) to stream data into these systems, so that they can be reshared using Water One Flow web services.

  17. Predicting the spatial extent of liquefaction from geospatial and earthquake specific parameters

    USGS Publications Warehouse

    Zhu, Jing; Baise, Laurie G.; Thompson, Eric M.; Wald, David J.; Knudsen, Keith L.; Deodatis, George; Ellingwood, Bruce R.; Frangopol, Dan M.

    2014-01-01

    The spatially extensive damage from the 2010-2011 Christchurch, New Zealand earthquake events are a reminder of the need for liquefaction hazard maps for anticipating damage from future earthquakes. Liquefaction hazard mapping as traditionally relied on detailed geologic mapping and expensive site studies. These traditional techniques are difficult to apply globally for rapid response or loss estimation. We have developed a logistic regression model to predict the probability of liquefaction occurrence in coastal sedimentary areas as a function of simple and globally available geospatial features (e.g., derived from digital elevation models) and standard earthquake-specific intensity data (e.g., peak ground acceleration). Some of the geospatial explanatory variables that we consider are taken from the hydrology community, which has a long tradition of using remotely sensed data as proxies for subsurface parameters. As a result of using high resolution, remotely-sensed, and spatially continuous data as a proxy for important subsurface parameters such as soil density and soil saturation, and by using a probabilistic modeling framework, our liquefaction model inherently includes the natural spatial variability of liquefaction occurrence and provides an estimate of spatial extent of liquefaction for a given earthquake. To provide a quantitative check on how the predicted probabilities relate to spatial extent of liquefaction, we report the frequency of observed liquefaction features within a range of predicted probabilities. The percentage of liquefaction is the areal extent of observed liquefaction within a given probability contour. The regional model and the results show that there is a strong relationship between the predicted probability and the observed percentage of liquefaction. Visual inspection of the probability contours for each event also indicates that the pattern of liquefaction is well represented by the model.

  18. Developing Visual Thinking in the Electronic Health Record.

    PubMed

    Boyd, Andrew D; Young, Christine D; Amatayakul, Margret; Dieter, Michael G; Pawola, Lawrence M

    2017-01-01

    The purpose of this vision paper is to identify how data visualization could transform healthcare. Electronic Health Records (EHRs) are maturing with new technology and tools being applied. Researchers are reaping the benefits of data visualization to better access compilations of EHR data for enhanced clinical research. Data visualization, while still primarily the domain of clinical researchers, is beginning to show promise for other stakeholders. A non-exhaustive review of the literature indicates that respective to the growth and development of the EHR, the maturity of data visualization in healthcare is in its infancy. Visual analytics has been only cursorily applied to healthcare. A fundamental issue contributing to fragmentation and poor coordination of healthcare delivery is that each member of the healthcare team, including patients, has a different view. Summarizing all of this care comprehensively for any member of the healthcare team is a "wickedly hard" visual analytics and data visualization problem to solve.

  19. LEARNERS: Interdisciplinary Learning Technology Projects Provide Visualizations and Simulations for Use of Geospatial Data in the Classroom

    NASA Astrophysics Data System (ADS)

    Farrell, N.; Hoban, S.

    2001-05-01

    The NASA Leading Educators to Applications, Research and NASA-related Educational Resources in Science (LEARNERS) initiative supports seven projects for enhancing kindergarten-to-high school science, geography, technology and mathematics education through Internet-based products derived from content on NASA's mission. Topics incorporated in LEARNERS projects include remote sensing of the Earth for agriculture and weather/climate studies, virtual exploration of remote worlds using robotics and digital imagery. Learners are engaged in inquiry or problem-based learning, often assuming the role of an expert scientist as part of an interdisciplinary science team, to study and explain practical problems using real-time NASA data. The presentation/poster will demonstrate novel uses of remote sensing data for K-12 and Post-Secondary students. This will include the use of visualizations, tools for educators, datasets, and classroom scenarios.

  20. SmartR: an open-source platform for interactive visual analytics for translational research data

    PubMed Central

    Herzinger, Sascha; Gu, Wei; Satagopam, Venkata; Eifes, Serge; Rege, Kavita; Barbosa-Silva, Adriano; Schneider, Reinhard

    2017-01-01

    Abstract Summary: In translational research, efficient knowledge exchange between the different fields of expertise is crucial. An open platform that is capable of storing a multitude of data types such as clinical, pre-clinical or OMICS data combined with strong visual analytical capabilities will significantly accelerate the scientific progress by making data more accessible and hypothesis generation easier. The open data warehouse tranSMART is capable of storing a variety of data types and has a growing user community including both academic institutions and pharmaceutical companies. tranSMART, however, currently lacks interactive and dynamic visual analytics and does not permit any post-processing interaction or exploration. For this reason, we developed SmartR, a plugin for tranSMART, that equips the platform not only with several dynamic visual analytical workflows, but also provides its own framework for the addition of new custom workflows. Modern web technologies such as D3.js or AngularJS were used to build a set of standard visualizations that were heavily improved with dynamic elements. Availability and Implementation: The source code is licensed under the Apache 2.0 License and is freely available on GitHub: https://github.com/transmart/SmartR. Contact: reinhard.schneider@uni.lu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28334291

  1. SmartR: an open-source platform for interactive visual analytics for translational research data.

    PubMed

    Herzinger, Sascha; Gu, Wei; Satagopam, Venkata; Eifes, Serge; Rege, Kavita; Barbosa-Silva, Adriano; Schneider, Reinhard

    2017-07-15

    In translational research, efficient knowledge exchange between the different fields of expertise is crucial. An open platform that is capable of storing a multitude of data types such as clinical, pre-clinical or OMICS data combined with strong visual analytical capabilities will significantly accelerate the scientific progress by making data more accessible and hypothesis generation easier. The open data warehouse tranSMART is capable of storing a variety of data types and has a growing user community including both academic institutions and pharmaceutical companies. tranSMART, however, currently lacks interactive and dynamic visual analytics and does not permit any post-processing interaction or exploration. For this reason, we developed SmartR , a plugin for tranSMART, that equips the platform not only with several dynamic visual analytical workflows, but also provides its own framework for the addition of new custom workflows. Modern web technologies such as D3.js or AngularJS were used to build a set of standard visualizations that were heavily improved with dynamic elements. The source code is licensed under the Apache 2.0 License and is freely available on GitHub: https://github.com/transmart/SmartR . reinhard.schneider@uni.lu. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  2. Visual Analytics of Surveillance Data on Foodborne Vibriosis, United States, 1973–2010

    PubMed Central

    Sims, Jennifer N.; Isokpehi, Raphael D.; Cooper, Gabrielle A.; Bass, Michael P.; Brown, Shyretha D.; St John, Alison L.; Gulig, Paul A.; Cohly, Hari H.P.

    2011-01-01

    Foodborne illnesses caused by microbial and chemical contaminants in food are a substantial health burden worldwide. In 2007, human vibriosis (non-cholera Vibrio infections) became a notifiable disease in the United States. In addition, Vibrio species are among the 31 major known pathogens transmitted through food in the United States. Diverse surveillance systems for foodborne pathogens also track outbreaks, illnesses, hospitalization and deaths due to non-cholera vibrios. Considering the recognition of vibriosis as a notifiable disease in the United States and the availability of diverse surveillance systems, there is a need for the development of easily deployed visualization and analysis approaches that can combine diverse data sources in an interactive manner. Current efforts to address this need are still limited. Visual analytics is an iterative process conducted via visual interfaces that involves collecting information, data preprocessing, knowledge representation, interaction, and decision making. We have utilized public domain outbreak and surveillance data sources covering 1973 to 2010, as well as visual analytics software to demonstrate integrated and interactive visualizations of data on foodborne outbreaks and surveillance of Vibrio species. Through the data visualization, we were able to identify unique patterns and/or novel relationships within and across datasets regarding (i) causative agent; (ii) foodborne outbreaks and illness per state; (iii) location of infection; (iv) vehicle (food) of infection; (v) anatomical site of isolation of Vibrio species; (vi) patients and complications of vibriosis; (vii) incidence of laboratory-confirmed vibriosis and V. parahaemolyticus outbreaks. The additional use of emerging visual analytics approaches for interaction with data on vibriosis, including non-foodborne related disease, can guide disease control and prevention as well as ongoing outbreak investigations. PMID:22174586

  3. Urban Typologies: Towards an ORNL Urban Information System (UrbIS)

    NASA Astrophysics Data System (ADS)

    KC, B.; King, A. W.; Sorokine, A.; Crow, M. C.; Devarakonda, R.; Hilbert, N. L.; Karthik, R.; Patlolla, D.; Surendran Nair, S.

    2016-12-01

    Urban environments differ in a large number of key attributes; these include infrastructure, morphology, demography, and economic and social variables, among others. These attributes determine many urban properties such as energy and water consumption, greenhouse gas emissions, air quality, public health, sustainability, and vulnerability and resilience to climate change. Characterization of urban environments by a single property such as population size does not sufficiently capture this complexity. In addressing this multivariate complexity one typically faces such problems as disparate and scattered data, challenges of big data management, spatial searching, insufficient computational capacity for data-driven analysis and modelling, and the lack of tools to quickly visualize the data and compare the analytical results across different cities and regions. We have begun the development of an Urban Information System (UrbIS) to address these issues, one that embraces the multivariate "big data" of urban areas and their environments across the United States utilizing the Big Data as a Service (BDaaS) concept. With technological roots in High-performance Computing (HPC), BDaaS is based on the idea of outsourcing computations to different computing paradigms, scalable to super-computers. UrbIS aims to incorporate federated metadata search, integrated modeling and analysis, and geovisualization into a single seamless workflow. The system includes web-based 2D/3D visualization with an iGlobe interface, fast cloud-based and server-side data processing and analysis, and a metadata search engine based on the Mercury data search system developed at Oak Ridge National Laboratory (ORNL). Results of analyses will be made available through web services. We are implementing UrbIS in ORNL's Compute and Data Environment for Science (CADES) and are leveraging ORNL experience in complex data and geospatial projects. The development of UrbIS is being guided by an investigation of urban heat islands (UHI) using high-dimensional clustering and statistics to define urban typologies (types of cities) in an investigation of how UHI vary with urban type across the United States.

  4. A Web Portal-Based Time-Aware KML Animation Tool for Exploring Spatiotemporal Dynamics of Hydrological Events

    NASA Astrophysics Data System (ADS)

    Bao, X.; Cai, X.; Liu, Y.

    2009-12-01

    Understanding spatiotemporal dynamics of hydrological events such as storms and droughts is highly valuable for decision making on disaster mitigation and recovery. Virtual Globe-based technologies such as Google Earth and Open Geospatial Consortium KML standards show great promises for collaborative exploration of such events using visual analytical approaches. However, currently there are two barriers for wider usage of such approaches. First, there lacks an easy way to use open source tools to convert legacy or existing data formats such as shapefiles, geotiff, or web services-based data sources to KML and to produce time-aware KML files. Second, an integrated web portal-based time-aware animation tool is currently not available. Thus users usually share their files in the portal but have no means to visually explore them without leaving the portal environment which the users are familiar with. We develop a web portal-based time-aware KML animation tool for viewing extreme hydrologic events. The tool is based on Google Earth JavaScript API and Java Portlet standard 2.0 JSR-286, and it is currently deployable in one of the most popular open source portal frameworks, namely Liferay. We have also developed an open source toolkit kml-soc-ncsa (http://code.google.com/p/kml-soc-ncsa/) to facilitate the conversion of multiple formats into KML and the creation of time-aware KML files. We illustrate our tool using some example cases, in which drought and storm events with both time and space dimension can be explored in this web-based KML animation portlet. The tool provides an easy-to-use web browser-based portal environment for multiple users to collaboratively share and explore their time-aware KML files as well as improving the understanding of the spatiotemporal dynamics of the hydrological events.

  5. Along the Virtuality Continuum - Two Showcases on how xR Technologies Transform Geoscience Research and Education

    NASA Astrophysics Data System (ADS)

    Klippel, A.; Zhao, J.; Masrur, A.; Wallgruen, J. O.; La Femina, P. C.

    2017-12-01

    We present work along the virtuality continuum showcasing both AR and VR environments for geoscience applications and research. The AR/VR project focusses on one of the most prominent landmarks on the Penn State campus which, at the same time, is a representation of the geology of Pennsylvania. The Penn State Obelisk is a 32" high, 51 ton monument composed of 281 rocks collected from across Pennsylvania. While information about its origins and composition are scattered in articles and some web databases, we compiled all the available data from the web and archives and curated them as a basis for an immersive xR experience. Tabular data was amended by xR data such as 360° photos, videos, and 3D models (e.g., the Obelisk). Our xR (both AR and VR) prototype provides an immersive analytical environment that supports interactive data visualization and virtual navigation in a natural environment (a campus model of today and of 1896, the year of the Obelisk's installation). This work-in-progress project can provide an interactive immersive learning platform (specifically, for K-12 and introductory level geosciences students) where learning process is enhanced through seamless navigation between 3D data space and physical space. The, second, VR focused application is creating and empirically evaluating virtual reality (VR) experiences for geosciences research, specifically, an interactive volcano experience based on LiDAR and image data of Iceland's Thrihnukar volcano. The prototype addresses the lack of content and tools for immersive virtual reality (iVR) in geoscientific education and research and how to make it easier to integrate iVR into research and classroom experiences. It makes use of environmentally sensed data such that interaction and linked content can be integrated into a single experience. We discuss our workflows as well as methods and authoring tools for iVR analysis and creation of virtual experiences. These methods and tools aim to enhance the utility of geospatial data from repositories such as OpenTopography.org through unlocking treasure-troves of geospatial data for VR applications. Their enhanced accessibility in education and research for the geosciences and beyond will benefit geoscientists and educators who cannot be expected to be VR and 3D application experts.

  6. Assessing Embedded Geospatial Student Learning Outcomes

    ERIC Educational Resources Information Center

    Carr, John David

    2012-01-01

    Geospatial tools and technologies have become core competencies for natural resource professionals due to the monitoring, modeling, and mapping capabilities they provide. To prepare students with needed background, geospatial instructional activities were integrated across Forest Management; Natural Resources; Fisheries, Wildlife, &…

  7. UASs for geospatial data

    USDA-ARS?s Scientific Manuscript database

    Increasingly, consumer organizations, businesses, and academic researchers are using UAS to gather geospatial, environmental data on natural and man-made phenomena. These data may be either remotely sensed or measured directly (e. g., sampling of atmospheric constituents). The term geospatial data r...

  8. Integrated Use of Remote Sensed Data and Numerical Cartography for the Generation of 3d City Models

    NASA Astrophysics Data System (ADS)

    Bitelli, G.; Girelli, V. A.; Lambertini, A.

    2018-05-01

    3D city models are becoming increasingly popular and important, because they constitute the base for all the visualization, planning, management operations regarding the urban infrastructure. These data are however not available in the majority of cities: in this paper, the possibility to use geospatial data of various kinds with the aim to generate 3D models in urban environment is investigated. In 3D modelling works, the starting data are frequently the 3D point clouds, which are nowadays possible to collect by different sensors mounted on different platforms: LiDAR, imagery from satellite, airborne or unmanned aerial vehicles, mobile mapping systems that integrate several sensors. The processing of the acquired data and consequently the obtainability of models able to provide geometric accuracy and a good visual impact is limited by time, costs and logistic constraints. Nowadays more and more innovative hardware and software solutions can offer to the municipalities and the public authorities the possibility to use available geospatial data, acquired for diverse aims, for the generation of 3D models of buildings and cities, characterized by different level of detail. In the paper two cases of study are presented, both regarding surveys carried out in Emilia Romagna region, Italy, where 2D or 2.5D numerical maps are available. The first one is about the use of oblique aerial images realized by the Municipality for a systematic documentation of the built environment, the second concerns the use of LiDAR data acquired for other purposes; in the two tests, these data were used in conjunction with large scale numerical maps to produce 3D city models.

  9. Cloud Geospatial Analysis Tools for Global-Scale Comparisons of Population Models for Decision Making

    NASA Astrophysics Data System (ADS)

    Hancher, M.; Lieber, A.; Scott, L.

    2017-12-01

    The volume of satellite and other Earth data is growing rapidly. Combined with information about where people are, these data can inform decisions in a range of areas including food and water security, disease and disaster risk management, biodiversity, and climate adaptation. Google's platform for planetary-scale geospatial data analysis, Earth Engine, grants access to petabytes of continually updating Earth data, programming interfaces for analyzing the data without the need to download and manage it, and mechanisms for sharing the analyses and publishing results for data-driven decision making. In addition to data about the planet, data about the human planet - population, settlement and urban models - are now available for global scale analysis. The Earth Engine APIs enable these data to be joined, combined or visualized with economic or environmental indicators such as nighttime lights trends, global surface water, or climate projections, in the browser without the need to download anything. We will present our newly developed application intended to serve as a resource for government agencies, disaster response and public health programs, or other consumers of these data to quickly visualize the different population models, and compare them to ground truth tabular data to determine which model suits their immediate needs. Users can further tap into the power of Earth Engine and other Google technologies to perform a range of analysis from simple statistics in custom regions to more complex machine learning models. We will highlight case studies in which organizations around the world have used Earth Engine to combine population data with multiple other sources of data, such as water resources and roads data, over deep stacks of temporal imagery to model disease risk and accessibility to inform decisions.

  10. Leveraging Google Geo Tools for Interactive STEM Education: Insights from the GEODE Project

    NASA Astrophysics Data System (ADS)

    Dordevic, M.; Whitmeyer, S. J.; De Paor, D. G.; Karabinos, P.; Burgin, S.; Coba, F.; Bentley, C.; St John, K. K.

    2016-12-01

    Web-based imagery and geospatial tools have transformed our ability to immerse students in global virtual environments. Google's suite of geospatial tools, such as Google Earth (± Engine), Google Maps, and Street View, allow developers and instructors to create interactive and immersive environments, where students can investigate and resolve common misconceptions in STEM concepts and natural processes. The GEODE (.net) project is developing digital resources to enhance STEM education. These include virtual field experiences (VFEs), such as an interactive visualization of the breakup of the Pangaea supercontinent, a "Grand Tour of the Terrestrial Planets," and GigaPan-based VFEs of sites like the Canadian Rockies. Web-based challenges, such as EarthQuiz (.net) and the "Fold Analysis Challenge," incorporate scaffolded investigations of geoscience concepts. EarthQuiz features web-hosted imagery, such as Street View, Photo Spheres, GigaPans, and Satellite View, as the basis for guided inquiry. In the Fold Analysis Challenge, upper-level undergraduates use Google Earth to evaluate a doubly-plunging fold at Sheep Mountain, WY. GEODE.net also features: "Reasons for the Seasons"—a Google Earth-based visualization that addresses misconceptions that abound amongst students, teachers, and the public, many of whom believe that seasonality is caused by large variations in Earth's distance from the Sun; "Plate Euler Pole Finder," which helps students understand rotational motion of tectonic plates on the globe; and "Exploring Marine Sediments Using Google Earth," an exercise that uses empirical data to explore the surficial distribution of marine sediments in the modern ocean. The GEODE research team includes the authors and: Heather Almquist, Cinzia Cervato, Gene Cooper, Helen Crompton, Terry Pavlis, Jen Piatek, Bill Richards, Jeff Ryan, Ron Schott, Barb Tewksbury, and their students and collaborating colleagues. We are supported by NSF DUE 1323419 and a Google Geo Curriculum Award.

  11. Implementing a Web-Based Decision Support System to Spatially and Statistically Analyze Ecological Conditions of the Sierra Nevada

    NASA Astrophysics Data System (ADS)

    Nguyen, A.; Mueller, C.; Brooks, A. N.; Kislik, E. A.; Baney, O. N.; Ramirez, C.; Schmidt, C.; Torres-Perez, J. L.

    2014-12-01

    The Sierra Nevada is experiencing changes in hydrologic regimes, such as decreases in snowmelt and peak runoff, which affect forest health and the availability of water resources. Currently, the USDA Forest Service Region 5 is undergoing Forest Plan revisions to include climate change impacts into mitigation and adaptation strategies. However, there are few processes in place to conduct quantitative assessments of forest conditions in relation to mountain hydrology, while easily and effectively delivering that information to forest managers. To assist the USDA Forest Service, this study is the final phase of a three-term project to create a Decision Support System (DSS) to allow ease of access to historical and forecasted hydrologic, climatic, and terrestrial conditions for the entire Sierra Nevada. This data is featured within three components of the DSS: the Mapping Viewer, Statistical Analysis Portal, and Geospatial Data Gateway. Utilizing ArcGIS Online, the Sierra DSS Mapping Viewer enables users to visually analyze and locate areas of interest. Once the areas of interest are targeted, the Statistical Analysis Portal provides subbasin level statistics for each variable over time by utilizing a recently developed web-based data analysis and visualization tool called Plotly. This tool allows users to generate graphs and conduct statistical analyses for the Sierra Nevada without the need to download the dataset of interest. For more comprehensive analysis, users are also able to download datasets via the Geospatial Data Gateway. The third phase of this project focused on Python-based data processing, the adaptation of the multiple capabilities of ArcGIS Online and Plotly, and the integration of the three Sierra DSS components within a website designed specifically for the USDA Forest Service.

  12. Quality Assessment and Accessibility Applications of Crowdsourced Geospatial Data: A Report on the Development and Extension of the George Mason University Geocrowdsourcing Testbed

    DTIC Science & Technology

    2014-09-01

    Approved for public release; distribution is unlimited. Prepared for Geospatial Research Laboratory U.S. Army Engineer Research and Development...Center U.S. Army Corps of Engineers Under Data Level Enterprise Tools Monitored by Geospatial Research Laboratory 7701 Telegraph Road...Engineer Research and Development Center (ERDC) ERDC Geospatial Research Laboratory 7701 Telegraph Road 11. SPONSOR/MONITOR’S REPORT Alexandria, VA 22135

  13. An Institutional Community-Driven effort to Curate and Preserve Geospatial Data using GeoBlacklight

    NASA Astrophysics Data System (ADS)

    Petters, J.; Coleman, S.; Andrea, O.

    2016-12-01

    A variety of geospatial data is produced or collected by both academic researchers and non-academic groups in the Virginia Tech community. In an effort to preserve, curate and make this geospatial data discoverable, the University Libraries have been building a local implementation of GeoBlacklight, a multi-institutional open-source collaborative project to improve the discoverability and sharing of geospatial data. We will discuss the local implementation of Geoblacklight at Virginia Tech, focusing on the efforts necessary to make it a sustainable resource for the institution and local community going forward. This includes technical challenges such as the development of uniform workflows for geospatial data produced within and outside the course of research, but organizational and economic barriers must be overcome as well. In spearheading this GeoBlacklight effort the Libraries have partnered with University Facilities and University IT. The IT group manages the storage and backup of geospatial data, allowing our group to focus on geospatial data collection and curation. Both IT and University Facilities are in possession of localized geospatial data of interest to Viriginia Tech researchers that all parties agreed should be made discoverable and accessible. The interest and involvement of these and other university stakeholders is key to establishing the sustainability of the infrastructure and the capabilities it can provide to the Virginia Tech community and beyond.

  14. A cross-sectional ecological analysis of international and sub-national health inequalities in commercial geospatial resource availability.

    PubMed

    Dotse-Gborgbortsi, Winfred; Wardrop, Nicola; Adewole, Ademola; Thomas, Mair L H; Wright, Jim

    2018-05-23

    Commercial geospatial data resources are frequently used to understand healthcare utilisation. Although there is widespread evidence of a digital divide for other digital resources and infra-structure, it is unclear how commercial geospatial data resources are distributed relative to health need. To examine the distribution of commercial geospatial data resources relative to health needs, we assembled coverage and quality metrics for commercial geocoding, neighbourhood characterisation, and travel time calculation resources for 183 countries. We developed a country-level, composite index of commercial geospatial data quality/availability and examined its distribution relative to age-standardised all-cause and cause specific (for three main causes of death) mortality using two inequality metrics, the slope index of inequality and relative concentration index. In two sub-national case studies, we also examined geocoding success rates versus area deprivation by district in Eastern Region, Ghana and Lagos State, Nigeria. Internationally, commercial geospatial data resources were inversely related to all-cause mortality. This relationship was more pronounced when examining mortality due to communicable diseases. Commercial geospatial data resources for calculating patient travel times were more equitably distributed relative to health need than resources for characterising neighbourhoods or geocoding patient addresses. Countries such as South Africa have comparatively high commercial geospatial data availability despite high mortality, whilst countries such as South Korea have comparatively low data availability and low mortality. Sub-nationally, evidence was mixed as to whether geocoding success was lowest in more deprived districts. To our knowledge, this is the first global analysis of commercial geospatial data resources in relation to health outcomes. In countries such as South Africa where there is high mortality but also comparatively rich commercial geospatial data, these data resources are a potential resource for examining healthcare utilisation that requires further evaluation. In countries such as Sierra Leone where there is high mortality but minimal commercial geospatial data, alternative approaches such as open data use are needed in quantifying patient travel times, geocoding patient addresses, and characterising patients' neighbourhoods.

  15. 78 FR 69393 - Privacy Act of 1974; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-19

    .... FOR FURTHER INFORMATION CONTACT: National Geospatial-Intelligence Agency (NGA), ATTN: Human...: Delete entry and replace with ``Human Development Directorate, National Geospatial-Intelligence Agency...; System of Records AGENCY: National Geospatial-Intelligence Agency, DoD. ACTION: Notice to alter a System...

  16. 77 FR 5820 - National Geospatial Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-06

    ... DEPARTMENT OF THE INTERIOR Office of the Secretary National Geospatial Advisory Committee AGENCY... that the Secretary of the Interior has renewed the National Geospatial Advisory Committee. The Committee will provide advice and recommendations to the Federal Geographic Data Committee (FGDC), through...

  17. THE NEVADA GEOSPATIAL DATA BROWSER

    EPA Science Inventory

    The Landscape Ecology Branch of the U.S. Environmental Protection Agency (Las Vegas, NV) has developed the Nevada Geospatial Data Browser, a spatial data archive to centralize and distribute the geospatial data used to create the land cover, vertebrate habitat models, and land o...

  18. Information Fusion for Feature Extraction and the Development of Geospatial Information

    DTIC Science & Technology

    2004-07-01

    of automated processing . 2. Requirements for Geospatial Information Accurate, timely geospatial information is critical for many military...this evaluation illustrates some of the difficulties in comparing manual and automated processing results (figure 5). The automated delineation of

  19. Nick Grue | NREL

    Science.gov Websites

    geospatial data analysis using parallel processing High performance computing Renewable resource technical potential and supply curve analysis Spatial database utilization Rapid analysis of large geospatial datasets energy and geospatial analysis products Research Interests Rapid, web-based renewable resource analysis

  20. Geospatial Information Best Practices

    DTIC Science & Technology

    2012-01-01

    26 Spring - 2012 By MAJ Christopher Blais, CW2 Joshua Stratton and MSG Moise Danjoint The fact that Geospatial information can be codified and...Operation Iraqi Freedom V (2007-2008, and Operation New Dawn (2011). MSG Moise Danjoint is the noncommissioned officer in charge, Geospatial

  1. SnapShot: Visualization to Propel Ice Hockey Analytics.

    PubMed

    Pileggi, H; Stolper, C D; Boyle, J M; Stasko, J T

    2012-12-01

    Sports analysts live in a world of dynamic games flattened into tables of numbers, divorced from the rinks, pitches, and courts where they were generated. Currently, these professional analysts use R, Stata, SAS, and other statistical software packages for uncovering insights from game data. Quantitative sports consultants seek a competitive advantage both for their clients and for themselves as analytics becomes increasingly valued by teams, clubs, and squads. In order for the information visualization community to support the members of this blossoming industry, it must recognize where and how visualization can enhance the existing analytical workflow. In this paper, we identify three primary stages of today's sports analyst's routine where visualization can be beneficially integrated: 1) exploring a dataspace; 2) sharing hypotheses with internal colleagues; and 3) communicating findings to stakeholders.Working closely with professional ice hockey analysts, we designed and built SnapShot, a system to integrate visualization into the hockey intelligence gathering process. SnapShot employs a variety of information visualization techniques to display shot data, yet given the importance of a specific hockey statistic, shot length, we introduce a technique, the radial heat map. Through a user study, we received encouraging feedback from several professional analysts, both independent consultants and professional team personnel.

  2. US EPA GEOSPATIAL QUALITY COUNCIL: ENSURING QUALITY IN GEOPSPATIAL SOLUTIONS

    EPA Science Inventory

    In 1999, the U.S. Environmental Protection Agency (EPA), Office of Research and Development, Environmental Sciences Division, created the EPA Geospatial Quality Council (GQC) to fill the gap between the EPA Quality Assurance (QA) and Geospatial communities. GQC participants inclu...

  3. Searches over graphs representing geospatial-temporal remote sensing data

    DOEpatents

    Brost, Randolph; Perkins, David Nikolaus

    2018-03-06

    Various technologies pertaining to identifying objects of interest in remote sensing images by searching over geospatial-temporal graph representations are described herein. Graphs are constructed by representing objects in remote sensing images as nodes, and connecting nodes with undirected edges representing either distance or adjacency relationships between objects and directed edges representing changes in time. Geospatial-temporal graph searches are made computationally efficient by taking advantage of characteristics of geospatial-temporal data in remote sensing images through the application of various graph search techniques.

  4. Visualization techniques for computer network defense

    NASA Astrophysics Data System (ADS)

    Beaver, Justin M.; Steed, Chad A.; Patton, Robert M.; Cui, Xiaohui; Schultz, Matthew

    2011-06-01

    Effective visual analysis of computer network defense (CND) information is challenging due to the volume and complexity of both the raw and analyzed network data. A typical CND is comprised of multiple niche intrusion detection tools, each of which performs network data analysis and produces a unique alerting output. The state-of-the-practice in the situational awareness of CND data is the prevalent use of custom-developed scripts by Information Technology (IT) professionals to retrieve, organize, and understand potential threat events. We propose a new visual analytics framework, called the Oak Ridge Cyber Analytics (ORCA) system, for CND data that allows an operator to interact with all detection tool outputs simultaneously. Aggregated alert events are presented in multiple coordinated views with timeline, cluster, and swarm model analysis displays. These displays are complemented with both supervised and semi-supervised machine learning classifiers. The intent of the visual analytics framework is to improve CND situational awareness, to enable an analyst to quickly navigate and analyze thousands of detected events, and to combine sophisticated data analysis techniques with interactive visualization such that patterns of anomalous activities may be more easily identified and investigated.

  5. 78 FR 32635 - Privacy Act of 1974; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-31

    ...; System of Records AGENCY: National Geospatial-Intelligence Agency, DoD. ACTION: Notice to Add a New System of Records. SUMMARY: The National Geospatial-Intelligence Agency is establishing a new system of... information. FOR FURTHER INFORMATION CONTACT: National Geospatial-Intelligence Agency [[Page 32636

  6. 78 FR 35606 - Privacy Act of 1974; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-13

    ...; System of Records AGENCY: National Geospatial-Intelligence Agency, DoD. ACTION: Notice to alter a System of Records. SUMMARY: The National Geospatial-Intelligence Agency is altering a system of records in.... FOR FURTHER INFORMATION CONTACT: National Geospatial-Intelligence Agency (NGA), ATTN: Security...

  7. How Can Visual Analytics Assist Investigative Analysis? Design Implications from an Evaluation.

    PubMed

    Youn-Ah Kang; Görg, Carsten; Stasko, John

    2011-05-01

    Despite the growing number of systems providing visual analytic support for investigative analysis, few empirical studies of the potential benefits of such systems have been conducted, particularly controlled, comparative evaluations. Determining how such systems foster insight and sensemaking is important for their continued growth and study, however. Furthermore, studies that identify how people use such systems and why they benefit (or not) can help inform the design of new systems in this area. We conducted an evaluation of the visual analytics system Jigsaw employed in a small investigative sensemaking exercise, and compared its use to three other more traditional methods of analysis. Sixteen participants performed a simulated intelligence analysis task under one of the four conditions. Experimental results suggest that Jigsaw assisted participants to analyze the data and identify an embedded threat. We describe different analysis strategies used by study participants and how computational support (or the lack thereof) influenced the strategies. We then illustrate several characteristics of the sensemaking process identified in the study and provide design implications for investigative analysis tools based thereon. We conclude with recommendations on metrics and techniques for evaluating visual analytics systems for investigative analysis.

  8. A Paper-Based Electrochromic Array for Visualized Electrochemical Sensing.

    PubMed

    Zhang, Fengling; Cai, Tianyi; Ma, Liang; Zhan, Liyuan; Liu, Hong

    2017-01-31

    We report a battery-powered, paper-based electrochromic array for visualized electrochemical sensing. The paper-based sensing system consists of six parallel electrochemical cells, which are powered by an aluminum-air battery. Each single electrochemical cell uses a Prussian Blue spot electrodeposited on an indium-doped tin oxide thin film as the electrochromic indicator. Each electrochemical cell is preloaded with increasing amounts of analyte. The sample activates the battery for the sensing. Both the preloaded analyte and the analyte in the sample initiate the color change of Prussian Blue to Prussian White. With a reaction time of 60 s, the number of electrochemical cells with complete color changes is correlated to the concentration of analyte in the sample. As a proof-of-concept analyte, lactic acid was detected semi-quantitatively using the naked eye.

  9. Insight solutions are correct more often than analytic solutions

    PubMed Central

    Salvi, Carola; Bricolo, Emanuela; Kounios, John; Bowden, Edward; Beeman, Mark

    2016-01-01

    How accurate are insights compared to analytical solutions? In four experiments, we investigated how participants’ solving strategies influenced their solution accuracies across different types of problems, including one that was linguistic, one that was visual and two that were mixed visual-linguistic. In each experiment, participants’ self-judged insight solutions were, on average, more accurate than their analytic ones. We hypothesised that insight solutions have superior accuracy because they emerge into consciousness in an all-or-nothing fashion when the unconscious solving process is complete, whereas analytic solutions can be guesses based on conscious, prematurely terminated, processing. This hypothesis is supported by the finding that participants’ analytic solutions included relatively more incorrect responses (i.e., errors of commission) than timeouts (i.e., errors of omission) compared to their insight responses. PMID:27667960

  10. SWOT analysis on National Common Geospatial Information Service Platform of China

    NASA Astrophysics Data System (ADS)

    Zheng, Xinyan; He, Biao

    2010-11-01

    Currently, the trend of International Surveying and Mapping is shifting from map production to integrated service of geospatial information, such as GOS of U.S. etc. Under this circumstance, the Surveying and Mapping of China is inevitably shifting from 4D product service to NCGISPC (National Common Geospatial Information Service Platform of China)-centered service. Although State Bureau of Surveying and Mapping of China has already provided a great quantity of geospatial information service to various lines of business, such as emergency and disaster management, transportation, water resource, agriculture etc. The shortcomings of the traditional service mode are more and more obvious, due to the highly emerging requirement of e-government construction, the remarkable development of IT technology and emerging online geospatial service demands of various lines of business. NCGISPC, which aimed to provide multiple authoritative online one-stop geospatial information service and API for further development to government, business and public, is now the strategic core of SBSM (State Bureau of Surveying and Mapping of China). This paper focuses on the paradigm shift that NCGISPC brings up by using SWOT (Strength, Weakness, Opportunity and Threat) analysis, compared to the service mode that based on 4D product. Though NCGISPC is still at its early stage, it represents the future service mode of geospatial information of China, and surely will have great impact not only on the construction of digital China, but also on the way that everyone uses geospatial information service.

  11. Real-time notification and improved situational awareness in fire emergencies using geospatial-based publish/subscribe

    NASA Astrophysics Data System (ADS)

    Kassab, Ala'; Liang, Steve; Gao, Yang

    2010-12-01

    Emergency agencies seek to maintain situational awareness and effective decision making through continuous monitoring of, and real-time alerting about, sources of information regarding current incidents and developing fire hazards. The nature of this goal requires integrating different, potentially numerous, sources of dynamic geospatial information on the one side, and a large number of clients having heterogeneous and specific interests in data on the other side. In such scenarios, the traditional request/reply communication style may function inefficiently, as it is based on point-to-point, synchronous, and pulling mode interaction between consumer clients and information providers/services. In this work, we propose Geospatial-based Publish/ Subscribe, an interaction framework that serves as a middleware for real-time transacting of spatially related information of interest, termed geospatial events, in distributed systems. Expressive data models, including geospatial event and geospatial subscription, as well as an efficient matching approach for fast dissemination of geospatial events to interested clients, are introduced. The proposed interaction framework is realized through the development of a Real-Time Fire Emergency Response System (RFERS) prototype. The prototype is designed for transacting several topics of geospatial events that are crucial within the context of fire emergencies, including GPS locations of emergency assets, meteorological observations of wireless sensors, fire incidents reports, and temporal sequences of remote sensing images of active wildfires. The performance of the system prototype has been evaluated in order to demonstrate its efficiency.

  12. Review: visual analytics of climate networks

    NASA Astrophysics Data System (ADS)

    Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.

    2015-09-01

    Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing numbers of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis relating the multiple visualisation challenges to a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.

  13. Review: visual analytics of climate networks

    NASA Astrophysics Data System (ADS)

    Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.

    2015-04-01

    Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing amounts of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis, relating the multiple visualisation challenges with a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.

  14. GIS-based Geospatial Infrastructure of Water Resource Assessment for Supporting Oil Shale Development in Piceance Basin of Northwestern Colorado

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Wei; Minnick, Matthew D; Mattson, Earl D

    Oil shale deposits of the Green River Formation (GRF) in Northwestern Colorado, Southwestern Wyoming, and Northeastern Utah may become one of the first oil shale deposits to be developed in the U.S. because of their richness, accessibility, and extensive prior characterization. Oil shale is an organic-rich fine-grained sedimentary rock that contains significant amounts of kerogen from which liquid hydrocarbons can be produced. Water is needed to retort or extract oil shale at an approximate rate of three volumes of water for every volume of oil produced. Concerns have been raised over the demand and availability of water to produce oilmore » shale, particularly in semiarid regions where water consumption must be limited and optimized to meet demands from other sectors. The economic benefit of oil shale development in this region may have tradeoffs within the local and regional environment. Due to these potential environmental impacts of oil shale development, water usage issues need to be further studied. A basin-wide baseline for oil shale and water resource data is the foundation of the study. This paper focuses on the design and construction of a centralized geospatial infrastructure for managing a large amount of oil shale and water resource related baseline data, and for setting up the frameworks for analytical and numerical models including but not limited to three-dimensional (3D) geologic, energy resource development systems, and surface water models. Such a centralized geospatial infrastructure made it possible to directly generate model inputs from the same database and to indirectly couple the different models through inputs/outputs. Thus ensures consistency of analyses conducted by researchers from different institutions, and help decision makers to balance water budget based on the spatial distribution of the oil shale and water resources, and the spatial variations of geologic, topographic, and hydrogeological Characterization of the basin. This endeavor encountered many technical challenging and hasn't been done in the past for any oil shale basin. The database built during this study remains valuable for any other future studies involving oil shale and water resource management in the Piceance Basin. The methodology applied in the development of the GIS based Geospatial Infrastructure can be readily adapted for other professionals to develop database structure for other similar basins.« less

  15. US EPA GLOBAL POSITIONING SYSTEMS - TECHNICAL IMPLEMENTATION GUIDANCE

    EPA Science Inventory

    The U.S. EPA Geospatial Quality Council (GQC) was formed in 1998 to provide Quality Assurance guidance for the development, use, and products of geospatial activities and research. The long-term goals of the GQC are expressed in a living document, currently the EPA Geospatial Qua...

  16. Integration of Geospatial Science in Teacher Education

    ERIC Educational Resources Information Center

    Hauselt, Peggy; Helzer, Jennifer

    2012-01-01

    One of the primary missions of our university is to train future primary and secondary teachers. Geospatial sciences, including GIS, have long been excluded from teacher education curriculum. This article explains the curriculum revisions undertaken to increase the geospatial technology education of future teachers. A general education class…

  17. 75 FR 43497 - Privacy Act of 1974; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-26

    ...; System of Records AGENCY: National Geospatial-Intelligence Agency (NGA), DoD. ACTION: Notice to add a system of records. SUMMARY: The National Geospatial-Intelligence Agency (NGA) proposes to add a system of...-3808. SUPPLEMENTARY INFORMATION: The National Geospatial-Intelligence Agency notices for systems of...

  18. Indigenous knowledges driving technological innovation

    Treesearch

    Lilian Alessa; Carlos Andrade; Phil Cash Cash; Christian P. Giardina; Matt Hamabata; Craig Hammer; Kai Henifin; Lee Joachim; Jay T. Johnson; Kekuhi Kealiikanakaoleohaililani; Deanna Kingston; Andrew Kliskey; Renee Pualani Louis; Amanda Lynch; Daryn McKenny; Chels Marshall; Mere Roberts; Taupouri Tangaro; Jyl Wheaton-Abraham; Everett Wingert

    2011-01-01

    This policy brief explores the use and expands the conversation on the ability of geospatial technologies to represent Indigenous cultural knowledge. Indigenous peoples' use of geospatial technologies has already proven to be a critical step for protecting tribal self-determination. However, the ontological frameworks and techniques of Western geospatial...

  19. Regional Geology Web Map Application Development: Javascript v2.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Russell, Glenn

    This is a milestone report for the FY2017 continuation of the Spent Fuel, Storage, and Waste, Technology (SFSWT) program (formerly Used Fuel Disposal (UFD) program) development of the Regional Geology Web Mapping Application by the Idaho National Laboratory Geospatial Science and Engineering group. This application was developed for general public use and is an interactive web-based application built in Javascript to visualize, reference, and analyze US pertinent geological features of the SFSWT program. This tool is a version upgrade from Adobe FLEX technology. It is designed to facilitate informed decision making of the geology of continental US relevant to themore » SFSWT program.« less

  20. Mapping the Future Today: The Community College of Baltimore County Geospatial Applications Program

    ERIC Educational Resources Information Center

    Jeffrey, Scott; Alvarez, Jaime

    2010-01-01

    The Geospatial Applications Program at the Community College of Baltimore County (CCBC), located five miles west of downtown Baltimore, Maryland, provides comprehensive instruction in geographic information systems (GIS), remote sensing and global positioning systems (GPS). Geospatial techniques, which include computer-based mapping and remote…

  1. The Efficacy of Educative Curriculum Materials to Support Geospatial Science Pedagogical Content Knowledge

    ERIC Educational Resources Information Center

    Bodzin, Alec; Peffer, Tamara; Kulo, Violet

    2012-01-01

    Teaching and learning about geospatial aspects of energy resource issues requires that science teachers apply effective science pedagogical approaches to implement geospatial technologies into classroom instruction. To address this need, we designed educative curriculum materials as an integral part of a comprehensive middle school energy…

  2. Strategizing Teacher Professional Development for Classroom Uses of Geospatial Data and Tools

    ERIC Educational Resources Information Center

    Zalles, Daniel R.; Manitakos, James

    2016-01-01

    Studying Topography, Orographic Rainfall, and Ecosystems with Geospatial Information Technology (STORE), a 4.5-year National Science Foundation funded project, explored the strategies that stimulate teacher commitment to the project's driving innovation: having students use geospatial information technology (GIT) to learn about weather, climate,…

  3. Fostering 21st Century Learning with Geospatial Technologies

    ERIC Educational Resources Information Center

    Hagevik, Rita A.

    2011-01-01

    Global positioning systems (GPS) receivers and other geospatial tools can help teachers create engaging, hands-on activities in all content areas. This article provides a rationale for using geospatial technologies in the middle grades and describes classroom-tested activities in English language arts, science, mathematics, and social studies.…

  4. EPA GEOSPATIAL QUALITY COUNCIL STRATEGY PLAN FY-02

    EPA Science Inventory



    The EPA Geospatial Quality Council (GQC), previously known as the EPA GIS-QA Team - EPA/600/R-00/009, was created to fill the gap between the EPA Quality Assurance (QA) and Geospatial communities. All EPA Offices and Regions were invited to participate. Currently, the EPA...

  5. Mapping and monitoring potato cropping systems in Maine: geospatial methods and land use assessments

    USDA-ARS?s Scientific Manuscript database

    Geospatial frameworks and GIS-based approaches were used to assess current cropping practices in potato production systems in Maine. Results from the geospatial integration of remotely-sensed cropland layers (2008-2011) and soil datasets for Maine revealed a four-year potato systems footprint estima...

  6. The Virginia Geocoin Adventure: An Experiential Geospatial Learning Activity

    ERIC Educational Resources Information Center

    Johnson, Laura; McGee, John; Campbell, James; Hays, Amy

    2013-01-01

    Geospatial technologies have become increasingly prevalent across our society. Educators at all levels have expressed a need for additional resources that can be easily adopted to support geospatial literacy and state standards of learning, while enhancing the overall learning experience. The Virginia Geocoin Adventure supports the needs of 4-H…

  7. Geospatial Technology

    ERIC Educational Resources Information Center

    Reed, Philip A.; Ritz, John

    2004-01-01

    Geospatial technology refers to a system that is used to acquire, store, analyze, and output data in two or three dimensions. This data is referenced to the earth by some type of coordinate system, such as a map projection. Geospatial systems include thematic mapping, the Global Positioning System (GPS), remote sensing (RS), telemetry, and…

  8. A Geospatial Online Instruction Model

    ERIC Educational Resources Information Center

    Rodgers, John C., III; Owen-Nagel, Athena; Ambinakudige, Shrinidhi

    2012-01-01

    The objective of this study is to present a pedagogical model for teaching geospatial courses through an online format and to critique the model's effectiveness. Offering geospatial courses through an online format provides avenues to a wider student population, many of whom are not able to take traditional on-campus courses. Yet internet-based…

  9. lawn: An R client for the Turf JavaScript Library for Geospatial Analysis

    EPA Science Inventory

    lawn is an R package to provide access to the geospatial analysis capabilities in the Turf javascript library. Turf expects data in GeoJSON format. Given that many datasets are now available natively in GeoJSON providing an easier method for conducting geospatial analyses on thes...

  10. Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention

    PubMed Central

    Fisher, Brian; Smith, Jennifer; Pike, Ian

    2017-01-01

    Background: Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods: Inspired by the Delphi method, we introduced a novel methodology—group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders’ observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results: The GA methodology triggered the emergence of ‘common ground’ among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders’ verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusions: Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ‘common ground’ among diverse stakeholders about health data and their implications. PMID:28895928

  11. Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention.

    PubMed

    Al-Hajj, Samar; Fisher, Brian; Smith, Jennifer; Pike, Ian

    2017-09-12

    Background : Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods : Inspired by the Delphi method, we introduced a novel methodology-group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders' observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results : The GA methodology triggered the emergence of ' common g round ' among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders' verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusion s : Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ' common ground' among diverse stakeholders about health data and their implications.

  12. Visualizing Earth and Planetary Remote Sensing Data Using JMARS

    NASA Astrophysics Data System (ADS)

    Dickenshied, S.; Christensen, P. R.; Carter, S.; Anwar, S.; Noss, D.

    2014-12-01

    JMARS (Java Mission-planning and Analysis for Remote Sensing) is a free geospatial application developed by the Mars Space Flight Facility at Arizona State University. Originally written as a mission planning tool for the THEMIS instrument on board the MARS Odyssey Spacecraft, it was released as an analysis tool to the general public in 2003. Since then it has expanded to be used for mission planning and scientific data analysis by additional NASA missions to Mars, the Moon, and Vesta, and it has come to be used by scientists, researchers and students of all ages from more than 40 countries around the world. The public version of JMARS now also includes remote sensing data for Mercury, Venus, Earth, the Moon, Mars, and a number of the moons of Jupiter and Saturn. Additional datasets for asteroids and other smaller bodies are being added as they becomes available and time permits. JMARS fuses data from different instruments in a geographical context. One core strength of JMARS is that it provides access to geospatially registered data via a consistent interface. Such data include global images (graphical and numeric), local mosaics, individual instrument images, spectra, and vector-oriented data. By hosting these products, users are able to avoid searching for, downloading, decoding, and projecting data on their own using a disparate set of tools and procedures. The JMARS team processes, indexes, and reorganizes data to make it quickly and easily accessible in a consistent manner. JMARS leverages many open-source technologies and tools to accomplish these data preparation steps. In addition to visualizing multiple datasets in context with one another, JMARS allows a user to find data products from differing missions that intersect the same geographical location, time range, or observational parameters. Any number of georegistered datasets can then be viewed or analyzed simultaneously with one another. A user can easily create a mosaic of graphic data, plot numeric data, or project any arbitrary scene over surface topography. All of these visualization options can be exported for use in presentations, publications, or for further analysis in other tools.

  13. Geospatial Education: Working with the NASA Airborne Science Program

    NASA Astrophysics Data System (ADS)

    Lockwood, C. M.; Handley, L.; Handley, N.

    2010-12-01

    WETMAAP (Wetland Education Through Maps and Aerial Photography) , a program of CNL World, supports the NASA Strategic Goals and Objectives for Education by providing classroom teachers and formal and informal educators with professional development. WETMAAP promotes science by inquiry through the use of a building-block process, comparative analysis, and analytical observations. Through the WETMAAP workshops and website, educators receive the concepts necessary to provide students with a basic understanding of maps, aerial photography, and satellite and airborne imagery that focus on the study of wetlands and wetland change. The program targets educators, Grades 5 - 12, in earth science, environmental science, biology, geography, and mathematics, and emphasizes a comprehensive curriculum approach.

  14. Large-Scale Overlays and Trends: Visually Mining, Panning and Zooming the Observable Universe.

    PubMed

    Luciani, Timothy Basil; Cherinka, Brian; Oliphant, Daniel; Myers, Sean; Wood-Vasey, W Michael; Labrinidis, Alexandros; Marai, G Elisabeta

    2014-07-01

    We introduce a web-based computing infrastructure to assist the visual integration, mining and interactive navigation of large-scale astronomy observations. Following an analysis of the application domain, we design a client-server architecture to fetch distributed image data and to partition local data into a spatial index structure that allows prefix-matching of spatial objects. In conjunction with hardware-accelerated pixel-based overlays and an online cross-registration pipeline, this approach allows the fetching, displaying, panning and zooming of gigabit panoramas of the sky in real time. To further facilitate the integration and mining of spatial and non-spatial data, we introduce interactive trend images-compact visual representations for identifying outlier objects and for studying trends within large collections of spatial objects of a given class. In a demonstration, images from three sky surveys (SDSS, FIRST and simulated LSST results) are cross-registered and integrated as overlays, allowing cross-spectrum analysis of astronomy observations. Trend images are interactively generated from catalog data and used to visually mine astronomy observations of similar type. The front-end of the infrastructure uses the web technologies WebGL and HTML5 to enable cross-platform, web-based functionality. Our approach attains interactive rendering framerates; its power and flexibility enables it to serve the needs of the astronomy community. Evaluation on three case studies, as well as feedback from domain experts emphasize the benefits of this visual approach to the observational astronomy field; and its potential benefits to large scale geospatial visualization in general.

  15. Leveraging multidisciplinarity in a visual analytics graduate course.

    PubMed

    Elmqvist, Niklas; Ebert, David S

    2012-01-01

    Demand is growing in engineering, business, science, research, and industry for students with visual analytics expertise. However, teaching VA is challenging owing to the multidisciplinary nature of the topic, students' diverse backgrounds, and the corresponding requirements for instructors. This article reports best practices from a VA graduate course at Purdue University, where instructors leveraged these challenges to their advantage instead of trying to mitigate them.

  16. Open and scalable analytics of large Earth observation datasets: From scenes to multidimensional arrays using SciDB and GDAL

    NASA Astrophysics Data System (ADS)

    Appel, Marius; Lahn, Florian; Buytaert, Wouter; Pebesma, Edzer

    2018-04-01

    Earth observation (EO) datasets are commonly provided as collection of scenes, where individual scenes represent a temporal snapshot and cover a particular region on the Earth's surface. Using these data in complex spatiotemporal modeling becomes difficult as soon as data volumes exceed a certain capacity or analyses include many scenes, which may spatially overlap and may have been recorded at different dates. In order to facilitate analytics on large EO datasets, we combine and extend the geospatial data abstraction library (GDAL) and the array-based data management and analytics system SciDB. We present an approach to automatically convert collections of scenes to multidimensional arrays and use SciDB to scale computationally intensive analytics. We evaluate the approach in three study cases on national scale land use change monitoring with Landsat imagery, global empirical orthogonal function analysis of daily precipitation, and combining historical climate model projections with satellite-based observations. Results indicate that the approach can be used to represent various EO datasets and that analyses in SciDB scale well with available computational resources. To simplify analyses of higher-dimensional datasets as from climate model output, however, a generalization of the GDAL data model might be needed. All parts of this work have been implemented as open-source software and we discuss how this may facilitate open and reproducible EO analyses.

  17. Analytical Thinking, Analytical Action: Using Prelab Video Demonstrations and e-Quizzes to Improve Undergraduate Preparedness for Analytical Chemistry Practical Classes

    ERIC Educational Resources Information Center

    Jolley, Dianne F.; Wilson, Stephen R.; Kelso, Celine; O'Brien, Glennys; Mason, Claire E.

    2016-01-01

    This project utilizes visual and critical thinking approaches to develop a higher-education synergistic prelab training program for a large second-year undergraduate analytical chemistry class, directing more of the cognitive learning to the prelab phase. This enabled students to engage in more analytical thinking prior to engaging in the…

  18. Estimating Prediction Uncertainty from Geographical Information System Raster Processing: A User's Manual for the Raster Error Propagation Tool (REPTool)

    USGS Publications Warehouse

    Gurdak, Jason J.; Qi, Sharon L.; Geisler, Michael L.

    2009-01-01

    The U.S. Geological Survey Raster Error Propagation Tool (REPTool) is a custom tool for use with the Environmental System Research Institute (ESRI) ArcGIS Desktop application to estimate error propagation and prediction uncertainty in raster processing operations and geospatial modeling. REPTool is designed to introduce concepts of error and uncertainty in geospatial data and modeling and provide users of ArcGIS Desktop a geoprocessing tool and methodology to consider how error affects geospatial model output. Similar to other geoprocessing tools available in ArcGIS Desktop, REPTool can be run from a dialog window, from the ArcMap command line, or from a Python script. REPTool consists of public-domain, Python-based packages that implement Latin Hypercube Sampling within a probabilistic framework to track error propagation in geospatial models and quantitatively estimate the uncertainty of the model output. Users may specify error for each input raster or model coefficient represented in the geospatial model. The error for the input rasters may be specified as either spatially invariant or spatially variable across the spatial domain. Users may specify model output as a distribution of uncertainty for each raster cell. REPTool uses the Relative Variance Contribution method to quantify the relative error contribution from the two primary components in the geospatial model - errors in the model input data and coefficients of the model variables. REPTool is appropriate for many types of geospatial processing operations, modeling applications, and related research questions, including applications that consider spatially invariant or spatially variable error in geospatial data.

  19. An updated geospatial liquefaction model for global application

    USGS Publications Warehouse

    Zhu, Jing; Baise, Laurie G.; Thompson, Eric M.

    2017-01-01

    We present an updated geospatial approach to estimation of earthquake-induced liquefaction from globally available geospatial proxies. Our previous iteration of the geospatial liquefaction model was based on mapped liquefaction surface effects from four earthquakes in Christchurch, New Zealand, and Kobe, Japan, paired with geospatial explanatory variables including slope-derived VS30, compound topographic index, and magnitude-adjusted peak ground acceleration from ShakeMap. The updated geospatial liquefaction model presented herein improves the performance and the generality of the model. The updates include (1) expanding the liquefaction database to 27 earthquake events across 6 countries, (2) addressing the sampling of nonliquefaction for incomplete liquefaction inventories, (3) testing interaction effects between explanatory variables, and (4) overall improving model performance. While we test 14 geospatial proxies for soil density and soil saturation, the most promising geospatial parameters are slope-derived VS30, modeled water table depth, distance to coast, distance to river, distance to closest water body, and precipitation. We found that peak ground velocity (PGV) performs better than peak ground acceleration (PGA) as the shaking intensity parameter. We present two models which offer improved performance over prior models. We evaluate model performance using the area under the curve under the Receiver Operating Characteristic (ROC) curve (AUC) and the Brier score. The best-performing model in a coastal setting uses distance to coast but is problematic for regions away from the coast. The second best model, using PGV, VS30, water table depth, distance to closest water body, and precipitation, performs better in noncoastal regions and thus is the model we recommend for global implementation.

  20. A Practice Approach of Multi-source Geospatial Data Integration for Web-based Geoinformation Services

    NASA Astrophysics Data System (ADS)

    Huang, W.; Jiang, J.; Zha, Z.; Zhang, H.; Wang, C.; Zhang, J.

    2014-04-01

    Geospatial data resources are the foundation of the construction of geo portal which is designed to provide online geoinformation services for the government, enterprise and public. It is vital to keep geospatial data fresh, accurate and comprehensive in order to satisfy the requirements of application and development of geographic location, route navigation, geo search and so on. One of the major problems we are facing is data acquisition. For us, integrating multi-sources geospatial data is the mainly means of data acquisition. This paper introduced a practice integration approach of multi-source geospatial data with different data model, structure and format, which provided the construction of National Geospatial Information Service Platform of China (NGISP) with effective technical supports. NGISP is the China's official geo portal which provides online geoinformation services based on internet, e-government network and classified network. Within the NGISP architecture, there are three kinds of nodes: national, provincial and municipal. Therefore, the geospatial data is from these nodes and the different datasets are heterogeneous. According to the results of analysis of the heterogeneous datasets, the first thing we do is to define the basic principles of data fusion, including following aspects: 1. location precision; 2.geometric representation; 3. up-to-date state; 4. attribute values; and 5. spatial relationship. Then the technical procedure is researched and the method that used to process different categories of features such as road, railway, boundary, river, settlement and building is proposed based on the principles. A case study in Jiangsu province demonstrated the applicability of the principle, procedure and method of multi-source geospatial data integration.

  1. Remote sensing applied to resource management

    Treesearch

    Henry M. Lachowski

    1998-01-01

    Effective management of forest resources requires access to current and consistent geospatial information that can be shared by resource managers and the public. Geospatial information describing our land and natural resources comes from many sources and is most effective when stored in a geospatial database and used in a geographic information system (GIS). The...

  2. The Impact of a Geospatial Technology-Supported Energy Curriculum on Middle School Students' Science Achievement

    ERIC Educational Resources Information Center

    Kulo, Violet; Bodzin, Alec

    2013-01-01

    Geospatial technologies are increasingly being integrated in science classrooms to foster learning. This study examined whether a Web-enhanced science inquiry curriculum supported by geospatial technologies promoted urban middle school students' understanding of energy concepts. The participants included one science teacher and 108 eighth-grade…

  3. Introduction to the Complex Geospatial Web in Geographical Education

    ERIC Educational Resources Information Center

    Papadimitriou, Fivos

    2010-01-01

    The Geospatial Web is emerging in the geographical education landscape in all its complexity. How will geographers and educators react? What are the most important facets of this development? After reviewing the possible impacts on geographical education, it can be conjectured that the Geospatial Web will eventually replace the usual geographical…

  4. Dylan Hettinger | NREL

    Science.gov Websites

    Hettinger Photo of Dylan Hettinger Dylan Hettinger Geospatial Data Scientist Dylan.Hettinger @nrel.gov | 303-275-3750 Dylan Hettinger is a member of the Geospatial Data Science team within the Systems Modeling & Geospatial Data Science Group in the Strategic Energy Analysis Center. Areas of Expertise

  5. The Impact of Professional Development in Natural Resource Investigations Using Geospatial Technologies

    ERIC Educational Resources Information Center

    Hanley, Carol D.; Davis, Hilarie B.; Davey, Bradford T.

    2012-01-01

    As use of geospatial technologies has increased in the workplace, so has interest in using these technologies in the K-12 classroom. Prior research has identified several reasons for using geospatial technologies in the classroom, such as developing spatial thinking, supporting local investigations, analyzing changes in the environment, and…

  6. The Sky's the Limit: Integrating Geospatial Tools with Pre-College Youth Education

    ERIC Educational Resources Information Center

    McGee, John; Kirwan, Jeff

    2010-01-01

    Geospatial tools, which include global positioning systems (GPS), geographic information systems (GIS), and remote sensing, are increasingly driving a variety of applications. Local governments and private industry are embracing these tools, and the public is beginning to demand geospatial services. The U.S. Department of Labor (DOL) reported that…

  7. Using the Geospatial Web to Deliver and Teach Giscience Education Programs

    NASA Astrophysics Data System (ADS)

    Veenendaal, B.

    2015-05-01

    Geographic information science (GIScience) education has undergone enormous changes over the past years. One major factor influencing this change is the role of the geospatial web in GIScience. In addition to the use of the web for enabling and enhancing GIScience education, it is also used as the infrastructure for communicating and collaborating among geospatial data and users. The web becomes both the means and the content for a geospatial education program. However, the web does not replace the traditional face-to-face environment, but rather is a means to enhance it, expand it and enable an authentic and real world learning environment. This paper outlines the use of the web in both the delivery and content of the GIScience program at Curtin University. The teaching of the geospatial web, web and cloud based mapping, and geospatial web services are key components of the program, and the use of the web and online learning are important to deliver this program. Some examples of authentic and real world learning environments are provided including joint learning activities with partner universities.

  8. A Geospatial Semantic Enrichment and Query Service for Geotagged Photographs

    PubMed Central

    Ennis, Andrew; Nugent, Chris; Morrow, Philip; Chen, Liming; Ioannidis, George; Stan, Alexandru; Rachev, Preslav

    2015-01-01

    With the increasing abundance of technologies and smart devices, equipped with a multitude of sensors for sensing the environment around them, information creation and consumption has now become effortless. This, in particular, is the case for photographs with vast amounts being created and shared every day. For example, at the time of this writing, Instagram users upload 70 million photographs a day. Nevertheless, it still remains a challenge to discover the “right” information for the appropriate purpose. This paper describes an approach to create semantic geospatial metadata for photographs, which can facilitate photograph search and discovery. To achieve this we have developed and implemented a semantic geospatial data model by which a photograph can be enrich with geospatial metadata extracted from several geospatial data sources based on the raw low-level geo-metadata from a smartphone photograph. We present the details of our method and implementation for searching and querying the semantic geospatial metadata repository to enable a user or third party system to find the information they are looking for. PMID:26205265

  9. Citing geospatial feature inventories with XML manifests

    NASA Astrophysics Data System (ADS)

    Bose, R.; McGarva, G.

    2006-12-01

    Today published scientific papers include a growing number of citations for online information sources that either complement or replace printed journals and books. We anticipate this same trend for cartographic citations used in the geosciences, following advances in web mapping and geographic feature-based services. Instead of using traditional libraries to resolve citations for print material, the geospatial citation life cycle will include requesting inventories of objects or geographic features from distributed geospatial data repositories. Using a case study from the UK Ordnance Survey MasterMap database, which is illustrative of geographic object-based products in general, we propose citing inventories of geographic objects using XML feature manifests. These manifests: (1) serve as a portable listing of sets of versioned features; (2) could be used as citations within the identification portion of an international geospatial metadata standard; (3) could be incorporated into geospatial data transfer formats such as GML; but (4) can be resolved only with comprehensive, curated repositories of current and historic data. This work has implications for any researcher who foresees the need to make or resolve references to online geospatial databases.

  10. High resolution pollutant measurements in complex urban ...

    EPA Pesticide Factsheets

    Measuring air pollution in real-time using an instrumented vehicle platform has been an emerging strategy to resolve air pollution trends at a very fine spatial scale (10s of meters). Achieving second-by-second data representative of urban air quality trends requires advanced instrumentation, such as a quantum cascade laser utilized to resolve carbon monoxide and real-time optical detection of black carbon. An equally challenging area of development is processing and visualization of complex geospatial air monitoring data to decipher key trends of interest. EPA’s Office of Research and Development staff have applied air monitoring to evaluate community air quality in a variety of environments, including assessing air quality surrounding rail yards, evaluating noise wall or tree stand effects on roadside and on-road air quality, and surveying of traffic-related exposure zones for comparison with land-use regression estimates. ORD has ongoing efforts to improve mobile monitoring data collection and interpretation, including instrumentation testing, evaluating the effect of post-processing algorithms on derived trends, and developing a web-based tool called Real-Time Geospatial Data Viewer (RETIGO) allowing for a simple plug-and-play of mobile monitoring data. Example findings from mobile data sets include an estimated 50% in roadside ultrafine particle levels when immediately downwind of a noise barrier, increases in neighborhood-wide black carbon levels (3

  11. Mobile membrane introduction tandem mass spectrometry for on-the-fly measurements and adaptive sampling of VOCs around oil and gas projects in Alberta, Canada

    NASA Astrophysics Data System (ADS)

    Krogh, E.; Gill, C.; Bell, R.; Davey, N.; Martinsen, M.; Thompson, A.; Simpson, I. J.; Blake, D. R.

    2012-12-01

    The release of hydrocarbons into the environment can have significant environmental and economic consequences. The evolution of smaller, more portable mass spectrometers to the field can provide spatially and temporally resolved information for rapid detection, adaptive sampling and decision support. We have deployed a mobile platform membrane introduction mass spectrometer (MIMS) for the in-field simultaneous measurement of volatile and semi-volatile organic compounds. In this work, we report instrument and data handling advances that produce geographically referenced data in real-time and preliminary data where these improvements have been combined with high precision ultra-trace VOCs analysis to adaptively sample air plumes near oil and gas operations in Alberta, Canada. We have modified a commercially available ion-trap mass spectrometer (Griffin ICX 400) with an in-house temperature controlled capillary hollow fibre polydimethylsiloxane (PDMS) polymer membrane interface and in-line permeation tube flow cell for a continuously infused internal standard. The system is powered by 24 VDC for remote operations in a moving vehicle. Software modifications include the ability to run continuous, interlaced tandem mass spectrometry (MS/MS) experiments for multiple contaminants/internal standards. All data are time and location stamped with on-board GPS and meteorological data to facilitate spatial and temporal data mapping. Tandem MS/MS scans were employed to simultaneously monitor ten volatile and semi-volatile analytes, including benzene, toluene, ethylbenzene and xylene (BTEX), reduced sulfur compounds, halogenated organics and naphthalene. Quantification was achieved by calibrating against a continuously infused deuterated internal standard (toluene-d8). Time referenced MS/MS data were correlated with positional data and processed using Labview and Matlab to produce calibrated, geographical Google Earth data-visualizations that enable adaptive sampling protocols. This real-time approach has been employed in a moving vehicle to identify and track downwind plumes of fugitive VOC emissions near hydrocarbon upgrading and chemical processing facilities in Fort Saskatchewan, Alberta. This information was relayed to a trailing vehicle, which collected stationary grab samples in evacuated canisters for ultra trace analysis of over seventy VOC analytes. In addition, stationary time series data were collected and compared with grab samples co-located with our sampling line. Spatially and temporally resolved, time referenced MS/MS data for several air contaminants associated with oil and gas processing were processed in real time to produce geospatial data for visualization in Google Earth. This information was used to strategically locate grab samples for high precision, ultra trace analysis.

  12. A Visual Analytics Approach for Station-Based Air Quality Data

    PubMed Central

    Du, Yi; Ma, Cuixia; Wu, Chao; Xu, Xiaowei; Guo, Yike; Zhou, Yuanchun; Li, Jianhui

    2016-01-01

    With the deployment of multi-modality and large-scale sensor networks for monitoring air quality, we are now able to collect large and multi-dimensional spatio-temporal datasets. For these sensed data, we present a comprehensive visual analysis approach for air quality analysis. This approach integrates several visual methods, such as map-based views, calendar views, and trends views, to assist the analysis. Among those visual methods, map-based visual methods are used to display the locations of interest, and the calendar and the trends views are used to discover the linear and periodical patterns. The system also provides various interaction tools to combine the map-based visualization, trends view, calendar view and multi-dimensional view. In addition, we propose a self-adaptive calendar-based controller that can flexibly adapt the changes of data size and granularity in trends view. Such a visual analytics system would facilitate big-data analysis in real applications, especially for decision making support. PMID:28029117

  13. A Visual Analytics Approach for Station-Based Air Quality Data.

    PubMed

    Du, Yi; Ma, Cuixia; Wu, Chao; Xu, Xiaowei; Guo, Yike; Zhou, Yuanchun; Li, Jianhui

    2016-12-24

    With the deployment of multi-modality and large-scale sensor networks for monitoring air quality, we are now able to collect large and multi-dimensional spatio-temporal datasets. For these sensed data, we present a comprehensive visual analysis approach for air quality analysis. This approach integrates several visual methods, such as map-based views, calendar views, and trends views, to assist the analysis. Among those visual methods, map-based visual methods are used to display the locations of interest, and the calendar and the trends views are used to discover the linear and periodical patterns. The system also provides various interaction tools to combine the map-based visualization, trends view, calendar view and multi-dimensional view. In addition, we propose a self-adaptive calendar-based controller that can flexibly adapt the changes of data size and granularity in trends view. Such a visual analytics system would facilitate big-data analysis in real applications, especially for decision making support.

  14. Introduction to geospatial semantics and technology workshop handbook

    USGS Publications Warehouse

    Varanka, Dalia E.

    2012-01-01

    The workshop is a tutorial on introductory geospatial semantics with hands-on exercises using standard Web browsers. The workshop is divided into two sections, general semantics on the Web and specific examples of geospatial semantics using data from The National Map of the U.S. Geological Survey and the Open Ontology Repository. The general semantics section includes information and access to publicly available semantic archives. The specific session includes information on geospatial semantics with access to semantically enhanced data for hydrography, transportation, boundaries, and names. The Open Ontology Repository offers open-source ontologies for public use.

  15. Visualization Techniques for Computer Network Defense

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beaver, Justin M; Steed, Chad A; Patton, Robert M

    2011-01-01

    Effective visual analysis of computer network defense (CND) information is challenging due to the volume and complexity of both the raw and analyzed network data. A typical CND is comprised of multiple niche intrusion detection tools, each of which performs network data analysis and produces a unique alerting output. The state-of-the-practice in the situational awareness of CND data is the prevalent use of custom-developed scripts by Information Technology (IT) professionals to retrieve, organize, and understand potential threat events. We propose a new visual analytics framework, called the Oak Ridge Cyber Analytics (ORCA) system, for CND data that allows an operatormore » to interact with all detection tool outputs simultaneously. Aggregated alert events are presented in multiple coordinated views with timeline, cluster, and swarm model analysis displays. These displays are complemented with both supervised and semi-supervised machine learning classifiers. The intent of the visual analytics framework is to improve CND situational awareness, to enable an analyst to quickly navigate and analyze thousands of detected events, and to combine sophisticated data analysis techniques with interactive visualization such that patterns of anomalous activities may be more easily identified and investigated.« less

  16. TopicLens: Efficient Multi-Level Visual Topic Exploration of Large-Scale Document Collections.

    PubMed

    Kim, Minjeong; Kang, Kyeongpil; Park, Deokgun; Choo, Jaegul; Elmqvist, Niklas

    2017-01-01

    Topic modeling, which reveals underlying topics of a document corpus, has been actively adopted in visual analytics for large-scale document collections. However, due to its significant processing time and non-interactive nature, topic modeling has so far not been tightly integrated into a visual analytics workflow. Instead, most such systems are limited to utilizing a fixed, initial set of topics. Motivated by this gap in the literature, we propose a novel interaction technique called TopicLens that allows a user to dynamically explore data through a lens interface where topic modeling and the corresponding 2D embedding are efficiently computed on the fly. To support this interaction in real time while maintaining view consistency, we propose a novel efficient topic modeling method and a semi-supervised 2D embedding algorithm. Our work is based on improving state-of-the-art methods such as nonnegative matrix factorization and t-distributed stochastic neighbor embedding. Furthermore, we have built a web-based visual analytics system integrated with TopicLens. We use this system to measure the performance and the visualization quality of our proposed methods. We provide several scenarios showcasing the capability of TopicLens using real-world datasets.

  17. A results-based process for evaluation of diverse visual analytics tools

    NASA Astrophysics Data System (ADS)

    Rubin, Gary; Berger, David H.

    2013-05-01

    With the pervasiveness of still and full-motion imagery in commercial and military applications, the need to ingest and analyze these media has grown rapidly in recent years. Additionally, video hosting and live camera websites provide a near real-time view of our changing world with unprecedented spatial coverage. To take advantage of these controlled and crowd-sourced opportunities, sophisticated visual analytics (VA) tools are required to accurately and efficiently convert raw imagery into usable information. Whether investing in VA products or evaluating algorithms for potential development, it is important for stakeholders to understand the capabilities and limitations of visual analytics tools. Visual analytics algorithms are being applied to problems related to Intelligence, Surveillance, and Reconnaissance (ISR), facility security, and public safety monitoring, to name a few. The diversity of requirements means that a onesize- fits-all approach to performance assessment will not work. We present a process for evaluating the efficacy of algorithms in real-world conditions, thereby allowing users and developers of video analytics software to understand software capabilities and identify potential shortcomings. The results-based approach described in this paper uses an analysis of end-user requirements and Concept of Operations (CONOPS) to define Measures of Effectiveness (MOEs), test data requirements, and evaluation strategies. We define metrics that individually do not fully characterize a system, but when used together, are a powerful way to reveal both strengths and weaknesses. We provide examples of data products, such as heatmaps, performance maps, detection timelines, and rank-based probability-of-detection curves.

  18. Mapping and Analysis of Forest and Land Fire Potential Using Geospatial Technology and Mathematical Modeling

    NASA Astrophysics Data System (ADS)

    Suliman, M. D. H.; Mahmud, M.; Reba, M. N. M.; S, L. W.

    2014-02-01

    Forest and land fire can cause negative implications for forest ecosystems, biodiversity, air quality and soil structure. However, the implications involved can be minimized through effective disaster management system. Effective disaster management mechanisms can be developed through appropriate early warning system as well as an efficient delivery system. This study tried to focus on two aspects, namely by mapping the potential of forest fire and land as well as the delivery of information to users through WebGIS application. Geospatial technology and mathematical modeling used in this study for identifying, classifying and mapping the potential area for burning. Mathematical models used is the Analytical Hierarchy Process (AHP), while Geospatial technologies involved include remote sensing, Geographic Information System (GIS) and digital field data collection. The entire Selangor state was chosen as our study area based on a number of cases have been reported over the last two decades. AHP modeling to assess the comparison between the three main criteria of fuel, topography and human factors design. Contributions of experts directly involved in forest fire fighting operations and land comprising officials from the Fire and Rescue Department Malaysia also evaluated in this model. The study found that about 32.83 square kilometers of the total area of Selangor state are the extreme potential for fire. Extreme potential areas identified are in Bestari Jaya and Kuala Langat High Ulu. Continuity of information and terrestrial forest fire potential was displayed in WebGIS applications on the internet. Display information through WebGIS applications is a better approach to help the decision-making process at a high level of confidence and approximate real conditions. Agencies involved in disaster management such as Jawatankuasa Pengurusan Dan Bantuan Bencana (JPBB) of District, State and the National under the National Security Division and the Fire and Rescue Department Malaysia can use the end result of this study in preparation for the land and forest fires in the future.

  19. The Value of Information - Accounting for a New Geospatial Paradigm

    NASA Astrophysics Data System (ADS)

    Pearlman, J.; Coote, A. M.

    2014-12-01

    A new frontier in consideration of socio-economic benefit is valuing information as an asset, often referred to as Infonomics. Conventional financial practice does not easily provide a mechanism for valuing information and yet clearly for many of the largest corporations, such as Google and Facebook, it is their principal asset. This is exacerbated for public sector organizations, as those that information-centric rather than information-enabled are relatively few - statistics, archiving and mapping agencies are perhaps the only examples - so it's not at the top of the agenda for Government. However, it is a hugely important issue when valuing Geospatial data and information. Geospatial data allows public institutions to operate, and facilitates the provision of essential services for emergency response and national defense. In this respect, geospatial data is strongly analogous to other types of public infrastructure, such as utilities and roads. The use of Geospatial data is widespread from companies in the transportation or construction sectors to individual planning for daily events. The categorization of geospatial data as infrastructure is critical to decisions related to investment in its management, maintenance and upgrade over time. Geospatial data depreciates in the same way that physical infrastructure depreciates. It needs to be maintained otherwise its functionality and value in use declines. We have coined the term geo-infonomics to encapsulate the concept. This presentation will develop the arguments around its importance and current avenues of research.

  20. Understanding needs and barriers to using geospatial tools for public health policymaking in China.

    PubMed

    Kim, Dohyeong; Zhang, Yingyuan; Lee, Chang Kil

    2018-05-07

    Despite growing popularity of using geographical information systems and geospatial tools in public health fields, these tools are only rarely implemented in health policy management in China. This study examines the barriers that could prevent policy-makers from applying such tools to actual managerial processes related to public health problems that could be assisted by such approaches, e.g. evidence-based policy-making. A questionnaire-based survey of 127 health-related experts and other stakeholders in China revealed that there is a consensus on the needs and demands for the use of geospatial tools, which shows that there is a more unified opinion on the matter than so far reported. Respondents pointed to lack of communication and collaboration among stakeholders as the most significant barrier to the implementation of geospatial tools. Comparison of survey results to those emanating from a similar study in Bangladesh revealed different priorities concerning the use of geospatial tools between the two countries. In addition, the follow-up in-depth interviews highlighted the political culture specific to China as a critical barrier to adopting new tools in policy development. Other barriers included concerns over the limited awareness of the availability of advanced geospatial tools. Taken together, these findings can facilitate a better understanding among policy-makers and practitioners of the challenges and opportunities for widespread adoption and implementation of a geospatial approach to public health policy-making in China.

Top