NHDPlusHR: A national geospatial framework for surface-water information
Viger, Roland; Rea, Alan H.; Simley, Jeffrey D.; Hanson, Karen M.
2016-01-01
The U.S. Geological Survey is developing a new geospatial hydrographic framework for the United States, called the National Hydrography Dataset Plus High Resolution (NHDPlusHR), that integrates a diversity of the best-available information, robustly supports ongoing dataset improvements, enables hydrographic generalization to derive alternate representations of the network while maintaining feature identity, and supports modern scientific computing and Internet accessibility needs. This framework is based on the High Resolution National Hydrography Dataset, the Watershed Boundaries Dataset, and elevation from the 3-D Elevation Program, and will provide an authoritative, high precision, and attribute-rich geospatial framework for surface-water information for the United States. Using this common geospatial framework will provide a consistent basis for indexing water information in the United States, eliminate redundancy, and harmonize access to, and exchange of water information.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pasha, M. Fayzul K.; Yang, Majntxov; Yeasmin, Dilruba
Benefited from the rapid development of multiple geospatial data sets on topography, hydrology, and existing energy-water infrastructures, the reconnaissance level hydropower resource assessment can now be conducted using geospatial models in all regions of the US. Furthermore, the updated techniques can be used to estimate the total undeveloped hydropower potential across all regions, and may eventually help identify further hydropower opportunities that were previously overlooked. To enhance the characterization of higher energy density stream-reaches, this paper explored the sensitivity of geospatial resolution on the identification of hydropower stream-reaches using the geospatial merit matrix based hydropower resource assessment (GMM-HRA) model. GMM-HRAmore » model simulation was conducted with eight different spatial resolutions on six U.S. Geological Survey (USGS) 8-digit hydrologic units (HUC8) located at three different terrains; Flat, Mild, and Steep. The results showed that more hydropower potential from higher energy density stream-reaches can be identified with increasing spatial resolution. Both Flat and Mild terrains exhibited lower impacts compared to the Steep terrain. Consequently, greater attention should be applied when selecting the discretization resolution for hydropower resource assessments in the future study.« less
Pasha, M. Fayzul K.; Yang, Majntxov; Yeasmin, Dilruba; ...
2016-01-07
Benefited from the rapid development of multiple geospatial data sets on topography, hydrology, and existing energy-water infrastructures, the reconnaissance level hydropower resource assessment can now be conducted using geospatial models in all regions of the US. Furthermore, the updated techniques can be used to estimate the total undeveloped hydropower potential across all regions, and may eventually help identify further hydropower opportunities that were previously overlooked. To enhance the characterization of higher energy density stream-reaches, this paper explored the sensitivity of geospatial resolution on the identification of hydropower stream-reaches using the geospatial merit matrix based hydropower resource assessment (GMM-HRA) model. GMM-HRAmore » model simulation was conducted with eight different spatial resolutions on six U.S. Geological Survey (USGS) 8-digit hydrologic units (HUC8) located at three different terrains; Flat, Mild, and Steep. The results showed that more hydropower potential from higher energy density stream-reaches can be identified with increasing spatial resolution. Both Flat and Mild terrains exhibited lower impacts compared to the Steep terrain. Consequently, greater attention should be applied when selecting the discretization resolution for hydropower resource assessments in the future study.« less
Commercial observation satellites: broadening the sources of geospatial data
NASA Astrophysics Data System (ADS)
Baker, John C.; O'Connell, Kevin M.; Venzor, Jose A.
2002-09-01
Commercial observation satellites promise to broaden substantially the sources of imagery data available to potential users of geospatial data and related information products. We examine the new trend toward private firms acquiring and operating high-resolution imagery satellites. These commercial observation satellites build on the substantial experience in Earth observation operations provided by government-owned imaging satellites for civilian and military purposes. However, commercial satellites will require governments and companies to reconcile public and private interests in allowing broad public access to high-resolution satellite imagery data without creating national security risks or placing the private firms at a disadvantage compared with other providers of geospatial data.
Modeling photovoltaic diffusion: an analysis of geospatial datasets
NASA Astrophysics Data System (ADS)
Davidson, Carolyn; Drury, Easan; Lopez, Anthony; Elmore, Ryan; Margolis, Robert
2014-07-01
This study combines address-level residential photovoltaic (PV) adoption trends in California with several types of geospatial information—population demographics, housing characteristics, foreclosure rates, solar irradiance, vehicle ownership preferences, and others—to identify which subsets of geospatial information are the best predictors of historical PV adoption. Number of rooms, heating source and house age were key variables that had not been previously explored in the literature, but are consistent with the expected profile of a PV adopter. The strong relationship provided by foreclosure indicators and mortgage status have less of an intuitive connection to PV adoption, but may be highly correlated with characteristics inherent in PV adopters. Next, we explore how these predictive factors and model performance varies between different Investor Owned Utility (IOU) regions in California, and at different spatial scales. Results suggest that models trained with small subsets of geospatial information (five to eight variables) may provide similar explanatory power as models using hundreds of geospatial variables. Further, the predictive performance of models generally decreases at higher resolution, i.e., below ZIP code level since several geospatial variables with coarse native resolution become less useful for representing high resolution variations in PV adoption trends. However, for California we find that model performance improves if parameters are trained at the regional IOU level rather than the state-wide level. We also find that models trained within one IOU region are generally representative for other IOU regions in CA, suggesting that a model trained with data from one state may be applicable in another state.
NASA Astrophysics Data System (ADS)
Jordan, T. R.; Madden, M.; Sharma, J. B.; Panda, S. S.
2012-07-01
In an innovative collaboration between government, university and private industry, researchers at the University of Georgia and Gainesville State College are collaborating with Photo Science, Inc. to acquire, process and quality control check lidar and or-thoimages of forest areas in the Southern Appalachian Mountains of the United States. Funded by the U.S. Geological Survey, this project meets the objectives of the ARRA initiative by creating jobs, preserving jobs and training students for high skill positions in geospatial technology. Leaf-off lidar data were acquired at 1-m resolution of the Tennessee portion of the Great Smoky Mountain National Park (GRSM) and adjacent Foothills Parkway. This 1400-sq. km. area is of high priority for national/global interests due to biodiversity, rare and endangered species and protection of some of the last remaining virgin forest in the U.S. High spatial resolution (30 cm) leaf-off 4-band multispectral orthoimages also were acquired for both the Chattahoochee National Forest in north Georgia and the entire GRSM. The data are intended to augment the National Elevation Dataset and orthoimage database of The National Map with information that can be used by many researchers in applications of LiDAR point clouds, high resolution DEMs and or-thoimage mosaics. Graduate and undergraduate students were involved at every stage of the workflow in order to provide then with high level technical educational and professional experience in preparation for entering the geospatial workforce. This paper will present geospatial workflow strategies, multi-team coordination, distance-learning training and industry-academia partnership.
Hu, Hao; Hong, Xingchen; Terstriep, Jeff; Liu, Yan; Finn, Michael P.; Rush, Johnathan; Wendel, Jeffrey; Wang, Shaowen
2016-01-01
Geospatial data, often embedded with geographic references, are important to many application and science domains, and represent a major type of big data. The increased volume and diversity of geospatial data have caused serious usability issues for researchers in various scientific domains, which call for innovative cyberGIS solutions. To address these issues, this paper describes a cyberGIS community data service framework to facilitate geospatial big data access, processing, and sharing based on a hybrid supercomputer architecture. Through the collaboration between the CyberGIS Center at the University of Illinois at Urbana-Champaign (UIUC) and the U.S. Geological Survey (USGS), a community data service for accessing, customizing, and sharing digital elevation model (DEM) and its derived datasets from the 10-meter national elevation dataset, namely TopoLens, is created to demonstrate the workflow integration of geospatial big data sources, computation, analysis needed for customizing the original dataset for end user needs, and a friendly online user environment. TopoLens provides online access to precomputed and on-demand computed high-resolution elevation data by exploiting the ROGER supercomputer. The usability of this prototype service has been acknowledged in community evaluation.
Evaluation of High Resolution Imagery and Elevation Data
2009-06-01
the value of cutting-edge geospatial tools while keeping the data constant, the present experiment evaluated the effect of higher resolution imagery...and elevation data while keeping the tools constant. The high resolution data under evaluation was generated from TEC’s Buckeye system, an...results. As researchers and developers provide increasingly advanced tools to process data more quickly and accurately, it is necessary to assess each
Situational Awareness Geospatial Application (iSAGA)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sher, Benjamin
Situational Awareness Geospatial Application (iSAGA) is a geospatial situational awareness software tool that uses an algorithm to extract location data from nearly any internet-based, or custom data source and display it geospatially; allows user-friendly conduct of spatial analysis using custom-developed tools; searches complex Geographic Information System (GIS) databases and accesses high resolution imagery. iSAGA has application at the federal, state and local levels of emergency response, consequence management, law enforcement, emergency operations and other decision makers as a tool to provide complete, visual, situational awareness using data feeds and tools selected by the individual agency or organization. Feeds may bemore » layered and custom tools developed to uniquely suit each subscribing agency or organization. iSAGA may similarly be applied to international agencies and organizations.« less
Geospatial Image Mining For Nuclear Proliferation Detection: Challenges and New Opportunities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vatsavai, Raju; Bhaduri, Budhendra L; Cheriyadat, Anil M
2010-01-01
With increasing understanding and availability of nuclear technologies, and increasing persuasion of nuclear technologies by several new countries, it is increasingly becoming important to monitor the nuclear proliferation activities. There is a great need for developing technologies to automatically or semi-automatically detect nuclear proliferation activities using remote sensing. Images acquired from earth observation satellites is an important source of information in detecting proliferation activities. High-resolution remote sensing images are highly useful in verifying the correctness, as well as completeness of any nuclear program. DOE national laboratories are interested in detecting nuclear proliferation by developing advanced geospatial image mining algorithms. Inmore » this paper we describe the current understanding of geospatial image mining techniques and enumerate key gaps and identify future research needs in the context of nuclear proliferation.« less
NASA Astrophysics Data System (ADS)
Leydsman-McGinty, E. I.; Ramsey, R. D.; McGinty, C.
2013-12-01
The Remote Sensing/GIS Laboratory at Utah State University, in cooperation with the United States Environmental Protection Agency, is quantifying impervious surfaces for three watershed sub-basins in Utah. The primary objective of developing watershed-scale quantifications of impervious surfaces is to provide an indicator of potential impacts to wetlands that occur within the Wasatch Front and along the Great Salt Lake. A geospatial layer of impervious surfaces can assist state agencies involved with Utah's Wetlands Program Plan (WPP) in understanding the impacts of impervious surfaces on wetlands, as well as support them in carrying out goals and actions identified in the WPP. The three watershed sub-basins, Lower Bear-Malad, Lower Weber, and Jordan, span the highly urbanized Wasatch Front and are consistent with focal areas in need of wetland monitoring and assessment as identified in Utah's WPP. Geospatial layers of impervious surface currently exist in the form of national and regional land cover datasets; however, these datasets are too coarse to be utilized in fine-scale analyses. In addition, the pixel-based image processing techniques used to develop these coarse datasets have proven insufficient in smaller scale or detailed studies, particularly when applied to high-resolution satellite imagery or aerial photography. Therefore, object-based image analysis techniques are being implemented to develop the geospatial layer of impervious surfaces. Object-based image analysis techniques employ a combination of both geospatial and image processing methods to extract meaningful information from high-resolution imagery. Spectral, spatial, textural, and contextual information is used to group pixels into image objects and then subsequently used to develop rule sets for image classification. eCognition, an object-based image analysis software program, is being utilized in conjunction with one-meter resolution National Agriculture Imagery Program (NAIP) aerial photography from 2011.
Integrated Geospatial Education and Technology Training for High School Age Youth (HiGETT)
NASA Astrophysics Data System (ADS)
Allen, J. E.
2012-12-01
The Landsat series of satellites provides high quality, consistent, 30 m resolution data for studies of landscape-scale change over time at no cost to the user. The availability of the Landsat data archive and the effectiveness and ease of its use to solve practical societal problems, particularly integrated with Geographic Information Systems (GIS), has been a key factor in a movement to bring remote sensing education to community colleges (as in the "iGETT" program funded by the National Science Foundation, 2007-2011) and now to younger students of high school age. "Integrated Geospatial Education and Technology Training for High School Age Youth (HiGETT)" was a two-day meeting convened April 4-5, 2011 to explore and articulate effective means of reaching teens with geospatial technology education and career awareness. Participants represented industry, government, academia, and informal education organizations such as 4-H and Girl Scouts. This poster will summarize a report on that meeting.
2006-06-01
angle Imaging SpectroRadiometer MODIS Moderate Resolution Imaging Spectroradiometer NGA National Geospatial Intelligence Agency POI Principles of...and µ , the cosine of the viewing zenith angle and the effect of the variation of each of these variables on total optical depth. Extraterrestrial ...Eq. (34). Additionally, solar zenith angle also plays a role in the third term on the RHS of Eq. (34) by modifying extraterrestrial spectral solar
Solar and Wind Resource Assessments for Afghanistan and Pakistan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Renne, D. S.; Kelly, M.; Elliott, D.
2007-01-01
The U.S. National Renewable Energy Laboratory (NREL) has recently completed the production of high-resolution wind and solar energy resource maps and related data products for Afghanistan and Pakistan. The resource data have been incorporated into a geospatial toolkit (GsT), which allows the user to manipulate the resource information along with country-specific geospatial information such as highway networks, power facilities, transmission corridors, protected land areas, etc. The toolkit allows users to then transfer resource data for specific locations into NREL's micropower optimization model known as HOMER.
NASA Technical Reports Server (NTRS)
Ross, Kenton W.; Graham, William D.
2007-01-01
In the aftermath of Hurricane Katrina and in response to the needs of SSC (Stennis Space Center), NASA required the generation of decision support products with a broad range of geospatial inputs. Applying a systems engineering approach, the NASA ARTPO (Applied Research and Technology Project Office) at SSC evaluated the Center's requirements and source data quality. ARTPO identified data and information products that had the potential to meet decision-making requirements; included were remotely sensed data ranging from high-spatial-resolution aerial images through high-temporal-resolution MODIS (Moderate Resolution Imaging Spectroradiometer) products. Geospatial products, such as FEMA's (Federal Emergency Management Agency's) Advisory Base Flood Elevations, were also relevant. Where possible, ARTPO applied SSC calibration/validation expertise to both clarify the quality of various data source options and to validate that the inputs that were finally chosen met SSC requirements. ARTPO integrated various information sources into multiple decision support products, including two maps: Hurricane Katrina Inundation Effects at Stennis Space Center (highlighting surge risk posture) and Vegetation Change In and Around Stennis Space Center: Katrina and Beyond (highlighting fire risk posture).
Enhancing GIS Capabilities for High Resolution Earth Science Grids
NASA Astrophysics Data System (ADS)
Koziol, B. W.; Oehmke, R.; Li, P.; O'Kuinghttons, R.; Theurich, G.; DeLuca, C.
2017-12-01
Applications for high performance GIS will continue to increase as Earth system models pursue more realistic representations of Earth system processes. Finer spatial resolution model input and output, unstructured or irregular modeling grids, data assimilation, and regional coordinate systems present novel challenges for GIS frameworks operating in the Earth system modeling domain. This presentation provides an overview of two GIS-driven applications that combine high performance software with big geospatial datasets to produce value-added tools for the modeling and geoscientific community. First, a large-scale interpolation experiment using National Hydrography Dataset (NHD) catchments, a high resolution rectilinear CONUS grid, and the Earth System Modeling Framework's (ESMF) conservative interpolation capability will be described. ESMF is a parallel, high-performance software toolkit that provides capabilities (e.g. interpolation) for building and coupling Earth science applications. ESMF is developed primarily by the NOAA Environmental Software Infrastructure and Interoperability (NESII) group. The purpose of this experiment was to test and demonstrate the utility of high performance scientific software in traditional GIS domains. Special attention will be paid to the nuanced requirements for dealing with high resolution, unstructured grids in scientific data formats. Second, a chunked interpolation application using ESMF and OpenClimateGIS (OCGIS) will demonstrate how spatial subsetting can virtually remove computing resource ceilings for very high spatial resolution interpolation operations. OCGIS is a NESII-developed Python software package designed for the geospatial manipulation of high-dimensional scientific datasets. An overview of the data processing workflow, why a chunked approach is required, and how the application could be adapted to meet operational requirements will be discussed here. In addition, we'll provide a general overview of OCGIS's parallel subsetting capabilities including challenges in the design and implementation of a scientific data subsetter.
NASA Astrophysics Data System (ADS)
Zhong, Yanfei; Han, Xiaobing; Zhang, Liangpei
2018-04-01
Multi-class geospatial object detection from high spatial resolution (HSR) remote sensing imagery is attracting increasing attention in a wide range of object-related civil and engineering applications. However, the distribution of objects in HSR remote sensing imagery is location-variable and complicated, and how to accurately detect the objects in HSR remote sensing imagery is a critical problem. Due to the powerful feature extraction and representation capability of deep learning, the deep learning based region proposal generation and object detection integrated framework has greatly promoted the performance of multi-class geospatial object detection for HSR remote sensing imagery. However, due to the translation caused by the convolution operation in the convolutional neural network (CNN), although the performance of the classification stage is seldom influenced, the localization accuracies of the predicted bounding boxes in the detection stage are easily influenced. The dilemma between translation-invariance in the classification stage and translation-variance in the object detection stage has not been addressed for HSR remote sensing imagery, and causes position accuracy problems for multi-class geospatial object detection with region proposal generation and object detection. In order to further improve the performance of the region proposal generation and object detection integrated framework for HSR remote sensing imagery object detection, a position-sensitive balancing (PSB) framework is proposed in this paper for multi-class geospatial object detection from HSR remote sensing imagery. The proposed PSB framework takes full advantage of the fully convolutional network (FCN), on the basis of a residual network, and adopts the PSB framework to solve the dilemma between translation-invariance in the classification stage and translation-variance in the object detection stage. In addition, a pre-training mechanism is utilized to accelerate the training procedure and increase the robustness of the proposed algorithm. The proposed algorithm is validated with a publicly available 10-class object detection dataset.
Analysis of Radar and Optical Space Borne Data for Large Scale Topographical Mapping
NASA Astrophysics Data System (ADS)
Tampubolon, W.; Reinhardt, W.
2015-03-01
Normally, in order to provide high resolution 3 Dimension (3D) geospatial data, large scale topographical mapping needs input from conventional airborne campaigns which are in Indonesia bureaucratically complicated especially during legal administration procedures i.e. security clearance from military/defense ministry. This often causes additional time delays besides technical constraints such as weather and limited aircraft availability for airborne campaigns. Of course the geospatial data quality is an important issue for many applications. The increasing demand of geospatial data nowadays consequently requires high resolution datasets as well as a sufficient level of accuracy. Therefore an integration of different technologies is required in many cases to gain the expected result especially in the context of disaster preparedness and emergency response. Another important issue in this context is the fast delivery of relevant data which is expressed by the term "Rapid Mapping". In this paper we present first results of an on-going research to integrate different data sources like space borne radar and optical platforms. Initially the orthorectification of Very High Resolution Satellite (VHRS) imagery i.e. SPOT-6 has been done as a continuous process to the DEM generation using TerraSAR-X/TanDEM-X data. The role of Ground Control Points (GCPs) from GNSS surveys is mandatory in order to fulfil geometrical accuracy. In addition, this research aims on providing suitable processing algorithm of space borne data for large scale topographical mapping as described in section 3.2. Recently, radar space borne data has been used for the medium scale topographical mapping e.g. for 1:50.000 map scale in Indonesian territories. The goal of this on-going research is to increase the accuracy of remote sensing data by different activities, e.g. the integration of different data sources (optical and radar) or the usage of the GCPs in both, the optical and the radar satellite data processing. Finally this results will be used in the future as a reference for further geospatial data acquisitions to support topographical mapping in even larger scales up to the 1:10.000 map scale.
Effective environmental stewardship requires timely geospatial information about ecology and
environment for informed environmental decision support. Unprecedented public access to high resolution
imagery from earth-looking sensors via online virtual earth browsers ...
NASA Astrophysics Data System (ADS)
Mongeau, R.; Baudouin, Y.; Cavayas, F.
2017-10-01
Ville de Montreal wanted to develop a system to identify heat islands and microparticles at the urban scale and to study their formation. UQAM and UdeM universities have joined their expertise under the framework "Observatoire Spatial Urbain" to create a representative geospatial database of thermal and atmospheric parameters collected during the summer months. They innovated in the development of a methodology for processing high resolution hyperspectral images (1-2 m). In partnership with Ville de Montreal, they integrated 3D geospatial data (topography, transportation and meteorology) in the process. The 3D mapping of intraurban heat islands as well as air micro-particles makes it possible, initially, to identify the problematic situations for future civil protection interventions during extreme heat. Moreover, it will be used as a reference for the Ville de Montreal to establish a strategy for public domain tree planting and in the analysis of urban development projects.
NASA Astrophysics Data System (ADS)
Trofymow, J. A.; Gougeon, F.
2015-12-01
After forest harvest significant amounts of woody residues are left dispersed on site and some subsequently piled and burned. Quantification of residues is required for estimating C budgets, billable waste, harvest efficiency, bioenergy potential and smoke emissions. Trofymow (et al 2014 CJFR) compared remote sensing methods to ground-based waste and residue survey (WRS) methods for residue piles in 4 cutblocks in the Oyster River (OR) area in coastal BC. Compared to geospatial methods using 15cm orthophotos and LiDAR acquired in 2011 by helicopter, the WRS method underestimated pile wood by 30% to 50% while a USFS volume method overestimated pile wood by 50% if site specific packing ratios were not used. A geospatial method was developed in PCI Geomatica to analyze 2-bit images of logs >15cm diameters to determine dispersed wood residues in OR and compare to WRS methods. Across blocks, geospatial and WRS method wood volumes were correlated (R2=0.69), however volumes were 2.5 times larger for the geospatial vs WRS method. Methods for dispersed residues could not be properly compared as individual WRS plots were not georeferenced, only 12 plots were sampled in total, and low-resolution images poorly resolved logs. Thus, a new study in 2 cutblocks in the Northwest Bay (NWB) area acquired 2cm resolution RGB air-photography in 2014-15 using an Aeryon Sky Ranger UAV prior to and after burn pile construction. A total of 57 dispersed WRS plots and 24 WRS pile or accumulation plots were georeferenced and measured. Stero-pairs were used to generate point-clouds for pile bulk volumes. Images processed to 8-bit grey scale are being analyzed with a revised PCI method that better accounts for log overlaps. WRS methods depend on a good sample of plots and accurate determination of stratum (dispersed, roadside, piles, accumulations) areas. Analysis of NWB blocks shows WRS field methods for stratum area differ by 5-20% from that determined using orthophotos. Plot-level wood volumes in each plot and stratum determined by geospatial and WRS methods will be compared. While geospatial methods for residue determination is a 100% sample, compared to sample-based WRS method, difficulties in resolving logs in the images may mean the best method for determining residues requires a combination of geospatial and ground plot measurements. .
NASA Astrophysics Data System (ADS)
Jawak, Shridhar D.; Luis, Alvarinho J.
2016-04-01
An accurate spatial mapping and characterization of land cover features in cryospheric regions is an essential procedure for many geoscientific studies. A novel semi-automated method was devised by coupling spectral index ratios (SIRs) and geographic object-based image analysis (OBIA) to extract cryospheric geospatial information from very high resolution WorldView 2 (WV-2) satellite imagery. The present study addresses development of multiple rule sets for OBIA-based classification of WV-2 imagery to accurately extract land cover features in the Larsemann Hills, east Antarctica. Multilevel segmentation process was applied to WV-2 image to generate different sizes of geographic image objects corresponding to various land cover features with respect to scale parameter. Several SIRs were applied to geographic objects at different segmentation levels to classify land mass, man-made features, snow/ice, and water bodies. We focus on water body class to identify water areas at the image level, considering their uneven appearance on landmass and ice. The results illustrated that synergetic usage of SIRs and OBIA can provide accurate means to identify land cover classes with an overall classification accuracy of ≍97%. In conclusion, our results suggest that OBIA is a powerful tool for carrying out automatic and semiautomatic analysis for most cryospheric remote-sensing applications, and the synergetic coupling with pixel-based SIRs is found to be a superior method for mining geospatial information.
Garcia, Adriana; Masbruch, Melissa D.; Susong, David D.
2014-01-01
The U.S. Geological Survey, as part of the Department of the Interior’s WaterSMART (Sustain and Manage America’s Resources for Tomorrow) initiative, compiled published estimates of groundwater discharge to streams in the Upper Colorado River Basin as a geospatial database. For the purpose of this report, groundwater discharge to streams is the baseflow portion of streamflow that includes contributions of groundwater from various flow paths. Reported estimates of groundwater discharge were assigned as attributes to stream reaches derived from the high-resolution National Hydrography Dataset. A total of 235 estimates of groundwater discharge to streams were compiled and included in the dataset. Feature class attributes of the geospatial database include groundwater discharge (acre-feet per year), method of estimation, citation abbreviation, defined reach, and 8-digit hydrologic unit code(s). Baseflow index (BFI) estimates of groundwater discharge were calculated using an existing streamflow characteristics dataset and were included as an attribute in the geospatial database. A comparison of the BFI estimates to the compiled estimates of groundwater discharge found that the BFI estimates were greater than the reported groundwater discharge estimates.
2005-09-30
Barbara, Newport, and San Diego Harbor. The MSI instrumentation was provided under a cooperative agreement with the Applanix Systems Integration Group...part of the Applanix Position and Orientation System (POS AV; http://www.applanix.com/products/posav_index.php), which was being evaluated as a
2007-03-01
instrumentation was provided under a cooperative agreement with the Applanix Systems Integration Group (ASIG), a subsidiary of the Trimble Corporation. This MSI...system (Digital Sensor System; http://www.applanix.com/products/dss index.php) was provided as part of the Applanix Position and Orientation System (POS
While the spatial heterogeneity of many aquatic ecosystems is acknowledged, rivers are often mistakenly described as homogenous and well-mixed. The collection and visualization of attributes like water quality is key to our perception and management of these ecosystems. The ass...
Refining FIA plot locations using LiDAR point clouds
Charlie Schrader-Patton; Greg C. Liknes; Demetrios Gatziolis; Brian M. Wing; Mark D. Nelson; Patrick D. Miles; Josh Bixby; Daniel G. Wendt; Dennis Kepler; Abbey Schaaf
2015-01-01
Forest Inventory and Analysis (FIA) plot location coordinate precision is often insufficient for use with high resolution remotely sensed data, thereby limiting the use of these plots for geospatial applications and reducing the validity of models that assume the locations are precise. A practical and efficient method is needed to improve coordinate precision. To...
Visual analytics of inherently noisy crowdsourced data on ultra high resolution displays
NASA Astrophysics Data System (ADS)
Huynh, Andrew; Ponto, Kevin; Lin, Albert Yu-Min; Kuester, Falko
The increasing prevalence of distributed human microtasking, crowdsourcing, has followed the exponential increase in data collection capabilities. The large scale and distributed nature of these microtasks produce overwhelming amounts of information that is inherently noisy due to the nature of human input. Furthermore, these inputs create a constantly changing dataset with additional information added on a daily basis. Methods to quickly visualize, filter, and understand this information over temporal and geospatial constraints is key to the success of crowdsourcing. This paper present novel methods to visually analyze geospatial data collected through crowdsourcing on top of remote sensing satellite imagery. An ultra high resolution tiled display system is used to explore the relationship between human and satellite remote sensing data at scale. A case study is provided that evaluates the presented technique in the context of an archaeological field expedition. A team in the field communicated in real-time with and was guided by researchers in the remote visual analytics laboratory, swiftly sifting through incoming crowdsourced data to identify target locations that were identified as viable archaeological sites.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mishra, Umakant; Drewniak, Beth; Jastrow, Julie D.
Soil properties such as soil organic carbon (SOC) stocks and active-layer thickness are used in earth system models (F.SMs) to predict anthropogenic and climatic impacts on soil carbon dynamics, future changes in atmospheric greenhouse gas concentrations, and associated climate changes in the permafrost regions. Accurate representation of spatial and vertical distribution of these soil properties in ESMs is a prerequisite for redudng existing uncertainty in predicting carbon-climate feedbacks. We compared the spatial representation of SOC stocks and active-layer thicknesses predicted by the coupled Modellntercomparison Project Phase 5 { CMIP5) ESMs with those predicted from geospatial predictions, based on observation datamore » for the state of Alaska, USA. For the geospatial modeling. we used soil profile observations {585 for SOC stocks and 153 for active-layer thickness) and environmental variables (climate, topography, land cover, and surficial geology types) and generated fine-resolution (50-m spatial resolution) predictions of SOC stocks (to 1-m depth) and active-layer thickness across Alaska. We found large inter-quartile range (2.5-5.5 m) in predicted active-layer thickness of CMIP5 modeled results and small inter-quartile range (11.5-22 kg m-2) in predicted SOC stocks. The spatial coefficient of variability of active-layer thickness and SOC stocks were lower in CMIP5 predictions compared to our geospatial estimates when gridded at similar spatial resolutions (24.7 compared to 30% and 29 compared to 38%, respectively). However, prediction errors. when calculated for independent validation sites, were several times larger in ESM predictions compared to geospatial predictions. Primaly factors leading to observed differences were ( 1) lack of spatial heterogeneity in ESM predictions, (2) differences in assumptions concerning environmental controls, and (3) the absence of pedogenic processes in ESM model structures. Our results suggest that efforts to incorporate these factors in F.SMs should reduce current uncertainties associated with ESM predictions of carbon-climate feedbacks.« less
Geospatial Information from Satellite Imagery for Geovisualisation of Smart Cities in India
NASA Astrophysics Data System (ADS)
Mohan, M.
2016-06-01
In the recent past, there have been large emphasis on extraction of geospatial information from satellite imagery. The Geospatial information are being processed through geospatial technologies which are playing important roles in developing of smart cities, particularly in developing countries of the world like India. The study is based on the latest geospatial satellite imagery available for the multi-date, multi-stage, multi-sensor, and multi-resolution. In addition to this, the latest geospatial technologies have been used for digital image processing of remote sensing satellite imagery and the latest geographic information systems as 3-D GeoVisualisation, geospatial digital mapping and geospatial analysis for developing of smart cities in India. The Geospatial information obtained from RS and GPS systems have complex structure involving space, time and presentation. Such information helps in 3-Dimensional digital modelling for smart cities which involves of spatial and non-spatial information integration for geographic visualisation of smart cites in context to the real world. In other words, the geospatial database provides platform for the information visualisation which is also known as geovisualisation. So, as a result there have been an increasing research interest which are being directed to geospatial analysis, digital mapping, geovisualisation, monitoring and developing of smart cities using geospatial technologies. However, the present research has made an attempt for development of cities in real world scenario particulary to help local, regional and state level planners and policy makers to better understand and address issues attributed to cities using the geospatial information from satellite imagery for geovisualisation of Smart Cities in emerging and developing country, India.
Openwebglobe 2: Visualization of Complex 3D-GEODATA in the (mobile) Webbrowser
NASA Astrophysics Data System (ADS)
Christen, M.
2016-06-01
Providing worldwide high resolution data for virtual globes consists of compute and storage intense tasks for processing data. Furthermore, rendering complex 3D-Geodata, such as 3D-City models with an extremely high polygon count and a vast amount of textures at interactive framerates is still a very challenging task, especially on mobile devices. This paper presents an approach for processing, caching and serving massive geospatial data in a cloud-based environment for large scale, out-of-core, highly scalable 3D scene rendering on a web based virtual globe. Cloud computing is used for processing large amounts of geospatial data and also for providing 2D and 3D map data to a large amount of (mobile) web clients. In this paper the approach for processing, rendering and caching very large datasets in the currently developed virtual globe "OpenWebGlobe 2" is shown, which displays 3D-Geodata on nearly every device.
NASA Astrophysics Data System (ADS)
Flinders, Ashton F.; Mayer, Larry A.; Calder, Brian A.; Armstrong, Andrew A.
2014-05-01
We document a new high-resolution multibeam bathymetry compilation for the Canada Basin and Chukchi Borderland in the Arctic Ocean - United States Arctic Multibeam Compilation (USAMBC Version 1.0). The compilation preserves the highest native resolution of the bathymetric data, allowing for more detailed interpretation of seafloor morphology than has been previously possible. The compilation was created from multibeam bathymetry data available through openly accessible government and academic repositories. Much of the new data was collected during dedicated mapping cruises in support of the United States effort to map extended continental shelf regions beyond the 200 nm Exclusive Economic Zone. Data quality was evaluated using nadir-beam crossover-error statistics, making it possible to assess the precision of multibeam depth soundings collected from a wide range of vessels and sonar systems. Data were compiled into a single high-resolution grid through a vertical stacking method, preserving the highest quality data source in any specific grid cell. The crossover-error analysis and method of data compilation can be applied to other multi-source multibeam data sets, and is particularly useful for government agencies targeting extended continental shelf regions but with limited hydrographic capabilities. Both the gridded compilation and an easily distributed geospatial PDF map are freely available through the University of New Hampshire's Center for Coastal and Ocean Mapping (ccom.unh.edu/theme/law-sea). The geospatial pdf is a full resolution, small file-size product that supports interpretation of Arctic seafloor morphology without the need for specialized gridding/visualization software.
2006-11-01
29 3.2.4 National Register Information System Model ............................................................... 30 3.3 Summary of...are later based on that information . Despite their general level of power and resolution, Federal data management and accounting tools have not yet...have begun tracking their historic building and structure inven- tories using geographic information systems (GISs). A geospatial-referenced data
Ghandehari, Masoud; Emig, Thorsten; Aghamohamadnia, Milad
2018-02-02
Despite decades of research seeking to derive the urban energy budget, the dynamics of thermal exchange in the densely constructed environment is not yet well understood. Using New York City as a study site, we present a novel hybrid experimental-computational approach for a better understanding of the radiative heat transfer in complex urban environments. The aim of this work is to contribute to the calculation of the urban energy budget, particularly the stored energy. We will focus our attention on surface thermal radiation. Improved understanding of urban thermodynamics incorporating the interaction of various bodies, particularly in high rise cities, will have implications on energy conservation at the building scale, and for human health and comfort at the urban scale. The platform presented is based on longwave hyperspectral imaging of nearly 100 blocks of Manhattan, in addition to a geospatial radiosity model that describes the collective radiative heat exchange between multiple buildings. Despite assumptions in surface emissivity and thermal conductivity of buildings walls, the close comparison of temperatures derived from measurements and computations is promising. Results imply that the presented geospatial thermodynamic model of urban structures can enable accurate and high resolution analysis of instantaneous urban surface temperatures.
John Hogland; Nathaniel Anderson; Woodam Chung
2018-01-01
Adequate biomass feedstock supply is an important factor in evaluating the financial feasibility of alternative site locations for bioenergy facilities and for maintaining profitability once a facility is built. We used newly developed spatial analysis and logistics software to model the variables influencing feedstock supply and to estimate and map two components of...
NASA Astrophysics Data System (ADS)
Belica, L.; Mitasova, H.; Caldwell, P.; McCarter, J. B.; Nelson, S. A. C.
2017-12-01
Thermal regimes of forested headwater streams continue to be an area of active research as climatic, hydrologic, and land cover changes can influence water temperature, a key aspect of aquatic ecosystems. Widespread monitoring of stream temperatures have provided an important data source, yielding insights on the temporal and spatial patterns and the underlying processes that influence stream temperature. However, small forested streams remain challenging to model due to the high spatial and temporal variability of stream temperatures and the climatic and hydrologic conditions that drive them. Technological advances and increased computational power continue to provide new tools and measurement methods and have allowed spatially explicit analyses of dynamic natural systems at greater temporal resolutions than previously possible. With the goal of understanding how current stream temperature patterns and processes may respond to changing landcover and hydroclimatoligical conditions, we combined high-resolution, spatially explicit geospatial modeling with deterministic heat flux modeling approaches using data sources that ranged from traditional hydrological and climatological measurements to emerging remote sensing techniques. Initial analyses of stream temperature monitoring data revealed that high temporal resolution (5 minutes) and measurement resolutions (<0.1°C) were needed to adequately describe diel stream temperature patterns and capture the differences between paired 1st order and 4th order forest streams draining north and south facing slopes. This finding along with geospatial models of subcanopy solar radiation and channel morphology were used to develop hypotheses and guide field data collection for further heat flux modeling. By integrating multiple approaches and optimizing data resolution for the processes being investigated, small, but ecologically significant differences in stream thermal regimes were revealed. In this case, multi-approach research contributed to the identification of the dominant mechanisms driving stream temperature in the study area and advanced our understanding of the current thermal fluxes and how they may change as environmental conditions change in the future.
Experiences with Acquiring Highly Redundant Spatial Data to Support Driverless Vehicle Technologies
NASA Astrophysics Data System (ADS)
Koppanyi, Z.; Toth, C. K.
2018-05-01
As vehicle technology is moving towards higher autonomy, the demand for highly accurate geospatial data is rapidly increasing, as accurate maps have a huge potential of increasing safety. In particular, high definition 3D maps, including road topography and infrastructure, as well as city models along the transportation corridors represent the necessary support for driverless vehicles. In this effort, a vehicle equipped with high-, medium- and low-resolution active and passive cameras acquired data in a typical traffic environment, represented here by the OSU campus, where GPS/GNSS data are available along with other navigation sensor data streams. The data streams can be used for two purposes. First, high-definition 3D maps can be created by integrating all the sensory data, and Data Analytics/Big Data methods can be tested for automatic object space reconstruction. Second, the data streams can support algorithmic research for driverless vehicle technologies, including object avoidance, navigation/positioning, detecting pedestrians and bicyclists, etc. Crucial cross-performance analyses on map database resolution and accuracy with respect to sensor performance metrics to achieve economic solution for accurate driverless vehicle positioning can be derived. These, in turn, could provide essential information on optimizing the choice of geospatial map databases and sensors' quality to support driverless vehicle technologies. The paper reviews the data acquisition and primary data processing challenges and performance results.
NASA Astrophysics Data System (ADS)
Bermudez, L. E.; Percivall, G.; Idol, T. A.
2015-12-01
Experts in climate modeling, remote sensing of the Earth, and cyber infrastructure must work together in order to make climate predictions available to decision makers. Such experts and decision makers worked together in the Open Geospatial Consortium's (OGC) Testbed 11 to address a scenario of population displacement by coastal inundation due to the predicted sea level rise. In a Policy Fact Sheet "Harnessing Climate Data to Boost Ecosystem & Water Resilience", issued by White House Office of Science and Technology (OSTP) in December 2014, OGC committed to increase access to climate change information using open standards. In July 2015, the OGC Testbed 11 Urban Climate Resilience activity delivered on that commitment with open standards based support for climate-change preparedness. Using open standards such as the OGC Web Coverage Service and Web Processing Service and the NetCDF and GMLJP2 encoding standards, Testbed 11 deployed an interoperable high-resolution flood model to bring climate model outputs together with global change assessment models and other remote sensing data for decision support. Methods to confirm model predictions and to allow "what-if-scenarios" included in-situ sensor webs and crowdsourcing. A scenario was in two locations: San Francisco Bay Area and Mozambique. The scenarios demonstrated interoperation and capabilities of open geospatial specifications in supporting data services and processing services. The resultant High Resolution Flood Information System addressed access and control of simulation models and high-resolution data in an open, worldwide, collaborative Web environment. The scenarios examined the feasibility and capability of existing OGC geospatial Web service specifications in supporting the on-demand, dynamic serving of flood information from models with forecasting capacity. Results of this testbed included identification of standards and best practices that help researchers and cities deal with climate-related issues. Results of the testbeds will now be deployed in pilot applications. The testbed also identified areas of additional development needed to help identify scientific investments and cyberinfrastructure approaches needed to improve the application of climate science research results to urban climate resilence.
Rea, Alan; Skinner, Kenneth D.
2012-01-01
The U.S. Geological Survey Hawaii StreamStats application uses an integrated suite of raster and vector geospatial datasets to delineate and characterize watersheds. The geospatial datasets used to delineate and characterize watersheds on the StreamStats website, and the methods used to develop the datasets are described in this report. The datasets for Hawaii were derived primarily from 10 meter resolution National Elevation Dataset (NED) elevation models, and the National Hydrography Dataset (NHD), using a set of procedures designed to enforce the drainage pattern from the NHD into the NED, resulting in an integrated suite of elevation-derived datasets. Additional sources of data used for computing basin characteristics include precipitation, land cover, soil permeability, and elevation-derivative datasets. The report also includes links for metadata and downloads of the geospatial datasets.
Planning and Management of Real-Time Geospatialuas Missions Within a Virtual Globe Environment
NASA Astrophysics Data System (ADS)
Nebiker, S.; Eugster, H.; Flückiger, K.; Christen, M.
2011-09-01
This paper presents the design and development of a hardware and software framework supporting all phases of typical monitoring and mapping missions with mini and micro UAVs (unmanned aerial vehicles). The developed solution combines state-of-the art collaborative virtual globe technologies with advanced geospatial imaging techniques and wireless data link technologies supporting the combined and highly reliable transmission of digital video, high-resolution still imagery and mission control data over extended operational ranges. The framework enables the planning, simulation, control and real-time monitoring of UAS missions in application areas such as monitoring of forest fires, agronomical research, border patrol or pipeline inspection. The geospatial components of the project are based on the Virtual Globe Technology i3D OpenWebGlobe of the Institute of Geomatics Engineering at the University of Applied Sciences Northwestern Switzerland (FHNW). i3D OpenWebGlobe is a high-performance 3D geovisualisation engine supporting the web-based streaming of very large amounts of terrain and POI data.
NASA Astrophysics Data System (ADS)
Chen, J.; Wang, D.; Zhao, R. L.; Zhang, H.; Liao, A.; Jiu, J.
2014-04-01
Geospatial databases are irreplaceable national treasure of immense importance. Their up-to-dateness referring to its consistency with respect to the real world plays a critical role in its value and applications. The continuous updating of map databases at 1:50,000 scales is a massive and difficult task for larger countries of the size of more than several million's kilometer squares. This paper presents the research and technological development to support the national map updating at 1:50,000 scales in China, including the development of updating models and methods, production tools and systems for large-scale and rapid updating, as well as the design and implementation of the continuous updating workflow. The use of many data sources and the integration of these data to form a high accuracy, quality checked product were required. It had in turn required up to date techniques of image matching, semantic integration, generalization, data base management and conflict resolution. Design and develop specific software tools and packages to support the large-scale updating production with high resolution imagery and large-scale data generalization, such as map generalization, GIS-supported change interpretation from imagery, DEM interpolation, image matching-based orthophoto generation, data control at different levels. A national 1:50,000 databases updating strategy and its production workflow were designed, including a full coverage updating pattern characterized by all element topographic data modeling, change detection in all related areas, and whole process data quality controlling, a series of technical production specifications, and a network of updating production units in different geographic places in the country.
User's Guide for MapIMG 2: Map Image Re-projection Software Package
Finn, Michael P.; Trent, Jason R.; Buehler, Robert A.
2006-01-01
BACKGROUND Scientists routinely accomplish small-scale geospatial modeling in the raster domain, using high-resolution datasets for large parts of continents and low-resolution to high-resolution datasets for the entire globe. Direct implementation of point-to-point transformation with appropriate functions yields the variety of projections available in commercial software packages, but implementation with data other than points requires specific adaptation of the transformation equations or prior preparation of the data to allow the transformation to succeed. It seems that some of these packages use the U.S. Geological Survey's (USGS) General Cartographic Transformation Package (GCTP) or similar point transformations without adaptation to the specific characteristics of raster data (Usery and others, 2003a). Usery and others (2003b) compiled and tabulated the accuracy of categorical areas in projected raster datasets of global extent. Based on the shortcomings identified in these studies, geographers and applications programmers at the USGS expanded and evolved a USGS software package, MapIMG, for raster map projection transformation (Finn and Trent, 2004). Daniel R. Steinwand of Science Applications International Corporation, National Center for Earth Resources Observation and Science, originally developed MapIMG for the USGS, basing it on GCTP. Through previous and continuing efforts at the USGS' National Geospatial Technical Operations Center, this program has been transformed from an application based on command line input into a software package based on a graphical user interface for Windows, Linux, and other UNIX machines.
Delparte, D; Gates, RD; Takabayashi, M
2015-01-01
The structural complexity of coral reefs plays a major role in the biodiversity, productivity, and overall functionality of reef ecosystems. Conventional metrics with 2-dimensional properties are inadequate for characterization of reef structural complexity. A 3-dimensional (3D) approach can better quantify topography, rugosity and other structural characteristics that play an important role in the ecology of coral reef communities. Structure-from-Motion (SfM) is an emerging low-cost photogrammetric method for high-resolution 3D topographic reconstruction. This study utilized SfM 3D reconstruction software tools to create textured mesh models of a reef at French Frigate Shoals, an atoll in the Northwestern Hawaiian Islands. The reconstructed orthophoto and digital elevation model were then integrated with geospatial software in order to quantify metrics pertaining to 3D complexity. The resulting data provided high-resolution physical properties of coral colonies that were then combined with live cover to accurately characterize the reef as a living structure. The 3D reconstruction of reef structure and complexity can be integrated with other physiological and ecological parameters in future research to develop reliable ecosystem models and improve capacity to monitor changes in the health and function of coral reef ecosystems. PMID:26207190
GSKY: A scalable distributed geospatial data server on the cloud
NASA Astrophysics Data System (ADS)
Rozas Larraondo, Pablo; Pringle, Sean; Antony, Joseph; Evans, Ben
2017-04-01
Earth systems, environmental and geophysical datasets are an extremely valuable sources of information about the state and evolution of the Earth. Being able to combine information coming from different geospatial collections is in increasing demand by the scientific community, and requires managing and manipulating data with different formats and performing operations such as map reprojections, resampling and other transformations. Due to the large data volume inherent in these collections, storing multiple copies of them is unfeasible and so such data manipulation must be performed on-the-fly using efficient, high performance techniques. Ideally this should be performed using a trusted data service and common system libraries to ensure wide use and reproducibility. Recent developments in distributed computing based on dynamic access to significant cloud infrastructure opens the door for such new ways of processing geospatial data on demand. The National Computational Infrastructure (NCI), hosted at the Australian National University (ANU), has over 10 Petabytes of nationally significant research data collections. Some of these collections, which comprise a variety of observed and modelled geospatial data, are now made available via a highly distributed geospatial data server, called GSKY (pronounced [jee-skee]). GSKY supports on demand processing of large geospatial data products such as satellite earth observation data as well as numerical weather products, allowing interactive exploration and analysis of the data. It dynamically and efficiently distributes the required computations among cloud nodes providing a scalable analysis framework that can adapt to serve large number of concurrent users. Typical geospatial workflows handling different file formats and data types, or blending data in different coordinate projections and spatio-temporal resolutions, is handled transparently by GSKY. This is achieved by decoupling the data ingestion and indexing process as an independent service. An indexing service crawls data collections either locally or remotely by extracting, storing and indexing all spatio-temporal metadata associated with each individual record. GSKY provides the user with the ability of specifying how ingested data should be aggregated, transformed and presented. It presents an OGC standards-compliant interface, allowing ready accessibility for users of the data via Web Map Services (WMS), Web Processing Services (WPS) or raw data arrays using Web Coverage Services (WCS). The presentation will show some cases where we have used this new capability to provide a significant improvement over previous approaches.
Archaeological Remote Sensing: Searching for Fort Clatsop from Space
NASA Technical Reports Server (NTRS)
Karsmizki, Kenneth W.; Spruce, Joe; Giardino, Marco
2002-01-01
The Columbia Gorge Discovery Center and NASA's Stennis Space Center have teamed up to use high-resolution aerial and satellite-based remote sensing in the search for Lewis and Clark expedition campsites. A Space Act Agreement between NASA and the Discovery Center has evolved into a study that employs remote sensing, plus modern and historical map data for relocating several Lewis and Clark encampments. Satellite data being studied include 30-meter Landsat Thematic Mapper and 1-meter Space Imaging IKONOS data. This paper includes an overview of the working relationship between NASA and the Discovery Center. It also reports on geospatial analyses of the Fort Clatsop site to demonstrate the ways geospatial technologies interface with the written and cartographic records of the expedition and how they are applied to the search for Lewis and Clark campsites.
NASA Astrophysics Data System (ADS)
Langford, Z. L.; Kumar, J.; Hoffman, F. M.
2015-12-01
Observations indicate that over the past several decades, landscape processes in the Arctic have been changing or intensifying. A dynamic Arctic landscape has the potential to alter ecosystems across a broad range of scales. Accurate characterization is useful to understand the properties and organization of the landscape, optimal sampling network design, measurement and process upscaling and to establish a landscape-based framework for multi-scale modeling of ecosystem processes. This study seeks to delineate the landscape at Seward Peninsula of Alaska into ecoregions using large volumes (terabytes) of high spatial resolution satellite remote-sensing data. Defining high-resolution ecoregion boundaries is difficult because many ecosystem processes in Arctic ecosystems occur at small local to regional scales, which are often resolved in by coarse resolution satellites (e.g., MODIS). We seek to use data-fusion techniques and data analytics algorithms applied to Phased Array type L-band Synthetic Aperture Radar (PALSAR), Interferometric Synthetic Aperture Radar (IFSAR), Satellite for Observation of Earth (SPOT), WorldView-2, WorldView-3, and QuickBird-2 to develop high-resolution (˜5m) ecoregion maps for multiple time periods. Traditional analysis methods and algorithms are insufficient for analyzing and synthesizing such large geospatial data sets, and those algorithms rarely scale out onto large distributed- memory parallel computer systems. We seek to develop computationally efficient algorithms and techniques using high-performance computing for characterization of Arctic landscapes. We will apply a variety of data analytics algorithms, such as cluster analysis, complex object-based image analysis (COBIA), and neural networks. We also propose to use representativeness analysis within the Seward Peninsula domain to determine optimal sampling locations for fine-scale measurements. This methodology should provide an initial framework for analyzing dynamic landscape trends in Arctic ecosystems, such as shrubification and disturbances, and integration of ecoregions into multi-scale models.
NASA Astrophysics Data System (ADS)
Carroll, M.; McCarty, J. L.; Neigh, C. S. R.; Wooten, M.
2016-12-01
Very high resolution (VHR) satellite data is experiencing rapid annual growth, producing petabytes of remotely sensed data per year. The WorldView constellation, operated by DigitalGlobe, images over 1.2 billion km2 annually at a > 2 m spatial resolution. Due to computation, data cost, and methodological concerns, VHR satellite data has mainly been used to produce needed geospatial information for site-specific phenomenon. This project produced a VHR spatiotemporally-explicit wall-to-wall cropland area map for the rainfed residential cropland mosaic of Tigray Region, Ethiopia, which is comprised entirely of smallholder farms. Moderate resolution satellite data is not adequate in spatial or temporal resolution to capture total area occupied by smallholder farms, i.e., farms with agricultural fields of ≥ 45 × 45 m in dimension. In order to accurately map smallholder crop area over a large region, hundreds of VHR images spanning two or more years are needed. Sub-meter WorldView-1 and WorldView-2 segmentation results were combined median phenology amplitude from Landsat 8 data. VHR WorldView-1, -2 data were obtained from the U.S. National Geospatial-Intelligence Agency (NGA) Commercial Archive Data at NASA Goddard Space Flight Center (GSFC) (http://cad4nasa.gsfc.nasa.gov/). Over 2700 scenes were processed from raw imagery to completed crop map in 1 week in a semi-automated method using the large computing capacity of the Advanced Data Analytics Platform (ADAPT) at NASA GSFC's NCCS (http://www.nccs.nasa.gov/services/adapt). This methodology is extensible to any land cover type and can easily be expanded to run on much larger regions.
Parallel Geospatial Data Management for Multi-Scale Environmental Data Analysis on GPUs
NASA Astrophysics Data System (ADS)
Wang, D.; Zhang, J.; Wei, Y.
2013-12-01
As the spatial and temporal resolutions of Earth observatory data and Earth system simulation outputs are getting higher, in-situ and/or post- processing such large amount of geospatial data increasingly becomes a bottleneck in scientific inquires of Earth systems and their human impacts. Existing geospatial techniques that are based on outdated computing models (e.g., serial algorithms and disk-resident systems), as have been implemented in many commercial and open source packages, are incapable of processing large-scale geospatial data and achieve desired level of performance. In this study, we have developed a set of parallel data structures and algorithms that are capable of utilizing massively data parallel computing power available on commodity Graphics Processing Units (GPUs) for a popular geospatial technique called Zonal Statistics. Given two input datasets with one representing measurements (e.g., temperature or precipitation) and the other one represent polygonal zones (e.g., ecological or administrative zones), Zonal Statistics computes major statistics (or complete distribution histograms) of the measurements in all regions. Our technique has four steps and each step can be mapped to GPU hardware by identifying its inherent data parallelisms. First, a raster is divided into blocks and per-block histograms are derived. Second, the Minimum Bounding Boxes (MBRs) of polygons are computed and are spatially matched with raster blocks; matched polygon-block pairs are tested and blocks that are either inside or intersect with polygons are identified. Third, per-block histograms are aggregated to polygons for blocks that are completely within polygons. Finally, for blocks that intersect with polygon boundaries, all the raster cells within the blocks are examined using point-in-polygon-test and cells that are within polygons are used to update corresponding histograms. As the task becomes I/O bound after applying spatial indexing and GPU hardware acceleration, we have developed a GPU-based data compression technique by reusing our previous work on Bitplane Quadtree (or BPQ-Tree) based indexing of binary bitmaps. Results have shown that our GPU-based parallel Zonal Statistic technique on 3000+ US counties over 20+ billion NASA SRTM 30 meter resolution Digital Elevation (DEM) raster cells has achieved impressive end-to-end runtimes: 101 seconds and 46 seconds a low-end workstation equipped with a Nvidia GTX Titan GPU using cold and hot cache, respectively; and, 60-70 seconds using a single OLCF TITAN computing node and 10-15 seconds using 8 nodes. Our experiment results clearly show the potentials of using high-end computing facilities for large-scale geospatial processing.
OGC and Grid Interoperability in enviroGRIDS Project
NASA Astrophysics Data System (ADS)
Gorgan, Dorian; Rodila, Denisa; Bacu, Victor; Giuliani, Gregory; Ray, Nicolas
2010-05-01
EnviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is a 4-years FP7 Project aiming to address the subjects of ecologically unsustainable development and inadequate resource management. The project develops a Spatial Data Infrastructure of the Black Sea Catchment region. The geospatial technologies offer very specialized functionality for Earth Science oriented applications as well as the Grid oriented technology that is able to support distributed and parallel processing. One challenge of the enviroGRIDS project is the interoperability between geospatial and Grid infrastructures by providing the basic and the extended features of the both technologies. The geospatial interoperability technology has been promoted as a way of dealing with large volumes of geospatial data in distributed environments through the development of interoperable Web service specifications proposed by the Open Geospatial Consortium (OGC), with applications spread across multiple fields but especially in Earth observation research. Due to the huge volumes of data available in the geospatial domain and the additional introduced issues (data management, secure data transfer, data distribution and data computation), the need for an infrastructure capable to manage all those problems becomes an important aspect. The Grid promotes and facilitates the secure interoperations of geospatial heterogeneous distributed data within a distributed environment, the creation and management of large distributed computational jobs and assures a security level for communication and transfer of messages based on certificates. This presentation analysis and discusses the most significant use cases for enabling the OGC Web services interoperability with the Grid environment and focuses on the description and implementation of the most promising one. In these use cases we give a special attention to issues such as: the relations between computational grid and the OGC Web service protocols, the advantages offered by the Grid technology - such as providing a secure interoperability between the distributed geospatial resource -and the issues introduced by the integration of distributed geospatial data in a secure environment: data and service discovery, management, access and computation. enviroGRIDS project proposes a new architecture which allows a flexible and scalable approach for integrating the geospatial domain represented by the OGC Web services with the Grid domain represented by the gLite middleware. The parallelism offered by the Grid technology is discussed and explored at the data level, management level and computation level. The analysis is carried out for OGC Web service interoperability in general but specific details are emphasized for Web Map Service (WMS), Web Feature Service (WFS), Web Coverage Service (WCS), Web Processing Service (WPS) and Catalog Service for Web (CSW). Issues regarding the mapping and the interoperability between the OGC and the Grid standards and protocols are analyzed as they are the base in solving the communication problems between the two environments: grid and geospatial. The presetation mainly highlights how the Grid environment and Grid applications capabilities can be extended and utilized in geospatial interoperability. Interoperability between geospatial and Grid infrastructures provides features such as the specific geospatial complex functionality and the high power computation and security of the Grid, high spatial model resolution and geographical area covering, flexible combination and interoperability of the geographical models. According with the Service Oriented Architecture concepts and requirements of interoperability between geospatial and Grid infrastructures each of the main functionality is visible from enviroGRIDS Portal and consequently, by the end user applications such as Decision Maker/Citizen oriented Applications. The enviroGRIDS portal is the single way of the user to get into the system and the portal faces a unique style of the graphical user interface. Main reference for further information: [1] enviroGRIDS Project, http://www.envirogrids.net/
US agricultural policy, land use change, and biofuels: are we driving our way to the next dust bowl?
NASA Astrophysics Data System (ADS)
Wright, Christopher K.
2015-05-01
Lark et al (2015 Environ. Res. Lett. 10 044003), analyze recent shifts in US agricultural land use (2008-2012) using newly-available, high-resolution geospatial information, the Cropland Data Layer. Cropland expansion documented by Lark et al suggests the need to reform national agricultural policies in the wake of an emerging, new era of US agriculture characterized by rapid land cover/land use change.
NASA Astrophysics Data System (ADS)
Chen, Nengcheng; Di, Liping; Yu, Genong; Gong, Jianya; Wei, Yaxing
2009-02-01
Recent advances in Sensor Web geospatial data capture, such as high-resolution in satellite imagery and Web-ready data processing and modeling technologies, have led to the generation of large numbers of datasets from real-time or near real-time observations and measurements. Finding which sensor or data complies with criteria such as specific times, locations, and scales has become a bottleneck for Sensor Web-based applications, especially remote-sensing observations. In this paper, an architecture for use of the integration Sensor Observation Service (SOS) with the Open Geospatial Consortium (OGC) Catalogue Service-Web profile (CSW) is put forward. The architecture consists of a distributed geospatial sensor observation service, a geospatial catalogue service based on the ebXML Registry Information Model (ebRIM), SOS search and registry middleware, and a geospatial sensor portal. The SOS search and registry middleware finds the potential SOS, generating data granule information and inserting the records into CSW. The contents and sequence of the services, the available observations, and the metadata of the observations registry are described. A prototype system is designed and implemented using the service middleware technology and a standard interface and protocol. The feasibility and the response time of registry and retrieval of observations are evaluated using a realistic Earth Observing-1 (EO-1) SOS scenario. Extracting information from SOS requires the same execution time as record generation for CSW. The average data retrieval response time in SOS+CSW mode is 17.6% of that of the SOS-alone mode. The proposed architecture has the more advantages of SOS search and observation data retrieval than the existing sensor Web enabled systems.
POLARIS: A 30-meter probabilistic soil series map of the contiguous United States
Chaney, Nathaniel W; Wood, Eric F; McBratney, Alexander B; Hempel, Jonathan W; Nauman, Travis; Brungard, Colby W.; Odgers, Nathan P
2016-01-01
A new complete map of soil series probabilities has been produced for the contiguous United States at a 30 m spatial resolution. This innovative database, named POLARIS, is constructed using available high-resolution geospatial environmental data and a state-of-the-art machine learning algorithm (DSMART-HPC) to remap the Soil Survey Geographic (SSURGO) database. This 9 billion grid cell database is possible using available high performance computing resources. POLARIS provides a spatially continuous, internally consistent, quantitative prediction of soil series. It offers potential solutions to the primary weaknesses in SSURGO: 1) unmapped areas are gap-filled using survey data from the surrounding regions, 2) the artificial discontinuities at political boundaries are removed, and 3) the use of high resolution environmental covariate data leads to a spatial disaggregation of the coarse polygons. The geospatial environmental covariates that have the largest role in assembling POLARIS over the contiguous United States (CONUS) are fine-scale (30 m) elevation data and coarse-scale (~ 2 km) estimates of the geographic distribution of uranium, thorium, and potassium. A preliminary validation of POLARIS using the NRCS National Soil Information System (NASIS) database shows variable performance over CONUS. In general, the best performance is obtained at grid cells where DSMART-HPC is most able to reduce the chance of misclassification. The important role of environmental covariates in limiting prediction uncertainty suggests including additional covariates is pivotal to improving POLARIS' accuracy. This database has the potential to improve the modeling of biogeochemical, water, and energy cycles in environmental models; enhance availability of data for precision agriculture; and assist hydrologic monitoring and forecasting to ensure food and water security.
A Geospatial Database for Wind and Solar Energy Applications: The Kingdom of Bahrain Study Case
NASA Astrophysics Data System (ADS)
Al-Joburi, Khalil; Dahman, Nidal
2017-11-01
This research is aimed at designing, implementing, and testing a geospatial database for wind and solar energy applications in the Kingdom of Bahrain. All decision making needed to determine economic feasibility and establish site location for wind turbines or solar panels depends primarily on geospatial feature theme information and non-spatial (attribute) data for wind, solar, rainfall, temperature and weather characteristics of a particular region. Spatial data includes, but is not limited to, digital elevation, slopes, land use, zonings, parks, population density, road utility maps, and other related information. Digital elevations for over 450,000 spot at 50 m spatial horizontal resolution plus field surveying and GPS (at selected locations) was obtained from the Surveying and Land Registration Bureau (SLRB). Road, utilities, and population density are obtained from the Central Information Organization (CIO). Land use zoning, recreational parks, and other data are obtained from the Ministry of Municipalities and Agricultural Affairs. Wind, solar, humidity, rainfall, and temperature data are obtained from the Ministry of Transportation, Civil Aviation Section. LandSat Satellite and others images are obtained from NASA and online sources respectively. The collected geospatial data was geo-referenced to Ain el-Abd UTM Zone 39 North. 3D Digital Elevation Model (DEM)-50 m spatial resolutions was created using SLRB spot elevations. Slope and aspect maps were generate based on the DEM. Supervised image classification to identify open spaces was performed utilizing satellite images. Other geospatial data was converted to raster format with the same cell resolution. Non-spatial data are entered as an attribute to spatial features. To eliminate ambiguous solution, multi-criteria GIS model is developed based on, vector (discrete point, line, and polygon representations) as well as raster model (continuous representation). The model was tested at the Al-Areen proposed project, a relatively small area (15 km2). Optimum site spatial location for the location of wind turbines and solar panels was determined and initial results indicates that the combination of wind and solar energy would be sufficient for the project to meet the energy demand at the present per capita consummation rate..
Konrad, Christopher P.
2015-01-01
Ecological functions and flood-related risks were assessed for floodplains along the 17 major rivers flowing into Puget Sound Basin, Washington. The assessment addresses five ecological functions, five components of flood-related risks at two spatial resolutions—fine and coarse. The fine-resolution assessment compiled spatial attributes of floodplains from existing, publically available sources and integrated the attributes into 10-meter rasters for each function, hazard, or exposure. The raster values generally represent different types of floodplains with regard to each function, hazard, or exposure rather than the degree of function, hazard, or exposure. The coarse-resolution assessment tabulates attributes from the fine-resolution assessment for larger floodplain units, which are floodplains associated with 0.1 to 21-kilometer long segments of major rivers. The coarse-resolution assessment also derives indices that can be used to compare function or risk among different floodplain units and to develop normative (based on observed distributions) standards. The products of the assessment are available online as geospatial datasets (Konrad, 2015; http://dx.doi.org/10.5066/F7DR2SJC).
2014-08-15
challenges. ERDC develops innovative solutions in civil and military engineering, geospatial sciences, water resources, and environmental sciences for...GRL TR-14-1 iv Abstract Orthoimages are used to produce image- map products for navigation and planning, and serve as source data for advanced...resulting mosaic covers a wider area and contains less visible seams, which makes the map easier to understand. RPC replace the actual sensor model while
NASA Astrophysics Data System (ADS)
Voss, M.; Blundell, B.
2015-12-01
Characterization of urban environments is a high priority for the U.S. Army as battlespaces have transitioned from the predominantly open spaces of the 20th century to urban areas where soldiers have reduced situational awareness due to the diversity and density of their surroundings. Creating high-resolution urban terrain geospatial information will improve mission planning and soldier effectiveness. In this effort, super-resolution true-color imagery was collected with an Altivan NOVA unmanned aerial system over the Muscatatuck Urban Training Center near Butlerville, Indiana on September 16, 2014. Multispectral texture analysis using different algorithms was conducted for urban surface characterization at a variety of scales. Training samples extracted from the true-color and texture images. These data were processed using a variety of meta-algorithms with a decision tree classifier to create a high-resolution urban features map. In addition to improving accuracy over traditional image classification methods, this technique allowed the determination of the most significant textural scales in creating urban terrain maps for tactical exploitation.
NASA Technical Reports Server (NTRS)
Lewis, Adam; Lymburner, Leo; Purss, Matthew B. J.; Brooke, Brendan; Evans, Ben; Ip, Alex; Dekker, Arnold G.; Irons, James R.; Minchin, Stuart; Mueller, Norman
2015-01-01
The effort and cost required to convert satellite Earth Observation (EO) data into meaningful geophysical variables has prevented the systematic analysis of all available observations. To overcome these problems, we utilise an integrated High Performance Computing and Data environment to rapidly process, restructure and analyse the Australian Landsat data archive. In this approach, the EO data are assigned to a common grid framework that spans the full geospatial and temporal extent of the observations - the EO Data Cube. This approach is pixel-based and incorporates geometric and spectral calibration and quality assurance of each Earth surface reflectance measurement. We demonstrate the utility of the approach with rapid time-series mapping of surface water across the entire Australian continent using 27 years of continuous, 25 m resolution observations. Our preliminary analysis of the Landsat archive shows how the EO Data Cube can effectively liberate high-resolution EO data from their complex sensor-specific data structures and revolutionise our ability to measure environmental change.
Digital Mapping and Environmental Characterization of National Wild and Scenic River Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
McManamay, Ryan A; Bosnall, Peter; Hetrick, Shelaine L
2013-09-01
Spatially accurate geospatial information is required to support decision-making regarding sustainable future hydropower development. Under a memorandum of understanding among several federal agencies, a pilot study was conducted to map a subset of National Wild and Scenic Rivers (WSRs) at a higher resolution and provide a consistent methodology for mapping WSRs across the United States and across agency jurisdictions. A subset of rivers (segments falling under the jurisdiction of the National Park Service) were mapped at a high resolution using the National Hydrography Dataset (NHD). The spatial extent and representation of river segments mapped at NHD scale were compared withmore » the prevailing geospatial coverage mapped at a coarser scale. Accurately digitized river segments were linked to environmental attribution datasets housed within the Oak Ridge National Laboratory s National Hydropower Asset Assessment Program database to characterize the environmental context of WSR segments. The results suggest that both the spatial scale of hydrography datasets and the adherence to written policy descriptions are critical to accurately mapping WSRs. The environmental characterization provided information to deduce generalized trends in either the uniqueness or the commonness of environmental variables associated with WSRs. Although WSRs occur in a wide range of human-modified landscapes, environmental data layers suggest that they provide habitats important to terrestrial and aquatic organisms and recreation important to humans. Ultimately, the research findings herein suggest that there is a need for accurate, consistent, mapping of the National WSRs across the agencies responsible for administering each river. Geospatial applications examining potential landscape and energy development require accurate sources of information, such as data layers that portray realistic spatial representations.« less
Joseph St. Peter; John Hogland; Nathaniel Anderson; Jason Drake; Paul Medley
2018-01-01
Land cover classification provides valuable information for prioritizing management and conservation operations across large landscapes. Current regional scale land cover geospatial products within the United States have a spatial resolution that is too coarse to provide the necessary information for operations at the local and project scales. This paper describes a...
Mapping Land Use/Land Cover in the Ambos Nogales Study Area
Norman, Laura M.; Wallace, Cynthia S.A.
2008-01-01
The Ambos Nogales watershed, which surrounds the twin cities of Nogales, Arizona, United States and Nogales, Sonora, Mexico, has a history of problems related to flooding. This paper describes the process of creating a high-resolution, binational land-cover dataset to be used in modeling the Ambos Nogales watershed. The Automated Geospatial Watershed Assessment tool will be used to model the Ambos Nogales watershed to identify focal points for planning efforts and to anticipate ramifications of implementing detention reservoirs at certain watershed planes.
Goddard Space Flight Center's Partnership with Florida International University
NASA Astrophysics Data System (ADS)
Rishe, N. D.; Graham, S. C.; Gutierrez, M. E.
2004-12-01
NASA's Goddard Space Flight Center (GSFC) has been collaborating with Florida International University's High Performance Database Research Center (FIU HPDRC) for nearly ten years. Much of this collaboration was funded through a NASA Institutional Research Award (IRA). That award involved research in the Internet dissemination of geospatial data, and in recruiting and training student researchers. FIU's TerraFly web service presently serves more than 10,000 unique users per day by providing an easy-to-use mechanism for exploring geospatial data and imagery. IRA-supported students have received 47 Bachelor's degrees, 20 Master's degrees, and 2 Doctoral degrees at FIU. FIU leveraged IRA funding into over \\$19 million in other funding and donations for their research and training activities and has published nearly 150 scientific papers acknowledging the NASA IRA award. GSFC has worked closely with FIU HPDRC in the development of their geospatial data storage and dissemination research. TerraFly presents many NASA datasets such as the nationwide mosaic of LandSat 5, the PRISM precipitation model, the TRMM accumulated rainfall worldwide; as well as USGS aerial photography nationwide at 30cm to 1m resolutions, demographic data, Ikonos satellite imagery, and many more. Our presentation will discuss the lessons learned during the collaboration between GSFC and FIU as well as our current research projects.
Pendleton, E.A.; Baldwin, W.E.; Danforth, W.W.; DeWitt, N.T.; Forde, A.S.; Foster, D.S.; Kelso, K.W.; Pfeiffer, W.R.; Turecek, A.M.; Flocks, J.G.; Twichell, D.C.
2011-01-01
This report contains the geophysical and geospatial data that were collected along the western offshore side of the Gulf Islands of Mississippi on the research vessel Tommy Munro during two cruises in 2010. Geophysical data were collected by the U.S. Geological Survey in Woods Hole, Massachusetts, and St. Petersburg, Forida, in cooperation with the U.S. Army Corps of Engineers Mobile District. Bathymetric-sonar, sidescan-sonar, and Chirp seismic-reflection data were acquired with the following equipment, respectively: Systems Engineering and Assessment, Ltd., SwathPlus interferometric sonars; Klein 3000 and 3900 dual-frequency sidescan sonars; and an EdgeTech 512i Chirp sub-bottom profiling system. The long-term goals of this mapping effort are to produce high-quality, high-resolution geologic maps and interpretations that can be utilized to identify sand resources within the region, to better understand the Holocene evolution, and to anticipate future changes in this coastal system. Processed geospatial data files and the geophysical data provided in this report help attain these goals.
The Hazards Data Distribution System update
Jones, Brenda K.; Lamb, Rynn M.
2010-01-01
After a major disaster, a satellite image or a collection of aerial photographs of the event is frequently the fastest, most effective way to determine its scope and severity. The U.S. Geological Survey (USGS) Emergency Operations Portal provides emergency first responders and support personnel with easy access to imagery and geospatial data, geospatial Web services, and a digital library focused on emergency operations. Imagery and geospatial data are accessed through the Hazards Data Distribution System (HDDS). HDDS historically provided data access and delivery services through nongraphical interfaces that allow emergency response personnel to select and obtain pre-event baseline data and (or) event/disaster response data. First responders are able to access full-resolution GeoTIFF images or JPEG images at medium- and low-quality compressions through ftp downloads. USGS HDDS home page: http://hdds.usgs.gov/hdds2/
Application of geo-spatial technology in schistosomiasis modelling in Africa: a review.
Manyangadze, Tawanda; Chimbari, Moses John; Gebreslasie, Michael; Mukaratirwa, Samson
2015-11-04
Schistosomiasis continues to impact socio-economic development negatively in sub-Saharan Africa. The advent of spatial technologies, including geographic information systems (GIS), Earth observation (EO) and global positioning systems (GPS) assist modelling efforts. However, there is increasing concern regarding the accuracy and precision of the current spatial models. This paper reviews the literature regarding the progress and challenges in the development and utilization of spatial technology with special reference to predictive models for schistosomiasis in Africa. Peer-reviewed papers identified through a PubMed search using the following keywords: geo-spatial analysis OR remote sensing OR modelling OR earth observation OR geographic information systems OR prediction OR mapping AND schistosomiasis AND Africa were used. Statistical uncertainty, low spatial and temporal resolution satellite data and poor validation were identified as some of the factors that compromise the precision and accuracy of the existing predictive models. The need for high spatial resolution of remote sensing data in conjunction with ancillary data viz. ground-measured climatic and environmental information, local presence/absence intermediate host snail surveys as well as prevalence and intensity of human infection for model calibration and validation are discussed. The importance of a multidisciplinary approach in developing robust, spatial data capturing, modelling techniques and products applicable in epidemiology is highlighted.
A Python Geospatial Language Toolkit
NASA Astrophysics Data System (ADS)
Fillmore, D.; Pletzer, A.; Galloy, M.
2012-12-01
The volume and scope of geospatial data archives, such as collections of satellite remote sensing or climate model products, has been rapidly increasing and will continue to do so in the near future. The recently launched (October 2011) Suomi National Polar-orbiting Partnership satellite (NPP) for instance, is the first of a new generation of Earth observation platforms that will monitor the atmosphere, oceans, and ecosystems, and its suite of instruments will generate several terabytes each day in the form of multi-spectral images and derived datasets. Full exploitation of such data for scientific analysis and decision support applications has become a major computational challenge. Geophysical data exploration and knowledge discovery could benefit, in particular, from intelligent mechanisms for extracting and manipulating subsets of data relevant to the problem of interest. Potential developments include enhanced support for natural language queries and directives to geospatial datasets. The translation of natural language (that is, human spoken or written phrases) into complex but unambiguous objects and actions can be based on a context, or knowledge domain, that represents the underlying geospatial concepts. This poster describes a prototype Python module that maps English phrases onto basic geospatial objects and operations. This module, along with the associated computational geometry methods, enables the resolution of natural language directives that include geographic regions of arbitrary shape and complexity.
Geospatial resources for the geologic community: The USGS National Map
Witt, Emitt C.
2015-01-01
Geospatial data are a key component of investigating, interpreting, and communicating the geological sciences. Locating geospatial data can be time-consuming, which detracts from time spent on a study because these data are not obviously placed in central locations or are served from many disparate databases. The National Map of the US Geological Survey is a publicly available resource for accessing the geospatial base map data needs of the geological community from a central location. The National Map data are available through a viewer and download platform providing access to eight primary data themes, plus the US Topo and scanned historical topographic maps. The eight themes are elevation, orthoimagery, hydrography, geographic names, boundaries, transportation, structures, and land cover, and they are being offered for download as predefined tiles in formats supported by leading geographic information system software. Data tiles are periodically refreshed to capture the most current content and are an efficient method for disseminating and receiving geospatial information. Elevation data, for example, are offered as a download from the National Map as 1° × 1° tiles for the 10- and 30- m products and as 15′ × 15′ tiles for the higher-resolution 3-m product. Vector data sets with smaller file sizes are offered at several tile sizes and formats. Partial tiles are not a download option—any prestaged data that intersect the requesting bounding box will be, in their entirety, part of the download order. While there are many options for accessing geospatial data via the Web, the National Map represents authoritative sources of data that are documented and can be referenced for citation and inclusion in scientific publications. Therefore, National Map products and services should be part of a geologist’s first stop for geospatial information and data.
KOLAM: a cross-platform architecture for scalable visualization and tracking in wide-area imagery
NASA Astrophysics Data System (ADS)
Fraser, Joshua; Haridas, Anoop; Seetharaman, Guna; Rao, Raghuveer M.; Palaniappan, Kannappan
2013-05-01
KOLAM is an open, cross-platform, interoperable, scalable and extensible framework supporting a novel multi- scale spatiotemporal dual-cache data structure for big data visualization and visual analytics. This paper focuses on the use of KOLAM for target tracking in high-resolution, high throughput wide format video also known as wide-area motion imagery (WAMI). It was originally developed for the interactive visualization of extremely large geospatial imagery of high spatial and spectral resolution. KOLAM is platform, operating system and (graphics) hardware independent, and supports embedded datasets scalable from hundreds of gigabytes to feasibly petabytes in size on clusters, workstations, desktops and mobile computers. In addition to rapid roam, zoom and hyper- jump spatial operations, a large number of simultaneously viewable embedded pyramid layers (also referred to as multiscale or sparse imagery), interactive colormap and histogram enhancement, spherical projection and terrain maps are supported. The KOLAM software architecture was extended to support airborne wide-area motion imagery by organizing spatiotemporal tiles in very large format video frames using a temporal cache of tiled pyramid cached data structures. The current version supports WAMI animation, fast intelligent inspection, trajectory visualization and target tracking (digital tagging); the latter by interfacing with external automatic tracking software. One of the critical needs for working with WAMI is a supervised tracking and visualization tool that allows analysts to digitally tag multiple targets, quickly review and correct tracking results and apply geospatial visual analytic tools on the generated trajectories. One-click manual tracking combined with multiple automated tracking algorithms are available to assist the analyst and increase human effectiveness.
NCI's Distributed Geospatial Data Server
NASA Astrophysics Data System (ADS)
Larraondo, P. R.; Evans, B. J. K.; Antony, J.
2016-12-01
Earth systems, environmental and geophysics datasets are an extremely valuable source of information about the state and evolution of the Earth. However, different disciplines and applications require this data to be post-processed in different ways before it can be used. For researchers experimenting with algorithms across large datasets or combining multiple data sets, the traditional approach to batch data processing and storing all the output for later analysis rapidly becomes unfeasible, and often requires additional work to publish for others to use. Recent developments on distributed computing using interactive access to significant cloud infrastructure opens the door for new ways of processing data on demand, hence alleviating the need for storage space for each individual copy of each product. The Australian National Computational Infrastructure (NCI) has developed a highly distributed geospatial data server which supports interactive processing of large geospatial data products, including satellite Earth Observation data and global model data, using flexible user-defined functions. This system dynamically and efficiently distributes the required computations among cloud nodes and thus provides a scalable analysis capability. In many cases this completely alleviates the need to preprocess and store the data as products. This system presents a standards-compliant interface, allowing ready accessibility for users of the data. Typical data wrangling problems such as handling different file formats and data types, or harmonising the coordinate projections or temporal and spatial resolutions, can now be handled automatically by this service. The geospatial data server exposes functionality for specifying how the data should be aggregated and transformed. The resulting products can be served using several standards such as the Open Geospatial Consortium's (OGC) Web Map Service (WMS) or Web Feature Service (WFS), Open Street Map tiles, or raw binary arrays under different conventions. We will show some cases where we have used this new capability to provide a significant improvement over previous approaches.
Water sources and mixing in riparian wetlands revealed by tracers and geospatial analysis.
Lessels, Jason S; Tetzlaff, Doerthe; Birkel, Christian; Dick, Jonathan; Soulsby, Chris
2016-01-01
Mixing of waters within riparian zones has been identified as an important influence on runoff generation and water quality. Improved understanding of the controls on the spatial and temporal variability of water sources and how they mix in riparian zones is therefore of both fundamental and applied interest. In this study, we have combined topographic indices derived from a high-resolution Digital Elevation Model (DEM) with repeated spatially high-resolution synoptic sampling of multiple tracers to investigate such dynamics of source water mixing. We use geostatistics to estimate concentrations of three different tracers (deuterium, alkalinity, and dissolved organic carbon) across an extended riparian zone in a headwater catchment in NE Scotland, to identify spatial and temporal influences on mixing of source waters. The various biogeochemical tracers and stable isotopes helped constrain the sources of runoff and their temporal dynamics. Results show that spatial variability in all three tracers was evident in all sampling campaigns, but more pronounced in warmer dryer periods. The extent of mixing areas within the riparian area reflected strong hydroclimatic controls and showed large degrees of expansion and contraction that was not strongly related to topographic indices. The integrated approach of using multiple tracers, geospatial statistics, and topographic analysis allowed us to classify three main riparian source areas and mixing zones. This study underlines the importance of the riparian zones for mixing soil water and groundwater and introduces a novel approach how this mixing can be quantified and the effect on the downstream chemistry be assessed.
Bridging the Gap Between Surveyors and the Geo-Spatial Society
NASA Astrophysics Data System (ADS)
Müller, H.
2016-06-01
For many years FIG, the International Association of Surveyors, has been trying to bridge the gap between surveyors and the geospatial society as a whole, with the geospatial industries in particular. Traditionally the surveying profession contributed to the good of society by creating and maintaining highly precise and accurate geospatial data bases, based on an in-depth knowledge of spatial reference frameworks. Furthermore in many countries surveyors may be entitled to make decisions about land divisions and boundaries. By managing information spatially surveyors today develop into the role of geo-data managers, the longer the more. Job assignments in this context include data entry management, data and process quality management, design of formal and informal systems, information management, consultancy, land management, all that in close cooperation with many different stakeholders. Future tasks will include the integration of geospatial information into e-government and e-commerce systems. The list of professional tasks underpins the capabilities of surveyors to contribute to a high quality geospatial data and information management. In that way modern surveyors support the needs of a geo-spatial society. The paper discusses several approaches to define the role of the surveyor within the modern geospatial society.
Dotse-Gborgbortsi, Winfred; Wardrop, Nicola; Adewole, Ademola; Thomas, Mair L H; Wright, Jim
2018-05-23
Commercial geospatial data resources are frequently used to understand healthcare utilisation. Although there is widespread evidence of a digital divide for other digital resources and infra-structure, it is unclear how commercial geospatial data resources are distributed relative to health need. To examine the distribution of commercial geospatial data resources relative to health needs, we assembled coverage and quality metrics for commercial geocoding, neighbourhood characterisation, and travel time calculation resources for 183 countries. We developed a country-level, composite index of commercial geospatial data quality/availability and examined its distribution relative to age-standardised all-cause and cause specific (for three main causes of death) mortality using two inequality metrics, the slope index of inequality and relative concentration index. In two sub-national case studies, we also examined geocoding success rates versus area deprivation by district in Eastern Region, Ghana and Lagos State, Nigeria. Internationally, commercial geospatial data resources were inversely related to all-cause mortality. This relationship was more pronounced when examining mortality due to communicable diseases. Commercial geospatial data resources for calculating patient travel times were more equitably distributed relative to health need than resources for characterising neighbourhoods or geocoding patient addresses. Countries such as South Africa have comparatively high commercial geospatial data availability despite high mortality, whilst countries such as South Korea have comparatively low data availability and low mortality. Sub-nationally, evidence was mixed as to whether geocoding success was lowest in more deprived districts. To our knowledge, this is the first global analysis of commercial geospatial data resources in relation to health outcomes. In countries such as South Africa where there is high mortality but also comparatively rich commercial geospatial data, these data resources are a potential resource for examining healthcare utilisation that requires further evaluation. In countries such as Sierra Leone where there is high mortality but minimal commercial geospatial data, alternative approaches such as open data use are needed in quantifying patient travel times, geocoding patient addresses, and characterising patients' neighbourhoods.
High-Resolution Satellite Data Open for Government Research
NASA Technical Reports Server (NTRS)
Neigh, Christopher S. R.; Masek, Jeffrey G.; Nickeson, Jaime E.
2013-01-01
U.S. satellite commercial imagery (CI) with resolution less than 1 meter is a common geospatial reference used by the public through Web applications, mobile devices, and the news media. However, CI use in the scientific community has not kept pace, even though those who are performing U.S. government research have access to these data at no cost.Previously, studies using multiple CI acquisitions from IKONOS-2, Quickbird-2, GeoEye-1, WorldView-1, and WorldView-2 would have been cost prohibitive. Now, with near-global submeter coverage and online distribution, opportunities abound for future scientific studies. This archive is already quite extensive (examples are shown in Figure 1) and is being used in many novel applications.
NASA Astrophysics Data System (ADS)
Jarihani, B.
2015-12-01
Digital Elevation Models (DEMs) that accurately replicate both landscape form and processes are critical to support modeling of environmental processes. Pre-processing analysis of DEMs and extracting characteristics of the watershed (e.g., stream networks, catchment delineation, surface and subsurface flow paths) is essential for hydrological and geomorphic analysis and sediment transport. This study investigates the status of the current remotely-sensed DEMs in providing advanced morphometric information of drainage basins particularly in data sparse regions. Here we assess the accuracy of three available DEMs: (i) hydrologically corrected "H-DEM" of Geoscience Australia derived from the Shuttle Radar Topography Mission (SRTM) data; (ii) the Advanced Spaceborne Thermal Emission and Reflection Radiometer Global Digital Elevation Model (ASTER GDEM) version2 1-arc-second (~30 m) data; and (iii) the 9-arc-second national GEODATA DEM-9S ver3 from Geoscience Australia and the Australian National University. We used ESRI's geospatial data model, Arc Hydro and HEC-GeoHMS, designed for building hydrologic information systems to synthesize geospatial and temporal water resources data that support hydrologic modeling and analysis. A coastal catchment in northeast Australia was selected as the study site where very high resolution LiDAR data are available for parts of the area as reference data to assess the accuracy of other lower resolution datasets. This study provides morphometric information for drainage basins as part of the broad research on sediment flux from coastal basins to Great Barrier Reef, Australia. After applying geo-referencing and elevation corrections, stream and sub basins were delineated for each DEM. Then physical characteristics for streams (i.e., length, upstream and downstream elevation, and slope) and sub-basins (i.e., longest flow lengths, area, relief and slopes) were extracted and compared with reference datasets from LiDAR. Results showed that, in the absence of high-precision and high resolution DEM data, ASTER GDEM or SRTM DEM can be used to extract common morphometric relationship which are widely used for hydrological and geomorphological modelling.
The Role of Discrete Global Grid Systems in the Global Statistical Geospatial Framework
NASA Astrophysics Data System (ADS)
Purss, M. B. J.; Peterson, P.; Minchin, S. A.; Bermudez, L. E.
2016-12-01
The United Nations Committee of Experts on Global Geospatial Information Management (UN-GGIM) has proposed the development of a Global Statistical Geospatial Framework (GSGF) as a mechanism for the establishment of common analytical systems that enable the integration of statistical and geospatial information. Conventional coordinate reference systems address the globe with a continuous field of points suitable for repeatable navigation and analytical geometry. While this continuous field is represented on a computer in a digitized and discrete fashion by tuples of fixed-precision floating point values, it is a non-trivial exercise to relate point observations spatially referenced in this way to areal coverages on the surface of the Earth. The GSGF states the need to move to gridded data delivery and the importance of using common geographies and geocoding. The challenges associated with meeting these goals are not new and there has been a significant effort within the geospatial community to develop nested gridding standards to tackle these issues over many years. These efforts have recently culminated in the development of a Discrete Global Grid Systems (DGGS) standard which has been developed under the auspices of Open Geospatial Consortium (OGC). DGGS provide a fixed areal based geospatial reference frame for the persistent location of measured Earth observations, feature interpretations, and modelled predictions. DGGS address the entire planet by partitioning it into a discrete hierarchical tessellation of progressively finer resolution cells, which are referenced by a unique index that facilitates rapid computation, query and analysis. The geometry and location of the cell is the principle aspect of a DGGS. Data integration, decomposition, and aggregation is optimised in the DGGS hierarchical structure and can be exploited for efficient multi-source data processing, storage, discovery, transmission, visualization, computation, analysis, and modelling. During the 6th Session of the UN-GGIM in August 2016 the role of DGGS in the context of the GSGF was formally acknowledged. This paper proposes to highlight the synergies and role of DGGS in the Global Statistical Geospatial Framework and to show examples of the use of DGGS to combine geospatial statistics with traditional geoscientific data.
geospatial data analysis using parallel processing High performance computing Renewable resource technical potential and supply curve analysis Spatial database utilization Rapid analysis of large geospatial datasets energy and geospatial analysis products Research Interests Rapid, web-based renewable resource analysis
NASA Astrophysics Data System (ADS)
Quesnel, K.; Ajami, N.; Urata, J.; Marx, A.
2017-12-01
Infrastructure modernization, information technology, and the internet of things are impacting urban water use. Advanced metering infrastructure (AMI), also known as smart meters, is one forthcoming technology that holds the potential to fundamentally shift the way customers use water and utilities manage their water resources. Broadly defined, AMI is a system and process used to measure, communicate, and analyze water use data at high resolution intervals at the customer or sub-customer level. There are many promising benefits of AMI systems, but there are also many challenges; consequently, AMI in the water sector is still in its infancy. In this study we provide insights into this emerging technology by taking advantage of the higher temporal and spatial resolution of water use data provided by these systems. We couple daily water use observations from AMI with monthly and bimonthly billing records to investigate water use trends, patterns, and drivers using a case study of the City of Redwood City, CA from 2007 through 2016. We look across sectors, with a particular focus on water use for urban irrigation. Almost half of Redwood City's irrigation accounts use recycled water, and we take this unique opportunity to investigate if the behavioral response for recycled water follows the water and energy efficiency paradox in which customers who have upgraded to more efficient devices end up using more of the commodity. We model potable and recycled water demand using geospatially explicit climate, demographic, and economic factors to gain insight into various water use drivers. Additionally, we use high resolution remote sensing data from the National Agricultural Imaging Program (NAIP) to observe how changes in greenness and impervious surface are related to water use. Using a series of statistical and unsupervised machine learning techniques, we find that water use has changed dramatically over the past decade corresponding to varying climatic regimes and drought cycles. Yet, these changes in demand are complex, and vary depending on sector, water type, and neighborhood norms.
OnEarth: An Open Source Solution for Efficiently Serving High-Resolution Mapped Image Products
NASA Astrophysics Data System (ADS)
Thompson, C. K.; Plesea, L.; Hall, J. R.; Roberts, J. T.; Cechini, M. F.; Schmaltz, J. E.; Alarcon, C.; Huang, T.; McGann, J. M.; Chang, G.; Boller, R. A.; Ilavajhala, S.; Murphy, K. J.; Bingham, A. W.
2013-12-01
This presentation introduces OnEarth, a server side software package originally developed at the Jet Propulsion Laboratory (JPL), that facilitates network-based, minimum-latency geolocated image access independent of image size or spatial resolution. The key component in this package is the Meta Raster Format (MRF), a specialized raster file extension to the Geospatial Data Abstraction Library (GDAL) consisting of an internal indexed pyramid of image tiles. Imagery to be served is converted to the MRF format and made accessible online via an expandable set of server modules handling requests in several common protocols, including the Open Geospatial Consortium (OGC) compliant Web Map Tile Service (WMTS) as well as Tiled WMS and Keyhole Markup Language (KML). OnEarth has recently transitioned to open source status and is maintained and actively developed as part of GIBS (Global Imagery Browse Services), a collaborative project between JPL and Goddard Space Flight Center (GSFC). The primary function of GIBS is to enhance and streamline the data discovery process and to support near real-time (NRT) applications via the expeditious ingestion and serving of full-resolution imagery representing science products from across the NASA Earth Science spectrum. Open source software solutions are leveraged where possible in order to utilize existing available technologies, reduce development time, and enlist wider community participation. We will discuss some of the factors and decision points in transitioning OnEarth to a suitable open source paradigm, including repository and licensing agreement decision points, institutional hurdles, and perceived benefits. We will also provide examples illustrating how OnEarth is integrated within GIBS and other applications.
Assessment of Near-Source Air Pollution at a Fine Spatial ...
Mobile monitoring is an emerging strategy to characterize spatially and temporally variable air pollution in areas near sources. EPA’s Geospatial Monitoring of Air Pollution (GMAP) vehicle – an all-electric vehicle measuring real-time concentrations of particulate and gaseous pollutants – was used to map air pollution levels near the Port of Charleston in South Carolina. High-resolution monitoring was performed along driving routes near several port terminals and rail yard facilities, recording geospatial coordinates and concentrations of pollutants including black carbon, size-resolved particle count ranging from ultrafine to coarse (6 nm to 20 um), carbon monoxide, carbon dioxide, and nitrogen dioxide. Additionally, a portable meteorological station was used to characterize local conditions. The primary objective of this work is to characterize the impact of port facilities on local scale air quality. It is found that elevated concentration measurements of Black Carbon and PM correlate to periods of increased port activity and a significant elevation in concentration is observed downwind of ports. However, limitations in study design prevent a more complete analysis of the port effect. As such, we discuss the ways in which this study is limited and how future work could be improved. Mobile monitoring is an emerging strategy to characterize spatially and temporally variable air pollution in areas near sources. EPA’s Geospatial Monitoring of Air Pollut
NASA Astrophysics Data System (ADS)
Gupta, S.; Paar, G.; Muller, J. P.; Tao, Y.; Tyler, L.; Traxler, C.; Hesina, G.; Huber, B.; Nauschnegg, B.
2014-12-01
The FP7-SPACE project PRoViDE has assembled a major portion of the imaging data gathered so far from rover vehicles, landers and probes on extra-terrestrial planetary surfaces into a unique database, bringing them into a common planetary geospatial context and providing access to a complete set of 3D vision products. One major aim of PRoViDE is the fusion between orbiter and rover image products. To close the gap between HiRISE imaging resolution (down to 25cm for the OrthoRectified image (ORI), down to 1m for the DTM) and surface vision products, images from multiple HiRISE acquisitions are combined into a super resolution data set (Tao & Muller, 2014), increasing to 5cm resolution the Ortho images. Furthermore, shape-from-shading is applied to one of the ORIs at its original resolution for refinement of the HiRISE DTM, leading to DTM ground resolutions of up to 25 cm. After texture-based co-registration with these refined orbiter 3D products, MER PanCam and NavCam 3D image products can be smoothly pasted into a multi-resolution 3D data representation. Typical results from the MER mission are presented by a dedicated real-time rendering tool which is fed by a hierarchical 3D data structure that is able to cope with all involved scales from global planetary scale down to close-up reconstructions in the mm range. This allows us to explore and analyze the geological characteristics of rock outcrops, for example the detailed geometry and internal features of sedimentary rock layers, to aid paleoenvironmental interpretation. This integrated approach enables more efficient development of geological models of martian rock outcrops. The rendering tool also provides measurement tools to obtain geospatial data of surface points and distances between them. We report on novel scientific use cases and the added value potential of the resultant high-quality data set and presentation means to support further geologic investigations. The research leading to these results has received funding from the EC's 7th Framework Programme (FP7/2007-2013) under grant agreement n° 312377.
GeoPAT: A toolbox for pattern-based information retrieval from large geospatial databases
NASA Astrophysics Data System (ADS)
Jasiewicz, Jarosław; Netzel, Paweł; Stepinski, Tomasz
2015-07-01
Geospatial Pattern Analysis Toolbox (GeoPAT) is a collection of GRASS GIS modules for carrying out pattern-based geospatial analysis of images and other spatial datasets. The need for pattern-based analysis arises when images/rasters contain rich spatial information either because of their very high resolution or their very large spatial extent. Elementary units of pattern-based analysis are scenes - patches of surface consisting of a complex arrangement of individual pixels (patterns). GeoPAT modules implement popular GIS algorithms, such as query, overlay, and segmentation, to operate on the grid of scenes. To achieve these capabilities GeoPAT includes a library of scene signatures - compact numerical descriptors of patterns, and a library of distance functions - providing numerical means of assessing dissimilarity between scenes. Ancillary GeoPAT modules use these functions to construct a grid of scenes or to assign signatures to individual scenes having regular or irregular geometries. Thus GeoPAT combines knowledge retrieval from patterns with mapping tasks within a single integrated GIS environment. GeoPAT is designed to identify and analyze complex, highly generalized classes in spatial datasets. Examples include distinguishing between different styles of urban settlements using VHR images, delineating different landscape types in land cover maps, and mapping physiographic units from DEM. The concept of pattern-based spatial analysis is explained and the roles of all modules and functions are described. A case study example pertaining to delineation of landscape types in a subregion of NLCD is given. Performance evaluation is included to highlight GeoPAT's applicability to very large datasets. The GeoPAT toolbox is available for download from
NASA Astrophysics Data System (ADS)
Chaudhary, A.
2017-12-01
Current simulation models and sensors are producing high-resolution, high-velocity data in geosciences domain. Knowledge discovery from these complex and large size datasets require tools that are capable of handling very large data and providing interactive data analytics features to researchers. To this end, Kitware and its collaborators are producing open-source tools GeoNotebook, GeoJS, Gaia, and Minerva for geosciences that are using hardware accelerated graphics and advancements in parallel and distributed processing (Celery and Apache Spark) and can be loosely coupled to solve real-world use-cases. GeoNotebook (https://github.com/OpenGeoscience/geonotebook) is co-developed by Kitware and NASA-Ames and is an extension to the Jupyter Notebook. It provides interactive visualization and python-based analysis of geospatial data and depending the backend (KTile or GeoPySpark) can handle data sizes of Hundreds of Gigabytes to Terabytes. GeoNotebook uses GeoJS (https://github.com/OpenGeoscience/geojs) to render very large geospatial data on the map using WebGL and Canvas2D API. GeoJS is more than just a GIS library as users can create scientific plots such as vector and contour and can embed InfoVis plots using D3.js. GeoJS aims for high-performance visualization and interactive data exploration of scientific and geospatial location aware datasets and supports features such as Point, Line, Polygon, and advanced features such as Pixelmap, Contour, Heatmap, and Choropleth. Our another open-source tool Minerva ((https://github.com/kitware/minerva) is a geospatial application that is built on top of open-source web-based data management system Girder (https://github.com/girder/girder) which provides an ability to access data from HDFS or Amazon S3 buckets and provides capabilities to perform visualization and analyses on geosciences data in a web environment using GDAL and GeoPandas wrapped in a unified API provided by Gaia (https://github.com/OpenDataAnalytics/gaia). In this presentation, we will discuss core features of each of these tools and will present lessons learned on handling large data in the context of data management, analyses and visualization.
Pendleton, Elizabeth A.; Twichell, David C.; Foster, David S.; Worley, Charles R.; Irwin, Barry J.; Danforth, William W.
2011-01-01
Geophysical and geospatial data were collected in the nearshore area surrounding the western Elizabeth Islands, Massachusetts on the U.S. Geological Survey research vessel Rafael during September 2010 in a collaborative effort between the U.S. Geological Survey and the Massachusetts, Office of Coastal Zone Management. This report describes the results of the short-term goals of this collaborative effort, which were to map the geology of the inner shelf zone of the western Elizabeth Islands and study the geologic processes that have contributed to its evolution. Data collected during the survey include: Bathymetric and sidescan-sonar data, chirp seismic-reflection data , sound velocity profiles, and navigation data. The long-term goals of this project are to provide high-resolution geophysical data that will support research on the influence of sea-level change and sediment supply on coastal evolution and inventory subtidal marine habitat type and distribution within the coastal zone of Massachusetts.
Proceedings of the 2006 Civil Commercial Imagery Evaluation Workshop
NASA Technical Reports Server (NTRS)
Stanley, Thomas; Pagnutti, Mary
2007-01-01
The Joint Agency Commercial Imagery Evaluation (JACIE) team is a collaborative interagency working group formed to leverage different government agencies' capabilities for the characterization of commercial remote sensing products. The team is composed of staff from the National Aeronautics and Space Administration (NASA), the National Geospatial-Intelligence Agency (NGA), and the U.S. Geological Survey (USGS). Each JACIE agency has a vested interest in the purchase and use of commercial imagery to support government research and operational applications. The intent of the 2006 workshop is to exchange information regarding the characterization and application of commercial imagery used by the government. The main focus of previous workshops has been on high-resolution satellite imagery from systems; such as, IKONOS (Space Imaging, Inc.), QuickBird (DigitalGlobe, Inc.), and OrbView-3 (ORBIMAGE). This workshop is being expanded to cover all civil medium- and high-resolution commercial imagery used by the government.
Web Map Services (WMS) Global Mosaic
NASA Technical Reports Server (NTRS)
Percivall, George; Plesea, Lucian
2003-01-01
The WMS Global Mosaic provides access to imagery of the global landmass using an open standard for web mapping. The seamless image is a mosaic of Landsat 7 scenes; geographically-accurate with 30 and 15 meter resolutions. By using the OpenGIS Web Map Service (WMS) interface, any organization can use the global mosaic as a layer in their geospatial applications. Based on a trade study, an implementation approach was chosen that extends a previously developed WMS hosting a Landsat 5 CONUS mosaic developed by JPL. The WMS Global Mosaic supports the NASA Geospatial Interoperability Office goal of providing an integrated digital representation of the Earth, widely accessible for humanity's critical decisions.
Cauzzi, Carlo; Fah, Donat; Wald, David J.; Clinton, John; Losey, Stephane; Wiemer, Stefan
2018-01-01
In Switzerland, nearly all historical Mw ~ 6 earthquakes have induced damaging landslides, rockslides and snow avalanches that, in some cases, also resulted in damage to infrastructure and loss of lives. We describe the customisation to Swiss conditions of a globally calibrated statistical approach originally developed to rapidly assess earthquake-induced landslide likelihoods worldwide. The probability of occurrence of such earthquake-induced effects is modelled through a set of geospatial susceptibility proxies and peak ground acceleration. The predictive model is tuned to capture the observations from past events and optimised for near-real-time estimates based on USGS-style ShakeMaps routinely produced by the Swiss Seismological Service. Our emphasis is on the use of high-resolution geospatial datasets along with additional local information on ground failure susceptibility. Even if calibrated on historic events with moderate magnitudes, the methodology presented in this paper yields sensible results also for low-magnitude recent events. The model is integrated in the Swiss ShakeMap framework. This study has a high practical relevance to many Swiss ShakeMap stakeholders, especially those managing lifeline systems, and to other global users interested in conducting a similar customisation for their region of interest.
Information from imagery: ISPRS scientific vision and research agenda
NASA Astrophysics Data System (ADS)
Chen, Jun; Dowman, Ian; Li, Songnian; Li, Zhilin; Madden, Marguerite; Mills, Jon; Paparoditis, Nicolas; Rottensteiner, Franz; Sester, Monika; Toth, Charles; Trinder, John; Heipke, Christian
2016-05-01
With the increased availability of very high-resolution satellite imagery, terrain based imaging and participatory sensing, inexpensive platforms, and advanced information and communication technologies, the application of imagery is now ubiquitous, playing an important role in many aspects of life and work today. As a leading organisation in this field, the International Society for Photogrammetry and Remote Sensing (ISPRS) has been devoted to effectively and efficiently obtaining and utilising information from imagery since its foundation in the year 1910. This paper examines the significant challenges currently facing ISPRS and its communities, such as providing high-quality information, enabling advanced geospatial computing, and supporting collaborative problem solving. The state-of-the-art in ISPRS related research and development is reviewed and the trends and topics for future work are identified. By providing an overarching scientific vision and research agenda, we hope to call on and mobilise all ISPRS scientists, practitioners and other stakeholders to continue improving our understanding and capacity on information from imagery and to deliver advanced geospatial knowledge that enables humankind to better deal with the challenges ahead, posed for example by global change, ubiquitous sensing, and a demand for real-time information generation.
Providing Internet Access to High-Resolution Lunar Images
NASA Technical Reports Server (NTRS)
Plesea, Lucian
2008-01-01
The OnMoon server is a computer program that provides Internet access to high-resolution Lunar images, maps, and elevation data, all suitable for use in geographical information system (GIS) software for generating images, maps, and computational models of the Moon. The OnMoon server implements the Open Geospatial Consortium (OGC) Web Map Service (WMS) server protocol and supports Moon-specific extensions. Unlike other Internet map servers that provide Lunar data using an Earth coordinate system, the OnMoon server supports encoding of data in Moon-specific coordinate systems. The OnMoon server offers access to most of the available high-resolution Lunar image and elevation data. This server can generate image and map files in the tagged image file format (TIFF) or the Joint Photographic Experts Group (JPEG), 8- or 16-bit Portable Network Graphics (PNG), or Keyhole Markup Language (KML) format. Image control is provided by use of the OGC Style Layer Descriptor (SLD) protocol. Full-precision spectral arithmetic processing is also available, by use of a custom SLD extension. This server can dynamically add shaded relief based on the Lunar elevation to any image layer. This server also implements tiled WMS protocol and super-overlay KML for high-performance client application programs.
Providing Internet Access to High-Resolution Mars Images
NASA Technical Reports Server (NTRS)
Plesea, Lucian
2008-01-01
The OnMars server is a computer program that provides Internet access to high-resolution Mars images, maps, and elevation data, all suitable for use in geographical information system (GIS) software for generating images, maps, and computational models of Mars. The OnMars server is an implementation of the Open Geospatial Consortium (OGC) Web Map Service (WMS) server. Unlike other Mars Internet map servers that provide Martian data using an Earth coordinate system, the OnMars WMS server supports encoding of data in Mars-specific coordinate systems. The OnMars server offers access to most of the available high-resolution Martian image and elevation data, including an 8-meter-per-pixel uncontrolled mosaic of most of the Mars Global Surveyor (MGS) Mars Observer Camera Narrow Angle (MOCNA) image collection, which is not available elsewhere. This server can generate image and map files in the tagged image file format (TIFF), Joint Photographic Experts Group (JPEG), 8- or 16-bit Portable Network Graphics (PNG), or Keyhole Markup Language (KML) format. Image control is provided by use of the OGC Style Layer Descriptor (SLD) protocol. The OnMars server also implements tiled WMS protocol and super-overlay KML for high-performance client application programs.
Earthquake Damage Assessment Using Very High Resolution Satelliteimagery
NASA Astrophysics Data System (ADS)
Chiroiu, L.; André, G.; Bahoken, F.; Guillande, R.
Various studies using satellite imagery were applied in the last years in order to assess natural hazard damages, most of them analyzing the case of floods, hurricanes or landslides. For the case of earthquakes, the medium or small spatial resolution data available in the recent past did not allow a reliable identification of damages, due to the size of the elements (e.g. buildings or other structures), too small compared with the pixel size. The recent progresses of remote sensing in terms of spatial resolution and data processing makes possible a reliable damage detection to the elements at risk. Remote sensing techniques applied to IKONOS (1 meter resolution) and IRS (5 meters resolution) imagery were used in order to evaluate seismic vulnerability and post earthquake damages. A fast estimation of losses was performed using a multidisciplinary approach based on earthquake engineering and geospatial analysis. The results, integrated into a GIS database, could be transferred via satellite networks to the rescue teams deployed on the affected zone, in order to better coordinate the emergency operations. The methodology was applied to the city of Bhuj and Anjar after the 2001 Gujarat (India) Earthquake.
Zelenyuk, Alla; Imre, Dan; Wilson, Jacqueline; Zhang, Zhiyuan; Wang, Jun; Mueller, Klaus
2015-02-01
Understanding the effect of aerosols on climate requires knowledge of the size and chemical composition of individual aerosol particles-two fundamental properties that determine an aerosol's optical properties and ability to serve as cloud condensation or ice nuclei. Here we present our aircraft-compatible single particle mass spectrometers, SPLAT II and its new, miniaturized version, miniSPLAT that measure in-situ and in real-time the size and chemical composition of individual aerosol particles with extremely high sensitivity, temporal resolution, and sizing precision on the order of a monolayer. Although miniSPLAT's size, weight, and power consumption are significantly smaller, its performance is on par with SPLAT II. Both instruments operate in dual data acquisition mode to measure, in addition to single particle size and composition, particle number concentrations, size distributions, density, and asphericity with high temporal resolution. We also present ND-Scope, our newly developed interactive visual analytics software package. ND-Scope is designed to explore and visualize the vast amount of complex, multidimensional data acquired by our single particle mass spectrometers, along with other aerosol and cloud characterization instruments on-board aircraft. We demonstrate that ND-Scope makes it possible to visualize the relationships between different observables and to view the data in a geo-spatial context, using the interactive and fully coupled Google Earth and Parallel Coordinates displays. Here we illustrate the utility of ND-Scope to visualize the spatial distribution of atmospheric particles of different compositions, and explore the relationship between individual particle compositions and their activity as cloud condensation nuclei.
NASA World Wind, Open Source 4D Geospatial Visualization Platform: *.NET & Java* for EDUCATION
NASA Astrophysics Data System (ADS)
Hogan, P.; Kuehnel, F.
2006-12-01
NASA World Wind has only one goal, to provide the maximum opportunity for geospatial information to be experienced, be it education, science, research, business, or government. The benefits to understanding for information delivered in the context of its 4D virtual reality are extraordinary. The NASA World Wind visualization platform is open source and therefore lends itself well to being extended to service *any* requirements, be they proprietary and commercial or simply available. Data accessibility is highly optimized using standard formats including internationally certified open standards (W*S). Although proprietary applications can be built based on World Wind, and proprietary data delivered that leverage World Wind, there is nothing proprietary about the visualization platform itself or the multiple planetary data sets readily available, including global animations of live weather. NASA World Wind is being used by NASA research teams as well as being a formal part of high school and university curriculum. The National Guard uses World Wind for emergency response activities and State governments have incorporated high resolution imagery for GIS management as well as for their cross-agency emergency response activities. The U.S. federal government uses NASA World Wind for a myriad of GIS and security-related issues (NSA, NGA, DOE, FAA, etc.).
NASA World Wind, Open Source 4D Geospatial Visualization Platform: *.NET & Java*
NASA Astrophysics Data System (ADS)
Hogan, P.; Coughlan, J.
2006-12-01
NASA World Wind has only one goal, to provide the maximum opportunity for geospatial information to be experienced, be it education, science, research, business, or government. The benefits to understanding for information delivered in the context of its 4D virtual reality are extraordinary. The NASA World Wind visualization platform is open source and therefore lends itself well to being extended to service *any* requirements, be they proprietary and commercial or simply available. Data accessibility is highly optimized using standard formats including internationally certified open standards (W*S). Although proprietary applications can be built based on World Wind, and proprietary data delivered that leverage World Wind, there is nothing proprietary about the visualization platform itself or the multiple planetary data sets readily available, including global animations of live weather. NASA World Wind is being used by NASA research teams as well as being a formal part of high school and university curriculum. The National Guard uses World Wind for emergency response activities and State governments have incorporated high resolution imagery for GIS management as well as for their cross-agency emergency response activities. The U.S. federal government uses NASA World Wind for a myriad of GIS and security-related issues (NSA, NGA, DOE, FAA, etc.).
NASA Astrophysics Data System (ADS)
Hamann, H.; Jimenez Marianno, F.; Klein, L.; Albrecht, C.; Freitag, M.; Hinds, N.; Lu, S.
2015-12-01
A big data geospatial analytics platform:Physical Analytics Information Repository and Services (PAIRS)Fernando Marianno, Levente Klein, Siyuan Lu, Conrad Albrecht, Marcus Freitag, Nigel Hinds, Hendrik HamannIBM TJ Watson Research Center, Yorktown Heights, NY 10598A major challenge in leveraging big geospatial data sets is the ability to quickly integrate multiple data sources into physical and statistical models and be run these models in real time. A geospatial data platform called Physical Analytics Information and Services (PAIRS) is developed on top of open source hardware and software stack to manage Terabyte of data. A new data interpolation and re gridding is implemented where any geospatial data layers can be associated with a set of global grid where the grid resolutions is doubling for consecutive layers. Each pixel on the PAIRS grid have an index that is a combination of locations and time stamp. The indexing allow quick access to data sets that are part of a global data layers and allowing to retrieve only the data of interest. PAIRS takes advantages of parallel processing framework (Hadoop) in a cloud environment to digest, curate, and analyze the data sets while being very robust and stable. The data is stored on a distributed no-SQL database (Hbase) across multiple server, data upload and retrieval is parallelized where the original analytics task is broken up is smaller areas/volume, analyzed independently, and then reassembled for the original geographical area. The differentiating aspect of PAIRS is the ability to accelerate model development across large geographical regions and spatial resolution ranging from 0.1 m up to hundreds of kilometer. System performance is benchmarked on real time automated data ingestion and retrieval of Modis and Landsat data layers. The data layers are curated for sensor error, verified for correctness, and analyzed statistically to detect local anomalies. Multi-layer query enable PAIRS to filter different data layers based on specific conditions (e.g analyze flooding risk of a property based on topography, soil ability to hold water, and forecasted precipitation) or retrieve information about locations that share similar weather and vegetation patterns during extreme weather events like heat wave.
Solar Data | Geospatial Data Science | NREL
System Name: WGS_1984 Coverage File Last Updated Metadata KMZ File Lower 48 and Hawaii DNI 10-km Resolution 1998-2009 Zip 9.6 MB 09/12/2012 Direct Normal.xml Direct Normal.kmz Lower 48 and Hawaii GHI 10-km : GCS_North_American_1983 Coverage File Last Updated Metadata KMZ File Lower 48 DNI 10-km Resolution 1998-2005 Zip 9.1 MB 12
NASA Astrophysics Data System (ADS)
Postadjian, T.; Le Bris, A.; Sahbi, H.; Mallet, C.
2017-05-01
Semantic classification is a core remote sensing task as it provides the fundamental input for land-cover map generation. The very recent literature has shown the superior performance of deep convolutional neural networks (DCNN) for many classification tasks including the automatic analysis of Very High Spatial Resolution (VHR) geospatial images. Most of the recent initiatives have focused on very high discrimination capacity combined with accurate object boundary retrieval. Therefore, current architectures are perfectly tailored for urban areas over restricted areas but not designed for large-scale purposes. This paper presents an end-to-end automatic processing chain, based on DCNNs, that aims at performing large-scale classification of VHR satellite images (here SPOT 6/7). Since this work assesses, through various experiments, the potential of DCNNs for country-scale VHR land-cover map generation, a simple yet effective architecture is proposed, efficiently discriminating the main classes of interest (namely buildings, roads, water, crops, vegetated areas) by exploiting existing VHR land-cover maps for training.
Mapping informal small-scale mining features in a data-sparse tropical environment with a small UAS
Chirico, Peter G.; Dewitt, Jessica D.
2017-01-01
This study evaluates the use of a small unmanned aerial system (UAS) to collect imagery over artisanal mining sites in West Africa. The purpose of this study is to consider how very high-resolution imagery and digital surface models (DSMs) derived from structure-from-motion (SfM) photogrammetric techniques from a small UAS can fill the gap in geospatial data collection between satellite imagery and data gathered during field work to map and monitor informal mining sites in tropical environments. The study compares both wide-angle and narrow field of view camera systems in the collection and analysis of high-resolution orthoimages and DSMs of artisanal mining pits. The results of the study indicate that UAS imagery and SfM photogrammetric techniques permit DSMs to be produced with a high degree of precision and relative accuracy, but highlight the challenges of mapping small artisanal mining pits in remote and data sparse terrain.
NASA Astrophysics Data System (ADS)
Tabrizian, P.; Petrasova, A.; Baran, P.; Petras, V.; Mitasova, H.; Meentemeyer, R. K.
2017-12-01
Viewshed modelling- a process of defining, parsing and analysis of landscape visual space's structure within GIS- has been commonly used in applications ranging from landscape planning and ecosystem services assessment to geography and archaeology. However, less effort has been made to understand whether and to what extent these objective analyses predict actual on-the-ground perception of human observer. Moreover, viewshed modelling at the human-scale level require incorporation of fine-grained landscape structure (eg., vegetation) and patterns (e.g, landcover) that are typically omitted from visibility calculations or unrealistically simulated leading to significant error in predicting visual attributes. This poster illustrates how photorealistic Immersive Virtual Environments and high-resolution geospatial data can be used to integrate objective and subjective assessments of visual characteristics at the human-scale level. We performed viewshed modelling for a systematically sampled set of viewpoints (N=340) across an urban park using open-source GIS (GRASS GIS). For each point a binary viewshed was computed on a 3D surface model derived from high-density leaf-off LIDAR (QL2) points. Viewshed map was combined with high-resolution landcover (.5m) derived through fusion of orthoimagery, lidar vegetation, and vector data. Geo-statistics and landscape structure analysis was performed to compute topological and compositional metrics for visual-scale (e.g., openness), complexity (pattern, shape and object diversity), and naturalness. Based on the viewshed model output, a sample of 24 viewpoints representing the variation of visual characteristics were selected and geolocated. For each location, 360o imagery were captured using a DSL camera mounted on a GIGA PAN robot. We programmed a virtual reality application through which human subjects (N=100) immersively experienced a random representation of selected environments via a head-mounted display (Oculus Rift CV1), and rated each location on perceived openness, naturalness and complexity. Regression models were performed to correlate model outputs with participants' responses. The results indicated strong, significant correlations for openness, and naturalness and moderate correlation for complexity estimations.
NASA Astrophysics Data System (ADS)
Miles, B.; Chepudira, K.; LaBar, W.
2017-12-01
The Open Geospatial Consortium (OGC) SensorThings API (STA) specification, ratified in 2016, is a next-generation open standard for enabling real-time communication of sensor data. Building on over a decade of OGC Sensor Web Enablement (SWE) Standards, STA offers a rich data model that can represent a range of sensor and phenomena types (e.g. fixed sensors sensing fixed phenomena, fixed sensors sensing moving phenomena, mobile sensors sensing fixed phenomena, and mobile sensors sensing moving phenomena) and is data agnostic. Additionally, and in contrast to previous SWE standards, STA is developer-friendly, as is evident from its convenient JSON serialization, and expressive OData-based query language (with support for geospatial queries); with its Message Queue Telemetry Transport (MQTT), STA is also well-suited to efficient real-time data publishing and discovery. All these attributes make STA potentially useful for use in environmental monitoring sensor networks. Here we present Kinota(TM), an Open-Source NoSQL implementation of OGC SensorThings for large-scale high-resolution real-time environmental monitoring. Kinota, which roughly stands for Knowledge from Internet of Things Analyses, relies on Cassandra its underlying data store, which is a horizontally scalable, fault-tolerant open-source database that is often used to store time-series data for Big Data applications (though integration with other NoSQL or rational databases is possible). With this foundation, Kinota can scale to store data from an arbitrary number of sensors collecting data every 500 milliseconds. Additionally, Kinota architecture is very modular allowing for customization by adopters who can choose to replace parts of the existing implementation when desirable. The architecture is also highly portable providing the flexibility to choose between cloud providers like azure, amazon, google etc. The scalable, flexible and cloud friendly architecture of Kinota makes it ideal for use in next-generation large-scale and high-resolution real-time environmental monitoring networks used in domains such as hydrology, geomorphology, and geophysics, as well as management applications such as flood early warning, and regulatory enforcement.
A Synopsis of Global Mapping of Freshwater Habitats and Biodiversity: Implications for Conservation
DOE Office of Scientific and Technical Information (OSTI.GOV)
McManamay, Ryan A.; Griffiths, Natalie A.; DeRolph, Christopher R.
Accurately mapping freshwater habitats and biodiversity at high-resolutions across the globe is essential for assessing the vulnerability and threats to freshwater organisms and prioritizing conservation efforts. Since the 2000s, extensive efforts have been devoted to mapping global freshwater habitats (rivers, lakes, and wetlands), the spatial representation of which has changed dramatically over time with new geospatial data products and improved remote sensing technologies. Some of these mapping efforts, however, are still coarse representations of actual conditions. Likewise, the resolution and scope of global freshwater biodiversity compilation efforts have also increased, but are yet to mirror the spatial resolution and fidelitymore » of mapped freshwater environments. In our synopsis, we find that efforts to map freshwater habitats have been conducted independently of those for freshwater biodiversity; subsequently, there is little congruence in the spatial representation and resolution of the two efforts. We suggest that global species distribution models are needed to fill this information gap; however, limiting data on habitat characteristics at scales that complement freshwater habitats has prohibited global high-resolution biogeography efforts. Emerging research trends, such as mapping habitat alteration in freshwater ecosystems and trait biogeography, show great promise in mechanistically linking global anthropogenic stressors to freshwater biodiversity decline and extinction risk.« less
Birdsong, Timothy W.; Bean, Megan; Grabowski, Timothy B.; Hardy, Thomas B.; Heard, Thomas; Holdstock, Derrick; Kollaus, Kristy; Magnelia, Stephan J.; Tolman, Kristina
2015-01-01
Low-cost unmanned aerial systems (UAS) have recently gained increasing attention in natural resources management due to their versatility and demonstrated utility in collection of high-resolution, temporally-specific geospatial data. This study applied low-cost UAS to support the geospatial data needs of aquatic resources management projects in four Texas rivers. Specifically, a UAS was used to (1) map invasive salt cedar (multiple species in the genus Tamarix) that have degraded instream habitat conditions in the Pease River, (2) map instream meso-habitats and structural habitat features (e.g., boulders, woody debris) in the South Llano River as a baseline prior to watershed-scale habitat improvements, (3) map enduring pools in the Blanco River during drought conditions to guide smallmouth bass removal efforts, and (4) quantify river use by anglers in the Guadalupe River. These four case studies represent an initial step toward assessing the full range of UAS applications in aquatic resources management, including their ability to offer potential cost savings, time efficiencies, and higher quality data over traditional survey methods.
Johnson, Samuel Y.; Cochrane, Guy R.; Golden, Nadine; Dartnell, Peter; Hartwell, Stephen; Cochran, Susan; Watt, Janet
2017-01-01
The California Seafloor and Coastal Mapping Program (CSCMP) is a collaborative effort to develop comprehensive bathymetric, geologic, and habitat maps and data for California's State Waters. CSCMP began in 2007 when the California Ocean Protection Council (OPC) and the National Oceanic and Atmospheric Administration (NOAA) allocated funding for high-resolution bathymetric mapping, largely to support the California Marine Life Protection Act and to update nautical charts. Collaboration and support from the U.S. Geological Survey and other partners has led to development and dissemination of one of the world's largest seafloor-mapping datasets. CSCMP provides essential science and data for ocean and coastal management, stimulates and enables research, and raises public education and awareness of coastal and ocean issues. Specific applications include:•Delineation and designation of marine protected areas•Characterization and modeling of benthic habitats and ecosystems•Updating nautical charts•Earthquake hazard assessments•Tsunami hazard assessments•Planning offshore infrastructure•Providing baselines for monitoring change•Input to models of sediment transport, coastal erosion, and coastal flooding•Regional sediment management•Understanding coastal aquifers•Providing geospatial data for emergency response
KML Super Overlay to WMS Translator
NASA Technical Reports Server (NTRS)
Plesea, Lucian
2007-01-01
This translator is a server-based application that automatically generates KML super overlay configuration files required by Google Earth for map data access via the Open Geospatial Consortium WMS (Web Map Service) standard. The translator uses a set of URL parameters that mirror the WMS parameters as much as possible, and it also can generate a super overlay subdivision of any given area that is only loaded when needed, enabling very large areas of coverage at very high resolutions. It can make almost any dataset available as a WMS service visible and usable in any KML application, without the need to reformat the data.
ERIC Educational Resources Information Center
Gaudet, Cyndi; Annulis, Heather; Kmiec, John
2010-01-01
The Geospatial Technology Apprenticeship Program (GTAP) pilot was designed as a replicable and sustainable program to enhance workforce skills in geospatial technologies to best leverage a $30 billion market potential. The purpose of evaluating GTAP was to ensure that investment in this high-growth industry was adding value. Findings from this…
NASA Astrophysics Data System (ADS)
Arbab-Zavar, B.; Chakravarthy, A.; Sabeur, Z. A.
2012-04-01
The rapid development of advanced smart communication tools with good quality and resolution video cameras, audio and GPS devices in the last few years shall lead to profound impacts on the way future environmental observations are conducted and accessed by communities. The resulting large scale interconnections of these "Future Internet Things" form a large environmental sensing network which will generate large volumes of quality environmental observations and at highly localised spatial scales. This enablement in environmental sensing at local scales will be of great importance to contribute in the study of fauna and flora in the near future, particularly on the effect of climate change on biodiversity in various regions of Europe and beyond. The Future Internet could also potentially become the de facto information space to provide participative real-time sensing by communities and improve our situation awarness of the effect of climate on local environments. In the ENVIROFI(2011-2013) Usage Area project in the FP7 FI-PPP programme, a set of requirements for specific (and generic) enablers is achieved with the potential establishement of participating community observatories of the future. In particular, the specific enablement of interest concerns the building of future interoperable services for the management of environmental data intelligently with tagged contextual geo-spatial information generated by multiple operators in communities (Using smart phones). The classification of observed species in the resulting images is achieved with structured data pre-processing, semantic enrichement using contextual geospatial information, and high level fusion with controlled uncertainty estimations. The returned identification of species is further improved using future ground truth corrections and learning by the specific enablers.
Geophysical Data from Offshore of the Chandeleur Islands, Eastern Mississippi Delta
Baldwin, Wayne E.; Pendleton, Elizabeth A.; Twichell, David C.
2009-01-01
This report contains the geophysical and geospatial data that were collected during two cruises on the R/V Acadiana along the eastern, offshore side of the Chandeleur Islands in 2006 and 2007. Data were acquired with the following equipment: a Systems Engineering and Assessment, Ltd., SwathPlus interferometric sonar; a Klein 3000 dual-frequency sidescan sonar; and an EdgeTech 512i chirp sub-bottom profiling system. The long-term goal of this mapping effort is to produce high-quality, high-resolution geologic maps and geophysical interpretations that can be utilized to investigate the impact of Hurricane Katrina, identify sand resources within the region, and make predictions regarding the future evolution of this coastal system.
Geospatial analysis based on GIS integrated with LADAR.
Fetterman, Matt R; Freking, Robert; Fernandez-Cull, Christy; Hinkle, Christopher W; Myne, Anu; Relyea, Steven; Winslow, Jim
2013-10-07
In this work, we describe multi-layered analyses of a high-resolution broad-area LADAR data set in support of expeditionary activities. High-level features are extracted from the LADAR data, such as the presence and location of buildings and cars, and then these features are used to populate a GIS (geographic information system) tool. We also apply line-of-sight (LOS) analysis to develop a path-planning module. Finally, visualization is addressed and enhanced with a gesture-based control system that allows the user to navigate through the enhanced data set in a virtual immersive experience. This work has operational applications including military, security, disaster relief, and task-based robotic path planning.
The development of effective measures to stabilize atmospheric 22 CO2 concentration and mitigate negative impacts of climate change requires accurate quantification of the spatial variation and magnitude of the terrestrial carbon (C) flux. However, the spatial pattern and strengt...
Merging climate and multi-sensor time-series data in real-time drought monitoring across the U.S.A.
Brown, Jesslyn F.; Miura, T.; Wardlow, B.; Gu, Yingxin
2011-01-01
Droughts occur repeatedly in the United States resulting in billions of dollars of damage. Monitoring and reporting on drought conditions is a necessary function of government agencies at multiple levels. A team of Federal and university partners developed a drought decision- support tool with higher spatial resolution relative to traditional climate-based drought maps. The Vegetation Drought Response Index (VegDRI) indicates general canopy vegetation condition assimilation of climate, satellite, and biophysical data via geospatial modeling. In VegDRI, complementary drought-related data are merged to provide a comprehensive, detailed representation of drought stress on vegetation. Time-series data from daily polar-orbiting earth observing systems [Advanced Very High Resolution Radiometer (AVHRR) and Moderate Resolution Imaging Spectroradiometer (MODIS)] providing global measurements of land surface conditions are ingested into VegDRI. Inter-sensor compatibility is required to extend multi-sensor data records; thus, translations were developed using overlapping observations to create consistent, long-term data time series.
NASA Astrophysics Data System (ADS)
Johnson, S. Y.; Cochrane, G. R.; Golden, N. E.; Dartnell, P.; Hartwell, S. R.; Cochran, S. A.; Watt, J. T.
2017-12-01
The California Seafloor Mapping Program (CSMP) is a collaborative effort to develop comprehensive bathymetric, geologic, and habitat maps and data for California's State Waters, which extend for 1,350 km from the shoreline to 5.6 km offshore. CSMP began in 2007 when the California Ocean Protection Council and NOAA allocated funding for high-resolution bathymetric mapping to support the California Marine Life Protection Act and update nautical charts. Collaboration and support from the USGS and other partners has led to development and dissemination of one of the world's largest seafloor-mapping datasets. CSMP data collection includes: (1) High-resolution bathymetric and backscatter mapping using swath sonar sensors; (2) "Ground-truth" imaging from a sled mounted with video and still cameras; (3) High-resolution seismic-reflection profiling at 1 km line spacing. Processed data are all publicly available. Additionally, 25 USGS map and datasets covering one third of California's coast have been published. Each publication contains 9 to 12 pdf map sheets (1:24,000 scale), an explanatory pamphlet, and a catalog of digital geospatial data layers (about 15 to 25 per map area) with web services. Map sheets display bathymetry, backscatter, perspective views, habitats, groundtruth imagery, seismic profiles, sediment distribution and thickness, and onshore-offshore geology. The CSMP goal is to serve a large constituency, ranging from senior GIS analysts in large agencies, to local governments with limited resources, to non-governmental organizations, the private sector, and concerned citizens. CSMP data and publications provide essential science and data for ocean and coastal management, stimulate and enable research, and raise public education and awareness of coastal and ocean issues. Specific applications include: Delineation and designation of marine protected areas Characterization and modeling of benthic habitats and ecosystems Updating nautical charts Earthquake hazard assessments Tsunami hazard assessments Planning and developing offshore infrastructure Providing baselines for monitoring change Input to models of sediment transport, coastal erosion, and coastal flooding Regional sediment management Understanding coastal aquifers Emergency (e.g., oil spill) response
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zelenyuk, Alla; Imre, D.; Wilson, Jacqueline M.
2015-02-01
Understanding the effect of aerosols on climate requires knowledge of the size and chemical composition of individual aerosol particles - two fundamental properties that determine an aerosol’s optical properties and ability to serve as cloud condensation or ice nuclei. Here we present miniSPLAT, our new aircraft compatible single particle mass spectrometer, that measures in-situ and in real-time size and chemical composition of individual aerosol particles with extremely high sensitivity, temporal resolution, and sizing precision on the order of a monolayer. miniSPLAT operates in dual data acquisition mode to measure, in addition to single particle size and composition, particle number concentrations,more » size distributions, density, and asphericity with high temporal resolution. When compared to our previous instrument, SPLAT II, miniSPLAT has been significantly reduced in size, weight, and power consumption without loss in performance. We also present ND-Scope, our newly developed interactive visual analytics software package. ND-Scope is designed to explore and visualize the vast amount of complex, multidimensional data acquired by our single particle mass spectrometers, along with other aerosol and cloud characterization instruments on-board aircraft. We demonstrate that ND-Scope makes it possible to visualize the relationships between different observables and to view the data in a geo-spatial context, using the interactive and fully coupled Google Earth and Parallel Coordinates displays. Here we illustrate the utility of ND-Scope to visualize the spatial distribution of atmospheric particles of different compositions, and explore the relationship between individual particle composition and their activity as cloud condensation nuclei.« less
Data for Renewable Energy Planning, Policy, and Investment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cox, Sarah L
Reliable, robust, and validated data are critical for informed planning, policy development, and investment in the clean energy sector. The Renewable Energy (RE) Explorer was developed to support data-driven renewable energy analysis that can inform key renewable energy decisions globally. This document presents the types of geospatial and other data at the core of renewable energy analysis and decision making. Individual data sets used to inform decisions vary in relation to spatial and temporal resolution, quality, and overall usefulness. From Data to Decisions, a complementary geospatial data and analysis decision guide, provides an in-depth view of these and other considerationsmore » to enable data-driven planning, policymaking, and investment. Data support a wide variety of renewable energy analyses and decisions, including technical and economic potential assessment, renewable energy zone analysis, grid integration, risk and resiliency identification, electrification, and distributed solar photovoltaic potential. This fact sheet provides information on the types of data that are important for renewable energy decision making using the RE Data Explorer or similar types of geospatial analysis tools.« less
High-Resolution Climate Data Visualization through GIS- and Web-based Data Portals
NASA Astrophysics Data System (ADS)
WANG, X.; Huang, G.
2017-12-01
Sound decisions on climate change adaptation rely on an in-depth assessment of potential climate change impacts at regional and local scales, which usually requires finer resolution climate projections at both spatial and temporal scales. However, effective downscaling of global climate projections is practically difficult due to the lack of computational resources and/or long-term reference data. Although a large volume of downscaled climate data has been make available to the public, how to understand and interpret the large-volume climate data and how to make use of the data to drive impact assessment and adaptation studies are still challenging for both impact researchers and decision makers. Such difficulties have become major barriers preventing informed climate change adaptation planning at regional scales. Therefore, this research will explore new GIS- and web-based technologies to help visualize the large-volume regional climate data with high spatiotemporal resolutions. A user-friendly public data portal, named Climate Change Data Portal (CCDP, http://ccdp.network), will be established to allow intuitive and open access to high-resolution regional climate projections at local scales. The CCDP offers functions of visual representation through geospatial maps and data downloading for a variety of climate variables (e.g., temperature, precipitation, relative humidity, solar radiation, and wind) at multiple spatial resolutions (i.e., 25 - 50 km) and temporal resolutions (i.e., annual, seasonal, monthly, daily, and hourly). The vast amount of information the CCDP encompasses can provide a crucial basis for assessing impacts of climate change on local communities and ecosystems and for supporting better decision making under a changing climate.
Airborne multicamera system for geo-spatial applications
NASA Astrophysics Data System (ADS)
Bachnak, Rafic; Kulkarni, Rahul R.; Lyle, Stacey; Steidley, Carl W.
2003-08-01
Airborne remote sensing has many applications that include vegetation detection, oceanography, marine biology, geographical information systems, and environmental coastal science analysis. Remotely sensed images, for example, can be used to study the aftermath of episodic events such as the hurricanes and floods that occur year round in the coastal bend area of Corpus Christi. This paper describes an Airborne Multi-Spectral Imaging System that uses digital cameras to provide high resolution at very high rates. The software is based on Delphi 5.0 and IC Imaging Control's ActiveX controls. Both time and the GPS coordinates are recorded. Three successful test flights have been conducted so far. The paper present flight test results and discusses the issues being addressed to fully develop the system.
NASA Astrophysics Data System (ADS)
Tsai, F.; Chen, L.-C.
2014-04-01
During the past decade, Taiwan has experienced an unusual and fast growing in the industry of mapping, remote sensing, spatial information and related markets. A successful space program and dozens of advanced airborne and ground-based remote sensing instruments as well as mobile mapping systems have been implemented and put into operation to support the vast demands of geospatial data acquisition. Moreover, in addition to the government agencies and research institutes, there are also tens of companies in the private sector providing geo-spatial data and services. However, the fast developing industry is also posing a great challenge to the education sector in Taiwan, especially the higher education for geo-spatial information. Facing this fast developing industry, the demands of skilled professionals and new technologies in order to address diversified needs are indubitably high. Consequently, while delighting in the expanding and prospering benefitted from the fast growing industry, how to fulfill these demands has become a challenge for the remote sensing and spatial information disciplines in the higher education institutes in Taiwan. This paper provides a brief insight into the status of the remote sensing and spatial information industry in Taiwan as well as the challenges of the education and technology transfer to support the increasing demands and to ensure the continuous development of the industry. In addition to the report of the current status of the remote sensing and spatial information related courses and programs in the colleges and universities, current and potential threatening issues and possible resolutions are also discussed in different points of view.
NASA Astrophysics Data System (ADS)
Clark, E. P.; Cosgrove, B.; Salas, F.
2016-12-01
As a significant step forward to transform NOAA's water prediction services, NOAA plans to implement a new National Water Model (NWM) Version 1.0 in August 2016. A continental scale water resources model, the NWM is an evolution of the WRF-Hydro architecture developed by the National Center for Atmospheric Research (NCAR). The NWM will provide analyses and forecasts of flow for the 2.7 million stream reaches nationwide in the National Hydrography Dataset Plus v2 (NHDPlusV2) jointly developed by the USGS and EPA. The NWM also produces high-resolution water budget variables of snow, soil moisture, and evapotranspiration on a 1-km grid. NOAA's stakeholders require additional decision support application to be built on these data. The Geo-intelligence division of the Office of Water Prediction is building new products and services that integrate output from the NWM with geospatial datasets such as infrastructure and demographics to better estimate the impacts dynamic water resource states on community resiliency. This presentation will detail the methods and underlying information to produce prototypes water resources intelligence that is timely, actionable and credible. Moreover, it will to explore the NWM capability to support sector-specific decision support services.
ArcticDEM; A Publically Available, High Resolution Elevation Model of the Arctic
NASA Astrophysics Data System (ADS)
Morin, Paul; Porter, Claire; Cloutier, Michael; Howat, Ian; Noh, Myoung-Jong; Willis, Michael; Bates, Brian; Willamson, Cathleen; Peterman, Kennith
2016-04-01
A Digital Elevation Model (DEM) of the Arctic is needed for a large number of reasons, including: measuring and understanding rapid, ongoing changes to the Arctic landscape resulting from climate change and human use and mitigation and adaptation planning for Arctic communities. The topography of the Arctic is more poorly mapped than most other regions of Earth due to logistical costs and the limits of satellite missions with low-latitude inclinations. A convergence of civilian, high-quality sub-meter stereo imagery; petascale computing and open source photogrammetry software has made it possible to produce a complete, very high resolution (2 to 8-meter posting), elevation model of the Arctic. A partnership between the US National Geospatial-intelligence Agency and a team led by the US National Science Foundation funded Polar Geospatial Center is using stereo imagery from DigitalGlobe's Worldview-1, 2 and 3 satellites and the Ohio State University's Surface Extraction with TIN-based Search-space Minimization (SETSM) software running on the University of Illinois's Blue Water supercomputer to address this challenge. The final product will be a seemless, 2-m posting digital surface model mosaic of the entire Arctic above 60 North including all of Alaska, Greenland and Kamchatka. We will also make available the more than 300,000 individual time-stamped DSM strip pairs that were used to assemble the mosaic. The Arctic DEM will have a vertical precision of better than 0.5m and can be used to examine changes in land surfaces such as those caused by permafrost degradation or the evolution of arctic rivers and floodplains. The data set can also be used to highlight changing geomorphology due to Earth surface mass transport processes occurring in active volcanic and glacial environments. When complete the ArcticDEM will catapult the Arctic from the worst to among the best mapped regions on Earth.
NASA Astrophysics Data System (ADS)
Noh, M. J.; Howat, I. M.; Porter, C. C.; Willis, M. J.; Morin, P. J.
2016-12-01
The Arctic is undergoing rapid change associated with climate warming. Digital Elevation Models (DEMs) provide critical information for change measurement and infrastructure planning in this vulnerable region, yet the existing quality and coverage of DEMs in the Arctic is poor. Low contrast and repeatedly-textured surfaces, such as snow and glacial ice and mountain shadows, all common in the Arctic, challenge existing stereo-photogrammetric techniques. Submeter resolution, stereoscopic satellite imagery with high geometric and radiometric quality, and wide spatial coverage are becoming increasingly accessible to the scientific community. To utilize these imagery for extracting DEMs at a large scale over glaciated and high latitude regions we developed the Surface Extraction from TIN-based Searchspace Minimization (SETSM) algorithm. SETSM is fully automatic (i.e. no search parameter settings are needed) and uses only the satellite rational polynomial coefficients (RPCs). Using SETSM, we have generated a large number of DEMs (> 100,000 scene pair) from WorldView, GeoEye and QuickBird stereo images collected by DigitalGlobe Inc. and archived by the Polar Geospatial Center (PGC) at the University of Minnesota through an academic licensing program maintained by the US National Geospatial-Intelligence Agency (NGA). SETSM is the primary DEM generation software for the US National Science Foundation's ArcticDEM program, with the objective of generating high resolution (2-8m) topography for the entire Arctic landmass, including seamless DEM mosaics and repeat DEM strips for change detection. ArcticDEM is collaboration between multiple US universities, governmental agencies and private companies, as well as international partners assisting with quality control and registration. ArcticDEM is being produced using the petascale Blue Waters supercomputer at the National Center for Supercomputer Applications at the University of Illinois. In this paper, we introduce the SETSM algorithm and the processing system used for the ArcticDEM project, as well as provide notable examples of ArcticDEM products.
NASA Astrophysics Data System (ADS)
LIU, Yiping; XU, Qing; ZhANG, Heng; LV, Liang; LU, Wanjie; WANG, Dandi
2016-11-01
The purpose of this paper is to solve the problems of the traditional single system for interpretation and draughting such as inconsistent standards, single function, dependence on plug-ins, closed system and low integration level. On the basis of the comprehensive analysis of the target elements composition, map representation and similar system features, a 3D interpretation and draughting integrated service platform for multi-source, multi-scale and multi-resolution geospatial objects is established based on HTML5 and WebGL, which not only integrates object recognition, access, retrieval, three-dimensional display and test evaluation but also achieves collection, transfer, storage, refreshing and maintenance of data about Geospatial Objects and shows value in certain prospects and potential for growth.
Revelation of `Hidden' Balinese Geospatial Heritage on A Map
NASA Astrophysics Data System (ADS)
Soeria Atmadja, Dicky A. S.; Wikantika, Ketut; Budi Harto, Agung; Putra, Daffa Gifary M.
2018-05-01
Bali is not just about beautiful nature. It also has a unique and interesting cultural heritage, including `hidden' geospatial heritage. Tri Hita Karana is a Hinduism concept of life consisting of human relation to God, to other humans and to the nature (Parahiyangan, Pawongan and Palemahan), Based on it, - in term of geospatial aspect - the Balinese derived its spatial orientation, spatial planning & lay out, measurement as well as color and typography. Introducing these particular heritage would be a very interesting contribution to Bali tourism. As a respond to these issues, a question arise on how to reveal these unique and highly valuable geospatial heritage on a map which can be used to introduce and disseminate them to the tourists. Symbols (patterns & colors), orientation, distance, scale, layout and toponimy have been well known as elements of a map. There is an chance to apply Balinese geospatial heritage in representing these map elements.
Hamel, Perrine; Falinski, Kim; Sharp, Richard; Auerbach, Daniel A; Sánchez-Canales, María; Dennedy-Frank, P James
2017-02-15
Geospatial models are commonly used to quantify sediment contributions at the watershed scale. However, the sensitivity of these models to variation in hydrological and geomorphological features, in particular to land use and topography data, remains uncertain. Here, we assessed the performance of one such model, the InVEST sediment delivery model, for six sites comprising a total of 28 watersheds varying in area (6-13,500km 2 ), climate (tropical, subtropical, mediterranean), topography, and land use/land cover. For each site, we compared uncalibrated and calibrated model predictions with observations and alternative models. We then performed correlation analyses between model outputs and watershed characteristics, followed by sensitivity analyses on the digital elevation model (DEM) resolution. Model performance varied across sites (overall r 2 =0.47), but estimates of the magnitude of specific sediment export were as or more accurate than global models. We found significant correlations between metrics of sediment delivery and watershed characteristics, including erosivity, suggesting that empirical relationships may ultimately be developed for ungauged watersheds. Model sensitivity to DEM resolution varied across and within sites, but did not correlate with other observed watershed variables. These results were corroborated by sensitivity analyses performed on synthetic watersheds ranging in mean slope and DEM resolution. Our study provides modelers using InVEST or similar geospatial sediment models with practical insights into model behavior and structural uncertainty: first, comparison of model predictions across regions is possible when environmental conditions differ significantly; second, local knowledge on the sediment budget is needed for calibration; and third, model outputs often show significant sensitivity to DEM resolution. Copyright © 2016 Elsevier B.V. All rights reserved.
Mapping and monitoring of crop intensity, calendar and irrigation using multi-temporal MODIS data
NASA Astrophysics Data System (ADS)
Xiao, X.; Boes, S.; Mulukutla, G.; Proussevitch, A.; Routhier, M.
2005-12-01
Agriculture is the most extensive land use and water use on the Earth. Because of the diverse range of natural environments and human needs, agriculture is also the most complicated land use and water use system, which poses an enormous challenge to the scientific community, the public and decision-makers. Updated and geo-referenced information on crop intensity (number of crops per year), calendar (planting date, harvesting date) and irrigation is critically needed to better understand the impacts of agriculture on biogeochemical cycles (e.g., carbon, nitrogen, trace gases), water and climate dynamics. Here we present an effort to develop a novel approach for mapping and monitoring crop intensity, calendar and irrigation, using multi-temporal Moderate Resolution Imaging Spectroradiometer (MODIS) image data. Our algorithm employed three vegetation indices that are sensitive to the seasonal dynamics of leaf area index, light absorption by leaf chlorophyll and land surface water content. Our objective is to generate geospatial databases of crop intensity, calendar and irrigation at 500-m spatial resolution and at 8-day temporal resolution. In this presentation, we report a preliminary geospatial dataset of paddy rice crop intensity, calendar and irrigation in Asia, which is developed from the 8-day composite images of MODIS in 2002. The resultant dataset could be used in many applications, including hydrological and climate modeling.
NASA Astrophysics Data System (ADS)
Kumar, Pawan; Katiyar, Swati; Rani, Meenu
2016-07-01
We are living in the age of a rapidly growing population and changing environmental conditions with an advance technical capacity.This has resulted in wide spread land cover change. One of the main causes for increasing urban heat is that more than half of the world's population lives in a rapidly growing urbanized environment. Satellite data can be highly useful to map change in land cover and other environmental phenomena with the passage of time. Among several human-induced environmental and urban thermal problems are reported to be negatively affecting urban residents in many ways. The built-up structures in urbanized areas considerably alter land cover thereby affecting thermal energy flow which leads to development of elevated surface and air temperature. The phenomenon Urban Heat Island implies 'island' of high temperature in cities, surrounded by relatively lower temperature in rural areas. The UHI for the temporal period is estimated using geospatial techniques which are then utilized for the impact assessment on climate of the surrounding regions and how it reduce the sustainability of the natural resources like air, vegetation. The present paper describes the methodology and resolution dynamic urban heat island change on climate using the geospatial approach. NDVI were generated using day time LANDSAT ETM+ image of 1990, 2000 and 2013. Temperature of various land use and land cover categories was estimated. Keywords: NDVI, Surface temperature, Dynamic changes.
Path Network Recovery Using Remote Sensing Data and Geospatial-Temporal Semantic Graphs
DOE Office of Scientific and Technical Information (OSTI.GOV)
William C. McLendon III; Brost, Randy C.
Remote sensing systems produce large volumes of high-resolution images that are difficult to search. The GeoGraphy (pronounced Geo-Graph-y) framework [2, 20] encodes remote sensing imagery into a geospatial-temporal semantic graph representation to enable high level semantic searches to be performed. Typically scene objects such as buildings and trees tend to be shaped like blocks with few holes, but other shapes generated from path networks tend to have a large number of holes and can span a large geographic region due to their connectedness. For example, we have a dataset covering the city of Philadelphia in which there is a singlemore » road network node spanning a 6 mile x 8 mile region. Even a simple question such as "find two houses near the same street" might give unexpected results. More generally, nodes arising from networks of paths (roads, sidewalks, trails, etc.) require additional processing to make them useful for searches in GeoGraphy. We have assigned the term Path Network Recovery to this process. Path Network Recovery is a three-step process involving (1) partitioning the network node into segments, (2) repairing broken path segments interrupted by occlusions or sensor noise, and (3) adding path-aware search semantics into GeoQuestions. This report covers the path network recovery process, how it is used, and some example use cases of the current capabilities.« less
Grill, Günther; Khan, Usman; Lehner, Bernhard; Nicell, Jim; Ariwi, Joseph
2016-01-15
Chemicals released into freshwater systems threaten ecological functioning and may put aquatic life and the health of humans at risk. We developed a new contaminant fate model (CFM) that follows simple, well-established methodologies and is unique in its cross-border, seamless hydrological and geospatial framework, including lake routing, a critical component in northern environments. We validated the model using the pharmaceutical Carbamazepine and predicted eco-toxicological risk for 15 pharmaceuticals in the Saint-Lawrence River Basin, Canada. The results indicated negligible to low environmental risk for the majority of tested chemicals, while two pharmaceuticals showed elevated risk in up to 13% of rivers affected by municipal effluents. As an integrated model, our CFM is designed for application at very large scales with the primary goal of detecting high risk zones. In regulatory frameworks, it can help screen existing or new chemicals entering the market regarding their potential impact on human and environmental health. Due to its high geospatial resolution, our CFM can also facilitate the prioritization of actions, such as identifying regions where reducing contamination sources or upgrading treatment plants is most pertinent to achieve targeted pollutant removal or to protect drinking water resources. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.
Study on analysis from sources of error for Airborne LIDAR
NASA Astrophysics Data System (ADS)
Ren, H. C.; Yan, Q.; Liu, Z. J.; Zuo, Z. Q.; Xu, Q. Q.; Li, F. F.; Song, C.
2016-11-01
With the advancement of Aerial Photogrammetry, it appears that to obtain geo-spatial information of high spatial and temporal resolution provides a new technical means for Airborne LIDAR measurement techniques, with unique advantages and broad application prospects. Airborne LIDAR is increasingly becoming a new kind of space for earth observation technology, which is mounted by launching platform for aviation, accepting laser pulses to get high-precision, high-density three-dimensional coordinate point cloud data and intensity information. In this paper, we briefly demonstrates Airborne laser radar systems, and that some errors about Airborne LIDAR data sources are analyzed in detail, so the corresponding methods is put forwarded to avoid or eliminate it. Taking into account the practical application of engineering, some recommendations were developed for these designs, which has crucial theoretical and practical significance in Airborne LIDAR data processing fields.
Interactive Visualization and Analysis of Geospatial Data Sets - TrikeND-iGlobe
NASA Astrophysics Data System (ADS)
Rosebrock, Uwe; Hogan, Patrick; Chandola, Varun
2013-04-01
The visualization of scientific datasets is becoming an ever-increasing challenge as advances in computing technologies have enabled scientists to build high resolution climate models that have produced petabytes of climate data. To interrogate and analyze these large datasets in real-time is a task that pushes the boundaries of computing hardware and software. But integration of climate datasets with geospatial data requires considerable amount of effort and close familiarity of various data formats and projection systems, which has prevented widespread utilization outside of climate community. TrikeND-iGlobe is a sophisticated software tool that bridges this gap, allows easy integration of climate datasets with geospatial datasets and provides sophisticated visualization and analysis capabilities. The objective for TrikeND-iGlobe is the continued building of an open source 4D virtual globe application using NASA World Wind technology that integrates analysis of climate model outputs with remote sensing observations as well as demographic and environmental data sets. This will facilitate a better understanding of global and regional phenomenon, and the impact analysis of climate extreme events. The critical aim is real-time interactive interrogation. At the data centric level the primary aim is to enable the user to interact with the data in real-time for the purpose of analysis - locally or remotely. TrikeND-iGlobe provides the basis for the incorporation of modular tools that provide extended interactions with the data, including sub-setting, aggregation, re-shaping, time series analysis methods and animation to produce publication-quality imagery. TrikeND-iGlobe may be run locally or can be accessed via a web interface supported by high-performance visualization compute nodes placed close to the data. It supports visualizing heterogeneous data formats: traditional geospatial datasets along with scientific data sets with geographic coordinates (NetCDF, HDF, etc.). It also supports multiple data access mechanisms, including HTTP, FTP, WMS, WCS, and Thredds Data Server (for NetCDF data and for scientific data, TrikeND-iGlobe supports various visualization capabilities, including animations, vector field visualization, etc. TrikeND-iGlobe is a collaborative open-source project, contributors include NASA (ARC-PX), ORNL (Oakridge National Laboratories), Unidata, Kansas University, CSIRO CMAR Australia and Geoscience Australia.
Geospatial Information is the Cornerstone of Effective Hazards Response
Newell, Mark
2008-01-01
Every day there are hundreds of natural disasters world-wide. Some are dramatic, whereas others are barely noticeable. A natural disaster is commonly defined as a natural event with catastrophic consequences for living things in the vicinity. Those events include earthquakes, floods, hurricanes, landslides, tsunami, volcanoes, and wildfires. Man-made disasters are events that are caused by man either intentionally or by accident, and that directly or indirectly threaten public health and well-being. These occurrences span the spectrum from terrorist attacks to accidental oil spills. To assist in responding to natural and potential man-made disasters, the U.S. Geological Survey (USGS) has established the Geospatial Information Response Team (GIRT) (http://www.usgs.gov/emergency/). The primary purpose of the GIRT is to ensure rapid coordination and availability of geospatial information for effective response by emergency responders, and land and resource managers, and for scientific analysis. The GIRT is responsible for establishing monitoring procedures for geospatial data acquisition, processing, and archiving; discovery, access, and delivery of data; anticipating geospatial needs; and providing relevant geospatial products and services. The GIRT is focused on supporting programs, offices, other agencies, and the public in mission response to hazards. The GIRT will leverage the USGS Geospatial Liaison Network and partnerships with the Department of Homeland Security (DHS), National Geospatial-Intelligence Agency (NGA), and Northern Command (NORTHCOM) to coordinate the provisioning and deployment of USGS geospatial data, products, services, and equipment. The USGS geospatial liaisons will coordinate geospatial information sharing with State, local, and tribal governments, and ensure geospatial liaison back-up support procedures are in place. The GIRT will coordinate disposition of USGS staff in support of DHS response center activities as requested by DHS. The GIRT is a standing team that is available during all hazard events and is on high alert during the hurricane season from June through November each year. To track all of the requirements and data acquisitions processed through the team, the GIRT will use the new Emergency Request Track (ER Track) tool. Currently, the ER Track is only available to USGS personnel.
ERIC Educational Resources Information Center
Annulis, Heather M.; Gaudet, Cyndi H.
2007-01-01
A shortage of a qualified and skilled workforce exists to meet the demands of the geospatial industry (NASA, 2002). Solving today's workforce issues requires new and innovative methods and techniques for this high growth, high technology industry. One tool to support workforce development is a competency model which can be used to build a…
GIS applications for military operations in coastal zones
Fleming, S.; Jordan, T.; Madden, M.; Usery, E.L.; Welch, R.
2009-01-01
In order to successfully support current and future US military operations in coastal zones, geospatial information must be rapidly integrated and analyzed to meet ongoing force structure evolution and new mission directives. Coastal zones in a military-operational environment are complex regions that include sea, land and air features that demand high-volume databases of extreme detail within relatively narrow geographic corridors. Static products in the form of analog maps at varying scales traditionally have been used by military commanders and their operational planners. The rapidly changing battlefield of 21st Century warfare, however, demands dynamic mapping solutions. Commercial geographic information system (GIS) software for military-specific applications is now being developed and employed with digital databases to provide customized digital maps of variable scale, content and symbolization tailored to unique demands of military units. Research conducted by the Center for Remote Sensing and Mapping Science at the University of Georgia demonstrated the utility of GIS-based analysis and digital map creation when developing large-scale (1:10,000) products from littoral warfare databases. The methodology employed-selection of data sources (including high resolution commercial images and Lidar), establishment of analysis/modeling parameters, conduct of vehicle mobility analysis, development of models and generation of products (such as a continuous sea-land DEM and geo-visualization of changing shorelines with tidal levels)-is discussed. Based on observations and identified needs from the National Geospatial-Intelligence Agency, formerly the National Imagery and Mapping Agency, and the Department of Defense, prototype GIS models for military operations in sea, land and air environments were created from multiple data sets of a study area at US Marine Corps Base Camp Lejeune, North Carolina. Results of these models, along with methodologies for developing large-scale littoral warfare databases, aid the National Geospatial-Intelligence Agency in meeting littoral warfare analysis, modeling and map generation requirements for US military organizations. ?? 2008 International Society for Photogrammetry and Remote Sensing, Inc. (ISPRS).
GIS applications for military operations in coastal zones
NASA Astrophysics Data System (ADS)
Fleming, S.; Jordan, T.; Madden, M.; Usery, E. L.; Welch, R.
In order to successfully support current and future US military operations in coastal zones, geospatial information must be rapidly integrated and analyzed to meet ongoing force structure evolution and new mission directives. Coastal zones in a military-operational environment are complex regions that include sea, land and air features that demand high-volume databases of extreme detail within relatively narrow geographic corridors. Static products in the form of analog maps at varying scales traditionally have been used by military commanders and their operational planners. The rapidly changing battlefield of 21st Century warfare, however, demands dynamic mapping solutions. Commercial geographic information system (GIS) software for military-specific applications is now being developed and employed with digital databases to provide customized digital maps of variable scale, content and symbolization tailored to unique demands of military units. Research conducted by the Center for Remote Sensing and Mapping Science at the University of Georgia demonstrated the utility of GIS-based analysis and digital map creation when developing large-scale (1:10,000) products from littoral warfare databases. The methodology employed-selection of data sources (including high resolution commercial images and Lidar), establishment of analysis/modeling parameters, conduct of vehicle mobility analysis, development of models and generation of products (such as a continuous sea-land DEM and geo-visualization of changing shorelines with tidal levels)-is discussed. Based on observations and identified needs from the National Geospatial-Intelligence Agency, formerly the National Imagery and Mapping Agency, and the Department of Defense, prototype GIS models for military operations in sea, land and air environments were created from multiple data sets of a study area at US Marine Corps Base Camp Lejeune, North Carolina. Results of these models, along with methodologies for developing large-scale littoral warfare databases, aid the National Geospatial-Intelligence Agency in meeting littoral warfare analysis, modeling and map generation requirements for US military organizations.
The road to NHDPlus — Advancements in digital stream networks and associated catchments
Moore, Richard B.; Dewald, Thomas A.
2016-01-01
A progression of advancements in Geographic Information Systems techniques for hydrologic network and associated catchment delineation has led to the production of the National Hydrography Dataset Plus (NHDPlus). NHDPlus is a digital stream network for hydrologic modeling with catchments and a suite of related geospatial data. Digital stream networks with associated catchments provide a geospatial framework for linking and integrating water-related data. Advancements in the development of NHDPlus are expected to continue to improve the capabilities of this national geospatial hydrologic framework. NHDPlus is built upon the medium-resolution NHD and, like NHD, was developed by the U.S. Environmental Protection Agency and U.S. Geological Survey to support the estimation of streamflow and stream velocity used in fate-and-transport modeling. Catchments included with NHDPlus were created by integrating vector information from the NHD and from the Watershed Boundary Dataset with the gridded land surface elevation as represented by the National Elevation Dataset. NHDPlus is an actively used and continually improved dataset. Users recognize the importance of a reliable stream network and associated catchments. The NHDPlus spatial features and associated data tables will continue to be improved to support regional water quality and streamflow models and other user-defined applications.
SWOT analysis on National Common Geospatial Information Service Platform of China
NASA Astrophysics Data System (ADS)
Zheng, Xinyan; He, Biao
2010-11-01
Currently, the trend of International Surveying and Mapping is shifting from map production to integrated service of geospatial information, such as GOS of U.S. etc. Under this circumstance, the Surveying and Mapping of China is inevitably shifting from 4D product service to NCGISPC (National Common Geospatial Information Service Platform of China)-centered service. Although State Bureau of Surveying and Mapping of China has already provided a great quantity of geospatial information service to various lines of business, such as emergency and disaster management, transportation, water resource, agriculture etc. The shortcomings of the traditional service mode are more and more obvious, due to the highly emerging requirement of e-government construction, the remarkable development of IT technology and emerging online geospatial service demands of various lines of business. NCGISPC, which aimed to provide multiple authoritative online one-stop geospatial information service and API for further development to government, business and public, is now the strategic core of SBSM (State Bureau of Surveying and Mapping of China). This paper focuses on the paradigm shift that NCGISPC brings up by using SWOT (Strength, Weakness, Opportunity and Threat) analysis, compared to the service mode that based on 4D product. Though NCGISPC is still at its early stage, it represents the future service mode of geospatial information of China, and surely will have great impact not only on the construction of digital China, but also on the way that everyone uses geospatial information service.
Possibilities of Use of UAVS for Technical Inspection of Buildings and Constructions
NASA Astrophysics Data System (ADS)
Banaszek, Anna; Banaszek, Sebastian; Cellmer, Anna
2017-12-01
In recent years, Unmanned Aerial Vehicles (UAVs) have been used in various sectors of the economy. This is due to the development of new technologies for acquiring and processing geospatial data. The paper presents the results of experiments using UAV, equipped with a high resolution digital camera, for a visual assessment of the technical condition of the building roof and for the inventory of energy infrastructure and its surroundings. The usefulness of digital images obtained from the UAV deck is presented in concrete examples. The use of UAV offers new opportunities in the area of technical inspection due to the detail and accuracy of the data, low operating costs and fast data acquisition.
NASA Astrophysics Data System (ADS)
Oeldenberger, S.; Khaled, K. B.
2012-07-01
The African Geospatial Sciences Institute (AGSI) is currently being established in Tunisia as a non-profit, non-governmental organization (NGO). Its objective is to accelerate the geospatial capacity development in North-Africa, providing the facilities for geospatial project and management training to regional government employees, university graduates, private individuals and companies. With typical course durations between one and six months, including part-time programs and long-term mentoring, its focus is on practical training, providing actual project execution experience. The AGSI will complement formal university education and will work closely with geospatial certification organizations and the geospatial industry. In the context of closer cooperation between neighboring North Africa and the European Community, the AGSI will be embedded in a network of several participating European and African universities, e. g. the ITC, and international organizations, such as the ISPRS, the ICA and the OGC. Through a close cooperation with African organizations, such as the AARSE, the RCMRD and RECTAS, the network and exchange of ideas, experiences, technology and capabilities will be extended to Saharan and sub-Saharan Africa. A board of trustees will be steering the AGSI operations and will ensure that practical training concepts and contents are certifiable and can be applied within a credit system to graduate and post-graduate education at European and African universities. The geospatial training activities of the AGSI are centered on a facility with approximately 30 part- and full-time general staff and lecturers in Tunis during the first year. The AGSI will operate a small aircraft with a medium-format aerial camera and compact LIDAR instrument for local, community-scale data capture. Surveying training, the photogrammetric processing of aerial images, GIS data capture and remote sensing training will be the main components of the practical training courses offered, to build geospatial capacity and ensure that AGSI graduates will have the appropriate skill-sets required for employment in the geospatial industry. Geospatial management courses and high-level seminars will be targeted at decision makers in government and industry to build awareness for geospatial applications and benefits. Online education will be developed together with international partners and internet-based activities will involve the public to familiarize them with geospatial data and its many applications.
Benjamin A. Crabb; James A. Powell; Barbara J. Bentz
2012-01-01
Forecasting spatial patterns of mountain pine beetle (MPB) population success requires spatially explicit information on host pine distribution. We developed a means of producing spatially explicit datasets of pine density at 30-m resolution using existing geospatial datasets of vegetation composition and structure. Because our ultimate goal is to model MPB population...
Innovating Big Data Computing Geoprocessing for Analysis of Engineered-Natural Systems
NASA Astrophysics Data System (ADS)
Rose, K.; Baker, V.; Bauer, J. R.; Vasylkivska, V.
2016-12-01
Big data computing and analytical techniques offer opportunities to improve predictions about subsurface systems while quantifying and characterizing associated uncertainties from these analyses. Spatial analysis, big data and otherwise, of subsurface natural and engineered systems are based on variable resolution, discontinuous, and often point-driven data to represent continuous phenomena. We will present examples from two spatio-temporal methods that have been adapted for use with big datasets and big data geo-processing capabilities. The first approach uses regional earthquake data to evaluate spatio-temporal trends associated with natural and induced seismicity. The second algorithm, the Variable Grid Method (VGM), is a flexible approach that presents spatial trends and patterns, such as those resulting from interpolation methods, while simultaneously visualizing and quantifying uncertainty in the underlying spatial datasets. In this presentation we will show how we are utilizing Hadoop to store and perform spatial analyses to efficiently consume and utilize large geospatial data in these custom analytical algorithms through the development of custom Spark and MapReduce applications that incorporate ESRI Hadoop libraries. The team will present custom `Big Data' geospatial applications that run on the Hadoop cluster and integrate with ESRI ArcMap with the team's probabilistic VGM approach. The VGM-Hadoop tool has been specially built as a multi-step MapReduce application running on the Hadoop cluster for the purpose of data reduction. This reduction is accomplished by generating multi-resolution, non-overlapping, attributed topology that is then further processed using ESRI's geostatistical analyst to convey a probabilistic model of a chosen study region. Finally, we will share our approach for implementation of data reduction and topology generation via custom multi-step Hadoop applications, performance benchmarking comparisons, and Hadoop-centric opportunities for greater parallelization of geospatial operations.
NASA Astrophysics Data System (ADS)
Kayi, A.; Erdogan, M.; Yilmaz, A.
2014-11-01
An earthquake occurred at Van City on 23 October 2011 at 13:41 local time. The magnitude, moment magnitude and depth of earthquake were respectively MI:6.7, Mw:7.0 and 19.07 km. Van city centre and its surrounding villages were affected from this destructive earthquake. Many buildings were ruined and approximately 600 people died. Acquisition and use of geospatial data is very important and crucial for the management of such kind of natural disasters. In this paper, the role of national and international geospatial data in the management of Van earthquake is investigated.. With an international collaboration with Charter, pre and post-earthquake satellite images were acquired in 24 hours following the Earthquake. Also General Command of Mapping (GCM), the national mapping agency of Turkey, produced the high resolution multispectral orthophotos of the region. Charter presented the orthophotos through 26-28 October 2012. Just after the earthquake with a quick reaction, GCM made the flight planning of the 1296 km2 disaster area to acquire aerial photos. The aerial photos were acquired on 24 October 2012 (one day after the earthquake) by UltraCamX large format digital aerial camera. 152 images were taken with 30 cm ground sample distance (GSD) by %30 sidelap and %60 overlap. In the evening of same flight day, orthophotos were produced without ground control points by direct georeferencing and GCM supplied the orthophotos to the disaster management authorities. Also 45 cm GSD archive orthophotos, acquired in 2010, were used as a reference in order to find out the effects of the disaster. The subjects written here do not represent the ideas of Turkish Armed Forces.
Predicting the spatial extent of liquefaction from geospatial and earthquake specific parameters
Zhu, Jing; Baise, Laurie G.; Thompson, Eric M.; Wald, David J.; Knudsen, Keith L.; Deodatis, George; Ellingwood, Bruce R.; Frangopol, Dan M.
2014-01-01
The spatially extensive damage from the 2010-2011 Christchurch, New Zealand earthquake events are a reminder of the need for liquefaction hazard maps for anticipating damage from future earthquakes. Liquefaction hazard mapping as traditionally relied on detailed geologic mapping and expensive site studies. These traditional techniques are difficult to apply globally for rapid response or loss estimation. We have developed a logistic regression model to predict the probability of liquefaction occurrence in coastal sedimentary areas as a function of simple and globally available geospatial features (e.g., derived from digital elevation models) and standard earthquake-specific intensity data (e.g., peak ground acceleration). Some of the geospatial explanatory variables that we consider are taken from the hydrology community, which has a long tradition of using remotely sensed data as proxies for subsurface parameters. As a result of using high resolution, remotely-sensed, and spatially continuous data as a proxy for important subsurface parameters such as soil density and soil saturation, and by using a probabilistic modeling framework, our liquefaction model inherently includes the natural spatial variability of liquefaction occurrence and provides an estimate of spatial extent of liquefaction for a given earthquake. To provide a quantitative check on how the predicted probabilities relate to spatial extent of liquefaction, we report the frequency of observed liquefaction features within a range of predicted probabilities. The percentage of liquefaction is the areal extent of observed liquefaction within a given probability contour. The regional model and the results show that there is a strong relationship between the predicted probability and the observed percentage of liquefaction. Visual inspection of the probability contours for each event also indicates that the pattern of liquefaction is well represented by the model.
Lin, Yun-Bin; Lin, Yu-Pin; Deng, Dong-Po; Chen, Kuan-Wei
2008-02-19
In Taiwan, earthquakes have long been recognized as a major cause oflandslides that are wide spread by floods brought by typhoons followed. Distinguishingbetween landslide spatial patterns in different disturbance regimes is fundamental fordisaster monitoring, management, and land-cover restoration. To circumscribe landslides,this study adopts the normalized difference vegetation index (NDVI), which can bedetermined by simply applying mathematical operations of near-infrared and visible-redspectral data immediately after remotely sensed data is acquired. In real-time disastermonitoring, the NDVI is more effective than using land-cover classifications generatedfrom remotely sensed data as land-cover classification tasks are extremely time consuming.Directional two-dimensional (2D) wavelet analysis has an advantage over traditionalspectrum analysis in that it determines localized variations along a specific direction whenidentifying dominant modes of change, and where those modes are located in multi-temporal remotely sensed images. Open geospatial techniques comprise a series ofsolutions developed based on Open Geospatial Consortium specifications that can beapplied to encode data for interoperability and develop an open geospatial service for sharing data. This study presents a novel approach and framework that uses directional 2Dwavelet analysis of real-time NDVI images to effectively identify landslide patterns andshare resulting patterns via open geospatial techniques. As a case study, this study analyzedNDVI images derived from SPOT HRV images before and after the ChiChi earthquake(7.3 on the Richter scale) that hit the Chenyulan basin in Taiwan, as well as images aftertwo large typhoons (Xangsane and Toraji) to delineate the spatial patterns of landslidescaused by major disturbances. Disturbed spatial patterns of landslides that followed theseevents were successfully delineated using 2D wavelet analysis, and results of patternrecognitions of landslides were distributed simultaneously to other agents using geographymarkup language. Real-time information allows successive platforms (agents) to work withlocal geospatial data for disaster management. Furthermore, the proposed is suitable fordetecting landslides in various regions on continental, regional, and local scales usingremotely sensed data in various resolutions derived from SPOT HRV, IKONOS, andQuickBird multispectral images.
Pendleton, Elizabeth A.; Andrews, Brian D.; Danforth, William W.; Foster, David S.
2014-01-01
Geophysical and geospatial data were collected in Buzzards Bay, in the shallow-water areas of Vineyard Sound, and in the nearshore areas off the eastern Elizabeth Islands and northern coast of Martha's Vineyard, Massachusetts, on the U.S. Geological Survey research vessel Rafael between 2007 and 2011, in a collaborative effort between the U.S. Geological Survey and the Massachusetts Office of Coastal Zone Management. This report describes results of this collaborative effort, which include mapping the geology of the inner shelf zone of the Elizabeth Islands and the sand shoals of Vineyard Sound and studying geologic processes that contribute to the evolution of this area. Data collected during these surveys include: bathymetry, acoustic backscatter, seismic-reflection profiles, sound velocity profiles, and navigation. The long-term goals of this project are (1) to provide high-resolution geophysical data that will support research on the influence of sea-level change and sediment supply on coastal evolution and (2) to inventory subtidal marine habitats and their distribution within the coastal zone of Massachusetts.
BPELPower—A BPEL execution engine for geospatial web services
NASA Astrophysics Data System (ADS)
Yu, Genong (Eugene); Zhao, Peisheng; Di, Liping; Chen, Aijun; Deng, Meixia; Bai, Yuqi
2012-10-01
The Business Process Execution Language (BPEL) has become a popular choice for orchestrating and executing workflows in the Web environment. As one special kind of scientific workflow, geospatial Web processing workflows are data-intensive, deal with complex structures in data and geographic features, and execute automatically with limited human intervention. To enable the proper execution and coordination of geospatial workflows, a specially enhanced BPEL execution engine is required. BPELPower was designed, developed, and implemented as a generic BPEL execution engine with enhancements for executing geospatial workflows. The enhancements are especially in its capabilities in handling Geography Markup Language (GML) and standard geospatial Web services, such as the Web Processing Service (WPS) and the Web Feature Service (WFS). BPELPower has been used in several demonstrations over the decade. Two scenarios were discussed in detail to demonstrate the capabilities of BPELPower. That study showed a standard-compliant, Web-based approach for properly supporting geospatial processing, with the only enhancement at the implementation level. Pattern-based evaluation and performance improvement of the engine are discussed: BPELPower directly supports 22 workflow control patterns and 17 workflow data patterns. In the future, the engine will be enhanced with high performance parallel processing and broad Web paradigms.
Robinson, Nathaniel; Allred, Brady; Jones, Matthew; ...
2017-08-21
Satellite derived vegetation indices (VIs) are broadly used in ecological research, ecosystem modeling, and land surface monitoring. The Normalized Difference Vegetation Index (NDVI), perhaps the most utilized VI, has countless applications across ecology, forestry, agriculture, wildlife, biodiversity, and other disciplines. Calculating satellite derived NDVI is not always straight-forward, however, as satellite remote sensing datasets are inherently noisy due to cloud and atmospheric contamination, data processing failures, and instrument malfunction. Readily available NDVI products that account for these complexities are generally at coarse resolution; high resolution NDVI datasets are not conveniently accessible and developing them often presents numerous technical and methodologicalmore » challenges. Here, we address this deficiency by producing a Landsat derived, high resolution (30 m), long-term (30+ years) NDVI dataset for the conterminous United States. We use Google Earth Engine, a planetary-scale cloud-based geospatial analysis platform, for processing the Landsat data and distributing the final dataset. We use a climatology driven approach to fill missing data and validate the dataset with established remote sensing products at multiple scales. We provide access to the composites through a simple web application, allowing users to customize key parameters appropriate for their application, question, and region of interest.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robinson, Nathaniel; Allred, Brady; Jones, Matthew
Satellite derived vegetation indices (VIs) are broadly used in ecological research, ecosystem modeling, and land surface monitoring. The Normalized Difference Vegetation Index (NDVI), perhaps the most utilized VI, has countless applications across ecology, forestry, agriculture, wildlife, biodiversity, and other disciplines. Calculating satellite derived NDVI is not always straight-forward, however, as satellite remote sensing datasets are inherently noisy due to cloud and atmospheric contamination, data processing failures, and instrument malfunction. Readily available NDVI products that account for these complexities are generally at coarse resolution; high resolution NDVI datasets are not conveniently accessible and developing them often presents numerous technical and methodologicalmore » challenges. Here, we address this deficiency by producing a Landsat derived, high resolution (30 m), long-term (30+ years) NDVI dataset for the conterminous United States. We use Google Earth Engine, a planetary-scale cloud-based geospatial analysis platform, for processing the Landsat data and distributing the final dataset. We use a climatology driven approach to fill missing data and validate the dataset with established remote sensing products at multiple scales. We provide access to the composites through a simple web application, allowing users to customize key parameters appropriate for their application, question, and region of interest.« less
NASA Astrophysics Data System (ADS)
Cropp, E. L.; Hazenberg, P.; Castro, C. L.; Demaria, E. M.
2017-12-01
In the southwestern US, the summertime North American Monsoon (NAM) provides about 60% of the region's annual precipitation. Recent research using high-resolution atmospheric model simulations and retrospective predictions has shown that since the 1950's, and more specifically in the last few decades, the mean daily precipitation in the southwestern U.S. during the NAM has followed a decreasing trend. Furthermore, days with more extreme precipitation have intensified. The current work focuses the impact of these long-term changes on the observed small-scale spatial variability of intense precipitation. Since limited long-term high-resolution observational data exist to support such climatological-induced spatial changes in precipitation frequency and intensity, the current work utilizes observations from the USDA-ARS Walnut Gulch Experimental Watershed (WGEW) in southeastern Arizona. Within this 150 km^2 catchment over 90 rain gauges have been installed since the 1950s, measuring at sub-hourly resolution. We have applied geospatial analyses and the kriging interpolation technique to identify long-term changes in the spatial and temporal correlation and anisotropy of intense precipitation. The observed results will be compared with the previously model simulated results, as well as related to large-scale variations in climate patterns, such as the El Niño Southern Oscillation (ENSO) and the Pacific Decadal Oscillation (PDO).
O'Connor, Ben L; Hamada, Yuki; Bowen, Esther E; Grippo, Mark A; Hartmann, Heidi M; Patton, Terri L; Van Lonkhuyzen, Robert A; Carr, Adrianne E
2014-11-01
Large areas of public lands administered by the Bureau of Land Management and located in arid regions of the southwestern United States are being considered for the development of utility-scale solar energy facilities. Land-disturbing activities in these desert, alluvium-filled valleys have the potential to adversely affect the hydrologic and ecologic functions of ephemeral streams. Regulation and management of ephemeral streams typically falls under a spectrum of federal, state, and local programs, but scientifically based guidelines for protecting ephemeral streams with respect to land-development activities are largely nonexistent. This study developed an assessment approach for quantifying the sensitivity to land disturbance of ephemeral stream reaches located in proposed solar energy zones (SEZs). The ephemeral stream assessment approach used publicly-available geospatial data on hydrology, topography, surficial geology, and soil characteristics, as well as high-resolution aerial imagery. These datasets were used to inform a professional judgment-based score index of potential land disturbance impacts on selected critical functions of ephemeral streams, including flow and sediment conveyance, ecological habitat value, and groundwater recharge. The total sensitivity scores (sum of scores for the critical stream functions of flow and sediment conveyance, ecological habitats, and groundwater recharge) were used to identify highly sensitive stream reaches to inform decisions on developable areas in SEZs. Total sensitivity scores typically reflected the scores of the individual stream functions; some exceptions pertain to groundwater recharge and ecological habitats. The primary limitations of this assessment approach were the lack of high-resolution identification of ephemeral stream channels in the existing National Hydrography Dataset, and the lack of mechanistic processes describing potential impacts on ephemeral stream functions at the watershed scale. The primary strength of this assessment approach is that it allows watershed-scale planning for low-impact development in arid ecosystems; the qualitative scoring of potential impacts can also be adjusted to accommodate new geospatial data, and to allow for expert and stakeholder input into decisions regarding the identification and potential avoidance of highly sensitive stream reaches.
Arc4nix: A cross-platform geospatial analytical library for cluster and cloud computing
NASA Astrophysics Data System (ADS)
Tang, Jingyin; Matyas, Corene J.
2018-02-01
Big Data in geospatial technology is a grand challenge for processing capacity. The ability to use a GIS for geospatial analysis on Cloud Computing and High Performance Computing (HPC) clusters has emerged as a new approach to provide feasible solutions. However, users lack the ability to migrate existing research tools to a Cloud Computing or HPC-based environment because of the incompatibility of the market-dominating ArcGIS software stack and Linux operating system. This manuscript details a cross-platform geospatial library "arc4nix" to bridge this gap. Arc4nix provides an application programming interface compatible with ArcGIS and its Python library "arcpy". Arc4nix uses a decoupled client-server architecture that permits geospatial analytical functions to run on the remote server and other functions to run on the native Python environment. It uses functional programming and meta-programming language to dynamically construct Python codes containing actual geospatial calculations, send them to a server and retrieve results. Arc4nix allows users to employ their arcpy-based script in a Cloud Computing and HPC environment with minimal or no modification. It also supports parallelizing tasks using multiple CPU cores and nodes for large-scale analyses. A case study of geospatial processing of a numerical weather model's output shows that arcpy scales linearly in a distributed environment. Arc4nix is open-source software.
Prosser, Diann J.; Hungerford, Laura L.; Erwin, R. Michael; Ottinger, Mary Ann; Takekawa, John Y.; Newman, Scott H.; Xiao, Xianming; Ellis, Erie C.
2016-01-01
One of the longest-persisting avian influenza viruses in history, highly pathogenic avian influenza virus (HPAIV) A(H5N1), continues to evolve after 18 years, advancing the threat of a global pandemic. Wild waterfowl (family Anatidae), are reported as secondary transmitters of HPAIV, and primary reservoirs for low-pathogenic avian influenza viruses, yet spatial inputs for disease risk modeling for this group have been lacking. Using GIS and Monte Carlo simulations, we developed geospatial indices of waterfowl abundance at 1 and 30 km resolutions and for the breeding and wintering seasons for China, the epicenter of H5N1. Two spatial layers were developed: cumulative waterfowl abundance (WAB), a measure of predicted abundance across species, and cumulative abundance weighted by H5N1 prevalence (WPR), whereby abundance for each species was adjusted based on prevalence values then totaled across species. Spatial patterns of the model output differed between seasons, with higher WAB and WPR in the northern and western regions of China for the breeding season and in the southeast for the wintering season. Uncertainty measures indicated highest error in southeastern China for both WAB and WPR. We also explored the effect of resampling waterfowl layers from 1 km to 30 km resolution for multi-scale risk modeling. Results indicated low average difference (less than 0.16 and 0.01 standard deviations for WAB and WPR, respectively), with greatest differences in the north for the breeding season and southeast for the wintering season. This work provides the first geospatial models of waterfowl abundance available for China. The indices provide important inputs for modeling disease transmission risk at the interface of poultry and wild birds. These models are easily adaptable, have broad utility to both disease and conservation needs, and will be available to the scientific community for advanced modeling applications.
Geospatial Data Science Publications | Geospatial Data Science | NREL
research in these publications. Featured Publications U.S. Renewable Energy Technical Potentials: A GIS -Based Analysis, NREL Technical Report (2012) 2016 Offshore Wind Energy Resource Assessment for the -Temperature Geothermal Resources of the United States, 40th GRC Annual Meeting (2016) High-Level Overview of
Geospatial Technologies: Real Projects in Real Classrooms
ERIC Educational Resources Information Center
Kolvoord, Bob
2008-01-01
Geospatial technologies of geographic information systems, global positioning systems, and remote sensing are just a few of the projects that evoke an unexpected drive and devotion from high school students in Virginia. Their integration into different curricular areas lets students focus on understanding their community and the many issues that…
Soil Monitor: an open source web application for real-time soil sealing monitoring and assessment
NASA Astrophysics Data System (ADS)
Langella, Giuliano; Basile, Angelo; Giannecchini, Simone; Iamarino, Michela; Munafò, Michele; Terribile, Fabio
2016-04-01
Soil sealing is one of the most important causes of land degradation and desertification. In Europe, soil covered by impermeable materials has increased by about 80% from the Second World War till nowadays, while population has only grown by one third. There is an increasing concern at the high political levels about the need to attenuate imperviousness itself and its effects on soil functions. European Commission promulgated a roadmap (COM(2011) 571) by which the net land take would be zero by 2050. Furthermore, European Commission also published a report in 2011 providing best practices and guidelines for limiting soil sealing and imperviousness. In this scenario, we developed an open source and an open source based Soil Sealing Geospatial Cyber Infrastructure (SS-GCI) named as "Soil Monitor". This tool merges a webGIS with parallel geospatial computation in a fast and dynamic fashion in order to provide real-time assessments of soil sealing at high spatial resolution (20 meters and below) over the whole Italy. Common open source webGIS packages are used to implement both the data management and visualization infrastructures, such as GeoServer and MapStore. The high-speed geospatial computation is ensured by a GPU parallelism using the CUDA (Computing Unified Device Architecture) framework by NVIDIA®. This kind of parallelism required the writing - from scratch - all codes needed to fulfil the geospatial computation built behind the soil sealing toolbox. The combination of GPU computing with webGIS infrastructures is relatively novel and required particular attention at the Java-CUDA programming interface. As a result, Soil Monitor is smart because it can perform very high time-consuming calculations (querying for instance an Italian administrative region as area of interest) in less than one minute. The web application is embedded in a web browser and nothing must be installed before using it. Potentially everybody can use it, but the main targets are the stakeholders dealing with sealing, such as policy makers, land owners and asphalt/cement companies. As a matter of fact, Soil Monitor can be used to improve the spatial planning therefore limiting the progression of disordered soil sealing which causes both the direct loss of soils due to imperviousness but also the indirect loss caused by fragmentation of soils (which has different negative effects on the durability of soil functions, such as habitat corridors). Further, in a future version, Soil Monitor would estimate the best location for a new building or help compensating soil losses by actions in other areas to offset drawbacks at zero. The presented SS-GCI dealing with soil sealing - if opportunely scaled - would aid the implementation of best practices for limiting soil sealing or mitigating its effects on soil functions.
High resolution pollutant measurements in complex urban ...
Measuring air pollution in real-time using an instrumented vehicle platform has been an emerging strategy to resolve air pollution trends at a very fine spatial scale (10s of meters). Achieving second-by-second data representative of urban air quality trends requires advanced instrumentation, such as a quantum cascade laser utilized to resolve carbon monoxide and real-time optical detection of black carbon. An equally challenging area of development is processing and visualization of complex geospatial air monitoring data to decipher key trends of interest. EPA’s Office of Research and Development staff have applied air monitoring to evaluate community air quality in a variety of environments, including assessing air quality surrounding rail yards, evaluating noise wall or tree stand effects on roadside and on-road air quality, and surveying of traffic-related exposure zones for comparison with land-use regression estimates. ORD has ongoing efforts to improve mobile monitoring data collection and interpretation, including instrumentation testing, evaluating the effect of post-processing algorithms on derived trends, and developing a web-based tool called Real-Time Geospatial Data Viewer (RETIGO) allowing for a simple plug-and-play of mobile monitoring data. Example findings from mobile data sets include an estimated 50% in roadside ultrafine particle levels when immediately downwind of a noise barrier, increases in neighborhood-wide black carbon levels (3
NASA Astrophysics Data System (ADS)
Schoessow, F. S.; Li, Y.; Howe, P. D.
2016-12-01
Extreme heat events are the deadliest natural hazard in the United States and are expected to increase in both severity and frequency in the coming years due to the effects of climate change. The risks of climate change and weather-related events such as heat waves to a population can be more comprehensively assessed by coupling the traditional examination of natural hazards using remote sensing and geospatial analysis techniques with human vulnerability factors and individual perceptions of hazards. By analyzing remote-sensed and empirical survey data alongside national hazards advisories, this study endeavors to establish a nationally-representative baseline quantifying the spatiotemporal variation of individual heat vulnerabilities at multiple scales and between disparate population groups affected by their unique socioenvironmental factors. This is of immediate academic interest because the study of heat waves risk perceptions remains relatively unexplored - despite the intensification of extreme heat events. The use of "human sensors", georeferenced & timestamped individual response data, provides invaluable contextualized data at a high spatial resolution, which will enable policy-makers to more effectively implement targeted strategies for risk prevention, mitigation, and communication. As climate change risks are further defined, this cognizance will help identify vulnerable populations and enhance national hazard preparedness and recovery frameworks.
Assessment of near-source air pollution at a fine spatial scale ...
Mobile monitoring is an emerging strategy to characterize spatially and temporally variable air pollution in areas near sources. EPA’s Geospatial Monitoring of Air Pollution (GMAP) vehicle, an all-electric vehicle measuring real-time concentrations of particulate and gaseous pollutants, was utilized to map air pollution trends near the Port of Charleston in South Carolina. High-resolution monitoring was performed along driving routes near several port terminals and rail yard facilities, recording geospatial coordinates and measurements of pollutants including black carbon, size-resolved particle count ranging from ultrafine to coarse (6 nm to 20 µm), carbon monoxide, carbon dioxide, and nitrogen dioxide. Additionally, a portable meteorological station was used to characterize local meteorology. Port activity data was provided by the Port Authority of Charleston and includes counts of ships and trucks, and port service operations such as cranes and forklifts during the sampling time periods. Measurements are supplemented with modeling performed with AERMOD and RLINE in order to characterize the impact of the various terminals at the Port of Charleston on local air quality. Specifically, the data are used to determine the magnitude of the increase in local, near-port pollutant concentrations as well as the spatial extent to which concentration is elevated above background. These effects are studied in relation to a number of potentially significant factors such
Conterminous U.S. and Alaska Forest Type Mapping Using Forest Inventory and Analysis Data
B. Ruefenacht; M.V. Finco; M.D. Nelson; R. Czaplewski; E.H. Helmer; J. A. Blackard; G.R. Holden; A.J. Lister; D. Salajanu; D. Weyermann; K. Winterberger
2008-01-01
Classification-trees were used to model forest type groups and forest types for the conterminous United States and Alaska. The predictor data were a geospatial data set with a spatial resolution of 250 m developed by the U.S. Department of Agriculture Forest Service (USFS). The response data were plot data from the USFS Forest Inventory and Analysis program. Overall...
3D geospatial visualizations: Animation and motion effects on spatial objects
NASA Astrophysics Data System (ADS)
Evangelidis, Konstantinos; Papadopoulos, Theofilos; Papatheodorou, Konstantinos; Mastorokostas, Paris; Hilas, Constantinos
2018-02-01
Digital Elevation Models (DEMs), in combination with high quality raster graphics provide realistic three-dimensional (3D) representations of the globe (virtual globe) and amazing navigation experience over the terrain through earth browsers. In addition, the adoption of interoperable geospatial mark-up languages (e.g. KML) and open programming libraries (Javascript) makes it also possible to create 3D spatial objects and convey on them the sensation of any type of texture by utilizing open 3D representation models (e.g. Collada). One step beyond, by employing WebGL frameworks (e.g. Cesium.js, three.js) animation and motion effects are attributed on 3D models. However, major GIS-based functionalities in combination with all the above mentioned visualization capabilities such as for example animation effects on selected areas of the terrain texture (e.g. sea waves) as well as motion effects on 3D objects moving in dynamically defined georeferenced terrain paths (e.g. the motion of an animal over a hill, or of a big fish in an ocean etc.) are not widely supported at least by open geospatial applications or development frameworks. Towards this we developed and made available to the research community, an open geospatial software application prototype that provides high level capabilities for dynamically creating user defined virtual geospatial worlds populated by selected animated and moving 3D models on user specified locations, paths and areas. At the same time, the generated code may enhance existing open visualization frameworks and programming libraries dealing with 3D simulations, with the geospatial aspect of a virtual world.
Estimating Vegetation Height from WorldView-02 and ArcticDEM Data for Broad Ecological Applications
NASA Astrophysics Data System (ADS)
Meddens, A. J.; Vierling, L. A.; Eitel, J.; Jennewein, J. S.; White, J. C.; Wulder, M.
2017-12-01
Boreal and arctic regions are warming at an unprecedented rate, and at a rate higher than in other regions across the globe. Ecological processes are highly responsive to temperature and therefore substantial changes in these northern ecosystems are expected. Recently, NASA initiated the Arctic-Boreal Vulnerability Experiment (ABoVE), which is a large-scale field campaign that aims to gain a better understanding of how the arctic responds to environmental change. High-resolution data products that quantify vegetation structure and function will improve efforts to assess these environmental change impacts. Our objective was to develop and test an approach that allows for mapping vegetation height at a 5m grid cell resolution across the ABoVE domain. To accomplish this, we selected three study areas across a north-south gradient in Alaska, representing an area of approximately 130 km2. We developed a RandomForest modeling approach for predicting vegetation height using the ArcticDEM (a digital surface model produced across the Arctic by the Polar Geospatial Center) and high-resolution multispectral satellite data (WorldView-2) in conjunction with aerial lidar data for calibration and validation. Vegetation height was successfully predicted across the three study areas and evaluated using an independent dataset, with R2 ranging from 0.58 to 0.76 and RMSEs ranging from 1.8 to 2.4 m. This predicted vegetation height dataset also led to the development of a digital terrain model using the ArcticDEM digital surface model by removing canopy heights from the surface heights. Our results show potential to establish a high resolution pan-arctic vegetation height map, which will provide useful information to a broad range of ongoing and future ecological research in high northern latitudes.
Yang, Limin; Huang, Chengquan; Homer, Collin G.; Wylie, Bruce K.; Coan, Michael
2003-01-01
A wide range of urban ecosystem studies, including urban hydrology, urban climate, land use planning, and resource management, require current and accurate geospatial data of urban impervious surfaces. We developed an approach to quantify urban impervious surfaces as a continuous variable by using multisensor and multisource datasets. Subpixel percent impervious surfaces at 30-m resolution were mapped using a regression tree model. The utility, practicality, and affordability of the proposed method for large-area imperviousness mapping were tested over three spatial scales (Sioux Falls, South Dakota, Richmond, Virginia, and the Chesapeake Bay areas of the United States). Average error of predicted versus actual percent impervious surface ranged from 8.8 to 11.4%, with correlation coefficients from 0.82 to 0.91. The approach is being implemented to map impervious surfaces for the entire United States as one of the major components of the circa 2000 national land cover database.
High resolution global gridded data for use in population studies
NASA Astrophysics Data System (ADS)
Lloyd, Christopher T.; Sorichetta, Alessandro; Tatem, Andrew J.
2017-01-01
Recent years have seen substantial growth in openly available satellite and other geospatial data layers, which represent a range of metrics relevant to global human population mapping at fine spatial scales. The specifications of such data differ widely and therefore the harmonisation of data layers is a prerequisite to constructing detailed and contemporary spatial datasets which accurately describe population distributions. Such datasets are vital to measure impacts of population growth, monitor change, and plan interventions. To this end the WorldPop Project has produced an open access archive of 3 and 30 arc-second resolution gridded data. Four tiled raster datasets form the basis of the archive: (i) Viewfinder Panoramas topography clipped to Global ADMinistrative area (GADM) coastlines; (ii) a matching ISO 3166 country identification grid; (iii) country area; (iv) and slope layer. Further layers include transport networks, landcover, nightlights, precipitation, travel time to major cities, and waterways. Datasets and production methodology are here described. The archive can be downloaded both from the WorldPop Dataverse Repository and the WorldPop Project website.
High resolution global gridded data for use in population studies.
Lloyd, Christopher T; Sorichetta, Alessandro; Tatem, Andrew J
2017-01-31
Recent years have seen substantial growth in openly available satellite and other geospatial data layers, which represent a range of metrics relevant to global human population mapping at fine spatial scales. The specifications of such data differ widely and therefore the harmonisation of data layers is a prerequisite to constructing detailed and contemporary spatial datasets which accurately describe population distributions. Such datasets are vital to measure impacts of population growth, monitor change, and plan interventions. To this end the WorldPop Project has produced an open access archive of 3 and 30 arc-second resolution gridded data. Four tiled raster datasets form the basis of the archive: (i) Viewfinder Panoramas topography clipped to Global ADMinistrative area (GADM) coastlines; (ii) a matching ISO 3166 country identification grid; (iii) country area; (iv) and slope layer. Further layers include transport networks, landcover, nightlights, precipitation, travel time to major cities, and waterways. Datasets and production methodology are here described. The archive can be downloaded both from the WorldPop Dataverse Repository and the WorldPop Project website.
NASA Astrophysics Data System (ADS)
Venäläinen, Ari; Laapas, Mikko; Pirinen, Pentti; Horttanainen, Matti; Hyvönen, Reijo; Lehtonen, Ilari; Junila, Päivi; Hou, Meiting; Peltola, Heli M.
2017-07-01
The bioeconomy has an increasing role to play in climate change mitigation and the sustainable development of national economies. In Finland, a forested country, over 50 % of the current bioeconomy relies on the sustainable management and utilization of forest resources. Wind storms are a major risk that forests are exposed to and high-spatial-resolution analysis of the most vulnerable locations can produce risk assessment of forest management planning. In this paper, we examine the feasibility of the wind multiplier approach for downscaling of maximum wind speed, using 20 m spatial resolution CORINE land-use dataset and high-resolution digital elevation data. A coarse spatial resolution estimate of the 10-year return level of maximum wind speed was obtained from the ERA-Interim reanalyzed data. Using a geospatial re-mapping technique the data were downscaled to 26 meteorological station locations to represent very diverse environments. Applying a comparison, we find that the downscaled 10-year return levels represent 66 % of the observed variation among the stations examined. In addition, the spatial variation in wind-multiplier-downscaled 10-year return level wind was compared with the WAsP model-simulated wind. The heterogeneous test area was situated in northern Finland, and it was found that the major features of the spatial variation were similar, but in some locations, there were relatively large differences. The results indicate that the wind multiplier method offers a pragmatic and computationally feasible tool for identifying at a high spatial resolution those locations with the highest forest wind damage risks. It can also be used to provide the necessary wind climate information for wind damage risk model calculations, thus making it possible to estimate the probability of predicted threshold wind speeds for wind damage and consequently the probability (and amount) of wind damage for certain forest stand configurations.
NASA Astrophysics Data System (ADS)
Spanò, A.; Chiabrando, F.; Sammartano, G.; Teppati Losè, L.
2018-05-01
The paper focuses on the exploration of the suitability and the discretization of applicability issues about advanced surveying integrated techniques, mainly based on image-based approaches compared and integrated to range-based ones that have been developed with the use of the cutting-edge solutions tested on field. The investigated techniques integrate both technological devices for 3D data acquisition and thus editing and management systems to handle metric models and multi-dimensional data in a geospatial perspective, in order to innovate and speed up the extraction of information during the archaeological excavation activities. These factors, have been experienced in the outstanding site of the Hierapolis of Phrygia ancient city (Turkey), downstream the 2017 surveying missions, in order to produce high-scale metric deliverables in terms of high-detailed Digital Surface Models (DSM), 3D continuous surface models and high-resolution orthoimages products. In particular, the potentialities in the use of UAV platforms for low altitude acquisitions in aerial photogrammetric approach, together with terrestrial panoramic acquisitions (Trimble V10 imaging rover), have been investigated with a comparison toward consolidated Terrestrial Laser Scanning (TLS) measurements. One of the main purposes of the paper is to evaluate the results offered by the technologies used independently and using integrated approaches. A section of the study in fact, is specifically dedicated to experimenting the union of different sensor dense clouds: both dense clouds derived from UAV have been integrated with terrestrial Lidar clouds, to evaluate their fusion. Different test cases have been considered, representing typical situations that can be encountered in archaeological sites.
CELL5M: A geospatial database of agricultural indicators for Africa South of the Sahara.
Koo, Jawoo; Cox, Cindy M; Bacou, Melanie; Azzarri, Carlo; Guo, Zhe; Wood-Sichra, Ulrike; Gong, Queenie; You, Liangzhi
2016-01-01
Recent progress in large-scale georeferenced data collection is widening opportunities for combining multi-disciplinary datasets from biophysical to socioeconomic domains, advancing our analytical and modeling capacity. Granular spatial datasets provide critical information necessary for decision makers to identify target areas, assess baseline conditions, prioritize investment options, set goals and targets and monitor impacts. However, key challenges in reconciling data across themes, scales and borders restrict our capacity to produce global and regional maps and time series. This paper provides overview, structure and coverage of CELL5M-an open-access database of geospatial indicators at 5 arc-minute grid resolution-and introduces a range of analytical applications and case-uses. CELL5M covers a wide set of agriculture-relevant domains for all countries in Africa South of the Sahara and supports our understanding of multi-dimensional spatial variability inherent in farming landscapes throughout the region.
NASA Astrophysics Data System (ADS)
Foumelis, Michael
2017-01-01
The applicability of the normalized difference water index (NDWI) to the delineation of dam failure-induced floods is demonstrated for the case of the Sparmos dam (Larissa, Central Greece). The approach followed was based on the differentiation of NDWI maps to accurately define the extent of the inundated area over different time spans using multimission Earth observation optical data. Besides using Landsat data, for which the index was initially designed, higher spatial resolution data from Sentinel-2 mission were also successfully exploited. A geospatial analysis approach was then introduced to rapidly identify potentially affected segments of the road network. This allowed for further correlation to actual damages in the following damage assessment and remediation activities. The proposed combination of geographic information systems and remote sensing techniques can be easily implemented by local authorities and civil protection agencies for mapping and monitoring flood events.
Mapping a Difference: The Power of Geospatial Visualization
NASA Astrophysics Data System (ADS)
Kolvoord, B.
2015-12-01
Geospatial Technologies (GST), such as GIS, GPS and remote sensing, offer students and teachers the opportunity to study the "why" of where. By making maps and collecting location-based data, students can pursue authentic problems using sophisticated tools. The proliferation of web- and cloud-based tools has made these technologies broadly accessible to schools. In addition, strong spatial thinking skills have been shown to be a key factor in supporting students that want to study science, technology, engineering, and mathematics (STEM) disciplines (Wai, Lubinski and Benbow) and pursue STEM careers. Geospatial technologies strongly scaffold the development of these spatial thinking skills. For the last ten years, the Geospatial Semester, a unique dual-enrollment partnership between James Madison University and Virginia high schools, has provided students with the opportunity to use GST's to hone their spatial thinking skills and to do extended projects of local interest, including environmental, geological and ecological studies. Along with strong spatial thinking skills, these students have also shown strong problem solving skills, often beyond those of fellow students in AP classes. Programs like the Geospatial Semester are scalable and within the reach of many college and university departments, allowing strong engagement with K-12 schools. In this presentation, we'll share details of the Geospatial Semester and research results on the impact of the use of these technologies on students' spatial thinking skills, and discuss the success and challenges of developing K-12 partnerships centered on geospatial visualization.
Distributed geospatial model sharing based on open interoperability standards
Feng, Min; Liu, Shuguang; Euliss, Ned H.; Fang, Yin
2009-01-01
Numerous geospatial computational models have been developed based on sound principles and published in journals or presented in conferences. However modelers have made few advances in the development of computable modules that facilitate sharing during model development or utilization. Constraints hampering development of model sharing technology includes limitations on computing, storage, and connectivity; traditional stand-alone and closed network systems cannot fully support sharing and integrating geospatial models. To address this need, we have identified methods for sharing geospatial computational models using Service Oriented Architecture (SOA) techniques and open geospatial standards. The service-oriented model sharing service is accessible using any tools or systems compliant with open geospatial standards, making it possible to utilize vast scientific resources available from around the world to solve highly sophisticated application problems. The methods also allow model services to be empowered by diverse computational devices and technologies, such as portable devices and GRID computing infrastructures. Based on the generic and abstract operations and data structures required for Web Processing Service (WPS) standards, we developed an interactive interface for model sharing to help reduce interoperability problems for model use. Geospatial computational models are shared on model services, where the computational processes provided by models can be accessed through tools and systems compliant with WPS. We developed a platform to help modelers publish individual models in a simplified and efficient way. Finally, we illustrate our technique using wetland hydrological models we developed for the prairie pothole region of North America.
Implementing a High School Level Geospatial Technologies and Spatial Thinking Course
ERIC Educational Resources Information Center
Nielsen, Curtis P.; Oberle, Alex; Sugumaran, Ramanathan
2011-01-01
Understanding geospatial technologies (GSTs) and spatial thinking is increasingly vital to contemporary life including common activities and hobbies; learning in science, mathematics, and social science; and employment within fields as diverse as engineering, health, business, and planning. As such, there is a need for a stand-alone K-12…
1-Meter Digital Elevation Model specification
Arundel, Samantha T.; Archuleta, Christy-Ann M.; Phillips, Lori A.; Roche, Brittany L.; Constance, Eric W.
2015-10-21
In January 2015, the U.S. Geological Survey National Geospatial Technical Operations Center began producing the 1-Meter Digital Elevation Model data product. This new product was developed to provide high resolution bare-earth digital elevation models from light detection and ranging (lidar) elevation data and other elevation data collected over the conterminous United States (lower 48 States), Hawaii, and potentially Alaska and the U.S. territories. The 1-Meter Digital Elevation Model consists of hydroflattened, topographic bare-earth raster digital elevation models, with a 1-meter x 1-meter cell size, and is available in 10,000-meter x 10,000-meter square blocks with a 6-meter overlap. This report details the specifications required for the production of the 1-Meter Digital Elevation Model.
Foreword to the theme issue on geospatial computer vision
NASA Astrophysics Data System (ADS)
Wegner, Jan Dirk; Tuia, Devis; Yang, Michael; Mallet, Clement
2018-06-01
Geospatial Computer Vision has become one of the most prevalent emerging fields of investigation in Earth Observation in the last few years. In this theme issue, we aim at showcasing a number of works at the interface between remote sensing, photogrammetry, image processing, computer vision and machine learning. In light of recent sensor developments - both from the ground as from above - an unprecedented (and ever growing) quantity of geospatial data is available for tackling challenging and urgent tasks such as environmental monitoring (deforestation, carbon sequestration, climate change mitigation), disaster management, autonomous driving or the monitoring of conflicts. The new bottleneck for serving these applications is the extraction of relevant information from such large amounts of multimodal data. This includes sources, stemming from multiple sensors, that exhibit distinct physical nature of heterogeneous quality, spatial, spectral and temporal resolutions. They are as diverse as multi-/hyperspectral satellite sensors, color cameras on drones, laser scanning devices, existing open land-cover geodatabases and social media. Such core data processing is mandatory so as to generate semantic land-cover maps, accurate detection and trajectories of objects of interest, as well as by-products of superior added-value: georeferenced data, images with enhanced geometric and radiometric qualities, or Digital Surface and Elevation Models.
NASA Astrophysics Data System (ADS)
Cheng, Gong; Han, Junwei; Zhou, Peicheng; Guo, Lei
2014-12-01
The rapid development of remote sensing technology has facilitated us the acquisition of remote sensing images with higher and higher spatial resolution, but how to automatically understand the image contents is still a big challenge. In this paper, we develop a practical and rotation-invariant framework for multi-class geospatial object detection and geographic image classification based on collection of part detectors (COPD). The COPD is composed of a set of representative and discriminative part detectors, where each part detector is a linear support vector machine (SVM) classifier used for the detection of objects or recurring spatial patterns within a certain range of orientation. Specifically, when performing multi-class geospatial object detection, we learn a set of seed-based part detectors where each part detector corresponds to a particular viewpoint of an object class, so the collection of them provides a solution for rotation-invariant detection of multi-class objects. When performing geographic image classification, we utilize a large number of pre-trained part detectors to discovery distinctive visual parts from images and use them as attributes to represent the images. Comprehensive evaluations on two remote sensing image databases and comparisons with some state-of-the-art approaches demonstrate the effectiveness and superiority of the developed framework.
Center of Excellence for Geospatial Information Science research plan 2013-18
Usery, E. Lynn
2013-01-01
The U.S. Geological Survey Center of Excellence for Geospatial Information Science (CEGIS) was created in 2006 and since that time has provided research primarily in support of The National Map. The presentations and publications of the CEGIS researchers document the research accomplishments that include advances in electronic topographic map design, generalization, data integration, map projections, sea level rise modeling, geospatial semantics, ontology, user-centered design, volunteer geographic information, and parallel and grid computing for geospatial data from The National Map. A research plan spanning 2013–18 has been developed extending the accomplishments of the CEGIS researchers and documenting new research areas that are anticipated to support The National Map of the future. In addition to extending the 2006–12 research areas, the CEGIS research plan for 2013–18 includes new research areas in data models, geospatial semantics, high-performance computing, volunteered geographic information, crowdsourcing, social media, data integration, and multiscale representations to support the Three-Dimensional Elevation Program (3DEP) and The National Map of the future of the U.S. Geological Survey.
NLCD tree canopy cover (TCC) maps of the contiguous United States and coastal Alaska
Robert Benton; Bonnie Ruefenacht; Vicky Johnson; Tanushree Biswas; Craig Baker; Mark Finco; Kevin Megown; John Coulston; Ken Winterberger; Mark Riley
2015-01-01
A tree canopy cover (TCC) map is one of three elements in the National Land Cover Database (NLCD) 2011 suite of nationwide geospatial data layers. In 2010, the USDA Forest Service (USFS) committed to creating the TCC layer as a member of the Multi-Resolution Land Cover (MRLC) consortium. A general methodology for creating the TCC layer was reported at the 2012 FIA...
Bonnie Ruefenacht; Robert Benton; Vicky Johnson; Tanushree Biswas; Craig Baker; Mark Finco; Kevin Megown; John Coulston; Ken Winterberger; Mark Riley
2015-01-01
A tree canopy cover (TCC) layer is one of three elements in the National Land Cover Database (NLCD) 2011 suite of nationwide geospatial data layers. In 2010, the USDA Forest Service (USFS) committed to creating the TCC layer as a member of the Multi-Resolution Land Cover (MRLC) consortium. A general methodology for creating the TCC layer was reported at the 2012 FIA...
Automated Generation of the Alaska Coastline Using High-Resolution Satellite Imagery
NASA Astrophysics Data System (ADS)
Roth, G.; Porter, C. C.; Cloutier, M. D.; Clementz, M. E.; Reim, C.; Morin, P. J.
2015-12-01
Previous campaigns to map Alaska's coast at high resolution have relied on airborne, marine, or ground-based surveying and manual digitization. The coarse temporal resolution, inability to scale geographically, and high cost of field data acquisition in these campaigns is inadequate for the scale and speed of recent coastal change in Alaska. Here, we leverage the Polar Geospatial Center (PGC) archive of DigitalGlobe, Inc. satellite imagery to produce a state-wide coastline at 2 meter resolution. We first select multispectral imagery based on time and quality criteria. We then extract the near-infrared (NIR) band from each processed image, and classify each pixel as water or land with a pre-determined NIR threshold value. Processing continues with vectorizing the water-land boundary, removing extraneous data, and attaching metadata. Final coastline raster and vector products maintain the original accuracy of the orthorectified satellite data, which is often within the local tidal range. The repeat frequency of coastline production can range from 1 month to 3 years, depending on factors such as satellite capacity, cloud cover, and floating ice. Shadows from trees or structures complicate the output and merit further data cleaning. The PGC's imagery archive, unique expertise, and computing resources enabled us to map the Alaskan coastline in a few months. The DigitalGlobe archive allows us to update this coastline as new imagery is acquired, and facilitates baseline data for studies of coastal change and improvement of topographic datasets. Our results are not simply a one-time coastline, but rather a system for producing multi-temporal, automated coastlines. Workflows and tools produced with this project can be freely distributed and utilized globally. Researchers and government agencies must now consider how they can incorporate and quality-control this high-frequency, high-resolution data to meet their mapping standards and research objectives.
Geospatial Data Science Modeling | Geospatial Data Science | NREL
Geospatial Data Science Modeling Geospatial Data Science Modeling NREL uses geospatial data science modeling to develop innovative models and tools for energy professionals, project developers, and consumers . Photo of researchers inspecting maps on a large display. Geospatial modeling at NREL often produces the
ERIC Educational Resources Information Center
Metoyer, Sandra; Bednarz, Robert
2017-01-01
This article provides a description and discussion of an exploratory research study that examined the effects of using geospatial technology (GST) on high school students' spatial skills and spatial-relations content knowledge. It presents results that support the use of GST to teach spatially dependent content. It also provides indication of an…
Raster Data Partitioning for Supporting Distributed GIS Processing
NASA Astrophysics Data System (ADS)
Nguyen Thai, B.; Olasz, A.
2015-08-01
In the geospatial sector big data concept also has already impact. Several studies facing originally computer science techniques applied in GIS processing of huge amount of geospatial data. In other research studies geospatial data is considered as it were always been big data (Lee and Kang, 2015). Nevertheless, we can prove data acquisition methods have been improved substantially not only the amount, but the resolution of raw data in spectral, spatial and temporal aspects as well. A significant portion of big data is geospatial data, and the size of such data is growing rapidly at least by 20% every year (Dasgupta, 2013). The produced increasing volume of raw data, in different format, representation and purpose the wealth of information derived from this data sets represents only valuable results. However, the computing capability and processing speed rather tackle with limitations, even if semi-automatic or automatic procedures are aimed on complex geospatial data (Kristóf et al., 2014). In late times, distributed computing has reached many interdisciplinary areas of computer science inclusive of remote sensing and geographic information processing approaches. Cloud computing even more requires appropriate processing algorithms to be distributed and handle geospatial big data. Map-Reduce programming model and distributed file systems have proven their capabilities to process non GIS big data. But sometimes it's inconvenient or inefficient to rewrite existing algorithms to Map-Reduce programming model, also GIS data can not be partitioned as text-based data by line or by bytes. Hence, we would like to find an alternative solution for data partitioning, data distribution and execution of existing algorithms without rewriting or with only minor modifications. This paper focuses on technical overview of currently available distributed computing environments, as well as GIS data (raster data) partitioning, distribution and distributed processing of GIS algorithms. A proof of concept implementation have been made for raster data partitioning, distribution and processing. The first results on performance have been compared against commercial software ERDAS IMAGINE 2011 and 2014. Partitioning methods heavily depend on application areas, therefore we may consider data partitioning as a preprocessing step before applying processing services on data. As a proof of concept we have implemented a simple tile-based partitioning method splitting an image into smaller grids (NxM tiles) and comparing the processing time to existing methods by NDVI calculation. The concept is demonstrated using own development open source processing framework.
High-fidelity national carbon mapping for resource management and REDD+
2013-01-01
Background High fidelity carbon mapping has the potential to greatly advance national resource management and to encourage international action toward climate change mitigation. However, carbon inventories based on field plots alone cannot capture the heterogeneity of carbon stocks, and thus remote sensing-assisted approaches are critically important to carbon mapping at regional to global scales. We advanced a high-resolution, national-scale carbon mapping approach applied to the Republic of Panama – one of the first UN REDD + partner countries. Results Integrating measurements of vegetation structure collected by airborne Light Detection and Ranging (LiDAR) with field inventory plots, we report LiDAR-estimated aboveground carbon stock errors of ~10% on any 1-ha land parcel across a wide range of ecological conditions. Critically, this shows that LiDAR provides a highly reliable replacement for inventory plots in areas lacking field data, both in humid tropical forests and among drier tropical vegetation types. We then scale up a systematically aligned LiDAR sampling of Panama using satellite data on topography, rainfall, and vegetation cover to model carbon stocks at 1-ha resolution with estimated average pixel-level uncertainty of 20.5 Mg C ha-1 nationwide. Conclusions The national carbon map revealed strong abiotic and human controls over Panamanian carbon stocks, and the new level of detail with estimated uncertainties for every individual hectare in the country sets Panama at the forefront in high-resolution ecosystem management. With this repeatable approach, carbon resource decision-making can be made on a geospatially explicit basis, enhancing human welfare and environmental protection. PMID:23866822
Sharing and interoperation of Digital Dongying geospatial data
NASA Astrophysics Data System (ADS)
Zhao, Jun; Liu, Gaohuan; Han, Lit-tao; Zhang, Rui-ju; Wang, Zhi-an
2006-10-01
Digital Dongying project was put forward by Dongying city, Shandong province, and authenticated by Ministry of Information Industry, Ministry of Science and Technology and Ministry of Construction P.R.CHINA in 2002. After five years of building, informationization level of Dongying has reached to the advanced degree. In order to forward the step of digital Dongying building, and to realize geospatial data sharing, geographic information sharing standards are drawn up and applied into realization. Secondly, Digital Dongying Geographic Information Sharing Platform has been constructed and developed, which is a highly integrated platform of WEBGIS. 3S (GIS, GPS, RS), Object oriented RDBMS, Internet, DCOM, etc. It provides an indispensable platform for sharing and interoperation of Digital Dongying Geospatial Data. According to the standards, and based on the platform, sharing and interoperation of "Digital Dongying" geospatial data have come into practice and the good results have been obtained. However, a perfect leadership group is necessary for data sharing and interoperation.
NASA Astrophysics Data System (ADS)
Kulo, Violet; Bodzin, Alec
2013-02-01
Geospatial technologies are increasingly being integrated in science classrooms to foster learning. This study examined whether a Web-enhanced science inquiry curriculum supported by geospatial technologies promoted urban middle school students' understanding of energy concepts. The participants included one science teacher and 108 eighth-grade students classified in three ability level tracks. Data were gathered through pre/posttest content knowledge assessments, daily classroom observations, and daily reflective meetings with the teacher. Findings indicated a significant increase in the energy content knowledge for all the students. Effect sizes were large for all three ability level tracks, with the middle and low track classes having larger effect sizes than the upper track class. Learners in all three tracks were highly engaged with the curriculum. Curriculum effectiveness and practical issues involved with using geospatial technologies to support science learning are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yue, Peng; Gong, Jianya; Di, Liping
Abstract A geospatial catalogue service provides a network-based meta-information repository and interface for advertising and discovering shared geospatial data and services. Descriptive information (i.e., metadata) for geospatial data and services is structured and organized in catalogue services. The approaches currently available for searching and using that information are often inadequate. Semantic Web technologies show promise for better discovery methods by exploiting the underlying semantics. Such development needs special attention from the Cyberinfrastructure perspective, so that the traditional focus on discovery of and access to geospatial data can be expanded to support the increased demand for processing of geospatial information andmore » discovery of knowledge. Semantic descriptions for geospatial data, services, and geoprocessing service chains are structured, organized, and registered through extending elements in the ebXML Registry Information Model (ebRIM) of a geospatial catalogue service, which follows the interface specifications of the Open Geospatial Consortium (OGC) Catalogue Services for the Web (CSW). The process models for geoprocessing service chains, as a type of geospatial knowledge, are captured, registered, and discoverable. Semantics-enhanced discovery for geospatial data, services/service chains, and process models is described. Semantic search middleware that can support virtual data product materialization is developed for the geospatial catalogue service. The creation of such a semantics-enhanced geospatial catalogue service is important in meeting the demands for geospatial information discovery and analysis in Cyberinfrastructure.« less
Poppenga, Sandra K.; Worstell, Bruce B.; Stoker, Jason M.; Greenlee, Susan K.
2010-01-01
Digital elevation data commonly are used to extract surface flow features. One source for high-resolution elevation data is light detection and ranging (lidar). Lidar can capture a vast amount of topographic detail because of its fine-scale ability to digitally capture the surface of the earth. Because elevation is a key factor in extracting surface flow features, high-resolution lidar-derived digital elevation models (DEMs) provide the detail needed to consistently integrate hydrography with elevation, land cover, structures, and other geospatial features. The U.S. Geological Survey has developed selective drainage methods to extract continuous surface flow from high-resolution lidar-derived digital elevation data. The lidar-derived continuous surface flow network contains valuable information for water resource management involving flood hazard mapping, flood inundation, and coastal erosion. DEMs used in hydrologic applications typically are processed to remove depressions by filling them. High-resolution DEMs derived from lidar can capture much more detail of the land surface than courser elevation data. Therefore, high-resolution DEMs contain more depressions because of obstructions such as roads, railroads, and other elevated structures. The filling of these depressions can significantly affect the DEM-derived surface flow routing and terrain characteristics in an adverse way. In this report, selective draining methods that modify the elevation surface to drain a depression through an obstruction are presented. If such obstructions are not removed from the elevation data, the filling of depressions to create continuous surface flow can cause the flow to spill over an obstruction in the wrong location. Using this modified elevation surface improves the quality of derived surface flow and retains more of the true surface characteristics by correcting large filled depressions. A reliable flow surface is necessary for deriving a consistently connected drainage network, which is important in understanding surface water movement and developing applications for surface water runoff, flood inundation, and erosion. Improved methods are needed to extract continuous surface flow features from high-resolution elevation data based on lidar.
4-D Visualization of Seismic and Geodetic Data of the Big Island of Hawai'i
NASA Astrophysics Data System (ADS)
Burstein, J. A.; Smith-Konter, B. R.; Aryal, A.
2017-12-01
For decades Hawai'i has served as a natural laboratory for studying complex interactions between magmatic and seismic processes. Investigating characteristics of these processes, as well as the crustal response to major Hawaiian earthquakes, requires a synthesis of seismic and geodetic data and models. Here, we present a 4-D visualization of the Big Island of Hawai'i that investigates geospatial and temporal relationships of seismicity, seismic velocity structure, and GPS crustal motions to known volcanic and seismically active features. Using the QPS Fledermaus visualization package, we compile 90 m resolution topographic data from NASA's Shuttle Radar Topography Mission (SRTM) and 50 m resolution bathymetric data from the Hawaiian Mapping Research Group (HMRG) with a high-precision earthquake catalog of more than 130,000 events from 1992-2009 [Matoza et al., 2013] and a 3-D seismic velocity model of Hawai'i [Lin et al., 2014] based on seismic data from the Hawaiian Volcano Observatory (HVO). Long-term crustal motion vectors are integrated into the visualization from HVO GPS time-series data. These interactive data sets reveal well-defined seismic structure near the summit areas of Mauna Loa and Kilauea volcanoes, where high Vp and high Vp/Vs anomalies at 5-12 km depth, as well as clusters of low magnitude (M < 3.5) seismicity, are observed. These areas of high Vp and high Vp/Vs are interpreted as mafic dike complexes and the surrounding seismic clusters are associated with shallow magma processes. GPS data are also used to help identify seismic clusters associated with the steady crustal detachment of the south flank of Kilauea's East Rift Zone. We also investigate the fault geometry of the 2006 M6.7 Kiholo Bay earthquake event by analyzing elastic dislocation deformation modeling results [Okada, 1985] and HVO GPS and seismic data of this event. We demonstrate the 3-D fault mechanisms of the Kiholo Bay main shock as a combination of strike-slip and dip-slip components (net slip 0.55 m) delineating a 30 km east-west striking, southward-dipping fault plane, occurring at 39 km depth. This visualization serves as a resource for advancing scientific analyses of Hawaiian seismic processes, as well as an interactive educational tool for demonstrating the geospatial and geophysical structure of the Big Island of Hawai'i.
NASA Astrophysics Data System (ADS)
Molinario, G.; Hansen, M.; Potapov, P.
2016-12-01
High resolution satellite imagery obtained from the National Geospatial Intelligence Agency through NASA was used to photo-interpret sample areas within the DRC. The area sampled is a stratifcation of the forest cover loss from circa 2014 that either occurred completely within the previosly mapped homogenous area of the Rural Complex, at it's interface with primary forest, or in isolated forest perforations. Previous research resulted in a map of these areas that contextualizes forest loss depending on where it occurs and with what spatial density, leading to a better understading of the real impacts on forest degradation of livelihood shifting cultivation. The stratified random sampling approach of these areas allows the characterization of the constituent land cover types within these areas, and their variability throughout the DRC. Shifting cultivation has a variable forest degradation footprint in the DRC depending on many factors that drive it, but it's role in forest degradation and deforestation had been disputed, leading us to investigate and quantify the clearing and reuse rates within the strata throughout the country.
Modelling the distribution of domestic ducks in Monsoon Asia
Van Bockel, Thomas P.; Prosser, Diann; Franceschini, Gianluca; Biradar, Chandra; Wint, William; Robinson, Tim; Gilbert, Marius
2011-01-01
Domestic ducks are considered to be an important reservoir of highly pathogenic avian influenza (HPAI), as shown by a number of geospatial studies in which they have been identified as a significant risk factor associated with disease presence. Despite their importance in HPAI epidemiology, their large-scale distribution in Monsoon Asia is poorly understood. In this study, we created a spatial database of domestic duck census data in Asia and used it to train statistical distribution models for domestic duck distributions at a spatial resolution of 1km. The method was based on a modelling framework used by the Food and Agriculture Organisation to produce the Gridded Livestock of the World (GLW) database, and relies on stratified regression models between domestic duck densities and a set of agro-ecological explanatory variables. We evaluated different ways of stratifying the analysis and of combining the prediction to optimize the goodness of fit of the predictions. We found that domestic duck density could be predicted with reasonable accuracy (mean RMSE and correlation coefficient between log-transformed observed and predicted densities being 0.58 and 0.80, respectively), using a stratification based on livestock production systems. We tested the use of artificially degraded data on duck distributions in Thailand and Vietnam as training data, and compared the modelled outputs with the original high-resolution data. This showed, for these two countries at least, that these approaches could be used to accurately disaggregate provincial level (administrative level 1) statistical data to provide high resolution model distributions.
An approach for heterogeneous and loosely coupled geospatial data distributed computing
NASA Astrophysics Data System (ADS)
Chen, Bin; Huang, Fengru; Fang, Yu; Huang, Zhou; Lin, Hui
2010-07-01
Most GIS (Geographic Information System) applications tend to have heterogeneous and autonomous geospatial information resources, and the availability of these local resources is unpredictable and dynamic under a distributed computing environment. In order to make use of these local resources together to solve larger geospatial information processing problems that are related to an overall situation, in this paper, with the support of peer-to-peer computing technologies, we propose a geospatial data distributed computing mechanism that involves loosely coupled geospatial resource directories and a term named as Equivalent Distributed Program of global geospatial queries to solve geospatial distributed computing problems under heterogeneous GIS environments. First, a geospatial query process schema for distributed computing as well as a method for equivalent transformation from a global geospatial query to distributed local queries at SQL (Structured Query Language) level to solve the coordinating problem among heterogeneous resources are presented. Second, peer-to-peer technologies are used to maintain a loosely coupled network environment that consists of autonomous geospatial information resources, thus to achieve decentralized and consistent synchronization among global geospatial resource directories, and to carry out distributed transaction management of local queries. Finally, based on the developed prototype system, example applications of simple and complex geospatial data distributed queries are presented to illustrate the procedure of global geospatial information processing.
Evaluation of Assimilated SMOS Soil Moisture Data for US Cropland Soil Moisture Monitoring
NASA Technical Reports Server (NTRS)
Yang, Zhengwei; Sherstha, Ranjay; Crow, Wade; Bolten, John; Mladenova, Iva; Yu, Genong; Di, Liping
2016-01-01
Remotely sensed soil moisture data can provide timely, objective and quantitative crop soil moisture information with broad geospatial coverage and sufficiently high resolution observations collected throughout the growing season. This paper evaluates the feasibility of using the assimilated ESA Soil Moisture Ocean Salinity (SMOS)Mission L-band passive microwave data for operational US cropland soil surface moisture monitoring. The assimilated SMOS soil moisture data are first categorized to match with the United States Department of Agriculture (USDA)National Agricultural Statistics Service (NASS) survey based weekly soil moisture observation data, which are ordinal. The categorized assimilated SMOS soil moisture data are compared with NASSs survey-based weekly soil moisture data for consistency and robustness using visual assessment and rank correlation. Preliminary results indicate that the assimilated SMOS soil moisture data highly co-vary with NASS field observations across a large geographic area. Therefore, SMOS data have great potential for US operational cropland soil moisture monitoring.
Modeling and formal representation of geospatial knowledge for the Geospatial Semantic Web
NASA Astrophysics Data System (ADS)
Huang, Hong; Gong, Jianya
2008-12-01
GML can only achieve geospatial interoperation at syntactic level. However, it is necessary to resolve difference of spatial cognition in the first place in most occasions, so ontology was introduced to describe geospatial information and services. But it is obviously difficult and improper to let users to find, match and compose services, especially in some occasions there are complicated business logics. Currently, with the gradual introduction of Semantic Web technology (e.g., OWL, SWRL), the focus of the interoperation of geospatial information has shifted from syntactic level to Semantic and even automatic, intelligent level. In this way, Geospatial Semantic Web (GSM) can be put forward as an augmentation to the Semantic Web that additionally includes geospatial abstractions as well as related reasoning, representation and query mechanisms. To advance the implementation of GSM, we first attempt to construct the mechanism of modeling and formal representation of geospatial knowledge, which are also two mostly foundational phases in knowledge engineering (KE). Our attitude in this paper is quite pragmatical: we argue that geospatial context is a formal model of the discriminate environment characters of geospatial knowledge, and the derivation, understanding and using of geospatial knowledge are located in geospatial context. Therefore, first, we put forward a primitive hierarchy of geospatial knowledge referencing first order logic, formal ontologies, rules and GML. Second, a metamodel of geospatial context is proposed and we use the modeling methods and representation languages of formal ontologies to process geospatial context. Thirdly, we extend Web Process Service (WPS) to be compatible with local DLL for geoprocessing and possess inference capability based on OWL.
NASA Astrophysics Data System (ADS)
McCarty, J. L.; Pouliot, G. A.; Soja, A. J.; Miller, M. E.; Rao, T.
2013-12-01
Prescribed fires in agricultural landscapes generally produce smaller burned areas than wildland fires but are important contributors to emissions impacting air quality and human health. Currently, there are a variety of available satellite-based estimates of crop residue burning, including the NOAA/NESDIS Hazard Mapping System (HMS) the Satellite Mapping Automated Reanalysis Tool for Fire Incident Reconciliation (SMARTFIRE 2), the Moderate Resolution Imaging Spectroradiometer (MODIS) Official Burned Area Product (MCD45A1)), the MODIS Direct Broadcast Burned Area Product (MCD64A1) the MODIS Active Fire Product (MCD14ML), and a regionally-tuned 8-day cropland differenced Normalized Burn Ratio product for the contiguous U.S. The purpose of this NASA-funded research was to refine the regionally-tuned product utilizing higher spatial resolution crop type data from the USDA NASS Cropland Data Layer and burned area training data from field work and high resolution commercial satellite data to improve the U.S. Environmental Protection Agency's (EPA) National Emissions Inventory (NEI). The final product delivered to the EPA included a detailed database of 25 different atmospheric emissions at the county level, emission distributions by crop type and seasonality, and GIS data. The resulting emission databases were shared with the U.S. EPA and regional offices, the National Wildfire Coordinating Group (NWGC) Smoke Committee, and all 48 states in the contiguous U.S., with detailed error estimations for Wyoming and Indiana and detailed analyses of results for Florida, Minnesota, North Dakota, Oklahoma, and Oregon. This work also provided opportunities in discovering the different needs of federal and state partners, including the various geospatial abilities and platforms across the many users and how to incorporate expert air quality, policy, and land management knowledge into quantitative earth observation-based estimations of prescribed fire emissions. Finally, this work created direct communication paths between federal and state partners to the scientists creating the remote sensing-based products, further improving the geospatial products and understanding of air quality impacts of prescribed burning at the state, regional, and national scales.
NASA Astrophysics Data System (ADS)
De Vecchi, Daniele; Harb, Mostapha; Dell'Acqua, Fabio; Aurelio Galeazzo, Daniel
2015-04-01
Aim: The paper introduces an integrated set of open-source tools designed to process medium and high-resolution imagery with the aim to extract vulnerability indicators [1]. Problem: In the context of risk monitoring [2], a series of vulnerability proxies can be defined, such as the extension of a built-up area or buildings regularity [3]. Different open-source C and Python libraries are already available for image processing and geospatial information (e.g. OrfeoToolbox, OpenCV and GDAL). They include basic processing tools but not vulnerability-oriented workflows. Therefore, it is of significant importance to provide end-users with a set of tools capable to return information at a higher level. Solution: The proposed set of python algorithms is a combination of low-level image processing and geospatial information handling tools along with high-level workflows. In particular, two main products are released under the GPL license: source code, developers-oriented, and a QGIS plugin. These tools were produced within the SENSUM project framework (ended December 2014) where the main focus was on earthquake and landslide risk. Further development and maintenance is guaranteed by the decision to include them in the platform designed within the FP 7 RASOR project . Conclusion: With the lack of a unified software suite for vulnerability indicators extraction, the proposed solution can provide inputs for already available models like the Global Earthquake Model. The inclusion of the proposed set of algorithms within the RASOR platforms can guarantee support and enlarge the community of end-users. Keywords: Vulnerability monitoring, remote sensing, optical imagery, open-source software tools References [1] M. Harb, D. De Vecchi, F. Dell'Acqua, "Remote sensing-based vulnerability proxies in the EU FP7 project SENSUM", Symposium on earthquake and landslide risk in Central Asia and Caucasus: exploiting remote sensing and geo-spatial information management, 29-30th January 2014, Bishkek, Kyrgyz Republic. [2] UNISDR, "Living with Risk", Geneva, Switzerland, 2004. [3] P. Bisch, E. Carvalho, H. Degree, P. Fajfar, M. Fardis, P. Franchin, M. Kreslin, A. Pecker, "Eurocode 8: Seismic Design of Buildings", Lisbon, 2011. (SENSUM: www.sensum-project.eu, grant number: 312972 ) (RASOR: www.rasor-project.eu, grant number: 606888 )
Quantifying Spatially Integrated Floodplain and Wetland Systems for the Conterminous US
NASA Astrophysics Data System (ADS)
Lane, C.; D'Amico, E.; Wing, O.; Bates, P. D.
2017-12-01
Wetlands interact with other waters across a variable connectivity continuum, from permanent to transient, from fast to slow, and from primarily surface water to exclusively groundwater flows. Floodplain wetlands typically experience fast and frequent surface and near-surface groundwater interactions with their river networks, leading to an increasing effort to tailor management strategies for these wetlands. Management of floodplain wetlands is contingent on accurate floodplain delineation, and though this has proven challenging, multiple efforts are being made to alleviate this data gap at the conterminous scale using spatial, physical, and hydrological floodplain proxies. In this study, we derived and contrasted floodplain extents using the following nationally available approaches: 1) a geospatial-buffer floodplain proxy (Lane and D'Amico 2016, JAWRA 52(3):705-722, 2) a regionalized flood frequency analysis coupled to a 30m resolution continental-scale hydraulic model (RFFA; Smith et al. 2015, WRR 51:539-553), and 3) a soils-based floodplain analysis (Sangwan and Merwade 2015, JAWRA 51(5):1286-1304). The geospatial approach uses National Wetlands Inventory and buffered National Hydrography Datasets. RFFA estimates extreme flows based on catchment size, regional climatology and upstream annual rainfall and routes these flows through a hydraulic model built with data from USGS HydroSHEDS, NOAA, and the National Elevation Dataset. Soil-based analyses define floodplains based on attributes within the USDA soil-survey data (SSURGO). Nearly 30% (by count) of U.S. freshwater wetlands are located within floodplains with geospatial analyses, contrasted with 37% (soils-based), and 53% (RFFA-based). The dichotomies between approaches are mainly a function of input data-layer resolution, accuracy, coverage, and extent, further discussed in this presentation. Ultimately, these spatial analyses and findings will improve floodplain and integrated wetland system extent assessment. This will lead to better management of the physically, chemically, and biologically integrated floodplain wetlands affecting the integrity of downstream waterbodies at multiple scales.
NASA Astrophysics Data System (ADS)
Arulbalaji, P.; Gurugnanam, B.
2017-11-01
A morphometric analysis of Sarabanga watershed in Salem district has been chosen for the present study. Geospatial tools, such as remote sensing and GIS, are utilized for the extraction of river basin and its drainage networks. The Shuttle Radar Topographic Mission (SRTM-30 m resolution) data have been used for morphometric analysis and evaluating various morphometric parameters. The morphometric parameters of Sarabanga watershed have been analyzed and evaluated by pioneer methods, such as Horton and Strahler. The dendritic type of drainage pattern is draining the Sarabanga watershed, which indicates that lithology and gentle slope category is controlling the study area. The Sarabanga watershed is covered an area of 1208 km2. The slope of the watershed is various from 10 to 40% and which is controlled by lithology of the watershed. The bifurcation ratio ranges from 3 to 4.66 indicating the influence of geological structure and suffered more structural disturbances. The form factor indicates elongated shape of the study area. The total stream length and area of watershed indicate that mean annual rainfall runoff is relatively moderate. The basin relief expressed that watershed has relatively high denudation rates. The drainage density of the watershed is low indicating that infiltration is more dominant. The ruggedness number shows the peak discharges that are likely to be relatively higher. The present study is very useful to plan the watershed management.
An Investigation of Automatic Change Detection for Topographic Map Updating
NASA Astrophysics Data System (ADS)
Duncan, P.; Smit, J.
2012-08-01
Changes to the landscape are constantly occurring and it is essential for geospatial and mapping organisations that these changes are regularly detected and captured, so that map databases can be updated to reflect the current status of the landscape. The Chief Directorate of National Geospatial Information (CD: NGI), South Africa's national mapping agency, currently relies on manual methods of detecting changes and capturing these changes. These manual methods are time consuming and labour intensive, and rely on the skills and interpretation of the operator. It is therefore necessary to move towards more automated methods in the production process at CD: NGI. The aim of this research is to do an investigation into a methodology for automatic or semi-automatic change detection for the purpose of updating topographic databases. The method investigated for detecting changes is through image classification as well as spatial analysis and is focussed on urban landscapes. The major data input into this study is high resolution aerial imagery and existing topographic vector data. Initial results indicate the traditional pixel-based image classification approaches are unsatisfactory for large scale land-use mapping and that object-orientated approaches hold more promise. Even in the instance of object-oriented image classification generalization of techniques on a broad-scale has provided inconsistent results. A solution may lie with a hybrid approach of pixel and object-oriented techniques.
Estimating Renewable Energy Economic Potential in the United States. Methodology and Initial Results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Austin; Beiter, Philipp; Heimiller, Donna
This report describes a geospatial analysis method to estimate the economic potential of several renewable resources available for electricity generation in the United States. Economic potential, one measure of renewable generation potential, may be defined in several ways. For example, one definition might be expected revenues (based on local market prices) minus generation costs, considered over the expected lifetime of the generation asset. Another definition might be generation costs relative to a benchmark (e.g., a natural gas combined cycle plant) using assumptions of fuel prices, capital cost, and plant efficiency. Economic potential in this report is defined as the subsetmore » of the available resource technical potential where the cost required to generate the electricity (which determines the minimum revenue requirements for development of the resource) is below the revenue available in terms of displaced energy and displaced capacity. The assessment is conducted at a high geospatial resolution (more than 150,000 technology-specific sites in the continental United States) to capture the significant variation in local resource, costs, and revenue potential. This metric can be a useful screening factor for understanding the economic viability of renewable generation technologies at a specific location. In contrast to many common estimates of renewable energy potential, economic potential does not consider market dynamics, customer demand, or most policy drivers that may incent renewable energy generation.« less
NASA Astrophysics Data System (ADS)
Römer, H.; Kiefl, R.; Henkel, F.; Wenxi, C.; Nippold, R.; Kurz, F.; Kippnich, U.
2016-06-01
Enhancing situational awareness in real-time (RT) civil protection and emergency response scenarios requires the development of comprehensive monitoring concepts combining classical remote sensing disciplines with geospatial information science. In the VABENE++ project of the German Aerospace Center (DLR) monitoring tools are being developed by which innovative data acquisition approaches are combined with information extraction as well as the generation and dissemination of information products to a specific user. DLR's 3K and 4k camera system which allow for a RT acquisition and pre-processing of high resolution aerial imagery are applied in two application examples conducted with end users: a civil protection exercise with humanitarian relief organisations and a large open-air music festival in cooperation with a festival organising company. This study discusses how airborne remote sensing can significantly contribute to both, situational assessment and awareness, focussing on the downstream processes required for extracting information from imagery and for visualising and disseminating imagery in combination with other geospatial information. Valuable user feedback and impetus for further developments has been obtained from both applications, referring to innovations in thematic image analysis (supporting festival site management) and product dissemination (editable web services). Thus, this study emphasises the important role of user involvement in application-related research, i.e. by aligning it closer to user's requirements.
EPA Geospatial Quality Council Promoting Quality Assurance in the Geospatial Coummunity
After establishing a foundation for the EPA National Geospatial Program, the EPA Geospatial Quality Council (GQC) is, in part, focusing on improving administrative efficiency in the geospatial community. To realize this goal, the GQC is developing Standard Operating Procedures (S...
High-resolution swath interferometric data collected within Muskeget Channel, Massachusetts
Pendleton, Elizabeth A.; Denny, Jane F.; Danforth, William W.; Baldwin, Wayne E.; Irwin, Barry J.
2014-01-01
Swath interferometric bathymetery data were collected within and around Muskeget Channel and along select nearshore areas south and east of Martha's Vineyard, Massachusetts. Data were collected aboard the U.S. Geological Survey research vessel Rafael in October and November 2010 in a collaborative effort between the U.S. Geological Survey and the Woods Hole Oceanographic Institution. This report describes the data-collection methods and -processing steps and releases the data in geospatial format. These data were collected to support an assessment of the effect on sediment transport that a tidal instream energy conversion facility would have within Muskeget Channel. Baseline bathymetry data were obtained for the Muskeget Channel area, and surveys in select areas were repeated after one month to monitor sediment transport and bedform migration.
NASA Astrophysics Data System (ADS)
Trofymow, J. A.; Gougeon, F.; Kelley, J. W.
2017-12-01
Forest carbon (C) models require knowledge on C transfers due to intense disturbances such as fire, harvest, and slash burning. In such events, live trees die and C transferred to detritus or exported as round wood. With burning, live and detrital C is lost as emissions. Burning can be incomplete, leaving wood, charred and scattered or in unburnt rings and piles. For harvests, all round wood volume is routinely measured, while dispersed and piled residue volumes are typically assessed in field surveys, scaled to a block. Recently, geospatial methods have been used to determine, for an entire block, piled residues using LiDAR or image point clouds (PC) and dispersed residues by analysis of high-resolution imagery. Second-growth Douglas-fir forests on eastern Vancouver Island were examined, 4 blocks at Oyster River (OR) and 2 at Northwest Bay (NB). OR blocks were cut winter 2011, piled spring 2011, field survey, aerial RGB imagery and LiDAR PC acquired fall 2011, piles burned, burn residues surveyed, and post-burn aerial RGB imagery acquired 2012. NB blocks were cut fall 2014, piled spring 2015, field survey, UAV RGB imagery and image PC acquired summer 2015, piles burned and burn residues surveyed spring 2016, and post-burn UAV RGB imagery and PC acquired fall 2016. Volume to biomass conversion used survey species proportions and wood density. At OR, round wood was 261.7 SE 13.1, firewood 1.7 SE 0.3, and dispersed residue by survey, 13.8 SE 3.6 tonnes dry mass (t dm) ha-1. Piled residues were 8.2 SE 0.9 from pile surveys vs. 25.0 SE 5.9 t dm ha-1 from LiDAR PC bulk pile volumes and packing ratios. Post-burn, piles lost 5.8 SE 0.5 from survey of burn residues vs. 18.2 SE 4.7 t dm ha-1 from pile volume changes using 2011 LiDAR PC and 2012 imagery. The percentage of initial merchantable biomass exported as round & fire wood, remaining as dispersed & piled residue, and lost to burning was, respectively, 92.5%, 5.5% and 2% using only field methods vs. 87%, 7% and 6% from dispersed residues surveys and LIDAR PC pile volumes. At NB, preliminary analysis shows the post-burn difference in 2015 to 2016 UAV PC pile volumes, was similar to that using 2015 UAV PC pile volume and 2016 orthophoto pile area burned, suggesting the two geospatial methods are comparable. Comparisons will be made for transfers in all 6 blocks using only field survey or geospatial methods.
Grid Enabled Geospatial Catalogue Web Service
NASA Technical Reports Server (NTRS)
Chen, Ai-Jun; Di, Li-Ping; Wei, Ya-Xing; Liu, Yang; Bui, Yu-Qi; Hu, Chau-Min; Mehrotra, Piyush
2004-01-01
Geospatial Catalogue Web Service is a vital service for sharing and interoperating volumes of distributed heterogeneous geospatial resources, such as data, services, applications, and their replicas over the web. Based on the Grid technology and the Open Geospatial Consortium (0GC) s Catalogue Service - Web Information Model, this paper proposes a new information model for Geospatial Catalogue Web Service, named as GCWS which can securely provides Grid-based publishing, managing and querying geospatial data and services, and the transparent access to the replica data and related services under the Grid environment. This information model integrates the information model of the Grid Replica Location Service (RLS)/Monitoring & Discovery Service (MDS) with the information model of OGC Catalogue Service (CSW), and refers to the geospatial data metadata standards from IS0 19115, FGDC and NASA EOS Core System and service metadata standards from IS0 191 19 to extend itself for expressing geospatial resources. Using GCWS, any valid geospatial user, who belongs to an authorized Virtual Organization (VO), can securely publish and manage geospatial resources, especially query on-demand data in the virtual community and get back it through the data-related services which provide functions such as subsetting, reformatting, reprojection etc. This work facilitates the geospatial resources sharing and interoperating under the Grid environment, and implements geospatial resources Grid enabled and Grid technologies geospatial enabled. It 2!so makes researcher to focus on science, 2nd not cn issues with computing ability, data locztic, processir,g and management. GCWS also is a key component for workflow-based virtual geospatial data producing.
Soranno, Patricia A; Bissell, Edward G; Cheruvelil, Kendra S; Christel, Samuel T; Collins, Sarah M; Fergus, C Emi; Filstrup, Christopher T; Lapierre, Jean-Francois; Lottig, Noah R; Oliver, Samantha K; Scott, Caren E; Smith, Nicole J; Stopyak, Scott; Yuan, Shuai; Bremigan, Mary Tate; Downing, John A; Gries, Corinna; Henry, Emily N; Skaff, Nick K; Stanley, Emily H; Stow, Craig A; Tan, Pang-Ning; Wagner, Tyler; Webster, Katherine E
2015-01-01
Although there are considerable site-based data for individual or groups of ecosystems, these datasets are widely scattered, have different data formats and conventions, and often have limited accessibility. At the broader scale, national datasets exist for a large number of geospatial features of land, water, and air that are needed to fully understand variation among these ecosystems. However, such datasets originate from different sources and have different spatial and temporal resolutions. By taking an open-science perspective and by combining site-based ecosystem datasets and national geospatial datasets, science gains the ability to ask important research questions related to grand environmental challenges that operate at broad scales. Documentation of such complicated database integration efforts, through peer-reviewed papers, is recommended to foster reproducibility and future use of the integrated database. Here, we describe the major steps, challenges, and considerations in building an integrated database of lake ecosystems, called LAGOS (LAke multi-scaled GeOSpatial and temporal database), that was developed at the sub-continental study extent of 17 US states (1,800,000 km(2)). LAGOS includes two modules: LAGOSGEO, with geospatial data on every lake with surface area larger than 4 ha in the study extent (~50,000 lakes), including climate, atmospheric deposition, land use/cover, hydrology, geology, and topography measured across a range of spatial and temporal extents; and LAGOSLIMNO, with lake water quality data compiled from ~100 individual datasets for a subset of lakes in the study extent (~10,000 lakes). Procedures for the integration of datasets included: creating a flexible database design; authoring and integrating metadata; documenting data provenance; quantifying spatial measures of geographic data; quality-controlling integrated and derived data; and extensively documenting the database. Our procedures make a large, complex, and integrated database reproducible and extensible, allowing users to ask new research questions with the existing database or through the addition of new data. The largest challenge of this task was the heterogeneity of the data, formats, and metadata. Many steps of data integration need manual input from experts in diverse fields, requiring close collaboration.
Soranno, Patricia A.; Bissell, E.G.; Cheruvelil, Kendra S.; Christel, Samuel T.; Collins, Sarah M.; Fergus, C. Emi; Filstrup, Christopher T.; Lapierre, Jean-Francois; Lotting, Noah R.; Oliver, Samantha K.; Scott, Caren E.; Smith, Nicole J.; Stopyak, Scott; Yuan, Shuai; Bremigan, Mary Tate; Downing, John A.; Gries, Corinna; Henry, Emily N.; Skaff, Nick K.; Stanley, Emily H.; Stow, Craig A.; Tan, Pang-Ning; Wagner, Tyler; Webster, Katherine E.
2015-01-01
Although there are considerable site-based data for individual or groups of ecosystems, these datasets are widely scattered, have different data formats and conventions, and often have limited accessibility. At the broader scale, national datasets exist for a large number of geospatial features of land, water, and air that are needed to fully understand variation among these ecosystems. However, such datasets originate from different sources and have different spatial and temporal resolutions. By taking an open-science perspective and by combining site-based ecosystem datasets and national geospatial datasets, science gains the ability to ask important research questions related to grand environmental challenges that operate at broad scales. Documentation of such complicated database integration efforts, through peer-reviewed papers, is recommended to foster reproducibility and future use of the integrated database. Here, we describe the major steps, challenges, and considerations in building an integrated database of lake ecosystems, called LAGOS (LAke multi-scaled GeOSpatial and temporal database), that was developed at the sub-continental study extent of 17 US states (1,800,000 km2). LAGOS includes two modules: LAGOSGEO, with geospatial data on every lake with surface area larger than 4 ha in the study extent (~50,000 lakes), including climate, atmospheric deposition, land use/cover, hydrology, geology, and topography measured across a range of spatial and temporal extents; and LAGOSLIMNO, with lake water quality data compiled from ~100 individual datasets for a subset of lakes in the study extent (~10,000 lakes). Procedures for the integration of datasets included: creating a flexible database design; authoring and integrating metadata; documenting data provenance; quantifying spatial measures of geographic data; quality-controlling integrated and derived data; and extensively documenting the database. Our procedures make a large, complex, and integrated database reproducible and extensible, allowing users to ask new research questions with the existing database or through the addition of new data. The largest challenge of this task was the heterogeneity of the data, formats, and metadata. Many steps of data integration need manual input from experts in diverse fields, requiring close collaboration.
NASA Astrophysics Data System (ADS)
Neale, A. C.
2016-12-01
EnviroAtlas is a multi-organization effort led by the US Environmental Protection Agency to develop, host and display a large suite of nation-wide geospatial indicators and indices of ecosystem services. This open access tool allows users to view, analyze, and download a wealth of geospatial data and other resources related to ecosystem goods and services. More than 160 national indicators of ecosystem service supply, demand, and drivers of change provide a framework to inform decisions and policies at multiple spatial scales, educate a range of audiences, and supply data for research. A higher resolution component is also available, providing over 100 data layers for finer-scale analyses for selected communities across the US. The ecosystem goods and services data are organized into seven general ecosystem benefit categories: clean and plentiful water; natural hazard mitigation; food, fuel, and materials; climate stabilization; clean air; biodiversity conservation; and recreation, culture, and aesthetics. Each indicator is described in terms of how it is important to human health or well-being. EnviroAtlas includes data describing existing ecosystem markets for water quality and quantity, biodiversity, wetland mitigation, and carbon credits. This presentation will briefly describe the EnviroAtlas data and tools and how they are being developed and used in ongoing research studies and in decision-making contexts.
Mapping irrigated lands at 250-m scale by merging MODIS data and National Agricultural Statistics
Pervez, Md Shahriar; Brown, Jesslyn F.
2010-01-01
Accurate geospatial information on the extent of irrigated land improves our understanding of agricultural water use, local land surface processes, conservation or depletion of water resources, and components of the hydrologic budget. We have developed a method in a geospatial modeling framework that assimilates irrigation statistics with remotely sensed parameters describing vegetation growth conditions in areas with agricultural land cover to spatially identify irrigated lands at 250-m cell size across the conterminous United States for 2002. The geospatial model result, known as the Moderate Resolution Imaging Spectroradiometer (MODIS) Irrigated Agriculture Dataset (MIrAD-US), identified irrigated lands with reasonable accuracy in California and semiarid Great Plains states with overall accuracies of 92% and 75% and kappa statistics of 0.75 and 0.51, respectively. A quantitative accuracy assessment of MIrAD-US for the eastern region has not yet been conducted, and qualitative assessment shows that model improvements are needed for the humid eastern regions where the distinction in annual peak NDVI between irrigated and non-irrigated crops is minimal and county sizes are relatively small. This modeling approach enables consistent mapping of irrigated lands based upon USDA irrigation statistics and should lead to better understanding of spatial trends in irrigated lands across the conterminous United States. An improved version of the model with revised datasets is planned and will employ 2007 USDA irrigation statistics.
Urban-hazard risk analysis: mapping of heat-related risks in the elderly in major Italian cities.
Morabito, Marco; Crisci, Alfonso; Gioli, Beniamino; Gualtieri, Giovanni; Toscano, Piero; Di Stefano, Valentina; Orlandini, Simone; Gensini, Gian Franco
2015-01-01
Short-term impacts of high temperatures on the elderly are well known. Even though Italy has the highest proportion of elderly citizens in Europe, there is a lack of information on spatial heat-related elderly risks. Development of high-resolution, heat-related urban risk maps regarding the elderly population (≥ 65). A long time-series (2001-2013) of remote sensing MODIS data, averaged over the summer period for eleven major Italian cities, were downscaled to obtain high spatial resolution (100 m) daytime and night-time land surface temperatures (LST). LST was estimated pixel-wise by applying two statistical model approaches: 1) the Linear Regression Model (LRM); 2) the Generalized Additive Model (GAM). Total and elderly population density data were extracted from the Joint Research Centre population grid (100 m) from the 2001 census (Eurostat source), and processed together using "Crichton's Risk Triangle" hazard-risk methodology for obtaining a Heat-related Elderly Risk Index (HERI). The GAM procedure allowed for improved daytime and night-time LST estimations compared to the LRM approach. High-resolution maps of daytime and night-time HERI levels were developed for inland and coastal cities. Urban areas with the hazardous HERI level (very high risk) were not necessarily characterized by the highest temperatures. The hazardous HERI level was generally localized to encompass the city-centre in inland cities and the inner area in coastal cities. The two most dangerous HERI levels were greater in the coastal rather than inland cities. This study shows the great potential of combining geospatial technologies and spatial demographic characteristics within a simple and flexible framework in order to provide high-resolution urban mapping of daytime and night-time HERI. In this way, potential areas for intervention are immediately identified with up-to-street level details. This information could support public health operators and facilitate coordination for heat-related emergencies.
Urban-Hazard Risk Analysis: Mapping of Heat-Related Risks in the Elderly in Major Italian Cities
Morabito, Marco; Crisci, Alfonso; Gioli, Beniamino; Gualtieri, Giovanni; Toscano, Piero; Di Stefano, Valentina; Orlandini, Simone; Gensini, Gian Franco
2015-01-01
Background Short-term impacts of high temperatures on the elderly are well known. Even though Italy has the highest proportion of elderly citizens in Europe, there is a lack of information on spatial heat-related elderly risks. Objectives Development of high-resolution, heat-related urban risk maps regarding the elderly population (≥65). Methods A long time-series (2001–2013) of remote sensing MODIS data, averaged over the summer period for eleven major Italian cities, were downscaled to obtain high spatial resolution (100 m) daytime and night-time land surface temperatures (LST). LST was estimated pixel-wise by applying two statistical model approaches: 1) the Linear Regression Model (LRM); 2) the Generalized Additive Model (GAM). Total and elderly population density data were extracted from the Joint Research Centre population grid (100 m) from the 2001 census (Eurostat source), and processed together using “Crichton’s Risk Triangle” hazard-risk methodology for obtaining a Heat-related Elderly Risk Index (HERI). Results The GAM procedure allowed for improved daytime and night-time LST estimations compared to the LRM approach. High-resolution maps of daytime and night-time HERI levels were developed for inland and coastal cities. Urban areas with the hazardous HERI level (very high risk) were not necessarily characterized by the highest temperatures. The hazardous HERI level was generally localized to encompass the city-centre in inland cities and the inner area in coastal cities. The two most dangerous HERI levels were greater in the coastal rather than inland cities. Conclusions This study shows the great potential of combining geospatial technologies and spatial demographic characteristics within a simple and flexible framework in order to provide high-resolution urban mapping of daytime and night-time HERI. In this way, potential areas for intervention are immediately identified with up-to-street level details. This information could support public health operators and facilitate coordination for heat-related emergencies. PMID:25985204
PANTHER. Pattern ANalytics To support High-performance Exploitation and Reasoning.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Czuchlewski, Kristina Rodriguez; Hart, William E.
Sandia has approached the analysis of big datasets with an integrated methodology that uses computer science, image processing, and human factors to exploit critical patterns and relationships in large datasets despite the variety and rapidity of information. The work is part of a three-year LDRD Grand Challenge called PANTHER (Pattern ANalytics To support High-performance Exploitation and Reasoning). To maximize data analysis capability, Sandia pursued scientific advances across three key technical domains: (1) geospatial-temporal feature extraction via image segmentation and classification; (2) geospatial-temporal analysis capabilities tailored to identify and process new signatures more efficiently; and (3) domain- relevant models of humanmore » perception and cognition informing the design of analytic systems. Our integrated results include advances in geographical information systems (GIS) in which we discover activity patterns in noisy, spatial-temporal datasets using geospatial-temporal semantic graphs. We employed computational geometry and machine learning to allow us to extract and predict spatial-temporal patterns and outliers from large aircraft and maritime trajectory datasets. We automatically extracted static and ephemeral features from real, noisy synthetic aperture radar imagery for ingestion into a geospatial-temporal semantic graph. We worked with analysts and investigated analytic workflows to (1) determine how experiential knowledge evolves and is deployed in high-demand, high-throughput visual search workflows, and (2) better understand visual search performance and attention. Through PANTHER, Sandia's fundamental rethinking of key aspects of geospatial data analysis permits the extraction of much richer information from large amounts of data. The project results enable analysts to examine mountains of historical and current data that would otherwise go untouched, while also gaining meaningful, measurable, and defensible insights into overlooked relationships and patterns. The capability is directly relevant to the nation's nonproliferation remote-sensing activities and has broad national security applications for military and intelligence- gathering organizations.« less
Digital orthoimagery base specification V1.0
Rufe, Philip P.
2014-01-01
The resolution requirement for orthoimagery in support of the The National Map of the U.S. Geological Survey (USGS) is 1 meter. However, as the Office of Management and Budget A-16 designated Federal agency responsible for base orthoimagery, the USGS National Geospatial Program (NGP) has developed this base specification to include higher resolution orthoimagery. Many Federal, State, and local programs use high-resolution orthoimagery for various purposes including critical infrastructure management, vector data updates, land-use analysis, natural resource inventory, and extraction of data. The complex nature of large-area orthoimagery datasets, combined with the broad interest in orthoimagery, which is of consistent quality and spatial accuracy, requires high-resolution orthoimagery to meet or exceed the format and content outlined in this specification. The USGS intends to use this specification primarily to create consistency across all NGP funded and managed orthoimagery collections, in particular, collections in support of the National Digital Orthoimagery Program (NDOP). In the absence of other comprehensive specifications or standards, the USGS intends that this specification will, to the highest degree practical, be adopted by other USGS programs and mission areas, and by other Federal agencies. This base specification, defining minimum parameters for orthoimagery data collection. Local conditions in any given project area, specialized applications for the data, or the preferences of cooperators, may mandate more stringent requirements. The USGS fully supports the acquisition of more detailed, accurate, or value-added data that exceed the base specification outlined herein. A partial list of common “buy-up” options is provided in appendix 1 for those areas and projects that require more stringent or expanded specifications.
NASA Astrophysics Data System (ADS)
Minnett, R. C.; Koppers, A. A.; Staudigel, D.; Staudigel, H.
2008-12-01
EarthRef.org is comprehensive and convenient resource for Earth Science reference data and models. It encompasses four main portals: the Geochemical Earth Reference Model (GERM), the Magnetics Information Consortium (MagIC), the Seamount Biogeosciences Network (SBN), and the Enduring Resources for Earth Science Education (ERESE). Their underlying databases are publically available and the scientific community has contributed widely and is urged to continue to do so. However, the net result is a vast and largely heterogeneous warehouse of geospatial data ranging from carefully prepared maps of seamounts to geochemical data/metadata, daily reports from seagoing expeditions, large volumes of raw and processed multibeam data, images of paleomagnetic sampling sites, etc. This presents a considerable obstacle for integrating other rich media content, such as videos, images, data files, cruise tracks, and interoperable database results, without overwhelming the web user. The four EarthRef.org portals clearly lend themselves to a more intuitive user interface and has, therefore, been an invaluable test bed for the design and implementation of FlashMap, a versatile KML-driven geospatial browser written for reliability and speed in Adobe Flash. FlashMap allows layers of content to be loaded and displayed over a streaming high-resolution map which can be zoomed and panned similarly to Google Maps and Google Earth. Many organizations, from National Geographic to the USGS, have begun using Google Earth software to display geospatial content. However, Google Earth, as a desktop application, does not integrate cleanly with existing websites requiring the user to navigate away from the browser and focus on a separate application and Google Maps, written in Java Script, does not scale up reliably to large datasets. FlashMap remedies these problems as a web-based application that allows for seamless integration of the real-time display power of Google Earth and the flexibility of the web without losing scalability and control of the base maps. Our Flash-based application is fully compatible with KML (Keyhole Markup Language) 2.2, the most recent iteration of KML, allowing users with existing Google Earth KML files to effortlessly display their geospatial content embedded in a web page. As a test case for FlashMap, the annual Iron-Oxidizing Microbial Observatory (FeMO) dive cruise to the Loihi Seamount, in conjunction with data available from ongoing and published FeMO laboratory studies, showcases the flexibility of this single web-based application. With a KML 2.2 compatible web-service providing the content, any database can display results in FlashMap. The user can then hide and show multiple layers of content, potentially from several data sources, and rapidly digest a vast quantity of information to narrow the search results. This flexibility gives experienced users the ability to drill down to exactly the record they are looking for (SERC at Carleton College's educational application of FlashMap at http://serc.carleton.edu/sp/erese/activities/22223.html) and allows users familiar with Google Earth the ability to load and view geospatial data content within a browser from any computer with an internet connection.
Remote Sensing Product Verification and Validation at the NASA Stennis Space Center
NASA Technical Reports Server (NTRS)
Stanley, Thomas M.
2005-01-01
Remote sensing data product verification and validation (V&V) is critical to successful science research and applications development. People who use remote sensing products to make policy, economic, or scientific decisions require confidence in and an understanding of the products' characteristics to make informed decisions about the products' use. NASA data products of coarse to moderate spatial resolution are validated by NASA science teams. NASA's Stennis Space Center (SSC) serves as the science validation team lead for validating commercial data products of moderate to high spatial resolution. At SSC, the Applications Research Toolbox simulates sensors and targets, and the Instrument Validation Laboratory validates critical sensors. The SSC V&V Site consists of radiometric tarps, a network of ground control points, a water surface temperature sensor, an atmospheric measurement system, painted concrete radial target and edge targets, and other instrumentation. NASA's Applied Sciences Directorate participates in the Joint Agency Commercial Imagery Evaluation (JACIE) team formed by NASA, the U.S. Geological Survey, and the National Geospatial-Intelligence Agency to characterize commercial systems and imagery.
Xian, George
2008-01-01
By using both high-resolution orthoimagery and medium-resolution Landsat satellite imagery with other geospatial information, several land surface parameters including impervious surfaces and land surface temperatures for three geographically distinct urban areas in the United States – Seattle, Washington, Tampa Bay, Florida, and Las Vegas, Nevada, are obtained. Percent impervious surface is used to quantitatively define the spatial extent and development density of urban land use. Land surface temperatures were retrieved by using a single band algorithm that processes both thermal infrared satellite data and total atmospheric water vapor content. Land surface temperatures were analyzed for different land use and land cover categories in the three regions. The heterogeneity of urban land surface and associated spatial extents were shown to influence surface thermal conditions because of the removal of vegetative cover, the introduction of non-transpiring surfaces, and the reduction in evaporation over urban impervious surfaces. Fifty years of in situ climate data were integrated to assess regional climatic conditions. The spatial structure of surface heating influenced by landscape characteristics has a profound influence on regional climate conditions, especially through urban heat island effects.
High resolution global gridded data for use in population studies
Lloyd, Christopher T.; Sorichetta, Alessandro; Tatem, Andrew J.
2017-01-01
Recent years have seen substantial growth in openly available satellite and other geospatial data layers, which represent a range of metrics relevant to global human population mapping at fine spatial scales. The specifications of such data differ widely and therefore the harmonisation of data layers is a prerequisite to constructing detailed and contemporary spatial datasets which accurately describe population distributions. Such datasets are vital to measure impacts of population growth, monitor change, and plan interventions. To this end the WorldPop Project has produced an open access archive of 3 and 30 arc-second resolution gridded data. Four tiled raster datasets form the basis of the archive: (i) Viewfinder Panoramas topography clipped to Global ADMinistrative area (GADM) coastlines; (ii) a matching ISO 3166 country identification grid; (iii) country area; (iv) and slope layer. Further layers include transport networks, landcover, nightlights, precipitation, travel time to major cities, and waterways. Datasets and production methodology are here described. The archive can be downloaded both from the WorldPop Dataverse Repository and the WorldPop Project website. PMID:28140386
Lemke, Lawrence D; Lamerato, Lois E; Xu, Xiaohong; Booza, Jason C; Reiners, John J; Raymond Iii, Delbert M; Villeneuve, Paul J; Lavigne, Eric; Larkin, Dana; Krouse, Helene J
2014-07-01
The Geospatial Determinants of Health Outcomes Consortium (GeoDHOC) study investigated ambient air quality across the international border between Detroit, Michigan, USA and Windsor, Ontario, Canada and its association with acute asthma events in 5- to 89-year-old residents of these cities. NO2, SO2, and volatile organic compounds (VOCs) were measured at 100 sites, and particulate matter (PM) and polycyclic aromatic hydrocarbons (PAHs) at 50 sites during two 2-week sampling periods in 2008 and 2009. Acute asthma event rates across neighborhoods in each city were calculated using emergency room visits and hospitalizations and standardized to the overall age and gender distribution of the population in the two cities combined. Results demonstrate that intra-urban air quality variations are related to adverse respiratory events in both cities. Annual 2008 asthma rates exhibited statistically significant positive correlations with total VOCs and total benzene, toluene, ethylbenzene and xylene (BTEX) at 5-digit zip code scale spatial resolution in Detroit. In Windsor, NO2, VOCs, and PM10 concentrations correlated positively with 2008 asthma rates at a similar 3-digit postal forward sortation area scale. The study is limited by its coarse temporal resolution (comparing relatively short term air quality measurements to annual asthma health data) and interpretation of findings is complicated by contrasts in population demographics and health-care delivery systems in Detroit and Windsor.
Geospatial considerations for a multiorganizational, landscape-scale program
O'Donnell, Michael S.; Assal, Timothy J.; Anderson, Patrick J.; Bowen, Zachary H.
2013-01-01
Geospatial data play an increasingly important role in natural resources management, conservation, and science-based projects. The management and effective use of spatial data becomes significantly more complex when the efforts involve a myriad of landscape-scale projects combined with a multiorganizational collaboration. There is sparse literature to guide users on this daunting subject; therefore, we present a framework of considerations for working with geospatial data that will provide direction to data stewards, scientists, collaborators, and managers for developing geospatial management plans. The concepts we present apply to a variety of geospatial programs or projects, which we describe as a “scalable framework” of processes for integrating geospatial efforts with management, science, and conservation initiatives. Our framework includes five tenets of geospatial data management: (1) the importance of investing in data management and standardization, (2) the scalability of content/efforts addressed in geospatial management plans, (3) the lifecycle of a geospatial effort, (4) a framework for the integration of geographic information systems (GIS) in a landscape-scale conservation or management program, and (5) the major geospatial considerations prior to data acquisition. We conclude with a discussion of future considerations and challenges.
Considerations on Geospatial Big Data
NASA Astrophysics Data System (ADS)
LIU, Zhen; GUO, Huadong; WANG, Changlin
2016-11-01
Geospatial data, as a significant portion of big data, has recently gained the full attention of researchers. However, few researchers focus on the evolution of geospatial data and its scientific research methodologies. When entering into the big data era, fully understanding the changing research paradigm associated with geospatial data will definitely benefit future research on big data. In this paper, we look deep into these issues by examining the components and features of geospatial big data, reviewing relevant scientific research methodologies, and examining the evolving pattern of geospatial data in the scope of the four ‘science paradigms’. This paper proposes that geospatial big data has significantly shifted the scientific research methodology from ‘hypothesis to data’ to ‘data to questions’ and it is important to explore the generality of growing geospatial data ‘from bottom to top’. Particularly, four research areas that mostly reflect data-driven geospatial research are proposed: spatial correlation, spatial analytics, spatial visualization, and scientific knowledge discovery. It is also pointed out that privacy and quality issues of geospatial data may require more attention in the future. Also, some challenges and thoughts are raised for future discussion.
A program for handling map projections of small-scale geospatial raster data
Finn, Michael P.; Steinwand, Daniel R.; Trent, Jason R.; Buehler, Robert A.; Mattli, David M.; Yamamoto, Kristina H.
2012-01-01
Scientists routinely accomplish small-scale geospatial modeling using raster datasets of global extent. Such use often requires the projection of global raster datasets onto a map or the reprojection from a given map projection associated with a dataset. The distortion characteristics of these projection transformations can have significant effects on modeling results. Distortions associated with the reprojection of global data are generally greater than distortions associated with reprojections of larger-scale, localized areas. The accuracy of areas in projected raster datasets of global extent is dependent on spatial resolution. To address these problems of projection and the associated resampling that accompanies it, methods for framing the transformation space, direct point-to-point transformations rather than gridded transformation spaces, a solution to the wrap-around problem, and an approach to alternative resampling methods are presented. The implementations of these methods are provided in an open-source software package called MapImage (or mapIMG, for short), which is designed to function on a variety of computer architectures.
Global polar geospatial information service retrieval based on search engine and ontology reasoning
Chen, Nengcheng; E, Dongcheng; Di, Liping; Gong, Jianya; Chen, Zeqiang
2007-01-01
In order to improve the access precision of polar geospatial information service on web, a new methodology for retrieving global spatial information services based on geospatial service search and ontology reasoning is proposed, the geospatial service search is implemented to find the coarse service from web, the ontology reasoning is designed to find the refined service from the coarse service. The proposed framework includes standardized distributed geospatial web services, a geospatial service search engine, an extended UDDI registry, and a multi-protocol geospatial information service client. Some key technologies addressed include service discovery based on search engine and service ontology modeling and reasoning in the Antarctic geospatial context. Finally, an Antarctica multi protocol OWS portal prototype based on the proposed methodology is introduced.
Smith, Dianna; Mathur, Rohini; Robson, John; Greenhalgh, Trisha
2012-01-01
Objective To explore the feasibility of producing small-area geospatial maps of chronic disease risk for use by clinical commissioning groups and public health teams. Study design Cross-sectional geospatial analysis using routinely collected general practitioner electronic record data. Sample and setting Tower Hamlets, an inner-city district of London, UK, characterised by high socioeconomic and ethnic diversity and high prevalence of non-communicable diseases. Methods The authors used type 2 diabetes as an example. The data set was drawn from electronic general practice records on all non-diabetic individuals aged 25–79 years in the district (n=163 275). The authors used a validated instrument, QDScore, to calculate 10-year risk of developing type 2 diabetes. Using specialist mapping software (ArcGIS), the authors produced visualisations of how these data varied by lower and middle super output area across the district. The authors enhanced these maps with information on examples of locality-based social determinants of health (population density, fast food outlets and green spaces). Data were piloted as three types of geospatial map (basic, heat and ring). The authors noted practical, technical and information governance challenges involved in producing the maps. Results Usable data were obtained on 96.2% of all records. One in 11 adults in our cohort was at ‘high risk’ of developing type 2 diabetes with a 20% or more 10-year risk. Small-area geospatial mapping illustrated ‘hot spots’ where up to 17.3% of all adults were at high risk of developing type 2 diabetes. Ring maps allowed visualisation of high risk for type 2 diabetes by locality alongside putative social determinants in the same locality. The task of downloading, cleaning and mapping data from electronic general practice records posed some technical challenges, and judgement was required to group data at an appropriate geographical level. Information governance issues were time consuming and required local and national consultation and agreement. Conclusions Producing small-area geospatial maps of diabetes risk calculated from general practice electronic record data across a district-wide population was feasible but not straightforward. Geovisualisation of epidemiological and environmental data, made possible by interdisciplinary links between public health clinicians and human geographers, allows presentation of findings in a way that is both accessible and engaging, hence potentially of value to commissioners and policymakers. Impact studies are needed of how maps of chronic disease risk might be used in public health and urban planning. PMID:22337817
Challenges in sharing of geospatial data by data custodians in South Africa
NASA Astrophysics Data System (ADS)
Kay, Sissiel E.
2018-05-01
As most development planning and rendering of public services happens at a place or in a space, geospatial data is required. This geospatial data is best managed through a spatial data infrastructure, which has as a key objective to share geospatial data. The collection and maintenance of geospatial data is expensive and time consuming and so the principle of "collect once - use many times" should apply. It is best to obtain the geospatial data from the authoritative source - the appointed data custodian. In South Africa the South African Spatial Data Infrastructure (SASDI) is the means to achieve the requirement for geospatial data sharing. This requires geospatial data sharing to take place between the data custodian and the user. All data custodians are expected to comply with the Spatial Data Infrastructure Act (SDI Act) in terms of geo-spatial data sharing. Currently data custodians are experiencing challenges with regard to the sharing of geospatial data. This research is based on the current ten data themes selected by the Committee for Spatial Information and the organisations identified as the data custodians for these ten data themes. The objectives are to determine whether the identified data custodians comply with the SDI Act with respect to geospatial data sharing, and if not what are the reasons for this. Through an international comparative assessment it then determines if the compliance with the SDI Act is not too onerous on the data custodians. The research concludes that there are challenges with geospatial data sharing in South Africa and that the data custodians only partially comply with the SDI Act in terms of geospatial data sharing. However, it is shown that the South African legislation is not too onerous on the data custodians.
NASA Astrophysics Data System (ADS)
Zalles, D. R.
2011-12-01
The presentation will compare and contrast two different place-based approaches to helping high school science teachers use geospatial data visualization technology to teach about climate change in their local regions. The approaches are being used in the development, piloting, and dissemination of two projects for high school science led by the author: the NASA-funded Data-enhanced Investigations for Climate Change Education (DICCE) and the NSF funded Studying Topography, Orographic Rainfall, and Ecosystems with Geospatial Information Technology (STORE). DICCE is bringing an extensive portal of Earth observation data, the Goddard Interactive Online Visualization and Analysis Infrastructure, to high school classrooms. STORE is making available data for viewing results of a particular IPCC-sanctioned climate change model in relation to recent data about average temperatures, precipitation, and land cover for study areas in central California and western New York State. Across the two projects, partner teachers of academically and ethnically diverse students from five states are participating in professional development and pilot testing. Powerful geospatial data representation technologies are difficult to implement in high school science because of challenges that teachers and students encounter navigating data access and making sense of data characteristics and nomenclature. Hence, on DICCE, the researchers are testing the theory that by providing a scaffolded technology-supported process for instructional design, starting from fundamental questions about the content domain, teachers will make better instructional decisions. Conversely, the STORE approach is rooted in the perspective that co-design of curricular materials among researchers and teacher partners that work off of "starter" lessons covering focal skills and understandings will lead to the most effective utilizations of the technology in the classroom. The projects' goals and strategies for student learning proceed from research suggesting that students will be more engaged and able to utilize prior knowledge better when seeing the local and hence personal relevance of climate change and other pressing contemporary science-related issues. In these projects, the students look for climate change trends in geospatial Earth System data layers from weather stations, satellites, and models in relation to global trends. They examine these data to (1) reify what they are learning in science class about meteorology, climate, and ecology, (2) build inquiry skills by posing and seeking answers to research questions, and (3) build data literacy skills through experience generating appropriate data queries and examining data output on different forms of geospatial representations such as maps, elevation profiles, and time series plots. Teachers also are given the opportunity to have their students look at geospatially represented census data from the tool Social Explorer (http://www.socialexplorer.com/pub/maps/home.aspx) in order to better understand demographic trends in relation to climate change-related trends in the Earth system. Early results will be reported about teacher professional development and student learning, gleaned from interviews and observations.
Geospatial Data Science Research Staff | Geospatial Data Science | NREL
Oliveira, Ricardo Researcher II-Geospatial Science Ricardo.Oliveira@nrel.gov 303-275-3272 Gilroy, Nicholas Specialist Pamela.Gray.hann@nrel.gov 303-275-4626 Grue, Nicholas Researcher III-Geospatial Science Nick.Grue
PLANNING QUALITY IN GEOSPATIAL PROJECTS
This presentation will briefly review some legal drivers and present a structure for the writing of geospatial Quality Assurance Projects Plans. In addition, the Geospatial Quality Council geospatial information life-cycle and sources of error flowchart will be reviewed.
Automatic geospatial information Web service composition based on ontology interface matching
NASA Astrophysics Data System (ADS)
Xu, Xianbin; Wu, Qunyong; Wang, Qinmin
2008-10-01
With Web services technology the functions of WebGIS can be presented as a kind of geospatial information service, and helped to overcome the limitation of the information-isolated situation in geospatial information sharing field. Thus Geospatial Information Web service composition, which conglomerates outsourced services working in tandem to offer value-added service, plays the key role in fully taking advantage of geospatial information services. This paper proposes an automatic geospatial information web service composition algorithm that employed the ontology dictionary WordNet to analyze semantic distances among the interfaces. Through making matching between input/output parameters and the semantic meaning of pairs of service interfaces, a geospatial information web service chain can be created from a number of candidate services. A practice of the algorithm is also proposed and the result of it shows the feasibility of this algorithm and the great promise in the emerging demand for geospatial information web service composition.
Retrieval of Mid-tropospheric CO2 Directly from AIRS Measurements
NASA Technical Reports Server (NTRS)
Olsen, Edward T.; Chahine, Moustafa T.; Chen, Luke L.; Pagano, Thomas S.
2008-01-01
We apply the method of Vanishing Partial Derivatives (VPD) to AIRS spectra to retrieve daily the global distribution of CO2 at a nadir geospatial resolution of 90 km x 90 km without requiring a first-guess input beyond the global average. Our retrievals utilize the 15 (micro)m band radiances, a complex spectral region. This method may be of value in other applications, in which spectral signatures of multiple species are not well isolated spectrally from one another.
NASA Astrophysics Data System (ADS)
Rodríguez-Galiano, Víctor; Garcia-Soldado, Maria José; Chica-Olmo, Mario
The importance of accurate and timely information describing the nature and extent of land and natural resources is increasing especially in rapidly growing metropolitan areas. While metropolitan area decision makers are in constant need of current geospatial information on patterns and trends in land cover and land use, relatively little researchers has investigated the influence of the satellite data resolution for monitoring geo-enviromental information. In this research a suite of remote sensing and GIS techniques is applied in a land use mapping study. The main task is to asses the influence of the spatial and spectral resolution in the separability between classes and in the classificatiońs accuracy. This study has been focused in a very dynamical area with respect to land use, located in the province of Granada (SE of Spain). The classifications results of the Airborne Hyperspectral Scanner (AHS, Daedalus Enterprise Inc., WA, EEUU) at different spatial resolutions: 2, 4 and 6 m and Landsat 5 TM data have been compared.
75 FR 6056 - National Geospatial Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-05
... DEPARTMENT OF THE INTERIOR Office of the Secretary National Geospatial Advisory Committee AGENCY: Office of the Secretary, Interior. ACTION: Notice of renewal of National Geospatial Advisory Committee... renewed the National Geospatial Advisory Committee. The Committee will provide advice and recommendations...
Building asynchronous geospatial processing workflows with web services
NASA Astrophysics Data System (ADS)
Zhao, Peisheng; Di, Liping; Yu, Genong
2012-02-01
Geoscience research and applications often involve a geospatial processing workflow. This workflow includes a sequence of operations that use a variety of tools to collect, translate, and analyze distributed heterogeneous geospatial data. Asynchronous mechanisms, by which clients initiate a request and then resume their processing without waiting for a response, are very useful for complicated workflows that take a long time to run. Geospatial contents and capabilities are increasingly becoming available online as interoperable Web services. This online availability significantly enhances the ability to use Web service chains to build distributed geospatial processing workflows. This paper focuses on how to orchestrate Web services for implementing asynchronous geospatial processing workflows. The theoretical bases for asynchronous Web services and workflows, including asynchrony patterns and message transmission, are examined to explore different asynchronous approaches to and architecture of workflow code for the support of asynchronous behavior. A sample geospatial processing workflow, issued by the Open Geospatial Consortium (OGC) Web Service, Phase 6 (OWS-6), is provided to illustrate the implementation of asynchronous geospatial processing workflows and the challenges in using Web Services Business Process Execution Language (WS-BPEL) to develop them.
Falcone, James A.; Carlisle, Daren M.; Wolock, David M.; Meador, Michael R.
2010-01-01
In addition, watersheds were assessed for their reference quality within nine broad regions for use in studies intended to characterize stream flows under conditions minimally influenced by human activities. Three primary criteria were used to assess reference quality: (1) a quantitative index of anthropogenic modification within the watershed based on GIS-derived variables, (2) visual inspection of every stream gage and drainage basin from recent high-resolution imagery and topographic maps, and (3) information about man-made influences from USGS Annual Water Data Reports. From the set of 6785 sites, we identified 1512 as reference-quality stream gages. All data derived for these watersheds as well as the reference condition evaluation are provided as an online data set termed GAGES (geospatial attributes of gages for evaluating stream flow).
NASA Astrophysics Data System (ADS)
Ogden, F. L.
2017-12-01
HIgh performance computing and the widespread availabilities of geospatial physiographic and forcing datasets have enabled consideration of flood impact predictions with longer lead times and more detailed spatial descriptions. We are now considering multi-hour flash flood forecast lead times at the subdivision level in so-called hydroblind regions away from the National Hydrography network. However, the computational demands of such models are high, necessitating a nested simulation approach. Research on hyper-resolution hydrologic modeling over the past three decades have illustrated some fundamental limits on predictability that are simultaneously related to runoff generation mechanism(s), antecedent conditions, rates and total amounts of precipitation, discretization of the model domain, and complexity or completeness of the model formulation. This latter point is an acknowledgement that in some ways hydrologic understanding in key areas related to land use, land cover, tillage practices, seasonality, and biological effects has some glaring deficiencies. This presentation represents a review of what is known related to the interacting effects of precipitation amount, model spatial discretization, antecedent conditions, physiographic characteristics and model formulation completeness for runoff predictions. These interactions define a region in multidimensional forcing, parameter and process space where there are in some cases clear limits on predictability, and in other cases diminished uncertainty.
Large Scale Analysis of Geospatial Data with Dask and XArray
NASA Astrophysics Data System (ADS)
Zender, C. S.; Hamman, J.; Abernathey, R.; Evans, K. J.; Rocklin, M.; Zender, C. S.; Rocklin, M.
2017-12-01
The analysis of geospatial data with high level languages has acceleratedinnovation and the impact of existing data resources. However, as datasetsgrow beyond single-machine memory, data structures within these high levellanguages can become a bottleneck. New libraries like Dask and XArray resolve some of these scalability issues,providing interactive workflows that are both familiar tohigh-level-language researchers while also scaling out to much largerdatasets. This broadens the access of researchers to larger datasets on highperformance computers and, through interactive development, reducestime-to-insight when compared to traditional parallel programming techniques(MPI). This talk describes Dask, a distributed dynamic task scheduler, Dask.array, amulti-dimensional array that copies the popular NumPy interface, and XArray,a library that wraps NumPy/Dask.array with labeled and indexes axes,implementing the CF conventions. We discuss both the basic design of theselibraries and how they change interactive analysis of geospatial data, and alsorecent benefits and challenges of distributed computing on clusters ofmachines.
Web mapping system for complex processing and visualization of environmental geospatial datasets
NASA Astrophysics Data System (ADS)
Titov, Alexander; Gordov, Evgeny; Okladnikov, Igor
2016-04-01
Environmental geospatial datasets (meteorological observations, modeling and reanalysis results, etc.) are used in numerous research applications. Due to a number of objective reasons such as inherent heterogeneity of environmental datasets, big dataset volume, complexity of data models used, syntactic and semantic differences that complicate creation and use of unified terminology, the development of environmental geodata access, processing and visualization services as well as client applications turns out to be quite a sophisticated task. According to general INSPIRE requirements to data visualization geoportal web applications have to provide such standard functionality as data overview, image navigation, scrolling, scaling and graphical overlay, displaying map legends and corresponding metadata information. It should be noted that modern web mapping systems as integrated geoportal applications are developed based on the SOA and might be considered as complexes of interconnected software tools for working with geospatial data. In the report a complex web mapping system including GIS web client and corresponding OGC services for working with geospatial (NetCDF, PostGIS) dataset archive is presented. There are three basic tiers of the GIS web client in it: 1. Tier of geospatial metadata retrieved from central MySQL repository and represented in JSON format 2. Tier of JavaScript objects implementing methods handling: --- NetCDF metadata --- Task XML object for configuring user calculations, input and output formats --- OGC WMS/WFS cartographical services 3. Graphical user interface (GUI) tier representing JavaScript objects realizing web application business logic Metadata tier consists of a number of JSON objects containing technical information describing geospatial datasets (such as spatio-temporal resolution, meteorological parameters, valid processing methods, etc). The middleware tier of JavaScript objects implementing methods for handling geospatial metadata, task XML object, and WMS/WFS cartographical services interconnects metadata and GUI tiers. The methods include such procedures as JSON metadata downloading and update, launching and tracking of the calculation task running on the remote servers as well as working with WMS/WFS cartographical services including: obtaining the list of available layers, visualizing layers on the map, exporting layers in graphical (PNG, JPG, GeoTIFF), vector (KML, GML, Shape) and digital (NetCDF) formats. Graphical user interface tier is based on the bundle of JavaScript libraries (OpenLayers, GeoExt and ExtJS) and represents a set of software components implementing web mapping application business logic (complex menus, toolbars, wizards, event handlers, etc.). GUI provides two basic capabilities for the end user: configuring the task XML object functionality and cartographical information visualizing. The web interface developed is similar to the interface of such popular desktop GIS applications, as uDIG, QuantumGIS etc. Web mapping system developed has shown its effectiveness in the process of solving real climate change research problems and disseminating investigation results in cartographical form. The work is supported by SB RAS Basic Program Projects VIII.80.2.1 and IV.38.1.7.
Geospatial Service Platform for Education and Research
NASA Astrophysics Data System (ADS)
Gong, J.; Wu, H.; Jiang, W.; Guo, W.; Zhai, X.; Yue, P.
2014-04-01
We propose to advance the scientific understanding through applications of geospatial service platforms, which can help students and researchers investigate various scientific problems in a Web-based environment with online tools and services. The platform also offers capabilities for sharing data, algorithm, and problem-solving knowledge. To fulfil this goal, the paper introduces a new course, named "Geospatial Service Platform for Education and Research", to be held in the ISPRS summer school in May 2014 at Wuhan University, China. The course will share cutting-edge achievements of a geospatial service platform with students from different countries, and train them with online tools from the platform for geospatial data processing and scientific research. The content of the course includes the basic concepts of geospatial Web services, service-oriented architecture, geoprocessing modelling and chaining, and problem-solving using geospatial services. In particular, the course will offer a geospatial service platform for handson practice. There will be three kinds of exercises in the course: geoprocessing algorithm sharing through service development, geoprocessing modelling through service chaining, and online geospatial analysis using geospatial services. Students can choose one of them, depending on their interests and background. Existing geoprocessing services from OpenRS and GeoPW will be introduced. The summer course offers two service chaining tools, GeoChaining and GeoJModelBuilder, as instances to explain specifically the method for building service chains in view of different demands. After this course, students can learn how to use online service platforms for geospatial resource sharing and problem-solving.
Shiika, Yulia; Kruger, Estie; Tennant, Marc
Australia has a significant mal-distribution of its limited dental workforce. Outside the major capital cities, the distribution of accessible dental care is at best patchy. This study applied geo-spatial analysis technology to locate gaps in dental service accessibility for rural and remote dwelling Australians, in order to test the hypothesis that there are a few key location points in Australia where further dental services could make a significant contribution to ameliorating the immediate shortage crisis. A total of 2,086 dental practices were located in country areas, covering a combined catchment area of 1.84 million square kilometers, based on 50 km catchment zones around each clinic. Geo-spatial analysis technology was used to identify gaps in the accessibility of dental services for rural and remote dwelling Australians. An extraction of data was obtained to analyse the integrated geographically-aligned database. Results: Resolution of the lack of dental practices for 74 townships (of greater than 500 residents) across Australia could potentially address access for 104,000 people. An examination of the socio-economic mix found that the majority of the dental practices (84%) are located in areas classified as less disadvantaged. Output from the study provided a cohesive national map that has identified locations that could have health improvement via the targeting of dental services to that location. The study identified potential location sites for dental clinics, to address the current inequity in accessing dental services in rural and remote Australia.
EPA GEOSPATIAL QUALITY COUNCIL
The EPA Geospatial Quality Council (previously known as the EPA GIS-QA Team - EPA/600/R-00/009 was created to fill the gap between the EPA Quality Assurance (QA) and Geospatial communities. All EPA Offices and Regions were invited to participate. Currently, the EPA Geospatial Q...
Geospatial Thinking of Information Professionals
ERIC Educational Resources Information Center
Bishop, Bradley Wade; Johnston, Melissa P.
2013-01-01
Geospatial thinking skills inform a host of library decisions including planning and managing facilities, analyzing service area populations, facility site location, library outlet and service point closures, as well as assisting users with their own geospatial needs. Geospatial thinking includes spatial cognition, spatial reasoning, and knowledge…
Semantic Segmentation of Forest Stands of Pure Species as a Global Optimization Problem
NASA Astrophysics Data System (ADS)
Dechesne, C.; Mallet, C.; Le Bris, A.; Gouet-Brunet, V.
2017-05-01
Forest stand delineation is a fundamental task for forest management purposes, that is still mainly manually performed through visual inspection of geospatial (very) high spatial resolution images. Stand detection has been barely addressed in the literature which has mainly focused, in forested environments, on individual tree extraction and tree species classification. From a methodological point of view, stand detection can be considered as a semantic segmentation problem. It offers two advantages. First, one can retrieve the dominant tree species per segment. Secondly, one can benefit from existing low-level tree species label maps from the literature as a basis for high-level object extraction. Thus, the semantic segmentation issue becomes a regularization issue in a weakly structured environment and can be formulated in an energetical framework. This papers aims at investigating which regularization strategies of the literature are the most adapted to delineate and classify forest stands of pure species. Both airborne lidar point clouds and multispectral very high spatial resolution images are integrated for that purpose. The local methods (such as filtering and probabilistic relaxation) are not adapted for such problem since the increase of the classification accuracy is below 5%. The global methods, based on an energy model, tend to be more efficient with an accuracy gain up to 15%. The segmentation results using such models have an accuracy ranging from 96% to 99%.
Establishing Accurate and Sustainable Geospatial Reference Layers in Developing Countries
NASA Astrophysics Data System (ADS)
Seaman, V. Y.
2017-12-01
Accurate geospatial reference layers (settlement names & locations, administrative boundaries, and population) are not readily available for most developing countries. This critical information gap makes it challenging for governments to efficiently plan, allocate resources, and provide basic services. It also hampers international agencies' response to natural disasters, humanitarian crises, and other emergencies. The current work involves a recent successful effort, led by the Bill & Melinda Gates Foundation and the Government of Nigeria, to obtain such data. The data collection began in 2013, with local teams collecting names, coordinates, and administrative attributes for over 100,000 settlements using ODK-enabled smartphones. A settlement feature layer extracted from satellite imagery was used to ensure all settlements were included. Administrative boundaries (Ward, LGA) were created using the settlement attributes. These "new" boundary layers were much more accurate than existing shapefiles used by the government and international organizations. The resulting data sets helped Nigeria eradicate polio from all areas except in the extreme northeast, where security issues limited access and vaccination activities. In addition to the settlement and boundary layers, a GIS-based population model was developed, in partnership with Oak Ridge National Laboratories and Flowminder), that used the extracted settlement areas and characteristics, along with targeted microcensus data. This model provides population and demographics estimates independent of census or other administrative data, at a resolution of 90 meters. These robust geospatial data layers found many other uses, including establishing catchment area settlements and populations for health facilities, validating denominators for population-based surveys, and applications across a variety of government sectors. Based on the success of the Nigeria effort, a partnership between DfID and the Bill & Melinda Gates Foundation was formed in 2017 to help other developing countries collect these geospatial reference layers, and to build capacity within the host governments to manage, use, and sustain them. This work will support, wherever possible, a national geo-referenced census, from which the reference layers can be extracted.
EPA Geospatial Quality Council Strategic and Implementation Plan 2010 to 2015
The EPA Geospatial Quality Council (GQC) was created to promote and provide Quality Assurance guidance for the development, use, and products of geospatial science. The GQC was created when the gap between the EPA Quality Assurance (QA) and Geospatial communities was recognized. ...
US EPA GEOSPATIAL QUALITY COUNCIL: ENSURING QUALITY GEOSPATIAL SOLUTIONS
This presentation will discuss the history, strategy, products, and future plans of the EPA Geospatial Quality Council (GQC). A topical review of GQC products will be presented including:
o Guidance for Geospatial Data Quality Assurance Project Plans.
o GPS - Tec...
NASA Astrophysics Data System (ADS)
Thomson, J. A.; Gee, L. J.; George, T.
2002-12-01
This presentation shows results of a visualization method used to display and analyze multiple data types in a geospatially referenced three-dimensional (3-D) space. The integrated data types include sonar and seismic geophysical data, pipeline and geotechnical engineering data, and 3-D facilities models. Visualization of these data collectively in proper 3-D orientation yields insights and synergistic understandings not previously obtainable. Key technological components of the method are: 1) high-resolution geophysical data obtained using a newly developed autonomous underwater vehicle (AUV), 2) 3-D visualization software that delivers correctly positioned display of multiple data types and full 3-D flight navigation within the data space and 3) a highly immersive visualization environment (HIVE) where multidisciplinary teams can work collaboratively to develop enhanced understandings of geospatially complex data relationships. The initial study focused on an active deepwater development area in the Green Canyon protraction area, Gulf of Mexico. Here several planned production facilities required detailed, integrated data analysis for design and installation purposes. To meet the challenges of tight budgets and short timelines, an innovative new method was developed based on the combination of newly developed technologies. Key benefits of the method include enhanced understanding of geologically complex seabed topography and marine soils yielding safer and more efficient pipeline and facilities siting. Environmental benefits include rapid and precise identification of potential locations of protected deepwater biological communities for avoidance and protection during exploration and production operations. In addition, the method allows data presentation and transfer of learnings to an audience outside the scientific and engineering team. This includes regulatory personnel, marine archaeologists, industry partners and others.
Rivers in the Anthropocene: Mapping Human Water Security
NASA Astrophysics Data System (ADS)
Vorosmarty, C. J.; Green, P.
2014-12-01
Fresh water underpins countless benefits to society and is pivotal to the success of the food and energy sectors, industry and commerce, and the expanding urban domain. It provides essential cultural, recreational, and aesthetic values and also plays a critical role in the maintenance of ecosystem services and biodiversity. Recent analyses of water systems across the planet, summarized using high resolution, geospatial indicator maps of rivers, demonstrate that a wide array of stressors combine to produce a pattern of worldwide threat to much of the freshwater resource base that sustains human water supply and aquatic biodiversity. A pervasive, globally-significant pattern of management is evident in the contemporary setting, through which impairment accumulates as a function of wealth, but is then remedied by costly, after-the-fact technological investments. This strategy of treating symptoms while leaving unabated the underlying causes is practiced widely across rich countries, but it strands poor nations and much of the world's aquatic lifeforms at high levels of vulnerability. The seeds of such an approach to water management are hardly new and are evident throughout human history. This talk will explore the implications of these global realities and will focus on the role of 21st century engineering as in both contributing to the growing water crisis and stimulating innovation for more effective stewardship of our water resource systems. It will also present a first global synthesis of the geography of freshwater provisioning source areas, evaluating jointly the quantity and condition of freshwater produced from these areas, and the downstream populations served by these resources. A geospatial indicator is derived, the freshwater provisioning index for humans (FPIh), which constitutes an objective measure of the state of the resource base and its role in supporting human water security.
Towards a geospatial wikipedia
NASA Astrophysics Data System (ADS)
Fritz, S.; McCallum, I.; Schill, C.; Perger, C.; Kraxner, F.; Obersteiner, M.
2009-04-01
Based on the Google Earth (http://earth.google.com) platform we have developed a geospatial Wikipedia (geo-wiki.org). The tool allows everybody in the world to contribute to spatial validation and is made available to the internet community interested in that task. We illustrate how this tool can be used for different applications. In our first application we combine uncertainty hotspot information from three global land cover datasets (GLC, MODIS, GlobCover). With an ever increasing amount of high resolution images available on Google Earth, it is becoming increasingly possible to distinguish land cover features with a high degree of accuracy. We first direct the land cover validation community to certain hotspots of land cover uncertainty and then ask them to fill in a small popup menu on type of land cover, possibly a picture at that location with the different cardinal points as well as date and what type of validation was chosen (google earth imagery/panoramio or if the person has ground truth data). We have implemented the tool via a land cover validation community at FACEBOOK which is based on a snowball system which allows the tracking of individuals and the possibility to ignore users which misuse the system. In a second application we illustrate how the tool could possibly be used for mapping malaria occurrence and small water bodies as well as overall malaria risk. For this application we have implemented a polygon as well as attribute function using Google maps as along with virtual earth using openlayers. The third application deals with illegal logging and how an alert system for illegal logging detection within a certain land tenure system could be implemented. Here we show how the tool can be used to document illegal logging via a YouTube video.
NASA Astrophysics Data System (ADS)
McCreedy, Frank P.; Sample, John T.; Ladd, William P.; Thomas, Michael L.; Shaw, Kevin B.
2005-05-01
The Naval Research Laboratory"s Geospatial Information Database (GIDBTM) Portal System has been extended to now include an extensive geospatial search functionality. The GIDB Portal System interconnects over 600 distributed geospatial data sources via the Internet with a thick client, thin client and a PDA client. As the GIDB Portal System has rapidly grown over the last two years (adding hundreds of geospatial sources), the obvious requirement has arisen to more effectively mine the interconnected sources in near real-time. How the GIDB Search addresses this issue is the prime focus of this paper.
NASA Astrophysics Data System (ADS)
Stuhlmacher, M.; Wang, C.; Georgescu, M.; Tellman, B.; Balling, R.; Clinton, N. E.; Collins, L.; Goldblatt, R.; Hanson, G.
2016-12-01
Global representations of modern day urban land use and land cover (LULC) extent are becoming increasingly prevalent. Yet considerable uncertainties in the representation of built environment extent (i.e. global classifications generated from 250m resolution MODIS imagery or the United States' National Land Cover Database) remain because of the lack of a systematic, globally consistent methodological approach. We aim to increase resolution, accuracy, and improve upon past efforts by establishing a data-driven definition of the urban landscape, based on Landsat 5, 7 & 8 imagery and ancillary data sets. Continuous and discrete machine learning classification algorithms have been developed in Google Earth Engine (GEE), a powerful online cloud-based geospatial storage and parallel-computing platform. Additionally, thousands of ground truth points have been selected from high resolution imagery to fill in the previous lack of accurate data to be used for training and validation. We will present preliminary classification and accuracy assessments for select cities in the United States and Mexico. Our approach has direct implications for development of projected urban growth that is grounded on realistic identification of urbanizing hot-spots, with consequences for local to regional scale climate change, energy demand, water stress, human health, urban-ecological interactions, and efforts used to prioritize adaptation and mitigation strategies to offset large-scale climate change. Future work to apply the built-up detection algorithm globally and yearly is underway in a partnership between GEE, University of California in San Diego, and Arizona State University.
NASA Astrophysics Data System (ADS)
Zheng, Xianwei; Xiong, Hanjiang; Gong, Jianya; Yue, Linwei
2017-07-01
Virtual globes play an important role in representing three-dimensional models of the Earth. To extend the functioning of a virtual globe beyond that of a "geobrowser", the accuracy of the geospatial data in the processing and representation should be of special concern for the scientific analysis and evaluation. In this study, we propose a method for the processing of large-scale terrain data for virtual globe visualization and analysis. The proposed method aims to construct a morphologically preserved multi-resolution triangulated irregular network (TIN) pyramid for virtual globes to accurately represent the landscape surface and simultaneously satisfy the demands of applications at different scales. By introducing cartographic principles, the TIN model in each layer is controlled with a data quality standard to formulize its level of detail generation. A point-additive algorithm is used to iteratively construct the multi-resolution TIN pyramid. The extracted landscape features are also incorporated to constrain the TIN structure, thus preserving the basic morphological shapes of the terrain surface at different levels. During the iterative construction process, the TIN in each layer is seamlessly partitioned based on a virtual node structure, and tiled with a global quadtree structure. Finally, an adaptive tessellation approach is adopted to eliminate terrain cracks in the real-time out-of-core spherical terrain rendering. The experiments undertaken in this study confirmed that the proposed method performs well in multi-resolution terrain representation, and produces high-quality underlying data that satisfy the demands of scientific analysis and evaluation.
The National Geospatial Technical Operations Center
Craun, Kari J.; Constance, Eric W.; Donnelly, Jay; Newell, Mark R.
2009-01-01
The United States Geological Survey (USGS) National Geospatial Technical Operations Center (NGTOC) provides geospatial technical expertise in support of the National Geospatial Program in its development of The National Map, National Atlas of the United States, and implementation of key components of the National Spatial Data Infrastructure (NSDI).
The Geospatial Web and Local Geographical Education
ERIC Educational Resources Information Center
Harris, Trevor M.; Rouse, L. Jesse; Bergeron, Susan J.
2010-01-01
Recent innovations in the Geospatial Web represent a paradigm shift in Web mapping by enabling educators to explore geography in the classroom by dynamically using a rapidly growing suite of impressive online geospatial tools. Coupled with access to spatial data repositories and User-Generated Content, the Geospatial Web provides a powerful…
NASA Technical Reports Server (NTRS)
Myint, Soe W.; Mesev, Victor; Quattrochi, Dale; Wentz, Elizabeth A.
2013-01-01
Remote sensing methods used to generate base maps to analyze the urban environment rely predominantly on digital sensor data from space-borne platforms. This is due in part from new sources of high spatial resolution data covering the globe, a variety of multispectral and multitemporal sources, sophisticated statistical and geospatial methods, and compatibility with GIS data sources and methods. The goal of this chapter is to review the four groups of classification methods for digital sensor data from space-borne platforms; per-pixel, sub-pixel, object-based (spatial-based), and geospatial methods. Per-pixel methods are widely used methods that classify pixels into distinct categories based solely on the spectral and ancillary information within that pixel. They are used for simple calculations of environmental indices (e.g., NDVI) to sophisticated expert systems to assign urban land covers. Researchers recognize however, that even with the smallest pixel size the spectral information within a pixel is really a combination of multiple urban surfaces. Sub-pixel classification methods therefore aim to statistically quantify the mixture of surfaces to improve overall classification accuracy. While within pixel variations exist, there is also significant evidence that groups of nearby pixels have similar spectral information and therefore belong to the same classification category. Object-oriented methods have emerged that group pixels prior to classification based on spectral similarity and spatial proximity. Classification accuracy using object-based methods show significant success and promise for numerous urban 3 applications. Like the object-oriented methods that recognize the importance of spatial proximity, geospatial methods for urban mapping also utilize neighboring pixels in the classification process. The primary difference though is that geostatistical methods (e.g., spatial autocorrelation methods) are utilized during both the pre- and post-classification steps. Within this chapter, each of the four approaches is described in terms of scale and accuracy classifying urban land use and urban land cover; and for its range of urban applications. We demonstrate the overview of four main classification groups in Figure 1 while Table 1 details the approaches with respect to classification requirements and procedures (e.g., reflectance conversion, steps before training sample selection, training samples, spatial approaches commonly used, classifiers, primary inputs for classification, output structures, number of output layers, and accuracy assessment). The chapter concludes with a brief summary of the methods reviewed and the challenges that remain in developing new classification methods for improving the efficiency and accuracy of mapping urban areas.
NASA Astrophysics Data System (ADS)
Alidoost, F.; Arefi, H.
2017-11-01
Nowadays, Unmanned Aerial System (UAS)-based photogrammetry offers an affordable, fast and effective approach to real-time acquisition of high resolution geospatial information and automatic 3D modelling of objects for numerous applications such as topography mapping, 3D city modelling, orthophoto generation, and cultural heritages preservation. In this paper, the capability of four different state-of-the-art software packages as 3DSurvey, Agisoft Photoscan, Pix4Dmapper Pro and SURE is examined to generate high density point cloud as well as a Digital Surface Model (DSM) over a historical site. The main steps of this study are including: image acquisition, point cloud generation, and accuracy assessment. The overlapping images are first captured using a quadcopter and next are processed by different software to generate point clouds and DSMs. In order to evaluate the accuracy and quality of point clouds and DSMs, both visual and geometric assessments are carry out and the comparison results are reported.
Mapping and Visualization of Storm-Surge Dynamics for Hurricane Katrina and Hurricane Rita
Gesch, Dean B.
2009-01-01
The damages caused by the storm surges from Hurricane Katrina and Hurricane Rita were significant and occurred over broad areas. Storm-surge maps are among the most useful geospatial datasets for hurricane recovery, impact assessments, and mitigation planning for future storms. Surveyed high-water marks were used to generate a maximum storm-surge surface for Hurricane Katrina extending from eastern Louisiana to Mobile Bay, Alabama. The interpolated surface was intersected with high-resolution lidar elevation data covering the study area to produce a highly detailed digital storm-surge inundation map. The storm-surge dataset and related data are available for display and query in a Web-based viewer application. A unique water-level dataset from a network of portable pressure sensors deployed in the days just prior to Hurricane Rita's landfall captured the hurricane's storm surge. The recorded sensor data provided water-level measurements with a very high temporal resolution at surveyed point locations. The resulting dataset was used to generate a time series of storm-surge surfaces that documents the surge dynamics in a new, spatially explicit way. The temporal information contained in the multiple storm-surge surfaces can be visualized in a number of ways to portray how the surge interacted with and was affected by land surface features. Spatially explicit storm-surge products can be useful for a variety of hurricane impact assessments, especially studies of wetland and land changes where knowledge of the extent and magnitude of storm-surge flooding is critical.
Infrastructure for the Geospatial Web
NASA Astrophysics Data System (ADS)
Lake, Ron; Farley, Jim
Geospatial data and geoprocessing techniques are now directly linked to business processes in many areas. Commerce, transportation and logistics, planning, defense, emergency response, health care, asset management and many other domains leverage geospatial information and the ability to model these data to achieve increased efficiencies and to develop better, more comprehensive decisions. However, the ability to deliver geospatial data and the capacity to process geospatial information effectively in these domains are dependent on infrastructure technology that facilitates basic operations such as locating data, publishing data, keeping data current and notifying subscribers and others whose applications and decisions are dependent on this information when changes are made. This chapter introduces the notion of infrastructure technology for the Geospatial Web. Specifically, the Geography Markup Language (GML) and registry technology developed using the ebRIM specification delivered from the OASIS consortium are presented as atomic infrastructure components in a working Geospatial Web.
ERIC Educational Resources Information Center
Hogrebe, Mark C.; Tate, William F., IV
2012-01-01
In this chapter, "geospatial" refers to geographic space that includes location, distance, and the relative position of things on the earth's surface. Geospatial perspective calls for the addition of a geographic lens that focuses on place and space as important contextual variables. A geospatial view increases one's understanding of…
Geospatial Data Curation at the University of Idaho
ERIC Educational Resources Information Center
Kenyon, Jeremy; Godfrey, Bruce; Eckwright, Gail Z.
2012-01-01
The management and curation of digital geospatial data has become a central concern for many academic libraries. Geospatial data is a complex type of data critical to many different disciplines, and its use has become more expansive in the past decade. The University of Idaho Library maintains a geospatial data repository called the Interactive…
2017-02-22
manages operations through guidance, policies, programs, and organizations. The NSG is designed to be a mutually supportive enterprise that...deliberate technical design and deliberate human actions. Geospatial engineer teams (GETs) within the geospatial intelligence cells are the day-to-day...standards working group and are designated by the AGC Geospatial Acquisition Support Directorate as required for interoperability. Applicable standards
Grid computing enhances standards-compatible geospatial catalogue service
NASA Astrophysics Data System (ADS)
Chen, Aijun; Di, Liping; Bai, Yuqi; Wei, Yaxing; Liu, Yang
2010-04-01
A catalogue service facilitates sharing, discovery, retrieval, management of, and access to large volumes of distributed geospatial resources, for example data, services, applications, and their replicas on the Internet. Grid computing provides an infrastructure for effective use of computing, storage, and other resources available online. The Open Geospatial Consortium has proposed a catalogue service specification and a series of profiles for promoting the interoperability of geospatial resources. By referring to the profile of the catalogue service for Web, an innovative information model of a catalogue service is proposed to offer Grid-enabled registry, management, retrieval of and access to geospatial resources and their replicas. This information model extends the e-business registry information model by adopting several geospatial data and service metadata standards—the International Organization for Standardization (ISO)'s 19115/19119 standards and the US Federal Geographic Data Committee (FGDC) and US National Aeronautics and Space Administration (NASA) metadata standards for describing and indexing geospatial resources. In order to select the optimal geospatial resources and their replicas managed by the Grid, the Grid data management service and information service from the Globus Toolkits are closely integrated with the extended catalogue information model. Based on this new model, a catalogue service is implemented first as a Web service. Then, the catalogue service is further developed as a Grid service conforming to Grid service specifications. The catalogue service can be deployed in both the Web and Grid environments and accessed by standard Web services or authorized Grid services, respectively. The catalogue service has been implemented at the George Mason University/Center for Spatial Information Science and Systems (GMU/CSISS), managing more than 17 TB of geospatial data and geospatial Grid services. This service makes it easy to share and interoperate geospatial resources by using Grid technology and extends Grid technology into the geoscience communities.
SDGs and Geospatial Frameworks: Data Integration in the United States
NASA Astrophysics Data System (ADS)
Trainor, T.
2016-12-01
Responding to the need to monitor a nation's progress towards meeting the Sustainable Development Goals (SDG) outlined in the 2030 U.N. Agenda requires the integration of earth observations with statistical information. The urban agenda proposed in SDG 11 challenges the global community to find a geospatial approach to monitor and measure inclusive, safe, resilient, and sustainable cities and communities. Target 11.7 identifies public safety, accessibility to green and public spaces, and the most vulnerable populations (i.e., women and children, older persons, and persons with disabilities) as the most important priorities of this goal. A challenge for both national statistical organizations and earth observation agencies in addressing SDG 11 is the requirement for detailed statistics at a sufficient spatial resolution to provide the basis for meaningful analysis of the urban population and city environments. Using an example for the city of Pittsburgh, this presentation proposes data and methods to illustrate how earth science and statistical data can be integrated to respond to Target 11.7. Finally, a preliminary series of data initiatives are proposed for extending this method to other global cities.
Using Globe Browsing Systems in Planetariums to Take Audiences to Other Worlds.
NASA Astrophysics Data System (ADS)
Emmart, C. B.
2014-12-01
For the last decade planetariums have been adding capability of "full dome video" systems for both movie playback and interactive display. True scientific data visualization has now come to planetarium audiences as a means to display the actual three dimensional layout of the universe, the time based array of planets, minor bodies and spacecraft across the solar system, and now globe browsing systems to examine planetary bodies to the limits of resolutions acquired. Additionally, such planetarium facilities can be networked for simultaneous display across the world for wider audience and reach to authoritative scientist description and commentary. Data repositories such as NASA's Lunar Mapping and Modeling Project (LMMP), NASA GSFC's LANCE-MODIS, and others conforming to the Open Geospatial Consortium (OGC) standard of Web Map Server (WMS) protocols make geospatial data available for a growing number of dome supporting globe visualization systems. The immersive surround graphics of full dome video replicates our visual system creating authentic virtual scenes effectively placing audiences on location in some cases to other worlds only mapped robotically.
Best Practices for Preparing Interoperable Geospatial Data
NASA Astrophysics Data System (ADS)
Wei, Y.; Santhana Vannan, S.; Cook, R. B.; Wilson, B. E.; Beaty, T. W.
2010-12-01
Geospatial data is critically important for a wide scope of research and applications: carbon cycle and ecosystem, climate change, land use and urban planning, environmental protecting, etc. Geospatial data is created by different organizations using different methods, from remote sensing observations, field surveys, model simulations, etc., and stored in various formats. So geospatial data is diverse and heterogeneous, which brings a huge barrier for the sharing and using of geospatial data, especially when targeting a broad user community. Many efforts have been taken to address different aspects of using geospatial data by improving its interoperability. For example, the specification for Open Geospatial Consortium (OGC) catalog services defines a standard way for geospatial information discovery; OGC Web Coverage Services (WCS) and OPeNDAP define interoperable protocols for geospatial data access, respectively. But the reality is that only having the standard mechanisms for data discovery and access is not enough. The geospatial data content itself has to be organized in standard, easily understandable, and readily usable formats. The Oak Ridge National Lab Distributed Archived Data Center (ORNL DAAC) archives data and information relevant to biogeochemical dynamics, ecological data, and environmental processes. The Modeling and Synthesis Thematic Data Center (MAST-DC) prepares and distributes both input data and output data of carbon cycle models and provides data support for synthesis and terrestrial model inter-comparison in multi-scales. Both of these NASA-funded data centers compile and distribute a large amount of diverse geospatial data and have broad user communities, including GIS users, Earth science researchers, and ecosystem modeling teams. The ORNL DAAC and MAST-DC address this geospatial data interoperability issue by standardizing the data content and feeding them into a well-designed Spatial Data Infrastructure (SDI) which provides interoperable mechanisms to advertise, visualize, and distribute the standardized geospatial data. In this presentation, we summarize the experiences learned and the best practices for geospatial data standardization. The presentation will describe how diverse and historical data archived in the ORNL DAAC were converted into standard and non-proprietary formats; what tools were used to make the conversion; how the spatial and temporal information are properly captured in a consistent manor; how to name a data file or a variable to make it both human-friendly and semantically interoperable; how NetCDF file format and CF convention can promote the data usage in ecosystem modeling user community; how those standardized geospatial data can be fed into OGC Web Services to support on-demand data visualization and access; and how the metadata should be collected and organized so that they can be discovered through standard catalog services.
a Framework for AN Open Source Geospatial Certification Model
NASA Astrophysics Data System (ADS)
Khan, T. U. R.; Davis, P.; Behr, F.-J.
2016-06-01
The geospatial industry is forecasted to have an enormous growth in the forthcoming years and an extended need for well-educated workforce. Hence ongoing education and training play an important role in the professional life. Parallel, in the geospatial and IT arena as well in the political discussion and legislation Open Source solutions, open data proliferation, and the use of open standards have an increasing significance. Based on the Memorandum of Understanding between International Cartographic Association, OSGeo Foundation, and ISPRS this development led to the implementation of the ICA-OSGeo-Lab imitative with its mission "Making geospatial education and opportunities accessible to all". Discussions in this initiative and the growth and maturity of geospatial Open Source software initiated the idea to develop a framework for a worldwide applicable Open Source certification approach. Generic and geospatial certification approaches are already offered by numerous organisations, i.e., GIS Certification Institute, GeoAcademy, ASPRS, and software vendors, i. e., Esri, Oracle, and RedHat. They focus different fields of expertise and have different levels and ways of examination which are offered for a wide range of fees. The development of the certification framework presented here is based on the analysis of diverse bodies of knowledge concepts, i.e., NCGIA Core Curriculum, URISA Body Of Knowledge, USGIF Essential Body Of Knowledge, the "Geographic Information: Need to Know", currently under development, and the Geospatial Technology Competency Model (GTCM). The latter provides a US American oriented list of the knowledge, skills, and abilities required of workers in the geospatial technology industry and influenced essentially the framework of certification. In addition to the theoretical analysis of existing resources the geospatial community was integrated twofold. An online survey about the relevance of Open Source was performed and evaluated with 105 respondents worldwide. 15 interviews (face-to-face or by telephone) with experts in different countries provided additional insights into Open Source usage and certification. The findings led to the development of a certification framework of three main categories with in total eleven sub-categories, i.e., "Certified Open Source Geospatial Data Associate / Professional", "Certified Open Source Geospatial Analyst Remote Sensing & GIS", "Certified Open Source Geospatial Cartographer", "Certified Open Source Geospatial Expert", "Certified Open Source Geospatial Associate Developer / Professional Developer", "Certified Open Source Geospatial Architect". Each certification is described by pre-conditions, scope and objectives, course content, recommended software packages, target group, expected benefits, and the methods of examination. Examinations can be flanked by proofs of professional career paths and achievements which need a peer qualification evaluation. After a couple of years a recertification is required. The concept seeks the accreditation by the OSGeo Foundation (and other bodies) and international support by a group of geospatial scientific institutions to achieve wide and international acceptance for this Open Source geospatial certification model. A business case for Open Source certification and a corresponding SWOT model is examined to support the goals of the Geo-For-All initiative of the ICA-OSGeo pact.
Big Geo Data Management: AN Exploration with Social Media and Telecommunications Open Data
NASA Astrophysics Data System (ADS)
Arias Munoz, C.; Brovelli, M. A.; Corti, S.; Zamboni, G.
2016-06-01
The term Big Data has been recently used to define big, highly varied, complex data sets, which are created and updated at a high speed and require faster processing, namely, a reduced time to filter and analyse relevant data. These data is also increasingly becoming Open Data (data that can be freely distributed) made public by the government, agencies, private enterprises and among others. There are at least two issues that can obstruct the availability and use of Open Big Datasets: Firstly, the gathering and geoprocessing of these datasets are very computationally intensive; hence, it is necessary to integrate high-performance solutions, preferably internet based, to achieve the goals. Secondly, the problems of heterogeneity and inconsistency in geospatial data are well known and affect the data integration process, but is particularly problematic for Big Geo Data. Therefore, Big Geo Data integration will be one of the most challenging issues to solve. With these applications, we demonstrate that is possible to provide processed Big Geo Data to common users, using open geospatial standards and technologies. NoSQL databases like MongoDB and frameworks like RASDAMAN could offer different functionalities that facilitate working with larger volumes and more heterogeneous geospatial data sources.
Geospatial Data Stream Processing in Python Using FOSS4G Components
NASA Astrophysics Data System (ADS)
McFerren, G.; van Zyl, T.
2016-06-01
One viewpoint of current and future IT systems holds that there is an increase in the scale and velocity at which data are acquired and analysed from heterogeneous, dynamic sources. In the earth observation and geoinformatics domains, this process is driven by the increase in number and types of devices that report location and the proliferation of assorted sensors, from satellite constellations to oceanic buoy arrays. Much of these data will be encountered as self-contained messages on data streams - continuous, infinite flows of data. Spatial analytics over data streams concerns the search for spatial and spatio-temporal relationships within and amongst data "on the move". In spatial databases, queries can assess a store of data to unpack spatial relationships; this is not the case on streams, where spatial relationships need to be established with the incomplete data available. Methods for spatially-based indexing, filtering, joining and transforming of streaming data need to be established and implemented in software components. This article describes the usage patterns and performance metrics of a number of well known FOSS4G Python software libraries within the data stream processing paradigm. In particular, we consider the RTree library for spatial indexing, the Shapely library for geometric processing and transformation and the PyProj library for projection and geodesic calculations over streams of geospatial data. We introduce a message oriented Python-based geospatial data streaming framework called Swordfish, which provides data stream processing primitives, functions, transports and a common data model for describing messages, based on the Open Geospatial Consortium Observations and Measurements (O&M) and Unidata Common Data Model (CDM) standards. We illustrate how the geospatial software components are integrated with the Swordfish framework. Furthermore, we describe the tight temporal constraints under which geospatial functionality can be invoked when processing high velocity, potentially infinite geospatial data streams. The article discusses the performance of these libraries under simulated streaming loads (size, complexity and volume of messages) and how they can be deployed and utilised with Swordfish under real load scenarios, illustrated by a set of Vessel Automatic Identification System (AIS) use cases. We conclude that the described software libraries are able to perform adequately under geospatial data stream processing scenarios - many real application use cases will be handled sufficiently by the software.
Stevens, Forrest R; Gaughan, Andrea E; Linard, Catherine; Tatem, Andrew J
2015-01-01
High resolution, contemporary data on human population distributions are vital for measuring impacts of population growth, monitoring human-environment interactions and for planning and policy development. Many methods are used to disaggregate census data and predict population densities for finer scale, gridded population data sets. We present a new semi-automated dasymetric modeling approach that incorporates detailed census and ancillary data in a flexible, "Random Forest" estimation technique. We outline the combination of widely available, remotely-sensed and geospatial data that contribute to the modeled dasymetric weights and then use the Random Forest model to generate a gridded prediction of population density at ~100 m spatial resolution. This prediction layer is then used as the weighting surface to perform dasymetric redistribution of the census counts at a country level. As a case study we compare the new algorithm and its products for three countries (Vietnam, Cambodia, and Kenya) with other common gridded population data production methodologies. We discuss the advantages of the new method and increases over the accuracy and flexibility of those previous approaches. Finally, we outline how this algorithm will be extended to provide freely-available gridded population data sets for Africa, Asia and Latin America.
NASA Astrophysics Data System (ADS)
Wang, Rong; Andrews, Elisabeth; Balkanski, Yves; Boucher, Olivier; Myhre, Gunnar; Samset, Bjørn Hallvard; Schulz, Michael; Schuster, Gregory L.; Valari, Myrto; Tao, Shu
2018-02-01
There is high uncertainty in the direct radiative forcing of black carbon (BC), an aerosol that strongly absorbs solar radiation. The observation-constrained estimate, which is several times larger than the bottom-up estimate, is influenced by the spatial representativeness error due to the mesoscale inhomogeneity of the aerosol fields and the relatively low resolution of global chemistry-transport models. Here we evaluated the spatial representativeness error for two widely used observational networks (AErosol RObotic NETwork and Global Atmosphere Watch) by downscaling the geospatial grid in a global model of BC aerosol absorption optical depth to 0.1° × 0.1°. Comparing the models at a spatial resolution of 2° × 2° with BC aerosol absorption at AErosol RObotic NETwork sites (which are commonly located near emission hot spots) tends to cause a global spatial representativeness error of 30%, as a positive bias for the current top-down estimate of global BC direct radiative forcing. By contrast, the global spatial representativeness error will be 7% for the Global Atmosphere Watch network, because the sites are located in such a way that there are almost an equal number of sites with positive or negative representativeness error.
Ontology for Transforming Geo-Spatial Data for Discovery and Integration of Scientific Data
NASA Astrophysics Data System (ADS)
Nguyen, L.; Chee, T.; Minnis, P.
2013-12-01
Discovery and access to geo-spatial scientific data across heterogeneous repositories and multi-discipline datasets can present challenges for scientist. We propose to build a workflow for transforming geo-spatial datasets into semantic environment by using relationships to describe the resource using OWL Web Ontology, RDF, and a proposed geo-spatial vocabulary. We will present methods for transforming traditional scientific dataset, use of a semantic repository, and querying using SPARQL to integrate and access datasets. This unique repository will enable discovery of scientific data by geospatial bound or other criteria.
The Updating of Geospatial Base Data
NASA Astrophysics Data System (ADS)
Alrajhi, Muhamad N.; Konecny, Gottfried
2018-04-01
Topopographic mapping issues concern the area coverage at different scales and their age. The age of the map is determined by the system of updating. The United Nations (UNGGIM) have attempted to track the global map coverage at various scale ranges, which has greatly improved in recent decades. However the poor state of updating of base maps is still a global problem. In Saudi Arabia large scale mapping is carried out for all urban, suburban and rural areas by aerial surveys. Updating is carried out by remapping every 5 to 10 years. Due to the rapid urban development this is not satisfactory, but faster update methods are forseen by use of high resolution satellite imagery and the improvement of object oriented geodatabase structures, which will permit to utilize various survey technologies to update the photogrammetry established geodatabases. The longterm goal is to create an geodata infrastructure, which exists in Great Britain or Germany.
NASA Astrophysics Data System (ADS)
Li, W.
2017-12-01
Data is the crux of science. The widespread availability of big data today is of particular importance for fostering new forms of geospatial innovation. This paper reports a state-of-the-art solution that addresses a key cyberinfrastructure research problem—providing ready access to big, distributed geospatial data resources on the Web. We first formulate this data-access problem and introduce its indispensable elements, including identifying the cyber-location, space and time coverage, theme, and quality of the dataset. We then propose strategies to tackle each data-access issue and make the data more discoverable and usable for geospatial data users and decision makers. Among these strategies is large-scale web crawling as a key technique to support automatic collection of online geospatial data that are highly distributed, intrinsically heterogeneous, and known to be dynamic. To better understand the content and scientific meanings of the data, methods including space-time filtering, ontology-based thematic classification, and service quality evaluation are incorporated. To serve a broad scientific user community, these techniques are integrated into an operational data crawling system, PolarHub, which is also an important cyberinfrastructure building block to support effective data discovery. A series of experiments were conducted to demonstrate the outstanding performance of the PolarHub system. We expect this work to contribute significantly in building the theoretical and methodological foundation for data-driven geography and the emerging spatial data science.
A geospatial search engine for discovering multi-format geospatial data across the web
Christopher Bone; Alan Ager; Ken Bunzel; Lauren Tierney
2014-01-01
The volume of publically available geospatial data on the web is rapidly increasing due to advances in server-based technologies and the ease at which data can now be created. However, challenges remain with connecting individuals searching for geospatial data with servers and websites where such data exist. The objective of this paper is to present a publically...
ERIC Educational Resources Information Center
Hedley, Mikell Lynne; Templin, Mark A.; Czaljkowski, Kevin; Czerniak, Charlene
2013-01-01
Many 21st century careers rely on geospatial skills; yet, curricula and professional development lag behind in incorporating these skills. As a result, many teachers have limited experience or preparation for teaching geospatial skills. One strategy for overcoming such problems is the creation of a student/teacher/scientist (STS) partnership…
Automated geospatial Web Services composition based on geodata quality requirements
NASA Astrophysics Data System (ADS)
Cruz, Sérgio A. B.; Monteiro, Antonio M. V.; Santos, Rafael
2012-10-01
Service-Oriented Architecture and Web Services technologies improve the performance of activities involved in geospatial analysis with a distributed computing architecture. However, the design of the geospatial analysis process on this platform, by combining component Web Services, presents some open issues. The automated construction of these compositions represents an important research topic. Some approaches to solving this problem are based on AI planning methods coupled with semantic service descriptions. This work presents a new approach using AI planning methods to improve the robustness of the produced geospatial Web Services composition. For this purpose, we use semantic descriptions of geospatial data quality requirements in a rule-based form. These rules allow the semantic annotation of geospatial data and, coupled with the conditional planning method, this approach represents more precisely the situations of nonconformities with geodata quality that may occur during the execution of the Web Service composition. The service compositions produced by this method are more robust, thus improving process reliability when working with a composition of chained geospatial Web Services.
Economic assessment of the use value of geospatial information
Bernknopf, Richard L.; Shapiro, Carl D.
2015-01-01
Geospatial data inform decision makers. An economic model that involves application of spatial and temporal scientific, technical, and economic data in decision making is described. The value of information (VOI) contained in geospatial data is the difference between the net benefits (in present value terms) of a decision with and without the information. A range of technologies is used to collect and distribute geospatial data. These technical activities are linked to examples that show how the data can be applied in decision making, which is a cultural activity. The economic model for assessing the VOI in geospatial data for decision making is applied to three examples: (1) a retrospective model about environmental regulation of agrochemicals; (2) a prospective model about the impact and mitigation of earthquakes in urban areas; and (3) a prospective model about developing private–public geospatial information for an ecosystem services market. Each example demonstrates the potential value of geospatial information in a decision with uncertain information.
EPA National Geospatial Data Policy
National Geospatial Data Policy (NGDP) establishes principles, responsibilities, and requirements for collecting and managing geospatial data used by Federal environmental programs and projects within the jurisdiction of the U.S. EPA
Towards the Geospatial Web: Media Platforms for Managing Geotagged Knowledge Repositories
NASA Astrophysics Data System (ADS)
Scharl, Arno
International media have recognized the visual appeal of geo-browsers such as NASA World Wind and Google Earth, for example, when Web and television coverage on Hurricane Katrina used interactive geospatial projections to illustrate its path and the scale of destruction in August 2005. Yet these early applications only hint at the true potential of geospatial technology to build and maintain virtual communities and to revolutionize the production, distribution and consumption of media products. This chapter investigates this potential by reviewing the literature and discussing the integration of geospatial and semantic reference systems, with an emphasis on extracting geospatial context from unstructured text. A content analysis of news coverage based on a suite of text mining tools (webLyzard) sheds light on the popularity and adoption of geospatial platforms.
Anthony, Michelle L.; Klaver, Jacqueline M.; Quenzer, Robert
1998-01-01
The US Geological Survey and US Agency for International Development are enhancing the geographic information infrastructure of the Western Hemisphere by establishing the Inter-American Geospatial Data Network (IGDN). In its efforts to strengthen the Western Hemisphere's information infrastructure, the IGDN is consistent with the goals of the Plan of Action that emerged from the 1994 Summit of the Americas. The IGDN is an on-line cooperative, or clearinghouse, of geospatial data. Internet technology is used to facilitate the discovery and access of Western Hemisphere geospatial data. It was established by using the standards and guidelines of the Federal Geographic Data Committee to provide a consistent data discovery mechanism that will help minimize geospatial data duplication, promote data availability, and coordinate data collection and research activities.
Dynamic Server-Based KML Code Generator Method for Level-of-Detail Traversal of Geospatial Data
NASA Technical Reports Server (NTRS)
Baxes, Gregory; Mixon, Brian; Linger, TIm
2013-01-01
Web-based geospatial client applications such as Google Earth and NASA World Wind must listen to data requests, access appropriate stored data, and compile a data response to the requesting client application. This process occurs repeatedly to support multiple client requests and application instances. Newer Web-based geospatial clients also provide user-interactive functionality that is dependent on fast and efficient server responses. With massively large datasets, server-client interaction can become severely impeded because the server must determine the best way to assemble data to meet the client applications request. In client applications such as Google Earth, the user interactively wanders through the data using visually guided panning and zooming actions. With these actions, the client application is continually issuing data requests to the server without knowledge of the server s data structure or extraction/assembly paradigm. A method for efficiently controlling the networked access of a Web-based geospatial browser to server-based datasets in particular, massively sized datasets has been developed. The method specifically uses the Keyhole Markup Language (KML), an Open Geospatial Consortium (OGS) standard used by Google Earth and other KML-compliant geospatial client applications. The innovation is based on establishing a dynamic cascading KML strategy that is initiated by a KML launch file provided by a data server host to a Google Earth or similar KMLcompliant geospatial client application user. Upon execution, the launch KML code issues a request for image data covering an initial geographic region. The server responds with the requested data along with subsequent dynamically generated KML code that directs the client application to make follow-on requests for higher level of detail (LOD) imagery to replace the initial imagery as the user navigates into the dataset. The approach provides an efficient data traversal path and mechanism that can be flexibly established for any dataset regardless of size or other characteristics. The method yields significant improvements in userinteractive geospatial client and data server interaction and associated network bandwidth requirements. The innovation uses a C- or PHP-code-like grammar that provides a high degree of processing flexibility. A set of language lexer and parser elements is provided that offers a complete language grammar for writing and executing language directives. A script is wrapped and passed to the geospatial data server by a client application as a component of a standard KML-compliant statement. The approach provides an efficient means for a geospatial client application to request server preprocessing of data prior to client delivery. Data is structured in a quadtree format. As the user zooms into the dataset, geographic regions are subdivided into four child regions. Conversely, as the user zooms out, four child regions collapse into a single, lower-LOD region. The approach provides an efficient data traversal path and mechanism that can be flexibly established for any dataset regardless of size or other characteristics.
Bartelt, Paul E.; Gallant, Alisa L.; Klaver, Robert W.; Wright, Christopher K.; Patla, Debra A.; Peterson, Charles R.
2011-01-01
The ability to predict amphibian breeding across landscapes is important for informing land management decisions and helping biologists better understand and remediate factors contributing to declines in amphibian populations. We built geospatial models of likely breeding habitats for each of four amphibian species that breed in Yellowstone National Park (YNP). We used field data collected in 2000-2002 from 497 sites among 16 basins and predictor variables from geospatial models produced from remotely sensed data (e.g., digital elevation model, complex topographic index, landform data, wetland probabililty, and vegetative cover). Except for 31 sites in one basin that were surveyed in both 2000 and 2002, all sites were surveyed once. We used polytomous regression to build statistical models for each species of amphibian from 1) field survey site data only, 2) field data combined with data from geospatial models, and 3) data from geospatial models only. Based on measures of receiver operating characteristic (ROC) scores, models of the second type best explained likely breeding habitat because they contained the most information (ROC values ranged from 0.70 - 0.88). However, models of the third type could be applied to the entire YNP landscape and produced maps that could be verified with reserve field data. Accuracy rates for models built for single years were highly variable, ranging from 0.30 to 0.78. Accuracy rates for models built with data combined from multiple years were higher and less variable, ranging from 0.60 to 0.80. Combining results from the geospatial multiyear models yielded maps of "core" breeding areas (areas with high probability values for all three years) surrounded by areas that scored high for only one or two years, providing an estimate of variability among years. Such information can highlight landscape options for amphibian conservation. For example, our models identify alternative for areas that could be protected for each species, including 6828-10 764 ha for tiger salamanders; 971-3017 ha for western toads; 4732-16 696 ha for boreal chorus frogs; 4940-19 690 hectares for Columbia spotted frogs.
Bartelt, Paul E.; Gallant, Alisa L.; Klaver, Robert W.; Wright, C.K.; Patla, Debra A.; Peterson, Charles R.
2011-01-01
The ability to predict amphibian breeding across landscapes is important for informing land management decisions and helping biologists better understand and remediate factors contributing to declines in amphibian populations. We built geospatial models of likely breeding habitats for each of four amphibian species that breed in Yellowstone National Park (YNP). We used field data collected in 2000-2002 from 497 sites among 16 basins and predictor variables from geospatial models produced from remotely sensed data (e.g., digital elevation model, complex topographic index, landform data, wetland probability, and vegetative cover). Except for 31 sites in one basin that were surveyed in both 2000 and 2002, all sites were surveyed once. We used polytomous regression to build statistical models for each species of amphibian from (1) field survey site data only, (2) field data combined with data from geospatial models, and (3) data from geospatial models only. Based on measures of receiver operating characteristic (ROC) scores, models of the second type best explained likely breeding habitat because they contained the most information (ROC values ranged from 0.70 to 0.88). However, models of the third type could be applied to the entire YNP landscape and produced maps that could be verified with reserve field data. Accuracy rates for models built for single years were highly variable, ranging from 0.30 to 0.78. Accuracy rates for models built with data combined from multiple years were higher and less variable, ranging from 0.60 to 0.80. Combining results from the geospatial multiyear models yielded maps of "core" breeding areas (areas with high probability values for all three years) surrounded by areas that scored high for only one or two years, providing an estimate of variability among years. Such information can highlight landscape options for amphibian conservation. For example, our models identify alternative areas that could be protected for each species, including 6828-10 764 ha for tiger salamanders, 971-3017 ha for western toads, 4732-16 696 ha for boreal chorus frogs, and 4940-19 690 ha for Columbia spotted frogs. ?? 2011 by the Ecological Society of America.
Bartelt, Paul E; Gallant, Alisa L; Klaver, Robert W; Wright, Chris K; Patla, Debra A; Peterson, Charles R
2011-10-01
The ability to predict amphibian breeding across landscapes is important for informing land management decisions and helping biologists better understand and remediate factors contributing to declines in amphibian populations. We built geospatial models of likely breeding habitats for each of four amphibian species that breed in Yellowstone National Park (YNP). We used field data collected in 2000-2002 from 497 sites among 16 basins and predictor variables from geospatial models produced from remotely sensed data (e.g., digital elevation model, complex topographic index, landform data, wetland probability, and vegetative cover). Except for 31 sites in one basin that were surveyed in both 2000 and 2002, all sites were surveyed once. We used polytomous regression to build statistical models for each species of amphibian from (1) field survey site data only, (2) field data combined with data from geospatial models, and (3) data from geospatial models only. Based on measures of receiver operating characteristic (ROC) scores, models of the second type best explained likely breeding habitat because they contained the most information (ROC values ranged from 0.70 to 0.88). However, models of the third type could be applied to the entire YNP landscape and produced maps that could be verified with reserve field data. Accuracy rates for models built for single years were highly variable, ranging from 0.30 to 0.78. Accuracy rates for models built with data combined from multiple years were higher and less variable, ranging from 0.60 to 0.80. Combining results from the geospatial multiyear models yielded maps of "core" breeding areas (areas with high probability values for all three years) surrounded by areas that scored high for only one or two years, providing an estimate of variability among years. Such information can highlight landscape options for amphibian conservation. For example, our models identify alternative areas that could be protected for each species, including 6828-10 764 ha for tiger salamanders, 971-3017 ha for western toads, 4732-16 696 ha for boreal chorus frogs, and 4940-19 690 ha for Columbia spotted frogs.
EPA has developed many applications that allow users to explore and interact with geospatial data. This page highlights some of the flagship geospatial web applications but these represent only a fraction of the total.
Use of NASA Satellite Data in Aiding Mississippi Barrier Island Restoration Projects
NASA Technical Reports Server (NTRS)
Giardino, Marco; Spruce, Joseph; Kalcic, Maria; Fletcher, Rose
2009-01-01
This presentation discusses a NASA Stennis Space Center project in which NASA-supported satellite and aerial data is being used to aid state and federal agencies in restoring the Mississippi barrier islands. Led by the Applied Science and Technology Project Office (ASTPO), this project will produce geospatial information products from multiple NASA-supported data sources, including Landsat, ASTER, and MODIS satellite data as well as ATLAS multispectral, CAMS multispectral, AVIRIS hyperspectral, EAARL, and other aerial data. Project objectives include the development and testing of a regional sediment transport model and the monitoring of barrier island restoration efforts through remote sensing. Barrier islands provide invaluable benefits to the State of Mississippi, including buffering the mainland from storm surge impacts, providing habitats for valuable wildlife and fisheries habitat, offering accessible recreational opportunities, and preserving natural environments for educating the public about coastal ecosystems and cultural resources. Unfortunately, these highly valued natural areas are prone to damage from hurricanes. For example, Hurricane Camille in 1969 split Ship Island into East and West Ship Island. Hurricane Georges in 1998 caused additional land loss for the two Ship Islands. More recently, Hurricanes Ivan, Katrina, Rita, Gustav, and Ike impacted the Mississippi barrier islands. In particular, Hurricane Katrina caused major damage to island vegetation and landforms, killing island forest overstories, overwashing entire islands, and causing widespread erosion. In response, multiple state and federal agencies are working to restore damaged components of these barrier islands. Much of this work is being implemented through federally funded Coastal Impact Assessment and Mississippi Coastal Improvement programs. One restoration component involves the reestablishment of the island footprints to that in 1969. Our project will employ NASA remote sensing data and products to support these federally funded efforts on multiple fronts. Landsat and ASTER data is being analyzed to assess changes in barrier island land cover over the last 35 years. ASTER, SRTM, and EAARL terrain products and other NASA airborne imagery are being applied in assessing changes in barrier island geomorphology and geospatial extent. MODIS data is being examined as a tool for sediment transport modeling by supplying geospatial data that quantifies in-water sediment concentrations. MODIS satellite data is being assessed for monitoring changes in the spatial extent of individual barrier islands. Results thus far indicate that NASA data products are useful in assessing barrier island conditions and changes. This value is enhanced with additional historical geospatial data, commercial high resolution satellite data, other non-NASA aerial imagery, and field survey data. The project s products are relevant to the Gulf of Mexico Alliance priority issues, including coastal habitat conservation, restoration and coastal community resilience. Such products will be available to state and federal agencies involved with coastal restoration. Potential end-users of these products include the National Park Service, U.S. Geological Survey, U.S. Army Corps of Engineers, Environmental Protection Agency, Mississippi Department of Environmental Quality, and Mississippi Department of Marine Resources.
Geospatial Data Science Analysis | Geospatial Data Science | NREL
different levels of technology maturity. Photo of a man taking field measurements. Geospatial analysis energy for different technologies across the nation? Featured Analysis Products Renewable Energy
The National 3-D Geospatial Information Web-Based Service of Korea
NASA Astrophysics Data System (ADS)
Lee, D. T.; Kim, C. W.; Kang, I. G.
2013-09-01
3D geospatial information systems should provide efficient spatial analysis tools and able to use all capabilities of the third dimension, and a visualization. Currently, many human activities make steps toward the third dimension like land use, urban and landscape planning, cadastre, environmental monitoring, transportation monitoring, real estate market, military applications, etc. To reflect this trend, the Korean government has been started to construct the 3D geospatial data and service platform. Since the geospatial information was introduced in Korea, the construction of geospatial information (3D geospatial information, digital maps, aerial photographs, ortho photographs, etc.) has been led by the central government. The purpose of this study is to introduce the Korean government-lead 3D geospatial information web-based service for the people who interested in this industry and we would like to introduce not only the present conditions of constructed 3D geospatial data but methodologies and applications of 3D geospatial information. About 15% (about 3,278.74 km2) of the total urban area's 3D geospatial data have been constructed by the national geographic information institute (NGII) of Korea from 2005 to 2012. Especially in six metropolitan cities and Dokdo (island belongs to Korea) on level of detail (LOD) 4 which is photo-realistic textured 3D models including corresponding ortho photographs were constructed in 2012. In this paper, we represented web-based 3D map service system composition and infrastructure and comparison of V-world with Google Earth service will be presented. We also represented Open API based service cases and discussed about the protection of location privacy when we construct 3D indoor building models. In order to prevent an invasion of privacy, we processed image blurring, elimination and camouflage. The importance of public-private cooperation and advanced geospatial information policy is emphasized in Korea. Thus, the progress of spatial information industry of Korea is expected in the near future.
Hou, Bin; Wang, Yunhong; Liu, Qingjie
2016-01-01
Characterizations of up to date information of the Earth’s surface are an important application providing insights to urban planning, resources monitoring and environmental studies. A large number of change detection (CD) methods have been developed to solve them by utilizing remote sensing (RS) images. The advent of high resolution (HR) remote sensing images further provides challenges to traditional CD methods and opportunities to object-based CD methods. While several kinds of geospatial objects are recognized, this manuscript mainly focuses on buildings. Specifically, we propose a novel automatic approach combining pixel-based strategies with object-based ones for detecting building changes with HR remote sensing images. A multiresolution contextual morphological transformation called extended morphological attribute profiles (EMAPs) allows the extraction of geometrical features related to the structures within the scene at different scales. Pixel-based post-classification is executed on EMAPs using hierarchical fuzzy clustering. Subsequently, the hierarchical fuzzy frequency vector histograms are formed based on the image-objects acquired by simple linear iterative clustering (SLIC) segmentation. Then, saliency and morphological building index (MBI) extracted on difference images are used to generate a pseudo training set. Ultimately, object-based semi-supervised classification is implemented on this training set by applying random forest (RF). Most of the important changes are detected by the proposed method in our experiments. This study was checked for effectiveness using visual evaluation and numerical evaluation. PMID:27618903
Hou, Bin; Wang, Yunhong; Liu, Qingjie
2016-08-27
Characterizations of up to date information of the Earth's surface are an important application providing insights to urban planning, resources monitoring and environmental studies. A large number of change detection (CD) methods have been developed to solve them by utilizing remote sensing (RS) images. The advent of high resolution (HR) remote sensing images further provides challenges to traditional CD methods and opportunities to object-based CD methods. While several kinds of geospatial objects are recognized, this manuscript mainly focuses on buildings. Specifically, we propose a novel automatic approach combining pixel-based strategies with object-based ones for detecting building changes with HR remote sensing images. A multiresolution contextual morphological transformation called extended morphological attribute profiles (EMAPs) allows the extraction of geometrical features related to the structures within the scene at different scales. Pixel-based post-classification is executed on EMAPs using hierarchical fuzzy clustering. Subsequently, the hierarchical fuzzy frequency vector histograms are formed based on the image-objects acquired by simple linear iterative clustering (SLIC) segmentation. Then, saliency and morphological building index (MBI) extracted on difference images are used to generate a pseudo training set. Ultimately, object-based semi-supervised classification is implemented on this training set by applying random forest (RF). Most of the important changes are detected by the proposed method in our experiments. This study was checked for effectiveness using visual evaluation and numerical evaluation.
NASA Astrophysics Data System (ADS)
Stepinski, T. F.; Mitasova, H.; Jasiewicz, J.; Neteler, M.; Gebbert, S.
2014-12-01
GRASS GIS is a leading open source GIS for geospatial analysis and modeling. In addition to being utilized as a desktop GIS it also serves as a processing engine for high performance geospatial computing for applications in diverse disciplines. The newly released GRASS GIS 7 supports big data analysis including temporal framework, image segmentation, watershed analysis, synchronized 2D/3D animations and many others. This presentation will focus on new GRASS GIS 7-powered tools for geoprocessing giga-size earth observation (EO) data using spatial pattern analysis. Pattern-based analysis connects to human visual perception of space as well as makes geoprocessing of giga-size EO data possible in an efficient and robust manner. GeoPAT is a collection of GRASS GIS 7 modules that fully integrates procedures for pattern representation of EO data and patterns similarity calculations with standard GIS tasks of mapping, maps overlay, segmentation, classification(Fig 1a), change detections etc. GeoPAT works very well on a desktop but it also underpins several GeoWeb applications (http://sil.uc.edu/ ) which allow users to do analysis on selected EO datasets without the need to download them. The GRASS GIS 7 temporal framework and high resolution visualizations will be illustrated using time series of giga-size, lidar-based digital elevation models representing the dynamics of North Carolina barrier islands over the past 15 years. The temporal framework supports efficient raster and vector data series analysis and simplifies data input for visual analysis of dynamic landscapes (Fig. 1b) allowing users to rapidly identify vulnerable locations, changes in built environment and eroding coastlines. Numerous improvements in GRASS GIS 7 were implemented to support terabyte size data processing for reconstruction of MODIS land surface temperature (LST) at 250m resolution using multiple regressions and PCA (Fig. 1c) . The new MODIS LST series (http://gis.cri.fmach.it/eurolst/) includes 4 maps per day since year 2000, provide improved data for the epidemiological predictions, viticulture, assessment of urban heat islands and numerous other applications. The presentation will conclude with outline of future development for big data interfaces to further enhance the web-based GRASS GIS data analysis.
Geospatial Science is increasingly becoming an important tool in making Agency decisions. Quality Control and Quality Assurance are required to be integrated during the planning, implementation and assessment of geospatial databases, processes and products. In order to ensure Age...
The geospatial data quality REST API for primary biodiversity data
Otegui, Javier; Guralnick, Robert P.
2016-01-01
Summary: We present a REST web service to assess the geospatial quality of primary biodiversity data. It enables access to basic and advanced functions to detect completeness and consistency issues as well as general errors in the provided record or set of records. The API uses JSON for data interchange and efficient parallelization techniques for fast assessments of large datasets. Availability and implementation: The Geospatial Data Quality API is part of the VertNet set of APIs. It can be accessed at http://api-geospatial.vertnet-portal.appspot.com/geospatial and is already implemented in the VertNet data portal for quality reporting. Source code is freely available under GPL license from http://www.github.com/vertnet/api-geospatial. Contact: javier.otegui@gmail.com or rguralnick@flmnh.ufl.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26833340
The geospatial data quality REST API for primary biodiversity data.
Otegui, Javier; Guralnick, Robert P
2016-06-01
We present a REST web service to assess the geospatial quality of primary biodiversity data. It enables access to basic and advanced functions to detect completeness and consistency issues as well as general errors in the provided record or set of records. The API uses JSON for data interchange and efficient parallelization techniques for fast assessments of large datasets. The Geospatial Data Quality API is part of the VertNet set of APIs. It can be accessed at http://api-geospatial.vertnet-portal.appspot.com/geospatial and is already implemented in the VertNet data portal for quality reporting. Source code is freely available under GPL license from http://www.github.com/vertnet/api-geospatial javier.otegui@gmail.com or rguralnick@flmnh.ufl.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.
System for real-time generation of georeferenced terrain models
NASA Astrophysics Data System (ADS)
Schultz, Howard J.; Hanson, Allen R.; Riseman, Edward M.; Stolle, Frank; Zhu, Zhigang; Hayward, Christopher D.; Slaymaker, Dana
2001-02-01
A growing number of law enforcement applications, especially in the areas of border security, drug enforcement and anti- terrorism require high-resolution wide area surveillance from unmanned air vehicles. At the University of Massachusetts we are developing an aerial reconnaissance system capable of generating high resolution, geographically registered terrain models (in the form of a seamless mosaic) in real-time from a single down-looking digital video camera. The efficiency of the processing algorithms, as well as the simplicity of the hardware, will provide the user with the ability to produce and roam through stereoscopic geo-referenced mosaic images in real-time, and to automatically generate highly accurate 3D terrain models offline in a fraction of the time currently required by softcopy conventional photogrammetry systems. The system is organized around a set of integrated sensor and software components. The instrumentation package is comprised of several inexpensive commercial-off-the-shelf components, including a digital video camera, a differential GPS, and a 3-axis heading and reference system. At the heart of the system is a set of software tools for image registration, mosaic generation, geo-location and aircraft state vector recovery. Each process is designed to efficiently handle the data collected by the instrument package. Particular attention is given to minimizing geospatial errors at each stage, as well as modeling propagation of errors through the system. Preliminary results for an urban and forested scene are discussed in detail.
NASA Astrophysics Data System (ADS)
Bodzin, Alec M.; Fu, Qiong; Kulo, Violet; Peffer, Tamara
2014-08-01
A potential method for teaching geospatial thinking and reasoning (GTR) is through geospatially enabled learning technologies. We developed an energy resources geospatial curriculum that included learning activities with geographic information systems and virtual globes. This study investigated how 13 urban middle school teachers implemented and varied the enactment of the curriculum with their students and investigated which teacher- and student-level factors accounted for students' GTR posttest achievement. Data included biweekly implementation surveys from teachers and energy resources content and GTR pre- and posttest achievement measures from 1,049 students. Students significantly increased both their energy resources content knowledge and their GTR skills related to energy resources at the end of the curriculum enactment. Both multiple regression and hierarchical linear modeling found that students' initial GTR abilities and gain in energy content knowledge were significantly explanatory variables for their geospatial achievement at the end of curriculum enactment, p < .001. Teacher enactment factors, including adherence to implementing the critical components of the curriculum or the number of years the teachers had taught the curriculum, did not have significant effects on students' geospatial posttest achievement. The findings from this study provide support that learning with geospatially enabled learning technologies can support GTR with urban middle-level learners.
Elmore, Kim; Flanagan, Barry; Jones, Nicholas F; Heitgerd, Janet L
2010-04-01
In 2008, CDC convened an expert panel to gather input on the use of geospatial science in surveillance, research and program activities focused on CDC's Healthy Communities Goal. The panel suggested six priorities: spatially enable and strengthen public health surveillance infrastructure; develop metrics for geospatial categorization of community health and health inequity; evaluate the feasibility and validity of standard metrics of community health and health inequities; support and develop GIScience and geospatial analysis; provide geospatial capacity building, training and education; and, engage non-traditional partners. Following the meeting, the strategies and action items suggested by the expert panel were reviewed by a CDC subcommittee to determine priorities relative to ongoing CDC geospatial activities, recognizing that many activities may need to occur either in parallel, or occur multiple times across phases. Phase A of the action items centers on developing leadership support. Phase B focuses on developing internal and external capacity in both physical (e.g., software and hardware) and intellectual infrastructure. Phase C of the action items plan concerns the development and integration of geospatial methods. In summary, the panel members provided critical input to the development of CDC's strategic thinking on integrating geospatial methods and research issues across program efforts in support of its Healthy Communities Goal.
Geospatial Science is increasingly becoming an important tool in making Agency decisions. QualIty Control and Quality Assurance are required to be integrated during the planning, implementation and assessment of geospatial databases, processes and products. In order to ensure Age...
A Decade of Annual National Land Cover Products - the Cropland Data Layer
NASA Astrophysics Data System (ADS)
Mueller, R.; Johnson, D. M.; Sandborn, A.; Willis, P.; Ebinger, L.; Yang, Z.; Seffrin, R.; Boryan, C. G.; Hardin, R.
2017-12-01
The Cropland Data Layer (CDL) is a national land cover product produced by the US Department of Agriculture/National Agricultural Statistics Service (NASS) to assess planted crop acreage on an annual basis. The 2017 CDL product serves as the decadal anniversary for the mapping of conterminous US agriculture. The CDL is a supervised land cover classification derived from medium resolution Earth observing satellites that capture crop phenology throughout the growing season, leveraging confidentially held ground reference information from the USDA Farm Service Agency (FSA) as training data. The CDL currently uses ancillary geospatial data from the US Geological Survey's National Land Cover Database (NLCD), and Imperviousness and Forest Canopy layers as well as the National Elevation Dataset as training for the non-agricultural domain. Accuracy assessments are documented and released annually with metadata publication. NASS is currently reprocessing the 2008 and 2009 CDL products to 30m resolution. They were originally processed and released at 56m based on the Resourcesat-1 AWiFS sensor. Additionally, best practices learned from processing the FSA ground reference data were applied to the historical training set, providing an enhanced classification at 30m. The release of these reprocessed products in the fall of 2017, along with the 2017 CDL annual product will be discussed and will complete a decade's worth of annual 30m products. Discussions of change and trend analytics as well as partnerships with key industry stakeholders will be displayed on the evolution and improvements made to this decadal geospatial crop specific land cover product.
EPA uses high-end scientific computing, geospatial services and remote sensing/imagery analysis to support EPA's mission. The Center for Environmental Computing (CEC) assists the Agency's program offices and regions to meet staff needs in these areas.
U.S. Geological Survey Geospatial Data To Support STEM Education And Communication
NASA Astrophysics Data System (ADS)
Molnia, B. F.
2017-12-01
The U.S. Geological Survey (USGS) has a long history of contributing to STEM education, outreach, and communication. The USGS EarthExplorer website: https://earthexplorer.usgs.gov is the USGS gateway to more than 150 geospatial data sets that are freely available to STEM students, educators, and researchers. Two in particular, Global Fiducials data and Declassified Satellite Imagery provide the highest resolution visual record of the Earth's surface that is available for unlimited, unrestricted download. Global Fiducials Data - Since the mid-1990s, more than 500 locations, each termed a 'Fiducial Site', have been systematically and repeatedly imaged with U.S. National Imagery Systems space-based sensors. Each location was selected for long-term monitoring, based on its history and environmental values. Since 2008, imagery from about a quarter of the sites has been publicly released and is available on EarthExplorer. These 5,000 electro-optical (EO) images, with 1.0 - 1.3 m resolution, comprise more than 140 time-series. Individual time-series focus on wildland fire recovery, Arctic sea ice change, Antarctic habitats, temperate glacier behavior, eroding barrier islands, coastline evolution, resource and ecosystem management, natural disaster response, global change studies, and other topics. Declassified Satellite Imagery - Nearly 1 million declassified photographs, collected between 1960 and 1984, by U.S. intelligence satellites KH-1 through KH-9 have been released to the public. The USGS has copies of most of the released film and provides a digital finding aid that can be accessed from the USGS EarthExplorer website. Individual frames were collected at resolutions that range from 0.61 m - 7.6 m. Imagery exists for locations on all continents. Combined with Landsat imagery, also available from the USGS EarthExplorer website, the STEM Community has access to more than 7.5 million images providing nearly 50 years of visual observations of Earth's dynamic surface.
Advancements in Open Geospatial Standards for Photogrammetry and Remote Sensing from Ogc
NASA Astrophysics Data System (ADS)
Percivall, George; Simonis, Ingo
2016-06-01
The necessity of open standards for effective sharing and use of remote sensing continues to receive increasing emphasis in policies of agencies and projects around the world. Coordination on the development of open standards for geospatial information is a vital step to insure that the technical standards are ready to support the policy objectives. The mission of the Open Geospatial Consortium (OGC) is to advance development and use of international standards and supporting services that promote geospatial interoperability. To accomplish this mission, OGC serves as the global forum for the collaboration of geospatial data / solution providers and users. Photogrammetry and remote sensing are sources of the largest and most complex geospatial information. Some of the most mature OGC standards for remote sensing include the Sensor Web Enablement (SWE) standards, the Web Coverage Service (WCS) suite of standards, encodings such as NetCDF, GMLJP2 and GeoPackage, and the soon to be approved Discrete Global Grid Systems (DGGS) standard. In collaboration with ISPRS, OGC working with government, research and industrial organizations continue to advance the state of geospatial standards for full use of photogrammetry and remote sensing.
Scale criticality in estimating ecosystem carbon dynamics
Zhao, Shuqing; Liu, Shuguang
2014-01-01
Scaling is central to ecology and Earth system sciences. However, the importance of scale (i.e. resolution and extent) for understanding carbon dynamics across scales is poorly understood and quantified. We simulated carbon dynamics under a wide range of combinations of resolution (nine spatial resolutions of 250 m, 500 m, 1 km, 2 km, 5 km, 10 km, 20 km, 50 km, and 100 km) and extent (57 geospatial extents ranging from 108 to 1 247 034 km2) in the southeastern United States to explore the existence of scale dependence of the simulated regional carbon balance. Results clearly show the existence of a critical threshold resolution for estimating carbon sequestration within a given extent and an error limit. Furthermore, an invariant power law scaling relationship was found between the critical resolution and the spatial extent as the critical resolution is proportional to An (n is a constant, and A is the extent). Scale criticality and the power law relationship might be driven by the power law probability distributions of land surface and ecological quantities including disturbances at landscape to regional scales. The current overwhelming practices without considering scale criticality might have largely contributed to difficulties in balancing carbon budgets at regional and global scales.
GeoSearch: A lightweight broking middleware for geospatial resources discovery
NASA Astrophysics Data System (ADS)
Gui, Z.; Yang, C.; Liu, K.; Xia, J.
2012-12-01
With petabytes of geodata, thousands of geospatial web services available over the Internet, it is critical to support geoscience research and applications by finding the best-fit geospatial resources from the massive and heterogeneous resources. Past decades' developments witnessed the operation of many service components to facilitate geospatial resource management and discovery. However, efficient and accurate geospatial resource discovery is still a big challenge due to the following reasons: 1)The entry barriers (also called "learning curves") hinder the usability of discovery services to end users. Different portals and catalogues always adopt various access protocols, metadata formats and GUI styles to organize, present and publish metadata. It is hard for end users to learn all these technical details and differences. 2)The cost for federating heterogeneous services is high. To provide sufficient resources and facilitate data discovery, many registries adopt periodic harvesting mechanism to retrieve metadata from other federated catalogues. These time-consuming processes lead to network and storage burdens, data redundancy, and also the overhead of maintaining data consistency. 3)The heterogeneous semantics issues in data discovery. Since the keyword matching is still the primary search method in many operational discovery services, the search accuracy (precision and recall) is hard to guarantee. Semantic technologies (such as semantic reasoning and similarity evaluation) offer a solution to solve these issues. However, integrating semantic technologies with existing service is challenging due to the expandability limitations on the service frameworks and metadata templates. 4)The capabilities to help users make final selection are inadequate. Most of the existing search portals lack intuitive and diverse information visualization methods and functions (sort, filter) to present, explore and analyze search results. Furthermore, the presentation of the value-added additional information (such as, service quality and user feedback), which conveys important decision supporting information, is missing. To address these issues, we prototyped a distributed search engine, GeoSearch, based on brokering middleware framework to search, integrate and visualize heterogeneous geospatial resources. Specifically, 1) A lightweight discover broker is developed to conduct distributed search. The broker retrieves metadata records for geospatial resources and additional information from dispersed services (portals and catalogues) and other systems on the fly. 2) A quality monitoring and evaluation broker (i.e., QoS Checker) is developed and integrated to provide quality information for geospatial web services. 3) The semantic assisted search and relevance evaluation functions are implemented by loosely interoperating with ESIP Testbed component. 4) Sophisticated information and data visualization functionalities and tools are assembled to improve user experience and assist resource selection.
Geospatial Analysis of Near-Term Technical Potential of BECCS in the U.S.
NASA Astrophysics Data System (ADS)
Baik, E.; Sanchez, D.; Turner, P. A.; Mach, K. J.; Field, C. B.; Benson, S. M.
2017-12-01
Atmospheric carbon dioxide (CO2) removal using bioenergy with carbon capture and storage (BECCS) is crucial for achieving stringent climate change mitigation targets. To date, previous work discussing the feasibility of BECCS has largely focused on land availability and bioenergy potential, while CCS components - including capacity, injectivity, and location of potential storage sites - have not been thoroughly considered in the context of BECCS. A high-resolution geospatial analysis of both biomass production and potential geologic storage sites is conducted to consider the near-term deployment potential of BECCS in the U.S. The analysis quantifies the overlap between the biomass resource and CO2 storage locations within the context of storage capacity and injectivity. This analysis leverages county-level biomass production data from the U.S. Department of Energy's Billion Ton Report alongside potential CO2 geologic storage sites as provided by the USGS Assessment of Geologic Carbon Dioxide Storage Resources. Various types of lignocellulosic biomass (agricultural residues, dedicated energy crops, and woody biomass) result in a potential 370-400 Mt CO2 /yr of negative emissions in 2020. Of that CO2, only 30-31% of the produced biomass (110-120 Mt CO2 /yr) is co-located with a potential storage site. While large potential exists, there would need to be more than 250 50-MW biomass power plants fitted with CCS to capture all the co-located CO2 capacity in 2020. Neither absolute injectivity nor absolute storage capacity is likely to limit BECCS, but the results show regional capacity and injectivity constraints in the U.S. that had not been identified in previous BECCS analysis studies. The state of Illinois, the Gulf region, and western North Dakota emerge as the best locations for near-term deployment of BECCS with abundant biomass, sufficient storage capacity and injectivity, and the co-location of the two resources. Future studies assessing BECCS potential should employ higher-resolution spatial datasets to identify near-term deployment opportunities, explicitly including the availability of co-located storage, regional capacity limitations, and integration of electricity produced with BECCS into local electricity grids.
GABBs: Cyberinfrastructure for Self-Service Geospatial Data Exploration, Computation, and Sharing
NASA Astrophysics Data System (ADS)
Song, C. X.; Zhao, L.; Biehl, L. L.; Merwade, V.; Villoria, N.
2016-12-01
Geospatial data are present everywhere today with the proliferation of location-aware computing devices. This is especially true in the scientific community where large amounts of data are driving research and education activities in many domains. Collaboration over geospatial data, for example, in modeling, data analysis and visualization, must still overcome the barriers of specialized software and expertise among other challenges. In addressing these needs, the Geospatial data Analysis Building Blocks (GABBs) project aims at building geospatial modeling, data analysis and visualization capabilities in an open source web platform, HUBzero. Funded by NSF's Data Infrastructure Building Blocks initiative, GABBs is creating a geospatial data architecture that integrates spatial data management, mapping and visualization, and interfaces in the HUBzero platform for scientific collaborations. The geo-rendering enabled Rappture toolkit, a generic Python mapping library, geospatial data exploration and publication tools, and an integrated online geospatial data management solution are among the software building blocks from the project. The GABBS software will be available through Amazon's AWS Marketplace VM images and open source. Hosting services are also available to the user community. The outcome of the project will enable researchers and educators to self-manage their scientific data, rapidly create GIS-enable tools, share geospatial data and tools on the web, and build dynamic workflows connecting data and tools, all without requiring significant software development skills, GIS expertise or IT administrative privileges. This presentation will describe the GABBs architecture, toolkits and libraries, and showcase the scientific use cases that utilize GABBs capabilities, as well as the challenges and solutions for GABBs to interoperate with other cyberinfrastructure platforms.
Assessing Embedded Geospatial Student Learning Outcomes
ERIC Educational Resources Information Center
Carr, John David
2012-01-01
Geospatial tools and technologies have become core competencies for natural resource professionals due to the monitoring, modeling, and mapping capabilities they provide. To prepare students with needed background, geospatial instructional activities were integrated across Forest Management; Natural Resources; Fisheries, Wildlife, &…
USDA-ARS?s Scientific Manuscript database
Increasingly, consumer organizations, businesses, and academic researchers are using UAS to gather geospatial, environmental data on natural and man-made phenomena. These data may be either remotely sensed or measured directly (e. g., sampling of atmospheric constituents). The term geospatial data r...
Golding, Nick; Burstein, Roy; Longbottom, Joshua; Browne, Annie J; Fullman, Nancy; Osgood-Zimmerman, Aaron; Earl, Lucas; Bhatt, Samir; Cameron, Ewan; Casey, Daniel C; Dwyer-Lindgren, Laura; Farag, Tamer H; Flaxman, Abraham D; Fraser, Maya S; Gething, Peter W; Gibson, Harry S; Graetz, Nicholas; Krause, L Kendall; Kulikoff, Xie Rachel; Lim, Stephen S; Mappin, Bonnie; Morozoff, Chloe; Reiner, Robert C; Sligar, Amber; Smith, David L; Wang, Haidong; Weiss, Daniel J; Murray, Christopher J L; Moyes, Catherine L; Hay, Simon I
2017-11-11
During the Millennium Development Goal (MDG) era, many countries in Africa achieved marked reductions in under-5 and neonatal mortality. Yet the pace of progress toward these goals substantially varied at the national level, demonstrating an essential need for tracking even more local trends in child mortality. With the adoption of the Sustainable Development Goals (SDGs) in 2015, which established ambitious targets for improving child survival by 2030, optimal intervention planning and targeting will require understanding of trends and rates of progress at a higher spatial resolution. In this study, we aimed to generate high-resolution estimates of under-5 and neonatal all-cause mortality across 46 countries in Africa. We assembled 235 geographically resolved household survey and census data sources on child deaths to produce estimates of under-5 and neonatal mortality at a resolution of 5 × 5 km grid cells across 46 African countries for 2000, 2005, 2010, and 2015. We used a Bayesian geostatistical analytical framework to generate these estimates, and implemented predictive validity tests. In addition to reporting 5 × 5 km estimates, we also aggregated results obtained from these estimates into three different levels-national, and subnational administrative levels 1 and 2-to provide the full range of geospatial resolution that local, national, and global decision makers might require. Amid improving child survival in Africa, there was substantial heterogeneity in absolute levels of under-5 and neonatal mortality in 2015, as well as the annualised rates of decline achieved from 2000 to 2015. Subnational areas in countries such as Botswana, Rwanda, and Ethiopia recorded some of the largest decreases in child mortality rates since 2000, positioning them well to achieve SDG targets by 2030 or earlier. Yet these places were the exception for Africa, since many areas, particularly in central and western Africa, must reduce under-5 mortality rates by at least 8·8% per year, between 2015 and 2030, to achieve the SDG 3.2 target for under-5 mortality by 2030. In the absence of unprecedented political commitment, financial support, and medical advances, the viability of SDG 3.2 achievement in Africa is precarious at best. By producing under-5 and neonatal mortality rates at multiple levels of geospatial resolution over time, this study provides key information for decision makers to target interventions at populations in the greatest need. In an era when precision public health increasingly has the potential to transform the design, implementation, and impact of health programmes, our 5 × 5 km estimates of child mortality in Africa provide a baseline against which local, national, and global stakeholders can map the pathways for ending preventable child deaths by 2030. Bill & Melinda Gates Foundation. Copyright © 2017 The Author(s). Published by Elsevier Ltd. This is an Open Access article under the CC BY 4.0 license. Published by Elsevier Ltd.. All rights reserved.
2014-09-01
Approved for public release; distribution is unlimited. Prepared for Geospatial Research Laboratory U.S. Army Engineer Research and Development...Center U.S. Army Corps of Engineers Under Data Level Enterprise Tools Monitored by Geospatial Research Laboratory 7701 Telegraph Road...Engineer Research and Development Center (ERDC) ERDC Geospatial Research Laboratory 7701 Telegraph Road 11. SPONSOR/MONITOR’S REPORT Alexandria, VA 22135
An Institutional Community-Driven effort to Curate and Preserve Geospatial Data using GeoBlacklight
NASA Astrophysics Data System (ADS)
Petters, J.; Coleman, S.; Andrea, O.
2016-12-01
A variety of geospatial data is produced or collected by both academic researchers and non-academic groups in the Virginia Tech community. In an effort to preserve, curate and make this geospatial data discoverable, the University Libraries have been building a local implementation of GeoBlacklight, a multi-institutional open-source collaborative project to improve the discoverability and sharing of geospatial data. We will discuss the local implementation of Geoblacklight at Virginia Tech, focusing on the efforts necessary to make it a sustainable resource for the institution and local community going forward. This includes technical challenges such as the development of uniform workflows for geospatial data produced within and outside the course of research, but organizational and economic barriers must be overcome as well. In spearheading this GeoBlacklight effort the Libraries have partnered with University Facilities and University IT. The IT group manages the storage and backup of geospatial data, allowing our group to focus on geospatial data collection and curation. Both IT and University Facilities are in possession of localized geospatial data of interest to Viriginia Tech researchers that all parties agreed should be made discoverable and accessible. The interest and involvement of these and other university stakeholders is key to establishing the sustainability of the infrastructure and the capabilities it can provide to the Virginia Tech community and beyond.
78 FR 69393 - Privacy Act of 1974; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-19
.... FOR FURTHER INFORMATION CONTACT: National Geospatial-Intelligence Agency (NGA), ATTN: Human...: Delete entry and replace with ``Human Development Directorate, National Geospatial-Intelligence Agency...; System of Records AGENCY: National Geospatial-Intelligence Agency, DoD. ACTION: Notice to alter a System...
77 FR 5820 - National Geospatial Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-06
... DEPARTMENT OF THE INTERIOR Office of the Secretary National Geospatial Advisory Committee AGENCY... that the Secretary of the Interior has renewed the National Geospatial Advisory Committee. The Committee will provide advice and recommendations to the Federal Geographic Data Committee (FGDC), through...
THE NEVADA GEOSPATIAL DATA BROWSER
The Landscape Ecology Branch of the U.S. Environmental Protection Agency (Las Vegas, NV) has developed the Nevada Geospatial Data Browser, a spatial data archive to centralize and distribute the geospatial data used to create the land cover, vertebrate habitat models, and land o...
Information Fusion for Feature Extraction and the Development of Geospatial Information
2004-07-01
of automated processing . 2. Requirements for Geospatial Information Accurate, timely geospatial information is critical for many military...this evaluation illustrates some of the difficulties in comparing manual and automated processing results (figure 5). The automated delineation of
Geospatial Information Best Practices
2012-01-01
26 Spring - 2012 By MAJ Christopher Blais, CW2 Joshua Stratton and MSG Moise Danjoint The fact that Geospatial information can be codified and...Operation Iraqi Freedom V (2007-2008, and Operation New Dawn (2011). MSG Moise Danjoint is the noncommissioned officer in charge, Geospatial
NASA Astrophysics Data System (ADS)
Hardin, Eric Jon
Coastal landscapes can be relentlessly dynamic---owing to wave energy, tidal cycles, extreme weather events, and perpetual coastal winds. In these settings, the ever-changing landscape can threaten assets and infrastructure, necessitating costly measures to mitigate associated risks and to repair or maintain the changing landscape. Mapping and monitoring of terrain change, identification of areas susceptible to dramatic change, and understanding the processes that drive landscape change are critical for the development of responsible coastal management strategies and policies. Over the past two decades, LiDAR mapping has been conducted along the U.S. east coast (including the Outer Banks, North Carolina) on a near annual basis---generating a rich time series of topographic data with unprecedented accuracy, resolution, and extent. This time series has captured the response of the landscape to episodic storms, daily forcing of wind and waves, and anthropogenic activities. This work presents raster-based geospatial techniques developed to gain new insights into coastal geomorphology from the time series of available LiDAR. Per-cell statistical techniques derive information that is typically not obtained through the techniques traditionally employed by coastal scientists and engineers. Application of these techniques to study sites along the Outer Banks, NC, revealed substantial spatial and temporal variations in terrain change. Additionally, they identify the foredunes as being the most geomorphologically dynamic coastal features. In addition to per-cell statistical analysis, an approach is presented for the extraction of the dune ridge and dune toe (two features that are essential to standard vulnerability assessment). The approach employs a novel application of least cost path analysis and a physics-based model of an elastic sheet. The spatially distributed nature of the approach achieves a high level of automation and repeatability that semi-automated methods and manual digitization lack. Furthermore, the approach can be fully implemented with standard Geographic Information System (GIS) functionality, resulting in efficiency and ease of implementation. With this approach, a raster-based implementation of the U.S. Geological Survey (USGS) storm impact scale (designed to assess storm vulnerability of barrier islands) was developed. Vulnerability of 4km of the Outer Banks to Hurricane Isabel (2003) was assessed. The demonstrated approach produced vulnerability mapping at the high resolution of the input Digital Elevation Model (DEM)---providing results at the scale needed for local management, in contrast to the USGS approach, which is designed for continental scale vulnerability assessment. However, geospatial techniques cannot fully explain the observed geomorphology. Therefore, we present the Smoothed Particle Hydrodynamics (SPH) implementation of the Sauermann model for wind-driven sand transport. The SPH implementation enables the full nonlinearity of the model to be applied to complex scenarios that are typical of coastal landscapes. Through application of the SPH model and Computational Fluid Dynamics (CFD) modeling of the windborne surface shear stress (which drives sand transport), we present the sediment flux at two study sites along the Outer Banks. Scenarios were tested that involved steady-state surface shear stress as well as scenarios with intermittent variations in the surface shear stress. Results showed that intermittency in the surface shear stress has the potential to greatly influence the resulting flux. However, the degree to which intermittency does alter the flux is highly dependent on wind characteristics and wind direction relative to the orientation of salient topographic features.
Geospatial-enabled Data Exploration and Computation through Data Infrastructure Building Blocks
NASA Astrophysics Data System (ADS)
Song, C. X.; Biehl, L. L.; Merwade, V.; Villoria, N.
2015-12-01
Geospatial data are present everywhere today with the proliferation of location-aware computing devices and sensors. This is especially true in the scientific community where large amounts of data are driving research and education activities in many domains. Collaboration over geospatial data, for example, in modeling, data analysis and visualization, must still overcome the barriers of specialized software and expertise among other challenges. The GABBs project aims at enabling broader access to geospatial data exploration and computation by developing spatial data infrastructure building blocks that leverage capabilities of end-to-end application service and virtualized computing framework in HUBzero. Funded by NSF Data Infrastructure Building Blocks (DIBBS) initiative, GABBs provides a geospatial data architecture that integrates spatial data management, mapping and visualization and will make it available as open source. The outcome of the project will enable users to rapidly create tools and share geospatial data and tools on the web for interactive exploration of data without requiring significant software development skills, GIS expertise or IT administrative privileges. This presentation will describe the development of geospatial data infrastructure building blocks and the scientific use cases that help drive the software development, as well as seek feedback from the user communities.
US EPA GEOSPATIAL QUALITY COUNCIL: ENSURING QUALITY IN GEOPSPATIAL SOLUTIONS
In 1999, the U.S. Environmental Protection Agency (EPA), Office of Research and Development, Environmental Sciences Division, created the EPA Geospatial Quality Council (GQC) to fill the gap between the EPA Quality Assurance (QA) and Geospatial communities. GQC participants inclu...
Searches over graphs representing geospatial-temporal remote sensing data
Brost, Randolph; Perkins, David Nikolaus
2018-03-06
Various technologies pertaining to identifying objects of interest in remote sensing images by searching over geospatial-temporal graph representations are described herein. Graphs are constructed by representing objects in remote sensing images as nodes, and connecting nodes with undirected edges representing either distance or adjacency relationships between objects and directed edges representing changes in time. Geospatial-temporal graph searches are made computationally efficient by taking advantage of characteristics of geospatial-temporal data in remote sensing images through the application of various graph search techniques.
GISpark: A Geospatial Distributed Computing Platform for Spatiotemporal Big Data
NASA Astrophysics Data System (ADS)
Wang, S.; Zhong, E.; Wang, E.; Zhong, Y.; Cai, W.; Li, S.; Gao, S.
2016-12-01
Geospatial data are growing exponentially because of the proliferation of cost effective and ubiquitous positioning technologies such as global remote-sensing satellites and location-based devices. Analyzing large amounts of geospatial data can provide great value for both industrial and scientific applications. Data- and compute- intensive characteristics inherent in geospatial big data increasingly pose great challenges to technologies of data storing, computing and analyzing. Such challenges require a scalable and efficient architecture that can store, query, analyze, and visualize large-scale spatiotemporal data. Therefore, we developed GISpark - a geospatial distributed computing platform for processing large-scale vector, raster and stream data. GISpark is constructed based on the latest virtualized computing infrastructures and distributed computing architecture. OpenStack and Docker are used to build multi-user hosting cloud computing infrastructure for GISpark. The virtual storage systems such as HDFS, Ceph, MongoDB are combined and adopted for spatiotemporal data storage management. Spark-based algorithm framework is developed for efficient parallel computing. Within this framework, SuperMap GIScript and various open-source GIS libraries can be integrated into GISpark. GISpark can also integrated with scientific computing environment (e.g., Anaconda), interactive computing web applications (e.g., Jupyter notebook), and machine learning tools (e.g., TensorFlow/Orange). The associated geospatial facilities of GISpark in conjunction with the scientific computing environment, exploratory spatial data analysis tools, temporal data management and analysis systems make up a powerful geospatial computing tool. GISpark not only provides spatiotemporal big data processing capacity in the geospatial field, but also provides spatiotemporal computational model and advanced geospatial visualization tools that deals with other domains related with spatial property. We tested the performance of the platform based on taxi trajectory analysis. Results suggested that GISpark achieves excellent run time performance in spatiotemporal big data applications.
McLaughlin, C.J.; Smith, C.A.; Buddemeier, R.W.; Bartley, J.D.; Maxwell, B.A.
2003-01-01
The role of terrigenous sediment in controlling the occurrence of coral reef ecosystems is qualitatively understood and has been studied at local scales, but has not been systematically evaluated on a global-to-regional scale. Current concerns about degradation of reef environments and alteration of the hydrologic and sediment cycles place the issue at a focal point of multiple environmental concerns. We use a geospatial clustering of a coastal zone database of river and local runoff identified with 0.5?? grid cells to identify areas of high potential runoff effects, and combine this with a database of reported coral reef locations. Coastal cells with high runoff values are much less likely to contain reefs than low runoff cells and GIS buffer analysis demonstrates that this inhibition extends to offshore ocean cells as well. This analysis does not uniquely define the effects of sediment, since salinity, nutrients, and contaminants are potentially confounding variables also associated with runoff. However, sediment effects are likely to be a major factor and a basis is provided for extending the study to higher resolution with more specific variables. ?? 2003 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Barik, M. G.; Al-Hamdan, M. Z.; Crosson, W. L.; Yang, C. A.; Coffield, S. R.
2017-12-01
Satellite-derived environmental data, available in a range of spatio-temporal scales, are contributing to the growing use of health impact assessments of air pollution in the public health sector. Models developed using correlation of Moderate Resolution Imaging Spectrometer (MODIS) Aerosol Optical Depth (AOD) with ground measurements of fine particulate matter less than 2.5 microns (PM2.5) are widely applied to measure PM2.5 spatial and temporal variability. In the public health sector, associations of PM2.5 with respiratory and cardiovascular diseases are often investigated to quantify air quality impacts on these health concerns. In order to improve predictability of PM2.5 estimation using correlation models, we have included meteorological variables, higher-resolution AOD products and instantaneous PM2.5 observations into statistical estimation models. Our results showed that incorporation of high-resolution (1-km) Multi-Angle Implementation of Atmospheric Correction (MAIAC)-generated MODIS AOD, meteorological variables and instantaneous PM2.5 observations improved model performance in various parts of California (CA), USA, where single variable AOD-based models showed relatively weak performance. In this study, we further asked whether these improved models actually would be more successful for exploring associations of public health outcomes with estimated PM2.5. To answer this question, we geospatially investigated model-estimated PM2.5's relationship with respiratory and cardiovascular diseases such as asthma, high blood pressure, coronary heart disease, heart attack and stroke in CA using health data from the Centers for Disease Control and Prevention (CDC)'s Wide-ranging Online Data for Epidemiologic Research (WONDER) and the Behavioral Risk Factor Surveillance System (BRFSS). PM2.5 estimation from these improved models have the potential to improve our understanding of associations between public health concerns and air quality.
A comparative analysis of the Global Land Cover 2000 and MODIS land cover data sets
Giri, C.; Zhu, Z.; Reed, B.
2005-01-01
Accurate and up-to-date global land cover data sets are necessary for various global change research studies including climate change, biodiversity conservation, ecosystem assessment, and environmental modeling. In recent years, substantial advancement has been achieved in generating such data products. Yet, we are far from producing geospatially consistent high-quality data at an operational level. We compared the recently available Global Land Cover 2000 (GLC-2000) and MODerate resolution Imaging Spectrometer (MODIS) global land cover data to evaluate the similarities and differences in methodologies and results, and to identify areas of spatial agreement and disagreement. These two global land cover data sets were prepared using different data sources, classification systems, and methodologies, but using the same spatial resolution (i.e., 1 km) satellite data. Our analysis shows a general agreement at the class aggregate level except for savannas/shrublands, and wetlands. The disagreement, however, increases when comparing detailed land cover classes. Similarly, percent agreement between the two data sets was found to be highly variable among biomes. The identified areas of spatial agreement and disagreement will be useful for both data producers and users. Data producers may use the areas of spatial agreement for training area selection and pay special attention to areas of disagreement for further improvement in future land cover characterization and mapping. Users can conveniently use the findings in the areas of agreement, whereas users might need to verify the informaiton in the areas of disagreement with the help of secondary information. Learning from past experience and building on the existing infrastructure (e.g., regional networks), further research is necessary to (1) reduce ambiguity in land cover definitions, (2) increase availability of improved spatial, spectral, radiometric, and geometric resolution satellite data, and (3) develop advanced classification algorithms.
NASA Astrophysics Data System (ADS)
Jackson, C.; Sava, E.; Cervone, G.
2017-12-01
Hurricane Harvey has been noted as the wettest cyclone on record for the US as well as the most destructive (so far) for the 2017 hurricane season. An entire year worth of rainfall occurred over the course of a few days. The city of Houston was greatly impacted as the storm lingered over the city for five days, causing a record-breaking 50+ inches of rain as well as severe damage from flooding. Flood model simulations were performed to reconstruct the event in order to better understand, assess, and predict flooding dynamics for the future. Additionally, number of remote sensing platforms, and on ground instruments that provide near real-time data have also been used for flood identification, monitoring, and damage assessment. Although both flood models and remote sensing techniques are able to identify inundated areas, rapid and accurate flood prediction at a high spatio-temporal resolution remains a challenge. Thus a methodological approach which fuses the two techniques can help to better validate what is being modeled and observed. Recent advancements in data fusion techniques of remote sensing with near real time heterogeneous datasets have allowed emergency responders to more efficiently extract increasingly precise and relevant knowledge from the available information. In this work the use of multiple sources of contributed data, coupled with remotely sensed and open source geospatial datasets is demonstrated to generate an understanding of potential damage assessment for the floods after Hurricane Harvey in Harris County, Texas. The feasibility of integrating multiple sources at different temporal and spatial resolutions into hydrodynamic models for flood inundation simulations is assessed. Furthermore the contributed datasets are compared against a reconstructed flood extent generated from the Flood2D-GPU model.
78 FR 32635 - Privacy Act of 1974; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-31
...; System of Records AGENCY: National Geospatial-Intelligence Agency, DoD. ACTION: Notice to Add a New System of Records. SUMMARY: The National Geospatial-Intelligence Agency is establishing a new system of... information. FOR FURTHER INFORMATION CONTACT: National Geospatial-Intelligence Agency [[Page 32636
78 FR 35606 - Privacy Act of 1974; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-13
...; System of Records AGENCY: National Geospatial-Intelligence Agency, DoD. ACTION: Notice to alter a System of Records. SUMMARY: The National Geospatial-Intelligence Agency is altering a system of records in.... FOR FURTHER INFORMATION CONTACT: National Geospatial-Intelligence Agency (NGA), ATTN: Security...
NASA Astrophysics Data System (ADS)
Ibarra, Mercedes; Gherboudj, Imen; Al Rished, Abdulaziz; Ghedira, Hosni
2017-06-01
Given ambitious plans to increase the amount of electricity production from renewable resources and the natural resources of the Kingdom of Saudi Arabia (KSA), solar energy stands as a technology with a great development potential in this country. In this work, the suitability of the territory is assess through a geospatial analysis, using a PTC performance model to account for the technical potential. As a result, a land suitability map is presented, where the North-West area of the country is identified as the one with more highly suitable area.
Open Technology Approaches to Geospatial Interface Design
NASA Astrophysics Data System (ADS)
Crevensten, B.; Simmons, D.; Alaska Satellite Facility
2011-12-01
What problems do you not want your software developers to be solving? Choosing open technologies across the entire stack of software development-from low-level shared libraries to high-level user interaction implementations-is a way to help ensure that customized software yields innovative and valuable tools for Earth Scientists. This demonstration will review developments in web application technologies and the recurring patterns of interaction design regarding exploration and discovery of geospatial data through the Vertex: ASF's Dataportal interface, a project utilizing current open web application standards and technologies including HTML5, jQueryUI, Backbone.js and the Jasmine unit testing framework.
NASA Astrophysics Data System (ADS)
Kassab, Ala'; Liang, Steve; Gao, Yang
2010-12-01
Emergency agencies seek to maintain situational awareness and effective decision making through continuous monitoring of, and real-time alerting about, sources of information regarding current incidents and developing fire hazards. The nature of this goal requires integrating different, potentially numerous, sources of dynamic geospatial information on the one side, and a large number of clients having heterogeneous and specific interests in data on the other side. In such scenarios, the traditional request/reply communication style may function inefficiently, as it is based on point-to-point, synchronous, and pulling mode interaction between consumer clients and information providers/services. In this work, we propose Geospatial-based Publish/ Subscribe, an interaction framework that serves as a middleware for real-time transacting of spatially related information of interest, termed geospatial events, in distributed systems. Expressive data models, including geospatial event and geospatial subscription, as well as an efficient matching approach for fast dissemination of geospatial events to interested clients, are introduced. The proposed interaction framework is realized through the development of a Real-Time Fire Emergency Response System (RFERS) prototype. The prototype is designed for transacting several topics of geospatial events that are crucial within the context of fire emergencies, including GPS locations of emergency assets, meteorological observations of wireless sensors, fire incidents reports, and temporal sequences of remote sensing images of active wildfires. The performance of the system prototype has been evaluated in order to demonstrate its efficiency.
Andrews, Elisabeth; Balkanski, Yves; Boucher, Olivier; Myhre, Gunnar; Samset, Bjørn Hallvard; Schulz, Michael; Schuster, Gregory L.; Valari, Myrto; Tao, Shu
2018-01-01
Abstract There is high uncertainty in the direct radiative forcing of black carbon (BC), an aerosol that strongly absorbs solar radiation. The observation‐constrained estimate, which is several times larger than the bottom‐up estimate, is influenced by the spatial representativeness error due to the mesoscale inhomogeneity of the aerosol fields and the relatively low resolution of global chemistry‐transport models. Here we evaluated the spatial representativeness error for two widely used observational networks (AErosol RObotic NETwork and Global Atmosphere Watch) by downscaling the geospatial grid in a global model of BC aerosol absorption optical depth to 0.1° × 0.1°. Comparing the models at a spatial resolution of 2° × 2° with BC aerosol absorption at AErosol RObotic NETwork sites (which are commonly located near emission hot spots) tends to cause a global spatial representativeness error of 30%, as a positive bias for the current top‐down estimate of global BC direct radiative forcing. By contrast, the global spatial representativeness error will be 7% for the Global Atmosphere Watch network, because the sites are located in such a way that there are almost an equal number of sites with positive or negative representativeness error. PMID:29937603
NASA Astrophysics Data System (ADS)
Morin, Paul; Porter, Claire; Cloutier, Michael; Howat, Ian; Noh, Myoung-Jong; Willis, Michael; Kramer, WIlliam; Bauer, Greg; Bates, Brian; Williamson, Cathleen
2017-04-01
Surface topography is among the most fundamental data sets for geosciences, essential for disciplines ranging from glaciology to geodynamics. Two new projects are using sub-meter, commercial imagery licensed by the National Geospatial-Intelligence Agency and open source photogrammetry software to produce a time-tagged 2m posting elevation model of the Arctic and an 8m posting reference elevation model for the Antarctic. When complete, this publically available data will be at higher resolution than any elevation models that cover the entirety of the Western United States. These two polar projects are made possible due to three equally important factors: 1) open-source photogrammetry software, 2) petascale computing, and 3) sub-meter imagery licensed to the United States Government. Our talk will detail the technical challenges of using automated photogrammetry software; the rapid workflow evolution to allow DEM production; the task of deploying the workflow on one of the world's largest supercomputers; the trials of moving massive amounts of data, and the management strategies the team needed to solve in order to meet deadlines. Finally, we will discuss the implications of this type of collaboration for future multi-team use of leadership-class systems such as Blue Waters, and for further elevation mapping.
Stevens, Forrest R.; Gaughan, Andrea E.; Linard, Catherine; Tatem, Andrew J.
2015-01-01
High resolution, contemporary data on human population distributions are vital for measuring impacts of population growth, monitoring human-environment interactions and for planning and policy development. Many methods are used to disaggregate census data and predict population densities for finer scale, gridded population data sets. We present a new semi-automated dasymetric modeling approach that incorporates detailed census and ancillary data in a flexible, “Random Forest” estimation technique. We outline the combination of widely available, remotely-sensed and geospatial data that contribute to the modeled dasymetric weights and then use the Random Forest model to generate a gridded prediction of population density at ~100 m spatial resolution. This prediction layer is then used as the weighting surface to perform dasymetric redistribution of the census counts at a country level. As a case study we compare the new algorithm and its products for three countries (Vietnam, Cambodia, and Kenya) with other common gridded population data production methodologies. We discuss the advantages of the new method and increases over the accuracy and flexibility of those previous approaches. Finally, we outline how this algorithm will be extended to provide freely-available gridded population data sets for Africa, Asia and Latin America. PMID:25689585
US EPA GLOBAL POSITIONING SYSTEMS - TECHNICAL IMPLEMENTATION GUIDANCE
The U.S. EPA Geospatial Quality Council (GQC) was formed in 1998 to provide Quality Assurance guidance for the development, use, and products of geospatial activities and research. The long-term goals of the GQC are expressed in a living document, currently the EPA Geospatial Qua...
Integration of Geospatial Science in Teacher Education
ERIC Educational Resources Information Center
Hauselt, Peggy; Helzer, Jennifer
2012-01-01
One of the primary missions of our university is to train future primary and secondary teachers. Geospatial sciences, including GIS, have long been excluded from teacher education curriculum. This article explains the curriculum revisions undertaken to increase the geospatial technology education of future teachers. A general education class…
75 FR 43497 - Privacy Act of 1974; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-26
...; System of Records AGENCY: National Geospatial-Intelligence Agency (NGA), DoD. ACTION: Notice to add a system of records. SUMMARY: The National Geospatial-Intelligence Agency (NGA) proposes to add a system of...-3808. SUPPLEMENTARY INFORMATION: The National Geospatial-Intelligence Agency notices for systems of...
Indigenous knowledges driving technological innovation
Lilian Alessa; Carlos Andrade; Phil Cash Cash; Christian P. Giardina; Matt Hamabata; Craig Hammer; Kai Henifin; Lee Joachim; Jay T. Johnson; Kekuhi Kealiikanakaoleohaililani; Deanna Kingston; Andrew Kliskey; Renee Pualani Louis; Amanda Lynch; Daryn McKenny; Chels Marshall; Mere Roberts; Taupouri Tangaro; Jyl Wheaton-Abraham; Everett Wingert
2011-01-01
This policy brief explores the use and expands the conversation on the ability of geospatial technologies to represent Indigenous cultural knowledge. Indigenous peoples' use of geospatial technologies has already proven to be a critical step for protecting tribal self-determination. However, the ontological frameworks and techniques of Western geospatial...
Malpeli, Katherine C.; Chirico, Peter G.
2014-01-01
The Central African Republic (CAR), a country with rich diamond deposits and a tumultuous political history, experienced a government takeover by the Seleka rebel coalition in 2013. It is within this context that we developed and implemented a geospatial approach for assessing the lootability of high value-to-weight resource deposits, using the case of diamonds in CAR as an example. According to current definitions of lootability, or the vulnerability of deposits to exploitation, CAR's two major diamond deposits are similarly lootable. However, using this geospatial approach, we demonstrate that the deposits experience differing political geographic, spatial location, and cultural geographic contexts, rendering the eastern deposits more lootable than the western deposits. The patterns identified through this detailed analysis highlight the geographic complexities surrounding the issue of conflict resources and lootability, and speak to the importance of examining these topics at the sub-national scale, rather than relying on national-scale statistics.
A Big Data Platform for Storing, Accessing, Mining and Learning Geospatial Data
NASA Astrophysics Data System (ADS)
Yang, C. P.; Bambacus, M.; Duffy, D.; Little, M. M.
2017-12-01
Big Data is becoming a norm in geoscience domains. A platform that is capable to effiently manage, access, analyze, mine, and learn the big data for new information and knowledge is desired. This paper introduces our latest effort on developing such a platform based on our past years' experiences on cloud and high performance computing, analyzing big data, comparing big data containers, and mining big geospatial data for new information. The platform includes four layers: a) the bottom layer includes a computing infrastructure with proper network, computer, and storage systems; b) the 2nd layer is a cloud computing layer based on virtualization to provide on demand computing services for upper layers; c) the 3rd layer is big data containers that are customized for dealing with different types of data and functionalities; d) the 4th layer is a big data presentation layer that supports the effient management, access, analyses, mining and learning of big geospatial data.
Increasing Diversity in Geosciences: Geospatial Initiatives at North Carolina Central University
NASA Astrophysics Data System (ADS)
Vlahovic, G.; Malhotra, R.; Renslow, M.; Harris, J.; Barnett, A.
2006-12-01
Two new initiatives funded by the NSF-GEO and NSF-HRD directorates have potential to advance the geospatial program at the North Carolina Central University (NCCU). As one of only two Historically Black Colleges and Universities (HBCUs) in the southeast offering Geography as a major, NCCU is establishing a GIS Research, Innovative Teaching, and Service (GRITS) Laboratory and has partnered with American Society for Photogrammetry and Remote Sensing (ASPRS) to offer GIS certification to Geography graduates. This presentation will focus on the role that GRITS and GIS certification will play in attracting students to the geoscience majors, the planned curriculum changes, and the emerging partnership with ASPRS to develop and offer "provisional certification" to NCCU students. In addition, authors would also like to describe plans to promote geospatial education in partnership with other educational institutions. NCCUs high minority enrollment (at the present approximately 90%) and quality and tradition of geoscience program make it an ideal incubator for accreditation and certification activities and possible role model for other HBCUs.
Mapping the Future Today: The Community College of Baltimore County Geospatial Applications Program
ERIC Educational Resources Information Center
Jeffrey, Scott; Alvarez, Jaime
2010-01-01
The Geospatial Applications Program at the Community College of Baltimore County (CCBC), located five miles west of downtown Baltimore, Maryland, provides comprehensive instruction in geographic information systems (GIS), remote sensing and global positioning systems (GPS). Geospatial techniques, which include computer-based mapping and remote…
ERIC Educational Resources Information Center
Bodzin, Alec; Peffer, Tamara; Kulo, Violet
2012-01-01
Teaching and learning about geospatial aspects of energy resource issues requires that science teachers apply effective science pedagogical approaches to implement geospatial technologies into classroom instruction. To address this need, we designed educative curriculum materials as an integral part of a comprehensive middle school energy…
Strategizing Teacher Professional Development for Classroom Uses of Geospatial Data and Tools
ERIC Educational Resources Information Center
Zalles, Daniel R.; Manitakos, James
2016-01-01
Studying Topography, Orographic Rainfall, and Ecosystems with Geospatial Information Technology (STORE), a 4.5-year National Science Foundation funded project, explored the strategies that stimulate teacher commitment to the project's driving innovation: having students use geospatial information technology (GIT) to learn about weather, climate,…
Fostering 21st Century Learning with Geospatial Technologies
ERIC Educational Resources Information Center
Hagevik, Rita A.
2011-01-01
Global positioning systems (GPS) receivers and other geospatial tools can help teachers create engaging, hands-on activities in all content areas. This article provides a rationale for using geospatial technologies in the middle grades and describes classroom-tested activities in English language arts, science, mathematics, and social studies.…
EPA GEOSPATIAL QUALITY COUNCIL STRATEGY PLAN FY-02
The EPA Geospatial Quality Council (GQC), previously known as the EPA GIS-QA Team - EPA/600/R-00/009, was created to fill the gap between the EPA Quality Assurance (QA) and Geospatial communities. All EPA Offices and Regions were invited to participate. Currently, the EPA...
Mapping and monitoring potato cropping systems in Maine: geospatial methods and land use assessments
USDA-ARS?s Scientific Manuscript database
Geospatial frameworks and GIS-based approaches were used to assess current cropping practices in potato production systems in Maine. Results from the geospatial integration of remotely-sensed cropland layers (2008-2011) and soil datasets for Maine revealed a four-year potato systems footprint estima...
The Virginia Geocoin Adventure: An Experiential Geospatial Learning Activity
ERIC Educational Resources Information Center
Johnson, Laura; McGee, John; Campbell, James; Hays, Amy
2013-01-01
Geospatial technologies have become increasingly prevalent across our society. Educators at all levels have expressed a need for additional resources that can be easily adopted to support geospatial literacy and state standards of learning, while enhancing the overall learning experience. The Virginia Geocoin Adventure supports the needs of 4-H…
ERIC Educational Resources Information Center
Reed, Philip A.; Ritz, John
2004-01-01
Geospatial technology refers to a system that is used to acquire, store, analyze, and output data in two or three dimensions. This data is referenced to the earth by some type of coordinate system, such as a map projection. Geospatial systems include thematic mapping, the Global Positioning System (GPS), remote sensing (RS), telemetry, and…
A Geospatial Online Instruction Model
ERIC Educational Resources Information Center
Rodgers, John C., III; Owen-Nagel, Athena; Ambinakudige, Shrinidhi
2012-01-01
The objective of this study is to present a pedagogical model for teaching geospatial courses through an online format and to critique the model's effectiveness. Offering geospatial courses through an online format provides avenues to a wider student population, many of whom are not able to take traditional on-campus courses. Yet internet-based…
lawn: An R client for the Turf JavaScript Library for Geospatial Analysis
lawn is an R package to provide access to the geospatial analysis capabilities in the Turf javascript library. Turf expects data in GeoJSON format. Given that many datasets are now available natively in GeoJSON providing an easier method for conducting geospatial analyses on thes...
Ma, Zhenling; Wu, Xiaoliang; Yan, Li; Xu, Zhenliang
2017-01-26
With the development of space technology and the performance of remote sensors, high-resolution satellites are continuously launched by countries around the world. Due to high efficiency, large coverage and not being limited by the spatial regulation, satellite imagery becomes one of the important means to acquire geospatial information. This paper explores geometric processing using satellite imagery without ground control points (GCPs). The outcome of spatial triangulation is introduced for geo-positioning as repeated observation. Results from combining block adjustment with non-oriented new images indicate the feasibility of geometric positioning with the repeated observation. GCPs are a must when high accuracy is demanded in conventional block adjustment; the accuracy of direct georeferencing with repeated observation without GCPs is superior to conventional forward intersection and even approximate to conventional block adjustment with GCPs. The conclusion is drawn that taking the existing oriented imagery as repeated observation enhances the effective utilization of previous spatial triangulation achievement, which makes the breakthrough for repeated observation to improve accuracy by increasing the base-height ratio and redundant observation. Georeferencing tests using data from multiple sensors and platforms with the repeated observation will be carried out in the follow-up research.
McShane, Ryan R.; Driscoll, Katelyn P.; Sando, Roy
2017-09-27
Many approaches have been developed for measuring or estimating actual evapotranspiration (ETa), and research over many years has led to the development of remote sensing methods that are reliably reproducible and effective in estimating ETa. Several remote sensing methods can be used to estimate ETa at the high spatial resolution of agricultural fields and the large extent of river basins. More complex remote sensing methods apply an analytical approach to ETa estimation using physically based models of varied complexity that require a combination of ground-based and remote sensing data, and are grounded in the theory behind the surface energy balance model. This report, funded through cooperation with the International Joint Commission, provides an overview of selected remote sensing methods used for estimating water consumed through ETa and focuses on Mapping Evapotranspiration at High Resolution with Internalized Calibration (METRIC) and Operational Simplified Surface Energy Balance (SSEBop), two energy balance models for estimating ETa that are currently applied successfully in the United States. The METRIC model can produce maps of ETa at high spatial resolution (30 meters using Landsat data) for specific areas smaller than several hundred square kilometers in extent, an improvement in practice over methods used more generally at larger scales. Many studies validating METRIC estimates of ETa against measurements from lysimeters have shown model accuracies on daily to seasonal time scales ranging from 85 to 95 percent. The METRIC model is accurate, but the greater complexity of METRIC results in greater data requirements, and the internalized calibration of METRIC leads to greater skill required for implementation. In contrast, SSEBop is a simpler model, having reduced data requirements and greater ease of implementation without a substantial loss of accuracy in estimating ETa. The SSEBop model has been used to produce maps of ETa over very large extents (the conterminous United States) using lower spatial resolution (1 kilometer) Moderate Resolution Imaging Spectroradiometer (MODIS) data. Model accuracies ranging from 80 to 95 percent on daily to annual time scales have been shown in numerous studies that validated ETa estimates from SSEBop against eddy covariance measurements. The METRIC and SSEBop models can incorporate low and high spatial resolution data from MODIS and Landsat, but the high spatiotemporal resolution of ETa estimates using Landsat data over large extents takes immense computing power. Cloud computing is providing an opportunity for processing an increasing amount of geospatial “big data” in a decreasing period of time. For example, Google Earth EngineTM has been used to implement METRIC with automated calibration for regional-scale estimates of ETa using Landsat data. The U.S. Geological Survey also is using Google Earth EngineTM to implement SSEBop for estimating ETa in the United States at a continental scale using Landsat data.
NASA Astrophysics Data System (ADS)
Ferrini, V. L.; Morton, J. J.; Carbotte, S. M.
2016-02-01
The Marine Geoscience Data System (MGDS: www.marine-geo.org) provides a suite of tools and services for free public access to data acquired throughout the global oceans including maps, grids, near-bottom photos, and geologic interpretations that are essential for habitat characterization and marine spatial planning. Users can explore, discover, and download data through a combination of APIs and front-end interfaces that include dynamic service-driven maps, a geospatially enabled search engine, and an easy to navigate user interface for browsing and discovering related data. MGDS offers domain-specific data curation with a team of scientists and data specialists who utilize a suite of back-end tools for introspection of data files and metadata assembly to verify data quality and ensure that data are well-documented for long-term preservation and re-use. Funded by the NSF as part of the multi-disciplinary IEDA Data Facility, MGDS also offers Data DOI registration and links between data and scientific publications. MGDS produces and curates the Global Multi-Resolution Topography Synthesis (GMRT: gmrt.marine-geo.org), a continuously updated Digital Elevation Model that seamlessly integrates multi-resolutional elevation data from a variety of sources including the GEBCO 2014 ( 1 km resolution) and International Bathymetric Chart of the Southern Ocean ( 500 m) compilations. A significant component of GMRT includes ship-based multibeam sonar data, publicly available through NOAA's National Centers for Environmental Information, that are cleaned and quality controlled by the MGDS Team and gridded at their full spatial resolution (typically 100 m resolution in the deep sea). Additional components include gridded bathymetry products contributed by individual scientists (up to meter scale resolution in places), publicly accessible regional bathymetry, and high-resolution terrestrial elevation data. New data are added to GMRT on an ongoing basis, with two scheduled releases per year. GMRT is available as both gridded data and images that can be viewed and downloaded directly through the Java application GeoMapApp (www.geomapapp.org) and the web-based GMRT MapTool. In addition, the GMRT GridServer API provides programmatic access to grids, imagery, profiles, and single point elevation values.
NASA Astrophysics Data System (ADS)
Gidey, Amanuel
2018-06-01
Determining suitability and vulnerability of groundwater quality for irrigation use is a key alarm and first aid for careful management of groundwater resources to diminish the impacts on irrigation. This study was conducted to determine the overall suitability of groundwater quality for irrigation use and to generate their spatial distribution maps in Elala catchment, Northern Ethiopia. Thirty-nine groundwater samples were collected to analyze and map the water quality variables. Atomic absorption spectrophotometer, ultraviolet spectrophotometer, titration and calculation methods were used for laboratory groundwater quality analysis. Arc GIS, geospatial analysis tools, semivariogram model types and interpolation methods were used to generate geospatial distribution maps. Twelve and eight water quality variables were used to produce weighted overlay and irrigation water quality index models, respectively. Root-mean-square error, mean square error, absolute square error, mean error, root-mean-square standardized error, measured values versus predicted values were used for cross-validation. The overall weighted overlay model result showed that 146 km2 areas are highly suitable, 135 km2 moderately suitable and 60 km2 area unsuitable for irrigation use. The result of irrigation water quality index confirms 10.26% with no restriction, 23.08% with low restriction, 20.51% with moderate restriction, 15.38% with high restriction and 30.76% with the severe restriction for irrigation use. GIS and irrigation water quality index are better methods for irrigation water resources management to achieve a full yield irrigation production to improve food security and to sustain it for a long period, to avoid the possibility of increasing environmental problems for the future generation.
UTZINGER, J.; RASO, G.; BROOKER, S.; DE SAVIGNY, D.; TANNER, M.; ØRNBJERG, N.; SINGER, B. H.; N’GORAN, E. K.
2009-01-01
SUMMARY In May 2001, the World Health Assembly (WHA) passed a resolution which urged member states to attain, by 2010, a minimum target of regularly administering anthelminthic drugs to at least 75% and up to 100% of all school-aged children at risk of morbidity. The refined global strategy for the prevention and control of schistosomiasis and soil-transmitted helminthiasis was issued in the following year and large-scale administration of anthelminthic drugs endorsed as the central feature. This strategy has subsequently been termed ‘preventive chemotherapy’. Clearly, the 2001 WHA resolution led the way for concurrently controlling multiple neglected tropical diseases. In this paper, we recall the schistosomiasis situation in Africa in mid-2003. Adhering to strategic guidelines issued by the World Health Organization, we estimate the projected annual treatment needs with praziquantel among the school-aged population and critically discuss these estimates. The important role of geospatial tools for disease risk mapping, surveillance and predictions for resource allocation is emphasised. We clarify that schistosomiasis is only one of many neglected tropical diseases and that considerable uncertainties remain regarding global burden estimates. We examine new control initiatives targeting schistosomiasis and other tropical diseases that are often neglected. The prospect and challenges of integrated control are discussed and the need for combining biomedical, educational and engineering strategies and geospatial tools for sustainable disease control are highlighted. We conclude that, for achieving integrated and sustainable control of neglected tropical diseases, a set of interventions must be tailored to a given endemic setting and fine-tuned over time in response to the changing nature and impact of control. Consequently, besides the environment, the prevailing demographic, health and social systems contexts need to be considered. PMID:19906318
NASA Astrophysics Data System (ADS)
Clinton, N.; Stuhlmacher, M.; Miles, A.; Uludere, N.; Wagner, M.; Georgescu, M.; Herwig, C.; Gong, P.
2017-12-01
Despite substantial interest in urban agriculture, little is known about the aggregate benefits conferred by natural capital for growing food in cities. Here we perform a scenario-based analysis to quantify ecosystem services from adoption of urban agriculture at varying intensity. To drive the scenarios, we created global-scale estimates of vacant land, rooftop and building surface area, at one kilometer resolution, from remotely sensed and modeled geospatial data. We used national scale agricultural reports, climate and other geospatial data at global scale to estimate agricultural production and economic returns, storm-water avoidance, energy savings from avoided heating and cooling costs, and ecosystem services provided by nitrogen sequestration, pollination and biocontrol of pests. The results indicate that vacant lands, followed by rooftops, represent the largest opportunities for natural capital put to agricultural use in urban areas. Ecosystem services from putting such spaces to productive use are dominated by agricultural returns, but energy savings conferred by insulative characteristics of growth substrate also provide economic incentives. Storm water avoidance was estimated to be substantial, but no economic value was estimated. Relatively low economic returns were estimated from the other ecosystem services examined. In aggregate, approximately $10-100 billion in economic incentives, before costs, were estimated. The results showed that relatively developed, high-income countries stand the most to gain from urban agricultural adoption due to the unique combination of climate, crop mixture and crop prices. While the results indicate that urban agriculture is not a panacea for urban food security issues, there is potential to simultaneously ameliorate multiple issues around food, energy and water in urbanized areas.
NASA Astrophysics Data System (ADS)
Paffett, K.; Crossey, L. J.; Crowley, L.; Karlstrom, K. E.
2010-12-01
In the arid southwestern U.S., springs and their associated wetlands provide an opportunity for diverse ecosystems to flourish. With increasing encroachment, multiple-use requirements and increasing groundwater depletion, a better understanding of how the springs function is needed in order to properly manage the springs as a resource. Critical data on spring status (discharge patterns across seasons and water quality) are lacking for most springs. New strategies and environmental sensors can be employed to provide baseline information, as well as continuous data. We report here on systematic evaluation of a suite of springs of the Cibola National Forest in central New Mexico, including characteristics of discharge and water quality. The work is prompted by concerns on preservation of vital habitat for the Zuni Bluehead Sucker in portions of the Cibola National Forest. Spring occurrence includes a range of elevation (2000-2500m), vegetation type (arid grasslands to alpine wilderness), impact (livestock use, increased groundwater withdrawal, species of concern, and increased recreational use), and water quality (potable to saline). Many of the springs occur along fault structures, and are fed by groundwater from confined aquifer systems. Two levels of protocols are described: Level One for developing a baseline survey for water quality in managed lands (geospatial data, geologic map, systematic photography, discharge estimate and field-determined water quality parameters); and Level Two Impact Evaluation Monitoring (includes high-resolution geologic mapping, major ion chemistry, multiple sampling dates, and real-time autonomous logging of several parameters including temperature, pH, conductance and dissolved oxygen). Data collected from the surveys are stored in a geospatial repository to serve as background for future monitoring of the water resources in the area.
The National Map hydrography data stewardship: what is it and why is it important?
Arnold, Dave
2014-01-01
The National Hydrography Dataset (NHD) and Watershed Boundary Dataset (WBD) were designed and populated by a large consortium of agencies involved in hydrography across the United States. The effort was led by the U.S. Geological Survey (USGS), the U.S. Environmental Protection Agency (EPA), and the Natural Resources Conservation Service (NRCS). The high-resolution NHD dataset, completed in 2007, is based on the USGS 7.5-minute series topographic maps at a scale of 1:24,000. There are now 26 million features in the NHD representing a 7.5 million mile stream network with over 6.5 million waterbodies. The six-level WBD, completed in 2010, is based on 1:24,000 scale data and contains over 23,000 watershed polygons. The NHD’s flow network, attribution, and linear referencing are used to conduct extensive scientific analyses. The NHD is ideal for cartographic applications such as the US Topo topographic map series, and also is available on the Geospatial Platform, which provides shared and trusted geospatial data, services, and applications for use by government agencies, their partners, and the public. The WBD watersheds are used by scientists and managers to identify discrete drainage areas. The ongoing maintenance of the NHD and WBD is essential for improving these datasets to meet the ever increasing demand for currency, additional detail, and more significant attribution. The best source of information about changes in local hydrography are users closest to the data, such as State and local governments, as well as Federal land management agencies, and other users of the data. The need for local knowledge has led to the creation of a collaborative data stewardship process to revise and maintain the NHD.
OpenFIRE - A Web GIS Service for Distributing the Finnish Reflection Experiment Datasets
NASA Astrophysics Data System (ADS)
Väkevä, Sakari; Aalto, Aleksi; Heinonen, Aku; Heikkinen, Pekka; Korja, Annakaisa
2017-04-01
The Finnish Reflection Experiment (FIRE) is a land-based deep seismic reflection survey conducted between 2001 and 2003 by a research consortium of the Universities of Helsinki and Oulu, the Geological Survey of Finland, and a Russian state-owned enterprise SpetsGeofysika. The dataset consists of 2100 kilometers of high-resolution profiles across the Archaean and Proterozoic nuclei of the Fennoscandian Shield. Although FIRE data have been available on request since 2009, the data have remained underused outside the original research consortium. The original FIRE data have been quality-controlled. The shot gathers have been cross-checked and comprehensive errata has been created. The brute stacks provided by the Russian seismic contractor have been reprocessed into seismic sections and replotted. A complete documentation of the intermediate processing steps is provided together with guidelines for setting up a computing environment and plotting the data. An open access web service "OpenFIRE" for the visualization and the downloading of FIRE data has been created. The service includes a mobile-responsive map application capable of enriching seismic sections with data from other sources such as open data from the National Land Survey and the Geological Survey of Finland. The AVAA team of the Finnish Open Science and Research Initiative has provided a tailored Liferay portal with necessary web components such as an API (Application Programming Interface) for download requests. INSPIRE (Infrastructure for Spatial Information in Europe) -compliant discovery metadata have been produced and geospatial data will be exposed as Open Geospatial Consortium standard services. The technical guidelines of the European Plate Observing System have been followed and the service could be considered as a reference application for sharing reflection seismic data. The OpenFIRE web service is available at www.seismo.helsinki.fi/openfire
A Hybrid Semi-supervised Classification Scheme for Mining Multisource Geospatial Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vatsavai, Raju; Bhaduri, Budhendra L
2011-01-01
Supervised learning methods such as Maximum Likelihood (ML) are often used in land cover (thematic) classification of remote sensing imagery. ML classifier relies exclusively on spectral characteristics of thematic classes whose statistical distributions (class conditional probability densities) are often overlapping. The spectral response distributions of thematic classes are dependent on many factors including elevation, soil types, and ecological zones. A second problem with statistical classifiers is the requirement of large number of accurate training samples (10 to 30 |dimensions|), which are often costly and time consuming to acquire over large geographic regions. With the increasing availability of geospatial databases, itmore » is possible to exploit the knowledge derived from these ancillary datasets to improve classification accuracies even when the class distributions are highly overlapping. Likewise newer semi-supervised techniques can be adopted to improve the parameter estimates of statistical model by utilizing a large number of easily available unlabeled training samples. Unfortunately there is no convenient multivariate statistical model that can be employed for mulitsource geospatial databases. In this paper we present a hybrid semi-supervised learning algorithm that effectively exploits freely available unlabeled training samples from multispectral remote sensing images and also incorporates ancillary geospatial databases. We have conducted several experiments on real datasets, and our new hybrid approach shows over 25 to 35% improvement in overall classification accuracy over conventional classification schemes.« less
School Mapping and Geospatial Analysis of the Schools in Jasra Development Block of India
NASA Astrophysics Data System (ADS)
Agrawal, S.; Gupta, R. D.
2016-06-01
GIS is a collection of tools and techniques that works on the geospatial data and is used in the analysis and decision making. Education is an inherent part of any civil society. Proper educational facilities generate the high quality human resource for any nation. Therefore, government needs an efficient system that can help in analysing the current state of education and its progress. Government also needs a system that can support in decision making and policy framing. GIS can serve the mentioned requirements not only for government but also for the general public. In order to meet the standards of human development, it is necessary for the government and decision makers to have a close watch on the existing education policy and its implementation condition. School mapping plays an important role in this aspect. School mapping consists of building the geospatial database of schools that supports in the infrastructure development, policy analysis and decision making. The present research work is an attempt for supporting Right to Education (RTE) and Sarv Sikha Abhiyaan (SSA) programmes run by Government of India through the use of GIS. School mapping of the study area is performed which is followed by the geospatial analysis. This research work will help in assessing the present status of educational infrastructure in Jasra block of Allahabad district, India.
Gurdak, Jason J.; Qi, Sharon L.; Geisler, Michael L.
2009-01-01
The U.S. Geological Survey Raster Error Propagation Tool (REPTool) is a custom tool for use with the Environmental System Research Institute (ESRI) ArcGIS Desktop application to estimate error propagation and prediction uncertainty in raster processing operations and geospatial modeling. REPTool is designed to introduce concepts of error and uncertainty in geospatial data and modeling and provide users of ArcGIS Desktop a geoprocessing tool and methodology to consider how error affects geospatial model output. Similar to other geoprocessing tools available in ArcGIS Desktop, REPTool can be run from a dialog window, from the ArcMap command line, or from a Python script. REPTool consists of public-domain, Python-based packages that implement Latin Hypercube Sampling within a probabilistic framework to track error propagation in geospatial models and quantitatively estimate the uncertainty of the model output. Users may specify error for each input raster or model coefficient represented in the geospatial model. The error for the input rasters may be specified as either spatially invariant or spatially variable across the spatial domain. Users may specify model output as a distribution of uncertainty for each raster cell. REPTool uses the Relative Variance Contribution method to quantify the relative error contribution from the two primary components in the geospatial model - errors in the model input data and coefficients of the model variables. REPTool is appropriate for many types of geospatial processing operations, modeling applications, and related research questions, including applications that consider spatially invariant or spatially variable error in geospatial data.
An updated geospatial liquefaction model for global application
Zhu, Jing; Baise, Laurie G.; Thompson, Eric M.
2017-01-01
We present an updated geospatial approach to estimation of earthquake-induced liquefaction from globally available geospatial proxies. Our previous iteration of the geospatial liquefaction model was based on mapped liquefaction surface effects from four earthquakes in Christchurch, New Zealand, and Kobe, Japan, paired with geospatial explanatory variables including slope-derived VS30, compound topographic index, and magnitude-adjusted peak ground acceleration from ShakeMap. The updated geospatial liquefaction model presented herein improves the performance and the generality of the model. The updates include (1) expanding the liquefaction database to 27 earthquake events across 6 countries, (2) addressing the sampling of nonliquefaction for incomplete liquefaction inventories, (3) testing interaction effects between explanatory variables, and (4) overall improving model performance. While we test 14 geospatial proxies for soil density and soil saturation, the most promising geospatial parameters are slope-derived VS30, modeled water table depth, distance to coast, distance to river, distance to closest water body, and precipitation. We found that peak ground velocity (PGV) performs better than peak ground acceleration (PGA) as the shaking intensity parameter. We present two models which offer improved performance over prior models. We evaluate model performance using the area under the curve under the Receiver Operating Characteristic (ROC) curve (AUC) and the Brier score. The best-performing model in a coastal setting uses distance to coast but is problematic for regions away from the coast. The second best model, using PGV, VS30, water table depth, distance to closest water body, and precipitation, performs better in noncoastal regions and thus is the model we recommend for global implementation.
NASA Astrophysics Data System (ADS)
Huang, W.; Jiang, J.; Zha, Z.; Zhang, H.; Wang, C.; Zhang, J.
2014-04-01
Geospatial data resources are the foundation of the construction of geo portal which is designed to provide online geoinformation services for the government, enterprise and public. It is vital to keep geospatial data fresh, accurate and comprehensive in order to satisfy the requirements of application and development of geographic location, route navigation, geo search and so on. One of the major problems we are facing is data acquisition. For us, integrating multi-sources geospatial data is the mainly means of data acquisition. This paper introduced a practice integration approach of multi-source geospatial data with different data model, structure and format, which provided the construction of National Geospatial Information Service Platform of China (NGISP) with effective technical supports. NGISP is the China's official geo portal which provides online geoinformation services based on internet, e-government network and classified network. Within the NGISP architecture, there are three kinds of nodes: national, provincial and municipal. Therefore, the geospatial data is from these nodes and the different datasets are heterogeneous. According to the results of analysis of the heterogeneous datasets, the first thing we do is to define the basic principles of data fusion, including following aspects: 1. location precision; 2.geometric representation; 3. up-to-date state; 4. attribute values; and 5. spatial relationship. Then the technical procedure is researched and the method that used to process different categories of features such as road, railway, boundary, river, settlement and building is proposed based on the principles. A case study in Jiangsu province demonstrated the applicability of the principle, procedure and method of multi-source geospatial data integration.
Globe Browsing: Contextualized Spatio-Temporal Planetary Surface Visualization.
Bladin, Karl; Axelsson, Emil; Broberg, Erik; Emmart, Carter; Ljung, Patric; Bock, Alexander; Ynnerman, Anders
2017-08-29
Results of planetary mapping are often shared openly for use in scientific research and mission planning. In its raw format, however, the data is not accessible to non-experts due to the difficulty in grasping the context and the intricate acquisition process. We present work on tailoring and integration of multiple data processing and visualization methods to interactively contextualize geospatial surface data of celestial bodies for use in science communication. As our approach handles dynamic data sources, streamed from online repositories, we are significantly shortening the time between discovery and dissemination of data and results. We describe the image acquisition pipeline, the pre-processing steps to derive a 2.5D terrain, and a chunked level-of-detail, out-of-core rendering approach to enable interactive exploration of global maps and high-resolution digital terrain models. The results are demonstrated for three different celestial bodies. The first case addresses high-resolution map data on the surface of Mars. A second case is showing dynamic processes, such as concurrent weather conditions on Earth that require temporal datasets. As a final example we use data from the New Horizons spacecraft which acquired images during a single flyby of Pluto. We visualize the acquisition process as well as the resulting surface data. Our work has been implemented in the OpenSpace software [8], which enables interactive presentations in a range of environments such as immersive dome theaters, interactive touch tables, and virtual reality headsets.
Environmental data analysis and remote sensing for early detection of dengue and malaria
NASA Astrophysics Data System (ADS)
Rahman, Md Z.; Roytman, Leonid; Kadik, Abdelhamid; Rosy, Dilara A.
2014-06-01
Malaria and dengue fever are the two most common mosquito-transmitted diseases, leading to millions of serious illnesses and deaths each year. Because the mosquito vectors are sensitive to environmental conditions such as temperature, precipitation, and humidity, it is possible to map areas currently or imminently at high risk for disease outbreaks using satellite remote sensing. In this paper we propose the development of an operational geospatial system for malaria and dengue fever early warning; this can be done by bringing together geographic information system (GIS) tools, artificial neural networks (ANN) for efficient pattern recognition, the best available ground-based epidemiological and vector ecology data, and current satellite remote sensing capabilities. We use Vegetation Health Indices (VHI) derived from visible and infrared radiances measured by satellite-mounted Advanced Very High Resolution Radiometers (AVHRR) and available weekly at 4-km resolution as one predictor of malaria and dengue fever risk in Bangladesh. As a study area, we focus on Bangladesh where malaria and dengue fever are serious public health threats. The technology developed will, however, be largely portable to other countries in the world and applicable to other disease threats. A malaria and dengue fever early warning system will be a boon to international public health, enabling resources to be focused where they will do the most good for stopping pandemics, and will be an invaluable decision support tool for national security assessment and potential troop deployment in regions susceptible to disease outbreaks.
Remote sensing applied to resource management
Henry M. Lachowski
1998-01-01
Effective management of forest resources requires access to current and consistent geospatial information that can be shared by resource managers and the public. Geospatial information describing our land and natural resources comes from many sources and is most effective when stored in a geospatial database and used in a geographic information system (GIS). The...
ERIC Educational Resources Information Center
Kulo, Violet; Bodzin, Alec
2013-01-01
Geospatial technologies are increasingly being integrated in science classrooms to foster learning. This study examined whether a Web-enhanced science inquiry curriculum supported by geospatial technologies promoted urban middle school students' understanding of energy concepts. The participants included one science teacher and 108 eighth-grade…
Introduction to the Complex Geospatial Web in Geographical Education
ERIC Educational Resources Information Center
Papadimitriou, Fivos
2010-01-01
The Geospatial Web is emerging in the geographical education landscape in all its complexity. How will geographers and educators react? What are the most important facets of this development? After reviewing the possible impacts on geographical education, it can be conjectured that the Geospatial Web will eventually replace the usual geographical…
ERIC Educational Resources Information Center
Bodzin, Alec M.; Fu, Qiong; Bressler, Denise; Vallera, Farah L.
2015-01-01
Geospatially enabled learning technologies may enhance Earth science learning by placing emphasis on geographic space, visualization, scale, representation, and geospatial thinking and reasoning (GTR) skills. This study examined if and how a series of Web geographic information system investigations that the researchers developed improved urban…
Hettinger Photo of Dylan Hettinger Dylan Hettinger Geospatial Data Scientist Dylan.Hettinger @nrel.gov | 303-275-3750 Dylan Hettinger is a member of the Geospatial Data Science team within the Systems Modeling & Geospatial Data Science Group in the Strategic Energy Analysis Center. Areas of Expertise
ERIC Educational Resources Information Center
Hanley, Carol D.; Davis, Hilarie B.; Davey, Bradford T.
2012-01-01
As use of geospatial technologies has increased in the workplace, so has interest in using these technologies in the K-12 classroom. Prior research has identified several reasons for using geospatial technologies in the classroom, such as developing spatial thinking, supporting local investigations, analyzing changes in the environment, and…
The Sky's the Limit: Integrating Geospatial Tools with Pre-College Youth Education
ERIC Educational Resources Information Center
McGee, John; Kirwan, Jeff
2010-01-01
Geospatial tools, which include global positioning systems (GPS), geographic information systems (GIS), and remote sensing, are increasingly driving a variety of applications. Local governments and private industry are embracing these tools, and the public is beginning to demand geospatial services. The U.S. Department of Labor (DOL) reported that…
Geospatial Services in Special Libraries: A Needs Assessment Perspective
ERIC Educational Resources Information Center
Barnes, Ilana
2013-01-01
Once limited to geographers and mapmakers, Geographic Information Systems (GIS) has taken a growing central role in information management and visualization. Geospatial services run a gamut of different products and services from Google maps to ArcGIS servers to Mobile development. Geospatial services are not new. Libraries have been writing about…
Using the Geospatial Web to Deliver and Teach Giscience Education Programs
NASA Astrophysics Data System (ADS)
Veenendaal, B.
2015-05-01
Geographic information science (GIScience) education has undergone enormous changes over the past years. One major factor influencing this change is the role of the geospatial web in GIScience. In addition to the use of the web for enabling and enhancing GIScience education, it is also used as the infrastructure for communicating and collaborating among geospatial data and users. The web becomes both the means and the content for a geospatial education program. However, the web does not replace the traditional face-to-face environment, but rather is a means to enhance it, expand it and enable an authentic and real world learning environment. This paper outlines the use of the web in both the delivery and content of the GIScience program at Curtin University. The teaching of the geospatial web, web and cloud based mapping, and geospatial web services are key components of the program, and the use of the web and online learning are important to deliver this program. Some examples of authentic and real world learning environments are provided including joint learning activities with partner universities.
A Geospatial Semantic Enrichment and Query Service for Geotagged Photographs
Ennis, Andrew; Nugent, Chris; Morrow, Philip; Chen, Liming; Ioannidis, George; Stan, Alexandru; Rachev, Preslav
2015-01-01
With the increasing abundance of technologies and smart devices, equipped with a multitude of sensors for sensing the environment around them, information creation and consumption has now become effortless. This, in particular, is the case for photographs with vast amounts being created and shared every day. For example, at the time of this writing, Instagram users upload 70 million photographs a day. Nevertheless, it still remains a challenge to discover the “right” information for the appropriate purpose. This paper describes an approach to create semantic geospatial metadata for photographs, which can facilitate photograph search and discovery. To achieve this we have developed and implemented a semantic geospatial data model by which a photograph can be enrich with geospatial metadata extracted from several geospatial data sources based on the raw low-level geo-metadata from a smartphone photograph. We present the details of our method and implementation for searching and querying the semantic geospatial metadata repository to enable a user or third party system to find the information they are looking for. PMID:26205265
Citing geospatial feature inventories with XML manifests
NASA Astrophysics Data System (ADS)
Bose, R.; McGarva, G.
2006-12-01
Today published scientific papers include a growing number of citations for online information sources that either complement or replace printed journals and books. We anticipate this same trend for cartographic citations used in the geosciences, following advances in web mapping and geographic feature-based services. Instead of using traditional libraries to resolve citations for print material, the geospatial citation life cycle will include requesting inventories of objects or geographic features from distributed geospatial data repositories. Using a case study from the UK Ordnance Survey MasterMap database, which is illustrative of geographic object-based products in general, we propose citing inventories of geographic objects using XML feature manifests. These manifests: (1) serve as a portable listing of sets of versioned features; (2) could be used as citations within the identification portion of an international geospatial metadata standard; (3) could be incorporated into geospatial data transfer formats such as GML; but (4) can be resolved only with comprehensive, curated repositories of current and historic data. This work has implications for any researcher who foresees the need to make or resolve references to online geospatial databases.
NASA Astrophysics Data System (ADS)
Lv, Zheng; Sui, Haigang; Zhang, Xilin; Huang, Xianfeng
2007-11-01
As one of the most important geo-spatial objects and military establishment, airport is always a key target in fields of transportation and military affairs. Therefore, automatic recognition and extraction of airport from remote sensing images is very important and urgent for updating of civil aviation and military application. In this paper, a new multi-source data fusion approach on automatic airport information extraction, updating and 3D modeling is addressed. Corresponding key technologies including feature extraction of airport information based on a modified Ostu algorithm, automatic change detection based on new parallel lines-based buffer detection algorithm, 3D modeling based on gradual elimination of non-building points algorithm, 3D change detecting between old airport model and LIDAR data, typical CAD models imported and so on are discussed in detail. At last, based on these technologies, we develop a prototype system and the results show our method can achieve good effects.
Nebhydro: Sharing Geospatial Data to Supportwater Management in Nebraska
NASA Astrophysics Data System (ADS)
Kamble, B.; Irmak, A.; Hubbard, K.; Deogun, J.; Dvorak, B.
2012-12-01
Recent advances in web-enabled geographical technologies have the potential to make a dramatic impact on development of highly interactive spatial applications on the web for visualization of large-scale geospatial data by water resources and irrigation scientists. Spatial and point scale water resources data visualization are an emerging and challenging application domain. Query based visual explorations of geospatial hydrological data can play an important role in stimulating scientific hypotheses and seeking causal relationships among hydro variables. The Nebraska Hydrological Information System (NebHydro) utilizes ESRI's ArcGIS server technology to increase technological awareness among farmers, irrigation managers and policy makers. Web-based geospatial applications are an effective way to expose scientific hydrological datasets to the research community and the public. NebHydro uses Adobe Flex technology to offer an online visualization and data analysis system for presentation of social and economic data. Internet mapping services is an integrated product of GIS and Internet technologies; it is a favored solution to achieve the interoperability of GIS. The development of Internet based GIS services in the state of Nebraska showcases the benefits of sharing geospatial hydrological data among agencies, resource managers and policy makers. Geospatial hydrological Information (Evapotranspiration from Remote Sensing, vegetation indices (NDVI), USGS Stream gauge data, Climatic data etc.) is generally generated through model simulation (METRIC, SWAP, Linux, Python based scripting etc). Information is compiled into and stored within object oriented relational spatial databases using a geodatabase information model that supports the key data types needed by applications including features, relationships, networks, imagery, terrains, maps and layers. The system provides online access, querying, visualization, and analysis of the hydrological data from several sources at one place. The study indicates that internet GIS, developed using advanced technologies, provides valuable education potential to users in hydrology and irrigation engineering and suggests that such a system can support advanced hydrological data access and analysis tools to improve utility of data in operations. Keywords: Hydrological Information System, NebHydro, Water Management, data sharing, data visualization, ArcGIS server.
NASA Astrophysics Data System (ADS)
Shiklomanov, A. I.; Okladnikov, I.; Gordov, E. P.; Proussevitch, A. A.; Titov, A. G.
2016-12-01
Presented is a collaborative project carrying out by joint team of researchers from the Institute of Monitoring of Climatic and Ecological Systems, Russia and Earth Systems Research Center, University of New Hampshire, USA. Its main objective is development of a hardware and software prototype of Distributed Research Center (DRC) for monitoring and projecting of regional climatic and and their impacts on the environment over the Northern extratropical areas. In the framework of the project new approaches to "cloud" processing and analysis of large geospatial datasets (big geospatial data) are being developed. It will be deployed on technical platforms of both institutions and applied in research of climate change and its consequences. Datasets available at NCEI and IMCES include multidimensional arrays of climatic, environmental, demographic, and socio-economic characteristics. The project is aimed at solving several major research and engineering tasks: 1) structure analysis of huge heterogeneous climate and environmental geospatial datasets used in the project, their preprocessing and unification; 2) development of a new distributed storage and processing model based on a "shared nothing" paradigm; 3) development of a dedicated database of metadata describing geospatial datasets used in the project; 4) development of a dedicated geoportal and a high-end graphical frontend providing intuitive user interface, internet-accessible online tools for analysis of geospatial data and web services for interoperability with other geoprocessing software packages. DRC will operate as a single access point to distributed archives of spatial data and online tools for their processing. Flexible modular computational engine running verified data processing routines will provide solid results of geospatial data analysis. "Cloud" data analysis and visualization approach will guarantee access to the DRC online tools and data from all over the world. Additionally, exporting of data processing results through WMS and WFS services will be used to provide their interoperability. Financial support of this activity by the RF Ministry of Education and Science under Agreement 14.613.21.0037 (RFMEFI61315X0037) and by the Iola Hubbard Climate Change Endowment is acknowledged.
NASA Astrophysics Data System (ADS)
Santillan, J. R.; Amora, A. M.; Makinano-Santillan, M.; Marqueso, J. T.; Cutamora, L. C.; Serviano, J. L.; Makinano, R. M.
2016-06-01
In this paper, we present a combined geospatial and two dimensional (2D) flood modeling approach to assess the impacts of flooding due to extreme rainfall events. We developed and implemented this approach to the Tago River Basin in the province of Surigao del Sur in Mindanao, Philippines, an area which suffered great damage due to flooding caused by Tropical Storms Lingling and Jangmi in the year 2014. The geospatial component of the approach involves extraction of several layers of information such as detailed topography/terrain, man-made features (buildings, roads, bridges) from 1-m spatial resolution LiDAR Digital Surface and Terrain Models (DTM/DSMs), and recent land-cover from Landsat 7 ETM+ and Landsat 8 OLI images. We then used these layers as inputs in developing a Hydrologic Engineering Center Hydrologic Modeling System (HEC HMS)-based hydrologic model, and a hydraulic model based on the 2D module of the latest version of HEC River Analysis System (RAS) to dynamically simulate and map the depth and extent of flooding due to extreme rainfall events. The extreme rainfall events used in the simulation represent 6 hypothetical rainfall events with return periods of 2, 5, 10, 25, 50, and 100 years. For each event, maximum flood depth maps were generated from the simulations, and these maps were further transformed into hazard maps by categorizing the flood depth into low, medium and high hazard levels. Using both the flood hazard maps and the layers of information extracted from remotely-sensed datasets in spatial overlay analysis, we were then able to estimate and assess the impacts of these flooding events to buildings, roads, bridges and landcover. Results of the assessments revealed increase in number of buildings, roads and bridges; and increase in areas of land-cover exposed to various flood hazards as rainfall events become more extreme. The wealth of information generated from the flood impact assessment using the approach can be very useful to the local government units and the concerned communities within Tago River Basin as an aid in determining in an advance manner all those infrastructures (buildings, roads and bridges) and land-cover that can be affected by different extreme rainfall event flood scenarios.
Introduction to geospatial semantics and technology workshop handbook
Varanka, Dalia E.
2012-01-01
The workshop is a tutorial on introductory geospatial semantics with hands-on exercises using standard Web browsers. The workshop is divided into two sections, general semantics on the Web and specific examples of geospatial semantics using data from The National Map of the U.S. Geological Survey and the Open Ontology Repository. The general semantics section includes information and access to publicly available semantic archives. The specific session includes information on geospatial semantics with access to semantically enhanced data for hydrography, transportation, boundaries, and names. The Open Ontology Repository offers open-source ontologies for public use.
The Value of Information - Accounting for a New Geospatial Paradigm
NASA Astrophysics Data System (ADS)
Pearlman, J.; Coote, A. M.
2014-12-01
A new frontier in consideration of socio-economic benefit is valuing information as an asset, often referred to as Infonomics. Conventional financial practice does not easily provide a mechanism for valuing information and yet clearly for many of the largest corporations, such as Google and Facebook, it is their principal asset. This is exacerbated for public sector organizations, as those that information-centric rather than information-enabled are relatively few - statistics, archiving and mapping agencies are perhaps the only examples - so it's not at the top of the agenda for Government. However, it is a hugely important issue when valuing Geospatial data and information. Geospatial data allows public institutions to operate, and facilitates the provision of essential services for emergency response and national defense. In this respect, geospatial data is strongly analogous to other types of public infrastructure, such as utilities and roads. The use of Geospatial data is widespread from companies in the transportation or construction sectors to individual planning for daily events. The categorization of geospatial data as infrastructure is critical to decisions related to investment in its management, maintenance and upgrade over time. Geospatial data depreciates in the same way that physical infrastructure depreciates. It needs to be maintained otherwise its functionality and value in use declines. We have coined the term geo-infonomics to encapsulate the concept. This presentation will develop the arguments around its importance and current avenues of research.
Understanding needs and barriers to using geospatial tools for public health policymaking in China.
Kim, Dohyeong; Zhang, Yingyuan; Lee, Chang Kil
2018-05-07
Despite growing popularity of using geographical information systems and geospatial tools in public health fields, these tools are only rarely implemented in health policy management in China. This study examines the barriers that could prevent policy-makers from applying such tools to actual managerial processes related to public health problems that could be assisted by such approaches, e.g. evidence-based policy-making. A questionnaire-based survey of 127 health-related experts and other stakeholders in China revealed that there is a consensus on the needs and demands for the use of geospatial tools, which shows that there is a more unified opinion on the matter than so far reported. Respondents pointed to lack of communication and collaboration among stakeholders as the most significant barrier to the implementation of geospatial tools. Comparison of survey results to those emanating from a similar study in Bangladesh revealed different priorities concerning the use of geospatial tools between the two countries. In addition, the follow-up in-depth interviews highlighted the political culture specific to China as a critical barrier to adopting new tools in policy development. Other barriers included concerns over the limited awareness of the availability of advanced geospatial tools. Taken together, these findings can facilitate a better understanding among policy-makers and practitioners of the challenges and opportunities for widespread adoption and implementation of a geospatial approach to public health policy-making in China.
USDA-ARS?s Scientific Manuscript database
The development of sensors that provide geospatial information on crop and soil conditions has been a primary success for precision agriculture. However, further developments are needed to integrate geospatial data into computer algorithms that spatially optimize crop production while considering po...
NASA Astrophysics Data System (ADS)
Marzolff, Irene
2014-05-01
One hundred years after the first publication on aerial photography taken from unmanned aerial platforms (Arthur Batut 1890), small-format aerial photography (SFAP) became a distinct niche within remote sensing during the 1990s. Geographers, plant biologists, archaeologists and other researchers with geospatial interests re-discovered the usefulness of unmanned platforms for taking high-resolution, low-altitude photographs that could then be digitized and analysed with geographical information systems, (softcopy) photogrammetry and image processing techniques originally developed for digital satellite imagery. Even before the ubiquity of digital consumer-grade cameras and 3D analysis software accessible to the photogrammetric layperson, do-it-yourself remote sensing using kites, blimps, drones and micro air vehicles literally enabled the questing researcher to get their own pictures of the world. As a flexible, cost-effective method, SFAP offered images with high spatial and temporal resolutions that could be ideally adapted to the scales of landscapes, forms and distribution patterns to be monitored. During the last five years, this development has been significantly accelerated by the rapid technological advancements of GPS navigation, autopiloting and revolutionary softcopy-photogrammetry techniques. State-of-the-art unmanned aerial systems (UAS) now allow automatic flight planning, autopilot-controlled aerial surveys, ground control-free direct georeferencing and DEM plus orthophoto generation with centimeter accuracy, all within the space of one day. The ease of use of current UAS and processing software for the generation of high-resolution topographic datasets and spectacular visualizations is tempting and has spurred the number of publications on these issues - but which advancements in our knowledge and understanding of geomorphological processes have we seen and can we expect in the future? This presentation traces the development of the last two decades by presenting and discussing examples for geomorphological research using UAS, mostly from the field of soil erosion monitoring.
Environmental Controls on Above-Ground Biomass in the Taita Hills, Kenya
NASA Astrophysics Data System (ADS)
Adhikari, H.; Heiskanen, J.; Siljander, M.; Maeda, E. E.; Heikinheimo, V.; Pellikka, P.
2016-12-01
Tropical forests are globally significant ecosystems which maintain high biodiversity and provide valuable ecosystem services, including carbon sink, climate change mitigation and adaptation. This ecosystem has been severely degraded for decades. However, the magnitude and spatial patterns of the above ground biomass (AGB) in the tropical forest-agriculture landscapes is highly variable, even under the same climatic condition and land use. This work aims 1) to generate wall-to-wall map of AGB density for the Taita Hills in Kenya based on field measurements and airborne laser scanning (ALS) and 2) to examine environmental controls on AGB using geospatial data sets on topography, soils, climate and land use, and statistical modelling. The study area (67000 ha) is located in the northernmost part of the Eastern Arc Mountains of Kenya and Tanzania, and the highest hilltops reach over 2200 m in elevation. Most of the forest area has been cleared for croplands and agroforestry, and hills are surrounded by the semi-arid scrublands and dry savannah at an elevation of 600-900 m a.s.l. As a result, the current land cover is a mosaic of various types of land cover and land use. The field measurements were carried out in total of 216 plots in 2013-2015 for AGB computations and ALS flights were conducted in 2014-2015. AGB map at 30 m x 30 m resolution was implemented using multiple linear regression based on ALS variables derived from the point cloud, namely canopy cover and 25 percentile height of ALS returns (R2 = 0.88). Boosted regression trees (BRT) was used for examining the relationship between AGB and explanatory variables, which were derived from ALS-based high resolution DEM (2 m resolution), soil database, downscaled climate data and land cover/use maps based on satellite image analysis. The results of these analyses will be presented in the conference.
NASA Astrophysics Data System (ADS)
Isaak, D.; Wenger, S. J.; Peterson, E.; Ver Hoef, J.; Luce, C.; Hostetler, S.
2015-12-01
Climate change is warming streams across the western U.S. and threatens billions of dollars of investments made to conserve valuable cold-water species like trout and salmon. Efficient threat response requires prioritization of limited conservation resources and coordinated interagency efforts guided by accurate information about climate at scales relevant to the distributions of species across landscapes. To provide that information, the NorWeST project was initiated in 2011 to aggregate stream temperature data from all available sources and create high-resolution climate scenarios. The database has since grown into the largest of its kind globally, and now consists of >60,000,000 hourly temperature recordings at >20,000 unique stream sites that were contributed by 100s of professionals working for >95 state, federal, tribal, municipal, county, and private resource agencies. This poster shows a high-resolution (1-kilometer) summer temperature scenario created with these data and mapped to 800,000 kilometers of network across eight western states (ID, WA, OR, MT, WY, UT, NV, CA). The geospatial data associated with this climate scenario and thirty others developed in this project are distributed in user-friendly digital formats through the NorWeST website (http://www.fs.fed.us/rm/boise/AWAE/projects/NorWeST.shtml). The accuracy, utility, and convenience of NorWeST data products has led to their rapid adoption and use by the management and research communities for conservation planning, inter-agency coordination of monitoring networks, and new research on stream temperatures and thermal ecology. A project of this scope and utility was possible only through crowd-sourcing techniques, which have also served to engage data contributors in the process of science creation while strengthening the social networks needed for effective conservation.
NASA Astrophysics Data System (ADS)
Haghighattalab, Atena
Wheat breeders are in a race for genetic gain to secure the future nutritional needs of a growing population. Multiple barriers exist in the acceleration of crop improvement. Emerging technologies are reducing these obstacles. Advances in genotyping technologies have significantly decreased the cost of characterizing the genetic make-up of candidate breeding lines. However, this is just part of the equation. Field-based phenotyping informs a breeder's decision as to which lines move forward in the breeding cycle. This has long been the most expensive and time-consuming, though most critical, aspect of breeding. The grand challenge remains in connecting genetic variants to observed phenotypes followed by predicting phenotypes based on the genetic composition of lines or cultivars. In this context, the current study was undertaken to investigate the utility of UAS in assessment field trials in wheat breeding programs. The major objective was to integrate remotely sensed data with geospatial analysis for high throughput phenotyping of large wheat breeding nurseries. The initial step was to develop and validate a semi-automated high-throughput phenotyping pipeline using a low-cost UAS and NIR camera, image processing, and radiometric calibration to build orthomosaic imagery and 3D models. The relationship between plot-level data (vegetation indices and height) extracted from UAS imagery and manual measurements were examined and found to have a high correlation. Data derived from UAS imagery performed as well as manual measurements while exponentially increasing the amount of data available. The high-resolution, high-temporal HTP data extracted from this pipeline offered the opportunity to develop a within season grain yield prediction model. Due to the variety in genotypes and environmental conditions, breeding trials are inherently spatial in nature and vary non-randomly across the field. This makes geographically weighted regression models a good choice as a geospatial prediction model. Finally, with the addition of georeferenced and spatial data integral in HTP and imagery, we were able to reduce the environmental effect from the data and increase the accuracy of UAS plot-level data. The models developed through this research, when combined with genotyping technologies, increase the volume, accuracy, and reliability of phenotypic data to better inform breeder selections. This increased accuracy with evaluating and predicting grain yield will help breeders to rapidly identify and advance the most promising candidate wheat varieties.
Geospatial Technology Strategic Plan 1997-2000
D'Erchia, Frank; D'Erchia, Terry D.; Getter, James; McNiff, Marcia; Root, Ralph; Stitt, Susan; White, Barbara
1997-01-01
Executive Summary -- Geospatial technology applications have been identified in many U.S. Geological Survey Biological Resources Division (BRD) proposals for grants awarded through internal and partnership programs. Because geospatial data and tools have become more sophisticated, accessible, and easy to use, BRD scientists frequently are using these tools and capabilities to enhance a broad spectrum of research activities. Bruce Babbitt, Secretary of the Interior, has acknowledged--and lauded--the important role of geospatial technology in natural resources management. In his keynote address to more than 5,500 people representing 87 countries at the Environmental Systems Research Institute Annual Conference (May 21, 1996), Secretary Babbitt stated, '. . .GIS [geographic information systems], if properly used, can provide a lot more than sets of data. Used effectively, it can help stakeholders to bring consensus out of conflict. And it can, by providing information, empower the participants to find new solutions to their problems.' This Geospatial Technology Strategic Plan addresses the use and application of geographic information systems, remote sensing, satellite positioning systems, image processing, and telemetry; describes methods of meeting national plans relating to geospatial data development, management, and serving; and provides guidance for sharing expertise and information. Goals are identified along with guidelines that focus on data sharing, training, and technology transfer. To measure success, critical performance indicators are included. The ability of the BRD to use and apply geospatial technology across all disciplines will greatly depend upon its success in transferring the technology to field biologists and researchers. The Geospatial Technology Strategic Planning Development Team coordinated and produced this document in the spirit of this premise. Individual Center and Program managers have the responsibility to implement the Strategic Plan by working within the policy and guidelines stated herein.
Jacquez, Geoffrey M; Essex, Aleksander; Curtis, Andrew; Kohler, Betsy; Sherman, Recinda; Emam, Khaled El; Shi, Chen; Kaufmann, Andy; Beale, Linda; Cusick, Thomas; Goldberg, Daniel; Goovaerts, Pierre
2017-07-01
As the volume, accuracy and precision of digital geographic information have increased, concerns regarding individual privacy and confidentiality have come to the forefront. Not only do these challenge a basic tenet underlying the advancement of science by posing substantial obstacles to the sharing of data to validate research results, but they are obstacles to conducting certain research projects in the first place. Geospatial cryptography involves the specification, design, implementation and application of cryptographic techniques to address privacy, confidentiality and security concerns for geographically referenced data. This article defines geospatial cryptography and demonstrates its application in cancer control and surveillance. Four use cases are considered: (1) national-level de-duplication among state or province-based cancer registries; (2) sharing of confidential data across cancer registries to support case aggregation across administrative geographies; (3) secure data linkage; and (4) cancer cluster investigation and surveillance. A secure multi-party system for geospatial cryptography is developed. Solutions under geospatial cryptography are presented and computation time is calculated. As services provided by cancer registries to the research community, de-duplication, case aggregation across administrative geographies and secure data linkage are often time-consuming and in some instances precluded by confidentiality and security concerns. Geospatial cryptography provides secure solutions that hold significant promise for addressing these concerns and for accelerating the pace of research with human subjects data residing in our nation's cancer registries. Pursuit of the research directions posed herein conceivably would lead to a geospatially encrypted geographic information system (GEGIS) designed specifically to promote the sharing and spatial analysis of confidential data. Geospatial cryptography holds substantial promise for accelerating the pace of research with spatially referenced human subjects data.
Carswell, William J.
2011-01-01
increases the efficiency of the Nation's geospatial community by improving communications about geospatial data, products, services, projects, needs, standards, and best practices. The NGP comprises seven major components (described below), that are managed as a unified set. For example, The National Map establishes data standards and identifies geographic areas where specific types of geospatial data need to be incorporated into The National Map. Partnership Network Liaisons work with Federal, State, local, and tribal partners to help acquire the data. Geospatial technical operations ensure the quality control, integration, and availability to the public of the data acquired. The Emergency Operations Office provides the requirements to The National Map and, during emergencies and natural disasters, provides rapid dissemination of information and data targeted to the needs of emergency responders. The National Atlas uses data from The National Map and other sources to make small-scale maps and multimedia articles about the maps.
DOE Office of Scientific and Technical Information (OSTI.GOV)
O’Connor, Ben L.; Hamada, Yuki; Bowen, Esther E.
2014-08-17
Large areas of public lands administered by the Bureau of Land Management and located in arid regions of the southwestern United States are being considered for the development of utility-scale solar energy facilities. Land-disturbing activities in these desert, alluvium-filled valleys have the potential to adversely affect the hydrologic and ecologic functions of ephemeral streams. Regulation and management of ephemeral streams typically falls under a spectrum of federal, state, and local programs, but scientifically based guidelines for protecting ephemeral streams with respect to land-development activities are largely nonexistent. This study developed an assessment approach for quantifying the sensitivity to land disturbancemore » of ephemeral stream reaches located in proposed solar energy zones (SEZs). The ephemeral stream assessment approach used publicly-available geospatial data on hydrology, topography, surficial geology, and soil characteristics, as well as highresolution aerial imagery. These datasets were used to inform a professional judgment-based score index of potential land disturbance impacts on selected critical functions of ephemeral streams, including flow and sediment conveyance, ecological habitat value, and groundwater recharge. The total sensitivity scores (sum of scores for the critical stream functions of flow and sediment conveyance, ecological habitats, and groundwater recharge) were used to identify highly sensitive stream reaches to inform decisions on developable areas in SEZs. Total sensitivity scores typically reflected the scores of the individual stream functions; some exceptions pertain to groundwater recharge and ecological habitats. The primary limitations of this assessment approach were the lack of high-resolution identification of ephemeral stream channels in the existing National Hydrography Dataset, and the lack of mechanistic processes describing potential impacts on ephemeral stream functions at the watershed scale.The primary strength of this assessment approach is that it allows watershed-scale planning for low-impact development in arid ecosystems; the qualitative scoring of potential impacts can also be adjusted to accommodate new geospatial data, and to allow for expert and stakeholder input into decisions regarding the identification and potential avoidance of highly sensitive stream reaches.« less
Visualization and Ontology of Geospatial Intelligence
NASA Astrophysics Data System (ADS)
Chan, Yupo
Recent events have deepened our conviction that many human endeavors are best described in a geospatial context. This is evidenced in the prevalence of location-based services, as afforded by the ubiquitous cell phone usage. It is also manifested by the popularity of such internet engines as Google Earth. As we commute to work, travel on business or pleasure, we make decisions based on the geospatial information provided by such location-based services. When corporations devise their business plans, they also rely heavily on such geospatial data. By definition, local, state and federal governments provide services according to geographic boundaries. One estimate suggests that 85 percent of data contain spatial attributes.
Intelligent services for discovery of complex geospatial features from remote sensing imagery
NASA Astrophysics Data System (ADS)
Yue, Peng; Di, Liping; Wei, Yaxing; Han, Weiguo
2013-09-01
Remote sensing imagery has been commonly used by intelligence analysts to discover geospatial features, including complex ones. The overwhelming volume of routine image acquisition requires automated methods or systems for feature discovery instead of manual image interpretation. The methods of extraction of elementary ground features such as buildings and roads from remote sensing imagery have been studied extensively. The discovery of complex geospatial features, however, is still rather understudied. A complex feature, such as a Weapon of Mass Destruction (WMD) proliferation facility, is spatially composed of elementary features (e.g., buildings for hosting fuel concentration machines, cooling towers, transportation roads, and fences). Such spatial semantics, together with thematic semantics of feature types, can be used to discover complex geospatial features. This paper proposes a workflow-based approach for discovery of complex geospatial features that uses geospatial semantics and services. The elementary features extracted from imagery are archived in distributed Web Feature Services (WFSs) and discoverable from a catalogue service. Using spatial semantics among elementary features and thematic semantics among feature types, workflow-based service chains can be constructed to locate semantically-related complex features in imagery. The workflows are reusable and can provide on-demand discovery of complex features in a distributed environment.
Finding geospatial pattern of unstructured data by clustering routes
NASA Astrophysics Data System (ADS)
Boustani, M.; Mattmann, C. A.; Ramirez, P.; Burke, W.
2016-12-01
Today the majority of data generated has a geospatial context to it. Either in attribute form as a latitude or longitude, or name of location or cross referenceable using other means such as an external gazetteer or location service. Our research is interested in exploiting geospatial location and context in unstructured data such as that found on the web in HTML pages, images, videos, documents, and other areas, and in structured information repositories found on intranets, in scientific environments, and otherwise. We are working together on the DARPA MEMEX project to exploit open source software tools such as the Lucene Geo Gazetteer, Apache Tika, Apache Lucene, and Apache OpenNLP, to automatically extract, and make meaning out of geospatial information. In particular, we are interested in unstructured descriptors e.g., a phone number, or a named entity, and the ability to automatically learn geospatial paths related to these descriptors. For example, a particular phone number may represent an entity that travels on a monthly basis, according to easily identifiable and somes more difficult to track patterns. We will present a set of automatic techniques to extract descriptors, and then to geospatially infer their paths across unstructured data.
Towards the Development of a Taxonomy for Visualisation of Streamed Geospatial Data
NASA Astrophysics Data System (ADS)
Sibolla, B. H.; Van Zyl, T.; Coetzee, S.
2016-06-01
Geospatial data has very specific characteristics that need to be carefully captured in its visualisation, in order for the user and the viewer to gain knowledge from it. The science of visualisation has gained much traction over the last decade as a response to various visualisation challenges. During the development of an open source based, dynamic two-dimensional visualisation library, that caters for geospatial streaming data, it was found necessary to conduct a review of existing geospatial visualisation taxonomies. The review was done in order to inform the design phase of the library development, such that either an existing taxonomy can be adopted or extended to fit the needs at hand. The major challenge in this case is to develop dynamic two dimensional visualisations that enable human interaction in order to assist the user to understand the data streams that are continuously being updated. This paper reviews the existing geospatial data visualisation taxonomies that have been developed over the years. Based on the review, an adopted taxonomy for visualisation of geospatial streaming data is presented. Example applications of this taxonomy are also provided. The adopted taxonomy will then be used to develop the information model for the visualisation library in a further study.
NASA Astrophysics Data System (ADS)
Bunds, M. P.
2017-12-01
Point clouds are a powerful data source in the geosciences, and the emergence of structure-from-motion (SfM) photogrammetric techniques has allowed them to be generated quickly and inexpensively. Consequently, applications of them as well as methods to generate, manipulate, and analyze them warrant inclusion in undergraduate curriculum. In a new course called Geospatial Field Methods at Utah Valley University, students in small groups use SfM to generate a point cloud from imagery collected with a small unmanned aerial system (sUAS) and use it as a primary data source for a research project. Before creating their point clouds, students develop needed technical skills in laboratory and class activities. The students then apply the skills to construct the point clouds, and the research projects and point cloud construction serve as a central theme for the class. Intended student outcomes for the class include: technical skills related to acquiring, processing, and analyzing geospatial data; improved ability to carry out a research project; and increased knowledge related to their specific project. To construct the point clouds, students first plan their field work by outlining the field site, identifying locations for ground control points (GCPs), and loading them onto a handheld GPS for use in the field. They also estimate sUAS flight elevation, speed, and the flight path grid spacing required to produce a point cloud with the resolution required for their project goals. In the field, the students place the GCPs using handheld GPS, and survey the GCP locations using post-processed-kinematic (PPK) or real-time-kinematic (RTK) methods. The students pilot the sUAS and operate its camera according to the parameters that they estimated in planning their field work. Data processing includes obtaining accurate locations for the PPK/RTK base station and GCPs, and SfM processing with Agisoft Photoscan. The resulting point clouds are rasterized into digital surface models, assessed for accuracy, and analyzed in Geographic Information System software. Student projects have included mapping and analyzing landslide morphology, fault scarps, and earthquake ground surface rupture. Students have praised the geospatial skills they learn, whereas helping them stay on schedule to finish their projects is a challenge.
Analysis of ArcticDEM orthorectification for polar navigational traverses
NASA Astrophysics Data System (ADS)
Menio, E. C.; Deeb, E. J.; Weale, J.; Courville, Z.; Tracy, B.; Cloutier, M. D.; Cothren, J. D.; Liu, J.
2017-12-01
The availability and accessibility of high-resolution satellite imagery allows operational support teams to visually assess physical risks along traverse routes before and during the field season. In support of operations along the Greenland Inland Traverse (GrIT), DigitalGlobe's WorldView 0.5m resolution panchromatic imagery is analyzed to identify and digitize crevasse features along the route from Thule Air Force Base to Summit Station, Greenland. In the spring of 2016, field teams reported up to 150 meters of offset between the location of crevasse features on the ground and the location of the same feature on the imagery provided. Investigation into this issue identified the need to orthorectify imagery—use digital elevation models (DEMs) to correct viewing geometry distortions—to improve navigational accuracy in the field. It was previously thought that orthorectification was not necessary for applications in relatively flat terrain such as ice sheets. However, the surface elevations on the margins of the Greenland Ice Sheet vary enough to cause distortions in imagery, if taken obliquely. As is standard for requests, the Polar Geospatial Center (PGC) provides orthorectified imagery using the MEaSUREs Greenland Ice Mapping Project (GIMP) 30m digital elevation model. Current, higher-resolution elevation datasets, such as the ArcticDEM (2-5m resolution) and WorldView stereopair DEMs (2-3m resolution), are available for use in orthorectification. This study examines three heavily crevassed areas along the GrIT traverse, as identified in 2015 and 2016 imagery. We extracted elevation profiles along the GrIT route from each of the three DEMs: GIMP, ArcticDEM, and WorldView stereopair mosaic. Results show the courser GIMP data deviating significantly from the ArcticDEM and WorldView data, at points by up to 80m, which is seen as offset of features in plan view. In-situ Ground Penetrating Radar (GPR) surveys of crevasse crossings allow for evaluation of geopositional accuracy of each resulting orthorectified photo and a quantitative analysis of plan view offset.
An effective approach for gap-filling continental scale remotely sensed time-series
Weiss, Daniel J.; Atkinson, Peter M.; Bhatt, Samir; Mappin, Bonnie; Hay, Simon I.; Gething, Peter W.
2014-01-01
The archives of imagery and modeled data products derived from remote sensing programs with high temporal resolution provide powerful resources for characterizing inter- and intra-annual environmental dynamics. The impressive depth of available time-series from such missions (e.g., MODIS and AVHRR) affords new opportunities for improving data usability by leveraging spatial and temporal information inherent to longitudinal geospatial datasets. In this research we develop an approach for filling gaps in imagery time-series that result primarily from cloud cover, which is particularly problematic in forested equatorial regions. Our approach consists of two, complementary gap-filling algorithms and a variety of run-time options that allow users to balance competing demands of model accuracy and processing time. We applied the gap-filling methodology to MODIS Enhanced Vegetation Index (EVI) and daytime and nighttime Land Surface Temperature (LST) datasets for the African continent for 2000–2012, with a 1 km spatial resolution, and an 8-day temporal resolution. We validated the method by introducing and filling artificial gaps, and then comparing the original data with model predictions. Our approach achieved R2 values above 0.87 even for pixels within 500 km wide introduced gaps. Furthermore, the structure of our approach allows estimation of the error associated with each gap-filled pixel based on the distance to the non-gap pixels used to model its fill value, thus providing a mechanism for including uncertainty associated with the gap-filling process in downstream applications of the resulting datasets. PMID:25642100
River predisposition to ice jams: a simplified geospatial model
NASA Astrophysics Data System (ADS)
De Munck, Stéphane; Gauthier, Yves; Bernier, Monique; Chokmani, Karem; Légaré, Serge
2017-07-01
Floods resulting from river ice jams pose a great risk to many riverside municipalities in Canada. The location of an ice jam is mainly influenced by channel morphology. The goal of this work was therefore to develop a simplified geospatial model to estimate the predisposition of a river channel to ice jams. Rather than predicting the timing of river ice breakup, the main question here was to predict where the broken ice is susceptible to jam based on the river's geomorphological characteristics. Thus, six parameters referred to potential causes for ice jams in the literature were initially selected: presence of an island, narrowing of the channel, high sinuosity, presence of a bridge, confluence of rivers, and slope break. A GIS-based tool was used to generate the aforementioned factors over regular-spaced segments along the entire channel using available geospatial data. An ice jam predisposition index
(IJPI) was calculated by combining the weighted optimal factors. Three Canadian rivers (province of Québec) were chosen as test sites. The resulting maps were assessed from historical observations and local knowledge. Results show that 77 % of the observed ice jam sites on record occurred in river sections that the model considered as having high or medium predisposition. This leaves 23 % of false negative errors (missed occurrence). Between 7 and 11 % of the highly predisposed
river sections did not have an ice jam on record (false-positive cases). Results, limitations, and potential improvements are discussed.
NASA Astrophysics Data System (ADS)
Une, Hiroshi; Nakano, Takayuki
2018-05-01
Geographic location is one of the most fundamental and indispensable information elements in the field of disaster response and prevention. For example, in the case of the Tohoku Earthquake in 2011, aerial photos taken immediately after the earthquake greatly improved information sharing among different government offices and facilitated rescue and recovery operations, and maps prepared after the disaster assisted in the rapid reconstruction of affected local communities. Thanks to the recent development of geospatial information technology, this information has become more essential for disaster response activities. Advancements in web mapping technology allows us to better understand the situation by overlaying various location-specific data on base maps on the web and specifying the areas on which activities should be focused. Through 3-D modelling technology, we can have a more realistic understanding of the relationship between disaster and topography. Geospatial information technology can sup-port proper preparation and emergency responses against disasters by individuals and local communities through hazard mapping and other information services using mobile devices. Thus, geospatial information technology is playing a more vital role on all stages of disaster risk management and responses. In acknowledging geospatial information's vital role in disaster risk reduction, the Sendai Framework for Disaster Risk Reduction 2015-2030, adopted at the Third United Nations World Conference on Disaster Risk Reduction, repeatedly reveals the importance of utilizing geospatial information technology for disaster risk reduction. This presentation aims to report the recent practical applications of geospatial information technology for disaster risk management and responses.
Bim and Gis: when Parametric Modeling Meets Geospatial Data
NASA Astrophysics Data System (ADS)
Barazzetti, L.; Banfi, F.
2017-12-01
Geospatial data have a crucial role in several projects related to infrastructures and land management. GIS software are able to perform advanced geospatial analyses, but they lack several instruments and tools for parametric modelling typically available in BIM. At the same time, BIM software designed for buildings have limited tools to handle geospatial data. As things stand at the moment, BIM and GIS could appear as complementary solutions, notwithstanding research work is currently under development to ensure a better level of interoperability, especially at the scale of the building. On the other hand, the transition from the local (building) scale to the infrastructure (where geospatial data cannot be neglected) has already demonstrated that parametric modelling integrated with geoinformation is a powerful tool to simplify and speed up some phases of the design workflow. This paper reviews such mixed approaches with both simulated and real examples, demonstrating that integration is already a reality at specific scales, which are not dominated by "pure" GIS or BIM. The paper will also demonstrate that some traditional operations carried out with GIS software are also available in parametric modelling software for BIM, such as transformation between reference systems, DEM generation, feature extraction, and geospatial queries. A real case study is illustrated and discussed to show the advantage of a combined use of both technologies. BIM and GIS integration can generate greater usage of geospatial data in the AECOO (Architecture, Engineering, Construction, Owner and Operator) industry, as well as new solutions for parametric modelling with additional geoinformation.
Wood, Nathan; Jones, Jeanne; Schelling, John; Schmidtlein, Mathew
2014-01-01
Tsunami vertical-evacuation (TVE) refuges can be effective risk-reduction options for coastal communities with local tsunami threats but no accessible high ground for evacuations. Deciding where to locate TVE refuges is a complex risk-management question, given the potential for conflicting stakeholder priorities and multiple, suitable sites. We use the coastal community of Ocean Shores (Washington, USA) and the local tsunami threat posed by Cascadia subduction zone earthquakes as a case study to explore the use of geospatial, multi-criteria decision analysis for framing the locational problem of TVE siting. We demonstrate a mixed-methods approach that uses potential TVE sites identified at community workshops, geospatial analysis to model changes in pedestrian evacuation times for TVE options, and statistical analysis to develop metrics for comparing population tradeoffs and to examine influences in decision making. Results demonstrate that no one TVE site can save all at-risk individuals in the community and each site provides varying benefits to residents, employees, customers at local stores, tourists at public venues, children at schools, and other vulnerable populations. The benefit of some proposed sites varies depending on whether or not nearby bridges will be functioning after the preceding earthquake. Relative rankings of the TVE sites are fairly stable under various criteria-weighting scenarios but do vary considerably when comparing strategies to exclusively protect tourists or residents. The proposed geospatial framework can serve as an analytical foundation for future TVE siting discussions.
Roberts-Ashby, Tina; Brandon N. Ashby,
2016-01-01
This paper demonstrates geospatial modification of the USGS methodology for assessing geologic CO2 storage resources, and was applied to the Pre-Punta Gorda Composite and Dollar Bay reservoirs of the South Florida Basin. The study provides detailed evaluation of porous intervals within these reservoirs and utilizes GIS to evaluate the potential spatial distribution of reservoir parameters and volume of CO2 that can be stored. This study also shows that incorporating spatial variation of parameters using detailed and robust datasets may improve estimates of storage resources when compared to applying uniform values across the study area derived from small datasets, like many assessment methodologies. Geospatially derived estimates of storage resources presented here (Pre-Punta Gorda Composite = 105,570 MtCO2; Dollar Bay = 24,760 MtCO2) were greater than previous assessments, which was largely attributed to the fact that detailed evaluation of these reservoirs resulted in higher estimates of porosity and net-porous thickness, and areas of high porosity and thick net-porous intervals were incorporated into the model, likely increasing the calculated volume of storage space available for CO2 sequestration. The geospatial method for evaluating CO2 storage resources also provides the ability to identify areas that potentially contain higher volumes of storage resources, as well as areas that might be less favorable.
Visa: AN Automatic Aware and Visual Aids Mechanism for Improving the Correct Use of Geospatial Data
NASA Astrophysics Data System (ADS)
Hong, J. H.; Su, Y. T.
2016-06-01
With the fast growth of internet-based sharing mechanism and OpenGIS technology, users nowadays enjoy the luxury to quickly locate and access a variety of geospatial data for the tasks at hands. While this sharing innovation tremendously expand the possibility of application and reduce the development cost, users nevertheless have to deal with all kinds of "differences" implicitly hidden behind the acquired georesources. We argue the next generation of GIS-based environment, regardless internet-based or not, must have built-in knowledge to automatically and correctly assess the fitness of data use and present the analyzed results to users in an intuitive and meaningful way. The VISA approach proposed in this paper refer to four different types of visual aids that can be respectively used for addressing analyzed results, namely, virtual layer, informative window, symbol transformation and augmented TOC. The VISA-enabled interface works in an automatic-aware fashion, where the standardized metadata serve as the known facts about the selected geospatial resources, algorithms for analyzing the differences of temporality and quality of the geospatial resources were designed and the transformation of analyzed results into visual aids were automatically executed. It successfully presents a new way for bridging the communication gaps between systems and users. GIS has been long seen as a powerful integration tool, but its achievements would be highly restricted if it fails to provide a friendly and correct working platform.
NASA Technical Reports Server (NTRS)
Lyle, Stacey D.
2009-01-01
A software package that has been designed to allow authentication for determining if the rover(s) is/are within a set of boundaries or a specific area to access critical geospatial information by using GPS signal structures as a means to authenticate mobile devices into a network wirelessly and in real-time. The advantage lies in that the system only allows those with designated geospatial boundaries or areas into the server.
Maya Quinones; William Gould; Carlos D. Rodriguez-Pedraza
2007-01-01
This report documents the type and source of geospatial data available for Haiti. It was compiled to serve as a resource for geographic information system (GIS)-based land management and planning. It will be useful for conservation planning, reforestation efforts, and agricultural extension projects. Our study indicates that there is a great deal of geospatial...
2014-05-22
attempted to respond to the advances in technology and the growing power of geographical information system (GIS) tools. However, the doctrine...Geospatial intelligence (GEOINT), Geographical information systems (GIS) tools, Humanitarian Assistance/Disaster Relief (HA/DR), 2010 Haiti Earthquake...Humanitarian Assistance/Disaster Relief (HA/DR) Decisions Through Geospatial Intelligence (GEOINT) and Geographical Information Systems (GIS) Tools
2009-06-08
CRS Report for Congress Prepared for Members and Committees of Congress Geospatial Information and Geographic Information Systems (GIS...Geographic Information Systems (GIS): Current Issues and Future Challenges 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...PAGE unclassified Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 Geospatial Information and Geographic Information Systems (GIS
Interacting With A Near Real-Time Urban Digital Watershed Using Emerging Geospatial Web Technologies
NASA Astrophysics Data System (ADS)
Liu, Y.; Fazio, D. J.; Abdelzaher, T.; Minsker, B.
2007-12-01
The value of real-time hydrologic data dissemination including river stage, streamflow, and precipitation for operational stormwater management efforts is particularly high for communities where flash flooding is common and costly. Ideally, such data would be presented within a watershed-scale geospatial context to portray a holistic view of the watershed. Local hydrologic sensor networks usually lack comprehensive integration with sensor networks managed by other agencies sharing the same watershed due to administrative, political, but mostly technical barriers. Recent efforts on providing unified access to hydrological data have concentrated on creating new SOAP-based web services and common data format (e.g. WaterML and Observation Data Model) for users to access the data (e.g. HIS and HydroSeek). Geospatial Web technology including OGC sensor web enablement (SWE), GeoRSS, Geo tags, Geospatial browsers such as Google Earth and Microsoft Virtual Earth and other location-based service tools provides possibilities for us to interact with a digital watershed in near-real-time. OGC SWE proposes a revolutionary concept towards a web-connected/controllable sensor networks. However, these efforts have not provided the capability to allow dynamic data integration/fusion among heterogeneous sources, data filtering and support for workflows or domain specific applications where both push and pull mode of retrieving data may be needed. We propose a light weight integration framework by extending SWE with open source Enterprise Service Bus (e.g., mule) as a backbone component to dynamically transform, transport, and integrate both heterogeneous sensor data sources and simulation model outputs. We will report our progress on building such framework where multi-agencies" sensor data and hydro-model outputs (with map layers) will be integrated and disseminated in a geospatial browser (e.g. Microsoft Virtual Earth). This is a collaborative project among NCSA, USGS Illinois Water Science Center, Computer Science Department at UIUC funded by the Adaptive Environmental Infrastructure Sensing and Information Systems initiative at UIUC.
NASA Astrophysics Data System (ADS)
Arozarena, A.; Villa, G.; Valcárcel, N.; Pérez, B.
2016-06-01
Remote sensing satellites, together with aerial and terrestrial platforms (mobile and fixed), produce nowadays huge amounts of data coming from a wide variety of sensors. These datasets serve as main data sources for the extraction of Geospatial Reference Information (GRI), constituting the "skeleton" of any Spatial Data Infrastructure (SDI). Since very different situations can be found around the world in terms of geographic information production and management, the generation of global GRI datasets seems extremely challenging. Remotely sensed data, due to its wide availability nowadays, is able to provide fundamental sources for any production or management system present in different countries. After several automatic and semiautomatic processes including ancillary data, the extracted geospatial information is ready to become part of the GRI databases. In order to optimize these data flows for the production of high quality geospatial information and to promote its use to address global challenges several initiatives at national, continental and global levels have been put in place, such as European INSPIRE initiative and Copernicus Programme, and global initiatives such as the Group on Earth Observation/Global Earth Observation System of Systems (GEO/GEOSS) and United Nations Global Geospatial Information Management (UN-GGIM). These workflows are established mainly by public organizations, with the adequate institutional arrangements at national, regional or global levels. Other initiatives, such as Volunteered Geographic Information (VGI), on the other hand may contribute to maintain the GRI databases updated. Remotely sensed data hence becomes one of the main pillars underpinning the establishment of a global SDI, as those datasets will be used by public agencies or institutions as well as by volunteers to extract the required spatial information that in turn will feed the GRI databases. This paper intends to provide an example of how institutional arrangements and cooperative production systems can be set up at any territorial level in order to exploit remotely sensed data in the most intensive manner, taking advantage of all its potential.
A novel algorithm for fully automated mapping of geospatial ontologies
NASA Astrophysics Data System (ADS)
Chaabane, Sana; Jaziri, Wassim
2018-01-01
Geospatial information is collected from different sources thus making spatial ontologies, built for the same geographic domain, heterogeneous; therefore, different and heterogeneous conceptualizations may coexist. Ontology integrating helps creating a common repository of the geospatial ontology and allows removing the heterogeneities between the existing ontologies. Ontology mapping is a process used in ontologies integrating and consists in finding correspondences between the source ontologies. This paper deals with the "mapping" process of geospatial ontologies which consist in applying an automated algorithm in finding the correspondences between concepts referring to the definitions of matching relationships. The proposed algorithm called "geographic ontologies mapping algorithm" defines three types of mapping: semantic, topological and spatial.
Data to Decisions: Valuing the Societal Benefit of Geospatial Information
NASA Astrophysics Data System (ADS)
Pearlman, F.; Kain, D.
2016-12-01
The March 10-11, 2016 GEOValue workshop on "Data to Decisions" was aimed at creating a framework for identification and implementation of best practices that capture the societal value of geospatial information for both public and private uses. The end-to-end information flow starts with the earth observation and data acquisition systems, includes the full range of processes from geospatial information to decisions support systems, and concludes with the end user. Case studies, which will be described in this presentation, were identified for a range of applications. The goal was to demonstrate and compare approaches to valuation of geospatial information and forge a path forward for research that leads to standards of practice.
Global Fiducials Program Imagery: New Opportunities for Geospatial Research, Outreach, and Education
NASA Astrophysics Data System (ADS)
Price, S. D.
2012-12-01
MOLNIA, Bruce F., PRICE, Susan D. and, KING, Stephen E., U.S. Geological Survey (USGS), 562 National Center, Reston, VA 20192, sprice@usgs.gov The Civil Applications Committee (CAC), operated by the U.S. Geological Survey (USGS), is the Federal interagency committee that facilitates Federal civil agency access to U.S. National Systems space-based electro-optical (EO) imagery for natural disaster response; global change investigations; ecosystem monitoring; mapping, charting, and geodesy; and related topics. The CAC's Global Fiducials Program (GFP) has overseen the systematic collection of high-resolution imagery to provide geospatial data time series spanning a decade or more at carefully selected sites to study and monitor changes, and to facilitate a comprehensive understanding of dynamic and sensitive areas of our planet. Since 2008, more than 4,500 one-meter resolution EO images which comprise time series from 85 GFP sites have been released for unrestricted public use. Initial site selections were made by Federal and academic scientists based on each site's unique history, susceptibility, or environmental value. For each site, collection strategies were carefully defined to maximize information extraction capabilities. This consistency enhances our ability to understand Earth's dynamic processes and long-term trends. Individual time series focus on Arctic sea ice change; temperate glacier behavior; mid-continent wetland dynamics; barrier island response to hurricanes; coastline evolution; wildland fire recovery; Long-Term Ecological Resource (LTER) site processes; and many other topics. The images are available from a USGS website at no cost, in an orthorectified GeoTIFF format with supporting metadata, making them ideal for use in Earth science education and GIS projects. New on-line tools provide enhanced analysis of these time-series imagery. For additional information go to http://gfp.usgs.gov or http://gfl.usgs.gov.Bering Glacier is the largest and longest glacier in continental North America, with a length of 190 km, a width of 40 km, and an area of about 5,000 km2. In the nine years between the 1996 image and the 2005 image, parts of the terminus retreated by more than 5 km and thinned by as much as 100 m. Long-term monitoring of Bering Glacier will enable scientists to better understand the dynamics of surging glaciers as well as how changing Alaska climate is affecting temperate glacier environments.
Bauermeister, José A; Connochie, Daniel; Eaton, Lisa; Demers, Michele; Stephenson, Rob
Young men who have sex with men (YMSM), particularly YMSM who are racial/ethnic minorities, are disproportionately affected by the human immunodeficiency virus (HIV) epidemic in the United States. These HIV disparities have been linked to demographic, social, and physical geospatial characteristics. The objective of this scoping review was to summarize the existing evidence from multilevel studies examining how geospatial characteristics are associated with HIV prevention and care outcomes among YMSM populations. Our literature search uncovered 126 peer-reviewed articles, of which 17 were eligible for inclusion based on our review criteria. Nine studies examined geospatial characteristics as predictors of HIV prevention outcomes. Nine of the 17 studies reported HIV care outcomes. From the synthesis regarding the current state of research around geospatial correlates of behavioral and biological HIV risk, we propose strategies to move the field forward in order to inform the design of future multilevel research and intervention studies for this population.
MapFactory - Towards a mapping design pattern for big geospatial data
NASA Astrophysics Data System (ADS)
Rautenbach, Victoria; Coetzee, Serena
2018-05-01
With big geospatial data emerging, cartographers and geographic information scientists have to find new ways of dealing with the volume, variety, velocity, and veracity (4Vs) of the data. This requires the development of tools that allow processing, filtering, analysing, and visualising of big data through multidisciplinary collaboration. In this paper, we present the MapFactory design pattern that will be used for the creation of different maps according to the (input) design specification for big geospatial data. The design specification is based on elements from ISO19115-1:2014 Geographic information - Metadata - Part 1: Fundamentals that would guide the design and development of the map or set of maps to be produced. The results of the exploratory research suggest that the MapFactory design pattern will help with software reuse and communication. The MapFactory design pattern will aid software developers to build the tools that are required to automate map making with big geospatial data. The resulting maps would assist cartographers and others to make sense of big geospatial data.
NASA Astrophysics Data System (ADS)
Johnson, A.
2010-12-01
Maps, spatial and temporal data and their use in analysis and visualization are integral components for studies in the geosciences. With the emergence of geospatial technology (Geographic Information Systems (GIS), remote sensing and imagery, Global Positioning Systems (GPS) and mobile technologies) scientists and the geosciences user community are now able to more easily accessed and share data, analyze their data and present their results. Educators are also incorporating geospatial technology into their geosciences programs by including an awareness of the technology in introductory courses to advanced courses exploring the capabilities to help answer complex questions in the geosciences. This paper will look how the new Geospatial Technology Competency Model from the Department of Labor can help ensure that geosciences programs address the skills and competencies identified by the workforce for geospatial technology as well as look at new tools created by the GeoTech Center to help do self and program assessments.
Geomatics Education: Need Assessment
NASA Astrophysics Data System (ADS)
Vyas, A.
2014-11-01
Education system is divided in to two classes: formal and informal. Formal education establishes the basis of theory and practical learning whereas informal education is largely self-learning, learning from real world projects. Generally science and technology streams require formal method of education. The social and related aspects can be taught through the other methods. Education is a media through which the foundation of the knowledge and skill is built. The statistics reveals the increase in the trend of the literate population. This may be accounted due to the level of urbanization and migration to the cities in search for the "white-collar jobs". As a result, a shift in the employment structure is observed from a primary sector to a secondary and tertiary sector. Thomas Friedman in his book `The World is Flat' quotes the impact of globalization on adaptation of science and technology, the world has become large to tiny. One of the technologies to mention here is geospatial technology. With the advancement in the satellite remote sensing, geographical information system, global positioning system, the database management system has become important subject areas. The countries are accounting hugh budget on the space technology, which includes education, training and research. Today many developing countries do not have base maps, they are lacking in the systemic data and record keeping, which are essential for governance, decision making and other development purpose. There is no trained manpower available. There is no standard hardware and software identified. An imbalance is observed when the government is promoting the use of geospatial technology, there is no trained manpower nor the availability of the experts to review the accurateness of the spatial data developed. There are very few universities which impart the degree level education, there are very few trained faculty members who give standard education, there exists a lack of standard syllabus. On the other hand, the industry requires high skilled manpower, high experienced manpower. This is a low equilibrium situation. Since the need is enhancing day by day, the shortage of the skilled manpower is increasing, the need of the geomatics education emerges. This paper researches on the need assessment of the education in geospatial specialization. It emphasises on the challenges and issues prevail in geospatial education and in the specialized fields of remote sensing and GIS. This paper analyse the need assessment through all the three actors: government, geospatial industry and education institutions.
GeoBrain Computational Cyber-laboratory for Earth Science Studies
NASA Astrophysics Data System (ADS)
Deng, M.; di, L.
2009-12-01
Computational approaches (e.g., computer-based data visualization, analysis and modeling) are critical for conducting increasingly data-intensive Earth science (ES) studies to understand functions and changes of the Earth system. However, currently Earth scientists, educators, and students have met two major barriers that prevent them from being effectively using computational approaches in their learning, research and application activities. The two barriers are: 1) difficulties in finding, obtaining, and using multi-source ES data; and 2) lack of analytic functions and computing resources (e.g., analysis software, computing models, and high performance computing systems) to analyze the data. Taking advantages of recent advances in cyberinfrastructure, Web service, and geospatial interoperability technologies, GeoBrain, a project funded by NASA, has developed a prototype computational cyber-laboratory to effectively remove the two barriers. The cyber-laboratory makes ES data and computational resources at large organizations in distributed locations available to and easily usable by the Earth science community through 1) enabling seamless discovery, access and retrieval of distributed data, 2) federating and enhancing data discovery with a catalogue federation service and a semantically-augmented catalogue service, 3) customizing data access and retrieval at user request with interoperable, personalized, and on-demand data access and services, 4) automating or semi-automating multi-source geospatial data integration, 5) developing a large number of analytic functions as value-added, interoperable, and dynamically chainable geospatial Web services and deploying them in high-performance computing facilities, 6) enabling the online geospatial process modeling and execution, and 7) building a user-friendly extensible web portal for users to access the cyber-laboratory resources. Users can interactively discover the needed data and perform on-demand data analysis and modeling through the web portal. The GeoBrain cyber-laboratory provides solutions to meet common needs of ES research and education, such as, distributed data access and analysis services, easy access to and use of ES data, and enhanced geoprocessing and geospatial modeling capability. It greatly facilitates ES research, education, and applications. The development of the cyber-laboratory provides insights, lessons-learned, and technology readiness to build more capable computing infrastructure for ES studies, which can meet wide-range needs of current and future generations of scientists, researchers, educators, and students for their formal or informal educational training, research projects, career development, and lifelong learning.
Exploitation of commercial remote sensing images: reality ignored?
NASA Astrophysics Data System (ADS)
Allen, Paul C.
1999-12-01
The remote sensing market is on the verge of being awash in commercial high-resolution images. Market estimates are based on the growing numbers of planned commercial remote sensing electro-optical, radar, and hyperspectral satellites and aircraft. EarthWatch, Space Imaging, SPOT, and RDL among others are all working towards launch and service of one to five meter panchromatic or radar-imaging satellites. Additionally, new advances in digital air surveillance and reconnaissance systems, both manned and unmanned, are also expected to expand the geospatial customer base. Regardless of platform, image type, or location, each system promises images with some combination of increased resolution, greater spectral coverage, reduced turn-around time (request-to- delivery), and/or reduced image cost. For the most part, however, market estimates for these new sources focus on the raw digital images (from collection to the ground station) while ignoring the requirements for a processing and exploitation infrastructure comprised of exploitation tools, exploitation training, library systems, and image management systems. From this it would appear the commercial imaging community has failed to learn the hard lessons of national government experience choosing instead to ignore reality and replicate the bias of collection over processing and exploitation. While this trend may be not impact the small quantity users that exist today it will certainly adversely affect the mid- to large-sized users of the future.
Modelling of human exposure to air pollution in the urban environment: a GPS-based approach.
Dias, Daniela; Tchepel, Oxana
2014-03-01
The main objective of this work was the development of a new modelling tool for quantification of human exposure to traffic-related air pollution within distinct microenvironments by using a novel approach for trajectory analysis of the individuals. For this purpose, mobile phones with Global Positioning System technology have been used to collect daily trajectories of the individuals with higher temporal resolution and a trajectory data mining, and geo-spatial analysis algorithm was developed and implemented within a Geographical Information System to obtain time-activity patterns. These data were combined with air pollutant concentrations estimated for several microenvironments. In addition to outdoor, pollutant concentrations in distinct indoor microenvironments are characterised using a probabilistic approach. An example of the application for PM2.5 is presented and discussed. The results obtained for daily average individual exposure correspond to a mean value of 10.6 and 6.0-16.4 μg m(-3) in terms of 5th-95th percentiles. Analysis of the results shows that the use of point air quality measurements for exposure assessment will not explain the intra- and inter-variability of individuals' exposure levels. The methodology developed and implemented in this work provides time-sequence of the exposure events thus making possible association of the exposure with the individual activities and delivers main statistics on individual's air pollution exposure with high spatio-temporal resolution.
Arctic shipping emissions inventories and future scenarios
NASA Astrophysics Data System (ADS)
Corbett, J. J.; Lack, D. A.; Winebrake, J. J.; Harder, S.; Silberman, J. A.; Gold, M.
2010-10-01
This paper presents 5 km×5 km Arctic emissions inventories of important greenhouse gases, black carbon and other pollutants under existing and future (2050) scenarios that account for growth of shipping in the region, potential diversion traffic through emerging routes, and possible emissions control measures. These high-resolution, geospatial emissions inventories for shipping can be used to evaluate Arctic climate sensitivity to black carbon (a short-lived climate forcing pollutant especially effective in accelerating the melting of ice and snow), aerosols, and gaseous emissions including carbon dioxide. We quantify ship emissions scenarios which are expected to increase as declining sea ice coverage due to climate change allows for increased shipping activity in the Arctic. A first-order calculation of global warming potential due to 2030 emissions in the high-growth scenario suggests that short-lived forcing of ~4.5 gigagrams of black carbon from Arctic shipping may increase global warming potential due to Arctic ships' CO2 emissions (~42 000 gigagrams) by some 17% to 78%. The paper also presents maximum feasible reduction scenarios for black carbon in particular. These emissions reduction scenarios will enable scientists and policymakers to evaluate the efficacy and benefits of technological controls for black carbon, and other pollutants from ships.
2009-06-01
AUTOMATED GEOSPATIAL TOOLS : AGILITY IN COMPLEX PLANNING Primary Topic: Track 5 – Experimentation and Analysis Walter A. Powell [STUDENT] - GMU...TITLE AND SUBTITLE Results of an Experimental Exploration of Advanced Automated Geospatial Tools : Agility in Complex Planning 5a. CONTRACT NUMBER...Std Z39-18 Abstract Typically, the development of tools and systems for the military is requirement driven; systems are developed to meet
Strategic Model for Future Geospatial Education.
1998-05-18
There appears to be only one benefit to doing nothing as option one dictates-there are no up front costs to the government for doing nothing. The costs...the government can ensure that US industry and academia benefit from decades of geospatial information expertise. Industry and academia will be...or militarily unique topics. In summary, option two provides more benefits for both the government and the geospatial information community as a
Restful Implementation of Catalogue Service for Geospatial Data Provenance
NASA Astrophysics Data System (ADS)
Jiang, L. C.; Yue, P.; Lu, X. C.
2013-10-01
Provenance, also known as lineage, is important in understanding the derivation history of data products. Geospatial data provenance helps data consumers to evaluate the quality and reliability of geospatial data. In a service-oriented environment, where data are often consumed or produced by distributed services, provenance could be managed by following the same service-oriented paradigm. The Open Geospatial Consortium (OGC) Catalogue Service for the Web (CSW) is used for the registration and query of geospatial data provenance by extending ebXML Registry Information Model (ebRIM). Recent advance of the REpresentational State Transfer (REST) paradigm has shown great promise for the easy integration of distributed resources. RESTful Web Service aims to provide a standard way for Web clients to communicate with servers based on REST principles. The existing approach for provenance catalogue service could be improved by adopting the RESTful design. This paper presents the design and implementation of a catalogue service for geospatial data provenance following RESTful architecture style. A middleware named REST Converter is added on the top of the legacy catalogue service to support a RESTful style interface. The REST Converter is composed of a resource request dispatcher and six resource handlers. A prototype service is developed to demonstrate the applicability of the approach.
New Geodetic Infrastructure for Australia: The NCRIS / AuScope Geospatial Component
NASA Astrophysics Data System (ADS)
Tregoning, P.; Watson, C. S.; Coleman, R.; Johnston, G.; Lovell, J.; Dickey, J.; Featherstone, W. E.; Rizos, C.; Higgins, M.; Priebbenow, R.
2009-12-01
In November 2006, the Australian Federal Government announced AUS15.8M in funding for geospatial research infrastructure through the National Collaborative Research Infrastructure Strategy (NCRIS). Funded within a broader capability area titled ‘Structure and Evolution of the Australian Continent’, NCRIS has provided a significant investment across Earth imaging, geochemistry, numerical simulation and modelling, the development of a virtual core library, and geospatial infrastructure. Known collectively as AuScope (www.auscope.org.au), this capability area has brought together Australian’s leading Earth scientists to decide upon the most pressing scientific issues and infrastructure needs for studying Earth systems and their impact on the Australian continent. Importantly and at the same time, the investment in geospatial infrastructure offers the opportunity to raise Australian geodetic science capability to the highest international level into the future. The geospatial component of AuScope builds onto the AUS15.8M of direct funding through the NCRIS process with significant in-kind and co-investment from universities and State/Territory and Federal government departments. The infrastructure to be acquired includes an FG5 absolute gravimeter, three gPhone relative gravimeters, three 12.1 m radio telescopes for geodetic VLBI, a continent-wide network of continuously operating geodetic quality GNSS receivers, a trial of a mobile SLR system and access to updated cluster computing facilities. We present an overview of the AuScope geospatial capability, review the current status of the infrastructure procurement and discuss some examples of the scientific research that will utilise the new geospatial infrastructure.
NASA Astrophysics Data System (ADS)
Hasyim, Fuad; Subagio, Habib; Darmawan, Mulyanto
2016-06-01
A preparation of spatial planning documents require basic geospatial information and thematic accuracies. Recently these issues become important because spatial planning maps are impartial attachment of the regional act draft on spatial planning (PERDA). The needs of geospatial information in the preparation of spatial planning maps preparation can be divided into two major groups: (i). basic geospatial information (IGD), consist of of Indonesia Topographic maps (RBI), coastal and marine environmental maps (LPI), and geodetic control network and (ii). Thematic Geospatial Information (IGT). Currently, mostly local goverment in Indonesia have not finished their regulation draft on spatial planning due to some constrain including technical aspect. Some constrain in mapping of spatial planning are as follows: the availability of large scale ofbasic geospatial information, the availability of mapping guidelines, and human resources. Ideal conditions to be achieved for spatial planning maps are: (i) the availability of updated geospatial information in accordance with the scale needed for spatial planning maps, (ii) the guideline of mapping for spatial planning to support local government in completion their PERDA, and (iii) capacity building of local goverment human resources to completed spatial planning maps. The OMP strategies formulated to achieve these conditions are: (i) accelerating of IGD at scale of 1:50,000, 1: 25,000 and 1: 5,000, (ii) to accelerate mapping and integration of Thematic Geospatial Information (IGT) through stocktaking availability and mapping guidelines, (iii) the development of mapping guidelines and dissemination of spatial utilization and (iv) training of human resource on mapping technology.
An Automated End-To Multi-Agent Qos Based Architecture for Selection of Geospatial Web Services
NASA Astrophysics Data System (ADS)
Shah, M.; Verma, Y.; Nandakumar, R.
2012-07-01
Over the past decade, Service-Oriented Architecture (SOA) and Web services have gained wide popularity and acceptance from researchers and industries all over the world. SOA makes it easy to build business applications with common services, and it provides like: reduced integration expense, better asset reuse, higher business agility, and reduction of business risk. Building of framework for acquiring useful geospatial information for potential users is a crucial problem faced by the GIS domain. Geospatial Web services solve this problem. With the help of web service technology, geospatial web services can provide useful geospatial information to potential users in a better way than traditional geographic information system (GIS). A geospatial Web service is a modular application designed to enable the discovery, access, and chaining of geospatial information and services across the web that are often both computation and data-intensive that involve diverse sources of data and complex processing functions. With the proliferation of web services published over the internet, multiple web services may provide similar functionality, but with different non-functional properties. Thus, Quality of Service (QoS) offers a metric to differentiate the services and their service providers. In a quality-driven selection of web services, it is important to consider non-functional properties of the web service so as to satisfy the constraints or requirements of the end users. The main intent of this paper is to build an automated end-to-end multi-agent based solution to provide the best-fit web service to service requester based on QoS.
Strengthened IAEA Safeguards-Imagery Analysis: Geospatial Tools for Nonproliferation Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pabian, Frank V
2012-08-14
This slide presentation focuses on the growing role and importance of imagery analysis for IAEA safeguards applications and how commercial satellite imagery, together with the newly available geospatial tools, can be used to promote 'all-source synergy.' As additional sources of openly available information, satellite imagery in conjunction with the geospatial tools can be used to significantly augment and enhance existing information gathering techniques, procedures, and analyses in the remote detection and assessment of nonproliferation relevant activities, facilities, and programs. Foremost of the geospatial tools are the 'Digital Virtual Globes' (i.e., GoogleEarth, Virtual Earth, etc.) that are far better than previouslymore » used simple 2-D plan-view line drawings for visualization of known and suspected facilities of interest which can be critical to: (1) Site familiarization and true geospatial context awareness; (2) Pre-inspection planning; (3) Onsite orientation and navigation; (4) Post-inspection reporting; (5) Site monitoring over time for changes; (6) Verification of states site declarations and for input to State Evaluation reports; and (7) A common basis for discussions among all interested parties (Member States). Additionally, as an 'open-source', such virtual globes can also provide a new, essentially free, means to conduct broad area search for undeclared nuclear sites and activities - either alleged through open source leads; identified on internet BLOGS and WIKI Layers, with input from a 'free' cadre of global browsers and/or by knowledgeable local citizens (a.k.a.: 'crowdsourcing'), that can include ground photos and maps; or by other initiatives based on existing information and in-house country knowledge. They also provide a means to acquire ground photography taken by locals, hobbyists, and tourists of the surrounding locales that can be useful in identifying and discriminating between relevant and non-relevant facilities and their associated infrastructure. The digital globes also provide highly accurate terrain mapping for better geospatial context and allow detailed 3-D perspectives of all sites or areas of interest. 3-D modeling software (i.e., Google's SketchUp6 newly available in 2007) when used in conjunction with these digital globes can significantly enhance individual building characterization and visualization (including interiors), allowing for better assessments including walk-arounds or fly-arounds and perhaps better decision making on multiple levels (e.g., the best placement for International Atomic Energy Agency (IAEA) video monitoring cameras).« less
NASA Astrophysics Data System (ADS)
Goodrich, D. C.; Clifford, T. J.; Guertin, D. P.; Sheppard, B. S.; Barlow, J. E.; Korgaonkar, Y.; Burns, I. S.; Unkrich, C. C.
2016-12-01
Wildfires disasters are common throughout the western US. While many feel fire suppression is the largest cost of wildfires, case studies note rehabilitation costs often equal or greatly exceed suppression costs. Using geospatial data sets, and post-fire burn severity products, coupled with the Automated Geospatial Watershed Assessment tool (AGWA - www.tucson.ars.ag.gov/agwa), the Dept. of Interior, Burned Area Emergency Response (BAER) teams can rapidly analyze and identify at-risk areas to target rehabilitation efforts. AGWA employs nationally available geospatial elevation, soils, and land cover data to parameterize the KINEROS2 hydrology and erosion model. A pre-fire watershed simulation can be done prior to BAER deployment using design storms. As soon as the satellite-derived Burned Area Reflectance Classification (BARC) map is obtained, a post-fire watershed simulation using the same storm is conducted. The pre- and post-fire simulations can be spatially differenced in the GIS for rapid identification of high at-risk areas of erosion or flooding. This difference map is used by BAER teams to prioritize field observations and in-turn produce a final burn severity map that is used in AGWA/KINEROS2 simulations to provide report ready results. The 2013 Elk Wildfire Complex that burned over 52,600 ha east of Boise, Idaho provides a tangible example of how BAER experts combined AGWA and geospatial data that resulted in substantial rehabilitation cost savings. The BAER team initially, they identified approximately 6,500 burned ha for rehabilitation. The team then used the AGWA pre- and post-fire watershed simulation results, accessibility constraints, and land slope conditions in an interactive process to locate burned areas that posed the greatest threat to downstream values-at-risk. The group combined the treatable area, field observations, and the spatial results from AGWA to target seed and mulch treatments that most effectively reduced the threats. Using this process, the BAER Team reduced the treatable acres from the original 16,000 ha to between 800 and 1,600 ha depending on the selected alternative. The final awarded contract amounted to about 1,480/ha, therefore, a total savings of 7.2 - $8.4 million was realized for mulch treatment alone.
NASA Astrophysics Data System (ADS)
Mitasova, H.; Hardin, E. J.; Kratochvilova, A.; Landa, M.
2012-12-01
Multitemporal data acquired by modern mapping technologies provide unique insights into processes driving land surface dynamics. These high resolution data also offer an opportunity to improve the theoretical foundations and accuracy of process-based simulations of evolving landforms. We discuss development of new generation of visualization and analytics tools for GRASS GIS designed for 3D multitemporal data from repeated lidar surveys and from landscape process simulations. We focus on data and simulation methods that are based on point sampling of continuous fields and lead to representation of evolving surfaces as series of raster map layers or voxel models. For multitemporal lidar data we present workflows that combine open source point cloud processing tools with GRASS GIS and custom python scripts to model and analyze dynamics of coastal topography (Figure 1) and we outline development of coastal analysis toolbox. The simulations focus on particle sampling method for solving continuity equations and its application for geospatial modeling of landscape processes. In addition to water and sediment transport models, already implemented in GIS, the new capabilities under development combine OpenFOAM for wind shear stress simulation with a new module for aeolian sand transport and dune evolution simulations. Comparison of observed dynamics with the results of simulations is supported by a new, integrated 2D and 3D visualization interface that provides highly interactive and intuitive access to the redesigned and enhanced visualization tools. Several case studies will be used to illustrate the presented methods and tools and demonstrate the power of workflows built with FOSS and highlight their interoperability.Figure 1. Isosurfaces representing evolution of shoreline and a z=4.5m contour between the years 1997-2011at Cape Hatteras, NC extracted from a voxel model derived from series of lidar-based DEMs.
Ki, Seo Jin; Ray, Chittaranjan; Hantush, Mohamed M
2015-06-15
A large-scale leaching assessment tool not only illustrates soil (or groundwater) vulnerability in unmonitored areas, but also can identify areas of potential concern for agrochemical contamination. This study describes the methodology of how the statewide leaching tool in Hawaii modified recently for use with pesticides and volatile organic compounds can be extended to the national assessment of soil vulnerability ratings. For this study, the tool was updated by extending the soil and recharge maps to cover the lower 48 states in the United States (US). In addition, digital maps of annual pesticide use (at a national scale) as well as detailed soil properties and monthly recharge rates (at high spatial and temporal resolutions) were used to examine variations in the leaching (loads) of pesticides for the upper soil horizons. Results showed that the extended tool successfully delineated areas of high to low vulnerability to selected pesticides. The leaching potential was high for picloram, medium for simazine, and low to negligible for 2,4-D and glyphosate. The mass loadings of picloram moving below 0.5 m depth increased greatly in northwestern and central US that recorded its extensive use in agricultural crops. However, in addition to the amount of pesticide used, annual leaching load of atrazine was also affected by other factors that determined the intrinsic aquifer vulnerability such as soil and recharge properties. Spatial and temporal resolutions of digital maps had a great effect on the leaching potential of pesticides, requiring a trade-off between data availability and accuracy. Potential applications of this tool include the rapid, large-scale vulnerability assessments for emerging contaminants which are hard to quantify directly through vadose zone models due to lack of full environmental data. Copyright © 2015 Elsevier Ltd. All rights reserved.
Development of a Florida Coastal Mapping Program Through Local and Regional Coordination
NASA Astrophysics Data System (ADS)
Hapke, C. J.; Kramer, P. A.; Fetherston-Resch, E.; Baumstark, R.
2017-12-01
The State of Florida has the longest coastline in the contiguous United States (2,170 km). The coastal zone is heavily populated and contains 1,900 km of sandy beaches that support economically important recreation and tourism. Florida's waters also host important marine mineral resources, unique ecosystems, and the largest number of recreational boats and saltwater fishermen in the country. There is increasing need and demand for high resolution data of the coast and adjacent seafloor for resource and habitat mapping, understanding coastal vulnerability, evaluating performance of restoration projects, and many other coastal and marine spatial planning efforts. The Florida Coastal Mapping Program (FCMP), initiated in 2017 as a regional collaboration between four federal and three state agencies, has goals of establishing the priorities for high resolution seafloor mapping of Florida's coastal environment, and developing a strategy for leveraging funds to support mapping priorities set by stakeholders. We began by creating a comprehensive digital inventory of existing data (collected by government, the private sector, and academia) from 1 kilometer inland to the 200 meter isobath for a statewide geospatial database and gap analysis. Data types include coastal topography, bathymetry, and acoustic data such as sidescan sonar and subbottom profiles. Next, we will develop appropriate proposals and legislative budget requests in response to opportunities to collect priority data in high priority areas. Data collection will be undertaken by a combination of state and federal agencies. The FCMP effort will provide the critical baseline information that is required for characterizing changes to fragile ecosystems, assessing marine resources, and forecasting the impacts on coastal infrastructure and recreational beaches from future storms and sea-level rise.
Quantarctica: A Unique, Open, Standalone GIS Package for Antarctic Research and Education
NASA Astrophysics Data System (ADS)
Roth, G.; Matsuoka, K.; Skoglund, A.; Melvaer, Y.; Tronstad, S.
2016-12-01
The Norwegian Polar Institute has developed Quantarctica, an open GIS package for use by the international Antarctic community. Quantarctica includes a wide range of cartographic basemap layers, geophysical and glaciological datasets, and satellite imagery in standardized file formats with a consistent Antarctic map projection and customized layer and labeling styles for quick, effective cartography. Quantarctica's strengths as an open science platform lie in 1) The complete, ready-to-use data package which includes full-resolution, original-quality vector and raster data, 2) A policy for freely-redistributable and modifiable data including all metadata and citations, and 3) QGIS, a free, full-featured, modular, offline-capable open-source GIS suite with a rapid and active development and support community. The Quantarctica team is actively seeking new contributions of peer-reviewed, freely distributable pan-Antarctic geospatial datasets for the next version release in 2017. As part of this ongoing development, we are investigating the best approaches for quickly and seamlessly distributing new and updated data to users, storing datasets in efficient file formats while maintaining full quality, and coexisting with numerous online data portals in a way that most actively benefits the Antarctic community. A recent survey of Quantarctica users showed broad geographical adoption among Antarctic Treaty countries, including those outside the large US and UK Antarctic programs. Maps and figures produced by Quantarctica have also appeared in open-access journals and outside of the formal scientific community on popular science and GIS blogs. Our experience with the Quantarctica project has shown the tremendous value of education and outreach, not only in promoting open software, data formats, and practices, but in empowering Antarctic science groups to more effectively use GIS and geospatial data. Open practices are making a huge impact in Antarctic GIS, where individual countries have historically maintained their own restricted Antarctic geodatabases and where a majority of the next generation of scientists are entering the field with experience in using geospatial thinking for planning, visualization, and problem solving.
Estimating of Soil Texture Using Landsat Imagery: a Case Study in Thatta Tehsil, Sindh
NASA Astrophysics Data System (ADS)
Khalil, Zahid
2016-07-01
Soil texture is considered as an important environment factor for agricultural growth. It is the most essential part for soil classification in large scale. Today the precise soil information in large scale is of great demand from various stakeholders including soil scientists, environmental managers, land use planners and traditional agricultural users. With the increasing demand of soil properties in fine scale spatial resolution made the traditional laboratory methods inadequate. In addition the costs of soil analysis with precision agriculture systems are more expensive than traditional methods. In this regard, the application of geo-spatial techniques can be used as an alternative for examining soil analysis. This study aims to examine the ability of Geo-spatial techniques in identifying the spatial patterns of soil attributes in fine scale. Around 28 samples of soil were collected from the different areas of Thatta Tehsil, Sindh, Pakistan for analyzing soil texture. An Ordinary Least Square (OLS) regression analysis was used to relate the reflectance values of Landsat8 OLI imagery with the soil variables. The analysis showed there was a significant relationship (p<0.05) of band 2 and 5 with silt% (R2 = 0.52), and band 4 and 6 with clay% (R2 =0.40). The equation derived from OLS analysis was then used for the whole study area for deriving soil attributes. The USDA textural classification triangle was implementing for the derivation of soil texture map in GIS environment. The outcome revealed that the 'sandy loam' was in great quantity followed by loam, sandy clay loam and clay loam. The outcome shows that the Geo-spatial techniques could be used efficiently for mapping soil texture of a larger area in fine scale. This technology helped in decreasing cost, time and increase detailed information by reducing field work to a considerable level.
A bioavailable strontium isoscape for Western Europe: A machine learning approach
von Holstein, Isabella C. C.; Laffoon, Jason E.; Willmes, Malte; Liu, Xiao-Ming; Davies, Gareth R.
2018-01-01
Strontium isotope ratios (87Sr/86Sr) are gaining considerable interest as a geolocation tool and are now widely applied in archaeology, ecology, and forensic research. However, their application for provenance requires the development of baseline models predicting surficial 87Sr/86Sr variations (“isoscapes”). A variety of empirically-based and process-based models have been proposed to build terrestrial 87Sr/86Sr isoscapes but, in their current forms, those models are not mature enough to be integrated with continuous-probability surface models used in geographic assignment. In this study, we aim to overcome those limitations and to predict 87Sr/86Sr variations across Western Europe by combining process-based models and a series of remote-sensing geospatial products into a regression framework. We find that random forest regression significantly outperforms other commonly used regression and interpolation methods, and efficiently predicts the multi-scale patterning of 87Sr/86Sr variations by accounting for geological, geomorphological and atmospheric controls. Random forest regression also provides an easily interpretable and flexible framework to integrate different types of environmental auxiliary variables required to model the multi-scale patterning of 87Sr/86Sr variability. The method is transferable to different scales and resolutions and can be applied to the large collection of geospatial data available at local and global levels. The isoscape generated in this study provides the most accurate 87Sr/86Sr predictions in bioavailable strontium for Western Europe (R2 = 0.58 and RMSE = 0.0023) to date, as well as a conservative estimate of spatial uncertainty by applying quantile regression forest. We anticipate that the method presented in this study combined with the growing numbers of bioavailable 87Sr/86Sr data and satellite geospatial products will extend the applicability of the 87Sr/86Sr geo-profiling tool in provenance applications. PMID:29847595
Data and Geocomputation: Time Critical Mission Support for the 2017 Hurricane Season
NASA Astrophysics Data System (ADS)
Bhaduri, B. L.; Tuttle, M.; Rose, A.; Sanyal, J.; Thakur, G.; White, D.; Yang, H. H.; Laverdiere, M.; Whitehead, M.; Taylor, H.; Jacob, M.
2017-12-01
A strong spatial data infrastructure and geospatial analysis capabilities are nucleus to the decision-making process during emergency preparedness, response, and recovery operations. For over a decade, the U.S. Department of Energy's Oak Ridge National Laboratory has been developing critical data and analytical capabilities that provide the Federal Emergency Management Agency (FEMA) and the rest of the federal response community assess and evaluate impacts of natural hazards on population and critical infrastructures including the status of the national electricity and oil and natural gas networks. These capabilities range from identifying structures or buildings from very high-resolution satellite imagery, utilizing machine learning and high-performance computing, to daily assessment of electricity restoration highlighting changes in nighttime lights for the impacted region based on the analysis of NOAA JPSS VIIRS Day/Night Band (DNB) imagery. This presentation will highlight our time critical mission support efforts for the 2017 hurricane season that witnessed unprecedented devastation from hurricanes Harvey, Irma, and Maria. ORNL provided 90m resolution LandScan USA population distribution data for identifying vulnerable population as well as structure (buildings) data extracted from 1m imagery for damage assessment. Spatially accurate data for solid waste facilities were developed and delivered to the response community. Human activity signatures were assessed from large scale collection of open source social media data around points of interests (POI) to ascertain level of destruction. The electricity transmission system was monitored in real time from data integration from hundreds of utilities and electricity outage information were provided back to the response community via standardized web-services.
NASA Astrophysics Data System (ADS)
Natali, S.; Mantovani, S.; Barboni, D.; Hogan, P.
2017-12-01
In 1999, US Vice-President Al Gore outlined the concept of `Digital Earth' as a multi-resolution, three-dimensional representation of the planet to find, visualise and make sense of vast amounts of geo- referenced information on physical and social environments, allowing to navigate through space and time, accessing historical and forecast data to support scientists, policy-makers, and any other user. The eodataservice platform (http://eodataservice.org/) implements the Digital Earth Concept: eodatasevice is a cross-domain platform that makes available a large set of multi-year global environmental collections allowing data discovery, visualization, combination, processing and download. It implements a "virtual datacube" approach where data stored on distributed data centers are made available via standardized OGC-compliant interfaces. Dedicated web-based Graphic User Interfaces (based on the ESA-NASA WebWorldWind technology) as well as web-based notebooks (e.g. Jupyter notebook), deskop GIS tools and command line interfaces can be used to access and manipulate the data. The platform can be fully customized on users' needs. So far eodataservice has been used for the following thematic applications: High resolution satellite data distribution Land surface monitoring using SAR surface deformation data Atmosphere, ocean and climate applications Climate-health applications Urban Environment monitoring Safeguard of cultural heritage sites Support to farmers and (re)-insurances in the agriculturés field In the current work, the EO Data Service concept is presented as key enabling technology; furthermore various examples are provided to demonstrate the high level of interdisciplinarity of the platform.
Light Detection and Ranging-Based Terrain Navigation: A Concept Exploration
NASA Technical Reports Server (NTRS)
Campbell, Jacob; UijtdeHaag, Maarten; vanGraas, Frank; Young, Steve
2003-01-01
This paper discusses the use of Airborne Light Detection And Ranging (LiDAR) equipment for terrain navigation. Airborne LiDAR is a relatively new technology used primarily by the geo-spatial mapping community to produce highly accurate and dense terrain elevation maps. In this paper, the term LiDAR refers to a scanning laser ranger rigidly mounted to an aircraft, as opposed to an integrated sensor system that consists of a scanning laser ranger integrated with Global Positioning System (GPS) and Inertial Measurement Unit (IMU) data. Data from the laser range scanner and IMU will be integrated with a terrain database to estimate the aircraft position and data from the laser range scanner will be integrated with GPS to estimate the aircraft attitude. LiDAR data was collected using NASA Dryden's DC-8 flying laboratory in Reno, NV and was used to test the proposed terrain navigation system. The results of LiDAR-based terrain navigation shown in this paper indicate that airborne LiDAR is a viable technology enabler for fully autonomous aircraft navigation. The navigation performance is highly dependent on the quality of the terrain databases used for positioning and therefore high-resolution (2 m post-spacing) data was used as the terrain reference.
NASA Astrophysics Data System (ADS)
Giberson, G. K.; Oswald, C.
2015-12-01
In areas affected by snow, chloride (Cl) salts are widely used as a de-icing agent to improve road conditions. While the improvement in road safety is indisputable, there are environmental consequences to local aquatic ecosystems. In many waterways, Cl concentrations have been increasing since the early 1990s, often exceeding national water quality guidelines. To determine the quantity of Cl that is accumulating in urban and urbanizing watersheds, accurate estimates of road salt usage at the watershed-scale are needed. The complex jurisdictional control over road salt application in southern Ontario lends itself to a geospatial approach for calculating Cl inputs to improve the accuracy of watershed-scale Cl mass balance estimates. This study will develop a geospatial protocol for combining information on road salt applications and road network areas to refine watershed-scale Cl inputs, as well as assess spatiotemporal patterns in road salt application across the southern Ontario study region. The overall objective of this project is to use geospatial methods (predominantly ArcGIS) to develop high-accuracy estimates of road salt usage in urbanizing watersheds in southern Ontario. Specifically, the aims will be to map and summarize the types and areas ("lane-lengths") of roadways in each watershed that have road salt applied to them, to determine the most appropriate source(s) of road salt usage data for each watershed, taking into consideration multiple levels of jurisdiction (e.g. municipal, regional, provincial), to calculate and summarize sub-watershed and watershed-scale road salt usage estimates for multiple years, and to analyze intra-watershed spatiotemporal patterns of road salt usage, especially focusing on impervious surfaces. These analyses will recommend areas of concern exacerbated by high-levels of road salt distribution; recommendations around modifying on-the-ground operations will be the next step in helping to correct these issues.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brost, Randolph C.; McLendon, William Clarence,
2013-01-01
Modeling geospatial information with semantic graphs enables search for sites of interest based on relationships between features, without requiring strong a priori models of feature shape or other intrinsic properties. Geospatial semantic graphs can be constructed from raw sensor data with suitable preprocessing to obtain a discretized representation. This report describes initial work toward extending geospatial semantic graphs to include temporal information, and initial results applying semantic graph techniques to SAR image data. We describe an efficient graph structure that includes geospatial and temporal information, which is designed to support simultaneous spatial and temporal search queries. We also report amore » preliminary implementation of feature recognition, semantic graph modeling, and graph search based on input SAR data. The report concludes with lessons learned and suggestions for future improvements.« less
Using Watershed Boundaries to Map Adverse Health Outcomes: Examples From Nebraska, USA
Corley, Brittany; Bartelt-Hunt, Shannon; Rogan, Eleanor; Coulter, Donald; Sparks, John; Baccaglini, Lorena; Howell, Madeline; Liaquat, Sidra; Commack, Rex; Kolok, Alan S
2018-01-01
In 2009, a paper was published suggesting that watersheds provide a geospatial platform for establishing linkages between aquatic contaminants, the health of the environment, and human health. This article is a follow-up to that original article. From an environmental perspective, watersheds segregate landscapes into geospatial units that may be relevant to human health outcomes. From an epidemiologic perspective, the watershed concept places anthropogenic health data into a geospatial framework that has environmental relevance. Research discussed in this article includes information gathered from the literature, as well as recent data collected and analyzed by this research group. It is our contention that the use of watersheds to stratify geospatial information may be both environmentally and epidemiologically valuable. PMID:29398918
Interoperability And Value Added To Earth Observation Data
NASA Astrophysics Data System (ADS)
Gasperi, J.
2012-04-01
Geospatial web services technology has provided a new means for geospatial data interoperability. Open Geospatial Consortium (OGC) services such as Web Map Service (WMS) to request maps on the Internet, Web Feature Service (WFS) to exchange vectors or Catalog Service for the Web (CSW) to search for geospatialized data have been widely adopted in the Geosciences community in general and in the remote sensing community in particular. These services make Earth Observation data available to a wider range of public users than ever before. The mapshup web client offers an innovative and efficient user interface that takes advantage of the power of interoperability. This presentation will demonstrate how mapshup can be effectively used in the context of natural disasters management.
United States Geological Survey (USGS) Natural Hazards Response
Lamb, Rynn M.; Jones, Brenda K.
2012-01-01
The primary goal of U.S. Geological Survey (USGS) Natural Hazards Response is to ensure that the disaster response community has access to timely, accurate, and relevant geospatial products, imagery, and services during and after an emergency event. To accomplish this goal, products and services provided by the National Geospatial Program (NGP) and Land Remote Sensing (LRS) Program serve as a geospatial framework for mapping activities of the emergency response community. Post-event imagery and analysis can provide important and timely information about the extent and severity of an event. USGS Natural Hazards Response will also support the coordination of remotely sensed data acquisitions, image distribution, and authoritative geospatial information production as required for use in disaster preparedness, response, and recovery operations.
Automated Detection of Thermo-Erosion in High Latitude Ecosystems
NASA Astrophysics Data System (ADS)
Lara, M. J.; Chipman, M. L.; Hu, F.
2017-12-01
Detecting permafrost disturbance is of critical importance as the severity of climate change and associated increase in wildfire frequency and magnitude impacts regional to global carbon dynamics. However, it has not been possible to evaluate spatiotemporal patterns of permafrost degradation over large regions of the Arctic, due to limited spatial and temporal coverage of high resolution optical, radar, lidar, or hyperspectral remote sensing products. Here we present the first automated multi-temporal analysis for detecting disturbance in response to permafrost thaw, using meso-scale high-frequency remote sensing products (i.e. entire Landsat image archive). This approach was developed, tested, and applied in the Noatak National Preserve (26,500km2) in northwestern Alaska. We identified thermo-erosion (TE), by capturing the indirect spectral signal associated with episodic sediment plumes in adjacent waterbodies following TE disturbance. We isolated this turbidity signal within lakes during summer (mid-summer & late-summer) and annual time-period image composites (1986-2016), using the cloud-based geospatial parallel processing platform, Google Earth Engine™API. We validated the TE detection algorithm using seven consecutive years of sub-meter high resolution imagery (2009-2015) covering 798 ( 33%) of the 2456 total lakes in the Noatak lowlands. Our approach had "good agreement" with sediment pulses and landscape deformation in response to permafrost thaw (overall accuracy and kappa coefficient of 85% and 0.61). We identify active TE to impact 10.4% of all lakes, but was inter-annually variable, with the highest and lowest TE years represented by 1986 ( 41.1%) and 2002 ( 0.7%), respectively. We estimate thaw slumps, lake erosion, lake drainage, and gully formation to account for 23.3, 61.8, 12.5, and 1.3%, of all active TE across the Noatak National Preserve. Preliminary analysis, suggests TE may be subject to a hysteresis effect following extreme climatic conditions or wildfire. This work demonstrates the utility of meso-scale high frequency remote sensing products for advancing high latitude permafrost research.
The wildland-urban interface raster dataset of Catalonia.
Alcasena, Fermín J; Evers, Cody R; Vega-Garcia, Cristina
2018-04-01
We provide the wildland urban interface (WUI) map of the autonomous community of Catalonia (Northeastern Spain). The map encompasses an area of some 3.21 million ha and is presented as a 150-m resolution raster dataset. Individual housing location, structure density and vegetation cover data were used to spatially assess in detail the interface, intermix and dispersed rural WUI communities with a geographical information system. Most WUI areas concentrate in the coastal belt where suburban sprawl has occurred nearby or within unmanaged forests. This geospatial information data provides an approximation of residential housing potential for loss given a wildfire, and represents a valuable contribution to assist landscape and urban planning in the region.
NASA Technical Reports Server (NTRS)
Rilee, Michael Lee; Kuo, Kwo-Sen
2017-01-01
The SpatioTemporal Adaptive Resolution Encoding (STARE) is a unifying scheme encoding geospatial and temporal information for organizing data on scalable computing/storage resources, minimizing expensive data transfers. STARE provides a compact representation that turns set-logic functions into integer operations, e.g. conditional sub-setting, taking into account representative spatiotemporal resolutions of the data in the datasets. STARE geo-spatiotemporally aligns data placements of diverse data on massive parallel resources to maximize performance. Automating important scientific functions (e.g. regridding) and computational functions (e.g. data placement) allows scientists to focus on domain-specific questions instead of expending their efforts and expertise on data processing. With STARE-enabled automation, SciDB (Scientific Database) plus STARE provides a database interface, reducing costly data preparation, increasing the volume and variety of interoperable data, and easing result sharing. Using SciDB plus STARE as part of an integrated analysis infrastructure dramatically eases combining diametrically different datasets.
Low income, community poverty and risk of end stage renal disease.
Crews, Deidra C; Gutiérrez, Orlando M; Fedewa, Stacey A; Luthi, Jean-Christophe; Shoham, David; Judd, Suzanne E; Powe, Neil R; McClellan, William M
2014-12-04
The risk of end stage renal disease (ESRD) is increased among individuals with low income and in low income communities. However, few studies have examined the relation of both individual and community socioeconomic status (SES) with incident ESRD. Among 23,314 U.S. adults in the population-based Reasons for Geographic and Racial Differences in Stroke study, we assessed participant differences across geospatially-linked categories of county poverty [outlier poverty, extremely high poverty, very high poverty, high poverty, neither (reference), high affluence and outlier affluence]. Multivariable Cox proportional hazards models were used to examine associations of annual household income and geospatially-linked county poverty measures with incident ESRD, while accounting for death as a competing event using the Fine and Gray method. There were 158 ESRD cases during follow-up. Incident ESRD rates were 178.8 per 100,000 person-years (105 py) in high poverty outlier counties and were 76.3 /105 py in affluent outlier counties, p trend=0.06. In unadjusted competing risk models, persons residing in high poverty outlier counties had higher incidence of ESRD (which was not statistically significant) when compared to those persons residing in counties with neither high poverty nor affluence [hazard ratio (HR) 1.54, 95% Confidence Interval (CI) 0.75-3.20]. This association was markedly attenuated following adjustment for socio-demographic factors (age, sex, race, education, and income); HR 0.96, 95% CI 0.46-2.00. However, in the same adjusted model, income was independently associated with risk of ESRD [HR 3.75, 95% CI 1.62-8.64, comparing the <$20,000 income group to the >$75,000 group]. There were no statistically significant associations of county measures of poverty with incident ESRD, and no evidence of effect modification. In contrast to annual family income, geospatially-linked measures of county poverty have little relation with risk of ESRD. Efforts to mitigate socioeconomic disparities in kidney disease may be best appropriated at the individual level.
77 FR 67831 - Announcement of National Geospatial Advisory Committee Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-14
... from governmental, private sector, non-profit, and academic organizations, has been established to... Dialogue --National Address Database --Geospatial Priorities --NGAC Subcommittee Activities --FGDC Update...
Borderless Geospatial Web (bolegweb)
NASA Astrophysics Data System (ADS)
Cetl, V.; Kliment, T.; Kliment, M.
2016-06-01
The effective access and use of geospatial information (GI) resources acquires a critical value of importance in modern knowledge based society. Standard web services defined by Open Geospatial Consortium (OGC) are frequently used within the implementations of spatial data infrastructures (SDIs) to facilitate discovery and use of geospatial data. This data is stored in databases located in a layer, called the invisible web, thus are ignored by search engines. SDI uses a catalogue (discovery) service for the web as a gateway to the GI world through the metadata defined by ISO standards, which are structurally diverse to OGC metadata. Therefore, a crosswalk needs to be implemented to bridge the OGC resources discovered on mainstream web with those documented by metadata in an SDI to enrich its information extent. A public global wide and user friendly portal of OGC resources available on the web ensures and enhances the use of GI within a multidisciplinary context and bridges the geospatial web from the end-user perspective, thus opens its borders to everybody. Project "Crosswalking the layers of geospatial information resources to enable a borderless geospatial web" with the acronym BOLEGWEB is ongoing as a postdoctoral research project at the Faculty of Geodesy, University of Zagreb in Croatia (http://bolegweb.geof.unizg.hr/). The research leading to the results of the project has received funding from the European Union Seventh Framework Programme (FP7 2007-2013) under Marie Curie FP7-PEOPLE-2011-COFUND. The project started in the November 2014 and is planned to be finished by the end of 2016. This paper provides an overview of the project, research questions and methodology, so far achieved results and future steps.
Geospatial Data as a Service: Towards planetary scale real-time analytics
NASA Astrophysics Data System (ADS)
Evans, B. J. K.; Larraondo, P. R.; Antony, J.; Richards, C. J.
2017-12-01
The rapid growth of earth systems, environmental and geophysical datasets poses a challenge to both end-users and infrastructure providers. For infrastructure and data providers, tasks like managing, indexing and storing large collections of geospatial data needs to take into consideration the various use cases by which consumers will want to access and use the data. Considerable investment has been made by the Earth Science community to produce suitable real-time analytics platforms for geospatial data. There are currently different interfaces that have been defined to provide data services. Unfortunately, there is considerable difference on the standards, protocols or data models which have been designed to target specific communities or working groups. The Australian National University's National Computational Infrastructure (NCI) is used for a wide range of activities in the geospatial community. Earth observations, climate and weather forecasting are examples of these communities which generate large amounts of geospatial data. The NCI has been carrying out significant effort to develop a data and services model that enables the cross-disciplinary use of data. Recent developments in cloud and distributed computing provide a publicly accessible platform where new infrastructures can be built. One of the key components these technologies offer is the possibility of having "limitless" compute power next to where the data is stored. This model is rapidly transforming data delivery from centralised monolithic services towards ubiquitous distributed services that scale up and down adapting to fluctuations in the demand. NCI has developed GSKY, a scalable, distributed server which presents a new approach for geospatial data discovery and delivery based on OGC standards. We will present the architecture and motivating use-cases that drove GSKY's collaborative design, development and production deployment. We show our approach offers the community valuable exploratory analysis capabilities, for dealing with petabyte-scale geospatial data collections.
Making geospatial data in ASF archive readily accessible
NASA Astrophysics Data System (ADS)
Gens, R.; Hogenson, K.; Wolf, V. G.; Drew, L.; Stern, T.; Stoner, M.; Shapran, M.
2015-12-01
The way geospatial data is searched, managed, processed and used has changed significantly in recent years. A data archive such as the one at the Alaska Satellite Facility (ASF), one of NASA's twelve interlinked Distributed Active Archive Centers (DAACs), used to be searched solely via user interfaces that were specifically developed for its particular archive and data sets. ASF then moved to using an application programming interface (API) that defined a set of routines, protocols, and tools for distributing the geospatial information stored in the database in real time. This provided a more flexible access to the geospatial data. Yet, it was up to user to develop the tools to get a more tailored access to the data they needed. We present two new approaches for serving data to users. In response to the recent Nepal earthquake we developed a data feed for distributing ESA's Sentinel data. Users can subscribe to the data feed and are provided with the relevant metadata the moment a new data set is available for download. The second approach was an Open Geospatial Consortium (OGC) web feature service (WFS). The WFS hosts the metadata along with a direct link from which the data can be downloaded. It uses the open-source GeoServer software (Youngblood and Iacovella, 2013) and provides an interface to include the geospatial information in the archive directly into the user's geographic information system (GIS) as an additional data layer. Both services are run on top of a geospatial PostGIS database, an open-source geographic extension for the PostgreSQL object-relational database (Marquez, 2015). Marquez, A., 2015. PostGIS essentials. Packt Publishing, 198 p. Youngblood, B. and Iacovella, S., 2013. GeoServer Beginner's Guide, Packt Publishing, 350 p.
Geospatial Information Response Team
Witt, Emitt C.
2010-01-01
Extreme emergency events of national significance that include manmade and natural disasters seem to have become more frequent during the past two decades. The Nation is becoming more resilient to these emergencies through better preparedness, reduced duplication, and establishing better communications so every response and recovery effort saves lives and mitigates the long-term social and economic impacts on the Nation. The National Response Framework (NRF) (http://www.fema.gov/NRF) was developed to provide the guiding principles that enable all response partners to prepare for and provide a unified national response to disasters and emergencies. The NRF provides five key principles for better preparation, coordination, and response: 1) engaged partnerships, 2) a tiered response, 3) scalable, flexible, and adaptable operations, 4) unity of effort, and 5) readiness to act. The NRF also describes how communities, tribes, States, Federal Government, privatesector, and non-governmental partners apply these principles for a coordinated, effective national response. The U.S. Geological Survey (USGS) has adopted the NRF doctrine by establishing several earth-sciences, discipline-level teams to ensure that USGS science, data, and individual expertise are readily available during emergencies. The Geospatial Information Response Team (GIRT) is one of these teams. The USGS established the GIRT to facilitate the effective collection, storage, and dissemination of geospatial data information and products during an emergency. The GIRT ensures that timely geospatial data are available for use by emergency responders, land and resource managers, and for scientific analysis. In an emergency and response capacity, the GIRT is responsible for establishing procedures for geospatial data acquisition, processing, and archiving; discovery, access, and delivery of data; anticipating geospatial needs; and providing coordinated products and services utilizing the USGS' exceptional pool of geospatial experts and equipment.
2007-01-01
software applications and rely on the installations to supply them with the basic I&E geospatial data - sets for those applications. Such...spatial data in geospatially based tools to help track military supplies and materials all over the world. For instance, SDDCTEA developed IRRIS, a...regional offices or individual installations to supply the data and perform QA/QC in the process. The IVT program office worked with the installations and
Leib, Kenneth J.; Linard, Joshua I.; Williams, Cory A.
2012-01-01
Elevated loads of salt and selenium can impair the quality of water for both anthropogenic and natural uses. Understanding the environmental processes controlling how salt and selenium are introduced to streams is critical to managing and mitigating the effects of elevated loads. Dominant relations between salt and selenium loads and environmental characteristics can be established by using geospatial data. The U.S. Geological Survey, in cooperation with the Bureau of Reclamation, investigated statistical relations between seasonal salt or selenium loads emanating from the Upper Colorado River Basin and geospatial data. Salt and selenium loads measured during the irrigation and nonirrigation seasons were related to geospatial variables for 168 subbasins within the Gunnison and Colorado River Basins. These geospatial variables represented subbasin characteristics of the physical environment, precipitation, geology, land use, and the irrigation network. All subbasin variables with units of area had statistically significant relations with load. The few variables that were not in units of area but were statistically significant helped to identify types of geospatial data that might influence salt and selenium loading. Following a stepwise approach, combinations of these statistically significant variables were used to develop multiple linear regression models. The models can be used to help prioritize areas where salt and selenium control projects might be most effective.
Issues on Building Kazakhstan Geospatial Portal to Implement E-Government
NASA Astrophysics Data System (ADS)
Sagadiyev, K.; Kang, H. K.; Li, K. J.
2016-06-01
A main issue in developing e-government is about how to integrate and organize many complicated processes and different stakeholders. Interestingly geospatial information provides an efficient framework to integrate and organized them. In particular, it is very useful to integrate the process of land management in e-government with geospatial information framework, since most of land management tasks are related with geospatial properties. In this paper, we present a use-case on the e-government project in Kazakhstan for land management. We develop a geoportal to connect many tasks and different users via geospatial information framework. This geoportal is based on open source geospatial software including GeoServer, PostGIS, and OpenLayers. With this geoportal, we expect three achievements as follows. First we establish a transparent governmental process, which is one of main goal of e-government. Every stakeholder monitors what is happening in land management process. Second, we can significantly reduce the time and efforts in the government process. For example, a grant procedure for a building construction has taken more than one year with more than 50 steps. It is expected that this procedure would be reduced to 2 weeks by the geoportal framework. Third we provide a collaborative environment between different governmental structures via the geoportal, while many conflicts and mismatches have been a critical issue of governmental administration processes.
Interoperability in planetary research for geospatial data analysis
NASA Astrophysics Data System (ADS)
Hare, Trent M.; Rossi, Angelo P.; Frigeri, Alessandro; Marmo, Chiara
2018-01-01
For more than a decade there has been a push in the planetary science community to support interoperable methods for accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (e.g., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized geospatial image formats, geologic mapping conventions, U.S. Federal Geographic Data Committee (FGDC) cartographic and metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Map Tile Services (cached image tiles), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they can be just as valuable for planetary domain. Another initiative, called VESPA (Virtual European Solar and Planetary Access), will marry several of the above geoscience standards and astronomy-based standards as defined by International Virtual Observatory Alliance (IVOA). This work outlines the current state of interoperability initiatives in use or in the process of being researched within the planetary geospatial community.
Baker, Nancy T.
2011-01-01
This report and the accompanying geospatial data were created to assist in analysis and interpretation of water-quality data provided by the U.S. Geological Survey's National Stream Quality Accounting Network (NASQAN) and by the U.S. Coastal Waters and Tributaries National Monitoring Network (NMN), which is a cooperative monitoring program of Federal, regional, and State agencies. The report describes the methods used to develop the geospatial data, which was primarily derived from the National Watershed Boundary Dataset. The geospatial data contains polygon shapefiles of basin boundaries for 33 NASQAN and 5 NMN streamflow and water-quality monitoring stations. In addition, 30 polygon shapefiles of the closed and noncontributing basins contained within the NASQAN or NMN boundaries are included. Also included is a point shapefile of the NASQAN and NMN monitoring stations and associated basin and station attributes. Geospatial data for basin delineations, associated closed and noncontributing basins, and monitoring station locations are available at http://water.usgs.gov/GIS/metadata/usgswrd/XML/ds641_nasqan_wbd12.xml.
Developing a distributed HTML5-based search engine for geospatial resource discovery
NASA Astrophysics Data System (ADS)
ZHOU, N.; XIA, J.; Nebert, D.; Yang, C.; Gui, Z.; Liu, K.
2013-12-01
With explosive growth of data, Geospatial Cyberinfrastructure(GCI) components are developed to manage geospatial resources, such as data discovery and data publishing. However, the efficiency of geospatial resources discovery is still challenging in that: (1) existing GCIs are usually developed for users of specific domains. Users may have to visit a number of GCIs to find appropriate resources; (2) The complexity of decentralized network environment usually results in slow response and pool user experience; (3) Users who use different browsers and devices may have very different user experiences because of the diversity of front-end platforms (e.g. Silverlight, Flash or HTML). To address these issues, we developed a distributed and HTML5-based search engine. Specifically, (1)the search engine adopts a brokering approach to retrieve geospatial metadata from various and distributed GCIs; (2) the asynchronous record retrieval mode enhances the search performance and user interactivity; (3) the search engine based on HTML5 is able to provide unified access capabilities for users with different devices (e.g. tablet and smartphone).
USDA-ARS?s Scientific Manuscript database
Understanding the genetic basis of complex plant traits requires connecting genotype to phenotype information, known as the “G2P question.” In the last three decades, genotyping methods have become highly developed. Much less innovation has occurred for measuring plant traits (phenotyping), particul...
Newspaper archives + text mining = rich sources of historical geo-spatial data
NASA Astrophysics Data System (ADS)
Yzaguirre, A.; Smit, M.; Warren, R.
2016-04-01
Newspaper archives are rich sources of cultural, social, and historical information. These archives, even when digitized, are typically unstructured and organized by date rather than by subject or location, and require substantial manual effort to analyze. The effort of journalists to be accurate and precise means that there is often rich geo-spatial data embedded in the text, alongside text describing events that editors considered to be of sufficient importance to the region or the world to merit column inches. A regional newspaper can add over 100,000 articles to its database each year, and extracting information from this data for even a single country would pose a substantial Big Data challenge. In this paper, we describe a pilot study on the construction of a database of historical flood events (location(s), date, cause, magnitude) to be used in flood assessment projects, for example to calibrate models, estimate frequency, establish high water marks, or plan for future events in contexts ranging from urban planning to climate change adaptation. We then present a vision for extracting and using the rich geospatial data available in unstructured text archives, and suggest future avenues of research.
Automated protocols for spaceborne sub-meter resolution "Big Data" products for Earth Science
NASA Astrophysics Data System (ADS)
Neigh, C. S. R.; Carroll, M.; Montesano, P.; Slayback, D. A.; Wooten, M.; Lyapustin, A.; Shean, D. E.; Alexandrov, O.; Macander, M. J.; Tucker, C. J.
2017-12-01
The volume of available remotely sensed data has grown exceeding Petabytes per year and the cost for data, storage systems and compute power have both dropped exponentially. This has opened the door for "Big Data" processing systems with high-end computing (HEC) such as the Google Earth Engine, NASA Earth Exchange (NEX), and NASA Center for Climate Simulation (NCCS). At the same time, commercial very high-resolution (VHR) satellites have grown into a constellation with global repeat coverage that can support existing NASA Earth observing missions with stereo and super-spectral capabilities. Through agreements with the National Geospatial-Intelligence Agency NASA-Goddard Space Flight Center is acquiring Petabytes of global sub-meter to 4 meter resolution imagery from WorldView-1,2,3 Quickbird-2, GeoEye-1 and IKONOS-2 satellites. These data are a valuable no-direct cost for the enhancement of Earth observation research that supports US government interests. We are currently developing automated protocols for generating VHR products to support NASA's Earth observing missions. These include two primary foci: 1) on demand VHR 1/2° ortho mosaics - process VHR to surface reflectance, orthorectify and co-register multi-temporal 2 m multispectral imagery compiled as user defined regional mosaics. This will provide an easy access dataset to investigate biodiversity, tree canopy closure, surface water fraction, and cropped area for smallholder agriculture; and 2) on demand VHR digital elevation models (DEMs) - process stereo VHR to extract VHR DEMs with the NASA Ames stereo pipeline. This will benefit Earth surface studies on the cryosphere (glacier mass balance, flow rates and snow depth), hydrology (lake/water body levels, landslides, subsidence) and biosphere (forest structure, canopy height/cover) among others. Recent examples of products used in NASA Earth Science projects will be provided. This HEC API could foster surmounting prior spatial-temporal limitations while providing broad benefits to Earth Science.
What if we took a global look?
NASA Astrophysics Data System (ADS)
Ouellet Dallaire, C.; Lehner, B.
2014-12-01
Freshwater resources are facing unprecedented pressures. In hope to cope with this, Environmental Hydrology, Freshwater Biology, and Fluvial Geomorphology have defined conceptual approaches such as "environmental flow requirements", "instream flow requirements" or "normative flow regime" to define appropriate flow regime to maintain a given ecological status. These advances in the fields of freshwater resources management are asking scientists to create bridges across disciplines. Holistic and multi-scales approaches are becoming more and more common in water sciences research. The intrinsic nature of river systems demands these approaches to account for the upstream-downstream link of watersheds. Before recent technological developments, large scale analyses were cumbersome and, often, the necessary data was unavailable. However, new technologies, both for information collection and computing capacity, enable a high resolution look at the global scale. For rivers around the world, this new outlook is facilitated by the hydrologically relevant geo-spatial database HydroSHEDS. This database now offers more than 24 millions of kilometers of rivers, some never mapped before, at the click of a fingertip. Large and, even, global scale assessments can now be used to compare rivers around the world. A river classification framework was developed using HydroSHEDS called GloRiC (Global River Classification). This framework advocates for holistic approach to river systems by using sub-classifications drawn from six disciplines related to river sciences: Hydrology, Physiography and climate, Geomorphology, Chemistry, Biology and Human impact. Each of these disciplines brings complementary information on the rivers that is relevant at different scales. A first version of a global river reach classification was produced at the 500m resolution. Variables used in the classification have influence on processes involved at different scales (ex. topography index vs. pH). However, all variables are computed at the same high spatial resolution. This way, we can have a global look at local phenomenon.
Metadata or data about data describes the content, quality, condition, and other characteristics of data. Geospatial metadata are critical to data discovery and serves as the fuel for the Geospatial One-Stop data portal.
,
1999-01-01
In May 1997, the U.S. Geological Survey (USGS) and the Microsoft Corporation of Redmond, Wash., entered into a cooperative research and development agreement (CRADA) to make vast amounts of geospatial data available to the general public through the Internet. The CRADA is a 36-month joint effort to develop a general, public-oriented browsing and retrieval site for geospatial data on the Internet. Specifically, Microsoft plans to (1) modify a large volume of USGS geospatial data so the images can be displayed quickly and easily over the Internet, (2) implement an easy-to-use interface for low-speed connections, and (3) develop an Internet Web site capable of servicing millions of users per day.
,
1998-01-01
In May 1997, the U.S. Geological Survey (USGS) and the Microsoft Corporation of Redmond, Wash., entered into a cooperative research and development agreement (CRADA) to make vast amounts of geospatial data available to the general public through the Internet. The CRADA is a 36-month joint effort to develop a general, public-oriented browsing and retrieval site for geospatial data on the Internet. Specifically, Microsoft plans to (1) modify a large volume of USGS geospatial data so the images can be displayed quickly and easily over the Internet, (2) implement an easy-to-use interface for low-speed connections, and (3) develop an Internet Web site capable of servicing millions of users per day.
Identifying residential neighbourhood types from settlement points in a machine learning approach.
Jochem, Warren C; Bird, Tomas J; Tatem, Andrew J
2018-05-01
Remote sensing techniques are now commonly applied to map and monitor urban land uses to measure growth and to assist with development and planning. Recent work in this area has highlighted the use of textures and other spatial features that can be measured in very high spatial resolution imagery. Far less attention has been given to using geospatial vector data (i.e. points, lines, polygons) to map land uses. This paper presents an approach to distinguish residential settlement types (regular vs. irregular) using an existing database of settlement points locating structures. Nine data features describing the density, distance, angles, and spacing of the settlement points are calculated at multiple spatial scales. These data are analysed alone and with five common remote sensing measures on elevation, slope, vegetation, and nighttime lights in a supervised machine learning approach to classify land use areas. The method was tested in seven provinces of Afghanistan (Balkh, Helmand, Herat, Kabul, Kandahar, Kunduz, Nangarhar). Overall accuracy ranged from 78% in Kandahar to 90% in Nangarhar. This research demonstrates the potential to accurately map land uses from even the simplest representation of structures.
Bradbury, Kyle; Saboo, Raghav; L. Johnson, Timothy; Malof, Jordan M.; Devarajan, Arjun; Zhang, Wuming; M. Collins, Leslie; G. Newell, Richard
2016-01-01
Earth-observing remote sensing data, including aerial photography and satellite imagery, offer a snapshot of the world from which we can learn about the state of natural resources and the built environment. The components of energy systems that are visible from above can be automatically assessed with these remote sensing data when processed with machine learning methods. Here, we focus on the information gap in distributed solar photovoltaic (PV) arrays, of which there is limited public data on solar PV deployments at small geographic scales. We created a dataset of solar PV arrays to initiate and develop the process of automatically identifying solar PV locations using remote sensing imagery. This dataset contains the geospatial coordinates and border vertices for over 19,000 solar panels across 601 high-resolution images from four cities in California. Dataset applications include training object detection and other machine learning algorithms that use remote sensing imagery, developing specific algorithms for predictive detection of distributed PV systems, estimating installed PV capacity, and analysis of the socioeconomic correlates of PV deployment. PMID:27922592
NASA Astrophysics Data System (ADS)
Bradbury, Kyle; Saboo, Raghav; L. Johnson, Timothy; Malof, Jordan M.; Devarajan, Arjun; Zhang, Wuming; M. Collins, Leslie; G. Newell, Richard
2016-12-01
Earth-observing remote sensing data, including aerial photography and satellite imagery, offer a snapshot of the world from which we can learn about the state of natural resources and the built environment. The components of energy systems that are visible from above can be automatically assessed with these remote sensing data when processed with machine learning methods. Here, we focus on the information gap in distributed solar photovoltaic (PV) arrays, of which there is limited public data on solar PV deployments at small geographic scales. We created a dataset of solar PV arrays to initiate and develop the process of automatically identifying solar PV locations using remote sensing imagery. This dataset contains the geospatial coordinates and border vertices for over 19,000 solar panels across 601 high-resolution images from four cities in California. Dataset applications include training object detection and other machine learning algorithms that use remote sensing imagery, developing specific algorithms for predictive detection of distributed PV systems, estimating installed PV capacity, and analysis of the socioeconomic correlates of PV deployment.
Odgers, Candice L; Donley, Sachiko; Caspi, Avshalom; Bates, Christopher J; Moffitt, Terrie E
2015-10-01
The creation of economically mixed communities has been proposed as one way to improve the life outcomes of children growing up in poverty. However, whether low-income children benefit from living alongside more affluent neighbors is unknown. Prospectively gathered data on over 1,600 children from the Environmental Risk (E-Risk) Longitudinal Twin Study living in urban environments is used to test whether living alongside more affluent neighbors (measured via high-resolution geo-spatial indices) predicts low-income children's antisocial behavior (reported by mothers and teachers at the ages of 5, 7, 10, and 12). Results indicated that low-income boys (but not girls) surrounded by more affluent neighbors had higher levels of antisocial behavior than their peers embedded in concentrated poverty. The negative effect of growing up alongside more affluent neighbors on low-income boys' antisocial behavior held across childhood and after controlling for key neighborhood and family-level factors. Findings suggest that efforts to create more economically mixed communities for children, if not properly supported, may have iatrogenic effects on boys' antisocial behavior. © 2015 Association for Child and Adolescent Mental Health.
Bradbury, Kyle; Saboo, Raghav; L Johnson, Timothy; Malof, Jordan M; Devarajan, Arjun; Zhang, Wuming; M Collins, Leslie; G Newell, Richard
2016-12-06
Earth-observing remote sensing data, including aerial photography and satellite imagery, offer a snapshot of the world from which we can learn about the state of natural resources and the built environment. The components of energy systems that are visible from above can be automatically assessed with these remote sensing data when processed with machine learning methods. Here, we focus on the information gap in distributed solar photovoltaic (PV) arrays, of which there is limited public data on solar PV deployments at small geographic scales. We created a dataset of solar PV arrays to initiate and develop the process of automatically identifying solar PV locations using remote sensing imagery. This dataset contains the geospatial coordinates and border vertices for over 19,000 solar panels across 601 high-resolution images from four cities in California. Dataset applications include training object detection and other machine learning algorithms that use remote sensing imagery, developing specific algorithms for predictive detection of distributed PV systems, estimating installed PV capacity, and analysis of the socioeconomic correlates of PV deployment.
Irrigation network extraction methodology from LiDAR DTM using Whitebox and ArcGIS
NASA Astrophysics Data System (ADS)
Mahor, M. A. P.; De La Cruz, R. M.; Olfindo, N. T.; Perez, A. M. C.
2016-10-01
Irrigation networks are important in distributing water resources to areas where rainfall is not enough to sustain agriculture. They are also crucial when it comes to being able to redirect vast amounts of water to decrease the risks of flooding in flat areas, especially near sources of water. With the lack of studies about irrigation feature extraction, which range from wide canals to small ditches, this study aims to present a method of extracting these features from LiDAR-derived digital terrain models (DTMs) using Geographic Information Systems (GIS) tools such as ArcGIS and Whitebox Geospatial Analysis Tools (Whitebox GAT). High-resolution LiDAR DTMs with 1-meter horizontal and 0.25-meter vertical accuracies were processed to generate the gully depth map. This map was then reclassified, converted to vector, and filtered according to segment length, and sinuosity to be able to isolate these irrigation features. Initial results in the test area show that the extraction completeness is greater than 80% when compared with data obtained from the National Irrigation Administration (NIA).
Odgers, Candice L.; Donley, Sachiko; Caspi, Avshalom; Bates, Christopher J.; Moffitt, Terrie E.
2016-01-01
Background The creation of economically mixed communities has been proposed as one way to improve the life outcomes of children growing up in poverty. However, whether low-income children benefit from living alongside more affluent neighbors is unknown. Method Prospectively gathered data on over 1,600 children from the Environmental Risk (E-Risk) Longitudinal Twin Study living in urban environments is used to test whether living alongside more affluent neighbors (measured via high-resolution geo-spatial indices) predicts low-income children’s antisocial behavior (reported by mothers and teachers at the ages of 5, 7, 10, and 12). Results Results indicated that low-income boys (but not girls) surrounded by more affluent neighbors had higher levels of antisocial behavior than their peers embedded in concentrated poverty. The negative effect of growing up alongside more affluent neighbors on low-income boys’ antisocial behavior held across childhood and after controlling for key neighborhood and family-level factors. Conclusions Findings suggest that efforts to create more economically mixed communities for children, if not properly supported, may have iatrogenic effects on boys’ antisocial behavior. PMID:25611118
A resource-oriented architecture for a Geospatial Web
NASA Astrophysics Data System (ADS)
Mazzetti, Paolo; Nativi, Stefano
2010-05-01
In this presentation we discuss some architectural issues on the design of an architecture for a Geospatial Web, that is an information system for sharing geospatial resources according to the Web paradigm. The success of the Web in building a multi-purpose information space, has raised questions about the possibility of adopting the same approach for systems dedicated to the sharing of more specific resources, such as the geospatial information, that is information characterized by spatial/temporal reference. To this aim an investigation on the nature of the Web and on the validity of its paradigm for geospatial resources is required. The Web was born in the early 90's to provide "a shared information space through which people and machines could communicate" [Berners-Lee 1996]. It was originally built around a small set of specifications (e.g. URI, HTTP, HTML, etc.); however, in the last two decades several other technologies and specifications have been introduced in order to extend its capabilities. Most of them (e.g. the SOAP family) actually aimed to transform the Web in a generic Distributed Computing Infrastructure. While these efforts were definitely successful enabling the adoption of service-oriented approaches for machine-to-machine interactions supporting complex business processes (e.g. for e-Government and e-Business applications), they do not fit in the original concept of the Web. In the year 2000, R. T. Fielding, one of the designers of the original Web specifications, proposes a new architectural style for distributed systems, called REST (Representational State Transfer), aiming to capture the fundamental characteristics of the Web as it was originally conceived [Fielding 2000]. In this view, the nature of the Web lies not so much in the technologies, as in the way they are used. Maintaining the Web architecture conform to the REST style would then assure the scalability, extensibility and low entry barrier of the original Web. On the contrary, systems using the same Web technologies and specifications but according to a different architectural style, despite their usefulness, should not be considered part of the Web. If the REST style captures the significant Web characteristics, then, in order to build a Geospatial Web it is necessary that its architecture satisfies all the REST constraints. One of them is of particular importance: the adoption of a Uniform Interface. It prescribes that all the geospatial resources must be accessed through the same interface; moreover according to the REST style this interface must satisfy four further constraints: a) identification of resources; b) manipulation of resources through representations; c) self-descriptive messages; and, d) hypermedia as the engine of application state. In the Web, the uniform interface provides basic operations which are meaningful for generic resources. They typically implement the CRUD pattern (Create-Retrieve-Update-Delete) which demonstrated to be flexible and powerful in several general-purpose contexts (e.g. filesystem management, SQL for database management systems, etc.). Restricting the scope to a subset of resources it would be possible to identify other generic actions which are meaningful for all of them. For example for geospatial resources, subsetting, resampling, interpolation and coordinate reference systems transformations functionalities are candidate functionalities for a uniform interface. However an investigation is needed to clarify the semantics of those actions for different resources, and consequently if they can really ascend the role of generic interface operation. Concerning the point a), (identification of resources), it is required that every resource addressable in the Geospatial Web has its own identifier (e.g. a URI). This allows to implement citation and re-use of resources, simply providing the URI. OPeNDAP and KVP encodings of OGC data access services specifications might provide a basis for it. Concerning point b) (manipulation of resources through representations), the Geospatial Web poses several issues. In fact, while the Web mainly handles semi-structured information, in the Geospatial Web the information is typically structured with several possible data models (e.g. point series, gridded coverages, trajectories, etc.) and encodings. A possibility would be to simplify the interchange formats, choosing to support a subset of data models and format(s). This is what actually the Web designers did choosing to define a common format for hypermedia (HTML), although the underlying protocol would be generic. Concerning point c), self-descriptive messages, the exchanged messages should describe themselves and their content. This would not be actually a major issue considering the effort put in recent years on geospatial metadata models and specifications. The point d), hypermedia as the engine of application state, is actually where the Geospatial Web would mainly differ from existing geospatial information sharing systems. In fact the existing systems typically adopt a service-oriented architecture, where applications are built as a single service or as a workflow of services. On the other hand, in the Geospatial Web, applications should be built following the path between interconnected resources. The link between resources should be made explicit as hyperlinks. The adoption of Semantic Web solutions would allow to define not only the existence of a link between two resources, but also the nature of the link. The implementation of a Geospatial Web would allow to build an information system with the same characteristics of the Web sharing its points-of-strength and weaknesses. The main advantages would be the following: • The user would interact with the Geospatial Web according to the well-known Web navigation paradigm. This would lower the barrier to the access to geospatial applications for non-specialists (e.g. the success of Google Maps and other Web mapping applications); • Successful Web and Web 2.0 applications - search engines, feeds, social network - could be integrated/replicated in the Geospatial Web; The main drawbacks would be the following: • The Uniform Interface simplifies the overall system architecture (e.g. no service registry, and service descriptors required), but moves the complexity to the data representation. Moreover since the interface must stay generic, it results really simple and therefore complex interactions would require several transfers. • In the geospatial domain one of the most valuable resources are processes (e.g. environmental models). How they can be modeled as resources accessed through the common interface is an open issue. Taking into account advantages and drawback it seems that a Geospatial Web would be useful, but its use would be limited to specific use-cases not covering all the possible applications. The Geospatial Web architecture could be partly based on existing specifications, while other aspects need investigation. References [Berners-Lee 1996] T. Berners-Lee, "WWW: Past, present, and future". IEEE Computer, 29(10), Oct. 1996, pp. 69-77. [Fielding 2000] Fielding, R. T. 2000. Architectural styles and the design of network-based software architectures. PhD Dissertation. Dept. of Information and Computer Science, University of California, Irvine
A research on the security of wisdom campus based on geospatial big data
NASA Astrophysics Data System (ADS)
Wang, Haiying
2018-05-01
There are some difficulties in wisdom campus, such as geospatial big data sharing, function expansion, data management, analysis and mining geospatial big data for a characteristic, especially the problem of data security can't guarantee cause prominent attention increasingly. In this article we put forward a data-oriented software architecture which is designed by the ideology of orienting data and data as kernel, solve the problem of traditional software architecture broaden the campus space data research, develop the application of wisdom campus.
Payne, Meredith C.; Reusser, Deborah A.; Lee, Henry
2012-01-01
Sea surface temperature (SST) is an important environmental characteristic in determining the suitability and sustainability of habitats for marine organisms. In particular, the fate of the Arctic Ocean, which provides critical habitat to commercially important fish, is in question. This poses an intriguing problem for future research of Arctic environments - one that will require examination of long-term SST records. This publication describes and provides access to an easy-to-use Arctic SST dataset for ecologists, biogeographers, oceanographers, and other scientists conducting research on habitats and/or processes in the Arctic Ocean. The data cover the Arctic ecoregions as defined by the "Marine Ecoregions of the World" (MEOW) biogeographic schema developed by The Nature Conservancy as well as the region to the north from approximately 46°N to about 88°N (constrained by the season and data coverage). The data span a 29-year period from September 1981 to December 2009. These SST data were derived from Advanced Very High Resolution Radiometer (AVHRR) instrument measurements that had been compiled into monthly means at 4-kilometer grid cell spatial resolution. The processed data files are available in ArcGIS geospatial datasets (raster and point shapefiles) and also are provided in text (.csv) format. All data except the raster files include attributes identifying latitude/longitude coordinates, and realm, province, and ecoregion as defined by the MEOW classification schema. A seasonal analysis of these Arctic ecoregions reveals a wide range of SSTs experienced throughout the Arctic, both over the course of an annual cycle and within each month of that cycle. Sea ice distribution plays a major role in SST regulation in all Arctic ecoregions.
Screening Assessment Report and Atlas with Geospatial Data
This Navajo Nation AUM Screening Assessment Report and the accompanying Atlas with Geospatial Data documents NAUM project data collection and screening results for all known AUMs on the Navajo Nation.
NHDPlus is a geospatial, hydrologic framework dataset that is intended for use by geospatial analysts and modelers to support water resources related applications. NHDPlus was developed by the USEPA in partnership with the US Geologic Survey
32 CFR 320.1 - Purpose and scope.
Code of Federal Regulations, 2010 CFR
2010-07-01
... NATIONAL GEOSPATIAL-INTELLIGENCE AGENCY (NGA) PRIVACY § 320.1 Purpose and scope. (a) This part is published... whether the National Geospatial-Intelligence Agency (NGA) maintains or has disclosed a record pertaining...
32 CFR 320.1 - Purpose and scope.
Code of Federal Regulations, 2014 CFR
2014-07-01
... NATIONAL GEOSPATIAL-INTELLIGENCE AGENCY (NGA) PRIVACY § 320.1 Purpose and scope. (a) This part is published... whether the National Geospatial-Intelligence Agency (NGA) maintains or has disclosed a record pertaining...
32 CFR 320.1 - Purpose and scope.
Code of Federal Regulations, 2012 CFR
2012-07-01
... NATIONAL GEOSPATIAL-INTELLIGENCE AGENCY (NGA) PRIVACY § 320.1 Purpose and scope. (a) This part is published... whether the National Geospatial-Intelligence Agency (NGA) maintains or has disclosed a record pertaining...
NHDPlus is a geospatial, hydrologic framework dataset that is intended for use by geospatial analysts and modelers to support water resources related applications. NHDPlus was developed by the USEPA in partnership with the US Geologic Survey
FRS Geospatial Return File Format
The Geospatial Return File Format describes format that needs to be used to submit latitude and longitude coordinates for use in Envirofacts mapping applications. These coordinates are stored in the Geospatail Reference Tables.
NHDPlus is a geospatial, hydrologic framework dataset that is intended for use by geospatial analysts and modelers to support water resources related applications. NHDPlus was developed by the USEPA in partnership with the US Geologic Survey
Recent Advances in Geospatial Visualization with the New Google Earth
NASA Astrophysics Data System (ADS)
Anderson, J. C.; Poyart, E.; Yan, S.; Sargent, R.
2017-12-01
Google Earth's detailed, world-wide imagery and terrain data provide a rich backdrop for geospatial visualization at multiple scales, from global to local. The Keyhole Markup Language (KML) is an open standard that has been the primary way for users to author and share data visualizations in Google Earth. Despite its ease of use and flexibility for relatively small amounts of data, users can quickly run into difficulties and limitations working with large-scale or time-varying datasets using KML in Google Earth. Recognizing these challenges, we present our recent work toward extending Google Earth to be a more powerful data visualization platform. We describe a new KML extension to simplify the display of multi-resolution map tile pyramids - which can be created by analysis platforms like Google Earth Engine, or by a variety of other map tile production pipelines. We also describe how this implementation can pave the way to creating novel data visualizations by leveraging custom graphics shaders. Finally, we present our investigations into native support in Google Earth for data storage and transport formats that are well-suited for big raster and vector data visualization. Taken together, these capabilities make it easier to create and share new scientific data visualization experiences using Google Earth, and simplify the integration of Google Earth with existing map data products, services, and analysis pipelines.
The Geoinformatica free and open source software stack
NASA Astrophysics Data System (ADS)
Jolma, A.
2012-04-01
The Geoinformatica free and open source software (FOSS) stack is based mainly on three established FOSS components, namely GDAL, GTK+, and Perl. GDAL provides access to a very large selection of geospatial data formats and data sources, a generic geospatial data model, and a large collection of geospatial analytical and processing functionality. GTK+ and the Cairo graphics library provide generic graphics and graphical user interface capabilities. Perl is a programming language, for which there is a very large set of FOSS modules for a wide range of purposes and which can be used as an integrative tool for building applications. In the Geoinformatica stack, data storages such as FOSS RDBMS PostgreSQL with its geospatial extension PostGIS can be used below the three above mentioned components. The top layer of Geoinformatica consists of a C library and several Perl modules. The C library comprises a general purpose raster algebra library, hydrological terrain analysis functions, and visualization code. The Perl modules define a generic visualized geospatial data layer and subclasses for raster and vector data and graphs. The hydrological terrain functions are already rather old and they suffer for example from the requirement of in-memory rasters. Newer research conducted using the platform include basic geospatial simulation modeling, visualization of ecological data, linking with a Bayesian network engine for spatial risk assessment in coastal areas, and developing standards-based distributed water resources information systems in Internet. The Geoinformatica stack constitutes a platform for geospatial research, which is targeted towards custom analytical tools, prototyping and linking with external libraries. Writing custom analytical tools is supported by the Perl language and the large collection of tools that are available especially in GDAL and Perl modules. Prototyping is supported by the GTK+ library, the GUI tools, and the support for object-oriented programming in Perl. New feature types, geospatial layer classes, and tools as extensions with specific features can be defined, used, and studied. Linking with external libraries is possible using the Perl foreign function interface tools or with generic tools such as Swig. We are interested in implementing and testing linking Geoinformatica with existing or new more specific hydrological FOSS.
NASA Astrophysics Data System (ADS)
Yang, Z.; Han, W.; di, L.
2010-12-01
The National Agricultural Statistics Service (NASS) of the USDA produces the Cropland Data Layer (CDL) product, which is a raster-formatted, geo-referenced, U.S. crop specific land cover classification. These digital data layers are widely used for a variety of applications by universities, research institutions, government agencies, and private industry in climate change studies, environmental ecosystem studies, bioenergy production & transportation planning, environmental health research and agricultural production decision making. The CDL is also used internally by NASS for crop acreage and yield estimation. Like most geospatial data products, the CDL product is only available by CD/DVD delivery or online bulk file downloading via the National Research Conservation Research (NRCS) Geospatial Data Gateway (external users) or in a printed paper map format. There is no online geospatial information access and dissemination, no crop visualization & browsing, no geospatial query capability, nor online analytics. To facilitate the application of this data layer and to help disseminating the data, a web-service based CDL interactive map visualization, dissemination, querying system is proposed. It uses Web service based service oriented architecture, adopts open standard geospatial information science technology and OGC specifications and standards, and re-uses functions/algorithms from GeoBrain Technology (George Mason University developed). This system provides capabilities of on-line geospatial crop information access, query and on-line analytics via interactive maps. It disseminates all data to the decision makers and users via real time retrieval, processing and publishing over the web through standards-based geospatial web services. A CDL region of interest can also be exported directly to Google Earth for mashup or downloaded for use with other desktop application. This web service based system greatly improves equal-accessibility, interoperability, usability, and data visualization, facilitates crop geospatial information usage, and enables US cropland online exploring capability without any client-side software installation. It also greatly reduces the need for paper map and analysis report printing and media usages, and thus enhances low-carbon Agro-geoinformation dissemination for decision support.
NASA Astrophysics Data System (ADS)
Delipetrev, Blagoj
2016-04-01
Presently, most of the existing software is desktop-based, designed to work on a single computer, which represents a major limitation in many ways, starting from limited computer processing, storage power, accessibility, availability, etc. The only feasible solution lies in the web and cloud. This abstract presents research and development of a cloud computing geospatial application for water resources based on free and open source software and open standards using hybrid deployment model of public - private cloud, running on two separate virtual machines (VMs). The first one (VM1) is running on Amazon web services (AWS) and the second one (VM2) is running on a Xen cloud platform. The presented cloud application is developed using free and open source software, open standards and prototype code. The cloud application presents a framework how to develop specialized cloud geospatial application that needs only a web browser to be used. This cloud application is the ultimate collaboration geospatial platform because multiple users across the globe with internet connection and browser can jointly model geospatial objects, enter attribute data and information, execute algorithms, and visualize results. The presented cloud application is: available all the time, accessible from everywhere, it is scalable, works in a distributed computer environment, it creates a real-time multiuser collaboration platform, the programing languages code and components are interoperable, and it is flexible in including additional components. The cloud geospatial application is implemented as a specialized water resources application with three web services for 1) data infrastructure (DI), 2) support for water resources modelling (WRM), 3) user management. The web services are running on two VMs that are communicating over the internet providing services to users. The application was tested on the Zletovica river basin case study with concurrent multiple users. The application is a state-of-the-art cloud geospatial collaboration platform. The presented solution is a prototype and can be used as a foundation for developing of any specialized cloud geospatial applications. Further research will be focused on distributing the cloud application on additional VMs, testing the scalability and availability of services.
NASA Astrophysics Data System (ADS)
Khalil, Zahid
2016-07-01
Decision making about identifying suitable sites for any project by considering different parameters, is difficult. Using GIS and Multi-Criteria Analysis (MCA) can make it easy for those projects. This technology has proved to be an efficient and adequate in acquiring the desired information. In this study, GIS and MCA were employed to identify the suitable sites for small dams in Dadu Tehsil, Sindh. The GIS software is used to create all the spatial parameters for the analysis. The parameters that derived are slope, drainage density, rainfall, land use / land cover, soil groups, Curve Number (CN) and runoff index with a spatial resolution of 30m. The data used for deriving above layers include 30 meter resolution SRTM DEM, Landsat 8 imagery, and rainfall from National Centre of Environment Prediction (NCEP) and soil data from World Harmonized Soil Data (WHSD). Land use/Land cover map is derived from Landsat 8 using supervised classification. Slope, drainage network and watershed are delineated by terrain processing of DEM. The Soil Conservation Services (SCS) method is implemented to estimate the surface runoff from the rainfall. Prior to this, SCS-CN grid is developed by integrating the soil and land use/land cover raster. These layers with some technical and ecological constraints are assigned weights on the basis of suitability criteria. The pair wise comparison method, also known as Analytical Hierarchy Process (AHP) is took into account as MCA for assigning weights on each decision element. All the parameters and group of parameters are integrated using weighted overlay in GIS environment to produce suitable sites for the Dams. The resultant layer is then classified into four classes namely, best suitable, suitable, moderate and less suitable. This study reveals a contribution to decision making about suitable sites analysis for small dams using geo-spatial data with minimal amount of ground data. This suitability maps can be helpful for water resource management organizations in determination of feasible rainwater harvesting structures (RWH).
VegScape: U.S. Crop Condition Monitoring Service
NASA Astrophysics Data System (ADS)
mueller, R.; Yang, Z.; Di, L.
2013-12-01
Since 1995, the US Department of Agriculture (USDA)/National Agricultural Statistics Service (NASS) has provided qualitative biweekly vegetation condition indices to USDA policymakers and the public on a weekly basis during the growing season. Vegetation indices have proven useful for assessing crop condition and identifying the areal extent of floods, drought, major weather anomalies, and vulnerabilities of early/late season crops. With growing emphasis on more extreme weather events and food security issues rising to the forefront of national interest, a new vegetation condition monitoring system was developed. The new vegetation condition portal named VegScape was initiated at the start of the 2013 growing season. VegScape delivers web mapping service based interactive vegetation indices. Users can use an interactive map to explore, query and disseminate current crop conditions. Vegetation indices like Normal Difference Vegetation Index (NDVI), Vegetation Condition Index (VCI), and mean, median, and ratio comparisons to prior years can be constructed for analytical purposes and on-demand crop statistics. The NASA MODIS satellite with 250 meter (15 acres) resolution and thirteen years of data history provides improved spatial and temporal resolutions and delivers improved detailed timely (i.e., daily) crop specific condition and dynamics. VegScape thus provides supplemental information to support NASS' weekly crop reports. VegScape delivers an agricultural cultivated crop mask and the most recent Cropland Data Layer (CDL) product to exploit the agricultural domain and visualize prior years' planted crops. Additionally, the data can be directly exported to Google Earth for web mashups or delivered via web mapping services for uses in other applications. VegScape supports the ethos of data democracy by providing free and open access to digital geospatial data layers using open geospatial standards, thereby supporting transparent and collaborative government initiatives. NASS developed VegScape in cooperation with the Center for Spatial Information Science and Systems, George Mason University, Fairfax, VA. VegScape Ratio to Median NDVI
Geospatial Information System Capability Maturity Models
DOT National Transportation Integrated Search
2017-06-01
To explore how State departments of transportation (DOTs) evaluate geospatial tool applications and services within their own agencies, particularly their experiences using capability maturity models (CMMs) such as the Urban and Regional Information ...
Geospatial Data Sciences | Energy Analysis | NREL
, demographics, and the earth's physical geography to provide the foundation for energy analysis and decision -making. Photo of two people discussing a map. Geospatial Analysis Our geographic information system
Conference on Geospatial Approaches to Cancer Control and Population Sciences
The purpose of this conference is to bring together a community of researchers across the cancer control continuum using geospatial tools, models and approaches to address cancer prevention and control.
Staff - Michael D. Hendricks | Alaska Division of Geological & Geophysical
, HI 1999-2000, Geospatial Intelligence Officer, U.S. Army Pacific (USARPAC), Fort Shafter, HI 1995 Geospatial Intelligence Foundation (USGIF), 2011 Selected Publications Wolken, G.J., Wikstrom Jones, Katreen
NHDPlus (National Hydrography Dataset Plus)
NHDPlus is a geospatial, hydrologic framework dataset that is intended for use by geospatial analysts and modelers to support water resources related applications. NHDPlus was developed by the USEPA in partnership with the US Geologic Survey
Incorporating Geographic Information Science in the BSc Environ-mental Science Program in Botswana
NASA Astrophysics Data System (ADS)
Akinyemi, Felicia O.
2018-05-01
Critical human capacity in Geographic Information Science (GISc) is developed at the Botswana International University of Science and Technology, a specialized, research university. Strategies employed include GISc courses offered each semester to students from various programs, the conduct of field-based projects, enrolment in online courses, geo-spatial initiatives with external partners, and final year research projects utilizing geospatial technologies. A review is made of available GISc courses embedded in the Bachelor of Science Environmental Science program. GISc courses are incorporated in three Bachelor degree programs as distinct courses. Geospatial technologies are employed in several other courses. Student researches apply GIS and Remote Sensing methods to environmental and geological themes. The overarching goals are to equip students in various disciplines to utilize geospatial technologies, and enhance their spatial thinking and reasoning skills.
A flexible geospatial sensor observation service for diverse sensor data based on Web service
NASA Astrophysics Data System (ADS)
Chen, Nengcheng; Di, Liping; Yu, Genong; Min, Min
Achieving a flexible and efficient geospatial Sensor Observation Service (SOS) is difficult, given the diversity of sensor networks, the heterogeneity of sensor data storage, and the differing requirements of users. This paper describes development of a service-oriented multi-purpose SOS framework. The goal is to create a single method of access to the data by integrating the sensor observation service with other Open Geospatial Consortium (OGC) services — Catalogue Service for the Web (CSW), Transactional Web Feature Service (WFS-T) and Transactional Web Coverage Service (WCS-T). The framework includes an extensible sensor data adapter, an OGC-compliant geospatial SOS, a geospatial catalogue service, a WFS-T, and a WCS-T for the SOS, and a geospatial sensor client. The extensible sensor data adapter finds, stores, and manages sensor data from live sensors, sensor models, and simulation systems. Abstract factory design patterns are used during design and implementation. A sensor observation service compatible with the SWE is designed, following the OGC "core" and "transaction" specifications. It is implemented using Java servlet technology. It can be easily deployed in any Java servlet container and automatically exposed for discovery using Web Service Description Language (WSDL). Interaction sequences between a Sensor Web data consumer and an SOS, between a producer and an SOS, and between an SOS and a CSW are described in detail. The framework has been successfully demonstrated in application scenarios for EO-1 observations, weather observations, and water height gauge observations.
NASA Astrophysics Data System (ADS)
Nunes, Paulo; Correia, Anacleto; Teodoro, M. Filomena
2017-06-01
Since long ago, information is a key factor for military organizations. In military context the success of joint and combined operations depends on the accurate information and knowledge flow concerning the operational theatre: provision of resources, environment evolution, targets' location, where and when an event will occur. Modern military operations cannot be conceive without maps and geospatial information. Staffs and forces on the field request large volume of information during the planning and execution process, horizontal and vertical geospatial information integration is critical for decision cycle. Information and knowledge management are fundamental to clarify an environment full of uncertainty. Geospatial information (GI) management rises as a branch of information and knowledge management, responsible for the conversion process from raw data collect by human or electronic sensors to knowledge. Geospatial information and intelligence systems allow us to integrate all other forms of intelligence and act as a main platform to process and display geospatial-time referenced events. Combining explicit knowledge with person know-how to generate a continuous learning cycle that supports real time decisions, mitigates the influences of fog of war and provides the knowledge supremacy. This paper presents the analysis done after applying a questionnaire and interviews about the GI and intelligence management in a military organization. The study intended to identify the stakeholder's requirements for a military spatial data infrastructure as well as the requirements for a future software system development.
Towards Precise Metadata-set for Discovering 3D Geospatial Models in Geo-portals
NASA Astrophysics Data System (ADS)
Zamyadi, A.; Pouliot, J.; Bédard, Y.
2013-09-01
Accessing 3D geospatial models, eventually at no cost and for unrestricted use, is certainly an important issue as they become popular among participatory communities, consultants, and officials. Various geo-portals, mainly established for 2D resources, have tried to provide access to existing 3D resources such as digital elevation model, LIDAR or classic topographic data. Describing the content of data, metadata is a key component of data discovery in geo-portals. An inventory of seven online geo-portals and commercial catalogues shows that the metadata referring to 3D information is very different from one geo-portal to another as well as for similar 3D resources in the same geo-portal. The inventory considered 971 data resources affiliated with elevation. 51% of them were from three geo-portals running at Canadian federal and municipal levels whose metadata resources did not consider 3D model by any definition. Regarding the remaining 49% which refer to 3D models, different definition of terms and metadata were found, resulting in confusion and misinterpretation. The overall assessment of these geo-portals clearly shows that the provided metadata do not integrate specific and common information about 3D geospatial models. Accordingly, the main objective of this research is to improve 3D geospatial model discovery in geo-portals by adding a specific metadata-set. Based on the knowledge and current practices on 3D modeling, and 3D data acquisition and management, a set of metadata is proposed to increase its suitability for 3D geospatial models. This metadata-set enables the definition of genuine classes, fields, and code-lists for a 3D metadata profile. The main structure of the proposal contains 21 metadata classes. These classes are classified in three packages as General and Complementary on contextual and structural information, and Availability on the transition from storage to delivery format. The proposed metadata set is compared with Canadian Geospatial Data Infrastructure (CGDI) metadata which is an implementation of North American Profile of ISO-19115. The comparison analyzes the two metadata against three simulated scenarios about discovering needed 3D geo-spatial datasets. Considering specific metadata about 3D geospatial models, the proposed metadata-set has six additional classes on geometric dimension, level of detail, geometric modeling, topology, and appearance information. In addition classes on data acquisition, preparation, and modeling, and physical availability have been specialized for 3D geospatial models.
Student Focused Geospatial Curriculum Initiatives: Internships and Certificate Programs at NCCU
NASA Astrophysics Data System (ADS)
Vlahovic, G.; Malhotra, R.
2009-12-01
This paper reports recent efforts by the Department of Environmental, Earth and Geospatial Sciences faculty at North Carolina Central University (NCCU) to develop a leading geospatial sciences program that will be considered a model for other Historically Black College/University (HBCU) peers nationally. NCCU was established in 1909 and is the nation’s first state supported public liberal arts college funded for African Americans. In the most recent annual ranking of America’s best black colleges by the US News and World Report (Best Colleges 2010), NCCU was ranked 10th in the nation. As one of only two HBCUs in the southeast offering an undergraduate degree in Geography (McKee, J.O. and C. V. Dixon. Geography in Historically Black Colleges/ Universities in the Southeast, in The Role of the South in Making of American Geography: Centennial of the AAG, 2004), NCCU is uniquely positioned to positively affect talent and diversity of the geospatial discipline in the future. Therefore, successful creation of research and internship pathways for NCCU students has national implications because it will increase the number of minority students joining the workforce and applying to PhD programs. Several related efforts will be described, including research and internship projects with Fugro EarthData Inc., Center for Remote Sensing and Mapping Science at the University of Georgia, Center for Earthquake Research and Information at the University of Memphis and the City of Durham. The authors will also outline requirements and recent successes of ASPRS Provisional Certification Program, developed and pioneered as collaborative effort between ASPRS and NCCU. This certificate program allows graduating students majoring in geospatial technologies and allied fields to become provisionally certified by passing peer-review and taking the certification exam. At NCCU, projects and certification are conducted under the aegis of the Geospatial Research, Innovative Teaching and Service (GRITS) Center housed in the Department of Environmental, Earth and Geospatial Sciences. The GRITS center was established in 2006 with funding from the National Science Foundation to promote the learning and application of geospatial technologies. Since then GRITS has been a hub for Geographical Information Science (GIS) curriculum development, faculty and professional GIS workshops, grant writing and outreach efforts. The Center also serves as a contact point for partnerships with other universities, national organizations and businesses in the geospatial arena - and as a result, opens doors to the professional world for our graduate and undergraduate students.
Geospatial methods provide timely and comprehensive urban forest information
Kathleen T. Ward; Gary R. Johnson
2007-01-01
Urban forests are unique and highly valued resources. However, trees in urban forests are often under greater stress than those in rural or undeveloped areas due to soil compaction, restricted growing spaces, high temperatures, and exposure to air and water pollution. In addition, conditions change more quickly in urban as opposed to rural and undeveloped settings....
High performance geospatial and climate data visualization using GeoJS
NASA Astrophysics Data System (ADS)
Chaudhary, A.; Beezley, J. D.
2015-12-01
GeoJS (https://github.com/OpenGeoscience/geojs) is an open-source library developed to support interactive scientific and geospatial visualization of climate and earth science datasets in a web environment. GeoJS has a convenient application programming interface (API) that enables users to harness the fast performance of WebGL and Canvas 2D APIs with sophisticated Scalable Vector Graphics (SVG) features in a consistent and convenient manner. We started the project in response to the need for an open-source JavaScript library that can combine traditional geographic information systems (GIS) and scientific visualization on the web. Many libraries, some of which are open source, support mapping or other GIS capabilities, but lack the features required to visualize scientific and other geospatial datasets. For instance, such libraries are not be capable of rendering climate plots from NetCDF files, and some libraries are limited in regards to geoinformatics (infovis in a geospatial environment). While libraries such as d3.js are extremely powerful for these kinds of plots, in order to integrate them into other GIS libraries, the construction of geoinformatics visualizations must be completed manually and separately, or the code must somehow be mixed in an unintuitive way.We developed GeoJS with the following motivations:• To create an open-source geovisualization and GIS library that combines scientific visualization with GIS and informatics• To develop an extensible library that can combine data from multiple sources and render them using multiple backends• To build a library that works well with existing scientific visualizations tools such as VTKWe have successfully deployed GeoJS-based applications for multiple domains across various projects. The ClimatePipes project funded by the Department of Energy, for example, used GeoJS to visualize NetCDF datasets from climate data archives. Other projects built visualizations using GeoJS for interactively exploring data and analysis regarding 1) the human trafficking domain, 2) New York City taxi drop-offs and pick-ups, and 3) the Ebola outbreak. GeoJS supports advanced visualization features such as picking and selecting, as well as clustering. It also supports 2D contour plots, vector plots, heat maps, and geospatial graphs.
NASA Astrophysics Data System (ADS)
Williams, N. A.; Morris, J. N.; Simms, M. L.; Metoyer, S.
2007-12-01
The Advancing Geospatial Skills in Science and Social Sciences (AGSSS) program, funded by NSF, provides middle and high school teacher-partners with access to graduate student scientists for classroom collaboration and curriculum adaptation to incorporate and advance skills in spatial thinking. AGSSS Fellows aid in the delivery of geospatially-enhanced activities utilizing technology such as geographic information systems, remote sensing, and virtual globes. The partnership also provides advanced professional development for both participating teachers and fellows. The AGSSS program is mutually beneficial to all parties involved. This successful collaboration of scientists, teachers, and students results in greater understanding and enthusiasm for the use of spatial thinking strategies and geospatial technologies. In addition, the partnership produces measurable improvements in student efficacy and attitudes toward processes of spatial thinking. The teacher partner training and classroom resources provided by AGSSS will continue the integration of geospatial activities into the curriculum after the project concludes. Time and resources are the main costs in implementing this partnership. Graduate fellows invest considerable time and energy, outside of academic responsibilities, to develop materials for the classroom. Fellows are required to be available during K-12 school hours, which necessitates forethought in scheduling other graduate duties. However, the benefits far outweigh the costs. Graduate fellows gain experience in working in classrooms. In exchange, students gain exposure to working scientists and their research. This affords graduate fellows the opportunity to hone their communication skills, and specifically allows them to address the issue of translating technical information for a novice audience. Teacher-partners and students benefit by having scientific expertise readily available. In summation, these experiences result in changes in teacher/student perceptions of science and scientists. Evidence of the aforementioned changes are provided through external evaluation and results obtained from several assessment tools. The program also utilizes an internal evaluator to monitor participants thoughts and opinions on the previous years' collaboration. Additionally, graduate fellows maintain a reflective journal to provide insight into experiences occurring both in-class and among peers. Finally, student surveys administered prior to and concluding the academic year assess changes in student attitudes and self-perception of spatial thinking skills.
Near Real-time Scientific Data Analysis and Visualization with the ArcGIS Platform
NASA Astrophysics Data System (ADS)
Shrestha, S. R.; Viswambharan, V.; Doshi, A.
2017-12-01
Scientific multidimensional data are generated from a variety of sources and platforms. These datasets are mostly produced by earth observation and/or modeling systems. Agencies like NASA, NOAA, USGS, and ESA produce large volumes of near real-time observation, forecast, and historical data that drives fundamental research and its applications in larger aspects of humanity from basic decision making to disaster response. A common big data challenge for organizations working with multidimensional scientific data and imagery collections is the time and resources required to manage and process such large volumes and varieties of data. The challenge of adopting data driven real-time visualization and analysis, as well as the need to share these large datasets, workflows, and information products to wider and more diverse communities, brings an opportunity to use the ArcGIS platform to handle such demand. In recent years, a significant effort has put in expanding the capabilities of ArcGIS to support multidimensional scientific data across the platform. New capabilities in ArcGIS to support scientific data management, processing, and analysis as well as creating information products from large volumes of data using the image server technology are becoming widely used in earth science and across other domains. We will discuss and share the challenges associated with big data by the geospatial science community and how we have addressed these challenges in the ArcGIS platform. We will share few use cases, such as NOAA High Resolution Refresh Radar (HRRR) data, that demonstrate how we access large collections of near real-time data (that are stored on-premise or on the cloud), disseminate them dynamically, process and analyze them on-the-fly, and serve them to a variety of geospatial applications. We will also share how on-the-fly processing using raster functions capabilities, can be extended to create persisted data and information products using raster analytics capabilities that exploit distributed computing in an enterprise environment.
NASA Astrophysics Data System (ADS)
Elias, E.; Reyes, J. J.; Steele, C. M.; Rango, A.
2017-12-01
Assessing vulnerability of agricultural systems to climate variability and change is vital in securing food systems and sustaining rural livelihoods. Farmers, ranchers, and forest landowners rely on science-based, decision-relevant, and localized information to maintain production, ecological viability, and economic returns. This contribution synthesizes a collection of research on the future of agricultural production in the American Southwest (SW). Research was based on a variety of geospatial methodologies and datasets to assess the vulnerability of rangelands and livestock, field crops, specialty crops, and forests in the SW to climate-risk and change. This collection emerged from the development of regional vulnerability assessments for agricultural climate-risk by the U.S. Department of Agriculture (USDA) Climate Hub Network, established to deliver science-based information and technologies to enable climate-informed decision-making. Authors defined vulnerability differently based on their agricultural system of interest, although each primarily focuses on biophysical systems. We found that an inconsistent framework for vulnerability and climate risk was necessary to adequately capture the diversity, variability, and heterogeneity of SW landscapes, peoples, and agriculture. Through the diversity of research questions and methodologies, this collection of articles provides valuable information on various aspects of SW vulnerability. All articles relied on geographic information systems technology, with highly variable levels of complexity. Agricultural articles used National Agricultural Statistics Service data, either as tabular county level summaries or through the CropScape cropland raster datasets. Most relied on modeled historic and future climate information, but with differing assumptions regarding spatial resolution and temporal framework. We assert that it is essential to evaluate climate risk using a variety of complementary methodologies and perspectives. In addition, we found that spatial analysis supports informed adaptation, within and outside the SW United States. The persistence and adaptive capacity of agriculture in the water-limited Southwest serves as an instructive example and may offer solutions to reduce future climate risk.
Environmental Remote Sensing Analysis Using Open Source Virtual Earths and Public Domain Imagery
NASA Astrophysics Data System (ADS)
Pilant, A. N.; Worthy, L. D.
2008-12-01
Human activities increasingly impact natural environments. Globally, many ecosystems are stressed to unhealthy limits, leading to loss of valuable ecosystem services- economic, ecologic and intrinsic. Virtual earths (virtual globes) (e.g., NASA World Wind, ossimPlanet, ArcGIS Explorer, Google Earth, Microsoft Virtual Earth) are geospatial data integration tools that can aid our efforts to understand and protect the environment. Virtual earths provide unprecedented desktop views of our planet, not only to professional scientists, but also to citizen scientists, students, environmental stewards, decision makers, urban developers and planners. Anyone with a broadband internet connection can explore the planet virtually, due in large part to freely available open source software and public domain imagery. This has at least two important potential benefits. One, individuals can study the planet from the visually intuitive perspective of the synoptic aerial view, promoting environmental awareness and stewardship. Two, it opens up the possibility of harnessing the in situ knowledge and observations of citizen scientists familiar with landscape conditions in their locales. Could this collective knowledge be harnessed (crowd sourcing) to validate and quality assure land cover and other maps? In this presentation we present examples using public domain imagery and two open source virtual earths to highlight some of the functionalities currently available. OssimPlanet is used to view aerial data from the USDA Geospatial Data Gateway. NASA World Wind is used to extract georeferenced high resolution USGS urban area orthoimagery. ArcGIS Explorer is used to demonstrate an example of image analysis using web processing services. The research presented here was conducted under the Environmental Feature Finder project of the Environmental Protection Agency's Advanced Monitoring Initiative. Although this work was reviewed by EPA and approved for publication, it may not necessarily reflect official Agency policy. Use of trade names does not imply endorsement by the authors or the EPA.
NASA Astrophysics Data System (ADS)
Harris, M. S.; Sautter, L.
2017-12-01
The College of Charleston's BEnthic Acoustic Mapping and Survey (BEAMS) Program has just completed its 10th year of operation, and has proven to be remarkably effective at activating and maintaining undergraduate student interest in conducting research using sophisticated software, state-of-the-art instrumentation, enormous datasets, and significant experiential time. BEAMS students conduct research as part of a minimum 3-course sequence of marine geology-based content, marine geospatial software, and seafloor research courses. Over 140 students have completed the program, 56% of the graduated students remain active in the marine geospatial workforce or academic arenas. Forty-eight percent (48%) of those students are female. As undergraduates, students not only conduct independent research projects, but present their work at national conferences each year. Additionally, over 90 % of all "BEAMers" have been provided a 2-3 day at-sea experience on a dedicated BEAMS Program multibeam survey research cruise, and many students also volunteer as survey technicians aboard NOAA research vessels. Critical partnerships have developed with private industry to provide numerous collaborative opportunities and an employment/employer pipeline, as well as provision of software and hardware at many fiscal levels. Ongoing collaboration with the Marine Institute of Ireland and the National and Kapodistrian University of Athens has also provided valuable field opportunities and collaborative experiences. This talk will summarize the program while highlighting some of the key areas and topics investigated by students, including detailed geomorphologic studies of continental margins, submarine canyons, tectonic features and seamounts. Students also work with NOAA investigators to aid in the characterization of fish and deep coral habitats, and with BOEM researchers to study offshore windfield suitability and submerged cultural landscapes. Our sister program at the University of Washington will also be discussed, as will developing relationships with our international and private industry partners.
NASA Astrophysics Data System (ADS)
Millard, Keiran
2015-04-01
This paper looks at current experiences of geospatial users and geospatial suppliers and how they have been limited by suitable frameworks for managing and communicating data quality, data provenance and intellectual property rights (IPR). Current political and technological drivers mean that increasing volumes of geospatial data are available through a plethora of different products and services, and whilst this is inherently a good thing it does create a new generation of challenges. This paper consider two examples of where these issues have been examined and looks at the challenges and possible solutions from a data user and data supplier perspective. The first example is the IQmulus project that is researching fusion environments for big geospatial point clouds and coverages. The second example is the EU Emodnet programme that is establishing thematic data portals for public marine and coastal data. IQmulus examines big geospatial data; the data from sources such as LIDAR, SONAR and numerical simulations; these data are simply too big for routine and ad-hoc analysis, yet they could realise a myriad of disparate, and readily useable, information products with the right infrastructure in place. IQmulus is researching how to deliver this infrastructure technically, but a financially sustainable delivery depends on being able to track and manage ownership and IPR across the numerous data sets being processed. This becomes complex when the data is composed of multiple overlapping coverages, however managing this allows for uses to be delivered highly-bespoke products to meet their budget and technical needs. The Emodnet programme delivers harmonised marine data at the EU scale across seven thematic portals. As part of the Emodnet programme a series of 'check points' have been initiated to examine how useful these services and other public data services actually are to solve real-world problems. One key finding is that users have been confused by the fact that often data from the same source appears across multiple platforms and that current 19115-style metadata catalogues do not help the vast majority of users in making data selections. To address this, we have looked at approaches used in the leisure industry. This industry has established tools to support users selecting the best hotel for their needs from the metadata available, supported by peer to peer rating. We have looked into how this approach can support users in selecting the best data to meet their needs.