The National Geospatial Technical Operations Center
Craun, Kari J.; Constance, Eric W.; Donnelly, Jay; Newell, Mark R.
2009-01-01
The United States Geological Survey (USGS) National Geospatial Technical Operations Center (NGTOC) provides geospatial technical expertise in support of the National Geospatial Program in its development of The National Map, National Atlas of the United States, and implementation of key components of the National Spatial Data Infrastructure (NSDI).
2017-02-22
manages operations through guidance, policies, programs, and organizations. The NSG is designed to be a mutually supportive enterprise that...deliberate technical design and deliberate human actions. Geospatial engineer teams (GETs) within the geospatial intelligence cells are the day-to-day...standards working group and are designated by the AGC Geospatial Acquisition Support Directorate as required for interoperability. Applicable standards
NASA Astrophysics Data System (ADS)
Yang, C.; Wong, D. W.; Phillips, T.; Wright, R. A.; Lindsey, S.; Kafatos, M.
2005-12-01
As a teamed partnership of the Center for Earth Observing and Space Research (CEOSR) at George Mason University (GMU), Virginia Department of Transportation (VDOT), Bureau of Transportation Statistics at the Department of Transportation (BTS/DOT), and Intergraph, we established Transportation Framework Data Services using Open Geospatial Consortium (OGC)'s Web Feature Service (WFS) Specification to enable the sharing of transportation data among the federal level with data from BTS/DOT, the state level through VDOT, the industries through Intergraph. CEOSR develops WFS solutions using Intergraph software. Relevant technical documents are also developed and disseminated through the partners. The WFS is integrated with operational geospatial systems at CEOSR and VDOT. CEOSR works with Intergraph on developing WFS solutions and technical documents. GeoMedia WebMap WFS toolkit is used with software and technical support from Intergraph. ESRI ArcIMS WFS connector is used with GMU's campus license of ESRI products. Tested solutions are integrated with framework data service operational systems, including 1) CEOSR's interoperable geospatial information services, FGDC clearinghouse Node, Geospatial One Stop (GOS) portal, and WMS services, 2) VDOT's state transportation data and GIS infrastructure, and 3)BTS/DOT's national transportation data. The project presents: 1) develop and deploy an operational OGC WFS 1.1 interfaces at CEOSR for registering with FGDC/GOS Portal and responding to Web ``POST'' requests for transportation Framework data as listed in Table 1; 2) build the WFS service that can return the data that conform to the drafted ANSI/INCITS L1 Standard (when available) for each identified theme in the format given by OGC Geography Markup Language (GML) Version 3.0 or higher; 3) integrate the OGC WFS with CEOSR's clearinghouse nodes, 4) establish a formal partnership to develop and share WFS-based geospatial interoperability technology among GMU, VDOT, BTS/DOT, and Intergraph; and 5) develop WFS-based solutions and technical documents using the GeoMedia WebMap WFS toolkit. Geospatial Web Feature Service is demonstrated to be more efficient in sharing vector data and supports direct Internet access transportation data. Developed WFS solutions also enhanced the interoperable service provided by CEOSR through the FGDC clearinghouse node and the GOS Portal.
Carswell, William J.
2011-01-01
increases the efficiency of the Nation's geospatial community by improving communications about geospatial data, products, services, projects, needs, standards, and best practices. The NGP comprises seven major components (described below), that are managed as a unified set. For example, The National Map establishes data standards and identifies geographic areas where specific types of geospatial data need to be incorporated into The National Map. Partnership Network Liaisons work with Federal, State, local, and tribal partners to help acquire the data. Geospatial technical operations ensure the quality control, integration, and availability to the public of the data acquired. The Emergency Operations Office provides the requirements to The National Map and, during emergencies and natural disasters, provides rapid dissemination of information and data targeted to the needs of emergency responders. The National Atlas uses data from The National Map and other sources to make small-scale maps and multimedia articles about the maps.
Geospatial Analysis of Renewable Energy Technical Potential on Tribal Lands
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doris, E.; Lopez, A.; Beckley, D.
2013-02-01
This technical report uses an established geospatial methodology to estimate the technical potential for renewable energy on tribal lands for the purpose of allowing Tribes to prioritize the development of renewable energy resources either for community scale on-tribal land use or for revenue generating electricity sales.
Technical Manual for the Geospatial Stream Flow Model (GeoSFM)
Asante, Kwabena O.; Artan, Guleid A.; Pervez, Md Shahriar; Bandaragoda, Christina; Verdin, James P.
2008-01-01
The monitoring of wide-area hydrologic events requires the use of geospatial and time series data available in near-real time. These data sets must be manipulated into information products that speak to the location and magnitude of the event. Scientists at the U.S. Geological Survey Earth Resources Observation and Science (USGS EROS) Center have implemented a hydrologic modeling system which consists of an operational data processing system and the Geospatial Stream Flow Model (GeoSFM). The data processing system generates daily forcing evapotranspiration and precipitation data from various remotely sensed and ground-based data sources. To allow for rapid implementation in data scarce environments, widely available terrain, soil, and land cover data sets are used for model setup and initial parameter estimation. GeoSFM performs geospatial preprocessing and postprocessing tasks as well as hydrologic modeling tasks within an ArcView GIS environment. The integration of GIS routines and time series processing routines is achieved seamlessly through the use of dynamically linked libraries (DLLs) embedded within Avenue scripts. GeoSFM is run operationally to identify and map wide-area streamflow anomalies. Daily model results including daily streamflow and soil water maps are disseminated through Internet map servers, flood hazard bulletins and other media.
US Topo - A new national map series
Moore, Laurence R.
2011-01-01
In the second half of the 20th century, the foundation of the U.S. Geological Survey's national map series was the handcrafted 7.5-minute topographic map. Times change, budgets get squeezed and currency expectations become ever more challenging. The USGS's Larry Moore, who oversees data production operations at two National Geospatial Technical Operations Centers, provides an introduction to the new US Topo quadrangle maps.
Economic assessment of the use value of geospatial information
Bernknopf, Richard L.; Shapiro, Carl D.
2015-01-01
Geospatial data inform decision makers. An economic model that involves application of spatial and temporal scientific, technical, and economic data in decision making is described. The value of information (VOI) contained in geospatial data is the difference between the net benefits (in present value terms) of a decision with and without the information. A range of technologies is used to collect and distribute geospatial data. These technical activities are linked to examples that show how the data can be applied in decision making, which is a cultural activity. The economic model for assessing the VOI in geospatial data for decision making is applied to three examples: (1) a retrospective model about environmental regulation of agrochemicals; (2) a prospective model about the impact and mitigation of earthquakes in urban areas; and (3) a prospective model about developing private–public geospatial information for an ecosystem services market. Each example demonstrates the potential value of geospatial information in a decision with uncertain information.
Geospatial Data Science Publications | Geospatial Data Science | NREL
research in these publications. Featured Publications U.S. Renewable Energy Technical Potentials: A GIS -Based Analysis, NREL Technical Report (2012) 2016 Offshore Wind Energy Resource Assessment for the -Temperature Geothermal Resources of the United States, 40th GRC Annual Meeting (2016) High-Level Overview of
geospatial data analysis using parallel processing High performance computing Renewable resource technical potential and supply curve analysis Spatial database utilization Rapid analysis of large geospatial datasets energy and geospatial analysis products Research Interests Rapid, web-based renewable resource analysis
US EPA GLOBAL POSITIONING SYSTEMS - TECHNICAL IMPLEMENTATION GUIDANCE
The U.S. EPA Geospatial Quality Council (GQC) was formed in 1998 to provide Quality Assurance guidance for the development, use, and products of geospatial activities and research. The long-term goals of the GQC are expressed in a living document, currently the EPA Geospatial Qua...
Possibilities of Use of UAVS for Technical Inspection of Buildings and Constructions
NASA Astrophysics Data System (ADS)
Banaszek, Anna; Banaszek, Sebastian; Cellmer, Anna
2017-12-01
In recent years, Unmanned Aerial Vehicles (UAVs) have been used in various sectors of the economy. This is due to the development of new technologies for acquiring and processing geospatial data. The paper presents the results of experiments using UAV, equipped with a high resolution digital camera, for a visual assessment of the technical condition of the building roof and for the inventory of energy infrastructure and its surroundings. The usefulness of digital images obtained from the UAV deck is presented in concrete examples. The use of UAV offers new opportunities in the area of technical inspection due to the detail and accuracy of the data, low operating costs and fast data acquisition.
Geospatial Analysis Using Remote Sensing Images: Case Studies of Zonguldak Test Field
NASA Astrophysics Data System (ADS)
Bayık, Çağlar; Topan, Hüseyin; Özendi, Mustafa; Oruç, Murat; Cam, Ali; Abdikan, Saygın
2016-06-01
Inclined topographies are one of the most challenging problems for geospatial analysis of air-borne and space-borne imageries. However, flat areas are mostly misleading to exhibit the real performance. For this reason, researchers generally require a study area which includes mountainous topography and various land cover and land use types. Zonguldak and its vicinity is a very suitable test site for performance investigation of remote sensing systems due to the fact that it contains different land use types such as dense forest, river, sea, urban area; different structures such as open pit mining operations, thermal power plant; and its mountainous structure. In this paper, we reviewed more than 120 proceeding papers and journal articles about geospatial analysis that are performed on the test field of Zonguldak and its surroundings. Geospatial analysis performed with imageries include elimination of systematic geometric errors, 2/3D georeferencing accuracy assessment, DEM and DSM generation and validation, ortho-image production, evaluation of information content, image classification, automatic feature extraction and object recognition, pan-sharpening, land use and land cover change analysis and deformation monitoring. In these applications many optical satellite images are used i.e. ASTER, Bilsat-1, IKONOS, IRS-1C, KOMPSAT-1, KVR-1000, Landsat-3-5-7, Orbview-3, QuickBird, Pleiades, SPOT-5, TK-350, RADARSAT-1, WorldView-1-2; as well as radar data i.e. JERS-1, Envisat ASAR, TerraSAR-X, ALOS PALSAR and SRTM. These studies are performed by Departments of Geomatics Engineering at Bülent Ecevit University, at İstanbul Technical University, at Yıldız Technical University, and Institute of Photogrammetry and GeoInformation at Leibniz University Hannover. These studies are financially supported by TÜBİTAK (Turkey), the Universities, ESA, Airbus DS, ERSDAC (Japan) and Jülich Research Centre (Germany).
NASA Astrophysics Data System (ADS)
Huang, W.; Jiang, J.; Zha, Z.; Zhang, H.; Wang, C.; Zhang, J.
2014-04-01
Geospatial data resources are the foundation of the construction of geo portal which is designed to provide online geoinformation services for the government, enterprise and public. It is vital to keep geospatial data fresh, accurate and comprehensive in order to satisfy the requirements of application and development of geographic location, route navigation, geo search and so on. One of the major problems we are facing is data acquisition. For us, integrating multi-sources geospatial data is the mainly means of data acquisition. This paper introduced a practice integration approach of multi-source geospatial data with different data model, structure and format, which provided the construction of National Geospatial Information Service Platform of China (NGISP) with effective technical supports. NGISP is the China's official geo portal which provides online geoinformation services based on internet, e-government network and classified network. Within the NGISP architecture, there are three kinds of nodes: national, provincial and municipal. Therefore, the geospatial data is from these nodes and the different datasets are heterogeneous. According to the results of analysis of the heterogeneous datasets, the first thing we do is to define the basic principles of data fusion, including following aspects: 1. location precision; 2.geometric representation; 3. up-to-date state; 4. attribute values; and 5. spatial relationship. Then the technical procedure is researched and the method that used to process different categories of features such as road, railway, boundary, river, settlement and building is proposed based on the principles. A case study in Jiangsu province demonstrated the applicability of the principle, procedure and method of multi-source geospatial data integration.
Renewable Energy Technical Potential | Geospatial Data Science | NREL
Technical Potential Renewable Energy Technical Potential The renewable energy technical potential level from Resource to Technical to Economic to Market. The benefit of assessing technical potential is potential-resource, technical, economic, and market-as shown in the graphic with key assumptions. Technical
Geospatial Data Science Research | Geospatial Data Science | NREL
, maps, and tools that determine which energy technologies are viable solutions across the globe ) to manipulate, manage, and analyze multidisciplinary geographic and energy data. The GIS includes of applications and visualizations. Analysis Renewable Energy Technical Potential Renewable Energy
Plug and Play web-based visualization of mobile air monitoring data (Abstract)
EPA’s Real-Time Geospatial (RETIGO) Data Viewer web-based tool is a new program reducing the technical barrier to visualize and understand geospatial air data time series collected using wearable, bicycle-mounted, or vehicle-mounted air sensors. The RETIGO tool, with anticipated...
Geospatial Information Best Practices
2012-01-01
26 Spring - 2012 By MAJ Christopher Blais, CW2 Joshua Stratton and MSG Moise Danjoint The fact that Geospatial information can be codified and...Operation Iraqi Freedom V (2007-2008, and Operation New Dawn (2011). MSG Moise Danjoint is the noncommissioned officer in charge, Geospatial
EPA Geospatial Quality Council Promoting Quality Assurance in the Geospatial Coummunity
After establishing a foundation for the EPA National Geospatial Program, the EPA Geospatial Quality Council (GQC) is, in part, focusing on improving administrative efficiency in the geospatial community. To realize this goal, the GQC is developing Standard Operating Procedures (S...
GeoSearch: A lightweight broking middleware for geospatial resources discovery
NASA Astrophysics Data System (ADS)
Gui, Z.; Yang, C.; Liu, K.; Xia, J.
2012-12-01
With petabytes of geodata, thousands of geospatial web services available over the Internet, it is critical to support geoscience research and applications by finding the best-fit geospatial resources from the massive and heterogeneous resources. Past decades' developments witnessed the operation of many service components to facilitate geospatial resource management and discovery. However, efficient and accurate geospatial resource discovery is still a big challenge due to the following reasons: 1)The entry barriers (also called "learning curves") hinder the usability of discovery services to end users. Different portals and catalogues always adopt various access protocols, metadata formats and GUI styles to organize, present and publish metadata. It is hard for end users to learn all these technical details and differences. 2)The cost for federating heterogeneous services is high. To provide sufficient resources and facilitate data discovery, many registries adopt periodic harvesting mechanism to retrieve metadata from other federated catalogues. These time-consuming processes lead to network and storage burdens, data redundancy, and also the overhead of maintaining data consistency. 3)The heterogeneous semantics issues in data discovery. Since the keyword matching is still the primary search method in many operational discovery services, the search accuracy (precision and recall) is hard to guarantee. Semantic technologies (such as semantic reasoning and similarity evaluation) offer a solution to solve these issues. However, integrating semantic technologies with existing service is challenging due to the expandability limitations on the service frameworks and metadata templates. 4)The capabilities to help users make final selection are inadequate. Most of the existing search portals lack intuitive and diverse information visualization methods and functions (sort, filter) to present, explore and analyze search results. Furthermore, the presentation of the value-added additional information (such as, service quality and user feedback), which conveys important decision supporting information, is missing. To address these issues, we prototyped a distributed search engine, GeoSearch, based on brokering middleware framework to search, integrate and visualize heterogeneous geospatial resources. Specifically, 1) A lightweight discover broker is developed to conduct distributed search. The broker retrieves metadata records for geospatial resources and additional information from dispersed services (portals and catalogues) and other systems on the fly. 2) A quality monitoring and evaluation broker (i.e., QoS Checker) is developed and integrated to provide quality information for geospatial web services. 3) The semantic assisted search and relevance evaluation functions are implemented by loosely interoperating with ESIP Testbed component. 4) Sophisticated information and data visualization functionalities and tools are assembled to improve user experience and assist resource selection.
Geospatial Analysis | Energy Analysis | NREL
products and tools. Image of a triangle divided into sections called Market, Economic, Technical, and Featured Study U.S. Renewable Energy Technical Potentials: A GIS-Based Analysis summarizes the achievable energy generation, or technical potential, of specific renewable energy technologies given system
Advancements in Open Geospatial Standards for Photogrammetry and Remote Sensing from Ogc
NASA Astrophysics Data System (ADS)
Percivall, George; Simonis, Ingo
2016-06-01
The necessity of open standards for effective sharing and use of remote sensing continues to receive increasing emphasis in policies of agencies and projects around the world. Coordination on the development of open standards for geospatial information is a vital step to insure that the technical standards are ready to support the policy objectives. The mission of the Open Geospatial Consortium (OGC) is to advance development and use of international standards and supporting services that promote geospatial interoperability. To accomplish this mission, OGC serves as the global forum for the collaboration of geospatial data / solution providers and users. Photogrammetry and remote sensing are sources of the largest and most complex geospatial information. Some of the most mature OGC standards for remote sensing include the Sensor Web Enablement (SWE) standards, the Web Coverage Service (WCS) suite of standards, encodings such as NetCDF, GMLJP2 and GeoPackage, and the soon to be approved Discrete Global Grid Systems (DGGS) standard. In collaboration with ISPRS, OGC working with government, research and industrial organizations continue to advance the state of geospatial standards for full use of photogrammetry and remote sensing.
An Institutional Community-Driven effort to Curate and Preserve Geospatial Data using GeoBlacklight
NASA Astrophysics Data System (ADS)
Petters, J.; Coleman, S.; Andrea, O.
2016-12-01
A variety of geospatial data is produced or collected by both academic researchers and non-academic groups in the Virginia Tech community. In an effort to preserve, curate and make this geospatial data discoverable, the University Libraries have been building a local implementation of GeoBlacklight, a multi-institutional open-source collaborative project to improve the discoverability and sharing of geospatial data. We will discuss the local implementation of Geoblacklight at Virginia Tech, focusing on the efforts necessary to make it a sustainable resource for the institution and local community going forward. This includes technical challenges such as the development of uniform workflows for geospatial data produced within and outside the course of research, but organizational and economic barriers must be overcome as well. In spearheading this GeoBlacklight effort the Libraries have partnered with University Facilities and University IT. The IT group manages the storage and backup of geospatial data, allowing our group to focus on geospatial data collection and curation. Both IT and University Facilities are in possession of localized geospatial data of interest to Viriginia Tech researchers that all parties agreed should be made discoverable and accessible. The interest and involvement of these and other university stakeholders is key to establishing the sustainability of the infrastructure and the capabilities it can provide to the Virginia Tech community and beyond.
International boundary experiences by the United Nations
NASA Astrophysics Data System (ADS)
Kagawa, A.
2013-12-01
Over the last few decades, the United Nations (UN) has been approached by Security Council and Member States on international boundary issues. The United Nations regards the adequate delimitation and demarcation of international boundaries as a very important element for the maintenance of peace and security in fragile post-conflict situations, establishment of friendly relationships and cross-border cooperation between States. This paper will present the main principles and framework the United Nations applies to support the process of international boundary delimitation and demarcation activities. The United Nations is involved in international boundary issues following the principle of impartiality and neutrality and its role as mediator. Since international boundary issues are multi-faceted, a range of expertise is required and the United Nations Secretariat is in a good position to provide diverse expertise within the multiple departments. Expertise in different departments ranging from legal, political, technical, administrative and logistical are mobilised in different ways to provide support to Member States depending on their specific needs. This presentation aims to highlight some of the international boundary projects that the United Nations Cartographic Section has been involved in order to provide the technical support to different boundary requirements as each international boundary issue requires specific focus and attention whether it be in preparation, delimitation, demarcation or management. Increasingly, the United Nations is leveraging geospatial technology to facilitate boundary delimitation and demarcation process between Member States. Through the presentation of the various case studies ranging from Iraq - Kuwait, Israel - Lebanon (Blue Line), Eritrea - Ethiopia, Cyprus (Green Line), Cameroon - Nigeria, Sudan - South Sudan, it will illustrate how geospatial technology is increasingly used to carry out the support. In having applied a range of geospatial solutions, some of the good practices that have been applied in preceding projects, but there have been challenges and limitations faced. However, these challenges need to be seen as an opportunity to improve the geospatial technology solutions in future international boundary projects. This presentation will also share the aspirations that the United Nations Cartographic Section has in becoming a facilitator in geospatial technical aspects related to international boundary issues as we increasingly develop our geospatial institutional knowledge base and expertise. The presentation will conclude by emphasizing the need for more collaboration between different actors dealing with geospatial technology on borderland issues in order to meet the main goal of the United Nations - to live and work together as "We the Peoples of the United Nations".
Hydrogen Maps | Geospatial Data Science | NREL
Hydrogen Maps Hydrogen Maps This collection of U.S. hydrogen maps provides examples of how : Milestone Report, NREL Technical Report (2006) Hydrogen Potential from Renewable Energy Resources This study Technical Report (2007) Hydrogen Potential from Coal, Natural Gas, Nuclear, and Hydro Resources This study
NASA Astrophysics Data System (ADS)
Shiklomanov, A. I.; Okladnikov, I.; Gordov, E. P.; Proussevitch, A. A.; Titov, A. G.
2016-12-01
Presented is a collaborative project carrying out by joint team of researchers from the Institute of Monitoring of Climatic and Ecological Systems, Russia and Earth Systems Research Center, University of New Hampshire, USA. Its main objective is development of a hardware and software prototype of Distributed Research Center (DRC) for monitoring and projecting of regional climatic and and their impacts on the environment over the Northern extratropical areas. In the framework of the project new approaches to "cloud" processing and analysis of large geospatial datasets (big geospatial data) are being developed. It will be deployed on technical platforms of both institutions and applied in research of climate change and its consequences. Datasets available at NCEI and IMCES include multidimensional arrays of climatic, environmental, demographic, and socio-economic characteristics. The project is aimed at solving several major research and engineering tasks: 1) structure analysis of huge heterogeneous climate and environmental geospatial datasets used in the project, their preprocessing and unification; 2) development of a new distributed storage and processing model based on a "shared nothing" paradigm; 3) development of a dedicated database of metadata describing geospatial datasets used in the project; 4) development of a dedicated geoportal and a high-end graphical frontend providing intuitive user interface, internet-accessible online tools for analysis of geospatial data and web services for interoperability with other geoprocessing software packages. DRC will operate as a single access point to distributed archives of spatial data and online tools for their processing. Flexible modular computational engine running verified data processing routines will provide solid results of geospatial data analysis. "Cloud" data analysis and visualization approach will guarantee access to the DRC online tools and data from all over the world. Additionally, exporting of data processing results through WMS and WFS services will be used to provide their interoperability. Financial support of this activity by the RF Ministry of Education and Science under Agreement 14.613.21.0037 (RFMEFI61315X0037) and by the Iola Hubbard Climate Change Endowment is acknowledged.
Interacting With A Near Real-Time Urban Digital Watershed Using Emerging Geospatial Web Technologies
NASA Astrophysics Data System (ADS)
Liu, Y.; Fazio, D. J.; Abdelzaher, T.; Minsker, B.
2007-12-01
The value of real-time hydrologic data dissemination including river stage, streamflow, and precipitation for operational stormwater management efforts is particularly high for communities where flash flooding is common and costly. Ideally, such data would be presented within a watershed-scale geospatial context to portray a holistic view of the watershed. Local hydrologic sensor networks usually lack comprehensive integration with sensor networks managed by other agencies sharing the same watershed due to administrative, political, but mostly technical barriers. Recent efforts on providing unified access to hydrological data have concentrated on creating new SOAP-based web services and common data format (e.g. WaterML and Observation Data Model) for users to access the data (e.g. HIS and HydroSeek). Geospatial Web technology including OGC sensor web enablement (SWE), GeoRSS, Geo tags, Geospatial browsers such as Google Earth and Microsoft Virtual Earth and other location-based service tools provides possibilities for us to interact with a digital watershed in near-real-time. OGC SWE proposes a revolutionary concept towards a web-connected/controllable sensor networks. However, these efforts have not provided the capability to allow dynamic data integration/fusion among heterogeneous sources, data filtering and support for workflows or domain specific applications where both push and pull mode of retrieving data may be needed. We propose a light weight integration framework by extending SWE with open source Enterprise Service Bus (e.g., mule) as a backbone component to dynamically transform, transport, and integrate both heterogeneous sensor data sources and simulation model outputs. We will report our progress on building such framework where multi-agencies" sensor data and hydro-model outputs (with map layers) will be integrated and disseminated in a geospatial browser (e.g. Microsoft Virtual Earth). This is a collaborative project among NCSA, USGS Illinois Water Science Center, Computer Science Department at UIUC funded by the Adaptive Environmental Infrastructure Sensing and Information Systems initiative at UIUC.
NASA Astrophysics Data System (ADS)
Bunds, M. P.
2017-12-01
Point clouds are a powerful data source in the geosciences, and the emergence of structure-from-motion (SfM) photogrammetric techniques has allowed them to be generated quickly and inexpensively. Consequently, applications of them as well as methods to generate, manipulate, and analyze them warrant inclusion in undergraduate curriculum. In a new course called Geospatial Field Methods at Utah Valley University, students in small groups use SfM to generate a point cloud from imagery collected with a small unmanned aerial system (sUAS) and use it as a primary data source for a research project. Before creating their point clouds, students develop needed technical skills in laboratory and class activities. The students then apply the skills to construct the point clouds, and the research projects and point cloud construction serve as a central theme for the class. Intended student outcomes for the class include: technical skills related to acquiring, processing, and analyzing geospatial data; improved ability to carry out a research project; and increased knowledge related to their specific project. To construct the point clouds, students first plan their field work by outlining the field site, identifying locations for ground control points (GCPs), and loading them onto a handheld GPS for use in the field. They also estimate sUAS flight elevation, speed, and the flight path grid spacing required to produce a point cloud with the resolution required for their project goals. In the field, the students place the GCPs using handheld GPS, and survey the GCP locations using post-processed-kinematic (PPK) or real-time-kinematic (RTK) methods. The students pilot the sUAS and operate its camera according to the parameters that they estimated in planning their field work. Data processing includes obtaining accurate locations for the PPK/RTK base station and GCPs, and SfM processing with Agisoft Photoscan. The resulting point clouds are rasterized into digital surface models, assessed for accuracy, and analyzed in Geographic Information System software. Student projects have included mapping and analyzing landslide morphology, fault scarps, and earthquake ground surface rupture. Students have praised the geospatial skills they learn, whereas helping them stay on schedule to finish their projects is a challenge.
Geo-spatial Service and Application based on National E-government Network Platform and Cloud
NASA Astrophysics Data System (ADS)
Meng, X.; Deng, Y.; Li, H.; Yao, L.; Shi, J.
2014-04-01
With the acceleration of China's informatization process, our party and government take a substantive stride in advancing development and application of digital technology, which promotes the evolution of e-government and its informatization. Meanwhile, as a service mode based on innovative resources, cloud computing may connect huge pools together to provide a variety of IT services, and has become one relatively mature technical pattern with further studies and massive practical applications. Based on cloud computing technology and national e-government network platform, "National Natural Resources and Geospatial Database (NRGD)" project integrated and transformed natural resources and geospatial information dispersed in various sectors and regions, established logically unified and physically dispersed fundamental database and developed national integrated information database system supporting main e-government applications. Cross-sector e-government applications and services are realized to provide long-term, stable and standardized natural resources and geospatial fundamental information products and services for national egovernment and public users.
NASA Astrophysics Data System (ADS)
Lawhead, Pamela B.; Aten, Michelle L.
2003-04-01
The Center for GeoSpatial Workforce Development is embarking on a new era in education by developing a repository of dynamic online courseware authored by the foremost industry experts within the remote sensing and GIS industries. Virtual classrooms equipped with the most advanced instructions, computations, communications, course evaluation, and management facilities amplify these courses to enhance the learning environment and provide rapid feedback between instructors and students. The launch of this program included the objective development of the Model Curriculum by an independent consortium of remote sensing industry leaders. The Center's research and development focus on recruiting additional industry experts to develop the technical content of the courseware and then utilize state-of-the-art technology to enhance their material with visually stimulating animations, compelling audio clips and entertaining, interactive exercises intended to reach the broadest audience possible by targeting various learning styles. The courseware will be delivered via various media: Internet, CD-ROM, DVD, and compressed video, that translates into anywhere, anytime delivery of GeoSpatial Information Technology education.
Infrastructure for the Geospatial Web
NASA Astrophysics Data System (ADS)
Lake, Ron; Farley, Jim
Geospatial data and geoprocessing techniques are now directly linked to business processes in many areas. Commerce, transportation and logistics, planning, defense, emergency response, health care, asset management and many other domains leverage geospatial information and the ability to model these data to achieve increased efficiencies and to develop better, more comprehensive decisions. However, the ability to deliver geospatial data and the capacity to process geospatial information effectively in these domains are dependent on infrastructure technology that facilitates basic operations such as locating data, publishing data, keeping data current and notifying subscribers and others whose applications and decisions are dependent on this information when changes are made. This chapter introduces the notion of infrastructure technology for the Geospatial Web. Specifically, the Geography Markup Language (GML) and registry technology developed using the ebRIM specification delivered from the OASIS consortium are presented as atomic infrastructure components in a working Geospatial Web.
Building asynchronous geospatial processing workflows with web services
NASA Astrophysics Data System (ADS)
Zhao, Peisheng; Di, Liping; Yu, Genong
2012-02-01
Geoscience research and applications often involve a geospatial processing workflow. This workflow includes a sequence of operations that use a variety of tools to collect, translate, and analyze distributed heterogeneous geospatial data. Asynchronous mechanisms, by which clients initiate a request and then resume their processing without waiting for a response, are very useful for complicated workflows that take a long time to run. Geospatial contents and capabilities are increasingly becoming available online as interoperable Web services. This online availability significantly enhances the ability to use Web service chains to build distributed geospatial processing workflows. This paper focuses on how to orchestrate Web services for implementing asynchronous geospatial processing workflows. The theoretical bases for asynchronous Web services and workflows, including asynchrony patterns and message transmission, are examined to explore different asynchronous approaches to and architecture of workflow code for the support of asynchronous behavior. A sample geospatial processing workflow, issued by the Open Geospatial Consortium (OGC) Web Service, Phase 6 (OWS-6), is provided to illustrate the implementation of asynchronous geospatial processing workflows and the challenges in using Web Services Business Process Execution Language (WS-BPEL) to develop them.
NASA Astrophysics Data System (ADS)
Nunes, Paulo; Correia, Anacleto; Teodoro, M. Filomena
2017-06-01
Since long ago, information is a key factor for military organizations. In military context the success of joint and combined operations depends on the accurate information and knowledge flow concerning the operational theatre: provision of resources, environment evolution, targets' location, where and when an event will occur. Modern military operations cannot be conceive without maps and geospatial information. Staffs and forces on the field request large volume of information during the planning and execution process, horizontal and vertical geospatial information integration is critical for decision cycle. Information and knowledge management are fundamental to clarify an environment full of uncertainty. Geospatial information (GI) management rises as a branch of information and knowledge management, responsible for the conversion process from raw data collect by human or electronic sensors to knowledge. Geospatial information and intelligence systems allow us to integrate all other forms of intelligence and act as a main platform to process and display geospatial-time referenced events. Combining explicit knowledge with person know-how to generate a continuous learning cycle that supports real time decisions, mitigates the influences of fog of war and provides the knowledge supremacy. This paper presents the analysis done after applying a questionnaire and interviews about the GI and intelligence management in a military organization. The study intended to identify the stakeholder's requirements for a military spatial data infrastructure as well as the requirements for a future software system development.
Gurdak, Jason J.; Qi, Sharon L.; Geisler, Michael L.
2009-01-01
The U.S. Geological Survey Raster Error Propagation Tool (REPTool) is a custom tool for use with the Environmental System Research Institute (ESRI) ArcGIS Desktop application to estimate error propagation and prediction uncertainty in raster processing operations and geospatial modeling. REPTool is designed to introduce concepts of error and uncertainty in geospatial data and modeling and provide users of ArcGIS Desktop a geoprocessing tool and methodology to consider how error affects geospatial model output. Similar to other geoprocessing tools available in ArcGIS Desktop, REPTool can be run from a dialog window, from the ArcMap command line, or from a Python script. REPTool consists of public-domain, Python-based packages that implement Latin Hypercube Sampling within a probabilistic framework to track error propagation in geospatial models and quantitatively estimate the uncertainty of the model output. Users may specify error for each input raster or model coefficient represented in the geospatial model. The error for the input rasters may be specified as either spatially invariant or spatially variable across the spatial domain. Users may specify model output as a distribution of uncertainty for each raster cell. REPTool uses the Relative Variance Contribution method to quantify the relative error contribution from the two primary components in the geospatial model - errors in the model input data and coefficients of the model variables. REPTool is appropriate for many types of geospatial processing operations, modeling applications, and related research questions, including applications that consider spatially invariant or spatially variable error in geospatial data.
Commercial observation satellites: broadening the sources of geospatial data
NASA Astrophysics Data System (ADS)
Baker, John C.; O'Connell, Kevin M.; Venzor, Jose A.
2002-09-01
Commercial observation satellites promise to broaden substantially the sources of imagery data available to potential users of geospatial data and related information products. We examine the new trend toward private firms acquiring and operating high-resolution imagery satellites. These commercial observation satellites build on the substantial experience in Earth observation operations provided by government-owned imaging satellites for civilian and military purposes. However, commercial satellites will require governments and companies to reconcile public and private interests in allowing broad public access to high-resolution satellite imagery data without creating national security risks or placing the private firms at a disadvantage compared with other providers of geospatial data.
National Hydropower Plant Dataset, Version 2 (FY18Q3)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Samu, Nicole; Kao, Shih-Chieh; O'Connor, Patrick
The National Hydropower Plant Dataset, Version 2 (FY18Q3) is a geospatially comprehensive point-level dataset containing locations and key characteristics of U.S. hydropower plants that are currently either in the hydropower development pipeline (pre-operational), operational, withdrawn, or retired. These data are provided in GIS and tabular formats with corresponding metadata for each. In addition, we include access to download 2 versions of the National Hydropower Map, which was produced with these data (i.e. Map 1 displays the geospatial distribution and characteristics of all operational hydropower plants; Map 2 displays the geospatial distribution and characteristics of operational hydropower plants with pumped storagemore » and mixed capabilities only). This dataset is a subset of ORNL's Existing Hydropower Assets data series, updated quarterly as part of ORNL's National Hydropower Asset Assessment Program.« less
NASA Astrophysics Data System (ADS)
Oeldenberger, S.; Khaled, K. B.
2012-07-01
The African Geospatial Sciences Institute (AGSI) is currently being established in Tunisia as a non-profit, non-governmental organization (NGO). Its objective is to accelerate the geospatial capacity development in North-Africa, providing the facilities for geospatial project and management training to regional government employees, university graduates, private individuals and companies. With typical course durations between one and six months, including part-time programs and long-term mentoring, its focus is on practical training, providing actual project execution experience. The AGSI will complement formal university education and will work closely with geospatial certification organizations and the geospatial industry. In the context of closer cooperation between neighboring North Africa and the European Community, the AGSI will be embedded in a network of several participating European and African universities, e. g. the ITC, and international organizations, such as the ISPRS, the ICA and the OGC. Through a close cooperation with African organizations, such as the AARSE, the RCMRD and RECTAS, the network and exchange of ideas, experiences, technology and capabilities will be extended to Saharan and sub-Saharan Africa. A board of trustees will be steering the AGSI operations and will ensure that practical training concepts and contents are certifiable and can be applied within a credit system to graduate and post-graduate education at European and African universities. The geospatial training activities of the AGSI are centered on a facility with approximately 30 part- and full-time general staff and lecturers in Tunis during the first year. The AGSI will operate a small aircraft with a medium-format aerial camera and compact LIDAR instrument for local, community-scale data capture. Surveying training, the photogrammetric processing of aerial images, GIS data capture and remote sensing training will be the main components of the practical training courses offered, to build geospatial capacity and ensure that AGSI graduates will have the appropriate skill-sets required for employment in the geospatial industry. Geospatial management courses and high-level seminars will be targeted at decision makers in government and industry to build awareness for geospatial applications and benefits. Online education will be developed together with international partners and internet-based activities will involve the public to familiarize them with geospatial data and its many applications.
NASA Astrophysics Data System (ADS)
Hasyim, Fuad; Subagio, Habib; Darmawan, Mulyanto
2016-06-01
A preparation of spatial planning documents require basic geospatial information and thematic accuracies. Recently these issues become important because spatial planning maps are impartial attachment of the regional act draft on spatial planning (PERDA). The needs of geospatial information in the preparation of spatial planning maps preparation can be divided into two major groups: (i). basic geospatial information (IGD), consist of of Indonesia Topographic maps (RBI), coastal and marine environmental maps (LPI), and geodetic control network and (ii). Thematic Geospatial Information (IGT). Currently, mostly local goverment in Indonesia have not finished their regulation draft on spatial planning due to some constrain including technical aspect. Some constrain in mapping of spatial planning are as follows: the availability of large scale ofbasic geospatial information, the availability of mapping guidelines, and human resources. Ideal conditions to be achieved for spatial planning maps are: (i) the availability of updated geospatial information in accordance with the scale needed for spatial planning maps, (ii) the guideline of mapping for spatial planning to support local government in completion their PERDA, and (iii) capacity building of local goverment human resources to completed spatial planning maps. The OMP strategies formulated to achieve these conditions are: (i) accelerating of IGD at scale of 1:50,000, 1: 25,000 and 1: 5,000, (ii) to accelerate mapping and integration of Thematic Geospatial Information (IGT) through stocktaking availability and mapping guidelines, (iii) the development of mapping guidelines and dissemination of spatial utilization and (iv) training of human resource on mapping technology.
Bim and Gis: when Parametric Modeling Meets Geospatial Data
NASA Astrophysics Data System (ADS)
Barazzetti, L.; Banfi, F.
2017-12-01
Geospatial data have a crucial role in several projects related to infrastructures and land management. GIS software are able to perform advanced geospatial analyses, but they lack several instruments and tools for parametric modelling typically available in BIM. At the same time, BIM software designed for buildings have limited tools to handle geospatial data. As things stand at the moment, BIM and GIS could appear as complementary solutions, notwithstanding research work is currently under development to ensure a better level of interoperability, especially at the scale of the building. On the other hand, the transition from the local (building) scale to the infrastructure (where geospatial data cannot be neglected) has already demonstrated that parametric modelling integrated with geoinformation is a powerful tool to simplify and speed up some phases of the design workflow. This paper reviews such mixed approaches with both simulated and real examples, demonstrating that integration is already a reality at specific scales, which are not dominated by "pure" GIS or BIM. The paper will also demonstrate that some traditional operations carried out with GIS software are also available in parametric modelling software for BIM, such as transformation between reference systems, DEM generation, feature extraction, and geospatial queries. A real case study is illustrated and discussed to show the advantage of a combined use of both technologies. BIM and GIS integration can generate greater usage of geospatial data in the AECOO (Architecture, Engineering, Construction, Owner and Operator) industry, as well as new solutions for parametric modelling with additional geoinformation.
Spatial Thinking: Precept for Understanding Operational Environments
2016-06-10
A Computer Movie Simulating Urban Growth in the Detroit Region,” 236. 29 U.S. National Research Council, Learning to Think Spatially: GIS as a... children and spatial language, the article focuses on the use of geospatial information systems (GIS) as a support mechanism for learning to think...Thinking, Cognition, Learning , Geospatial, Operating Environment, Space Perception 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18
NASA Astrophysics Data System (ADS)
Titov, A. G.; Okladnikov, I. G.; Gordov, E. P.
2017-11-01
The use of large geospatial datasets in climate change studies requires the development of a set of Spatial Data Infrastructure (SDI) elements, including geoprocessing and cartographical visualization web services. This paper presents the architecture of a geospatial OGC web service system as an integral part of a virtual research environment (VRE) general architecture for statistical processing and visualization of meteorological and climatic data. The architecture is a set of interconnected standalone SDI nodes with corresponding data storage systems. Each node runs a specialized software, such as a geoportal, cartographical web services (WMS/WFS), a metadata catalog, and a MySQL database of technical metadata describing geospatial datasets available for the node. It also contains geospatial data processing services (WPS) based on a modular computing backend realizing statistical processing functionality and, thus, providing analysis of large datasets with the results of visualization and export into files of standard formats (XML, binary, etc.). Some cartographical web services have been developed in a system’s prototype to provide capabilities to work with raster and vector geospatial data based on OGC web services. The distributed architecture presented allows easy addition of new nodes, computing and data storage systems, and provides a solid computational infrastructure for regional climate change studies based on modern Web and GIS technologies.
Improved satellite and geospatial tools for pipeline operator decision support systems.
DOT National Transportation Integrated Search
2017-01-06
Under Cooperative Agreement No. OASRTRS-14-H-CAL, California Polytechnic State University San Luis Obispo (Cal Poly), partnered with C-CORE, MDA, PRCI, and Electricore to design and develop improved satellite and geospatial tools for pipeline operato...
Development of a National Digital Geospatial Data Framework
,
1995-01-01
This proposal of a data framework to organize and enhance the activities of the geospatial data community to meet needs for basic themes of data was developed in response to a request in Executive Order 12906, Coordinating Geographic Data Acquisition and Access: The National Spatial Data Infrastructure (U.S. Executive Office of the President, 1994). The request stated: in consultation with State, local, and tribal governments and within 9 months of the date of this order, the FGDC shall submit a plan and schedule to OMB [U.S. Office of Management and Budget] for completing the initial implementation of a national digital geospatial data framework ("framework") by January 2000 and for establishing a process of ongoing data maintenance. The framework shall include geospatial data that are significant, in the determination of the FGDC, to a broad variety of users within any geographic area or nationwide. At a minimum, the plan shall address how the initial transportation, hydrology, and boundary elements of the framework might be completed by January 1998 in order to support the decennial census of 2000. The proposal was developed by representatives of local, regional, State, and Federal agencies under the auspices of the Federal Geographic Data Committee (FGDC). The individuals are listed in the appendix of this report. This Framework Working Group identified the purpose and goals for the framework; identified incentives for participation; defined the information content; developed preliminary technical, operational, and business contexts; specified the institutional roles needed; and developed a strategy for a phased implementation of the framework.Members of the working group presented the concepts of the framework for discussion at several national and regional public meetings. The draft of the report also was provided for public, written review. These discussions and reviews were the source of many improvements to the report.The FGDC approved the report for submission to the Office of Management and Budget on March 31, 1995.
NASA Astrophysics Data System (ADS)
Ibarra, Mercedes; Gherboudj, Imen; Al Rished, Abdulaziz; Ghedira, Hosni
2017-06-01
Given ambitious plans to increase the amount of electricity production from renewable resources and the natural resources of the Kingdom of Saudi Arabia (KSA), solar energy stands as a technology with a great development potential in this country. In this work, the suitability of the territory is assess through a geospatial analysis, using a PTC performance model to account for the technical potential. As a result, a land suitability map is presented, where the North-West area of the country is identified as the one with more highly suitable area.
A View from Above Without Leaving the Ground
NASA Technical Reports Server (NTRS)
2004-01-01
In order to deliver accurate geospatial data and imagery to the remote sensing community, NASA is constantly developing new image-processing algorithms while refining existing ones for technical improvement. For 8 years, the NASA Regional Applications Center at Florida International University has served as a test bed for implementing and validating many of these algorithms, helping the Space Program to fulfill its strategic and educational goals in the area of remote sensing. The algorithms in return have helped the NASA Regional Applications Center develop comprehensive semantic database systems for data management, as well as new tools for disseminating geospatial information via the Internet.
77 FR 1454 - Request for Nominations of Members To Serve on the Census Scientific Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-10
..., statistical analysis, survey methodology, geospatial analysis, econometrics, cognitive psychology, and... following disciplines: Demography, economics, geography, psychology, statistics, survey methodology, social... technical expertise in such areas as demography, economics, geography, psychology, statistics, survey...
SDI-based business processes: A territorial analysis web information system in Spain
NASA Astrophysics Data System (ADS)
Béjar, Rubén; Latre, Miguel Á.; Lopez-Pellicer, Francisco J.; Nogueras-Iso, Javier; Zarazaga-Soria, F. J.; Muro-Medrano, Pedro R.
2012-09-01
Spatial Data Infrastructures (SDIs) provide access to geospatial data and operations through interoperable Web services. These data and operations can be chained to set up specialized geospatial business processes, and these processes can give support to different applications. End users can benefit from these applications, while experts can integrate the Web services in their own business processes and developments. This paper presents an SDI-based territorial analysis Web information system for Spain, which gives access to land cover, topography and elevation data, as well as to a number of interoperable geospatial operations by means of a Web Processing Service (WPS). Several examples illustrate how different territorial analysis business processes are supported. The system has been established by the Spanish National SDI (Infraestructura de Datos Espaciales de España, IDEE) both as an experimental platform for geoscientists and geoinformation system developers, and as a mechanism to contribute to the Spanish citizens knowledge about their territory.
1-Meter Digital Elevation Model specification
Arundel, Samantha T.; Archuleta, Christy-Ann M.; Phillips, Lori A.; Roche, Brittany L.; Constance, Eric W.
2015-10-21
In January 2015, the U.S. Geological Survey National Geospatial Technical Operations Center began producing the 1-Meter Digital Elevation Model data product. This new product was developed to provide high resolution bare-earth digital elevation models from light detection and ranging (lidar) elevation data and other elevation data collected over the conterminous United States (lower 48 States), Hawaii, and potentially Alaska and the U.S. territories. The 1-Meter Digital Elevation Model consists of hydroflattened, topographic bare-earth raster digital elevation models, with a 1-meter x 1-meter cell size, and is available in 10,000-meter x 10,000-meter square blocks with a 6-meter overlap. This report details the specifications required for the production of the 1-Meter Digital Elevation Model.
77 FR 32978 - Call for Nominations to the National Geospatial Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-04
... the Department and the FGDC on policy and management issues related to the effective operation of... through the Federal Geographic Data Committee related to management of Federal geospatial programs, development of the National Spatial Data Infrastructure, and the implementation of Office of Management and...
Cloud Computing for Geosciences--GeoCloud for standardized geospatial service platforms (Invited)
NASA Astrophysics Data System (ADS)
Nebert, D. D.; Huang, Q.; Yang, C.
2013-12-01
The 21st century geoscience faces challenges of Big Data, spike computing requirements (e.g., when natural disaster happens), and sharing resources through cyberinfrastructure across different organizations (Yang et al., 2011). With flexibility and cost-efficiency of computing resources a primary concern, cloud computing emerges as a promising solution to provide core capabilities to address these challenges. Many governmental and federal agencies are adopting cloud technologies to cut costs and to make federal IT operations more efficient (Huang et al., 2010). However, it is still difficult for geoscientists to take advantage of the benefits of cloud computing to facilitate the scientific research and discoveries. This presentation reports using GeoCloud to illustrate the process and strategies used in building a common platform for geoscience communities to enable the sharing, integration of geospatial data, information and knowledge across different domains. GeoCloud is an annual incubator project coordinated by the Federal Geographic Data Committee (FGDC) in collaboration with the U.S. General Services Administration (GSA) and the Department of Health and Human Services. It is designed as a staging environment to test and document the deployment of a common GeoCloud community platform that can be implemented by multiple agencies. With these standardized virtual geospatial servers, a variety of government geospatial applications can be quickly migrated to the cloud. In order to achieve this objective, multiple projects are nominated each year by federal agencies as existing public-facing geospatial data services. From the initial candidate projects, a set of common operating system and software requirements was identified as the baseline for platform as a service (PaaS) packages. Based on these developed common platform packages, each project deploys and monitors its web application, develops best practices, and documents cost and performance information. This paper presents the background, architectural design, and activities of GeoCloud in support of the Geospatial Platform Initiative. System security strategies and approval processes for migrating federal geospatial data, information, and applications into cloud, and cost estimation for cloud operations are covered. Finally, some lessons learned from the GeoCloud project are discussed as reference for geoscientists to consider in the adoption of cloud computing.
United States Geological Survey (USGS) Natural Hazards Response
Lamb, Rynn M.; Jones, Brenda K.
2012-01-01
The primary goal of U.S. Geological Survey (USGS) Natural Hazards Response is to ensure that the disaster response community has access to timely, accurate, and relevant geospatial products, imagery, and services during and after an emergency event. To accomplish this goal, products and services provided by the National Geospatial Program (NGP) and Land Remote Sensing (LRS) Program serve as a geospatial framework for mapping activities of the emergency response community. Post-event imagery and analysis can provide important and timely information about the extent and severity of an event. USGS Natural Hazards Response will also support the coordination of remotely sensed data acquisitions, image distribution, and authoritative geospatial information production as required for use in disaster preparedness, response, and recovery operations.
Users Manual for the Geospatial Stream Flow Model (GeoSFM)
Artan, Guleid A.; Asante, Kwabena; Smith, Jodie; Pervez, Md Shahriar; Entenmann, Debbie; Verdin, James P.; Rowland, James
2008-01-01
The monitoring of wide-area hydrologic events requires the manipulation of large amounts of geospatial and time series data into concise information products that characterize the location and magnitude of the event. To perform these manipulations, scientists at the U.S. Geological Survey Center for Earth Resources Observation and Science (EROS), with the cooperation of the U.S. Agency for International Development, Office of Foreign Disaster Assistance (USAID/OFDA), have implemented a hydrologic modeling system. The system includes a data assimilation component to generate data for a Geospatial Stream Flow Model (GeoSFM) that can be run operationally to identify and map wide-area streamflow anomalies. GeoSFM integrates a geographical information system (GIS) for geospatial preprocessing and postprocessing tasks and hydrologic modeling routines implemented as dynamically linked libraries (DLLs) for time series manipulations. Model results include maps that depicting the status of streamflow and soil water conditions. This Users Manual provides step-by-step instructions for running the model and for downloading and processing the input data required for initial model parameterization and daily operation.
A geospatial assessment of mini/small hydropower potential in Sub-Saharan Africa
NASA Astrophysics Data System (ADS)
Korkovelos, Alexandros; Mentis, Dimitrios; Hussain Siyal, Shahid; Arderne, Christopher; Beck, Hylke; de Roo, Ad; Howells, Mark
2017-04-01
Sub-Saharan Africa has been the epicenter of ongoing global dialogues around energy poverty and justifiably so. More than half of the world's unserved population lives there. At the same time, a big part of the continent is privileged with plentiful renewable energy resources. Hydropower is one of them and to a large extent it remains untapped. This study focuses on the technical assessment of small-scale hydropower (0.01-10 MW) in Sub-Saharan Africa. The underlying methodology was based on open source geospatial datasets, whose combination allowed a consistent evaluation of 712,615 km of river network spanning over 44 countries. Environmental, topological and social constraints were included in the form of geospatial restrictions to help preserve the natural wealth and promote sustainable development. The results revealed that small-scale hydropower could cover 8.5-12.5% of the estimated electricity demand in 2030, thus making it a viable option to support electrification efforts in the region.
The Hazards Data Distribution System update
Jones, Brenda K.; Lamb, Rynn M.
2010-01-01
After a major disaster, a satellite image or a collection of aerial photographs of the event is frequently the fastest, most effective way to determine its scope and severity. The U.S. Geological Survey (USGS) Emergency Operations Portal provides emergency first responders and support personnel with easy access to imagery and geospatial data, geospatial Web services, and a digital library focused on emergency operations. Imagery and geospatial data are accessed through the Hazards Data Distribution System (HDDS). HDDS historically provided data access and delivery services through nongraphical interfaces that allow emergency response personnel to select and obtain pre-event baseline data and (or) event/disaster response data. First responders are able to access full-resolution GeoTIFF images or JPEG images at medium- and low-quality compressions through ftp downloads. USGS HDDS home page: http://hdds.usgs.gov/hdds2/
The use of National Technical Means (NTM) data and advanced geospatial technologies has an important role in supporting the mission of the Environmental Protection Agency (EPA). EPA's responsibilities have grown beyond pollution compliance monitoring and enforcement to include t...
Community Mapping: Putting the Pieces Together
ERIC Educational Resources Information Center
Andersen, Doug
2011-01-01
Many geography and technology educators have been attracted to geospatial technologies because of the potential to help students develop and demonstrate spatial and higher order thinking skills, only to be frustrated with implementation at the school level. Even when teachers have overcome the technical hurdles of hardware, software, and data,…
Situational Awareness Geospatial Application (iSAGA)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sher, Benjamin
Situational Awareness Geospatial Application (iSAGA) is a geospatial situational awareness software tool that uses an algorithm to extract location data from nearly any internet-based, or custom data source and display it geospatially; allows user-friendly conduct of spatial analysis using custom-developed tools; searches complex Geographic Information System (GIS) databases and accesses high resolution imagery. iSAGA has application at the federal, state and local levels of emergency response, consequence management, law enforcement, emergency operations and other decision makers as a tool to provide complete, visual, situational awareness using data feeds and tools selected by the individual agency or organization. Feeds may bemore » layered and custom tools developed to uniquely suit each subscribing agency or organization. iSAGA may similarly be applied to international agencies and organizations.« less
NASA Astrophysics Data System (ADS)
Millard, Keiran
2015-04-01
This paper looks at current experiences of geospatial users and geospatial suppliers and how they have been limited by suitable frameworks for managing and communicating data quality, data provenance and intellectual property rights (IPR). Current political and technological drivers mean that increasing volumes of geospatial data are available through a plethora of different products and services, and whilst this is inherently a good thing it does create a new generation of challenges. This paper consider two examples of where these issues have been examined and looks at the challenges and possible solutions from a data user and data supplier perspective. The first example is the IQmulus project that is researching fusion environments for big geospatial point clouds and coverages. The second example is the EU Emodnet programme that is establishing thematic data portals for public marine and coastal data. IQmulus examines big geospatial data; the data from sources such as LIDAR, SONAR and numerical simulations; these data are simply too big for routine and ad-hoc analysis, yet they could realise a myriad of disparate, and readily useable, information products with the right infrastructure in place. IQmulus is researching how to deliver this infrastructure technically, but a financially sustainable delivery depends on being able to track and manage ownership and IPR across the numerous data sets being processed. This becomes complex when the data is composed of multiple overlapping coverages, however managing this allows for uses to be delivered highly-bespoke products to meet their budget and technical needs. The Emodnet programme delivers harmonised marine data at the EU scale across seven thematic portals. As part of the Emodnet programme a series of 'check points' have been initiated to examine how useful these services and other public data services actually are to solve real-world problems. One key finding is that users have been confused by the fact that often data from the same source appears across multiple platforms and that current 19115-style metadata catalogues do not help the vast majority of users in making data selections. To address this, we have looked at approaches used in the leisure industry. This industry has established tools to support users selecting the best hotel for their needs from the metadata available, supported by peer to peer rating. We have looked into how this approach can support users in selecting the best data to meet their needs.
Smith, Dianna; Mathur, Rohini; Robson, John; Greenhalgh, Trisha
2012-01-01
Objective To explore the feasibility of producing small-area geospatial maps of chronic disease risk for use by clinical commissioning groups and public health teams. Study design Cross-sectional geospatial analysis using routinely collected general practitioner electronic record data. Sample and setting Tower Hamlets, an inner-city district of London, UK, characterised by high socioeconomic and ethnic diversity and high prevalence of non-communicable diseases. Methods The authors used type 2 diabetes as an example. The data set was drawn from electronic general practice records on all non-diabetic individuals aged 25–79 years in the district (n=163 275). The authors used a validated instrument, QDScore, to calculate 10-year risk of developing type 2 diabetes. Using specialist mapping software (ArcGIS), the authors produced visualisations of how these data varied by lower and middle super output area across the district. The authors enhanced these maps with information on examples of locality-based social determinants of health (population density, fast food outlets and green spaces). Data were piloted as three types of geospatial map (basic, heat and ring). The authors noted practical, technical and information governance challenges involved in producing the maps. Results Usable data were obtained on 96.2% of all records. One in 11 adults in our cohort was at ‘high risk’ of developing type 2 diabetes with a 20% or more 10-year risk. Small-area geospatial mapping illustrated ‘hot spots’ where up to 17.3% of all adults were at high risk of developing type 2 diabetes. Ring maps allowed visualisation of high risk for type 2 diabetes by locality alongside putative social determinants in the same locality. The task of downloading, cleaning and mapping data from electronic general practice records posed some technical challenges, and judgement was required to group data at an appropriate geographical level. Information governance issues were time consuming and required local and national consultation and agreement. Conclusions Producing small-area geospatial maps of diabetes risk calculated from general practice electronic record data across a district-wide population was feasible but not straightforward. Geovisualisation of epidemiological and environmental data, made possible by interdisciplinary links between public health clinicians and human geographers, allows presentation of findings in a way that is both accessible and engaging, hence potentially of value to commissioners and policymakers. Impact studies are needed of how maps of chronic disease risk might be used in public health and urban planning. PMID:22337817
Sagl, Günther; Resch, Bernd; Blaschke, Thomas
2015-01-01
In this article we critically discuss the challenge of integrating contextual information, in particular spatiotemporal contextual information, with human and technical sensor information, which we approach from a geospatial perspective. We start by highlighting the significance of context in general and spatiotemporal context in particular and introduce a smart city model of interactions between humans, the environment, and technology, with context at the common interface. We then focus on both the intentional and the unintentional sensing capabilities of today’s technologies and discuss current technological trends that we consider have the ability to enrich human and technical geo-sensor information with contextual detail. The different types of sensors used to collect contextual information are analyzed and sorted into three groups on the basis of names considering frequently used related terms, and characteristic contextual parameters. These three groups, namely technical in situ sensors, technical remote sensors, and human sensors are analyzed and linked to three dimensions involved in sensing (data generation, geographic phenomena, and type of sensing). In contrast to other scientific publications, we found a large number of technologies and applications using in situ and mobile technical sensors within the context of smart cities, and surprisingly limited use of remote sensing approaches. In this article we further provide a critical discussion of possible impacts and influences of both technical and human sensing approaches on society, pointing out that a larger number of sensors, increased fusion of information, and the use of standardized data formats and interfaces will not necessarily result in any improvement in the quality of life of the citizens of a smart city. This article seeks to improve our understanding of technical and human geo-sensing capabilities, and to demonstrate that the use of such sensors can facilitate the integration of different types of contextual information, thus providing an additional, namely the geo-spatial perspective on the future development of smart cities. PMID:26184221
Sagl, Günther; Resch, Bernd; Blaschke, Thomas
2015-07-14
In this article we critically discuss the challenge of integrating contextual information, in particular spatiotemporal contextual information, with human and technical sensor information, which we approach from a geospatial perspective. We start by highlighting the significance of context in general and spatiotemporal context in particular and introduce a smart city model of interactions between humans, the environment, and technology, with context at the common interface. We then focus on both the intentional and the unintentional sensing capabilities of today's technologies and discuss current technological trends that we consider have the ability to enrich human and technical geo-sensor information with contextual detail. The different types of sensors used to collect contextual information are analyzed and sorted into three groups on the basis of names considering frequently used related terms, and characteristic contextual parameters. These three groups, namely technical in situ sensors, technical remote sensors, and human sensors are analyzed and linked to three dimensions involved in sensing (data generation, geographic phenomena, and type of sensing). In contrast to other scientific publications, we found a large number of technologies and applications using in situ and mobile technical sensors within the context of smart cities, and surprisingly limited use of remote sensing approaches. In this article we further provide a critical discussion of possible impacts and influences of both technical and human sensing approaches on society, pointing out that a larger number of sensors, increased fusion of information, and the use of standardized data formats and interfaces will not necessarily result in any improvement in the quality of life of the citizens of a smart city. This article seeks to improve our understanding of technical and human geo-sensing capabilities, and to demonstrate that the use of such sensors can facilitate the integration of different types of contextual information, thus providing an additional, namely the geo-spatial perspective on the future development of smart cities.
Intergraph video and images exploitation capabilities
NASA Astrophysics Data System (ADS)
Colla, Simone; Manesis, Charalampos
2013-08-01
The current paper focuses on the capture, fusion and process of aerial imagery in order to leverage full motion video, giving analysts the ability to collect, analyze, and maximize the value of video assets. Unmanned aerial vehicles (UAV) have provided critical real-time surveillance and operational support to military organizations, and are a key source of intelligence, particularly when integrated with other geospatial data. In the current workflow, at first, the UAV operators plan the flight by using a flight planning software. During the flight the UAV send a live video stream directly on the field to be processed by Intergraph software, to generate and disseminate georeferenced images trough a service oriented architecture based on ERDAS Apollo suite. The raw video-based data sources provide the most recent view of a situation and can augment other forms of geospatial intelligence - such as satellite imagery and aerial photos - to provide a richer, more detailed view of the area of interest. To effectively use video as a source of intelligence, however, the analyst needs to seamlessly fuse the video with these other types of intelligence, such as map features and annotations. Intergraph has developed an application that automatically generates mosaicked georeferenced image, tags along the video route which can then be seamlessly integrated with other forms of static data, such as aerial photos, satellite imagery, or geospatial layers and features. Consumers will finally have the ability to use a single, streamlined system to complete the entire geospatial information lifecycle: capturing geospatial data using sensor technology; processing vector, raster, terrain data into actionable information; managing, fusing, and sharing geospatial data and video toghether; and finally, rapidly and securely delivering integrated information products, ensuring individuals can make timely decisions.
Renewable Energy Economic Potential | Geospatial Data Science | NREL
Economic Potential Renewable Energy Economic Potential Economic potential, one measure of renewable electricity is less than the revenue available. Illustration that shows economic potential grow smaller at each level from Resource to Technical to Economic to Market. Estimating Renewable Energy Economic
ERIC Educational Resources Information Center
Palmer, Mark H.
2012-01-01
The centering processes of geographic information system (GIS) development at the United States Bureau of Indian Affairs (BIA) was an extension of past cartographic encounters with American Indians through the central control of geospatial technologies, uneven development of geographic information resources, and extension of technically dependent…
Arc4nix: A cross-platform geospatial analytical library for cluster and cloud computing
NASA Astrophysics Data System (ADS)
Tang, Jingyin; Matyas, Corene J.
2018-02-01
Big Data in geospatial technology is a grand challenge for processing capacity. The ability to use a GIS for geospatial analysis on Cloud Computing and High Performance Computing (HPC) clusters has emerged as a new approach to provide feasible solutions. However, users lack the ability to migrate existing research tools to a Cloud Computing or HPC-based environment because of the incompatibility of the market-dominating ArcGIS software stack and Linux operating system. This manuscript details a cross-platform geospatial library "arc4nix" to bridge this gap. Arc4nix provides an application programming interface compatible with ArcGIS and its Python library "arcpy". Arc4nix uses a decoupled client-server architecture that permits geospatial analytical functions to run on the remote server and other functions to run on the native Python environment. It uses functional programming and meta-programming language to dynamically construct Python codes containing actual geospatial calculations, send them to a server and retrieve results. Arc4nix allows users to employ their arcpy-based script in a Cloud Computing and HPC environment with minimal or no modification. It also supports parallelizing tasks using multiple CPU cores and nodes for large-scale analyses. A case study of geospatial processing of a numerical weather model's output shows that arcpy scales linearly in a distributed environment. Arc4nix is open-source software.
The Value of Information - Accounting for a New Geospatial Paradigm
NASA Astrophysics Data System (ADS)
Pearlman, J.; Coote, A. M.
2014-12-01
A new frontier in consideration of socio-economic benefit is valuing information as an asset, often referred to as Infonomics. Conventional financial practice does not easily provide a mechanism for valuing information and yet clearly for many of the largest corporations, such as Google and Facebook, it is their principal asset. This is exacerbated for public sector organizations, as those that information-centric rather than information-enabled are relatively few - statistics, archiving and mapping agencies are perhaps the only examples - so it's not at the top of the agenda for Government. However, it is a hugely important issue when valuing Geospatial data and information. Geospatial data allows public institutions to operate, and facilitates the provision of essential services for emergency response and national defense. In this respect, geospatial data is strongly analogous to other types of public infrastructure, such as utilities and roads. The use of Geospatial data is widespread from companies in the transportation or construction sectors to individual planning for daily events. The categorization of geospatial data as infrastructure is critical to decisions related to investment in its management, maintenance and upgrade over time. Geospatial data depreciates in the same way that physical infrastructure depreciates. It needs to be maintained otherwise its functionality and value in use declines. We have coined the term geo-infonomics to encapsulate the concept. This presentation will develop the arguments around its importance and current avenues of research.
User's guide for mapIMG 3--Map image re-projection software package
Finn, Michael P.; Mattli, David M.
2012-01-01
Version 0.0 (1995), Dan Steinwand, U.S. Geological Survey (USGS)/Earth Resources Observation Systems (EROS) Data Center (EDC)--Version 0.0 was a command line version for UNIX that required four arguments: the input metadata, the output metadata, the input data file, and the output destination path. Version 1.0 (2003), Stephen Posch and Michael P. Finn, USGS/Mid-Continent Mapping Center (MCMC--Version 1.0 added a GUI interface that was built using the Qt library for cross platform development. Version 1.01 (2004), Jason Trent and Michael P. Finn, USGS/MCMC--Version 1.01 suggested bounds for the parameters of each projection. Support was added for larger input files, storage of the last used input and output folders, and for TIFF/ GeoTIFF input images. Version 2.0 (2005), Robert Buehler, Jason Trent, and Michael P. Finn, USGS/National Geospatial Technical Operations Center (NGTOC)--Version 2.0 added Resampling Methods (Mean, Mode, Min, Max, and Sum), updated the GUI design, and added the viewer/pre-viewer. The metadata style was changed to XML and was switched to a new naming convention. Version 3.0 (2009), David Mattli and Michael P. Finn, USGS/Center of Excellence for Geospatial Information Science (CEGIS)--Version 3.0 brings optimized resampling methods, an updated GUI, support for less than global datasets, UTM support and the whole codebase was ported to Qt4.
An updated geospatial liquefaction model for global application
Zhu, Jing; Baise, Laurie G.; Thompson, Eric M.
2017-01-01
We present an updated geospatial approach to estimation of earthquake-induced liquefaction from globally available geospatial proxies. Our previous iteration of the geospatial liquefaction model was based on mapped liquefaction surface effects from four earthquakes in Christchurch, New Zealand, and Kobe, Japan, paired with geospatial explanatory variables including slope-derived VS30, compound topographic index, and magnitude-adjusted peak ground acceleration from ShakeMap. The updated geospatial liquefaction model presented herein improves the performance and the generality of the model. The updates include (1) expanding the liquefaction database to 27 earthquake events across 6 countries, (2) addressing the sampling of nonliquefaction for incomplete liquefaction inventories, (3) testing interaction effects between explanatory variables, and (4) overall improving model performance. While we test 14 geospatial proxies for soil density and soil saturation, the most promising geospatial parameters are slope-derived VS30, modeled water table depth, distance to coast, distance to river, distance to closest water body, and precipitation. We found that peak ground velocity (PGV) performs better than peak ground acceleration (PGA) as the shaking intensity parameter. We present two models which offer improved performance over prior models. We evaluate model performance using the area under the curve under the Receiver Operating Characteristic (ROC) curve (AUC) and the Brier score. The best-performing model in a coastal setting uses distance to coast but is problematic for regions away from the coast. The second best model, using PGV, VS30, water table depth, distance to closest water body, and precipitation, performs better in noncoastal regions and thus is the model we recommend for global implementation.
Public participation in GIS via mobile applications
NASA Astrophysics Data System (ADS)
Brovelli, Maria Antonia; Minghini, Marco; Zamboni, Giorgio
2016-04-01
Driven by the recent trends in the GIS domain including Volunteered Geographic Information, geo-crowdsourcing and citizen science, and fostered by the constant technological advances, collection and dissemination of geospatial information by ordinary people has become commonplace. However, applications involving user-generated geospatial content show dramatically diversified patterns in terms of incentive, type and level of participation, purpose of the activity, data/metadata provided and data quality. This study contributes to this heterogeneous context by investigating public participation in GIS within the field of mobile-based applications. Results not only show examples of how to technically build GIS applications enabling user collection and interaction with geospatial data, but they also draw conclusions about the methods and needs of public participation. We describe three projects with different scales and purposes in the context of urban monitoring and planning, and tourism valorisation. In each case, an open source architecture is used, allowing users to exploit their mobile devices to collect georeferenced information. This data is then made publicly available on specific Web viewers. Analysis of user involvement in these projects provides insights related to participation patterns which suggests some generalized conclusions.
Data for Renewable Energy Planning, Policy, and Investment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cox, Sarah L
Reliable, robust, and validated data are critical for informed planning, policy development, and investment in the clean energy sector. The Renewable Energy (RE) Explorer was developed to support data-driven renewable energy analysis that can inform key renewable energy decisions globally. This document presents the types of geospatial and other data at the core of renewable energy analysis and decision making. Individual data sets used to inform decisions vary in relation to spatial and temporal resolution, quality, and overall usefulness. From Data to Decisions, a complementary geospatial data and analysis decision guide, provides an in-depth view of these and other considerationsmore » to enable data-driven planning, policymaking, and investment. Data support a wide variety of renewable energy analyses and decisions, including technical and economic potential assessment, renewable energy zone analysis, grid integration, risk and resiliency identification, electrification, and distributed solar photovoltaic potential. This fact sheet provides information on the types of data that are important for renewable energy decision making using the RE Data Explorer or similar types of geospatial analysis tools.« less
NASA Technical Reports Server (NTRS)
Lyle, Stacey D.
2009-01-01
A software package that has been designed to allow authentication for determining if the rover(s) is/are within a set of boundaries or a specific area to access critical geospatial information by using GPS signal structures as a means to authenticate mobile devices into a network wirelessly and in real-time has been developed. The advantage lies in that the system only allows those with designated geospatial boundaries or areas into the server. The Geospatial Authentication software has two parts Server and Client. The server software is a virtual private network (VPN) developed in Linux operating system using Perl programming language. The server can be a stand-alone VPN server or can be combined with other applications and services. The client software is a GUI Windows CE software, or Mobile Graphical Software, that allows users to authenticate into a network. The purpose of the client software is to pass the needed satellite information to the server for authentication.
ERIC Educational Resources Information Center
National Council for Geographic Education (NJ1), 2006
2006-01-01
This report examines the outcomes of a workshop held at the National Science Foundation on August 15-16, 2005. Forty-six participants, representing academia, industry, government agencies, professional associations, and special projects met to: (1) discuss how geospatial technology training at two-year colleges can address workforce needs; and…
78 FR 39163 - Navigation and Navigable Waters; Technical, Organizational, and Conforming Amendments
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-01
... publication in the Federal Register. III. Background and Purpose Each year, the printed edition of Title 33 of... Mapping Agency changed its name to the National Geospatial- Intelligence Agency. This rule removes the references to printed versions of the LNM in Sec. 72.01-10(c). In place of paragraph (c) is an updated link...
NASA Astrophysics Data System (ADS)
Une, Hiroshi; Nakano, Takayuki
2018-05-01
Geographic location is one of the most fundamental and indispensable information elements in the field of disaster response and prevention. For example, in the case of the Tohoku Earthquake in 2011, aerial photos taken immediately after the earthquake greatly improved information sharing among different government offices and facilitated rescue and recovery operations, and maps prepared after the disaster assisted in the rapid reconstruction of affected local communities. Thanks to the recent development of geospatial information technology, this information has become more essential for disaster response activities. Advancements in web mapping technology allows us to better understand the situation by overlaying various location-specific data on base maps on the web and specifying the areas on which activities should be focused. Through 3-D modelling technology, we can have a more realistic understanding of the relationship between disaster and topography. Geospatial information technology can sup-port proper preparation and emergency responses against disasters by individuals and local communities through hazard mapping and other information services using mobile devices. Thus, geospatial information technology is playing a more vital role on all stages of disaster risk management and responses. In acknowledging geospatial information's vital role in disaster risk reduction, the Sendai Framework for Disaster Risk Reduction 2015-2030, adopted at the Third United Nations World Conference on Disaster Risk Reduction, repeatedly reveals the importance of utilizing geospatial information technology for disaster risk reduction. This presentation aims to report the recent practical applications of geospatial information technology for disaster risk management and responses.
Distributed geospatial model sharing based on open interoperability standards
Feng, Min; Liu, Shuguang; Euliss, Ned H.; Fang, Yin
2009-01-01
Numerous geospatial computational models have been developed based on sound principles and published in journals or presented in conferences. However modelers have made few advances in the development of computable modules that facilitate sharing during model development or utilization. Constraints hampering development of model sharing technology includes limitations on computing, storage, and connectivity; traditional stand-alone and closed network systems cannot fully support sharing and integrating geospatial models. To address this need, we have identified methods for sharing geospatial computational models using Service Oriented Architecture (SOA) techniques and open geospatial standards. The service-oriented model sharing service is accessible using any tools or systems compliant with open geospatial standards, making it possible to utilize vast scientific resources available from around the world to solve highly sophisticated application problems. The methods also allow model services to be empowered by diverse computational devices and technologies, such as portable devices and GRID computing infrastructures. Based on the generic and abstract operations and data structures required for Web Processing Service (WPS) standards, we developed an interactive interface for model sharing to help reduce interoperability problems for model use. Geospatial computational models are shared on model services, where the computational processes provided by models can be accessed through tools and systems compliant with WPS. We developed a platform to help modelers publish individual models in a simplified and efficient way. Finally, we illustrate our technique using wetland hydrological models we developed for the prairie pothole region of North America.
NASA Astrophysics Data System (ADS)
Santoro, E.
2017-05-01
The crisis management of a disaster, whether caused naturally or by human action, requires a thorough knowledge of the territory involved, with regard to both its terrain and its developed areas. Therefore, it is essential that the National Mapping and Cadastral Agencies (NMCAs) and all other public and scientific institutions responsible for the production of geospatial information closely co-operate in making their data in that field available. This crucial sharing of geographic information is a top-level priority, not only in a disaster emergency situation, but also for effective urban and environmental planning and Cultural Heritage protection and preservation. Geospatial data-sharing, responding to the needs of all institutions involved in disaster surveying operations, is fundamental, as a priority, to the task of avoiding loss of human lives. However, no less important is the acquisition, dissemination and use of this data, in addition to direct, "in-the-field" operations of specialists in geomatics, in order to preserve the Cultural Heritage located in the crisis area. It is in this context that an NMCA such as the Italian Military Geographic Institute (IGMI) plays a key role.
How NASA is Building a Petabyte Scale Geospatial Archive in the Cloud
NASA Technical Reports Server (NTRS)
Pilone, Dan; Quinn, Patrick; Jazayeri, Alireza; Baynes, Kathleen; Murphy, Kevin J.
2018-01-01
NASA's Earth Observing System Data and Information System (EOSDIS) is working towards a vision of a cloud-based, highly-flexible, ingest, archive, management, and distribution system for its ever-growing and evolving data holdings. This free and open source system, Cumulus, is emerging from its prototyping stages and is poised to make a huge impact on how NASA manages and disseminates its Earth science data. This talk outlines the motivation for this work, present the achievements and hurdles of the past 18 months and charts a course for the future expansion of Cumulus. We explore not just the technical, but also the socio-technical challenges that we face in evolving a system of this magnitude into the cloud. The NASA EOSDIS archive is currently at nearly 30 PBs and will grow to over 300PBs in the coming years. We've presented progress on this effort at AWS re:Invent and the American Geophysical Union (AGU) Fall Meeting in 2017 and hope to have the opportunity to share with FOSS4G attendees information on the availability of the open sourced software and how NASA intends on making its Earth Observing Geospatial data available for free to the public in the cloud.
New Geodetic Infrastructure for Australia: The NCRIS / AuScope Geospatial Component
NASA Astrophysics Data System (ADS)
Tregoning, P.; Watson, C. S.; Coleman, R.; Johnston, G.; Lovell, J.; Dickey, J.; Featherstone, W. E.; Rizos, C.; Higgins, M.; Priebbenow, R.
2009-12-01
In November 2006, the Australian Federal Government announced AUS15.8M in funding for geospatial research infrastructure through the National Collaborative Research Infrastructure Strategy (NCRIS). Funded within a broader capability area titled ‘Structure and Evolution of the Australian Continent’, NCRIS has provided a significant investment across Earth imaging, geochemistry, numerical simulation and modelling, the development of a virtual core library, and geospatial infrastructure. Known collectively as AuScope (www.auscope.org.au), this capability area has brought together Australian’s leading Earth scientists to decide upon the most pressing scientific issues and infrastructure needs for studying Earth systems and their impact on the Australian continent. Importantly and at the same time, the investment in geospatial infrastructure offers the opportunity to raise Australian geodetic science capability to the highest international level into the future. The geospatial component of AuScope builds onto the AUS15.8M of direct funding through the NCRIS process with significant in-kind and co-investment from universities and State/Territory and Federal government departments. The infrastructure to be acquired includes an FG5 absolute gravimeter, three gPhone relative gravimeters, three 12.1 m radio telescopes for geodetic VLBI, a continent-wide network of continuously operating geodetic quality GNSS receivers, a trial of a mobile SLR system and access to updated cluster computing facilities. We present an overview of the AuScope geospatial capability, review the current status of the infrastructure procurement and discuss some examples of the scientific research that will utilise the new geospatial infrastructure.
User's Guide for MapIMG 2: Map Image Re-projection Software Package
Finn, Michael P.; Trent, Jason R.; Buehler, Robert A.
2006-01-01
BACKGROUND Scientists routinely accomplish small-scale geospatial modeling in the raster domain, using high-resolution datasets for large parts of continents and low-resolution to high-resolution datasets for the entire globe. Direct implementation of point-to-point transformation with appropriate functions yields the variety of projections available in commercial software packages, but implementation with data other than points requires specific adaptation of the transformation equations or prior preparation of the data to allow the transformation to succeed. It seems that some of these packages use the U.S. Geological Survey's (USGS) General Cartographic Transformation Package (GCTP) or similar point transformations without adaptation to the specific characteristics of raster data (Usery and others, 2003a). Usery and others (2003b) compiled and tabulated the accuracy of categorical areas in projected raster datasets of global extent. Based on the shortcomings identified in these studies, geographers and applications programmers at the USGS expanded and evolved a USGS software package, MapIMG, for raster map projection transformation (Finn and Trent, 2004). Daniel R. Steinwand of Science Applications International Corporation, National Center for Earth Resources Observation and Science, originally developed MapIMG for the USGS, basing it on GCTP. Through previous and continuing efforts at the USGS' National Geospatial Technical Operations Center, this program has been transformed from an application based on command line input into a software package based on a graphical user interface for Windows, Linux, and other UNIX machines.
Renewable Energy Data Explorer User Guide
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cox, Sarah L; Grue, Nicholas W; Tran, July
This publication provides a user guide for the Renewable Energy Data Explorer and technical potential tool within the Explorer. The Renewable Energy Data Explorer is a dynamic, web-based geospatial analysis tool that facilitates renewable energy decision-making, investment, and deployment. It brings together renewable energy resource data and other modeled or measured geographic information system (GIS) layers, including land use, weather, environmental, population density, administrative, and grid data.
Geospatial Information Response Team
Witt, Emitt C.
2010-01-01
Extreme emergency events of national significance that include manmade and natural disasters seem to have become more frequent during the past two decades. The Nation is becoming more resilient to these emergencies through better preparedness, reduced duplication, and establishing better communications so every response and recovery effort saves lives and mitigates the long-term social and economic impacts on the Nation. The National Response Framework (NRF) (http://www.fema.gov/NRF) was developed to provide the guiding principles that enable all response partners to prepare for and provide a unified national response to disasters and emergencies. The NRF provides five key principles for better preparation, coordination, and response: 1) engaged partnerships, 2) a tiered response, 3) scalable, flexible, and adaptable operations, 4) unity of effort, and 5) readiness to act. The NRF also describes how communities, tribes, States, Federal Government, privatesector, and non-governmental partners apply these principles for a coordinated, effective national response. The U.S. Geological Survey (USGS) has adopted the NRF doctrine by establishing several earth-sciences, discipline-level teams to ensure that USGS science, data, and individual expertise are readily available during emergencies. The Geospatial Information Response Team (GIRT) is one of these teams. The USGS established the GIRT to facilitate the effective collection, storage, and dissemination of geospatial data information and products during an emergency. The GIRT ensures that timely geospatial data are available for use by emergency responders, land and resource managers, and for scientific analysis. In an emergency and response capacity, the GIRT is responsible for establishing procedures for geospatial data acquisition, processing, and archiving; discovery, access, and delivery of data; anticipating geospatial needs; and providing coordinated products and services utilizing the USGS' exceptional pool of geospatial experts and equipment.
Intelligence, mapping, and geospatial exploitation system (IMAGES)
NASA Astrophysics Data System (ADS)
Moellman, Dennis E.; Cain, Joel M.
1998-08-01
This paper provides further detail to one facet of the battlespace visualization concept described in last year's paper Battlespace Situation Awareness for Force XXI. It focuses on the National Imagery and Mapping Agency (NIMA) goal to 'provide customers seamless access to tailorable imagery, imagery intelligence, and geospatial information.' This paper describes Intelligence, Mapping, and Geospatial Exploitation System (IMAGES), an exploitation element capable of CONUS baseplant operations or field deployment to provide NIMA geospatial information collaboratively into a reconnaissance, surveillance, and target acquisition (RSTA) environment through the United States Imagery and Geospatial Information System (USIGS). In a baseplant CONUS setting IMAGES could be used to produce foundation data to support mission planning. In the field it could be directly associated with a tactical sensor receiver or ground station (e.g. UAV or UGV) to provide near real-time and mission specific RSTA to support mission execution. This paper provides IMAGES functional level design; describes the technologies, their interactions and interdependencies; and presents a notional operational scenario to illustrate the system flexibility. Using as a system backbone an intelligent software agent technology, called Open Agent ArchitectureTM (OAATM), IMAGES combines multimodal data entry, natural language understanding, and perceptual and evidential reasoning for system management. Configured to be DII COE compliant, it would utilize, to the extent possible, COTS applications software for data management, processing, fusion, exploitation, and reporting. It would also be modular, scaleable, and reconfigurable. This paper describes how the OAATM achieves data synchronization and enables the necessary level of information to be rapidly available to various command echelons for making informed decisions. The reasoning component will provide for the best information to be developed in the timeline available and it will also provide statistical pedigree data. This pedigree data provides both uncertainties associated with the information and an audit trail cataloging the raw data sources and the processing/exploitation applied to derive the final product. Collaboration provides for a close union between the information producer(s)/exploiter(s) and the information user(s) as well as between local and remote producer(s)/exploiter(s). From a military operational perspective, IMAGES is a step toward further uniting NIMA with its customers and further blurring the dividing line between operational command and control (C2) and its supporting intelligence activities. IMAGES also provides a foundation for reachback to remote data sources, data stores, application software, and computational resources for achieving 'just-in- time' information delivery -- all of which is transparent to the analyst or operator employing the system.
A Python Geospatial Language Toolkit
NASA Astrophysics Data System (ADS)
Fillmore, D.; Pletzer, A.; Galloy, M.
2012-12-01
The volume and scope of geospatial data archives, such as collections of satellite remote sensing or climate model products, has been rapidly increasing and will continue to do so in the near future. The recently launched (October 2011) Suomi National Polar-orbiting Partnership satellite (NPP) for instance, is the first of a new generation of Earth observation platforms that will monitor the atmosphere, oceans, and ecosystems, and its suite of instruments will generate several terabytes each day in the form of multi-spectral images and derived datasets. Full exploitation of such data for scientific analysis and decision support applications has become a major computational challenge. Geophysical data exploration and knowledge discovery could benefit, in particular, from intelligent mechanisms for extracting and manipulating subsets of data relevant to the problem of interest. Potential developments include enhanced support for natural language queries and directives to geospatial datasets. The translation of natural language (that is, human spoken or written phrases) into complex but unambiguous objects and actions can be based on a context, or knowledge domain, that represents the underlying geospatial concepts. This poster describes a prototype Python module that maps English phrases onto basic geospatial objects and operations. This module, along with the associated computational geometry methods, enables the resolution of natural language directives that include geographic regions of arbitrary shape and complexity.
Leveraging Geospatial Intelligence (GEOINT) in Mission Command
2009-03-21
Operational artists at all levels need new conceptual tools commensurate to today’s demands. Conceptual aids derived from old, industrial-age analogies...are not up to the mental gymnastics demanded by 21 st –century missions. Because operational environments evince increasingly dynamic complexity
Hu, Chuli; Guan, Qingfeng; Li, Jie; Wang, Ke; Chen, Nengcheng
2016-01-01
Sensor inquirers cannot understand comprehensive or accurate observation capability information because current observation capability modeling does not consider the union of multiple sensors nor the effect of geospatial environmental features on the observation capability of sensors. These limitations result in a failure to discover credible sensors or plan for their collaboration for environmental monitoring. The Geospatial Environmental Observation Capability (GEOC) is proposed in this study and can be used as an information basis for the reliable discovery and collaborative planning of multiple environmental sensors. A field-based GEOC (GEOCF) information representation model is built. Quintuple GEOCF feature components and two GEOCF operations are formulated based on the geospatial field conceptual framework. The proposed GEOCF markup language is used to formalize the proposed GEOCF. A prototype system called GEOCapabilityManager is developed, and a case study is conducted for flood observation in the lower reaches of the Jinsha River Basin. The applicability of the GEOCF is verified through the reliable discovery of flood monitoring sensors and planning for the collaboration of these sensors. PMID:27999247
Hu, Chuli; Guan, Qingfeng; Li, Jie; Wang, Ke; Chen, Nengcheng
2016-12-16
Sensor inquirers cannot understand comprehensive or accurate observation capability information because current observation capability modeling does not consider the union of multiple sensors nor the effect of geospatial environmental features on the observation capability of sensors. These limitations result in a failure to discover credible sensors or plan for their collaboration for environmental monitoring. The Geospatial Environmental Observation Capability (GEOC) is proposed in this study and can be used as an information basis for the reliable discovery and collaborative planning of multiple environmental sensors. A field-based GEOC (GEOCF) information representation model is built. Quintuple GEOCF feature components and two GEOCF operations are formulated based on the geospatial field conceptual framework. The proposed GEOCF markup language is used to formalize the proposed GEOCF. A prototype system called GEOCapabilityManager is developed, and a case study is conducted for flood observation in the lower reaches of the Jinsha River Basin. The applicability of the GEOCF is verified through the reliable discovery of flood monitoring sensors and planning for the collaboration of these sensors.
The Challenges to Coupling Dynamic Geospatial Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldstein, N
2006-06-23
Many applications of modeling spatial dynamic systems focus on a single system and a single process, ignoring the geographic and systemic context of the processes being modeled. A solution to this problem is the coupled modeling of spatial dynamic systems. Coupled modeling is challenging for both technical reasons, as well as conceptual reasons. This paper explores the benefits and challenges to coupling or linking spatial dynamic models, from loose coupling, where information transfer between models is done by hand, to tight coupling, where two (or more) models are merged as one. To illustrate the challenges, a coupled model of Urbanizationmore » and Wildfire Risk is presented. This model, called Vesta, was applied to the Santa Barbara, California region (using real geospatial data), where Urbanization and Wildfires occur and recur, respectively. The preliminary results of the model coupling illustrate that coupled modeling can lead to insight into the consequences of processes acting on their own.« less
NASA Astrophysics Data System (ADS)
Jordan, T. R.; Madden, M.; Sharma, J. B.; Panda, S. S.
2012-07-01
In an innovative collaboration between government, university and private industry, researchers at the University of Georgia and Gainesville State College are collaborating with Photo Science, Inc. to acquire, process and quality control check lidar and or-thoimages of forest areas in the Southern Appalachian Mountains of the United States. Funded by the U.S. Geological Survey, this project meets the objectives of the ARRA initiative by creating jobs, preserving jobs and training students for high skill positions in geospatial technology. Leaf-off lidar data were acquired at 1-m resolution of the Tennessee portion of the Great Smoky Mountain National Park (GRSM) and adjacent Foothills Parkway. This 1400-sq. km. area is of high priority for national/global interests due to biodiversity, rare and endangered species and protection of some of the last remaining virgin forest in the U.S. High spatial resolution (30 cm) leaf-off 4-band multispectral orthoimages also were acquired for both the Chattahoochee National Forest in north Georgia and the entire GRSM. The data are intended to augment the National Elevation Dataset and orthoimage database of The National Map with information that can be used by many researchers in applications of LiDAR point clouds, high resolution DEMs and or-thoimage mosaics. Graduate and undergraduate students were involved at every stage of the workflow in order to provide then with high level technical educational and professional experience in preparation for entering the geospatial workforce. This paper will present geospatial workflow strategies, multi-team coordination, distance-learning training and industry-academia partnership.
PANTHER. Pattern ANalytics To support High-performance Exploitation and Reasoning.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Czuchlewski, Kristina Rodriguez; Hart, William E.
Sandia has approached the analysis of big datasets with an integrated methodology that uses computer science, image processing, and human factors to exploit critical patterns and relationships in large datasets despite the variety and rapidity of information. The work is part of a three-year LDRD Grand Challenge called PANTHER (Pattern ANalytics To support High-performance Exploitation and Reasoning). To maximize data analysis capability, Sandia pursued scientific advances across three key technical domains: (1) geospatial-temporal feature extraction via image segmentation and classification; (2) geospatial-temporal analysis capabilities tailored to identify and process new signatures more efficiently; and (3) domain- relevant models of humanmore » perception and cognition informing the design of analytic systems. Our integrated results include advances in geographical information systems (GIS) in which we discover activity patterns in noisy, spatial-temporal datasets using geospatial-temporal semantic graphs. We employed computational geometry and machine learning to allow us to extract and predict spatial-temporal patterns and outliers from large aircraft and maritime trajectory datasets. We automatically extracted static and ephemeral features from real, noisy synthetic aperture radar imagery for ingestion into a geospatial-temporal semantic graph. We worked with analysts and investigated analytic workflows to (1) determine how experiential knowledge evolves and is deployed in high-demand, high-throughput visual search workflows, and (2) better understand visual search performance and attention. Through PANTHER, Sandia's fundamental rethinking of key aspects of geospatial data analysis permits the extraction of much richer information from large amounts of data. The project results enable analysts to examine mountains of historical and current data that would otherwise go untouched, while also gaining meaningful, measurable, and defensible insights into overlooked relationships and patterns. The capability is directly relevant to the nation's nonproliferation remote-sensing activities and has broad national security applications for military and intelligence- gathering organizations.« less
Operational Marine Data Acquisition and Delivery Powered by Web and Geospatial Standards
NASA Astrophysics Data System (ADS)
Thomas, R.; Buck, J. J. H.
2015-12-01
As novel sensor types and new platforms are deployed to monitor the global oceans, the volumes of scientific and environmental data collected in the marine context are rapidly growing. In order to use these data in both the traditional operational modes and in innovative "Big Data" applications the data must be readily understood by software agents. One approach to achieving this is the application of both World Wide Web and Open Geospatial Consortium standards: namely Linked Data1 and Sensor Web Enablement2 (SWE). The British Oceanographic Data Centre (BODC) is adopting this strategy in a number of European Commission funded projects (NETMAR; SenseOCEAN; Ocean Data Interoperability Platform - ODIP; and AtlantOS) to combine its existing data archiving architecture with SWE components (such as Sensor Observation Services) and a Linked Data interface. These will evolve the data management and data transfer from a process that requires significant manual intervention to an automated operational process enabling the rapid, standards-based, ingestion and delivery of data. This poster will show the current capabilities of BODC and the status of on-going implementation of this strategy. References1. World Wide Web Consortium. (2013). Linked Data. Available:http://www.w3.org/standards/semanticweb/data. Last accessed 7th April 20152. Open Geospatial Consortium. (2014). Sensor Web Enablement (SWE). Available:http://www.opengeospatial.org/ogc/markets-technologies/swe. Last accessed 8th October 2014
NASA's Geospatial Interoperability Office(GIO)Program
NASA Technical Reports Server (NTRS)
Weir, Patricia
2004-01-01
NASA produces vast amounts of information about the Earth from satellites, supercomputer models, and other sources. These data are most useful when made easily accessible to NASA researchers and scientists, to NASA's partner Federal Agencies, and to society as a whole. A NASA goal is to apply its data for knowledge gain, decision support and understanding of Earth, and other planetary systems. The NASA Earth Science Enterprise (ESE) Geospatial Interoperability Office (GIO) Program leads the development, promotion and implementation of information technology standards that accelerate and expand the delivery of NASA's Earth system science research through integrated systems solutions. Our overarching goal is to make it easy for decision-makers, scientists and citizens to use NASA's science information. NASA's Federal partners currently participate with NASA and one another in the development and implementation of geospatial standards to ensure the most efficient and effective access to one another's data. Through the GIO, NASA participates with its Federal partners in implementing interoperability standards in support of E-Gov and the associated President's Management Agenda initiatives by collaborating on standards development. Through partnerships with government, private industry, education and communities the GIO works towards enhancing the ESE Applications Division in the area of National Applications and decision support systems. The GIO provides geospatial standards leadership within NASA, represents NASA on the Federal Geographic Data Committee (FGDC) Coordination Working Group and chairs the FGDC's Geospatial Applications and Interoperability Working Group (GAI) and supports development and implementation efforts such as Earth Science Gateway (ESG), Space Time Tool Kit and Web Map Services (WMS) Global Mosaic. The GIO supports NASA in the collection and dissemination of geospatial interoperability standards needs and progress throughout the agency including areas such as ESE Applications, the SEEDS Working Groups, the Facilities Engineering Division (Code JX) and NASA's Chief Information Offices (CIO). With these agency level requirements GIO leads, brokers and facilitates efforts to, develop, implement, influence and fully participate in standards development internationally, federally and locally. The GIO also represents NASA in the OpenGIS Consortium and ISO TC211. The OGC has made considerable progress in regards to relations with other open standards bodies; namely ISO, W3C and OASIS. ISO TC211 is the Geographic and Geomatics Information technical committee that works towards standardization in the field of digital geographic information. The GIO focuses on seamless access to data, applications of data, and enabling technologies furthering the interoperability of distributed data. Through teaming within the Applications Directorate and partnerships with government, private industry, education and communities, GIO works towards the data application goals of NASA, the ESE Applications Directorate, and our Federal partners by managing projects in four categories: Geospatial Standards and Leadership, Geospatial One Stop, Standards Development and Implementation, and National and NASA Activities.
Core Science Systems--Mission overview
Gallagher, Kevin T.
2012-01-01
CSS provides a foundation for all USGS Mission Areas, as well as for the mission of the Department of the Interior (DOI), in the following ways: 1) Conducts basic and applied science research and development 2) Fosters broad understanding and application of analyses and information 3) Provides a framework for data and information sharing 4) Creates new geospatially enabled data and information 5) Provides technical expertise in standards and methods 6) Builds and facilitates partnerships and innovation
Importance of the spatial data and the sensor web in the ubiquitous computing area
NASA Astrophysics Data System (ADS)
Akçit, Nuhcan; Tomur, Emrah; Karslıoǧlu, Mahmut O.
2014-08-01
Spatial data has become a critical issue in recent years. In the past years, nearly more than three quarters of databases, were related directly or indirectly to locations referring to physical features, which constitute the relevant aspects. Spatial data is necessary to identify or calculate the relationships between spatial objects when using spatial operators in programs or portals. Originally, calculations were conducted using Geographic Information System (GIS) programs on local computers. Subsequently, through the Internet, they formed a geospatial web, which is integrated into a discoverable collection of geographically related web standards and key features, and constitutes a global network of geospatial data that employs the World Wide Web to process textual data. In addition, the geospatial web is used to gather spatial data producers, resources, and users. Standards also constitute a critical dimension in further globalizing the idea of the geospatial web. The sensor web is an example of the real time service that the geospatial web can provide. Sensors around the world collect numerous types of data. The sensor web is a type of sensor network that is used for visualizing, calculating, and analyzing collected sensor data. Today, people use smart devices and systems more frequently because of the evolution of technology and have more than one mobile device. The considerable number of sensors and different types of data that are positioned around the world have driven the production of interoperable and platform-independent sensor web portals. The focus of such production has been on further developing the idea of an interoperable and interdependent sensor web of all devices that share and collect information. The other pivotal idea consists of encouraging people to use and send data voluntarily for numerous purposes with the some level of credibility. The principal goal is to connect mobile and non-mobile device in the sensor web platform together to operate for serving and collecting information from people.
Stauffer, Andrew J.; Webinger, Seth; Roche, Brittany
2016-01-01
The US Geological Survey’s (USGS) National Geospatial Technical Operations Center is prototyping and evaluating the ability to filter data through a range of scales using 1:24,000-scale The National Map (TNM) datasets as the source. A “VisibilityFilter” attribute is under evaluation that can be added to all TNM vector data themes and will permit filtering of data to eight target scales between 1:24,000 and 1:5,000,000, thus defining each feature’s smallest applicable scale-of-use. For a prototype implementation, map specifications for 1:100,000- and 1:250,000-scale USGS Topographic Map Series are being utilized to define feature content appropriate at fixed mapping scales to guide generalization decisions that are documented in a ScaleMaster diagram. This paper defines the VisibilityFilter attribute, the generalization decisions made for each TNM data theme, and how these decisions are embedded into the data to support efficient data filtering.
Olugasa, B O
2014-12-01
The World-Wide-Web as a contemporary means of information sharing offers a platform for geo-spatial information dissemination to improve education about spatio-temporal patterns of disease spread at the human-animal-environment interface in developing countries of West Africa. In assessing the quality of exposure to geospatial information applications among students in five purposively selected institutions in West Africa, this study reviewed course contents and postgraduate programmes in zoonoses surveillance. Geospatial information content and associated practical exercises in zoonoses surveillance were scored.. Seven criteria were used to categorize and score capability, namely, spatial data capture; thematic map design and interpretation; spatio-temporal analysis; remote sensing of data; statistical modelling; the management of spatial data-profile; and web-based map sharing operation within an organization. These criteria were used to compute weighted exposure during training at the institutions. A categorical description of institution with highest-scoring of computed Cumulative Exposure Point Average (CEPA) was based on an illustration with retrospective records of rabies cases, using data from humans, animals and the environment, that were sourced from Grand Bassa County, Liberia to create and share maps and information with faculty, staff, students and the neighbourhood about animal bite injury surveillance and spatial distribution of rabies-like illness. Uniformly low CEPA values (0-1.3) were observed across academic departments. The highest (3.8) was observed at the Centre for Control and Prevention of Zoonoses (CCPZ), University of Ibadan, Nigeria, where geospatial techniques were systematically taught, and thematic and predictive maps were produced and shared online with other institutions in West Africa. In addition, a short course in zoonosis surveillance, which offers inclusive learning in geospatial applications, is taught at CCPZ. The paper presents a graded capability for geospatial data capture, analysis and an emerging sustainable map pavilion dedicated to zoonoses disease surveillance training among collaborating institutions in West Africa.
NASA Astrophysics Data System (ADS)
Ozturk, D.; Chaudhary, A.; Votava, P.; Kotfila, C.
2016-12-01
Jointly developed by Kitware and NASA Ames, GeoNotebook is an open source tool designed to give the maximum amount of flexibility to analysts, while dramatically simplifying the process of exploring geospatially indexed datasets. Packages like Fiona (backed by GDAL), Shapely, Descartes, Geopandas, and PySAL provide a stack of technologies for reading, transforming, and analyzing geospatial data. Combined with the Jupyter notebook and libraries like matplotlib/Basemap it is possible to generate detailed geospatial visualizations. Unfortunately, visualizations generated is either static or does not perform well for very large datasets. Also, this setup requires a great deal of boilerplate code to create and maintain. Other extensions exist to remedy these problems, but they provide a separate map for each input cell and do not support map interactions that feed back into the python environment. To support interactive data exploration and visualization on large datasets we have developed an extension to the Jupyter notebook that provides a single dynamic map that can be managed from the Python environment, and that can communicate back with a server which can perform operations like data subsetting on a cloud-based cluster.
Hearn,, Paul P.
2009-01-01
Federal, State, and local government agencies in the United States face a broad range of issues on a daily basis. Among these are natural hazard mitigation, homeland security, emergency response, economic and community development, water supply, and health and safety services. The U.S. Geological Survey (USGS) helps decision makers address these issues by providing natural hazard assessments, information on energy, mineral, water and biological resources, maps, and other geospatial information. Increasingly, decision makers at all levels are challenged not by the lack of information, but by the absence of effective tools to synthesize the large volume of data available, and to utilize the data to frame policy options in a straightforward and understandable manner. While geographic information system (GIS) technology has been widely applied to this end, systems with the necessary analytical power have been usable only by trained operators. The USGS is addressing the need for more accessible, manageable data tools by developing a suite of Web-based geospatial applications that will incorporate USGS and cooperating partner data into the decision making process for a variety of critical issues. Examples of Web-based geospatial tools being used to address societal issues follow.
a Public Platform for Geospatial Data Sharing for Disaster Risk Management
NASA Astrophysics Data System (ADS)
Balbo, S.; Boccardo, P.; Dalmasso, S.; Pasquali, P.
2013-01-01
Several studies have been conducted in Africa to assist local governments in addressing the risk situation related to natural hazards. Geospatial data containing information on vulnerability, impacts, climate change, disaster risk reduction is usually part of the output of such studies and is valuable to national and international organizations to reduce the risks and mitigate the impacts of disasters. Nevertheless this data isn't efficiently widely distributed and often resides in remote storage solutions hardly reachable. Spatial Data Infrastructures are technical solutions capable to solve this issue, by storing geospatial data and making them widely available through the internet. Among these solutions, GeoNode, an open source online platform for geospatial data sharing, has been developed in recent years. GeoNode is a platform for the management and publication of geospatial data. It brings together mature and stable open-source software projects under a consistent and easy-to-use interface allowing users, with little training, to quickly and easily share data and create interactive maps. GeoNode data management tools allow for integrated creation of data, metadata, and map visualizations. Each dataset in the system can be shared publicly or restricted to allow access to only specific users. Social features like user profiles and commenting and rating systems allow for the development of communities around each platform to facilitate the use, management, and quality control of the data the GeoNode instance contains (http://geonode.org/). This paper presents a case study scenario of setting up a Web platform based on GeoNode. It is a public platform called MASDAP and promoted by the Government of Malawi in order to support development of the country and build resilience against natural disasters. A substantial amount of geospatial data has already been collected about hydrogeological risk, as well as several other-disasters related information. Moreover this platform will help to ensure that the data created by a number of past or ongoing projects is maintained and that this information remains accessible and useful. An Integrated Flood Risk Management Plan for a river basin has already been included in the platform and other data from future disaster risk management projects will be added as well.
Raster Data Partitioning for Supporting Distributed GIS Processing
NASA Astrophysics Data System (ADS)
Nguyen Thai, B.; Olasz, A.
2015-08-01
In the geospatial sector big data concept also has already impact. Several studies facing originally computer science techniques applied in GIS processing of huge amount of geospatial data. In other research studies geospatial data is considered as it were always been big data (Lee and Kang, 2015). Nevertheless, we can prove data acquisition methods have been improved substantially not only the amount, but the resolution of raw data in spectral, spatial and temporal aspects as well. A significant portion of big data is geospatial data, and the size of such data is growing rapidly at least by 20% every year (Dasgupta, 2013). The produced increasing volume of raw data, in different format, representation and purpose the wealth of information derived from this data sets represents only valuable results. However, the computing capability and processing speed rather tackle with limitations, even if semi-automatic or automatic procedures are aimed on complex geospatial data (Kristóf et al., 2014). In late times, distributed computing has reached many interdisciplinary areas of computer science inclusive of remote sensing and geographic information processing approaches. Cloud computing even more requires appropriate processing algorithms to be distributed and handle geospatial big data. Map-Reduce programming model and distributed file systems have proven their capabilities to process non GIS big data. But sometimes it's inconvenient or inefficient to rewrite existing algorithms to Map-Reduce programming model, also GIS data can not be partitioned as text-based data by line or by bytes. Hence, we would like to find an alternative solution for data partitioning, data distribution and execution of existing algorithms without rewriting or with only minor modifications. This paper focuses on technical overview of currently available distributed computing environments, as well as GIS data (raster data) partitioning, distribution and distributed processing of GIS algorithms. A proof of concept implementation have been made for raster data partitioning, distribution and processing. The first results on performance have been compared against commercial software ERDAS IMAGINE 2011 and 2014. Partitioning methods heavily depend on application areas, therefore we may consider data partitioning as a preprocessing step before applying processing services on data. As a proof of concept we have implemented a simple tile-based partitioning method splitting an image into smaller grids (NxM tiles) and comparing the processing time to existing methods by NDVI calculation. The concept is demonstrated using own development open source processing framework.
Geospatial Standards and the Knowledge Generation Lifescycle
NASA Technical Reports Server (NTRS)
Khalsa, Siri Jodha S.; Ramachandran, Rahul
2014-01-01
Standards play an essential role at each stage in the sequence of processes by which knowledge is generated from geoscience observations, simulations and analysis. This paper provides an introduction to the field of informatics and the knowledge generation lifecycle in the context of the geosciences. In addition we discuss how the newly formed Earth Science Informatics Technical Committee is helping to advance the application of standards and best practices to make data and data systems more usable and interoperable.
Field Ground Truthing Data Collector - a Mobile Toolkit for Image Analysis and Processing
NASA Astrophysics Data System (ADS)
Meng, X.
2012-07-01
Field Ground Truthing Data Collector is one of the four key components of the NASA funded ICCaRS project, being developed in Southeast Michigan. The ICCaRS ground truthing toolkit entertains comprehensive functions: 1) Field functions, including determining locations through GPS, gathering and geo-referencing visual data, laying out ground control points for AEROKAT flights, measuring the flight distance and height, and entering observations of land cover (and use) and health conditions of ecosystems and environments in the vicinity of the flight field; 2) Server synchronization functions, such as, downloading study-area maps, aerial photos and satellite images, uploading and synchronizing field-collected data with the distributed databases, calling the geospatial web services on the server side to conduct spatial querying, image analysis and processing, and receiving the processed results in field for near-real-time validation; and 3) Social network communication functions for direct technical assistance and pedagogical support, e.g., having video-conference calls in field with the supporting educators, scientists, and technologists, participating in Webinars, or engaging discussions with other-learning portals. This customized software package is being built on Apple iPhone/iPad and Google Maps/Earth. The technical infrastructures, data models, coupling methods between distributed geospatial data processing and field data collector tools, remote communication interfaces, coding schema, and functional flow charts will be illustrated and explained at the presentation. A pilot case study will be also demonstrated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. Bauman; S. Burian; M. Deo
The Utah Heavy Oil Program (UHOP) was established in June 2006 to provide multidisciplinary research support to federal and state constituents for addressing the wide-ranging issues surrounding the creation of an industry for unconventional oil production in the United States. Additionally, UHOP was to serve as an on-going source of unbiased information to the nation surrounding technical, economic, legal and environmental aspects of developing heavy oil, oil sands, and oil shale resources. UHOP fulGilled its role by completing three tasks. First, in response to the Energy Policy Act of 2005 Section 369(p), UHOP published an update report to the 1987more » technical and economic assessment of domestic heavy oil resources that was prepared by the Interstate Oil and Gas Compact Commission. The UHOP report, entitled 'A Technical, Economic, and Legal Assessment of North American Heavy Oil, Oil Sands, and Oil Shale Resources' was published in electronic and hard copy form in October 2007. Second, UHOP developed of a comprehensive, publicly accessible online repository of unconventional oil resources in North America based on the DSpace software platform. An interactive map was also developed as a source of geospatial information and as a means to interact with the repository from a geospatial setting. All documents uploaded to the repository are fully searchable by author, title, and keywords. Third, UHOP sponsored Give research projects related to unconventional fuels development. Two projects looked at issues associated with oil shale production, including oil shale pyrolysis kinetics, resource heterogeneity, and reservoir simulation. One project evaluated in situ production from Utah oil sands. Another project focused on water availability and produced water treatments. The last project considered commercial oil shale leasing from a policy, environmental, and economic perspective.« less
NASA Astrophysics Data System (ADS)
Li, W.
2017-12-01
Data is the crux of science. The widespread availability of big data today is of particular importance for fostering new forms of geospatial innovation. This paper reports a state-of-the-art solution that addresses a key cyberinfrastructure research problem—providing ready access to big, distributed geospatial data resources on the Web. We first formulate this data-access problem and introduce its indispensable elements, including identifying the cyber-location, space and time coverage, theme, and quality of the dataset. We then propose strategies to tackle each data-access issue and make the data more discoverable and usable for geospatial data users and decision makers. Among these strategies is large-scale web crawling as a key technique to support automatic collection of online geospatial data that are highly distributed, intrinsically heterogeneous, and known to be dynamic. To better understand the content and scientific meanings of the data, methods including space-time filtering, ontology-based thematic classification, and service quality evaluation are incorporated. To serve a broad scientific user community, these techniques are integrated into an operational data crawling system, PolarHub, which is also an important cyberinfrastructure building block to support effective data discovery. A series of experiments were conducted to demonstrate the outstanding performance of the PolarHub system. We expect this work to contribute significantly in building the theoretical and methodological foundation for data-driven geography and the emerging spatial data science.
Developing Geospatial Intelligence Stewardship for Multinational Operations
2010-06-11
and chaired by the NGA’s It is important to note that the GEOINT data stream requires the largest bandwidth for full motion video , hyper-spectral...platforms, as in the phrase “Predator Porn .” Yet, what should be used could be called an ISR-operational design, in an end-ways-means approach...
NASA Astrophysics Data System (ADS)
Williams, N. A.; Morris, J. N.; Simms, M. L.; Metoyer, S.
2007-12-01
The Advancing Geospatial Skills in Science and Social Sciences (AGSSS) program, funded by NSF, provides middle and high school teacher-partners with access to graduate student scientists for classroom collaboration and curriculum adaptation to incorporate and advance skills in spatial thinking. AGSSS Fellows aid in the delivery of geospatially-enhanced activities utilizing technology such as geographic information systems, remote sensing, and virtual globes. The partnership also provides advanced professional development for both participating teachers and fellows. The AGSSS program is mutually beneficial to all parties involved. This successful collaboration of scientists, teachers, and students results in greater understanding and enthusiasm for the use of spatial thinking strategies and geospatial technologies. In addition, the partnership produces measurable improvements in student efficacy and attitudes toward processes of spatial thinking. The teacher partner training and classroom resources provided by AGSSS will continue the integration of geospatial activities into the curriculum after the project concludes. Time and resources are the main costs in implementing this partnership. Graduate fellows invest considerable time and energy, outside of academic responsibilities, to develop materials for the classroom. Fellows are required to be available during K-12 school hours, which necessitates forethought in scheduling other graduate duties. However, the benefits far outweigh the costs. Graduate fellows gain experience in working in classrooms. In exchange, students gain exposure to working scientists and their research. This affords graduate fellows the opportunity to hone their communication skills, and specifically allows them to address the issue of translating technical information for a novice audience. Teacher-partners and students benefit by having scientific expertise readily available. In summation, these experiences result in changes in teacher/student perceptions of science and scientists. Evidence of the aforementioned changes are provided through external evaluation and results obtained from several assessment tools. The program also utilizes an internal evaluator to monitor participants thoughts and opinions on the previous years' collaboration. Additionally, graduate fellows maintain a reflective journal to provide insight into experiences occurring both in-class and among peers. Finally, student surveys administered prior to and concluding the academic year assess changes in student attitudes and self-perception of spatial thinking skills.
Development of Analytical Plug-ins for ENSITE: Version 1.0
2017-11-01
ENSITE’s core-software platform builds upon leading geospatial platforms already in use by the Army and is designed to offer an easy-to-use, customized ...use by the Army and is designed to offer an easy-to-use, customized set of workflows for CB planners. Within this platform are added software compo...public good . Find out more at www.erdc.usace.army.mil. To search for other technical reports published by ERDC, visit the ERDC online library at
Estimating Renewable Energy Economic Potential in the United States: Methodology and Initial Results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Austin; Beiter, Philipp; Heimiller, Donna
The report describes a geospatial analysis method to estimate the economic potential of several renewable resources available for electricity generation in the United States. Economic potential, one measure of renewable generation potential, is defined in this report as the subset of the available resource technical potential where the cost required to generate the electricity (which determines the minimum revenue requirements for development of the resource) is below the revenue available in terms of displaced energy and displaced capacity.
Creating Actionable Data from an Optical Depth Measurement Network using RDF
NASA Astrophysics Data System (ADS)
Freemantle, J. R.; O'Neill, N. T.; Lumb, L. I.; Abboud, I.; McArthur, B.
2010-12-01
The AEROCAN sunphotometery network has, for more than a decade, generated optical indicators of aerosol concentration and size on a regional and national scale. We believe this optical information can be rendered more “actionable” to the health care community by developing a technical and interpretative information-sharing geospatial strategy with that community. By actionable data we mean information that is presented in manner that can be understood and then used in the decision making process. The decision may be that of a technical professional, a policy maker or a machine. The information leading up to a decision may come from many sources; this means it is particularly important that data are well defined across knowledge fields, in our case atmospheric science and respiratory health science. As part of the AEROCAN operational quality assurance (QA) methodology we have written automatic procedures to make some of the AEROCAN data more accessible or “actionable”. Tim Berners-Lee has advocated making datasets, “Linked Data”, available on the web with a proper structural description (metadata). We have been using RDF (Resource Description Framework) to enhance the utility of our sunphotometer data; the resulting self-describing representation is structured so that it is machine readable. This allows semantically based queries (e.g., via SPARQL) on our dataset that in the past were only viewable as passive Web tables of data.
Joseph St. Peter; John Hogland; Nathaniel Anderson; Jason Drake; Paul Medley
2018-01-01
Land cover classification provides valuable information for prioritizing management and conservation operations across large landscapes. Current regional scale land cover geospatial products within the United States have a spatial resolution that is too coarse to provide the necessary information for operations at the local and project scales. This paper describes a...
46 CFR 131.910 - Notices to mariners and aids to navigation.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 46 Shipping 4 2014-10-01 2014-10-01 false Notices to mariners and aids to navigation. 131.910... OPERATIONS Miscellaneous § 131.910 Notices to mariners and aids to navigation. Each master and mate shall... Geospatial-Intelligence Agency regarding aids to navigation in the area in which the vessel operates. [CGD 82...
78 FR 57455 - Pipeline Safety: Information Collection Activities
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-18
... ``. . . system-specific information, including pipe diameter, operating pressure, product transported, and...) must provide contact information and geospatial data on their pipeline system. This information should... Mapping System (NPMS) to support various regulatory programs, pipeline inspections, and authorized...
NASA Astrophysics Data System (ADS)
Kagawa, Ayako; Le Sourd, Guillaume
2018-05-01
United Nations Secretariat activities, mapping began in 1946, and by 1951, the need for maps increased and an office with a team of cartographers was established. Since then, with the development of technologies including internet, remote sensing, unmanned aerial systems, relationship database management and information systems, geospatial information provides an ever-increasing variation of support to the work of the Organization for planning of operations, decision-making and monitoring of crises. However, the need for maps has remained intact. This presentation aims to highlight some of the cartographic representation styles over the decades by reviewing the evolution of selected maps by the office, and noting the changing cognitive and semiotic aspects of cartographic and geographic visualization required by the United Nations. Through presentation and analysis of these maps, the changing dynamics of the Organization in information management can be reflected, with a reminder of the continuing and expanding deconstructionist role of a cartographer, now geospatial information management experts.
Geospatial Data Science Modeling | Geospatial Data Science | NREL
Geospatial Data Science Modeling Geospatial Data Science Modeling NREL uses geospatial data science modeling to develop innovative models and tools for energy professionals, project developers, and consumers . Photo of researchers inspecting maps on a large display. Geospatial modeling at NREL often produces the
A resource-oriented architecture for a Geospatial Web
NASA Astrophysics Data System (ADS)
Mazzetti, Paolo; Nativi, Stefano
2010-05-01
In this presentation we discuss some architectural issues on the design of an architecture for a Geospatial Web, that is an information system for sharing geospatial resources according to the Web paradigm. The success of the Web in building a multi-purpose information space, has raised questions about the possibility of adopting the same approach for systems dedicated to the sharing of more specific resources, such as the geospatial information, that is information characterized by spatial/temporal reference. To this aim an investigation on the nature of the Web and on the validity of its paradigm for geospatial resources is required. The Web was born in the early 90's to provide "a shared information space through which people and machines could communicate" [Berners-Lee 1996]. It was originally built around a small set of specifications (e.g. URI, HTTP, HTML, etc.); however, in the last two decades several other technologies and specifications have been introduced in order to extend its capabilities. Most of them (e.g. the SOAP family) actually aimed to transform the Web in a generic Distributed Computing Infrastructure. While these efforts were definitely successful enabling the adoption of service-oriented approaches for machine-to-machine interactions supporting complex business processes (e.g. for e-Government and e-Business applications), they do not fit in the original concept of the Web. In the year 2000, R. T. Fielding, one of the designers of the original Web specifications, proposes a new architectural style for distributed systems, called REST (Representational State Transfer), aiming to capture the fundamental characteristics of the Web as it was originally conceived [Fielding 2000]. In this view, the nature of the Web lies not so much in the technologies, as in the way they are used. Maintaining the Web architecture conform to the REST style would then assure the scalability, extensibility and low entry barrier of the original Web. On the contrary, systems using the same Web technologies and specifications but according to a different architectural style, despite their usefulness, should not be considered part of the Web. If the REST style captures the significant Web characteristics, then, in order to build a Geospatial Web it is necessary that its architecture satisfies all the REST constraints. One of them is of particular importance: the adoption of a Uniform Interface. It prescribes that all the geospatial resources must be accessed through the same interface; moreover according to the REST style this interface must satisfy four further constraints: a) identification of resources; b) manipulation of resources through representations; c) self-descriptive messages; and, d) hypermedia as the engine of application state. In the Web, the uniform interface provides basic operations which are meaningful for generic resources. They typically implement the CRUD pattern (Create-Retrieve-Update-Delete) which demonstrated to be flexible and powerful in several general-purpose contexts (e.g. filesystem management, SQL for database management systems, etc.). Restricting the scope to a subset of resources it would be possible to identify other generic actions which are meaningful for all of them. For example for geospatial resources, subsetting, resampling, interpolation and coordinate reference systems transformations functionalities are candidate functionalities for a uniform interface. However an investigation is needed to clarify the semantics of those actions for different resources, and consequently if they can really ascend the role of generic interface operation. Concerning the point a), (identification of resources), it is required that every resource addressable in the Geospatial Web has its own identifier (e.g. a URI). This allows to implement citation and re-use of resources, simply providing the URI. OPeNDAP and KVP encodings of OGC data access services specifications might provide a basis for it. Concerning point b) (manipulation of resources through representations), the Geospatial Web poses several issues. In fact, while the Web mainly handles semi-structured information, in the Geospatial Web the information is typically structured with several possible data models (e.g. point series, gridded coverages, trajectories, etc.) and encodings. A possibility would be to simplify the interchange formats, choosing to support a subset of data models and format(s). This is what actually the Web designers did choosing to define a common format for hypermedia (HTML), although the underlying protocol would be generic. Concerning point c), self-descriptive messages, the exchanged messages should describe themselves and their content. This would not be actually a major issue considering the effort put in recent years on geospatial metadata models and specifications. The point d), hypermedia as the engine of application state, is actually where the Geospatial Web would mainly differ from existing geospatial information sharing systems. In fact the existing systems typically adopt a service-oriented architecture, where applications are built as a single service or as a workflow of services. On the other hand, in the Geospatial Web, applications should be built following the path between interconnected resources. The link between resources should be made explicit as hyperlinks. The adoption of Semantic Web solutions would allow to define not only the existence of a link between two resources, but also the nature of the link. The implementation of a Geospatial Web would allow to build an information system with the same characteristics of the Web sharing its points-of-strength and weaknesses. The main advantages would be the following: • The user would interact with the Geospatial Web according to the well-known Web navigation paradigm. This would lower the barrier to the access to geospatial applications for non-specialists (e.g. the success of Google Maps and other Web mapping applications); • Successful Web and Web 2.0 applications - search engines, feeds, social network - could be integrated/replicated in the Geospatial Web; The main drawbacks would be the following: • The Uniform Interface simplifies the overall system architecture (e.g. no service registry, and service descriptors required), but moves the complexity to the data representation. Moreover since the interface must stay generic, it results really simple and therefore complex interactions would require several transfers. • In the geospatial domain one of the most valuable resources are processes (e.g. environmental models). How they can be modeled as resources accessed through the common interface is an open issue. Taking into account advantages and drawback it seems that a Geospatial Web would be useful, but its use would be limited to specific use-cases not covering all the possible applications. The Geospatial Web architecture could be partly based on existing specifications, while other aspects need investigation. References [Berners-Lee 1996] T. Berners-Lee, "WWW: Past, present, and future". IEEE Computer, 29(10), Oct. 1996, pp. 69-77. [Fielding 2000] Fielding, R. T. 2000. Architectural styles and the design of network-based software architectures. PhD Dissertation. Dept. of Information and Computer Science, University of California, Irvine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yue, Peng; Gong, Jianya; Di, Liping
Abstract A geospatial catalogue service provides a network-based meta-information repository and interface for advertising and discovering shared geospatial data and services. Descriptive information (i.e., metadata) for geospatial data and services is structured and organized in catalogue services. The approaches currently available for searching and using that information are often inadequate. Semantic Web technologies show promise for better discovery methods by exploiting the underlying semantics. Such development needs special attention from the Cyberinfrastructure perspective, so that the traditional focus on discovery of and access to geospatial data can be expanded to support the increased demand for processing of geospatial information andmore » discovery of knowledge. Semantic descriptions for geospatial data, services, and geoprocessing service chains are structured, organized, and registered through extending elements in the ebXML Registry Information Model (ebRIM) of a geospatial catalogue service, which follows the interface specifications of the Open Geospatial Consortium (OGC) Catalogue Services for the Web (CSW). The process models for geoprocessing service chains, as a type of geospatial knowledge, are captured, registered, and discoverable. Semantics-enhanced discovery for geospatial data, services/service chains, and process models is described. Semantic search middleware that can support virtual data product materialization is developed for the geospatial catalogue service. The creation of such a semantics-enhanced geospatial catalogue service is important in meeting the demands for geospatial information discovery and analysis in Cyberinfrastructure.« less
GIS applications for military operations in coastal zones
Fleming, S.; Jordan, T.; Madden, M.; Usery, E.L.; Welch, R.
2009-01-01
In order to successfully support current and future US military operations in coastal zones, geospatial information must be rapidly integrated and analyzed to meet ongoing force structure evolution and new mission directives. Coastal zones in a military-operational environment are complex regions that include sea, land and air features that demand high-volume databases of extreme detail within relatively narrow geographic corridors. Static products in the form of analog maps at varying scales traditionally have been used by military commanders and their operational planners. The rapidly changing battlefield of 21st Century warfare, however, demands dynamic mapping solutions. Commercial geographic information system (GIS) software for military-specific applications is now being developed and employed with digital databases to provide customized digital maps of variable scale, content and symbolization tailored to unique demands of military units. Research conducted by the Center for Remote Sensing and Mapping Science at the University of Georgia demonstrated the utility of GIS-based analysis and digital map creation when developing large-scale (1:10,000) products from littoral warfare databases. The methodology employed-selection of data sources (including high resolution commercial images and Lidar), establishment of analysis/modeling parameters, conduct of vehicle mobility analysis, development of models and generation of products (such as a continuous sea-land DEM and geo-visualization of changing shorelines with tidal levels)-is discussed. Based on observations and identified needs from the National Geospatial-Intelligence Agency, formerly the National Imagery and Mapping Agency, and the Department of Defense, prototype GIS models for military operations in sea, land and air environments were created from multiple data sets of a study area at US Marine Corps Base Camp Lejeune, North Carolina. Results of these models, along with methodologies for developing large-scale littoral warfare databases, aid the National Geospatial-Intelligence Agency in meeting littoral warfare analysis, modeling and map generation requirements for US military organizations. ?? 2008 International Society for Photogrammetry and Remote Sensing, Inc. (ISPRS).
GIS applications for military operations in coastal zones
NASA Astrophysics Data System (ADS)
Fleming, S.; Jordan, T.; Madden, M.; Usery, E. L.; Welch, R.
In order to successfully support current and future US military operations in coastal zones, geospatial information must be rapidly integrated and analyzed to meet ongoing force structure evolution and new mission directives. Coastal zones in a military-operational environment are complex regions that include sea, land and air features that demand high-volume databases of extreme detail within relatively narrow geographic corridors. Static products in the form of analog maps at varying scales traditionally have been used by military commanders and their operational planners. The rapidly changing battlefield of 21st Century warfare, however, demands dynamic mapping solutions. Commercial geographic information system (GIS) software for military-specific applications is now being developed and employed with digital databases to provide customized digital maps of variable scale, content and symbolization tailored to unique demands of military units. Research conducted by the Center for Remote Sensing and Mapping Science at the University of Georgia demonstrated the utility of GIS-based analysis and digital map creation when developing large-scale (1:10,000) products from littoral warfare databases. The methodology employed-selection of data sources (including high resolution commercial images and Lidar), establishment of analysis/modeling parameters, conduct of vehicle mobility analysis, development of models and generation of products (such as a continuous sea-land DEM and geo-visualization of changing shorelines with tidal levels)-is discussed. Based on observations and identified needs from the National Geospatial-Intelligence Agency, formerly the National Imagery and Mapping Agency, and the Department of Defense, prototype GIS models for military operations in sea, land and air environments were created from multiple data sets of a study area at US Marine Corps Base Camp Lejeune, North Carolina. Results of these models, along with methodologies for developing large-scale littoral warfare databases, aid the National Geospatial-Intelligence Agency in meeting littoral warfare analysis, modeling and map generation requirements for US military organizations.
Ontology Based Quality Evaluation for Spatial Data
NASA Astrophysics Data System (ADS)
Yılmaz, C.; Cömert, Ç.
2015-08-01
Many institutions will be providing data to the National Spatial Data Infrastructure (NSDI). Current technical background of the NSDI is based on syntactic web services. It is expected that this will be replaced by semantic web services. The quality of the data provided is important in terms of the decision-making process and the accuracy of transactions. Therefore, the data quality needs to be tested. This topic has been neglected in Turkey. Data quality control for NSDI may be done by private or public "data accreditation" institutions. A methodology is required for data quality evaluation. There are studies for data quality including ISO standards, academic studies and software to evaluate spatial data quality. ISO 19157 standard defines the data quality elements. Proprietary software such as, 1Spatial's 1Validate and ESRI's Data Reviewer offers quality evaluation based on their own classification of rules. Commonly, rule based approaches are used for geospatial data quality check. In this study, we look for the technical components to devise and implement a rule based approach with ontologies using free and open source software in semantic web context. Semantic web uses ontologies to deliver well-defined web resources and make them accessible to end-users and processes. We have created an ontology conforming to the geospatial data and defined some sample rules to show how to test data with respect to data quality elements including; attribute, topo-semantic and geometrical consistency using free and open source software. To test data against rules, sample GeoSPARQL queries are created, associated with specifications.
An approach for heterogeneous and loosely coupled geospatial data distributed computing
NASA Astrophysics Data System (ADS)
Chen, Bin; Huang, Fengru; Fang, Yu; Huang, Zhou; Lin, Hui
2010-07-01
Most GIS (Geographic Information System) applications tend to have heterogeneous and autonomous geospatial information resources, and the availability of these local resources is unpredictable and dynamic under a distributed computing environment. In order to make use of these local resources together to solve larger geospatial information processing problems that are related to an overall situation, in this paper, with the support of peer-to-peer computing technologies, we propose a geospatial data distributed computing mechanism that involves loosely coupled geospatial resource directories and a term named as Equivalent Distributed Program of global geospatial queries to solve geospatial distributed computing problems under heterogeneous GIS environments. First, a geospatial query process schema for distributed computing as well as a method for equivalent transformation from a global geospatial query to distributed local queries at SQL (Structured Query Language) level to solve the coordinating problem among heterogeneous resources are presented. Second, peer-to-peer technologies are used to maintain a loosely coupled network environment that consists of autonomous geospatial information resources, thus to achieve decentralized and consistent synchronization among global geospatial resource directories, and to carry out distributed transaction management of local queries. Finally, based on the developed prototype system, example applications of simple and complex geospatial data distributed queries are presented to illustrate the procedure of global geospatial information processing.
Modeling and formal representation of geospatial knowledge for the Geospatial Semantic Web
NASA Astrophysics Data System (ADS)
Huang, Hong; Gong, Jianya
2008-12-01
GML can only achieve geospatial interoperation at syntactic level. However, it is necessary to resolve difference of spatial cognition in the first place in most occasions, so ontology was introduced to describe geospatial information and services. But it is obviously difficult and improper to let users to find, match and compose services, especially in some occasions there are complicated business logics. Currently, with the gradual introduction of Semantic Web technology (e.g., OWL, SWRL), the focus of the interoperation of geospatial information has shifted from syntactic level to Semantic and even automatic, intelligent level. In this way, Geospatial Semantic Web (GSM) can be put forward as an augmentation to the Semantic Web that additionally includes geospatial abstractions as well as related reasoning, representation and query mechanisms. To advance the implementation of GSM, we first attempt to construct the mechanism of modeling and formal representation of geospatial knowledge, which are also two mostly foundational phases in knowledge engineering (KE). Our attitude in this paper is quite pragmatical: we argue that geospatial context is a formal model of the discriminate environment characters of geospatial knowledge, and the derivation, understanding and using of geospatial knowledge are located in geospatial context. Therefore, first, we put forward a primitive hierarchy of geospatial knowledge referencing first order logic, formal ontologies, rules and GML. Second, a metamodel of geospatial context is proposed and we use the modeling methods and representation languages of formal ontologies to process geospatial context. Thirdly, we extend Web Process Service (WPS) to be compatible with local DLL for geoprocessing and possess inference capability based on OWL.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Preliminary Image Map of the 2007 Ranch Fire Perimeter, Piru Quadrangle, Ventura County, California
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
OpenStreetMap Collaborative Prototype, Phase 1
Wolf, Eric B.; Matthews, Greg D.; McNinch, Kevin; Poore, Barbara S.
2011-01-01
Phase One of the OpenStreetMap Collaborative Prototype (OSMCP) attempts to determine if the open source software developed for the OpenStreetMap (OSM, http://www.openstreetmap.org) can be used for data contributions and improvements that meet or exceed the requirements for integration into The National Map (http://www.nationalmap.gov). OpenStreetMap Collaborative Prototype Phase One focused on road data aggregated at the state level by the Kansas Data Access and Support Center (DASC). Road data from the DASC were loaded into a system hosted by the U.S. Geological Survey (USGS) National Geospatial Technical Operations Center (NGTOC) in Rolla, Missouri. U.S. Geological Survey editing specifications were developed by NGTOC personnel (J. Walters and G. Matthews, USGS, unpub. report, 2010). Interstate and U.S. Highways in the dataset were edited to the specifications by NGTOC personnel while State roads were edited by DASC personnel. Resulting data were successfully improved to meet standards for The National Map once the system and specifications were in place. The OSM software proved effective in providing a usable platform for collaborative data editing
Clark, Perry S.; Scratch, Wendy S.; Bias, Gaylord W.; Stander, Gregory B.; Sexton, Jenne L.; Krawczak, Bridgette J.
2008-01-01
In the fall of 2007, wildfires burned out of control in southern California. The extent of these fires encompassed large geographic areas that included a variety of landscapes from urban to wilderness. The U.S. Geological Survey National Geospatial Technical Operations Center (NGTOC) is currently (2008) developing a quadrangle-based 1:24,000-scale image map product. One of the concepts behind the image map product is to provide an updated map in electronic format to assist with emergency response. This image map is one of 55 preliminary image map quadrangles covering the areas burned by the southern California wildfires. Each map is a layered, geo-registered Portable Document Format (.pdf) file. For more information about the layered geo-registered .pdf, see the readme file (http://pubs.usgs.gov/of/2008/1029/downloads/CA_Agua_Dulce_of2008-1029_README.txt). To view the areas affected and the quadrangles mapped in this preliminary project, see the map index (http://pubs.usgs.gov/of/2008/1029/downloads/CA_of2008_1029-1083_index.pdf) provided with this report.
Hazards Data Distribution System (HDDS)
Jones, Brenda; Lamb, Rynn M.
2015-07-09
When emergencies occur, first responders and disaster response teams often need rapid access to aerial photography and satellite imagery that is acquired before and after the event. The U.S. Geological Survey (USGS) Hazards Data Distribution System (HDDS) provides quick and easy access to pre- and post-event imagery and geospatial datasets that support emergency response and recovery operations. The HDDS provides a single, consolidated point-of-entry and distribution system for USGS-hosted remotely sensed imagery and other geospatial datasets related to an event response. The data delivery services are provided through an interactive map-based interface that allows emergency response personnel to rapidly select and download pre-event ("baseline") and post-event emergency response imagery.
NASA Astrophysics Data System (ADS)
Rosinski, A.; Beilin, P.; Colwell, J.; Hornick, M.; Glasscoe, M. T.; Morentz, J.; Smorodinsky, S.; Millington, A.; Hudnut, K. W.; Penn, P.; Ortiz, M.; Kennedy, M.; Long, K.; Miller, K.; Stromberg, M.
2015-12-01
The Clearinghouse provides emergency management and response professionals, scientific and engineering communities with prompt information on ground failure, structural damage, and other consequences from significant seismic events such as earthquakes or tsunamis. Clearinghouse activations include participation from Federal, State and local government, law enforcement, fire, EMS, emergency management, public health, environmental protection, the military, public and non-governmental organizations, and private sector. For the August 24, 2014 S. Napa earthquake, over 100 people from 40 different organizations participated during the 3-day Clearinghouse activation. Every organization has its own role and responsibility in disaster response; however all require authoritative data about the disaster for rapid hazard assessment and situational awareness. The Clearinghouse has been proactive in fostering collaboration and sharing Essential Elements of Information across disciplines. The Clearinghouse-led collaborative promotes the use of standard formats and protocols to allow existing technology to transform data into meaningful incident-related content and to enable data to be used by the largest number of participating Clearinghouse partners, thus providing responding personnel with enhanced real-time situational awareness, rapid hazard assessment, and more informed decision-making in support of response and recovery. The Clearinghouse efforts address national priorities outlined in USGS Circular 1242, Plan to Coordinate NEHRP post-earthquake investigations and S. 740-Geospatial Data Act of 2015, Sen. Orrin Hatch (R-UT), to streamline and coordinate geospatial data infrastructure, maximizing geospatial data in support of the Robert T. Stafford Act. Finally, the US Dept. of Homeland Security, Geospatial Management Office, recognized Clearinghouse's data sharing efforts as a Best Practice to be included in the forthcoming 2015 HLS Geospatial Concept of Operations.
Grid Enabled Geospatial Catalogue Web Service
NASA Technical Reports Server (NTRS)
Chen, Ai-Jun; Di, Li-Ping; Wei, Ya-Xing; Liu, Yang; Bui, Yu-Qi; Hu, Chau-Min; Mehrotra, Piyush
2004-01-01
Geospatial Catalogue Web Service is a vital service for sharing and interoperating volumes of distributed heterogeneous geospatial resources, such as data, services, applications, and their replicas over the web. Based on the Grid technology and the Open Geospatial Consortium (0GC) s Catalogue Service - Web Information Model, this paper proposes a new information model for Geospatial Catalogue Web Service, named as GCWS which can securely provides Grid-based publishing, managing and querying geospatial data and services, and the transparent access to the replica data and related services under the Grid environment. This information model integrates the information model of the Grid Replica Location Service (RLS)/Monitoring & Discovery Service (MDS) with the information model of OGC Catalogue Service (CSW), and refers to the geospatial data metadata standards from IS0 19115, FGDC and NASA EOS Core System and service metadata standards from IS0 191 19 to extend itself for expressing geospatial resources. Using GCWS, any valid geospatial user, who belongs to an authorized Virtual Organization (VO), can securely publish and manage geospatial resources, especially query on-demand data in the virtual community and get back it through the data-related services which provide functions such as subsetting, reformatting, reprojection etc. This work facilitates the geospatial resources sharing and interoperating under the Grid environment, and implements geospatial resources Grid enabled and Grid technologies geospatial enabled. It 2!so makes researcher to focus on science, 2nd not cn issues with computing ability, data locztic, processir,g and management. GCWS also is a key component for workflow-based virtual geospatial data producing.
NASA Astrophysics Data System (ADS)
Cannata, Massimiliano; Colombo, Massimo; Antonovic, Milan; Cardoso, Mirko; Delucchi, Andrea; Gianocca, Giancarlo; Brovelli, Maria Antonia
2015-04-01
"I CAMMINI DELLA REGINA" (The Via Regina Paths) is an Interreg project funded within the transnational cooperation program between Italy and Switzerland 2007-2013. The aim of this project is the preservation and valorization of the cultural heritage linked to the walking historically paths crossing, connecting and serving the local territories. With the approach of leveraging the already existing tools, which generally consist of technical descriptions of the paths, the project uses the open source geospatial technologies to deploy innovative solutions which can fill some of the gaps in historical-cultural tourism offers. The Swiss part, and particularly the IST-SUPSI team, has been focusing its activities in the realization of two innovative solutions: a mobile application for the survey of historical paths and a storytelling system for immersive cultural exploration of the historical paths. The former, based on Android, allows to apply in a revised manner a consolidated and already successfully used methodology of survey focused on the conservation of the historical paths (Inventory of historical traffic routes in Switzerland). Up to now operators could rely only on hand work based on a combination of notes, pictures and GPS devices synthesized in manually drawn maps; this procedure is error prone and shows many problems both in data updating and extracting for elaborations. Thus it has been created an easy to use interface which allows to map, according to a newly developed spatially enabled data model, paths, morphological elements, and multimedia notes. When connected to the internet the application can send the data to a web service which, after applying linear referencing and further elaborating the data, makes them available using open standards. The storytelling system has been designed to provide users with cultural insights embedded in a multimedial and immersive geospatial portal. Whether the tourist is exploring physically or virtually the desired historical path, the system will provide notifications and immersive multimedia information that foster a new sight of the territory: award of the culture and history of the place thanks to attractive description of the geological, land use, historical and ethnographic contexts. The technologies used for these developments are: mongoDB, tornado, Android SDK, geoserver, bootstrap, OpenLayers, HTML5, CSS3, JQuery. The approach, methodologies and technical implementations will be discussed and presented.
Web mapping system for complex processing and visualization of environmental geospatial datasets
NASA Astrophysics Data System (ADS)
Titov, Alexander; Gordov, Evgeny; Okladnikov, Igor
2016-04-01
Environmental geospatial datasets (meteorological observations, modeling and reanalysis results, etc.) are used in numerous research applications. Due to a number of objective reasons such as inherent heterogeneity of environmental datasets, big dataset volume, complexity of data models used, syntactic and semantic differences that complicate creation and use of unified terminology, the development of environmental geodata access, processing and visualization services as well as client applications turns out to be quite a sophisticated task. According to general INSPIRE requirements to data visualization geoportal web applications have to provide such standard functionality as data overview, image navigation, scrolling, scaling and graphical overlay, displaying map legends and corresponding metadata information. It should be noted that modern web mapping systems as integrated geoportal applications are developed based on the SOA and might be considered as complexes of interconnected software tools for working with geospatial data. In the report a complex web mapping system including GIS web client and corresponding OGC services for working with geospatial (NetCDF, PostGIS) dataset archive is presented. There are three basic tiers of the GIS web client in it: 1. Tier of geospatial metadata retrieved from central MySQL repository and represented in JSON format 2. Tier of JavaScript objects implementing methods handling: --- NetCDF metadata --- Task XML object for configuring user calculations, input and output formats --- OGC WMS/WFS cartographical services 3. Graphical user interface (GUI) tier representing JavaScript objects realizing web application business logic Metadata tier consists of a number of JSON objects containing technical information describing geospatial datasets (such as spatio-temporal resolution, meteorological parameters, valid processing methods, etc). The middleware tier of JavaScript objects implementing methods for handling geospatial metadata, task XML object, and WMS/WFS cartographical services interconnects metadata and GUI tiers. The methods include such procedures as JSON metadata downloading and update, launching and tracking of the calculation task running on the remote servers as well as working with WMS/WFS cartographical services including: obtaining the list of available layers, visualizing layers on the map, exporting layers in graphical (PNG, JPG, GeoTIFF), vector (KML, GML, Shape) and digital (NetCDF) formats. Graphical user interface tier is based on the bundle of JavaScript libraries (OpenLayers, GeoExt and ExtJS) and represents a set of software components implementing web mapping application business logic (complex menus, toolbars, wizards, event handlers, etc.). GUI provides two basic capabilities for the end user: configuring the task XML object functionality and cartographical information visualizing. The web interface developed is similar to the interface of such popular desktop GIS applications, as uDIG, QuantumGIS etc. Web mapping system developed has shown its effectiveness in the process of solving real climate change research problems and disseminating investigation results in cartographical form. The work is supported by SB RAS Basic Program Projects VIII.80.2.1 and IV.38.1.7.
Availability of the OGC geoprocessing standard: March 2011 reality check
NASA Astrophysics Data System (ADS)
Lopez-Pellicer, Francisco J.; Rentería-Agualimpia, Walter; Béjar, Rubén; Muro-Medrano, Pedro R.; Zarazaga-Soria, F. Javier
2012-10-01
This paper presents an investigation about the servers available in March 2011 conforming to the Web Processing Service interface specification published by the geospatial standards organization Open Geospatial Consortium (OGC) in 2007. This interface specification gives support to standard Web-based geoprocessing. The data used in this research were collected using a focused crawler configured for finding OGC Web services. The research goals are (i) to provide a reality check of the availability of Web Processing Service servers, (ii) to provide quantitative data about the use of different features defined in the standard that are relevant for a scalable Geoprocessing Web (e.g. long-running processes, Web-accessible data outputs), and (iii) to test if the advances in the use of search engines and focused crawlers for finding Web services can be applied for finding geoscience processing systems. Research results show the feasibility of the discovery approach and provide data about the implementation of the Web Processing Service specification. These results also show extensive use of features related to scalability, except for those related to technical and semantic interoperability.
NASA Astrophysics Data System (ADS)
Karnatak, H.; Pandey, K.; Oberai, K.; Roy, A.; Joshi, D.; Singh, H.; Raju, P. L. N.; Krishna Murthy, Y. V. N.
2014-11-01
National Biodiversity Characterization at Landscape Level, a project jointly sponsored by Department of Biotechnology and Department of Space, was implemented to identify and map the potential biodiversity rich areas in India. This project has generated spatial information at three levels viz. Satellite based primary information (Vegetation Type map, spatial locations of road & village, Fire occurrence); geospatially derived or modelled information (Disturbance Index, Fragmentation, Biological Richness) and geospatially referenced field samples plots. The study provides information of high disturbance and high biological richness areas suggesting future management strategies and formulating action plans. The study has generated for the first time baseline database in India which will be a valuable input towards climate change study in the Indian Subcontinent. The spatial data generated during the study is organized as central data repository in Geo-RDBMS environment using PostgreSQL and POSTGIS. The raster and vector data is published as OGC WMS and WFS standard for development of web base geoinformation system using Service Oriented Architecture (SOA). The WMS and WFS based system allows geo-visualization, online query and map outputs generation based on user request and response. This is a typical mashup architecture based geo-information system which allows access to remote web services like ISRO Bhuvan, Openstreet map, Google map etc., with overlay on Biodiversity data for effective study on Bio-resources. The spatial queries and analysis with vector data is achieved through SQL queries on POSTGIS and WFS-T operations. But the most important challenge is to develop a system for online raster based geo-spatial analysis and processing based on user defined Area of Interest (AOI) for large raster data sets. The map data of this study contains approximately 20 GB of size for each data layer which are five in number. An attempt has been to develop system using python, PostGIS and PHP for raster data analysis over the web for Biodiversity conservation and prioritization. The developed system takes inputs from users as WKT, Openlayer based Polygon geometry and Shape file upload as AOI to perform raster based operation using Python and GDAL/OGR. The intermediate products are stored in temporary files and tables which generate XML outputs for web representation. The raster operations like clip-zip-ship, class wise area statistics, single to multi-layer operations, diagrammatic representation and other geo-statistical analysis are performed. This is indigenous geospatial data processing engine developed using Open system architecture for spatial analysis of Biodiversity data sets in Internet GIS environment. The performance of this applications in multi-user environment like Internet domain is another challenging task which is addressed by fine tuning the source code, server hardening, spatial indexing and running the process in load balance mode. The developed system is hosted in Internet domain (http://bis.iirs.gov.in) for user access.
GSKY: A scalable distributed geospatial data server on the cloud
NASA Astrophysics Data System (ADS)
Rozas Larraondo, Pablo; Pringle, Sean; Antony, Joseph; Evans, Ben
2017-04-01
Earth systems, environmental and geophysical datasets are an extremely valuable sources of information about the state and evolution of the Earth. Being able to combine information coming from different geospatial collections is in increasing demand by the scientific community, and requires managing and manipulating data with different formats and performing operations such as map reprojections, resampling and other transformations. Due to the large data volume inherent in these collections, storing multiple copies of them is unfeasible and so such data manipulation must be performed on-the-fly using efficient, high performance techniques. Ideally this should be performed using a trusted data service and common system libraries to ensure wide use and reproducibility. Recent developments in distributed computing based on dynamic access to significant cloud infrastructure opens the door for such new ways of processing geospatial data on demand. The National Computational Infrastructure (NCI), hosted at the Australian National University (ANU), has over 10 Petabytes of nationally significant research data collections. Some of these collections, which comprise a variety of observed and modelled geospatial data, are now made available via a highly distributed geospatial data server, called GSKY (pronounced [jee-skee]). GSKY supports on demand processing of large geospatial data products such as satellite earth observation data as well as numerical weather products, allowing interactive exploration and analysis of the data. It dynamically and efficiently distributes the required computations among cloud nodes providing a scalable analysis framework that can adapt to serve large number of concurrent users. Typical geospatial workflows handling different file formats and data types, or blending data in different coordinate projections and spatio-temporal resolutions, is handled transparently by GSKY. This is achieved by decoupling the data ingestion and indexing process as an independent service. An indexing service crawls data collections either locally or remotely by extracting, storing and indexing all spatio-temporal metadata associated with each individual record. GSKY provides the user with the ability of specifying how ingested data should be aggregated, transformed and presented. It presents an OGC standards-compliant interface, allowing ready accessibility for users of the data via Web Map Services (WMS), Web Processing Services (WPS) or raw data arrays using Web Coverage Services (WCS). The presentation will show some cases where we have used this new capability to provide a significant improvement over previous approaches.
NASA Astrophysics Data System (ADS)
Tisdale, M.
2017-12-01
NASA's Atmospheric Science Data Center (ASDC) is operationally using the Esri ArcGIS Platform to improve data discoverability, accessibility and interoperability to meet the diversifying user requirements from government, private, public and academic communities. The ASDC is actively working to provide their mission essential datasets as ArcGIS Image Services, Open Geospatial Consortium (OGC) Web Mapping Services (WMS), and OGC Web Coverage Services (WCS) while leveraging the ArcGIS multidimensional mosaic dataset structure. Science teams at ASDC are utilizing these services through the development of applications using the Web AppBuilder for ArcGIS and the ArcGIS API for Javascript. These services provide greater exposure of ASDC data holdings to the GIS community and allow for broader sharing and distribution to various end users. These capabilities provide interactive visualization tools and improved geospatial analytical tools for a mission critical understanding in the areas of the earth's radiation budget, clouds, aerosols, and tropospheric chemistry. The presentation will cover how the ASDC is developing geospatial web services and applications to improve data discoverability, accessibility, and interoperability.
NASA Astrophysics Data System (ADS)
Crosthwaite Eyre, Charles
2010-12-01
Payments for Ecosystem Services (PES) is an exciting and expanding opportunity for sustainably managed forests. PES are derived from a range of ecosystem benefits from forests including climate change mitigation through afforestation and avoided deforestation, green power generation, wetland and watershed rehabilitation, water quality improvement, marine flood defence and the reduction in desertification and soil erosion. Forests are also the ancestral home to many vulnerable communities which need protection. Sustainable forest management plays a key role in many of these services which generates a potentially critical source of finance. However, for forests to realise revenues from these PES, they must meet demanding standards of project validation and service verification. They also need geospatial data to manage and monitor operational risk. In many cases the data is difficult to collect on the ground - in some cases impossible. This will create a new demand for data that must be impartial, timely, area wide, accurate and cost effective. This presentation will highlight the unique capacity of EO to provide these geospatial inputs required in the generation of PES from forestry and demonstrate products with practical examples.
Web catalog of oceanographic data using GeoNetwork
NASA Astrophysics Data System (ADS)
Marinova, Veselka; Stefanov, Asen
2017-04-01
Most of the data collected, analyzed and used by Bulgarian oceanographic data center (BgODC) from scientific cruises, argo floats, ferry boxes and real time operating systems are spatially oriented and need to be displayed on the map. The challenge is to make spatial information more accessible to users, decision makers and scientists. In order to meet this challenge, BgODC concentrate its efforts on improving dynamic and standardized access to their geospatial data as well as those from various related organizations and institutions. BgODC currently is implementing a project to create a geospatial portal for distributing metadata and search, exchange and harvesting spatial data. There are many open source software solutions able to create such spatial data infrastructure (SDI). Finally, the GeoNetwork open source is chosen, as it is already widespread. This software is free, effective and "cheap" solution for implementing SDI at organization level. It is platform independent and runs under many operating systems. Filling of the catalog goes through these practical steps: • Managing and storing data reliably within MS SQL spatial data base; • Registration of maps and data of various formats and sources in GeoServer (most popular open source geospatial server embedded with GeoNetwork) ; • Filling added meta data and publishing geospatial data at the desktop of GeoNetwork. GeoServer and GeoNetwork are based on Java so they require installing of a servlet engine like Tomcat. The experience gained from the use of GeoNetwork Open Source confirms that the catalog meets the requirements for data management and is flexible enough to customize. Building the catalog facilitates sustainable data exchange between end users. The catalog is a big step towards implementation of the INSPIRE directive due to availability of many features necessary for producing "INSPIRE compliant" metadata records. The catalog now contains all available GIS data provided by BgODC for Internet access. Searching data within the catalog is based upon geographic extent, theme type and free text search.
Geospatial considerations for a multiorganizational, landscape-scale program
O'Donnell, Michael S.; Assal, Timothy J.; Anderson, Patrick J.; Bowen, Zachary H.
2013-01-01
Geospatial data play an increasingly important role in natural resources management, conservation, and science-based projects. The management and effective use of spatial data becomes significantly more complex when the efforts involve a myriad of landscape-scale projects combined with a multiorganizational collaboration. There is sparse literature to guide users on this daunting subject; therefore, we present a framework of considerations for working with geospatial data that will provide direction to data stewards, scientists, collaborators, and managers for developing geospatial management plans. The concepts we present apply to a variety of geospatial programs or projects, which we describe as a “scalable framework” of processes for integrating geospatial efforts with management, science, and conservation initiatives. Our framework includes five tenets of geospatial data management: (1) the importance of investing in data management and standardization, (2) the scalability of content/efforts addressed in geospatial management plans, (3) the lifecycle of a geospatial effort, (4) a framework for the integration of geographic information systems (GIS) in a landscape-scale conservation or management program, and (5) the major geospatial considerations prior to data acquisition. We conclude with a discussion of future considerations and challenges.
Considerations on Geospatial Big Data
NASA Astrophysics Data System (ADS)
LIU, Zhen; GUO, Huadong; WANG, Changlin
2016-11-01
Geospatial data, as a significant portion of big data, has recently gained the full attention of researchers. However, few researchers focus on the evolution of geospatial data and its scientific research methodologies. When entering into the big data era, fully understanding the changing research paradigm associated with geospatial data will definitely benefit future research on big data. In this paper, we look deep into these issues by examining the components and features of geospatial big data, reviewing relevant scientific research methodologies, and examining the evolving pattern of geospatial data in the scope of the four ‘science paradigms’. This paper proposes that geospatial big data has significantly shifted the scientific research methodology from ‘hypothesis to data’ to ‘data to questions’ and it is important to explore the generality of growing geospatial data ‘from bottom to top’. Particularly, four research areas that mostly reflect data-driven geospatial research are proposed: spatial correlation, spatial analytics, spatial visualization, and scientific knowledge discovery. It is also pointed out that privacy and quality issues of geospatial data may require more attention in the future. Also, some challenges and thoughts are raised for future discussion.
NASA Technical Reports Server (NTRS)
Brower, Robert
2004-01-01
This report summarizes the activity conducted under NASA Grant NAG13-02059 entitled "Preserving the Finger Lakes for the Future" A Prototype Decision Support System for Water Resources Management, Open Space and Agricultural Protection, for the period of September 26, 2003 to September 25, 2004. The RACNE continues to utilize the services of its affiliate, the Institute for the Application of Geospatial Technology at Cayuga Community College, Inc. (IAGT), for the purposes of this project under its permanent operating agreement with IAGT. IAGT is a 501(c)(3) not-for-profit Corporation created by the RACNE for the purpose of carrying out its programmatic and administrative mission. The "Preserving the Finger Lakes for the Future" project has progressed and evolved as planned, with the continuation or initiation of a number of program facets at programmatic, technical, and inter-agency levels. The project has grown, starting with the well received core concept of the Virtual Management Operations Center (VMOC), to the functional Watershed Virtual Management Operations Center (W-VMOC) prototype, to the more advanced Finger Lakes Decision Support System (FLDSS) prototype, deployed for evaluation and assessment to a wide variety of agencies and organizations in the Finger Lakes region and beyond. This suite of tools offers the advanced, compelling functionality of interactive 3D visualization interfaced with 2D mapping, all accessed via Internet or virtually any kind of distributed computer network.
Global polar geospatial information service retrieval based on search engine and ontology reasoning
Chen, Nengcheng; E, Dongcheng; Di, Liping; Gong, Jianya; Chen, Zeqiang
2007-01-01
In order to improve the access precision of polar geospatial information service on web, a new methodology for retrieving global spatial information services based on geospatial service search and ontology reasoning is proposed, the geospatial service search is implemented to find the coarse service from web, the ontology reasoning is designed to find the refined service from the coarse service. The proposed framework includes standardized distributed geospatial web services, a geospatial service search engine, an extended UDDI registry, and a multi-protocol geospatial information service client. Some key technologies addressed include service discovery based on search engine and service ontology modeling and reasoning in the Antarctic geospatial context. Finally, an Antarctica multi protocol OWS portal prototype based on the proposed methodology is introduced.
NASA Astrophysics Data System (ADS)
Connor, C. L.; Prakash, A.
2007-12-01
Alaska's secondary school teachers are increasingly required to provide Earth systems science (ESS) education that integrates student observations of local natural processes related to rapid climate change with geospatial datasets and satellite imagery using Geographic Information Systems (GIS) technology. Such skills are also valued in various employment sectors of the state where job opportunities requiring Earth science and GIS training are increasing. University of Alaska's EDGE (Experiential Discoveries in Geoscience Education) program has provided training and classroom resources for 3 cohorts of inservice Alaska science and math teachers in GIS and Earth Systems Science (2005-2007). Summer workshops include geologic field experiences, GIS instruction, computer equipment and technical support for groups of Alaska high school (HS) and middle school (MS) science teachers each June and their students in August. Since 2005, EDGE has increased Alaska science and math teachers' Earth science content knowledge and developed their GIS and computer skills. In addition, EDGE has guided teachers using a follow-up, fall online course that provided more extensive ESS knowledge linked with classroom standards and provided course content that was directly transferable into their MS and HS science classrooms. EDGE teachers were mentored by University faculty and technical staff as they guided their own students through semester-scale, science fair style projects using geospatial data that was student- collected. EDGE program assessment indicates that all teachers have improved their ESS knowledge, GIS knowledge, and the use of technology in their classrooms. More than 230 middle school students have learned GIS, from EDGE teachers and 50 EDGE secondary students have conducted original research related to landscape change and its impacts on their own communities. Longer-term EDGE goals include improving student performance on the newly implemented (spring 2008) 10th grade, standards-based, High School Qualifying Exam, on recruiting first-generation college students, and on increasing the number of Earth science majors in the University of Alaska system.
NASA Astrophysics Data System (ADS)
Iqbal, M.; Islam, A.; Hossain, A.; Mustaque, S.
2016-12-01
Multi-Criteria Decision Making(MCDM) is advanced analytical method to evaluate appropriate result or decision from multiple criterion environment. Present time in advanced research, MCDM technique is progressive analytical process to evaluate a logical decision from various conflict. In addition, Present day Geospatial approach (e.g. Remote sensing and GIS) also another advanced technical approach in a research to collect, process and analyze various spatial data at a time. GIS and Remote sensing together with the MCDM technique could be the best platform to solve a complex decision making process. These two latest process combined very effectively used in site selection for solid waste management in urban policy. The most popular MCDM technique is Weighted Linear Method (WLC) where Analytical Hierarchy Process (AHP) is another popular and consistent techniques used in worldwide as dependable decision making. Consequently, the main objective of this study is improving a AHP model as MCDM technique with Geographic Information System (GIS) to select a suitable landfill site for urban solid waste management. Here AHP technique used as a MCDM tool to select the best suitable landfill location for urban solid waste management. To protect the urban environment in a sustainable way municipal waste needs an appropriate landfill site considering environmental, geological, social and technical aspect of the region. A MCDM model generate from five class related which related to environmental, geological, social and technical using AHP method and input the result set in GIS for final model location for urban solid waste management. The final suitable location comes out that 12.2% of the area corresponds to 22.89 km2 considering the total study area. In this study, Keraniganj sub-district of Dhaka district in Bangladesh is consider as study area which is densely populated city currently undergoes an unmanaged waste management system especially the suitable landfill sites for waste dumping site.
Geospatial Information from Satellite Imagery for Geovisualisation of Smart Cities in India
NASA Astrophysics Data System (ADS)
Mohan, M.
2016-06-01
In the recent past, there have been large emphasis on extraction of geospatial information from satellite imagery. The Geospatial information are being processed through geospatial technologies which are playing important roles in developing of smart cities, particularly in developing countries of the world like India. The study is based on the latest geospatial satellite imagery available for the multi-date, multi-stage, multi-sensor, and multi-resolution. In addition to this, the latest geospatial technologies have been used for digital image processing of remote sensing satellite imagery and the latest geographic information systems as 3-D GeoVisualisation, geospatial digital mapping and geospatial analysis for developing of smart cities in India. The Geospatial information obtained from RS and GPS systems have complex structure involving space, time and presentation. Such information helps in 3-Dimensional digital modelling for smart cities which involves of spatial and non-spatial information integration for geographic visualisation of smart cites in context to the real world. In other words, the geospatial database provides platform for the information visualisation which is also known as geovisualisation. So, as a result there have been an increasing research interest which are being directed to geospatial analysis, digital mapping, geovisualisation, monitoring and developing of smart cities using geospatial technologies. However, the present research has made an attempt for development of cities in real world scenario particulary to help local, regional and state level planners and policy makers to better understand and address issues attributed to cities using the geospatial information from satellite imagery for geovisualisation of Smart Cities in emerging and developing country, India.
Browsing and Visualization of Linked Environmental Data
NASA Astrophysics Data System (ADS)
Nikolaou, Charalampos; Kyzirakos, Kostis; Bereta, Konstantina; Dogani, Kallirroi; Koubarakis, Manolis
2014-05-01
Linked environmental data has started to appear on the Web as environmental researchers make use of technologies such as ontologies, RDF, and SPARQL. Many of these datasets have an important geospatial and temporal dimension. The same is true also for the Web of data that is being rapidly populated not only with geospatial information, but also with temporal information. As the real-world entities represented in linked geospatial datasets evolve over time, the datasets themselves get updated and both the spatial and the temporal dimension of data become significant for users. For example, in the Earth Observation and Environment domains, data is constantly produced by satellite sensors and is associated with metadata containing, among others, temporal attributes, such as the time that an image was acquired. In addition, the acquisitions are considered to be valid for specific periods of time, for example until they get updated by new acquisitions. Satellite acquisitions might be utilized in applications such as the CORINE Land Cover programme operated by the European Environment Agency that makes available as a cartographic product the land cover of European areas. Periodically CORINE publishes the changes in the land cover of these areas in the form of changesets. Tools for exploiting the abundance of geospatial information have also started to emerge. However, these tools are designed for browsing a single data source, while in addition they cannot represent the temporal dimension. This is for two reasons: a) the lack of an implementation of a data model and a query language with temporal features covering the various semantics associated with the representation of time (e.g., valid and user-defined), and b) the lack of a standard temporal extension of RDF that would allow practitioners to utilize when publishing RDF data. Recently, we presented the temporal features of the data model stRDF, the query language stSPARQL, and their implementation in the geospatial RDF store Strabon (http://www.strabon.di.uoa.gr/) which, apart from querying geospatial information, can also be used to query both the valid time of a triple and user-defined time. With the aim of filling the aforementioned gaps and going beyond data exploration to map creation and sharing, we have designed and developed SexTant (http://sextant.di.uoa.gr/). SexTant can be used to produce thematic maps by layering spatiotemporal information which exists in a number of data sources ranging from standard SPARQL endpoints, to SPARQL endpoints following the standard GeoSPARQL defined by the Open Geospatial Consortium (OGC) for the modelling and querying of geospatial information, and other well-adopted geospatial file formats, such as KML and GeoJSON. In this work, we pick some real use cases from the environment domain to showcase the usefulness of SexTant to the environmental studies of a domain expert by presenting its browsing and visualization capabilities using a number of environmental datasets that we have published as linked data and also other geospatial data sources publicly available on the Web, such as KML files.
Strategy for the Identification of an INL Comprehensive Utility Corridor
DOE Office of Scientific and Technical Information (OSTI.GOV)
John Reisenauer
2011-05-01
This report documents the strategy developed to identify a comprehensive utility corridor (CUC) on the Idaho National Laboratory (INL) Site. The strategy established the process for which the Campus Development Office will evaluate land management issues. It is a process that uses geographical information system geospatial technology to layer critical INL mission information in a way that thorough evaluations can be conducted and strategies developed. The objective of the CUC Project was to develop a process that could be implemented to identify potential utility corridor options for consideration. The process had to take into account all the missions occurring onmore » the INL and other land-related issues. The process for developing a CUC strategy consists of the following four basic elements using geographical information system capabilities: 1. Development of an INL base layer map; this base layer map geospatially references all stationary geographical features on INL and sitewide information. 2. Development of current and future mission land-use need maps; this involved working with each directorate to identify current mission land use needs and future land use needs that project 30 years into the future. 3. Development of restricted and potential constraint maps; this included geospatially mapping areas such as wells, contaminated areas, firing ranges, cultural areas, ecological areas, hunting areas, easement, and grazing areas. 4. Development of state highway and power line rights of way map; this included geospatially mapping rights-of-way along existing state highways and power lines running through the INL that support INL operations. It was determined after completing and evaluating the geospatial information that the area with the least impact to INL missions was around the perimeter of the INL Site. Option 1, in this document, identifies this perimeter; however, it does not mean the entire perimeter is viable. Many places along the perimeter corridor cannot be used or are not economically viable. Specific detailed studies will need to be conducted on a case-by-case basis to clearly identify which sections along the perimeter can and cannot be used. Option 2, in this document, identifies areas along existing highways that could be a viable option. However, discussions would have to take place with the State of Idaho to use their easement as part of the corridor and mission impact would need to be evaluated if a specific request was made to the Department of Energy, Idaho Operations Office. Option 3, in this document, is a combination of Options 1 and 2. This option provides the most flexibility to minimize impacts to INL missions. As with the other two options, discussions and agreements with the State of Idaho would be needed and any specific route would need to be thoroughly evaluated for impact, implementation, and operability beyond just a strategy.« less
2010-03-01
Dynamics Itronix Duo-Touch II SmartPhones 1. Apple iPhone 2. Blackberry Smartphone 3. Cassiopeia E-105 4. Hewlett Packard (HP) iPAQ 910 Smartphone...Mobile GIS Page 2-39 Blackberry Smartphone Housekeeping Functions (internal device functionality, status, and security) 1 Maintain awareness of...sensor status and alarms SW (comments) 2 Plan storage SW 3 Development Environment Blackberry OS Can additional programmable
Emerging Geospatial Sharing Technologies in Earth and Space Science Informatics
NASA Astrophysics Data System (ADS)
Singh, R.; Bermudez, L. E.
2013-12-01
Emerging Geospatial Sharing Technologies in Earth and Space Science Informatics The Open Geospatial Consortium (OGC) mission is to serve as a global forum for the collaboration of developers and users of spatial data products and services, and to advance the development of international standards for geospatial interoperability. The OGC coordinates with over 400 institutions in the development of geospatial standards. In the last years two main trends are making disruptions in geospatial applications: mobile and context sharing. People now have more and more mobile devices to support their work and personal life. Mobile devices are intermittently connected to the internet and have smaller computing capacity than a desktop computer. Based on this trend a new OGC file format standard called GeoPackage will enable greater geospatial data sharing on mobile devices. GeoPackage is perhaps best understood as the natural evolution of Shapefiles, which have been the predominant lightweight geodata sharing format for two decades. However the format is extremely limited. Four major shortcomings are that only vector points, lines, and polygons are supported; property names are constrained by the dBASE format; multiple files are required to encode a single data set; and multiple Shapefiles are required to encode multiple data sets. A more modern lingua franca for geospatial data is long overdue. GeoPackage fills this need with support for vector data, image tile matrices, and raster data. And it builds upon a database container - SQLite - that's self-contained, single-file, cross-platform, serverless, transactional, and open source. A GeoPackage, in essence, is a set of SQLite database tables whose content and layout is described in the candidate GeoPackage Implementation Specification available at https://portal.opengeospatial.org/files/?artifact_id=54838&version=1. The second trend is sharing client 'contexts'. When a user is looking into an article or a product on the web, they can easily share this information with colleagues or friends via an email that includes URLs (links to web resources) and attachments (inline data). In the case of geospatial information, a user would like to share a map created from different OGC sources, which may include for example, WMS and WFS links, and GML and KML annotations. The emerging OGC file format is called the OGC Web Services Context Document (OWS Context), which allows clients to reproduce a map previously created by someone else. Context sharing is important in a variety of domains, from emergency response, where fire, police and emergency medical personnel need to work off a common map, to multi-national military operations, where coalition forces need to share common data sources, but have cartographic displays in different languages and symbology sets. OWS Contexts can be written in XML (building upon the Atom Syndication Format) or JSON. This presentation will provide an introduction of GeoPackage and OWS Context and how they can be used to advance sharing of Earth and Space Science information.
Challenges in sharing of geospatial data by data custodians in South Africa
NASA Astrophysics Data System (ADS)
Kay, Sissiel E.
2018-05-01
As most development planning and rendering of public services happens at a place or in a space, geospatial data is required. This geospatial data is best managed through a spatial data infrastructure, which has as a key objective to share geospatial data. The collection and maintenance of geospatial data is expensive and time consuming and so the principle of "collect once - use many times" should apply. It is best to obtain the geospatial data from the authoritative source - the appointed data custodian. In South Africa the South African Spatial Data Infrastructure (SASDI) is the means to achieve the requirement for geospatial data sharing. This requires geospatial data sharing to take place between the data custodian and the user. All data custodians are expected to comply with the Spatial Data Infrastructure Act (SDI Act) in terms of geo-spatial data sharing. Currently data custodians are experiencing challenges with regard to the sharing of geospatial data. This research is based on the current ten data themes selected by the Committee for Spatial Information and the organisations identified as the data custodians for these ten data themes. The objectives are to determine whether the identified data custodians comply with the SDI Act with respect to geospatial data sharing, and if not what are the reasons for this. Through an international comparative assessment it then determines if the compliance with the SDI Act is not too onerous on the data custodians. The research concludes that there are challenges with geospatial data sharing in South Africa and that the data custodians only partially comply with the SDI Act in terms of geospatial data sharing. However, it is shown that the South African legislation is not too onerous on the data custodians.
Donato, David I.; Shapiro, Jason L.
2016-12-13
An effort to build a unified collection of geospatial data for use in land-change modeling (LCM) led to new insights into the requirements and challenges of building an LCM data infrastructure. A case study of data compilation and unification for the Richmond, Va., Metropolitan Statistical Area (MSA) delineated the problems of combining and unifying heterogeneous data from many independent localities such as counties and cities. The study also produced conclusions and recommendations for use by the national LCM community, emphasizing the critical need for simple, practical data standards and conventions for use by localities. This report contributes an uncopyrighted core glossary and a much needed operational definition of data unification.
Stracuzzi, David John; Brost, Randolph C.; Phillips, Cynthia A.; ...
2015-09-26
Geospatial semantic graphs provide a robust foundation for representing and analyzing remote sensor data. In particular, they support a variety of pattern search operations that capture the spatial and temporal relationships among the objects and events in the data. However, in the presence of large data corpora, even a carefully constructed search query may return a large number of unintended matches. This work considers the problem of calculating a quality score for each match to the query, given that the underlying data are uncertain. As a result, we present a preliminary evaluation of three methods for determining both match qualitymore » scores and associated uncertainty bounds, illustrated in the context of an example based on overhead imagery data.« less
People, Places and Pixels: Remote Sensing in the Service of Society
NASA Technical Reports Server (NTRS)
Lulla, Kamlesh
2003-01-01
What is the role of Earth remote sensing and other geospatial technologies in our society? Recent global events have brought into focus the role of geospatial science and technology such as remote sensing, GIS, GPS in assisting the professionals who are responsible for operations such as rescue and recovery of sites after a disaster or a terrorist act. This paper reviews the use of recent remote sensing products from satellites such as IKONOS in these efforts. Aerial and satellite imagery used in land mine detection has been evaluated and the results of this evaluation will be discussed. Synopsis of current and future ISS Earth Remote Sensing capabilities will be provided. The role of future missions in humanitarian use of remote sensing will be explored.
Lin, Yun-Bin; Lin, Yu-Pin; Deng, Dong-Po; Chen, Kuan-Wei
2008-02-19
In Taiwan, earthquakes have long been recognized as a major cause oflandslides that are wide spread by floods brought by typhoons followed. Distinguishingbetween landslide spatial patterns in different disturbance regimes is fundamental fordisaster monitoring, management, and land-cover restoration. To circumscribe landslides,this study adopts the normalized difference vegetation index (NDVI), which can bedetermined by simply applying mathematical operations of near-infrared and visible-redspectral data immediately after remotely sensed data is acquired. In real-time disastermonitoring, the NDVI is more effective than using land-cover classifications generatedfrom remotely sensed data as land-cover classification tasks are extremely time consuming.Directional two-dimensional (2D) wavelet analysis has an advantage over traditionalspectrum analysis in that it determines localized variations along a specific direction whenidentifying dominant modes of change, and where those modes are located in multi-temporal remotely sensed images. Open geospatial techniques comprise a series ofsolutions developed based on Open Geospatial Consortium specifications that can beapplied to encode data for interoperability and develop an open geospatial service for sharing data. This study presents a novel approach and framework that uses directional 2Dwavelet analysis of real-time NDVI images to effectively identify landslide patterns andshare resulting patterns via open geospatial techniques. As a case study, this study analyzedNDVI images derived from SPOT HRV images before and after the ChiChi earthquake(7.3 on the Richter scale) that hit the Chenyulan basin in Taiwan, as well as images aftertwo large typhoons (Xangsane and Toraji) to delineate the spatial patterns of landslidescaused by major disturbances. Disturbed spatial patterns of landslides that followed theseevents were successfully delineated using 2D wavelet analysis, and results of patternrecognitions of landslides were distributed simultaneously to other agents using geographymarkup language. Real-time information allows successive platforms (agents) to work withlocal geospatial data for disaster management. Furthermore, the proposed is suitable fordetecting landslides in various regions on continental, regional, and local scales usingremotely sensed data in various resolutions derived from SPOT HRV, IKONOS, andQuickBird multispectral images.
Nebhydro: Sharing Geospatial Data to Supportwater Management in Nebraska
NASA Astrophysics Data System (ADS)
Kamble, B.; Irmak, A.; Hubbard, K.; Deogun, J.; Dvorak, B.
2012-12-01
Recent advances in web-enabled geographical technologies have the potential to make a dramatic impact on development of highly interactive spatial applications on the web for visualization of large-scale geospatial data by water resources and irrigation scientists. Spatial and point scale water resources data visualization are an emerging and challenging application domain. Query based visual explorations of geospatial hydrological data can play an important role in stimulating scientific hypotheses and seeking causal relationships among hydro variables. The Nebraska Hydrological Information System (NebHydro) utilizes ESRI's ArcGIS server technology to increase technological awareness among farmers, irrigation managers and policy makers. Web-based geospatial applications are an effective way to expose scientific hydrological datasets to the research community and the public. NebHydro uses Adobe Flex technology to offer an online visualization and data analysis system for presentation of social and economic data. Internet mapping services is an integrated product of GIS and Internet technologies; it is a favored solution to achieve the interoperability of GIS. The development of Internet based GIS services in the state of Nebraska showcases the benefits of sharing geospatial hydrological data among agencies, resource managers and policy makers. Geospatial hydrological Information (Evapotranspiration from Remote Sensing, vegetation indices (NDVI), USGS Stream gauge data, Climatic data etc.) is generally generated through model simulation (METRIC, SWAP, Linux, Python based scripting etc). Information is compiled into and stored within object oriented relational spatial databases using a geodatabase information model that supports the key data types needed by applications including features, relationships, networks, imagery, terrains, maps and layers. The system provides online access, querying, visualization, and analysis of the hydrological data from several sources at one place. The study indicates that internet GIS, developed using advanced technologies, provides valuable education potential to users in hydrology and irrigation engineering and suggests that such a system can support advanced hydrological data access and analysis tools to improve utility of data in operations. Keywords: Hydrological Information System, NebHydro, Water Management, data sharing, data visualization, ArcGIS server.
NASA Astrophysics Data System (ADS)
Čepický, Jáchym; Moreira de Sousa, Luís
2016-06-01
The OGC® Web Processing Service (WPS) Interface Standard provides rules for standardizing inputs and outputs (requests and responses) for geospatial processing services, such as polygon overlay. The standard also defines how a client can request the execution of a process, and how the output from the process is handled. It defines an interface that facilitates publishing of geospatial processes and client discovery of processes and and binding to those processes into workflows. Data required by a WPS can be delivered across a network or they can be available at a server. PyWPS was one of the first implementations of OGC WPS on the server side. It is written in the Python programming language and it tries to connect to all existing tools for geospatial data analysis, available on the Python platform. During the last two years, the PyWPS development team has written a new version (called PyWPS-4) completely from scratch. The analysis of large raster datasets poses several technical issues in implementing the WPS standard. The data format has to be defined and validated on the server side and binary data have to be encoded using some numeric representation. Pulling raster data from remote servers introduces security risks, in addition, running several processes in parallel has to be possible, so that system resources are used efficiently while preserving security. Here we discuss these topics and illustrate some of the solutions adopted within the PyWPS implementation.
Geospatial Data Science Research Staff | Geospatial Data Science | NREL
Oliveira, Ricardo Researcher II-Geospatial Science Ricardo.Oliveira@nrel.gov 303-275-3272 Gilroy, Nicholas Specialist Pamela.Gray.hann@nrel.gov 303-275-4626 Grue, Nicholas Researcher III-Geospatial Science Nick.Grue
PLANNING QUALITY IN GEOSPATIAL PROJECTS
This presentation will briefly review some legal drivers and present a structure for the writing of geospatial Quality Assurance Projects Plans. In addition, the Geospatial Quality Council geospatial information life-cycle and sources of error flowchart will be reviewed.
Automatic geospatial information Web service composition based on ontology interface matching
NASA Astrophysics Data System (ADS)
Xu, Xianbin; Wu, Qunyong; Wang, Qinmin
2008-10-01
With Web services technology the functions of WebGIS can be presented as a kind of geospatial information service, and helped to overcome the limitation of the information-isolated situation in geospatial information sharing field. Thus Geospatial Information Web service composition, which conglomerates outsourced services working in tandem to offer value-added service, plays the key role in fully taking advantage of geospatial information services. This paper proposes an automatic geospatial information web service composition algorithm that employed the ontology dictionary WordNet to analyze semantic distances among the interfaces. Through making matching between input/output parameters and the semantic meaning of pairs of service interfaces, a geospatial information web service chain can be created from a number of candidate services. A practice of the algorithm is also proposed and the result of it shows the feasibility of this algorithm and the great promise in the emerging demand for geospatial information web service composition.
75 FR 6056 - National Geospatial Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-05
... DEPARTMENT OF THE INTERIOR Office of the Secretary National Geospatial Advisory Committee AGENCY: Office of the Secretary, Interior. ACTION: Notice of renewal of National Geospatial Advisory Committee... renewed the National Geospatial Advisory Committee. The Committee will provide advice and recommendations...
The Adversarial Route Analysis Tool: A Web Application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Casson, William H. Jr.
2012-08-02
The Adversarial Route Analysis Tool is a type of Google maps for adversaries. It's a web-based Geospatial application similar to Google Maps. It helps the U.S. government plan operations that predict where an adversary might be. It's easily accessible and maintainble and it's simple to use without much training.
The United States Environmental Protection Agency Geospatial Quality Council developed this document to harmonize the process of collecting, editing, and exporting spatial data of known quality using the Global Positioning System (GPS). Each organizational entity may adopt this d...
The Human Touch: Geospatial Engineering Meets Local Afghans
2012-04-01
flat- bread , sweet cream, water, and tea to serve our guests when visiting elders came to the gate of our forward operating base. This flexibility was...meeting room, where we made introductions, engaged in small talk, and exchanged cell phone numbers. Refreshments were served as we began talking about
Geospatial Service Platform for Education and Research
NASA Astrophysics Data System (ADS)
Gong, J.; Wu, H.; Jiang, W.; Guo, W.; Zhai, X.; Yue, P.
2014-04-01
We propose to advance the scientific understanding through applications of geospatial service platforms, which can help students and researchers investigate various scientific problems in a Web-based environment with online tools and services. The platform also offers capabilities for sharing data, algorithm, and problem-solving knowledge. To fulfil this goal, the paper introduces a new course, named "Geospatial Service Platform for Education and Research", to be held in the ISPRS summer school in May 2014 at Wuhan University, China. The course will share cutting-edge achievements of a geospatial service platform with students from different countries, and train them with online tools from the platform for geospatial data processing and scientific research. The content of the course includes the basic concepts of geospatial Web services, service-oriented architecture, geoprocessing modelling and chaining, and problem-solving using geospatial services. In particular, the course will offer a geospatial service platform for handson practice. There will be three kinds of exercises in the course: geoprocessing algorithm sharing through service development, geoprocessing modelling through service chaining, and online geospatial analysis using geospatial services. Students can choose one of them, depending on their interests and background. Existing geoprocessing services from OpenRS and GeoPW will be introduced. The summer course offers two service chaining tools, GeoChaining and GeoJModelBuilder, as instances to explain specifically the method for building service chains in view of different demands. After this course, students can learn how to use online service platforms for geospatial resource sharing and problem-solving.
EPA GEOSPATIAL QUALITY COUNCIL
The EPA Geospatial Quality Council (previously known as the EPA GIS-QA Team - EPA/600/R-00/009 was created to fill the gap between the EPA Quality Assurance (QA) and Geospatial communities. All EPA Offices and Regions were invited to participate. Currently, the EPA Geospatial Q...
Geospatial Thinking of Information Professionals
ERIC Educational Resources Information Center
Bishop, Bradley Wade; Johnston, Melissa P.
2013-01-01
Geospatial thinking skills inform a host of library decisions including planning and managing facilities, analyzing service area populations, facility site location, library outlet and service point closures, as well as assisting users with their own geospatial needs. Geospatial thinking includes spatial cognition, spatial reasoning, and knowledge…
2011-03-16
as a Geospatial Technician ( Geotech ). Specialist ( 4) TpHQ (1:4) Manning: 1 Offr : 45 Enlisted EODDet (2) Lane metres: 127.0 2xEDDteams Plus: I I...CAPT and Geotech with CT HQ I I I I L CE Sect- Mech (0:10) CE Sect- Mech (0:10) CE Sect - Light (0:8) Support Sect (0:9) M113AS4 Armoured...such as asbestos. An RAE Captain is included in the JTF Headquarters, as well as a Geotech . It is also more than likely that a technically qualified
EPA Geospatial Quality Council Strategic and Implementation Plan 2010 to 2015
The EPA Geospatial Quality Council (GQC) was created to promote and provide Quality Assurance guidance for the development, use, and products of geospatial science. The GQC was created when the gap between the EPA Quality Assurance (QA) and Geospatial communities was recognized. ...
US EPA GEOSPATIAL QUALITY COUNCIL: ENSURING QUALITY GEOSPATIAL SOLUTIONS
This presentation will discuss the history, strategy, products, and future plans of the EPA Geospatial Quality Council (GQC). A topical review of GQC products will be presented including:
o Guidance for Geospatial Data Quality Assurance Project Plans.
o GPS - Tec...
NASA Astrophysics Data System (ADS)
McCreedy, Frank P.; Sample, John T.; Ladd, William P.; Thomas, Michael L.; Shaw, Kevin B.
2005-05-01
The Naval Research Laboratory"s Geospatial Information Database (GIDBTM) Portal System has been extended to now include an extensive geospatial search functionality. The GIDB Portal System interconnects over 600 distributed geospatial data sources via the Internet with a thick client, thin client and a PDA client. As the GIDB Portal System has rapidly grown over the last two years (adding hundreds of geospatial sources), the obvious requirement has arisen to more effectively mine the interconnected sources in near real-time. How the GIDB Search addresses this issue is the prime focus of this paper.
The use of geospatial web services for exchanging utilities data
NASA Astrophysics Data System (ADS)
Kuczyńska, Joanna
2013-04-01
Geographic information technologies and related geo-information systems currently play an important role in the management of public administration in Poland. One of these tasks is to maintain and update Geodetic Evidence of Public Utilities (GESUT), part of the National Geodetic and Cartographic Resource, which contains an important for many institutions information of technical infrastructure. It requires an active exchange of data between the Geodesy and Cartography Documentation Centers and institutions, which administrate transmission lines. The administrator of public utilities, is legally obliged to provide information about utilities to GESUT. The aim of the research work was to develop a universal data exchange methodology, which can be implemented on a variety of hardware and software platforms. This methodology use Unified Modeling Language (UML), eXtensible Markup Language (XML), and Geography Markup Language (GML). The proposed methodology is based on the two different strategies: Model Driven Architecture (MDA) and Service Oriented Architecture (SOA). Used solutions are consistent with the INSPIRE Directive and ISO 19100 series standards for geographic information. On the basis of analysis of the input data structures, conceptual models were built for both databases. Models were written in the universal modeling language: UML. Combined model that defines a common data structure was also built. This model was transformed into developed for the exchange of geographic information GML standard. The structure of the document describing the data that may be exchanged is defined in the .xsd file. Network services were selected and implemented in the system designed for data exchange based on open source tools. Methodology was implemented and tested. Data in the agreed data structure and metadata were set up on the server. Data access was provided by geospatial network services: data searching possibilities by Catalog Service for the Web (CSW), data collection by Web Feature Service (WFS). WFS provides also operation for modification data, for example to update them by utility administrator. The proposed solution significantly increases the efficiency of data exchange and facilitates maintenance the National Geodetic and Cartographic Resource.
Helterbrand, Wm. Steve; Sieverling, Jennifer B.
2008-01-01
The U.S. Geological Survey (USGS) Seventh Biennial Geographic Information Science (GIS) Workshop (USGS-GIS 2008) on May 12 through 16, 2008, at the Denver Federal Center in Denver, Colorado, is unique in that it brings together GIS professionals from all of the USGS disciplines across all regions, and focuses primarily on the needs and accomplishments of the USGS. The theme for the 2008 workshop, ?GIS for Tomorrow?s Challenges,? provides an opportunity for USGS GIS professionals to demonstrate how they have responded to the challenges set forth in the USGS Science Strategy. During this workshop, attendees will have an opportunity to present or demonstrate their work; develop their knowledge by attending hands-on workshops and presentations given by professionals from the USGS and other Federal agencies, GIS-related companies, and academia; and to network with other professionals to develop collaborative opportunities. In addition to participation in numerous workshops and presentations, attendees will have opportunities to listen to top-level managers from the USGS present updates and goals concerning the future of several USGS programs. Monday evening?s Star Guest presentation by Thomas Wagner, NSF Office of Polar Programs, and Paul Morin, Antarctic Geospatial Information Center, entitled ?Mapping all that is White: Antarctic Science and Operations Viewed Though Geospatial Data,? will be one of many valuable presentations. This Proceedings volume will serve as an activity reference for workshop attendees, as well as an archive of technical abstracts presented at the workshop. Author, co-author, and presenter names, affiliations, and contact information are listed with presentation titles with the abstracts. Some hands-on sessions are offered twice; in these instances, abstracts submitted for publication are presented in the proceedings on both days on which they are offered. All acronyms used in these proceedings are explained in the text of each abstract.
International Virtual Observatory System for Water Resources Information
NASA Astrophysics Data System (ADS)
Leinenweber, Lewis; Bermudez, Luis
2013-04-01
Sharing, accessing, and integrating hydrologic and climatic data have been identified as a critical need for some time. The current state of data portals, standards, technologies, activities, and expertise can be leverage to develop an initial operational capability for a virtual observatory system. This system will allow to link observations data with stream networks and models, and to solve semantic inconsistencies among communities. Prototyping a virtual observatory system is an inter-disciplinary, inter-agency and international endeavor. The Open Geospatial Consortium (OGC) within the OGC Interoperability Program provides the process and expertise to run such collaborative effort. The OGC serves as a global forum for the collaboration of developers and users of spatial data products and services, and to advance the development of international standards for geospatial interoperability. The project coordinated by OGC that is advancing an international virtual observatory system for water resources information is called Climatology-Hydrology Information Sharing Pilot, Phase 1 (CHISP-1). It includes observations and forecasts in the U.S. and Canada levering current networks and capabilities. It is designed to support the following use cases: 1) Hydrologic modeling for historical and near-future stream flow and groundwater conditions. Requires the integration of trans-boundary stream flow and groundwater well data, as well as national river networks (US NHD and Canada NHN) from multiple agencies. Emphasis will be on time series data and real-time flood monitoring. 2) Modeling and assessment of nutrient load into the lakes. Requires accessing water-quality data from multiple agencies and integrating with stream flow information for calculating loads. Emphasis on discrete sampled water quality observations, linking those to specific NHD stream reaches and catchments, and additional metadata for sampled data. The key objectives of these use cases are: 1) To link observations data to the stream network, enabling queries of conditions upstream from a given location to return all relevant gages and well locations. This is currently not practical with the data sources available. 2) To bridge differences in semantics across information models and processes used by the various data producers, to improve the hydrologic and water quality modeling capabilities. Other expected benefits to be derived from this project include: - Leverage a large body of existing data holdings and related activities of multiple agencies in the US and Canada. - Influence data and metadata standards used internationally for web-based information sharing, through multiple agency cooperation and OGC standards setting process. - Reduction of procurement risk through partnership-based development of an initial operating capability verses the cost for building a fully operational system using a traditional "waterfall approach". - Identification and clarification of what is possible, and of the key technical and non-technical barriers to continued progress in sharing and integrating hydrologic and climatic information. - Promote understanding and strengthen ties within the hydro-climatic community. This is anticipated to be the first phase of a multi-phase project, with future work on forecasting the hydrologic consequences of extreme weather events, and enabling more sophisticated water quality modeling.
Marine vessels as substitutes for heavy-duty trucks in Great Lakes freight transportation.
Comer, Bryan; Corbett, James J; Hawker, J Scott; Korfmacher, Karl; Lee, Earl E; Prokop, Chris; Winebrake, James J
2010-07-01
This paper applies a geospatial network optimization model to explore environmental, economic, and time-of-delivery tradeoffs associated with the application of marine vessels as substitutes for heavy-duty trucks operating in the Great Lakes region. The geospatial model integrates U.S. and Canadian highway, rail, and waterway networks to create an intermodal network and characterizes this network using temporal, economic, and environmental attributes (including emissions of carbon dioxide, particulate matter, carbon monoxide, sulfur oxides, volatile organic compounds, and nitrogen oxides). A case study evaluates tradeoffs associated with containerized traffic flow in the Great Lakes region, demonstrating how choice of freight mode affects the environmental performance of movement of goods. These results suggest opportunities to improve the environmental performance of freight transport through infrastructure development, technology implementation, and economic incentives.
The Geospatial Web and Local Geographical Education
ERIC Educational Resources Information Center
Harris, Trevor M.; Rouse, L. Jesse; Bergeron, Susan J.
2010-01-01
Recent innovations in the Geospatial Web represent a paradigm shift in Web mapping by enabling educators to explore geography in the classroom by dynamically using a rapidly growing suite of impressive online geospatial tools. Coupled with access to spatial data repositories and User-Generated Content, the Geospatial Web provides a powerful…
E-DECIDER Disaster Response and Decision Support Cyberinfrastructure: Technology and Challenges
NASA Astrophysics Data System (ADS)
Glasscoe, M. T.; Parker, J. W.; Pierce, M. E.; Wang, J.; Eguchi, R. T.; Huyck, C. K.; Hu, Z.; Chen, Z.; Yoder, M. R.; Rundle, J. B.; Rosinski, A.
2014-12-01
Timely delivery of critical information to decision makers during a disaster is essential to response and damage assessment. Key issues to an efficient emergency response after a natural disaster include rapidly processing and delivering this critical information to emergency responders and reducing human intervention as much as possible. Essential elements of information necessary to achieve situational awareness are often generated by a wide array of organizations and disciplines, using any number of geospatial and non-geospatial technologies. A key challenge is the current state of practice does not easily support information sharing and technology interoperability. NASA E-DECIDER (Emergency Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response) has worked with the California Earthquake Clearinghouse and its partners to address these issues and challenges by adopting the XChangeCore Web Service Data Orchestration technology and participating in several earthquake response exercises. The E-DECIDER decision support system provides rapid delivery of advanced situational awareness data products to operations centers and emergency responders in the field. Remote sensing and hazard data, model-based map products, information from simulations, damage detection, and crowdsourcing is integrated into a single geospatial view and delivered through a service oriented architecture for improved decision-making and then directly to mobile devices of responders. By adopting a Service Oriented Architecture based on Open Geospatial Consortium standards, the system provides an extensible, comprehensive framework for geospatial data processing and distribution on Cloud platforms and other distributed environments. While the Clearinghouse and its partners are not first responders, they do support the emergency response community by providing information about the damaging effects earthquakes. It is critical for decision makers to maintain a situational awareness that is knowledgeable of potential and current conditions, possible impacts on populations and infrastructure, and other key information. E-DECIDER and the Clearinghouse have worked together to address many of these issues and challenges to deliver interoperable, authoritative decision support products.
NASA Astrophysics Data System (ADS)
Zhong, Yanfei; Han, Xiaobing; Zhang, Liangpei
2018-04-01
Multi-class geospatial object detection from high spatial resolution (HSR) remote sensing imagery is attracting increasing attention in a wide range of object-related civil and engineering applications. However, the distribution of objects in HSR remote sensing imagery is location-variable and complicated, and how to accurately detect the objects in HSR remote sensing imagery is a critical problem. Due to the powerful feature extraction and representation capability of deep learning, the deep learning based region proposal generation and object detection integrated framework has greatly promoted the performance of multi-class geospatial object detection for HSR remote sensing imagery. However, due to the translation caused by the convolution operation in the convolutional neural network (CNN), although the performance of the classification stage is seldom influenced, the localization accuracies of the predicted bounding boxes in the detection stage are easily influenced. The dilemma between translation-invariance in the classification stage and translation-variance in the object detection stage has not been addressed for HSR remote sensing imagery, and causes position accuracy problems for multi-class geospatial object detection with region proposal generation and object detection. In order to further improve the performance of the region proposal generation and object detection integrated framework for HSR remote sensing imagery object detection, a position-sensitive balancing (PSB) framework is proposed in this paper for multi-class geospatial object detection from HSR remote sensing imagery. The proposed PSB framework takes full advantage of the fully convolutional network (FCN), on the basis of a residual network, and adopts the PSB framework to solve the dilemma between translation-invariance in the classification stage and translation-variance in the object detection stage. In addition, a pre-training mechanism is utilized to accelerate the training procedure and increase the robustness of the proposed algorithm. The proposed algorithm is validated with a publicly available 10-class object detection dataset.
NASA Technical Reports Server (NTRS)
Hemmings, Sarah; Limaye, Ashutosh; Irwin, Dan
2011-01-01
Background: SERVIR -- the Regional Visualization and Monitoring System -- helps people use Earth observations and predictive models based on data from orbiting satellites to make timely decisions that benefit society. SERVIR operates through a network of regional hubs in Mesoamerica, East Africa, and the Hindu Kush-Himalayas. USAID and NASA support SERVIR, with the long-term goal of transferring SERVIR capabilities to the host countries. Objective/Purpose: The purpose of this presentation is to describe how the SERVIR system helps the SERVIR regions cope with eight areas of societal benefit identified by the Group on Earth Observations (GEO): health, disasters, ecosystems, biodiversity, weather, water, climate, and agriculture. This presentation will describe environmental health applications of data in the SERVIR system, as well as ongoing and future efforts to incorporate additional health applications into the SERVIR system. Methods: This presentation will discuss how the SERVIR Program makes environmental data available for use in environmental health applications. SERVIR accomplishes its mission by providing member nations with access to geospatial data and predictive models, information visualization, training and capacity building, and partnership development. SERVIR conducts needs assessments in partner regions, develops custom applications of Earth observation data, and makes NASA and partner data available through an online geospatial data portal at SERVIRglobal.net. Results: Decision makers use SERVIR to improve their ability to monitor air quality, extreme weather, biodiversity, and changes in land cover. In past several years, the system has been used over 50 times to respond to environmental threats such as wildfires, floods, landslides, and harmful algal blooms. Given that the SERVIR regions are experiencing increased stress under larger climate variability than historic observations, SERVIR provides information to support the development of adaptation strategies for nations affected by climate change. Conclusions: SERVIR is a platform for collaboration and cross-agency coordination, international partnerships, and delivery of web-based geospatial information services and applications. SERVIR makes a variety of geospatial data available for use in studies of environmental health outcomes.
NASA Astrophysics Data System (ADS)
Huang, L.; Zhu, X.; Guo, W.; Xiang, L.; Chen, X.; Mei, Y.
2012-07-01
Existing implementations of collaborative image interpretation have many limitations for very large satellite imageries, such as inefficient browsing, slow transmission, etc. This article presents a KML-based approach to support distributed, real-time, synchronous collaborative interpretation for remote sensing images in the geo-browser. As an OGC standard, KML (Keyhole Markup Language) has the advantage of organizing various types of geospatial data (including image, annotation, geometry, etc.) in the geo-browser. Existing KML elements can be used to describe simple interpretation results indicated by vector symbols. To enlarge its application, this article expands KML elements to describe some complex image processing operations, including band combination, grey transformation, geometric correction, etc. Improved KML is employed to describe and share interpretation operations and results among interpreters. Further, this article develops some collaboration related services that are collaboration launch service, perceiving service and communication service. The launch service creates a collaborative interpretation task and provides a unified interface for all participants. The perceiving service supports interpreters to share collaboration awareness. Communication service provides interpreters with written words communication. Finally, the GeoGlobe geo-browser (an extensible and flexible geospatial platform developed in LIESMARS) is selected to perform experiments of collaborative image interpretation. The geo-browser, which manage and visualize massive geospatial information, can provide distributed users with quick browsing and transmission. Meanwhile in the geo-browser, GIS data (for example DEM, DTM, thematic map and etc.) can be integrated to assist in improving accuracy of interpretation. Results show that the proposed method is available to support distributed collaborative interpretation of remote sensing image
Lights Out Operations of a Space, Ground, Sensorweb
NASA Technical Reports Server (NTRS)
Chien, Steve; Tran, Daniel; Johnston, Mark; Davies, Ashley Gerard; Castano, Rebecca; Rabideau, Gregg; Cichy, Benjamin; Doubleday, Joshua; Pieri, David; Scharenbroich, Lucas;
2008-01-01
We have been operating an autonomous, integrated sensorweb linking numerous space and ground sensors in 24/7 operations since 2004. This sensorweb includes elements of space data acquisition (MODIS, GOES, and EO-1), space asset retasking (EO-1), integration of data acquired from ground sensor networks with on-demand ground processing of data into science products. These assets are being integrated using web service standards from the Open Geospatial Consortium. Future plans include extension to fixed and mobile surface and subsurface sea assets as part of the NSF's ORION Program.
NASA Astrophysics Data System (ADS)
Gordov, E.; Shiklomanov, A.; Okladnikov, I.; Prusevich, A.; Titov, A.
2016-11-01
We present an approach and first results of a collaborative project being carried out by a joint team of researchers from the Institute of Monitoring of Climatic and Ecological Systems, Russia and Earth Systems Research Center UNH, USA. Its main objective is development of a hardware and software platform prototype of a Distributed Research Center (DRC) for monitoring and projecting of regional climatic and environmental changes in the Northern extratropical areas. The DRC should provide the specialists working in climate related sciences and decision-makers with accurate and detailed climatic characteristics for the selected area and reliable and affordable tools for their in-depth statistical analysis and studies of the effects of climate change. Within the framework of the project, new approaches to cloud processing and analysis of large geospatial datasets (big geospatial data) inherent to climate change studies are developed and deployed on technical platforms of both institutions. We discuss here the state of the art in this domain, describe web based information-computational systems developed by the partners, justify the methods chosen to reach the project goal, and briefly list the results obtained so far.
A Software Engineering Paradigm for Quick-turnaround Earth Science Data Projects
NASA Astrophysics Data System (ADS)
Moore, K.
2016-12-01
As is generally the case with applied sciences professional and educational programs, the participants of such programs can come from a variety of technical backgrounds. In the NASA DEVELOP National Program, the participants constitute an interdisciplinary set of backgrounds, with varying levels of experience with computer programming. DEVELOP makes use of geographically explicit data sets, and it is necessary to use geographic information systems and geospatial image processing environments. As data sets cover longer time spans and include more complex sets of parameters, automation is becoming an increasingly prevalent feature. Though platforms such as ArcGIS, ERDAS Imagine, and ENVI facilitate the batch-processing of geospatial imagery, these environments are naturally constricting to the user in that they limit him or her to the tools that are available. Users must then turn to "homemade" scripting in more traditional programming languages such as Python, JavaScript, or R, to automate workflows. However, in the context of quick-turnaround projects like those in DEVELOP, the programming learning curve may be prohibitively steep. In this work, we consider how to best design a software development paradigm that addresses two major constants: an arbitrarily experienced programmer and quick-turnaround project timelines.
Analysis of Radar and Optical Space Borne Data for Large Scale Topographical Mapping
NASA Astrophysics Data System (ADS)
Tampubolon, W.; Reinhardt, W.
2015-03-01
Normally, in order to provide high resolution 3 Dimension (3D) geospatial data, large scale topographical mapping needs input from conventional airborne campaigns which are in Indonesia bureaucratically complicated especially during legal administration procedures i.e. security clearance from military/defense ministry. This often causes additional time delays besides technical constraints such as weather and limited aircraft availability for airborne campaigns. Of course the geospatial data quality is an important issue for many applications. The increasing demand of geospatial data nowadays consequently requires high resolution datasets as well as a sufficient level of accuracy. Therefore an integration of different technologies is required in many cases to gain the expected result especially in the context of disaster preparedness and emergency response. Another important issue in this context is the fast delivery of relevant data which is expressed by the term "Rapid Mapping". In this paper we present first results of an on-going research to integrate different data sources like space borne radar and optical platforms. Initially the orthorectification of Very High Resolution Satellite (VHRS) imagery i.e. SPOT-6 has been done as a continuous process to the DEM generation using TerraSAR-X/TanDEM-X data. The role of Ground Control Points (GCPs) from GNSS surveys is mandatory in order to fulfil geometrical accuracy. In addition, this research aims on providing suitable processing algorithm of space borne data for large scale topographical mapping as described in section 3.2. Recently, radar space borne data has been used for the medium scale topographical mapping e.g. for 1:50.000 map scale in Indonesian territories. The goal of this on-going research is to increase the accuracy of remote sensing data by different activities, e.g. the integration of different data sources (optical and radar) or the usage of the GCPs in both, the optical and the radar satellite data processing. Finally this results will be used in the future as a reference for further geospatial data acquisitions to support topographical mapping in even larger scales up to the 1:10.000 map scale.
Generation of Multiple Metadata Formats from a Geospatial Data Repository
NASA Astrophysics Data System (ADS)
Hudspeth, W. B.; Benedict, K. K.; Scott, S.
2012-12-01
The Earth Data Analysis Center (EDAC) at the University of New Mexico is partnering with the CYBERShARE and Environmental Health Group from the Center for Environmental Resource Management (CERM), located at the University of Texas, El Paso (UTEP), the Biodiversity Institute at the University of Kansas (KU), and the New Mexico Geo- Epidemiology Research Network (GERN) to provide a technical infrastructure that enables investigation of a variety of climate-driven human/environmental systems. Two significant goals of this NASA-funded project are: a) to increase the use of NASA Earth observational data at EDAC by various modeling communities through enabling better discovery, access, and use of relevant information, and b) to expose these communities to the benefits of provenance for improving understanding and usability of heterogeneous data sources and derived model products. To realize these goals, EDAC has leveraged the core capabilities of its Geographic Storage, Transformation, and Retrieval Engine (Gstore) platform, developed with support of the NSF EPSCoR Program. The Gstore geospatial services platform provides general purpose web services based upon the REST service model, and is capable of data discovery, access, and publication functions, metadata delivery functions, data transformation, and auto-generated OGC services for those data products that can support those services. Central to the NASA ACCESS project is the delivery of geospatial metadata in a variety of formats, including ISO 19115-2/19139, FGDC CSDGM, and the Proof Markup Language (PML). This presentation details the extraction and persistence of relevant metadata in the Gstore data store, and their transformation into multiple metadata formats that are increasingly utilized by the geospatial community to document not only core library catalog elements (e.g. title, abstract, publication data, geographic extent, projection information, and database elements), but also the processing steps used to generate derived modeling products. In particular, we discuss the generation and service delivery of provenance, or trace of data sources and analytical methods used in a scientific analysis, for archived data. We discuss the workflows developed by EDAC to capture end-to-end provenance, the storage model for those data in a delivery format independent data structure, and delivery of PML, ISO, and FGDC documents to clients requesting those products.
ERIC Educational Resources Information Center
Hogrebe, Mark C.; Tate, William F., IV
2012-01-01
In this chapter, "geospatial" refers to geographic space that includes location, distance, and the relative position of things on the earth's surface. Geospatial perspective calls for the addition of a geographic lens that focuses on place and space as important contextual variables. A geospatial view increases one's understanding of…
Geospatial Data Curation at the University of Idaho
ERIC Educational Resources Information Center
Kenyon, Jeremy; Godfrey, Bruce; Eckwright, Gail Z.
2012-01-01
The management and curation of digital geospatial data has become a central concern for many academic libraries. Geospatial data is a complex type of data critical to many different disciplines, and its use has become more expansive in the past decade. The University of Idaho Library maintains a geospatial data repository called the Interactive…
Mark Kimsey; Deborah Page-Dumroese; Mark Coleman
2011-01-01
Biomass harvesting for energy production and forest health can impact the soil resource by altering inherent chemical, physical and biological properties. These impacts raise concern about damaging sensitive forest soils, even with the prospect of maintaining vigorous forest growth through biomass harvesting operations. Current forest biomass harvesting research...
NASA Astrophysics Data System (ADS)
Tisdale, M.
2016-12-01
NASA's Atmospheric Science Data Center (ASDC) is operationally using the Esri ArcGIS Platform to improve data discoverability, accessibility and interoperability to meet the diversifying government, private, public and academic communities' driven requirements. The ASDC is actively working to provide their mission essential datasets as ArcGIS Image Services, Open Geospatial Consortium (OGC) Web Mapping Services (WMS), OGC Web Coverage Services (WCS) and leveraging the ArcGIS multidimensional mosaic dataset structure. Science teams and ASDC are utilizing these services, developing applications using the Web AppBuilder for ArcGIS and ArcGIS API for Javascript, and evaluating restructuring their data production and access scripts within the ArcGIS Python Toolbox framework and Geoprocessing service environment. These capabilities yield a greater usage and exposure of ASDC data holdings and provide improved geospatial analytical tools for a mission critical understanding in the areas of the earth's radiation budget, clouds, aerosols, and tropospheric chemistry.
NASA Technical Reports Server (NTRS)
Young, Steven D.; Harrah, Steven D.; deHaag, Maarten Uijt
2002-01-01
Terrain Awareness and Warning Systems (TAWS) and Synthetic Vision Systems (SVS) provide pilots with displays of stored geo-spatial data (e.g. terrain, obstacles, and/or features). As comprehensive validation is impractical, these databases typically have no quantifiable level of integrity. This lack of a quantifiable integrity level is one of the constraints that has limited certification and operational approval of TAWS/SVS to "advisory-only" systems for civil aviation. Previous work demonstrated the feasibility of using a real-time monitor to bound database integrity by using downward-looking remote sensing technology (i.e. radar altimeters). This paper describes an extension of the integrity monitor concept to include a forward-looking sensor to cover additional classes of terrain database faults and to reduce the exposure time associated with integrity threats. An operational concept is presented that combines established feature extraction techniques with a statistical assessment of similarity measures between the sensed and stored features using principles from classical detection theory. Finally, an implementation is presented that uses existing commercial-off-the-shelf weather radar sensor technology.
Geospatial Modelling Approach for 3d Urban Densification Developments
NASA Astrophysics Data System (ADS)
Koziatek, O.; Dragićević, S.; Li, S.
2016-06-01
With growing populations, economic pressures, and the need for sustainable practices, many urban regions are rapidly densifying developments in the vertical built dimension with mid- and high-rise buildings. The location of these buildings can be projected based on key factors that are attractive to urban planners, developers, and potential buyers. Current research in this area includes various modelling approaches, such as cellular automata and agent-based modelling, but the results are mostly linked to raster grids as the smallest spatial units that operate in two spatial dimensions. Therefore, the objective of this research is to develop a geospatial model that operates on irregular spatial tessellations to model mid- and high-rise buildings in three spatial dimensions (3D). The proposed model is based on the integration of GIS, fuzzy multi-criteria evaluation (MCE), and 3D GIS-based procedural modelling. Part of the City of Surrey, within the Metro Vancouver Region, Canada, has been used to present the simulations of the generated 3D building objects. The proposed 3D modelling approach was developed using ESRI's CityEngine software and the Computer Generated Architecture (CGA) language.
Real-time access of large volume imagery through low-bandwidth links
NASA Astrophysics Data System (ADS)
Phillips, James; Grohs, Karl; Brower, Bernard; Kelly, Lawrence; Carlisle, Lewis; Pellechia, Matthew
2010-04-01
Providing current, time-sensitive imagery and geospatial information to deployed tactical military forces or first responders continues to be a challenge. This challenge is compounded through rapid increases in sensor collection volumes, both with larger arrays and higher temporal capture rates. Focusing on the needs of these military forces and first responders, ITT developed a system called AGILE (Advanced Geospatial Imagery Library Enterprise) Access as an innovative approach based on standard off-the-shelf techniques to solving this problem. The AGILE Access system is based on commercial software called Image Access Solutions (IAS) and incorporates standard JPEG 2000 processing. Our solution system is implemented in an accredited, deployable form, incorporating a suite of components, including an image database, a web-based search and discovery tool, and several software tools that act in concert to process, store, and disseminate imagery from airborne systems and commercial satellites. Currently, this solution is operational within the U.S. Government tactical infrastructure and supports disadvantaged imagery users in the field. This paper presents the features and benefits of this system to disadvantaged users as demonstrated in real-world operational environments.
Grid computing enhances standards-compatible geospatial catalogue service
NASA Astrophysics Data System (ADS)
Chen, Aijun; Di, Liping; Bai, Yuqi; Wei, Yaxing; Liu, Yang
2010-04-01
A catalogue service facilitates sharing, discovery, retrieval, management of, and access to large volumes of distributed geospatial resources, for example data, services, applications, and their replicas on the Internet. Grid computing provides an infrastructure for effective use of computing, storage, and other resources available online. The Open Geospatial Consortium has proposed a catalogue service specification and a series of profiles for promoting the interoperability of geospatial resources. By referring to the profile of the catalogue service for Web, an innovative information model of a catalogue service is proposed to offer Grid-enabled registry, management, retrieval of and access to geospatial resources and their replicas. This information model extends the e-business registry information model by adopting several geospatial data and service metadata standards—the International Organization for Standardization (ISO)'s 19115/19119 standards and the US Federal Geographic Data Committee (FGDC) and US National Aeronautics and Space Administration (NASA) metadata standards for describing and indexing geospatial resources. In order to select the optimal geospatial resources and their replicas managed by the Grid, the Grid data management service and information service from the Globus Toolkits are closely integrated with the extended catalogue information model. Based on this new model, a catalogue service is implemented first as a Web service. Then, the catalogue service is further developed as a Grid service conforming to Grid service specifications. The catalogue service can be deployed in both the Web and Grid environments and accessed by standard Web services or authorized Grid services, respectively. The catalogue service has been implemented at the George Mason University/Center for Spatial Information Science and Systems (GMU/CSISS), managing more than 17 TB of geospatial data and geospatial Grid services. This service makes it easy to share and interoperate geospatial resources by using Grid technology and extends Grid technology into the geoscience communities.
NASA Astrophysics Data System (ADS)
Runyan, T. E.; Wood, W. T.; Palmsten, M. L.; Zhang, R.
2016-12-01
Gas hydrates, specifically methane hydrates, are sparsely sampled on a global scale, and their accumulation is difficult to predict geospatially. Several attempts have been made at estimating global inventories, and to some extent geospatial distribution, using geospatial extrapoltions guided with geophysical and geochemical methods. Our objective is to quantitatively predict the geospatial likelihood of encountering methane hydrates, with uncertainty. Predictions could be incorporated into analyses of drilling hazards as well as climate change. We use global data sets (including water depth, temperature, pressure, TOC, sediment thickness, and heat flow) as parameters to train a k-nearest neighbor (KNN) machine learning technique. The KNN is unsupervised and non-parametric, we do not provide any interpretive influence on prior probability distribution, so our results are strictly data driven. We have selected as test sites several locations where gas hydrates have been well studied, each with significantly different geologic settings.These include: The Blake Ridge (U.S. East Coast), Hydrate Ridge (U.S. West Coast), and the Gulf of Mexico. We then use KNN to quantify similarities between these sites, and determine, via the distance in parameter space, what is the likelihood and uncertainty of encountering gas hydrate anywhere in the world. Here we are operating under the assumption that the distance in parameter space is proportional to the probability of the occurrence of gas hydrate. We then compare these global similarity maps made from our several test sites to identify the geologic (geophyisical, bio-geochemical) parameters best suited for predicting gas hydrate occurrence.
Best Practices for Preparing Interoperable Geospatial Data
NASA Astrophysics Data System (ADS)
Wei, Y.; Santhana Vannan, S.; Cook, R. B.; Wilson, B. E.; Beaty, T. W.
2010-12-01
Geospatial data is critically important for a wide scope of research and applications: carbon cycle and ecosystem, climate change, land use and urban planning, environmental protecting, etc. Geospatial data is created by different organizations using different methods, from remote sensing observations, field surveys, model simulations, etc., and stored in various formats. So geospatial data is diverse and heterogeneous, which brings a huge barrier for the sharing and using of geospatial data, especially when targeting a broad user community. Many efforts have been taken to address different aspects of using geospatial data by improving its interoperability. For example, the specification for Open Geospatial Consortium (OGC) catalog services defines a standard way for geospatial information discovery; OGC Web Coverage Services (WCS) and OPeNDAP define interoperable protocols for geospatial data access, respectively. But the reality is that only having the standard mechanisms for data discovery and access is not enough. The geospatial data content itself has to be organized in standard, easily understandable, and readily usable formats. The Oak Ridge National Lab Distributed Archived Data Center (ORNL DAAC) archives data and information relevant to biogeochemical dynamics, ecological data, and environmental processes. The Modeling and Synthesis Thematic Data Center (MAST-DC) prepares and distributes both input data and output data of carbon cycle models and provides data support for synthesis and terrestrial model inter-comparison in multi-scales. Both of these NASA-funded data centers compile and distribute a large amount of diverse geospatial data and have broad user communities, including GIS users, Earth science researchers, and ecosystem modeling teams. The ORNL DAAC and MAST-DC address this geospatial data interoperability issue by standardizing the data content and feeding them into a well-designed Spatial Data Infrastructure (SDI) which provides interoperable mechanisms to advertise, visualize, and distribute the standardized geospatial data. In this presentation, we summarize the experiences learned and the best practices for geospatial data standardization. The presentation will describe how diverse and historical data archived in the ORNL DAAC were converted into standard and non-proprietary formats; what tools were used to make the conversion; how the spatial and temporal information are properly captured in a consistent manor; how to name a data file or a variable to make it both human-friendly and semantically interoperable; how NetCDF file format and CF convention can promote the data usage in ecosystem modeling user community; how those standardized geospatial data can be fed into OGC Web Services to support on-demand data visualization and access; and how the metadata should be collected and organized so that they can be discovered through standard catalog services.
OOI's Cyberinfrastructure: An Opening
NASA Astrophysics Data System (ADS)
Graybeal, J.; Ampe, T.; Arrott, M.; Chave, A. D.; Cressey, R.; Jul, S.; McPhail, T.; Meisinger, M.; Orcutt, J. A.; Peach, C. L.; Schofield, O.; Stocks, K.; Thomas, J.; Vernon, F.
2012-12-01
The Ocean Observatories Initiative is a long-term, NSF-funded program to provide 25-30 years of sustained ocean measurements to study climate variability, ocean circulation and ecosystem dynamics, air-sea exchange, seafloor processes, and plate-scale geodynamics. The OOI will enable powerful new scientific approaches for exploring the complexities of Earth-ocean-atmosphere interactions, thereby accelerating progress toward the goal of understanding, predicting, and managing our ocean environment. The OOI can foster new discoveries that, in turn, move research in unforeseen directions. The OOI Cyberinfrastructure will connect and coordinate the operations of OOI marine components and data processes, to meet the objectives of the oceanographic research and education communities. The CI will let all users easily interact with deployed resources, access collected data, and apply those data to their specific research and educational needs. The CI is a free and open product that adopts innovative and flexible strategies to bring the oceans to users, any time, any place, on any suitable device. The OOI CI is dedicated to "using the latest computing technologies to solve the interoperability problem among vast amounts of heterogeneous geospatial data from various sources." OOI CI's charge is to be transformative, and its technologies and goals are just that (see URL). The Cyberinfrastructure integrates state-of-the-art and best-practice approaches to provide fully interoperable access to the widest possible collection of geospatial data. From the system-of-systems model of the planned observatories and the ingestion of data, models, and services; to the configurable, automated workflows producing real-time products, data curation and quality management strategies are supported to the fullest possible extent. How do we build a system to efficiently support 750 core instruments across numerous platform types, add as-yet unknown instruments during the operations phase, and support any number of processes and external data in the system throughout its 25+ years of operation? What key strategies must be adopted, architectural approaches applied, and technologies integrated to provide complete discovery, access, and use of the system and its data? What defines the critical characteristics expected of the core system, the complete system, and the transformative system? And how can this system be leveraged by multiple science users, programs, and organizations beyond its initial target functionality? We will present the CI team's best responses to these questions. The project is completing Release 2, two-thirds of the way to a fully public release, and halfway to the final system. The engagement of OOI marine operations and marine science teams prepares us to support marine operations, and the software will be applied to "real operations" very soon. Most of the fundamental marine and operational scenarios are in place at a basic level, and the capabilities have been laid out for a full suite of mature operations and science activities. From these beginnings, we offer technical, social, and strategic perspectives on the challenges and solutions in geoinformatics data systems, and ask "Where to from here?" Funding for OOI is provided by the National Science Foundation through a Cooperative Agreement with the Consortium for Ocean Leadership, which in turn funds the CI project.
a Framework for AN Open Source Geospatial Certification Model
NASA Astrophysics Data System (ADS)
Khan, T. U. R.; Davis, P.; Behr, F.-J.
2016-06-01
The geospatial industry is forecasted to have an enormous growth in the forthcoming years and an extended need for well-educated workforce. Hence ongoing education and training play an important role in the professional life. Parallel, in the geospatial and IT arena as well in the political discussion and legislation Open Source solutions, open data proliferation, and the use of open standards have an increasing significance. Based on the Memorandum of Understanding between International Cartographic Association, OSGeo Foundation, and ISPRS this development led to the implementation of the ICA-OSGeo-Lab imitative with its mission "Making geospatial education and opportunities accessible to all". Discussions in this initiative and the growth and maturity of geospatial Open Source software initiated the idea to develop a framework for a worldwide applicable Open Source certification approach. Generic and geospatial certification approaches are already offered by numerous organisations, i.e., GIS Certification Institute, GeoAcademy, ASPRS, and software vendors, i. e., Esri, Oracle, and RedHat. They focus different fields of expertise and have different levels and ways of examination which are offered for a wide range of fees. The development of the certification framework presented here is based on the analysis of diverse bodies of knowledge concepts, i.e., NCGIA Core Curriculum, URISA Body Of Knowledge, USGIF Essential Body Of Knowledge, the "Geographic Information: Need to Know", currently under development, and the Geospatial Technology Competency Model (GTCM). The latter provides a US American oriented list of the knowledge, skills, and abilities required of workers in the geospatial technology industry and influenced essentially the framework of certification. In addition to the theoretical analysis of existing resources the geospatial community was integrated twofold. An online survey about the relevance of Open Source was performed and evaluated with 105 respondents worldwide. 15 interviews (face-to-face or by telephone) with experts in different countries provided additional insights into Open Source usage and certification. The findings led to the development of a certification framework of three main categories with in total eleven sub-categories, i.e., "Certified Open Source Geospatial Data Associate / Professional", "Certified Open Source Geospatial Analyst Remote Sensing & GIS", "Certified Open Source Geospatial Cartographer", "Certified Open Source Geospatial Expert", "Certified Open Source Geospatial Associate Developer / Professional Developer", "Certified Open Source Geospatial Architect". Each certification is described by pre-conditions, scope and objectives, course content, recommended software packages, target group, expected benefits, and the methods of examination. Examinations can be flanked by proofs of professional career paths and achievements which need a peer qualification evaluation. After a couple of years a recertification is required. The concept seeks the accreditation by the OSGeo Foundation (and other bodies) and international support by a group of geospatial scientific institutions to achieve wide and international acceptance for this Open Source geospatial certification model. A business case for Open Source certification and a corresponding SWOT model is examined to support the goals of the Geo-For-All initiative of the ICA-OSGeo pact.
NASA Astrophysics Data System (ADS)
Wakil, K.; Hussnain, MQ; Tahir, A.; Naeem, M. A.
2016-06-01
Unmanaged placement, size, location, structure and contents of outdoor advertisement boards have resulted in severe urban visual pollution and deterioration of the socio-physical living environment in urban centres of Pakistan. As per the regulatory instruments, the approval decision for a new advertisement installation is supposed to be based on the locational density of existing boards and their proximity or remoteness to certain land- uses. In cities, where regulatory tools for the control of advertisement boards exist, responsible authorities are handicapped in effective implementation due to the absence of geospatial analysis capacity. This study presents the development of a spatial decision support system (SDSS) for regularization of advertisement boards in terms of their location and placement. The knowledge module of the proposed SDSS is based on provisions and restrictions prescribed in regulatory documents. While the user interface allows visualization and scenario evaluation to understand if the new board will affect existing linear density on a particular road and if it violates any buffer restrictions around a particular land use. Technically the structure of the proposed SDSS is a web-based solution which includes open geospatial tools such as OpenGeo Suite, GeoExt, PostgreSQL, and PHP. It uses three key data sets including road network, locations of existing billboards and building parcels with land use information to perform the analysis. Locational suitability has been calculated using pairwise comparison through analytical hierarchy process (AHP) and weighted linear combination (WLC). Our results indicate that open geospatial tools can be helpful in developing an SDSS which can assist solving space related iterative decision challenges on outdoor advertisements. Employing such a system will result in effective implementation of regulations resulting in visual harmony and aesthetic improvement in urban communities.
NASA Astrophysics Data System (ADS)
Deo, Ram K.
Credible spatial information characterizing the structure and site quality of forests is critical to sustainable forest management and planning, especially given the increasing demands and threats to forest products and services. Forest managers and planners are required to evaluate forest conditions over a broad range of scales, contingent on operational or reporting requirements. Traditionally, forest inventory estimates are generated via a design-based approach that involves generalizing sample plot measurements to characterize an unknown population across a larger area of interest. However, field plot measurements are costly and as a consequence spatial coverage is limited. Remote sensing technologies have shown remarkable success in augmenting limited sample plot data to generate stand- and landscape-level spatial predictions of forest inventory attributes. Further enhancement of forest inventory approaches that couple field measurements with cutting edge remotely sensed and geospatial datasets are essential to sustainable forest management. We evaluated a novel Random Forest based k Nearest Neighbors (RF-kNN) imputation approach to couple remote sensing and geospatial data with field inventory collected by different sampling methods to generate forest inventory information across large spatial extents. The forest inventory data collected by the FIA program of US Forest Service was integrated with optical remote sensing and other geospatial datasets to produce biomass distribution maps for a part of the Lake States and species-specific site index maps for the entire Lake State. Targeting small-area application of the state-of-art remote sensing, LiDAR (light detection and ranging) data was integrated with the field data collected by an inexpensive method, called variable plot sampling, in the Ford Forest of Michigan Tech to derive standing volume map in a cost-effective way. The outputs of the RF-kNN imputation were compared with independent validation datasets and extant map products based on different sampling and modeling strategies. The RF-kNN modeling approach was found to be very effective, especially for large-area estimation, and produced results statistically equivalent to the field observations or the estimates derived from secondary data sources. The models are useful to resource managers for operational and strategic purposes.
NASA Astrophysics Data System (ADS)
XIA, J.; Yang, C.; Liu, K.; Huang, Q.; Li, Z.
2013-12-01
Big Data becomes increasingly important in almost all scientific domains, especially in geoscience where hundreds to millions of sensors are collecting data of the Earth continuously (Whitehouse News 2012). With the explosive growth of data, various Geospatial Cyberinfrastructure (GCI) (Yang et al. 2010) components are developed to manage geospatial resources and provide data access for the public. These GCIs are accessed by different users intensively on a daily basis. However, little research has been done to analyze the spatiotemporal patterns of user behavior, which could be critical to the management of Big Data and the operation of GCIs (Yang et al. 2011). For example, the spatiotemporal distribution of end users helps us better arrange and locate GCI computing facilities. A better indexing and caching mechanism could be developed based on the spatiotemporal pattern of user queries. In this paper, we use GEOSS Clearinghouse as an example to investigate spatiotemporal patterns of user behavior in GCIs. The investigation results show that user behaviors are heterogeneous but with patterns across space and time. Identified patterns include (1) the high access frequency regions; (2) local interests; (3) periodical accesses and rush hours; (4) spiking access. Based on identified patterns, this presentation reports several solutions to better support the operation of the GEOSS Clearinghouse and other GCIs. Keywords: Big Data, EarthCube, CyberGIS, Spatiotemporal Thinking and Computing, Data Mining, User Behavior Reference: Fayyad, U. M., Piatetsky-Shapiro, G., Smyth, P., & Uthurusamy, R. 1996. Advances in knowledge discovery and data mining. Whitehouse. 2012. Obama administration unveils 'BIG DATA' initiative: announces $200 million in new R&D investments. Whitehouse. Retrieved from http://www.whitehouse.gov/sites/default/files/microsites/ostp/big_data_press_release_final_2.pdf [Accessed 14 June 2013] Yang, C., Wu, H., Huang, Q., Li, Z., & Li, J. 2011. Using spatial principles to optimize distributed computing for enabling the physical science discoveries. Proceedings of the National Academy of Sciences, 108(14), 5498-5503. doi:10.1073/pnas.0909315108 Yang, C., Raskin, R., Goodchild, M., & Gahegan, M. 2010. Geospatial Cyberinfrastructure: Past, present and future. Computers, Environment and Urban Systems, 34(4), 264-277. doi:10.1016/j.compenvurbsys.2010.04.001
Cool Apps: Building Cryospheric Data Applications with Standards-Based Service Oriented Architecture
NASA Astrophysics Data System (ADS)
Oldenburg, J.; Truslove, I.; Collins, J. A.; Liu, M.; Lewis, S.; Brodzik, M.
2012-12-01
The National Snow and Ice Data Center (NSIDC) holds a large collection of cryospheric data, and is involved in a number of informatics research and development projects aimed at improving the discoverability and accessibility of these data. To develop high- quality software in a timely manner, we have adopted a Service- Oriented Architecture (SOA) approach for our core technical infrastructure development. Data services at NSIDC are internally exposed to other tools and applications through standards-based service interfaces. These standards include OAI-PMH (Open Archives Initiative Protocol for Metadata Harvesting), various OGC (Open Geospatial Consortium) standards including WMS (Web Map Service) and WFS (Web Feature Service), ESIP (Federation of Earth Sciences Information Partners) OpenSearch, and NSIDC-defined service endpoints which follow a RESTful architecture. By taking a standards-based approach, we are able to use off-the-shelf tools and libraries to consume, translate and broker these data services, and thus develop applications faster. Additionally, by exposing public interfaces to these services we provide valuable data services to technical collaborators; for example, NASA Reverb (http://reverb.echo.nasa.gov) uses NSIDC's WMS services. Our latest generation of web applications consume these data services directly. The most complete example of this is the Operation IceBridge Data Portal (http://nsidc.org/icebridge/ portal) which depends on many of the aforementioned services, retrieving data in several ways. The maps it displays are obtained through the use of WMS and WFS protocols from a MapServer instance hosted at NSIDC. Links to the scientific data collected on Operation IceBridge campaigns are obtained through ESIP OpenSearch requests service providers that encapsulate our metadata databases. These standards-based web services are also developed at NSIDC and are designed to be used independently of the Portal. This poster provides a visual representation of the relationships described above, with additional details and examples, and more generally outlines the benefits and challenges of this SOA approach.
Ontology for Transforming Geo-Spatial Data for Discovery and Integration of Scientific Data
NASA Astrophysics Data System (ADS)
Nguyen, L.; Chee, T.; Minnis, P.
2013-12-01
Discovery and access to geo-spatial scientific data across heterogeneous repositories and multi-discipline datasets can present challenges for scientist. We propose to build a workflow for transforming geo-spatial datasets into semantic environment by using relationships to describe the resource using OWL Web Ontology, RDF, and a proposed geo-spatial vocabulary. We will present methods for transforming traditional scientific dataset, use of a semantic repository, and querying using SPARQL to integrate and access datasets. This unique repository will enable discovery of scientific data by geospatial bound or other criteria.
NASA Astrophysics Data System (ADS)
Karnatak, H.; Raju, P. L. N.; Krishna Murthy, Y. V. N.; Srivastav, S. K.; Gupta, P. K.
2014-11-01
IIRS has initiated its interactive distance education based capacity building under IIRS outreach programme in year 2007 where more than 15000+ students were trained in the field of geospatial technology using Satellite based interactive terminals and internet based learning using A-View software. During last decade the utilization of Internet technology by different user groups in the society is emerged as a technological revaluation which has directly affect the life of human being. The Internet is used extensively in India for various purposes right from entrainment to critical decision making in government machinery. The role of internet technology is very important for capacity building in any discipline which can satisfy the needs of maximum users in minimum time. Further to enhance the outreach of geospatial science and technology, IIRS has initiated e-learning based certificate courses of different durations. The contents for e-learning based capacity building programme are developed for various target user groups including mid-career professionals, researchers, academia, fresh graduates, and user department professionals from different States and Central Government ministries. The official website of IIRS e-learning is hosted at http://elearning.iirs.gov.in. The contents of IIRS e-learning programme are flexible for anytime, anywhere learning keeping in mind the demands of geographically dispersed audience and their requirements. The program is comprehensive with variety of online delivery modes with interactive, easy to learn and having a proper blend of concepts and practical to elicit students' full potential. The course content of this programme includes Image Statistics, Basics of Remote Sensing, Photogrammetry and Cartography, Digital Image Processing, Geographical Information System, Global Positioning System, Customization of Geospatial tools and Applications of Geospatial Technologies. The syllabus of the courses is as per latest developments and trends in geo-spatial science and technologies with specific focus on Indian case studies for geo-spatial applications. The learning is made available through interactive 2D and 3D animations, audio, video for practical demonstrations, software operations with free data applications. The learning methods are implemented to make it more interactive and learner centric application with practical examples of real world problems.
NASA Astrophysics Data System (ADS)
Pearlman, Francoise; Bernknopf, Richard; Pearlman, Jay; Rigby, Michael
2017-04-01
Assessment of the impact and societal benefit of Earth Observation (EO) is a multidisciplinary task that involves the social, economic and environmental knowledge to formulate indicators and methods. The value of information (VOI) of EO is based on case studies that document the value in use of the information in a specific decision. A case study is an empirical inquiry investigating a phenomenon. It emphasizes detailed contextual analysis of a limited number of events or conditions and their relationships. Quantitative estimates of the benefits and costs of the geospatial information derived from EO data document and demonstrate its economic value. A case study was completed to examine some of the technical perspectives of adapting and coupling satellite imagery and in situ water quality measurements to forecast changes in groundwater quality in the agricultural sector in Iowa. The analysis was conducted to identify the ability of EO to assist in improving agricultural land management and regulation of balancing production and groundwater contamination. The Iowa case study described the application of Landsat data in a land adaptation strategy to maintain agricultural production and groundwater water quality. Results demonstrated that Landsat information facilitates spatiotemporal analysis of the impact of nitrates (fertilizer application) on groundwater resources and that crop production could be retained while groundwater quality is maintained. To transition to the operational use of the geospatial information, the Landsat data should be applied in a use case where Interaction of various stakeholders within a decision process are addressed. The objective is to design implementation experiments of a system from the user's and contributor's perspective, and to communicate system behavior in their terms. A use case requires communication of system requirements, how the system operates and may be used, the roles that all participants play and what value the user will receive from the system. The use case must be broader than simply a technical demonstration of capability and involves scientific experts, farmers and their representatives, and the Government. Decisions will ultimately need to take into account some level of uncertainty in the scientific "measurement". The data also have statistical variability which affects the confidence in the value of information. These issues are concerns when implementing remote sensing technology and must be examined from an end user perspective and their impact discussed and understood. The study team held meetings with subject experts from Iowa State University and the Iowa Farm Bureau to explore the next steps in developing the use case. A meeting between the study team and the Iowa Farm Bureau centered on the need for efficient regulation of land use and regulation of agrochemical application in the Midwest. This presentation will describe the results of the case study and the ongoing investigation into directions into the broader application of the use case and the application of economic indicators that have applications across fields of interest.
A geospatial search engine for discovering multi-format geospatial data across the web
Christopher Bone; Alan Ager; Ken Bunzel; Lauren Tierney
2014-01-01
The volume of publically available geospatial data on the web is rapidly increasing due to advances in server-based technologies and the ease at which data can now be created. However, challenges remain with connecting individuals searching for geospatial data with servers and websites where such data exist. The objective of this paper is to present a publically...
ERIC Educational Resources Information Center
Hedley, Mikell Lynne; Templin, Mark A.; Czaljkowski, Kevin; Czerniak, Charlene
2013-01-01
Many 21st century careers rely on geospatial skills; yet, curricula and professional development lag behind in incorporating these skills. As a result, many teachers have limited experience or preparation for teaching geospatial skills. One strategy for overcoming such problems is the creation of a student/teacher/scientist (STS) partnership…
Bridging the Gap Between Surveyors and the Geo-Spatial Society
NASA Astrophysics Data System (ADS)
Müller, H.
2016-06-01
For many years FIG, the International Association of Surveyors, has been trying to bridge the gap between surveyors and the geospatial society as a whole, with the geospatial industries in particular. Traditionally the surveying profession contributed to the good of society by creating and maintaining highly precise and accurate geospatial data bases, based on an in-depth knowledge of spatial reference frameworks. Furthermore in many countries surveyors may be entitled to make decisions about land divisions and boundaries. By managing information spatially surveyors today develop into the role of geo-data managers, the longer the more. Job assignments in this context include data entry management, data and process quality management, design of formal and informal systems, information management, consultancy, land management, all that in close cooperation with many different stakeholders. Future tasks will include the integration of geospatial information into e-government and e-commerce systems. The list of professional tasks underpins the capabilities of surveyors to contribute to a high quality geospatial data and information management. In that way modern surveyors support the needs of a geo-spatial society. The paper discusses several approaches to define the role of the surveyor within the modern geospatial society.
Automated geospatial Web Services composition based on geodata quality requirements
NASA Astrophysics Data System (ADS)
Cruz, Sérgio A. B.; Monteiro, Antonio M. V.; Santos, Rafael
2012-10-01
Service-Oriented Architecture and Web Services technologies improve the performance of activities involved in geospatial analysis with a distributed computing architecture. However, the design of the geospatial analysis process on this platform, by combining component Web Services, presents some open issues. The automated construction of these compositions represents an important research topic. Some approaches to solving this problem are based on AI planning methods coupled with semantic service descriptions. This work presents a new approach using AI planning methods to improve the robustness of the produced geospatial Web Services composition. For this purpose, we use semantic descriptions of geospatial data quality requirements in a rule-based form. These rules allow the semantic annotation of geospatial data and, coupled with the conditional planning method, this approach represents more precisely the situations of nonconformities with geodata quality that may occur during the execution of the Web Service composition. The service compositions produced by this method are more robust, thus improving process reliability when working with a composition of chained geospatial Web Services.
EPA National Geospatial Data Policy
National Geospatial Data Policy (NGDP) establishes principles, responsibilities, and requirements for collecting and managing geospatial data used by Federal environmental programs and projects within the jurisdiction of the U.S. EPA
Towards the Geospatial Web: Media Platforms for Managing Geotagged Knowledge Repositories
NASA Astrophysics Data System (ADS)
Scharl, Arno
International media have recognized the visual appeal of geo-browsers such as NASA World Wind and Google Earth, for example, when Web and television coverage on Hurricane Katrina used interactive geospatial projections to illustrate its path and the scale of destruction in August 2005. Yet these early applications only hint at the true potential of geospatial technology to build and maintain virtual communities and to revolutionize the production, distribution and consumption of media products. This chapter investigates this potential by reviewing the literature and discussing the integration of geospatial and semantic reference systems, with an emphasis on extracting geospatial context from unstructured text. A content analysis of news coverage based on a suite of text mining tools (webLyzard) sheds light on the popularity and adoption of geospatial platforms.
Anthony, Michelle L.; Klaver, Jacqueline M.; Quenzer, Robert
1998-01-01
The US Geological Survey and US Agency for International Development are enhancing the geographic information infrastructure of the Western Hemisphere by establishing the Inter-American Geospatial Data Network (IGDN). In its efforts to strengthen the Western Hemisphere's information infrastructure, the IGDN is consistent with the goals of the Plan of Action that emerged from the 1994 Summit of the Americas. The IGDN is an on-line cooperative, or clearinghouse, of geospatial data. Internet technology is used to facilitate the discovery and access of Western Hemisphere geospatial data. It was established by using the standards and guidelines of the Federal Geographic Data Committee to provide a consistent data discovery mechanism that will help minimize geospatial data duplication, promote data availability, and coordinate data collection and research activities.
EPA has developed many applications that allow users to explore and interact with geospatial data. This page highlights some of the flagship geospatial web applications but these represent only a fraction of the total.
Geospatial Data Science Analysis | Geospatial Data Science | NREL
different levels of technology maturity. Photo of a man taking field measurements. Geospatial analysis energy for different technologies across the nation? Featured Analysis Products Renewable Energy
Geospatial Information is the Cornerstone of Effective Hazards Response
Newell, Mark
2008-01-01
Every day there are hundreds of natural disasters world-wide. Some are dramatic, whereas others are barely noticeable. A natural disaster is commonly defined as a natural event with catastrophic consequences for living things in the vicinity. Those events include earthquakes, floods, hurricanes, landslides, tsunami, volcanoes, and wildfires. Man-made disasters are events that are caused by man either intentionally or by accident, and that directly or indirectly threaten public health and well-being. These occurrences span the spectrum from terrorist attacks to accidental oil spills. To assist in responding to natural and potential man-made disasters, the U.S. Geological Survey (USGS) has established the Geospatial Information Response Team (GIRT) (http://www.usgs.gov/emergency/). The primary purpose of the GIRT is to ensure rapid coordination and availability of geospatial information for effective response by emergency responders, and land and resource managers, and for scientific analysis. The GIRT is responsible for establishing monitoring procedures for geospatial data acquisition, processing, and archiving; discovery, access, and delivery of data; anticipating geospatial needs; and providing relevant geospatial products and services. The GIRT is focused on supporting programs, offices, other agencies, and the public in mission response to hazards. The GIRT will leverage the USGS Geospatial Liaison Network and partnerships with the Department of Homeland Security (DHS), National Geospatial-Intelligence Agency (NGA), and Northern Command (NORTHCOM) to coordinate the provisioning and deployment of USGS geospatial data, products, services, and equipment. The USGS geospatial liaisons will coordinate geospatial information sharing with State, local, and tribal governments, and ensure geospatial liaison back-up support procedures are in place. The GIRT will coordinate disposition of USGS staff in support of DHS response center activities as requested by DHS. The GIRT is a standing team that is available during all hazard events and is on high alert during the hurricane season from June through November each year. To track all of the requirements and data acquisitions processed through the team, the GIRT will use the new Emergency Request Track (ER Track) tool. Currently, the ER Track is only available to USGS personnel.
The National 3-D Geospatial Information Web-Based Service of Korea
NASA Astrophysics Data System (ADS)
Lee, D. T.; Kim, C. W.; Kang, I. G.
2013-09-01
3D geospatial information systems should provide efficient spatial analysis tools and able to use all capabilities of the third dimension, and a visualization. Currently, many human activities make steps toward the third dimension like land use, urban and landscape planning, cadastre, environmental monitoring, transportation monitoring, real estate market, military applications, etc. To reflect this trend, the Korean government has been started to construct the 3D geospatial data and service platform. Since the geospatial information was introduced in Korea, the construction of geospatial information (3D geospatial information, digital maps, aerial photographs, ortho photographs, etc.) has been led by the central government. The purpose of this study is to introduce the Korean government-lead 3D geospatial information web-based service for the people who interested in this industry and we would like to introduce not only the present conditions of constructed 3D geospatial data but methodologies and applications of 3D geospatial information. About 15% (about 3,278.74 km2) of the total urban area's 3D geospatial data have been constructed by the national geographic information institute (NGII) of Korea from 2005 to 2012. Especially in six metropolitan cities and Dokdo (island belongs to Korea) on level of detail (LOD) 4 which is photo-realistic textured 3D models including corresponding ortho photographs were constructed in 2012. In this paper, we represented web-based 3D map service system composition and infrastructure and comparison of V-world with Google Earth service will be presented. We also represented Open API based service cases and discussed about the protection of location privacy when we construct 3D indoor building models. In order to prevent an invasion of privacy, we processed image blurring, elimination and camouflage. The importance of public-private cooperation and advanced geospatial information policy is emphasized in Korea. Thus, the progress of spatial information industry of Korea is expected in the near future.
Geospatial Science is increasingly becoming an important tool in making Agency decisions. Quality Control and Quality Assurance are required to be integrated during the planning, implementation and assessment of geospatial databases, processes and products. In order to ensure Age...
The geospatial data quality REST API for primary biodiversity data
Otegui, Javier; Guralnick, Robert P.
2016-01-01
Summary: We present a REST web service to assess the geospatial quality of primary biodiversity data. It enables access to basic and advanced functions to detect completeness and consistency issues as well as general errors in the provided record or set of records. The API uses JSON for data interchange and efficient parallelization techniques for fast assessments of large datasets. Availability and implementation: The Geospatial Data Quality API is part of the VertNet set of APIs. It can be accessed at http://api-geospatial.vertnet-portal.appspot.com/geospatial and is already implemented in the VertNet data portal for quality reporting. Source code is freely available under GPL license from http://www.github.com/vertnet/api-geospatial. Contact: javier.otegui@gmail.com or rguralnick@flmnh.ufl.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26833340
The geospatial data quality REST API for primary biodiversity data.
Otegui, Javier; Guralnick, Robert P
2016-06-01
We present a REST web service to assess the geospatial quality of primary biodiversity data. It enables access to basic and advanced functions to detect completeness and consistency issues as well as general errors in the provided record or set of records. The API uses JSON for data interchange and efficient parallelization techniques for fast assessments of large datasets. The Geospatial Data Quality API is part of the VertNet set of APIs. It can be accessed at http://api-geospatial.vertnet-portal.appspot.com/geospatial and is already implemented in the VertNet data portal for quality reporting. Source code is freely available under GPL license from http://www.github.com/vertnet/api-geospatial javier.otegui@gmail.com or rguralnick@flmnh.ufl.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.
NASA Astrophysics Data System (ADS)
Prandi, F.; Magliocchetti, D.; Poveda, A.; De Amicis, R.; Andreolli, M.; Devigili, F.
2016-06-01
Forests represent an important economic resource for mountainous areas being for a few region and mountain communities the main form of income. However, wood chain management in these contexts differs from the traditional schemes due to the limits imposed by terrain morphology, both for the operation planning aspects and the hardware requirements. In fact, forest organizational and technical problems require a wider strategic and detailed level of planning to reach the level of productivity of forest operation techniques applied on flatlands. In particular, a perfect knowledge of forest inventories improves long-term management sustainability and efficiency allowing a better understanding of forest ecosystems. However, this knowledge is usually based on historical parcel information with only few cases of remote sensing information from satellite imageries. This is not enough to fully exploit the benefit of the mountain areas forest stocks where the economic and ecological value of each single parcel depends on singletree characteristics. The work presented in this paper, based on the results of the SLOPE (Integrated proceSsing and controL systems fOr sustainable forest Production in mountain arEas) project, investigates the capability to generate, manage and visualize detailed virtual forest models using geospatial information, combining data acquired from traditional on-the-field laser scanning surveys technologies with new aerial survey through UAV systems. These models are then combined with interactive 3D virtual globes for continuous assessment of resource characteristics, harvesting planning and real-time monitoring of the whole production.
Extending Climate Analytics-As to the Earth System Grid Federation
NASA Astrophysics Data System (ADS)
Tamkin, G.; Schnase, J. L.; Duffy, D.; McInerney, M.; Nadeau, D.; Li, J.; Strong, S.; Thompson, J. H.
2015-12-01
We are building three extensions to prior-funded work on climate analytics-as-a-service that will benefit the Earth System Grid Federation (ESGF) as it addresses the Big Data challenges of future climate research: (1) We are creating a cloud-based, high-performance Virtual Real-Time Analytics Testbed supporting a select set of climate variables from six major reanalysis data sets. This near real-time capability will enable advanced technologies like the Cloudera Impala-based Structured Query Language (SQL) query capabilities and Hadoop-based MapReduce analytics over native NetCDF files while providing a platform for community experimentation with emerging analytic technologies. (2) We are building a full-featured Reanalysis Ensemble Service comprising monthly means data from six reanalysis data sets. The service will provide a basic set of commonly used operations over the reanalysis collections. The operations will be made accessible through NASA's climate data analytics Web services and our client-side Climate Data Services (CDS) API. (3) We are establishing an Open Geospatial Consortium (OGC) WPS-compliant Web service interface to our climate data analytics service that will enable greater interoperability with next-generation ESGF capabilities. The CDS API will be extended to accommodate the new WPS Web service endpoints as well as ESGF's Web service endpoints. These activities address some of the most important technical challenges for server-side analytics and support the research community's requirements for improved interoperability and improved access to reanalysis data.
NASA Astrophysics Data System (ADS)
Bodzin, Alec M.; Fu, Qiong; Kulo, Violet; Peffer, Tamara
2014-08-01
A potential method for teaching geospatial thinking and reasoning (GTR) is through geospatially enabled learning technologies. We developed an energy resources geospatial curriculum that included learning activities with geographic information systems and virtual globes. This study investigated how 13 urban middle school teachers implemented and varied the enactment of the curriculum with their students and investigated which teacher- and student-level factors accounted for students' GTR posttest achievement. Data included biweekly implementation surveys from teachers and energy resources content and GTR pre- and posttest achievement measures from 1,049 students. Students significantly increased both their energy resources content knowledge and their GTR skills related to energy resources at the end of the curriculum enactment. Both multiple regression and hierarchical linear modeling found that students' initial GTR abilities and gain in energy content knowledge were significantly explanatory variables for their geospatial achievement at the end of curriculum enactment, p < .001. Teacher enactment factors, including adherence to implementing the critical components of the curriculum or the number of years the teachers had taught the curriculum, did not have significant effects on students' geospatial posttest achievement. The findings from this study provide support that learning with geospatially enabled learning technologies can support GTR with urban middle-level learners.
Elmore, Kim; Flanagan, Barry; Jones, Nicholas F; Heitgerd, Janet L
2010-04-01
In 2008, CDC convened an expert panel to gather input on the use of geospatial science in surveillance, research and program activities focused on CDC's Healthy Communities Goal. The panel suggested six priorities: spatially enable and strengthen public health surveillance infrastructure; develop metrics for geospatial categorization of community health and health inequity; evaluate the feasibility and validity of standard metrics of community health and health inequities; support and develop GIScience and geospatial analysis; provide geospatial capacity building, training and education; and, engage non-traditional partners. Following the meeting, the strategies and action items suggested by the expert panel were reviewed by a CDC subcommittee to determine priorities relative to ongoing CDC geospatial activities, recognizing that many activities may need to occur either in parallel, or occur multiple times across phases. Phase A of the action items centers on developing leadership support. Phase B focuses on developing internal and external capacity in both physical (e.g., software and hardware) and intellectual infrastructure. Phase C of the action items plan concerns the development and integration of geospatial methods. In summary, the panel members provided critical input to the development of CDC's strategic thinking on integrating geospatial methods and research issues across program efforts in support of its Healthy Communities Goal.
Geospatial Science is increasingly becoming an important tool in making Agency decisions. QualIty Control and Quality Assurance are required to be integrated during the planning, implementation and assessment of geospatial databases, processes and products. In order to ensure Age...
NASA Astrophysics Data System (ADS)
Samadzadegan, F.; Saber, M.; Zahmatkesh, H.; Joze Ghazi Khanlou, H.
2013-09-01
Rapidly discovering, sharing, integrating and applying geospatial information are key issues in the domain of emergency response and disaster management. Due to the distributed nature of data and processing resources in disaster management, utilizing a Service Oriented Architecture (SOA) to take advantages of workflow of services provides an efficient, flexible and reliable implementations to encounter different hazardous situation. The implementation specification of the Web Processing Service (WPS) has guided geospatial data processing in a Service Oriented Architecture (SOA) platform to become a widely accepted solution for processing remotely sensed data on the web. This paper presents an architecture design based on OGC web services for automated workflow for acquisition, processing remotely sensed data, detecting fire and sending notifications to the authorities. A basic architecture and its building blocks for an automated fire detection early warning system are represented using web-based processing of remote sensing imageries utilizing MODIS data. A composition of WPS processes is proposed as a WPS service to extract fire events from MODIS data. Subsequently, the paper highlights the role of WPS as a middleware interface in the domain of geospatial web service technology that can be used to invoke a large variety of geoprocessing operations and chaining of other web services as an engine of composition. The applicability of proposed architecture by a real world fire event detection and notification use case is evaluated. A GeoPortal client with open-source software was developed to manage data, metadata, processes, and authorities. Investigating feasibility and benefits of proposed framework shows that this framework can be used for wide area of geospatial applications specially disaster management and environmental monitoring.
Enhancing Public Participation to Improve Natural Resources Science and its Use in Decision Making
NASA Astrophysics Data System (ADS)
Glynn, P. D.; Shapiro, C. D.; Liu, S. B.
2015-12-01
The need for broader understanding and involvement in science coupled with social technology advances enabling crowdsourcing and citizen science have created greater opportunities for public participation in the gathering, interpretation, and use of geospatial information. The U.S. Geological Survey (USGS) is developing guidance for USGS scientists, partners, and interested members of the public on when and how public participation can most effectively be used in the conduct of scientific activities. Public participation can provide important perspectives and knowledge that cannot be obtained through traditional scientific methods alone. Citizen engagement can also provide increased efficiencies to USGS science and additional benefits to society including enhanced understanding, appreciation, and interest in geospatial information and its use in decision making.The USGS guidance addresses several fundamental issues by:1. Developing an operational definition of citizen or participatory science.2. Identifying the circumstances under which citizen science is appropriate for use and when its use is not recommended. 3. Describing structured processes for effective use of citizen science. 4. Defining the successful application of citizen science and identifying useful success metrics.The guidance is coordinated by the USGS Science and Decisions Center and developed by a multidisciplinary team of USGS scientists and managers. External perspectives will also be incorporated, as appropriate to align with other efforts such as the White House Office of Science and Technology Policy (OSTP) Citizen Science and Crowdsourcing Toolkit for the Federal government. The guidance will include the development of an economic framework to assess the benefits and costs of geospatial information developed through participatory processes. This economic framework considers tradeoffs between obtaining additional perspectives through enhanced participation with costs associated from obtaining geospatial information from multiple sources.
Exploring Methodologies and Indicators for Cross-disciplinary Applications
NASA Astrophysics Data System (ADS)
Bernknopf, R.; Pearlman, J.
2015-12-01
Assessing the impact and benefit of geospatial information is a multidisciplinary task that involves the social, economic and environmental knowledge to formulate indicators and methods. There are use cases that couple the social sciences including economics, psychology, sociology that incorporate geospatial information. Benefit - cost analysis is an empirical approach that uses money as an indicator for decision making. It is a traditional base for a use case and has been applied to geospatial information and other areas. A new use case that applies indicators is Meta Regression analysis, which is used to evaluate transfers of socioeconomic benefits from different geographic regions into a unifying statistical approach. In this technique, qualitative and quantitative variables are indicators, which provide a weighted average of value for the nonmarket good or resource over a large region. The expected willingness to pay for the nonmarket good can be applied to a specific region. A third use case is the application of Decision Support Systems and Tools that have been used for forecasting agricultural prices and analysis of hazard policies. However, new methods for integrating these disciplines into use cases, an avenue to instruct the development of operational applications of geospatial information, are needed. Experience in one case may not be broadly transferable to other uses and applications if multiple disciplines are involved. To move forward, more use cases are needed and, especially, applications in the private sector. Applications are being examined across a multidisciplinary community for good examples that would be instructive in meeting the challenge. This presentation will look at the results of an investigation into directions in the broader applications of use cases to teach the methodologies and use of indicators that have applications across fields of interest.
GABBs: Cyberinfrastructure for Self-Service Geospatial Data Exploration, Computation, and Sharing
NASA Astrophysics Data System (ADS)
Song, C. X.; Zhao, L.; Biehl, L. L.; Merwade, V.; Villoria, N.
2016-12-01
Geospatial data are present everywhere today with the proliferation of location-aware computing devices. This is especially true in the scientific community where large amounts of data are driving research and education activities in many domains. Collaboration over geospatial data, for example, in modeling, data analysis and visualization, must still overcome the barriers of specialized software and expertise among other challenges. In addressing these needs, the Geospatial data Analysis Building Blocks (GABBs) project aims at building geospatial modeling, data analysis and visualization capabilities in an open source web platform, HUBzero. Funded by NSF's Data Infrastructure Building Blocks initiative, GABBs is creating a geospatial data architecture that integrates spatial data management, mapping and visualization, and interfaces in the HUBzero platform for scientific collaborations. The geo-rendering enabled Rappture toolkit, a generic Python mapping library, geospatial data exploration and publication tools, and an integrated online geospatial data management solution are among the software building blocks from the project. The GABBS software will be available through Amazon's AWS Marketplace VM images and open source. Hosting services are also available to the user community. The outcome of the project will enable researchers and educators to self-manage their scientific data, rapidly create GIS-enable tools, share geospatial data and tools on the web, and build dynamic workflows connecting data and tools, all without requiring significant software development skills, GIS expertise or IT administrative privileges. This presentation will describe the GABBs architecture, toolkits and libraries, and showcase the scientific use cases that utilize GABBs capabilities, as well as the challenges and solutions for GABBs to interoperate with other cyberinfrastructure platforms.
Bartelt, Paul E.; Gallant, Alisa L.; Klaver, Robert W.; Wright, Christopher K.; Patla, Debra A.; Peterson, Charles R.
2011-01-01
The ability to predict amphibian breeding across landscapes is important for informing land management decisions and helping biologists better understand and remediate factors contributing to declines in amphibian populations. We built geospatial models of likely breeding habitats for each of four amphibian species that breed in Yellowstone National Park (YNP). We used field data collected in 2000-2002 from 497 sites among 16 basins and predictor variables from geospatial models produced from remotely sensed data (e.g., digital elevation model, complex topographic index, landform data, wetland probabililty, and vegetative cover). Except for 31 sites in one basin that were surveyed in both 2000 and 2002, all sites were surveyed once. We used polytomous regression to build statistical models for each species of amphibian from 1) field survey site data only, 2) field data combined with data from geospatial models, and 3) data from geospatial models only. Based on measures of receiver operating characteristic (ROC) scores, models of the second type best explained likely breeding habitat because they contained the most information (ROC values ranged from 0.70 - 0.88). However, models of the third type could be applied to the entire YNP landscape and produced maps that could be verified with reserve field data. Accuracy rates for models built for single years were highly variable, ranging from 0.30 to 0.78. Accuracy rates for models built with data combined from multiple years were higher and less variable, ranging from 0.60 to 0.80. Combining results from the geospatial multiyear models yielded maps of "core" breeding areas (areas with high probability values for all three years) surrounded by areas that scored high for only one or two years, providing an estimate of variability among years. Such information can highlight landscape options for amphibian conservation. For example, our models identify alternative for areas that could be protected for each species, including 6828-10 764 ha for tiger salamanders; 971-3017 ha for western toads; 4732-16 696 ha for boreal chorus frogs; 4940-19 690 hectares for Columbia spotted frogs.
Bartelt, Paul E.; Gallant, Alisa L.; Klaver, Robert W.; Wright, C.K.; Patla, Debra A.; Peterson, Charles R.
2011-01-01
The ability to predict amphibian breeding across landscapes is important for informing land management decisions and helping biologists better understand and remediate factors contributing to declines in amphibian populations. We built geospatial models of likely breeding habitats for each of four amphibian species that breed in Yellowstone National Park (YNP). We used field data collected in 2000-2002 from 497 sites among 16 basins and predictor variables from geospatial models produced from remotely sensed data (e.g., digital elevation model, complex topographic index, landform data, wetland probability, and vegetative cover). Except for 31 sites in one basin that were surveyed in both 2000 and 2002, all sites were surveyed once. We used polytomous regression to build statistical models for each species of amphibian from (1) field survey site data only, (2) field data combined with data from geospatial models, and (3) data from geospatial models only. Based on measures of receiver operating characteristic (ROC) scores, models of the second type best explained likely breeding habitat because they contained the most information (ROC values ranged from 0.70 to 0.88). However, models of the third type could be applied to the entire YNP landscape and produced maps that could be verified with reserve field data. Accuracy rates for models built for single years were highly variable, ranging from 0.30 to 0.78. Accuracy rates for models built with data combined from multiple years were higher and less variable, ranging from 0.60 to 0.80. Combining results from the geospatial multiyear models yielded maps of "core" breeding areas (areas with high probability values for all three years) surrounded by areas that scored high for only one or two years, providing an estimate of variability among years. Such information can highlight landscape options for amphibian conservation. For example, our models identify alternative areas that could be protected for each species, including 6828-10 764 ha for tiger salamanders, 971-3017 ha for western toads, 4732-16 696 ha for boreal chorus frogs, and 4940-19 690 ha for Columbia spotted frogs. ?? 2011 by the Ecological Society of America.
Bartelt, Paul E; Gallant, Alisa L; Klaver, Robert W; Wright, Chris K; Patla, Debra A; Peterson, Charles R
2011-10-01
The ability to predict amphibian breeding across landscapes is important for informing land management decisions and helping biologists better understand and remediate factors contributing to declines in amphibian populations. We built geospatial models of likely breeding habitats for each of four amphibian species that breed in Yellowstone National Park (YNP). We used field data collected in 2000-2002 from 497 sites among 16 basins and predictor variables from geospatial models produced from remotely sensed data (e.g., digital elevation model, complex topographic index, landform data, wetland probability, and vegetative cover). Except for 31 sites in one basin that were surveyed in both 2000 and 2002, all sites were surveyed once. We used polytomous regression to build statistical models for each species of amphibian from (1) field survey site data only, (2) field data combined with data from geospatial models, and (3) data from geospatial models only. Based on measures of receiver operating characteristic (ROC) scores, models of the second type best explained likely breeding habitat because they contained the most information (ROC values ranged from 0.70 to 0.88). However, models of the third type could be applied to the entire YNP landscape and produced maps that could be verified with reserve field data. Accuracy rates for models built for single years were highly variable, ranging from 0.30 to 0.78. Accuracy rates for models built with data combined from multiple years were higher and less variable, ranging from 0.60 to 0.80. Combining results from the geospatial multiyear models yielded maps of "core" breeding areas (areas with high probability values for all three years) surrounded by areas that scored high for only one or two years, providing an estimate of variability among years. Such information can highlight landscape options for amphibian conservation. For example, our models identify alternative areas that could be protected for each species, including 6828-10 764 ha for tiger salamanders, 971-3017 ha for western toads, 4732-16 696 ha for boreal chorus frogs, and 4940-19 690 ha for Columbia spotted frogs.
Soranno, Patricia A; Bissell, Edward G; Cheruvelil, Kendra S; Christel, Samuel T; Collins, Sarah M; Fergus, C Emi; Filstrup, Christopher T; Lapierre, Jean-Francois; Lottig, Noah R; Oliver, Samantha K; Scott, Caren E; Smith, Nicole J; Stopyak, Scott; Yuan, Shuai; Bremigan, Mary Tate; Downing, John A; Gries, Corinna; Henry, Emily N; Skaff, Nick K; Stanley, Emily H; Stow, Craig A; Tan, Pang-Ning; Wagner, Tyler; Webster, Katherine E
2015-01-01
Although there are considerable site-based data for individual or groups of ecosystems, these datasets are widely scattered, have different data formats and conventions, and often have limited accessibility. At the broader scale, national datasets exist for a large number of geospatial features of land, water, and air that are needed to fully understand variation among these ecosystems. However, such datasets originate from different sources and have different spatial and temporal resolutions. By taking an open-science perspective and by combining site-based ecosystem datasets and national geospatial datasets, science gains the ability to ask important research questions related to grand environmental challenges that operate at broad scales. Documentation of such complicated database integration efforts, through peer-reviewed papers, is recommended to foster reproducibility and future use of the integrated database. Here, we describe the major steps, challenges, and considerations in building an integrated database of lake ecosystems, called LAGOS (LAke multi-scaled GeOSpatial and temporal database), that was developed at the sub-continental study extent of 17 US states (1,800,000 km(2)). LAGOS includes two modules: LAGOSGEO, with geospatial data on every lake with surface area larger than 4 ha in the study extent (~50,000 lakes), including climate, atmospheric deposition, land use/cover, hydrology, geology, and topography measured across a range of spatial and temporal extents; and LAGOSLIMNO, with lake water quality data compiled from ~100 individual datasets for a subset of lakes in the study extent (~10,000 lakes). Procedures for the integration of datasets included: creating a flexible database design; authoring and integrating metadata; documenting data provenance; quantifying spatial measures of geographic data; quality-controlling integrated and derived data; and extensively documenting the database. Our procedures make a large, complex, and integrated database reproducible and extensible, allowing users to ask new research questions with the existing database or through the addition of new data. The largest challenge of this task was the heterogeneity of the data, formats, and metadata. Many steps of data integration need manual input from experts in diverse fields, requiring close collaboration.
Soranno, Patricia A.; Bissell, E.G.; Cheruvelil, Kendra S.; Christel, Samuel T.; Collins, Sarah M.; Fergus, C. Emi; Filstrup, Christopher T.; Lapierre, Jean-Francois; Lotting, Noah R.; Oliver, Samantha K.; Scott, Caren E.; Smith, Nicole J.; Stopyak, Scott; Yuan, Shuai; Bremigan, Mary Tate; Downing, John A.; Gries, Corinna; Henry, Emily N.; Skaff, Nick K.; Stanley, Emily H.; Stow, Craig A.; Tan, Pang-Ning; Wagner, Tyler; Webster, Katherine E.
2015-01-01
Although there are considerable site-based data for individual or groups of ecosystems, these datasets are widely scattered, have different data formats and conventions, and often have limited accessibility. At the broader scale, national datasets exist for a large number of geospatial features of land, water, and air that are needed to fully understand variation among these ecosystems. However, such datasets originate from different sources and have different spatial and temporal resolutions. By taking an open-science perspective and by combining site-based ecosystem datasets and national geospatial datasets, science gains the ability to ask important research questions related to grand environmental challenges that operate at broad scales. Documentation of such complicated database integration efforts, through peer-reviewed papers, is recommended to foster reproducibility and future use of the integrated database. Here, we describe the major steps, challenges, and considerations in building an integrated database of lake ecosystems, called LAGOS (LAke multi-scaled GeOSpatial and temporal database), that was developed at the sub-continental study extent of 17 US states (1,800,000 km2). LAGOS includes two modules: LAGOSGEO, with geospatial data on every lake with surface area larger than 4 ha in the study extent (~50,000 lakes), including climate, atmospheric deposition, land use/cover, hydrology, geology, and topography measured across a range of spatial and temporal extents; and LAGOSLIMNO, with lake water quality data compiled from ~100 individual datasets for a subset of lakes in the study extent (~10,000 lakes). Procedures for the integration of datasets included: creating a flexible database design; authoring and integrating metadata; documenting data provenance; quantifying spatial measures of geographic data; quality-controlling integrated and derived data; and extensively documenting the database. Our procedures make a large, complex, and integrated database reproducible and extensible, allowing users to ask new research questions with the existing database or through the addition of new data. The largest challenge of this task was the heterogeneity of the data, formats, and metadata. Many steps of data integration need manual input from experts in diverse fields, requiring close collaboration.
Assessing Embedded Geospatial Student Learning Outcomes
ERIC Educational Resources Information Center
Carr, John David
2012-01-01
Geospatial tools and technologies have become core competencies for natural resource professionals due to the monitoring, modeling, and mapping capabilities they provide. To prepare students with needed background, geospatial instructional activities were integrated across Forest Management; Natural Resources; Fisheries, Wildlife, &…
USDA-ARS?s Scientific Manuscript database
Increasingly, consumer organizations, businesses, and academic researchers are using UAS to gather geospatial, environmental data on natural and man-made phenomena. These data may be either remotely sensed or measured directly (e. g., sampling of atmospheric constituents). The term geospatial data r...
Tools for open geospatial science
NASA Astrophysics Data System (ADS)
Petras, V.; Petrasova, A.; Mitasova, H.
2017-12-01
Open science uses open source to deal with reproducibility challenges in data and computational sciences. However, just using open source software or making the code public does not make the research reproducible. Moreover, the scientists face the challenge of learning new unfamiliar tools and workflows. In this contribution, we will look at a graduate-level course syllabus covering several software tools which make validation and reuse by a wider professional community possible. For the novices in the open science arena, we will look at how scripting languages such as Python and Bash help us reproduce research (starting with our own work). Jupyter Notebook will be introduced as a code editor, data exploration tool, and a lab notebook. We will see how Git helps us not to get lost in revisions and how Docker is used to wrap all the parts together using a single text file so that figures for a scientific paper or a technical report can be generated with a single command. We will look at examples of software and publications in the geospatial domain which use these tools and principles. Scientific contributions to GRASS GIS, a powerful open source desktop GIS and geoprocessing backend, will serve as an example of why and how to publish new algorithms and tools as part of a bigger open source project.
Onshore industrial wind turbine locations for the United States up to March 2014
Diffendorfer, James E.; Kramer, Louisa; Ancona, Zachary H.; Garrity, Christopher P.
2015-01-01
Wind energy is a rapidly growing form of renewable energy in the United States. While summary information on the total amounts of installed capacity are available by state, a free, centralized, national, turbine-level, geospatial dataset useful for scientific research, land and resource management, and other uses did not exist. Available in multiple formats and in a web application, these public domain data provide industrial-scale onshore wind turbine locations in the United States up to March 2014, corresponding facility information, and turbine technical specifications. Wind turbine records have been collected and compiled from various public sources, digitized or position verified from aerial imagery, and quality assured and quality controlled. Technical specifications for turbines were assigned based on the wind turbine make and model as described in public literature. In some cases, turbines were not seen in imagery or turbine information did not exist or was difficult to obtain. Uncertainty associated with these is recorded in a confidence rating.
Onshore industrial wind turbine locations for the United States up to March 2014.
Diffendorfer, Jay E; Kramer, Louisa A; Ancona, Zach H; Garrity, Christopher P
2015-11-24
Wind energy is a rapidly growing form of renewable energy in the United States. While summary information on the total amounts of installed capacity are available by state, a free, centralized, national, turbine-level, geospatial dataset useful for scientific research, land and resource management, and other uses did not exist. Available in multiple formats and in a web application, these public domain data provide industrial-scale onshore wind turbine locations in the United States up to March 2014, corresponding facility information, and turbine technical specifications. Wind turbine records have been collected and compiled from various public sources, digitized or position verified from aerial imagery, and quality assured and quality controlled. Technical specifications for turbines were assigned based on the wind turbine make and model as described in public literature. In some cases, turbines were not seen in imagery or turbine information did not exist or was difficult to obtain. Uncertainty associated with these is recorded in a confidence rating.
Onshore industrial wind turbine locations for the United States up to March 2014
Diffendorfer, Jay E.; Kramer, Louisa A.; Ancona, Zach H.; Garrity, Christopher P.
2015-01-01
Wind energy is a rapidly growing form of renewable energy in the United States. While summary information on the total amounts of installed capacity are available by state, a free, centralized, national, turbine-level, geospatial dataset useful for scientific research, land and resource management, and other uses did not exist. Available in multiple formats and in a web application, these public domain data provide industrial-scale onshore wind turbine locations in the United States up to March 2014, corresponding facility information, and turbine technical specifications. Wind turbine records have been collected and compiled from various public sources, digitized or position verified from aerial imagery, and quality assured and quality controlled. Technical specifications for turbines were assigned based on the wind turbine make and model as described in public literature. In some cases, turbines were not seen in imagery or turbine information did not exist or was difficult to obtain. Uncertainty associated with these is recorded in a confidence rating. PMID:26601687
Development of Rural Emergency Medical System (REMS) with Geospatial Technology in Malaysia
NASA Astrophysics Data System (ADS)
Ooi, W. H.; Shahrizal, I. M.; Noordin, A.; Nurulain, M. I.; Norhan, M. Y.
2014-02-01
Emergency medical services are dedicated services in providing out-of-hospital transport to definitive care or patients with illnesses and injuries. In this service the response time and the preparedness of medical services is of prime importance. The application of space and geospatial technology such as satellite navigation system and Geographical Information System (GIS) was proven to improve the emergency operation in many developed countries. In collaboration with a medical service NGO, the National Space Agency (ANGKASA) has developed a prototype Rural Emergency Medical System (REMS), focusing on providing medical services to rural areas and incorporating satellite based tracking module integrated with GIS and patience database to improve the response time of the paramedic team during emergency. With the aim to benefit the grassroots community by exploiting space technology, the project was able to prove the system concept which will be addressed in this paper.
NASA Technical Reports Server (NTRS)
Kurihara, Shinobu; Nozawa, Kentaro
2013-01-01
The K5/VSSP software correlator (Figure 1), located in Tsukuba, Japan, is operated by the Geospatial Information Authority of Japan (GSI). It is fully dedicated to processing the geodetic VLBI sessions of the International VLBI Service for Geodesy and Astrometry. All of the weekend IVS Intensives (INT2) and the Japanese domestic VLBI observations organized by GSI were processed at the Tsukuba VLBI Correlator.
Mapping the Future: Optimizing Joint Geospatial Engineering Support
2006-05-16
Environment. Maxwell Air Force Base, AL.: Air University, 1990. Babbage , Ross and Desmond Ball. Geographic Information Systems: Defence Applications...Joint Pub 4-04. Washington, DC: 27 September 2001. Wertz, Charles J. The Data Dictionary, Concepts and Uses. Wellesley, MA: QED Information...Force Defense Mapping for Future Operations, Washington, DC: September 1995, 1-7. 18 Charles J. Wertz, The Data Dictionary, Concepts and Uses
2006-05-25
desempeño económico en las áreas rurales. Bogotá: Universidad Externado de Colombia & Fondo Financiero de Proyectos de Desarrollo. 1997. Bilyeu, Elisabeth...Ediciones B Colombia, S.A.. 2004. Presidencia República de Colombia. Plan Nacional de Desarrollo 2002-2006: Hacia un Estado Comunitario. Bogotá
Assessing Intelligence Operation/Fusion/Coordination Centers for Efficiency Opportunities
2013-02-28
intelligence ], HUMINT [human intelligence ], GEOINT [geospatial intelligence ], or even open source information into the NIC-C. There is no...centers have and continue to be stood up to improve the collaboration across intelligence organizatons addressing national security threats. Open ... source review of journals and books describing changes in the intelligence community organizational structure since September 2001, were reviewed to
A linear geospatial streamflow modeling system for data sparse environments
Asante, Kwabena O.; Arlan, Guleid A.; Pervez, Md Shahriar; Rowland, James
2008-01-01
In many river basins around the world, inaccessibility of flow data is a major obstacle to water resource studies and operational monitoring. This paper describes a geospatial streamflow modeling system which is parameterized with global terrain, soils and land cover data and run operationally with satellite‐derived precipitation and evapotranspiration datasets. Simple linear methods transfer water through the subsurface, overland and river flow phases, and the resulting flows are expressed in terms of standard deviations from mean annual flow. In sample applications, the modeling system was used to simulate flow variations in the Congo, Niger, Nile, Zambezi, Orange and Lake Chad basins between 1998 and 2005, and the resulting flows were compared with mean monthly values from the open‐access Global River Discharge Database. While the uncalibrated model cannot predict the absolute magnitude of flow, it can quantify flow anomalies in terms of relative departures from mean flow. Most of the severe flood events identified in the flow anomalies were independently verified by the Dartmouth Flood Observatory (DFO) and the Emergency Disaster Database (EM‐DAT). Despite its limitations, the modeling system is valuable for rapid characterization of the relative magnitude of flood hazards and seasonal flow changes in data sparse settings.
2014-09-01
Approved for public release; distribution is unlimited. Prepared for Geospatial Research Laboratory U.S. Army Engineer Research and Development...Center U.S. Army Corps of Engineers Under Data Level Enterprise Tools Monitored by Geospatial Research Laboratory 7701 Telegraph Road...Engineer Research and Development Center (ERDC) ERDC Geospatial Research Laboratory 7701 Telegraph Road 11. SPONSOR/MONITOR’S REPORT Alexandria, VA 22135
Dotse-Gborgbortsi, Winfred; Wardrop, Nicola; Adewole, Ademola; Thomas, Mair L H; Wright, Jim
2018-05-23
Commercial geospatial data resources are frequently used to understand healthcare utilisation. Although there is widespread evidence of a digital divide for other digital resources and infra-structure, it is unclear how commercial geospatial data resources are distributed relative to health need. To examine the distribution of commercial geospatial data resources relative to health needs, we assembled coverage and quality metrics for commercial geocoding, neighbourhood characterisation, and travel time calculation resources for 183 countries. We developed a country-level, composite index of commercial geospatial data quality/availability and examined its distribution relative to age-standardised all-cause and cause specific (for three main causes of death) mortality using two inequality metrics, the slope index of inequality and relative concentration index. In two sub-national case studies, we also examined geocoding success rates versus area deprivation by district in Eastern Region, Ghana and Lagos State, Nigeria. Internationally, commercial geospatial data resources were inversely related to all-cause mortality. This relationship was more pronounced when examining mortality due to communicable diseases. Commercial geospatial data resources for calculating patient travel times were more equitably distributed relative to health need than resources for characterising neighbourhoods or geocoding patient addresses. Countries such as South Africa have comparatively high commercial geospatial data availability despite high mortality, whilst countries such as South Korea have comparatively low data availability and low mortality. Sub-nationally, evidence was mixed as to whether geocoding success was lowest in more deprived districts. To our knowledge, this is the first global analysis of commercial geospatial data resources in relation to health outcomes. In countries such as South Africa where there is high mortality but also comparatively rich commercial geospatial data, these data resources are a potential resource for examining healthcare utilisation that requires further evaluation. In countries such as Sierra Leone where there is high mortality but minimal commercial geospatial data, alternative approaches such as open data use are needed in quantifying patient travel times, geocoding patient addresses, and characterising patients' neighbourhoods.
78 FR 69393 - Privacy Act of 1974; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-19
.... FOR FURTHER INFORMATION CONTACT: National Geospatial-Intelligence Agency (NGA), ATTN: Human...: Delete entry and replace with ``Human Development Directorate, National Geospatial-Intelligence Agency...; System of Records AGENCY: National Geospatial-Intelligence Agency, DoD. ACTION: Notice to alter a System...
77 FR 5820 - National Geospatial Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-06
... DEPARTMENT OF THE INTERIOR Office of the Secretary National Geospatial Advisory Committee AGENCY... that the Secretary of the Interior has renewed the National Geospatial Advisory Committee. The Committee will provide advice and recommendations to the Federal Geographic Data Committee (FGDC), through...
THE NEVADA GEOSPATIAL DATA BROWSER
The Landscape Ecology Branch of the U.S. Environmental Protection Agency (Las Vegas, NV) has developed the Nevada Geospatial Data Browser, a spatial data archive to centralize and distribute the geospatial data used to create the land cover, vertebrate habitat models, and land o...
Information Fusion for Feature Extraction and the Development of Geospatial Information
2004-07-01
of automated processing . 2. Requirements for Geospatial Information Accurate, timely geospatial information is critical for many military...this evaluation illustrates some of the difficulties in comparing manual and automated processing results (figure 5). The automated delineation of
Geospatial-enabled Data Exploration and Computation through Data Infrastructure Building Blocks
NASA Astrophysics Data System (ADS)
Song, C. X.; Biehl, L. L.; Merwade, V.; Villoria, N.
2015-12-01
Geospatial data are present everywhere today with the proliferation of location-aware computing devices and sensors. This is especially true in the scientific community where large amounts of data are driving research and education activities in many domains. Collaboration over geospatial data, for example, in modeling, data analysis and visualization, must still overcome the barriers of specialized software and expertise among other challenges. The GABBs project aims at enabling broader access to geospatial data exploration and computation by developing spatial data infrastructure building blocks that leverage capabilities of end-to-end application service and virtualized computing framework in HUBzero. Funded by NSF Data Infrastructure Building Blocks (DIBBS) initiative, GABBs provides a geospatial data architecture that integrates spatial data management, mapping and visualization and will make it available as open source. The outcome of the project will enable users to rapidly create tools and share geospatial data and tools on the web for interactive exploration of data without requiring significant software development skills, GIS expertise or IT administrative privileges. This presentation will describe the development of geospatial data infrastructure building blocks and the scientific use cases that help drive the software development, as well as seek feedback from the user communities.
US EPA GEOSPATIAL QUALITY COUNCIL: ENSURING QUALITY IN GEOPSPATIAL SOLUTIONS
In 1999, the U.S. Environmental Protection Agency (EPA), Office of Research and Development, Environmental Sciences Division, created the EPA Geospatial Quality Council (GQC) to fill the gap between the EPA Quality Assurance (QA) and Geospatial communities. GQC participants inclu...
Searches over graphs representing geospatial-temporal remote sensing data
Brost, Randolph; Perkins, David Nikolaus
2018-03-06
Various technologies pertaining to identifying objects of interest in remote sensing images by searching over geospatial-temporal graph representations are described herein. Graphs are constructed by representing objects in remote sensing images as nodes, and connecting nodes with undirected edges representing either distance or adjacency relationships between objects and directed edges representing changes in time. Geospatial-temporal graph searches are made computationally efficient by taking advantage of characteristics of geospatial-temporal data in remote sensing images through the application of various graph search techniques.
GISpark: A Geospatial Distributed Computing Platform for Spatiotemporal Big Data
NASA Astrophysics Data System (ADS)
Wang, S.; Zhong, E.; Wang, E.; Zhong, Y.; Cai, W.; Li, S.; Gao, S.
2016-12-01
Geospatial data are growing exponentially because of the proliferation of cost effective and ubiquitous positioning technologies such as global remote-sensing satellites and location-based devices. Analyzing large amounts of geospatial data can provide great value for both industrial and scientific applications. Data- and compute- intensive characteristics inherent in geospatial big data increasingly pose great challenges to technologies of data storing, computing and analyzing. Such challenges require a scalable and efficient architecture that can store, query, analyze, and visualize large-scale spatiotemporal data. Therefore, we developed GISpark - a geospatial distributed computing platform for processing large-scale vector, raster and stream data. GISpark is constructed based on the latest virtualized computing infrastructures and distributed computing architecture. OpenStack and Docker are used to build multi-user hosting cloud computing infrastructure for GISpark. The virtual storage systems such as HDFS, Ceph, MongoDB are combined and adopted for spatiotemporal data storage management. Spark-based algorithm framework is developed for efficient parallel computing. Within this framework, SuperMap GIScript and various open-source GIS libraries can be integrated into GISpark. GISpark can also integrated with scientific computing environment (e.g., Anaconda), interactive computing web applications (e.g., Jupyter notebook), and machine learning tools (e.g., TensorFlow/Orange). The associated geospatial facilities of GISpark in conjunction with the scientific computing environment, exploratory spatial data analysis tools, temporal data management and analysis systems make up a powerful geospatial computing tool. GISpark not only provides spatiotemporal big data processing capacity in the geospatial field, but also provides spatiotemporal computational model and advanced geospatial visualization tools that deals with other domains related with spatial property. We tested the performance of the platform based on taxi trajectory analysis. Results suggested that GISpark achieves excellent run time performance in spatiotemporal big data applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Wei; Minnick, Matthew D; Mattson, Earl D
Oil shale deposits of the Green River Formation (GRF) in Northwestern Colorado, Southwestern Wyoming, and Northeastern Utah may become one of the first oil shale deposits to be developed in the U.S. because of their richness, accessibility, and extensive prior characterization. Oil shale is an organic-rich fine-grained sedimentary rock that contains significant amounts of kerogen from which liquid hydrocarbons can be produced. Water is needed to retort or extract oil shale at an approximate rate of three volumes of water for every volume of oil produced. Concerns have been raised over the demand and availability of water to produce oilmore » shale, particularly in semiarid regions where water consumption must be limited and optimized to meet demands from other sectors. The economic benefit of oil shale development in this region may have tradeoffs within the local and regional environment. Due to these potential environmental impacts of oil shale development, water usage issues need to be further studied. A basin-wide baseline for oil shale and water resource data is the foundation of the study. This paper focuses on the design and construction of a centralized geospatial infrastructure for managing a large amount of oil shale and water resource related baseline data, and for setting up the frameworks for analytical and numerical models including but not limited to three-dimensional (3D) geologic, energy resource development systems, and surface water models. Such a centralized geospatial infrastructure made it possible to directly generate model inputs from the same database and to indirectly couple the different models through inputs/outputs. Thus ensures consistency of analyses conducted by researchers from different institutions, and help decision makers to balance water budget based on the spatial distribution of the oil shale and water resources, and the spatial variations of geologic, topographic, and hydrogeological Characterization of the basin. This endeavor encountered many technical challenging and hasn't been done in the past for any oil shale basin. The database built during this study remains valuable for any other future studies involving oil shale and water resource management in the Piceance Basin. The methodology applied in the development of the GIS based Geospatial Infrastructure can be readily adapted for other professionals to develop database structure for other similar basins.« less
Cool Apps: Building Cryospheric Data Applications With Standards-Based Service Oriented Architecture
NASA Astrophysics Data System (ADS)
Collins, J. A.; Truslove, I.; Billingsley, B. W.; Oldenburg, J.; Brodzik, M.; Lewis, S.; Liu, M.
2012-12-01
The National Snow and Ice Data Center (NSIDC) holds a large collection of cryospheric data, and is involved in a number of informatics research and development projects aimed at improving the discoverability and accessibility of these data. To develop high-quality software in a timely manner, we have adopted a Service-Oriented Architecture (SOA) approach for our core technical infrastructure development. Data services at NSIDC are internally exposed to other tools and applications through standards-based service interfaces. These standards include OAI-PMH (Open Archives Initiative Protocol for Metadata Harvesting), various OGC (Open Geospatial Consortium) standards including WMS (Web Map Service) and WFS (Web Feature Service), ESIP (Federation of Earth Sciences Information Partners) OpenSearch, and NSIDC-specific RESTful services. By taking a standards-based approach, we are able to use off-the-shelf tools and libraries to consume, translate and broker these data services, and thus develop applications faster. Additionally, by exposing public interfaces to these services we provide valuable data services to technical collaborators; for example, NASA Reverb (http://reverb.echo.nasa.gov) uses NSIDC's WMS services. Our latest generation of web applications consume these data services directly. The most complete example of this is the Operation IceBridge Data Portal (http://nsidc.org/icebridge/portal) which depends on many of the aforementioned services, and clearly exhibits many of the advantages of building applications atop a service-oriented architecture. This presentation outlines the architectural approach and components and open standards and protocols adopted at NSIDC, demonstrates the interactions and uses of public and internal service interfaces currently powering applications including the IceBridge Data Portal, and outlines the benefits and challenges of this approach.
78 FR 32635 - Privacy Act of 1974; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-31
...; System of Records AGENCY: National Geospatial-Intelligence Agency, DoD. ACTION: Notice to Add a New System of Records. SUMMARY: The National Geospatial-Intelligence Agency is establishing a new system of... information. FOR FURTHER INFORMATION CONTACT: National Geospatial-Intelligence Agency [[Page 32636
78 FR 35606 - Privacy Act of 1974; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-13
...; System of Records AGENCY: National Geospatial-Intelligence Agency, DoD. ACTION: Notice to alter a System of Records. SUMMARY: The National Geospatial-Intelligence Agency is altering a system of records in.... FOR FURTHER INFORMATION CONTACT: National Geospatial-Intelligence Agency (NGA), ATTN: Security...
NASA Astrophysics Data System (ADS)
Dastageeri, H.; Storz, M.; Koukofikis, A.; Knauth, S.; Coors, V.
2016-09-01
Providing mobile location-based information for pedestrians faces many challenges. On one hand the accuracy of localisation indoors and outdoors is restricted due to technical limitations of GPS and Beacons. Then again only a small display is available to display information as well as to develop a user interface. Plus, the software solution has to consider the hardware characteristics of mobile devices during the implementation process for aiming a performance with minimum latency. This paper describes our approach by including a combination of image tracking and GPS or Beacons to ensure orientation and precision of localisation. To communicate the information on Points of Interest (POIs), we decided to choose Augmented Reality (AR). For this concept of operations, we used besides the display also the acceleration and positions sensors as a user interface. This paper especially goes into detail on the optimization of the image tracking algorithms, the development of the video-based AR player for the Android platform and the evaluation of videos as an AR element in consideration of providing a good user experience. For setting up content for the POIs or even generate a tour we used and extended the Open Geospatial Consortium (OGC) standard Augmented Reality Markup Language (ARML).
NASA Astrophysics Data System (ADS)
Signell, R. P.; Camossi, E.
2015-11-01
Work over the last decade has resulted in standardized web-services and tools that can significantly improve the efficiency and effectiveness of working with meteorological and ocean model data. While many operational modelling centres have enabled query and access to data via common web services, most small research groups have not. The penetration of this approach into the research community, where IT resources are limited, can be dramatically improved by: (1) making it simple for providers to enable web service access to existing output files; (2) using technology that is free, and that is easy to deploy and configure; and (3) providing tools to communicate with web services that work in existing research environments. We present a simple, local brokering approach that lets modelers continue producing custom data, but virtually aggregates and standardizes the data using NetCDF Markup Language. The THREDDS Data Server is used for data delivery, pycsw for data search, NCTOOLBOX (Matlab®1) and Iris (Python) for data access, and Ocean Geospatial Consortium Web Map Service for data preview. We illustrate the effectiveness of this approach with two use cases involving small research modelling groups at NATO and USGS.1 Mention of trade names or commercial products does not constitute endorsement or recommendation for use by the US Government.
SWOT analysis on National Common Geospatial Information Service Platform of China
NASA Astrophysics Data System (ADS)
Zheng, Xinyan; He, Biao
2010-11-01
Currently, the trend of International Surveying and Mapping is shifting from map production to integrated service of geospatial information, such as GOS of U.S. etc. Under this circumstance, the Surveying and Mapping of China is inevitably shifting from 4D product service to NCGISPC (National Common Geospatial Information Service Platform of China)-centered service. Although State Bureau of Surveying and Mapping of China has already provided a great quantity of geospatial information service to various lines of business, such as emergency and disaster management, transportation, water resource, agriculture etc. The shortcomings of the traditional service mode are more and more obvious, due to the highly emerging requirement of e-government construction, the remarkable development of IT technology and emerging online geospatial service demands of various lines of business. NCGISPC, which aimed to provide multiple authoritative online one-stop geospatial information service and API for further development to government, business and public, is now the strategic core of SBSM (State Bureau of Surveying and Mapping of China). This paper focuses on the paradigm shift that NCGISPC brings up by using SWOT (Strength, Weakness, Opportunity and Threat) analysis, compared to the service mode that based on 4D product. Though NCGISPC is still at its early stage, it represents the future service mode of geospatial information of China, and surely will have great impact not only on the construction of digital China, but also on the way that everyone uses geospatial information service.
NASA Astrophysics Data System (ADS)
Kassab, Ala'; Liang, Steve; Gao, Yang
2010-12-01
Emergency agencies seek to maintain situational awareness and effective decision making through continuous monitoring of, and real-time alerting about, sources of information regarding current incidents and developing fire hazards. The nature of this goal requires integrating different, potentially numerous, sources of dynamic geospatial information on the one side, and a large number of clients having heterogeneous and specific interests in data on the other side. In such scenarios, the traditional request/reply communication style may function inefficiently, as it is based on point-to-point, synchronous, and pulling mode interaction between consumer clients and information providers/services. In this work, we propose Geospatial-based Publish/ Subscribe, an interaction framework that serves as a middleware for real-time transacting of spatially related information of interest, termed geospatial events, in distributed systems. Expressive data models, including geospatial event and geospatial subscription, as well as an efficient matching approach for fast dissemination of geospatial events to interested clients, are introduced. The proposed interaction framework is realized through the development of a Real-Time Fire Emergency Response System (RFERS) prototype. The prototype is designed for transacting several topics of geospatial events that are crucial within the context of fire emergencies, including GPS locations of emergency assets, meteorological observations of wireless sensors, fire incidents reports, and temporal sequences of remote sensing images of active wildfires. The performance of the system prototype has been evaluated in order to demonstrate its efficiency.
Integration of Geospatial Science in Teacher Education
ERIC Educational Resources Information Center
Hauselt, Peggy; Helzer, Jennifer
2012-01-01
One of the primary missions of our university is to train future primary and secondary teachers. Geospatial sciences, including GIS, have long been excluded from teacher education curriculum. This article explains the curriculum revisions undertaken to increase the geospatial technology education of future teachers. A general education class…
75 FR 43497 - Privacy Act of 1974; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-26
...; System of Records AGENCY: National Geospatial-Intelligence Agency (NGA), DoD. ACTION: Notice to add a system of records. SUMMARY: The National Geospatial-Intelligence Agency (NGA) proposes to add a system of...-3808. SUPPLEMENTARY INFORMATION: The National Geospatial-Intelligence Agency notices for systems of...
Indigenous knowledges driving technological innovation
Lilian Alessa; Carlos Andrade; Phil Cash Cash; Christian P. Giardina; Matt Hamabata; Craig Hammer; Kai Henifin; Lee Joachim; Jay T. Johnson; Kekuhi Kealiikanakaoleohaililani; Deanna Kingston; Andrew Kliskey; Renee Pualani Louis; Amanda Lynch; Daryn McKenny; Chels Marshall; Mere Roberts; Taupouri Tangaro; Jyl Wheaton-Abraham; Everett Wingert
2011-01-01
This policy brief explores the use and expands the conversation on the ability of geospatial technologies to represent Indigenous cultural knowledge. Indigenous peoples' use of geospatial technologies has already proven to be a critical step for protecting tribal self-determination. However, the ontological frameworks and techniques of Western geospatial...
Design for Connecting Spatial Data Infrastructures with Sensor Web (sensdi)
NASA Astrophysics Data System (ADS)
Bhattacharya, D.; M., M.
2016-06-01
Integrating Sensor Web With Spatial Data Infrastructures (SENSDI) aims to extend SDIs with sensor web enablement, converging geospatial and built infrastructure, and implement test cases with sensor data and SDI. It is about research to harness the sensed environment by utilizing domain specific sensor data to create a generalized sensor webframework. The challenges being semantic enablement for Spatial Data Infrastructures, and connecting the interfaces of SDI with interfaces of Sensor Web. The proposed research plan is to Identify sensor data sources, Setup an open source SDI, Match the APIs and functions between Sensor Web and SDI, and Case studies like hazard applications, urban applications etc. We take up co-operative development of SDI best practices to enable a new realm of a location enabled and semantically enriched World Wide Web - the "Geospatial Web" or "Geosemantic Web" by setting up one to one correspondence between WMS, WFS, WCS, Metadata and 'Sensor Observation Service' (SOS); 'Sensor Planning Service' (SPS); 'Sensor Alert Service' (SAS); a service that facilitates asynchronous message interchange between users and services, and between two OGC-SWE services, called the 'Web Notification Service' (WNS). Hence in conclusion, it is of importance to geospatial studies to integrate SDI with Sensor Web. The integration can be done through merging the common OGC interfaces of SDI and Sensor Web. Multi-usability studies to validate integration has to be undertaken as future research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thakur, Gautam S; Bhaduri, Budhendra L; Piburn, Jesse O
Geospatial intelligence has traditionally relied on the use of archived and unvarying data for planning and exploration purposes. In consequence, the tools and methods that are architected to provide insight and generate projections only rely on such datasets. Albeit, if this approach has proven effective in several cases, such as land use identification and route mapping, it has severely restricted the ability of researchers to inculcate current information in their work. This approach is inadequate in scenarios requiring real-time information to act and to adjust in ever changing dynamic environments, such as evacuation and rescue missions. In this work, wemore » propose PlanetSense, a platform for geospatial intelligence that is built to harness the existing power of archived data and add to that, the dynamics of real-time streams, seamlessly integrated with sophisticated data mining algorithms and analytics tools for generating operational intelligence on the fly. The platform has four main components i) GeoData Cloud a data architecture for storing and managing disparate datasets; ii) Mechanism to harvest real-time streaming data; iii) Data analytics framework; iv) Presentation and visualization through web interface and RESTful services. Using two case studies, we underpin the necessity of our platform in modeling ambient population and building occupancy at scale.« less
Mapping the Future Today: The Community College of Baltimore County Geospatial Applications Program
ERIC Educational Resources Information Center
Jeffrey, Scott; Alvarez, Jaime
2010-01-01
The Geospatial Applications Program at the Community College of Baltimore County (CCBC), located five miles west of downtown Baltimore, Maryland, provides comprehensive instruction in geographic information systems (GIS), remote sensing and global positioning systems (GPS). Geospatial techniques, which include computer-based mapping and remote…
ERIC Educational Resources Information Center
Bodzin, Alec; Peffer, Tamara; Kulo, Violet
2012-01-01
Teaching and learning about geospatial aspects of energy resource issues requires that science teachers apply effective science pedagogical approaches to implement geospatial technologies into classroom instruction. To address this need, we designed educative curriculum materials as an integral part of a comprehensive middle school energy…
Strategizing Teacher Professional Development for Classroom Uses of Geospatial Data and Tools
ERIC Educational Resources Information Center
Zalles, Daniel R.; Manitakos, James
2016-01-01
Studying Topography, Orographic Rainfall, and Ecosystems with Geospatial Information Technology (STORE), a 4.5-year National Science Foundation funded project, explored the strategies that stimulate teacher commitment to the project's driving innovation: having students use geospatial information technology (GIT) to learn about weather, climate,…
Fostering 21st Century Learning with Geospatial Technologies
ERIC Educational Resources Information Center
Hagevik, Rita A.
2011-01-01
Global positioning systems (GPS) receivers and other geospatial tools can help teachers create engaging, hands-on activities in all content areas. This article provides a rationale for using geospatial technologies in the middle grades and describes classroom-tested activities in English language arts, science, mathematics, and social studies.…
EPA GEOSPATIAL QUALITY COUNCIL STRATEGY PLAN FY-02
The EPA Geospatial Quality Council (GQC), previously known as the EPA GIS-QA Team - EPA/600/R-00/009, was created to fill the gap between the EPA Quality Assurance (QA) and Geospatial communities. All EPA Offices and Regions were invited to participate. Currently, the EPA...
Mapping and monitoring potato cropping systems in Maine: geospatial methods and land use assessments
USDA-ARS?s Scientific Manuscript database
Geospatial frameworks and GIS-based approaches were used to assess current cropping practices in potato production systems in Maine. Results from the geospatial integration of remotely-sensed cropland layers (2008-2011) and soil datasets for Maine revealed a four-year potato systems footprint estima...
The Virginia Geocoin Adventure: An Experiential Geospatial Learning Activity
ERIC Educational Resources Information Center
Johnson, Laura; McGee, John; Campbell, James; Hays, Amy
2013-01-01
Geospatial technologies have become increasingly prevalent across our society. Educators at all levels have expressed a need for additional resources that can be easily adopted to support geospatial literacy and state standards of learning, while enhancing the overall learning experience. The Virginia Geocoin Adventure supports the needs of 4-H…
ERIC Educational Resources Information Center
Reed, Philip A.; Ritz, John
2004-01-01
Geospatial technology refers to a system that is used to acquire, store, analyze, and output data in two or three dimensions. This data is referenced to the earth by some type of coordinate system, such as a map projection. Geospatial systems include thematic mapping, the Global Positioning System (GPS), remote sensing (RS), telemetry, and…
A Geospatial Online Instruction Model
ERIC Educational Resources Information Center
Rodgers, John C., III; Owen-Nagel, Athena; Ambinakudige, Shrinidhi
2012-01-01
The objective of this study is to present a pedagogical model for teaching geospatial courses through an online format and to critique the model's effectiveness. Offering geospatial courses through an online format provides avenues to a wider student population, many of whom are not able to take traditional on-campus courses. Yet internet-based…
lawn: An R client for the Turf JavaScript Library for Geospatial Analysis
lawn is an R package to provide access to the geospatial analysis capabilities in the Turf javascript library. Turf expects data in GeoJSON format. Given that many datasets are now available natively in GeoJSON providing an easier method for conducting geospatial analyses on thes...
Spatial Information Technology Center at Fulton-Montgomery Community College
NASA Technical Reports Server (NTRS)
2004-01-01
The Spatial Information Technology Center (SITC) at Fulton-Montgomery Community College (FMCC) continued to fulfill its mission and charter by successfully completing its fourth year of operations under Congressional funding and NASA sponsorship. Fourth year operations (01 Oct 03 - 30 Sep 04) have been funded and conducted utilizing an authorized Research Grant NAG 13-02053 (via a one-year no-cost extension expiring Sep 04). Drawdown and reporting of fiscal activities for SITC operations passes through the Institute for the Application of Geo-spatial Technology (IAGT) at Cayuga Community College in Auburn, New York. Fiscal activity of the Center is reported quarterly via SF 272 to IAGT, this report contains an overview and expenditures for the remaining funds of NAG 13-02053. NAG 13-02053, slated for operating costs for the fiscal year FY02-03, received a one-year no-cost extension. SITC also received permission to use remaining funds for salaries and benefits through December 31,2004. The IAGT receives no compensation for administrative costs. This report includes addendums for the NAG award as required by federal guidelines. Attached are the signed Report of New Technology/Inventions and a Final Property Report. As an academic, economic, and workforce development program, the Center has made significant strides in bringing the technology, knowledge and applications of the spatial information technology field to the region it serves. Through the mission of the Center, the region's communities have become increasingly aware of the benefits of Geospatial technology, particularly in the region s K-12 arena. SITC continues to positively affect the region's education, employment and economic development, while expanding its services and operations.
Pathway to 2022: The Ongoing Modernization of the United States National Spatial Reference System
NASA Astrophysics Data System (ADS)
Stone, W. A.; Caccamise, D.
2017-12-01
The National Oceanic and Atmospheric Administration's National Geodetic Survey (NGS) mission is "to define, maintain and provide access to the National Spatial Reference System (NSRS) to meet our nation's economic, social, and environmental needs." The NSRS is an assemblage of geophysical and geodetic models, tools, and data, with the most-visible components being the North American Datum of 1983 (NAD83) and the North American Vertical Datum of 1988 (NAVD88), which together provide a consistent spatial reference framework for myriad geospatial applications and positioning requirements throughout the United States. The NGS is engaged in an ongoing and comprehensive multi-year project of modernizing the NSRS, a makeover necessitated by technological developments and user accuracy requirements, all with a goal of providing a modern, accurate, accessible, and globally aligned national positioning framework exploiting the substantial power and utility of the Global Navigation Satellite System - of both today and tomorrow. The modernized NSRS will include four new-generation geometric terrestrial reference frames (replacing NAD83) and a technically unprecedented geopotential datum (replacing NAVD88), all to be released in 2022 (anticipated). This poster/presentation will describe the justification for this modernization effort and will update the status and planned evolution of the NSRS as 2022 draws ever closer. Also discussed will be recent developments, including the publication of "blueprint" documents addressing technical details of various facets of the modernized NSRS and a continued series of public Geospatial Summits. Supporting/ancillary projects such as Gravity for the Redefinition of the American Vertical Datum (GRAV-D), which will result in the generation of a highly accurate gravimetric geoid - or definitional reference surface (zero elevation) - for the future geopotential datum, and Geoid Slope Validation Surveys (GSVS), which are exploring the achievable accuracy of the new geopotential datum, will be summarized. Also included will be suggestions of user preparation for transition to the NSRS of tomorrow.
Remote sensing applied to resource management
Henry M. Lachowski
1998-01-01
Effective management of forest resources requires access to current and consistent geospatial information that can be shared by resource managers and the public. Geospatial information describing our land and natural resources comes from many sources and is most effective when stored in a geospatial database and used in a geographic information system (GIS). The...
ERIC Educational Resources Information Center
Kulo, Violet; Bodzin, Alec
2013-01-01
Geospatial technologies are increasingly being integrated in science classrooms to foster learning. This study examined whether a Web-enhanced science inquiry curriculum supported by geospatial technologies promoted urban middle school students' understanding of energy concepts. The participants included one science teacher and 108 eighth-grade…
Introduction to the Complex Geospatial Web in Geographical Education
ERIC Educational Resources Information Center
Papadimitriou, Fivos
2010-01-01
The Geospatial Web is emerging in the geographical education landscape in all its complexity. How will geographers and educators react? What are the most important facets of this development? After reviewing the possible impacts on geographical education, it can be conjectured that the Geospatial Web will eventually replace the usual geographical…
ERIC Educational Resources Information Center
Bodzin, Alec M.; Fu, Qiong; Bressler, Denise; Vallera, Farah L.
2015-01-01
Geospatially enabled learning technologies may enhance Earth science learning by placing emphasis on geographic space, visualization, scale, representation, and geospatial thinking and reasoning (GTR) skills. This study examined if and how a series of Web geographic information system investigations that the researchers developed improved urban…
Hettinger Photo of Dylan Hettinger Dylan Hettinger Geospatial Data Scientist Dylan.Hettinger @nrel.gov | 303-275-3750 Dylan Hettinger is a member of the Geospatial Data Science team within the Systems Modeling & Geospatial Data Science Group in the Strategic Energy Analysis Center. Areas of Expertise
ERIC Educational Resources Information Center
Hanley, Carol D.; Davis, Hilarie B.; Davey, Bradford T.
2012-01-01
As use of geospatial technologies has increased in the workplace, so has interest in using these technologies in the K-12 classroom. Prior research has identified several reasons for using geospatial technologies in the classroom, such as developing spatial thinking, supporting local investigations, analyzing changes in the environment, and…
The Sky's the Limit: Integrating Geospatial Tools with Pre-College Youth Education
ERIC Educational Resources Information Center
McGee, John; Kirwan, Jeff
2010-01-01
Geospatial tools, which include global positioning systems (GPS), geographic information systems (GIS), and remote sensing, are increasingly driving a variety of applications. Local governments and private industry are embracing these tools, and the public is beginning to demand geospatial services. The U.S. Department of Labor (DOL) reported that…
Geospatial Services in Special Libraries: A Needs Assessment Perspective
ERIC Educational Resources Information Center
Barnes, Ilana
2013-01-01
Once limited to geographers and mapmakers, Geographic Information Systems (GIS) has taken a growing central role in information management and visualization. Geospatial services run a gamut of different products and services from Google maps to ArcGIS servers to Mobile development. Geospatial services are not new. Libraries have been writing about…
Using the Geospatial Web to Deliver and Teach Giscience Education Programs
NASA Astrophysics Data System (ADS)
Veenendaal, B.
2015-05-01
Geographic information science (GIScience) education has undergone enormous changes over the past years. One major factor influencing this change is the role of the geospatial web in GIScience. In addition to the use of the web for enabling and enhancing GIScience education, it is also used as the infrastructure for communicating and collaborating among geospatial data and users. The web becomes both the means and the content for a geospatial education program. However, the web does not replace the traditional face-to-face environment, but rather is a means to enhance it, expand it and enable an authentic and real world learning environment. This paper outlines the use of the web in both the delivery and content of the GIScience program at Curtin University. The teaching of the geospatial web, web and cloud based mapping, and geospatial web services are key components of the program, and the use of the web and online learning are important to deliver this program. Some examples of authentic and real world learning environments are provided including joint learning activities with partner universities.
A Geospatial Semantic Enrichment and Query Service for Geotagged Photographs
Ennis, Andrew; Nugent, Chris; Morrow, Philip; Chen, Liming; Ioannidis, George; Stan, Alexandru; Rachev, Preslav
2015-01-01
With the increasing abundance of technologies and smart devices, equipped with a multitude of sensors for sensing the environment around them, information creation and consumption has now become effortless. This, in particular, is the case for photographs with vast amounts being created and shared every day. For example, at the time of this writing, Instagram users upload 70 million photographs a day. Nevertheless, it still remains a challenge to discover the “right” information for the appropriate purpose. This paper describes an approach to create semantic geospatial metadata for photographs, which can facilitate photograph search and discovery. To achieve this we have developed and implemented a semantic geospatial data model by which a photograph can be enrich with geospatial metadata extracted from several geospatial data sources based on the raw low-level geo-metadata from a smartphone photograph. We present the details of our method and implementation for searching and querying the semantic geospatial metadata repository to enable a user or third party system to find the information they are looking for. PMID:26205265
Citing geospatial feature inventories with XML manifests
NASA Astrophysics Data System (ADS)
Bose, R.; McGarva, G.
2006-12-01
Today published scientific papers include a growing number of citations for online information sources that either complement or replace printed journals and books. We anticipate this same trend for cartographic citations used in the geosciences, following advances in web mapping and geographic feature-based services. Instead of using traditional libraries to resolve citations for print material, the geospatial citation life cycle will include requesting inventories of objects or geographic features from distributed geospatial data repositories. Using a case study from the UK Ordnance Survey MasterMap database, which is illustrative of geographic object-based products in general, we propose citing inventories of geographic objects using XML feature manifests. These manifests: (1) serve as a portable listing of sets of versioned features; (2) could be used as citations within the identification portion of an international geospatial metadata standard; (3) could be incorporated into geospatial data transfer formats such as GML; but (4) can be resolved only with comprehensive, curated repositories of current and historic data. This work has implications for any researcher who foresees the need to make or resolve references to online geospatial databases.
Introduction to geospatial semantics and technology workshop handbook
Varanka, Dalia E.
2012-01-01
The workshop is a tutorial on introductory geospatial semantics with hands-on exercises using standard Web browsers. The workshop is divided into two sections, general semantics on the Web and specific examples of geospatial semantics using data from The National Map of the U.S. Geological Survey and the Open Ontology Repository. The general semantics section includes information and access to publicly available semantic archives. The specific session includes information on geospatial semantics with access to semantically enhanced data for hydrography, transportation, boundaries, and names. The Open Ontology Repository offers open-source ontologies for public use.
Understanding needs and barriers to using geospatial tools for public health policymaking in China.
Kim, Dohyeong; Zhang, Yingyuan; Lee, Chang Kil
2018-05-07
Despite growing popularity of using geographical information systems and geospatial tools in public health fields, these tools are only rarely implemented in health policy management in China. This study examines the barriers that could prevent policy-makers from applying such tools to actual managerial processes related to public health problems that could be assisted by such approaches, e.g. evidence-based policy-making. A questionnaire-based survey of 127 health-related experts and other stakeholders in China revealed that there is a consensus on the needs and demands for the use of geospatial tools, which shows that there is a more unified opinion on the matter than so far reported. Respondents pointed to lack of communication and collaboration among stakeholders as the most significant barrier to the implementation of geospatial tools. Comparison of survey results to those emanating from a similar study in Bangladesh revealed different priorities concerning the use of geospatial tools between the two countries. In addition, the follow-up in-depth interviews highlighted the political culture specific to China as a critical barrier to adopting new tools in policy development. Other barriers included concerns over the limited awareness of the availability of advanced geospatial tools. Taken together, these findings can facilitate a better understanding among policy-makers and practitioners of the challenges and opportunities for widespread adoption and implementation of a geospatial approach to public health policy-making in China.
ERIC Educational Resources Information Center
Gaudet, Cyndi; Annulis, Heather; Kmiec, John
2010-01-01
The Geospatial Technology Apprenticeship Program (GTAP) pilot was designed as a replicable and sustainable program to enhance workforce skills in geospatial technologies to best leverage a $30 billion market potential. The purpose of evaluating GTAP was to ensure that investment in this high-growth industry was adding value. Findings from this…
USDA-ARS?s Scientific Manuscript database
The development of sensors that provide geospatial information on crop and soil conditions has been a primary success for precision agriculture. However, further developments are needed to integrate geospatial data into computer algorithms that spatially optimize crop production while considering po...
The AFIT of Today is the Air Force of Tomorrow
2012-05-11
Engineering • Operations Research • Space Systems • Systems Engineering • Air Mobility • Combating Weapons of Mass Destruction • Cost Analysis • Cyber...Fight - Win Graduate Certificate Programs • Systems Engineering • Space Systems • Advanced Geospatial Intelligence • Combating Weapons of Mass ...over five years • Critical enabler for SSA: extending the satellite catalog to small objects Current Works: • Converting satellite catalog to KAM Tori
NASA Technical Reports Server (NTRS)
Kawabata, Ryoji; Kurihara, Shinobu; Fukuzaki, Yoshihiro; Kuroda, Jiro; Tanabe, Tadashi; Mukai, Yasuko; Nishikawa, Takashi
2013-01-01
The Tsukuba 32-m VLBI station is operated by the Geospatial Information Authority of Japan. This report summarizes activities of the Tsukuba 32-m VLBI station in 2012. More than 200 sessions were observed with the Tsukuba 32-m and other GSI antennas in accordance with the IVS Master Schedule of 2012. We have started installing the observing facilities that will be fully compliant with VLBI2010 for the first time in Japan.
Region 9 NPDES Facilities - Waste Water Treatment Plants
Point geospatial dataset representing locations of NPDES Waste Water Treatment Plant Facilities. NPDES (National Pollution Discharge Elimination System) is an EPA permit program that regulates direct discharges from facilities that discharge treated waste water into waters of the US. Facilities are issued NPDES permits regulating their discharge as required by the Clean Water Act. A facility may have one or more outfalls (dischargers). The location represents the facility or operating plant.
Geoscience Australia Continuous Global Positioning System (CGPS) Station Field Campaign Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruddick, R.; Twilley, B.
2016-03-01
This station formed part of the Australian Regional GPS Network (ARGN) and South Pacific Regional GPS Network (SPRGN), which is a network of continuous GPS stations operating within Australia and its Territories (including Antarctica) and the Pacific. These networks support a number of different science applications including maintenance of the Geospatial Reference Frame, both national and international, continental and tectonic plate motions, sea level rise, and global warming.
Geospatial Technology Strategic Plan 1997-2000
D'Erchia, Frank; D'Erchia, Terry D.; Getter, James; McNiff, Marcia; Root, Ralph; Stitt, Susan; White, Barbara
1997-01-01
Executive Summary -- Geospatial technology applications have been identified in many U.S. Geological Survey Biological Resources Division (BRD) proposals for grants awarded through internal and partnership programs. Because geospatial data and tools have become more sophisticated, accessible, and easy to use, BRD scientists frequently are using these tools and capabilities to enhance a broad spectrum of research activities. Bruce Babbitt, Secretary of the Interior, has acknowledged--and lauded--the important role of geospatial technology in natural resources management. In his keynote address to more than 5,500 people representing 87 countries at the Environmental Systems Research Institute Annual Conference (May 21, 1996), Secretary Babbitt stated, '. . .GIS [geographic information systems], if properly used, can provide a lot more than sets of data. Used effectively, it can help stakeholders to bring consensus out of conflict. And it can, by providing information, empower the participants to find new solutions to their problems.' This Geospatial Technology Strategic Plan addresses the use and application of geographic information systems, remote sensing, satellite positioning systems, image processing, and telemetry; describes methods of meeting national plans relating to geospatial data development, management, and serving; and provides guidance for sharing expertise and information. Goals are identified along with guidelines that focus on data sharing, training, and technology transfer. To measure success, critical performance indicators are included. The ability of the BRD to use and apply geospatial technology across all disciplines will greatly depend upon its success in transferring the technology to field biologists and researchers. The Geospatial Technology Strategic Planning Development Team coordinated and produced this document in the spirit of this premise. Individual Center and Program managers have the responsibility to implement the Strategic Plan by working within the policy and guidelines stated herein.
Jacquez, Geoffrey M; Essex, Aleksander; Curtis, Andrew; Kohler, Betsy; Sherman, Recinda; Emam, Khaled El; Shi, Chen; Kaufmann, Andy; Beale, Linda; Cusick, Thomas; Goldberg, Daniel; Goovaerts, Pierre
2017-07-01
As the volume, accuracy and precision of digital geographic information have increased, concerns regarding individual privacy and confidentiality have come to the forefront. Not only do these challenge a basic tenet underlying the advancement of science by posing substantial obstacles to the sharing of data to validate research results, but they are obstacles to conducting certain research projects in the first place. Geospatial cryptography involves the specification, design, implementation and application of cryptographic techniques to address privacy, confidentiality and security concerns for geographically referenced data. This article defines geospatial cryptography and demonstrates its application in cancer control and surveillance. Four use cases are considered: (1) national-level de-duplication among state or province-based cancer registries; (2) sharing of confidential data across cancer registries to support case aggregation across administrative geographies; (3) secure data linkage; and (4) cancer cluster investigation and surveillance. A secure multi-party system for geospatial cryptography is developed. Solutions under geospatial cryptography are presented and computation time is calculated. As services provided by cancer registries to the research community, de-duplication, case aggregation across administrative geographies and secure data linkage are often time-consuming and in some instances precluded by confidentiality and security concerns. Geospatial cryptography provides secure solutions that hold significant promise for addressing these concerns and for accelerating the pace of research with human subjects data residing in our nation's cancer registries. Pursuit of the research directions posed herein conceivably would lead to a geospatially encrypted geographic information system (GEGIS) designed specifically to promote the sharing and spatial analysis of confidential data. Geospatial cryptography holds substantial promise for accelerating the pace of research with spatially referenced human subjects data.
Estimating Renewable Energy Economic Potential in the United States. Methodology and Initial Results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Austin; Beiter, Philipp; Heimiller, Donna
This report describes a geospatial analysis method to estimate the economic potential of several renewable resources available for electricity generation in the United States. Economic potential, one measure of renewable generation potential, may be defined in several ways. For example, one definition might be expected revenues (based on local market prices) minus generation costs, considered over the expected lifetime of the generation asset. Another definition might be generation costs relative to a benchmark (e.g., a natural gas combined cycle plant) using assumptions of fuel prices, capital cost, and plant efficiency. Economic potential in this report is defined as the subsetmore » of the available resource technical potential where the cost required to generate the electricity (which determines the minimum revenue requirements for development of the resource) is below the revenue available in terms of displaced energy and displaced capacity. The assessment is conducted at a high geospatial resolution (more than 150,000 technology-specific sites in the continental United States) to capture the significant variation in local resource, costs, and revenue potential. This metric can be a useful screening factor for understanding the economic viability of renewable generation technologies at a specific location. In contrast to many common estimates of renewable energy potential, economic potential does not consider market dynamics, customer demand, or most policy drivers that may incent renewable energy generation.« less
NASA Astrophysics Data System (ADS)
Kumar, Pawan; Katiyar, Swati; Rani, Meenu
2016-07-01
We are living in the age of a rapidly growing population and changing environmental conditions with an advance technical capacity.This has resulted in wide spread land cover change. One of the main causes for increasing urban heat is that more than half of the world's population lives in a rapidly growing urbanized environment. Satellite data can be highly useful to map change in land cover and other environmental phenomena with the passage of time. Among several human-induced environmental and urban thermal problems are reported to be negatively affecting urban residents in many ways. The built-up structures in urbanized areas considerably alter land cover thereby affecting thermal energy flow which leads to development of elevated surface and air temperature. The phenomenon Urban Heat Island implies 'island' of high temperature in cities, surrounded by relatively lower temperature in rural areas. The UHI for the temporal period is estimated using geospatial techniques which are then utilized for the impact assessment on climate of the surrounding regions and how it reduce the sustainability of the natural resources like air, vegetation. The present paper describes the methodology and resolution dynamic urban heat island change on climate using the geospatial approach. NDVI were generated using day time LANDSAT ETM+ image of 1990, 2000 and 2013. Temperature of various land use and land cover categories was estimated. Keywords: NDVI, Surface temperature, Dynamic changes.
Revelation of `Hidden' Balinese Geospatial Heritage on A Map
NASA Astrophysics Data System (ADS)
Soeria Atmadja, Dicky A. S.; Wikantika, Ketut; Budi Harto, Agung; Putra, Daffa Gifary M.
2018-05-01
Bali is not just about beautiful nature. It also has a unique and interesting cultural heritage, including `hidden' geospatial heritage. Tri Hita Karana is a Hinduism concept of life consisting of human relation to God, to other humans and to the nature (Parahiyangan, Pawongan and Palemahan), Based on it, - in term of geospatial aspect - the Balinese derived its spatial orientation, spatial planning & lay out, measurement as well as color and typography. Introducing these particular heritage would be a very interesting contribution to Bali tourism. As a respond to these issues, a question arise on how to reveal these unique and highly valuable geospatial heritage on a map which can be used to introduce and disseminate them to the tourists. Symbols (patterns & colors), orientation, distance, scale, layout and toponimy have been well known as elements of a map. There is an chance to apply Balinese geospatial heritage in representing these map elements.
The Future of Geospatial Standards
NASA Astrophysics Data System (ADS)
Bermudez, L. E.; Simonis, I.
2016-12-01
The OGC is an international not-for-profit standards development organization (SDO) committed to making quality standards for the geospatial community. A community of more than 500 member organizations with more than 6,000 people registered at the OGC communication platform drives the development of standards that are freely available for anyone to use and to improve sharing of the world's geospatial data. OGC standards are applied in a variety of application domains including Environment, Defense and Intelligence, Smart Cities, Aviation, Disaster Management, Agriculture, Business Development and Decision Support, and Meteorology. Profiles help to apply information models to different communities, thus adapting to particular needs of that community while ensuring interoperability by using common base models and appropriate support services. Other standards address orthogonal aspects such as handling of Big Data, Crowd-sourced information, Geosemantics, or container for offline data usage. Like most SDOs, the OGC develops and maintains standards through a formal consensus process under the OGC Standards Program (OGC-SP) wherein requirements and use cases are discussed in forums generally open to the public (Domain Working Groups, or DWGs), and Standards Working Groups (SWGs) are established to create standards. However, OGC is unique among SDOs in that it also operates the OGC Interoperability Program (OGC-IP) to provide real-world testing of existing and proposed standards. The OGC-IP is considered the experimental playground, where new technologies are researched and developed in a user-driven process. Its goal is to prototype, test, demonstrate, and promote OGC Standards in a structured environment. Results from the OGC-IP often become requirements for new OGC standards or identify deficiencies in existing OGC standards that can be addressed. This presentation will provide an analysis of the work advanced in the OGC consortium including standards and testbeds, where we can extract a trend for the future of geospatial standards. We see a number of key elements in focus, but simultaneously a broadening of standards to address particular communities' needs.
NASA Astrophysics Data System (ADS)
Giberson, G. K.; Oswald, C.
2015-12-01
In areas affected by snow, chloride (Cl) salts are widely used as a de-icing agent to improve road conditions. While the improvement in road safety is indisputable, there are environmental consequences to local aquatic ecosystems. In many waterways, Cl concentrations have been increasing since the early 1990s, often exceeding national water quality guidelines. To determine the quantity of Cl that is accumulating in urban and urbanizing watersheds, accurate estimates of road salt usage at the watershed-scale are needed. The complex jurisdictional control over road salt application in southern Ontario lends itself to a geospatial approach for calculating Cl inputs to improve the accuracy of watershed-scale Cl mass balance estimates. This study will develop a geospatial protocol for combining information on road salt applications and road network areas to refine watershed-scale Cl inputs, as well as assess spatiotemporal patterns in road salt application across the southern Ontario study region. The overall objective of this project is to use geospatial methods (predominantly ArcGIS) to develop high-accuracy estimates of road salt usage in urbanizing watersheds in southern Ontario. Specifically, the aims will be to map and summarize the types and areas ("lane-lengths") of roadways in each watershed that have road salt applied to them, to determine the most appropriate source(s) of road salt usage data for each watershed, taking into consideration multiple levels of jurisdiction (e.g. municipal, regional, provincial), to calculate and summarize sub-watershed and watershed-scale road salt usage estimates for multiple years, and to analyze intra-watershed spatiotemporal patterns of road salt usage, especially focusing on impervious surfaces. These analyses will recommend areas of concern exacerbated by high-levels of road salt distribution; recommendations around modifying on-the-ground operations will be the next step in helping to correct these issues.
MappERS-C and MappERS-V. The crowd source for prevention and crisis support
NASA Astrophysics Data System (ADS)
Frigerio, Simone; Schenato, Luca; Bianchizza, Chiara; Del Bianco, Daniele
2015-04-01
The responsibilities within natural hazards at local/regional levels involve citizens and volunteers as first actors of civil protection and territorial management. The prevention implicates the capacities of professional operators and technical volunteers, but the priority implies now the involvement and awareness of the citizens over the territory they inhabit. The involvement of population creates context-specific strategies of territorial surveillance and management, skipping the limit to face risks only when they have to bear impacts on their lives. MAppERS (Mobile Application for Emergency Response and Support) is a EU project (funded under programme 2013-2015 Humanitarian Aid and Civil Protection, ECHO A5) which empowers "crowd-sourced mappers" through smart phone applications and sensors, with geo-tagged information, detailed gathered parameters, field-check survey in a contest of geospatial response. The process of development includes feedback from citizens, involving them in training courses on the monitoring as long term objective (raising public awareness and participation). The project deals with the development and testing of the smart phone applications (module MAppERS-V for volunteers, module MAppERS-C for citizens) according to Android SDK environment. A first research described a desk-based investigation on consequences of disasters impacts and costs of prevention strategies in pilot countries. Furthermore a review of state-of-the-art of database management systems (DBMS) in pilot countries and involvement of volunteers/citizens in data collection/monitoring collected basic info on data structure for the development. A desk-based research proposed communication methods/graphic solutions within mobile technologies for disaster management in pilot countries and available smartphone applications linked to centralized web/server database. A technical review is compulsory for a useful design-line for MappERS development, and it is linked with on-site feedback about volounteers and citizens needs within pilot groups activities. The app modules will be later re-designed according to the methodological and technical feedback gained during pilot study. Training curricula for citizens are planned to increase awareness, skills on smart phone utilities and efficient jargon for hazard contest. The expected results are: a) an easy-to-use interface for "human-data" in crisis support, b) a maximised utility of peer-produced data gathering, c) the development of human resources as technical tools d) a self-based awareness improvement.
Providing R-Tree Support for Mongodb
NASA Astrophysics Data System (ADS)
Xiang, Longgang; Shao, Xiaotian; Wang, Dehao
2016-06-01
Supporting large amounts of spatial data is a significant characteristic of modern databases. However, unlike some mature relational databases, such as Oracle and PostgreSQL, most of current burgeoning NoSQL databases are not well designed for storing geospatial data, which is becoming increasingly important in various fields. In this paper, we propose a novel method to provide R-tree index, as well as corresponding spatial range query and nearest neighbour query functions, for MongoDB, one of the most prevalent NoSQL databases. First, after in-depth analysis of MongoDB's features, we devise an efficient tabular document structure which flattens R-tree index into MongoDB collections. Further, relevant mechanisms of R-tree operations are issued, and then we discuss in detail how to integrate R-tree into MongoDB. Finally, we present the experimental results which show that our proposed method out-performs the built-in spatial index of MongoDB. Our research will greatly facilitate big data management issues with MongoDB in a variety of geospatial information applications.
Planning and Management of Real-Time Geospatialuas Missions Within a Virtual Globe Environment
NASA Astrophysics Data System (ADS)
Nebiker, S.; Eugster, H.; Flückiger, K.; Christen, M.
2011-09-01
This paper presents the design and development of a hardware and software framework supporting all phases of typical monitoring and mapping missions with mini and micro UAVs (unmanned aerial vehicles). The developed solution combines state-of-the art collaborative virtual globe technologies with advanced geospatial imaging techniques and wireless data link technologies supporting the combined and highly reliable transmission of digital video, high-resolution still imagery and mission control data over extended operational ranges. The framework enables the planning, simulation, control and real-time monitoring of UAS missions in application areas such as monitoring of forest fires, agronomical research, border patrol or pipeline inspection. The geospatial components of the project are based on the Virtual Globe Technology i3D OpenWebGlobe of the Institute of Geomatics Engineering at the University of Applied Sciences Northwestern Switzerland (FHNW). i3D OpenWebGlobe is a high-performance 3D geovisualisation engine supporting the web-based streaming of very large amounts of terrain and POI data.
Visualization and Ontology of Geospatial Intelligence
NASA Astrophysics Data System (ADS)
Chan, Yupo
Recent events have deepened our conviction that many human endeavors are best described in a geospatial context. This is evidenced in the prevalence of location-based services, as afforded by the ubiquitous cell phone usage. It is also manifested by the popularity of such internet engines as Google Earth. As we commute to work, travel on business or pleasure, we make decisions based on the geospatial information provided by such location-based services. When corporations devise their business plans, they also rely heavily on such geospatial data. By definition, local, state and federal governments provide services according to geographic boundaries. One estimate suggests that 85 percent of data contain spatial attributes.
Intelligent services for discovery of complex geospatial features from remote sensing imagery
NASA Astrophysics Data System (ADS)
Yue, Peng; Di, Liping; Wei, Yaxing; Han, Weiguo
2013-09-01
Remote sensing imagery has been commonly used by intelligence analysts to discover geospatial features, including complex ones. The overwhelming volume of routine image acquisition requires automated methods or systems for feature discovery instead of manual image interpretation. The methods of extraction of elementary ground features such as buildings and roads from remote sensing imagery have been studied extensively. The discovery of complex geospatial features, however, is still rather understudied. A complex feature, such as a Weapon of Mass Destruction (WMD) proliferation facility, is spatially composed of elementary features (e.g., buildings for hosting fuel concentration machines, cooling towers, transportation roads, and fences). Such spatial semantics, together with thematic semantics of feature types, can be used to discover complex geospatial features. This paper proposes a workflow-based approach for discovery of complex geospatial features that uses geospatial semantics and services. The elementary features extracted from imagery are archived in distributed Web Feature Services (WFSs) and discoverable from a catalogue service. Using spatial semantics among elementary features and thematic semantics among feature types, workflow-based service chains can be constructed to locate semantically-related complex features in imagery. The workflows are reusable and can provide on-demand discovery of complex features in a distributed environment.
Finding geospatial pattern of unstructured data by clustering routes
NASA Astrophysics Data System (ADS)
Boustani, M.; Mattmann, C. A.; Ramirez, P.; Burke, W.
2016-12-01
Today the majority of data generated has a geospatial context to it. Either in attribute form as a latitude or longitude, or name of location or cross referenceable using other means such as an external gazetteer or location service. Our research is interested in exploiting geospatial location and context in unstructured data such as that found on the web in HTML pages, images, videos, documents, and other areas, and in structured information repositories found on intranets, in scientific environments, and otherwise. We are working together on the DARPA MEMEX project to exploit open source software tools such as the Lucene Geo Gazetteer, Apache Tika, Apache Lucene, and Apache OpenNLP, to automatically extract, and make meaning out of geospatial information. In particular, we are interested in unstructured descriptors e.g., a phone number, or a named entity, and the ability to automatically learn geospatial paths related to these descriptors. For example, a particular phone number may represent an entity that travels on a monthly basis, according to easily identifiable and somes more difficult to track patterns. We will present a set of automatic techniques to extract descriptors, and then to geospatially infer their paths across unstructured data.
Towards the Development of a Taxonomy for Visualisation of Streamed Geospatial Data
NASA Astrophysics Data System (ADS)
Sibolla, B. H.; Van Zyl, T.; Coetzee, S.
2016-06-01
Geospatial data has very specific characteristics that need to be carefully captured in its visualisation, in order for the user and the viewer to gain knowledge from it. The science of visualisation has gained much traction over the last decade as a response to various visualisation challenges. During the development of an open source based, dynamic two-dimensional visualisation library, that caters for geospatial streaming data, it was found necessary to conduct a review of existing geospatial visualisation taxonomies. The review was done in order to inform the design phase of the library development, such that either an existing taxonomy can be adopted or extended to fit the needs at hand. The major challenge in this case is to develop dynamic two dimensional visualisations that enable human interaction in order to assist the user to understand the data streams that are continuously being updated. This paper reviews the existing geospatial data visualisation taxonomies that have been developed over the years. Based on the review, an adopted taxonomy for visualisation of geospatial streaming data is presented. Example applications of this taxonomy are also provided. The adopted taxonomy will then be used to develop the information model for the visualisation library in a further study.
BPELPower—A BPEL execution engine for geospatial web services
NASA Astrophysics Data System (ADS)
Yu, Genong (Eugene); Zhao, Peisheng; Di, Liping; Chen, Aijun; Deng, Meixia; Bai, Yuqi
2012-10-01
The Business Process Execution Language (BPEL) has become a popular choice for orchestrating and executing workflows in the Web environment. As one special kind of scientific workflow, geospatial Web processing workflows are data-intensive, deal with complex structures in data and geographic features, and execute automatically with limited human intervention. To enable the proper execution and coordination of geospatial workflows, a specially enhanced BPEL execution engine is required. BPELPower was designed, developed, and implemented as a generic BPEL execution engine with enhancements for executing geospatial workflows. The enhancements are especially in its capabilities in handling Geography Markup Language (GML) and standard geospatial Web services, such as the Web Processing Service (WPS) and the Web Feature Service (WFS). BPELPower has been used in several demonstrations over the decade. Two scenarios were discussed in detail to demonstrate the capabilities of BPELPower. That study showed a standard-compliant, Web-based approach for properly supporting geospatial processing, with the only enhancement at the implementation level. Pattern-based evaluation and performance improvement of the engine are discussed: BPELPower directly supports 22 workflow control patterns and 17 workflow data patterns. In the future, the engine will be enhanced with high performance parallel processing and broad Web paradigms.
OGC and Grid Interoperability in enviroGRIDS Project
NASA Astrophysics Data System (ADS)
Gorgan, Dorian; Rodila, Denisa; Bacu, Victor; Giuliani, Gregory; Ray, Nicolas
2010-05-01
EnviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is a 4-years FP7 Project aiming to address the subjects of ecologically unsustainable development and inadequate resource management. The project develops a Spatial Data Infrastructure of the Black Sea Catchment region. The geospatial technologies offer very specialized functionality for Earth Science oriented applications as well as the Grid oriented technology that is able to support distributed and parallel processing. One challenge of the enviroGRIDS project is the interoperability between geospatial and Grid infrastructures by providing the basic and the extended features of the both technologies. The geospatial interoperability technology has been promoted as a way of dealing with large volumes of geospatial data in distributed environments through the development of interoperable Web service specifications proposed by the Open Geospatial Consortium (OGC), with applications spread across multiple fields but especially in Earth observation research. Due to the huge volumes of data available in the geospatial domain and the additional introduced issues (data management, secure data transfer, data distribution and data computation), the need for an infrastructure capable to manage all those problems becomes an important aspect. The Grid promotes and facilitates the secure interoperations of geospatial heterogeneous distributed data within a distributed environment, the creation and management of large distributed computational jobs and assures a security level for communication and transfer of messages based on certificates. This presentation analysis and discusses the most significant use cases for enabling the OGC Web services interoperability with the Grid environment and focuses on the description and implementation of the most promising one. In these use cases we give a special attention to issues such as: the relations between computational grid and the OGC Web service protocols, the advantages offered by the Grid technology - such as providing a secure interoperability between the distributed geospatial resource -and the issues introduced by the integration of distributed geospatial data in a secure environment: data and service discovery, management, access and computation. enviroGRIDS project proposes a new architecture which allows a flexible and scalable approach for integrating the geospatial domain represented by the OGC Web services with the Grid domain represented by the gLite middleware. The parallelism offered by the Grid technology is discussed and explored at the data level, management level and computation level. The analysis is carried out for OGC Web service interoperability in general but specific details are emphasized for Web Map Service (WMS), Web Feature Service (WFS), Web Coverage Service (WCS), Web Processing Service (WPS) and Catalog Service for Web (CSW). Issues regarding the mapping and the interoperability between the OGC and the Grid standards and protocols are analyzed as they are the base in solving the communication problems between the two environments: grid and geospatial. The presetation mainly highlights how the Grid environment and Grid applications capabilities can be extended and utilized in geospatial interoperability. Interoperability between geospatial and Grid infrastructures provides features such as the specific geospatial complex functionality and the high power computation and security of the Grid, high spatial model resolution and geographical area covering, flexible combination and interoperability of the geographical models. According with the Service Oriented Architecture concepts and requirements of interoperability between geospatial and Grid infrastructures each of the main functionality is visible from enviroGRIDS Portal and consequently, by the end user applications such as Decision Maker/Citizen oriented Applications. The enviroGRIDS portal is the single way of the user to get into the system and the portal faces a unique style of the graphical user interface. Main reference for further information: [1] enviroGRIDS Project, http://www.envirogrids.net/
Mapping a Difference: The Power of Geospatial Visualization
NASA Astrophysics Data System (ADS)
Kolvoord, B.
2015-12-01
Geospatial Technologies (GST), such as GIS, GPS and remote sensing, offer students and teachers the opportunity to study the "why" of where. By making maps and collecting location-based data, students can pursue authentic problems using sophisticated tools. The proliferation of web- and cloud-based tools has made these technologies broadly accessible to schools. In addition, strong spatial thinking skills have been shown to be a key factor in supporting students that want to study science, technology, engineering, and mathematics (STEM) disciplines (Wai, Lubinski and Benbow) and pursue STEM careers. Geospatial technologies strongly scaffold the development of these spatial thinking skills. For the last ten years, the Geospatial Semester, a unique dual-enrollment partnership between James Madison University and Virginia high schools, has provided students with the opportunity to use GST's to hone their spatial thinking skills and to do extended projects of local interest, including environmental, geological and ecological studies. Along with strong spatial thinking skills, these students have also shown strong problem solving skills, often beyond those of fellow students in AP classes. Programs like the Geospatial Semester are scalable and within the reach of many college and university departments, allowing strong engagement with K-12 schools. In this presentation, we'll share details of the Geospatial Semester and research results on the impact of the use of these technologies on students' spatial thinking skills, and discuss the success and challenges of developing K-12 partnerships centered on geospatial visualization.
NASA Astrophysics Data System (ADS)
Řezník, T.; Lukas, V.; Charvát, K.; Charvát, K., Jr.; Horáková, Š.; Křivánek, Z.; Herman, L.
2016-06-01
The agricultural sector is in a unique position due to its strategic importance around the world. It is crucial for both citizens (consumers) and the economy (both regional and global), which, ideally, should ensure that the whole sector is a network of interacting organisations. It is important to develop new tools, management methods, and applications to improve the management and logistic operations of agricultural producers (farms) and agricultural service providers. From a geospatial perspective, this involves identifying cost optimization pathways, reducing transport, reducing environmental loads, and improving the energy balance, while maintaining production levels, etc. This paper describes the benefits of, and open issues arising from, the development of the Open Farm Management Information System. Emphasis is placed on descriptions of available remote sensing and other geospatial data, and their harmonization, processing, and presentation to users. At the same time, the FOODIE platform also offers a novel approach of yield potential estimations. Validation for one farm demonstrated 70% successful rate when comparing yield results at a farm counting 1'284 hectares on one hand and results of a theoretical model of yield potential on the other hand. The presented Open Farm Management Information System has already been successfully registered under Phase 8 of the Global Earth Observation System of Systems (GEOSS) Architecture Implementation Pilot in order to support the wide variety of demands that are primarily aimed at agriculture and water pollution monitoring by means of remote sensing.
ACHP | Historic Preservation in Technical or Scientific Facilities
with the Operation of Highly Technical or Scientific Facilities Balancing Historic Preservation Needs with the Operation of Highly Technical or Scientific Facilities 1991; 79 pages; excerpt available Needs with the Operation of Highly Technical or Scientific Facilities considers the appropriate role of
76 FR 72885 - FM Asymmetric Sideband Operation and Associated Technical Studies
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-28
... Sideband Operation and Associated Technical Studies AGENCY: Federal Communications Commission. ACTION... Asymmetric Sideband Operation and Associated Technical Studies, MM Docket No. 99-325, Public Notice, DA 11-1832 (MB rel. Nov. 1, 2011). The iBiquity and NPR request and the iBiquity and NPR technical studies...
NASA Technical Reports Server (NTRS)
Lyle, Stacey D.
2009-01-01
A software package that has been designed to allow authentication for determining if the rover(s) is/are within a set of boundaries or a specific area to access critical geospatial information by using GPS signal structures as a means to authenticate mobile devices into a network wirelessly and in real-time. The advantage lies in that the system only allows those with designated geospatial boundaries or areas into the server.
Maya Quinones; William Gould; Carlos D. Rodriguez-Pedraza
2007-01-01
This report documents the type and source of geospatial data available for Haiti. It was compiled to serve as a resource for geographic information system (GIS)-based land management and planning. It will be useful for conservation planning, reforestation efforts, and agricultural extension projects. Our study indicates that there is a great deal of geospatial...
2014-05-22
attempted to respond to the advances in technology and the growing power of geographical information system (GIS) tools. However, the doctrine...Geospatial intelligence (GEOINT), Geographical information systems (GIS) tools, Humanitarian Assistance/Disaster Relief (HA/DR), 2010 Haiti Earthquake...Humanitarian Assistance/Disaster Relief (HA/DR) Decisions Through Geospatial Intelligence (GEOINT) and Geographical Information Systems (GIS) Tools
2009-06-08
CRS Report for Congress Prepared for Members and Committees of Congress Geospatial Information and Geographic Information Systems (GIS...Geographic Information Systems (GIS): Current Issues and Future Challenges 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...PAGE unclassified Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 Geospatial Information and Geographic Information Systems (GIS
The Role of Discrete Global Grid Systems in the Global Statistical Geospatial Framework
NASA Astrophysics Data System (ADS)
Purss, M. B. J.; Peterson, P.; Minchin, S. A.; Bermudez, L. E.
2016-12-01
The United Nations Committee of Experts on Global Geospatial Information Management (UN-GGIM) has proposed the development of a Global Statistical Geospatial Framework (GSGF) as a mechanism for the establishment of common analytical systems that enable the integration of statistical and geospatial information. Conventional coordinate reference systems address the globe with a continuous field of points suitable for repeatable navigation and analytical geometry. While this continuous field is represented on a computer in a digitized and discrete fashion by tuples of fixed-precision floating point values, it is a non-trivial exercise to relate point observations spatially referenced in this way to areal coverages on the surface of the Earth. The GSGF states the need to move to gridded data delivery and the importance of using common geographies and geocoding. The challenges associated with meeting these goals are not new and there has been a significant effort within the geospatial community to develop nested gridding standards to tackle these issues over many years. These efforts have recently culminated in the development of a Discrete Global Grid Systems (DGGS) standard which has been developed under the auspices of Open Geospatial Consortium (OGC). DGGS provide a fixed areal based geospatial reference frame for the persistent location of measured Earth observations, feature interpretations, and modelled predictions. DGGS address the entire planet by partitioning it into a discrete hierarchical tessellation of progressively finer resolution cells, which are referenced by a unique index that facilitates rapid computation, query and analysis. The geometry and location of the cell is the principle aspect of a DGGS. Data integration, decomposition, and aggregation is optimised in the DGGS hierarchical structure and can be exploited for efficient multi-source data processing, storage, discovery, transmission, visualization, computation, analysis, and modelling. During the 6th Session of the UN-GGIM in August 2016 the role of DGGS in the context of the GSGF was formally acknowledged. This paper proposes to highlight the synergies and role of DGGS in the Global Statistical Geospatial Framework and to show examples of the use of DGGS to combine geospatial statistics with traditional geoscientific data.
Jewett, Lauren; Harroud, Adil; Hill, Michael D.; Côté, Robert; Wein, Theodore; Smith, Eric E.; Gubitz, Gord; Demchuk, Andrew M.; Sahlas, Demetrios J.; Gladstone, David J.; Lindsay, M. Patrice
2018-01-01
Background: Rapid assessment and management of transient ischemic attacks and nondisabling strokes by specialized stroke prevention services reduces the risk of recurrent stroke and improves outcomes. In Canada, with its vast geography and with 16.8% of the population living in rural areas, access to these services is challenging, and considerable variation in access to care exists. The purpose of this multiphase study was to identify sites across Canada providing stroke prevention services, evaluate resource capacity and determine geographic access for Canadians. Methods: We developed a Stroke Prevention Services Resource Inventory that contained 22 questions on the organization and delivery of stroke prevention services and quality monitoring. The inventory ran from November 2015 to January 2016 and was administered online. We conducted a geospatial analysis to estimate access by drive times. Considerations were made for hours of operation and access within and across provincial borders. Results: A total of 123 stroke prevention sites were identified, of which 119 (96.7%) completed the inventory. Most (95) are designated stroke prevention or rapid assessment clinics. Of the 119 sites, 68 operate full time, and 39 operate less than 2.5 days per week. A total of 87.3% of the Canadian population has access to a stroke prevention service within a 1-hour drive; however, only 69.2% has access to a service that operates 5-7 days a week. Allowing provincial border crossing improves access (< 6-h drive) for those who are beyond a 6-hour drive within their home province (3.4%). Interpretation: Most Canadians have reasonable geographic access to stroke prevention services. Allowing patients to cross borders improves the existing access for many, particularly some remote communities along the Ontario-Quebec and British Columbia-Alberta borders. PMID:29472251
2015-06-09
anomaly detection , which is generally considered part of high level information fusion (HLIF) involving temporal-geospatial data as well as meta-data... Anomaly detection in the Maritime defence and security domain typically focusses on trying to identify vessels that are behaving in an unusual...manner compared with lawful vessels operating in the area – an applied case of target detection among distractors. Anomaly detection is a complex problem
2015-06-01
sociocultural understanding of the operating environment. Mark Herbert provides a snap shot of the current frustration with this DOD shortcoming stating, “we...41 Mark Herbert , “The Human Domain: The Army’s Necessary Push toward Squishiness,” Military Review (Sept–Oct...Theory,” 20. 77 Derived from the following works: Department of the Army, Insurgencies and Countering Insurgencies (FM 3-24), 3-4; Helen Spencer
Region 9 NPDES Facilities 2012- Waste Water Treatment Plants
Point geospatial dataset representing locations of NPDES Waste Water Treatment Plant Facilities. NPDES (National Pollution Discharge Elimination System) is an EPA permit program that regulates direct discharges from facilities that discharge treated waste water into waters of the US. Facilities are issued NPDES permits regulating their discharge as required by the Clean Water Act. A facility may have one or more outfalls (dischargers). The location represents the facility or operating plant.
A novel algorithm for fully automated mapping of geospatial ontologies
NASA Astrophysics Data System (ADS)
Chaabane, Sana; Jaziri, Wassim
2018-01-01
Geospatial information is collected from different sources thus making spatial ontologies, built for the same geographic domain, heterogeneous; therefore, different and heterogeneous conceptualizations may coexist. Ontology integrating helps creating a common repository of the geospatial ontology and allows removing the heterogeneities between the existing ontologies. Ontology mapping is a process used in ontologies integrating and consists in finding correspondences between the source ontologies. This paper deals with the "mapping" process of geospatial ontologies which consist in applying an automated algorithm in finding the correspondences between concepts referring to the definitions of matching relationships. The proposed algorithm called "geographic ontologies mapping algorithm" defines three types of mapping: semantic, topological and spatial.
Data to Decisions: Valuing the Societal Benefit of Geospatial Information
NASA Astrophysics Data System (ADS)
Pearlman, F.; Kain, D.
2016-12-01
The March 10-11, 2016 GEOValue workshop on "Data to Decisions" was aimed at creating a framework for identification and implementation of best practices that capture the societal value of geospatial information for both public and private uses. The end-to-end information flow starts with the earth observation and data acquisition systems, includes the full range of processes from geospatial information to decisions support systems, and concludes with the end user. Case studies, which will be described in this presentation, were identified for a range of applications. The goal was to demonstrate and compare approaches to valuation of geospatial information and forge a path forward for research that leads to standards of practice.
Bauermeister, José A; Connochie, Daniel; Eaton, Lisa; Demers, Michele; Stephenson, Rob
Young men who have sex with men (YMSM), particularly YMSM who are racial/ethnic minorities, are disproportionately affected by the human immunodeficiency virus (HIV) epidemic in the United States. These HIV disparities have been linked to demographic, social, and physical geospatial characteristics. The objective of this scoping review was to summarize the existing evidence from multilevel studies examining how geospatial characteristics are associated with HIV prevention and care outcomes among YMSM populations. Our literature search uncovered 126 peer-reviewed articles, of which 17 were eligible for inclusion based on our review criteria. Nine studies examined geospatial characteristics as predictors of HIV prevention outcomes. Nine of the 17 studies reported HIV care outcomes. From the synthesis regarding the current state of research around geospatial correlates of behavioral and biological HIV risk, we propose strategies to move the field forward in order to inform the design of future multilevel research and intervention studies for this population.
MapFactory - Towards a mapping design pattern for big geospatial data
NASA Astrophysics Data System (ADS)
Rautenbach, Victoria; Coetzee, Serena
2018-05-01
With big geospatial data emerging, cartographers and geographic information scientists have to find new ways of dealing with the volume, variety, velocity, and veracity (4Vs) of the data. This requires the development of tools that allow processing, filtering, analysing, and visualising of big data through multidisciplinary collaboration. In this paper, we present the MapFactory design pattern that will be used for the creation of different maps according to the (input) design specification for big geospatial data. The design specification is based on elements from ISO19115-1:2014 Geographic information - Metadata - Part 1: Fundamentals that would guide the design and development of the map or set of maps to be produced. The results of the exploratory research suggest that the MapFactory design pattern will help with software reuse and communication. The MapFactory design pattern will aid software developers to build the tools that are required to automate map making with big geospatial data. The resulting maps would assist cartographers and others to make sense of big geospatial data.
NASA Astrophysics Data System (ADS)
Johnson, A.
2010-12-01
Maps, spatial and temporal data and their use in analysis and visualization are integral components for studies in the geosciences. With the emergence of geospatial technology (Geographic Information Systems (GIS), remote sensing and imagery, Global Positioning Systems (GPS) and mobile technologies) scientists and the geosciences user community are now able to more easily accessed and share data, analyze their data and present their results. Educators are also incorporating geospatial technology into their geosciences programs by including an awareness of the technology in introductory courses to advanced courses exploring the capabilities to help answer complex questions in the geosciences. This paper will look how the new Geospatial Technology Competency Model from the Department of Labor can help ensure that geosciences programs address the skills and competencies identified by the workforce for geospatial technology as well as look at new tools created by the GeoTech Center to help do self and program assessments.
2009-06-01
AUTOMATED GEOSPATIAL TOOLS : AGILITY IN COMPLEX PLANNING Primary Topic: Track 5 – Experimentation and Analysis Walter A. Powell [STUDENT] - GMU...TITLE AND SUBTITLE Results of an Experimental Exploration of Advanced Automated Geospatial Tools : Agility in Complex Planning 5a. CONTRACT NUMBER...Std Z39-18 Abstract Typically, the development of tools and systems for the military is requirement driven; systems are developed to meet
Strategic Model for Future Geospatial Education.
1998-05-18
There appears to be only one benefit to doing nothing as option one dictates-there are no up front costs to the government for doing nothing. The costs...the government can ensure that US industry and academia benefit from decades of geospatial information expertise. Industry and academia will be...or militarily unique topics. In summary, option two provides more benefits for both the government and the geospatial information community as a
Restful Implementation of Catalogue Service for Geospatial Data Provenance
NASA Astrophysics Data System (ADS)
Jiang, L. C.; Yue, P.; Lu, X. C.
2013-10-01
Provenance, also known as lineage, is important in understanding the derivation history of data products. Geospatial data provenance helps data consumers to evaluate the quality and reliability of geospatial data. In a service-oriented environment, where data are often consumed or produced by distributed services, provenance could be managed by following the same service-oriented paradigm. The Open Geospatial Consortium (OGC) Catalogue Service for the Web (CSW) is used for the registration and query of geospatial data provenance by extending ebXML Registry Information Model (ebRIM). Recent advance of the REpresentational State Transfer (REST) paradigm has shown great promise for the easy integration of distributed resources. RESTful Web Service aims to provide a standard way for Web clients to communicate with servers based on REST principles. The existing approach for provenance catalogue service could be improved by adopting the RESTful design. This paper presents the design and implementation of a catalogue service for geospatial data provenance following RESTful architecture style. A middleware named REST Converter is added on the top of the legacy catalogue service to support a RESTful style interface. The REST Converter is composed of a resource request dispatcher and six resource handlers. A prototype service is developed to demonstrate the applicability of the approach.
An Automated End-To Multi-Agent Qos Based Architecture for Selection of Geospatial Web Services
NASA Astrophysics Data System (ADS)
Shah, M.; Verma, Y.; Nandakumar, R.
2012-07-01
Over the past decade, Service-Oriented Architecture (SOA) and Web services have gained wide popularity and acceptance from researchers and industries all over the world. SOA makes it easy to build business applications with common services, and it provides like: reduced integration expense, better asset reuse, higher business agility, and reduction of business risk. Building of framework for acquiring useful geospatial information for potential users is a crucial problem faced by the GIS domain. Geospatial Web services solve this problem. With the help of web service technology, geospatial web services can provide useful geospatial information to potential users in a better way than traditional geographic information system (GIS). A geospatial Web service is a modular application designed to enable the discovery, access, and chaining of geospatial information and services across the web that are often both computation and data-intensive that involve diverse sources of data and complex processing functions. With the proliferation of web services published over the internet, multiple web services may provide similar functionality, but with different non-functional properties. Thus, Quality of Service (QoS) offers a metric to differentiate the services and their service providers. In a quality-driven selection of web services, it is important to consider non-functional properties of the web service so as to satisfy the constraints or requirements of the end users. The main intent of this paper is to build an automated end-to-end multi-agent based solution to provide the best-fit web service to service requester based on QoS.
User-driven generation of standard data services
NASA Astrophysics Data System (ADS)
Díaz, Laura; Granell, Carlos; Gould, Michael; Huerta, Joaquín.
2010-05-01
Geospatial Information systems are experiencing the shift from monolithic to distributed environments (Bernard, 2003). Current research trends for discover and access of geospatial resources, in these distributed environments, are being addressed by deployment of interconnected Spatial Data Infrastructure (SDI) nodes at different scales to build a global spatial information infrastructure (Masser et al., 2008; Rajabifard et al., 2002). One of the challenges for implementing these global and multiscale SDIs is to agree with common standards in consideration with heterogeneity of various stakeholders [Masser 2005]. In Europe, the European Commission took the INSPIRE initiative to monitor the development of European SDIs. INSPIRE Directive addresses the need for web services to discover, view, transform, invoke, and download geospatial resources, which enable various stakeholders to share resources in an interoperable manner [INSPIRE 2007]. Such web services require technical specifications for the interoperability and harmonization of their SDIs [INSPIRE 2007]. Moreover, interoperability is ensured by a number of specification efforts, in the geo domain most prominently by ISO/TC 211 and the OpenGIS Consortium (OGC) (Bernard, 2003). Other research challenges regarding SDI are on one hand how to handle complexity by users in charge of maintaining SDIs as they grow, and on the other hand the fact the SDI maintenance and evolution should be guided (Bejar et al, 2009). So there is a motivation to improve the complex deployment mechanisms in SDI since there is a need of expertise and time to deploy resources and integrate them by means of standard services. In this context we present an architecture following the INSPIRE technical guidelines and therefore based on SDI principles. This architecture supports distributed applications and provides components to assist users in deploying and updating SDI resources. Therefore mechanisms and components for the automatic generation and publication of standard geospatial are proposed. These mechanisms deal with the fact of hiding the underlying technology and let stakeholders wrap resources as standard services to share these resources in a transparent manner. These components are integrated in our architecture within the Service Framework node (module). PIC Figure 1: Figure 1. Architecture components diagram Figure 1 shows the components of the architecture: The Application Node provides the entry point for users to run distributed applications. This software component has the user interface and the application logic. The Service Connector component provides the ability to connect to the services available in the middleware layer of SDI. This node acts as a socket to OGC Web Services. For instance we appreciate the WMS component implementing the OGC WMS specification as it is the standard recommended by the INSPIRE implementation rules as View Service Type.The Service Framework node contains several components. The Service Framework main functionality is to assist users in wrapping and sharing geospatial resources. It implements the proposed mechanisms to improve the availability and visibility of geospatial resources. The main components of this framework are the Data wrapper, the Process Wrapper and the Service Publisher. The Data Wrapper and Process Wrapper components guide users to wrap data and tools as standard services according with INSPIRE implementing rules (availability). The Service Publisher component aims at creating service metadata and publishing them in catalogues (visibility). Roughly speaking, all of these components are concerned with the idea of acting as a service generator and publisher, i.e., they get a resource (data or process) and return an INSPIRE service that will be published in catalogue services. References Béjar, R., Latre, M. Á., Nogueras-Iso, J., Muro-Medrano, P. R., Zarazaga-Soria, F. J. 2009. International Journal of Geographical Information Science, 23(3), 271-294. Bernard, L, U Einspanier, M Lutz & C Portele. Interoperability in GI Service Chains The Way Forward. In: M. Gould, R. Laurini & S. Coulondre (Eds.). 6th AGILE Conference on Geographic Information Science 2003, Lyon: 179-188. INSPIRE. Directive 2007/2/EC of the European Parliament and of the Council of 14 March 2007 establishing an Infrastructure for Spatial Information in the European Community. (2007) Masser, I. GIS Worlds: Creating Spatial Data Infrastructures. Redlands, California. ESRI Press. (2005) Masser, I., Rajabifard, A., Williamson, I. 2008. Spatially enabling governments through SDI implementation. International Journal of Geographical Information Science. Vol. 22, No. 1, (2008) 5-20 Rajabifard, A., Feeney, M-E. F., Williamson, I. P. 2002. Future directions for SDI development. International Journal of Applied Earth Observation and Geoinformation 4 (2002) 11-22
Signell, Richard; Camossi, E.
2016-01-01
Work over the last decade has resulted in standardised web services and tools that can significantly improve the efficiency and effectiveness of working with meteorological and ocean model data. While many operational modelling centres have enabled query and access to data via common web services, most small research groups have not. The penetration of this approach into the research community, where IT resources are limited, can be dramatically improved by (1) making it simple for providers to enable web service access to existing output files; (2) using free technologies that are easy to deploy and configure; and (3) providing standardised, service-based tools that work in existing research environments. We present a simple, local brokering approach that lets modellers continue to use their existing files and tools, while serving virtual data sets that can be used with standardised tools. The goal of this paper is to convince modellers that a standardised framework is not only useful but can be implemented with modest effort using free software components. We use NetCDF Markup language for data aggregation and standardisation, the THREDDS Data Server for data delivery, pycsw for data search, NCTOOLBOX (MATLAB®) and Iris (Python) for data access, and Open Geospatial Consortium Web Map Service for data preview. We illustrate the effectiveness of this approach with two use cases involving small research modelling groups at NATO and USGS.
NASA Astrophysics Data System (ADS)
Signell, Richard P.; Camossi, Elena
2016-05-01
Work over the last decade has resulted in standardised web services and tools that can significantly improve the efficiency and effectiveness of working with meteorological and ocean model data. While many operational modelling centres have enabled query and access to data via common web services, most small research groups have not. The penetration of this approach into the research community, where IT resources are limited, can be dramatically improved by (1) making it simple for providers to enable web service access to existing output files; (2) using free technologies that are easy to deploy and configure; and (3) providing standardised, service-based tools that work in existing research environments. We present a simple, local brokering approach that lets modellers continue to use their existing files and tools, while serving virtual data sets that can be used with standardised tools. The goal of this paper is to convince modellers that a standardised framework is not only useful but can be implemented with modest effort using free software components. We use NetCDF Markup language for data aggregation and standardisation, the THREDDS Data Server for data delivery, pycsw for data search, NCTOOLBOX (MATLAB®) and Iris (Python) for data access, and Open Geospatial Consortium Web Map Service for data preview. We illustrate the effectiveness of this approach with two use cases involving small research modelling groups at NATO and USGS.
Web-client based distributed generalization and geoprocessing
Wolf, E.B.; Howe, K.
2009-01-01
Generalization and geoprocessing operations on geospatial information were once the domain of complex software running on high-performance workstations. Currently, these computationally intensive processes are the domain of desktop applications. Recent efforts have been made to move geoprocessing operations server-side in a distributed, web accessible environment. This paper initiates research into portable client-side generalization and geoprocessing operations as part of a larger effort in user-centered design for the US Geological Survey's The National Map. An implementation of the Ramer-Douglas-Peucker (RDP) line simplification algorithm was created in the open source OpenLayers geoweb client. This algorithm implementation was benchmarked using differing data structures and browser platforms. The implementation and results of the benchmarks are discussed in the general context of client-side geoprocessing. (Abstract).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brost, Randolph C.; McLendon, William Clarence,
2013-01-01
Modeling geospatial information with semantic graphs enables search for sites of interest based on relationships between features, without requiring strong a priori models of feature shape or other intrinsic properties. Geospatial semantic graphs can be constructed from raw sensor data with suitable preprocessing to obtain a discretized representation. This report describes initial work toward extending geospatial semantic graphs to include temporal information, and initial results applying semantic graph techniques to SAR image data. We describe an efficient graph structure that includes geospatial and temporal information, which is designed to support simultaneous spatial and temporal search queries. We also report amore » preliminary implementation of feature recognition, semantic graph modeling, and graph search based on input SAR data. The report concludes with lessons learned and suggestions for future improvements.« less
Using Watershed Boundaries to Map Adverse Health Outcomes: Examples From Nebraska, USA
Corley, Brittany; Bartelt-Hunt, Shannon; Rogan, Eleanor; Coulter, Donald; Sparks, John; Baccaglini, Lorena; Howell, Madeline; Liaquat, Sidra; Commack, Rex; Kolok, Alan S
2018-01-01
In 2009, a paper was published suggesting that watersheds provide a geospatial platform for establishing linkages between aquatic contaminants, the health of the environment, and human health. This article is a follow-up to that original article. From an environmental perspective, watersheds segregate landscapes into geospatial units that may be relevant to human health outcomes. From an epidemiologic perspective, the watershed concept places anthropogenic health data into a geospatial framework that has environmental relevance. Research discussed in this article includes information gathered from the literature, as well as recent data collected and analyzed by this research group. It is our contention that the use of watersheds to stratify geospatial information may be both environmentally and epidemiologically valuable. PMID:29398918
Interoperability And Value Added To Earth Observation Data
NASA Astrophysics Data System (ADS)
Gasperi, J.
2012-04-01
Geospatial web services technology has provided a new means for geospatial data interoperability. Open Geospatial Consortium (OGC) services such as Web Map Service (WMS) to request maps on the Internet, Web Feature Service (WFS) to exchange vectors or Catalog Service for the Web (CSW) to search for geospatialized data have been widely adopted in the Geosciences community in general and in the remote sensing community in particular. These services make Earth Observation data available to a wider range of public users than ever before. The mapshup web client offers an innovative and efficient user interface that takes advantage of the power of interoperability. This presentation will demonstrate how mapshup can be effectively used in the context of natural disasters management.
The French Space Operation Act: Technical Regulations
NASA Astrophysics Data System (ADS)
Trinchero, J. P.; Lazare, B.
2010-09-01
The French Space Operation Act(FSOA) stipulates that a prime objective of the National technical regulations is to protect people, property, public health and the environment. Compliance with these technical regulations is mandatory as of 10 December 2010 for space operations by French space operators and for space operations from French territory. The space safety requirements and regulations governing procedures are based on national and international best practices and experience. A critical design review of the space system and procedures shall be carried out by the applicant, in order to verify compliance with the Technical Regulations. An independent technical assessment of the operation is delegated to CNES. The principles applied when drafting technical regulations are as follows: requirements must as far as possible establish the rules according to the objective to be obtained, rather than how it is to be achieved; requirements must give preference to international standards recognised as being the state of the art; requirements must take previous experience into account. Technical regulations are divided into three sections covering common requirements for the launch, control and return of a space object. A dedicated section will cover specific rules to be applied at the Guiana Space Centre. The main topics addressed by the technical regulations are: operator safety management system; study of risks to people, property, public health and the Earth’s environment; impact study on the outer space environment: space debris generated by the operation; planetary protection.
Fecso, A B; Kuzulugil, S S; Babaoglu, C; Bener, A B; Grantcharov, T P
2018-03-30
The operating theatre is a unique environment with complex team interactions, where technical and non-technical performance affect patient outcomes. The correlation between technical and non-technical performance, however, remains underinvestigated. The purpose of this study was to explore these interactions in the operating theatre. A prospective single-centre observational study was conducted at a tertiary academic medical centre. One surgeon and three fellows participated as main operators. All patients who underwent a laparoscopic Roux-en-Y gastric bypass and had the procedures captured using the Operating Room Black Box ® platform were included. Technical assessment was performed using the Objective Structured Assessment of Technical Skills and Generic Error Rating Tool instruments. For non-technical assessment, the Non-Technical Skills for Surgeons (NOTSS) and Scrub Practitioners' List of Intraoperative Non-Technical Skills (SPLINTS) tools were used. Spearman rank-order correlation and N-gram statistics were conducted. Fifty-six patients were included in the study and 90 procedural steps (gastrojejunostomy and jejunojejunostomy) were analysed. There was a moderate to strong correlation between technical adverse events (r s = 0·417-0·687), rectifications (r s = 0·380-0·768) and non-technical performance of the surgical and nursing teams (NOTSS and SPLINTS). N-gram statistics showed that after technical errors, events and prior rectifications, the staff surgeon and the scrub nurse exhibited the most positive non-technical behaviours, irrespective of operator (staff surgeon or fellow). This study demonstrated that technical and non-technical performances are related, on both an individual and a team level. Valuable data can be obtained around intraoperative errors, events and rectifications. © 2018 BJS Society Ltd Published by John Wiley & Sons Ltd.
Improving situation awareness with the Android Team Awareness Kit (ATAK)
NASA Astrophysics Data System (ADS)
Usbeck, Kyle; Gillen, Matthew; Loyall, Joseph; Gronosky, Andrew; Sterling, Joshua; Kohler, Ralph; Hanlon, Kelly; Scally, Andrew; Newkirk, Richard; Canestrare, David
2015-05-01
To make appropriate, timely decisions in the field, Situational Awareness (SA) needs to be conveyed in a decentralized manner to the users at the edge of the network as well as at operations centers. Sharing real-time SA efficiently between command centers and operational troops poses many challenges, including handling heterogeneous and dynamic networks, resource constraints, and varying needs for the collection, dissemination, and display of information, as well as recording that information. A mapping application that allows teams to share relevant geospatial information efficiently and to communicate effectively with one another and command centers has wide applicability to many vertical markets across the Department of Defense, as well as a wide variety of federal, state local, and non-profit agencies that need to share locations, text, photos, and video. This paper describes the Android Team Awareness Kit (ATAK), an advanced, distributed tool for commercial- off-the-shelf (COTS) mobile devices such as smartphones and tablets. ATAK provides a variety of useful SA functions for soldiers, law enforcement, homeland defense, and civilian collaborative use; including mapping and navigation, range and bearing, text chat, force tracking, geospatial markup tools, image and file sharing, video playback, site surveys, and many others. This paper describes ATAK, the SA tools that ATAK has built-in, and the ways it is being used by a variety of military, homeland security, and law enforcement users.
The ISPRS Student Consortium: From launch to tenth anniversary
NASA Astrophysics Data System (ADS)
Kanjir, U.; Detchev, I.; Reyes, S. R.; Akkartal Aktas, A.; Lo, C. Y.; Miyazaki, H.
2014-04-01
The ISPRS Student Consortium is an international organization for students and young professionals in the fields of photogrammetry, remote sensing, and the geospatial information sciences. Since its start ten years ago, the number of members of the Student Consortium has been steadily growing, now reaching close to 1000. Its increased popularity, especially in recent years, is mainly due to the organization's worldwide involvement in student matters. The Student Consortium has helped organize numerous summer schools, youth forums, and student technical sessions at ISPRS sponsored conferences. In addition, the organization publishes a newsletter, and hosts several social media outlets in order to keep its global membership up-to-date on a regular basis. This paper will describe the structure of the organization, and it will give some example of its past student related activities.
4-station ultra-rapid EOP experiment with e-VLBI technique and automated correlation/analysis
NASA Astrophysics Data System (ADS)
Kurihara, S.; Nozawa, K.; Haas, R.; Lovell, J.; McCallum, J.; Quick, J.; Hobiger, T.
2013-08-01
Since 2007, the Geospatial Information Authority of Japan (GSI) and the Onsala Space Observatory (OSO) have performed the ultra-rapid dUT1 experiments, which can provide us with near real-time dUT1 value. Its technical knowledge has already been adopted for the regular series of the Tsukuba-Wettzell intensive session. Now we tried some 4-station ultra-rapid EOP experiments in association with Hobart and HartRAO so that we can estimate not only dUT1 but also the two polar motion parameters. In this experiment a new analysis software c5++ developed by the National Institute of Information and Communications Technology (NICT) was used. We describe past developments and an overview of the experiment, and conclude with its results in this report.
NASA Astrophysics Data System (ADS)
Vuorinen, Tommi; Korja, Annakaisa
2017-04-01
FIN-EPOS consortium is a joint community of Finnish national research institutes tasked with operating and maintaining solid-earth geophysical and geological observatories and laboratories in Finland. These national research infrastructures (NRIs) seek to join EPOS research infrastructure (EPOS RI) and further pursue Finland's participation as a founding member in EPOS ERIC (European Research Infrastructure Consortium). Current partners of FIN-EPOS are the University of Helsinki (UH), the University of and Oulu (UO), Finnish Geospatial Research Institute (FGI) of the National Land Survey (NLS), Finnish Meteorological Institute (FMI), Geological Survey of Finland (GTK), CSC - IT Center for Science and MIKES Metrology at VTT Technical Research Centre of Finland Ltd. The consortium is hosted by the Institute of Seismology, UH (ISUH). The primary purpose of the consortium is to act as a coordinating body between various NRIs and the EPOS RI. FIN-EPOS engages in planning and development of the national EPOS RI and will provide support in EPOS implementation phase (IP) for the partner NRIs. FIN-EPOS also promotes the awareness of EPOS in Finland and is open to new partner NRIs that would benefit from participating in EPOS. The consortium additionally seeks to advance solid Earth science education, technologies and innovations in Finland and is actively engaging in Nordic co-operation and collaboration of solid Earth RIs. The main short term objective of FIN-EPOS is to make Finnish geoscientific data provided by NRIs interoperable with the Thematic Core Services (TCS) in the EPOS IP. Consortium partners commit into applying and following metadata and data format standards provided by EPOS. FIN-EPOS will also provide a national Finnish language web portal where users are identified and their user rights for EPOS resources are defined.
77 FR 67831 - Announcement of National Geospatial Advisory Committee Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-14
... from governmental, private sector, non-profit, and academic organizations, has been established to... Dialogue --National Address Database --Geospatial Priorities --NGAC Subcommittee Activities --FGDC Update...
Geospatial Data Management Platform for Urban Groundwater
NASA Astrophysics Data System (ADS)
Gaitanaru, D.; Priceputu, A.; Gogu, C. R.
2012-04-01
Due to the large amount of civil work projects and research studies, large quantities of geo-data are produced for the urban environments. These data are usually redundant as well as they are spread in different institutions or private companies. Time consuming operations like data processing and information harmonisation represents the main reason to systematically avoid the re-use of data. The urban groundwater data shows the same complex situation. The underground structures (subway lines, deep foundations, underground parkings, and others), the urban facility networks (sewer systems, water supply networks, heating conduits, etc), the drainage systems, the surface water works and many others modify continuously. As consequence, their influence on groundwater changes systematically. However, these activities provide a large quantity of data, aquifers modelling and then behaviour prediction can be done using monitored quantitative and qualitative parameters. Due to the rapid evolution of technology in the past few years, transferring large amounts of information through internet has now become a feasible solution for sharing geoscience data. Furthermore, standard platform-independent means to do this have been developed (specific mark-up languages like: GML, GeoSciML, WaterML, GWML, CityML). They allow easily large geospatial databases updating and sharing through internet, even between different companies or between research centres that do not necessarily use the same database structures. For Bucharest City (Romania) an integrated platform for groundwater geospatial data management is developed under the framework of a national research project - "Sedimentary media modeling platform for groundwater management in urban areas" (SIMPA) financed by the National Authority for Scientific Research of Romania. The platform architecture is based on three components: a geospatial database, a desktop application (a complex set of hydrogeological and geological analysis tools) and a front-end geoportal service. The SIMPA platform makes use of mark-up transfer standards to provide a user-friendly application that can be accessed through internet to query, analyse, and visualise geospatial data related to urban groundwater. The platform holds the information within the local groundwater geospatial databases and the user is able to access this data through a geoportal service. The database architecture allows storing accurate and very detailed geological, hydrogeological, and infrastructure information that can be straightforwardly generalized and further upscaled. The geoportal service offers the possibility of querying a dataset from the spatial database. The query is coded in a standard mark-up language, and sent to the server through a standard Hyper Text Transfer Protocol (http) to be processed by the local application. After the validation of the query, the results are sent back to the user to be displayed by the geoportal application. The main advantage of the SIMPA platform is that it offers to the user the possibility to make a primary multi-criteria query, which results in a smaller set of data to be analysed afterwards. This improves both the transfer process parameters and the user's means of creating the desired query.
Borderless Geospatial Web (bolegweb)
NASA Astrophysics Data System (ADS)
Cetl, V.; Kliment, T.; Kliment, M.
2016-06-01
The effective access and use of geospatial information (GI) resources acquires a critical value of importance in modern knowledge based society. Standard web services defined by Open Geospatial Consortium (OGC) are frequently used within the implementations of spatial data infrastructures (SDIs) to facilitate discovery and use of geospatial data. This data is stored in databases located in a layer, called the invisible web, thus are ignored by search engines. SDI uses a catalogue (discovery) service for the web as a gateway to the GI world through the metadata defined by ISO standards, which are structurally diverse to OGC metadata. Therefore, a crosswalk needs to be implemented to bridge the OGC resources discovered on mainstream web with those documented by metadata in an SDI to enrich its information extent. A public global wide and user friendly portal of OGC resources available on the web ensures and enhances the use of GI within a multidisciplinary context and bridges the geospatial web from the end-user perspective, thus opens its borders to everybody. Project "Crosswalking the layers of geospatial information resources to enable a borderless geospatial web" with the acronym BOLEGWEB is ongoing as a postdoctoral research project at the Faculty of Geodesy, University of Zagreb in Croatia (http://bolegweb.geof.unizg.hr/). The research leading to the results of the project has received funding from the European Union Seventh Framework Programme (FP7 2007-2013) under Marie Curie FP7-PEOPLE-2011-COFUND. The project started in the November 2014 and is planned to be finished by the end of 2016. This paper provides an overview of the project, research questions and methodology, so far achieved results and future steps.
Geospatial Data as a Service: Towards planetary scale real-time analytics
NASA Astrophysics Data System (ADS)
Evans, B. J. K.; Larraondo, P. R.; Antony, J.; Richards, C. J.
2017-12-01
The rapid growth of earth systems, environmental and geophysical datasets poses a challenge to both end-users and infrastructure providers. For infrastructure and data providers, tasks like managing, indexing and storing large collections of geospatial data needs to take into consideration the various use cases by which consumers will want to access and use the data. Considerable investment has been made by the Earth Science community to produce suitable real-time analytics platforms for geospatial data. There are currently different interfaces that have been defined to provide data services. Unfortunately, there is considerable difference on the standards, protocols or data models which have been designed to target specific communities or working groups. The Australian National University's National Computational Infrastructure (NCI) is used for a wide range of activities in the geospatial community. Earth observations, climate and weather forecasting are examples of these communities which generate large amounts of geospatial data. The NCI has been carrying out significant effort to develop a data and services model that enables the cross-disciplinary use of data. Recent developments in cloud and distributed computing provide a publicly accessible platform where new infrastructures can be built. One of the key components these technologies offer is the possibility of having "limitless" compute power next to where the data is stored. This model is rapidly transforming data delivery from centralised monolithic services towards ubiquitous distributed services that scale up and down adapting to fluctuations in the demand. NCI has developed GSKY, a scalable, distributed server which presents a new approach for geospatial data discovery and delivery based on OGC standards. We will present the architecture and motivating use-cases that drove GSKY's collaborative design, development and production deployment. We show our approach offers the community valuable exploratory analysis capabilities, for dealing with petabyte-scale geospatial data collections.
Making geospatial data in ASF archive readily accessible
NASA Astrophysics Data System (ADS)
Gens, R.; Hogenson, K.; Wolf, V. G.; Drew, L.; Stern, T.; Stoner, M.; Shapran, M.
2015-12-01
The way geospatial data is searched, managed, processed and used has changed significantly in recent years. A data archive such as the one at the Alaska Satellite Facility (ASF), one of NASA's twelve interlinked Distributed Active Archive Centers (DAACs), used to be searched solely via user interfaces that were specifically developed for its particular archive and data sets. ASF then moved to using an application programming interface (API) that defined a set of routines, protocols, and tools for distributing the geospatial information stored in the database in real time. This provided a more flexible access to the geospatial data. Yet, it was up to user to develop the tools to get a more tailored access to the data they needed. We present two new approaches for serving data to users. In response to the recent Nepal earthquake we developed a data feed for distributing ESA's Sentinel data. Users can subscribe to the data feed and are provided with the relevant metadata the moment a new data set is available for download. The second approach was an Open Geospatial Consortium (OGC) web feature service (WFS). The WFS hosts the metadata along with a direct link from which the data can be downloaded. It uses the open-source GeoServer software (Youngblood and Iacovella, 2013) and provides an interface to include the geospatial information in the archive directly into the user's geographic information system (GIS) as an additional data layer. Both services are run on top of a geospatial PostGIS database, an open-source geographic extension for the PostgreSQL object-relational database (Marquez, 2015). Marquez, A., 2015. PostGIS essentials. Packt Publishing, 198 p. Youngblood, B. and Iacovella, S., 2013. GeoServer Beginner's Guide, Packt Publishing, 350 p.
Geospatial resources for the geologic community: The USGS National Map
Witt, Emitt C.
2015-01-01
Geospatial data are a key component of investigating, interpreting, and communicating the geological sciences. Locating geospatial data can be time-consuming, which detracts from time spent on a study because these data are not obviously placed in central locations or are served from many disparate databases. The National Map of the US Geological Survey is a publicly available resource for accessing the geospatial base map data needs of the geological community from a central location. The National Map data are available through a viewer and download platform providing access to eight primary data themes, plus the US Topo and scanned historical topographic maps. The eight themes are elevation, orthoimagery, hydrography, geographic names, boundaries, transportation, structures, and land cover, and they are being offered for download as predefined tiles in formats supported by leading geographic information system software. Data tiles are periodically refreshed to capture the most current content and are an efficient method for disseminating and receiving geospatial information. Elevation data, for example, are offered as a download from the National Map as 1° × 1° tiles for the 10- and 30- m products and as 15′ × 15′ tiles for the higher-resolution 3-m product. Vector data sets with smaller file sizes are offered at several tile sizes and formats. Partial tiles are not a download option—any prestaged data that intersect the requesting bounding box will be, in their entirety, part of the download order. While there are many options for accessing geospatial data via the Web, the National Map represents authoritative sources of data that are documented and can be referenced for citation and inclusion in scientific publications. Therefore, National Map products and services should be part of a geologist’s first stop for geospatial information and data.
2007-01-01
software applications and rely on the installations to supply them with the basic I&E geospatial data - sets for those applications. Such...spatial data in geospatially based tools to help track military supplies and materials all over the world. For instance, SDDCTEA developed IRRIS, a...regional offices or individual installations to supply the data and perform QA/QC in the process. The IVT program office worked with the installations and
Leib, Kenneth J.; Linard, Joshua I.; Williams, Cory A.
2012-01-01
Elevated loads of salt and selenium can impair the quality of water for both anthropogenic and natural uses. Understanding the environmental processes controlling how salt and selenium are introduced to streams is critical to managing and mitigating the effects of elevated loads. Dominant relations between salt and selenium loads and environmental characteristics can be established by using geospatial data. The U.S. Geological Survey, in cooperation with the Bureau of Reclamation, investigated statistical relations between seasonal salt or selenium loads emanating from the Upper Colorado River Basin and geospatial data. Salt and selenium loads measured during the irrigation and nonirrigation seasons were related to geospatial variables for 168 subbasins within the Gunnison and Colorado River Basins. These geospatial variables represented subbasin characteristics of the physical environment, precipitation, geology, land use, and the irrigation network. All subbasin variables with units of area had statistically significant relations with load. The few variables that were not in units of area but were statistically significant helped to identify types of geospatial data that might influence salt and selenium loading. Following a stepwise approach, combinations of these statistically significant variables were used to develop multiple linear regression models. The models can be used to help prioritize areas where salt and selenium control projects might be most effective.
Issues on Building Kazakhstan Geospatial Portal to Implement E-Government
NASA Astrophysics Data System (ADS)
Sagadiyev, K.; Kang, H. K.; Li, K. J.
2016-06-01
A main issue in developing e-government is about how to integrate and organize many complicated processes and different stakeholders. Interestingly geospatial information provides an efficient framework to integrate and organized them. In particular, it is very useful to integrate the process of land management in e-government with geospatial information framework, since most of land management tasks are related with geospatial properties. In this paper, we present a use-case on the e-government project in Kazakhstan for land management. We develop a geoportal to connect many tasks and different users via geospatial information framework. This geoportal is based on open source geospatial software including GeoServer, PostGIS, and OpenLayers. With this geoportal, we expect three achievements as follows. First we establish a transparent governmental process, which is one of main goal of e-government. Every stakeholder monitors what is happening in land management process. Second, we can significantly reduce the time and efforts in the government process. For example, a grant procedure for a building construction has taken more than one year with more than 50 steps. It is expected that this procedure would be reduced to 2 weeks by the geoportal framework. Third we provide a collaborative environment between different governmental structures via the geoportal, while many conflicts and mismatches have been a critical issue of governmental administration processes.
Interoperability in planetary research for geospatial data analysis
NASA Astrophysics Data System (ADS)
Hare, Trent M.; Rossi, Angelo P.; Frigeri, Alessandro; Marmo, Chiara
2018-01-01
For more than a decade there has been a push in the planetary science community to support interoperable methods for accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (e.g., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized geospatial image formats, geologic mapping conventions, U.S. Federal Geographic Data Committee (FGDC) cartographic and metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Map Tile Services (cached image tiles), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they can be just as valuable for planetary domain. Another initiative, called VESPA (Virtual European Solar and Planetary Access), will marry several of the above geoscience standards and astronomy-based standards as defined by International Virtual Observatory Alliance (IVOA). This work outlines the current state of interoperability initiatives in use or in the process of being researched within the planetary geospatial community.
Geospatial data for coal beds in the Powder River Basin, Wyoming and Montana
Kinney, Scott A.; Scott, David C.; Osmonson, Lee M.; Luppens, James A.
2015-01-01
The purpose of this report is to provide geospatial data for various layers and themes in a Geographic Information System (GIS) format for the Powder River Basin, Wyoming and Montana. In 2015, as part of the U.S. Coal Resources and Reserves Assessment Project, the U.S. Geological Survey (USGS) completed an assessment of coal resources and reserves within the Powder River Basin, Wyoming and Montana. This report is supplemental to USGS Professional Paper 1809 and contains GIS data that can be used to view digital layers or themes, including the Tertiary limit of the Powder River Basin boundary, locations of drill holes, clinker, mined coal, land use and technical restrictions, geology, mineral estate ownership, coal thickness, depth to the top of the coal bed (overburden), and coal reliability categories. Larger scale maps may be viewed using the GIS data provided in this report supplemental to the page-size maps provided in USGS Professional Paper 1809. Additionally, these GIS data can be exported to other digital applications as needed by the user. The database used for this report contains a total of 29,928 drill holes, of which 21,393 are in the public domain. The public domain database is linked to the geodatabase in this report so that the user can access the drill-hole data through GIS applications. Results of this report are available at the USGS Energy Resources Program Web site,http://energy.usgs.gov/RegionalStudies/PowderRiverBasin.aspx.
Baker, Nancy T.
2011-01-01
This report and the accompanying geospatial data were created to assist in analysis and interpretation of water-quality data provided by the U.S. Geological Survey's National Stream Quality Accounting Network (NASQAN) and by the U.S. Coastal Waters and Tributaries National Monitoring Network (NMN), which is a cooperative monitoring program of Federal, regional, and State agencies. The report describes the methods used to develop the geospatial data, which was primarily derived from the National Watershed Boundary Dataset. The geospatial data contains polygon shapefiles of basin boundaries for 33 NASQAN and 5 NMN streamflow and water-quality monitoring stations. In addition, 30 polygon shapefiles of the closed and noncontributing basins contained within the NASQAN or NMN boundaries are included. Also included is a point shapefile of the NASQAN and NMN monitoring stations and associated basin and station attributes. Geospatial data for basin delineations, associated closed and noncontributing basins, and monitoring station locations are available at http://water.usgs.gov/GIS/metadata/usgswrd/XML/ds641_nasqan_wbd12.xml.
Developing a distributed HTML5-based search engine for geospatial resource discovery
NASA Astrophysics Data System (ADS)
ZHOU, N.; XIA, J.; Nebert, D.; Yang, C.; Gui, Z.; Liu, K.
2013-12-01
With explosive growth of data, Geospatial Cyberinfrastructure(GCI) components are developed to manage geospatial resources, such as data discovery and data publishing. However, the efficiency of geospatial resources discovery is still challenging in that: (1) existing GCIs are usually developed for users of specific domains. Users may have to visit a number of GCIs to find appropriate resources; (2) The complexity of decentralized network environment usually results in slow response and pool user experience; (3) Users who use different browsers and devices may have very different user experiences because of the diversity of front-end platforms (e.g. Silverlight, Flash or HTML). To address these issues, we developed a distributed and HTML5-based search engine. Specifically, (1)the search engine adopts a brokering approach to retrieve geospatial metadata from various and distributed GCIs; (2) the asynchronous record retrieval mode enhances the search performance and user interactivity; (3) the search engine based on HTML5 is able to provide unified access capabilities for users with different devices (e.g. tablet and smartphone).
Center of Excellence for Geospatial Information Science research plan 2013-18
Usery, E. Lynn
2013-01-01
The U.S. Geological Survey Center of Excellence for Geospatial Information Science (CEGIS) was created in 2006 and since that time has provided research primarily in support of The National Map. The presentations and publications of the CEGIS researchers document the research accomplishments that include advances in electronic topographic map design, generalization, data integration, map projections, sea level rise modeling, geospatial semantics, ontology, user-centered design, volunteer geographic information, and parallel and grid computing for geospatial data from The National Map. A research plan spanning 2013–18 has been developed extending the accomplishments of the CEGIS researchers and documenting new research areas that are anticipated to support The National Map of the future. In addition to extending the 2006–12 research areas, the CEGIS research plan for 2013–18 includes new research areas in data models, geospatial semantics, high-performance computing, volunteered geographic information, crowdsourcing, social media, data integration, and multiscale representations to support the Three-Dimensional Elevation Program (3DEP) and The National Map of the future of the U.S. Geological Survey.
Abu Dhabi Basemap Update Using the LiDAR Mobile Mapping Technology
NASA Astrophysics Data System (ADS)
Alshaiba, Omar; Amparo Núñez-Andrés, M.; Lantada, Nieves
2016-04-01
Mobile LiDAR system provides a new technology which can be used to update geospatial information by direct and rapid data collection. This technology is faster than the traditional survey ways and has lower cost. Abu Dhabi Municipal System aims to update its geospatial system frequently as the government entities have invested heavily in GIS technology and geospatial data to meet the repaid growth in the infrastructure and construction projects in recent years. The Emirate of Abu Dhabi has witnessed a huge growth in infrastructure and construction projects in recent years. Therefore, it is necessary to develop and update its basemap system frequently to meet their own organizational needs. Currently, the traditional ways are used to update basemap system such as human surveyors, GPS receivers and controller (GPS assigned computer). Then the surveyed data are downloaded, edited and reviewed manually before it is merged to the basemap system. Traditional surveying ways may not be applicable in some conditions such as; bad weather, difficult topographic area and boundary area. This paper presents a proposed methodology which uses the Mobile LiDAR system to update basemap in Abu Dhabi by using daily transactions services. It aims to use and integrate the mobile LiDAR technology into the municipality's daily workflow such that it becomes the new standard cost efficiency operating procedure for updating the base-map in Abu Dhabi Municipal System. On another note, the paper will demonstrate the results of the innovated workflow for the base-map update using the mobile LiDAR point cloud and few processing algorithms.
Geospatial Analysis of Low-frequency Radio Signals Collected During the 2017 Solar Eclipse
NASA Astrophysics Data System (ADS)
Liles, W. C.; Nelson, J.; Kerby, K. C.; Lukes, L.; Henry, J.; Oputa, J.; Lemaster, G.
2017-12-01
The total solar eclipse of 2017, with a path that crosses the continental United States, offers a unique opportunity to gather geospatially diverse data. The EclipseMob project has been designed to crowdsource this data by building a network of citizen scientists across the country. The project focuses on gathering low-frequency radio wave data before, during, and after the eclipse. WWVB, a 60 KHz transmitter in Ft. Collins, CO operated by the National Institutes of Standard and Technology, will provide the transmit signal that will be observed by project participants. Participating citizen scientists are building simple antennas and receivers designed by the EclipseMob team and provided to participants in the form of "receiver kits." The EclipseMob receiver downsamples the 60 KHz signal to 18 KHz and supplies the downsampled signal to the audio jack of a smartphone. A dedicated app is used to collect data and upload it to the EclipseMob server. By studying the variations in WWVB amplitude observed during the eclipse at over 150 locations across the country, we aim to understand how the ionization of the D layer of the ionosphere is impacted by the eclipse as a function of both time and space (location). The diverse locations of the EclipseMob participants will provide data from a wide variety of propagation paths - some crossing the path of the total eclipse, and some remaining on the same side of the eclipse path as the transmitter. Our initial data analysis will involve identifying characteristics that define geospatial relationships in the behavior of observed WWVB signal amplitudes.
NASA Astrophysics Data System (ADS)
Tsai, F.; Chen, L.-C.
2014-04-01
During the past decade, Taiwan has experienced an unusual and fast growing in the industry of mapping, remote sensing, spatial information and related markets. A successful space program and dozens of advanced airborne and ground-based remote sensing instruments as well as mobile mapping systems have been implemented and put into operation to support the vast demands of geospatial data acquisition. Moreover, in addition to the government agencies and research institutes, there are also tens of companies in the private sector providing geo-spatial data and services. However, the fast developing industry is also posing a great challenge to the education sector in Taiwan, especially the higher education for geo-spatial information. Facing this fast developing industry, the demands of skilled professionals and new technologies in order to address diversified needs are indubitably high. Consequently, while delighting in the expanding and prospering benefitted from the fast growing industry, how to fulfill these demands has become a challenge for the remote sensing and spatial information disciplines in the higher education institutes in Taiwan. This paper provides a brief insight into the status of the remote sensing and spatial information industry in Taiwan as well as the challenges of the education and technology transfer to support the increasing demands and to ensure the continuous development of the industry. In addition to the report of the current status of the remote sensing and spatial information related courses and programs in the colleges and universities, current and potential threatening issues and possible resolutions are also discussed in different points of view.
GeoPAT: A toolbox for pattern-based information retrieval from large geospatial databases
NASA Astrophysics Data System (ADS)
Jasiewicz, Jarosław; Netzel, Paweł; Stepinski, Tomasz
2015-07-01
Geospatial Pattern Analysis Toolbox (GeoPAT) is a collection of GRASS GIS modules for carrying out pattern-based geospatial analysis of images and other spatial datasets. The need for pattern-based analysis arises when images/rasters contain rich spatial information either because of their very high resolution or their very large spatial extent. Elementary units of pattern-based analysis are scenes - patches of surface consisting of a complex arrangement of individual pixels (patterns). GeoPAT modules implement popular GIS algorithms, such as query, overlay, and segmentation, to operate on the grid of scenes. To achieve these capabilities GeoPAT includes a library of scene signatures - compact numerical descriptors of patterns, and a library of distance functions - providing numerical means of assessing dissimilarity between scenes. Ancillary GeoPAT modules use these functions to construct a grid of scenes or to assign signatures to individual scenes having regular or irregular geometries. Thus GeoPAT combines knowledge retrieval from patterns with mapping tasks within a single integrated GIS environment. GeoPAT is designed to identify and analyze complex, highly generalized classes in spatial datasets. Examples include distinguishing between different styles of urban settlements using VHR images, delineating different landscape types in land cover maps, and mapping physiographic units from DEM. The concept of pattern-based spatial analysis is explained and the roles of all modules and functions are described. A case study example pertaining to delineation of landscape types in a subregion of NLCD is given. Performance evaluation is included to highlight GeoPAT's applicability to very large datasets. The GeoPAT toolbox is available for download from
NASA Astrophysics Data System (ADS)
Dong, S.; Yan, Q.; Xu, Y.; Bai, J.
2018-04-01
In order to promote the construction of digital geo-spatial framework in China and accelerate the construction of informatization mapping system, three-dimensional geographic information model emerged. The three-dimensional geographic information model based on oblique photogrammetry technology has higher accuracy, shorter period and lower cost than traditional methods, and can more directly reflect the elevation, position and appearance of the features. At this stage, the technology of producing three-dimensional geographic information models based on oblique photogrammetry technology is rapidly developing. The market demand and model results have been emerged in a large amount, and the related quality inspection needs are also getting larger and larger. Through the study of relevant literature, it is found that there are a lot of researches on the basic principles and technical characteristics of this technology, and relatively few studies on quality inspection and analysis. On the basis of summarizing the basic principle and technical characteristics of oblique photogrammetry technology, this paper introduces the inspection contents and inspection methods of three-dimensional geographic information model based on oblique photogrammetry technology. Combined with the actual inspection work, this paper summarizes the quality problems of three-dimensional geographic information model based on oblique photogrammetry technology, analyzes the causes of the problems and puts forward the quality control measures. It provides technical guidance for the quality inspection of three-dimensional geographic information model data products based on oblique photogrammetry technology in China and provides technical support for the vigorous development of three-dimensional geographic information model based on oblique photogrammetry technology.
Metadata or data about data describes the content, quality, condition, and other characteristics of data. Geospatial metadata are critical to data discovery and serves as the fuel for the Geospatial One-Stop data portal.
,
1999-01-01
In May 1997, the U.S. Geological Survey (USGS) and the Microsoft Corporation of Redmond, Wash., entered into a cooperative research and development agreement (CRADA) to make vast amounts of geospatial data available to the general public through the Internet. The CRADA is a 36-month joint effort to develop a general, public-oriented browsing and retrieval site for geospatial data on the Internet. Specifically, Microsoft plans to (1) modify a large volume of USGS geospatial data so the images can be displayed quickly and easily over the Internet, (2) implement an easy-to-use interface for low-speed connections, and (3) develop an Internet Web site capable of servicing millions of users per day.
,
1998-01-01
In May 1997, the U.S. Geological Survey (USGS) and the Microsoft Corporation of Redmond, Wash., entered into a cooperative research and development agreement (CRADA) to make vast amounts of geospatial data available to the general public through the Internet. The CRADA is a 36-month joint effort to develop a general, public-oriented browsing and retrieval site for geospatial data on the Internet. Specifically, Microsoft plans to (1) modify a large volume of USGS geospatial data so the images can be displayed quickly and easily over the Internet, (2) implement an easy-to-use interface for low-speed connections, and (3) develop an Internet Web site capable of servicing millions of users per day.
Capacity Building on the Use of Earth Observation for Bridging the Gaps between Science and Policy
NASA Astrophysics Data System (ADS)
Thapa, R. B.; Bajracharya, B.
2017-12-01
Although the geospatial technologies and Earth observation (EO) data are getting more accessible, lack of skilled human resources and institutional capacities are the major hurdles in the effective applications in Hindu Kush Himalayan (HKH) region. Designing efficient and cost effective capacity building (CB) programs fitting needs by different users on the use of EO information for decision making will provide options in bridging the gaps in the region. This paper presents the strategies adopted by SERVIR-HKH as an attempt to strengthen the capacity of governments and development stakeholders in the region. SERVIR-HKH hub plays vital role in CB on EO applications by bringing together the leading scientists from the Globe and the key national institutions and stakeholders in the region. We conducted country consultation workshops in Afghanistan, Bangladesh, Pakistan, and Nepal to identify national priorities, requirements and the capacity of the institutions to utilize EO information in decision making. The need assessments were focused on four thematic areas of SERVIR where capacity gaps in utilization of EO data in policy decisions were identified in thirteen key service areas. Geospatial capacities in GIT infrastructure, data, and human resources were varied. Linking EO information to policy decision is mostly lacking. Geospatial data sharing provision among the institutions in the region is poor. We developed a capacity building strategy for HKH region which bridges the gaps in a coordinated manner through customized training programs, institutional strengthening, coordination and regional cooperation. Using the strategy, we conducted training on FEWS NET remote sensing products for agro-climatological analysis, which focused on technical interpretation and analysis of the remote sensing and modeled products, eg, CHIRPS, RFE2, CHIRTS, GFS, NDVI, GeoCLIM and GeoGLAM. Scientists from USGS FEWS NET program delivered the training to mid-level managers and decision makers. We also carried out on-the-job trainings on wheat mapping using multi-sensor EO data for co-development of methodologies and implementation on sustainable basis. In this presentation, we will also present the lesson learned from capacity building efforts at SERVIR-HKH and how we envision the best practices for other SERVIR hubs.
A research on the security of wisdom campus based on geospatial big data
NASA Astrophysics Data System (ADS)
Wang, Haiying
2018-05-01
There are some difficulties in wisdom campus, such as geospatial big data sharing, function expansion, data management, analysis and mining geospatial big data for a characteristic, especially the problem of data security can't guarantee cause prominent attention increasingly. In this article we put forward a data-oriented software architecture which is designed by the ideology of orienting data and data as kernel, solve the problem of traditional software architecture broaden the campus space data research, develop the application of wisdom campus.
Data management for geospatial vulnerability assessment of interdependencies in US power generation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shih, C.Y.; Scown, C.D.; Soibelman, L.
2009-09-15
Critical infrastructures maintain our society's stability, security, and quality of life. These systems are also interdependent, which means that the disruption of one infrastructure system can significantly impact the operation of other systems. Because of the heavy reliance on electricity production, it is important to assess possible vulnerabilities. Determining the source of these vulnerabilities can provide insight for risk management and emergency response efforts. This research uses data warehousing and visualization techniques to explore the interdependencies between coal mines, rail transportation, and electric power plants. By merging geospatial and nonspatial data, we are able to model the potential impacts ofmore » a disruption to one or more mines, rail lines, or power plants, and visually display the results using a geographical information system. A scenario involving a severe earthquake in the New Madrid Seismic Zone is used to demonstrate the capabilities of the model when given input in the form of a potentially impacted area. This type of interactive analysis can help decision makers to understand the vulnerabilities of the coal distribution network and the potential impact it can have on electricity production.« less
Geodecision system for traceability and sustainable production of beef cattle in Brazil
NASA Astrophysics Data System (ADS)
Victoria, D. D.; Andrade, R. G.; Bolfe, L.; Batistella, M.; Pires, P. P.; Vicente, L. E.; Visoli, M. C.
2011-12-01
Beef cattle production sustainability depends on incorporating innovative tools and technologies which are easy to comprehend, economically viable, and spatially explicit into the registration of precise, reliable data about production practices. This research developed from the needs and demands of food safety and food quality in extensive beef cattle production within the scope of the policies of Southern Cone and European Union's countries. Initially, the OTAG project (Operational Management and Geodecisional Prototype to Track and Trace Agricultural Production) focused on the development of a prototype traceability of cattle. The aim for the project's next phase is to enhance the electronic devices used in the identification and positioning of the animals, and the incorporation of more management and sanitary information. Besides, we intend to structure a database that enables the inclusion of greater amount of geospatial information linked to environmental aspects, such as water deficit, vegetation vigour, degradation indices of pasture areas, among others. For the extraction of knowledge, and the presentation of the results, we propose the development of a friendly interface to facilitate the exploration of the textual, tabular and geospatial information useful for the user.
Screening Assessment Report and Atlas with Geospatial Data
This Navajo Nation AUM Screening Assessment Report and the accompanying Atlas with Geospatial Data documents NAUM project data collection and screening results for all known AUMs on the Navajo Nation.
NHDPlus is a geospatial, hydrologic framework dataset that is intended for use by geospatial analysts and modelers to support water resources related applications. NHDPlus was developed by the USEPA in partnership with the US Geologic Survey
32 CFR 320.1 - Purpose and scope.
Code of Federal Regulations, 2010 CFR
2010-07-01
... NATIONAL GEOSPATIAL-INTELLIGENCE AGENCY (NGA) PRIVACY § 320.1 Purpose and scope. (a) This part is published... whether the National Geospatial-Intelligence Agency (NGA) maintains or has disclosed a record pertaining...
32 CFR 320.1 - Purpose and scope.
Code of Federal Regulations, 2014 CFR
2014-07-01
... NATIONAL GEOSPATIAL-INTELLIGENCE AGENCY (NGA) PRIVACY § 320.1 Purpose and scope. (a) This part is published... whether the National Geospatial-Intelligence Agency (NGA) maintains or has disclosed a record pertaining...
32 CFR 320.1 - Purpose and scope.
Code of Federal Regulations, 2012 CFR
2012-07-01
... NATIONAL GEOSPATIAL-INTELLIGENCE AGENCY (NGA) PRIVACY § 320.1 Purpose and scope. (a) This part is published... whether the National Geospatial-Intelligence Agency (NGA) maintains or has disclosed a record pertaining...
NHDPlus is a geospatial, hydrologic framework dataset that is intended for use by geospatial analysts and modelers to support water resources related applications. NHDPlus was developed by the USEPA in partnership with the US Geologic Survey
FRS Geospatial Return File Format
The Geospatial Return File Format describes format that needs to be used to submit latitude and longitude coordinates for use in Envirofacts mapping applications. These coordinates are stored in the Geospatail Reference Tables.
NHDPlus is a geospatial, hydrologic framework dataset that is intended for use by geospatial analysts and modelers to support water resources related applications. NHDPlus was developed by the USEPA in partnership with the US Geologic Survey
The Geoinformatica free and open source software stack
NASA Astrophysics Data System (ADS)
Jolma, A.
2012-04-01
The Geoinformatica free and open source software (FOSS) stack is based mainly on three established FOSS components, namely GDAL, GTK+, and Perl. GDAL provides access to a very large selection of geospatial data formats and data sources, a generic geospatial data model, and a large collection of geospatial analytical and processing functionality. GTK+ and the Cairo graphics library provide generic graphics and graphical user interface capabilities. Perl is a programming language, for which there is a very large set of FOSS modules for a wide range of purposes and which can be used as an integrative tool for building applications. In the Geoinformatica stack, data storages such as FOSS RDBMS PostgreSQL with its geospatial extension PostGIS can be used below the three above mentioned components. The top layer of Geoinformatica consists of a C library and several Perl modules. The C library comprises a general purpose raster algebra library, hydrological terrain analysis functions, and visualization code. The Perl modules define a generic visualized geospatial data layer and subclasses for raster and vector data and graphs. The hydrological terrain functions are already rather old and they suffer for example from the requirement of in-memory rasters. Newer research conducted using the platform include basic geospatial simulation modeling, visualization of ecological data, linking with a Bayesian network engine for spatial risk assessment in coastal areas, and developing standards-based distributed water resources information systems in Internet. The Geoinformatica stack constitutes a platform for geospatial research, which is targeted towards custom analytical tools, prototyping and linking with external libraries. Writing custom analytical tools is supported by the Perl language and the large collection of tools that are available especially in GDAL and Perl modules. Prototyping is supported by the GTK+ library, the GUI tools, and the support for object-oriented programming in Perl. New feature types, geospatial layer classes, and tools as extensions with specific features can be defined, used, and studied. Linking with external libraries is possible using the Perl foreign function interface tools or with generic tools such as Swig. We are interested in implementing and testing linking Geoinformatica with existing or new more specific hydrological FOSS.
NASA Astrophysics Data System (ADS)
Yang, Z.; Han, W.; di, L.
2010-12-01
The National Agricultural Statistics Service (NASS) of the USDA produces the Cropland Data Layer (CDL) product, which is a raster-formatted, geo-referenced, U.S. crop specific land cover classification. These digital data layers are widely used for a variety of applications by universities, research institutions, government agencies, and private industry in climate change studies, environmental ecosystem studies, bioenergy production & transportation planning, environmental health research and agricultural production decision making. The CDL is also used internally by NASS for crop acreage and yield estimation. Like most geospatial data products, the CDL product is only available by CD/DVD delivery or online bulk file downloading via the National Research Conservation Research (NRCS) Geospatial Data Gateway (external users) or in a printed paper map format. There is no online geospatial information access and dissemination, no crop visualization & browsing, no geospatial query capability, nor online analytics. To facilitate the application of this data layer and to help disseminating the data, a web-service based CDL interactive map visualization, dissemination, querying system is proposed. It uses Web service based service oriented architecture, adopts open standard geospatial information science technology and OGC specifications and standards, and re-uses functions/algorithms from GeoBrain Technology (George Mason University developed). This system provides capabilities of on-line geospatial crop information access, query and on-line analytics via interactive maps. It disseminates all data to the decision makers and users via real time retrieval, processing and publishing over the web through standards-based geospatial web services. A CDL region of interest can also be exported directly to Google Earth for mashup or downloaded for use with other desktop application. This web service based system greatly improves equal-accessibility, interoperability, usability, and data visualization, facilitates crop geospatial information usage, and enables US cropland online exploring capability without any client-side software installation. It also greatly reduces the need for paper map and analysis report printing and media usages, and thus enhances low-carbon Agro-geoinformation dissemination for decision support.
NASA Astrophysics Data System (ADS)
Delipetrev, Blagoj
2016-04-01
Presently, most of the existing software is desktop-based, designed to work on a single computer, which represents a major limitation in many ways, starting from limited computer processing, storage power, accessibility, availability, etc. The only feasible solution lies in the web and cloud. This abstract presents research and development of a cloud computing geospatial application for water resources based on free and open source software and open standards using hybrid deployment model of public - private cloud, running on two separate virtual machines (VMs). The first one (VM1) is running on Amazon web services (AWS) and the second one (VM2) is running on a Xen cloud platform. The presented cloud application is developed using free and open source software, open standards and prototype code. The cloud application presents a framework how to develop specialized cloud geospatial application that needs only a web browser to be used. This cloud application is the ultimate collaboration geospatial platform because multiple users across the globe with internet connection and browser can jointly model geospatial objects, enter attribute data and information, execute algorithms, and visualize results. The presented cloud application is: available all the time, accessible from everywhere, it is scalable, works in a distributed computer environment, it creates a real-time multiuser collaboration platform, the programing languages code and components are interoperable, and it is flexible in including additional components. The cloud geospatial application is implemented as a specialized water resources application with three web services for 1) data infrastructure (DI), 2) support for water resources modelling (WRM), 3) user management. The web services are running on two VMs that are communicating over the internet providing services to users. The application was tested on the Zletovica river basin case study with concurrent multiple users. The application is a state-of-the-art cloud geospatial collaboration platform. The presented solution is a prototype and can be used as a foundation for developing of any specialized cloud geospatial applications. Further research will be focused on distributing the cloud application on additional VMs, testing the scalability and availability of services.
NASA Astrophysics Data System (ADS)
Nass, Andrea; van Gasselt, Stephan; Hargitai, Hendrik; Hare, Trent; Manaud, Nicolas; Karachevtseva, Irina; Kersten, Elke; Roatsch, Thomas; Wählisch, Marita; Kereszturi, Akos
2016-04-01
Cartography is one of the most important communication channels between users of spatial information and laymen as well as the open public alike. This applies to all known real-world objects located either here on Earth or on any other object in our Solar System. In planetary sciences, however, the main use of cartography resides in a concept called planetary mapping with all its various attached meanings: it can be (1) systematic spacecraft observation from orbit, i.e. the retrieval of physical information, (2) the interpretation of discrete planetary surface units and their abstraction, or it can be (3) planetary cartography sensu strictu, i.e., the technical and artistic creation of map products. As the concept of planetary mapping covers a wide range of different information and knowledge levels, aims associated with the concept of mapping consequently range from a technical and engineering focus to a scientific distillation process. Among others, scientific centers focusing on planetary cartography are the United State Geological Survey (USGS, Flagstaff), the Moscow State University of Geodesy and Cartography (MIIGAiK, Moscow), Eötvös Loránd University (ELTE, Hungary), and the German Aerospace Center (DLR, Berlin). The International Astronomical Union (IAU), the Commission Planetary Cartography within International Cartographic Association (ICA), the Open Geospatial Consortium (OGC), the WG IV/8 Planetary Mapping and Spatial Databases within International Society for Photogrammetry and Remote Sensing (ISPRS) and a range of other institutions contribute on definition frameworks in planetary cartography. Classical cartography is nowadays often (mis-)understood as a tool mainly rather than a scientific discipline and an art of communication. Consequently, concepts of information systems, mapping tools and cartographic frameworks are used interchangeably, and cartographic workflows and visualization of spatial information in thematic maps have often been neglected or were left to software systems to decide by some arbitrary default values. The diversity of cartography as a research discipline and its different contributions in geospatial sciences and communication of information and knowledge will be highlighted in this contribution. We invite colleagues from this and other discipline to discuss concepts and topics for joint future collaboration and research.
2011-12-01
organized and equipped along the same lines as the French gendarmerie mobile, while its counter terrorism component is a replica of the French Groupe...first responders involved in disaster relief and homeland defense operations by providing geospatial intelligence data, products, and analyses.125 4...the impact of manmade and natural disasters .126 5. Service Intelligence Units The service intelligence units of the Army, Navy, Air Force, and
Improving Situation Awareness with the Android Team Awareness Kit (ATAK)
2015-04-01
fluid user experience and enhanced data sharing. 19 6.2.2 Esri Esri is a US-based company that sells geospatial information systems and data services...field, Situational Awareness (SA) needs to be conveyed in a de- centralized manner to the users at the edge of the network as well as at operations...that ATAK has built-in, and the ways it is being used by a variety of military, homeland security, and law enforcement users . Keywords: situational
NASA's Agricultural Program: A USDA/Grower Partnership
NASA Technical Reports Server (NTRS)
McKellip, Rodney; Thomas, Michael
2002-01-01
Ag20/20 is a partnership between USDA, NASA, and four national commodity associations. It is driven by the information needs of U.S. farmers. Ag20/20 is focused on utilization of earth science and remote sensing for decision-making and oriented toward economically viable operational solutions. Its purpose is to accelerate the use of remote sensing and other geospatial technologies on the farm to: 1) Increase the production efficiency of the American farmer; 2) Reduce crop production risks; 3) Improve environmental stewardship tools for agricultural production.
Geospatial Information System Capability Maturity Models
DOT National Transportation Integrated Search
2017-06-01
To explore how State departments of transportation (DOTs) evaluate geospatial tool applications and services within their own agencies, particularly their experiences using capability maturity models (CMMs) such as the Urban and Regional Information ...
Geospatial Data Sciences | Energy Analysis | NREL
, demographics, and the earth's physical geography to provide the foundation for energy analysis and decision -making. Photo of two people discussing a map. Geospatial Analysis Our geographic information system
NASA Astrophysics Data System (ADS)
Buonanno, Sabatino; Fusco, Adele; Zeni, Giovanni; Manunta, Michele; Lanari, Riccardo
2017-04-01
This work describes the implementation of an efficient system for managing, viewing, analyzing and updating remotely sensed data, with special reference to Differential Interferometric Synthetic Aperture Radar (DInSAR) data. The DInSAR products measure Earth surface deformation both in space and time, producing deformation maps and time series[1,2]. The use of these data in research or operational contexts requires tools that have to handle temporal and spatial variability with high efficiency. For this aim we present an implementation based on Spatial Data Infrastructure (SDI) for data integration, management and interchange, by using standard protocols[3]. SDI tools provide access to static datasets that operate only with spatial variability . In this paper we use the open source project GeoNode as framework to extend SDI infrastructure functionalities to ingest very efficiently DInSAR deformation maps and deformation time series. GeoNode allows to realize comprehensive and distributed infrastructure, following the standards of the Open Geospatial Consortium, Inc. - OGC, for remote sensing data management, analysis and integration [4,5]. In the current paper we explain the methodology used for manage the data complexity and data integration using the opens source project GeoNode. The solution presented in this work for the ingestion of DinSAR products is a very promising starting point for future developments of the OGC compliant implementation of a semi-automatic remote sensing data processing chain . [1] Berardino, P., Fornaro, G., Lanari, R., & Sansosti, E. (2002). A new Algorithm for Surface Deformation Monitoring based on Small Baseline Differential SAR Interferograms. IEEE Transactions on Geoscience and Remote Sensing, 40, 11, pp. 2375-2383. [2] Lanari R., F. Casu, M. Manzo, G. Zeni,, P. Berardino, M. Manunta and A. Pepe (2007), An overview of the Small Baseline Subset Algorithm: a DInSAR Technique for Surface Deformation Analysis, P. Appl. Geophys., 164, doi: 10.1007/s00024-007-0192-9. [3] Nebert, D.D. (ed). 2000. Developing Spatial data Infrastructures: The SDI Cookbook. [4] Geonode (www.geonode.org) [5] Kolodziej, k. (ed). 2004. OGC OpenGIS Web Map Server Cookbook. Open Geospatial Consortium, 1.0.2 edition.
Conference on Geospatial Approaches to Cancer Control and Population Sciences
The purpose of this conference is to bring together a community of researchers across the cancer control continuum using geospatial tools, models and approaches to address cancer prevention and control.
Staff - Michael D. Hendricks | Alaska Division of Geological & Geophysical
, HI 1999-2000, Geospatial Intelligence Officer, U.S. Army Pacific (USARPAC), Fort Shafter, HI 1995 Geospatial Intelligence Foundation (USGIF), 2011 Selected Publications Wolken, G.J., Wikstrom Jones, Katreen
NHDPlus (National Hydrography Dataset Plus)
NHDPlus is a geospatial, hydrologic framework dataset that is intended for use by geospatial analysts and modelers to support water resources related applications. NHDPlus was developed by the USEPA in partnership with the US Geologic Survey
Incorporating Geographic Information Science in the BSc Environ-mental Science Program in Botswana
NASA Astrophysics Data System (ADS)
Akinyemi, Felicia O.
2018-05-01
Critical human capacity in Geographic Information Science (GISc) is developed at the Botswana International University of Science and Technology, a specialized, research university. Strategies employed include GISc courses offered each semester to students from various programs, the conduct of field-based projects, enrolment in online courses, geo-spatial initiatives with external partners, and final year research projects utilizing geospatial technologies. A review is made of available GISc courses embedded in the Bachelor of Science Environmental Science program. GISc courses are incorporated in three Bachelor degree programs as distinct courses. Geospatial technologies are employed in several other courses. Student researches apply GIS and Remote Sensing methods to environmental and geological themes. The overarching goals are to equip students in various disciplines to utilize geospatial technologies, and enhance their spatial thinking and reasoning skills.
Rea, Alan; Skinner, Kenneth D.
2012-01-01
The U.S. Geological Survey Hawaii StreamStats application uses an integrated suite of raster and vector geospatial datasets to delineate and characterize watersheds. The geospatial datasets used to delineate and characterize watersheds on the StreamStats website, and the methods used to develop the datasets are described in this report. The datasets for Hawaii were derived primarily from 10 meter resolution National Elevation Dataset (NED) elevation models, and the National Hydrography Dataset (NHD), using a set of procedures designed to enforce the drainage pattern from the NHD into the NED, resulting in an integrated suite of elevation-derived datasets. Additional sources of data used for computing basin characteristics include precipitation, land cover, soil permeability, and elevation-derivative datasets. The report also includes links for metadata and downloads of the geospatial datasets.
NHDPlusHR: A national geospatial framework for surface-water information
Viger, Roland; Rea, Alan H.; Simley, Jeffrey D.; Hanson, Karen M.
2016-01-01
The U.S. Geological Survey is developing a new geospatial hydrographic framework for the United States, called the National Hydrography Dataset Plus High Resolution (NHDPlusHR), that integrates a diversity of the best-available information, robustly supports ongoing dataset improvements, enables hydrographic generalization to derive alternate representations of the network while maintaining feature identity, and supports modern scientific computing and Internet accessibility needs. This framework is based on the High Resolution National Hydrography Dataset, the Watershed Boundaries Dataset, and elevation from the 3-D Elevation Program, and will provide an authoritative, high precision, and attribute-rich geospatial framework for surface-water information for the United States. Using this common geospatial framework will provide a consistent basis for indexing water information in the United States, eliminate redundancy, and harmonize access to, and exchange of water information.
A flexible geospatial sensor observation service for diverse sensor data based on Web service
NASA Astrophysics Data System (ADS)
Chen, Nengcheng; Di, Liping; Yu, Genong; Min, Min
Achieving a flexible and efficient geospatial Sensor Observation Service (SOS) is difficult, given the diversity of sensor networks, the heterogeneity of sensor data storage, and the differing requirements of users. This paper describes development of a service-oriented multi-purpose SOS framework. The goal is to create a single method of access to the data by integrating the sensor observation service with other Open Geospatial Consortium (OGC) services — Catalogue Service for the Web (CSW), Transactional Web Feature Service (WFS-T) and Transactional Web Coverage Service (WCS-T). The framework includes an extensible sensor data adapter, an OGC-compliant geospatial SOS, a geospatial catalogue service, a WFS-T, and a WCS-T for the SOS, and a geospatial sensor client. The extensible sensor data adapter finds, stores, and manages sensor data from live sensors, sensor models, and simulation systems. Abstract factory design patterns are used during design and implementation. A sensor observation service compatible with the SWE is designed, following the OGC "core" and "transaction" specifications. It is implemented using Java servlet technology. It can be easily deployed in any Java servlet container and automatically exposed for discovery using Web Service Description Language (WSDL). Interaction sequences between a Sensor Web data consumer and an SOS, between a producer and an SOS, and between an SOS and a CSW are described in detail. The framework has been successfully demonstrated in application scenarios for EO-1 observations, weather observations, and water height gauge observations.
Modeling photovoltaic diffusion: an analysis of geospatial datasets
NASA Astrophysics Data System (ADS)
Davidson, Carolyn; Drury, Easan; Lopez, Anthony; Elmore, Ryan; Margolis, Robert
2014-07-01
This study combines address-level residential photovoltaic (PV) adoption trends in California with several types of geospatial information—population demographics, housing characteristics, foreclosure rates, solar irradiance, vehicle ownership preferences, and others—to identify which subsets of geospatial information are the best predictors of historical PV adoption. Number of rooms, heating source and house age were key variables that had not been previously explored in the literature, but are consistent with the expected profile of a PV adopter. The strong relationship provided by foreclosure indicators and mortgage status have less of an intuitive connection to PV adoption, but may be highly correlated with characteristics inherent in PV adopters. Next, we explore how these predictive factors and model performance varies between different Investor Owned Utility (IOU) regions in California, and at different spatial scales. Results suggest that models trained with small subsets of geospatial information (five to eight variables) may provide similar explanatory power as models using hundreds of geospatial variables. Further, the predictive performance of models generally decreases at higher resolution, i.e., below ZIP code level since several geospatial variables with coarse native resolution become less useful for representing high resolution variations in PV adoption trends. However, for California we find that model performance improves if parameters are trained at the regional IOU level rather than the state-wide level. We also find that models trained within one IOU region are generally representative for other IOU regions in CA, suggesting that a model trained with data from one state may be applicable in another state.
Digital data collection in paleoanthropology.
Reed, Denné; Barr, W Andrew; Mcpherron, Shannon P; Bobe, René; Geraads, Denis; Wynn, Jonathan G; Alemseged, Zeresenay
2015-01-01
Understanding patterns of human evolution across space and time requires synthesizing data collected by independent research teams, and this effort is part of a larger trend to develop cyber infrastructure and e-science initiatives. At present, paleoanthropology cannot easily answer basic questions about the total number of fossils and artifacts that have been discovered, or exactly how those items were collected. In this paper, we examine the methodological challenges to data integration, with the hope that mitigating the technical obstacles will further promote data sharing. At a minimum, data integration efforts must document what data exist and how the data were collected (discovery), after which we can begin standardizing data collection practices with the aim of achieving combined analyses (synthesis). This paper outlines a digital data collection system for paleoanthropology. We review the relevant data management principles for a general audience and supplement this with technical details drawn from over 15 years of paleontological and archeological field experience in Africa and Europe. The system outlined here emphasizes free open-source software (FOSS) solutions that work on multiple computer platforms; it builds on recent advances in open-source geospatial software and mobile computing. © 2015 Wiley Periodicals, Inc.
Semantically optiMize the dAta seRvice operaTion (SMART) system for better data discovery and access
NASA Astrophysics Data System (ADS)
Yang, C.; Huang, T.; Armstrong, E. M.; Moroni, D. F.; Liu, K.; Gui, Z.
2013-12-01
Abstract: We present a Semantically optiMize the dAta seRvice operaTion (SMART) system for better data discovery and access across the NASA data systems, Global Earth Observation System of Systems (GEOSS) Clearinghouse and Data.gov to facilitate scientists to select Earth observation data that fit better their needs in four aspects: 1. Integrating and interfacing the SMART system to include the functionality of a) semantic reasoning based on Jena, an open source semantic reasoning engine, b) semantic similarity calculation, c) recommendation based on spatiotemporal, semantic, and user workflow patterns, and d) ranking results based on similarity between search terms and data ontology. 2. Collaborating with data user communities to a) capture science data ontology and record relevant ontology triple stores, b) analyze and mine user search and download patterns, c) integrate SMART into metadata-centric discovery system for community-wide usage and feedback, and d) customizing data discovery, search and access user interface to include the ranked results, recommendation components, and semantic based navigations. 3. Laying the groundwork to interface the SMART system with other data search and discovery systems as an open source data search and discovery solution. The SMART systems leverages NASA, GEO, FGDC data discovery, search and access for the Earth science community by enabling scientists to readily discover and access data appropriate to their endeavors, increasing the efficiency of data exploration and decreasing the time that scientists must spend on searching, downloading, and processing the datasets most applicable to their research. By incorporating the SMART system, it is a likely aim that the time being devoted to discovering the most applicable dataset will be substantially reduced, thereby reducing the number of user inquiries and likewise reducing the time and resources expended by a data center in addressing user inquiries. Keywords: EarthCube; ECHO, DAACs, GeoPlatform; Geospatial Cyberinfrastructure References: 1. Yang, P., Evans, J., Cole, M., Alameh, N., Marley, S., & Bambacus, M., (2007). The Emerging Concepts and Applications of the Spatial Web Portal. Photogrammetry Engineering &Remote Sensing,73(6):691-698. 2. Zhang, C, Zhao, T. and W. Li. (2010). The Framework of a Geospatial Semantic Web based Spatial Decision Support System for Digital Earth. International Journal of Digital Earth. 3(2):111-134. 3. Yang C., Raskin R., Goodchild M.F., Gahegan M., 2010, Geospatial Cyberinfrastructure: Past, Present and Future,Computers, Environment, and Urban Systems, 34(4):264-277. 4. Liu K., Yang C., Li W., Gui Z., Xu C., Xia J., 2013. Using ontology and similarity calculations to rank Earth science data searching results, International Journal of Geospatial Information Applications. (in press)
Towards Precise Metadata-set for Discovering 3D Geospatial Models in Geo-portals
NASA Astrophysics Data System (ADS)
Zamyadi, A.; Pouliot, J.; Bédard, Y.
2013-09-01
Accessing 3D geospatial models, eventually at no cost and for unrestricted use, is certainly an important issue as they become popular among participatory communities, consultants, and officials. Various geo-portals, mainly established for 2D resources, have tried to provide access to existing 3D resources such as digital elevation model, LIDAR or classic topographic data. Describing the content of data, metadata is a key component of data discovery in geo-portals. An inventory of seven online geo-portals and commercial catalogues shows that the metadata referring to 3D information is very different from one geo-portal to another as well as for similar 3D resources in the same geo-portal. The inventory considered 971 data resources affiliated with elevation. 51% of them were from three geo-portals running at Canadian federal and municipal levels whose metadata resources did not consider 3D model by any definition. Regarding the remaining 49% which refer to 3D models, different definition of terms and metadata were found, resulting in confusion and misinterpretation. The overall assessment of these geo-portals clearly shows that the provided metadata do not integrate specific and common information about 3D geospatial models. Accordingly, the main objective of this research is to improve 3D geospatial model discovery in geo-portals by adding a specific metadata-set. Based on the knowledge and current practices on 3D modeling, and 3D data acquisition and management, a set of metadata is proposed to increase its suitability for 3D geospatial models. This metadata-set enables the definition of genuine classes, fields, and code-lists for a 3D metadata profile. The main structure of the proposal contains 21 metadata classes. These classes are classified in three packages as General and Complementary on contextual and structural information, and Availability on the transition from storage to delivery format. The proposed metadata set is compared with Canadian Geospatial Data Infrastructure (CGDI) metadata which is an implementation of North American Profile of ISO-19115. The comparison analyzes the two metadata against three simulated scenarios about discovering needed 3D geo-spatial datasets. Considering specific metadata about 3D geospatial models, the proposed metadata-set has six additional classes on geometric dimension, level of detail, geometric modeling, topology, and appearance information. In addition classes on data acquisition, preparation, and modeling, and physical availability have been specialized for 3D geospatial models.
Student Focused Geospatial Curriculum Initiatives: Internships and Certificate Programs at NCCU
NASA Astrophysics Data System (ADS)
Vlahovic, G.; Malhotra, R.
2009-12-01
This paper reports recent efforts by the Department of Environmental, Earth and Geospatial Sciences faculty at North Carolina Central University (NCCU) to develop a leading geospatial sciences program that will be considered a model for other Historically Black College/University (HBCU) peers nationally. NCCU was established in 1909 and is the nation’s first state supported public liberal arts college funded for African Americans. In the most recent annual ranking of America’s best black colleges by the US News and World Report (Best Colleges 2010), NCCU was ranked 10th in the nation. As one of only two HBCUs in the southeast offering an undergraduate degree in Geography (McKee, J.O. and C. V. Dixon. Geography in Historically Black Colleges/ Universities in the Southeast, in The Role of the South in Making of American Geography: Centennial of the AAG, 2004), NCCU is uniquely positioned to positively affect talent and diversity of the geospatial discipline in the future. Therefore, successful creation of research and internship pathways for NCCU students has national implications because it will increase the number of minority students joining the workforce and applying to PhD programs. Several related efforts will be described, including research and internship projects with Fugro EarthData Inc., Center for Remote Sensing and Mapping Science at the University of Georgia, Center for Earthquake Research and Information at the University of Memphis and the City of Durham. The authors will also outline requirements and recent successes of ASPRS Provisional Certification Program, developed and pioneered as collaborative effort between ASPRS and NCCU. This certificate program allows graduating students majoring in geospatial technologies and allied fields to become provisionally certified by passing peer-review and taking the certification exam. At NCCU, projects and certification are conducted under the aegis of the Geospatial Research, Innovative Teaching and Service (GRITS) Center housed in the Department of Environmental, Earth and Geospatial Sciences. The GRITS center was established in 2006 with funding from the National Science Foundation to promote the learning and application of geospatial technologies. Since then GRITS has been a hub for Geographical Information Science (GIS) curriculum development, faculty and professional GIS workshops, grant writing and outreach efforts. The Center also serves as a contact point for partnerships with other universities, national organizations and businesses in the geospatial arena - and as a result, opens doors to the professional world for our graduate and undergraduate students.
The National Map product and services directory
Newell, Mark R.
2008-01-01
As one of the cornerstones of the U.S. Geological Survey's (USGS) National Geospatial Program (NGP), The National Map is a collaborative effort among the USGS and other Federal, state, and local partners to improve and deliver topographic information for the Nation. It has many uses ranging from recreation to scientific analysis to emergency response. The National Map is easily accessible for display on the Web, as products, and as downloadable data. The geographic information available from The National Map includes orthoimagery (aerial photographs), elevation, geographic names, hydrography, boundaries, transportation, structures, and land cover. Other types of geographic information can be added to create specific types of maps. Of major importance, The National Map currently is being transformed to better serve the geospatial community. The USGS National Geospatial Program Office (NGPO) was established to provide leadership for placing geographic knowledge at the fingertips of the Nation. The office supports The National Map, Geospatial One-Stop (GOS), National Atlas of the United States®, and the Federal Geographic Data Committee (FGDC). This integrated portfolio of geospatial information and data supports the essential components of delivering the National Spatial Data Infrastructure (NSDI) and capitalizing on the power of place.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pasha, M. Fayzul K.; Yang, Majntxov; Yeasmin, Dilruba
Benefited from the rapid development of multiple geospatial data sets on topography, hydrology, and existing energy-water infrastructures, the reconnaissance level hydropower resource assessment can now be conducted using geospatial models in all regions of the US. Furthermore, the updated techniques can be used to estimate the total undeveloped hydropower potential across all regions, and may eventually help identify further hydropower opportunities that were previously overlooked. To enhance the characterization of higher energy density stream-reaches, this paper explored the sensitivity of geospatial resolution on the identification of hydropower stream-reaches using the geospatial merit matrix based hydropower resource assessment (GMM-HRA) model. GMM-HRAmore » model simulation was conducted with eight different spatial resolutions on six U.S. Geological Survey (USGS) 8-digit hydrologic units (HUC8) located at three different terrains; Flat, Mild, and Steep. The results showed that more hydropower potential from higher energy density stream-reaches can be identified with increasing spatial resolution. Both Flat and Mild terrains exhibited lower impacts compared to the Steep terrain. Consequently, greater attention should be applied when selecting the discretization resolution for hydropower resource assessments in the future study.« less
Distributed Storage Algorithm for Geospatial Image Data Based on Data Access Patterns.
Pan, Shaoming; Li, Yongkai; Xu, Zhengquan; Chong, Yanwen
2015-01-01
Declustering techniques are widely used in distributed environments to reduce query response time through parallel I/O by splitting large files into several small blocks and then distributing those blocks among multiple storage nodes. Unfortunately, however, many small geospatial image data files cannot be further split for distributed storage. In this paper, we propose a complete theoretical system for the distributed storage of small geospatial image data files based on mining the access patterns of geospatial image data using their historical access log information. First, an algorithm is developed to construct an access correlation matrix based on the analysis of the log information, which reveals the patterns of access to the geospatial image data. Then, a practical heuristic algorithm is developed to determine a reasonable solution based on the access correlation matrix. Finally, a number of comparative experiments are presented, demonstrating that our algorithm displays a higher total parallel access probability than those of other algorithms by approximately 10-15% and that the performance can be further improved by more than 20% by simultaneously applying a copy storage strategy. These experiments show that the algorithm can be applied in distributed environments to help realize parallel I/O and thereby improve system performance.
A Geospatial Data Recommender System based on Metadata and User Behaviour
NASA Astrophysics Data System (ADS)
Li, Y.; Jiang, Y.; Yang, C. P.; Armstrong, E. M.; Huang, T.; Moroni, D. F.; Finch, C. J.; McGibbney, L. J.
2017-12-01
Earth observations are produced in a fast velocity through real time sensors, reaching tera- to peta- bytes of geospatial data daily. Discovering and accessing the right data from the massive geospatial data is like finding needle in the haystack. To help researchers find the right data for study and decision support, quite a lot of research focusing on improving search performance have been proposed including recommendation algorithm. However, few papers have discussed the way to implement a recommendation algorithm in geospatial data retrieval system. In order to address this problem, we propose a recommendation engine to improve discovering relevant geospatial data by mining and utilizing metadata and user behavior data: 1) metadata based recommendation considers the correlation of each attribute (i.e., spatiotemporal, categorical, and ordinal) to data to be found. In particular, phrase extraction method is used to improve the accuracy of the description similarity; 2) user behavior data are utilized to predict the interest of a user through collaborative filtering; 3) an integration method is designed to combine the results of the above two methods to achieve better recommendation Experiments show that in the hybrid recommendation list, the all the precisions are larger than 0.8 from position 1 to 10.
Pasha, M. Fayzul K.; Yang, Majntxov; Yeasmin, Dilruba; ...
2016-01-07
Benefited from the rapid development of multiple geospatial data sets on topography, hydrology, and existing energy-water infrastructures, the reconnaissance level hydropower resource assessment can now be conducted using geospatial models in all regions of the US. Furthermore, the updated techniques can be used to estimate the total undeveloped hydropower potential across all regions, and may eventually help identify further hydropower opportunities that were previously overlooked. To enhance the characterization of higher energy density stream-reaches, this paper explored the sensitivity of geospatial resolution on the identification of hydropower stream-reaches using the geospatial merit matrix based hydropower resource assessment (GMM-HRA) model. GMM-HRAmore » model simulation was conducted with eight different spatial resolutions on six U.S. Geological Survey (USGS) 8-digit hydrologic units (HUC8) located at three different terrains; Flat, Mild, and Steep. The results showed that more hydropower potential from higher energy density stream-reaches can be identified with increasing spatial resolution. Both Flat and Mild terrains exhibited lower impacts compared to the Steep terrain. Consequently, greater attention should be applied when selecting the discretization resolution for hydropower resource assessments in the future study.« less
Bill Walker on July 29, 2015. Since that date, additional members have been annexed to include federal Executive Committee. News and Updates: Geospatial Data Act of 2017: A bill re-introduced to the Senate on
Siu, Joey; Maran, Nikki; Paterson-Brown, Simon
2016-06-01
The importance of non-technical skills in improving surgical safety and performance is now well recognised. Better understanding is needed of the impact that non-technical skills of the multi-disciplinary theatre team have on intra-operative incidents in the operating room (OR) using structured theatre-based assessment. The interaction of non-technical skills that influence surgical safety of the OR team will be explored and made more transparent. Between May-August 2013, a range of procedures in general and vascular surgery in the Royal Infirmary of Edinburgh were performed. Non-technical skills behavioural markers and associated intra-operative incidents were recorded using established behavioural marking systems (NOTSS, ANTS and SPLINTS). Adherence to the surgical safety checklist was also observed. A total of 51 procedures were observed, with 90 recorded incidents - 57 of which were considered avoidable. Poor situational awareness was a common area for surgeons and anaesthetists leading to most intra-operative incidents. Poor communication and teamwork across the whole OR team had a generally large impact on intra-operative incidents. Leadership was shown to be an essential set of skills for the surgeons as demonstrated by the high correlation of poor leadership with intra-operative incidents. Team-working and management skills appeared to be especially important for anaesthetists in the recovery from an intra-operative incident. A significant number of avoidable incidents occur during operative procedures. These can all be linked to failures in non-technical skills. Better training of both individual and team in non-technical skills is needed in order to improve patient safety in the operating room. Copyright © 2014 Royal College of Surgeons of Edinburgh (Scottish charity number SC005317) and Royal College of Surgeons in Ireland. Published by Elsevier Ltd. All rights reserved.
Enhancing GIS Capabilities for High Resolution Earth Science Grids
NASA Astrophysics Data System (ADS)
Koziol, B. W.; Oehmke, R.; Li, P.; O'Kuinghttons, R.; Theurich, G.; DeLuca, C.
2017-12-01
Applications for high performance GIS will continue to increase as Earth system models pursue more realistic representations of Earth system processes. Finer spatial resolution model input and output, unstructured or irregular modeling grids, data assimilation, and regional coordinate systems present novel challenges for GIS frameworks operating in the Earth system modeling domain. This presentation provides an overview of two GIS-driven applications that combine high performance software with big geospatial datasets to produce value-added tools for the modeling and geoscientific community. First, a large-scale interpolation experiment using National Hydrography Dataset (NHD) catchments, a high resolution rectilinear CONUS grid, and the Earth System Modeling Framework's (ESMF) conservative interpolation capability will be described. ESMF is a parallel, high-performance software toolkit that provides capabilities (e.g. interpolation) for building and coupling Earth science applications. ESMF is developed primarily by the NOAA Environmental Software Infrastructure and Interoperability (NESII) group. The purpose of this experiment was to test and demonstrate the utility of high performance scientific software in traditional GIS domains. Special attention will be paid to the nuanced requirements for dealing with high resolution, unstructured grids in scientific data formats. Second, a chunked interpolation application using ESMF and OpenClimateGIS (OCGIS) will demonstrate how spatial subsetting can virtually remove computing resource ceilings for very high spatial resolution interpolation operations. OCGIS is a NESII-developed Python software package designed for the geospatial manipulation of high-dimensional scientific datasets. An overview of the data processing workflow, why a chunked approach is required, and how the application could be adapted to meet operational requirements will be discussed here. In addition, we'll provide a general overview of OCGIS's parallel subsetting capabilities including challenges in the design and implementation of a scientific data subsetter.
Sensor Webs and Virtual Globes: Enabling Understanding of Changes in a partially Glaciated Watershed
NASA Astrophysics Data System (ADS)
Heavner, M.; Fatland, D. R.; Habermann, M.; Berner, L.; Hood, E.; Connor, C.; Galbraith, J.; Knuth, E.; O'Brien, W.
2008-12-01
The University of Alaska Southeast is currently implementing a sensor web identified as the SouthEast Alaska MOnitoring Network for Science, Telecommunications, Education, and Research (SEAMONSTER). SEAMONSTER is operating in the partially glaciated Mendenhall and Lemon Creek Watersheds, in the Juneau area, on the margins of the Juneau Icefield. These watersheds are studied for both 1. long term monitoring of changes, and 2. detection and analysis of transient events (such as glacier lake outburst floods). The heterogeneous sensors (meteorologic, dual frequency GPS, water quality, lake level, etc), power and bandwidth constraints, and competing time scales of interest require autonomous reactivity of the sensor web. They also present challenges for operational management of the sensor web. The harsh conditions on the glaciers provide additional operating constraints. The tight integration of the sensor web and virtual global enabling technology enhance the project in multiple ways. We are utilizing virtual globe infrastructures to enhance both sensor web management and data access. SEAMONSTER utilizes virtual globes for education and public outreach, sensor web management, data dissemination, and enabling collaboration. Using a PosgreSQL with GIS extensions database coupled to the Open Geospatial Consortium (OGC) Geoserver, we generate near-real-time auto-updating geobrowser files of the data in multiple OGC standard formats (e.g KML, WCS). Additionally, embedding wiki pages in this database allows the development of a geospatially aware wiki describing the projects for better public outreach and education. In this presentation we will describe how we have implemented these technologies to date, the lessons learned, and our efforts towards greater OGC standard implementation. A major focus will be on demonstrating how geobrowsers and virtual globes have made this project possible.
Geospatial Data Integration for Assessing Landslide Hazard on Engineered Slopes
NASA Astrophysics Data System (ADS)
Miller, P. E.; Mills, J. P.; Barr, S. L.; Birkinshaw, S. J.
2012-07-01
Road and rail networks are essential components of national infrastructures, underpinning the economy, and facilitating the mobility of goods and the human workforce. Earthwork slopes such as cuttings and embankments are primary components, and their reliability is of fundamental importance. However, instability and failure can occur, through processes such as landslides. Monitoring the condition of earthworks is a costly and continuous process for network operators, and currently, geospatial data is largely underutilised. The research presented here addresses this by combining airborne laser scanning and multispectral aerial imagery to develop a methodology for assessing landslide hazard. This is based on the extraction of key slope stability variables from the remotely sensed data. The methodology is implemented through numerical modelling, which is parameterised with the slope stability information, simulated climate conditions, and geotechnical properties. This allows determination of slope stability (expressed through the factor of safety) for a range of simulated scenarios. Regression analysis is then performed in order to develop a functional model relating slope stability to the input variables. The remotely sensed raster datasets are robustly re-sampled to two-dimensional cross-sections to facilitate meaningful interpretation of slope behaviour and mapping of landslide hazard. Results are stored in a geodatabase for spatial analysis within a GIS environment. For a test site located in England, UK, results have shown the utility of the approach in deriving practical hazard assessment information. Outcomes were compared to the network operator's hazard grading data, and show general agreement. The utility of the slope information was also assessed with respect to auto-population of slope geometry, and found to deliver significant improvements over the network operator's existing field-based approaches.
GIO-EMS and International Collaboration in Satellite based Emergency Mapping
NASA Astrophysics Data System (ADS)
Kucera, Jan; Lemoine, Guido; Broglia, Marco
2013-04-01
During the last decade, satellite based emergency mapping has developed into a mature operational stage. The European Union's GMES Initial Operations - Emergency Management Service (GIO-EMS), is operational since April 2012. It's set up differs from other mechanisms (for example from the International Charter "Space and Major Disasters"), as it extends fast satellite tasking and delivery with the value adding map production as a single service, which is available, free of charge, to the authorized users of the service. Maps and vector datasets with standard characteristics and formats ranging from post-disaster damage assessment to recovery and disaster prevention are covered by this initiative. Main users of the service are European civil protection authorities and international organizations active in humanitarian aid. All non-sensitive outputs of the service are accessible to the public. The European Commission's in-house science service Joint Research Centre (JRC) is the technical and administrative supervisor of the GIO-EMS. The EC's DG ECHO Monitoring and Information Centre acts as the service's focal point and DG ENTR is responsible for overall service governance. GIO-EMS also aims to contribute to the synergy with similar existing mechanisms at national and international level. The usage of satellite data for emergency mapping has increased during the last years and this trend is expected to continue because of easier accessibility to suitable satellite and other relevant data in the near future. Furthermore, the data and analyses coming from volunteer emergency mapping communities are expected to further enrich the content of such cartographic products. In the case of major disasters the parallel activity of more providers is likely to generate non-optimal use of resources, e.g. unnecessary duplication; whereas coordination may lead to reduced time needed to cover the disaster area. Furthermore the abundant number of geospatial products of different characteristics and quality can become confusing for users. The urgent need for a better coordination has led to establishment of the International Working Group on Satellite Based Emergency Mapping (IWG-SEM). Members of the IWG-SEM, which include JRC, USGS, DLR-ZKI, SERVIR, Sentinel Asia, UNOSAT, UN-SPIDER, GEO, ITHACA and SERTIT have recognized the need to establish the best practice between operational satellite-based emergency mapping programs. The group intends to: • work with the appropriate organizations on definition of professional standards for emergency mapping, guidelines for product generation and reviewing relevant technical standards and protocols • facilitate communication and collaboration during the major emergencies • stimulate coordination of expertise and capacities. The existence of the group and the cooperation among members already brought benefits during recent disasters in Africa and Europe in 2012 in terms of faster and effective satellite data provision and better product generation.
Representation of activity in images using geospatial temporal graphs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brost, Randolph; McLendon, III, William C.; Parekh, Ojas D.
Various technologies pertaining to modeling patterns of activity observed in remote sensing images using geospatial-temporal graphs are described herein. Graphs are constructed by representing objects in remote sensing images as nodes, and connecting nodes with undirected edges representing either distance or adjacency relationships between objects and directed edges representing changes in time. Activity patterns may be discerned from the graphs by coding nodes representing persistent objects like buildings differently from nodes representing ephemeral objects like vehicles, and examining the geospatial-temporal relationships of ephemeral nodes within the graph.
Remote Sensing Technologies and Geospatial Modelling Hierarchy for Smart City Support
NASA Astrophysics Data System (ADS)
Popov, M.; Fedorovsky, O.; Stankevich, S.; Filipovich, V.; Khyzhniak, A.; Piestova, I.; Lubskyi, M.; Svideniuk, M.
2017-12-01
The approach to implementing the remote sensing technologies and geospatial modelling for smart city support is presented. The hierarchical structure and basic components of the smart city information support subsystem are considered. Some of the already available useful practical developments are described. These include city land use planning, urban vegetation analysis, thermal condition forecasting, geohazard detection, flooding risk assessment. Remote sensing data fusion approach for comprehensive geospatial analysis is discussed. Long-term city development forecasting by Forrester - Graham system dynamics model is provided over Kiev urban area.
Dynamic Server-Based KML Code Generator Method for Level-of-Detail Traversal of Geospatial Data
NASA Technical Reports Server (NTRS)
Baxes, Gregory; Mixon, Brian; Linger, TIm
2013-01-01
Web-based geospatial client applications such as Google Earth and NASA World Wind must listen to data requests, access appropriate stored data, and compile a data response to the requesting client application. This process occurs repeatedly to support multiple client requests and application instances. Newer Web-based geospatial clients also provide user-interactive functionality that is dependent on fast and efficient server responses. With massively large datasets, server-client interaction can become severely impeded because the server must determine the best way to assemble data to meet the client applications request. In client applications such as Google Earth, the user interactively wanders through the data using visually guided panning and zooming actions. With these actions, the client application is continually issuing data requests to the server without knowledge of the server s data structure or extraction/assembly paradigm. A method for efficiently controlling the networked access of a Web-based geospatial browser to server-based datasets in particular, massively sized datasets has been developed. The method specifically uses the Keyhole Markup Language (KML), an Open Geospatial Consortium (OGS) standard used by Google Earth and other KML-compliant geospatial client applications. The innovation is based on establishing a dynamic cascading KML strategy that is initiated by a KML launch file provided by a data server host to a Google Earth or similar KMLcompliant geospatial client application user. Upon execution, the launch KML code issues a request for image data covering an initial geographic region. The server responds with the requested data along with subsequent dynamically generated KML code that directs the client application to make follow-on requests for higher level of detail (LOD) imagery to replace the initial imagery as the user navigates into the dataset. The approach provides an efficient data traversal path and mechanism that can be flexibly established for any dataset regardless of size or other characteristics. The method yields significant improvements in userinteractive geospatial client and data server interaction and associated network bandwidth requirements. The innovation uses a C- or PHP-code-like grammar that provides a high degree of processing flexibility. A set of language lexer and parser elements is provided that offers a complete language grammar for writing and executing language directives. A script is wrapped and passed to the geospatial data server by a client application as a component of a standard KML-compliant statement. The approach provides an efficient means for a geospatial client application to request server preprocessing of data prior to client delivery. Data is structured in a quadtree format. As the user zooms into the dataset, geographic regions are subdivided into four child regions. Conversely, as the user zooms out, four child regions collapse into a single, lower-LOD region. The approach provides an efficient data traversal path and mechanism that can be flexibly established for any dataset regardless of size or other characteristics.
Benefits of using Open Geo-spatial Data for valorization of Cultural Heritage: GeoPan app
NASA Astrophysics Data System (ADS)
Cuca, Branka; Previtali, Mattia; Barazzetti, Luigi; Brumana, Raffaella
2017-04-01
Experts evaluate the spatial data to be one of the categories of Public Sector Information (PSI), of which the exchange is particularly important. On the other side an initiative with a great vision such as Digital Agenda for Europe, emphasizes on intelligent processing of information as essential factor for tackling the challenges of the contemporary society. In such context, the Open Data are considered to be crucial in addressing, environmental pressures, energy efficiency issues, land use and climate change, pollution and traffic management. Furthermore, Open Data are thought to have an important impact on more informed decision making and policy creation for multiple domains that could be addressed even through "apps" of our smart devices. Activities performed in ENERGIC OD project - "European NEtwork for Redistributing Geospatial Information to user Communities - Open Data" have led to some first conclusions on the use and re-use of geo-spatial Open Data by means of Virtual Hubs - an innovative method for brokering of geo-spatial information. This paper illustrates some main benefits of using Open Geo-spatial Data for valorisation of Cultural Heritage through a case of an innovative app called "GeoPan Atl@s". GeoPan, inserted in a dynamic policy context described, aims to provide all information valuable for a sustainable territorial development in a common platform, in particular the material that regards history and changes of the cultural landscapes in Lombardy region. Furthermore, this innovative app is used as a test-bed to facilitate and encourage a more active exchange and exploitation of open geo-spatial information for purposes of valorisation of cultural heritage and landscapes. The aim of this practice is also to achieve a more active participation of experts, VGI communities and citizens and a higher awareness of the multiple use-possibilities of historic and contemporary geo-spatial information for smarter decision making.
Cuomo, Raphael E; Mackey, Tim K
2014-12-02
To explore healthcare policy and system improvements that would more proactively respond to future penetration of counterfeit cancer medications in the USA drug supply chain using geospatial analysis. A statistical and geospatial analysis of areas that received notices from the Food and Drug Administration (FDA) about the possibility of counterfeit Avastin penetrating the US drug supply chain. Data from FDA warning notices were compared to data from 44 demographic variables available from the US Census Bureau via correlation, means testing and geospatial visualisation. Results were interpreted in light of existing literature in order to recommend improvements to surveillance of counterfeit medicines. This study analysed 791 distinct healthcare provider addresses that received FDA warning notices across 30,431 zip codes in the USA. Statistical outputs were Pearson's correlation coefficients and t values. Geospatial outputs were cartographic visualisations. These data were used to generate the overarching study outcome, which was a recommendation for a strategy for drug safety surveillance congruent with existing literature on counterfeit medication. Zip codes with greater numbers of individuals age 65+ and greater numbers of ethnic white individuals were most correlated with receipt of a counterfeit Avastin notice. Geospatial visualisations designed in conjunction with statistical analysis of demographic variables appeared more capable of suggesting areas and populations that may be at risk for undetected counterfeit Avastin penetration. This study suggests that dual incorporation of statistical and geospatial analysis in surveillance of counterfeit medicine may be helpful in guiding efforts to prevent, detect and visualise counterfeit medicines penetrations in the US drug supply chain and other settings. Importantly, the information generated by these analyses could be utilised to identify at-risk populations associated with demographic characteristics. Stakeholders should explore these results as another tool to improve on counterfeit medicine surveillance. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
The report discusses technical and non-technical considerations associated with the development and operation of landfill gas to energy projects. Much of the report is based on interviews and site visits with the major developers and operators of the more than 110 projects in the...
Study on analysis from sources of error for Airborne LIDAR
NASA Astrophysics Data System (ADS)
Ren, H. C.; Yan, Q.; Liu, Z. J.; Zuo, Z. Q.; Xu, Q. Q.; Li, F. F.; Song, C.
2016-11-01
With the advancement of Aerial Photogrammetry, it appears that to obtain geo-spatial information of high spatial and temporal resolution provides a new technical means for Airborne LIDAR measurement techniques, with unique advantages and broad application prospects. Airborne LIDAR is increasingly becoming a new kind of space for earth observation technology, which is mounted by launching platform for aviation, accepting laser pulses to get high-precision, high-density three-dimensional coordinate point cloud data and intensity information. In this paper, we briefly demonstrates Airborne laser radar systems, and that some errors about Airborne LIDAR data sources are analyzed in detail, so the corresponding methods is put forwarded to avoid or eliminate it. Taking into account the practical application of engineering, some recommendations were developed for these designs, which has crucial theoretical and practical significance in Airborne LIDAR data processing fields.
The NatCarb geoportal: Linking distributed data from the Carbon Sequestration Regional Partnerships
Carr, T.R.; Rich, P.M.; Bartley, J.D.
2007-01-01
The Department of Energy (DOE) Carbon Sequestration Regional Partnerships are generating the data for a "carbon atlas" of key geospatial data (carbon sources, potential sinks, etc.) required for rapid implementation of carbon sequestration on a broad scale. The NATional CARBon Sequestration Database and Geographic Information System (NatCarb) provides Web-based, nation-wide data access. Distributed computing solutions link partnerships and other publicly accessible repositories of geological, geophysical, natural resource, infrastructure, and environmental data. Data are maintained and enhanced locally, but assembled and accessed through a single geoportal. NatCarb, as a first attempt at a national carbon cyberinfrastructure (NCCI), assembles the data required to address technical and policy challenges of carbon capture and storage. We present a path forward to design and implement a comprehensive and successful NCCI. ?? 2007 The Haworth Press, Inc. All rights reserved.
Business models for implementing geospatial technologies in transportation decision-making
DOT National Transportation Integrated Search
2007-03-31
This report describes six State DOTs business models for implementing geospatial technologies. It provides a comparison of the organizational factors influencing how Arizona DOT, Delaware DOT, Georgia DOT, Montana DOT, North Carolina DOT, and Okla...
Automated Geospatial Watershed Assessment
The Automated Geospatial Watershed Assessment (AGWA) tool is a Geographic Information Systems (GIS) interface jointly developed by the U.S. Environmental Protection Agency, the U.S. Department of Agriculture (USDA) Agricultural Research Service, and the University of Arizona to a...
Geospatial Data Science Applications and Visualizations | Geospatial Data
. Since before the time of Google Maps, NREL has used the internet to allow stakeholders to view and world, these maps drive understanding. See our collection of key maps for examples. Featured Analysis
Geospatial data for 303(d) Impaired Waters are available as prepackaged national downloads or as GIS web and and data services. EPA provides geospatial data in the formats: GIS compatible shapefiles and geodatabases and ESRI and OGC web mapping.
DOT National Transportation Integrated Search
2009-11-01
This report examines two linked phenomena in transportation planning: the geospatial analysis capabilities of local planning agencies and the increasing demands on such capabilities imposed by comprehensive planning mandates.
GEOSPATIAL DATA ACCURACY ASSESSMENT
The development of robust accuracy assessment methods for the validation of spatial data represent's a difficult scientific challenge for the geospatial science community. The importance and timeliness of this issue is related directly to the dramatic escalation in the developmen...
Publications - RI 2001-1C | Alaska Division of Geological & Geophysical
map of the Chulitna region, southcentral Alaska, scale 1:63,360 (7.5 M) Digital Geospatial Data Digital Geospatial Data Chulitna region surficial geology Data File Format File Size Info Download
Publications - RDF 2015-17 | Alaska Division of Geological & Geophysical
/10.14509/29519 Publication Products Report Report Information rdf2015_017.pdf (347.0 K) Digital Geospatial Data Digital Geospatial Data Tonsina geochemistry: DGGS samples Data File Format File Size Info
Demopoulos, Amanda W.J.; Foster, Ann M.; Jones, Michal L.; Gualtieri, Daniel J.
2011-01-01
The Geospatial Characteristics Geopdf of Florida's Coastal and Offshore Environments is a comprehensive collection of geospatial data describing the political and natural resources of Florida. This interactive map provides spatial information on bathymetry, sand resources, military areas, marine protected areas, cultural resources, locations of submerged cables, and shipping routes. The map should be useful to coastal resource managers and others interested in the administrative and political boundaries of Florida's coastal and offshore region. In particular, as oil and gas explorations continue to expand, the map may be used to explore information regarding sensitive areas and resources in the State of Florida. Users of this geospatial database will find that they have access to synthesized information in a variety of scientific disciplines concerning Florida's coastal zone. This powerful tool provides a one-stop assembly of data that can be tailored to fit the needs of many natural resource managers.
Jones, Benjamin M.; Arp, Christopher D.; Whitman, Matthew S.; Nigro, Debora A.; Nitze, Ingmar; Beaver, John; Gadeke, Anne; Zuck, Callie; Liljedahl, Anna K.; Daanen, Ronald; Torvinen, Eric; Fritz, Stacey; Grosse, Guido
2017-01-01
Lakes are dominant and diverse landscape features in the Arctic, but conventional land cover classification schemes typically map them as a single uniform class. Here, we present a detailed lake-centric geospatial database for an Arctic watershed in northern Alaska. We developed a GIS dataset consisting of 4362 lakes that provides information on lake morphometry, hydrologic connectivity, surface area dynamics, surrounding terrestrial ecotypes, and other important conditions describing Arctic lakes. Analyzing the geospatial database relative to fish and bird survey data shows relations to lake depth and hydrologic connectivity, which are being used to guide research and aid in the management of aquatic resources in the National Petroleum Reserve in Alaska. Further development of similar geospatial databases is needed to better understand and plan for the impacts of ongoing climate and land-use changes occurring across lake-rich landscapes in the Arctic.
The role of visualization in learning from computer-based images
NASA Astrophysics Data System (ADS)
Piburn, Michael D.; Reynolds, Stephen J.; McAuliffe, Carla; Leedy, Debra E.; Birk, James P.; Johnson, Julia K.
2005-05-01
Among the sciences, the practice of geology is especially visual. To assess the role of spatial ability in learning geology, we designed an experiment using: (1) web-based versions of spatial visualization tests, (2) a geospatial test, and (3) multimedia instructional modules built around QuickTime Virtual Reality movies. Students in control and experimental sections were administered measures of spatial orientation and visualization, as well as a content-based geospatial examination. All subjects improved significantly in their scores on spatial visualization and the geospatial examination. There was no change in their scores on spatial orientation. A three-way analysis of variance, with the geospatial examination as the dependent variable, revealed significant main effects favoring the experimental group and a significant interaction between treatment and gender. These results demonstrate that spatial ability can be improved through instruction, that learning of geological content will improve as a result, and that differences in performance between the genders can be eliminated.
National hydrography dataset--linear referencing
Simley, Jeffrey; Doumbouya, Ariel
2012-01-01
Geospatial data normally have a certain set of standard attributes, such as an identification number, the type of feature, and name of the feature. These standard attributes are typically embedded into the default attribute table, which is directly linked to the geospatial features. However, it is impractical to embed too much information because it can create a complex, inflexible, and hard to maintain geospatial dataset. Many scientists prefer to create a modular, or relational, data design where the information about the features is stored and maintained separately, then linked to the geospatial data. For example, information about the water chemistry of a lake can be maintained in a separate file and linked to the lake. A Geographic Information System (GIS) can then relate the water chemistry to the lake and analyze it as one piece of information. For example, the GIS can select all lakes more than 50 acres, with turbidity greater than 1.5 milligrams per liter.
U.S. EPAs Geospatial Data Access Project
To improve public health and the environment, the United States Environmental Protection Agency (EPA) collects information about facilities, sites, or places subject to environmental regulation or of environmental interest. Through the Geospatial Data Download Service, the public is now able to download the EPA Geodata Shapefile, Feature Class or extensible markup language (XML) file containing facility and site information from EPA's national program systems. The files are Internet accessible from the Envirofacts Web site (https://www3.epa.gov/enviro/). The data may be used with geospatial mapping applications. (Note: The files omit facilities without latitude/longitude coordinates.) The EPA Geospatial Data contains the name, location (latitude/longitude), and EPA program information about specific facilities and sites. In addition, the files contain a Uniform Resource Locator (URL), which allows mapping applications to present an option to users to access additional EPA data resources on a specific facility or site.
Sharing and interoperation of Digital Dongying geospatial data
NASA Astrophysics Data System (ADS)
Zhao, Jun; Liu, Gaohuan; Han, Lit-tao; Zhang, Rui-ju; Wang, Zhi-an
2006-10-01
Digital Dongying project was put forward by Dongying city, Shandong province, and authenticated by Ministry of Information Industry, Ministry of Science and Technology and Ministry of Construction P.R.CHINA in 2002. After five years of building, informationization level of Dongying has reached to the advanced degree. In order to forward the step of digital Dongying building, and to realize geospatial data sharing, geographic information sharing standards are drawn up and applied into realization. Secondly, Digital Dongying Geographic Information Sharing Platform has been constructed and developed, which is a highly integrated platform of WEBGIS. 3S (GIS, GPS, RS), Object oriented RDBMS, Internet, DCOM, etc. It provides an indispensable platform for sharing and interoperation of Digital Dongying Geospatial Data. According to the standards, and based on the platform, sharing and interoperation of "Digital Dongying" geospatial data have come into practice and the good results have been obtained. However, a perfect leadership group is necessary for data sharing and interoperation.
NASA Astrophysics Data System (ADS)
Kulo, Violet; Bodzin, Alec
2013-02-01
Geospatial technologies are increasingly being integrated in science classrooms to foster learning. This study examined whether a Web-enhanced science inquiry curriculum supported by geospatial technologies promoted urban middle school students' understanding of energy concepts. The participants included one science teacher and 108 eighth-grade students classified in three ability level tracks. Data were gathered through pre/posttest content knowledge assessments, daily classroom observations, and daily reflective meetings with the teacher. Findings indicated a significant increase in the energy content knowledge for all the students. Effect sizes were large for all three ability level tracks, with the middle and low track classes having larger effect sizes than the upper track class. Learners in all three tracks were highly engaged with the curriculum. Curriculum effectiveness and practical issues involved with using geospatial technologies to support science learning are discussed.
Regenbogen, Scott E; Greenberg, Caprice C; Studdert, David M; Lipsitz, Stuart R; Zinner, Michael J; Gawande, Atul A
2007-11-01
To identify the most prevalent patterns of technical errors in surgery, and evaluate commonly recommended interventions in light of these patterns. The majority of surgical adverse events involve technical errors, but little is known about the nature and causes of these events. We examined characteristics of technical errors and common contributing factors among closed surgical malpractice claims. Surgeon reviewers analyzed 444 randomly sampled surgical malpractice claims from four liability insurers. Among 258 claims in which injuries due to error were detected, 52% (n = 133) involved technical errors. These technical errors were further analyzed with a structured review instrument designed by qualitative content analysis. Forty-nine percent of the technical errors caused permanent disability; an additional 16% resulted in death. Two-thirds (65%) of the technical errors were linked to manual error, 9% to errors in judgment, and 26% to both manual and judgment error. A minority of technical errors involved advanced procedures requiring special training ("index operations"; 16%), surgeons inexperienced with the task (14%), or poorly supervised residents (9%). The majority involved experienced surgeons (73%), and occurred in routine, rather than index, operations (84%). Patient-related complexities-including emergencies, difficult or unexpected anatomy, and previous surgery-contributed to 61% of technical errors, and technology or systems failures contributed to 21%. Most technical errors occur in routine operations with experienced surgeons, under conditions of increased patient complexity or systems failure. Commonly recommended interventions, including restricting high-complexity operations to experienced surgeons, additional training for inexperienced surgeons, and stricter supervision of trainees, are likely to address only a minority of technical errors. Surgical safety research should instead focus on improving decision-making and performance in routine operations for complex patients and circumstances.
3D geospatial visualizations: Animation and motion effects on spatial objects
NASA Astrophysics Data System (ADS)
Evangelidis, Konstantinos; Papadopoulos, Theofilos; Papatheodorou, Konstantinos; Mastorokostas, Paris; Hilas, Constantinos
2018-02-01
Digital Elevation Models (DEMs), in combination with high quality raster graphics provide realistic three-dimensional (3D) representations of the globe (virtual globe) and amazing navigation experience over the terrain through earth browsers. In addition, the adoption of interoperable geospatial mark-up languages (e.g. KML) and open programming libraries (Javascript) makes it also possible to create 3D spatial objects and convey on them the sensation of any type of texture by utilizing open 3D representation models (e.g. Collada). One step beyond, by employing WebGL frameworks (e.g. Cesium.js, three.js) animation and motion effects are attributed on 3D models. However, major GIS-based functionalities in combination with all the above mentioned visualization capabilities such as for example animation effects on selected areas of the terrain texture (e.g. sea waves) as well as motion effects on 3D objects moving in dynamically defined georeferenced terrain paths (e.g. the motion of an animal over a hill, or of a big fish in an ocean etc.) are not widely supported at least by open geospatial applications or development frameworks. Towards this we developed and made available to the research community, an open geospatial software application prototype that provides high level capabilities for dynamically creating user defined virtual geospatial worlds populated by selected animated and moving 3D models on user specified locations, paths and areas. At the same time, the generated code may enhance existing open visualization frameworks and programming libraries dealing with 3D simulations, with the geospatial aspect of a virtual world.
Assessing and Valuing Historical Geospatial Data for Decisions
NASA Astrophysics Data System (ADS)
Sylak-Glassman, E.; Gallo, J.
2016-12-01
We will present a method for assessing the use and valuation of historical geospatial data and information products derived from Earth observations (EO). Historical data is widely used in the establishment of baseline reference cases, time-series analysis, and Earth system modeling. Historical geospatial data is used in diverse application areas, such as risk assessment in the insurance and reinsurance industry, disaster preparedness and response planning, historical demography, land-use change analysis, and paleoclimate research, among others. Establishing the current value of previously collected data, often from EO systems that are no longer operating, is difficult since the costs associated with their preservation, maintenance, and dissemination are current, while the costs associated with their original collection are sunk. Understanding their current use and value can aid in funding decisions about the data management infrastructure and workforce allocation required to maintain their availability. Using a value-tree framework to trace the application of data from EO systems, sensors, networks, and surveys, to weighted key Federal objectives, we are able to estimate relative contribution of individual EO systems, sensors, networks, and surveys to meeting those objectives. The analysis relies on a modified Delphi method to elicit relative levels of reliance on individual EO data inputs, including historical data, from subject matter experts. This results in the identification of a representative portfolio of all EO data used to meet key Federal objectives. Because historical data is collected in conjunction with all other EO data within a weighted framework, its contribution to meeting key Federal objectives can be specifically identified and evaluated in relationship to other EO data. The results of this method could be applied better understanding and projecting the long-term value of data from current and future EO systems.
Prediction of fish and sediment mercury in streams using landscape variables and historical mining.
Alpers, Charles N; Yee, Julie L; Ackerman, Joshua T; Orlando, James L; Slotton, Darrel G; Marvin-DiPasquale, Mark C
2016-11-15
Widespread mercury (Hg) contamination of aquatic systems in the Sierra Nevada of California, U.S., is associated with historical use to enhance gold (Au) recovery by amalgamation. In areas affected by historical Au mining operations, including the western slope of the Sierra Nevada and downstream areas in northern California, such as San Francisco Bay and the Sacramento River-San Joaquin River Delta, microbial conversion of Hg to methylmercury (MeHg) leads to bioaccumulation of MeHg in food webs, and increased risks to humans and wildlife. This study focused on developing a predictive model for THg in stream fish tissue based on geospatial data, including land use/land cover data, and the distribution of legacy Au mines. Data on total mercury (THg) and MeHg concentrations in fish tissue and streambed sediment collected during 1980-2012 from stream sites in the Sierra Nevada, California were combined with geospatial data to estimate fish THg concentrations across the landscape. THg concentrations of five fish species (Brown Trout, Rainbow Trout, Sacramento Pikeminnow, Sacramento Sucker, and Smallmouth Bass) within stream sections were predicted using multi-model inference based on Akaike Information Criteria, using geospatial data for mining history and landscape characteristics as well as fish species and length (r(2)=0.61, p<0.001). Including THg concentrations in streambed sediment did not improve the model's fit, however including MeHg concentrations in streambed sediment, organic content (loss on ignition), and sediment grain size resulted in an improved fit (r(2)=0.63, p<0.001). These models can be used to estimate THg concentrations in stream fish based on landscape variables in the Sierra Nevada in areas where direct measurements of THg concentration in fish are unavailable. Published by Elsevier B.V.
Strategic planning of INA-CORS development for public service and tectonic deformation study
NASA Astrophysics Data System (ADS)
Syetiawan, Agung; Gaol, Yustisi Ardhitasari Lumban; Safi'i, Ayu Nur
2017-07-01
GPS technology can be applied for surveying, mapping and research purposes. The simplicity of GPS technology for positioning make it become the first choice for survey compared with another positioning method. GPS can measure a position with various accuracy level based on the measurement method. In order to facilitate the GPS positioning, many organizations are establishing permanent GPS station. National Geodetic Survey (NGS) called it as Continuously Operating Reference Stations (CORS). Those devices continuously collect and record GPS data to be used by users. CORS has been built by several government agencies for particular purposes and scattered throughout Indonesia. Geospatial Information Agency (BIG) as a geospatial information providers begin to compile a grand design of Indonesia CORS (INA-CORS) that can be used for public service such as Real Time Kinematic (RTK), RINEX data request, or post-processing service and for tectonic deformation study to determine the deformation models of Indonesia and to evaluate the national geospatial reference system. This study aims to review the ideal location to develop CORS network distribution. The method was used is to perform spatial analysis on the data distribution of BIG and BPN CORS overlayed with Seismotectonic Map of Indonesia and land cover. The ideal condition to be achieved is that CORS will be available on each radius of 50 km. The result showed that CORS distribution in Java and Nusa Tenggara are already tight while on Sumatra, Celebes and Moluccas are still need to be more tighten. Meanwhile, the development of CORS in Papua will encounter obstacles toward road access and networking. This analysis result can be used as consideration for determining the priorities of CORS development in Indonesia.
Prediction of fish and sediment mercury in streams using landscape variables and historical mining
Alpers, Charles N.; Yee, Julie L.; Ackerman, Joshua T.; Orlando, James L.; Slotton, Darrell G.; Marvin-DiPasquale, Mark C.
2016-01-01
Widespread mercury (Hg) contamination of aquatic systems in the Sierra Nevada of California, U.S., is associated with historical use to enhance gold (Au) recovery by amalgamation. In areas affected by historical Au mining operations, including the western slope of the Sierra Nevada and downstream areas in northern California, such as San Francisco Bay and the Sacramento River–San Joaquin River Delta, microbial conversion of Hg to methylmercury (MeHg) leads to bioaccumulation of MeHg in food webs, and increased risks to humans and wildlife. This study focused on developing a predictive model for THg in stream fish tissue based on geospatial data, including land use/land cover data, and the distribution of legacy Au mines. Data on total mercury (THg) and MeHg concentrations in fish tissue and streambed sediment collected during 1980–2012 from stream sites in the Sierra Nevada, California were combined with geospatial data to estimate fish THg concentrations across the landscape. THg concentrations of five fish species (Brown Trout, Rainbow Trout, Sacramento Pikeminnow, Sacramento Sucker, and Smallmouth Bass) within stream sections were predicted using multi-model inference based on Akaike Information Criteria, using geospatial data for mining history and landscape characteristics as well as fish species and length (r2 = 0.61, p < 0.001). Including THg concentrations in streambed sediment did not improve the model's fit, however including MeHg concentrations in streambed sediment, organic content (loss on ignition), and sediment grain size resulted in an improved fit (r2 = 0.63, p < 0.001). These models can be used to estimate THg concentrations in stream fish based on landscape variables in the Sierra Nevada in areas where direct measurements of THg concentration in fish are unavailable.
Innovating Big Data Computing Geoprocessing for Analysis of Engineered-Natural Systems
NASA Astrophysics Data System (ADS)
Rose, K.; Baker, V.; Bauer, J. R.; Vasylkivska, V.
2016-12-01
Big data computing and analytical techniques offer opportunities to improve predictions about subsurface systems while quantifying and characterizing associated uncertainties from these analyses. Spatial analysis, big data and otherwise, of subsurface natural and engineered systems are based on variable resolution, discontinuous, and often point-driven data to represent continuous phenomena. We will present examples from two spatio-temporal methods that have been adapted for use with big datasets and big data geo-processing capabilities. The first approach uses regional earthquake data to evaluate spatio-temporal trends associated with natural and induced seismicity. The second algorithm, the Variable Grid Method (VGM), is a flexible approach that presents spatial trends and patterns, such as those resulting from interpolation methods, while simultaneously visualizing and quantifying uncertainty in the underlying spatial datasets. In this presentation we will show how we are utilizing Hadoop to store and perform spatial analyses to efficiently consume and utilize large geospatial data in these custom analytical algorithms through the development of custom Spark and MapReduce applications that incorporate ESRI Hadoop libraries. The team will present custom `Big Data' geospatial applications that run on the Hadoop cluster and integrate with ESRI ArcMap with the team's probabilistic VGM approach. The VGM-Hadoop tool has been specially built as a multi-step MapReduce application running on the Hadoop cluster for the purpose of data reduction. This reduction is accomplished by generating multi-resolution, non-overlapping, attributed topology that is then further processed using ESRI's geostatistical analyst to convey a probabilistic model of a chosen study region. Finally, we will share our approach for implementation of data reduction and topology generation via custom multi-step Hadoop applications, performance benchmarking comparisons, and Hadoop-centric opportunities for greater parallelization of geospatial operations.
Integrated Sustainable Planning for Industrial Region Using Geospatial Technology
NASA Astrophysics Data System (ADS)
Tiwari, Manish K.; Saxena, Aruna; Katare, Vivek
2012-07-01
The Geospatial techniques and its scope of applications have undergone an order of magnitude change since its advent and now it has been universally accepted as a most important and modern tool for mapping and monitoring of various natural resources as well as amenities and infrastructure. The huge and voluminous spatial database generated from various Remote Sensing platforms needs proper management like storage, retrieval, manipulation and analysis to extract desired information, which is beyond the capability of human brain. This is where the computer aided GIS technology came into existence. A GIS with major input from Remote Sensing satellites for the natural resource management applications must be able to handle the spatiotemporal data, supporting spatiotemporal quarries and other spatial operations. Software and the computer-based tools are designed to make things easier to the user and to improve the efficiency and quality of information processing tasks. The natural resources are a common heritage, which we have shared with the past generations, and our future generation will be inheriting these resources from us. Our greed for resource and our tremendous technological capacity to exploit them at a much larger scale has created a situation where we have started withdrawing from the future stocks. Bhopal capital region had attracted the attention of the planners from the beginning of the five-year plan strategy for Industrial development. However, a number of projects were carried out in the individual Districts (Bhopal, Rajgarh, Shajapur, Raisen, Sehore) which also gave fruitful results, but no serious efforts have been made to involve the entire region. No use of latest Geospatial technique (Remote Sensing, GIS, GPS) to prepare a well structured computerized data base without which it is very different to retrieve, analyze and compare the data for monitoring as well as for planning the developmental activities in future.
NASA Astrophysics Data System (ADS)
Khan, K. M.; Rashid, S.; Yaseen, M.; Ikram, M.
2016-12-01
The Karakoram Highway (KKH) 'eighth wonder of the world', constructed and completed by the consent of Pakistan and China in 1979 as a Friendship Highway. It connect Gilgit-Baltistan, a strategically prominent region of Pakistan, with Xinjiang region in China. Due to manifold geology/geomorphology, soil formation, steep slopes, climate change well as unsustainable anthropogenic activities, still, KKH is remarkably vulnerable to natural hazards i.e. land subsistence, landslides, erosion, rock fall, floods, debris flows, cyclical torrential rainfall and snowfall, lake outburst etc. Most of the time these geohazard's damaging effects jeopardized the life in the region. To ascertain the nature and frequency of the disaster and vulnerability zoning, a rating and management (logistic) analysis were made to investigate the spatiotemporal sharing of the natural hazard. The substantial dynamics of the physiograpy, geology, geomorphology, soils and climate were carefully understand while slope, aspect, elevation, profile curvature and rock hardness was calculated by different techniques. To assess the nature and intensity geospatial analysis were conducted and magnitude of every factor was gauged by using logistic regression. Moreover, ever relative variable was integrated in the evaluation process. Logistic regression and geospatial techniques were used to map the geohazard vulnerability zoning (GVZ). The GVZ model findings were endorsed by the reviews of documented hazards in the current years and the precision was realized more than 88.1 %. The study has proved the model authentication by highlighting the comfortable indenture among the vulnerability mapping and past documented hazards. By using a receiver operating characteristic curve, the logistic regression model made satisfactory results. The outcomes will be useful in sustainable land use and infrastructure planning, mainly in high risk zones for reduceing economic damages and community betterment.
NASA Astrophysics Data System (ADS)
Arbab-Zavar, B.; Chakravarthy, A.; Sabeur, Z. A.
2012-04-01
The rapid development of advanced smart communication tools with good quality and resolution video cameras, audio and GPS devices in the last few years shall lead to profound impacts on the way future environmental observations are conducted and accessed by communities. The resulting large scale interconnections of these "Future Internet Things" form a large environmental sensing network which will generate large volumes of quality environmental observations and at highly localised spatial scales. This enablement in environmental sensing at local scales will be of great importance to contribute in the study of fauna and flora in the near future, particularly on the effect of climate change on biodiversity in various regions of Europe and beyond. The Future Internet could also potentially become the de facto information space to provide participative real-time sensing by communities and improve our situation awarness of the effect of climate on local environments. In the ENVIROFI(2011-2013) Usage Area project in the FP7 FI-PPP programme, a set of requirements for specific (and generic) enablers is achieved with the potential establishement of participating community observatories of the future. In particular, the specific enablement of interest concerns the building of future interoperable services for the management of environmental data intelligently with tagged contextual geo-spatial information generated by multiple operators in communities (Using smart phones). The classification of observed species in the resulting images is achieved with structured data pre-processing, semantic enrichement using contextual geospatial information, and high level fusion with controlled uncertainty estimations. The returned identification of species is further improved using future ground truth corrections and learning by the specific enablers.
Spatial Information Technology Center at Fulton-Montgomery Community College
NASA Technical Reports Server (NTRS)
Flinton, Michael E.
2004-01-01
The Spatial Information Technology Center (SITC) at Fulton-Montgomery Community College (FMCC) continued to fulfill its mission and charter by successfully completing its third year of operations under Congressional funding and NASA sponsorship. Third year operations (01 Oct 02 - 30 Sep 03) have been funded and conducted utilizing two authorized Research Grants NAG 13-00043 (via a one-year no-cost extension expiring Sep 03) and NAG 13-02053 (one-year no-cost extension expiring Sep 04). Drawdowns and reporting of fiscal activities for SlTC operations continues to pass through the Institute for the Application of Geo-spatial Technology (IAGT) at Cayuga Community College in Auburn, New York. Fiscal activity of the Center is reported quarterly via SF 272 to IAGT, thus this report contains only a budgetary overview and forecast of future expenditures for the remaining funds of NAG 13 - 02053. Funds from NAG 13 - 00043 were exhausted during the fourth quarter of fiscal year FY02 - 03, which necessitated initial draw down of NAG 13 - 02053. The IAGT receives no compensation for administrative costs as authorized and approved by NASA in each award budget. This report also includes the necessary addendums for each NAG award, as required by federal guidelines, though no reportable activities took place within this report period. Attached are the signed Report of New Technology/lnventions and a Final Property Report identifying qualifying equipment purchased by the Center. As an academic, economic and workforce development oriented program, the Center has made significant strides in bringing the technology, knowledge and applications of the spatial information technology field to the region it serves. Through the mission of the Center, the region's educational, economic development and work force communities have become increasingly educated to the benefits of spatial (Geospatial) technology, particularly in the region's K-12 arena. SlTC continues to positively affect the region's education, employment and economic development, while expanding its services and operations designed to be customer driven, growing infrastructure and affecting systemic change.
Geospatial Data Science Data and Tools | Geospatial Data Science | NREL
help sizing a residential photovoltaic system? Want to know what renewable energy resources are science tools help users apply NREL's geographic information system expertise to their own projects. Need
THE NEVADA GEOSPATIAL DATA BROWSER
The Nevada Geospatial Data Browser was developed by the Landscape Ecology Branch of the U.S. Environmental Protection Agency (Las Vegas, NV) with the assistance and collaboration of the University of Idaho (Moscow, ID) and Lockheed-Martin Environmental Services (Las Vegas, NV).