To ontologise or not to ontologise: An information model for a geospatial knowledge infrastructure
NASA Astrophysics Data System (ADS)
Stock, Kristin; Stojanovic, Tim; Reitsma, Femke; Ou, Yang; Bishr, Mohamed; Ortmann, Jens; Robertson, Anne
2012-08-01
A geospatial knowledge infrastructure consists of a set of interoperable components, including software, information, hardware, procedures and standards, that work together to support advanced discovery and creation of geoscientific resources, including publications, data sets and web services. The focus of the work presented is the development of such an infrastructure for resource discovery. Advanced resource discovery is intended to support scientists in finding resources that meet their needs, and focuses on representing the semantic details of the scientific resources, including the detailed aspects of the science that led to the resource being created. This paper describes an information model for a geospatial knowledge infrastructure that uses ontologies to represent these semantic details, including knowledge about domain concepts, the scientific elements of the resource (analysis methods, theories and scientific processes) and web services. This semantic information can be used to enable more intelligent search over scientific resources, and to support new ways to infer and visualise scientific knowledge. The work describes the requirements for semantic support of a knowledge infrastructure, and analyses the different options for information storage based on the twin goals of semantic richness and syntactic interoperability to allow communication between different infrastructures. Such interoperability is achieved by the use of open standards, and the architecture of the knowledge infrastructure adopts such standards, particularly from the geospatial community. The paper then describes an information model that uses a range of different types of ontologies, explaining those ontologies and their content. The information model was successfully implemented in a working geospatial knowledge infrastructure, but the evaluation identified some issues in creating the ontologies.
NASA Astrophysics Data System (ADS)
Nunes, Paulo; Correia, Anacleto; Teodoro, M. Filomena
2017-06-01
Since long ago, information is a key factor for military organizations. In military context the success of joint and combined operations depends on the accurate information and knowledge flow concerning the operational theatre: provision of resources, environment evolution, targets' location, where and when an event will occur. Modern military operations cannot be conceive without maps and geospatial information. Staffs and forces on the field request large volume of information during the planning and execution process, horizontal and vertical geospatial information integration is critical for decision cycle. Information and knowledge management are fundamental to clarify an environment full of uncertainty. Geospatial information (GI) management rises as a branch of information and knowledge management, responsible for the conversion process from raw data collect by human or electronic sensors to knowledge. Geospatial information and intelligence systems allow us to integrate all other forms of intelligence and act as a main platform to process and display geospatial-time referenced events. Combining explicit knowledge with person know-how to generate a continuous learning cycle that supports real time decisions, mitigates the influences of fog of war and provides the knowledge supremacy. This paper presents the analysis done after applying a questionnaire and interviews about the GI and intelligence management in a military organization. The study intended to identify the stakeholder's requirements for a military spatial data infrastructure as well as the requirements for a future software system development.
Infrastructure for the Geospatial Web
NASA Astrophysics Data System (ADS)
Lake, Ron; Farley, Jim
Geospatial data and geoprocessing techniques are now directly linked to business processes in many areas. Commerce, transportation and logistics, planning, defense, emergency response, health care, asset management and many other domains leverage geospatial information and the ability to model these data to achieve increased efficiencies and to develop better, more comprehensive decisions. However, the ability to deliver geospatial data and the capacity to process geospatial information effectively in these domains are dependent on infrastructure technology that facilitates basic operations such as locating data, publishing data, keeping data current and notifying subscribers and others whose applications and decisions are dependent on this information when changes are made. This chapter introduces the notion of infrastructure technology for the Geospatial Web. Specifically, the Geography Markup Language (GML) and registry technology developed using the ebRIM specification delivered from the OASIS consortium are presented as atomic infrastructure components in a working Geospatial Web.
New Geodetic Infrastructure for Australia: The NCRIS / AuScope Geospatial Component
NASA Astrophysics Data System (ADS)
Tregoning, P.; Watson, C. S.; Coleman, R.; Johnston, G.; Lovell, J.; Dickey, J.; Featherstone, W. E.; Rizos, C.; Higgins, M.; Priebbenow, R.
2009-12-01
In November 2006, the Australian Federal Government announced AUS15.8M in funding for geospatial research infrastructure through the National Collaborative Research Infrastructure Strategy (NCRIS). Funded within a broader capability area titled ‘Structure and Evolution of the Australian Continent’, NCRIS has provided a significant investment across Earth imaging, geochemistry, numerical simulation and modelling, the development of a virtual core library, and geospatial infrastructure. Known collectively as AuScope (www.auscope.org.au), this capability area has brought together Australian’s leading Earth scientists to decide upon the most pressing scientific issues and infrastructure needs for studying Earth systems and their impact on the Australian continent. Importantly and at the same time, the investment in geospatial infrastructure offers the opportunity to raise Australian geodetic science capability to the highest international level into the future. The geospatial component of AuScope builds onto the AUS15.8M of direct funding through the NCRIS process with significant in-kind and co-investment from universities and State/Territory and Federal government departments. The infrastructure to be acquired includes an FG5 absolute gravimeter, three gPhone relative gravimeters, three 12.1 m radio telescopes for geodetic VLBI, a continent-wide network of continuously operating geodetic quality GNSS receivers, a trial of a mobile SLR system and access to updated cluster computing facilities. We present an overview of the AuScope geospatial capability, review the current status of the infrastructure procurement and discuss some examples of the scientific research that will utilise the new geospatial infrastructure.
Facilitating Data-Intensive Education and Research in Earth Science through Geospatial Web Services
ERIC Educational Resources Information Center
Deng, Meixia
2009-01-01
The realm of Earth science (ES) is increasingly data-intensive. Geoinformatics research attempts to robustly smooth and accelerate the flow of data to information, information to knowledge, and knowledge to decisions and to supply necessary infrastructure and tools for advancing ES. Enabling easy access to and use of large volumes of ES data and…
Geospatial-enabled Data Exploration and Computation through Data Infrastructure Building Blocks
NASA Astrophysics Data System (ADS)
Song, C. X.; Biehl, L. L.; Merwade, V.; Villoria, N.
2015-12-01
Geospatial data are present everywhere today with the proliferation of location-aware computing devices and sensors. This is especially true in the scientific community where large amounts of data are driving research and education activities in many domains. Collaboration over geospatial data, for example, in modeling, data analysis and visualization, must still overcome the barriers of specialized software and expertise among other challenges. The GABBs project aims at enabling broader access to geospatial data exploration and computation by developing spatial data infrastructure building blocks that leverage capabilities of end-to-end application service and virtualized computing framework in HUBzero. Funded by NSF Data Infrastructure Building Blocks (DIBBS) initiative, GABBs provides a geospatial data architecture that integrates spatial data management, mapping and visualization and will make it available as open source. The outcome of the project will enable users to rapidly create tools and share geospatial data and tools on the web for interactive exploration of data without requiring significant software development skills, GIS expertise or IT administrative privileges. This presentation will describe the development of geospatial data infrastructure building blocks and the scientific use cases that help drive the software development, as well as seek feedback from the user communities.
NASA Astrophysics Data System (ADS)
Bandaragoda, C.; Castronova, A. M.; Phuong, J.; Istanbulluoglu, E.; Strauch, R. L.; Nudurupati, S. S.; Tarboton, D. G.; Wang, S. W.; Yin, D.; Barnhart, K. R.; Tucker, G. E.; Hutton, E.; Hobley, D. E. J.; Gasparini, N. M.; Adams, J. M.
2017-12-01
The ability to test hypotheses about hydrology, geomorphology and atmospheric processes is invaluable to research in the era of big data. Although community resources are available, there remain significant educational, logistical and time investment barriers to their use. Knowledge infrastructure is an emerging intellectual framework to understand how people are creating, sharing and distributing knowledge - which has been dramatically transformed by Internet technologies. In addition to the technical and social components in a cyberinfrastructure system, knowledge infrastructure considers educational, institutional, and open source governance components required to advance knowledge. We are designing an infrastructure environment that lowers common barriers to reproducing modeling experiments for earth surface investigation. Landlab is an open-source modeling toolkit for building, coupling, and exploring two-dimensional numerical models. HydroShare is an online collaborative environment for sharing hydrologic data and models. CyberGIS-Jupyter is an innovative cyberGIS framework for achieving data-intensive, reproducible, and scalable geospatial analytics using the Jupyter Notebook based on ROGER - the first cyberGIS supercomputer, so that models that can be elastically reproduced through cloud computing approaches. Our team of geomorphologists, hydrologists, and computer geoscientists has created a new infrastructure environment that combines these three pieces of software to enable knowledge discovery. Through this novel integration, any user can interactively execute and explore their shared data and model resources. Landlab on HydroShare with CyberGIS-Jupyter supports the modeling continuum from fully developed modelling applications, prototyping new science tools, hands on research demonstrations for training workshops, and classroom applications. Computational geospatial models based on big data and high performance computing can now be more efficiently developed, improved, scaled, and seamlessly reproduced among multidisciplinary users, thereby expanding the active learning curriculum and research opportunities for students in earth surface modeling and informatics.
Transportation of Large Wind Components: A Review of Existing Geospatial Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mooney, Meghan; Maclaurin, Galen
2016-09-01
This report features the geospatial data component of a larger project evaluating logistical and infrastructure requirements for transporting oversized and overweight (OSOW) wind components. The goal of the larger project was to assess the status and opportunities for improving the infrastructure and regulatory practices necessary to transport wind turbine towers, blades, and nacelles from current and potential manufacturing facilities to end-use markets. The purpose of this report is to summarize existing geospatial data on wind component transportation infrastructure and to provide a data gap analysis, identifying areas for further analysis and data collection.
Geospatial decision support framework for critical infrastructure interdependency assessment
NASA Astrophysics Data System (ADS)
Shih, Chung Yan
Critical infrastructures, such as telecommunications, energy, banking and finance, transportation, water systems and emergency services are the foundations of modern society. There is a heavy dependence on critical infrastructures at multiple levels within the supply chain of any good or service. Any disruptions in the supply chain may cause profound cascading effect to other critical infrastructures. A 1997 report by the President's Commission on Critical Infrastructure Protection states that a serious interruption in freight rail service would bring the coal mining industry to a halt within approximately two weeks and the availability of electric power could be reduced in a matter of one to two months. Therefore, this research aimed at representing and assessing the interdependencies between coal supply, transportation and energy production. A proposed geospatial decision support framework was established and applied to analyze interdependency related disruption impact. By utilizing the data warehousing approach, geospatial and non-geospatial data were retrieved, integrated and analyzed based on the transportation model and geospatial disruption analysis developed in the research. The results showed that by utilizing this framework, disruption impacts can be estimated at various levels (e.g., power plant, county, state, etc.) for preventative or emergency response efforts. The information derived from the framework can be used for data mining analysis (e.g., assessing transportation mode usages; finding alternative coal suppliers, etc.).
Challenges in sharing of geospatial data by data custodians in South Africa
NASA Astrophysics Data System (ADS)
Kay, Sissiel E.
2018-05-01
As most development planning and rendering of public services happens at a place or in a space, geospatial data is required. This geospatial data is best managed through a spatial data infrastructure, which has as a key objective to share geospatial data. The collection and maintenance of geospatial data is expensive and time consuming and so the principle of "collect once - use many times" should apply. It is best to obtain the geospatial data from the authoritative source - the appointed data custodian. In South Africa the South African Spatial Data Infrastructure (SASDI) is the means to achieve the requirement for geospatial data sharing. This requires geospatial data sharing to take place between the data custodian and the user. All data custodians are expected to comply with the Spatial Data Infrastructure Act (SDI Act) in terms of geo-spatial data sharing. Currently data custodians are experiencing challenges with regard to the sharing of geospatial data. This research is based on the current ten data themes selected by the Committee for Spatial Information and the organisations identified as the data custodians for these ten data themes. The objectives are to determine whether the identified data custodians comply with the SDI Act with respect to geospatial data sharing, and if not what are the reasons for this. Through an international comparative assessment it then determines if the compliance with the SDI Act is not too onerous on the data custodians. The research concludes that there are challenges with geospatial data sharing in South Africa and that the data custodians only partially comply with the SDI Act in terms of geospatial data sharing. However, it is shown that the South African legislation is not too onerous on the data custodians.
Anthony, Michelle L.; Klaver, Jacqueline M.; Quenzer, Robert
1998-01-01
The US Geological Survey and US Agency for International Development are enhancing the geographic information infrastructure of the Western Hemisphere by establishing the Inter-American Geospatial Data Network (IGDN). In its efforts to strengthen the Western Hemisphere's information infrastructure, the IGDN is consistent with the goals of the Plan of Action that emerged from the 1994 Summit of the Americas. The IGDN is an on-line cooperative, or clearinghouse, of geospatial data. Internet technology is used to facilitate the discovery and access of Western Hemisphere geospatial data. It was established by using the standards and guidelines of the Federal Geographic Data Committee to provide a consistent data discovery mechanism that will help minimize geospatial data duplication, promote data availability, and coordinate data collection and research activities.
a Bottom-Up Geosptial Data Update Mechanism for Spatial Data Infrastructure Updating
NASA Astrophysics Data System (ADS)
Tian, W.; Zhu, X.; Liu, Y.
2012-08-01
Currently, the top-down spatial data update mechanism has made a big progress and it is wildly applied in many SDI (spatial data infrastructure). However, this mechanism still has some issues. For example, the update schedule is limited by the professional department's project, usually which is too long for the end-user; the data form collection to public cost too much time and energy for professional department; the details of geospatial information does not provide sufficient attribute, etc. Thus, how to deal with the problems has become the effective shortcut. Emerging Internet technology, 3S technique and geographic information knowledge which is popular in the public promote the booming development of geoscience in volunteered geospatial information. Volunteered geospatial information is the current "hotspot", which attracts many researchers to study its data quality and credibility, accuracy, sustainability, social benefit, application and so on. In addition to this, a few scholars also pay attention to the value of VGI to support the SDI updating. And on that basis, this paper presents a bottom-up update mechanism form VGI to SDI, which includes the processes of match homonymous elements between VGI and SDI vector data , change data detection, SDI spatial database update and new data product publication to end-users. Then, the proposed updating cycle is deeply discussed about the feasibility of which can detect the changed elements in time and shorten the update period, provide more accurate geometry and attribute data for spatial data infrastructure and support update propagation.
The National Geospatial Technical Operations Center
Craun, Kari J.; Constance, Eric W.; Donnelly, Jay; Newell, Mark R.
2009-01-01
The United States Geological Survey (USGS) National Geospatial Technical Operations Center (NGTOC) provides geospatial technical expertise in support of the National Geospatial Program in its development of The National Map, National Atlas of the United States, and implementation of key components of the National Spatial Data Infrastructure (NSDI).
SDI-based business processes: A territorial analysis web information system in Spain
NASA Astrophysics Data System (ADS)
Béjar, Rubén; Latre, Miguel Á.; Lopez-Pellicer, Francisco J.; Nogueras-Iso, Javier; Zarazaga-Soria, F. J.; Muro-Medrano, Pedro R.
2012-09-01
Spatial Data Infrastructures (SDIs) provide access to geospatial data and operations through interoperable Web services. These data and operations can be chained to set up specialized geospatial business processes, and these processes can give support to different applications. End users can benefit from these applications, while experts can integrate the Web services in their own business processes and developments. This paper presents an SDI-based territorial analysis Web information system for Spain, which gives access to land cover, topography and elevation data, as well as to a number of interoperable geospatial operations by means of a Web Processing Service (WPS). Several examples illustrate how different territorial analysis business processes are supported. The system has been established by the Spanish National SDI (Infraestructura de Datos Espaciales de España, IDEE) both as an experimental platform for geoscientists and geoinformation system developers, and as a mechanism to contribute to the Spanish citizens knowledge about their territory.
The National Map product and services directory
Newell, Mark R.
2008-01-01
As one of the cornerstones of the U.S. Geological Survey's (USGS) National Geospatial Program (NGP), The National Map is a collaborative effort among the USGS and other Federal, state, and local partners to improve and deliver topographic information for the Nation. It has many uses ranging from recreation to scientific analysis to emergency response. The National Map is easily accessible for display on the Web, as products, and as downloadable data. The geographic information available from The National Map includes orthoimagery (aerial photographs), elevation, geographic names, hydrography, boundaries, transportation, structures, and land cover. Other types of geographic information can be added to create specific types of maps. Of major importance, The National Map currently is being transformed to better serve the geospatial community. The USGS National Geospatial Program Office (NGPO) was established to provide leadership for placing geographic knowledge at the fingertips of the Nation. The office supports The National Map, Geospatial One-Stop (GOS), National Atlas of the United States®, and the Federal Geographic Data Committee (FGDC). This integrated portfolio of geospatial information and data supports the essential components of delivering the National Spatial Data Infrastructure (NSDI) and capitalizing on the power of place.
The Value of Information - Accounting for a New Geospatial Paradigm
NASA Astrophysics Data System (ADS)
Pearlman, J.; Coote, A. M.
2014-12-01
A new frontier in consideration of socio-economic benefit is valuing information as an asset, often referred to as Infonomics. Conventional financial practice does not easily provide a mechanism for valuing information and yet clearly for many of the largest corporations, such as Google and Facebook, it is their principal asset. This is exacerbated for public sector organizations, as those that information-centric rather than information-enabled are relatively few - statistics, archiving and mapping agencies are perhaps the only examples - so it's not at the top of the agenda for Government. However, it is a hugely important issue when valuing Geospatial data and information. Geospatial data allows public institutions to operate, and facilitates the provision of essential services for emergency response and national defense. In this respect, geospatial data is strongly analogous to other types of public infrastructure, such as utilities and roads. The use of Geospatial data is widespread from companies in the transportation or construction sectors to individual planning for daily events. The categorization of geospatial data as infrastructure is critical to decisions related to investment in its management, maintenance and upgrade over time. Geospatial data depreciates in the same way that physical infrastructure depreciates. It needs to be maintained otherwise its functionality and value in use declines. We have coined the term geo-infonomics to encapsulate the concept. This presentation will develop the arguments around its importance and current avenues of research.
Geospatial Data as a Service: Towards planetary scale real-time analytics
NASA Astrophysics Data System (ADS)
Evans, B. J. K.; Larraondo, P. R.; Antony, J.; Richards, C. J.
2017-12-01
The rapid growth of earth systems, environmental and geophysical datasets poses a challenge to both end-users and infrastructure providers. For infrastructure and data providers, tasks like managing, indexing and storing large collections of geospatial data needs to take into consideration the various use cases by which consumers will want to access and use the data. Considerable investment has been made by the Earth Science community to produce suitable real-time analytics platforms for geospatial data. There are currently different interfaces that have been defined to provide data services. Unfortunately, there is considerable difference on the standards, protocols or data models which have been designed to target specific communities or working groups. The Australian National University's National Computational Infrastructure (NCI) is used for a wide range of activities in the geospatial community. Earth observations, climate and weather forecasting are examples of these communities which generate large amounts of geospatial data. The NCI has been carrying out significant effort to develop a data and services model that enables the cross-disciplinary use of data. Recent developments in cloud and distributed computing provide a publicly accessible platform where new infrastructures can be built. One of the key components these technologies offer is the possibility of having "limitless" compute power next to where the data is stored. This model is rapidly transforming data delivery from centralised monolithic services towards ubiquitous distributed services that scale up and down adapting to fluctuations in the demand. NCI has developed GSKY, a scalable, distributed server which presents a new approach for geospatial data discovery and delivery based on OGC standards. We will present the architecture and motivating use-cases that drove GSKY's collaborative design, development and production deployment. We show our approach offers the community valuable exploratory analysis capabilities, for dealing with petabyte-scale geospatial data collections.
Adoption of Geospatial Systems towards evolving Sustainable Himalayan Mountain Development
NASA Astrophysics Data System (ADS)
Murthy, M. S. R.; Bajracharya, B.; Pradhan, S.; Shestra, B.; Bajracharya, R.; Shakya, K.; Wesselmann, S.; Ali, M.; Bajracharya, S.; Pradhan, S.
2014-11-01
Natural resources dependence of mountain communities, rapid social and developmental changes, disaster proneness and climate change are conceived as the critical factors regulating sustainable Himalayan mountain development. The Himalayan region posed by typical geographic settings, diverse physical and cultural diversity present a formidable challenge to collect and manage data, information and understands varied socio-ecological settings. Recent advances in earth observation, near real-time data, in-situ measurements and in combination of information and communication technology have transformed the way we collect, process, and generate information and how we use such information for societal benefits. Glacier dynamics, land cover changes, disaster risk reduction systems, food security and ecosystem conservation are a few thematic areas where geospatial information and knowledge have significantly contributed to informed decision making systems over the region. The emergence and adoption of near-real time systems, unmanned aerial vehicles (UAV), board-scale citizen science (crowd-sourcing), mobile services and mapping, and cloud computing have paved the way towards developing automated environmental monitoring systems, enhanced scientific understanding of geophysical and biophysical processes, coupled management of socio-ecological systems and community based adaptation models tailored to mountain specific environment. There are differentiated capacities among the ICIMOD regional member countries with regard to utilization of earth observation and geospatial technologies. The region can greatly benefit from a coordinated and collaborative approach to capture the opportunities offered by earth observation and geospatial technologies. The regional level data sharing, knowledge exchange, and Himalayan GEO supporting geospatial platforms, spatial data infrastructure, unique region specific satellite systems to address trans-boundary challenges would go a long way in evolving sustainable Himalayan livelihoods.
A Big Data Platform for Storing, Accessing, Mining and Learning Geospatial Data
NASA Astrophysics Data System (ADS)
Yang, C. P.; Bambacus, M.; Duffy, D.; Little, M. M.
2017-12-01
Big Data is becoming a norm in geoscience domains. A platform that is capable to effiently manage, access, analyze, mine, and learn the big data for new information and knowledge is desired. This paper introduces our latest effort on developing such a platform based on our past years' experiences on cloud and high performance computing, analyzing big data, comparing big data containers, and mining big geospatial data for new information. The platform includes four layers: a) the bottom layer includes a computing infrastructure with proper network, computer, and storage systems; b) the 2nd layer is a cloud computing layer based on virtualization to provide on demand computing services for upper layers; c) the 3rd layer is big data containers that are customized for dealing with different types of data and functionalities; d) the 4th layer is a big data presentation layer that supports the effient management, access, analyses, mining and learning of big geospatial data.
NASA Astrophysics Data System (ADS)
Ross, A.; Little, M. M.
2013-12-01
NASA's Atmospheric Science Data Center (ASDC) is piloting the use of Geographic Information System (GIS) technology that can be leveraged for crisis planning, emergency response, and disaster management/awareness. Many different organizations currently use GIS tools and geospatial data during a disaster event. ASDC datasets have not been fully utilized by this community in the past due to incompatible data formats that ASDC holdings are archived in. Through the successful implementation of this pilot effort and continued collaboration with the larger Homeland Defense and Department of Defense emergency management community through the Homeland Infrastructure Foundation-Level Data Working Group (HIFLD WG), our data will be easily accessible to those using GIS and increase the ability to plan, respond, manage, and provide awareness during disasters. The HIFLD WG Partnership has expanded to include more than 5,900 mission partners representing the 14 executive departments, 98 agencies, 50 states (and 3 territories), and more than 700 private sector organizations to directly enhance the federal, state, and local government's ability to support domestic infrastructure data gathering, sharing and protection, visualization, and spatial knowledge management.The HIFLD WG Executive Membership is lead by representatives from the Department of Defense (DoD) Office of the Assistant Secretary of Defense for Homeland Defense and Americas' Security Affairs - OASD (HD&ASA); the Department of Homeland Security (DHS), National Protection and Programs Directorate's Office of Infrastructure Protection (NPPD IP); the National Geospatial-Intelligence Agency (NGA) Integrated Working Group - Readiness, Response and Recovery (IWG-R3); the Department of Interior (DOI) United States Geological Survey (USGS) National Geospatial Program (NGP), and DHS Federal Emergency Management Agency (FEMA).
Elmore, Kim; Flanagan, Barry; Jones, Nicholas F; Heitgerd, Janet L
2010-04-01
In 2008, CDC convened an expert panel to gather input on the use of geospatial science in surveillance, research and program activities focused on CDC's Healthy Communities Goal. The panel suggested six priorities: spatially enable and strengthen public health surveillance infrastructure; develop metrics for geospatial categorization of community health and health inequity; evaluate the feasibility and validity of standard metrics of community health and health inequities; support and develop GIScience and geospatial analysis; provide geospatial capacity building, training and education; and, engage non-traditional partners. Following the meeting, the strategies and action items suggested by the expert panel were reviewed by a CDC subcommittee to determine priorities relative to ongoing CDC geospatial activities, recognizing that many activities may need to occur either in parallel, or occur multiple times across phases. Phase A of the action items centers on developing leadership support. Phase B focuses on developing internal and external capacity in both physical (e.g., software and hardware) and intellectual infrastructure. Phase C of the action items plan concerns the development and integration of geospatial methods. In summary, the panel members provided critical input to the development of CDC's strategic thinking on integrating geospatial methods and research issues across program efforts in support of its Healthy Communities Goal.
E-DECIDER Disaster Response and Decision Support Cyberinfrastructure: Technology and Challenges
NASA Astrophysics Data System (ADS)
Glasscoe, M. T.; Parker, J. W.; Pierce, M. E.; Wang, J.; Eguchi, R. T.; Huyck, C. K.; Hu, Z.; Chen, Z.; Yoder, M. R.; Rundle, J. B.; Rosinski, A.
2014-12-01
Timely delivery of critical information to decision makers during a disaster is essential to response and damage assessment. Key issues to an efficient emergency response after a natural disaster include rapidly processing and delivering this critical information to emergency responders and reducing human intervention as much as possible. Essential elements of information necessary to achieve situational awareness are often generated by a wide array of organizations and disciplines, using any number of geospatial and non-geospatial technologies. A key challenge is the current state of practice does not easily support information sharing and technology interoperability. NASA E-DECIDER (Emergency Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response) has worked with the California Earthquake Clearinghouse and its partners to address these issues and challenges by adopting the XChangeCore Web Service Data Orchestration technology and participating in several earthquake response exercises. The E-DECIDER decision support system provides rapid delivery of advanced situational awareness data products to operations centers and emergency responders in the field. Remote sensing and hazard data, model-based map products, information from simulations, damage detection, and crowdsourcing is integrated into a single geospatial view and delivered through a service oriented architecture for improved decision-making and then directly to mobile devices of responders. By adopting a Service Oriented Architecture based on Open Geospatial Consortium standards, the system provides an extensible, comprehensive framework for geospatial data processing and distribution on Cloud platforms and other distributed environments. While the Clearinghouse and its partners are not first responders, they do support the emergency response community by providing information about the damaging effects earthquakes. It is critical for decision makers to maintain a situational awareness that is knowledgeable of potential and current conditions, possible impacts on populations and infrastructure, and other key information. E-DECIDER and the Clearinghouse have worked together to address many of these issues and challenges to deliver interoperable, authoritative decision support products.
NASA Astrophysics Data System (ADS)
Wiggins, H. V.; Warnick, W. K.; Hempel, L. C.; Henk, J.; Sorensen, M.; Tweedie, C. E.; Gaylord, A. G.
2007-12-01
As the creation and use of geospatial data in research, management, logistics, and education applications has proliferated, there is now a tremendous potential for advancing science through a variety of cyber-infrastructure applications, including Spatial Data Infrastructure (SDI) and related technologies. SDIs provide a necessary and common framework of standards, securities, policies, procedures, and technology to support the effective acquisition, coordination, dissemination and use of geospatial data by multiple and distributed stakeholder and user groups. Despite the numerous research activities in the Arctic, there is no established SDI and, because of this lack of a coordinated infrastructure, there is inefficiency, duplication of effort, and reduced data quality and search ability of arctic geospatial data. The urgency for establishing this framework is significant considering the myriad of data that is being collected in celebration of the International Polar Year (IPY) in 2007-2008 and the current international momentum for an improved and integrated circum-arctic terrestrial-marine-atmospheric environmental observatories network. The key objective of this project is to lay the foundation for full implementation of an Arctic Spatial Data Infrastructure (ASDI) through an assessment of community needs, readiness, and resources and through the development of a prototype web-mapping portal.
Modeling and formal representation of geospatial knowledge for the Geospatial Semantic Web
NASA Astrophysics Data System (ADS)
Huang, Hong; Gong, Jianya
2008-12-01
GML can only achieve geospatial interoperation at syntactic level. However, it is necessary to resolve difference of spatial cognition in the first place in most occasions, so ontology was introduced to describe geospatial information and services. But it is obviously difficult and improper to let users to find, match and compose services, especially in some occasions there are complicated business logics. Currently, with the gradual introduction of Semantic Web technology (e.g., OWL, SWRL), the focus of the interoperation of geospatial information has shifted from syntactic level to Semantic and even automatic, intelligent level. In this way, Geospatial Semantic Web (GSM) can be put forward as an augmentation to the Semantic Web that additionally includes geospatial abstractions as well as related reasoning, representation and query mechanisms. To advance the implementation of GSM, we first attempt to construct the mechanism of modeling and formal representation of geospatial knowledge, which are also two mostly foundational phases in knowledge engineering (KE). Our attitude in this paper is quite pragmatical: we argue that geospatial context is a formal model of the discriminate environment characters of geospatial knowledge, and the derivation, understanding and using of geospatial knowledge are located in geospatial context. Therefore, first, we put forward a primitive hierarchy of geospatial knowledge referencing first order logic, formal ontologies, rules and GML. Second, a metamodel of geospatial context is proposed and we use the modeling methods and representation languages of formal ontologies to process geospatial context. Thirdly, we extend Web Process Service (WPS) to be compatible with local DLL for geoprocessing and possess inference capability based on OWL.
NASA Astrophysics Data System (ADS)
Khalid, A.; Haddad, J.; Lawler, S.; Ferreira, C.
2014-12-01
Areas along the Chesapeake Bay and its tributaries are extremely vulnerable to hurricane flooding, as evidenced by the costly effects and severe impacts of recent storms along the Virginia coast, such as Hurricane Isabel in 2003 and Hurricane Sandy in 2012. Coastal wetlands, in addition to their ecological importance, are expected to mitigate the impact of storm surge by acting as a natural protection against hurricane flooding. Quantifying such interactions helps to provide a sound scientific basis to support planning and decision making. Using storm surge flooding from various historical hurricanes, simulated using a coupled hydrodynamic wave model (ADCIRC-SWAN), we propose an integrated framework yielding a geospatial identification of the capacity of Chesapeake Bay wetlands to protect critical infrastructure. Spatial identification of Chesapeake Bay wetlands is derived from the National Wetlands Inventory (NWI), National Land Cover Database (NLCD), and the Coastal Change Analysis Program (C-CAP). Inventories of population and critical infrastructure are extracted from US Census block data and FEMA's HAZUS-Multi Hazard geodatabase. Geospatial and statistical analyses are carried out to develop a relationship between wetland land cover, hurricane flooding, population and infrastructure vulnerability. These analyses result in the identification and quantification of populations and infrastructure in flooded areas that lie within a reasonable buffer surrounding the identified wetlands. Our analysis thus produces a spatial perspective on the potential for wetlands to attenuate hurricane flood impacts in critical areas. Statistical analysis will support hypothesis testing to evaluate the benefits of wetlands from a flooding and storm-surge attenuation perspective. Results from geospatial analysis are used to identify where interactions with critical infrastructure are relevant in the Chesapeake Bay.
OGC and Grid Interoperability in enviroGRIDS Project
NASA Astrophysics Data System (ADS)
Gorgan, Dorian; Rodila, Denisa; Bacu, Victor; Giuliani, Gregory; Ray, Nicolas
2010-05-01
EnviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is a 4-years FP7 Project aiming to address the subjects of ecologically unsustainable development and inadequate resource management. The project develops a Spatial Data Infrastructure of the Black Sea Catchment region. The geospatial technologies offer very specialized functionality for Earth Science oriented applications as well as the Grid oriented technology that is able to support distributed and parallel processing. One challenge of the enviroGRIDS project is the interoperability between geospatial and Grid infrastructures by providing the basic and the extended features of the both technologies. The geospatial interoperability technology has been promoted as a way of dealing with large volumes of geospatial data in distributed environments through the development of interoperable Web service specifications proposed by the Open Geospatial Consortium (OGC), with applications spread across multiple fields but especially in Earth observation research. Due to the huge volumes of data available in the geospatial domain and the additional introduced issues (data management, secure data transfer, data distribution and data computation), the need for an infrastructure capable to manage all those problems becomes an important aspect. The Grid promotes and facilitates the secure interoperations of geospatial heterogeneous distributed data within a distributed environment, the creation and management of large distributed computational jobs and assures a security level for communication and transfer of messages based on certificates. This presentation analysis and discusses the most significant use cases for enabling the OGC Web services interoperability with the Grid environment and focuses on the description and implementation of the most promising one. In these use cases we give a special attention to issues such as: the relations between computational grid and the OGC Web service protocols, the advantages offered by the Grid technology - such as providing a secure interoperability between the distributed geospatial resource -and the issues introduced by the integration of distributed geospatial data in a secure environment: data and service discovery, management, access and computation. enviroGRIDS project proposes a new architecture which allows a flexible and scalable approach for integrating the geospatial domain represented by the OGC Web services with the Grid domain represented by the gLite middleware. The parallelism offered by the Grid technology is discussed and explored at the data level, management level and computation level. The analysis is carried out for OGC Web service interoperability in general but specific details are emphasized for Web Map Service (WMS), Web Feature Service (WFS), Web Coverage Service (WCS), Web Processing Service (WPS) and Catalog Service for Web (CSW). Issues regarding the mapping and the interoperability between the OGC and the Grid standards and protocols are analyzed as they are the base in solving the communication problems between the two environments: grid and geospatial. The presetation mainly highlights how the Grid environment and Grid applications capabilities can be extended and utilized in geospatial interoperability. Interoperability between geospatial and Grid infrastructures provides features such as the specific geospatial complex functionality and the high power computation and security of the Grid, high spatial model resolution and geographical area covering, flexible combination and interoperability of the geographical models. According with the Service Oriented Architecture concepts and requirements of interoperability between geospatial and Grid infrastructures each of the main functionality is visible from enviroGRIDS Portal and consequently, by the end user applications such as Decision Maker/Citizen oriented Applications. The enviroGRIDS portal is the single way of the user to get into the system and the portal faces a unique style of the graphical user interface. Main reference for further information: [1] enviroGRIDS Project, http://www.envirogrids.net/
NASA Astrophysics Data System (ADS)
Bhattacharya, D.; Painho, M.
2017-09-01
The paper endeavours to enhance the Sensor Web with crucial geospatial analysis capabilities through integration with Spatial Data Infrastructure. The objective is development of automated smart cities intelligence system (SMACiSYS) with sensor-web access (SENSDI) utilizing geomatics for sustainable societies. There has been a need to develop automated integrated system to categorize events and issue information that reaches users directly. At present, no web-enabled information system exists which can disseminate messages after events evaluation in real time. Research work formalizes a notion of an integrated, independent, generalized, and automated geo-event analysing system making use of geo-spatial data under popular usage platform. Integrating Sensor Web With Spatial Data Infrastructures (SENSDI) aims to extend SDIs with sensor web enablement, converging geospatial and built infrastructure, and implement test cases with sensor data and SDI. The other benefit, conversely, is the expansion of spatial data infrastructure to utilize sensor web, dynamically and in real time for smart applications that smarter cities demand nowadays. Hence, SENSDI augments existing smart cities platforms utilizing sensor web and spatial information achieved by coupling pairs of otherwise disjoint interfaces and APIs formulated by Open Geospatial Consortium (OGC) keeping entire platform open access and open source. SENSDI is based on Geonode, QGIS and Java, that bind most of the functionalities of Internet, sensor web and nowadays Internet of Things superseding Internet of Sensors as well. In a nutshell, the project delivers a generalized real-time accessible and analysable platform for sensing the environment and mapping the captured information for optimal decision-making and societal benefit.
Borderless Geospatial Web (bolegweb)
NASA Astrophysics Data System (ADS)
Cetl, V.; Kliment, T.; Kliment, M.
2016-06-01
The effective access and use of geospatial information (GI) resources acquires a critical value of importance in modern knowledge based society. Standard web services defined by Open Geospatial Consortium (OGC) are frequently used within the implementations of spatial data infrastructures (SDIs) to facilitate discovery and use of geospatial data. This data is stored in databases located in a layer, called the invisible web, thus are ignored by search engines. SDI uses a catalogue (discovery) service for the web as a gateway to the GI world through the metadata defined by ISO standards, which are structurally diverse to OGC metadata. Therefore, a crosswalk needs to be implemented to bridge the OGC resources discovered on mainstream web with those documented by metadata in an SDI to enrich its information extent. A public global wide and user friendly portal of OGC resources available on the web ensures and enhances the use of GI within a multidisciplinary context and bridges the geospatial web from the end-user perspective, thus opens its borders to everybody. Project "Crosswalking the layers of geospatial information resources to enable a borderless geospatial web" with the acronym BOLEGWEB is ongoing as a postdoctoral research project at the Faculty of Geodesy, University of Zagreb in Croatia (http://bolegweb.geof.unizg.hr/). The research leading to the results of the project has received funding from the European Union Seventh Framework Programme (FP7 2007-2013) under Marie Curie FP7-PEOPLE-2011-COFUND. The project started in the November 2014 and is planned to be finished by the end of 2016. This paper provides an overview of the project, research questions and methodology, so far achieved results and future steps.
DOT National Transportation Integrated Search
2015-05-01
infrastructure networks are essential to sustain our economy, society and quality of life. Natural disasters cost lives, infrastructure destruction, and economic losses. In 2013 over 28 million people were displaced worldwide by natural disasters wit...
Dotse-Gborgbortsi, Winfred; Wardrop, Nicola; Adewole, Ademola; Thomas, Mair L H; Wright, Jim
2018-05-23
Commercial geospatial data resources are frequently used to understand healthcare utilisation. Although there is widespread evidence of a digital divide for other digital resources and infra-structure, it is unclear how commercial geospatial data resources are distributed relative to health need. To examine the distribution of commercial geospatial data resources relative to health needs, we assembled coverage and quality metrics for commercial geocoding, neighbourhood characterisation, and travel time calculation resources for 183 countries. We developed a country-level, composite index of commercial geospatial data quality/availability and examined its distribution relative to age-standardised all-cause and cause specific (for three main causes of death) mortality using two inequality metrics, the slope index of inequality and relative concentration index. In two sub-national case studies, we also examined geocoding success rates versus area deprivation by district in Eastern Region, Ghana and Lagos State, Nigeria. Internationally, commercial geospatial data resources were inversely related to all-cause mortality. This relationship was more pronounced when examining mortality due to communicable diseases. Commercial geospatial data resources for calculating patient travel times were more equitably distributed relative to health need than resources for characterising neighbourhoods or geocoding patient addresses. Countries such as South Africa have comparatively high commercial geospatial data availability despite high mortality, whilst countries such as South Korea have comparatively low data availability and low mortality. Sub-nationally, evidence was mixed as to whether geocoding success was lowest in more deprived districts. To our knowledge, this is the first global analysis of commercial geospatial data resources in relation to health outcomes. In countries such as South Africa where there is high mortality but also comparatively rich commercial geospatial data, these data resources are a potential resource for examining healthcare utilisation that requires further evaluation. In countries such as Sierra Leone where there is high mortality but minimal commercial geospatial data, alternative approaches such as open data use are needed in quantifying patient travel times, geocoding patient addresses, and characterising patients' neighbourhoods.
GISpark: A Geospatial Distributed Computing Platform for Spatiotemporal Big Data
NASA Astrophysics Data System (ADS)
Wang, S.; Zhong, E.; Wang, E.; Zhong, Y.; Cai, W.; Li, S.; Gao, S.
2016-12-01
Geospatial data are growing exponentially because of the proliferation of cost effective and ubiquitous positioning technologies such as global remote-sensing satellites and location-based devices. Analyzing large amounts of geospatial data can provide great value for both industrial and scientific applications. Data- and compute- intensive characteristics inherent in geospatial big data increasingly pose great challenges to technologies of data storing, computing and analyzing. Such challenges require a scalable and efficient architecture that can store, query, analyze, and visualize large-scale spatiotemporal data. Therefore, we developed GISpark - a geospatial distributed computing platform for processing large-scale vector, raster and stream data. GISpark is constructed based on the latest virtualized computing infrastructures and distributed computing architecture. OpenStack and Docker are used to build multi-user hosting cloud computing infrastructure for GISpark. The virtual storage systems such as HDFS, Ceph, MongoDB are combined and adopted for spatiotemporal data storage management. Spark-based algorithm framework is developed for efficient parallel computing. Within this framework, SuperMap GIScript and various open-source GIS libraries can be integrated into GISpark. GISpark can also integrated with scientific computing environment (e.g., Anaconda), interactive computing web applications (e.g., Jupyter notebook), and machine learning tools (e.g., TensorFlow/Orange). The associated geospatial facilities of GISpark in conjunction with the scientific computing environment, exploratory spatial data analysis tools, temporal data management and analysis systems make up a powerful geospatial computing tool. GISpark not only provides spatiotemporal big data processing capacity in the geospatial field, but also provides spatiotemporal computational model and advanced geospatial visualization tools that deals with other domains related with spatial property. We tested the performance of the platform based on taxi trajectory analysis. Results suggested that GISpark achieves excellent run time performance in spatiotemporal big data applications.
Brokered virtual hubs for facilitating access and use of geospatial Open Data
NASA Astrophysics Data System (ADS)
Mazzetti, Paolo; Latre, Miguel; Kamali, Nargess; Brumana, Raffaella; Braumann, Stefan; Nativi, Stefano
2016-04-01
Open Data is a major trend in current information technology scenario and it is often publicised as one of the pillars of the information society in the near future. In particular, geospatial Open Data have a huge potential also for Earth Sciences, through the enablement of innovative applications and services integrating heterogeneous information. However, open does not mean usable. As it was recognized at the very beginning of the Web revolution, many different degrees of openness exist: from simple sharing in a proprietary format to advanced sharing in standard formats and including semantic information. Therefore, to fully unleash the potential of geospatial Open Data, advanced infrastructures are needed to increase the data openness degree, enhancing their usability. In October 2014, the ENERGIC OD (European NEtwork for Redistributing Geospatial Information to user Communities - Open Data) project, funded by the European Union under the Competitiveness and Innovation framework Programme (CIP), has started. In response to the EU call, the general objective of the project is to "facilitate the use of open (freely available) geographic data from different sources for the creation of innovative applications and services through the creation of Virtual Hubs". The ENERGIC OD Virtual Hubs aim to facilitate the use of geospatial Open Data by lowering and possibly removing the main barriers which hampers geo-information (GI) usage by end-users and application developers. Data and services heterogeneity is recognized as one of the major barriers to Open Data (re-)use. It imposes end-users and developers to spend a lot of effort in accessing different infrastructures and harmonizing datasets. Such heterogeneity cannot be completely removed through the adoption of standard specifications for service interfaces, metadata and data models, since different infrastructures adopt different standards to answer to specific challenges and to address specific use-cases. Thus, beyond a certain extent, heterogeneity is irreducible especially in interdisciplinary contexts. ENERGIC OD Virtual Hubs address heterogeneity adopting a mediation and brokering approach: specific components (brokers) are dedicated to harmonize service interfaces, metadata and data models, enabling seamless discovery and access to heterogeneous infrastructures and datasets. As an innovation project, ENERGIC OD integrates several existing technologies to implement Virtual Hubs as single points of access to geospatial datasets provided by new or existing platforms and infrastructures, including INSPIRE-compliant systems and Copernicus services. A first version of the ENERGIC OD brokers has been implemented based on the GI-Suite Brokering Framework developed by CNR-IIA, and complemented with other tools under integration and development. It already enables mediated discovery and harmonized access to different geospatial Open Data sources. It is accessible by users as Software-as-a-Service through a browser. Moreover, open APIs and a Javascript library are available for application developers. Six ENERGIC OD Virtual Hubs have been currently deployed: one at regional level (Berlin metropolitan area) and five at national-level (in France, Germany, Italy, Poland and Spain). Each Virtual Hub manager decided the deployment strategy (local infrastructure or commercial Infrastructure-as-a-Service cloud), and the list of connected Open Data sources. The ENERGIC OD Virtual Hubs are under test and validation through the development of ten different mobile and Web applications.
Bim and Gis: when Parametric Modeling Meets Geospatial Data
NASA Astrophysics Data System (ADS)
Barazzetti, L.; Banfi, F.
2017-12-01
Geospatial data have a crucial role in several projects related to infrastructures and land management. GIS software are able to perform advanced geospatial analyses, but they lack several instruments and tools for parametric modelling typically available in BIM. At the same time, BIM software designed for buildings have limited tools to handle geospatial data. As things stand at the moment, BIM and GIS could appear as complementary solutions, notwithstanding research work is currently under development to ensure a better level of interoperability, especially at the scale of the building. On the other hand, the transition from the local (building) scale to the infrastructure (where geospatial data cannot be neglected) has already demonstrated that parametric modelling integrated with geoinformation is a powerful tool to simplify and speed up some phases of the design workflow. This paper reviews such mixed approaches with both simulated and real examples, demonstrating that integration is already a reality at specific scales, which are not dominated by "pure" GIS or BIM. The paper will also demonstrate that some traditional operations carried out with GIS software are also available in parametric modelling software for BIM, such as transformation between reference systems, DEM generation, feature extraction, and geospatial queries. A real case study is illustrated and discussed to show the advantage of a combined use of both technologies. BIM and GIS integration can generate greater usage of geospatial data in the AECOO (Architecture, Engineering, Construction, Owner and Operator) industry, as well as new solutions for parametric modelling with additional geoinformation.
77 FR 32978 - Call for Nominations to the National Geospatial Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-04
... the Department and the FGDC on policy and management issues related to the effective operation of... through the Federal Geographic Data Committee related to management of Federal geospatial programs, development of the National Spatial Data Infrastructure, and the implementation of Office of Management and...
Advanced space-based InSAR risk analysis of planned and existing transportation infrastructure.
DOT National Transportation Integrated Search
2017-03-21
The purpose of this document is to summarize activities by Stanford University and : MDA Geospatial Services Inc. (MDA) to estimate surface deformation and associated : risk to transportation infrastructure using SAR Interferometric methods for the :...
NASA Astrophysics Data System (ADS)
Titov, A. G.; Okladnikov, I. G.; Gordov, E. P.
2017-11-01
The use of large geospatial datasets in climate change studies requires the development of a set of Spatial Data Infrastructure (SDI) elements, including geoprocessing and cartographical visualization web services. This paper presents the architecture of a geospatial OGC web service system as an integral part of a virtual research environment (VRE) general architecture for statistical processing and visualization of meteorological and climatic data. The architecture is a set of interconnected standalone SDI nodes with corresponding data storage systems. Each node runs a specialized software, such as a geoportal, cartographical web services (WMS/WFS), a metadata catalog, and a MySQL database of technical metadata describing geospatial datasets available for the node. It also contains geospatial data processing services (WPS) based on a modular computing backend realizing statistical processing functionality and, thus, providing analysis of large datasets with the results of visualization and export into files of standard formats (XML, binary, etc.). Some cartographical web services have been developed in a system’s prototype to provide capabilities to work with raster and vector geospatial data based on OGC web services. The distributed architecture presented allows easy addition of new nodes, computing and data storage systems, and provides a solid computational infrastructure for regional climate change studies based on modern Web and GIS technologies.
NASA Astrophysics Data System (ADS)
Wiggins, H. V.; Warnick, W. K.; Hempel, L. C.; Henk, J.; Sorensen, M.; Tweedie, C. E.; Gaylord, A.; Behr, S.
2006-12-01
As the creation and use of geospatial data in research, management, logistics, and education applications has proliferated, there is now a tremendous potential for advancing the IPY initiative through a variety of cyberinfrastructure applications, including Spatial Data Infrastructure (SDI) and related technologies. SDIs provide a necessary and common framework of standards, securities, policies, procedures, and technology to support the effective acquisition, coordination, dissemination and use of geospatial data by multiple and distributed stakeholder and user groups. Despite the numerous research activities in the Arctic, there is no established SDI and, because of this lack of a coordinated infrastructure, there is inefficiency, duplication of effort, and reduced data quality and search ability of arctic geospatial data. The urgency for establishing this framework is significant considering the myriad of data that is likely to be collected in celebration of the International Polar Year (IPY) in 2007-2008 and the current international momentum for an improved and integrated circumarctic terrestrial-marine-atmospheric environmental observatories network. The key objective of this project is to lay the foundation for full implementation of an Arctic Spatial Data Infrastructure (ASDI) through two related activities: (1) an assessment - via interviews, questionnaires, a workshop, and other means - of community needs, readiness, and resources, and (2) the development of a prototype web mapping portal to demonstrate the purpose and function on an arctic geospatial one-stop portal technology and to solicit community input on design and function. The results of this project will be compiled into a comprehensive report guiding the research community and funding agencies in the design and implementation of an ASDI to contribute to a robust IPY data cyberinfrastructure.
Green Infrastructure Design Evaluation Using the Automated Geospatial Watershed Assessment Tool
In arid and semi-arid regions, green infrastructure (GI) can address several issues facing urban environments, including augmenting water supply, mitigating flooding, decreasing pollutant loads, and promoting greenness in the built environment. An optimum design captures stormwat...
Evaluation of Green Infrastructure Designs Using the Automated Geospatial Watershed Assessment Tool
In arid and semi-arid regions, green infrastructure (GI) can address several issues facing urban environments, including augmenting water supply, mitigating flooding, decreasing pollutant loads, and promoting greenness in the built environment. An optimum design captures stormwat...
Onboard Radar Processor Development for Disaster Response
NASA Technical Reports Server (NTRS)
Lou, Yunling; Clark, Duane; Hensley, Scott; Jones, Cathleen; Marks, Phillip; Muellerschoen, Ron; Wang, Charles C.
2013-01-01
Natural hazards often result in significant loss of human lives, economic assets and productivity as well as significant damage to the ecosystem. Scientists have reported more frequent and intense natural disasters in recent years, which may well be attributed to climate change. Many of the disaster response efforts were hampered by lack of up-to-date knowledge of the state of the affected areas because damaged infrastructure rendered the areas inaccessible. Radar remote sensing is playing an increasingly critical role in providing timely information to disaster response agencies due to the increasing fidelity and availability of geospatial information products.
Towards a semantics-based approach in the development of geographic portals
NASA Astrophysics Data System (ADS)
Athanasis, Nikolaos; Kalabokidis, Kostas; Vaitis, Michail; Soulakellis, Nikolaos
2009-02-01
As the demand for geospatial data increases, the lack of efficient ways to find suitable information becomes critical. In this paper, a new methodology for knowledge discovery in geographic portals is presented. Based on the Semantic Web, our approach exploits the Resource Description Framework (RDF) in order to describe the geoportal's information with ontology-based metadata. When users traverse from page to page in the portal, they take advantage of the metadata infrastructure to navigate easily through data of interest. New metadata descriptions are published in the geoportal according to the RDF schemas.
Evaluation of green infrastructure designs using the Automated Geospatial Watershed Assessment Tool
USDA-ARS?s Scientific Manuscript database
In arid and semi-arid regions, green infrastructure (GI) designs can address several issues facing urban environments, including augmenting water supply, mitigating flooding, decreasing pollutant loads, and promoting greenness in the built environment. An optimum design captures stormwater, addressi...
Design for Connecting Spatial Data Infrastructures with Sensor Web (sensdi)
NASA Astrophysics Data System (ADS)
Bhattacharya, D.; M., M.
2016-06-01
Integrating Sensor Web With Spatial Data Infrastructures (SENSDI) aims to extend SDIs with sensor web enablement, converging geospatial and built infrastructure, and implement test cases with sensor data and SDI. It is about research to harness the sensed environment by utilizing domain specific sensor data to create a generalized sensor webframework. The challenges being semantic enablement for Spatial Data Infrastructures, and connecting the interfaces of SDI with interfaces of Sensor Web. The proposed research plan is to Identify sensor data sources, Setup an open source SDI, Match the APIs and functions between Sensor Web and SDI, and Case studies like hazard applications, urban applications etc. We take up co-operative development of SDI best practices to enable a new realm of a location enabled and semantically enriched World Wide Web - the "Geospatial Web" or "Geosemantic Web" by setting up one to one correspondence between WMS, WFS, WCS, Metadata and 'Sensor Observation Service' (SOS); 'Sensor Planning Service' (SPS); 'Sensor Alert Service' (SAS); a service that facilitates asynchronous message interchange between users and services, and between two OGC-SWE services, called the 'Web Notification Service' (WNS). Hence in conclusion, it is of importance to geospatial studies to integrate SDI with Sensor Web. The integration can be done through merging the common OGC interfaces of SDI and Sensor Web. Multi-usability studies to validate integration has to be undertaken as future research.
Indigenous knowledges driving technological innovation
Lilian Alessa; Carlos Andrade; Phil Cash Cash; Christian P. Giardina; Matt Hamabata; Craig Hammer; Kai Henifin; Lee Joachim; Jay T. Johnson; Kekuhi Kealiikanakaoleohaililani; Deanna Kingston; Andrew Kliskey; Renee Pualani Louis; Amanda Lynch; Daryn McKenny; Chels Marshall; Mere Roberts; Taupouri Tangaro; Jyl Wheaton-Abraham; Everett Wingert
2011-01-01
This policy brief explores the use and expands the conversation on the ability of geospatial technologies to represent Indigenous cultural knowledge. Indigenous peoples' use of geospatial technologies has already proven to be a critical step for protecting tribal self-determination. However, the ontological frameworks and techniques of Western geospatial...
Strengthened IAEA Safeguards-Imagery Analysis: Geospatial Tools for Nonproliferation Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pabian, Frank V
2012-08-14
This slide presentation focuses on the growing role and importance of imagery analysis for IAEA safeguards applications and how commercial satellite imagery, together with the newly available geospatial tools, can be used to promote 'all-source synergy.' As additional sources of openly available information, satellite imagery in conjunction with the geospatial tools can be used to significantly augment and enhance existing information gathering techniques, procedures, and analyses in the remote detection and assessment of nonproliferation relevant activities, facilities, and programs. Foremost of the geospatial tools are the 'Digital Virtual Globes' (i.e., GoogleEarth, Virtual Earth, etc.) that are far better than previouslymore » used simple 2-D plan-view line drawings for visualization of known and suspected facilities of interest which can be critical to: (1) Site familiarization and true geospatial context awareness; (2) Pre-inspection planning; (3) Onsite orientation and navigation; (4) Post-inspection reporting; (5) Site monitoring over time for changes; (6) Verification of states site declarations and for input to State Evaluation reports; and (7) A common basis for discussions among all interested parties (Member States). Additionally, as an 'open-source', such virtual globes can also provide a new, essentially free, means to conduct broad area search for undeclared nuclear sites and activities - either alleged through open source leads; identified on internet BLOGS and WIKI Layers, with input from a 'free' cadre of global browsers and/or by knowledgeable local citizens (a.k.a.: 'crowdsourcing'), that can include ground photos and maps; or by other initiatives based on existing information and in-house country knowledge. They also provide a means to acquire ground photography taken by locals, hobbyists, and tourists of the surrounding locales that can be useful in identifying and discriminating between relevant and non-relevant facilities and their associated infrastructure. The digital globes also provide highly accurate terrain mapping for better geospatial context and allow detailed 3-D perspectives of all sites or areas of interest. 3-D modeling software (i.e., Google's SketchUp6 newly available in 2007) when used in conjunction with these digital globes can significantly enhance individual building characterization and visualization (including interiors), allowing for better assessments including walk-arounds or fly-arounds and perhaps better decision making on multiple levels (e.g., the best placement for International Atomic Energy Agency (IAEA) video monitoring cameras).« less
NASA Astrophysics Data System (ADS)
Dabolt, T. O.
2016-12-01
The proliferation of open data and data services continues to thrive and is creating new challenges on how researchers, policy analysts and other decision makes can quickly discover and use relevant data. While traditional metadata catalog approaches used by applications such as data.gov prove to be useful starting points for data search they can quickly frustrate end users who are seeking ways to quickly find and then use data in machine to machine environs. The Geospatial Platform is overcoming these obstacles and providing end users and applications developers a richer more productive user experience. The Geospatial Platform leverages a collection of open source and commercial technology hosted on Amazon Web Services providing an ecosystem of services delivering trusted, consistent data in open formats to all users as well as a shared infrastructure for federal partners to serve their spatial data assets. It supports a diverse array of communities of practice ranging on topics from the 16 National Geospatial Data Assets Themes, to homeland security and climate adaptation. Come learn how you can contribute your data and leverage others or check it out on your own at https://www.geoplatform.gov/
Rafferty, Sharon A.; Arnold, L.R.; Char, Stephen J.
2002-01-01
The U.S. Geological Survey developed this dataset as part of the Colorado Front Range Infrastructure Resources Project (FRIRP). One goal of the FRIRP was to provide information on the availability of those hydrogeologic resources that are either critical to maintaining infrastructure along the northern Front Range or that may become less available because of urban expansion in the northern Front Range. This dataset extends from the Boulder-Jefferson County line on the south, to the middle of Larimer and Weld Counties on the North. On the west, this dataset is bounded by the approximate mountain front of the Front Range of the Rocky Mountains; on the east, by an arbitrary north-south line extending through a point about 6.5 kilometers east of Greeley. This digital geospatial dataset consists of digitized contours of unconsolidated-sediment thickness (depth to bedrock).
DOT National Transportation Integrated Search
2015-08-01
Small and medium-sized cities need publicly acceptable criteria for bicycle infrastructure improvements. This report explores the : effectiveness of one proposed system of bicycle infrastructure criteria using data from a state-of-the-art travel surv...
NASA Astrophysics Data System (ADS)
Coote, A. M.; Whiteman, B.; Carver, J.; Balakrishnan, A.
2013-12-01
The disastrous earthquake in Christchurch city centre and surrounding parts of the Canterbury region of New Zealand in February 2011 which resulted in over 120 fatalities, highlighted a number of deficiencies in the information systems available to those involved in first response and in the subsequent rebuild. The lack of interoperability of geospatial information systems in particular was highlighted within the Royal Commission report on the disaster. As a result of this high level 'something must be done' call to action, Land Information New Zealand (LINZ), the lead public agency in national geospatial data management, were asked to scope a programme of work to accelerate the creation of a Spatial Data Infrastructure (SDI) for the area. This paper will outline the work undertaken to scope and prioritise a programme addressing the most pressing information infrastructure issues and then prepare the business case setting out the benefit-cost justification for the investment required. The resulting programme encompasses many of the emerging opportunities in the geospatial field including 3D GIS, crowd sourcing and open data leading to challenges in how to evaluate the benefits of innovative and 'ground breaking' solutions. It also considers how to track benefits realisation in a rapidly changing environment requiring an agile approach to programme management.
76 FR 78944 - Announcement of National Geospatial Advisory Committee Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-20
...The National Geospatial Advisory Committee (NGAC) will meet on January 12, 2012, from 1 p.m. to 4 p.m. EST. The meeting will be held via Web conference and teleconference. The NGAC, which is composed of representatives from governmental, private sector, non-profit, and academic organizations, has been established to advise the Chair of the Federal Geographic Data Committee on management of Federal geospatial programs, the development of the National Spatial Data Infrastructure, and the implementation of Office of Management and Budget (OMB) Circular A-16. Topics to be addressed at the meeting include:
75 FR 54385 - Announcement of National Geospatial Advisory Committee Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-07
...The National Geospatial Advisory Committee (NGAC) will meet on September 22-23, 2010 at the American Institute of Architects Building, 1735 New York Avenue, NW., Washington, DC 20006. The meeting will be held in the Gallery Room. The NGAC, which is composed of representatives from governmental, private sector, non-profit, and academic organizations, was established to advise the Chair of the Federal Geographic Data Committee on management of Federal geospatial programs, the development of the National Spatial Data Infrastructure, and the implementation of Office of Management and Budget (OMB) Circular A-16. Topics to be addressed at the meeting include:
76 FR 55939 - Announcement of National Geospatial Advisory Committee Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-09
...The National Geospatial Advisory Committee (NGAC) will meet on October 4-5, 2011 at the National Conservation Training Center, 698 Conservation Way, Shepherdstown, WV 25443. The meeting will be held in Room 201 Instructional East. The NGAC, which is composed of representatives from governmental, private sector, non-profit, and academic organizations, has been established to advise the Chair of the Federal Geographic Data Committee on management of Federal geospatial programs, the development of the National Spatial Data Infrastructure, and the implementation of Office of Management and Budget (OMB) Circular A-16. Topics to be addressed at the meeting include:
78 FR 16527 - Announcement of National Geospatial Advisory Committee Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-15
...The National Geospatial Advisory Committee (NGAC) will meet on April 3, 2013, from 1:00 p.m. to 5:00 p.m. EST. The meeting will be held via Web conference and teleconference. The NGAC, which is composed of representatives from governmental, private sector, non-profit, and academic organizations, has been established to advise the Chair of the Federal Geographic Data Committee on management of Federal geospatial programs, the development of the National Spatial Data Infrastructure, and the implementation of Office of Management and Budget (OMB) Circular A-16. Topics to be addressed at the meeting include:
78 FR 71638 - Announcement of National Geospatial Advisory Committee Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-29
...The National Geospatial Advisory Committee (NGAC) will meet on December 11, 2013, from 1:00 p.m. to 5:00 p.m. EST. The meeting will be held via web conference and teleconference. The NGAC, which is composed of representatives from governmental, private sector, non-profit, and academic organizations, has been established to advise the Chair of the Federal Geographic Data Committee on management of Federal geospatial programs, the development of the National Spatial Data Infrastructure, and the implementation of Office of Management and Budget (OMB) Circular A-16. Topics to be addressed at the meeting include:
76 FR 10914 - Announcement of National Geospatial Advisory Committee Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-28
...The National Geospatial Advisory Committee (NGAC) will meet on March 17-18, 2011 at the American Institute of Architects Building, 1735 New York Avenue, NW., Washington, DC 20006. The meeting will be held in the Gallery Room. The NGAC, which is composed of representatives from governmental, private sector, non-profit, and academic organizations, was established to advise the Federal Geographic Data Committee on management of Federal geospatial programs, the development of the National Spatial Data Infrastructure, and the implementation of Office of Management and Budget (OMB) Circular A-16. Topics to be addressed at the meeting include:
76 FR 28449 - Announcement of National Geospatial Advisory Committee Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-17
...The National Geospatial Advisory Committee (NGAC) will meet on June 8-9, 2011 at the American Institute of Architects Building, 1735 New York Avenue, NW., Washington, DC 20006. The meeting will be held in the Gallery Room. The NGAC, which is composed of representatives from governmental, private sector, non-profit, and academic organizations, was established to advise the Federal Geographic Data Committee on management of Federal geospatial programs, the development of the National Spatial Data Infrastructure, and the implementation of Office of Management and Budget (OMB) Circular A-16. Topics to be addressed at the meeting include:
75 FR 30855 - Announcement of National Geospatial Advisory Committee Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-02
...The National Geospatial Advisory Committee (NGAC) will meet on June 22-23, 2010 at the National Conservation Training Center, 698 Conservation Way, Shepherdstown, WV 25443. The meeting will be held in Room 201 Instructional East. The NGAC, which is composed of representatives from governmental, private sector, non-profit, and academic organizations, has been established to advise the Chair of the Federal Geographic Data Committee on management of Federal geospatial programs, the development of the National Spatial Data Infrastructure, and the implementation of Office of Management and Budget (OMB) Circular A-16. Topics to be addressed at the meeting include:
75 FR 71141 - Announcement of National Geospatial Advisory Committee Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-22
...The National Geospatial Advisory Committee (NGAC) will meet on December 7-8, 2010 at the American Institute of Architects Building, 1735 New York Avenue, NW., Washington, DC 20006. The meeting will be held in the Gallery Room. The NGAC, which is composed of representatives from governmental, private sector, non-profit, and academic organizations, was established to advise the Federal Geographic Data Committee on management of Federal geospatial programs, the development of the National Spatial Data Infrastructure, and the implementation of Office of Management and Budget (OMB) Circular A-16. Topics to be addressed at the meeting include:
Geospatial Applications on Different Parallel and Distributed Systems in enviroGRIDS Project
NASA Astrophysics Data System (ADS)
Rodila, D.; Bacu, V.; Gorgan, D.
2012-04-01
The execution of Earth Science applications and services on parallel and distributed systems has become a necessity especially due to the large amounts of Geospatial data these applications require and the large geographical areas they cover. The parallelization of these applications comes to solve important performance issues and can spread from task parallelism to data parallelism as well. Parallel and distributed architectures such as Grid, Cloud, Multicore, etc. seem to offer the necessary functionalities to solve important problems in the Earth Science domain: storing, distribution, management, processing and security of Geospatial data, execution of complex processing through task and data parallelism, etc. A main goal of the FP7-funded project enviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is the development of a Spatial Data Infrastructure targeting this catchment region but also the development of standardized and specialized tools for storing, analyzing, processing and visualizing the Geospatial data concerning this area. For achieving these objectives, the enviroGRIDS deals with the execution of different Earth Science applications, such as hydrological models, Geospatial Web services standardized by the Open Geospatial Consortium (OGC) and others, on parallel and distributed architecture to maximize the obtained performance. This presentation analysis the integration and execution of Geospatial applications on different parallel and distributed architectures and the possibility of choosing among these architectures based on application characteristics and user requirements through a specialized component. Versions of the proposed platform have been used in enviroGRIDS project on different use cases such as: the execution of Geospatial Web services both on Web and Grid infrastructures [2] and the execution of SWAT hydrological models both on Grid and Multicore architectures [3]. The current focus is to integrate in the proposed platform the Cloud infrastructure, which is still a paradigm with critical problems to be solved despite the great efforts and investments. Cloud computing comes as a new way of delivering resources while using a large set of old as well as new technologies and tools for providing the necessary functionalities. The main challenges in the Cloud computing, most of them identified also in the Open Cloud Manifesto 2009, address resource management and monitoring, data and application interoperability and portability, security, scalability, software licensing, etc. We propose a platform able to execute different Geospatial applications on different parallel and distributed architectures such as Grid, Cloud, Multicore, etc. with the possibility of choosing among these architectures based on application characteristics and complexity, user requirements, necessary performances, cost support, etc. The execution redirection on a selected architecture is realized through a specialized component and has the purpose of offering a flexible way in achieving the best performances considering the existing restrictions.
The Automated Geospatial Watershed Assessment (AGWA) Urban tool provides a step-by-step process to model subdivisions using the KINEROS2 model, with and without Green Infrastructure (GI) practices. AGWA utilizes the Kinematic Runoff and Erosion (KINEROS2) model, an event driven, ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yue, Peng; Gong, Jianya; Di, Liping
Abstract A geospatial catalogue service provides a network-based meta-information repository and interface for advertising and discovering shared geospatial data and services. Descriptive information (i.e., metadata) for geospatial data and services is structured and organized in catalogue services. The approaches currently available for searching and using that information are often inadequate. Semantic Web technologies show promise for better discovery methods by exploiting the underlying semantics. Such development needs special attention from the Cyberinfrastructure perspective, so that the traditional focus on discovery of and access to geospatial data can be expanded to support the increased demand for processing of geospatial information andmore » discovery of knowledge. Semantic descriptions for geospatial data, services, and geoprocessing service chains are structured, organized, and registered through extending elements in the ebXML Registry Information Model (ebRIM) of a geospatial catalogue service, which follows the interface specifications of the Open Geospatial Consortium (OGC) Catalogue Services for the Web (CSW). The process models for geoprocessing service chains, as a type of geospatial knowledge, are captured, registered, and discoverable. Semantics-enhanced discovery for geospatial data, services/service chains, and process models is described. Semantic search middleware that can support virtual data product materialization is developed for the geospatial catalogue service. The creation of such a semantics-enhanced geospatial catalogue service is important in meeting the demands for geospatial information discovery and analysis in Cyberinfrastructure.« less
Application of Ontologies for Big Earth Data
NASA Astrophysics Data System (ADS)
Huang, T.; Chang, G.; Armstrong, E. M.; Boening, C.
2014-12-01
Connected data is smarter data! Earth Science research infrastructure must do more than just being able to support temporal, geospatial discovery of satellite data. As the Earth Science data archives continue to expand across NASA data centers, the research communities are demanding smarter data services. A successful research infrastructure must be able to present researchers the complete picture, that is, datasets with linked citations, related interdisciplinary data, imageries, current events, social media discussions, and scientific data tools that are relevant to the particular dataset. The popular Semantic Web for Earth and Environmental Terminology (SWEET) ontologies is a collection of ontologies and concepts designed to improve discovery and application of Earth Science data. The SWEET ontologies collection was initially developed to capture the relationships between keywords in the NASA Global Change Master Directory (GCMD). Over the years this popular ontologies collection has expanded to cover over 200 ontologies and 6000 concepts to enable scalable classification of Earth system science concepts and Space science. This presentation discusses the semantic web technologies as the enabling technology for data-intensive science. We will discuss the application of the SWEET ontologies as a critical component in knowledge-driven research infrastructure for some of the recent projects, which include the DARPA Ontological System for Context Artifact and Resources (OSCAR), 2013 NASA ACCESS Virtual Quality Screening Service (VQSS), and the 2013 NASA Sea Level Change Portal (SLCP) projects. The presentation will also discuss the benefits in using semantic web technologies in developing research infrastructure for Big Earth Science Data in an attempt to "accommodate all domains and provide the necessary glue for information to be cross-linked, correlated, and discovered in a semantically rich manner." [1] [1] Savas Parastatidis: A platform for all that we know: creating a knowledge-driven research infrastructure. The Fourth Paradigm 2009: 165-172
Towards Precise Metadata-set for Discovering 3D Geospatial Models in Geo-portals
NASA Astrophysics Data System (ADS)
Zamyadi, A.; Pouliot, J.; Bédard, Y.
2013-09-01
Accessing 3D geospatial models, eventually at no cost and for unrestricted use, is certainly an important issue as they become popular among participatory communities, consultants, and officials. Various geo-portals, mainly established for 2D resources, have tried to provide access to existing 3D resources such as digital elevation model, LIDAR or classic topographic data. Describing the content of data, metadata is a key component of data discovery in geo-portals. An inventory of seven online geo-portals and commercial catalogues shows that the metadata referring to 3D information is very different from one geo-portal to another as well as for similar 3D resources in the same geo-portal. The inventory considered 971 data resources affiliated with elevation. 51% of them were from three geo-portals running at Canadian federal and municipal levels whose metadata resources did not consider 3D model by any definition. Regarding the remaining 49% which refer to 3D models, different definition of terms and metadata were found, resulting in confusion and misinterpretation. The overall assessment of these geo-portals clearly shows that the provided metadata do not integrate specific and common information about 3D geospatial models. Accordingly, the main objective of this research is to improve 3D geospatial model discovery in geo-portals by adding a specific metadata-set. Based on the knowledge and current practices on 3D modeling, and 3D data acquisition and management, a set of metadata is proposed to increase its suitability for 3D geospatial models. This metadata-set enables the definition of genuine classes, fields, and code-lists for a 3D metadata profile. The main structure of the proposal contains 21 metadata classes. These classes are classified in three packages as General and Complementary on contextual and structural information, and Availability on the transition from storage to delivery format. The proposed metadata set is compared with Canadian Geospatial Data Infrastructure (CGDI) metadata which is an implementation of North American Profile of ISO-19115. The comparison analyzes the two metadata against three simulated scenarios about discovering needed 3D geo-spatial datasets. Considering specific metadata about 3D geospatial models, the proposed metadata-set has six additional classes on geometric dimension, level of detail, geometric modeling, topology, and appearance information. In addition classes on data acquisition, preparation, and modeling, and physical availability have been specialized for 3D geospatial models.
Using the Geospatial Web to Deliver and Teach Giscience Education Programs
NASA Astrophysics Data System (ADS)
Veenendaal, B.
2015-05-01
Geographic information science (GIScience) education has undergone enormous changes over the past years. One major factor influencing this change is the role of the geospatial web in GIScience. In addition to the use of the web for enabling and enhancing GIScience education, it is also used as the infrastructure for communicating and collaborating among geospatial data and users. The web becomes both the means and the content for a geospatial education program. However, the web does not replace the traditional face-to-face environment, but rather is a means to enhance it, expand it and enable an authentic and real world learning environment. This paper outlines the use of the web in both the delivery and content of the GIScience program at Curtin University. The teaching of the geospatial web, web and cloud based mapping, and geospatial web services are key components of the program, and the use of the web and online learning are important to deliver this program. Some examples of authentic and real world learning environments are provided including joint learning activities with partner universities.
SimWIND: A Geospatial Infrastructure Model for Wind Energy Production and Transmission
NASA Astrophysics Data System (ADS)
Middleton, R. S.; Phillips, B. R.; Bielicki, J. M.
2009-12-01
Wind is a clean, enduring energy resource with a capacity to satisfy 20% or more of the electricity needs in the United States. A chief obstacle to realizing this potential is the general paucity of electrical transmission lines between promising wind resources and primary load centers. Successful exploitation of this resource will therefore require carefully planned enhancements to the electric grid. To this end, we present the model SimWIND for self-consistent optimization of the geospatial arrangement and cost of wind energy production and transmission infrastructure. Given a set of wind farm sites that satisfy meteorological viability and stakeholder interest, our model simultaneously determines where and how much electricity to produce, where to build new transmission infrastructure and with what capacity, and where to use existing infrastructure in order to minimize the cost for delivering a given amount of electricity to key markets. Costs and routing of transmission line construction take into account geographic and social factors, as well as connection and delivery expenses (transformers, substations, etc.). We apply our model to Texas and consider how findings complement the 2008 Electric Reliability Council of Texas (ERCOT) Competitive Renewable Energy Zones (CREZ) Transmission Optimization Study. Results suggest that integrated optimization of wind energy infrastructure and cost using SimWIND could play a critical role in wind energy planning efforts.
School Mapping and Geospatial Analysis of the Schools in Jasra Development Block of India
NASA Astrophysics Data System (ADS)
Agrawal, S.; Gupta, R. D.
2016-06-01
GIS is a collection of tools and techniques that works on the geospatial data and is used in the analysis and decision making. Education is an inherent part of any civil society. Proper educational facilities generate the high quality human resource for any nation. Therefore, government needs an efficient system that can help in analysing the current state of education and its progress. Government also needs a system that can support in decision making and policy framing. GIS can serve the mentioned requirements not only for government but also for the general public. In order to meet the standards of human development, it is necessary for the government and decision makers to have a close watch on the existing education policy and its implementation condition. School mapping plays an important role in this aspect. School mapping consists of building the geospatial database of schools that supports in the infrastructure development, policy analysis and decision making. The present research work is an attempt for supporting Right to Education (RTE) and Sarv Sikha Abhiyaan (SSA) programmes run by Government of India through the use of GIS. School mapping of the study area is performed which is followed by the geospatial analysis. This research work will help in assessing the present status of educational infrastructure in Jasra block of Allahabad district, India.
Developing Energy Literacy in US Middle-Level Students Using the Geospatial Curriculum Approach
NASA Astrophysics Data System (ADS)
Bodzin, Alec M.; Fu, Qiong; Peffer, Tamara E.; Kulo, Violet
2013-06-01
This quantitative study examined the effectiveness of a geospatial curriculum approach to promote energy literacy in an urban school district and examined factors that may account for energy content knowledge achievement. An energy literacy measure was administered to 1,044 eighth-grade students (ages 13-15) in an urban school district in Pennsylvania, USA. One group of students received instruction with a geospatial curriculum approach (geospatial technologies (GT)) and another group of students received 'business as usual' (BAU) curriculum instruction. For the GT students, findings revealed statistically significant gains from pretest to posttest (p < 0.001) on knowledge of energy resource acquisition, energy generation, storage and transport, and energy consumption and conservation. The GT students had year-end energy content knowledge scores significantly higher than those who learned with the BAU curriculum (p < 0.001; effect size being large). A multiple regression found that prior energy content knowledge was the only significant predictor to the year-end energy content knowledge achievement for the GT students (p < 0.001). The findings support that the implementation of a geospatial curriculum approach that employs learning activities that focus on the spatial nature of energy resources can improve the energy literacy of urban middle-level education students.
Geospatial data infrastructure: The development of metadata for geo-information in China
NASA Astrophysics Data System (ADS)
Xu, Baiquan; Yan, Shiqiang; Wang, Qianju; Lian, Jian; Wu, Xiaoping; Ding, Keyong
2014-03-01
Stores of geoscience records are in constant flux. These stores are continually added to by new information, ideas and data, which are frequently revised. The geoscience record is in restrained by human thought and technology for handling information. Conventional methods strive, with limited success, to maintain geoscience records which are readily susceptible and renewable. The information system must adapt to the diversity of ideas and data in geoscience and their changes through time. In China, more than 400,000 types of important geological data are collected and produced in geological work during the last two decades, including oil, natural gas and marine data, mine exploration, geophysical, geochemical, remote sensing and important local geological survey and research reports. Numerous geospatial databases are formed and stored in National Geological Archives (NGA) with available formats of MapGIS, ArcGIS, ArcINFO, Metalfile, Raster, SQL Server, Access and JPEG. But there is no effective way to warrant that the quality of information is adequate in theory and practice for decision making. The need for fast, reliable, accurate and up-to-date information by providing the Geographic Information System (GIS) communities are becoming insistent for all geoinformation producers and users in China. Since 2010, a series of geoinformation projects have been carried out under the leadership of the Ministry of Land and Resources (MLR), including (1) Integration, update and maintenance of geoinformation databases; (2) Standards research on clusterization and industrialization of information services; (3) Platform construction of geological data sharing; (4) Construction of key borehole databases; (5) Product development of information services. "Nine-System" of the basic framework has been proposed for the development and improvement of the geospatial data infrastructure, which are focused on the construction of the cluster organization, cluster service, convergence, database, product, policy, technology, standard and infrastructure systems. The development of geoinformation stores and services put forward a need for Geospatial Data Infrastructure (GDI) in China. In this paper, some of the ideas envisaged into the development of metadata in China are discussed.
a Framework for AN Open Source Geospatial Certification Model
NASA Astrophysics Data System (ADS)
Khan, T. U. R.; Davis, P.; Behr, F.-J.
2016-06-01
The geospatial industry is forecasted to have an enormous growth in the forthcoming years and an extended need for well-educated workforce. Hence ongoing education and training play an important role in the professional life. Parallel, in the geospatial and IT arena as well in the political discussion and legislation Open Source solutions, open data proliferation, and the use of open standards have an increasing significance. Based on the Memorandum of Understanding between International Cartographic Association, OSGeo Foundation, and ISPRS this development led to the implementation of the ICA-OSGeo-Lab imitative with its mission "Making geospatial education and opportunities accessible to all". Discussions in this initiative and the growth and maturity of geospatial Open Source software initiated the idea to develop a framework for a worldwide applicable Open Source certification approach. Generic and geospatial certification approaches are already offered by numerous organisations, i.e., GIS Certification Institute, GeoAcademy, ASPRS, and software vendors, i. e., Esri, Oracle, and RedHat. They focus different fields of expertise and have different levels and ways of examination which are offered for a wide range of fees. The development of the certification framework presented here is based on the analysis of diverse bodies of knowledge concepts, i.e., NCGIA Core Curriculum, URISA Body Of Knowledge, USGIF Essential Body Of Knowledge, the "Geographic Information: Need to Know", currently under development, and the Geospatial Technology Competency Model (GTCM). The latter provides a US American oriented list of the knowledge, skills, and abilities required of workers in the geospatial technology industry and influenced essentially the framework of certification. In addition to the theoretical analysis of existing resources the geospatial community was integrated twofold. An online survey about the relevance of Open Source was performed and evaluated with 105 respondents worldwide. 15 interviews (face-to-face or by telephone) with experts in different countries provided additional insights into Open Source usage and certification. The findings led to the development of a certification framework of three main categories with in total eleven sub-categories, i.e., "Certified Open Source Geospatial Data Associate / Professional", "Certified Open Source Geospatial Analyst Remote Sensing & GIS", "Certified Open Source Geospatial Cartographer", "Certified Open Source Geospatial Expert", "Certified Open Source Geospatial Associate Developer / Professional Developer", "Certified Open Source Geospatial Architect". Each certification is described by pre-conditions, scope and objectives, course content, recommended software packages, target group, expected benefits, and the methods of examination. Examinations can be flanked by proofs of professional career paths and achievements which need a peer qualification evaluation. After a couple of years a recertification is required. The concept seeks the accreditation by the OSGeo Foundation (and other bodies) and international support by a group of geospatial scientific institutions to achieve wide and international acceptance for this Open Source geospatial certification model. A business case for Open Source certification and a corresponding SWOT model is examined to support the goals of the Geo-For-All initiative of the ICA-OSGeo pact.
78 FR 47003 - Draft National Spatial Data Infrastructure Strategic Plan; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-02
... NSDI.'' Executive Order 12906 describes the NSDI as ``the technology, policies, standards, and human resources necessary to acquire, process, store, distribute, and improve utilization of geospatial data...
ERIC Educational Resources Information Center
Hedley, Mikell Lynne; Templin, Mark A.; Czaljkowski, Kevin; Czerniak, Charlene
2013-01-01
Many 21st century careers rely on geospatial skills; yet, curricula and professional development lag behind in incorporating these skills. As a result, many teachers have limited experience or preparation for teaching geospatial skills. One strategy for overcoming such problems is the creation of a student/teacher/scientist (STS) partnership…
An Institutional Community-Driven effort to Curate and Preserve Geospatial Data using GeoBlacklight
NASA Astrophysics Data System (ADS)
Petters, J.; Coleman, S.; Andrea, O.
2016-12-01
A variety of geospatial data is produced or collected by both academic researchers and non-academic groups in the Virginia Tech community. In an effort to preserve, curate and make this geospatial data discoverable, the University Libraries have been building a local implementation of GeoBlacklight, a multi-institutional open-source collaborative project to improve the discoverability and sharing of geospatial data. We will discuss the local implementation of Geoblacklight at Virginia Tech, focusing on the efforts necessary to make it a sustainable resource for the institution and local community going forward. This includes technical challenges such as the development of uniform workflows for geospatial data produced within and outside the course of research, but organizational and economic barriers must be overcome as well. In spearheading this GeoBlacklight effort the Libraries have partnered with University Facilities and University IT. The IT group manages the storage and backup of geospatial data, allowing our group to focus on geospatial data collection and curation. Both IT and University Facilities are in possession of localized geospatial data of interest to Viriginia Tech researchers that all parties agreed should be made discoverable and accessible. The interest and involvement of these and other university stakeholders is key to establishing the sustainability of the infrastructure and the capabilities it can provide to the Virginia Tech community and beyond.
WPS mediation: An approach to process geospatial data on different computing backends
NASA Astrophysics Data System (ADS)
Giuliani, Gregory; Nativi, Stefano; Lehmann, Anthony; Ray, Nicolas
2012-10-01
The OGC Web Processing Service (WPS) specification allows generating information by processing distributed geospatial data made available through Spatial Data Infrastructures (SDIs). However, current SDIs have limited analytical capacities and various problems emerge when trying to use them in data and computing-intensive domains such as environmental sciences. These problems are usually not or only partially solvable using single computing resources. Therefore, the Geographic Information (GI) community is trying to benefit from the superior storage and computing capabilities offered by distributed computing (e.g., Grids, Clouds) related methods and technologies. Currently, there is no commonly agreed approach to grid-enable WPS. No implementation allows one to seamlessly execute a geoprocessing calculation following user requirements on different computing backends, ranging from a stand-alone GIS server up to computer clusters and large Grid infrastructures. Considering this issue, this paper presents a proof of concept by mediating different geospatial and Grid software packages, and by proposing an extension of WPS specification through two optional parameters. The applicability of this approach will be demonstrated using a Normalized Difference Vegetation Index (NDVI) mediated WPS process, highlighting benefits, and issues that need to be further investigated to improve performances.
Grid computing enhances standards-compatible geospatial catalogue service
NASA Astrophysics Data System (ADS)
Chen, Aijun; Di, Liping; Bai, Yuqi; Wei, Yaxing; Liu, Yang
2010-04-01
A catalogue service facilitates sharing, discovery, retrieval, management of, and access to large volumes of distributed geospatial resources, for example data, services, applications, and their replicas on the Internet. Grid computing provides an infrastructure for effective use of computing, storage, and other resources available online. The Open Geospatial Consortium has proposed a catalogue service specification and a series of profiles for promoting the interoperability of geospatial resources. By referring to the profile of the catalogue service for Web, an innovative information model of a catalogue service is proposed to offer Grid-enabled registry, management, retrieval of and access to geospatial resources and their replicas. This information model extends the e-business registry information model by adopting several geospatial data and service metadata standards—the International Organization for Standardization (ISO)'s 19115/19119 standards and the US Federal Geographic Data Committee (FGDC) and US National Aeronautics and Space Administration (NASA) metadata standards for describing and indexing geospatial resources. In order to select the optimal geospatial resources and their replicas managed by the Grid, the Grid data management service and information service from the Globus Toolkits are closely integrated with the extended catalogue information model. Based on this new model, a catalogue service is implemented first as a Web service. Then, the catalogue service is further developed as a Grid service conforming to Grid service specifications. The catalogue service can be deployed in both the Web and Grid environments and accessed by standard Web services or authorized Grid services, respectively. The catalogue service has been implemented at the George Mason University/Center for Spatial Information Science and Systems (GMU/CSISS), managing more than 17 TB of geospatial data and geospatial Grid services. This service makes it easy to share and interoperate geospatial resources by using Grid technology and extends Grid technology into the geoscience communities.
Unmanned aircraft systems for transportation decision support.
DOT National Transportation Integrated Search
2016-11-30
Our nation relies on accurate geospatial information to map, measure, and monitor transportation infrastructure and the surrounding landscapes. This project focused on the application of Unmanned Aircraft systems (UAS) as a novel tool for improving e...
Grid infrastructure for automatic processing of SAR data for flood applications
NASA Astrophysics Data System (ADS)
Kussul, Natalia; Skakun, Serhiy; Shelestov, Andrii
2010-05-01
More and more geosciences applications are being put on to the Grids. Due to the complexity of geosciences applications that is caused by complex workflow, the use of computationally intensive environmental models, the need of management and integration of heterogeneous data sets, Grid offers solutions to tackle these problems. Many geosciences applications, especially those related to the disaster management and mitigations require the geospatial services to be delivered in proper time. For example, information on flooded areas should be provided to corresponding organizations (local authorities, civil protection agencies, UN agencies etc.) no more than in 24 h to be able to effectively allocate resources required to mitigate the disaster. Therefore, providing infrastructure and services that will enable automatic generation of products based on the integration of heterogeneous data represents the tasks of great importance. In this paper we present Grid infrastructure for automatic processing of synthetic-aperture radar (SAR) satellite images to derive flood products. In particular, we use SAR data acquired by ESA's ENVSAT satellite, and neural networks to derive flood extent. The data are provided in operational mode from ESA rolling archive (within ESA Category-1 grant). We developed a portal that is based on OpenLayers frameworks and provides access point to the developed services. Through the portal the user can define geographical region and search for the required data. Upon selection of data sets a workflow is automatically generated and executed on the resources of Grid infrastructure. For workflow execution and management we use Karajan language. The workflow of SAR data processing consists of the following steps: image calibration, image orthorectification, image processing with neural networks, topographic effects removal, geocoding and transformation to lat/long projection, and visualisation. These steps are executed by different software, and can be executed by different resources of the Grid system. The resulting geospatial services are available in various OGC standards such as KML and WMS. Currently, the Grid infrastructure integrates the resources of several geographically distributed organizations, in particular: Space Research Institute NASU-NSAU (Ukraine) with deployed computational and storage nodes based on Globus Toolkit 4 (htpp://www.globus.org) and gLite 3 (http://glite.web.cern.ch) middleware, access to geospatial data and a Grid portal; Institute of Cybernetics of NASU (Ukraine) with deployed computational and storage nodes (SCIT-1/2/3 clusters) based on Globus Toolkit 4 middleware and access to computational resources (approximately 500 processors); Center of Earth Observation and Digital Earth Chinese Academy of Sciences (CEODE-CAS, China) with deployed computational nodes based on Globus Toolkit 4 middleware and access to geospatial data (approximately 16 processors). We are currently adding new geospatial services based on optical satellite data, namely MODIS. This work is carried out jointly with the CEODE-CAS. Using workflow patterns that were developed for SAR data processing we are building new workflows for optical data processing.
NASA Astrophysics Data System (ADS)
Bodzin, Alec M.; Fu, Qiong; Kulo, Violet; Peffer, Tamara
2014-08-01
A potential method for teaching geospatial thinking and reasoning (GTR) is through geospatially enabled learning technologies. We developed an energy resources geospatial curriculum that included learning activities with geographic information systems and virtual globes. This study investigated how 13 urban middle school teachers implemented and varied the enactment of the curriculum with their students and investigated which teacher- and student-level factors accounted for students' GTR posttest achievement. Data included biweekly implementation surveys from teachers and energy resources content and GTR pre- and posttest achievement measures from 1,049 students. Students significantly increased both their energy resources content knowledge and their GTR skills related to energy resources at the end of the curriculum enactment. Both multiple regression and hierarchical linear modeling found that students' initial GTR abilities and gain in energy content knowledge were significantly explanatory variables for their geospatial achievement at the end of curriculum enactment, p < .001. Teacher enactment factors, including adherence to implementing the critical components of the curriculum or the number of years the teachers had taught the curriculum, did not have significant effects on students' geospatial posttest achievement. The findings from this study provide support that learning with geospatially enabled learning technologies can support GTR with urban middle-level learners.
ERIC Educational Resources Information Center
Bodzin, Alec; Peffer, Tamara; Kulo, Violet
2012-01-01
Teaching and learning about geospatial aspects of energy resource issues requires that science teachers apply effective science pedagogical approaches to implement geospatial technologies into classroom instruction. To address this need, we designed educative curriculum materials as an integral part of a comprehensive middle school energy…
GABBs: Cyberinfrastructure for Self-Service Geospatial Data Exploration, Computation, and Sharing
NASA Astrophysics Data System (ADS)
Song, C. X.; Zhao, L.; Biehl, L. L.; Merwade, V.; Villoria, N.
2016-12-01
Geospatial data are present everywhere today with the proliferation of location-aware computing devices. This is especially true in the scientific community where large amounts of data are driving research and education activities in many domains. Collaboration over geospatial data, for example, in modeling, data analysis and visualization, must still overcome the barriers of specialized software and expertise among other challenges. In addressing these needs, the Geospatial data Analysis Building Blocks (GABBs) project aims at building geospatial modeling, data analysis and visualization capabilities in an open source web platform, HUBzero. Funded by NSF's Data Infrastructure Building Blocks initiative, GABBs is creating a geospatial data architecture that integrates spatial data management, mapping and visualization, and interfaces in the HUBzero platform for scientific collaborations. The geo-rendering enabled Rappture toolkit, a generic Python mapping library, geospatial data exploration and publication tools, and an integrated online geospatial data management solution are among the software building blocks from the project. The GABBS software will be available through Amazon's AWS Marketplace VM images and open source. Hosting services are also available to the user community. The outcome of the project will enable researchers and educators to self-manage their scientific data, rapidly create GIS-enable tools, share geospatial data and tools on the web, and build dynamic workflows connecting data and tools, all without requiring significant software development skills, GIS expertise or IT administrative privileges. This presentation will describe the GABBs architecture, toolkits and libraries, and showcase the scientific use cases that utilize GABBs capabilities, as well as the challenges and solutions for GABBs to interoperate with other cyberinfrastructure platforms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Wei; Minnick, Matthew D; Mattson, Earl D
Oil shale deposits of the Green River Formation (GRF) in Northwestern Colorado, Southwestern Wyoming, and Northeastern Utah may become one of the first oil shale deposits to be developed in the U.S. because of their richness, accessibility, and extensive prior characterization. Oil shale is an organic-rich fine-grained sedimentary rock that contains significant amounts of kerogen from which liquid hydrocarbons can be produced. Water is needed to retort or extract oil shale at an approximate rate of three volumes of water for every volume of oil produced. Concerns have been raised over the demand and availability of water to produce oilmore » shale, particularly in semiarid regions where water consumption must be limited and optimized to meet demands from other sectors. The economic benefit of oil shale development in this region may have tradeoffs within the local and regional environment. Due to these potential environmental impacts of oil shale development, water usage issues need to be further studied. A basin-wide baseline for oil shale and water resource data is the foundation of the study. This paper focuses on the design and construction of a centralized geospatial infrastructure for managing a large amount of oil shale and water resource related baseline data, and for setting up the frameworks for analytical and numerical models including but not limited to three-dimensional (3D) geologic, energy resource development systems, and surface water models. Such a centralized geospatial infrastructure made it possible to directly generate model inputs from the same database and to indirectly couple the different models through inputs/outputs. Thus ensures consistency of analyses conducted by researchers from different institutions, and help decision makers to balance water budget based on the spatial distribution of the oil shale and water resources, and the spatial variations of geologic, topographic, and hydrogeological Characterization of the basin. This endeavor encountered many technical challenging and hasn't been done in the past for any oil shale basin. The database built during this study remains valuable for any other future studies involving oil shale and water resource management in the Piceance Basin. The methodology applied in the development of the GIS based Geospatial Infrastructure can be readily adapted for other professionals to develop database structure for other similar basins.« less
GSKY: A scalable distributed geospatial data server on the cloud
NASA Astrophysics Data System (ADS)
Rozas Larraondo, Pablo; Pringle, Sean; Antony, Joseph; Evans, Ben
2017-04-01
Earth systems, environmental and geophysical datasets are an extremely valuable sources of information about the state and evolution of the Earth. Being able to combine information coming from different geospatial collections is in increasing demand by the scientific community, and requires managing and manipulating data with different formats and performing operations such as map reprojections, resampling and other transformations. Due to the large data volume inherent in these collections, storing multiple copies of them is unfeasible and so such data manipulation must be performed on-the-fly using efficient, high performance techniques. Ideally this should be performed using a trusted data service and common system libraries to ensure wide use and reproducibility. Recent developments in distributed computing based on dynamic access to significant cloud infrastructure opens the door for such new ways of processing geospatial data on demand. The National Computational Infrastructure (NCI), hosted at the Australian National University (ANU), has over 10 Petabytes of nationally significant research data collections. Some of these collections, which comprise a variety of observed and modelled geospatial data, are now made available via a highly distributed geospatial data server, called GSKY (pronounced [jee-skee]). GSKY supports on demand processing of large geospatial data products such as satellite earth observation data as well as numerical weather products, allowing interactive exploration and analysis of the data. It dynamically and efficiently distributes the required computations among cloud nodes providing a scalable analysis framework that can adapt to serve large number of concurrent users. Typical geospatial workflows handling different file formats and data types, or blending data in different coordinate projections and spatio-temporal resolutions, is handled transparently by GSKY. This is achieved by decoupling the data ingestion and indexing process as an independent service. An indexing service crawls data collections either locally or remotely by extracting, storing and indexing all spatio-temporal metadata associated with each individual record. GSKY provides the user with the ability of specifying how ingested data should be aggregated, transformed and presented. It presents an OGC standards-compliant interface, allowing ready accessibility for users of the data via Web Map Services (WMS), Web Processing Services (WPS) or raw data arrays using Web Coverage Services (WCS). The presentation will show some cases where we have used this new capability to provide a significant improvement over previous approaches.
Geospatial Technology Applications and Infrastructure in the Biological Resources Division.
1998-09-01
Forestry/forest ecology Geography Geology GIS/mapping technologies GPS technology HTML/World Wide Web Information management/transfer JAVA Land...tech- nologies are being used to understand diet selection, habitat use, hibernation behavior, and social interactions of desert tortoises
The National 3-D Geospatial Information Web-Based Service of Korea
NASA Astrophysics Data System (ADS)
Lee, D. T.; Kim, C. W.; Kang, I. G.
2013-09-01
3D geospatial information systems should provide efficient spatial analysis tools and able to use all capabilities of the third dimension, and a visualization. Currently, many human activities make steps toward the third dimension like land use, urban and landscape planning, cadastre, environmental monitoring, transportation monitoring, real estate market, military applications, etc. To reflect this trend, the Korean government has been started to construct the 3D geospatial data and service platform. Since the geospatial information was introduced in Korea, the construction of geospatial information (3D geospatial information, digital maps, aerial photographs, ortho photographs, etc.) has been led by the central government. The purpose of this study is to introduce the Korean government-lead 3D geospatial information web-based service for the people who interested in this industry and we would like to introduce not only the present conditions of constructed 3D geospatial data but methodologies and applications of 3D geospatial information. About 15% (about 3,278.74 km2) of the total urban area's 3D geospatial data have been constructed by the national geographic information institute (NGII) of Korea from 2005 to 2012. Especially in six metropolitan cities and Dokdo (island belongs to Korea) on level of detail (LOD) 4 which is photo-realistic textured 3D models including corresponding ortho photographs were constructed in 2012. In this paper, we represented web-based 3D map service system composition and infrastructure and comparison of V-world with Google Earth service will be presented. We also represented Open API based service cases and discussed about the protection of location privacy when we construct 3D indoor building models. In order to prevent an invasion of privacy, we processed image blurring, elimination and camouflage. The importance of public-private cooperation and advanced geospatial information policy is emphasized in Korea. Thus, the progress of spatial information industry of Korea is expected in the near future.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pasha, M. Fayzul K.; Yang, Majntxov; Yeasmin, Dilruba
Benefited from the rapid development of multiple geospatial data sets on topography, hydrology, and existing energy-water infrastructures, the reconnaissance level hydropower resource assessment can now be conducted using geospatial models in all regions of the US. Furthermore, the updated techniques can be used to estimate the total undeveloped hydropower potential across all regions, and may eventually help identify further hydropower opportunities that were previously overlooked. To enhance the characterization of higher energy density stream-reaches, this paper explored the sensitivity of geospatial resolution on the identification of hydropower stream-reaches using the geospatial merit matrix based hydropower resource assessment (GMM-HRA) model. GMM-HRAmore » model simulation was conducted with eight different spatial resolutions on six U.S. Geological Survey (USGS) 8-digit hydrologic units (HUC8) located at three different terrains; Flat, Mild, and Steep. The results showed that more hydropower potential from higher energy density stream-reaches can be identified with increasing spatial resolution. Both Flat and Mild terrains exhibited lower impacts compared to the Steep terrain. Consequently, greater attention should be applied when selecting the discretization resolution for hydropower resource assessments in the future study.« less
Pasha, M. Fayzul K.; Yang, Majntxov; Yeasmin, Dilruba; ...
2016-01-07
Benefited from the rapid development of multiple geospatial data sets on topography, hydrology, and existing energy-water infrastructures, the reconnaissance level hydropower resource assessment can now be conducted using geospatial models in all regions of the US. Furthermore, the updated techniques can be used to estimate the total undeveloped hydropower potential across all regions, and may eventually help identify further hydropower opportunities that were previously overlooked. To enhance the characterization of higher energy density stream-reaches, this paper explored the sensitivity of geospatial resolution on the identification of hydropower stream-reaches using the geospatial merit matrix based hydropower resource assessment (GMM-HRA) model. GMM-HRAmore » model simulation was conducted with eight different spatial resolutions on six U.S. Geological Survey (USGS) 8-digit hydrologic units (HUC8) located at three different terrains; Flat, Mild, and Steep. The results showed that more hydropower potential from higher energy density stream-reaches can be identified with increasing spatial resolution. Both Flat and Mild terrains exhibited lower impacts compared to the Steep terrain. Consequently, greater attention should be applied when selecting the discretization resolution for hydropower resource assessments in the future study.« less
NCI's Distributed Geospatial Data Server
NASA Astrophysics Data System (ADS)
Larraondo, P. R.; Evans, B. J. K.; Antony, J.
2016-12-01
Earth systems, environmental and geophysics datasets are an extremely valuable source of information about the state and evolution of the Earth. However, different disciplines and applications require this data to be post-processed in different ways before it can be used. For researchers experimenting with algorithms across large datasets or combining multiple data sets, the traditional approach to batch data processing and storing all the output for later analysis rapidly becomes unfeasible, and often requires additional work to publish for others to use. Recent developments on distributed computing using interactive access to significant cloud infrastructure opens the door for new ways of processing data on demand, hence alleviating the need for storage space for each individual copy of each product. The Australian National Computational Infrastructure (NCI) has developed a highly distributed geospatial data server which supports interactive processing of large geospatial data products, including satellite Earth Observation data and global model data, using flexible user-defined functions. This system dynamically and efficiently distributes the required computations among cloud nodes and thus provides a scalable analysis capability. In many cases this completely alleviates the need to preprocess and store the data as products. This system presents a standards-compliant interface, allowing ready accessibility for users of the data. Typical data wrangling problems such as handling different file formats and data types, or harmonising the coordinate projections or temporal and spatial resolutions, can now be handled automatically by this service. The geospatial data server exposes functionality for specifying how the data should be aggregated and transformed. The resulting products can be served using several standards such as the Open Geospatial Consortium's (OGC) Web Map Service (WMS) or Web Feature Service (WFS), Open Street Map tiles, or raw binary arrays under different conventions. We will show some cases where we have used this new capability to provide a significant improvement over previous approaches.
Architecture of the local spatial data infrastructure for regional climate change research
NASA Astrophysics Data System (ADS)
Titov, Alexander; Gordov, Evgeny
2013-04-01
Georeferenced datasets (meteorological databases, modeling and reanalysis results, etc.) are actively used in modeling and analysis of climate change for various spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their size which might constitute up to tens terabytes for a single dataset studies in the area of climate and environmental change require a special software support based on SDI approach. A dedicated architecture of the local spatial data infrastructure aiming at regional climate change analysis using modern web mapping technologies is presented. Geoportal is a key element of any SDI, allowing searching of geoinformation resources (datasets and services) using metadata catalogs, producing geospatial data selections by their parameters (data access functionality) as well as managing services and applications of cartographical visualization. It should be noted that due to objective reasons such as big dataset volume, complexity of data models used, syntactic and semantic differences of various datasets, the development of environmental geodata access, processing and visualization services turns out to be quite a complex task. Those circumstances were taken into account while developing architecture of the local spatial data infrastructure as a universal framework providing geodata services. So that, the architecture presented includes: 1. Effective in terms of search, access, retrieval and subsequent statistical processing, model of storing big sets of regional georeferenced data, allowing in particular to store frequently used values (like monthly and annual climate change indices, etc.), thus providing different temporal views of the datasets 2. General architecture of the corresponding software components handling geospatial datasets within the storage model 3. Metadata catalog describing in detail using ISO 19115 and CF-convention standards datasets used in climate researches as a basic element of the spatial data infrastructure as well as its publication according to OGC CSW (Catalog Service Web) specification 4. Computational and mapping web services to work with geospatial datasets based on OWS (OGC Web Services) standards: WMS, WFS, WPS 5. Geoportal as a key element of thematic regional spatial data infrastructure providing also software framework for dedicated web applications development To realize web mapping services Geoserver software is used since it provides natural WPS implementation as a separate software module. To provide geospatial metadata services GeoNetwork Opensource (http://geonetwork-opensource.org) product is planned to be used for it supports ISO 19115/ISO 19119/ISO 19139 metadata standards as well as ISO CSW 2.0 profile for both client and server. To implement thematic applications based on geospatial web services within the framework of local SDI geoportal the following open source software have been selected: 1. OpenLayers JavaScript library, providing basic web mapping functionality for the thin client such as web browser 2. GeoExt/ExtJS JavaScript libraries for building client-side web applications working with geodata services. The web interface developed will be similar to the interface of such popular desktop GIS applications, as uDIG, QuantumGIS etc. The work is partially supported by RF Ministry of Education and Science grant 8345, SB RAS Program VIII.80.2.1 and IP 131.
Distributed geospatial model sharing based on open interoperability standards
Feng, Min; Liu, Shuguang; Euliss, Ned H.; Fang, Yin
2009-01-01
Numerous geospatial computational models have been developed based on sound principles and published in journals or presented in conferences. However modelers have made few advances in the development of computable modules that facilitate sharing during model development or utilization. Constraints hampering development of model sharing technology includes limitations on computing, storage, and connectivity; traditional stand-alone and closed network systems cannot fully support sharing and integrating geospatial models. To address this need, we have identified methods for sharing geospatial computational models using Service Oriented Architecture (SOA) techniques and open geospatial standards. The service-oriented model sharing service is accessible using any tools or systems compliant with open geospatial standards, making it possible to utilize vast scientific resources available from around the world to solve highly sophisticated application problems. The methods also allow model services to be empowered by diverse computational devices and technologies, such as portable devices and GRID computing infrastructures. Based on the generic and abstract operations and data structures required for Web Processing Service (WPS) standards, we developed an interactive interface for model sharing to help reduce interoperability problems for model use. Geospatial computational models are shared on model services, where the computational processes provided by models can be accessed through tools and systems compliant with WPS. We developed a platform to help modelers publish individual models in a simplified and efficient way. Finally, we illustrate our technique using wetland hydrological models we developed for the prairie pothole region of North America.
IsoMAP (Isoscape Modeling, Analysis, and Prediction)
NASA Astrophysics Data System (ADS)
Miller, C. C.; Bowen, G. J.; Zhang, T.; Zhao, L.; West, J. B.; Liu, Z.; Rapolu, N.
2009-12-01
IsoMAP is a TeraGrid-based web portal aimed at building the infrastructure that brings together distributed multi-scale and multi-format geospatial datasets to enable statistical analysis and modeling of environmental isotopes. A typical workflow enabled by the portal includes (1) data source exploration and selection, (2) statistical analysis and model development; (3) predictive simulation of isotope distributions using models developed in (1) and (2); (4) analysis and interpretation of simulated spatial isotope distributions (e.g., comparison with independent observations, pattern analysis). The gridded models and data products created by one user can be shared and reused among users within the portal, enabling collaboration and knowledge transfer. This infrastructure and the research it fosters can lead to fundamental changes in our knowledge of the water cycle and ecological and biogeochemical processes through analysis of network-based isotope data, but it will be important A) that those with whom the data and models are shared can be sure of the origin, quality, inputs, and processing history of these products, and B) the system is agile and intuitive enough to facilitate this sharing (rather than just ‘allow’ it). IsoMAP researchers are therefore building into the portal’s architecture several components meant to increase the amount of metadata about users’ products and to repurpose those metadata to make sharing and discovery more intuitive and robust to both expected, professional users as well as unforeseeable populations from other sectors.
Considerations on Geospatial Big Data
NASA Astrophysics Data System (ADS)
LIU, Zhen; GUO, Huadong; WANG, Changlin
2016-11-01
Geospatial data, as a significant portion of big data, has recently gained the full attention of researchers. However, few researchers focus on the evolution of geospatial data and its scientific research methodologies. When entering into the big data era, fully understanding the changing research paradigm associated with geospatial data will definitely benefit future research on big data. In this paper, we look deep into these issues by examining the components and features of geospatial big data, reviewing relevant scientific research methodologies, and examining the evolving pattern of geospatial data in the scope of the four ‘science paradigms’. This paper proposes that geospatial big data has significantly shifted the scientific research methodology from ‘hypothesis to data’ to ‘data to questions’ and it is important to explore the generality of growing geospatial data ‘from bottom to top’. Particularly, four research areas that mostly reflect data-driven geospatial research are proposed: spatial correlation, spatial analytics, spatial visualization, and scientific knowledge discovery. It is also pointed out that privacy and quality issues of geospatial data may require more attention in the future. Also, some challenges and thoughts are raised for future discussion.
NASA Astrophysics Data System (ADS)
Kulo, Violet; Bodzin, Alec
2013-02-01
Geospatial technologies are increasingly being integrated in science classrooms to foster learning. This study examined whether a Web-enhanced science inquiry curriculum supported by geospatial technologies promoted urban middle school students' understanding of energy concepts. The participants included one science teacher and 108 eighth-grade students classified in three ability level tracks. Data were gathered through pre/posttest content knowledge assessments, daily classroom observations, and daily reflective meetings with the teacher. Findings indicated a significant increase in the energy content knowledge for all the students. Effect sizes were large for all three ability level tracks, with the middle and low track classes having larger effect sizes than the upper track class. Learners in all three tracks were highly engaged with the curriculum. Curriculum effectiveness and practical issues involved with using geospatial technologies to support science learning are discussed.
Towards the Geospatial Web: Media Platforms for Managing Geotagged Knowledge Repositories
NASA Astrophysics Data System (ADS)
Scharl, Arno
International media have recognized the visual appeal of geo-browsers such as NASA World Wind and Google Earth, for example, when Web and television coverage on Hurricane Katrina used interactive geospatial projections to illustrate its path and the scale of destruction in August 2005. Yet these early applications only hint at the true potential of geospatial technology to build and maintain virtual communities and to revolutionize the production, distribution and consumption of media products. This chapter investigates this potential by reviewing the literature and discussing the integration of geospatial and semantic reference systems, with an emphasis on extracting geospatial context from unstructured text. A content analysis of news coverage based on a suite of text mining tools (webLyzard) sheds light on the popularity and adoption of geospatial platforms.
NASA Astrophysics Data System (ADS)
Othman, Raha binti; Bakar, Muhamad Shahbani Abu; Mahamud, Ku Ruhana Ku
2017-10-01
While Spatial Data Infrastructure (SDI) has been established in Malaysia, the full potential can be further realized. To a large degree, geospatial industry users are hopeful that they can easily get access to the system and start utilizing the data. Some users expect SDI to provide them with readily available data without the necessary steps of requesting the data from the data providers as well as the steps for them to process and to prepare the data for their use. Some further argued that the usability of the system can be improved if appropriate combination between data sharing and focused application is found within the services. In order to address the current challenges and to enhance the effectiveness of the SDI in Malaysia, there is possibility of establishing a collaborative business venture between public and private entities; thus can help addressing the issues and expectations. In this paper, we discussed the possibility of collaboration between these two entities. Interviews with seven entities are held to collect information on the exposure, acceptance and sharing of platform. The outcomes indicate that though the growth of GIS technology and the high level of technology acceptance provides a solid based for utilizing the geospatial data, the absence of concrete policy on data sharing, a quality geospatial data, an authority for coordinator agency, leaves a vacuum for the successful implementation of the SDI initiative.
GIS-and Web-based Water Resource Geospatial Infrastructure for Oil Shale Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Wei; Minnick, Matthew; Geza, Mengistu
2012-09-30
The Colorado School of Mines (CSM) was awarded a grant by the National Energy Technology Laboratory (NETL), Department of Energy (DOE) to conduct a research project en- titled GIS- and Web-based Water Resource Geospatial Infrastructure for Oil Shale Development in October of 2008. The ultimate goal of this research project is to develop a water resource geo-spatial infrastructure that serves as “baseline data” for creating solutions on water resource management and for supporting decisions making on oil shale resource development. The project came to the end on September 30, 2012. This final project report will report the key findings frommore » the project activity, major accomplishments, and expected impacts of the research. At meantime, the gamma version (also known as Version 4.0) of the geodatabase as well as other various deliverables stored on digital storage media will be send to the program manager at NETL, DOE via express mail. The key findings from the project activity include the quantitative spatial and temporal distribution of the water resource throughout the Piceance Basin, water consumption with respect to oil shale production, and data gaps identified. Major accomplishments of this project include the creation of a relational geodatabase, automated data processing scripts (Matlab) for database link with surface water and geological model, ArcGIS Model for hydrogeologic data processing for groundwater model input, a 3D geological model, surface water/groundwater models, energy resource development systems model, as well as a web-based geo-spatial infrastructure for data exploration, visualization and dissemination. This research will have broad impacts of the devel- opment of the oil shale resources in the US. The geodatabase provides a “baseline” data for fur- ther study of the oil shale development and identification of further data collection needs. The 3D geological model provides better understanding through data interpolation and visualization techniques of the Piceance Basin structure spatial distribution of the oil shale resources. The sur- face water/groundwater models quantify the water shortage and better understanding the spatial distribution of the available water resources. The energy resource development systems model reveals the phase shift of water usage and the oil shale production, which will facilitate better planning for oil shale development. Detailed descriptions about the key findings from the project activity, major accomplishments, and expected impacts of the research will be given in the sec- tion of “ACCOMPLISHMENTS, RESULTS, AND DISCUSSION” of this report.« less
Geospatial Resource Access Analysis In Hedaru, Tanzania
NASA Astrophysics Data System (ADS)
Clark, Dylan G.; Premkumar, Deepak; Mazur, Robert; Kisimbo, Elibariki
2013-12-01
Populations around the world are facing increased impacts of anthropogenic-induced environmental changes and rapid population movements. These environmental and social shifts are having an elevated impact on the livelihoods of agriculturalists and pastoralists in developing countries. This appraisal integrates various tools—usually used independently— to gain a comprehensive understanding of the regional livelihood constraints in the rural Hedaru Valley of northeastern Tanzania. Conducted in three villages with different natural resources, using three primary methods: 1) participatory mapping of infrastructures; 2) administration of quantitative, spatially-tied surveys (n=80) and focus groups (n=14) that examined land use, household health, education, and demographics; 3) conducting quantitative time series analysis of Landsat- based Normalized Difference Vegetation Index images. Through various geospatial and multivariate linear regression analyses, significant geospatial trends emerged. This research added to the academic understanding of the region while establishing pathways for climate change adaptation strategies.
NativeView: Our Land, Our People, Our Future
NASA Astrophysics Data System (ADS)
Bennett, T.
2006-05-01
The objective of this discussion is to (1) discuss the chasm between the breadth of Tribal land and resource to be sustained compared to the finite number of Tribal people trained in the sciences; (2) illustrate the need for integrating scientific knowledge with cultural knowledge; and (3) discuss the emergence of NativeView as Tribal College (TCUs) initiative leading the integration of geoscience and geospatial technology (GIS, Remote Sensing) with cultural knowledge to meet the growing needs of indigenous communities. It's about our land, our people and the need for highly trained individuals to sustainable and manage our resources for the future. There is a tremendous gap between total acreage of land owned or managed and the level of education obtained by indigenous people. In the United States today, American Indians and Alaskan Natives account for less than one percent of the total population, yet are responsible for more than five percent of the total land area. In North Dakota, there are over 54 thousand American Indians responsible for more than 3.8 million acres of Tribal Land. In contrast, less than 15 percent of indigenous people finish a Bachelor's degree of any kind and far fewer finish a science degree that would help them become more effective and responsible land managers. This poses an important dilemma. How will the Tribes meet (1) the resource needs of a growing population, (2) the demand for a skilled workforce, and (3) resource management goals in ways that contribute to Tribal infrastructure and equate to sustainable resource management? The integration of geoscience and geospatial technologies into the curriculum of Tribal Colleges (TCU's) has quietly emerged as one of the leading initiatives across Indian Country. These skills are widely recognized as a vehicle to empower our constituents in the sciences, in the cultural values and the traditional land ethic that defines us as a people. NativeView has taken the lead in working with the Tribes, TCUs and other partners to create cadres of indigenous professionals that possess skills in geoscience and geospatial technologies that will manage Tribal resources in scientifically sound, culturally relevant ways. Preliminary results suggest that developing strength-based collaborations that create an environment of investment and ownership by all Indian and non-Indian participants proves an effective model for meeting long- term goals. A number of these projects and the mechanisms that define the successful collaborations will be illustrated.
NASA Astrophysics Data System (ADS)
Usländer, Thomas
2012-10-01
The demand for the rapid provision of EO products with well-defined characteristics in terms of temporal, spatial, image-specific and thematic criteria is increasing. Examples are products to support near real-time damage assessment after a natural disaster event, e.g. an earthquake. However, beyond the organizational and economic questions, there are technological and systemic barriers to enable a comfortable search, order, delivery or even combination of EO products. Most portals of space agencies and EO product providers require sophisticated satellite and product knowledge and, even worse, are all different and not interoperable. This paper gives an overview about the use cases and the architectural solutions that aim at an open and flexible EO mission infrastructure with application-oriented user interfaces and well-defined service interfaces based upon open standards. It presents corresponding international initiatives such as INSPIRE (Infrastructure for Spatial Information in the European Community), GMES (Global Monitoring for Environment and Security), GEOSS (Global Earth Observation System of Systems) and HMA (Heterogeneous Missions Accessibility) and their associated infrastructure approaches. The paper presents a corresponding analysis and design methodology and two examples how such architectures are already successfully used in early warning systems for geo-hazards and toolsets for environmentallyinduced health risks. Finally, the paper concludes with an outlook how these ideas relate to the vision of the Future Internet.
Online Resources to Support Professional Development for Managing and Preserving Geospatial Data
NASA Astrophysics Data System (ADS)
Downs, R. R.; Chen, R. S.
2013-12-01
Improved capabilities of information and communication technologies (ICT) enable the development of new systems and applications for collecting, managing, disseminating, and using scientific data. New knowledge, skills, and techniques are also being developed to leverage these new ICT capabilities and improve scientific data management practices throughout the entire data lifecycle. In light of these developments and in response to increasing recognition of the wider value of scientific data for society, government agencies are requiring plans for the management, stewardship, and public dissemination of data and research products that are created by government-funded studies. Recognizing that data management and dissemination have not been part of traditional science education programs, new educational programs and learning resources are being developed to prepare new and practicing scientists, data scientists, data managers, and other data professionals with skills in data science and data management. Professional development and training programs also are being developed to address the need for scientists and professionals to improve their expertise in using the tools and techniques for managing and preserving scientific data. The Geospatial Data Preservation Resource Center offers an online catalog of various open access publications, open source tools, and freely available information for the management and stewardship of geospatial data and related resources, such as maps, GIS, and remote sensing data. Containing over 500 resources that can be found by type, topic, or search query, the geopreservation.org website enables discovery of various types of resources to improve capabilities for managing and preserving geospatial data. Applications and software tools can be found for use online or for download. Online journal articles, presentations, reports, blogs, and forums are also available through the website. Available education and training materials include tutorials, primers, guides, and online learning modules. The site enables users to find and access standards, real-world examples, and websites of other resources about geospatial data management. Quick links to lists of resources are available for data managers, system developers, and researchers. New resources are featured regularly to highlight current developments in practice and research. A user-centered approach was taken to design and develop the site iteratively, based on a survey of the expectations and needs of community members who have an interest in the management and preservation of geospatial data. Formative and summative evaluation activities have informed design, content, and feature enhancements to enable users to use the website efficiently and effectively. Continuing management and evaluation of the website keeps the content and the infrastructure current with evolving research, practices, and technology. The design, development, evaluation, and use of the website are described along with selected resources and activities that support education and professional development for the management, preservation, and stewardship of geospatial data.
Get a Grip on Demographics with Geospatial Technology
ERIC Educational Resources Information Center
Raymond, Randall E.
2009-01-01
Aging school infrastructure, changing population dynamics, decreased funding, and increased accountability for reporting school success all require today's school business officials to combine a variety of disparate data sets into a coherent system that enables effective and efficient decision making. School business officials are required to: (1)…
Planning Quality for Successful International Environmental Monitoring
George M. Brilis; John G. Lyon; Jeffery C. Worthington
2006-01-01
Federal, State, and municipal government entities are increasingly depending on geospatial data for a myriad of purposes. This trend is expected to continue. Information sharing and interoperability are in line with the Federal Executive Order 12906 (Clinton, 1994) which calls for the establishment of the National Spatial Data Infrastructure (NSDI). If other...
NASA Astrophysics Data System (ADS)
Mazzetti, P.; Nativi, S.; Verlato, M.; Angelini, V.
2009-04-01
In the context of the EU co-funded project CYCLOPS (http://www.cyclops-project.eu) the problem of designing an advanced e-Infrastructure for Civil Protection (CP) applications has been addressed. As a preliminary step, some studies about European CP systems and operational applications were performed in order to define their specific system requirements. At a higher level it was verified that CP applications are usually conceived to map CP Business Processes involving different levels of processing including data access, data processing, and output visualization. At their core they usually run one or more Earth Science models for information extraction. The traditional approach based on the development of monolithic applications presents some limitations related to flexibility (e.g. the possibility of running the same models with different input data sources, or different models with the same data sources) and scalability (e.g. launching several runs for different scenarios, or implementing more accurate and computing-demanding models). Flexibility can be addressed adopting a modular design based on a SOA and standard services and models, such as OWS and ISO for geospatial services. Distributed computing and storage solutions could improve scalability. Basing on such considerations an architectural framework has been defined. It is made of a Web Service layer providing advanced services for CP applications (e.g. standard geospatial data sharing and processing services) working on the underlying Grid platform. This framework has been tested through the development of prototypes as proof-of-concept. These theoretical studies and proof-of-concept demonstrated that although Grid and geospatial technologies would be able to provide significant benefits to CP applications in terms of scalability and flexibility, current platforms are designed taking into account requirements different from CP. In particular CP applications have strict requirements in terms of: a) Real-Time capabilities, privileging time-of-response instead of accuracy, b) Security services to support complex data policies and trust relationships, c) Interoperability with existing or planned infrastructures (e.g. e-Government, INSPIRE compliant, etc.). Actually these requirements are the main reason why CP applications differ from Earth Science applications. Therefore further research is required to design and implement an advanced e-Infrastructure satisfying those specific requirements. In particular five themes where further research is required were identified: Grid Infrastructure Enhancement, Advanced Middleware for CP Applications, Security and Data Policies, CP Applications Enablement, and Interoperability. For each theme several research topics were proposed and detailed. They are targeted to solve specific problems for the implementation of an effective operational European e-Infrastructure for CP applications.
NASA Astrophysics Data System (ADS)
Tao, W.; Tucker, K.; DeFlorio, J.
2012-12-01
The reality of a changing climate means that transportation and planning agencies need to understand the potential effects of changes in storm activity, sea levels, temperature, and precipitation patterns; and develop strategies to ensure the continuing robustness and resilience of transportation infrastructure and services. This is a relatively new challenge for California's regional planning agencies - adding yet one more consideration to an already complex and multifaceted planning process. In that light, the California Department of Transportation (Caltrans) is developing a strategy framework using a module-based process that planning agencies can undertake to incorporating the risks of climate change impacts into their decision-making and long-range transportation plans. The module-based approach was developed using a best practices survey of existing work nationally, along with a set of structured interviews with metropolitan planning organizations (MPOs) and regional transportation planning agencies (RTPAs) within California. Findings led to the development of a process, as well as a package of foundational geospatial layers (i.e. the Statewide Transportation Asset Geodatabase - STAG), primarily comprising state and Federal transportation assets. These assets are intersected with a set of geospatial layers for the climate stressors of relevance in the state which are placed in the same reference layers as the STAG; thus providing a full set of GIS layers that can be a starting point for MPOs/RTPAs that want to follow the step-by-step module-based approach in its entirety. The fast-paced changes in science and climate change knowledge requires a flexible platform to display continuously evolving information. To this end, the development of the modules are accompanied by a set of geospatial analysis disseminated using an online web portal. In this way, the information can be relayed to MPO/RTPAs in a easy-to-use fashion that can help them follow the modules for the strategy framework. The strategy framework for MPOs and RTPAs is used to: 1) Assess the relative risks to their transportation system infrastructure and services of different climate stressors (sea level rise, temperature changes, snow melt, precipita¬tion changes, flooding, extreme weather events); 2) Conduct an asset inventory and vulnerability assessment of existing infrastructure; 3) Prioritize segments and facilities for adaptation action; 4) Identify appropriate and cost-effective adaptation strategies; and 5) Incorporate climate impact considerations into future long-range transportation planning and investment decisions. This framework complements the broader planning and investment processes that MPOs and RTPAs already manage. It recognizes the varying capacities and resources among MPOs and RTPAs and provide methods that can be used by organizations seeking to conduct in-depth analysis or a more sketch-level assessment.
Best Practices for Preparing Interoperable Geospatial Data
NASA Astrophysics Data System (ADS)
Wei, Y.; Santhana Vannan, S.; Cook, R. B.; Wilson, B. E.; Beaty, T. W.
2010-12-01
Geospatial data is critically important for a wide scope of research and applications: carbon cycle and ecosystem, climate change, land use and urban planning, environmental protecting, etc. Geospatial data is created by different organizations using different methods, from remote sensing observations, field surveys, model simulations, etc., and stored in various formats. So geospatial data is diverse and heterogeneous, which brings a huge barrier for the sharing and using of geospatial data, especially when targeting a broad user community. Many efforts have been taken to address different aspects of using geospatial data by improving its interoperability. For example, the specification for Open Geospatial Consortium (OGC) catalog services defines a standard way for geospatial information discovery; OGC Web Coverage Services (WCS) and OPeNDAP define interoperable protocols for geospatial data access, respectively. But the reality is that only having the standard mechanisms for data discovery and access is not enough. The geospatial data content itself has to be organized in standard, easily understandable, and readily usable formats. The Oak Ridge National Lab Distributed Archived Data Center (ORNL DAAC) archives data and information relevant to biogeochemical dynamics, ecological data, and environmental processes. The Modeling and Synthesis Thematic Data Center (MAST-DC) prepares and distributes both input data and output data of carbon cycle models and provides data support for synthesis and terrestrial model inter-comparison in multi-scales. Both of these NASA-funded data centers compile and distribute a large amount of diverse geospatial data and have broad user communities, including GIS users, Earth science researchers, and ecosystem modeling teams. The ORNL DAAC and MAST-DC address this geospatial data interoperability issue by standardizing the data content and feeding them into a well-designed Spatial Data Infrastructure (SDI) which provides interoperable mechanisms to advertise, visualize, and distribute the standardized geospatial data. In this presentation, we summarize the experiences learned and the best practices for geospatial data standardization. The presentation will describe how diverse and historical data archived in the ORNL DAAC were converted into standard and non-proprietary formats; what tools were used to make the conversion; how the spatial and temporal information are properly captured in a consistent manor; how to name a data file or a variable to make it both human-friendly and semantically interoperable; how NetCDF file format and CF convention can promote the data usage in ecosystem modeling user community; how those standardized geospatial data can be fed into OGC Web Services to support on-demand data visualization and access; and how the metadata should be collected and organized so that they can be discovered through standard catalog services.
NASA Astrophysics Data System (ADS)
Buonanno, Sabatino; Fusco, Adele; Zeni, Giovanni; Manunta, Michele; Lanari, Riccardo
2017-04-01
This work describes the implementation of an efficient system for managing, viewing, analyzing and updating remotely sensed data, with special reference to Differential Interferometric Synthetic Aperture Radar (DInSAR) data. The DInSAR products measure Earth surface deformation both in space and time, producing deformation maps and time series[1,2]. The use of these data in research or operational contexts requires tools that have to handle temporal and spatial variability with high efficiency. For this aim we present an implementation based on Spatial Data Infrastructure (SDI) for data integration, management and interchange, by using standard protocols[3]. SDI tools provide access to static datasets that operate only with spatial variability . In this paper we use the open source project GeoNode as framework to extend SDI infrastructure functionalities to ingest very efficiently DInSAR deformation maps and deformation time series. GeoNode allows to realize comprehensive and distributed infrastructure, following the standards of the Open Geospatial Consortium, Inc. - OGC, for remote sensing data management, analysis and integration [4,5]. In the current paper we explain the methodology used for manage the data complexity and data integration using the opens source project GeoNode. The solution presented in this work for the ingestion of DinSAR products is a very promising starting point for future developments of the OGC compliant implementation of a semi-automatic remote sensing data processing chain . [1] Berardino, P., Fornaro, G., Lanari, R., & Sansosti, E. (2002). A new Algorithm for Surface Deformation Monitoring based on Small Baseline Differential SAR Interferograms. IEEE Transactions on Geoscience and Remote Sensing, 40, 11, pp. 2375-2383. [2] Lanari R., F. Casu, M. Manzo, G. Zeni,, P. Berardino, M. Manunta and A. Pepe (2007), An overview of the Small Baseline Subset Algorithm: a DInSAR Technique for Surface Deformation Analysis, P. Appl. Geophys., 164, doi: 10.1007/s00024-007-0192-9. [3] Nebert, D.D. (ed). 2000. Developing Spatial data Infrastructures: The SDI Cookbook. [4] Geonode (www.geonode.org) [5] Kolodziej, k. (ed). 2004. OGC OpenGIS Web Map Server Cookbook. Open Geospatial Consortium, 1.0.2 edition.
Crowdsourced Contributions to the Nation's Geodetic Elevation Infrastructure
NASA Astrophysics Data System (ADS)
Stone, W. A.
2014-12-01
NOAA's National Geodetic Survey (NGS), a United States Department of Commerce agency, is engaged in providing the nation's fundamental positioning infrastructure - the National Spatial Reference System (NSRS) - which includes the framework for latitude, longitude, and elevation determination as well as various geodetic models, tools, and data. Capitalizing on Global Navigation Satellite System (GNSS) technology for improved access to the nation's precise geodetic elevation infrastructure requires use of a geoid model, which relates GNSS-derived heights (ellipsoid heights) with traditional elevations (orthometric heights). NGS is facilitating the use of crowdsourced GNSS observations collected at published elevation control stations by the professional surveying, geospatial, and scientific communities to help improve NGS' geoid modeling capability. This collocation of published elevation data and newly collected GNSS data integrates together the two height systems. This effort in turn supports enhanced access to accurate elevation information across the nation, thereby benefiting all users of geospatial data. By partnering with the public in this collaborative effort, NGS is not only helping facilitate improvements to the elevation infrastructure for all users but also empowering users of NSRS with the capability to do their own high-accuracy positioning. The educational outreach facet of this effort helps inform the public, including the scientific community, about the utility of various NGS tools, including the widely used Online Positioning User Service (OPUS). OPUS plays a key role in providing user-friendly and high accuracy access to NSRS, with optional sharing of results with NGS and the public. All who are interested in helping evolve and improve the nationwide elevation determination capability are invited to participate in this nationwide partnership and to learn more about the geodetic infrastructure which is a vital component of viable spatial data for many disciplines, including the geosciences.
A Security Architecture for Grid-enabling OGC Web Services
NASA Astrophysics Data System (ADS)
Angelini, Valerio; Petronzio, Luca
2010-05-01
In the proposed presentation we describe an architectural solution for enabling a secure access to Grids and possibly other large scale on-demand processing infrastructures through OGC (Open Geospatial Consortium) Web Services (OWS). This work has been carried out in the context of the security thread of the G-OWS Working Group. G-OWS (gLite enablement of OGC Web Services) is an international open initiative started in 2008 by the European CYCLOPS , GENESI-DR, and DORII Project Consortia in order to collect/coordinate experiences in the enablement of OWS's on top of the gLite Grid middleware. G-OWS investigates the problem of the development of Spatial Data and Information Infrastructures (SDI and SII) based on the Grid/Cloud capacity in order to enable Earth Science applications and tools. Concerning security issues, the integration of OWS compliant infrastructures and gLite Grids needs to address relevant challenges, due to their respective design principles. In fact OWS's are part of a Web based architecture that demands security aspects to other specifications, whereas the gLite middleware implements the Grid paradigm with a strong security model (the gLite Grid Security Infrastructure: GSI). In our work we propose a Security Architectural Framework allowing the seamless use of Grid-enabled OGC Web Services through the federation of existing security systems (mostly web based) with the gLite GSI. This is made possible mediating between different security realms, whose mutual trust is established in advance during the deployment of the system itself. Our architecture is composed of three different security tiers: the user's security system, a specific G-OWS security system, and the gLite Grid Security Infrastructure. Applying the separation-of-concerns principle, each of these tiers is responsible for controlling the access to a well-defined resource set, respectively: the user's organization resources, the geospatial resources and services, and the Grid resources. While the gLite middleware is tied to a consolidated security approach based on X.509 certificates, our system is able to support different kinds of user's security infrastructures. Our central component, the G-OWS Security Framework, is based on the OASIS WS-Trust specifications and on the OGC GeoRM architectural framework. This allows to satisfy advanced requirements such as the enforcement of specific geospatial policies and complex secure web service chained requests. The typical use case is represented by a scientist belonging to a given organization who issues a request to a G-OWS Grid-enabled Web Service. The system initially asks the user to authenticate to his/her organization's security system and, after verification of the user's security credentials, it translates the user's digital identity into a G-OWS identity. This identity is linked to a set of attributes describing the user's access rights to the G-OWS services and resources. Inside the G-OWS Security system, access restrictions are applied making use of the enhanced Geospatial capabilities specified by the OGC GeoXACML. If the required action needs to make use of the Grid environment the system checks if the user is entitled to access a Grid infrastructure. In that case his/her identity is translated to a temporary Grid security token using the Short Lived Credential Services (IGTF Standard). In our case, for the specific gLite Grid infrastructure, some information (VOMS Attributes) is plugged into the Grid Security Token to grant the access to the user's Virtual Organization Grid resources. The resulting token is used to submit the request to the Grid and also by the various gLite middleware elements to verify the user's grants. Basing on the presented framework, the G-OWS Security Working Group developed a prototype, enabling the execution of OGC Web Services on the EGEE Production Grid through the federation with a Shibboleth based security infrastructure. Future plans aim to integrate other Web authentication services such as OpenID, Kerberos and WS-Federation.
A Research Agenda for Geospatial Technologies and Learning
ERIC Educational Resources Information Center
Baker, Tom R.; Battersby, Sarah; Bednarz, Sarah W.; Bodzin, Alec M.; Kolvoord, Bob; Moore, Steven; Sinton, Diana; Uttal, David
2015-01-01
Knowledge around geospatial technologies and learning remains sparse, inconsistent, and overly anecdotal. Studies are needed that are better structured; more systematic and replicable; attentive to progress and findings in the cognate fields of science, technology, engineering, and math education; and coordinated for multidisciplinary approaches.…
71 FR 66315 - Notice of Availability of Invention for Licensing; Government-Owned Invention
Federal Register 2010, 2011, 2012, 2013, 2014
2006-11-14
... Coating and Method of Formulator.//Navy Case No. 97,486: Processing Semantic Markups in Web Ontology... Rotating Clip.//Navy Case No. 97,886: Adding Semantic Support to Existing UDDI Infrastructure.//Navy Case..., Binding, and Integration of Non-Registered Geospatial Web Services.//Navy Case No. 98,094: Novel, Single...
Data management for geospatial vulnerability assessment of interdependencies in US power generation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shih, C.Y.; Scown, C.D.; Soibelman, L.
2009-09-15
Critical infrastructures maintain our society's stability, security, and quality of life. These systems are also interdependent, which means that the disruption of one infrastructure system can significantly impact the operation of other systems. Because of the heavy reliance on electricity production, it is important to assess possible vulnerabilities. Determining the source of these vulnerabilities can provide insight for risk management and emergency response efforts. This research uses data warehousing and visualization techniques to explore the interdependencies between coal mines, rail transportation, and electric power plants. By merging geospatial and nonspatial data, we are able to model the potential impacts ofmore » a disruption to one or more mines, rail lines, or power plants, and visually display the results using a geographical information system. A scenario involving a severe earthquake in the New Madrid Seismic Zone is used to demonstrate the capabilities of the model when given input in the form of a potentially impacted area. This type of interactive analysis can help decision makers to understand the vulnerabilities of the coal distribution network and the potential impact it can have on electricity production.« less
Baker, Michael S.; Buteyn, Spencer D.; Freeman, Philip A.; Trippi, Michael H.; Trimmer III, Loyd M.
2017-07-31
This report describes the U.S. Geological Survey’s (USGS) ongoing commitment to its mission of understanding the nature and distribution of global mineral commodity supply chains by updating and publishing the georeferenced locations of mineral commodity production and processing facilities, mineral exploration and development sites, and mineral commodity exporting ports in Latin America and the Caribbean. The report includes an overview of data sources and an explanation of the geospatial PDF map format.The geodatabase and geospatial data layers described in this report create a new geographic information product in the form of a geospatial portable document format (PDF) map. The geodatabase contains additional data layers from USGS, foreign governmental, and open-source sources as follows: (1) coal occurrence areas, (2) electric power generating facilities, (3) electric power transmission lines, (4) hydrocarbon resource cumulative production data, (5) liquefied natural gas terminals, (6) oil and gas concession leasing areas, (7) oil and gas field center points, (8) oil and gas pipelines, (9) USGS petroleum provinces, (10) railroads, (11) recoverable proven plus probable hydrocarbon resources, (12) major cities, (13) major rivers, and (14) undiscovered porphyry copper tracts.
NASA Astrophysics Data System (ADS)
Smart, A. C.
2014-12-01
Governments are increasingly asking for more evidence of the benefits of investing in geospatial data and infrastructure before investing. They are looking for a clearer articulation of the economic, environmental and social benefits than has been possble in the past. Development of techniques has accelerated in the past five years as governments and industry become more involved in the capture and use of geospatial data. However evaluation practitioners have struggled to answer these emerging questions. The paper explores the types of questions that decision makers are asking and discusses the different approaches and methods that have been used recently to answer them. It explores the need for better buisness case models. The emerging approaches are then discussed and their attributes reviewed. These include methods of analysing tengible economic benefits, intangible benefits and societal benefits. The paper explores the use of value chain analysis and real options analysis to better articulate the impacts on international competitiveness and how to value the potential benefits of innovations enabled by the geospatial data that is produced. The paper concludes by illustrating the potential for these techniques in current and future decision making.
A Spatial Data Infrastructure to Share Earth and Space Science Data
NASA Astrophysics Data System (ADS)
Nativi, S.; Mazzetti, P.; Bigagli, L.; Cuomo, V.
2006-05-01
Spatial Data Infrastructure:SDI (also known as Geospatial Data Infrastructure) is fundamentally a mechanism to facilitate the sharing and exchange of geospatial data. SDI is a scheme necessary for the effective collection, management, access, delivery and utilization of geospatial data; it is important for: objective decision making and sound land based policy, support economic development and encourage socially and environmentally sustainable development. As far as data model and semantics are concerned, a valuable and effective SDI should be able to cross the boundaries between the Geographic Information System/Science (GIS) and Earth and Space Science (ESS) communities. Hence, SDI should be able to discover, access and share information and data produced and managed by both GIS and ESS communities, in an integrated way. In other terms, SDI must be built on a conceptual and technological framework which abstracts the nature and structure of shared dataset: feature-based data or Imagery, Gridded and Coverage Data (IGCD). ISO TC211 and the Open Geospatial Consortium provided important artifacts to build up this framework. In particular, the OGC Web Services (OWS) initiatives and several Interoperability Experiment (e.g. the GALEON IE) are extremely useful for this purpose. We present a SDI solution which is able to manage both GIS and ESS datasets. It is based on OWS and other well-accepted or promising technologies, such as: UNIDATA netCDF and CDM, ncML and ncML-GML. Moreover, it uses a specific technology to implement a distributed and federated system of catalogues: the GI-Cat. This technology performs data model mediation and protocol adaptation tasks. It is used to work out a metadata clearinghouse service, implementing a common (federal) catalogue model which is based on the ISO 19115 core metadata for geo-dataset. Nevertheless, other well- accepted or standard catalogue data models can be easily implemented as common view (e.g. OGC CS-W, the next coming INSPIRE discovery metadata model, etc.). The proposed solution has been conceived and developed for building up the "Lucan SDI". This is the SDI of the Italian Basilicata Region. It aims to connect the following data providers and users: the National River Basin Authority of Basilicata, the Regional Environmental Agency, the Land Management & Cadastre Regional Authorities, the Prefecture, the Regional Civil Protection Centers, the National Research Council Institutes in Basilicata, the Academia, several SMEs.
Citizen science, GIS, and the global hunt for landslides
NASA Astrophysics Data System (ADS)
Juang, C.; Stanley, T.; Kirschbaum, D.
2017-12-01
Landslides occur across the United States and around the world, causing much suffering and infrastructure damage. Many of these events have been recorded in the Global Landslide Catalog (GLC), a worldwide record of recently rainfall-triggered landslides. The extent and composition of this database has been affected by the limits of media search tools and available staffing. Citizen scientists could expand the effort exponentially, as well as diversify the knowledge base of the research team. In order to enable this collaboration the NASA Center for Climate Simulation has created a GIS portal for viewing, editing, and managing the GLC. The data is also exposed through a Rest API, for easy incorporation into geospatial websites by third parties. Future developments may include the ability to store polygons delineating large landslides, digitization from recent satellite imagery, and the establishment of a community for international landslide research that is open to both lay and academic users.
Public health, GIS, and the internet.
Croner, Charles M
2003-01-01
Internet access and use of georeferenced public health information for GIS application will be an important and exciting development for the nation's Department of Health and Human Services and other health agencies in this new millennium. Technological progress toward public health geospatial data integration, analysis, and visualization of space-time events using the Web portends eventual robust use of GIS by public health and other sectors of the economy. Increasing Web resources from distributed spatial data portals and global geospatial libraries, and a growing suite of Web integration tools, will provide new opportunities to advance disease surveillance, control, and prevention, and insure public access and community empowerment in public health decision making. Emerging supercomputing, data mining, compression, and transmission technologies will play increasingly critical roles in national emergency, catastrophic planning and response, and risk management. Web-enabled public health GIS will be guided by Federal Geographic Data Committee spatial metadata, OpenGIS Web interoperability, and GML/XML geospatial Web content standards. Public health will become a responsive and integral part of the National Spatial Data Infrastructure.
A web service for service composition to aid geospatial modelers
NASA Astrophysics Data System (ADS)
Bigagli, L.; Santoro, M.; Roncella, R.; Mazzetti, P.
2012-04-01
The identification of appropriate mechanisms for process reuse, chaining and composition is considered a key enabler for the effective uptake of a global Earth Observation infrastructure, currently pursued by the international geospatial research community. In the Earth and Space Sciences, such a facility could primarily enable integrated and interoperable modeling, for what several approaches have been proposed and developed, over the last years. In fact, GEOSS is specifically tasked with the development of the so-called "Model Web". At increasing levels of abstraction and generalization, the initial stove-pipe software tools have evolved to community-wide modeling frameworks, to Component-Based Architecture solution, and, more recently, started to embrace Service-Oriented Architectures technologies, such as the OGC WPS specification and the WS-* stack of W3C standards for service composition. However, so far, the level of abstraction seems too low for implementing the Model Web vision, and far too complex technological aspects must still be addressed by both providers and users, resulting in limited usability and, eventually, difficult uptake. As by the recent ICT trend of resource virtualization, it has been suggested that users in need of a particular processing capability, required by a given modeling workflow, may benefit from outsourcing the composition activities into an external first-class service, according to the Composition as a Service (CaaS) approach. A CaaS system provides the necessary interoperability service framework for adaptation, reuse and complementation of existing processing resources (including models and geospatial services in general) in the form of executable workflows. This work introduces the architecture of a CaaS system, as a distributed information system for creating, validating, editing, storing, publishing, and executing geospatial workflows. This way, the users can be freed from the need of a composition infrastructure and alleviated from the technicalities of workflow definitions (type matching, identification of external services endpoints, binding issues, etc.) and focus on their intended application. Moreover, the user may submit an incomplete workflow definition, and leverage CaaS recommendations (that may derive from an aggregated knowledge base of user feedback, underpinned by Web 2.0 technologies) to execute it. This is of particular interest for multidisciplinary scientific contexts, where different communities may benefit of each other knowledge through model chaining. Indeed, the CaaS approach is presented as an attempt to combine the recent advances in service-oriented computing with collaborative research principles, and social network information in general. Arguably, it may be considered a fundamental capability of the Model Web. The CaaS concept is being investigated in several application scenarios identified in the FP7 UncertWeb and EuroGEOSS projects. Key aspects of the described CaaS solution are: it provides a standard WPS interface for invoking Business Processes and allows on the fly recursive compositions of Business Processes into other Composite Processes; it is designed according to the extended SOA (broker-based) and the System-of-Systems approach, to support the reuse and integration of existing resources, in compliance with the GEOSS Model Web architecture. The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under Grant Agreement n° 248488.
Bridging the Gap Between Surveyors and the Geo-Spatial Society
NASA Astrophysics Data System (ADS)
Müller, H.
2016-06-01
For many years FIG, the International Association of Surveyors, has been trying to bridge the gap between surveyors and the geospatial society as a whole, with the geospatial industries in particular. Traditionally the surveying profession contributed to the good of society by creating and maintaining highly precise and accurate geospatial data bases, based on an in-depth knowledge of spatial reference frameworks. Furthermore in many countries surveyors may be entitled to make decisions about land divisions and boundaries. By managing information spatially surveyors today develop into the role of geo-data managers, the longer the more. Job assignments in this context include data entry management, data and process quality management, design of formal and informal systems, information management, consultancy, land management, all that in close cooperation with many different stakeholders. Future tasks will include the integration of geospatial information into e-government and e-commerce systems. The list of professional tasks underpins the capabilities of surveyors to contribute to a high quality geospatial data and information management. In that way modern surveyors support the needs of a geo-spatial society. The paper discusses several approaches to define the role of the surveyor within the modern geospatial society.
A Python Geospatial Language Toolkit
NASA Astrophysics Data System (ADS)
Fillmore, D.; Pletzer, A.; Galloy, M.
2012-12-01
The volume and scope of geospatial data archives, such as collections of satellite remote sensing or climate model products, has been rapidly increasing and will continue to do so in the near future. The recently launched (October 2011) Suomi National Polar-orbiting Partnership satellite (NPP) for instance, is the first of a new generation of Earth observation platforms that will monitor the atmosphere, oceans, and ecosystems, and its suite of instruments will generate several terabytes each day in the form of multi-spectral images and derived datasets. Full exploitation of such data for scientific analysis and decision support applications has become a major computational challenge. Geophysical data exploration and knowledge discovery could benefit, in particular, from intelligent mechanisms for extracting and manipulating subsets of data relevant to the problem of interest. Potential developments include enhanced support for natural language queries and directives to geospatial datasets. The translation of natural language (that is, human spoken or written phrases) into complex but unambiguous objects and actions can be based on a context, or knowledge domain, that represents the underlying geospatial concepts. This poster describes a prototype Python module that maps English phrases onto basic geospatial objects and operations. This module, along with the associated computational geometry methods, enables the resolution of natural language directives that include geographic regions of arbitrary shape and complexity.
Geospatial Service Platform for Education and Research
NASA Astrophysics Data System (ADS)
Gong, J.; Wu, H.; Jiang, W.; Guo, W.; Zhai, X.; Yue, P.
2014-04-01
We propose to advance the scientific understanding through applications of geospatial service platforms, which can help students and researchers investigate various scientific problems in a Web-based environment with online tools and services. The platform also offers capabilities for sharing data, algorithm, and problem-solving knowledge. To fulfil this goal, the paper introduces a new course, named "Geospatial Service Platform for Education and Research", to be held in the ISPRS summer school in May 2014 at Wuhan University, China. The course will share cutting-edge achievements of a geospatial service platform with students from different countries, and train them with online tools from the platform for geospatial data processing and scientific research. The content of the course includes the basic concepts of geospatial Web services, service-oriented architecture, geoprocessing modelling and chaining, and problem-solving using geospatial services. In particular, the course will offer a geospatial service platform for handson practice. There will be three kinds of exercises in the course: geoprocessing algorithm sharing through service development, geoprocessing modelling through service chaining, and online geospatial analysis using geospatial services. Students can choose one of them, depending on their interests and background. Existing geoprocessing services from OpenRS and GeoPW will be introduced. The summer course offers two service chaining tools, GeoChaining and GeoJModelBuilder, as instances to explain specifically the method for building service chains in view of different demands. After this course, students can learn how to use online service platforms for geospatial resource sharing and problem-solving.
NASA Astrophysics Data System (ADS)
Zalles, D. R.
2011-12-01
The presentation will compare and contrast two different place-based approaches to helping high school science teachers use geospatial data visualization technology to teach about climate change in their local regions. The approaches are being used in the development, piloting, and dissemination of two projects for high school science led by the author: the NASA-funded Data-enhanced Investigations for Climate Change Education (DICCE) and the NSF funded Studying Topography, Orographic Rainfall, and Ecosystems with Geospatial Information Technology (STORE). DICCE is bringing an extensive portal of Earth observation data, the Goddard Interactive Online Visualization and Analysis Infrastructure, to high school classrooms. STORE is making available data for viewing results of a particular IPCC-sanctioned climate change model in relation to recent data about average temperatures, precipitation, and land cover for study areas in central California and western New York State. Across the two projects, partner teachers of academically and ethnically diverse students from five states are participating in professional development and pilot testing. Powerful geospatial data representation technologies are difficult to implement in high school science because of challenges that teachers and students encounter navigating data access and making sense of data characteristics and nomenclature. Hence, on DICCE, the researchers are testing the theory that by providing a scaffolded technology-supported process for instructional design, starting from fundamental questions about the content domain, teachers will make better instructional decisions. Conversely, the STORE approach is rooted in the perspective that co-design of curricular materials among researchers and teacher partners that work off of "starter" lessons covering focal skills and understandings will lead to the most effective utilizations of the technology in the classroom. The projects' goals and strategies for student learning proceed from research suggesting that students will be more engaged and able to utilize prior knowledge better when seeing the local and hence personal relevance of climate change and other pressing contemporary science-related issues. In these projects, the students look for climate change trends in geospatial Earth System data layers from weather stations, satellites, and models in relation to global trends. They examine these data to (1) reify what they are learning in science class about meteorology, climate, and ecology, (2) build inquiry skills by posing and seeking answers to research questions, and (3) build data literacy skills through experience generating appropriate data queries and examining data output on different forms of geospatial representations such as maps, elevation profiles, and time series plots. Teachers also are given the opportunity to have their students look at geospatially represented census data from the tool Social Explorer (http://www.socialexplorer.com/pub/maps/home.aspx) in order to better understand demographic trends in relation to climate change-related trends in the Earth system. Early results will be reported about teacher professional development and student learning, gleaned from interviews and observations.
NASA Astrophysics Data System (ADS)
Millard, Keiran
2015-04-01
This paper looks at current experiences of geospatial users and geospatial suppliers and how they have been limited by suitable frameworks for managing and communicating data quality, data provenance and intellectual property rights (IPR). Current political and technological drivers mean that increasing volumes of geospatial data are available through a plethora of different products and services, and whilst this is inherently a good thing it does create a new generation of challenges. This paper consider two examples of where these issues have been examined and looks at the challenges and possible solutions from a data user and data supplier perspective. The first example is the IQmulus project that is researching fusion environments for big geospatial point clouds and coverages. The second example is the EU Emodnet programme that is establishing thematic data portals for public marine and coastal data. IQmulus examines big geospatial data; the data from sources such as LIDAR, SONAR and numerical simulations; these data are simply too big for routine and ad-hoc analysis, yet they could realise a myriad of disparate, and readily useable, information products with the right infrastructure in place. IQmulus is researching how to deliver this infrastructure technically, but a financially sustainable delivery depends on being able to track and manage ownership and IPR across the numerous data sets being processed. This becomes complex when the data is composed of multiple overlapping coverages, however managing this allows for uses to be delivered highly-bespoke products to meet their budget and technical needs. The Emodnet programme delivers harmonised marine data at the EU scale across seven thematic portals. As part of the Emodnet programme a series of 'check points' have been initiated to examine how useful these services and other public data services actually are to solve real-world problems. One key finding is that users have been confused by the fact that often data from the same source appears across multiple platforms and that current 19115-style metadata catalogues do not help the vast majority of users in making data selections. To address this, we have looked at approaches used in the leisure industry. This industry has established tools to support users selecting the best hotel for their needs from the metadata available, supported by peer to peer rating. We have looked into how this approach can support users in selecting the best data to meet their needs.
Diy Geospatial Web Service Chains: Geochaining Make it Easy
NASA Astrophysics Data System (ADS)
Wu, H.; You, L.; Gui, Z.
2011-08-01
It is a great challenge for beginners to create, deploy and utilize a Geospatial Web Service Chain (GWSC). People in Computer Science are usually not familiar with geospatial domain knowledge. Geospatial practitioners may lack the knowledge about web services and service chains. The end users may lack both. However, integrated visual editing interfaces, validation tools, and oneclick deployment wizards may help to lower the learning curve and improve modelling skills so beginners will have a better experience. GeoChaining is a GWSC modelling tool designed and developed based on these ideas. GeoChaining integrates visual editing, validation, deployment, execution etc. into a unified platform. By employing a Virtual Globe, users can intuitively visualize raw data and results produced by GeoChaining. All of these features allow users to easily start using GWSC, regardless of their professional background and computer skills. Further, GeoChaining supports GWSC model reuse, meaning that an entire GWSC model created or even a specific part can be directly reused in a new model. This greatly improves the efficiency of creating a new GWSC, and also contributes to the sharing and interoperability of GWSC.
Donato, David I.; Shapiro, Jason L.
2016-12-13
An effort to build a unified collection of geospatial data for use in land-change modeling (LCM) led to new insights into the requirements and challenges of building an LCM data infrastructure. A case study of data compilation and unification for the Richmond, Va., Metropolitan Statistical Area (MSA) delineated the problems of combining and unifying heterogeneous data from many independent localities such as counties and cities. The study also produced conclusions and recommendations for use by the national LCM community, emphasizing the critical need for simple, practical data standards and conventions for use by localities. This report contributes an uncopyrighted core glossary and a much needed operational definition of data unification.
NASA Astrophysics Data System (ADS)
Ratnasari, Nila; Dwi Candra, Erika; Herdianta Saputra, Defa; Putra Perdana, Aji
2016-11-01
Urban development in Indonesia significantly incerasing in line with rapid development of infrastructure, utility, and transportation network. Recently, people live depend on lights at night and social media and these two aspects can depicted urban spatial pattern and interaction. This research used nighttime remote sensing data with the VIIRS (Visible Infrared Imaging Radiometer Suite) day-night band detects lights, gas flares, auroras, and wildfires. Geo-social media information derived from twitter data gave big picture on spatial interaction from the geospatial footprint. Combined both data produced comprehensive urban spatial pattern and interaction in general for Indonesian territory. The result is shown as a preliminary study of integrating nighttime remote sensing data and geospatial footprint from twitter data.
Abu Dhabi Basemap Update Using the LiDAR Mobile Mapping Technology
NASA Astrophysics Data System (ADS)
Alshaiba, Omar; Amparo Núñez-Andrés, M.; Lantada, Nieves
2016-04-01
Mobile LiDAR system provides a new technology which can be used to update geospatial information by direct and rapid data collection. This technology is faster than the traditional survey ways and has lower cost. Abu Dhabi Municipal System aims to update its geospatial system frequently as the government entities have invested heavily in GIS technology and geospatial data to meet the repaid growth in the infrastructure and construction projects in recent years. The Emirate of Abu Dhabi has witnessed a huge growth in infrastructure and construction projects in recent years. Therefore, it is necessary to develop and update its basemap system frequently to meet their own organizational needs. Currently, the traditional ways are used to update basemap system such as human surveyors, GPS receivers and controller (GPS assigned computer). Then the surveyed data are downloaded, edited and reviewed manually before it is merged to the basemap system. Traditional surveying ways may not be applicable in some conditions such as; bad weather, difficult topographic area and boundary area. This paper presents a proposed methodology which uses the Mobile LiDAR system to update basemap in Abu Dhabi by using daily transactions services. It aims to use and integrate the mobile LiDAR technology into the municipality's daily workflow such that it becomes the new standard cost efficiency operating procedure for updating the base-map in Abu Dhabi Municipal System. On another note, the paper will demonstrate the results of the innovated workflow for the base-map update using the mobile LiDAR point cloud and few processing algorithms.
NASA Astrophysics Data System (ADS)
Peltz-Lewis, L. A.; Blake-Coleman, W.; Johnston, J.; DeLoatch, I. B.
2014-12-01
The Federal Geographic Data Committee (FGDC) is designing a portfolio management process for 193 geospatial datasets contained within the 16 topical National Spatial Data Infrastructure themes managed under OMB Circular A-16 "Coordination of Geographic Information and Related Spatial Data Activities." The 193 datasets are designated as National Geospatial Data Assets (NGDA) because of their significance in implementing to the missions of multiple levels of government, partners and stakeholders. As a starting point, the data managers of these NGDAs will conduct a baseline maturity assessment of the dataset(s) for which they are responsible. The maturity is measured against benchmarks related to each of the seven stages of the data lifecycle management framework promulgated within the OMB Circular A-16 Supplemental Guidance issued by OMB in November 2010. This framework was developed by the interagency Lifecycle Management Work Group (LMWG), consisting of 16 Federal agencies, under the 2004 Presidential Initiative the Geospatial Line of Business,using OMB Circular A-130" Management of Federal Information Resources" as guidance The seven lifecycle stages are: Define, Inventory/Evaluate, Obtain, Access, Maintain, Use/Evaluate, and Archive. This paper will focus on the Lifecycle Baseline Maturity Assessment, and efforts to integration the FGDC approach with other data maturity assessments.
ERIC Educational Resources Information Center
Metoyer, Sandra; Bednarz, Robert
2017-01-01
This article provides a description and discussion of an exploratory research study that examined the effects of using geospatial technology (GST) on high school students' spatial skills and spatial-relations content knowledge. It presents results that support the use of GST to teach spatially dependent content. It also provides indication of an…
NASA Astrophysics Data System (ADS)
Wodajo, Bikila Teklu
Every year, coastal disasters such as hurricanes and floods claim hundreds of lives and severely damage homes, businesses, and lifeline infrastructure. This research was motivated by the 2005 Hurricane Katrina disaster, which devastated the Mississippi and Louisiana Gulf Coast. The primary objective was to develop a geospatial decision-support system for extracting built-up surfaces and estimating disaster impacts using spaceborne remote sensing satellite imagery. Pre-Katrina 1-m Ikonos imagery of a 5km x 10km area of Gulfport, Mississippi, was used as source data to develop the built-up area and natural surfaces or BANS classification methodology. Autocorrelation of 0.6 or higher values related to spectral reflectance values of groundtruth pixels were used to select spectral bands and establish the BANS decision criteria of unique ranges of reflectance values. Surface classification results using GeoMedia Pro geospatial analysis for Gulfport sample areas, based on BANS criteria and manually drawn polygons, were within +/-7% of the groundtruth. The difference between the BANS results and the groundtruth was statistically not significant. BANS is a significant improvement over other supervised classification methods, which showed only 50% correctly classified pixels. The storm debris and erosion estimation or SDE methodology was developed from analysis of pre- and post-Katrina surface classification results of Gulfport samples. The SDE severity level criteria considered hurricane and flood damages and vulnerability of inhabited built-environment. A linear regression model, with +0.93 Pearson R-value, was developed for predicting SDE as a function of pre-disaster percent built-up area. SDE predictions for Gulfport sample areas, used for validation, were within +/-4% of calculated values. The damage cost model considered maintenance, rehabilitation and reconstruction costs related to infrastructure damage and community impacts of Hurricane Katrina. The developed models were implemented for a study area along I-10 considering the predominantly flood-induced damages in New Orleans. The BANS methodology was calibrated for 0.6-m QuickBird2 multispectral imagery of Karachi Port area in Pakistan. The results were accurate within +/-6% of the groundtruth. Due to its computational simplicity, the unit hydrograph method is recommended for geospatial visualization of surface runoff in the built-environment using BANS surface classification maps and elevations data. Key words. geospatial analysis, satellite imagery, built-environment, hurricane, disaster impacts, runoff.
Olugasa, B O
2014-12-01
The World-Wide-Web as a contemporary means of information sharing offers a platform for geo-spatial information dissemination to improve education about spatio-temporal patterns of disease spread at the human-animal-environment interface in developing countries of West Africa. In assessing the quality of exposure to geospatial information applications among students in five purposively selected institutions in West Africa, this study reviewed course contents and postgraduate programmes in zoonoses surveillance. Geospatial information content and associated practical exercises in zoonoses surveillance were scored.. Seven criteria were used to categorize and score capability, namely, spatial data capture; thematic map design and interpretation; spatio-temporal analysis; remote sensing of data; statistical modelling; the management of spatial data-profile; and web-based map sharing operation within an organization. These criteria were used to compute weighted exposure during training at the institutions. A categorical description of institution with highest-scoring of computed Cumulative Exposure Point Average (CEPA) was based on an illustration with retrospective records of rabies cases, using data from humans, animals and the environment, that were sourced from Grand Bassa County, Liberia to create and share maps and information with faculty, staff, students and the neighbourhood about animal bite injury surveillance and spatial distribution of rabies-like illness. Uniformly low CEPA values (0-1.3) were observed across academic departments. The highest (3.8) was observed at the Centre for Control and Prevention of Zoonoses (CCPZ), University of Ibadan, Nigeria, where geospatial techniques were systematically taught, and thematic and predictive maps were produced and shared online with other institutions in West Africa. In addition, a short course in zoonosis surveillance, which offers inclusive learning in geospatial applications, is taught at CCPZ. The paper presents a graded capability for geospatial data capture, analysis and an emerging sustainable map pavilion dedicated to zoonoses disease surveillance training among collaborating institutions in West Africa.
NASA Astrophysics Data System (ADS)
Delipetrev, Blagoj
2016-04-01
Presently, most of the existing software is desktop-based, designed to work on a single computer, which represents a major limitation in many ways, starting from limited computer processing, storage power, accessibility, availability, etc. The only feasible solution lies in the web and cloud. This abstract presents research and development of a cloud computing geospatial application for water resources based on free and open source software and open standards using hybrid deployment model of public - private cloud, running on two separate virtual machines (VMs). The first one (VM1) is running on Amazon web services (AWS) and the second one (VM2) is running on a Xen cloud platform. The presented cloud application is developed using free and open source software, open standards and prototype code. The cloud application presents a framework how to develop specialized cloud geospatial application that needs only a web browser to be used. This cloud application is the ultimate collaboration geospatial platform because multiple users across the globe with internet connection and browser can jointly model geospatial objects, enter attribute data and information, execute algorithms, and visualize results. The presented cloud application is: available all the time, accessible from everywhere, it is scalable, works in a distributed computer environment, it creates a real-time multiuser collaboration platform, the programing languages code and components are interoperable, and it is flexible in including additional components. The cloud geospatial application is implemented as a specialized water resources application with three web services for 1) data infrastructure (DI), 2) support for water resources modelling (WRM), 3) user management. The web services are running on two VMs that are communicating over the internet providing services to users. The application was tested on the Zletovica river basin case study with concurrent multiple users. The application is a state-of-the-art cloud geospatial collaboration platform. The presented solution is a prototype and can be used as a foundation for developing of any specialized cloud geospatial applications. Further research will be focused on distributing the cloud application on additional VMs, testing the scalability and availability of services.
Marine vessels as substitutes for heavy-duty trucks in Great Lakes freight transportation.
Comer, Bryan; Corbett, James J; Hawker, J Scott; Korfmacher, Karl; Lee, Earl E; Prokop, Chris; Winebrake, James J
2010-07-01
This paper applies a geospatial network optimization model to explore environmental, economic, and time-of-delivery tradeoffs associated with the application of marine vessels as substitutes for heavy-duty trucks operating in the Great Lakes region. The geospatial model integrates U.S. and Canadian highway, rail, and waterway networks to create an intermodal network and characterizes this network using temporal, economic, and environmental attributes (including emissions of carbon dioxide, particulate matter, carbon monoxide, sulfur oxides, volatile organic compounds, and nitrogen oxides). A case study evaluates tradeoffs associated with containerized traffic flow in the Great Lakes region, demonstrating how choice of freight mode affects the environmental performance of movement of goods. These results suggest opportunities to improve the environmental performance of freight transport through infrastructure development, technology implementation, and economic incentives.
The Diverse Data, User Driven Services and the Power of Giovanni at NASA GES DISC
NASA Technical Reports Server (NTRS)
Shen, Suhung
2017-01-01
This presentation provides an overview of remote sensing and model data at GES (Goddard Earth Sciences) DISC (Data and Information Services Center); Overview of data services at GES DISC (Registration with NASA data system; Searching and downloading data); Giovanni (Geospatial Interactive Online VisualizationANd aNalysis Infrastructure): online data exploration tool; and NASA Earth Data and Information System.
Hand, Carri; Huot, Suzanne; Laliberte Rudman, Debbie; Wijekoon, Sachindri
2017-06-01
Research exploring how places shape and interact with the lives of aging adults must be grounded in the places where aging adults live and participate. Combined participatory geospatial and qualitative methods have the potential to illuminate the complex processes enacted between person and place to create much-needed knowledge in this area. The purpose of this scoping review was to identify methods that can be used to study person-place relationships among aging adults and their neighborhoods by determining the extent and nature of research with aging adults that combines qualitative methods with participatory geospatial methods. A systematic search of nine databases identified 1,965 articles published from 1995 to late 2015. We extracted data and assessed whether the geospatial and qualitative methods were supported by a specified methodology, the methods of data analysis, and the extent of integration of geospatial and qualitative methods. Fifteen studies were included and used the photovoice method, global positioning system tracking plus interview, or go-along interviews. Most included articles provided sufficient detail about data collection methods, yet limited detail about methodologies supporting the study designs and/or data analysis. Approaches that combine participatory geospatial and qualitative methods are beginning to emerge in the aging literature. By more explicitly grounding studies in a methodology, better integrating different types of data during analysis, and reflecting on methods as they are applied, these methods can be further developed and utilized to provide crucial place-based knowledge that can support aging adults' health, well-being, engagement, and participation. © The Author 2017. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Participating in the Geospatial Web: Collaborative Mapping, Social Networks and Participatory GIS
NASA Astrophysics Data System (ADS)
Rouse, L. Jesse; Bergeron, Susan J.; Harris, Trevor M.
In 2005, Google, Microsoft and Yahoo! released free Web mapping applications that opened up digital mapping to mainstream Internet users. Importantly, these companies also released free APIs for their platforms, allowing users to geo-locate and map their own data. These initiatives have spurred the growth of the Geospatial Web and represent spatially aware online communities and new ways of enabling communities to share information from the bottom up. This chapter explores how the emerging Geospatial Web can meet some of the fundamental needs of Participatory GIS projects to incorporate local knowledge into GIS, as well as promote public access and collaborative mapping.
Towards the Development of a Taxonomy for Visualisation of Streamed Geospatial Data
NASA Astrophysics Data System (ADS)
Sibolla, B. H.; Van Zyl, T.; Coetzee, S.
2016-06-01
Geospatial data has very specific characteristics that need to be carefully captured in its visualisation, in order for the user and the viewer to gain knowledge from it. The science of visualisation has gained much traction over the last decade as a response to various visualisation challenges. During the development of an open source based, dynamic two-dimensional visualisation library, that caters for geospatial streaming data, it was found necessary to conduct a review of existing geospatial visualisation taxonomies. The review was done in order to inform the design phase of the library development, such that either an existing taxonomy can be adopted or extended to fit the needs at hand. The major challenge in this case is to develop dynamic two dimensional visualisations that enable human interaction in order to assist the user to understand the data streams that are continuously being updated. This paper reviews the existing geospatial data visualisation taxonomies that have been developed over the years. Based on the review, an adopted taxonomy for visualisation of geospatial streaming data is presented. Example applications of this taxonomy are also provided. The adopted taxonomy will then be used to develop the information model for the visualisation library in a further study.
Estes, John; Belward, Alan; Loveland, Thomas; Scepan, Joseph; Strahler, Alan H.; Townshend, John B.; Justice, Chris
1999-01-01
This paper focuses on the lessons hearned in the conduct of the lnternational Geosphere Biosphere Program's Data and Information System (rcnr-nts), global 1-km Land-Cover Mapping Project (n$cover). There is stiLL considerable fundamental research to be conducted dealing with the development and validation of thematic geospatial products derived from a combination of remotely sensed and ancillary data. Issues include database and data product development, classification legend definitions, processing and analysis techniques, and sampling strategies. A significant infrastructure is required to support an effort such as DISCover. The infrastructure put in place under the auspices of the IGBP-DIS serves as a model, and must be put in place to enable replication and development of projects such as Discover.
Incorporating Resilience into Transportation Planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Connelly, Elizabeth; Melaina, Marc
To aid decision making for developing transportation infrastructure, the National Renewable Energy Laboratory has developed the Scenario Evaluation, Regionalization and Analysis (SERA) model. The SERA model is a geospatially and temporally oriented model that has been applied to determine optimal production and delivery scenarios for hydrogen, given resource availability and technology cost and performance, for use in fuel cell vehicles. In addition, the SERA model has been applied to plug-in electric vehicles.
Spatial Information in local society's cultural conservation and research
NASA Astrophysics Data System (ADS)
Jang, J.-J.; Liao, H.-M.; Fan, I.-C.
2015-09-01
Center for Geographic Information Science, Research Center for Humanities and Social Sciences,Academia Sinica (GIS center), Coordinate short-, medium-, and long-term operations of multidisciplinary researches focusing on related topics in the sciences and humanities. Based on the requirements of multi-disciplinary research applications, sustain collection and construction of sustaining and unifying spatial base data and knowledge and building of spatial data infrastructure. Since the 1990s, GIS center build geographic information platform: "Time and space infrastructure of Chinese civilization" (Chinese Civilizationin Time and Space, CCTS) and "Taiwan History and Culture Map" (Taiwan History and Culture in Time and Space, THCTS) . the goal of both system is constructing an integrated GIS-based application infrastructure on the spatial extent of China and Taiwan, in the timeframe of Chinese and Taiwanese history, and with the contents of Chinese and Taiwanese civilization. Base on THCTS, we began to build Cultural Resources GIS(CRGIS, http://crgis.rchss.sinica.edu.tw) in 2006, to collect temples, historic Monuments, historic buildings, old trees, wind lions god and other cultural resource in Taiwan, and provide a platform for the volunteers to make for all types of tangible, intangible cultural resources, add, edit, organize and query data via Content Management System(CMS) . CRGIS collected aggregated 13,000 temples, 4,900 churches. On this basis, draw a variety of religious beliefs map-multiple times Temple distributions, different main god distributions, church distribution. Such as Mazu maps, Multiple times temple distributions map (before 1823, 1823-1895,1895-1949,1949-2015 years) at Taijiang inner sea areas in Tainan. In Taiwan, there is a religious ritual through folk activities for a period ranging from one day to several days, passing specific geospatial range and passes through some temples or houses. Such an important folk activity somewhat similar to Western parade, called " raojing " , the main spirit is passing through of these ranges in the process, to reach the people within bless range, many scholars and academic experts's folk research are dependent on such spatial information. 2012, GIS center applied WebGIS and GPS to gather raojing activities spatial information in cooperation with multi-units, aggregated seven sessions, 22 days, 442 temples had pass through . The atlas also published named "Atlas of the 2012 Religious Processions in the Tainan Region" in 2014. we also applied national cultural resources data form relevant government authorities, through the metadata design and data processing(geocoding), established cultural geospatial and thematic information ,such as 800 monuments, 1,100 historic buildings and 4,300 old trees data. In recent years, based on CRGIS technology and operational concepts, different domain experts or local culture-ahistory research worker/team had to cooperate with us to establish local or thematic material and cultural resources. As in collaboration with local culture-history research worker in Kinmen County in 2012, build Kinmen intangible cultural assets - Wind Lion God ,set metadata and build 122 wind lion god `s attribute data and maps through field survey, it is worth mention such fieldwork data integrity is more than the official registration data form Kinmen National Park, the number of is wind lion god more than 40; in 2013,we were in cooperation with academic experts to establish property data and map of the theatre during the Japanese colonial era in Taiwan, a total of 170 theatres ; we were in cooperation with Japanese scholars, used his 44 detaile field survey data of sugar refineries during the Japanese colonial era in Taiwan ,to produce a sugar refineries distribution map and extend to a thematic web(http://map.net.tw/) [The Cultural Heritage Maps of Taiwan Suger Factories in a Century]site according to CRGIS independent cultural concept. Deployment and operation of the CRGIS, the meaning is not only build the thematic GIS system ,but also contain these concepts: Open Data, Wikipedia ,Public Participation, and we provide an interactive platform with culture resource data and geospatial technology. In addition to providing these reference material for local culture education, local cultural recognition, but to further cooperate with scholars, academic experts , local culture-history research worker / team, accumulated rich record of the past, research results, through the spatial database planning, data processing(ex. geocoding), field survey, geospatial materials overlapping, such as nesting geospatial technology to compile the cultural information and value-added applications.
NASA Astrophysics Data System (ADS)
Yang, C.; Wong, D. W.; Phillips, T.; Wright, R. A.; Lindsey, S.; Kafatos, M.
2005-12-01
As a teamed partnership of the Center for Earth Observing and Space Research (CEOSR) at George Mason University (GMU), Virginia Department of Transportation (VDOT), Bureau of Transportation Statistics at the Department of Transportation (BTS/DOT), and Intergraph, we established Transportation Framework Data Services using Open Geospatial Consortium (OGC)'s Web Feature Service (WFS) Specification to enable the sharing of transportation data among the federal level with data from BTS/DOT, the state level through VDOT, the industries through Intergraph. CEOSR develops WFS solutions using Intergraph software. Relevant technical documents are also developed and disseminated through the partners. The WFS is integrated with operational geospatial systems at CEOSR and VDOT. CEOSR works with Intergraph on developing WFS solutions and technical documents. GeoMedia WebMap WFS toolkit is used with software and technical support from Intergraph. ESRI ArcIMS WFS connector is used with GMU's campus license of ESRI products. Tested solutions are integrated with framework data service operational systems, including 1) CEOSR's interoperable geospatial information services, FGDC clearinghouse Node, Geospatial One Stop (GOS) portal, and WMS services, 2) VDOT's state transportation data and GIS infrastructure, and 3)BTS/DOT's national transportation data. The project presents: 1) develop and deploy an operational OGC WFS 1.1 interfaces at CEOSR for registering with FGDC/GOS Portal and responding to Web ``POST'' requests for transportation Framework data as listed in Table 1; 2) build the WFS service that can return the data that conform to the drafted ANSI/INCITS L1 Standard (when available) for each identified theme in the format given by OGC Geography Markup Language (GML) Version 3.0 or higher; 3) integrate the OGC WFS with CEOSR's clearinghouse nodes, 4) establish a formal partnership to develop and share WFS-based geospatial interoperability technology among GMU, VDOT, BTS/DOT, and Intergraph; and 5) develop WFS-based solutions and technical documents using the GeoMedia WebMap WFS toolkit. Geospatial Web Feature Service is demonstrated to be more efficient in sharing vector data and supports direct Internet access transportation data. Developed WFS solutions also enhanced the interoperable service provided by CEOSR through the FGDC clearinghouse node and the GOS Portal.
NASA Astrophysics Data System (ADS)
Rosinski, A.; Beilin, P.; Colwell, J.; Hornick, M.; Glasscoe, M. T.; Morentz, J.; Smorodinsky, S.; Millington, A.; Hudnut, K. W.; Penn, P.; Ortiz, M.; Kennedy, M.; Long, K.; Miller, K.; Stromberg, M.
2015-12-01
The Clearinghouse provides emergency management and response professionals, scientific and engineering communities with prompt information on ground failure, structural damage, and other consequences from significant seismic events such as earthquakes or tsunamis. Clearinghouse activations include participation from Federal, State and local government, law enforcement, fire, EMS, emergency management, public health, environmental protection, the military, public and non-governmental organizations, and private sector. For the August 24, 2014 S. Napa earthquake, over 100 people from 40 different organizations participated during the 3-day Clearinghouse activation. Every organization has its own role and responsibility in disaster response; however all require authoritative data about the disaster for rapid hazard assessment and situational awareness. The Clearinghouse has been proactive in fostering collaboration and sharing Essential Elements of Information across disciplines. The Clearinghouse-led collaborative promotes the use of standard formats and protocols to allow existing technology to transform data into meaningful incident-related content and to enable data to be used by the largest number of participating Clearinghouse partners, thus providing responding personnel with enhanced real-time situational awareness, rapid hazard assessment, and more informed decision-making in support of response and recovery. The Clearinghouse efforts address national priorities outlined in USGS Circular 1242, Plan to Coordinate NEHRP post-earthquake investigations and S. 740-Geospatial Data Act of 2015, Sen. Orrin Hatch (R-UT), to streamline and coordinate geospatial data infrastructure, maximizing geospatial data in support of the Robert T. Stafford Act. Finally, the US Dept. of Homeland Security, Geospatial Management Office, recognized Clearinghouse's data sharing efforts as a Best Practice to be included in the forthcoming 2015 HLS Geospatial Concept of Operations.
NASA Astrophysics Data System (ADS)
Xie, Jibo; Li, Guoqing
2015-04-01
Earth observation (EO) data obtained by air-borne or space-borne sensors has the characteristics of heterogeneity and geographical distribution of storage. These data sources belong to different organizations or agencies whose data management and storage methods are quite different and geographically distributed. Different data sources provide different data publish platforms or portals. With more Remote sensing sensors used for Earth Observation (EO) missions, different space agencies have distributed archived massive EO data. The distribution of EO data archives and system heterogeneity makes it difficult to efficiently use geospatial data for many EO applications, such as hazard mitigation. To solve the interoperable problems of different EO data systems, an advanced architecture of distributed geospatial data infrastructure is introduced to solve the complexity of distributed and heterogeneous EO data integration and on-demand processing in this paper. The concept and architecture of geospatial data service gateway (GDSG) is proposed to build connection with heterogeneous EO data sources by which EO data can be retrieved and accessed with unified interfaces. The GDSG consists of a set of tools and service to encapsulate heterogeneous geospatial data sources into homogenous service modules. The GDSG modules includes EO metadata harvesters and translators, adaptors to different type of data system, unified data query and access interfaces, EO data cache management, and gateway GUI, etc. The GDSG framework is used to implement interoperability and synchronization between distributed EO data sources with heterogeneous architecture. An on-demand distributed EO data platform is developed to validate the GDSG architecture and implementation techniques. Several distributed EO data achieves are used for test. Flood and earthquake serves as two scenarios for the use cases of distributed EO data integration and interoperability.
Baloye, David O.
2016-01-01
The understanding and institutionalisation of the seamless link between urban critical infrastructure and disaster management has greatly helped the developed world to establish effective disaster management processes. However, this link is conspicuously missing in developing countries, where disaster management has been more reactive than proactive. The consequence of this is typified in poor response time and uncoordinated ways in which disasters and emergency situations are handled. As is the case with many Nigerian cities, the challenges of urban development in the city of Abeokuta have limited the effectiveness of disaster and emergency first responders and managers. Using geospatial techniques, the study attempted to design and deploy a spatial database running a web-based information system to track the characteristics and distribution of critical infrastructure for effective use during disaster and emergencies, with the purpose of proactively improving disaster and emergency management processes in Abeokuta.
World Wind: NASA's Virtual Globe
NASA Astrophysics Data System (ADS)
Hogan, P.
2007-12-01
Virtual globes have set the standard for information exchange. Once you've experienced the visually rich and highly compelling nature of data delivered via virtual globes with their highly engaging context of 3D, it's hard to go back to a flat 2D world. Just as the sawbones of not-too-long-ago have given way to sophisticated surgical operating theater, today's medium for information exchange is just beginning to leap from the staid chalkboards and remote libraries to fingertip navigable 3D worlds. How we harness this technology to serve a world inundated with information will describe the quality of our future. Our instincts for discovery and entertainment urge us on. There's so much we could know if the world's knowledge was presented to us in its natural context. Virtual globes are almost magical in their ability to reveal natural wonders. Anyone flying along a chain of volcanoes, a mid-ocean ridge or deep ocean trench, while simultaneously seeing the different depths to the history of earthquakes in those areas, will be delighted to sense Earth's dynamic nature in a way that would otherwise take several paragraphs of "boring" text. The sophisticated concepts related to global climate change would be far more comprehensible when experienced via a virtual globe. There is a large universe of public and private geospatial data sets that virtual globes can bring to light. The benefit derived from access to this data within virtual globes represents a significant return on investment for government, industry, the general public, and especially in the realm of education. Data access remains a key issue. Just as the highway infrastructure allows unimpeded access from point A to point B, an open standards-based infrastructure for data access allows virtual globes to exchange data in the most efficient manner possible. This data can be either free or proprietary. The Open Geospatial Consortium is providing the leadership necessary for this open standards-based data access infrastructure. The open-source community plays a crucial role in advancing virtual globe technology. This world community identifies, tracks and resolves technical problems, suggests new features and source code modifications, and often provides high-resolution data sets and other types of user-generated content, all while extending the functionality of virtual globe technology. NASA World Wind is one example of open source virtual globe technology that provides the world with the ability to build any desired functionality and make any desired data accessible.
Soil Monitor: an open source web application for real-time soil sealing monitoring and assessment
NASA Astrophysics Data System (ADS)
Langella, Giuliano; Basile, Angelo; Giannecchini, Simone; Iamarino, Michela; Munafò, Michele; Terribile, Fabio
2016-04-01
Soil sealing is one of the most important causes of land degradation and desertification. In Europe, soil covered by impermeable materials has increased by about 80% from the Second World War till nowadays, while population has only grown by one third. There is an increasing concern at the high political levels about the need to attenuate imperviousness itself and its effects on soil functions. European Commission promulgated a roadmap (COM(2011) 571) by which the net land take would be zero by 2050. Furthermore, European Commission also published a report in 2011 providing best practices and guidelines for limiting soil sealing and imperviousness. In this scenario, we developed an open source and an open source based Soil Sealing Geospatial Cyber Infrastructure (SS-GCI) named as "Soil Monitor". This tool merges a webGIS with parallel geospatial computation in a fast and dynamic fashion in order to provide real-time assessments of soil sealing at high spatial resolution (20 meters and below) over the whole Italy. Common open source webGIS packages are used to implement both the data management and visualization infrastructures, such as GeoServer and MapStore. The high-speed geospatial computation is ensured by a GPU parallelism using the CUDA (Computing Unified Device Architecture) framework by NVIDIA®. This kind of parallelism required the writing - from scratch - all codes needed to fulfil the geospatial computation built behind the soil sealing toolbox. The combination of GPU computing with webGIS infrastructures is relatively novel and required particular attention at the Java-CUDA programming interface. As a result, Soil Monitor is smart because it can perform very high time-consuming calculations (querying for instance an Italian administrative region as area of interest) in less than one minute. The web application is embedded in a web browser and nothing must be installed before using it. Potentially everybody can use it, but the main targets are the stakeholders dealing with sealing, such as policy makers, land owners and asphalt/cement companies. As a matter of fact, Soil Monitor can be used to improve the spatial planning therefore limiting the progression of disordered soil sealing which causes both the direct loss of soils due to imperviousness but also the indirect loss caused by fragmentation of soils (which has different negative effects on the durability of soil functions, such as habitat corridors). Further, in a future version, Soil Monitor would estimate the best location for a new building or help compensating soil losses by actions in other areas to offset drawbacks at zero. The presented SS-GCI dealing with soil sealing - if opportunely scaled - would aid the implementation of best practices for limiting soil sealing or mitigating its effects on soil functions.
A study on state of Geospatial courses in Indian Universities
NASA Astrophysics Data System (ADS)
Shekhar, S.
2014-12-01
Today the world is dominated by three technologies such as Nano technology, Bio technology and Geospatial technology. This increases the huge demand for experts in the respective field for disseminating the knowledge as well as for an innovative research. Therefore, the prime need is to train the existing fraternity to gain progressive knowledge in these technologies and impart the same to student community. The geospatial technology faces some peculiar problem than other two technologies because of its interdisciplinary, multi-disciplinary nature. It attracts students and mid career professionals from various disciplines including Physics, Computer science, Engineering, Geography, Geology, Agriculture, Forestry, Town Planning and so on. Hence there is always competition to crab and stabilize their position. The students of Master's degree in Geospatial science are facing two types of problem. The first one is no unique identity in the academic field. Neither they are exempted for National eligibility Test for Lecturer ship nor given an opportunity to have the exam in geospatial science. The second one is differential treatment by the industrial world. The students are either given low grade jobs or poorly paid for their job. Thus, it is a serious issue about the future of this course in the Universities and its recognition in the academic and industrial world. The universities should make this course towards more job oriented in consultation with the Industries and Industries should come forward to share their demands and requirements to the Universities, so that necessary changes in the curriculum can be made to meet the industrial requirements.
Open cyberGIS software for geospatial research and education in the big data era
NASA Astrophysics Data System (ADS)
Wang, Shaowen; Liu, Yan; Padmanabhan, Anand
CyberGIS represents an interdisciplinary field combining advanced cyberinfrastructure, geographic information science and systems (GIS), spatial analysis and modeling, and a number of geospatial domains to improve research productivity and enable scientific breakthroughs. It has emerged as new-generation GIS that enable unprecedented advances in data-driven knowledge discovery, visualization and visual analytics, and collaborative problem solving and decision-making. This paper describes three open software strategies-open access, source, and integration-to serve various research and education purposes of diverse geospatial communities. These strategies have been implemented in a leading-edge cyberGIS software environment through three corresponding software modalities: CyberGIS Gateway, Toolkit, and Middleware, and achieved broad and significant impacts.
Streamlining geospatial metadata in the Semantic Web
NASA Astrophysics Data System (ADS)
Fugazza, Cristiano; Pepe, Monica; Oggioni, Alessandro; Tagliolato, Paolo; Carrara, Paola
2016-04-01
In the geospatial realm, data annotation and discovery rely on a number of ad-hoc formats and protocols. These have been created to enable domain-specific use cases generalized search is not feasible for. Metadata are at the heart of the discovery process and nevertheless they are often neglected or encoded in formats that either are not aimed at efficient retrieval of resources or are plainly outdated. Particularly, the quantum leap represented by the Linked Open Data (LOD) movement did not induce so far a consistent, interlinked baseline in the geospatial domain. In a nutshell, datasets, scientific literature related to them, and ultimately the researchers behind these products are only loosely connected; the corresponding metadata intelligible only to humans, duplicated on different systems, seldom consistently. Instead, our workflow for metadata management envisages i) editing via customizable web- based forms, ii) encoding of records in any XML application profile, iii) translation into RDF (involving the semantic lift of metadata records), and finally iv) storage of the metadata as RDF and back-translation into the original XML format with added semantics-aware features. Phase iii) hinges on relating resource metadata to RDF data structures that represent keywords from code lists and controlled vocabularies, toponyms, researchers, institutes, and virtually any description one can retrieve (or directly publish) in the LOD Cloud. In the context of a distributed Spatial Data Infrastructure (SDI) built on free and open-source software, we detail phases iii) and iv) of our workflow for the semantics-aware management of geospatial metadata.
NASA Astrophysics Data System (ADS)
Shapiro, C. D.
2014-12-01
Data democracy is a concept that has great relevance to the use and value of geospatial data and scientific information. Data democracy describes a world in which data and information are widely and broadly accessible, understandable, and useable. The concept operationalizes the public good nature of scientific information and provides a framework for increasing benefits from its use. Data democracy encompasses efforts to increase accessibility to geospatial data and to expand participation in its collection, analysis, and application. These two pillars are analogous to demand and supply relationships. Improved accessibility, or demand, includes increased knowledge about geospatial data and low barriers to retrieval and use. Expanded participation, or supply, encompasses a broader community involved in developing geospatial data and scientific information. This pillar of data democracy is characterized by methods such as citizen science or crowd sourcing.A framework is developed for advancing the use of data democracy. This includes efforts to assess the societal benefits (economic and social) of scientific information. This knowledge is critical to continued monitoring of the effectiveness of data democracy implementation and of potential impact on the use and value of scientific information. The framework also includes an assessment of opportunities for advancing data democracy both on the supply and demand sides. These opportunities include relatively inexpensive efforts to reduce barriers to use as well as the identification of situations in which participation can be expanded in scientific efforts to enhance the breadth of involvement as well as expanding participation to non-traditional communities. This framework provides an initial perspective on ways to expand the "scientific community" of data users and providers. It also describes a way forward for enhancing the societal benefits from geospatial data and scientific information. As a result, data democracy not only provides benefits to a greater population, it enhances the value of science.
Monitoring and evaluation of rowing performance using mobile mapping data
NASA Astrophysics Data System (ADS)
Mpimis, A.; Gikas, V.
2011-12-01
Traditionally, the term mobile mapping refers to a means of collecting geospatial data using mapping sensors that are mounted on a mobile platform. Historically, this process was mainly driven by the need for highway infrastructure mapping and transportation corridor inventories. However, the recent advances in mapping sensor and telecommunication technologies create the opportunity that, completely new, emergent application areas of mobile mapping to evolve rapidly. This article examines the potential of mobile mapping technology (MMT) in sports science and in particular in competitive rowing. Notably, in this study the concept definition of mobile mapping somehow differs from the traditional one in a way that, the end result is not relevant to the geospatial information acquired as the moving platform travels in space. In contrast, the interest is placed on the moving platform (rowing boat) itself and on the various subsystems which are also in continuous motion.
NASA Astrophysics Data System (ADS)
Kerschke, Dorit; Schilling, Maik; Simon, Andreas; Wächter, Joachim
2014-05-01
The Energiewende and the increasing scarcity of raw materials will lead to an intensified utilization of the subsurface in Germany. Within this context, geological 3D modeling is a fundamental approach for integrated decision and planning processes. Initiated by the development of the European Geospatial Infrastructure INSPIRE, the German State Geological Offices started digitizing their predominantly analog archive inventory. Until now, a comprehensive 3D subsurface model of Brandenburg did not exist. Therefore the project B3D strived to develop a new 3D model as well as a subsequent infrastructure node to integrate all geological and spatial data within the Geodaten-Infrastruktur Brandenburg (Geospatial Infrastructure, GDI-BB) and provide it to the public through an interactive 2D/3D web application. The functionality of the web application is based on a client-server architecture. Server-sided, all available spatial data is published through GeoServer. GeoServer is designed for interoperability and acts as the reference implementation of the Open Geospatial Consortium (OGC) Web Feature Service (WFS) standard that provides the interface that allows requests for geographical features. In addition, GeoServer implements, among others, the high performance certified compliant Web Map Service (WMS) that serves geo-referenced map images. For publishing 3D data, the OGC Web 3D Service (W3DS), a portrayal service for three-dimensional geo-data, is used. The W3DS displays elements representing the geometry, appearance, and behavior of geographic objects. On the client side, the web application is solely based on Free and Open Source Software and leans on the JavaScript API WebGL that allows the interactive rendering of 2D and 3D graphics by means of GPU accelerated usage of physics and image processing as part of the web page canvas without the use of plug-ins. WebGL is supported by most web browsers (e.g., Google Chrome, Mozilla Firefox, Safari, and Opera). The web application enables an intuitive navigation through all available information and allows the visualization of geological maps (2D), seismic transects (2D/3D), wells (2D/3D), and the 3D-model. These achievements will alleviate spatial and geological data management within the German State Geological Offices and foster the interoperability of heterogeneous systems. It will provide guidance to a systematic subsurface management across system, domain and administrative boundaries on the basis of a federated spatial data infrastructure, and include the public in the decision processes (e-Governance). Yet, the interoperability of the systems has to be strongly propelled forward through agreements on standards that need to be decided upon in responsible committees. The project B3D is funded with resources from the European Fund for Regional Development (EFRE).
International Conference of Applied Science and Technology for Infrastructure Engineering
NASA Astrophysics Data System (ADS)
Elvina Santoso, Shelvy; Hardianto, Ekky
2017-11-01
Preface: International Conference of Applied Science and Technology for Infrastructure Engineering (ICASIE) 2017. The International Conference of Applied Science and Technology for Infrastructure Engineering (ICASIE) 2017 has been scheduled and successfully taken place at Swiss-Bell Inn Hotel, Surabaya, Indonesia, on August 5th 2017 organized by Department of Civil Infrastructure Engineering, Faculty of Vocation, Institut Teknologi Sepuluh Nopember (ITS). This annual event aims to create synergies between government, private sectors; employers; practitioners; and academics. This conference has different theme each year and “MATERIAL FOR INFRASTUCTURE ENGINEERING” will be taken for this year’s main theme. In addition, we also provide a platform for various other sub-theme topic including but not limited to Geopolymer Concrete and Materials Technology, Structural Dynamics, Engineering, and Sustainability, Seismic Design and Control of Structural Vibrations, Innovative and Green Buildings, Project Management, Transportation and Highway Engineering, Geotechnical Engineering, Water Engineering and Resources Management, Surveying and Geospatial Engineering, Coastal Engineering, Geophysics, Energy, Electronic and Mechatronic, Industrial Process, and Data Mining. List of Organizers, Journal Editors, Steering Committee, International Scientific Committee, Chairman, Keynote Speakers are available in this pdf.
Concept Maps as a Tool to Analyse College Students' Knowledge of Geospatial Concepts
ERIC Educational Resources Information Center
Oda, Katsuhiko
2016-01-01
This study focused on college students' development of conceptual knowledge in geographic information system (GIS). The aim of this study was to examine if and how students developed their conceptual knowledge during their enrollment in an introductory-level GIS course. Twelve undergraduate students constructed 36 concept maps and revised 24…
Enhancing The National Map Through Tactical Planning and Performance Monitoring
,
2008-01-01
Tactical planning and performance monitoring are initial steps toward improving 'the way The National Map works' and supporting the U.S. Geological Survey (USGS) Science Strategy. This Tactical Performance Planning Summary for The National Map combines information from The National Map 2.0 Tactical Plan and The National Map Performance Milestone Matrix. The National Map 2.0 Tactical Plan is primarily a working document to guide The National Map program's execution, production, and metrics monitoring for fiscal years (FY) 2008 and 2009. The Tactical Plan addresses data, products, and services, as well as supporting and enabling activities. The National Map's 2-year goal for FY 2008 and FY 2009 is to provide a range of geospatial products and services that further the National Spatial Data Infrastructure and underpin USGS science. To do this, the National Geospatial Program will develop a renewed understanding during FY 2008 of key customer needs and requirements, develop the infrastructure to support The National Map business model, modernize its business processes, and reengineer its workforce. Priorities for The National Map will be adjusted if necessary to respond to changes to the project that may impact resources, constrain timeframes, or change customer needs. The supporting and enabling activities that make it possible to produce the products and services of The National Map will include partnership activities, improved compatibility of systems, outreach, and integration of data themes.
Using Participatory Approach to Improve Availability of Spatial Data for Local Government
NASA Astrophysics Data System (ADS)
Kliment, T.; Cetl, V.; Tomič, H.; Lisiak, J.; Kliment, M.
2016-09-01
Nowadays, the availability of authoritative geospatial features of various data themes is becoming wider on global, regional and national levels. The reason is existence of legislative frameworks for public sector information and related spatial data infrastructure implementations, emergence of support for initiatives as open data, big data ensuring that online geospatial information are made available to digital single market, entrepreneurs and public bodies on both national and local level. However, the availability of authoritative reference spatial data linking the geographic representation of the properties and their owners are still missing in an appropriate quantity and quality level, even though this data represent fundamental input for local governments regarding the register of buildings used for property tax calculations, identification of illegal buildings, etc. We propose a methodology to improve this situation by applying the principles of participatory GIS and VGI used to collect observations, update authoritative datasets and verify the newly developed datasets of areas of buildings used to calculate property tax rates issued to their owners. The case study was performed within the district of the City of Požega in eastern Croatia in the summer 2015 and resulted in a total number of 16072 updated and newly identified objects made available online for quality verification by citizens using open source geospatial technologies.
GENESI-DR Portal: a scientific gateway to distributed repositories
NASA Astrophysics Data System (ADS)
Goncalves, Pedro; Brito, Fabrice; D'Andria, Fabio; Cossu, Roberto; Fusco, Luigi
2010-05-01
GENESI-DR (Ground European Network for Earth Science Interoperations - Digital Repositories) is a European Commission (EC)-funded project, kicked-off early 2008 lead by ESA; partners include Space Agencies (DLR, ASI, CNES), both space and no-space data providers such as ENEA (I), Infoterra (UK), K-SAT (N), NILU (N), JRC (EU) and industry as Elsag Datamat (I), CS (F) and TERRADUE (I). GENESI-DR intends to meet the challenge of facilitating "time to science" from different Earth Science disciplines in discovery, access and use (combining, integrating, processing, …) of historical and recent Earth-related data from space, airborne and in-situ sensors, which are archived in large distributed repositories. "Discovering" which data are available on a "geospatial web" is one of the main challenges ES scientists have to face today. Some well- known data sets are referred to in many places, available from many sources. For core information with a common purpose many copies are distributed, e.g., VMap0, Landsat, and SRTM. Other data sets in low or local demand may only be found in a few places and niche communities. Relevant services, results of analysis, applications and tools are accessible in a very scattered and uncoordinated way, often through individual initiatives from Earth Observation mission operators, scientific institutes dealing with ground measurements, service companies or data catalogues. In the discourse of Spatial Data Infrastructures, there are "catalogue services" - directories containing information on where spatial data and services can be found. For metadata "records" describing spatial data and services, there are "registries". The Geospatial industry coins specifications for search interfaces, where it might do better to reach out to other information retrieval and Internet communities. These considerations are the basis for the GENESI-DR scientific portal, which adopts a simple model allowing the geo-spatial classification and discovery of information as a loosely connected federation of nodes. This network had however to be resilient to node failures and able to scale with the growing addition of new information about data and services. The GENESI-DR scientific portal is still evolving as the project deploys the different components amongst the different partners, but the aim is to provide the connection to information, establish rights, access it and in some cases apply algorithms using the computer power available on the infrastructure with simple interfaces. As information is discovered in the network, it can be further exploited, filtered or enhanced according to the user goals. To implement this vision two specialized graphical interfaces were designed on the portal. The first, concentrates on the text-based search of information, while the second is a command and control of submission and order status on a distributed processing environment. The text search uses natural language features that extract the spatial temporal components from the user query. This is then propagated to the nodes by mapping them to OpenSearch extensions, and then returned to the user as an aggregated list of the resources. These can either be access points to dataset series or services that can be further analysed and processed. At this stage, the user is presented with dedicated interfaces that correspond to context of the action that is performing. Be it a bulk data download, data processing or data mining, the different services offer specialized interfaces that are integrated on the portal. In the overall, the GENESI-DR project identifies best practices and supporting context for the use of a minimal abstract model to loosely connect a federation of Digital Repositories. Surpassing the apparent lack of cost effectiveness of the Spatial Data Infrastructures effort in developing "catalogue services" is achieved by trimming the use cases to the most common and relevant. The GENESI-DR scientific portal is, as such, the visible front-end of a dedicated infrastructure providing transparent access to information and allowing Earth Science communities to easily and quickly derive objective information and share knowledge based on all environmentally sensitive domains.
A web based spatial decision supporting system for land management and soil conservation
NASA Astrophysics Data System (ADS)
Terribile, F.; Agrillo, A.; Bonfante, A.; Buscemi, G.; Colandrea, M.; D'Antonio, A.; De Mascellis, R.; De Michele, C.; Langella, G.; Manna, P.; Marotta, L.; Mileti, F. A.; Minieri, L.; Orefice, N.; Valentini, S.; Vingiani, S.; Basile, A.
2015-02-01
Today it is evident that there are many contrasting demands on our landscape (e.g. food security, more sustainable agriculture, higher income in rural areas, etc.) but also many land degradation problems. It has been proved that providing operational answers to these demands and problems is extremely difficult. Here we aim to demonstrate that a Spatial Decision Support System based on geospatial cyber-infrastructure (GCI) can embody all of the above, so producing a smart system for supporting decision making for agriculture, forestry and urban planning with respect to the landscape. In this paper, we discuss methods and results of a special kind of GCI architecture, one that is highly focused on soil and land conservation (SOILCONSWEB-LIFE+ project). The system allows us to obtain dynamic, multidisciplinary, multiscale, and multifunctional answers to agriculture, forestry and urban planning issues through the web. The system has been applied to and tested in an area of about 20 000 ha in the South of Italy, within the framework of a European LIFE+ project. The paper reports - as a case study - results from two different applications dealing with agriculture (olive growth tool) and environmental protection (soil capability to protect groundwater). Developed with the help of end users, the system is starting to be adopted by local communities. The system indirectly explores a change of paradigm for soil and landscape scientists. Indeed, the potential benefit is shown of overcoming current disciplinary fragmentation over landscape issues by offering - through a smart web based system - truly integrated geospatial knowledge that may be directly and freely used by any end user (http://www.landconsultingweb.eu). This may help bridge the last much important divide between scientists working on the landscape and end users.
NASA Astrophysics Data System (ADS)
Simonis, Ingo
2015-04-01
Traditional Spatial Data Infrastructures focus on aspects such as description and discovery of geospatial data, integration of these data into processing workflows, and representation of fusion or other data analysis results. Though lots of interoperability agreements still need to be worked out to achieve a satisfying level of interoperability within large scale initiatives such as INSPIRE, new technologies, use cases and requirements are constantly emerging from the user community. This paper focuses on three aspects that came up recently: The integration of social media data into SDIs, synchronization aspects between datasets used by field workers in shared resources environments, and the generation and maintenance of data for mixed mode online/offline situations that can be easily packed, delivered, modified, and synchronized with reference data sets. The work described in this paper results from the latest testbed executed by the Open Geospatial Consortium, OGC. The testbed is part of the interoperability program (IP), which constitutes a significant part of the OGC standards development process. The IP has a number of instruments to enhance geospatial standards and technologies, such as Testbeds, Pilot Projects, Interoperability Experiments, and Interoperability Expert Services. These activities are designed to encourage rapid development, testing, validation, demonstration and adoption of open, consensus based standards and best practices. The latest global activity, testbed-11, aims at exploring new technologies and architectural approaches to enrich and extend traditional spatial data infrastructures with data from Social Media, improved data synchronization, and the capability to take data to the field in new synchronized data containers called GeoPackages. Social media sources are a valuable supplement to providing up to date information in distributed environments. Following an uncoordinated crowdsourcing approach, social media data can be both overwhelming in volume and questionable in its accuracy and legitimacy. Testbed-11 explores how best to make use of such sources of information and how to deal with immanent issues with data from platforms such as OpenStreetMap, Twitter, tumblr, flickr, Snapchat, Facebook, Instagram, YouTube, Vimeo, Panoramio, Pinterest, Picasa or storyful. Further important aspects highlighted here are the synchronization of data and the capability to take complex data sets of any size on mobile devices to the field - and keeping them in sync with reference data stores. In particular in emergency management situations, it is crucial to ensure properly synchronized data sets across different types of data stores and applications. Often data is taken to the field on mobile devices, where it gets updated or annotated. Though bandwidth permanently improves, requirements on data quality and complexity grow in parallel. Intermitted connectivity is paired with high security requirements that have to be fulfilled. This paper discusses the latest approaches using synchronization services and synchronized GeoPackages, the new container format for geospatial data.
Information technology developments within the national biological information infrastructure
Cotter, G.; Frame, M.T.
2000-01-01
Looking out an office window or exploring a community park, one can easily see the tremendous challenges that biological information presents the computer science community. Biological information varies in format and content depending whether or not it is information pertaining to a particular species (i.e. Brown Tree Snake), or a specific ecosystem, which often includes multiple species, land use characteristics, and geospatially referenced information. The complexity and uniqueness of each individual species or ecosystem do not easily lend themselves to today's computer science tools and applications. To address the challenges that the biological enterprise presents the National Biological Information Infrastructure (NBII) (http://www.nbii.gov) was established in 1993. The NBII is designed to address these issues on a National scale within the United States, and through international partnerships abroad. This paper discusses current computer science efforts within the National Biological Information Infrastructure Program and future computer science research endeavors that are needed to address the ever-growing issues related to our Nation's biological concerns.
The Effectiveness of a Geospatial Technologies-Integrated Curriculum to Promote Climate Literacy
NASA Astrophysics Data System (ADS)
Anastasio, D. J.; Bodzin, A. M.; Peffer, T.; Sahagian, D. L.; Cirucci, L.
2011-12-01
This study examined the effectiveness of a geospatial technologies - integrated climate change curriculum (http://www.ei.lehigh.edu/eli/cc/) to promote climate literacy in an urban school district. Five 8th grade Earth and Space Science classes in an urban middle school (Bethlehem, Pennsylvania) consisting of three different ability level tracks participated in the study. Data gathering methods included pre/posttest assessments, daily classroom observations, daily teacher meetings, and examination of student produced artifacts. Data was gathered using a climate change literacy assessment instrument designed to measure students' climate change content knowledge. The items included distractors that address misunderstandings and knowledge deficits about climate change from the existing literature. Paired-sample t-test analyses were conducted to compare the pre- and post-test assessment results. The results of these analyses were used to compare overall gains as well as ability level track groups. Overall results regarding the use of the climate change curriculum showed significant improvement in urban middle school students' understanding of climate change concepts. Effect sizes were large (ES>0.8) and significant (p<0.001) for the entire assessment and for each ability level subgroup. Findings from classroom observations, assessments embedded in the curriculum, and the examination of all student artifacts revealed that the use of geospatial technologies enable middle school students to improve their knowledge of climate change and improve their spatial thinking and reasoning skills.
Interacting With A Near Real-Time Urban Digital Watershed Using Emerging Geospatial Web Technologies
NASA Astrophysics Data System (ADS)
Liu, Y.; Fazio, D. J.; Abdelzaher, T.; Minsker, B.
2007-12-01
The value of real-time hydrologic data dissemination including river stage, streamflow, and precipitation for operational stormwater management efforts is particularly high for communities where flash flooding is common and costly. Ideally, such data would be presented within a watershed-scale geospatial context to portray a holistic view of the watershed. Local hydrologic sensor networks usually lack comprehensive integration with sensor networks managed by other agencies sharing the same watershed due to administrative, political, but mostly technical barriers. Recent efforts on providing unified access to hydrological data have concentrated on creating new SOAP-based web services and common data format (e.g. WaterML and Observation Data Model) for users to access the data (e.g. HIS and HydroSeek). Geospatial Web technology including OGC sensor web enablement (SWE), GeoRSS, Geo tags, Geospatial browsers such as Google Earth and Microsoft Virtual Earth and other location-based service tools provides possibilities for us to interact with a digital watershed in near-real-time. OGC SWE proposes a revolutionary concept towards a web-connected/controllable sensor networks. However, these efforts have not provided the capability to allow dynamic data integration/fusion among heterogeneous sources, data filtering and support for workflows or domain specific applications where both push and pull mode of retrieving data may be needed. We propose a light weight integration framework by extending SWE with open source Enterprise Service Bus (e.g., mule) as a backbone component to dynamically transform, transport, and integrate both heterogeneous sensor data sources and simulation model outputs. We will report our progress on building such framework where multi-agencies" sensor data and hydro-model outputs (with map layers) will be integrated and disseminated in a geospatial browser (e.g. Microsoft Virtual Earth). This is a collaborative project among NCSA, USGS Illinois Water Science Center, Computer Science Department at UIUC funded by the Adaptive Environmental Infrastructure Sensing and Information Systems initiative at UIUC.
NASA Astrophysics Data System (ADS)
Arozarena, A.; Villa, G.; Valcárcel, N.; Pérez, B.
2016-06-01
Remote sensing satellites, together with aerial and terrestrial platforms (mobile and fixed), produce nowadays huge amounts of data coming from a wide variety of sensors. These datasets serve as main data sources for the extraction of Geospatial Reference Information (GRI), constituting the "skeleton" of any Spatial Data Infrastructure (SDI). Since very different situations can be found around the world in terms of geographic information production and management, the generation of global GRI datasets seems extremely challenging. Remotely sensed data, due to its wide availability nowadays, is able to provide fundamental sources for any production or management system present in different countries. After several automatic and semiautomatic processes including ancillary data, the extracted geospatial information is ready to become part of the GRI databases. In order to optimize these data flows for the production of high quality geospatial information and to promote its use to address global challenges several initiatives at national, continental and global levels have been put in place, such as European INSPIRE initiative and Copernicus Programme, and global initiatives such as the Group on Earth Observation/Global Earth Observation System of Systems (GEO/GEOSS) and United Nations Global Geospatial Information Management (UN-GGIM). These workflows are established mainly by public organizations, with the adequate institutional arrangements at national, regional or global levels. Other initiatives, such as Volunteered Geographic Information (VGI), on the other hand may contribute to maintain the GRI databases updated. Remotely sensed data hence becomes one of the main pillars underpinning the establishment of a global SDI, as those datasets will be used by public agencies or institutions as well as by volunteers to extract the required spatial information that in turn will feed the GRI databases. This paper intends to provide an example of how institutional arrangements and cooperative production systems can be set up at any territorial level in order to exploit remotely sensed data in the most intensive manner, taking advantage of all its potential.
Existing Geospatial Knowledge of Gopher Tortoise Population and Abundance
2007-05-01
2 Figure A1. Gopher tortoise Alabama habitat map ................................................................................ 14 Figure A2...12 Table A3. Alabama data (Federal...15 Table A4. Alabama data (state
GeoThentic: Designing and Assessing with Technology, Pedagogy, and Content Knowledge
ERIC Educational Resources Information Center
Doering, Aaron; Scharber, Cassandra; Miller, Charles; Veletsianos, George
2009-01-01
GeoThentic, an online teaching and learning environment, focuses on engaging teachers and learners in solving real-world geography problems through use of geospatial technologies. The design of GeoThentic is grounded on the technology, pedagogy, and content knowledge (TPACK) framework as a metacognitive tool. This paper describes how the TPACK…
Planetary Cartography - Activities and Current Challenges
NASA Astrophysics Data System (ADS)
Nass, Andrea; Di, Kaichang; Elgner, Stephan; van Gasselt, Stephan; Hare, Trent; Hargitai, Henrik; Karachevtseva, Irina; Kereszturi, Akos; Kersten, Elke; Kokhanov, Alexander; Manaud, Nicolas; Roatsch, Thomas; Rossi, Angelo Pio; Skinner, James, Jr.; Wählisch, Marita
2018-05-01
Maps are one of the most important tools for communicating geospatial information between producers and receivers. Geospatial data, tools, contributions in geospatial sciences, and the communication of information and transmission of knowledge are matter of ongoing cartographic research. This applies to all topics and objects located on Earth or on any other body in our Solar System. In planetary science, cartography and mapping have a history dating back to the roots of telescopic space exploration and are now facing new technological and organizational challenges with the rise of new missions, new global initiatives, organizations and opening research markets. The focus of this contribution is to introduce the community to the field of planetary cartography and its historic foundation, to highlight some of the organizations involved and to emphasize challenges that Planetary Cartography has to face today and in the near future.
NASA Astrophysics Data System (ADS)
Mansor, S. B.; Pormanafi, S.; Mahmud, A. R. B.; Pirasteh, S.
2012-08-01
In this study, a geospatial model for land use allocation was developed from the view of simulating the biological autonomous adaptability to environment and the infrastructural preference. The model was developed based on multi-agent genetic algorithm. The model was customized to accommodate the constraint set for the study area, namely the resource saving and environmental-friendly. The model was then applied to solve the practical multi-objective spatial optimization allocation problems of land use in the core region of Menderjan Basin in Iran. The first task was to study the dominant crops and economic suitability evaluation of land. Second task was to determine the fitness function for the genetic algorithms. The third objective was to optimize the land use map using economical benefits. The results has indicated that the proposed model has much better performance for solving complex multi-objective spatial optimization allocation problems and it is a promising method for generating land use alternatives for further consideration in spatial decision-making.
Integrated web system of geospatial data services for climate research
NASA Astrophysics Data System (ADS)
Okladnikov, Igor; Gordov, Evgeny; Titov, Alexander
2016-04-01
Georeferenced datasets are currently actively used for modeling, interpretation and forecasting of climatic and ecosystem changes on different spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their huge size (up to tens terabytes for a single dataset) a special software supporting studies in the climate and environmental change areas is required. An approach for integrated analysis of georefernced climatological data sets based on combination of web and GIS technologies in the framework of spatial data infrastructure paradigm is presented. According to this approach a dedicated data-processing web system for integrated analysis of heterogeneous georeferenced climatological and meteorological data is being developed. It is based on Open Geospatial Consortium (OGC) standards and involves many modern solutions such as object-oriented programming model, modular composition, and JavaScript libraries based on GeoExt library, ExtJS Framework and OpenLayers software. This work is supported by the Ministry of Education and Science of the Russian Federation, Agreement #14.613.21.0037.
Monitoring Aircraft Motion at Airports by LIDAR
NASA Astrophysics Data System (ADS)
Toth, C.; Jozkow, G.; Koppanyi, Z.; Young, S.; Grejner-Brzezinska, D.
2016-06-01
Improving sensor performance, combined with better affordability, provides better object space observability, resulting in new applications. Remote sensing systems are primarily concerned with acquiring data of the static components of our environment, such as the topographic surface of the earth, transportation infrastructure, city models, etc. Observing the dynamic component of the object space is still rather rare in the geospatial application field; vehicle extraction and traffic flow monitoring are a few examples of using remote sensing to detect and model moving objects. Deploying a network of inexpensive LiDAR sensors along taxiways and runways can provide both geometrically and temporally rich geospatial data that aircraft body can be extracted from the point cloud, and then, based on consecutive point clouds motion parameters can be estimated. Acquiring accurate aircraft trajectory data is essential to improve aviation safety at airports. This paper reports about the initial experiences obtained by using a network of four Velodyne VLP- 16 sensors to acquire data along a runway segment.
NASA Astrophysics Data System (ADS)
Ward, S. M.; Paulus, G.
2013-06-01
The Danube River basin has long been the location of significant flooding problems across central Europe. The last decade has seen a sharp increase in the frequency, duration and intensity of these flood events, unveiling a dire need for enhanced flood management policy and tools in the region. Located in the southern portion of Austria, the state of Carinthia has experienced a significant volume of intense flood impacts over the last decade. Although the Austrian government has acknowledged these issues, their remedial actions have been primarily structural to date. Continued focus on controlling the natural environment through infrastructure while disregarding the need to consider alternative forms of assessing flood exposure will only act as a provisional solution to this inescapable risk. In an attempt to remedy this flaw, this paper highlights the application of geospatial predictive analytics and spatial recovery index as a proxy for community resilience, as well as the cultural challenges associated with the application of foreign models within an Austrian environment.
NASA Technical Reports Server (NTRS)
Cole, Tony A.; Wanik, David W.; Molthan, Andrew L.; Roman, Miguel O.; Griffin, Robert E.
2017-01-01
Natural and anthropogenic hazards are frequently responsible for disaster events, leading to damaged physical infrastructure, which can result in loss of electrical power for affected locations. Remotely-sensed, nighttime satellite imagery from the Suomi National Polar-orbiting Partnership (Suomi-NPP) Visible Infrared Imaging Radiometer Suite (VIIRS) Day/Night Band (DNB) can monitor power outages in disaster-affected areas through the identification of missing city lights. When combined with locally-relevant geospatial information, these observations can be used to estimate power outages, defined as geographic locations requiring manual intervention to restore power. In this study, we produced a power outage product based on Suomi-NPP VIIRS DNB observations to estimate power outages following Hurricane Sandy in 2012. This product, combined with known power outage data and ambient population estimates, was then used to predict power outages in a layered, feedforward neural network model. We believe this is the first attempt to synergistically combine such data sources to quantitatively estimate power outages. The VIIRS DNB power outage product was able to identify initial loss of light following Hurricane Sandy, as well as the gradual restoration of electrical power. The neural network model predicted power outages with reasonable spatial accuracy, achieving Pearson coefficients (r) between 0.48 and 0.58 across all folds. Our results show promise for producing a continental United States (CONUS)- or global-scale power outage monitoring network using satellite imagery and locally-relevant geospatial data.
User-driven generation of standard data services
NASA Astrophysics Data System (ADS)
Díaz, Laura; Granell, Carlos; Gould, Michael; Huerta, Joaquín.
2010-05-01
Geospatial Information systems are experiencing the shift from monolithic to distributed environments (Bernard, 2003). Current research trends for discover and access of geospatial resources, in these distributed environments, are being addressed by deployment of interconnected Spatial Data Infrastructure (SDI) nodes at different scales to build a global spatial information infrastructure (Masser et al., 2008; Rajabifard et al., 2002). One of the challenges for implementing these global and multiscale SDIs is to agree with common standards in consideration with heterogeneity of various stakeholders [Masser 2005]. In Europe, the European Commission took the INSPIRE initiative to monitor the development of European SDIs. INSPIRE Directive addresses the need for web services to discover, view, transform, invoke, and download geospatial resources, which enable various stakeholders to share resources in an interoperable manner [INSPIRE 2007]. Such web services require technical specifications for the interoperability and harmonization of their SDIs [INSPIRE 2007]. Moreover, interoperability is ensured by a number of specification efforts, in the geo domain most prominently by ISO/TC 211 and the OpenGIS Consortium (OGC) (Bernard, 2003). Other research challenges regarding SDI are on one hand how to handle complexity by users in charge of maintaining SDIs as they grow, and on the other hand the fact the SDI maintenance and evolution should be guided (Bejar et al, 2009). So there is a motivation to improve the complex deployment mechanisms in SDI since there is a need of expertise and time to deploy resources and integrate them by means of standard services. In this context we present an architecture following the INSPIRE technical guidelines and therefore based on SDI principles. This architecture supports distributed applications and provides components to assist users in deploying and updating SDI resources. Therefore mechanisms and components for the automatic generation and publication of standard geospatial are proposed. These mechanisms deal with the fact of hiding the underlying technology and let stakeholders wrap resources as standard services to share these resources in a transparent manner. These components are integrated in our architecture within the Service Framework node (module). PIC Figure 1: Figure 1. Architecture components diagram Figure 1 shows the components of the architecture: The Application Node provides the entry point for users to run distributed applications. This software component has the user interface and the application logic. The Service Connector component provides the ability to connect to the services available in the middleware layer of SDI. This node acts as a socket to OGC Web Services. For instance we appreciate the WMS component implementing the OGC WMS specification as it is the standard recommended by the INSPIRE implementation rules as View Service Type.The Service Framework node contains several components. The Service Framework main functionality is to assist users in wrapping and sharing geospatial resources. It implements the proposed mechanisms to improve the availability and visibility of geospatial resources. The main components of this framework are the Data wrapper, the Process Wrapper and the Service Publisher. The Data Wrapper and Process Wrapper components guide users to wrap data and tools as standard services according with INSPIRE implementing rules (availability). The Service Publisher component aims at creating service metadata and publishing them in catalogues (visibility). Roughly speaking, all of these components are concerned with the idea of acting as a service generator and publisher, i.e., they get a resource (data or process) and return an INSPIRE service that will be published in catalogue services. References Béjar, R., Latre, M. Á., Nogueras-Iso, J., Muro-Medrano, P. R., Zarazaga-Soria, F. J. 2009. International Journal of Geographical Information Science, 23(3), 271-294. Bernard, L, U Einspanier, M Lutz & C Portele. Interoperability in GI Service Chains The Way Forward. In: M. Gould, R. Laurini & S. Coulondre (Eds.). 6th AGILE Conference on Geographic Information Science 2003, Lyon: 179-188. INSPIRE. Directive 2007/2/EC of the European Parliament and of the Council of 14 March 2007 establishing an Infrastructure for Spatial Information in the European Community. (2007) Masser, I. GIS Worlds: Creating Spatial Data Infrastructures. Redlands, California. ESRI Press. (2005) Masser, I., Rajabifard, A., Williamson, I. 2008. Spatially enabling governments through SDI implementation. International Journal of Geographical Information Science. Vol. 22, No. 1, (2008) 5-20 Rajabifard, A., Feeney, M-E. F., Williamson, I. P. 2002. Future directions for SDI development. International Journal of Applied Earth Observation and Geoinformation 4 (2002) 11-22
Creating of Central Geospatial Database of the Slovak Republic and Procedures of its Revision
NASA Astrophysics Data System (ADS)
Miškolci, M.; Šafář, V.; Šrámková, R.
2016-06-01
The article describes the creation of initial three dimensional geodatabase from planning and designing through the determination of technological and manufacturing processes to practical using of Central Geospatial Database (CGD - official name in Slovak language is Centrálna Priestorová Databáza - CPD) and shortly describes procedures of its revision. CGD ensures proper collection, processing, storing, transferring and displaying of digital geospatial information. CGD is used by Ministry of Defense (MoD) for defense and crisis management tasks and by Integrated rescue system. For military personnel CGD is run on MoD intranet, and for other users outside of MoD is transmutated to ZbGIS (Primary Geodatabase of Slovak Republic) and is run on public web site. CGD is a global set of geo-spatial information. CGD is a vector computer model which completely covers entire territory of Slovakia. Seamless CGD is created by digitizing of real world using of photogrammetric stereoscopic methods and measurements of objects properties. Basic vector model of CGD (from photogrammetric processing) is then taken out to the field for inspection and additional gathering of objects properties in the whole area of mapping. Finally real-world objects are spatially modeled as a entities of three-dimensional database. CGD gives us opportunity, to get know the territory complexly in all the three spatial dimensions. Every entity in CGD has recorded the time of collection, which allows the individual to assess the timeliness of information. CGD can be utilized for the purposes of geographical analysis, geo-referencing, cartographic purposes as well as various special-purpose mapping and has the ambition to cover the needs not only the MoD, but to become a reference model for the national geographical infrastructure.
NASA Astrophysics Data System (ADS)
Chaabane, M. S.; Abouali, N.; Boumeaza, T.; Zahouily, M.
2017-11-01
Today, the prevention and the risk management occupy an important part of public policy activities and are considered as major components in the process of sustainable development of territories. Due to the expansion of IT processes, in particular the geomatics sciences, decision-makers are increasingly requesting for digital tools before, during and after the risks of natural disasters. Both, the geographic information system (GIS) and the remote sensing are considered as geospatial and fundamental tools which help to understand the evolution of risks, to analyze their temporality and to make the right decisions. The historic events (on 1996, 2002 and 2010) which struck the city of Mohammedia and having caused the consequent damage to vital infrastructure and private property, require a thorough and rational analyze to benefit from it and well manage the floods phenomena. This article present i) the contribution of the geospatial tools for the floods simulation of Oued of el Maleh city at various return periods. These tools allow the demarcation of flood-risk areas and so to make floods simulations in several scenarios (decadal flood, 20-year flood, 50-year flood, 100-year flood, 500-year flood & also millennial flood) and besides (ii) present a synthesis map combining the territorial stakes superposed on the flood scenarios at different periods of return.
Ramachandran, Varun; Long, Suzanna K.; Shoberg, Thomas G.; Corns, Steven; Carlo, Hector J.
2016-01-01
The majority of restoration strategies in the wake of large-scale disasters have focused on short-term emergency response solutions. Few consider medium- to long-term restoration strategies to reconnect urban areas to national supply chain interdependent critical infrastructure systems (SCICI). These SCICI promote the effective flow of goods, services, and information vital to the economic vitality of an urban environment. To re-establish the connectivity that has been broken during a disaster between the different SCICI, relationships between these systems must be identified, formulated, and added to a common framework to form a system-level restoration plan. To accomplish this goal, a considerable collection of SCICI data is necessary. The aim of this paper is to review what data are required for model construction, the accessibility of these data, and their integration with each other. While a review of publically available data reveals a dearth of real-time data to assist modeling long-term recovery following an extreme event, a significant amount of static data does exist and these data can be used to model the complex interdependencies needed. For the sake of illustration, a particular SCICI (transportation) is used to highlight the challenges of determining the interdependencies and creating models capable of describing the complexity of an urban environment with the data publically available. Integration of such data as is derived from public domain sources is readily achieved in a geospatial environment, after all geospatial infrastructure data are the most abundant data source and while significant quantities of data can be acquired through public sources, a significant effort is still required to gather, develop, and integrate these data from multiple sources to build a complete model. Therefore, while continued availability of high quality, public information is essential for modeling efforts in academic as well as government communities, a more streamlined approach to a real-time acquisition and integration of these data is essential.
NASA Astrophysics Data System (ADS)
Chen, R. S.; MacManus, K.; Vinay, S.; Yetman, G.
2016-12-01
The Socioeconomic Data and Applications Center (SEDAC), one of 12 Distributed Active Archive Centers (DAACs) in the NASA Earth Observing System Data and Information System (EOSDIS), has developed a variety of operational spatial data services aimed at providing online access, visualization, and analytic functions for geospatial socioeconomic and environmental data. These services include: open web services that implement Open Geospatial Consortium (OGC) specifications such as Web Map Service (WMS), Web Feature Service (WFS), and Web Coverage Service (WCS); spatial query services that support Web Processing Service (WPS) and Representation State Transfer (REST); and web map clients and a mobile app that utilize SEDAC and other open web services. These services may be accessed from a variety of external map clients and visualization tools such as NASA's WorldView, NOAA's Climate Explorer, and ArcGIS Online. More than 200 data layers related to population, settlements, infrastructure, agriculture, environmental pollution, land use, health, hazards, climate change and other aspects of sustainable development are available through WMS, WFS, and/or WCS. Version 2 of the SEDAC Population Estimation Service (PES) supports spatial queries through WPS and REST in the form of a user-defined polygon or circle. The PES returns an estimate of the population residing in the defined area for a specific year (2000, 2005, 2010, 2015, or 2020) based on SEDAC's Gridded Population of the World version 4 (GPWv4) dataset, together with measures of accuracy. The SEDAC Hazards Mapper and the recently released HazPop iOS mobile app enable users to easily submit spatial queries to the PES and see the results. SEDAC has developed an operational virtualized backend infrastructure to manage these services and support their continual improvement as standards change, new data and services become available, and user needs evolve. An ongoing challenge is to improve the reliability and performance of the infrastructure, in conjunction with external services, to meet both research and operational needs.
Investigation into Cloud Computing for More Robust Automated Bulk Image Geoprocessing
NASA Technical Reports Server (NTRS)
Brown, Richard B.; Smoot, James C.; Underwood, Lauren; Armstrong, C. Duane
2012-01-01
Geospatial resource assessments frequently require timely geospatial data processing that involves large multivariate remote sensing data sets. In particular, for disasters, response requires rapid access to large data volumes, substantial storage space and high performance processing capability. The processing and distribution of this data into usable information products requires a processing pipeline that can efficiently manage the required storage, computing utilities, and data handling requirements. In recent years, with the availability of cloud computing technology, cloud processing platforms have made available a powerful new computing infrastructure resource that can meet this need. To assess the utility of this resource, this project investigates cloud computing platforms for bulk, automated geoprocessing capabilities with respect to data handling and application development requirements. This presentation is of work being conducted by Applied Sciences Program Office at NASA-Stennis Space Center. A prototypical set of image manipulation and transformation processes that incorporate sample Unmanned Airborne System data were developed to create value-added products and tested for implementation on the "cloud". This project outlines the steps involved in creating and testing of open source software developed process code on a local prototype platform, and then transitioning this code with associated environment requirements into an analogous, but memory and processor enhanced cloud platform. A data processing cloud was used to store both standard digital camera panchromatic and multi-band image data, which were subsequently subjected to standard image processing functions such as NDVI (Normalized Difference Vegetation Index), NDMI (Normalized Difference Moisture Index), band stacking, reprojection, and other similar type data processes. Cloud infrastructure service providers were evaluated by taking these locally tested processing functions, and then applying them to a given cloud-enabled infrastructure to assesses and compare environment setup options and enabled technologies. This project reviews findings that were observed when cloud platforms were evaluated for bulk geoprocessing capabilities based on data handling and application development requirements.
Geospatial analysis of cyclist injury trends: an investigation in Melbourne, Australia.
Lawrence, Brendan M; Stevenson, Mark R; Oxley, Jennifer A; Logan, David B
2015-01-01
This study applied geospatial analysis to explore spatial trends in cycling-related injury in Melbourne, Australia, in order to identify an area where injury density was reducing against expectation. The crash characteristics and cycling environment of the identified area were examined to better understand factors related to cycling safety. Two methods were used to examine spatial trends in cycling-related injury. Firstly, cycling injury density was calculated using a kernel density estimation method for the years 2000 to 2011. This was used to examine patterns in injury density across Melbourne over an extended time period. Secondly, absolute change in injury density was calculated between 2005 and 2011. From this, a geographical area presenting a reduced injury density was selected for a case study, and crash characteristics of the area were obtained for the observational period. This led to discussion on which changes to the cycling environment, if any, may be associated with the reduced injury rate. Injury density in Melbourne had been progressively increasing between 2000 and 2011, with a nearly 3-fold increase in the peak injury density over that period. Decreases were observed in some locations between 2005 and 2011, and a geographical area to the southeast of Melbourne experienced a more significant decrease than others. This appeared to be associated with a combination of behavior and road infrastructure change, although a lack of data to verify change in cycling exposure prevented more definitive associations from being established. The apparent positive response of the injury rate to behavior and road infrastructure interventions is promising, yet the injury rate is unlikely to achieve the government's road safety target of 30% reduction in serious injuries by 2022. Moreover, the number of injuries sustained at the most common crash location appears to be increasing. Further research is necessary to discern which specific features of the urban road infrastructure have an effect on the risk of injury to a cyclist and which combination of features is consistent with a safe cycling environment.
Modeling Being "Lost": Imperfect Situation Awareness
NASA Technical Reports Server (NTRS)
Middleton, Victor E.
2011-01-01
Being "lost" is an exemplar of imperfect Situation Awareness/Situation Understanding (SA/SU) -- information/knowledge that is uncertain, incomplete, and/or just wrong. Being "lost" may be a geo-spatial condition - not knowing/being wrong about where to go or how to get there. More broadly, being "lost" can serve as a metaphor for uncertainty and/or inaccuracy - not knowing/being wrong about how one fits into a larger world view, what one wants to do, or how to do it. This paper discusses using agent based modeling (ABM) to explore imperfect SA/SU, simulating geo-spatially "lost" intelligent agents trying to navigate in a virtual world. Each agent has a unique "mental map" -- its idiosyncratic view of its geo-spatial environment. Its decisions are based on this idiosyncratic view, but behavior outcomes are based on ground truth. Consequently, the rate and degree to which an agent's expectations diverge from ground truth provide measures of that agent's SA/SU.
NASA Astrophysics Data System (ADS)
Edsall, Robert; Hembree, Harvey
2018-05-01
The geospatial research and development team in the National and Homeland Security Division at Idaho National Laboratory was tasked with providing tools to derive insight from the substantial amount of data currently available - and continuously being produced - associated with the critical infrastructure of the US. This effort is in support of the Department of Homeland Security, whose mission includes the protection of this infrastructure and the enhancement of its resilience to hazards, both natural and human. We present geovisual-analytics-based approaches for analysis of vulnerabilities and resilience of critical infrastructure, designed so that decision makers, analysts, and infrastructure owners and managers can manage risk, prepare for hazards, and direct resources before and after an incident that might result in an interruption in service. Our designs are based on iterative discussions with DHS leadership and analysts, who in turn will use these tools to explore and communicate data in partnership with utility providers, law enforcement, and emergency response and recovery organizations, among others. In most cases these partners desire summaries of large amounts of data, but increasingly, our users seek the additional capability of focusing on, for example, a specific infrastructure sector, a particular geographic region, or time period, or of examining data in a variety of generalization or aggregation levels. These needs align well with tenets of in-formation-visualization design; in this paper, selected applications among those that we have designed are described and positioned within geovisualization, geovisual analytical, and information visualization frameworks.
Aron, Joan L
2006-11-01
This paper presents two case studies of the barriers to the use of geospatial data in the context of public health adaptation to climate change and variability. The first case study is on the hazards of coastal zone development in the United States with the main emphasis on Hurricane Katrina. An important barrier to the use of geospatial data is that the legal system does not support restrictions on land use intended to protect the coastal zone. Economic interests to develop New Orleans and the Mississippi River, both over the long term and the short term, had the effect of increasing the impact of the hurricane. The second case study is epidemics of climate-sensitive diseases with the main emphasis on malaria in Africa. Limits to model accuracy may present a problem in using climate data for an early warning system, and some geographic locations are likely to be more suitable than others. Costs of the system, including the costs of errors, may also inhibit implementation. Deriving societal benefits from geospatial data requires an understanding of the particular decision contexts and organizational processes in which knowledge is developed and used. The data by themselves will not usually generate a societal response. Scientists working in applications should develop partnerships to address the use of geospatial data for societal benefit.
a Public Platform for Geospatial Data Sharing for Disaster Risk Management
NASA Astrophysics Data System (ADS)
Balbo, S.; Boccardo, P.; Dalmasso, S.; Pasquali, P.
2013-01-01
Several studies have been conducted in Africa to assist local governments in addressing the risk situation related to natural hazards. Geospatial data containing information on vulnerability, impacts, climate change, disaster risk reduction is usually part of the output of such studies and is valuable to national and international organizations to reduce the risks and mitigate the impacts of disasters. Nevertheless this data isn't efficiently widely distributed and often resides in remote storage solutions hardly reachable. Spatial Data Infrastructures are technical solutions capable to solve this issue, by storing geospatial data and making them widely available through the internet. Among these solutions, GeoNode, an open source online platform for geospatial data sharing, has been developed in recent years. GeoNode is a platform for the management and publication of geospatial data. It brings together mature and stable open-source software projects under a consistent and easy-to-use interface allowing users, with little training, to quickly and easily share data and create interactive maps. GeoNode data management tools allow for integrated creation of data, metadata, and map visualizations. Each dataset in the system can be shared publicly or restricted to allow access to only specific users. Social features like user profiles and commenting and rating systems allow for the development of communities around each platform to facilitate the use, management, and quality control of the data the GeoNode instance contains (http://geonode.org/). This paper presents a case study scenario of setting up a Web platform based on GeoNode. It is a public platform called MASDAP and promoted by the Government of Malawi in order to support development of the country and build resilience against natural disasters. A substantial amount of geospatial data has already been collected about hydrogeological risk, as well as several other-disasters related information. Moreover this platform will help to ensure that the data created by a number of past or ongoing projects is maintained and that this information remains accessible and useful. An Integrated Flood Risk Management Plan for a river basin has already been included in the platform and other data from future disaster risk management projects will be added as well.
NASA Astrophysics Data System (ADS)
Fernandes, E. C.; Norbu, C.; Juizo, D.; Wangdi, T.; Richey, J. E.
2011-12-01
Landscapes, watersheds, and their downstream coastal and lacustrine zones are facing a series of challenges critical to their future, centered on the availability and distribution of water. Management options cover a range of issues, from bringing safe water to local villages for the rural poor, developing adaptation strategies for both rural and urban populations and large infrastructure, and sustaining environmental flows and ecosystem services needed for natural and human-dominated ecosystems. These targets represent a very complex set of intersecting issues of scale, cross-sector science and technology, education, politics, and economics, and the desired sustainable development is closely linked to how the nominally responsible governmental Ministries respond to the information they have. In practice, such information and even perspectives are virtually absent, in much of the developing world. A Dynamic Information Framework (DIF) is being designed as a knowledge platform whereby decision-makers in information-sparse regions can consider rigorous scenarios of alternative futures and obtain decision support for complex environmental and economic decisions is essential. The DIF is geospatial gateway, with functional components of base data layers, directed data layers focused on synthetic objectives, geospatially-explicit, process-based, cross-sector simulation models (requiring data from the directed data layers), and facilitated input/output (including visualizations), and decision support system and scenario testing capabilities. A fundamental aspect to a DIF is not only the convergence of multi-sector information, but how that information can be (a) integrated (b) used for robust simulations and projections, and (c) conveyed to policymakers and stakeholders, in the most compelling, and visual, manner. Examples are given of emerging applications. The ZambeziDIF was used to establish baselines for agriculture, biodiversity, and water resources in the lower Zambezi valley of Mozambique. The DrukDIF for Bhutan is moving from a test-of-concept to an operational phase, with uses from extending local biodiversity to computing how much energy can be sold tomorrow, based on waterflows today. AralDIF is being developed to serve as a neutral and transparent platform, as a catalyst for open and transparent discussion on water and energy linkages, for central Asia. ImisoziDIF is now being ramped up in Rwanda, to help guide scaling up of agricultural practices and biodiversity from sites to the country. The Virtual Mekong Basin, "tells the story" of the multiple issues facing the Mekong Basin.
The Earth System Grid Federation: An Open Infrastructure for Access to Distributed Geospatial Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ananthakrishnan, Rachana; Bell, Gavin; Cinquini, Luca
2013-01-01
The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF s architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL,more » GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).« less
The Earth System Grid Federation: An Open Infrastructure for Access to Distributed Geo-Spatial Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cinquini, Luca; Crichton, Daniel; Miller, Neill
2012-01-01
The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF s architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL,more » GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).« less
The Earth System Grid Federation : an Open Infrastructure for Access to Distributed Geospatial Data
NASA Technical Reports Server (NTRS)
Cinquini, Luca; Crichton, Daniel; Mattmann, Chris; Harney, John; Shipman, Galen; Wang, Feiyi; Ananthakrishnan, Rachana; Miller, Neill; Denvil, Sebastian; Morgan, Mark;
2012-01-01
The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF's architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL, GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).
Exploring the Earth Using Deep Learning Techniques
NASA Astrophysics Data System (ADS)
Larraondo, P. R.; Evans, B. J. K.; Antony, J.
2016-12-01
Research using deep neural networks have significantly matured in recent times, and there is now a surge in interest to apply such methods to Earth systems science and the geosciences. When combined with Big Data, we believe there are opportunities for significantly transforming a number of areas relevant to researchers and policy makers. In particular, by using a combination of data from a range of satellite Earth observations as well as computer simulations from climate models and reanalysis, we can gain new insights into the information that is locked within the data. Global geospatial datasets describe a wide range of physical and chemical parameters, which are mostly available using regular grids covering large spatial and temporal extents. This makes them perfect candidates to apply deep learning methods. So far, these techniques have been successfully applied to image analysis through the use of convolutional neural networks. However, this is only one field of interest, and there is potential for many more use cases to be explored. The deep learning algorithms require fast access to large amounts of data in the form of tensors and make intensive use of CPU in order to train its models. The Australian National Computational Infrastructure (NCI) has recently augmented its Raijin 1.2 PFlop supercomputer with hardware accelerators. Together with NCI's 3000 core high performance OpenStack cloud, these computational systems have direct access to NCI's 10+ PBytes of datasets and associated Big Data software technologies (see http://geonetwork.nci.org.au/ and http://nci.org.au/systems-services/national-facility/nerdip/). To effectively use these computing infrastructures requires that both the data and software are organised in a way that readily supports the deep learning software ecosystem. Deep learning software, such as the open source TensorFlow library, has allowed us to demonstrate the possibility of generating geospatial models by combining information from our different data sources. This opens the door to an exciting new way of generating products and extracting features that have previously been labour intensive. In this paper, we will explore some of these geospatial use cases and share some of the lessons learned from this experience.
OntoFire: an ontology-based geo-portal for wildfires
NASA Astrophysics Data System (ADS)
Kalabokidis, K.; Athanasis, N.; Vaitis, M.
2011-12-01
With the proliferation of the geospatial technologies on the Internet, the role of geo-portals (i.e. gateways to Spatial Data Infrastructures) in the area of wildfires management emerges. However, keyword-based techniques often frustrate users when looking for data of interest in geo-portal environments, while little attention has been paid to shift from the conventional keyword-based to navigation-based mechanisms. The presented OntoFire system is an ontology-based geo-portal about wildfires. Through the proposed navigation mechanisms, the relationships between the data can be discovered, which would otherwise not be possible when using conventional querying techniques alone. End users can use the browsing interface to find resources of interest by using the navigation mechanisms provided. Data providers can use the publishing interface to submit new metadata, modify metadata or removing metadata in/from the catalogue. The proposed approach can improve the discovery of valuable information that is necessary to set priorities for disaster mitigation and prevention strategies. OntoFire aspires to be a focal point of integration and management of a very large amount of information, contributing in this way to the dissemination of knowledge and to the preparedness of the operational stakeholders.
NASA Astrophysics Data System (ADS)
Balaji Bhaskar, M. S.; Rosenzweig, J.; Shishodia, S.
2017-12-01
The objective of our activity is to improve the students understanding and interpretation of geospatial science and climate change concepts and its applications in the field of Environmental and Biological Sciences in the College of Science Engineering and Technology (COEST) at Texas Southern University (TSU) in Houston, TX. The courses of GIS for Environment, Ecology and Microbiology were selected for the curriculum infusion. A total of ten GIS hands-on lab modules, along with two NCAR (National Center for Atmospheric Research) lab modules on climate change were implemented in the "GIS for Environment" course. GIS and Google Earth Labs along with climate change lectures were infused into Microbiology and Ecology courses. Critical thinking and empirical skills of the students were assessed in all the courses. The student learning outcomes of these courses includes the ability of students to interpret the geospatial maps and the student demonstration of knowledge of the basic principles and concepts of GIS (Geographic Information Systems) and climate change. At the end of the courses, students developed a comprehensive understanding of the geospatial data, its applications in understanding climate change and its interpretation at the local and regional scales during multiple years.
ERIC Educational Resources Information Center
Lytras, Miltiadis, Ed.; Naeve, Ambjorn, Ed.
2005-01-01
In the context of Knowledge Society, the convergence of knowledge and learning management is a critical milestone. "Intelligent Learning Infrastructure for Knowledge Intensive Organizations: A Semantic Web Perspective" provides state-of-the art knowledge through a balanced theoretical and technological discussion. The semantic web perspective…
Developing standards for a national spatial data infrastructure
Wortman, Kathryn C.
1994-01-01
The concept of a framework for data and information linkages among producers and users, known as a National Spatial Data Infrastructure (NSDI), is built upon four corners: data, technology, institutions, and standards. Standards are paramount to increase the efficiency and effectiveness of the NSDI. Historically, data standards and specifications have been developed with a very limited scope - they were parochial, and even competitive in nature, and promoted the sharing of data and information within only a small community at the expense of more open sharing across many communities. Today, an approach is needed to grow and evolve standards to support open systems and provide consistency and uniformity among data producers. There are several significant ongoing activities in geospatial data standards: transfer or exchange, metadata, and data content. In addition, standards in other areas are under discussion, including data quality, data models, and data collection.
On the sensitivity of geospatial low impact development locations to the centralized sewer network.
Zischg, Jonatan; Zeisl, Peter; Winkler, Daniel; Rauch, Wolfgang; Sitzenfrei, Robert
2018-04-01
In the future, infrastructure systems will have to become smarter, more sustainable, and more resilient requiring new methods of urban infrastructure design. In the field of urban drainage, green infrastructure is a promising design concept with proven benefits to runoff reduction, stormwater retention, pollution removal, and/or the creation of attractive living spaces. Such 'near-nature' concepts are usually distributed over the catchment area in small scale units. In many cases, these above-ground structures interact with the existing underground pipe infrastructure, resulting in hybrid solutions. In this work, we investigate the effect of different placement strategies for low impact development (LID) structures on hydraulic network performance of existing drainage networks. Based on a sensitivity analysis, geo-referenced maps are created which identify the most effective LID positions within the city framework (e.g. to improve network resilience). The methodology is applied to a case study to test the effectiveness of the approach and compare different placement strategies. The results show that with a simple targeted LID placement strategy, the flood performance is improved by an additional 34% as compared to a random placement strategy. The developed map is easy to communicate and can be rapidly applied by decision makers when deciding on stormwater policies.
Bielicki, Jeffrey M.; Langenfeld, Julie K.; Tao, Zhiyuan; ...
2018-05-26
Hydrocarbon depleted fractured shale (HDFS) formations could be attractive for geologic carbon dioxide (CO 2) storage. Shale formations may be able to leverage existing infrastructure, have larger capacities, and be more secure than saline aquifers. We compared regional storage capacities and integrated CO 2 capture, transport, and storage systems that use HDFS with those that use saline aquifers in a region of the United States with extensive shale development that overlies prospective saline aquifers. We estimated HDFS storage capacities with a production-based method and costs by adapting methods developed for saline aquifers and found that HDFS formations in this regionmore » might be able to store with less cost an estimated ~14× more CO 2 on average than saline aquifers at the same location. The potential for smaller Areas of Review and less investment in infrastructure accounted for up to 84% of the difference in estimated storage costs. We implemented an engineering-economic geospatial optimization model to determine and compare the viability of storage capacity for these two storage resources. Across the state-specific and regional scenarios we investigated, our results for this region suggest that integrated CCS systems using HDFS could be more centralized, require less pipelines, prioritize different routes for trunklines, and be 6.4–6.8% ($5-10/tCO 2) cheaper than systems using saline aquifers. In conclusion, overall, CO 2 storage in HDFS could be technically and economically attractive and may lower barriers to large scale CO 2 storage if they can be permitted.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bielicki, Jeffrey M.; Langenfeld, Julie K.; Tao, Zhiyuan
Hydrocarbon depleted fractured shale (HDFS) formations could be attractive for geologic carbon dioxide (CO 2) storage. Shale formations may be able to leverage existing infrastructure, have larger capacities, and be more secure than saline aquifers. We compared regional storage capacities and integrated CO 2 capture, transport, and storage systems that use HDFS with those that use saline aquifers in a region of the United States with extensive shale development that overlies prospective saline aquifers. We estimated HDFS storage capacities with a production-based method and costs by adapting methods developed for saline aquifers and found that HDFS formations in this regionmore » might be able to store with less cost an estimated ~14× more CO 2 on average than saline aquifers at the same location. The potential for smaller Areas of Review and less investment in infrastructure accounted for up to 84% of the difference in estimated storage costs. We implemented an engineering-economic geospatial optimization model to determine and compare the viability of storage capacity for these two storage resources. Across the state-specific and regional scenarios we investigated, our results for this region suggest that integrated CCS systems using HDFS could be more centralized, require less pipelines, prioritize different routes for trunklines, and be 6.4–6.8% ($5-10/tCO 2) cheaper than systems using saline aquifers. In conclusion, overall, CO 2 storage in HDFS could be technically and economically attractive and may lower barriers to large scale CO 2 storage if they can be permitted.« less
Geospatial Standards and the Knowledge Generation Lifescycle
NASA Technical Reports Server (NTRS)
Khalsa, Siri Jodha S.; Ramachandran, Rahul
2014-01-01
Standards play an essential role at each stage in the sequence of processes by which knowledge is generated from geoscience observations, simulations and analysis. This paper provides an introduction to the field of informatics and the knowledge generation lifecycle in the context of the geosciences. In addition we discuss how the newly formed Earth Science Informatics Technical Committee is helping to advance the application of standards and best practices to make data and data systems more usable and interoperable.
NASA Astrophysics Data System (ADS)
Hedley, Mikell Lynne
2008-10-01
The purpose of the study was to use geospatial technologies to improve the spatial abilities and specific atmospheric science content knowledge of students in high schools and junior highs in primarily high-needs schools. These technologies include remote sensing, geographic information systems, and global positioning systems. The program involved training the teachers in the use of the technologies at a five-day institute. Scientists who use the technologies in their research taught the basics of their use and scientific background. Standards-based activities were used to integrate the technologies in the classroom setting. Students were tested before any instruction in the technologies and then tested two other times. They used the technologies in field data collection and used that data in an inquiry-based project. Their projects were presented at a mini-science conference with scientists, teachers, parents, and other students in attendance. Significant differences were noted from pre-test to second post-test in the test in both the spatial abilities and science section. There was a gain in both spatial abilities and in specific atmospheric science content knowledge.
A knowledge infrastructure for occupational safety and health.
van Dijk, Frank J H; Verbeek, Jos H; Hoving, Jan L; Hulshof, Carel T J
2010-12-01
Occupational Safety and Health (OSH) professionals should use scientific evidence to support their decisions in policy and practice. Although examples from practice show that progress has been made in evidence-based decision making, there is a challenge to improve and extend the facilities that support knowledge translation in practice. A knowledge infrastructure that supports OSH practice should include scientific research, systematic reviews, practice guidelines, and other tools for professionals such as well accessible virtual libraries and databases providing knowledge, quality tools, and good learning materials. A good infrastructure connects facilities with each other and with practice. Training and education is needed for OSH professionals in the use of evidence to improve effectiveness and efficiency. New initiatives show that occupational health can profit from intensified international collaboration to establish a good functioning knowledge infrastructure.
Impacts of Geospatial Information for Decision Making
NASA Astrophysics Data System (ADS)
Pearlman, F.; Coote, A.; Friedl, L.; Stewart, M.
2012-12-01
Geospatial information contributes to decisions by both societal and individual decision-makers. More effective use of this information is essential as issues are increasingly complex and consequences can be critical for future economic and social development. To address this, a workshop brought together analysts, communicators, officials, and researchers from academia, government, non-governmental organizations, and the private sector. A range of policy issues, management needs, and resource requirements were discussed and a wide array of analyses, geospatial data, methods of analysis, and metrics were presented for assessing and communicating the value of geospatial information. It is clear that there are many opportunities for integrating science and engineering disciplines with the social sciences for addressing societal issues that would benefit from using geospatial information and earth observations. However, these collaborations must have outcomes that can be easily communicated to decision makers. This generally requires either succinct quantitative statements of value based on rigorous models and/or user testimonials of actual applications that save real money. An outcome of the workshop is to pursue the development of a community of practice or society that encompasses a wide range of scientific, social, management, and communication disciplines and fosters collaboration across specialties, helping to build trust across social and science aspects. A resource base is also necessary. This presentation will address approaches for creating a shared knowledge database, containing a glossary of terms, reference materials and examples of case studies and the potential applications for benefit analyses.
National Geospatial-Intelligence Agency Academic Research Program
NASA Astrophysics Data System (ADS)
Loomer, S. A.
2004-12-01
"Know the Earth.Show the Way." In fulfillment of its vision, the National Geospatial-Intelligence Agency (NGA) provides geospatial intelligence in all its forms and from whatever source-imagery, imagery intelligence, and geospatial data and information-to ensure the knowledge foundation for planning, decision, and action. To achieve this, NGA conducts a multi-disciplinary program of basic research in geospatial intelligence topics through grants and fellowships to the leading investigators, research universities, and colleges of the nation. This research provides the fundamental science support to NGA's applied and advanced research programs. The major components of the NGA Academic Research Program (NARP) are: - NGA University Research Initiatives (NURI): Three-year basic research grants awarded competitively to the best investigators across the US academic community. Topics are selected to provide the scientific basis for advanced and applied research in NGA core disciplines. - Historically Black College and University - Minority Institution Research Initiatives (HBCU-MI): Two-year basic research grants awarded competitively to the best investigators at Historically Black Colleges and Universities, and Minority Institutions across the US academic community. - Director of Central Intelligence Post-Doctoral Research Fellowships: Fellowships providing access to advanced research in science and technology applicable to the intelligence community's mission. The program provides a pool of researchers to support future intelligence community needs and develops long-term relationships with researchers as they move into career positions. This paper provides information about the NGA Academic Research Program, the projects it supports and how other researchers and institutions can apply for grants under the program.
Geospatial information technology: an adjunct to service-based outreach and education.
Faruque, Fazlay; Hewlett, Peggy O; Wyatt, Sharon; Wilson, Kaye; Lofton, Susan; Frate, Dennis; Gunn, Jennie
2004-02-01
This exemplar highlights how geospatial information technology was effective in supporting academic practice, faculty outreach, and education initiatives at the University of Mississippi School of Nursing. Using this cutting-edge technology created a community-based prototype for fully integrating point-of-service research, practice, and academics into a cohesive strategy to influence change within the health care delivery system. This exemplar discusses ways this knowledge benefits practice and curriculum development; informs critical decision making affecting the people we serve; underscores the vital role nurses play in linking this technology to practice; and develops community residents as partners in their own health and that of the community.
WebGL Visualisation of 3D Environmental Models Based on Finnish Open Geospatial Data Sets
NASA Astrophysics Data System (ADS)
Krooks, A.; Kahkonen, J.; Lehto, L.; Latvala, P.; Karjalainen, M.; Honkavaara, E.
2014-08-01
Recent developments in spatial data infrastructures have enabled real time GIS analysis and visualization using open input data sources and service interfaces. In this study we present a new concept where metric point clouds derived from national open airborne laser scanning (ALS) and photogrammetric image data are processed, analyzed, finally visualised a through open service interfaces to produce user-driven analysis products from targeted areas. The concept is demonstrated in three environmental applications: assessment of forest storm damages, assessment of volumetric changes in open pit mine and 3D city model visualization. One of the main objectives was to study the usability and requirements of national level photogrammetric imagery in these applications. The results demonstrated that user driven 3D geospatial analyses were possible with the proposed approach and current technology, for instance, the landowner could assess the amount of fallen trees within his property borders after a storm easily using any web browser. On the other hand, our study indicated that there are still many uncertainties especially due to the insufficient standardization of photogrammetric products and processes and their quality indicators.
USGS Geospatial Fabric and Geo Data Portal for Continental Scale Hydrology Simulations
NASA Astrophysics Data System (ADS)
Sampson, K. M.; Newman, A. J.; Blodgett, D. L.; Viger, R.; Hay, L.; Clark, M. P.
2013-12-01
This presentation describes use of United States Geological Survey (USGS) data products and server-based resources for continental-scale hydrologic simulations. The USGS Modeling of Watershed Systems (MoWS) group provides a consistent national geospatial fabric built on NHDPlus. They have defined more than 100,000 hydrologic response units (HRUs) over the continental United States based on points of interest (POIs) and split into left and right bank based on the corresponding stream segment. Geophysical attributes are calculated for each HRU that can be used to define parameters in hydrologic and land-surface models. The Geo Data Portal (GDP) project at the USGS Center for Integrated Data Analytics (CIDA) provides access to downscaled climate datasets and processing services via web-interface and python modules for creating forcing datasets for any polygon (such as an HRU). These resources greatly reduce the labor required for creating model-ready data in-house, contributing to efficient and effective modeling applications. We will present an application of this USGS cyber-infrastructure for assessments of impacts of climate change on hydrology over the continental United States.
Real-time access of large volume imagery through low-bandwidth links
NASA Astrophysics Data System (ADS)
Phillips, James; Grohs, Karl; Brower, Bernard; Kelly, Lawrence; Carlisle, Lewis; Pellechia, Matthew
2010-04-01
Providing current, time-sensitive imagery and geospatial information to deployed tactical military forces or first responders continues to be a challenge. This challenge is compounded through rapid increases in sensor collection volumes, both with larger arrays and higher temporal capture rates. Focusing on the needs of these military forces and first responders, ITT developed a system called AGILE (Advanced Geospatial Imagery Library Enterprise) Access as an innovative approach based on standard off-the-shelf techniques to solving this problem. The AGILE Access system is based on commercial software called Image Access Solutions (IAS) and incorporates standard JPEG 2000 processing. Our solution system is implemented in an accredited, deployable form, incorporating a suite of components, including an image database, a web-based search and discovery tool, and several software tools that act in concert to process, store, and disseminate imagery from airborne systems and commercial satellites. Currently, this solution is operational within the U.S. Government tactical infrastructure and supports disadvantaged imagery users in the field. This paper presents the features and benefits of this system to disadvantaged users as demonstrated in real-world operational environments.
Geospatial Data Standards for Indian Water Resources Systems
NASA Astrophysics Data System (ADS)
Goyal, A.; Tyagi, H.; Gosain, A. K.; Khosa, R.
2016-12-01
Sustainable management of water resources is fundamental to the socio-economic development of any nation. There is an increasing degree of dependency on digital geographical data for monitoring, planning, managing and preserving the water resources and environmental quality. But the rising sophistication associated with the sharing of geospatial data among organizations or users, demands development of data standards for seamless information exchange among collaborators. Therefore, due to the realization that these datasets are vital for efficient use of Geographical Information Systems, there is a growing emphasis on data standards for modeling, encoding and communicating spatial data. Real world hydrologic interactions represented in a digital framework requires geospatial standards that may vary in contexts like: governance, resource inventory, cultural diversity, identifiers, role and scale. Though the prevalent standards for the hydrology data facilitate a particular need in a particular context but they lack a holistic approach. However, several worldwide initiatives such as Consortium for the Advancement of Hydrologic Sciences Inc. (USA), Infrastructure for Spatial Information in the European Community (Europe), Australian Water Resources Information System, etc., endeavour to address this issue of hydrology specific spatial data standards in a wholesome manner. But unfortunately there is no such provision for hydrology data exchange within the Indian community. Moreover, these standards somehow fail in providing powerful communication of the spatial hydrologic data. This study thus investigates the shortcomings of the existing industry standards for the hydrologic data models and then demonstrates a set of requirements for effective exchange of the hydrologic information in the Indian scenario.
NASA Astrophysics Data System (ADS)
Gordov, Evgeny; Okladnikov, Igor; Titov, Alexander
2017-04-01
For comprehensive usage of large geospatial meteorological and climate datasets it is necessary to create a distributed software infrastructure based on the spatial data infrastructure (SDI) approach. Currently, it is generally accepted that the development of client applications as integrated elements of such infrastructure should be based on the usage of modern web and GIS technologies. The paper describes the Web GIS for complex processing and visualization of geospatial (mainly in NetCDF and PostGIS formats) datasets as an integral part of the dedicated Virtual Research Environment for comprehensive study of ongoing and possible future climate change, and analysis of their implications, providing full information and computing support for the study of economic, political and social consequences of global climate change at the global and regional levels. The Web GIS consists of two basic software parts: 1. Server-side part representing PHP applications of the SDI geoportal and realizing the functionality of interaction with computational core backend, WMS/WFS/WPS cartographical services, as well as implementing an open API for browser-based client software. Being the secondary one, this part provides a limited set of procedures accessible via standard HTTP interface. 2. Front-end part representing Web GIS client developed according to a "single page application" technology based on JavaScript libraries OpenLayers (http://openlayers.org/), ExtJS (https://www.sencha.com/products/extjs), GeoExt (http://geoext.org/). It implements application business logic and provides intuitive user interface similar to the interface of such popular desktop GIS applications, as uDIG, QuantumGIS etc. Boundless/OpenGeo architecture was used as a basis for Web-GIS client development. According to general INSPIRE requirements to data visualization Web GIS provides such standard functionality as data overview, image navigation, scrolling, scaling and graphical overlay, displaying map legends and corresponding metadata information. The specialized Web GIS client contains three basic tires: • Tier of NetCDF metadata in JSON format • Middleware tier of JavaScript objects implementing methods to work with: o NetCDF metadata o XML file of selected calculations configuration (XML task) o WMS/WFS/WPS cartographical services • Graphical user interface tier representing JavaScript objects realizing general application business logic Web-GIS developed provides computational processing services launching to support solving tasks in the area of environmental monitoring, as well as presenting calculation results in the form of WMS/WFS cartographical layers in raster (PNG, JPG, GeoTIFF), vector (KML, GML, Shape), and binary (NetCDF) formats. It has shown its effectiveness in the process of solving real climate change research problems and disseminating investigation results in cartographical formats. The work is supported by the Russian Science Foundation grant No 16-19-10257.
NASA Astrophysics Data System (ADS)
Murakami, S.; Takemoto, T.; Ito, Y.
2012-07-01
The Japanese government, local governments and businesses are working closely together to establish spatial data infrastructures in accordance with the Basic Act on the Advancement of Utilizing Geospatial Information (NSDI Act established in August 2007). Spatial data infrastructures are urgently required not only to accelerate computerization of the public administration, but also to help restoration and reconstruction of the areas struck by the East Japan Great Earthquake and future disaster prevention and reduction. For construction of a spatial data infrastructure, various guidelines have been formulated. But after an infrastructure is constructed, there is a problem of maintaining it. In one case, an organization updates its spatial data only once every several years because of budget problems. Departments and sections update the data on their own without careful consideration. That upsets the quality control of the entire data system and the system loses integrity, which is crucial to a spatial data infrastructure. To ensure quality, ideally, it is desirable to update data of the entire area every year. But, that is virtually impossible, considering the recent budget crunch. The method we suggest is to update spatial data items of higher importance only in order to maintain quality, not updating all the items across the board. We have explored a method of partially updating the data of these two geographical features while ensuring the accuracy of locations. Using this method, data on roads and buildings that greatly change with time can be updated almost in real time or at least within a year. The method will help increase the availability of a spatial data infrastructure. We have conducted an experiment on the spatial data infrastructure of a municipality using those data. As a result, we have found that it is possible to update data of both features almost in real time.
Spatial Information Technology Center at Fulton-Montgomery Community College
NASA Technical Reports Server (NTRS)
Flinton, Michael E.
2004-01-01
The Spatial Information Technology Center (SITC) at Fulton-Montgomery Community College (FMCC) continued to fulfill its mission and charter by successfully completing its third year of operations under Congressional funding and NASA sponsorship. Third year operations (01 Oct 02 - 30 Sep 03) have been funded and conducted utilizing two authorized Research Grants NAG 13-00043 (via a one-year no-cost extension expiring Sep 03) and NAG 13-02053 (one-year no-cost extension expiring Sep 04). Drawdowns and reporting of fiscal activities for SlTC operations continues to pass through the Institute for the Application of Geo-spatial Technology (IAGT) at Cayuga Community College in Auburn, New York. Fiscal activity of the Center is reported quarterly via SF 272 to IAGT, thus this report contains only a budgetary overview and forecast of future expenditures for the remaining funds of NAG 13 - 02053. Funds from NAG 13 - 00043 were exhausted during the fourth quarter of fiscal year FY02 - 03, which necessitated initial draw down of NAG 13 - 02053. The IAGT receives no compensation for administrative costs as authorized and approved by NASA in each award budget. This report also includes the necessary addendums for each NAG award, as required by federal guidelines, though no reportable activities took place within this report period. Attached are the signed Report of New Technology/lnventions and a Final Property Report identifying qualifying equipment purchased by the Center. As an academic, economic and workforce development oriented program, the Center has made significant strides in bringing the technology, knowledge and applications of the spatial information technology field to the region it serves. Through the mission of the Center, the region's educational, economic development and work force communities have become increasingly educated to the benefits of spatial (Geospatial) technology, particularly in the region's K-12 arena. SlTC continues to positively affect the region's education, employment and economic development, while expanding its services and operations designed to be customer driven, growing infrastructure and affecting systemic change.
Promoting Ecohealth through Geography and Governmental Partnerships
Ecohealth is truly interdisciplinary and now includes the relatively new field of exposure science. In 2012, the National Research Council released Exposure Science in the 21st Century: A Vision and a Strategy, in which application of geospatial knowledge and technology such as r...
Spatial information semantic query based on SPARQL
NASA Astrophysics Data System (ADS)
Xiao, Zhifeng; Huang, Lei; Zhai, Xiaofang
2009-10-01
How can the efficiency of spatial information inquiries be enhanced in today's fast-growing information age? We are rich in geospatial data but poor in up-to-date geospatial information and knowledge that are ready to be accessed by public users. This paper adopts an approach for querying spatial semantic by building an Web Ontology language(OWL) format ontology and introducing SPARQL Protocol and RDF Query Language(SPARQL) to search spatial semantic relations. It is important to establish spatial semantics that support for effective spatial reasoning for performing semantic query. Compared to earlier keyword-based and information retrieval techniques that rely on syntax, we use semantic approaches in our spatial queries system. Semantic approaches need to be developed by ontology, so we use OWL to describe spatial information extracted by the large-scale map of Wuhan. Spatial information expressed by ontology with formal semantics is available to machines for processing and to people for understanding. The approach is illustrated by introducing a case study for using SPARQL to query geo-spatial ontology instances of Wuhan. The paper shows that making use of SPARQL to search OWL ontology instances can ensure the result's accuracy and applicability. The result also indicates constructing a geo-spatial semantic query system has positive efforts on forming spatial query and retrieval.
Development of a National Digital Geospatial Data Framework
,
1995-01-01
This proposal of a data framework to organize and enhance the activities of the geospatial data community to meet needs for basic themes of data was developed in response to a request in Executive Order 12906, Coordinating Geographic Data Acquisition and Access: The National Spatial Data Infrastructure (U.S. Executive Office of the President, 1994). The request stated: in consultation with State, local, and tribal governments and within 9 months of the date of this order, the FGDC shall submit a plan and schedule to OMB [U.S. Office of Management and Budget] for completing the initial implementation of a national digital geospatial data framework ("framework") by January 2000 and for establishing a process of ongoing data maintenance. The framework shall include geospatial data that are significant, in the determination of the FGDC, to a broad variety of users within any geographic area or nationwide. At a minimum, the plan shall address how the initial transportation, hydrology, and boundary elements of the framework might be completed by January 1998 in order to support the decennial census of 2000. The proposal was developed by representatives of local, regional, State, and Federal agencies under the auspices of the Federal Geographic Data Committee (FGDC). The individuals are listed in the appendix of this report. This Framework Working Group identified the purpose and goals for the framework; identified incentives for participation; defined the information content; developed preliminary technical, operational, and business contexts; specified the institutional roles needed; and developed a strategy for a phased implementation of the framework.Members of the working group presented the concepts of the framework for discussion at several national and regional public meetings. The draft of the report also was provided for public, written review. These discussions and reviews were the source of many improvements to the report.The FGDC approved the report for submission to the Office of Management and Budget on March 31, 1995.
Web catalog of oceanographic data using GeoNetwork
NASA Astrophysics Data System (ADS)
Marinova, Veselka; Stefanov, Asen
2017-04-01
Most of the data collected, analyzed and used by Bulgarian oceanographic data center (BgODC) from scientific cruises, argo floats, ferry boxes and real time operating systems are spatially oriented and need to be displayed on the map. The challenge is to make spatial information more accessible to users, decision makers and scientists. In order to meet this challenge, BgODC concentrate its efforts on improving dynamic and standardized access to their geospatial data as well as those from various related organizations and institutions. BgODC currently is implementing a project to create a geospatial portal for distributing metadata and search, exchange and harvesting spatial data. There are many open source software solutions able to create such spatial data infrastructure (SDI). Finally, the GeoNetwork open source is chosen, as it is already widespread. This software is free, effective and "cheap" solution for implementing SDI at organization level. It is platform independent and runs under many operating systems. Filling of the catalog goes through these practical steps: • Managing and storing data reliably within MS SQL spatial data base; • Registration of maps and data of various formats and sources in GeoServer (most popular open source geospatial server embedded with GeoNetwork) ; • Filling added meta data and publishing geospatial data at the desktop of GeoNetwork. GeoServer and GeoNetwork are based on Java so they require installing of a servlet engine like Tomcat. The experience gained from the use of GeoNetwork Open Source confirms that the catalog meets the requirements for data management and is flexible enough to customize. Building the catalog facilitates sustainable data exchange between end users. The catalog is a big step towards implementation of the INSPIRE directive due to availability of many features necessary for producing "INSPIRE compliant" metadata records. The catalog now contains all available GIS data provided by BgODC for Internet access. Searching data within the catalog is based upon geographic extent, theme type and free text search.
Prototyping an online wetland ecosystem services model using open model sharing standards
Feng, M.; Liu, S.; Euliss, N.H.; Young, Caitlin; Mushet, D.M.
2011-01-01
Great interest currently exists for developing ecosystem models to forecast how ecosystem services may change under alternative land use and climate futures. Ecosystem services are diverse and include supporting services or functions (e.g., primary production, nutrient cycling), provisioning services (e.g., wildlife, groundwater), regulating services (e.g., water purification, floodwater retention), and even cultural services (e.g., ecotourism, cultural heritage). Hence, the knowledge base necessary to quantify ecosystem services is broad and derived from many diverse scientific disciplines. Building the required interdisciplinary models is especially challenging as modelers from different locations and times may develop the disciplinary models needed for ecosystem simulations, and these models must be identified and made accessible to the interdisciplinary simulation. Additional difficulties include inconsistent data structures, formats, and metadata required by geospatial models as well as limitations on computing, storage, and connectivity. Traditional standalone and closed network systems cannot fully support sharing and integrating interdisciplinary geospatial models from variant sources. To address this need, we developed an approach to openly share and access geospatial computational models using distributed Geographic Information System (GIS) techniques and open geospatial standards. We included a means to share computational models compliant with Open Geospatial Consortium (OGC) Web Processing Services (WPS) standard to ensure modelers have an efficient and simplified means to publish new models. To demonstrate our approach, we developed five disciplinary models that can be integrated and shared to simulate a few of the ecosystem services (e.g., water storage, waterfowl breeding) that are provided by wetlands in the Prairie Pothole Region (PPR) of North America.
A GEO Initiative to Support the Sustainable Development Goals
NASA Astrophysics Data System (ADS)
Friedl, L.
2016-12-01
The United Nations Agenda 2030 serves as a global development agenda for progress on economic, social and environmental sustainability. These Sustainable Development Goals (SDG) have a specific provision for the use of Earth observations and geospatial information to support progress. The international Group on Earth Observations, GEO, has a dedicated initiative focused on the SDGs. This initiative supports efforts to integrate Earth observations and geospatial information into national development and monitoring frameworks for the SDGs. It helps enables countries and stakeholders to leverage Earth observations to support the implementation, planning, measuring, monitoring, reporting, and evaluation of the SDGs. This paper will present an overview of the GEO initiative and ways that Earth observations support the development goals. It will address how information and knowledge can be shared on effective methods to apply Earth observations to the SDGs and their associated targets and indicators. It will also highlight some existing information sources and tools on the SDGs, which can help identify key approaches for developing a knowledge base.
NASA Astrophysics Data System (ADS)
Hudspeth, W. B.; Baros, S.; Barrett, H.; Savickas, J.; Erickson, J.
2015-12-01
WC WAVE (Western Consortium for Watershed Analysis, Visualization and Exploration) is a collaborative research project between the states of Idaho, Nevada, and New Mexico that is funded under the National Science Foundation's Experimental Program to Stimulate Competitive Research (EPSCoR). The goal of the project is to understand and document the effects of climate change on interactions between precipitation, vegetation growth, soil moisture and other landscape properties. These interactions are modeled within a framework we refer to as a virtual watershed (VW), a computer infrastructure that simulates watershed dynamics by linking scientific modeling, visualization, and data management components into a coherent whole. Developed and hosted at the Earth Data Analysis Center, University of New Mexico, the virtual watershed has a number of core functions which include: a) streamlined access to data required for model initialization and boundary conditions; b) the development of analytic scenarios through interactive visualization of available data and the storage of model configuration options; c) coupling of hydrological models through the rapid assimilation of model outputs into the data management system for access and use by sequent models. The WC-WAVE virtual watershed accomplishes these functions by provision of large-scale vector and raster data discovery, subsetting, and delivery via Open Geospatial Consortium (OGC) and REST web service standards. Central to the virtual watershed is the design and use of an innovative array of metadata elements that permits the stepwise coupling of diverse hydrological models (e.g. ISNOBAL, PRMS, CASiMiR) and input data to rapidly assess variation in outcomes under different climatic conditions. We present details on the architecture and functionality of the virtual watershed, results from three western U.S. watersheds, and discuss the realized benefits to watershed science of employing this integrated solution.
ERIC Educational Resources Information Center
Mathisen, Arve; Nerland, Monika
2012-01-01
This paper employs a socio-technical perspective to explore the role of complex work support systems in organising knowledge and providing opportunities for learning in professional work. Drawing on concepts from infrastructure studies, such systems are seen as work infrastructures which connect information, knowledge, standards and work…
Cloud Computing for Geosciences--GeoCloud for standardized geospatial service platforms (Invited)
NASA Astrophysics Data System (ADS)
Nebert, D. D.; Huang, Q.; Yang, C.
2013-12-01
The 21st century geoscience faces challenges of Big Data, spike computing requirements (e.g., when natural disaster happens), and sharing resources through cyberinfrastructure across different organizations (Yang et al., 2011). With flexibility and cost-efficiency of computing resources a primary concern, cloud computing emerges as a promising solution to provide core capabilities to address these challenges. Many governmental and federal agencies are adopting cloud technologies to cut costs and to make federal IT operations more efficient (Huang et al., 2010). However, it is still difficult for geoscientists to take advantage of the benefits of cloud computing to facilitate the scientific research and discoveries. This presentation reports using GeoCloud to illustrate the process and strategies used in building a common platform for geoscience communities to enable the sharing, integration of geospatial data, information and knowledge across different domains. GeoCloud is an annual incubator project coordinated by the Federal Geographic Data Committee (FGDC) in collaboration with the U.S. General Services Administration (GSA) and the Department of Health and Human Services. It is designed as a staging environment to test and document the deployment of a common GeoCloud community platform that can be implemented by multiple agencies. With these standardized virtual geospatial servers, a variety of government geospatial applications can be quickly migrated to the cloud. In order to achieve this objective, multiple projects are nominated each year by federal agencies as existing public-facing geospatial data services. From the initial candidate projects, a set of common operating system and software requirements was identified as the baseline for platform as a service (PaaS) packages. Based on these developed common platform packages, each project deploys and monitors its web application, develops best practices, and documents cost and performance information. This paper presents the background, architectural design, and activities of GeoCloud in support of the Geospatial Platform Initiative. System security strategies and approval processes for migrating federal geospatial data, information, and applications into cloud, and cost estimation for cloud operations are covered. Finally, some lessons learned from the GeoCloud project are discussed as reference for geoscientists to consider in the adoption of cloud computing.
An Interoperable Architecture for Air Pollution Early Warning System Based on Sensor Web
NASA Astrophysics Data System (ADS)
Samadzadegan, F.; Zahmatkesh, H.; Saber, M.; Ghazi khanlou, H. J.
2013-09-01
Environmental monitoring systems deal with time-sensitive issues which require quick responses in emergency situations. Handling the sensor observations in near real-time and obtaining valuable information is challenging issues in these systems from a technical and scientific point of view. The ever-increasing population growth in urban areas has caused certain problems in developing countries, which has direct or indirect impact on human life. One of applicable solution for controlling and managing air quality by considering real time and update air quality information gathered by spatially distributed sensors in mega cities, using sensor web technology for developing monitoring and early warning systems. Urban air quality monitoring systems using functionalities of geospatial information system as a platform for analysing, processing, and visualization of data in combination with Sensor Web for supporting decision support systems in disaster management and emergency situations. This system uses Sensor Web Enablement (SWE) framework of the Open Geospatial Consortium (OGC), which offers a standard framework that allows the integration of sensors and sensor data into spatial data infrastructures. SWE framework introduces standards for services to access sensor data and discover events from sensor data streams as well as definition set of standards for the description of sensors and the encoding of measurements. The presented system provides capabilities to collect, transfer, share, process air quality sensor data and disseminate air quality status in real-time. It is possible to overcome interoperability challenges by using standard framework. In a routine scenario, air quality data measured by in-situ sensors are communicated to central station where data is analysed and processed. The extracted air quality status is processed for discovering emergency situations, and if necessary air quality reports are sent to the authorities. This research proposed an architecture to represent how integrate air quality sensor data stream into geospatial data infrastructure to present an interoperable air quality monitoring system for supporting disaster management systems by real time information. Developed system tested on Tehran air pollution sensors for calculating Air Quality Index (AQI) for CO pollutant and subsequently notifying registered users in emergency cases by sending warning E-mails. Air quality monitoring portal used to retrieving and visualize sensor observation through interoperable framework. This system provides capabilities to retrieve SOS observation using WPS in a cascaded service chaining pattern for monitoring trend of timely sensor observation.
GeoBrain Computational Cyber-laboratory for Earth Science Studies
NASA Astrophysics Data System (ADS)
Deng, M.; di, L.
2009-12-01
Computational approaches (e.g., computer-based data visualization, analysis and modeling) are critical for conducting increasingly data-intensive Earth science (ES) studies to understand functions and changes of the Earth system. However, currently Earth scientists, educators, and students have met two major barriers that prevent them from being effectively using computational approaches in their learning, research and application activities. The two barriers are: 1) difficulties in finding, obtaining, and using multi-source ES data; and 2) lack of analytic functions and computing resources (e.g., analysis software, computing models, and high performance computing systems) to analyze the data. Taking advantages of recent advances in cyberinfrastructure, Web service, and geospatial interoperability technologies, GeoBrain, a project funded by NASA, has developed a prototype computational cyber-laboratory to effectively remove the two barriers. The cyber-laboratory makes ES data and computational resources at large organizations in distributed locations available to and easily usable by the Earth science community through 1) enabling seamless discovery, access and retrieval of distributed data, 2) federating and enhancing data discovery with a catalogue federation service and a semantically-augmented catalogue service, 3) customizing data access and retrieval at user request with interoperable, personalized, and on-demand data access and services, 4) automating or semi-automating multi-source geospatial data integration, 5) developing a large number of analytic functions as value-added, interoperable, and dynamically chainable geospatial Web services and deploying them in high-performance computing facilities, 6) enabling the online geospatial process modeling and execution, and 7) building a user-friendly extensible web portal for users to access the cyber-laboratory resources. Users can interactively discover the needed data and perform on-demand data analysis and modeling through the web portal. The GeoBrain cyber-laboratory provides solutions to meet common needs of ES research and education, such as, distributed data access and analysis services, easy access to and use of ES data, and enhanced geoprocessing and geospatial modeling capability. It greatly facilitates ES research, education, and applications. The development of the cyber-laboratory provides insights, lessons-learned, and technology readiness to build more capable computing infrastructure for ES studies, which can meet wide-range needs of current and future generations of scientists, researchers, educators, and students for their formal or informal educational training, research projects, career development, and lifelong learning.
NASA Technical Reports Server (NTRS)
Rilee, Michael Lee; Kuo, Kwo-Sen
2017-01-01
The SpatioTemporal Adaptive Resolution Encoding (STARE) is a unifying scheme encoding geospatial and temporal information for organizing data on scalable computing/storage resources, minimizing expensive data transfers. STARE provides a compact representation that turns set-logic functions into integer operations, e.g. conditional sub-setting, taking into account representative spatiotemporal resolutions of the data in the datasets. STARE geo-spatiotemporally aligns data placements of diverse data on massive parallel resources to maximize performance. Automating important scientific functions (e.g. regridding) and computational functions (e.g. data placement) allows scientists to focus on domain-specific questions instead of expending their efforts and expertise on data processing. With STARE-enabled automation, SciDB (Scientific Database) plus STARE provides a database interface, reducing costly data preparation, increasing the volume and variety of interoperable data, and easing result sharing. Using SciDB plus STARE as part of an integrated analysis infrastructure dramatically eases combining diametrically different datasets.
NASA Astrophysics Data System (ADS)
Selsam, Peter; Schwartze, Christian
2016-10-01
Providing software solutions via internet has been known for quite some time and is now an increasing trend marketed as "software as a service". A lot of business units accept the new methods and streamlined IT strategies by offering web-based infrastructures for external software usage - but geospatial applications featuring very specialized services or functionalities on demand are still rare. Originally applied in desktop environments, the ILMSimage tool for remote sensing image analysis and classification was modified in its communicating structures and enabled for running on a high-power server and benefiting from Tavema software. On top, a GIS-like and web-based user interface guides the user through the different steps in ILMSimage. ILMSimage combines object oriented image segmentation with pattern recognition features. Basic image elements form a construction set to model for large image objects with diverse and complex appearance. There is no need for the user to set up detailed object definitions. Training is done by delineating one or more typical examples (templates) of the desired object using a simple vector polygon. The template can be large and does not need to be homogeneous. The template is completely independent from the segmentation. The object definition is done completely by the software.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deshmukh, Ranjit; Wu, Grace
The MapRE (Multi-criteria Analysis for Planning Renewable Energy) GIS (Geographic Information Systems) Tools are a set of ArcGIS tools to a) conduct site suitability analysis for wind and solar resources using inclusion and exclusion criteria, and create resource maps, b) create project opportunity areas and compute various attributes such as cost, distances to existing and planned infrastructure. and environmental impact factors; and c) calculate and update various attributes for already processed renewable energy zones. In addition, MapRE data sets are geospatial data of renewable energy project opportunity areas and zones with pre-calculated attributes for several countries. These tools and datamore » are available at mapre.lbl.gov.« less
A Hybrid Semi-supervised Classification Scheme for Mining Multisource Geospatial Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vatsavai, Raju; Bhaduri, Budhendra L
2011-01-01
Supervised learning methods such as Maximum Likelihood (ML) are often used in land cover (thematic) classification of remote sensing imagery. ML classifier relies exclusively on spectral characteristics of thematic classes whose statistical distributions (class conditional probability densities) are often overlapping. The spectral response distributions of thematic classes are dependent on many factors including elevation, soil types, and ecological zones. A second problem with statistical classifiers is the requirement of large number of accurate training samples (10 to 30 |dimensions|), which are often costly and time consuming to acquire over large geographic regions. With the increasing availability of geospatial databases, itmore » is possible to exploit the knowledge derived from these ancillary datasets to improve classification accuracies even when the class distributions are highly overlapping. Likewise newer semi-supervised techniques can be adopted to improve the parameter estimates of statistical model by utilizing a large number of easily available unlabeled training samples. Unfortunately there is no convenient multivariate statistical model that can be employed for mulitsource geospatial databases. In this paper we present a hybrid semi-supervised learning algorithm that effectively exploits freely available unlabeled training samples from multispectral remote sensing images and also incorporates ancillary geospatial databases. We have conducted several experiments on real datasets, and our new hybrid approach shows over 25 to 35% improvement in overall classification accuracy over conventional classification schemes.« less
Measuring the Interdisciplinary Impact of Using Geospatial Data with Remote Sensing Data
NASA Astrophysics Data System (ADS)
Downs, R. R.; Chen, R. S.; Schumacher, J.
2017-12-01
Various disciplines offer benefits to society by contributing to the scientific progress that informs the knowledge and decisions that improve the lives, safety, and conditions of people around the globe. In addition to disciplines within the natural sciences, other disciplines, including those in the social, health, and computer sciences, provide benefits to society by collecting, preparing, and analyzing data in the process of conducting research. Preparing geospatial environmental and socioeconomic data together with remote sensing data from satellite-based instruments for wider use by heterogeneous communities of users increases the potential impact of these data by enabling their use in different application areas and sectors of society. Furthermore, enabling wider use of scientific data can bring to bear resources and expertise that will improve reproducibility, quality, methodological transparency, interoperability, and improved understanding by diverse communities of users. In line with its commitment to open data, the NASA Socioeconomic Data and Applications Center (SEDAC), which focuses on human interactions in the environment, curates and disseminates freely and publicly available geospatial data for use across many disciplines and societal benefit areas. We describe efforts to broaden the use of SEDAC data and to publicly document their impact, assess the interdisciplinary impact of the use of SEDAC data with remote sensing data, and characterize these impacts in terms of their influence across disciplines by analyzing citations of geospatial data with remote sensing data within scientific journals.
Investigating Climate Change Issues With Web-Based Geospatial Inquiry Activities
NASA Astrophysics Data System (ADS)
Dempsey, C.; Bodzin, A. M.; Sahagian, D. L.; Anastasio, D. J.; Peffer, T.; Cirucci, L.
2011-12-01
In the Environmental Literacy and Inquiry middle school Climate Change curriculum we focus on essential climate literacy principles with an emphasis on weather and climate, Earth system energy balance, greenhouse gases, paleoclimatology, and how human activities influence climate change (http://www.ei.lehigh.edu/eli/cc/). It incorporates a related set of a framework and design principles to provide guidance for the development of the geospatial technology-integrated Earth and environmental science curriculum materials. Students use virtual globes, Web-based tools including an interactive carbon calculator and geologic timeline, and inquiry-based lab activities to investigate climate change topics. The curriculum includes educative curriculum materials that are designed to promote and support teachers' learning of important climate change content and issues, geospatial pedagogical content knowledge, and geographic spatial thinking. The curriculum includes baseline instructional guidance for teachers and provides implementation and adaptation guidance for teaching with diverse learners including low-level readers, English language learners and students with disabilities. In the curriculum, students use geospatial technology tools including Google Earth with embedded spatial data to investigate global temperature changes, areas affected by climate change, evidence of climate change, and the effects of sea level rise on the existing landscape. We conducted a designed-based research implementation study with urban middle school students. Findings showed that the use of the Climate Change curriculum showed significant improvement in urban middle school students' understanding of climate change concepts.
Cauzzi, Carlo; Fah, Donat; Wald, David J.; Clinton, John; Losey, Stephane; Wiemer, Stefan
2018-01-01
In Switzerland, nearly all historical Mw ~ 6 earthquakes have induced damaging landslides, rockslides and snow avalanches that, in some cases, also resulted in damage to infrastructure and loss of lives. We describe the customisation to Swiss conditions of a globally calibrated statistical approach originally developed to rapidly assess earthquake-induced landslide likelihoods worldwide. The probability of occurrence of such earthquake-induced effects is modelled through a set of geospatial susceptibility proxies and peak ground acceleration. The predictive model is tuned to capture the observations from past events and optimised for near-real-time estimates based on USGS-style ShakeMaps routinely produced by the Swiss Seismological Service. Our emphasis is on the use of high-resolution geospatial datasets along with additional local information on ground failure susceptibility. Even if calibrated on historic events with moderate magnitudes, the methodology presented in this paper yields sensible results also for low-magnitude recent events. The model is integrated in the Swiss ShakeMap framework. This study has a high practical relevance to many Swiss ShakeMap stakeholders, especially those managing lifeline systems, and to other global users interested in conducting a similar customisation for their region of interest.
NASA Astrophysics Data System (ADS)
Clark, E. P.; Cosgrove, B.; Salas, F.
2016-12-01
As a significant step forward to transform NOAA's water prediction services, NOAA plans to implement a new National Water Model (NWM) Version 1.0 in August 2016. A continental scale water resources model, the NWM is an evolution of the WRF-Hydro architecture developed by the National Center for Atmospheric Research (NCAR). The NWM will provide analyses and forecasts of flow for the 2.7 million stream reaches nationwide in the National Hydrography Dataset Plus v2 (NHDPlusV2) jointly developed by the USGS and EPA. The NWM also produces high-resolution water budget variables of snow, soil moisture, and evapotranspiration on a 1-km grid. NOAA's stakeholders require additional decision support application to be built on these data. The Geo-intelligence division of the Office of Water Prediction is building new products and services that integrate output from the NWM with geospatial datasets such as infrastructure and demographics to better estimate the impacts dynamic water resource states on community resiliency. This presentation will detail the methods and underlying information to produce prototypes water resources intelligence that is timely, actionable and credible. Moreover, it will to explore the NWM capability to support sector-specific decision support services.
Towards a Web-Enabled Geovisualization and Analytics Platform for the Energy and Water Nexus
NASA Astrophysics Data System (ADS)
Sanyal, J.; Chandola, V.; Sorokine, A.; Allen, M.; Berres, A.; Pang, H.; Karthik, R.; Nugent, P.; McManamay, R.; Stewart, R.; Bhaduri, B. L.
2017-12-01
Interactive data analytics are playing an increasingly vital role in the generation of new, critical insights regarding the complex dynamics of the energy/water nexus (EWN) and its interactions with climate variability and change. Integration of impacts, adaptation, and vulnerability (IAV) science with emerging, and increasingly critical, data science capabilities offers a promising potential to meet the needs of the EWN community. To enable the exploration of pertinent research questions, a web-based geospatial visualization platform is being built that integrates a data analysis toolbox with advanced data fusion and data visualization capabilities to create a knowledge discovery framework for the EWN. The system, when fully built out, will offer several geospatial visualization capabilities including statistical visual analytics, clustering, principal-component analysis, dynamic time warping, support uncertainty visualization and the exploration of data provenance, as well as support machine learning discoveries to render diverse types of geospatial data and facilitate interactive analysis. Key components in the system architecture includes NASA's WebWorldWind, the Globus toolkit, postgresql, as well as other custom built software modules.
NASA Astrophysics Data System (ADS)
Haener, Rainer; Waechter, Joachim; Grellet, Sylvain; Robida, Francois
2017-04-01
Interoperability is the key factor in establishing scientific research environments and infrastructures, as well as in bringing together heterogeneous, geographically distributed risk management, monitoring, and early warning systems. Based on developments within the European Plate Observing System (EPOS), a reference architecture has been devised that comprises architectural blue-prints and interoperability models regarding the specification of business processes and logic as well as the encoding of data, metadata, and semantics. The architectural blueprint is developed on the basis of the so called service-oriented architecture (SOA) 2.0 paradigm, which combines intelligence and proactiveness of event-driven with service-oriented architectures. SOA 2.0 supports analysing (Data Mining) both, static and real-time data in order to find correlations of disparate information that do not at first appear to be intuitively obvious: Analysed data (e.g., seismological monitoring) can be enhanced with relationships discovered by associating them (Data Fusion) with other data (e.g., creepmeter monitoring), with digital models of geological structures, or with the simulation of geological processes. The interoperability model describes the information, communication (conversations) and the interactions (choreographies) of all participants involved as well as the processes for registering, providing, and retrieving information. It is based on the principles of functional integration, implemented via dedicated services, communicating via service-oriented and message-driven infrastructures. The services provide their functionality via standardised interfaces: Instead of requesting data directly, users share data via services that are built upon specific adapters. This approach replaces the tight coupling at data level by a flexible dependency on loosely coupled services. The main component of the interoperability model is the comprehensive semantic description of the information, business logic and processes on the basis of a minimal set of well-known, established standards. It implements the representation of knowledge with the application of domain-controlled vocabularies to statements about resources, information, facts, and complex matters (ontologies). Seismic experts for example, would be interested in geological models or borehole measurements at a certain depth, based on which it is possible to correlate and verify seismic profiles. The entire model is built upon standards from the Open Geospatial Consortium (Dictionaries, Service Layer), the International Organisation for Standardisation (Registries, Metadata), and the World Wide Web Consortium (Resource Description Framework, Spatial Data on the Web Best Practices). It has to be emphasised that this approach is scalable to the greatest possible extent: All information, necessary in the context of cross-domain infrastructures is referenced via vocabularies and knowledge bases containing statements that provide either the information itself or resources (service-endpoints), the information can be retrieved from. The entire infrastructure communication is subject to a broker-based business logic integration platform where the information exchanged between involved participants, is managed on the basis of standardised dictionaries, repositories, and registries. This approach also enables the development of Systems-of-Systems (SoS), which allow the collaboration of autonomous, large scale concurrent, and distributed systems, yet cooperatively interacting as a collective in a common environment.
NASA Astrophysics Data System (ADS)
Lari, Z.; El-Sheimy, N.
2017-09-01
In recent years, the increasing incidence of climate-related disasters has tremendously affected our environment. In order to effectively manage and reduce dramatic impacts of such events, the development of timely disaster management plans is essential. Since these disasters are spatial phenomena, timely provision of geospatial information is crucial for effective development of response and management plans. Due to inaccessibility of the affected areas and limited budget of first-responders, timely acquisition of the required geospatial data for these applications is usually possible only using low-cost imaging and georefencing sensors mounted on unmanned platforms. Despite rapid collection of the required data using these systems, available processing techniques are not yet capable of delivering geospatial information to responders and decision makers in a timely manner. To address this issue, this paper introduces a new technique for dense 3D reconstruction of the affected scenes which can deliver and improve the needed geospatial information incrementally. This approach is implemented based on prior 3D knowledge of the scene and employs computationally-efficient 2D triangulation, feature descriptor, feature matching and point verification techniques to optimize and speed up 3D dense scene reconstruction procedure. To verify the feasibility and computational efficiency of the proposed approach, an experiment using a set of consecutive images collected onboard a UAV platform and prior low-density airborne laser scanning over the same area is conducted and step by step results are provided. A comparative analysis of the proposed approach and an available image-based dense reconstruction technique is also conducted to prove the computational efficiency and competency of this technique for delivering geospatial information with pre-specified accuracy.
Academic research opportunities at the National Geospatial-Intelligence Agency(NGA)
NASA Astrophysics Data System (ADS)
Loomer, Scott A.
2006-05-01
The vision of the National Geospatial-Intelligence Agency (NGA) is to "Know the Earth...Show the Way." To achieve this vision, the NGA provides geospatial intelligence in all its forms and from whatever source-imagery, imagery intelligence, and geospatial data and information-to ensure the knowledge foundation for planning, decision, and action. Academia plays a key role in the NGA research and development program through the NGA Academic Research Program. This multi-disciplinary program of basic research in geospatial intelligence topics provides grants and fellowships to the leading investigators, research universities, and colleges of the nation. This research provides the fundamental science support to NGA's applied and advanced research programs. The major components of the NGA Academic Research Program are: *NGA University Research Initiatives (NURI): Three-year basic research grants awarded competitively to the best investigators across the US academic community. Topics are selected to provide the scientific basis for advanced and applied research in NGA core disciplines. *Historically Black College and University - Minority Institution Research Initiatives (HBCU-MI): Two-year basic research grants awarded competitively to the best investigators at Historically Black Colleges and Universities, and Minority Institutions across the US academic community. *Intelligence Community Post-Doctoral Research Fellowships: Fellowships providing access to advanced research in science and technology applicable to the intelligence community's mission. The program provides a pool of researchers to support future intelligence community needs and develops long-term relationships with researchers as they move into career positions. This paper provides information about the NGA Academic Research Program, the projects it supports and how researchers and institutions can apply for grants under the program. In addition, other opportunities for academia to engage with NGA through training programs and recruitment are discussed.
Geospatial Technology Applications and Infrastructure in the Biological Resources Division
D'Erchia, Frank; Getter, James; D'Erchia, Terry D.; Root, Ralph; Stitt, Susan; White, Barbara
1998-01-01
Executive Summary -- Automated spatial processing technology such as geographic information systems (GIS), telemetry, and satellite-based remote sensing are some of the more recent developments in the long history of geographic inquiry. For millennia, humankind has endeavored to map the Earth's surface and identify spatial relationships. But the precision with which we can locate geographic features has increased exponentially with satellite positioning systems. Remote sensing, GIS, thematic mapping, telemetry, and satellite positioning systems such as the Global Positioning System (GPS) are tools that greatly enhance the quality and rapidity of analysis of biological resources. These technologies allow researchers, planners, and managers to more quickly and accurately determine appropriate strategies and actions. Researchers and managers can view information from new and varying perspectives using GIS and remote sensing, and GPS receivers allow the researcher or manager to identify the exact location of interest. These geospatial technologies support the mission of the U.S. Geological Survey (USGS) Biological Resources Division (BRD) and the Strategic Science Plan (BRD 1996) by providing a cost-effective and efficient method for collection, analysis, and display of information. The BRD mission is 'to work with others to provide the scientific understanding and technologies needed to support the sound management and conservation of our Nation's biological resources.' A major responsibility of the BRD is to develop and employ advanced technologies needed to synthesize, analyze, and disseminate biological and ecological information. As the Strategic Science Plan (BRD 1996) states, 'fulfilling this mission depends on effectively balancing the immediate need for information to guide management of biological resources with the need for technical assistance and long-range, strategic information to understand and predict emerging patterns and trends in ecological systems.' Information sharing plays a key role in nearly everything BRD does. The Strategic Science Plan discusses the need to (1) develop tools and standards for information transfer, (2) disseminate information, and (3) facilitate effective use of information. This effort centers around the National Biological Information Infrastructure (NBII) and the National Spatial Data Infrastructure (NSDI), components of the National Information Infrastructure. The NBII and NSDI are distributed electronic networks of biological and geographical data and information, as well as tools to help users around the world easily find and retrieve the biological and geographical data and information they need. The BRD is responsible for developing scientifically and statistically reliable methods and protocols to assess the status and trends of the Nation's biological resources. Scientists also conduct important inventory and monitoring studies to maintain baseline information on these same resources. Research on those species for which the Department of the Interior (DOI) has trust responsibilities (including endangered species and migratory species) involves laboratory and field studies of individual animals and the environments in which they live. Researchboth tactical and strategicis conducted at the BRD's 17 science centers and 81 field stations, 54 Cooperative Fish and Wildlife Research Units in 40 states, and at 11 former Cooperative Park Study Units. Studies encompass fish, birds, mammals, and plants, as well as their ecosystems and the surrounding landscape. Biological Resources Division researchers use a variety of scientific tools in their endeavors to understand the causes of biological and ecological trends. Research results are used by managers to predict environmental changes and to help them take appropriate measures to manage resources effectively. The BRD Geospatial Technology Program facilitates the collection, analysis, and dissemination of data and informat
Virtual Hubs for facilitating access to Open Data
NASA Astrophysics Data System (ADS)
Mazzetti, Paolo; Latre, Miguel Á.; Ernst, Julia; Brumana, Raffaella; Brauman, Stefan; Nativi, Stefano
2015-04-01
In October 2014 the ENERGIC-OD (European NEtwork for Redistributing Geospatial Information to user Communities - Open Data) project, funded by the European Union under the Competitiveness and Innovation framework Programme (CIP), has started. In response to the EU call, the general objective of the project is to "facilitate the use of open (freely available) geographic data from different sources for the creation of innovative applications and services through the creation of Virtual Hubs". In ENERGIC-OD, Virtual Hubs are conceived as information systems supporting the full life cycle of Open Data: publishing, discovery and access. They facilitate the use of Open Data by lowering and possibly removing the main barriers which hampers geo-information (GI) usage by end-users and application developers. Data and data services heterogeneity is recognized as one of the major barriers to Open Data (re-)use. It imposes end-users and developers to spend a lot of effort in accessing different infrastructures and harmonizing datasets. Such heterogeneity cannot be completely removed through the adoption of standard specifications for service interfaces, metadata and data models, since different infrastructures adopt different standards to answer to specific challenges and to address specific use-cases. Thus, beyond a certain extent, heterogeneity is irreducible especially in interdisciplinary contexts. ENERGIC-OD Virtual Hubs address heterogeneity adopting a mediation and brokering approach: specific components (brokers) are dedicated to harmonize service interfaces, metadata and data models, enabling seamless discovery and access to heterogeneous infrastructures and datasets. As an innovation project, ENERGIC-OD will integrate several existing technologies to implement Virtual Hubs as single points of access to geospatial datasets provided by new or existing platforms and infrastructures, including INSPIRE-compliant systems and Copernicus services. ENERGIC OD will deploy a set of five Virtual Hubs (VHs) at national level in France, Germany, Italy, Poland, Spain and an additional one at the European level. VHs will be provided according to the cloud Software-as-a-Services model. The main expected impact of VHs is the creation of new business opportunities opening up access to Research Data and Public Sector Information. Therefore, ENERGIC-OD addresses not only end-users, who will have the opportunity to access the VH through a geo-portal, but also application developers who will be able to access VH functionalities through simple Application Programming Interfaces (API). ENERGIC-OD Consortium will develop ten different applications on top of the deployed VHs. They aim to demonstrate how VHs facilitate the development of new and multidisciplinary applications based on the full exploitation of (open) GI, hence stimulating innovation and business activities.
Dynamic Server-Based KML Code Generator Method for Level-of-Detail Traversal of Geospatial Data
NASA Technical Reports Server (NTRS)
Baxes, Gregory; Mixon, Brian; Linger, TIm
2013-01-01
Web-based geospatial client applications such as Google Earth and NASA World Wind must listen to data requests, access appropriate stored data, and compile a data response to the requesting client application. This process occurs repeatedly to support multiple client requests and application instances. Newer Web-based geospatial clients also provide user-interactive functionality that is dependent on fast and efficient server responses. With massively large datasets, server-client interaction can become severely impeded because the server must determine the best way to assemble data to meet the client applications request. In client applications such as Google Earth, the user interactively wanders through the data using visually guided panning and zooming actions. With these actions, the client application is continually issuing data requests to the server without knowledge of the server s data structure or extraction/assembly paradigm. A method for efficiently controlling the networked access of a Web-based geospatial browser to server-based datasets in particular, massively sized datasets has been developed. The method specifically uses the Keyhole Markup Language (KML), an Open Geospatial Consortium (OGS) standard used by Google Earth and other KML-compliant geospatial client applications. The innovation is based on establishing a dynamic cascading KML strategy that is initiated by a KML launch file provided by a data server host to a Google Earth or similar KMLcompliant geospatial client application user. Upon execution, the launch KML code issues a request for image data covering an initial geographic region. The server responds with the requested data along with subsequent dynamically generated KML code that directs the client application to make follow-on requests for higher level of detail (LOD) imagery to replace the initial imagery as the user navigates into the dataset. The approach provides an efficient data traversal path and mechanism that can be flexibly established for any dataset regardless of size or other characteristics. The method yields significant improvements in userinteractive geospatial client and data server interaction and associated network bandwidth requirements. The innovation uses a C- or PHP-code-like grammar that provides a high degree of processing flexibility. A set of language lexer and parser elements is provided that offers a complete language grammar for writing and executing language directives. A script is wrapped and passed to the geospatial data server by a client application as a component of a standard KML-compliant statement. The approach provides an efficient means for a geospatial client application to request server preprocessing of data prior to client delivery. Data is structured in a quadtree format. As the user zooms into the dataset, geographic regions are subdivided into four child regions. Conversely, as the user zooms out, four child regions collapse into a single, lower-LOD region. The approach provides an efficient data traversal path and mechanism that can be flexibly established for any dataset regardless of size or other characteristics.
Frame, M.T.; Cotter, G.; Zolly, L.; Little, J.
2002-01-01
Whether your vantage point is that of an office window or a national park, your view undoubtedly encompasses a rich diversity of life forms, all carefully studied or managed by some scientist, resource manager, or planner. A few simple calculations - the number of species, their interrelationships, and the many researchers studying them - and you can easily see the tremendous challenges that the resulting biological data presents to the information and computer science communities. Biological information varies in format and content: it may pertain to a particular species or an entire ecosystem; it can contain land use characteristics, and geospatially referenced information. The complexity and uniqueness of each individual species or ecosystem do not easily lend themselves to today's computer science tools and applications. To address the challenges that the biological enterprise presents, the National Biological Information Infrastructure (NBII) (http://www.nbii.gov) was established in 1993 on the recommendation of the National Research Council (National Research Council 1993). The NBII is designed to address these issues on a national scale, and through international partnerships. This paper discusses current information and computer science efforts within the National Biological Information Infrastructure Program, and future computer science research endeavors that are needed to address the ever-growing issues related to our nation's biological concerns. ?? 2003 by The Haworth Press, Inc. All rights reserved.
ERIC Educational Resources Information Center
Phillips, Daniel W.; Montello, Daniel R.
2015-01-01
Previous research has examined heuristics--simplified decision-making rules-of-thumb--for geospatial reasoning. This study examined at two locations the influence of beliefs about local coastline orientation on estimated directions to local and distant places; estimates were made immediately or after fifteen seconds. This study goes beyond…
ERIC Educational Resources Information Center
Bodzin, Alec; Anastasio, David; Sahagian, Dork; Henry, Jill Burrows
2016-01-01
A curriculum-linked professional development approach designed to support middle level science teachers' understandings about tectonics and geospatial pedagogical content knowledge was developed. This approach takes into account limited face-to-face professional development time and instead provides pedagogical support within the design of a…
A resource-oriented architecture for a Geospatial Web
NASA Astrophysics Data System (ADS)
Mazzetti, Paolo; Nativi, Stefano
2010-05-01
In this presentation we discuss some architectural issues on the design of an architecture for a Geospatial Web, that is an information system for sharing geospatial resources according to the Web paradigm. The success of the Web in building a multi-purpose information space, has raised questions about the possibility of adopting the same approach for systems dedicated to the sharing of more specific resources, such as the geospatial information, that is information characterized by spatial/temporal reference. To this aim an investigation on the nature of the Web and on the validity of its paradigm for geospatial resources is required. The Web was born in the early 90's to provide "a shared information space through which people and machines could communicate" [Berners-Lee 1996]. It was originally built around a small set of specifications (e.g. URI, HTTP, HTML, etc.); however, in the last two decades several other technologies and specifications have been introduced in order to extend its capabilities. Most of them (e.g. the SOAP family) actually aimed to transform the Web in a generic Distributed Computing Infrastructure. While these efforts were definitely successful enabling the adoption of service-oriented approaches for machine-to-machine interactions supporting complex business processes (e.g. for e-Government and e-Business applications), they do not fit in the original concept of the Web. In the year 2000, R. T. Fielding, one of the designers of the original Web specifications, proposes a new architectural style for distributed systems, called REST (Representational State Transfer), aiming to capture the fundamental characteristics of the Web as it was originally conceived [Fielding 2000]. In this view, the nature of the Web lies not so much in the technologies, as in the way they are used. Maintaining the Web architecture conform to the REST style would then assure the scalability, extensibility and low entry barrier of the original Web. On the contrary, systems using the same Web technologies and specifications but according to a different architectural style, despite their usefulness, should not be considered part of the Web. If the REST style captures the significant Web characteristics, then, in order to build a Geospatial Web it is necessary that its architecture satisfies all the REST constraints. One of them is of particular importance: the adoption of a Uniform Interface. It prescribes that all the geospatial resources must be accessed through the same interface; moreover according to the REST style this interface must satisfy four further constraints: a) identification of resources; b) manipulation of resources through representations; c) self-descriptive messages; and, d) hypermedia as the engine of application state. In the Web, the uniform interface provides basic operations which are meaningful for generic resources. They typically implement the CRUD pattern (Create-Retrieve-Update-Delete) which demonstrated to be flexible and powerful in several general-purpose contexts (e.g. filesystem management, SQL for database management systems, etc.). Restricting the scope to a subset of resources it would be possible to identify other generic actions which are meaningful for all of them. For example for geospatial resources, subsetting, resampling, interpolation and coordinate reference systems transformations functionalities are candidate functionalities for a uniform interface. However an investigation is needed to clarify the semantics of those actions for different resources, and consequently if they can really ascend the role of generic interface operation. Concerning the point a), (identification of resources), it is required that every resource addressable in the Geospatial Web has its own identifier (e.g. a URI). This allows to implement citation and re-use of resources, simply providing the URI. OPeNDAP and KVP encodings of OGC data access services specifications might provide a basis for it. Concerning point b) (manipulation of resources through representations), the Geospatial Web poses several issues. In fact, while the Web mainly handles semi-structured information, in the Geospatial Web the information is typically structured with several possible data models (e.g. point series, gridded coverages, trajectories, etc.) and encodings. A possibility would be to simplify the interchange formats, choosing to support a subset of data models and format(s). This is what actually the Web designers did choosing to define a common format for hypermedia (HTML), although the underlying protocol would be generic. Concerning point c), self-descriptive messages, the exchanged messages should describe themselves and their content. This would not be actually a major issue considering the effort put in recent years on geospatial metadata models and specifications. The point d), hypermedia as the engine of application state, is actually where the Geospatial Web would mainly differ from existing geospatial information sharing systems. In fact the existing systems typically adopt a service-oriented architecture, where applications are built as a single service or as a workflow of services. On the other hand, in the Geospatial Web, applications should be built following the path between interconnected resources. The link between resources should be made explicit as hyperlinks. The adoption of Semantic Web solutions would allow to define not only the existence of a link between two resources, but also the nature of the link. The implementation of a Geospatial Web would allow to build an information system with the same characteristics of the Web sharing its points-of-strength and weaknesses. The main advantages would be the following: • The user would interact with the Geospatial Web according to the well-known Web navigation paradigm. This would lower the barrier to the access to geospatial applications for non-specialists (e.g. the success of Google Maps and other Web mapping applications); • Successful Web and Web 2.0 applications - search engines, feeds, social network - could be integrated/replicated in the Geospatial Web; The main drawbacks would be the following: • The Uniform Interface simplifies the overall system architecture (e.g. no service registry, and service descriptors required), but moves the complexity to the data representation. Moreover since the interface must stay generic, it results really simple and therefore complex interactions would require several transfers. • In the geospatial domain one of the most valuable resources are processes (e.g. environmental models). How they can be modeled as resources accessed through the common interface is an open issue. Taking into account advantages and drawback it seems that a Geospatial Web would be useful, but its use would be limited to specific use-cases not covering all the possible applications. The Geospatial Web architecture could be partly based on existing specifications, while other aspects need investigation. References [Berners-Lee 1996] T. Berners-Lee, "WWW: Past, present, and future". IEEE Computer, 29(10), Oct. 1996, pp. 69-77. [Fielding 2000] Fielding, R. T. 2000. Architectural styles and the design of network-based software architectures. PhD Dissertation. Dept. of Information and Computer Science, University of California, Irvine
Assessing and Valuing Historical Geospatial Data for Decisions
NASA Astrophysics Data System (ADS)
Sylak-Glassman, E.; Gallo, J.
2016-12-01
We will present a method for assessing the use and valuation of historical geospatial data and information products derived from Earth observations (EO). Historical data is widely used in the establishment of baseline reference cases, time-series analysis, and Earth system modeling. Historical geospatial data is used in diverse application areas, such as risk assessment in the insurance and reinsurance industry, disaster preparedness and response planning, historical demography, land-use change analysis, and paleoclimate research, among others. Establishing the current value of previously collected data, often from EO systems that are no longer operating, is difficult since the costs associated with their preservation, maintenance, and dissemination are current, while the costs associated with their original collection are sunk. Understanding their current use and value can aid in funding decisions about the data management infrastructure and workforce allocation required to maintain their availability. Using a value-tree framework to trace the application of data from EO systems, sensors, networks, and surveys, to weighted key Federal objectives, we are able to estimate relative contribution of individual EO systems, sensors, networks, and surveys to meeting those objectives. The analysis relies on a modified Delphi method to elicit relative levels of reliance on individual EO data inputs, including historical data, from subject matter experts. This results in the identification of a representative portfolio of all EO data used to meet key Federal objectives. Because historical data is collected in conjunction with all other EO data within a weighted framework, its contribution to meeting key Federal objectives can be specifically identified and evaluated in relationship to other EO data. The results of this method could be applied better understanding and projecting the long-term value of data from current and future EO systems.
Integrated Sustainable Planning for Industrial Region Using Geospatial Technology
NASA Astrophysics Data System (ADS)
Tiwari, Manish K.; Saxena, Aruna; Katare, Vivek
2012-07-01
The Geospatial techniques and its scope of applications have undergone an order of magnitude change since its advent and now it has been universally accepted as a most important and modern tool for mapping and monitoring of various natural resources as well as amenities and infrastructure. The huge and voluminous spatial database generated from various Remote Sensing platforms needs proper management like storage, retrieval, manipulation and analysis to extract desired information, which is beyond the capability of human brain. This is where the computer aided GIS technology came into existence. A GIS with major input from Remote Sensing satellites for the natural resource management applications must be able to handle the spatiotemporal data, supporting spatiotemporal quarries and other spatial operations. Software and the computer-based tools are designed to make things easier to the user and to improve the efficiency and quality of information processing tasks. The natural resources are a common heritage, which we have shared with the past generations, and our future generation will be inheriting these resources from us. Our greed for resource and our tremendous technological capacity to exploit them at a much larger scale has created a situation where we have started withdrawing from the future stocks. Bhopal capital region had attracted the attention of the planners from the beginning of the five-year plan strategy for Industrial development. However, a number of projects were carried out in the individual Districts (Bhopal, Rajgarh, Shajapur, Raisen, Sehore) which also gave fruitful results, but no serious efforts have been made to involve the entire region. No use of latest Geospatial technique (Remote Sensing, GIS, GPS) to prepare a well structured computerized data base without which it is very different to retrieve, analyze and compare the data for monitoring as well as for planning the developmental activities in future.
NASA Astrophysics Data System (ADS)
Khan, K. M.; Rashid, S.; Yaseen, M.; Ikram, M.
2016-12-01
The Karakoram Highway (KKH) 'eighth wonder of the world', constructed and completed by the consent of Pakistan and China in 1979 as a Friendship Highway. It connect Gilgit-Baltistan, a strategically prominent region of Pakistan, with Xinjiang region in China. Due to manifold geology/geomorphology, soil formation, steep slopes, climate change well as unsustainable anthropogenic activities, still, KKH is remarkably vulnerable to natural hazards i.e. land subsistence, landslides, erosion, rock fall, floods, debris flows, cyclical torrential rainfall and snowfall, lake outburst etc. Most of the time these geohazard's damaging effects jeopardized the life in the region. To ascertain the nature and frequency of the disaster and vulnerability zoning, a rating and management (logistic) analysis were made to investigate the spatiotemporal sharing of the natural hazard. The substantial dynamics of the physiograpy, geology, geomorphology, soils and climate were carefully understand while slope, aspect, elevation, profile curvature and rock hardness was calculated by different techniques. To assess the nature and intensity geospatial analysis were conducted and magnitude of every factor was gauged by using logistic regression. Moreover, ever relative variable was integrated in the evaluation process. Logistic regression and geospatial techniques were used to map the geohazard vulnerability zoning (GVZ). The GVZ model findings were endorsed by the reviews of documented hazards in the current years and the precision was realized more than 88.1 %. The study has proved the model authentication by highlighting the comfortable indenture among the vulnerability mapping and past documented hazards. By using a receiver operating characteristic curve, the logistic regression model made satisfactory results. The outcomes will be useful in sustainable land use and infrastructure planning, mainly in high risk zones for reduceing economic damages and community betterment.
NASA Astrophysics Data System (ADS)
Sawayama, Shuhei; Nurdin, Nurjannah; Akbar AS, Muhammad; Sakamoto, Shingo X.; Komatsu, Teruhisa
2015-06-01
Coral reef ecosystems worldwide are now being harmed by various stresses accompanying the degradation of fish habitats and thus knowledge of fish-habitat relationships is urgently required. Because conventional research methods were not practical for this purpose due to the lack of a geospatial perspective, we attempted to develop a research method integrating visual fish observation with a seabed habitat map and to expand knowledge to a two-dimensional scale. WorldView-2 satellite imagery of Spermonde Archipelago, Indonesia obtained in September 2012 was analyzed and classified into four typical substrates: live coral, dead coral, seagrass and sand. Overall classification accuracy of this map was 81.3% and considered precise enough for subsequent analyses. Three sub-areas (CC: continuous coral reef, BC: boundary of coral reef and FC: few live coral zone) around reef slopes were extracted from the map. Visual transect surveys for several fish species were conducted within each sub-area in June 2013. As a result, Mean density (Ind. / 300 m2) of Chaetodon octofasciatus, known as an obligate feeder of corals, was significantly higher at BC than at the others (p < 0.05), implying that this species' density is strongly influenced by spatial configuration of its habitat, like the "edge effect." This indicates that future conservation procedures for coral reef fishes should consider not only coral cover but also its spatial configuration. The present study also indicates that the introduction of a geospatial perspective derived from remote sensing has great potential to progress conventional ecological studies on coral reef fishes.
Impacts of Permafrost on Infrastructure and Ecosystem Services
NASA Astrophysics Data System (ADS)
Trochim, E.; Schuur, E.; Schaedel, C.; Kelly, B. P.
2017-12-01
The Study of Environmental Arctic Change (SEARCH) program developed knowledge pyramids as a tool for advancing scientific understanding and making this information accessible for decision makers. Knowledge pyramids are being used to synthesize, curate and disseminate knowledge of changing land ice, sea ice, and permafrost in the Arctic. Each pyramid consists of a one-two page summary brief in broadly accessible language and literature organized by levels of detail including synthesizes and scientific building blocks. Three knowledge pyramids have been produced related to permafrost on carbon, infrastructure, and ecosystem services. Each brief answers key questions with high societal relevance framed in policy-relevant terms. The knowledge pyramids concerning infrastructure and ecosystem services were developed in collaboration with researchers specializing in the specific topic areas in order to identify the most pertinent issues and accurately communicate information for integration into policy and planning. For infrastructure, the main issue was the need to build consensus in the engineering and science communities for developing improved methods for incorporating data applicable to building infrastructure on permafrost. In ecosystem services, permafrost provides critical landscape properties which affect basic human needs including fuel and drinking water availability, access to hunting and harvest, and fish and wildlife habitat. Translating these broad and complex topics necessitated a systematic and iterative approach to identifying key issues and relating them succinctly to the best state of the art research. The development of the knowledge pyramids provoked collaboration and synthesis across distinct research and engineering communities. The knowledge pyramids also provide a solid basis for policy development and the format allows the content to be regularly updated as the research community advances.
Generation of Multiple Metadata Formats from a Geospatial Data Repository
NASA Astrophysics Data System (ADS)
Hudspeth, W. B.; Benedict, K. K.; Scott, S.
2012-12-01
The Earth Data Analysis Center (EDAC) at the University of New Mexico is partnering with the CYBERShARE and Environmental Health Group from the Center for Environmental Resource Management (CERM), located at the University of Texas, El Paso (UTEP), the Biodiversity Institute at the University of Kansas (KU), and the New Mexico Geo- Epidemiology Research Network (GERN) to provide a technical infrastructure that enables investigation of a variety of climate-driven human/environmental systems. Two significant goals of this NASA-funded project are: a) to increase the use of NASA Earth observational data at EDAC by various modeling communities through enabling better discovery, access, and use of relevant information, and b) to expose these communities to the benefits of provenance for improving understanding and usability of heterogeneous data sources and derived model products. To realize these goals, EDAC has leveraged the core capabilities of its Geographic Storage, Transformation, and Retrieval Engine (Gstore) platform, developed with support of the NSF EPSCoR Program. The Gstore geospatial services platform provides general purpose web services based upon the REST service model, and is capable of data discovery, access, and publication functions, metadata delivery functions, data transformation, and auto-generated OGC services for those data products that can support those services. Central to the NASA ACCESS project is the delivery of geospatial metadata in a variety of formats, including ISO 19115-2/19139, FGDC CSDGM, and the Proof Markup Language (PML). This presentation details the extraction and persistence of relevant metadata in the Gstore data store, and their transformation into multiple metadata formats that are increasingly utilized by the geospatial community to document not only core library catalog elements (e.g. title, abstract, publication data, geographic extent, projection information, and database elements), but also the processing steps used to generate derived modeling products. In particular, we discuss the generation and service delivery of provenance, or trace of data sources and analytical methods used in a scientific analysis, for archived data. We discuss the workflows developed by EDAC to capture end-to-end provenance, the storage model for those data in a delivery format independent data structure, and delivery of PML, ISO, and FGDC documents to clients requesting those products.
GODAN Local Farming Challenge 2017 - Encourage Geo-Innovation Solutions for Zero Hunger
NASA Astrophysics Data System (ADS)
Anand, Suchith; Hogan, Patrick; Brovelli, Maria; Schaap, Ben; Musker, Ruthie; Laperrière, André
2017-04-01
The initial ideas for Open Geospatial Science [1] were presented nearly a decade ago. They build upon the proposition of Open science which argues that scientific knowledge develops more rapidly and productively if openly shared (as early as is practical in the discovery process). The key ingredients that make Open Geospatial Science possible are enshrined in Open Principles, i.e.: open source geospatial software, open data, open standards, open educational resources, and open access to research publications. OpenCitySmart[2] is an initiative of Geo for All [3] that aims to develop a suite of tools for city-related infrastructure management (utilities, traffic, services, etc.). Its purpose will be to continually refine and add functionality that not only streamlines operational efficiency but also considers the need for sustainability and quality of urban life. OpenCitySmart employs Open solutions to build richer tools that empower organisations and individuals to utilizespatial and non-spatial data alike. This will create opportunities for innovation both globally and locally. As the population of cities grow, the concern of food security will shift from rural to urban areas. Currently, nearly 800 million people struggle with debilitating hunger and malnutrition and can be found in every corner of the globe. That's one in every nine people, with the majority being women and children. The Global Open Data for Agriculture and Nutrition (GODAN) [4] supports the proactive sharing of open data to make information about agriculture and nutrition available, accessible and usable to deal with the urgent challenge of ensuring world food security. A core principle behind GODAN is that a solution to Zero Hunger lies within existing, but often unavailable, agriculture and nutrition data. Through an online survey, GODAN found that the most needed data type across its 430+ partner network was geospatial data. Through the GODAN Europa Challenge we want to bring together researchers and students to work collaboratively on innovative ideas to create change using agriculture and nutrition data. The Europa Challenge is a World Challenge, though we use the wisdom of Europe's INSPIRE Directive to guide project development. The Europa Challenge is asking the world's *best and brightest* to deliver solutions serving city needs. With support from the NASA Europa Challenge[5], GODAN is launching a Local Farming Challenge. We welcome students to create innovative ideas that will help tackle the solutions of local farming in growing cities, using some aspect of the OpenCitySmart Design and uses NASA's open source virtual globe technology, WebWorldWind. Ideas may include ways for optimally linking local farming communities directly with potential customers, tools for visualising spatio-temporal aspects of local farming, tools for helping reducing wastage (for example linking with local food banks), and any number of solutions for helping our goal of Zero Hunger. 1. http://www.mdpi.com/journal/ijgi/special_issues/science-applications 2. http://www.geoforall.org/ 3. https://wiki.osgeo.org/wiki/Opencitysmart 4. http://www.godan.info 5. http://eurochallenge.como.polimi.it
7th IGRSM International Remote Sensing & GIS Conference and Exhibition
NASA Astrophysics Data System (ADS)
Shariff, Abdul Rashid Mohamed
2014-06-01
IGRSM This proceedings consists of the peer-reviewed papers from the 7th IGRSM International Conference and Exhibition on Remote Sensing & GIS (IGRSM 2014), which was held on 21-22 April 2014 at Berjaya Times Square Hotel, Kuala Lumpur, Malaysia. The conference, with the theme Geospatial Innovation for Nation Building was aimed at disseminating knowledge, and sharing expertise and experiences in geospatial sciences in all aspects of applications. It also aimed to build linkages between local and international professionals in this field with industries. Highlights of the conference included: Officiation by Y B Datuk Dr Abu Bakar bin Mohamad Diah, Deputy Minister of Minister of Science, Technology & Innovation Keynote presentations by: Associate Professor Dr Francis Harvey, Chair of the Geographic Information Science Commission at the International Geographical Union (IGU) and Director of U-Spatial, University of Minnesota, US: The Next Age of Discovery and a Future in a Post-GIS World. Professor Dr Naoshi Kondo, Bio-Sensing Engineering, University of Kyoto, Japan: Mobile Fruit Grading Machine for Precision Agriculture. Datuk Ir Hj Ahmad Jamalluddin bin Shaaban, Director-General, National Hydraulic Research Institute of Malaysia (NAHRIM), Malaysia: Remote Sensing & GIS in Climate Change Analyses. Oral and poster presentations from 69 speakers, from both Malaysia (35) and abroad (34), covering areas of water resources management, urban sprawl & social mobility, agriculture, land use/cover mapping, infrastructure planning, disaster management, technology trends, environmental monitoring, atmospheric/temperature monitoring, and space applications for the environment. Post-conference workshops on: Space Applications for Environment (SAFE), which was be organised by the Japan Aerospace Exploration Agency (JAXA) Global Positioning System (GPS) Receiver Evaluation Using GPS Simulation, which was be organised by the Science & Technology Research Institute for Defence (STRIDE), and sponsored by RFI Technologies Sdn. Bhd. and Aeroflex Inc. Two awards were presented by Dr Noordin Ahmad, Director-General of the National Space Agency during the conference's closing ceremony: Best Paper Award: Dr Rizatus Shofiyati, Indonesian Center for Agricultural Land Resources Research and Development (ICALRD), Indonesia: Indonesian Drought Monitoring from Space. A Report of SAFE Activity: Assessment of Drought Impact on Rice Production in Indonesia by Satellite Remote Sensing and Dissemination with Web-GIS Best Student Paper Award: Rosnani Rahman, Space Science Centre (ANGKASA), Institute of Climate Change, Universiti Kebangsaan Malaysia (UKM), Malaysia: Monitoring the Variability of Precipitable Water Vapor Over the Klang Valley, Malaysia During Flash Flood The success of the IGRSM 2014 was due to commitments of many: authors, keynote speakers, session chairpersons, the organising and technical programme committees, student volunteers from Universiti Putra Malaysia (UPM), and many others of various roles. We acknowledge the sponsors of IGRSM 2014, namely Antaragrafik Systems Sdn. Bhd. and Geospatial Media and Communications Sdn. Bhd. We also thank all exhibitors and contributors: E J Motiwalla, Fajar Saintifik Sdn. Bhd., Bandwork GPS Solutions Sdn. Bhd., Tenaga Nasional Bhd., TSKAY Technology Sdn. Bhd., Geo Spatial Solutions Sdn. Bhd. and Accutac Sdn. Bhd. Associate Professor Sr Dr Abdul Rashid Mohamed Shariff Chairman 7th IGRSM International Remote Sensing & GIS Conference and Exhibition (IGRSM2014) President Institution of Geospatial and Remote Sensing Malaysia (IGRSM), 2012-2014
Visa: AN Automatic Aware and Visual Aids Mechanism for Improving the Correct Use of Geospatial Data
NASA Astrophysics Data System (ADS)
Hong, J. H.; Su, Y. T.
2016-06-01
With the fast growth of internet-based sharing mechanism and OpenGIS technology, users nowadays enjoy the luxury to quickly locate and access a variety of geospatial data for the tasks at hands. While this sharing innovation tremendously expand the possibility of application and reduce the development cost, users nevertheless have to deal with all kinds of "differences" implicitly hidden behind the acquired georesources. We argue the next generation of GIS-based environment, regardless internet-based or not, must have built-in knowledge to automatically and correctly assess the fitness of data use and present the analyzed results to users in an intuitive and meaningful way. The VISA approach proposed in this paper refer to four different types of visual aids that can be respectively used for addressing analyzed results, namely, virtual layer, informative window, symbol transformation and augmented TOC. The VISA-enabled interface works in an automatic-aware fashion, where the standardized metadata serve as the known facts about the selected geospatial resources, algorithms for analyzing the differences of temporality and quality of the geospatial resources were designed and the transformation of analyzed results into visual aids were automatically executed. It successfully presents a new way for bridging the communication gaps between systems and users. GIS has been long seen as a powerful integration tool, but its achievements would be highly restricted if it fails to provide a friendly and correct working platform.
An infrastructure for ontology-based information systems in biomedicine: RICORDO case study.
Wimalaratne, Sarala M; Grenon, Pierre; Hoehndorf, Robert; Gkoutos, Georgios V; de Bono, Bernard
2012-02-01
The article presents an infrastructure for supporting the semantic interoperability of biomedical resources based on the management (storing and inference-based querying) of their ontology-based annotations. This infrastructure consists of: (i) a repository to store and query ontology-based annotations; (ii) a knowledge base server with an inference engine to support the storage of and reasoning over ontologies used in the annotation of resources; (iii) a set of applications and services allowing interaction with the integrated repository and knowledge base. The infrastructure is being prototyped and developed and evaluated by the RICORDO project in support of the knowledge management of biomedical resources, including physiology and pharmacology models and associated clinical data. The RICORDO toolkit and its source code are freely available from http://ricordo.eu/relevant-resources. sarala@ebi.ac.uk.
ERIC Educational Resources Information Center
Vanderburg, Willem H.
2006-01-01
The role tradition played in preindustrial societies has been supplanted by the decisions of countless specialists organized by means of an intellectual and professional division of labor shaping a knowledge infrastructure that sustains these decisions. Three limitations of this knowledge system are discussed: (a) on the macrolevel, it imposes an…
Possibilities of Use of UAVS for Technical Inspection of Buildings and Constructions
NASA Astrophysics Data System (ADS)
Banaszek, Anna; Banaszek, Sebastian; Cellmer, Anna
2017-12-01
In recent years, Unmanned Aerial Vehicles (UAVs) have been used in various sectors of the economy. This is due to the development of new technologies for acquiring and processing geospatial data. The paper presents the results of experiments using UAV, equipped with a high resolution digital camera, for a visual assessment of the technical condition of the building roof and for the inventory of energy infrastructure and its surroundings. The usefulness of digital images obtained from the UAV deck is presented in concrete examples. The use of UAV offers new opportunities in the area of technical inspection due to the detail and accuracy of the data, low operating costs and fast data acquisition.
Harvesting geographic features from heterogeneous raster maps
NASA Astrophysics Data System (ADS)
Chiang, Yao-Yi
2010-11-01
Raster maps offer a great deal of geospatial information and are easily accessible compared to other geospatial data. However, harvesting geographic features locked in heterogeneous raster maps to obtain the geospatial information is challenging. This is because of the varying image quality of raster maps (e.g., scanned maps with poor image quality and computer-generated maps with good image quality), the overlapping geographic features in maps, and the typical lack of metadata (e.g., map geocoordinates, map source, and original vector data). Previous work on map processing is typically limited to a specific type of map and often relies on intensive manual work. In contrast, this thesis investigates a general approach that does not rely on any prior knowledge and requires minimal user effort to process heterogeneous raster maps. This approach includes automatic and supervised techniques to process raster maps for separating individual layers of geographic features from the maps and recognizing geographic features in the separated layers (i.e., detecting road intersections, generating and vectorizing road geometry, and recognizing text labels). The automatic technique eliminates user intervention by exploiting common map properties of how road lines and text labels are drawn in raster maps. For example, the road lines are elongated linear objects and the characters are small connected-objects. The supervised technique utilizes labels of road and text areas to handle complex raster maps, or maps with poor image quality, and can process a variety of raster maps with minimal user input. The results show that the general approach can handle raster maps with varying map complexity, color usage, and image quality. By matching extracted road intersections to another geospatial dataset, we can identify the geocoordinates of a raster map and further align the raster map, separated feature layers from the map, and recognized features from the layers with the geospatial dataset. The road vectorization and text recognition results outperform state-of-art commercial products, and with considerably less user input. The approach in this thesis allows us to make use of the geospatial information of heterogeneous maps locked in raster format.
Geospatial Data Science Modeling | Geospatial Data Science | NREL
Geospatial Data Science Modeling Geospatial Data Science Modeling NREL uses geospatial data science modeling to develop innovative models and tools for energy professionals, project developers, and consumers . Photo of researchers inspecting maps on a large display. Geospatial modeling at NREL often produces the
NASA Astrophysics Data System (ADS)
Lykiardopoulos, A.; Iona, A.; Lakes, V.; Batis, A.; Balopoulos, E.
2009-04-01
The development of new technologies for the aim of enhancing Web Applications with Dynamically data access was the starting point for Geospatial Web Applications to developed at the same time as well. By the means of these technologies the Web Applications embed the capability of presenting Geographical representations of the Geo Information. The induction in nowadays, of the state of the art technologies known as Web Services, enforce the Web Applications to have interoperability among them i.e. to be able to process requests from each other via a network. In particular throughout the Oceanographic Community, modern Geographical Information systems based on Geospatial Web Services are now developed or will be developed shortly in the near future, with capabilities of managing the information itself fully through Web Based Geographical Interfaces. The exploitation of HNODC Data Base, through a Web Based Application enhanced with Web Services by the use of open source tolls may be consider as an ideal case of such implementation. Hellenic National Oceanographic Data Center (HNODC) as a National Public Oceanographic Data provider and at the same time a member of the International Net of Oceanographic Data Centers( IOC/IODE), owns a very big volume of Data and Relevant information about the Marine Ecosystem. For the efficient management and exploitation of these Data, a relational Data Base has been constructed with a storage of over 300.000 station data concerning, physical, chemical and biological Oceanographic information. The development of a modern Web Application for the End User worldwide to be able to explore and navigate throughout HNODC data via the use of an interface with the capability of presenting Geographical representations of the Geo Information, is today a fact. The application is constituted with State of the art software components and tools such as: • Geospatial and no Spatial Web Services mechanisms • Geospatial open source tools for the creation of Dynamic Geographical Representations. • Communication protocols (messaging mechanisms) in all Layers such as XML and GML together with SOAP protocol via Apache/Axis. At the same time the application may interact with any other SOA application either in sending or receiving Geospatial Data through Geographical Layers, since it inherits the big advantage of interoperability between Web Services systems. Roughly the Architecture can denoted as follows: • At the back End Open source PostgreSQL DBMS stands as the data storage mechanism with more than one Data Base Schemas cause of the separation of the Geospatial Data and the non Geospatial Data. • UMN Map Server and Geoserver are the mechanisms for: Represent Geospatial Data via Web Map Service (WMS) Querying and Navigating in Geospatial and Meta Data Information via Web Feature Service (WFS) oAnd in the near future Transacting and processing new or existing Geospatial Data via Web Processing Service (WPS) • Map Bender, a geospatial portal site management software for OGC and OWS architectures acts as the integration module between the Geospatial Mechanisms. Mapbender comes with an embedded data model capable to manage interfaces for displaying, navigating and querying OGC compliant web map and feature services (WMS and transactional WFS). • Apache and Tomcat stand again as the Web Service middle Layers • Apache Axis with it's embedded implementation of the SOAP protocol ("Simple Object Access Protocol") acts as the No spatial data Mechanism of Web Services. These modules of the platform are still under development but their implementation will be fulfilled in the near future. • And a new Web user Interface for the end user based on enhanced and customized version of a MapBender GUI, a powerful Web Services client. For HNODC the interoperability of Web Services is the big advantage of the developed platform since it is capable to act in the future as provider and consumer of Web Services in both ways: • Either as data products provider for external SOA platforms. • Or as consumer of data products from external SOA platforms for new applications to be developed or for existing applications to be enhanced. A great paradigm of Data Managenet integration and dissemination via the use of such technologies is the European's Union Research Project Seadatanet, with the main objective to develop a standardized distributed system for managing and disseminating the large and diverse data sets and to enhance the currently existing infrastructures with Web Services Further more and when the technology of Web Processing Service (WPS), will be mature enough and applicable for development, the derived data products will be able to have any kind of GIS functionality for consumers across the network. From this point of view HNODC, joins the global scientific community by providing and consuming application Independent data products.
A geospatial model of ambient sound pressure levels in the contiguous United States.
Mennitt, Daniel; Sherrill, Kirk; Fristrup, Kurt
2014-05-01
This paper presents a model that predicts measured sound pressure levels using geospatial features such as topography, climate, hydrology, and anthropogenic activity. The model utilizes random forest, a tree-based machine learning algorithm, which does not incorporate a priori knowledge of source characteristics or propagation mechanics. The response data encompasses 270 000 h of acoustical measurements from 190 sites located in National Parks across the contiguous United States. The explanatory variables were derived from national geospatial data layers and cross validation procedures were used to evaluate model performance and identify variables with predictive power. Using the model, the effects of individual explanatory variables on sound pressure level were isolated and quantified to reveal systematic trends across environmental gradients. Model performance varies by the acoustical metric of interest; the seasonal L50 can be predicted with a median absolute deviation of approximately 3 dB. The primary application for this model is to generalize point measurements to maps expressing spatial variation in ambient sound levels. An example of this mapping capability is presented for Zion National Park and Cedar Breaks National Monument in southwestern Utah.
Growing a Global Perspective: Utilizing Graduate Students as Scientists in the Classroom
NASA Astrophysics Data System (ADS)
Martinez, A.; Prouhet, T.; Kincaid, J.; Williams, N.; Simms, M.; Evans, R.
2006-12-01
Advancing Geospatial Skills in Science and Social Sciences (AGSSS) is a NSF GK12 program designed to produce scientists with an interest in and skills related to education by bringing graduate students (termed Fellows) into science and social science classrooms. The AGSSS program is unique in the GK-12 program because of its emphasis on spatial thinking with and through geospatial technologies. Spatial thinking is defined as the knowledge, skills, and habits of mind to use concepts of space, tools of representation, and processes of reasoning to structure problems, find answers and express solutions to these problems. Working collaboratively, Fellows assist teachers in using technologies (many freely available) such as virtual globes, GIS, GPS, NASA's ISSEarthKAM, and online databases. Fellows also customize existing curricula based on teacher requests to focus on spatial thinking and skill development. Preliminary results of the program reveal that students' use of geospatial technologies in interactive lessons that highlight real world processes and global perspectives encourages the development of higher order thinking skills. Fellows perceive three primary benefits: developing collaboration and communication skills, solidifying their own understandings of spatial thinking and becoming more aware and skilled in working in educational settings.
NASA Astrophysics Data System (ADS)
Santoro, E.
2017-05-01
The crisis management of a disaster, whether caused naturally or by human action, requires a thorough knowledge of the territory involved, with regard to both its terrain and its developed areas. Therefore, it is essential that the National Mapping and Cadastral Agencies (NMCAs) and all other public and scientific institutions responsible for the production of geospatial information closely co-operate in making their data in that field available. This crucial sharing of geographic information is a top-level priority, not only in a disaster emergency situation, but also for effective urban and environmental planning and Cultural Heritage protection and preservation. Geospatial data-sharing, responding to the needs of all institutions involved in disaster surveying operations, is fundamental, as a priority, to the task of avoiding loss of human lives. However, no less important is the acquisition, dissemination and use of this data, in addition to direct, "in-the-field" operations of specialists in geomatics, in order to preserve the Cultural Heritage located in the crisis area. It is in this context that an NMCA such as the Italian Military Geographic Institute (IGMI) plays a key role.
Creating a Flood Risk Index to Improve Community Resilience
NASA Astrophysics Data System (ADS)
Klima, K.; El Gammal, L.
2017-12-01
While flood risk reduction is an existent discourse and agenda in policy and insurance, vulnerabilities vary between communities; some communities may have aging infrastructure, or an older/poorer population less able to absorb a flood, putting them at increased risk from the hazards. As a result, some are considering environmental justice aspects of flood risk reduction. To date, catastrophe models have focused on creating floodmaps (e.g., NOAA's Sea Level Rise Viewer, Climate Central's Surging Seas), or on linking hydrological models to economic loss models (e.g., HEC-RAS + HAZUS). However, this approach may be highly inequitable between areas of different income (as well as other demographics). Some have begun work on combining hydrology with vulnerability information (e.g., USACE's North Atlantic Comprehensive Coastal Study). To our knowledge, no one has tried to adapt the more advanced known heat risk theory to water risk by combining hydrology information (e.g., HEC-RAS, floodplain maps) with the social vulnerability (e.g., Cutter et al.) of the residents. This project will create a method to combine water hazard data with a derived water vulnerability index to help a community understand their current and future water risk. We will use the case study area of Pittsburgh, PA, which faces severe precipitation and riverine flooding hazards. Building on present literature of factors influencing water vulnerability contextualized to the Pittsburgh region, we will identify, quantify, and map the top factors impacting water vulnerability. We will combine these with flood maps to identify the geospatial distribution of water risk. This work will allow policy makers to identify location-specific aspects of water vulnerability and risk in any community, thus promoting environmental justice. It is possible that this type of original research would create maps of relative water risk that may prove as understandable to the general public as other flood maps, and may also help to promote "just resilience". This presentation will present a method to combine water hazard data with a derived water vulnerability index to present work on the geospatial distribution of water risk in Pittsburgh, PA.
NASA Astrophysics Data System (ADS)
Branch, B. D.; Wegner, K.; Smith, S.; Schulze, D. G.; Merwade, V.; Jung, J.; Bessenbacher, A.
2013-12-01
It has been the tradition of the libraries to support literacy. Now in the realm of Executive Order, Making Open and Machine Readable the New Default for Government Information, May 9, 2013, the library has the responsibility to support geospatial data, big data, earth science data or cyber infrastructure data that may support STEM for educational pipeline stimulation. (Such information can be found at http://www.whitehouse.gov/the-press-office/2013/05/09/executive-order-making-open-and-machine-readable-new-default-government-.) Provided is an Educational Data Curation Framework (EDCF) that has been initiated in Purdue research, geospatial data service engagement and outreach endeavors for future consideration and application to augment such data science and climate literacy needs of future global citizens. In addition, this endorsement of this framework by the GLOBE program may facilitate further EDCF implementations, discussion points and prototypes for libraries. In addition, the ECDF will support teacher-led, placed-based and large scale climate or earth science learning systems where such knowledge transfer of climate or earth science data is effectively transferred from higher education research of cyberinfrastructure use such as, NOAA or NASA, to K-12 teachers and school systems. The purpose of this effort is to establish best practices for sustainable K-12 data science delivery system or GLOBE-provided system (http://vis.globe.gov/GLOBE/) where libraries manage the data curation and data appropriateness as data reference experts for such digital data. Here, the Purdue University Libraries' GIS department works to support soils, LIDAR and water science data experiences to support teacher training for an EDCF development effort. Lastly, it should be noted that the interdisciplinary collaboration and demonstration of library supported outreach partners and national organizations such the GLOBE program may best foster EDCF development. This trend in data science where library roles may emerge is consistent with NASA's wavelength program at http://nasawavelength.org. Mr. Steven Smith, an outreach coordinator, led this Purdue University outreach activity involving the GLOBE program with support by the Purdue University Libraries GIS department.
Norman, Laura M.; Donelson, Angela J.; Pfeifer, Edwin L.; Lam, Alven H.; Osborn, Kenneth J.
2004-01-01
The U.S. Department of Housing and Urban Development (HUD) and the U.S. Geological Survey (USGS) have developed a joint project to create Internet-enabled geographic information systems (GIS) that will help cities along the United States-Mexico border deal with issues related to colonias. HUD defines colonias as rural neighborhoods in the United States-Mexico border region that lack adequate infrastructure or housing and other basic services. They typically have high poverty rates that make it difficult for residents to pay for roads, sanitary water and sewer systems, decent housing, street lighting, and other services through assessment. Many Federal agencies recognize colonias designations and provide funding assistance. It is the intention of this project to empower Arizona-Sonora borderland neighborhoods and community members by recognizing them as colonias. This recognition will result in eligibility for available economic subsidies and accessibility to geospatial tools and information for urban planning. The steps to achieve this goal include delineation of colonia-like neighborhoods, identification of their urbanization over time, development of geospatial databases describing their infrastructure, and establishment of a framework for distributing Web-based GIS decision support systems. A combination of imagery and infrastructure information was used to help delineate colonia boundaries. A land-use change analysis, focused on urbanization in the cities over a 30-year timeframe, was implemented. The results of this project are being served over the Internet, providing data to the public as well as to participating agencies. One of the initial study areas for this project was the City of Douglas, Ariz., and its Mexican sister-city Agua Prieta, Sonora, which are described herein. Because of its location on the border, this twin-cities area is especially well suited to international manufacturing and commerce, which has, in turn, led to an uncontrolled spread of colonias. The USGS worked with local organizations in developing the Web-based GIS database. Community involvement ensured that the database and map server would meet the current and long-term needs of the communities and end users. Partners include Federal agencies, State agencies, county officials, town representatives, universities, and youth organizations, as well as interested local advocacy groups and individuals. A significant component of this project was development of relationships and partnerships in the border towns for facilitating binational approaches to land management.
New Generation Sensor Web Enablement
Bröring, Arne; Echterhoff, Johannes; Jirka, Simon; Simonis, Ingo; Everding, Thomas; Stasch, Christoph; Liang, Steve; Lemmens, Rob
2011-01-01
Many sensor networks have been deployed to monitor Earth’s environment, and more will follow in the future. Environmental sensors have improved continuously by becoming smaller, cheaper, and more intelligent. Due to the large number of sensor manufacturers and differing accompanying protocols, integrating diverse sensors into observation systems is not straightforward. A coherent infrastructure is needed to treat sensors in an interoperable, platform-independent and uniform way. The concept of the Sensor Web reflects such a kind of infrastructure for sharing, finding, and accessing sensors and their data across different applications. It hides the heterogeneous sensor hardware and communication protocols from the applications built on top of it. The Sensor Web Enablement initiative of the Open Geospatial Consortium standardizes web service interfaces and data encodings which can be used as building blocks for a Sensor Web. This article illustrates and analyzes the recent developments of the new generation of the Sensor Web Enablement specification framework. Further, we relate the Sensor Web to other emerging concepts such as the Web of Things and point out challenges and resulting future work topics for research on Sensor Web Enablement. PMID:22163760
Visualizing and Understanding Socio-Environmental Dynamics in Baltimore
NASA Astrophysics Data System (ADS)
Zaitchik, B. F.; Omeara, K.; Guikema, S.; Scott, A.; Bessho, A.; Logan, T. M.
2015-12-01
The City of Baltimore, like any city, is the sum of its component neighborhoods, institutions, businesses, cultures, and, ultimately, its people. It is also an organism in its own right, with distinct geography, history, infrastructure, and environments that shape its residents even as it is shaped by them. Sometimes these interactions are obvious but often they are not; while basic economic patterns are widely documented, the distribution of socio-spatial and environmental connections often hides below the surface, as does the potential that those connections hold. Here we present results of a collaborative initiative on the geography, design, and policy of socio-environmental dynamics of Baltimore. Geospatial data derived from satellite imagery, demographic databases, social media feeds, infrastructure plans, and in situ environmental networks, among other sources, are applied to generate an interactive portrait of Baltimore City's social, health, and well-being dynamics. The layering of data serves as a platform for visualizing the interconnectedness of the City and as a database for modeling risk interactions, vulnerabilities, and strengths within and between communities. This presentation will provide an overview of project findings and highlight linkages to education and policy.
NASA Astrophysics Data System (ADS)
Hernández Ernst, Vera; Poigné, Axel; Los, Walter
2010-05-01
Understanding and managing the complexity of the biodiversity system in relation to global changes concerning land use and climate change with their social and economic implications is crucial to mitigate species loss and biodiversity changes in general. The sustainable development and exploitation of existing biodiversity resources require flexible and powerful infrastructures offering, on the one hand, the access to large-scale databases of observations and measures, to advanced analytical and modelling software, and to high performance computing environments and, on the other hand, the interlinkage of European scientific communities among each others and with national policies. The European Strategy Forum on Research Infrastructures (ESFRI) selected the "LifeWatch e-science and technology infrastructure for biodiversity research" as a promising development to construct facilities to contribute to meet those challenges. LifeWatch collaborates with other selected initiatives (e.g. ICOS, ANAEE, NOHA, and LTER-Europa) to achieve the integration of the infrastructures at landscape and regional scales. This should result in a cooperating cluster of such infrastructures supporting an integrated approach for data capture and transmission, data management and harmonisation. Besides, facilities for exploration, forecasting, and presentation using heterogeneous and distributed data and tools should allow the interdisciplinary scientific research at any spatial and temporal scale. LifeWatch is an example of a new generation of interoperable research infrastructures based on standards and a service-oriented architecture that allow for linkage with external resources and associated infrastructures. External data sources will be established data aggregators as the Global Biodiversity Information Facility (GBIF) for species occurrences and other EU Networks of Excellence like the Long-Term Ecological Research Network (LTER), GMES, and GEOSS for terrestrial monitoring, the MARBEF network for marine data, and the Consortium for European Taxonomic Facilities (CETAF) and its European Distributed Institute for Taxonomy (EDIT) for taxonomic data. But also "smaller" networks and "volunteer scientists" may send data (e.g. GPS supported species observations) to a LifeWatch repository. Autonomous operating wireless environmental sensors and other smart hand-held devices will contribute to increase data capture activities. In this way LifeWatch will directly underpin the development of GEOBON, the biodiversity component if GEOSS, the Global Earth observation System. To overcome all major technical difficulties imposed by the variety of currently and future technologies, protocols, data formats, etc., LifeWatch will define and use common open interfaces. For this purpose, the LifeWatch Reference Model was developed during the preparatory phase specifying the service-oriented architecture underlying the ICT-infrastructure. The Reference Model identifies key requirements and key architectural concepts to support workflows for scientific in-silico experiments, tracking of provenance, and semantic enhancement, besides meeting the functional requirements mentioned before. It provides guidelines for the specification and implementation of services and information models, defining as well a number of generic services and models. Another key issue addressed by the Reference Model is that the cooperation of many developer teams residing in many European countries has to be organized to obtain compatible results in that conformance with the specifications and policies of the Reference Model will be required. The LifeWatch Reference Model is based on the ORCHESTRA Reference Model for geospatial-oriented architectures and services networks that provides a generic framework and has been endorsed as best practice by the Open Geospatial Consortium (OGC). The LifeWatch Infrastructure will allow (interdisciplinary) scientific researchers to collaborate by creating e-Laboratories or by composing e-Services which can be shared and jointly developed. For it a long-term vision for the LifeWatch Biodiversity Workbench Portal has been developed as a one-stop application for the LifeWatch infrastructure based on existing and emerging technologies. There the user can find all available resources such as data, workflows, tools, etc. and access LifeWatch applications that integrate different resource and provides key capabilities like resource discovery and visualisation, creation of workflows, creation and management of provenance, and the support of collaborative activities. While LifeWatch developers will construct components for solving generic LifeWatch tasks, users may add their own facilities to fulfil individual needs. Examples for application of the LifeWatch Reference Model and the LifeWatch Biodiversity Workbench Portal will be given.
NASA Astrophysics Data System (ADS)
Weigel, A. M.; Griffin, R.; Bugbee, K.
2015-12-01
Various organizations such as the Group on Earth Observations (GEO) have developed a structure for general thematic areas in Earth science research, however the Climate Data Initiative (CDI) is addressing the challenging goal of organizing such datasets around core themes specifically related to climate change impacts. These thematic areas, which currently include coastal flooding, food resilience, ecosystem vulnerability, water, transportation, energy infrastructure, and human health, form the core of a new college course at the University of Alabama in Huntsville developed around real-world applications in the Earth sciences. The goal of this course is to educate students on the data available and scope of GIS applications in Earth science across the CDI climate themes. Real world applications and datasets serve as a pedagogical tool that provide a useful medium for instruction in scientific geospatial analysis and GIS software. With a wide range of potential research areas that fall under the rubric of "Earth science", thematic foci can help to structure a student's understanding of the potential uses of GIS across sub-disciplines, while communicating core data processing concepts. The learning modules and use-case scenarios for this course demonstrate the potential applications of CDI data to undergraduate and graduate Earth science students.
NASA Astrophysics Data System (ADS)
Argenti, M.; Giannini, V.; Averty, R.; Bigagli, L.; Dumoulin, J.
2012-04-01
The EC FP7 ISTIMES project has the goal of realizing an ICT-based system exploiting distributed and local sensors for non destructive electromagnetic monitoring in order to make critical transport infrastructures more reliable and safe. Higher situation awareness thanks to real time and detailed information and images of the controlled infrastructure status allows improving decision capabilities for emergency management stakeholders. Web-enabled sensors and a service-oriented approach are used as core of the architecture providing a sys-tem that adopts open standards (e.g. OGC SWE, OGC CSW etc.) and makes efforts to achieve full interoperability with other GMES and European Spatial Data Infrastructure initiatives as well as compliance with INSPIRE. The system exploits an open easily scalable network architecture to accommodate a wide range of sensors integrated with a set of tools for handling, analyzing and processing large data volumes from different organizations with different data models. Situation Awareness tools are also integrated in the system. Definition of sensor observations and services follows a metadata model based on the ISO 19115 Core set of metadata elements and the O&M model of OGC SWE. The ISTIMES infrastructure is based on an e-Infrastructure for geospatial data sharing, with a Data Cata-log that implements the discovery services for sensor data retrieval, acting as a broker through static connections based on standard SOS and WNS interfaces; a Decision Support component which helps decision makers providing support for data fusion and inference and generation of situation indexes; a Presentation component which implements system-users interaction services for information publication and rendering, by means of a WEB Portal using SOA design principles; A security framework using Shibboleth open source middleware based on the Security Assertion Markup Language supporting Single Sign On (SSO). ACKNOWLEDGEMENT - The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under Grant Agreement n° 225663
Monitoring of changes in areas of conflicts: the example of Darfur
NASA Astrophysics Data System (ADS)
Thunig, H.; Michel, U.
2012-10-01
Rapid change detection is used in cases of natural hazards and disasters. This analysis leads to rapid information on areas of damage. In certain cases the lack of information after catastrophe events is obstructing supporting measures within disaster management. Earthquakes, tsunamis, civil war, volcanic eruption, droughts and floods have much in common: people are directly affected, landscapes and buildings are destroyed. In every case geospatial data is necessary to gain knowledge as basement for decision support. Where to go first? Which infrastructure is usable? How much area is affected? These are essential question which need to be answered before appropriate, eligible help can be established. This paper focuses on change detection applications in areas where catastrophic events took place which resulted in rapid destruction especially of manmade objects. Standard methods for automated change detection prove not to be sufficient; therefore a new method was developed and tested. The presented method allows a fast detection and visualization of change in areas of crisis or catastrophes. While often new methods of remote sensing are developed without user oriented aspects, organizations and authorities are not able to use these methods because of lack of remote sensing knowledge. Therefore a semi-automated procedure was developed. Within a transferable framework, the developed algorithm can be implemented for a set of remote sensing data among different investigation areas. Several case studies are the base for the retrieved results. Within a coarse dividing into statistical parts and the segmentation in meaningful objects, the framework is able to deal with different types of change. By means of an elaborated Temporal Change Index (TCI) only panchromatic datasets are used to extract areas which are destroyed, areas which were not affected and in addition areas where rebuilding has already started.
NASA's Geospatial Interoperability Office(GIO)Program
NASA Technical Reports Server (NTRS)
Weir, Patricia
2004-01-01
NASA produces vast amounts of information about the Earth from satellites, supercomputer models, and other sources. These data are most useful when made easily accessible to NASA researchers and scientists, to NASA's partner Federal Agencies, and to society as a whole. A NASA goal is to apply its data for knowledge gain, decision support and understanding of Earth, and other planetary systems. The NASA Earth Science Enterprise (ESE) Geospatial Interoperability Office (GIO) Program leads the development, promotion and implementation of information technology standards that accelerate and expand the delivery of NASA's Earth system science research through integrated systems solutions. Our overarching goal is to make it easy for decision-makers, scientists and citizens to use NASA's science information. NASA's Federal partners currently participate with NASA and one another in the development and implementation of geospatial standards to ensure the most efficient and effective access to one another's data. Through the GIO, NASA participates with its Federal partners in implementing interoperability standards in support of E-Gov and the associated President's Management Agenda initiatives by collaborating on standards development. Through partnerships with government, private industry, education and communities the GIO works towards enhancing the ESE Applications Division in the area of National Applications and decision support systems. The GIO provides geospatial standards leadership within NASA, represents NASA on the Federal Geographic Data Committee (FGDC) Coordination Working Group and chairs the FGDC's Geospatial Applications and Interoperability Working Group (GAI) and supports development and implementation efforts such as Earth Science Gateway (ESG), Space Time Tool Kit and Web Map Services (WMS) Global Mosaic. The GIO supports NASA in the collection and dissemination of geospatial interoperability standards needs and progress throughout the agency including areas such as ESE Applications, the SEEDS Working Groups, the Facilities Engineering Division (Code JX) and NASA's Chief Information Offices (CIO). With these agency level requirements GIO leads, brokers and facilitates efforts to, develop, implement, influence and fully participate in standards development internationally, federally and locally. The GIO also represents NASA in the OpenGIS Consortium and ISO TC211. The OGC has made considerable progress in regards to relations with other open standards bodies; namely ISO, W3C and OASIS. ISO TC211 is the Geographic and Geomatics Information technical committee that works towards standardization in the field of digital geographic information. The GIO focuses on seamless access to data, applications of data, and enabling technologies furthering the interoperability of distributed data. Through teaming within the Applications Directorate and partnerships with government, private industry, education and communities, GIO works towards the data application goals of NASA, the ESE Applications Directorate, and our Federal partners by managing projects in four categories: Geospatial Standards and Leadership, Geospatial One Stop, Standards Development and Implementation, and National and NASA Activities.
An approach for heterogeneous and loosely coupled geospatial data distributed computing
NASA Astrophysics Data System (ADS)
Chen, Bin; Huang, Fengru; Fang, Yu; Huang, Zhou; Lin, Hui
2010-07-01
Most GIS (Geographic Information System) applications tend to have heterogeneous and autonomous geospatial information resources, and the availability of these local resources is unpredictable and dynamic under a distributed computing environment. In order to make use of these local resources together to solve larger geospatial information processing problems that are related to an overall situation, in this paper, with the support of peer-to-peer computing technologies, we propose a geospatial data distributed computing mechanism that involves loosely coupled geospatial resource directories and a term named as Equivalent Distributed Program of global geospatial queries to solve geospatial distributed computing problems under heterogeneous GIS environments. First, a geospatial query process schema for distributed computing as well as a method for equivalent transformation from a global geospatial query to distributed local queries at SQL (Structured Query Language) level to solve the coordinating problem among heterogeneous resources are presented. Second, peer-to-peer technologies are used to maintain a loosely coupled network environment that consists of autonomous geospatial information resources, thus to achieve decentralized and consistent synchronization among global geospatial resource directories, and to carry out distributed transaction management of local queries. Finally, based on the developed prototype system, example applications of simple and complex geospatial data distributed queries are presented to illustrate the procedure of global geospatial information processing.
Web-GIS visualisation of permafrost-related Remote Sensing products for ESA GlobPermafrost
NASA Astrophysics Data System (ADS)
Haas, A.; Heim, B.; Schaefer-Neth, C.; Laboor, S.; Nitze, I.; Grosse, G.; Bartsch, A.; Kaab, A.; Strozzi, T.; Wiesmann, A.; Seifert, F. M.
2016-12-01
The ESA GlobPermafrost (www.globpermafrost.info) provides a remote sensing service for permafrost research and applications. The service comprises of data product generation for various sites and regions as well as specific infrastructure allowing overview and access to datasets. Based on an online user survey conducted within the project, the user community extensively applies GIS software to handle remote sensing-derived datasets and requires preview functionalities before accessing them. In response, we develop the Permafrost Information System PerSys which is conceptualized as an open access geospatial data dissemination and visualization portal. PerSys will allow visualisation of GlobPermafrost raster and vector products such as land cover classifications, Landsat multispectral index trend datasets, lake and wetland extents, InSAR-based land surface deformation maps, rock glacier velocity fields, spatially distributed permafrost model outputs, and land surface temperature datasets. The datasets will be published as WebGIS services relying on OGC-standardized Web Mapping Service (WMS) and Web Feature Service (WFS) technologies for data display and visualization. The WebGIS environment will be hosted at the AWI computing centre where a geodata infrastructure has been implemented comprising of ArcGIS for Server 10.4, PostgreSQL 9.2 and a browser-driven data viewer based on Leaflet (http://leafletjs.com). Independently, we will provide an `Access - Restricted Data Dissemination Service', which will be available to registered users for testing frequently updated versions of project datasets. PerSys will become a core project of the Arctic Permafrost Geospatial Centre (APGC) within the ERC-funded PETA-CARB project (www.awi.de/petacarb). The APGC Data Catalogue will contain all final products of GlobPermafrost, allow in-depth dataset search via keywords, spatial and temporal coverage, data type, etc., and will provide DOI-based links to the datasets archived in the long-term, open access PANGAEA data repository.
Discovery of Marine Datasets and Geospatial Metadata Visualization
NASA Astrophysics Data System (ADS)
Schwehr, K. D.; Brennan, R. T.; Sellars, J.; Smith, S.
2009-12-01
NOAA's National Geophysical Data Center (NGDC) provides the deep archive of US multibeam sonar hydrographic surveys. NOAA stores the data as Bathymetric Attributed Grids (BAG; http://www.opennavsurf.org/) that are HDF5 formatted files containing gridded bathymetry, gridded uncertainty, and XML metadata. While NGDC provides the deep store and a basic ERSI ArcIMS interface to the data, additional tools need to be created to increase the frequency with which researchers discover hydrographic surveys that might be beneficial for their research. Using Open Source tools, we have created a draft of a Google Earth visualization of NOAA's complete collection of BAG files as of March 2009. Each survey is represented as a bounding box, an optional preview image of the survey data, and a pop up placemark. The placemark contains a brief summary of the metadata and links to directly download of the BAG survey files and the complete metadata file. Each survey is time tagged so that users can search both in space and time for surveys that meet their needs. By creating this visualization, we aim to make the entire process of data discovery, validation of relevance, and download much more efficient for research scientists who may not be familiar with NOAA's hydrographic survey efforts or the BAG format. In the process of creating this demonstration, we have identified a number of improvements that can be made to the hydrographic survey process in order to make the results easier to use especially with respect to metadata generation. With the combination of the NGDC deep archiving infrastructure, a Google Earth virtual globe visualization, and GeoRSS feeds of updates, we hope to increase the utilization of these high-quality gridded bathymetry. This workflow applies equally well to LIDAR topography and bathymetry. Additionally, with proper referencing and geotagging in journal publications, we hope to close the loop and help the community create a true “Geospatial Scholar” infrastructure.
Geospatial Data Management Platform for Urban Groundwater
NASA Astrophysics Data System (ADS)
Gaitanaru, D.; Priceputu, A.; Gogu, C. R.
2012-04-01
Due to the large amount of civil work projects and research studies, large quantities of geo-data are produced for the urban environments. These data are usually redundant as well as they are spread in different institutions or private companies. Time consuming operations like data processing and information harmonisation represents the main reason to systematically avoid the re-use of data. The urban groundwater data shows the same complex situation. The underground structures (subway lines, deep foundations, underground parkings, and others), the urban facility networks (sewer systems, water supply networks, heating conduits, etc), the drainage systems, the surface water works and many others modify continuously. As consequence, their influence on groundwater changes systematically. However, these activities provide a large quantity of data, aquifers modelling and then behaviour prediction can be done using monitored quantitative and qualitative parameters. Due to the rapid evolution of technology in the past few years, transferring large amounts of information through internet has now become a feasible solution for sharing geoscience data. Furthermore, standard platform-independent means to do this have been developed (specific mark-up languages like: GML, GeoSciML, WaterML, GWML, CityML). They allow easily large geospatial databases updating and sharing through internet, even between different companies or between research centres that do not necessarily use the same database structures. For Bucharest City (Romania) an integrated platform for groundwater geospatial data management is developed under the framework of a national research project - "Sedimentary media modeling platform for groundwater management in urban areas" (SIMPA) financed by the National Authority for Scientific Research of Romania. The platform architecture is based on three components: a geospatial database, a desktop application (a complex set of hydrogeological and geological analysis tools) and a front-end geoportal service. The SIMPA platform makes use of mark-up transfer standards to provide a user-friendly application that can be accessed through internet to query, analyse, and visualise geospatial data related to urban groundwater. The platform holds the information within the local groundwater geospatial databases and the user is able to access this data through a geoportal service. The database architecture allows storing accurate and very detailed geological, hydrogeological, and infrastructure information that can be straightforwardly generalized and further upscaled. The geoportal service offers the possibility of querying a dataset from the spatial database. The query is coded in a standard mark-up language, and sent to the server through a standard Hyper Text Transfer Protocol (http) to be processed by the local application. After the validation of the query, the results are sent back to the user to be displayed by the geoportal application. The main advantage of the SIMPA platform is that it offers to the user the possibility to make a primary multi-criteria query, which results in a smaller set of data to be analysed afterwards. This improves both the transfer process parameters and the user's means of creating the desired query.
NASA Astrophysics Data System (ADS)
Johnson, A. B.
2012-12-01
Geospatial science and technology (GST) including geographic information systems, remote sensing, global positioning systems and mobile applications, are valuable tools for geoscientists and students learning to become geoscientists. GST allows the user to analyze data spatially and temporarily and then visualize the data and outcomes in multiple formats (digital, web and paper). GST has evolved rapidly and it has been difficult to create effective curriculum as few guidelines existed to help educators. In 2010, the US Department of Labor (DoL), in collaboration with the National Geospatial Center of Excellence (GeoTech Center), a National Science Foundation supported grant, approved the Geospatial Technology Competency Mode (GTCM). The GTCM was developed and vetted with industry experts and provided the structure and example competencies needed across the industry. While the GTCM was helpful, a more detailed list of skills and competencies needed to be identified in order to build appropriate curriculum. The GeoTech Center carried out multiple DACUM events to identify the skills and competencies needed by entry-level workers. DACUM (Developing a Curriculum) is a job analysis process whereby expert workers are convened to describe what they do for a specific occupation. The outcomes from multiple DACUMs were combined into a MetaDACUM and reviewed by hundreds of GST professionals. This provided a list of more than 320 skills and competencies needed by the workforce. The GeoTech Center then held multiple workshops across the U.S. where more than 100 educators knowledgeable in teaching GST parsed the list into Model Courses and a Model Certificate Program. During this process, tools were developed that helped educators define which competency should be included in a specific course and the depth of instruction for that competency. This presentation will provide details about the process, methodology and tools used to create the Models and suggest how they can be used to create customized curriculum integrating geospatial science and technology into geoscience programs.
PANTHER. Pattern ANalytics To support High-performance Exploitation and Reasoning.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Czuchlewski, Kristina Rodriguez; Hart, William E.
Sandia has approached the analysis of big datasets with an integrated methodology that uses computer science, image processing, and human factors to exploit critical patterns and relationships in large datasets despite the variety and rapidity of information. The work is part of a three-year LDRD Grand Challenge called PANTHER (Pattern ANalytics To support High-performance Exploitation and Reasoning). To maximize data analysis capability, Sandia pursued scientific advances across three key technical domains: (1) geospatial-temporal feature extraction via image segmentation and classification; (2) geospatial-temporal analysis capabilities tailored to identify and process new signatures more efficiently; and (3) domain- relevant models of humanmore » perception and cognition informing the design of analytic systems. Our integrated results include advances in geographical information systems (GIS) in which we discover activity patterns in noisy, spatial-temporal datasets using geospatial-temporal semantic graphs. We employed computational geometry and machine learning to allow us to extract and predict spatial-temporal patterns and outliers from large aircraft and maritime trajectory datasets. We automatically extracted static and ephemeral features from real, noisy synthetic aperture radar imagery for ingestion into a geospatial-temporal semantic graph. We worked with analysts and investigated analytic workflows to (1) determine how experiential knowledge evolves and is deployed in high-demand, high-throughput visual search workflows, and (2) better understand visual search performance and attention. Through PANTHER, Sandia's fundamental rethinking of key aspects of geospatial data analysis permits the extraction of much richer information from large amounts of data. The project results enable analysts to examine mountains of historical and current data that would otherwise go untouched, while also gaining meaningful, measurable, and defensible insights into overlooked relationships and patterns. The capability is directly relevant to the nation's nonproliferation remote-sensing activities and has broad national security applications for military and intelligence- gathering organizations.« less
Kawamoto, Kensaku; Lobach, David F; Willard, Huntington F; Ginsburg, Geoffrey S
2009-03-23
In recent years, the completion of the Human Genome Project and other rapid advances in genomics have led to increasing anticipation of an era of genomic and personalized medicine, in which an individual's health is optimized through the use of all available patient data, including data on the individual's genome and its downstream products. Genomic and personalized medicine could transform healthcare systems and catalyze significant reductions in morbidity, mortality, and overall healthcare costs. Critical to the achievement of more efficient and effective healthcare enabled by genomics is the establishment of a robust, nationwide clinical decision support infrastructure that assists clinicians in their use of genomic assays to guide disease prevention, diagnosis, and therapy. Requisite components of this infrastructure include the standardized representation of genomic and non-genomic patient data across health information systems; centrally managed repositories of computer-processable medical knowledge; and standardized approaches for applying these knowledge resources against patient data to generate and deliver patient-specific care recommendations. Here, we provide recommendations for establishing a national decision support infrastructure for genomic and personalized medicine that fulfills these needs, leverages existing resources, and is aligned with the Roadmap for National Action on Clinical Decision Support commissioned by the U.S. Office of the National Coordinator for Health Information Technology. Critical to the establishment of this infrastructure will be strong leadership and substantial funding from the federal government. A national clinical decision support infrastructure will be required for reaping the full benefits of genomic and personalized medicine. Essential components of this infrastructure include standards for data representation; centrally managed knowledge repositories; and standardized approaches for leveraging these knowledge repositories to generate patient-specific care recommendations at the point of care.
A GeoServices Infrastructure for Near-Real-Time Access to Suomi NPP Satellite Data
NASA Astrophysics Data System (ADS)
Evans, J. D.; Valente, E. G.; Hao, W.; Chettri, S.
2012-12-01
The new Suomi National Polar-orbiting Partnership (NPP) satellite extends NASA's moderate-resolution, multispectral observations with a suite of powerful imagers and sounders to support a broad array of research and applications. However, NPP data products consist of a complex set of data and metadata files in highly specialized formats; which NPP's operational ground segment delivers to users only with several hours' delay. This severely limits their use in critical applications such as weather forecasting, emergency / disaster response, search and rescue, and other activities that require near-real-time access to satellite observations. Alternative approaches, based on distributed Direct Broadcast facilities, can reduce the delay in NPP data delivery from hours to minutes, and can make products more directly usable by practitioners in the field. To assess and fulfill this potential, we are developing a suite of software that couples Direct Broadcast data feeds with a streamlined, scalable processing chain and geospatial Web services, so as to permit many more time-sensitive applications to use NPP data. The resulting geoservices infrastructure links a variety of end-user tools and applications to NPP data from different sources, and to other rapidly-changing geospatial data. By using well-known, standard software interfaces (such as OGC Web Services or OPeNDAP), this infrastructure serves a variety of end-user analysis and visualization tools, giving them access into datasets of arbitrary size and resolution and allowing them to request and receive tailored products on demand. The standards-based approach may also streamline data sharing among independent satellite receiving facilities, thus helping them to interoperate in providing frequent, composite views of continent-scale or global regions. To enable others to build similar or derived systems, the service components we are developing (based in part on the Community Satellite Processing Package (CSPP) from the University of Wisconsin and the International Polar-Orbiter Processing Package (IPOPP) from NASA) are being released as open source software. Furthermore, they are configured to operate in a cloud computing environment, so as to allow even small organizations to process and serve NPP data without large hardware investments; and to maintain near-real-time performance cost-effectively by growing and shrinking their use of computing resources to meet large, rapid fluctuations in end-user demand, data availability, and processing needs. (This is especially important for polar-orbiting satellites like NPP, which pass within range of a receiver only a few times each day.) We will discuss the design of the infrastructure, highlight its capabilities, and sketch its potential to facilitate broad access to satellite data processing and visualization, and to enhance near-real-time applications via distributed NPP data streams.
Improving the Slum Planning Through Geospatial Decision Support System
NASA Astrophysics Data System (ADS)
Shekhar, S.
2014-11-01
In India, a number of schemes and programmes have been launched from time to time in order to promote integrated city development and to enable the slum dwellers to gain access to the basic services. Despite the use of geospatial technologies in planning, the local, state and central governments have only been partially successful in dealing with these problems. The study on existing policies and programmes also proved that when the government is the sole provider or mediator, GIS can become a tool of coercion rather than participatory decision-making. It has also been observed that local level administrators who have adopted Geospatial technology for local planning continue to base decision-making on existing political processes. In this juncture, geospatial decision support system (GSDSS) can provide a framework for integrating database management systems with analytical models, graphical display, tabular reporting capabilities and the expert knowledge of decision makers. This assists decision-makers to generate and evaluate alternative solutions to spatial problems. During this process, decision-makers undertake a process of decision research - producing a large number of possible decision alternatives and provide opportunities to involve the community in decision making. The objective is to help decision makers and planners to find solutions through a quantitative spatial evaluation and verification process. The study investigates the options for slum development in a formal framework of RAY (Rajiv Awas Yojana), an ambitious program of Indian Government for slum development. The software modules for realizing the GSDSS were developed using the ArcGIS and Community -VIZ software for Gulbarga city.
Global patterns of current and future road infrastructure
NASA Astrophysics Data System (ADS)
Meijer, Johan R.; Huijbregts, Mark A. J.; Schotten, Kees C. G. J.; Schipper, Aafke M.
2018-06-01
Georeferenced information on road infrastructure is essential for spatial planning, socio-economic assessments and environmental impact analyses. Yet current global road maps are typically outdated or characterized by spatial bias in coverage. In the Global Roads Inventory Project we gathered, harmonized and integrated nearly 60 geospatial datasets on road infrastructure into a global roads dataset. The resulting dataset covers 222 countries and includes over 21 million km of roads, which is two to three times the total length in the currently best available country-based global roads datasets. We then related total road length per country to country area, population density, GDP and OECD membership, resulting in a regression model with adjusted R 2 of 0.90, and found that that the highest road densities are associated with densely populated and wealthier countries. Applying our regression model to future population densities and GDP estimates from the Shared Socioeconomic Pathway (SSP) scenarios, we obtained a tentative estimate of 3.0–4.7 million km additional road length for the year 2050. Large increases in road length were projected for developing nations in some of the world’s last remaining wilderness areas, such as the Amazon, the Congo basin and New Guinea. This highlights the need for accurate spatial road datasets to underpin strategic spatial planning in order to reduce the impacts of roads in remaining pristine ecosystems.
Advanced e-Infrastructures for Civil Protection applications: the CYCLOPS Project
NASA Astrophysics Data System (ADS)
Mazzetti, P.; Nativi, S.; Verlato, M.; Ayral, P. A.; Fiorucci, P.; Pina, A.; Oliveira, J.; Sorani, R.
2009-04-01
During the full cycle of the emergency management, Civil Protection operative procedures involve many actors belonging to several institutions (civil protection agencies, public administrations, research centers, etc.) playing different roles (decision-makers, data and service providers, emergency squads, etc.). In this context the sharing of information is a vital requirement to make correct and effective decisions. Therefore a European-wide technological infrastructure providing a distributed and coordinated access to different kinds of resources (data, information, services, expertise, etc.) could enhance existing Civil Protection applications and even enable new ones. Such European Civil Protection e-Infrastructure should be designed taking into account the specific requirements of Civil Protection applications and the state-of-the-art in the scientific and technological disciplines which could make the emergency management more effective. In the recent years Grid technologies have reached a mature state providing a platform for secure and coordinated resource sharing between the participants collected in the so-called Virtual Organizations. Moreover the Earth and Space Sciences Informatics provide the conceptual tools for modeling the geospatial information shared in Civil Protection applications during its entire lifecycle. Therefore a European Civil Protection e-infrastructure might be based on a Grid platform enhanced with Earth Sciences services. In the context of the 6th Framework Programme the EU co-funded Project CYCLOPS (CYber-infrastructure for CiviL protection Operative ProcedureS), ended in December 2008, has addressed the problem of defining the requirements and identifying the research strategies and innovation guidelines towards an advanced e-Infrastructure for Civil Protection. Starting from the requirement analysis CYCLOPS has proposed an architectural framework for a European Civil Protection e-Infrastructure. This architectural framework has been evaluated through the development of prototypes of two operative applications used by the Italian Civil Protection for Wild Fires Risk Assessment (RISICO) and by the French Civil Protection for Flash Flood Risk Management (SPC-GD). The results of these studies and proof-of-concepts have been used as the basis for the definition of research and innovation strategies aiming to the detailed design and implementation of the infrastructure. In particular the main research themes and topics to be addressed have been identified and detailed. Finally the obstacles to the innovation required for the adoption of this infrastructure and possible strategies to overcome them have been discussed.
EPA Geospatial Quality Council Promoting Quality Assurance in the Geospatial Coummunity
After establishing a foundation for the EPA National Geospatial Program, the EPA Geospatial Quality Council (GQC) is, in part, focusing on improving administrative efficiency in the geospatial community. To realize this goal, the GQC is developing Standard Operating Procedures (S...
International boundary experiences by the United Nations
NASA Astrophysics Data System (ADS)
Kagawa, A.
2013-12-01
Over the last few decades, the United Nations (UN) has been approached by Security Council and Member States on international boundary issues. The United Nations regards the adequate delimitation and demarcation of international boundaries as a very important element for the maintenance of peace and security in fragile post-conflict situations, establishment of friendly relationships and cross-border cooperation between States. This paper will present the main principles and framework the United Nations applies to support the process of international boundary delimitation and demarcation activities. The United Nations is involved in international boundary issues following the principle of impartiality and neutrality and its role as mediator. Since international boundary issues are multi-faceted, a range of expertise is required and the United Nations Secretariat is in a good position to provide diverse expertise within the multiple departments. Expertise in different departments ranging from legal, political, technical, administrative and logistical are mobilised in different ways to provide support to Member States depending on their specific needs. This presentation aims to highlight some of the international boundary projects that the United Nations Cartographic Section has been involved in order to provide the technical support to different boundary requirements as each international boundary issue requires specific focus and attention whether it be in preparation, delimitation, demarcation or management. Increasingly, the United Nations is leveraging geospatial technology to facilitate boundary delimitation and demarcation process between Member States. Through the presentation of the various case studies ranging from Iraq - Kuwait, Israel - Lebanon (Blue Line), Eritrea - Ethiopia, Cyprus (Green Line), Cameroon - Nigeria, Sudan - South Sudan, it will illustrate how geospatial technology is increasingly used to carry out the support. In having applied a range of geospatial solutions, some of the good practices that have been applied in preceding projects, but there have been challenges and limitations faced. However, these challenges need to be seen as an opportunity to improve the geospatial technology solutions in future international boundary projects. This presentation will also share the aspirations that the United Nations Cartographic Section has in becoming a facilitator in geospatial technical aspects related to international boundary issues as we increasingly develop our geospatial institutional knowledge base and expertise. The presentation will conclude by emphasizing the need for more collaboration between different actors dealing with geospatial technology on borderland issues in order to meet the main goal of the United Nations - to live and work together as "We the Peoples of the United Nations".
The Semi-opened Infrastructure Model (SopIM): A Frame to Set Up an Organizational Learning Process
NASA Astrophysics Data System (ADS)
Grundstein, Michel
In this paper, we introduce the "Semi-opened Infrastructure Model (SopIM)" implemented to deploy Artificial Intelligence and Knowledge-based Systems within a large industrial company. This model illustrates what could be two of the operating elements of the Model for General Knowledge Management within the Enterprise (MGKME) that are essential to set up the organizational learning process that leads people to appropriate and use concepts, methods and tools of an innovative technology: the "Ad hoc Infrastructures" element, and the "Organizational Learning Processes" element.
SERVIR: Environmental Decision Making in the Americas
NASA Technical Reports Server (NTRS)
Lapenta, William; Irwin, Dan
2008-01-01
SERVIR is a regional visualization and monitoring system for Mesoamerica that integrates satellite and other geospatial data for improved scientific knowledge and decision making by managers, researchers, students, and the general public. SERVIR addresses the nine societal benefit areas of the Global Earth Observation System of Systems (GEOSS). This talk will provide an overview of products and services available through SERVIR.
Spatial Knowledge Infrastructures - Creating Value for Policy Makers and Benefits the Community
NASA Astrophysics Data System (ADS)
Arnold, L. M.
2016-12-01
The spatial data infrastructure is arguably one of the most significant advancements in the spatial sector. It's been a game changer for governments, providing for the coordination and sharing of spatial data across organisations and the provision of accessible information to the broader community of users. Today however, end-users such as policy-makers require far more from these spatial data infrastructures. They want more than just data; they want the knowledge that can be extracted from data and they don't want to have to download, manipulate and process data in order to get the knowledge they seek. It's time for the spatial sector to reduce its focus on data in spatial data infrastructures and take a more proactive step in emphasising and delivering the knowledge value. Nowadays, decision-makers want to be able to query at will the data to meet their immediate need for knowledge. This is a new value proposal for the decision-making consumer and will require a shift in thinking. This paper presents a model for a Spatial Knowledge Infrastructure and underpinning methods that will realise a new real-time approach to delivering knowledge. The methods embrace the new capabilities afforded through the sematic web, domain and process ontologies and natural query language processing. Semantic Web technologies today have the potential to transform the spatial industry into more than just a distribution channel for data. The Semantic Web RDF (Resource Description Framework) enables meaning to be drawn from data automatically. While pushing data out to end-users will remain a central role for data producers, the power of the semantic web is that end-users have the ability to marshal a broad range of spatial resources via a query to extract knowledge from available data. This can be done without actually having to configure systems specifically for the end-user. All data producers need do is make data accessible in RDF and the spatial analytics does the rest.
Grid Enabled Geospatial Catalogue Web Service
NASA Technical Reports Server (NTRS)
Chen, Ai-Jun; Di, Li-Ping; Wei, Ya-Xing; Liu, Yang; Bui, Yu-Qi; Hu, Chau-Min; Mehrotra, Piyush
2004-01-01
Geospatial Catalogue Web Service is a vital service for sharing and interoperating volumes of distributed heterogeneous geospatial resources, such as data, services, applications, and their replicas over the web. Based on the Grid technology and the Open Geospatial Consortium (0GC) s Catalogue Service - Web Information Model, this paper proposes a new information model for Geospatial Catalogue Web Service, named as GCWS which can securely provides Grid-based publishing, managing and querying geospatial data and services, and the transparent access to the replica data and related services under the Grid environment. This information model integrates the information model of the Grid Replica Location Service (RLS)/Monitoring & Discovery Service (MDS) with the information model of OGC Catalogue Service (CSW), and refers to the geospatial data metadata standards from IS0 19115, FGDC and NASA EOS Core System and service metadata standards from IS0 191 19 to extend itself for expressing geospatial resources. Using GCWS, any valid geospatial user, who belongs to an authorized Virtual Organization (VO), can securely publish and manage geospatial resources, especially query on-demand data in the virtual community and get back it through the data-related services which provide functions such as subsetting, reformatting, reprojection etc. This work facilitates the geospatial resources sharing and interoperating under the Grid environment, and implements geospatial resources Grid enabled and Grid technologies geospatial enabled. It 2!so makes researcher to focus on science, 2nd not cn issues with computing ability, data locztic, processir,g and management. GCWS also is a key component for workflow-based virtual geospatial data producing.
Giovanni - The Bridge Between Data and Science
NASA Technical Reports Server (NTRS)
Liu, Zhong; Acker, James
2017-01-01
This article describes new features in the Geospatial Interactive Online Visualization ANd aNalysis Infrastructure (Giovanni), a user-friendly online tool that enables visualization, analysis, and assessment of NASA Earth science data sets without downloading data and software. Since the satellite era began, data collected from Earth-observing satellites have been widely used in research and applications; however, using satellite-based data sets can still be a challenge to many. To facilitate data access and evaluation, as well as scientific exploration and discovery, the NASA Goddard Earth Sciences (GES) Data and Information Services Center (DISC) has developed Giovanni for a wide range of users around the world. This article describes the latest capabilities of Giovanni with examples, and discusses future plans for this innovative system.
Development of a Water Infrastructure Knowledge Database
This paper presents a methodology for developing a national database, as applied to water infrastructure systems, which includes both drinking water and wastewater. The database is branded as "WATERiD" and can be accessed at www.waterid.org. Water infrastructure in the U.S. is ag...
Geospatial considerations for a multiorganizational, landscape-scale program
O'Donnell, Michael S.; Assal, Timothy J.; Anderson, Patrick J.; Bowen, Zachary H.
2013-01-01
Geospatial data play an increasingly important role in natural resources management, conservation, and science-based projects. The management and effective use of spatial data becomes significantly more complex when the efforts involve a myriad of landscape-scale projects combined with a multiorganizational collaboration. There is sparse literature to guide users on this daunting subject; therefore, we present a framework of considerations for working with geospatial data that will provide direction to data stewards, scientists, collaborators, and managers for developing geospatial management plans. The concepts we present apply to a variety of geospatial programs or projects, which we describe as a “scalable framework” of processes for integrating geospatial efforts with management, science, and conservation initiatives. Our framework includes five tenets of geospatial data management: (1) the importance of investing in data management and standardization, (2) the scalability of content/efforts addressed in geospatial management plans, (3) the lifecycle of a geospatial effort, (4) a framework for the integration of geographic information systems (GIS) in a landscape-scale conservation or management program, and (5) the major geospatial considerations prior to data acquisition. We conclude with a discussion of future considerations and challenges.
NASA Astrophysics Data System (ADS)
Patton, Ashley M.
2016-04-01
Reusing existing subsurface data can greatly cut the time and financial costs of site investigations, and reduce uncertainty regarding ground conditions that can result in delays and overspend. In Hong Kong SAR it is common practice for consultancies to deposit records in the form of factual and interpretive reports, borehole logs and laboratory test data with the Geotechnical Engineering Office (GEO) who make this information openly available to access for future investigative works. In addition to these deposits, other datasets available at GEO include, amongst others, landslide records, aerial photographs and as-built records. These archives are the first source of information about development sites in Hong Kong and no investigation takes place without a thorough desk study. Increasingly these data are digital, and can be accessed through a GIS-based online portal. In the U.K. the British Geological Survey (BGS) acts as a custodian for geoscience data deposited by the public and private sectors on a voluntary basis, and encourages organisations to make their data publicly available through the BGS online data portals. The facility to deposit digital data via the BGS website has recently been launched and should increase uptake of data sharing in the U.K. as it becomes easier for users to batch upload records digitally. Issues regarding data ownership and confidentiality are being overcome by the establishment, in some cities, of knowledge exchange networks where members who sign up to view data are expected under the terms of membership to deposit data. This has received backing from local government in some areas. The U.K. may not have the density of existing data that Hong Kong has but as knowledge exchange gathers momentum the BGS datasets are expected to grow rapidly. In Europe there appears to be a reluctance to share data. However, escalating demand for land, greater redevelopment of brownfield sites and an ever-growing need to ensure future construction and infrastructure projects are sustainable and compliant with European environmental targets means reusing data may have a role to play in increased subsurface knowledge, the reduction of unforeseen ground conditions and ultimately saving money. Data in .ags format is particularly favourable due to its uniform nature. First-hand experience of the approach towards disseminating geospatial data in Hong Kong and the U.K. will be presented, examining the difference in attitudes regarding data sharing in the two territories, and highlighting how it benefits ground investigations and geohazard assessment with the hope that Europe can learn lessons for the future and change old habits. The different systems of data sharing used in Hong Kong and the U.K. will be discussed, and their strengths and weaknesses evaluated with the aim of fostering a methodology for sharing geoscience data within Europe that benefits from the combined successes of both approaches and builds upon existing expertise.
Global polar geospatial information service retrieval based on search engine and ontology reasoning
Chen, Nengcheng; E, Dongcheng; Di, Liping; Gong, Jianya; Chen, Zeqiang
2007-01-01
In order to improve the access precision of polar geospatial information service on web, a new methodology for retrieving global spatial information services based on geospatial service search and ontology reasoning is proposed, the geospatial service search is implemented to find the coarse service from web, the ontology reasoning is designed to find the refined service from the coarse service. The proposed framework includes standardized distributed geospatial web services, a geospatial service search engine, an extended UDDI registry, and a multi-protocol geospatial information service client. Some key technologies addressed include service discovery based on search engine and service ontology modeling and reasoning in the Antarctic geospatial context. Finally, an Antarctica multi protocol OWS portal prototype based on the proposed methodology is introduced.
Kaggal, Vinod C.; Elayavilli, Ravikumar Komandur; Mehrabi, Saeed; Pankratz, Joshua J.; Sohn, Sunghwan; Wang, Yanshan; Li, Dingcheng; Rastegar, Majid Mojarad; Murphy, Sean P.; Ross, Jason L.; Chaudhry, Rajeev; Buntrock, James D.; Liu, Hongfang
2016-01-01
The concept of optimizing health care by understanding and generating knowledge from previous evidence, ie, the Learning Health-care System (LHS), has gained momentum and now has national prominence. Meanwhile, the rapid adoption of electronic health records (EHRs) enables the data collection required to form the basis for facilitating LHS. A prerequisite for using EHR data within the LHS is an infrastructure that enables access to EHR data longitudinally for health-care analytics and real time for knowledge delivery. Additionally, significant clinical information is embedded in the free text, making natural language processing (NLP) an essential component in implementing an LHS. Herein, we share our institutional implementation of a big data-empowered clinical NLP infrastructure, which not only enables health-care analytics but also has real-time NLP processing capability. The infrastructure has been utilized for multiple institutional projects including the MayoExpertAdvisor, an individualized care recommendation solution for clinical care. We compared the advantages of big data over two other environments. Big data infrastructure significantly outperformed other infrastructure in terms of computing speed, demonstrating its value in making the LHS a possibility in the near future. PMID:27385912
Kaggal, Vinod C; Elayavilli, Ravikumar Komandur; Mehrabi, Saeed; Pankratz, Joshua J; Sohn, Sunghwan; Wang, Yanshan; Li, Dingcheng; Rastegar, Majid Mojarad; Murphy, Sean P; Ross, Jason L; Chaudhry, Rajeev; Buntrock, James D; Liu, Hongfang
2016-01-01
The concept of optimizing health care by understanding and generating knowledge from previous evidence, ie, the Learning Health-care System (LHS), has gained momentum and now has national prominence. Meanwhile, the rapid adoption of electronic health records (EHRs) enables the data collection required to form the basis for facilitating LHS. A prerequisite for using EHR data within the LHS is an infrastructure that enables access to EHR data longitudinally for health-care analytics and real time for knowledge delivery. Additionally, significant clinical information is embedded in the free text, making natural language processing (NLP) an essential component in implementing an LHS. Herein, we share our institutional implementation of a big data-empowered clinical NLP infrastructure, which not only enables health-care analytics but also has real-time NLP processing capability. The infrastructure has been utilized for multiple institutional projects including the MayoExpertAdvisor, an individualized care recommendation solution for clinical care. We compared the advantages of big data over two other environments. Big data infrastructure significantly outperformed other infrastructure in terms of computing speed, demonstrating its value in making the LHS a possibility in the near future.
I Want It, You've Got It - Effectively Connect Users to Geospatial Resources
NASA Astrophysics Data System (ADS)
White, C. E.
2012-12-01
How do users of scientific data find what they need? How do they know where to look, what to look for, how to evaluate, and - if they find the right resource - then how to get it? When the data is of a geospatial nature, other factors also come into play - is the data in a format/projection compatible with other data being used, does the user have access to tools that can analyze and display the data to adequately evaluate it, and does the user have knowledge on how to manage that access - especially if the data is being exposed by web services. Supporting users to connect them to geospatial data in a continually evolving technological climate is a challenge that reaches deeply into all levels of data management. In this talk, we will discuss specific challenges in how users discover and access resources, and how Esri has evolved solutions over time to more effectively connect users to what they need. Some of the challenges - and current solutions - that will be discussed are: balancing a straightforward user experience with rich functionality, providing simple descriptions while maintaining complete metadata, enabling data access to work with an organization's content while being compatible with other organizations' access mechanisms, and the ability to publish data once yet share it in many venues.
Geospatial Information from Satellite Imagery for Geovisualisation of Smart Cities in India
NASA Astrophysics Data System (ADS)
Mohan, M.
2016-06-01
In the recent past, there have been large emphasis on extraction of geospatial information from satellite imagery. The Geospatial information are being processed through geospatial technologies which are playing important roles in developing of smart cities, particularly in developing countries of the world like India. The study is based on the latest geospatial satellite imagery available for the multi-date, multi-stage, multi-sensor, and multi-resolution. In addition to this, the latest geospatial technologies have been used for digital image processing of remote sensing satellite imagery and the latest geographic information systems as 3-D GeoVisualisation, geospatial digital mapping and geospatial analysis for developing of smart cities in India. The Geospatial information obtained from RS and GPS systems have complex structure involving space, time and presentation. Such information helps in 3-Dimensional digital modelling for smart cities which involves of spatial and non-spatial information integration for geographic visualisation of smart cites in context to the real world. In other words, the geospatial database provides platform for the information visualisation which is also known as geovisualisation. So, as a result there have been an increasing research interest which are being directed to geospatial analysis, digital mapping, geovisualisation, monitoring and developing of smart cities using geospatial technologies. However, the present research has made an attempt for development of cities in real world scenario particulary to help local, regional and state level planners and policy makers to better understand and address issues attributed to cities using the geospatial information from satellite imagery for geovisualisation of Smart Cities in emerging and developing country, India.
NASA Astrophysics Data System (ADS)
Veenendaal, B.
2014-04-01
A wide range of geographic information science (GIScience) educational programs currently exist, the oldest now over 25 years. Offerings vary from those specifically focussed on geographic information science, to those that utilise geographic information systems in various applications and disciplines. Over the past two decades, there have been a number of initiatives to design curricula for GIScience, including the NCGIA Core Curriculum, GIS&T Body of Knowledge and the Geospatial Technology Competency Model developments. The rapid developments in geospatial technology, applications and organisations have added to the challenges that higher educational institutions face in order to ensure that GIScience education is relevant and responsive to the changing needs of students and industry. This paper discusses some of the challenges being faced in higher education in general, and GIScience education in particular, and outlines a flexible higher education curriculum framework for GIScience.
Integrating sea floor observatory data: the EMSO data infrastructure
NASA Astrophysics Data System (ADS)
Huber, Robert; Azzarone, Adriano; Carval, Thierry; Doumaz, Fawzi; Giovanetti, Gabriele; Marinaro, Giuditta; Rolin, Jean-Francois; Beranzoli, Laura; Waldmann, Christoph
2013-04-01
The European research infrastructure EMSO is a European network of fixed-point, deep-seafloor and water column observatories deployed in key sites of the European Continental margin and Arctic. It aims to provide the technological and scientific framework for the investigation of the environmental processes related to the interaction between the geosphere, biosphere, and hydrosphere and for a sustainable management by long-term monitoring also with real-time data transmission. Since 2006, EMSO is on the ESFRI (European Strategy Forum on Research Infrastructures) roadmap and has entered its construction phase in 2012. Within this framework, EMSO is contributing to large infrastructure integration projects such as ENVRI and COOPEUS. The EMSO infrastructure is geographically distributed in key sites of European waters, spanning from the Arctic, through the Atlantic and Mediterranean Sea to the Black Sea. It is presently consisting of thirteen sites which have been identified by the scientific community according to their importance respect to Marine Ecosystems, Climate Changes and Marine GeoHazards. The data infrastructure for EMSO is being designed as a distributed system. Presently, EMSO data collected during experiments at each EMSO site are locally stored and organized in catalogues or relational databases run by the responsible regional EMSO nodes. Three major institutions and their data centers are currently offering access to EMSO data: PANGAEA, INGV and IFREMER. In continuation of the IT activities which have been performed during EMSOs twin project ESONET, EMSO is now implementing the ESONET data architecture within an operational EMSO data infrastructure. EMSO aims to be compliant with relevant marine initiatives such as MyOceans, EUROSITES, EuroARGO, SEADATANET and EMODNET as well as to meet the requirements of international and interdisciplinary projects such as COOPEUS and ENVRI, EUDAT and iCORDI. A major focus is therefore set on standardization and interoperability of the EMSO data infrastructure. Beneath common standards for metadata exchange such as OpenSearch or OAI-PMH, EMSO has chosen to implement core standards of the Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) suite of standards, such as Catalogue Service for Web (CS-W), Sensor Observation Service (SOS) and Observations and Measurements (O&M). Further, strong integration efforts are currently undertaken to harmonize data formats e.g NetCDF as well as the used ontologies and terminologies. The presentation will also give information to users about the discovery and visualization procedure for the EMSO data presently available.
International Convergence on Geoscience Cyberinfrastructure
NASA Astrophysics Data System (ADS)
Allison, M. L.; Atkinson, R.; Arctur, D. K.; Cox, S.; Jackson, I.; Nativi, S.; Wyborn, L. A.
2012-04-01
There is growing international consensus on addressing the challenges to cyber(e)-infrastructure for the geosciences. These challenges include: Creating common standards and protocols; Engaging the vast number of distributed data resources; Establishing practices for recognition of and respect for intellectual property; Developing simple data and resource discovery and access systems; Building mechanisms to encourage development of web service tools and workflows for data analysis; Brokering the diverse disciplinary service buses; Creating sustainable business models for maintenance and evolution of information resources; Integrating the data management life-cycle into the practice of science. Efforts around the world are converging towards de facto creation of an integrated global digital data network for the geosciences based on common standards and protocols for data discovery and access, and a shared vision of distributed, web-based, open source interoperable data access and integration. Commonalities include use of Open Geospatial Consortium (OGC) and ISO specifications and standardized data interchange mechanisms. For multidisciplinarity, mediation, adaptation, and profiling services have been successfully introduced to leverage the geosciences standards which are commonly used by the different geoscience communities -introducing a brokering approach which extends the basic SOA archetype. Principal challenges are less technical than cultural, social, and organizational. Before we can make data interoperable, we must make people interoperable. These challenges are being met by increased coordination of development activities (technical, organizational, social) among leaders and practitioners in national and international efforts across the geosciences to foster commonalities across disparate networks. In doing so, we will 1) leverage and share resources, and developments, 2) facilitate and enhance emerging technical and structural advances, 3) promote interoperability across scientific domains, 4) support the promulgation and institutionalization of agreed-upon standards, protocols, and practice, and 5) enhance knowledge transfer not only across the community, but into the domain sciences, 6) lower existing entry barriers for users and data producers, 7) build on the existing disciplinary infrastructures leveraging their service buses. . All of these objectives are required for establishing a permanent and sustainable cyber(e)-infrastructure for the geosciences. The rationale for this approach is well articulated in the AuScope mission statement: "Many of these problems can only be solved on a national, if not global scale. No single researcher, research institution, discipline or jurisdiction can provide the solutions. We increasingly need to embrace e-Research techniques and use the internet not only to access nationally distributed datasets, instruments and compute infrastructure, but also to build online, 'virtual' communities of globally dispersed researchers." Multidisciplinary interoperability can be successfully pursued by adopting a "system of systems" or a "Network of Networks" philosophy. This approach aims to: (a) supplement but not supplant systems mandates and governance arrangements; (b) keep the existing capacities as autonomous as possible; (c) lower entry barriers; (d) Build incrementally on existing infrastructures (information systems); (e) incorporate heterogeneous resources by introducing distribution and mediation functionalities. This approach has been adopted by the European INSPIRE (Infrastructure for Spatial Information in the European Community) initiative and by the international GEOSS (Global Earth Observation System of Systems) programme.
Enhancing Public Participation to Improve Natural Resources Science and its Use in Decision Making
NASA Astrophysics Data System (ADS)
Glynn, P. D.; Shapiro, C. D.; Liu, S. B.
2015-12-01
The need for broader understanding and involvement in science coupled with social technology advances enabling crowdsourcing and citizen science have created greater opportunities for public participation in the gathering, interpretation, and use of geospatial information. The U.S. Geological Survey (USGS) is developing guidance for USGS scientists, partners, and interested members of the public on when and how public participation can most effectively be used in the conduct of scientific activities. Public participation can provide important perspectives and knowledge that cannot be obtained through traditional scientific methods alone. Citizen engagement can also provide increased efficiencies to USGS science and additional benefits to society including enhanced understanding, appreciation, and interest in geospatial information and its use in decision making.The USGS guidance addresses several fundamental issues by:1. Developing an operational definition of citizen or participatory science.2. Identifying the circumstances under which citizen science is appropriate for use and when its use is not recommended. 3. Describing structured processes for effective use of citizen science. 4. Defining the successful application of citizen science and identifying useful success metrics.The guidance is coordinated by the USGS Science and Decisions Center and developed by a multidisciplinary team of USGS scientists and managers. External perspectives will also be incorporated, as appropriate to align with other efforts such as the White House Office of Science and Technology Policy (OSTP) Citizen Science and Crowdsourcing Toolkit for the Federal government. The guidance will include the development of an economic framework to assess the benefits and costs of geospatial information developed through participatory processes. This economic framework considers tradeoffs between obtaining additional perspectives through enhanced participation with costs associated from obtaining geospatial information from multiple sources.
Exploring Methodologies and Indicators for Cross-disciplinary Applications
NASA Astrophysics Data System (ADS)
Bernknopf, R.; Pearlman, J.
2015-12-01
Assessing the impact and benefit of geospatial information is a multidisciplinary task that involves the social, economic and environmental knowledge to formulate indicators and methods. There are use cases that couple the social sciences including economics, psychology, sociology that incorporate geospatial information. Benefit - cost analysis is an empirical approach that uses money as an indicator for decision making. It is a traditional base for a use case and has been applied to geospatial information and other areas. A new use case that applies indicators is Meta Regression analysis, which is used to evaluate transfers of socioeconomic benefits from different geographic regions into a unifying statistical approach. In this technique, qualitative and quantitative variables are indicators, which provide a weighted average of value for the nonmarket good or resource over a large region. The expected willingness to pay for the nonmarket good can be applied to a specific region. A third use case is the application of Decision Support Systems and Tools that have been used for forecasting agricultural prices and analysis of hazard policies. However, new methods for integrating these disciplines into use cases, an avenue to instruct the development of operational applications of geospatial information, are needed. Experience in one case may not be broadly transferable to other uses and applications if multiple disciplines are involved. To move forward, more use cases are needed and, especially, applications in the private sector. Applications are being examined across a multidisciplinary community for good examples that would be instructive in meeting the challenge. This presentation will look at the results of an investigation into directions in the broader applications of use cases to teach the methodologies and use of indicators that have applications across fields of interest.
Development of the AuScope Australian Earth Observing System
NASA Astrophysics Data System (ADS)
Rawling, T.
2017-12-01
Advances in monitoring technology and significant investment in new national research initiatives, will provide significant new opportunities for delivery of novel geoscience data streams from across the Australian continent over the next decade. The AuScope Australian Earth Observing System (AEOS) is linking field and laboratory infrastructure across Australia to form a national sensor array focusing on the Solid Earth. As such AuScope is working with these programs to deploy observational infrastructure, including MT, passive seismic, and GNSS networks across the entire Australian Continent. Where possible the observational grid will be co-located with strategic basement drilling in areas of shallow cover and tied with national reflection seismic and sampling transects. This integrated suite of distributed earth observation and imaging sensors will provide unprecedented imaging fidelity of our crust, across all length and time scales, to fundamental and applied researchers in the earth, environmental and geospatial sciences. The AEOS will the Earth Science community's Square Kilometer Array (SKA) - a distributed telescope that looks INTO the earth rather than away from it - a 10 million SKA. The AEOS is strongly aligned with other community strategic initiatives including the UNCOVER research program as well as other National Collaborative Research Infrastructure programs such as the Terrestrial Environmental Research Network (TERN) and the Integrated Marine Observing System (IMOS) providing an interdisciplinary collaboration platform across the earth and environmental sciences. There is also very close alignment between AuScope and similar international programs such as EPOS, the USArray and EarthCube - potential collaborative linkages we are currently in the process of pursuing more fomally. The AuScope AEOS Infrastructure System is ultimately designed to enable the progressive construction, refinement and ongoing enrichment of a live, "FAIR" four-dimensional Earth Model for the Australian Continent and its immediate environs.
NASA Astrophysics Data System (ADS)
Vuorinen, Tommi; Korja, Annakaisa
2017-04-01
FIN-EPOS consortium is a joint community of Finnish national research institutes tasked with operating and maintaining solid-earth geophysical and geological observatories and laboratories in Finland. These national research infrastructures (NRIs) seek to join EPOS research infrastructure (EPOS RI) and further pursue Finland's participation as a founding member in EPOS ERIC (European Research Infrastructure Consortium). Current partners of FIN-EPOS are the University of Helsinki (UH), the University of and Oulu (UO), Finnish Geospatial Research Institute (FGI) of the National Land Survey (NLS), Finnish Meteorological Institute (FMI), Geological Survey of Finland (GTK), CSC - IT Center for Science and MIKES Metrology at VTT Technical Research Centre of Finland Ltd. The consortium is hosted by the Institute of Seismology, UH (ISUH). The primary purpose of the consortium is to act as a coordinating body between various NRIs and the EPOS RI. FIN-EPOS engages in planning and development of the national EPOS RI and will provide support in EPOS implementation phase (IP) for the partner NRIs. FIN-EPOS also promotes the awareness of EPOS in Finland and is open to new partner NRIs that would benefit from participating in EPOS. The consortium additionally seeks to advance solid Earth science education, technologies and innovations in Finland and is actively engaging in Nordic co-operation and collaboration of solid Earth RIs. The main short term objective of FIN-EPOS is to make Finnish geoscientific data provided by NRIs interoperable with the Thematic Core Services (TCS) in the EPOS IP. Consortium partners commit into applying and following metadata and data format standards provided by EPOS. FIN-EPOS will also provide a national Finnish language web portal where users are identified and their user rights for EPOS resources are defined.
Harmonizing Settlement, Infrastructure, and Population Data to Support Sustainable Development
NASA Astrophysics Data System (ADS)
Chen, R. S.; de Sherbinin, A. M.; Yetman, G.
2016-12-01
The geospatial data community has been developing global-scale georeferenced population, human settlements, and infrastructure data for more than two decades, pushing available technologies to process ever growing amounts of data and increase the resolution of the outputs. These population, settlement, and infrastructure data products have seen wide use in varied aspects of sustainable development, including agriculture, energy, water, health, land use, transportation, risk management, and climate impact assessment. However, in most cases, data development has been driven by the availability of specific data sources (e.g., census data, night-time lights, radar data, or moderate- to high-resolution imagery), rather than by an integrated view of how best to characterize human settlement patterns over time and space on multiple dimensions using diverse data sources. Such an integrated view would enhance our ability to observe, model, and predict where on the planet people live and work—in the past, present, and future—and under what conditions, i.e., in relationship not only to environmental systems, resources, extremes, and changes, but also to the human settlements and built infrastructure that mediate impacts on both people and the environment. We report here on a new international effort to improve understanding of the strengths and weaknesses of existing and planned georeferenced data products, and to create a collaborative community across the natural, social, health, engineering, and data sciences and the public and private sectors supporting data integration and coordination to meet sustainable development data needs. Opportunities exist to share data and expertise, coordinate activities, pool computing resources, reduce duplication, improve data quality and harmonization, and facilitate effective data use for sustainable development monitoring and decision making, especially with respect to the 17 Sustainable Development Goals adopted by the international community in September 2015.
Geospatial Data Science Research Staff | Geospatial Data Science | NREL
Oliveira, Ricardo Researcher II-Geospatial Science Ricardo.Oliveira@nrel.gov 303-275-3272 Gilroy, Nicholas Specialist Pamela.Gray.hann@nrel.gov 303-275-4626 Grue, Nicholas Researcher III-Geospatial Science Nick.Grue
PLANNING QUALITY IN GEOSPATIAL PROJECTS
This presentation will briefly review some legal drivers and present a structure for the writing of geospatial Quality Assurance Projects Plans. In addition, the Geospatial Quality Council geospatial information life-cycle and sources of error flowchart will be reviewed.
Automatic geospatial information Web service composition based on ontology interface matching
NASA Astrophysics Data System (ADS)
Xu, Xianbin; Wu, Qunyong; Wang, Qinmin
2008-10-01
With Web services technology the functions of WebGIS can be presented as a kind of geospatial information service, and helped to overcome the limitation of the information-isolated situation in geospatial information sharing field. Thus Geospatial Information Web service composition, which conglomerates outsourced services working in tandem to offer value-added service, plays the key role in fully taking advantage of geospatial information services. This paper proposes an automatic geospatial information web service composition algorithm that employed the ontology dictionary WordNet to analyze semantic distances among the interfaces. Through making matching between input/output parameters and the semantic meaning of pairs of service interfaces, a geospatial information web service chain can be created from a number of candidate services. A practice of the algorithm is also proposed and the result of it shows the feasibility of this algorithm and the great promise in the emerging demand for geospatial information web service composition.
The Emerging Phenomenon of Knowledge Management.
ERIC Educational Resources Information Center
Broadbent, Marianne
1997-01-01
Clarifies the meaning of knowledge management and gives examples of organizations that overtly practice it. Outlines four steps in knowledge management: (1) making knowledge visible; (2) building knowledge intensity; (3) building knowledge infrastructure; and (4) developing a knowledge culture. Discusses managing people as assets, librarians as…
Experiences with Acquiring Highly Redundant Spatial Data to Support Driverless Vehicle Technologies
NASA Astrophysics Data System (ADS)
Koppanyi, Z.; Toth, C. K.
2018-05-01
As vehicle technology is moving towards higher autonomy, the demand for highly accurate geospatial data is rapidly increasing, as accurate maps have a huge potential of increasing safety. In particular, high definition 3D maps, including road topography and infrastructure, as well as city models along the transportation corridors represent the necessary support for driverless vehicles. In this effort, a vehicle equipped with high-, medium- and low-resolution active and passive cameras acquired data in a typical traffic environment, represented here by the OSU campus, where GPS/GNSS data are available along with other navigation sensor data streams. The data streams can be used for two purposes. First, high-definition 3D maps can be created by integrating all the sensory data, and Data Analytics/Big Data methods can be tested for automatic object space reconstruction. Second, the data streams can support algorithmic research for driverless vehicle technologies, including object avoidance, navigation/positioning, detecting pedestrians and bicyclists, etc. Crucial cross-performance analyses on map database resolution and accuracy with respect to sensor performance metrics to achieve economic solution for accurate driverless vehicle positioning can be derived. These, in turn, could provide essential information on optimizing the choice of geospatial map databases and sensors' quality to support driverless vehicle technologies. The paper reviews the data acquisition and primary data processing challenges and performance results.
An Application Domain Extension to CityGML for immovable property taxation: A Turkish case study
NASA Astrophysics Data System (ADS)
Çağdaş, Volkan
2013-04-01
It is generally acknowledged that immovable property taxes are one of the main revenue sources for local government. The literature emphasizes that the administration of property taxes needs well-developed inventories or registers that provide complete and accurate records of the taxed properties and their legal-economic attributes. This requirement is generally fulfilled by Spatial Data Infrastructures (SDIs) in which the coordinate exchange and sharing of geo-spatial data is provided by separate registers/information systems such as: cadastral systems, building and address registers. Recently, the Open Geospatial Consortium presented a core component of a 3D SDI in the form of an international domain standard for representing, storing and exchanging 3D city models. The CityGML allows the semantic and 3D geometrical representation of physical objects but does not deal with the legal and administrative aspects of the city objects which are required for the process of property taxation. This paper outlines the development of an Application Domain Extension (ADE) for the immovable property taxation domain that expands the CityGML data model with the legal and administrative concepts defined in Turkish Law. The study shows that this ADE could be a 3D national data model for municipal information systems and facilitate a more efficient taxation process, as well as providing data for urban planning, facility management and other municipal services.
Field Ground Truthing Data Collector - a Mobile Toolkit for Image Analysis and Processing
NASA Astrophysics Data System (ADS)
Meng, X.
2012-07-01
Field Ground Truthing Data Collector is one of the four key components of the NASA funded ICCaRS project, being developed in Southeast Michigan. The ICCaRS ground truthing toolkit entertains comprehensive functions: 1) Field functions, including determining locations through GPS, gathering and geo-referencing visual data, laying out ground control points for AEROKAT flights, measuring the flight distance and height, and entering observations of land cover (and use) and health conditions of ecosystems and environments in the vicinity of the flight field; 2) Server synchronization functions, such as, downloading study-area maps, aerial photos and satellite images, uploading and synchronizing field-collected data with the distributed databases, calling the geospatial web services on the server side to conduct spatial querying, image analysis and processing, and receiving the processed results in field for near-real-time validation; and 3) Social network communication functions for direct technical assistance and pedagogical support, e.g., having video-conference calls in field with the supporting educators, scientists, and technologists, participating in Webinars, or engaging discussions with other-learning portals. This customized software package is being built on Apple iPhone/iPad and Google Maps/Earth. The technical infrastructures, data models, coupling methods between distributed geospatial data processing and field data collector tools, remote communication interfaces, coding schema, and functional flow charts will be illustrated and explained at the presentation. A pilot case study will be also demonstrated.
Dulin, Michael F; Tapp, Hazel; Smith, Heather A; de Hernandez, Brisa Urquieta; Coffman, Maren J; Ludden, Tom; Sorensen, Janni; Furuseth, Owen J
2012-09-11
Individual and community health are adversely impacted by disparities in health outcomes among disadvantaged and vulnerable populations. Understanding the underlying causes for variations in health outcomes is an essential step towards developing effective interventions to ameliorate inequalities and subsequently improve overall community health. Working at the neighborhood scale, this study examines multiple social determinates that can cause health disparities including low neighborhood wealth, weak social networks, inadequate public infrastructure, the presence of hazardous materials in or near a neighborhood, and the lack of access to primary care services. The goal of this research is to develop innovative and replicable strategies to improve community health in disadvantaged communities such as newly arrived Hispanic immigrants. This project is taking place within a primary care practice-based research network (PBRN) using key principles of community-based participatory research (CBPR). Associations between social determinants and rates of hospitalizations, emergency department (ED) use, and ED use for primary care treatable or preventable conditions are being examined. Geospatial models are in development using both hospital and community level data to identify local areas where interventions to improve disparities would have the greatest impact. The developed associations between social determinants and health outcomes as well as the geospatial models will be validated using community surveys and qualitative methods. A rapidly growing and underserved Hispanic immigrant population will be the target of an intervention informed by the research process to impact utilization of primary care services and designed, deployed, and evaluated using the geospatial tools and qualitative research findings. The purpose of this intervention will be to reduce health disparities by improving access to, and utilization of, primary care and preventative services. The results of this study will demonstrate the importance of several novel approaches to ameliorating health disparities, including the use of CBPR, the effectiveness of community-based interventions to influence health outcomes by leveraging social networks, and the importance of primary care access in ameliorating health disparities.
A geospatial soil-based DSS to reconcile landscape management and land protection
NASA Astrophysics Data System (ADS)
Manna, Piero; Basile, Angelo; Bonfante, Antonello; D'Antonio, Amedeo; De Michele, Carlo; Iamarino, Michela; Langella, Giuliano; Florindo Mileti, Antonio; Pileri, Paolo; Vingiani, Simona; Terribile, Fabio
2017-04-01
The implementation of UN Agenda 2030 may represent a great opportunity to place soil science at the hearth of many Sustainable Development Goals (e.g. SDGs 2, 3, 13, 15, 15.3, 16.7). On the other side the high complexity embedded in the factual implementation of SDG and many others ambitious objectives (e.g. FAO goals) may cause new frustrations if these policy documents will not bring real progresses. The scientific communities are asked to contribute to disentangle this complexity and possibly identifying a "way to go". This may help the large number of European directives (e.g. WFD, EIA), regulations and communications aiming to achieve a better environment but still facing large difficulties in their full implementation (e.g. COM2015/120; COM2013/683). This contribution has the motivation to provide a different perspective, thinking that the full implementation of SDGs and integrated land policies requires to challenge some key overlooked issues including full competence (and managing capability) about the landscape variability, its multi-functionalities (e.g. agriculture / environment) and its dynamic nature (many processes, including crops growth and fate of pollutants, are dynamic); moreover, it requires to support actions at a very detailed local scale since many processes and problems are site specific. The landscape and all the above issues have the soil as pulsing heart. Accordingly, we aim to demonstrate the multiple benefits in using a smart geoSpatial Decision Support System (S-DSS) grounded on soil modelling, called SOILCONSWEB (EU LIFE+ project and its extensions). It is a freely-accessible web platform based on a Geospatial Cyber-Infrastructure (GCI) and developed in Valle Telesina (South Italy) over an area of 20,000 ha. It supports a multilevel decision-making in agriculture and environment including the interaction with other land uses (such as landscape and urban planning) and thus it simultaneously delivers to SDGs 2, 3, 13, 15, 15.3, 16.7.
ERIC Educational Resources Information Center
Organisation for Economic Cooperation and Development, Paris (France). Programme on Educational Building.
This document summarizes themes developed and conclusions from the International Workshop on Educational Infrastructure. The opening topic was "Delivering Education and Training in the Knowledge Society." It was clear to participants that educational infrastructure must go hand-in-hand with reengineering processes to adjust to the needs…
Distributed Multi-interface Catalogue for Geospatial Data
NASA Astrophysics Data System (ADS)
Nativi, S.; Bigagli, L.; Mazzetti, P.; Mattia, U.; Boldrini, E.
2007-12-01
Several geosciences communities (e.g. atmospheric science, oceanography, hydrology) have developed tailored data and metadata models and service protocol specifications for enabling online data discovery, inventory, evaluation, access and download. These specifications are conceived either profiling geospatial information standards or extending the well-accepted geosciences data models and protocols in order to capture more semantics. These artifacts have generated a set of related catalog -and inventory services- characterizing different communities, initiatives and projects. In fact, these geospatial data catalogs are discovery and access systems that use metadata as the target for query on geospatial information. The indexed and searchable metadata provide a disciplined vocabulary against which intelligent geospatial search can be performed within or among communities. There exists a clear need to conceive and achieve solutions to implement interoperability among geosciences communities, in the context of the more general geospatial information interoperability framework. Such solutions should provide search and access capabilities across catalogs, inventory lists and their registered resources. Thus, the development of catalog clearinghouse solutions is a near-term challenge in support of fully functional and useful infrastructures for spatial data (e.g. INSPIRE, GMES, NSDI, GEOSS). This implies the implementation of components for query distribution and virtual resource aggregation. These solutions must implement distributed discovery functionalities in an heterogeneous environment, requiring metadata profiles harmonization as well as protocol adaptation and mediation. We present a catalog clearinghouse solution for the interoperability of several well-known cataloguing systems (e.g. OGC CSW, THREDDS catalog and data services). The solution implements consistent resource discovery and evaluation over a dynamic federation of several well-known cataloguing and inventory systems. Prominent features include: 1)Support to distributed queries over a hierarchical data model, supporting incremental queries (i.e. query over collections, to be subsequently refined) and opaque/translucent chaining; 2)Support to several client protocols, through a compound front-end interface module. This allows to accommodate a (growing) number of cataloguing standards, or profiles thereof, including the OGC CSW interface, ebRIM Application Profile (for Core ISO Metadata and other data models), and the ISO Application Profile. The presented catalog clearinghouse supports both the opaque and translucent pattern for service chaining. In fact, the clearinghouse catalog may be configured either to completely hide the underlying federated services or to provide clients with services information. In both cases, the clearinghouse solution presents a higher level interface (i.e. OGC CSW) which harmonizes multiple lower level services (e.g. OGC CSW, WMS and WCS, THREDDS, etc.), and handles all control and interaction with them. In the translucent case, client has the option to directly access the lower level services (e.g. to improve performances). In the GEOSS context, the solution has been experimented both as a stand-alone user application and as a service framework. The first scenario allows a user to download a multi-platform client software and query a federation of cataloguing systems, that he can customize at will. The second scenario support server-side deployment and can be flexibly adapted to several use-cases, such as intranet proxy, catalog broker, etc.
Exchanging the Context between OGC Geospatial Web clients and GIS applications using Atom
NASA Astrophysics Data System (ADS)
Maso, Joan; Díaz, Paula; Riverola, Anna; Pons, Xavier
2013-04-01
Currently, the discovery and sharing of geospatial information over the web still presents difficulties. News distribution through website content was simplified by the use of Really Simple Syndication (RSS) and Atom syndication formats. This communication exposes an extension of Atom to redistribute references to geospatial information in a Spatial Data Infrastructure distributed environment. A geospatial client can save the status of an application that involves several OGC services of different kind and direct data and share this status with other users that need the same information and use different client vendor products in an interoperable way. The extensibility of the Atom format was essential to define a format that could be used in RSS enabled web browser, Mass Market map viewers and emerging geospatial enable integrated clients that support Open Geospatial Consortium (OGC) services. Since OWS Context has been designed as an Atom extension, it is possible to see the document in common places where Atom documents are valid. Internet web browsers are able to present the document as a list of items with title, abstract, time, description and downloading features. OWS Context uses GeoRSS so that, the document can be to be interpreted by both Google maps and Bing Maps as items that have the extent represented in a dynamic map. Another way to explode a OWS Context is to develop an XSLT to transform the Atom feed into an HTML5 document that shows the exact status of the client view window that saved the context document. To accomplish so, we use the width and height of the client window, and the extent of the view in world (geographic) coordinates in order to calculate the scale of the map. Then, we can mix elements in world coordinates (such as CF-NetCDF files or GML) with elements in pixel coordinates (such as WMS maps, WMTS tiles and direct SVG content). A smarter map browser application called MiraMon Map Browser is able to write a context document and read it again to recover the context of the previous view or load a context generated by another application. The possibility to store direct links to direct files in OWS Context is particularly interesting for GIS desktop solutions. This communication also presents the development made in the MiraMon desktop GIS solution to include OWS Context. MiraMon software is able to deal either with local files, web services and database connections. As in any other GIS solution, MiraMon team designed its own file (MiraMon Map MMM) for storing and sharing the status of a GIS session. The new OWS Context format is now adopted as an interoperable substitution of the MMM. The extensibility of the format makes it possible to map concepts in the MMM to current OWS Context elements (such as titles, data links, extent, etc) and to generate new elements that are able to include all extra metadata not currently covered by OWS Context. These developments were done in the nine edition of the OpenGIS Web Services Interoperability Experiment (OWS-9) and are demonstrated in this communication.
75 FR 6056 - National Geospatial Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-05
... DEPARTMENT OF THE INTERIOR Office of the Secretary National Geospatial Advisory Committee AGENCY: Office of the Secretary, Interior. ACTION: Notice of renewal of National Geospatial Advisory Committee... renewed the National Geospatial Advisory Committee. The Committee will provide advice and recommendations...
Building asynchronous geospatial processing workflows with web services
NASA Astrophysics Data System (ADS)
Zhao, Peisheng; Di, Liping; Yu, Genong
2012-02-01
Geoscience research and applications often involve a geospatial processing workflow. This workflow includes a sequence of operations that use a variety of tools to collect, translate, and analyze distributed heterogeneous geospatial data. Asynchronous mechanisms, by which clients initiate a request and then resume their processing without waiting for a response, are very useful for complicated workflows that take a long time to run. Geospatial contents and capabilities are increasingly becoming available online as interoperable Web services. This online availability significantly enhances the ability to use Web service chains to build distributed geospatial processing workflows. This paper focuses on how to orchestrate Web services for implementing asynchronous geospatial processing workflows. The theoretical bases for asynchronous Web services and workflows, including asynchrony patterns and message transmission, are examined to explore different asynchronous approaches to and architecture of workflow code for the support of asynchronous behavior. A sample geospatial processing workflow, issued by the Open Geospatial Consortium (OGC) Web Service, Phase 6 (OWS-6), is provided to illustrate the implementation of asynchronous geospatial processing workflows and the challenges in using Web Services Business Process Execution Language (WS-BPEL) to develop them.
Making Temporal Search More Central in Spatial Data Infrastructures
NASA Astrophysics Data System (ADS)
Corti, P.; Lewis, B.
2017-10-01
A temporally enabled Spatial Data Infrastructure (SDI) is a framework of geospatial data, metadata, users, and tools intended to provide an efficient and flexible way to use spatial information which includes the historical dimension. One of the key software components of an SDI is the catalogue service which is needed to discover, query, and manage the metadata. A search engine is a software system capable of supporting fast and reliable search, which may use any means necessary to get users to the resources they need quickly and efficiently. These techniques may include features such as full text search, natural language processing, weighted results, temporal search based on enrichment, visualization of patterns in distributions of results in time and space using temporal and spatial faceting, and many others. In this paper we will focus on the temporal aspects of search which include temporal enrichment using a time miner - a software engine able to search for date components within a larger block of text, the storage of time ranges in the search engine, handling historical dates, and the use of temporal histograms in the user interface to display the temporal distribution of search results.
NASA Astrophysics Data System (ADS)
Neff, K. L.; Farr, T.
2016-12-01
Aquifer subsidence due to groundwater abstraction poses a significant threat to aquifer sustainability and infrastructure. The need to prevent permanent compaction to preserve aquifer storage capacity and protect infrastructure begs a better understanding of how compaction is related to groundwater abstraction and aquifer hydrogeology. The stress-strain relationship between hydraulic head changes and aquifer compaction has previously been observed to be hysteretic in both empirical and modeling studies. Here, subsidence data for central California's San Joaquin Valley derived from interferometric synthetic aperture radar (InSAR) for the period 2007-2016 is examined relative to hydraulic head levels in monitoring and production wells collected by the California Department of Water Resources. Such a large and long-term data set is available for empirical analysis for the first time thanks to advances in InSAR data collection and geospatial data management. The California Department of Water Resources (DWR) funded this work to provide the background and an update on subsidence in the Central Valley to support future policy. Part of this work was carried out at the Jet Propulsion Laboratory, California Institute of Technology, under contract with NASA.
Semantics Enabled Queries in EuroGEOSS: a Discovery Augmentation Approach
NASA Astrophysics Data System (ADS)
Santoro, M.; Mazzetti, P.; Fugazza, C.; Nativi, S.; Craglia, M.
2010-12-01
One of the main challenges in Earth Science Informatics is to build interoperability frameworks which allow users to discover, evaluate, and use information from different scientific domains. This needs to address multidisciplinary interoperability challenges concerning both technological and scientific aspects. From the technological point of view, it is necessary to provide a set of special interoperability arrangement in order to develop flexible frameworks that allow a variety of loosely-coupled services to interact with each other. From a scientific point of view, it is necessary to document clearly the theoretical and methodological assumptions underpinning applications in different scientific domains, and develop cross-domain ontologies to facilitate interdisciplinary dialogue and understanding. In this presentation we discuss a brokering approach that extends the traditional Service Oriented Architecture (SOA) adopted by most Spatial Data Infrastructures (SDIs) to provide the necessary special interoperability arrangements. In the EC-funded EuroGEOSS (A European approach to GEOSS) project, we distinguish among three possible functional brokering components: discovery, access and semantics brokers. This presentation focuses on the semantics broker, the Discovery Augmentation Component (DAC), which was specifically developed to address the three thematic areas covered by the EuroGEOSS project: biodiversity, forestry and drought. The EuroGEOSS DAC federates both semantics (e.g. SKOS repositories) and ISO-compliant geospatial catalog services. The DAC can be queried using common geospatial constraints (i.e. what, where, when, etc.). Two different augmented discovery styles are supported: a) automatic query expansion; b) user assisted query expansion. In the first case, the main discovery steps are: i. the query keywords (the what constraint) are “expanded” with related concepts/terms retrieved from the set of federated semantic services. A default expansion regards the multilinguality relationship; ii. The resulting queries are submitted to the federated catalog services; iii. The DAC performs a “smart” aggregation of the queries results and provides them back to the client. In the second case, the main discovery steps are: i. the user browses the federated semantic repositories and selects the concepts/terms-of-interest; ii. The DAC creates the set of geospatial queries based on the selected concepts/terms and submits them to the federated catalog services; iii. The DAC performs a “smart” aggregation of the queries results and provides them back to the client. A Graphical User Interface (GUI) was also developed for testing and interacting with the DAC. The entire brokering framework is deployed in the context of EuroGEOSS infrastructure and it is used in a couple of GEOSS AIP-3 use scenarios: the “e-Habitat Use Scenario” for the Biodiversity and Climate Change topic, and the “Comprehensive Drought Index Use Scenario” for Water/Drought topic
Use of Open Standards and Technologies at the Lunar Mapping and Modeling Project
NASA Astrophysics Data System (ADS)
Law, E.; Malhotra, S.; Bui, B.; Chang, G.; Goodale, C. E.; Ramirez, P.; Kim, R. M.; Sadaqathulla, S.; Rodriguez, L.
2011-12-01
The Lunar Mapping and Modeling Project (LMMP), led by the Marshall Space Flight center (MSFC), is tasked by NASA. The project is responsible for the development of an information system to support lunar exploration activities. It provides lunar explorers a set of tools and lunar map and model products that are predominantly derived from present lunar missions (e.g., the Lunar Reconnaissance Orbiter (LRO)) and from historical missions (e.g., Apollo). At Jet Propulsion Laboratory (JPL), we have built the LMMP interoperable geospatial information system's underlying infrastructure and a single point of entry - the LMMP Portal by employing a number of open standards and technologies. The Portal exposes a set of services to users to allow search, visualization, subset, and download of lunar data managed by the system. Users also have access to a set of tools that visualize, analyze and annotate the data. The infrastructure and Portal are based on web service oriented architecture. We designed the system to support solar system bodies in general including asteroids, earth and planets. We employed a combination of custom software, commercial and open-source components, off-the-shelf hardware and pay-by-use cloud computing services. The use of open standards and web service interfaces facilitate platform and application independent access to the services and data, offering for instances, iPad and Android mobile applications and large screen multi-touch with 3-D terrain viewing functions, for a rich browsing and analysis experience from a variety of platforms. The web services made use of open standards including: Representational State Transfer (REST); and Open Geospatial Consortium (OGC)'s Web Map Service (WMS), Web Coverage Service (WCS), Web Feature Service (WFS). Its data management services have been built on top of a set of open technologies including: Object Oriented Data Technology (OODT) - open source data catalog, archive, file management, data grid framework; openSSO - open source access management and federation platform; solr - open source enterprise search platform; redmine - open source project collaboration and management framework; GDAL - open source geospatial data abstraction library; and others. Its data products are compliant with Federal Geographic Data Committee (FGDC) metadata standard. This standardization allows users to access the data products via custom written applications or off-the-shelf applications such as GoogleEarth. We will demonstrate this ready-to-use system for data discovery and visualization by walking through the data services provided through the portal such as browse, search, and other tools. We will further demonstrate image viewing and layering of lunar map images from the Internet, via mobile devices such as Apple's iPad.
EPA GEOSPATIAL QUALITY COUNCIL
The EPA Geospatial Quality Council (previously known as the EPA GIS-QA Team - EPA/600/R-00/009 was created to fill the gap between the EPA Quality Assurance (QA) and Geospatial communities. All EPA Offices and Regions were invited to participate. Currently, the EPA Geospatial Q...
Geospatial Thinking of Information Professionals
ERIC Educational Resources Information Center
Bishop, Bradley Wade; Johnston, Melissa P.
2013-01-01
Geospatial thinking skills inform a host of library decisions including planning and managing facilities, analyzing service area populations, facility site location, library outlet and service point closures, as well as assisting users with their own geospatial needs. Geospatial thinking includes spatial cognition, spatial reasoning, and knowledge…
The University of Mississippi Geoinformatics Center (UMGC)
NASA Technical Reports Server (NTRS)
Easson, Gregory L.
2003-01-01
The overarching goal of the University of Mississippi Geoinformatics Center (UMGC) is to promote application of geospatial information technologies through technology education, research support, and infrastructure development. During the initial two- year phase of operation the UMGC has successfully met those goals and is uniquely positioned to continue operation and further expand the UMGC into additional academic programs. At the end of the first funding cycle, the goals of the UMGC have been and are being met through research and educational activities in the original four participating programs; Biology, Computer and Information Science, Geology and Geological Engineering, and Sociology and Anthropology, with the School of Business joining the UMGC in early 2001. Each of these departments is supporting graduate students conducting research, has created combination teaching and research laboratories, and supported faculty during the summer months.
Impact of highway construction on water bodies: a geospatial assessment.
Vijay, Ritesh; Kushwaha, Vikash K; Mardikar, Trupti; Labhasetwar, P K
2017-08-01
India has witnessed a massive infrastructure boom in the past few years. One of such projects is National Highway-7 (NH-7), a North-South highway connecting Kanyakumari, Tamil Nadu, to Varanasi, Uttar Pradesh, traversing many water bodies. The present study aims to assess the pre- and post-construction impact due to existing, new and widened NH-7 on the physical status of the water bodies, using remote sensing techniques. Satellite images spanning 22 years were procured and analysed for change detection in land use and land cover within the waterbodies. The study indicates that construction activities have led to transformation within the water bodies regarding reduction in area and inter-changing of land use and land cover classes, in turn leading to siltation and reduction of recharge.
Impacts of climate change on public health in India: future research directions.
Bush, Kathleen F; Luber, George; Kotha, S Rani; Dhaliwal, R S; Kapil, Vikas; Pascual, Mercedes; Brown, Daniel G; Frumkin, Howard; Dhiman, R C; Hess, Jeremy; Wilson, Mark L; Balakrishnan, Kalpana; Eisenberg, Joseph; Kaur, Tanvir; Rood, Richard; Batterman, Stuart; Joseph, Aley; Gronlund, Carina J; Agrawal, Arun; Hu, Howard
2011-06-01
Climate change and associated increases in climate variability will likely further exacerbate global health disparities. More research is needed, particularly in developing countries, to accurately predict the anticipated impacts and inform effective interventions. Building on the information presented at the 2009 Joint Indo-U.S. Workshop on Climate Change and Health in Goa, India, we reviewed relevant literature and data, addressed gaps in knowledge, and identified priorities and strategies for future research in India. The scope of the problem in India is enormous, based on the potential for climate change and variability to exacerbate endemic malaria, dengue, yellow fever, cholera, and chikungunya, as well as chronic diseases, particularly among the millions of people who already experience poor sanitation, pollution, malnutrition, and a shortage of drinking water. Ongoing efforts to study these risks were discussed but remain scant. A universal theme of the recommendations developed was the importance of improving the surveillance, monitoring, and integration of meteorological, environmental, geospatial, and health data while working in parallel to implement adaptation strategies. It will be critical for India to invest in improvements in information infrastructure that are innovative and that promote interdisciplinary collaborations while embarking on adaptation strategies. This will require unprecedented levels of collaboration across diverse institutions in India and abroad. The data can be used in research on the likely impacts of climate change on health that reflect India's diverse climates and populations. Local human and technical capacities for risk communication and promoting adaptive behavior must also be enhanced.
EPA Geospatial Quality Council Strategic and Implementation Plan 2010 to 2015
The EPA Geospatial Quality Council (GQC) was created to promote and provide Quality Assurance guidance for the development, use, and products of geospatial science. The GQC was created when the gap between the EPA Quality Assurance (QA) and Geospatial communities was recognized. ...
US EPA GEOSPATIAL QUALITY COUNCIL: ENSURING QUALITY GEOSPATIAL SOLUTIONS
This presentation will discuss the history, strategy, products, and future plans of the EPA Geospatial Quality Council (GQC). A topical review of GQC products will be presented including:
o Guidance for Geospatial Data Quality Assurance Project Plans.
o GPS - Tec...
NASA Astrophysics Data System (ADS)
McCreedy, Frank P.; Sample, John T.; Ladd, William P.; Thomas, Michael L.; Shaw, Kevin B.
2005-05-01
The Naval Research Laboratory"s Geospatial Information Database (GIDBTM) Portal System has been extended to now include an extensive geospatial search functionality. The GIDB Portal System interconnects over 600 distributed geospatial data sources via the Internet with a thick client, thin client and a PDA client. As the GIDB Portal System has rapidly grown over the last two years (adding hundreds of geospatial sources), the obvious requirement has arisen to more effectively mine the interconnected sources in near real-time. How the GIDB Search addresses this issue is the prime focus of this paper.
NASA Astrophysics Data System (ADS)
Deng, M.; di, L.
2005-12-01
The needs for Earth science education to prepare students as globally-trained geoscience workforce increase tremendously with globalization of the economy. However, current academic programs often have difficulties in providing students world-view training or experiences with global context due to lack of resources and suitable teaching technology. This paper presents a NASA funded project with insights and solutions to this problem. The project aims to establish a geospatial data-rich learning and research environment that enable the students, faculty and researchers from institutes all over the world easily accessing, analyzing and modeling with the huge amount of NASA EOS data just like they possess those vast resources locally at their desktops. With the environment, classroom demonstration and training for students to deal with global climate and environment issues for any part of the world are possible in any classroom with Internet connection. Globalization and mobilization of Earth science education can be truly realized through the environment. This project, named as NASA EOS Higher Education Alliance: Mobilization of NASA EOS Data and Information through Web Services and Knowledge Management Technologies for Higher Education Teaching and Research, is built on profound technology and infrastructure foundations including web service technology, NASA EOS data resources, and open interoperability standards. An open, distributed, standard compliant, interoperable web-based system, called GeoBrain, is being developed by this project to provide a data-rich on-line learning and research environment. The system allows users to dynamically and collaboratively develop interoperable, web-executable geospatial process and analysis modules and models, and run them on-line against any part of the peta-byte archives for getting back the customized information products rather than raw data. The system makes a data-rich globally-capable Earth science learning and research environment, backed by NASA EOS data and computing resources that are unavailable to students and professors before, available to them at their desktops free of charge. In order to efficiently integrate this new environment into Earth science education and research, a NASA EOS Higher Education Alliance (NEHEA) is formed. The core members of NEHEA consist of the GeoBrain development team led by LAITS at George Mason University and a group of Earth science educators selected from an open RFP process. NEHEA is an open and free alliance. NEHEA welcomes Earth science educators around the world to join as associate members. NEHEA promotes international research and education collaborations in Earth science. NEHEA core members will provide technical support to NEHEA associate members for incorporating the data-rich learning environment into their teaching and research activities. The responsibilities of NEHEA education members include using the system in their research and teaching, providing feedback and requirements to the development team, exchanging information on the utilization of the system capabilities, participating in the system development, and developing new curriculums and research around the environment provided by GeoBrain.
The Geospatial Web and Local Geographical Education
ERIC Educational Resources Information Center
Harris, Trevor M.; Rouse, L. Jesse; Bergeron, Susan J.
2010-01-01
Recent innovations in the Geospatial Web represent a paradigm shift in Web mapping by enabling educators to explore geography in the classroom by dynamically using a rapidly growing suite of impressive online geospatial tools. Coupled with access to spatial data repositories and User-Generated Content, the Geospatial Web provides a powerful…
NASA Astrophysics Data System (ADS)
Yang, Z. L.; Cao, J.; Hu, K.; Gui, Z. P.; Wu, H. Y.; You, L.
2016-06-01
Efficient online discovering and applying geospatial information resources (GIRs) is critical in Earth Science domain as while for cross-disciplinary applications. However, to achieve it is challenging due to the heterogeneity, complexity and privacy of online GIRs. In this article, GeoSquare, a collaborative online geospatial information sharing and geoprocessing platform, was developed to tackle this problem. Specifically, (1) GIRs registration and multi-view query functions allow users to publish and discover GIRs more effectively. (2) Online geoprocessing and real-time execution status checking help users process data and conduct analysis without pre-installation of cumbersome professional tools on their own machines. (3) A service chain orchestration function enables domain experts to contribute and share their domain knowledge with community members through workflow modeling. (4) User inventory management allows registered users to collect and manage their own GIRs, monitor their execution status, and track their own geoprocessing histories. Besides, to enhance the flexibility and capacity of GeoSquare, distributed storage and cloud computing technologies are employed. To support interactive teaching and training, GeoSquare adopts the rich internet application (RIA) technology to create user-friendly graphical user interface (GUI). Results show that GeoSquare can integrate and foster collaboration between dispersed GIRs, computing resources and people. Subsequently, educators and researchers can share and exchange resources in an efficient and harmonious way.
The National Mechatronic Platform. The basis of the educational programs in the knowledge society
NASA Astrophysics Data System (ADS)
Maties, V.
2016-08-01
The shift from the information society to the knowledge based society caused by the mechatronic revolution, that took place in the 9th decade of the last century, launched a lot of challenges for education and researches activities too. Knowledge production development asks for new educational technologies to stimulate the initiative and creativity as a base to increase the productivity in the knowledge production. The paper presents details related on the innovative potential of mechatronics as educational environment for transdisciplinarity learning and integral education. The basic infrastructure of that environment is based on mechatronic platforms. In order to develop the knowledge production at the national level the specific structures are to be developed. The paper presents details related on the structure of the National Mechatronic Platform as a true knowledge factory. The benefits of the effort to develop the specific infrastructure for knowledge production in the field of mechatronics are outlined too.
NASA Astrophysics Data System (ADS)
Quesnel, K.; Ajami, N.; Urata, J.; Marx, A.
2017-12-01
Infrastructure modernization, information technology, and the internet of things are impacting urban water use. Advanced metering infrastructure (AMI), also known as smart meters, is one forthcoming technology that holds the potential to fundamentally shift the way customers use water and utilities manage their water resources. Broadly defined, AMI is a system and process used to measure, communicate, and analyze water use data at high resolution intervals at the customer or sub-customer level. There are many promising benefits of AMI systems, but there are also many challenges; consequently, AMI in the water sector is still in its infancy. In this study we provide insights into this emerging technology by taking advantage of the higher temporal and spatial resolution of water use data provided by these systems. We couple daily water use observations from AMI with monthly and bimonthly billing records to investigate water use trends, patterns, and drivers using a case study of the City of Redwood City, CA from 2007 through 2016. We look across sectors, with a particular focus on water use for urban irrigation. Almost half of Redwood City's irrigation accounts use recycled water, and we take this unique opportunity to investigate if the behavioral response for recycled water follows the water and energy efficiency paradox in which customers who have upgraded to more efficient devices end up using more of the commodity. We model potable and recycled water demand using geospatially explicit climate, demographic, and economic factors to gain insight into various water use drivers. Additionally, we use high resolution remote sensing data from the National Agricultural Imaging Program (NAIP) to observe how changes in greenness and impervious surface are related to water use. Using a series of statistical and unsupervised machine learning techniques, we find that water use has changed dramatically over the past decade corresponding to varying climatic regimes and drought cycles. Yet, these changes in demand are complex, and vary depending on sector, water type, and neighborhood norms.
NASA Astrophysics Data System (ADS)
Biswas, R.; Arya, K.; Deshpande, S. C.
2017-12-01
Sanitation is the daily water-human interaction, but Billions of people are still far away from access to improved public sanitation - mostly in developing countries. This challenges Millennium Development Goals across the globe. Economic growth with provision of basic services is unable to assure improvements in sanitation & health. Policymakers & researchers often focus on building infrastructural-capacity without considering empirical factors behind poor sanitation. What are these driving factors? Is there a nexus between sanitation & health? How it is spatially distributed? We have conducted geo-spatial assessment and exploratory regression to interpret spatial-distribution data and deriving influential pragmatic factors in the process. Mumbai is our test-bed, where we have accumulated and applied a total of 40 ward-wise-attributes related to socio-demographic, spatial, services, diseases and infrastructural data. The results indicate that: higher population per toilet-seat, numerous toilet-issues, low toilet density and poor/moderate toilet-condition may be the reason behind the spread of Diarrhoea. On the other hand, illiteracy, per capita waste generation, excreta overflow to open gutter/nallah from toilets and poor/moderate toilet-condition may be the reasons for the spread of Malaria. Strong correlation or associations observed, as in our Malaria-model has an adjusted R2 of 0.65 and the Diarrhoea-model has 0.76. The identified variables are significant enough, since the p-value is <0.05 for the Malaria-model and <0.08 for the Diarrhoea-model. The spatial assessment identifies that ward-FS is most vulnerable followed by ward-GN, considering poor public sanitation & excessive waste generation along with Malaria & Diarrhoea disease-cases. This study and its methods contribute to the advancement of scientific method as a tool that may be useful for researchers, stakeholders and policymakers to conduct further scientific studies in analogous cities. This also permits us to model them to explore policy amendments to mitigate poor sanitation practices that affect public health in contemporary societies.
ERIC Educational Resources Information Center
Hogrebe, Mark C.; Tate, William F., IV
2012-01-01
In this chapter, "geospatial" refers to geographic space that includes location, distance, and the relative position of things on the earth's surface. Geospatial perspective calls for the addition of a geographic lens that focuses on place and space as important contextual variables. A geospatial view increases one's understanding of…
Geospatial Data Curation at the University of Idaho
ERIC Educational Resources Information Center
Kenyon, Jeremy; Godfrey, Bruce; Eckwright, Gail Z.
2012-01-01
The management and curation of digital geospatial data has become a central concern for many academic libraries. Geospatial data is a complex type of data critical to many different disciplines, and its use has become more expansive in the past decade. The University of Idaho Library maintains a geospatial data repository called the Interactive…
2017-02-22
manages operations through guidance, policies, programs, and organizations. The NSG is designed to be a mutually supportive enterprise that...deliberate technical design and deliberate human actions. Geospatial engineer teams (GETs) within the geospatial intelligence cells are the day-to-day...standards working group and are designated by the AGC Geospatial Acquisition Support Directorate as required for interoperability. Applicable standards
NASA Astrophysics Data System (ADS)
Connor, C. L.; Prakash, A.
2007-12-01
Alaska's secondary school teachers are increasingly required to provide Earth systems science (ESS) education that integrates student observations of local natural processes related to rapid climate change with geospatial datasets and satellite imagery using Geographic Information Systems (GIS) technology. Such skills are also valued in various employment sectors of the state where job opportunities requiring Earth science and GIS training are increasing. University of Alaska's EDGE (Experiential Discoveries in Geoscience Education) program has provided training and classroom resources for 3 cohorts of inservice Alaska science and math teachers in GIS and Earth Systems Science (2005-2007). Summer workshops include geologic field experiences, GIS instruction, computer equipment and technical support for groups of Alaska high school (HS) and middle school (MS) science teachers each June and their students in August. Since 2005, EDGE has increased Alaska science and math teachers' Earth science content knowledge and developed their GIS and computer skills. In addition, EDGE has guided teachers using a follow-up, fall online course that provided more extensive ESS knowledge linked with classroom standards and provided course content that was directly transferable into their MS and HS science classrooms. EDGE teachers were mentored by University faculty and technical staff as they guided their own students through semester-scale, science fair style projects using geospatial data that was student- collected. EDGE program assessment indicates that all teachers have improved their ESS knowledge, GIS knowledge, and the use of technology in their classrooms. More than 230 middle school students have learned GIS, from EDGE teachers and 50 EDGE secondary students have conducted original research related to landscape change and its impacts on their own communities. Longer-term EDGE goals include improving student performance on the newly implemented (spring 2008) 10th grade, standards-based, High School Qualifying Exam, on recruiting first-generation college students, and on increasing the number of Earth science majors in the University of Alaska system.
Knowledge Glyphs: Visualization Theory Development to Support C2 Practice
2006-03-01
interface’s graphic structure (Calder and Linton, 2003). "• ’Glyphs’ as components of a typographical set (Microsoft Typography Standards). "* ’DataGlyphs...MOOTW) factors MIL STD 2525’s symbology set was designed for application in the context of geospatial representations - i.e., geographical maps. It is...the visual elements used to portray discrete entities. In a conventional windowing environment, such entities are likely to be graphically portrayed
Natural Assurance Scheme: A level playing field framework for Green-Grey infrastructure development.
Denjean, Benjamin; Altamirano, Mónica A; Graveline, Nina; Giordano, Raffaele; van der Keur, Peter; Moncoulon, David; Weinberg, Josh; Máñez Costa, María; Kozinc, Zdravko; Mulligan, Mark; Pengal, Polona; Matthews, John; van Cauwenbergh, Nora; López Gunn, Elena; Bresch, David N
2017-11-01
This paper proposes a conceptual framework to systematize the use of Nature-based solutions (NBS) by integrating their resilience potential into Natural Assurance Scheme (NAS), focusing on insurance value as corner stone for both awareness-raising and valuation. As such one of its core goal is to align research and pilot projects with infrastructure development constraints and priorities. Under NAS, the integrated contribution of natural infrastructure to Disaster Risk Reduction is valued in the context of an identified growing need for climate robust infrastructure. The potential of NAS benefits and trade-off are explored by through the alternative lens of Disaster Resilience Enhancement (DRE). Such a system requires a joint effort of specific knowledge transfer from research groups and stakeholders to potential future NAS developers and investors. We therefore match the knowledge gaps with operational stages of the development of NAS from a project designer perspective. We start by highlighting the key role of the insurance industry in incentivizing and assessing disaster and slow onset resilience enhancement strategies. In parallel we place the public sector as potential kick-starters in DRE initiatives through the existing initiatives and constraints of infrastructure procurement. Under this perspective the paper explores the required alignment of Integrated Water resources planning and Public investment systems. Ultimately this will provide the possibility for both planners and investors to design no regret NBS and mixed Grey-Green infrastructures systems. As resources and constraints are widely different between infrastructure development contexts, the framework does not provide explicit methodological choices but presents current limits of knowledge and know-how. In conclusion the paper underlines the potential of NAS to ease the infrastructure gap in water globally by stressing the advantages of investment in the protection, enhancement and restoration of natural capital as an effective climate change adaptation investment. Copyright © 2017. Published by Elsevier Inc.
Integrating biodiversity distribution knowledge: toward a global map of life.
Jetz, Walter; McPherson, Jana M; Guralnick, Robert P
2012-03-01
Global knowledge about the spatial distribution of species is orders of magnitude coarser in resolution than other geographically-structured environmental datasets such as topography or land cover. Yet such knowledge is crucial in deciphering ecological and evolutionary processes and in managing global change. In this review, we propose a conceptual and cyber-infrastructure framework for refining species distributional knowledge that is novel in its ability to mobilize and integrate diverse types of data such that their collective strengths overcome individual weaknesses. The ultimate aim is a public, online, quality-vetted 'Map of Life' that for every species integrates and visualizes available distributional knowledge, while also facilitating user feedback and dynamic biodiversity analyses. First milestones toward such an infrastructure have now been implemented. Copyright © 2011 Elsevier Ltd. All rights reserved.
Ontology for Transforming Geo-Spatial Data for Discovery and Integration of Scientific Data
NASA Astrophysics Data System (ADS)
Nguyen, L.; Chee, T.; Minnis, P.
2013-12-01
Discovery and access to geo-spatial scientific data across heterogeneous repositories and multi-discipline datasets can present challenges for scientist. We propose to build a workflow for transforming geo-spatial datasets into semantic environment by using relationships to describe the resource using OWL Web Ontology, RDF, and a proposed geo-spatial vocabulary. We will present methods for transforming traditional scientific dataset, use of a semantic repository, and querying using SPARQL to integrate and access datasets. This unique repository will enable discovery of scientific data by geospatial bound or other criteria.
Detmer, Don E
2003-01-01
Background Improving health in our nation requires strengthening four major domains of the health care system: personal health management, health care delivery, public health, and health-related research. Many avoidable shortcomings in the health sector that result in poor quality are due to inaccessible data, information, and knowledge. A national health information infrastructure (NHII) offers the connectivity and knowledge management essential to correct these shortcomings. Better health and a better health system are within our reach. Discussion A national health information infrastructure for the United States should address the needs of personal health management, health care delivery, public health, and research. It should also address relevant global dimensions (e.g., standards for sharing data and knowledge across national boundaries). The public and private sectors will need to collaborate to build a robust national health information infrastructure, essentially a 'paperless' health care system, for the United States. The federal government should assume leadership for assuring a national health information infrastructure as recommended by the National Committee on Vital and Health Statistics and the President's Information Technology Advisory Committee. Progress is needed in the areas of funding, incentives, standards, and continued refinement of a privacy (i.e., confidentiality and security) framework to facilitate personal identification for health purposes. Particular attention should be paid to NHII leadership and change management challenges. Summary A national health information infrastructure is a necessary step for improved health in the U.S. It will require a concerted, collaborative effort by both public and private sectors. If you cannot measure it, you cannot improve it. Lord Kelvin PMID:12525262
Geoinformatics: Transforming data to knowledge for geosciences
Sinha, A.K.; Malik, Z.; Rezgui, A.; Barnes, C.G.; Lin, K.; Heiken, G.; Thomas, W.A.; Gundersen, L.C.; Raskin, R.; Jackson, I.; Fox, P.; McGuinness, D.; Seber, D.; Zimmerman, H.
2010-01-01
An integrative view of Earth as a system, based on multidisciplinary data, has become one of the most compelling reasons for research and education in the geosciences. It is now necessary to establish a modern infrastructure that can support the transformation of data to knowledge. Such an information infrastructure for geosciences is contained within the emerging science of geoinformatics, which seeks to promote the utilizetion and integration of complex, multidisciplinary data in seeking solutions to geosciencebased societal challenges.
Prat, P; Aulinas, M; Turon, C; Comas, J; Poch, M
2009-01-01
Current management of sanitation infrastructures (sewer systems, wastewater treatment plant, receiving water, bypasses, deposits, etc) is not fulfilling the objectives of up to date legislation, to achieve a good ecological and chemical status of water bodies through integrated management. These made it necessary to develop new methodologies that help decision makers to improve the management in order to achieve that status. Decision Support Systems (DSS) based on Multi-Agent System (MAS) paradigm are promising tools to improve the integrated management. When all the different agents involved interact, new important knowledge emerges. This knowledge can be used to build better DSS and improve wastewater infrastructures management achieving the objectives planned by legislation. The paper describes a methodology to acquire this knowledge through a Role Playing Game (RPG). First of all there is an introduction about the wastewater problems, a definition of RPG, and the relation between RPG and MAS. Then it is explained how the RPG was built with two examples of game sessions and results. The paper finishes with a discussion about the uses of this methodology and future work.
Development of a flexible higher education curriculum framework for geographic information science
NASA Astrophysics Data System (ADS)
Veenendaal, B.
2014-04-01
A wide range of geographic information science (GIScience) educational programs currently exist, the oldest now over 25 years. Offerings vary from those specifically focussed on geographic information science, to those that utilise geographic information systems in various applications and disciplines. Over the past two decades, there have been a number of initiatives to design curricula for GIScience, including the NCGIA Core Curriculum, GIS&T Body of Knowledge and the Geospatial Technology Competency Model developments. The rapid developments in geospatial technology, applications and organisations means that curricula need to constantly be updated and developed to maintain currency and relevance. This paper reviews the curriculum initiatives and outlines a new and flexible GIScience higher education curriculum framework which complements and utilises existing curricula. This new framework was applied to the GIScience programs at Curtin University in Perth, Australia which has surpassed 25 years of GIScience education. Some of the results of applying this framework are outlined and discussed.
VoPham, Trang; Hart, Jaime E; Laden, Francine; Chiang, Yao-Yi
2018-04-17
Geospatial artificial intelligence (geoAI) is an emerging scientific discipline that combines innovations in spatial science, artificial intelligence methods in machine learning (e.g., deep learning), data mining, and high-performance computing to extract knowledge from spatial big data. In environmental epidemiology, exposure modeling is a commonly used approach to conduct exposure assessment to determine the distribution of exposures in study populations. geoAI technologies provide important advantages for exposure modeling in environmental epidemiology, including the ability to incorporate large amounts of big spatial and temporal data in a variety of formats; computational efficiency; flexibility in algorithms and workflows to accommodate relevant characteristics of spatial (environmental) processes including spatial nonstationarity; and scalability to model other environmental exposures across different geographic areas. The objectives of this commentary are to provide an overview of key concepts surrounding the evolving and interdisciplinary field of geoAI including spatial data science, machine learning, deep learning, and data mining; recent geoAI applications in research; and potential future directions for geoAI in environmental epidemiology.
A geospatial search engine for discovering multi-format geospatial data across the web
Christopher Bone; Alan Ager; Ken Bunzel; Lauren Tierney
2014-01-01
The volume of publically available geospatial data on the web is rapidly increasing due to advances in server-based technologies and the ease at which data can now be created. However, challenges remain with connecting individuals searching for geospatial data with servers and websites where such data exist. The objective of this paper is to present a publically...
Automated geospatial Web Services composition based on geodata quality requirements
NASA Astrophysics Data System (ADS)
Cruz, Sérgio A. B.; Monteiro, Antonio M. V.; Santos, Rafael
2012-10-01
Service-Oriented Architecture and Web Services technologies improve the performance of activities involved in geospatial analysis with a distributed computing architecture. However, the design of the geospatial analysis process on this platform, by combining component Web Services, presents some open issues. The automated construction of these compositions represents an important research topic. Some approaches to solving this problem are based on AI planning methods coupled with semantic service descriptions. This work presents a new approach using AI planning methods to improve the robustness of the produced geospatial Web Services composition. For this purpose, we use semantic descriptions of geospatial data quality requirements in a rule-based form. These rules allow the semantic annotation of geospatial data and, coupled with the conditional planning method, this approach represents more precisely the situations of nonconformities with geodata quality that may occur during the execution of the Web Service composition. The service compositions produced by this method are more robust, thus improving process reliability when working with a composition of chained geospatial Web Services.
Economic assessment of the use value of geospatial information
Bernknopf, Richard L.; Shapiro, Carl D.
2015-01-01
Geospatial data inform decision makers. An economic model that involves application of spatial and temporal scientific, technical, and economic data in decision making is described. The value of information (VOI) contained in geospatial data is the difference between the net benefits (in present value terms) of a decision with and without the information. A range of technologies is used to collect and distribute geospatial data. These technical activities are linked to examples that show how the data can be applied in decision making, which is a cultural activity. The economic model for assessing the VOI in geospatial data for decision making is applied to three examples: (1) a retrospective model about environmental regulation of agrochemicals; (2) a prospective model about the impact and mitigation of earthquakes in urban areas; and (3) a prospective model about developing private–public geospatial information for an ecosystem services market. Each example demonstrates the potential value of geospatial information in a decision with uncertain information.
Global Mapping Project - Applications and Development of Version 2 Dataset
NASA Astrophysics Data System (ADS)
Ubukawa, T.; Nakamura, T.; Otsuka, T.; Iimura, T.; Kishimoto, N.; Nakaminami, K.; Motojima, Y.; Suga, M.; Yatabe, Y.; Koarai, M.; Okatani, T.
2012-07-01
The Global Mapping Project aims to develop basic geospatial information of the whole land area of the globe, named Global Map, through the cooperation of National Mapping Organizations (NMOs) around the world. The Global Map data can be a base of global geospatial infrastructure and is composed of eight layers: Boundaries, Drainage, Transportation, Population Centers, Elevation, Land Use, Land Cover and Vegetation. The Global Map Version 1 was released in 2008, and the Version 2 will be released in 2013 as the data are to be updated every five years. In 2009, the International Steering Committee for Global Mapping (ISCGM) adopted new Specifications to develop the Global Map Version 2 with a change of its format so that it is compatible with the international standards, namely ISO 19136 and ISO 19115. With the support of the secretariat of ISCGM, the project participating countries are accelerating their data development toward the completion of the global coverage in 2013, while some countries have already released their Global Map version 2 datasets since 2010. Global Map data are available from the Internet free of charge for non-commercial purposes, which can be used to predict, assess, prepare for and cope with global issues by combining with other spatial data. There are a lot of Global Map applications in various fields, and further utilization of Global Map is expected. This paper summarises the activities toward the development of the Global Map Version 2 as well as some examples of the Global Map applications in various fields.
NASA Astrophysics Data System (ADS)
Morin, P. J.; Pundsack, J. W.; Carbotte, S. M.; Tweedie, C. E.; Grunow, A.; Lazzara, M. A.; Carpenter, P.; Sjunneskog, C. M.; Yarmey, L.; Bauer, R.; Adrian, B. M.; Pettit, J.
2014-12-01
The U.S. National Science Foundation Antarctic & Arctic Data Consortium (a2dc) is a collaboration of research centers and support organizations that provide polar scientists with data and tools to complete their research objectives. From searching historical weather observations to submitting geologic samples, polar researchers utilize the a2dc to search andcontribute to the wealth of polar scientific and geospatial data.The goals of the Antarctic & Arctic Data Consortium are to increase visibility in the research community of the services provided by resource and support facilities. Closer integration of individual facilities into a "one stop shop" will make it easier for researchers to take advantage of services and products provided by consortium members. The a2dc provides a common web portal where investigators can go to access data and samples needed to build research projects, develop student projects, or to do virtual field reconnaissance without having to utilize expensive logistics to go into the field.Participation by the international community is crucial for the success of a2dc. There are 48 nations that are signatories of the Antarctic Treaty, and 8 sovereign nations in the Arctic. Many of these organizations have unique capabilities and data that would benefit US funded polar science and vice versa.We'll present an overview of the Antarctic & Arctic Data Consortium, current participating organizations, challenges & opportunities, and plans to better coordinate data through a geospatial strategy and infrastructure.
Ontology Based Quality Evaluation for Spatial Data
NASA Astrophysics Data System (ADS)
Yılmaz, C.; Cömert, Ç.
2015-08-01
Many institutions will be providing data to the National Spatial Data Infrastructure (NSDI). Current technical background of the NSDI is based on syntactic web services. It is expected that this will be replaced by semantic web services. The quality of the data provided is important in terms of the decision-making process and the accuracy of transactions. Therefore, the data quality needs to be tested. This topic has been neglected in Turkey. Data quality control for NSDI may be done by private or public "data accreditation" institutions. A methodology is required for data quality evaluation. There are studies for data quality including ISO standards, academic studies and software to evaluate spatial data quality. ISO 19157 standard defines the data quality elements. Proprietary software such as, 1Spatial's 1Validate and ESRI's Data Reviewer offers quality evaluation based on their own classification of rules. Commonly, rule based approaches are used for geospatial data quality check. In this study, we look for the technical components to devise and implement a rule based approach with ontologies using free and open source software in semantic web context. Semantic web uses ontologies to deliver well-defined web resources and make them accessible to end-users and processes. We have created an ontology conforming to the geospatial data and defined some sample rules to show how to test data with respect to data quality elements including; attribute, topo-semantic and geometrical consistency using free and open source software. To test data against rules, sample GeoSPARQL queries are created, associated with specifications.
The NatCarb geoportal: Linking distributed data from the Carbon Sequestration Regional Partnerships
Carr, T.R.; Rich, P.M.; Bartley, J.D.
2007-01-01
The Department of Energy (DOE) Carbon Sequestration Regional Partnerships are generating the data for a "carbon atlas" of key geospatial data (carbon sources, potential sinks, etc.) required for rapid implementation of carbon sequestration on a broad scale. The NATional CARBon Sequestration Database and Geographic Information System (NatCarb) provides Web-based, nation-wide data access. Distributed computing solutions link partnerships and other publicly accessible repositories of geological, geophysical, natural resource, infrastructure, and environmental data. Data are maintained and enhanced locally, but assembled and accessed through a single geoportal. NatCarb, as a first attempt at a national carbon cyberinfrastructure (NCCI), assembles the data required to address technical and policy challenges of carbon capture and storage. We present a path forward to design and implement a comprehensive and successful NCCI. ?? 2007 The Haworth Press, Inc. All rights reserved.
Green pastures: Do US real estate prices respond to population health?
Nau, Claudia; Bishai, David
2018-01-01
We investigate whether communities with improving population health will subsequently experience rising real estate prices. Home price indices (HPIs) for 371 MSAs from 1990 to 2010 are regressed against life-expectancy five years prior. HPIs come from the Federal Housing Finance Agency. Life expectancy estimates come from the Institute of Health Metrics. Our analysis uses random and fixed effect models with a comprehensive set of controls. Life expectancy predicted increases in the HPI controlling for potential confounders. We found that, this effect varied spatially. Communities that invest their revenue from property taxes in public health infrastructure could benefit from a virtuous cycle of better health leading to higher property values. Communities that do not invest in health could enter vicious cycles and this could widen geospatial health and wealth disparities. Copyright © 2017 Elsevier Ltd. All rights reserved.
EPA National Geospatial Data Policy
National Geospatial Data Policy (NGDP) establishes principles, responsibilities, and requirements for collecting and managing geospatial data used by Federal environmental programs and projects within the jurisdiction of the U.S. EPA
NASA Astrophysics Data System (ADS)
Asmi, A.; Sorvari, S.; Kutsch, W. L.; Laj, P.
2017-12-01
European long-term environmental research infrastructures (often referred as ESFRI RIs) are the core facilities for providing services for scientists in their quest for understanding and predicting the complex Earth system and its functioning that requires long-term efforts to identify environmental changes (trends, thresholds and resilience, interactions and feedbacks). Many of the research infrastructures originally have been developed to respond to the needs of their specific research communities, however, it is clear that strong collaboration among research infrastructures is needed to serve the trans-boundary research requires exploring scientific questions at the intersection of different scientific fields, conducting joint research projects and developing concepts, devices, and methods that can be used to integrate knowledge. European Environmental research infrastructures have already been successfully worked together for many years and have established a cluster - ENVRI cluster - for their collaborative work. ENVRI cluster act as a collaborative platform where the RIs can jointly agree on the common solutions for their operations, draft strategies and policies and share best practices and knowledge. Supporting project for the ENVRI cluster, ENVRIplus project, brings together 21 European research infrastructures and infrastructure networks to work on joint technical solutions, data interoperability, access management, training, strategies and dissemination efforts. ENVRI cluster act as one stop shop for multidisciplinary RI users, other collaborative initiatives, projects and programmes and coordinates and implement jointly agreed RI strategies.
EPA has developed many applications that allow users to explore and interact with geospatial data. This page highlights some of the flagship geospatial web applications but these represent only a fraction of the total.
An Update on NASA's Arctic Boreal Vulnerability Experiment
NASA Astrophysics Data System (ADS)
Goetz, S. J.; Miller, C. E.; Griffith, P. C.; Larson, E. K.; Kasischke, E. S.; Margolis, H. A.
2016-12-01
ABoVE is a NASA-led field campaign taking place in Alaska and western Canada over the next 8-10 years, with a wide range of interdisciplinary science objectives designed to address the extent to which ecosystems and society are vulnerable, or resilient, to environmental changes underway and expected. The first phase of ABoVE is underway, with a focus on ecosystem dynamics and ecosystem services objectives. Some 45 core and affiliated projects are currently included, and another 10-20 will be added in late 2016 with initiation of the airborne science component. The ABoVE leadership is fostering partnerships with several other major arctic and boreal research, management and policy initiatives. The Science Team is organized around science themes, with Working Groups (WGs) on vegetation, permafrost and hydrology, disturbance, carbon dynamics, wildlife and ecosystem services, and modeling. Despite the disciplinary science WGs, ABoVE research broadly focuses the complex interdependencies and feedbacks across disciplines. Additional WGs focus on airborne science, geospatial products, core variables and standards, and stakeholder engagement - all supplemented by a range of infrastructure activities such as data management, cloud computing, laboratory and field support. Ultimately ABoVE research will improve our understanding of the consequences of environmental changes occurring across the study domain, as well as increase our confidence in making projections of the ecosystem responses and vulnerability to changes taking place both within and outside the domain. ABoVE will also build a lasting legacy of research through an expanded knowledge base, the provision of key datasets archived for a broader network of researchers and resource managers, and the development of data products and knowledge designed to foster decision support and applied research partnerships with broad societal relevance. We will provide a brief status update of ABoVE activities and plans, including the upcoming airborne campaigns, science team meetings, and the potential for partnerships and engagement.
NASA Astrophysics Data System (ADS)
Czajkowski, M.; Shilliday, A.; LoFaso, N.; Dipon, A.; Van Brackle, D.
2016-09-01
In this paper, we describe and depict the Defense Advanced Research Projects Agency (DARPA)'s OrbitOutlook Data Archive (OODA) architecture. OODA is the infrastructure that DARPA's OrbitOutlook program has developed to integrate diverse data from various academic, commercial, government, and amateur space situational awareness (SSA) telescopes. At the heart of the OODA system is its world model - a distributed data store built to quickly query big data quantities of information spread out across multiple processing nodes and data centers. The world model applies a multi-index approach where each index is a distinct view on the data. This allows for analysts and analytics (algorithms) to access information through queries with a variety of terms that may be of interest to them. Our indices include: a structured global-graph view of knowledge, a keyword search of data content, an object-characteristic range search, and a geospatial-temporal orientation of spatially located data. In addition, the world model applies a federated approach by connecting to existing databases and integrating them into one single interface as a "one-stop shopping place" to access SSA information. In addition to the world model, OODA provides a processing platform for various analysts to explore and analytics to execute upon this data. Analytic algorithms can use OODA to take raw data and build information from it. They can store these products back into the world model, allowing analysts to gain situational awareness with this information. Analysts in turn would help decision makers use this knowledge to address a wide range of SSA problems. OODA is designed to make it easy for software developers who build graphical user interfaces (GUIs) and algorithms to quickly get started with working with this data. This is done through a multi-language software development kit that includes multiple application program interfaces (APIs) and a data model with SSA concepts and terms such as: space observation, observable, measurable, metadata, track, space object, catalog, expectation, and maneuver.
NASA Astrophysics Data System (ADS)
Oeldenberger, S.; Khaled, K. B.
2012-07-01
The African Geospatial Sciences Institute (AGSI) is currently being established in Tunisia as a non-profit, non-governmental organization (NGO). Its objective is to accelerate the geospatial capacity development in North-Africa, providing the facilities for geospatial project and management training to regional government employees, university graduates, private individuals and companies. With typical course durations between one and six months, including part-time programs and long-term mentoring, its focus is on practical training, providing actual project execution experience. The AGSI will complement formal university education and will work closely with geospatial certification organizations and the geospatial industry. In the context of closer cooperation between neighboring North Africa and the European Community, the AGSI will be embedded in a network of several participating European and African universities, e. g. the ITC, and international organizations, such as the ISPRS, the ICA and the OGC. Through a close cooperation with African organizations, such as the AARSE, the RCMRD and RECTAS, the network and exchange of ideas, experiences, technology and capabilities will be extended to Saharan and sub-Saharan Africa. A board of trustees will be steering the AGSI operations and will ensure that practical training concepts and contents are certifiable and can be applied within a credit system to graduate and post-graduate education at European and African universities. The geospatial training activities of the AGSI are centered on a facility with approximately 30 part- and full-time general staff and lecturers in Tunis during the first year. The AGSI will operate a small aircraft with a medium-format aerial camera and compact LIDAR instrument for local, community-scale data capture. Surveying training, the photogrammetric processing of aerial images, GIS data capture and remote sensing training will be the main components of the practical training courses offered, to build geospatial capacity and ensure that AGSI graduates will have the appropriate skill-sets required for employment in the geospatial industry. Geospatial management courses and high-level seminars will be targeted at decision makers in government and industry to build awareness for geospatial applications and benefits. Online education will be developed together with international partners and internet-based activities will involve the public to familiarize them with geospatial data and its many applications.
Services Oriented Smart City Platform Based On 3d City Model Visualization
NASA Astrophysics Data System (ADS)
Prandi, F.; Soave, M.; Devigili, F.; Andreolli, M.; De Amicis, R.
2014-04-01
The rapid technological evolution, which is characterizing all the disciplines involved within the wide concept of smart cities, is becoming a key factor to trigger true user-driven innovation. However to fully develop the Smart City concept to a wide geographical target, it is required an infrastructure that allows the integration of heterogeneous geographical information and sensor networks into a common technological ground. In this context 3D city models will play an increasingly important role in our daily lives and become an essential part of the modern city information infrastructure (Spatial Data Infrastructure). The work presented in this paper describes an innovative Services Oriented Architecture software platform aimed at providing smartcities services on top of 3D urban models. 3D city models are the basis of many applications and can became the platform for integrating city information within the Smart-Cites context. In particular the paper will investigate how the efficient visualisation of 3D city models using different levels of detail (LODs) is one of the pivotal technological challenge to support Smart-Cities applications. The goal is to provide to the final user realistic and abstract 3D representations of the urban environment and the possibility to interact with a massive amounts of semantic information contained into the geospatial 3D city model. The proposed solution, using OCG standards and a custom service to provide 3D city models, lets the users to consume the services and interact with the 3D model via Web in a more effective way.
Geospatial Data Science Analysis | Geospatial Data Science | NREL
different levels of technology maturity. Photo of a man taking field measurements. Geospatial analysis energy for different technologies across the nation? Featured Analysis Products Renewable Energy
Geospatial Information is the Cornerstone of Effective Hazards Response
Newell, Mark
2008-01-01
Every day there are hundreds of natural disasters world-wide. Some are dramatic, whereas others are barely noticeable. A natural disaster is commonly defined as a natural event with catastrophic consequences for living things in the vicinity. Those events include earthquakes, floods, hurricanes, landslides, tsunami, volcanoes, and wildfires. Man-made disasters are events that are caused by man either intentionally or by accident, and that directly or indirectly threaten public health and well-being. These occurrences span the spectrum from terrorist attacks to accidental oil spills. To assist in responding to natural and potential man-made disasters, the U.S. Geological Survey (USGS) has established the Geospatial Information Response Team (GIRT) (http://www.usgs.gov/emergency/). The primary purpose of the GIRT is to ensure rapid coordination and availability of geospatial information for effective response by emergency responders, and land and resource managers, and for scientific analysis. The GIRT is responsible for establishing monitoring procedures for geospatial data acquisition, processing, and archiving; discovery, access, and delivery of data; anticipating geospatial needs; and providing relevant geospatial products and services. The GIRT is focused on supporting programs, offices, other agencies, and the public in mission response to hazards. The GIRT will leverage the USGS Geospatial Liaison Network and partnerships with the Department of Homeland Security (DHS), National Geospatial-Intelligence Agency (NGA), and Northern Command (NORTHCOM) to coordinate the provisioning and deployment of USGS geospatial data, products, services, and equipment. The USGS geospatial liaisons will coordinate geospatial information sharing with State, local, and tribal governments, and ensure geospatial liaison back-up support procedures are in place. The GIRT will coordinate disposition of USGS staff in support of DHS response center activities as requested by DHS. The GIRT is a standing team that is available during all hazard events and is on high alert during the hurricane season from June through November each year. To track all of the requirements and data acquisitions processed through the team, the GIRT will use the new Emergency Request Track (ER Track) tool. Currently, the ER Track is only available to USGS personnel.
A tool for exploring space-time patterns: an animation user research.
Ogao, Patrick J
2006-08-29
Ever since Dr. John Snow (1813-1854) used a case map to identify water well as the source of a cholera outbreak in London in the 1800s, the use of spatio-temporal maps have become vital tools in a wide range of disease mapping and control initiatives. The increasing use of spatio-temporal maps in these life-threatening sectors warrants that they are accurate, and easy to interpret to enable prompt decision making by health experts. Similar spatio-temporal maps are observed in urban growth and census mapping--all critical aspects a of a country's socio-economic development. In this paper, a user test research was carried out to determine the effectiveness of spatio-temporal maps (animation) in exploring geospatial structures encompassing disease, urban and census mapping. Three types of animation were used, namely; passive, interactive and inference-based animation, with the key differences between them being on the level of interactivity and complementary domain knowledge that each offers to the user. Passive animation maintains the view only status. The user has no control over its contents and dynamic variables. Interactive animation provides users with the basic media player controls, navigation and orientation tools. Inference-based animation incorporates these interactive capabilities together with a complementary automated intelligent view that alerts users to interesting patterns, trends or anomalies that may be inherent in the data sets. The test focussed on the role of animation passive and interactive capabilities in exploring space-time patterns by engaging test-subjects in thinking aloud evaluation protocol. The test subjects were selected from a geoinformatics (map reading, interpretation and analysis abilities) background. Every test-subject used each of the three types of animation and their performances for each session assessed. The results show that interactivity in animation is a preferred exploratory tool in identifying, interpreting and providing explanations about observed geospatial phenomena. Also, exploring geospatial data structures using animation is best achieved using provocative interactive tools such as was seen with the inference-based animation. The visual methods employed using the three types of animation are all related and together these patterns confirm the exploratory cognitive structure and processes for visualization tools. The generic types of animation as defined in this paper play a crucial role in facilitating the visualization of geospatial data. These animations can be created and their contents defined based on the user's presentational and exploratory needs. For highly explorative tasks, maintaining a link between the data sets and the animation is crucial to enabling a rich and effective knowledge discovery environment.
A tool for exploring space-time patterns : an animation user research
Ogao, Patrick J
2006-01-01
Background Ever since Dr. John Snow (1813–1854) used a case map to identify water well as the source of a cholera outbreak in London in the 1800s, the use of spatio-temporal maps have become vital tools in a wide range of disease mapping and control initiatives. The increasing use of spatio-temporal maps in these life-threatening sectors warrants that they are accurate, and easy to interpret to enable prompt decision making by health experts. Similar spatio-temporal maps are observed in urban growth and census mapping – all critical aspects a of a country's socio-economic development. In this paper, a user test research was carried out to determine the effectiveness of spatio-temporal maps (animation) in exploring geospatial structures encompassing disease, urban and census mapping. Results Three types of animation were used, namely; passive, interactive and inference-based animation, with the key differences between them being on the level of interactivity and complementary domain knowledge that each offers to the user. Passive animation maintains the view only status. The user has no control over its contents and dynamic variables. Interactive animation provides users with the basic media player controls, navigation and orientation tools. Inference-based animation incorporates these interactive capabilities together with a complementary automated intelligent view that alerts users to interesting patterns, trends or anomalies that may be inherent in the data sets. The test focussed on the role of animation passive and interactive capabilities in exploring space-time patterns by engaging test-subjects in thinking aloud evaluation protocol. The test subjects were selected from a geoinformatics (map reading, interpretation and analysis abilities) background. Every test-subject used each of the three types of animation and their performances for each session assessed. The results show that interactivity in animation is a preferred exploratory tool in identifying, interpreting and providing explanations about observed geospatial phenomena. Also, exploring geospatial data structures using animation is best achieved using provocative interactive tools such as was seen with the inference-based animation. The visual methods employed using the three types of animation are all related and together these patterns confirm the exploratory cognitive structure and processes for visualization tools. Conclusion The generic types of animation as defined in this paper play a crucial role in facilitating the visualization of geospatial data. These animations can be created and their contents defined based on the user's presentational and exploratory needs. For highly explorative tasks, maintaining a link between the data sets and the animation is crucial to enabling a rich and effective knowledge discovery environment. PMID:16938138
A National contribution to the GEO Science and Technology roadmap: GIIDA Project
NASA Astrophysics Data System (ADS)
Nativi, Stefano; Mazzetti, Paolo; Guzzetti, Fausto; Oggioni, Alessandro; Pirrone, Nicola; Santolieri, Rosalia; Viola, Angelo; Tartari, Gianni; Santoro, Mattia
2010-05-01
The GIIDA (Gestione Integrata e Interoperativa dei Dati Ambientali) project is an initiative of the Italian National Research Council (CNR) launched in 2008 as an inter-departmental project, aiming to design and develop a multidisciplinary e-infrastructure (cyber-infrastructure) for the management, processing, and evaluation of Earth and Environmental resources -i.e. data, services, models, sensors, best practices. GIIDA has been contributing to the implementation of the GEO (Group of Earth Observation) Science and Technology (S&T) roadmap by: (a) linking relevant S&T communities to GEOSS (GEO System of Systems); (b) ensuring that GEOSS is built based on state-of-the-art science and technology. GIIDA co-ordinates the CNR's digital infrastructure development for Earth Observation resources sharing and cooperates with other national agencies and existing projects pursuing the same objective. For the CNR, GIIDA provides an interface to European and international interoperability programmes (e.g. INSPIRE, and GMES). It builds a national network for dialogue and resolution of issues at varying scientific and technical levels. To achieve such goals, GIIDA introduced a set of guidance principles: • To shift from a "traditional" data centric approach to a more advanced service-based solution for Earth System Science and Environmental information. • To shift the focus from Data to Information Spatial Infrastructures in order to support decision-making. • To be interoperable with analogous National (e.g. SINAnet, and the INSPIRE National Infrastructure) and international initiatives (e.g. INSPIRE, GMES, SEIS, and GEOSS). • To reinforce the Italian presence in the European and international programmes concerning digital infrastructures, geospatial information, and the Mega-Science approach. • To apply the National and International Information Technology (IT) standards for achieving multi-disciplinary interoperability in the Earth and Space Sciences (e.g. ISO, OGC, CEN, CNIPA) In keeping with GEOSS, GIIDA infrastructure adopts a System of Systems architectural approach in order to federate the existing systems managed by a set of recognized Thematic Areas (i.e. Risks, Biodiversity, Climate Change, Air Quality, Land and Water Quality, Ocean and Marine resources, Joint Research and Public Administration infrastructures). GIIDA system of systems will contribute to develop multidisciplinary teams studying the global Earth systems in order to address the needs coming from the GEO Societal Benefit Areas (SBAs). GIIDA issued a Call For Pilots receiving more than 20 high-level projects which are contributing to the GIIDA system development. A national-wide research environmental infrastructure must be interconnected with analogous digital infrastructures operated by other important stakeholders, such as public users and private companies. In fact, the long-term sustainability of a "System of Systems" requires synergies between all the involved stakeholders' domains: Users, Governance, Capacity provision, and Research. Therefore, in order to increase the effectiveness of the GIIDA contribution process to a national environmental e-infrastructure, collaborations were activated with relevant actors of the other stakeholders' domains at the national level (e.g. ISPRA SINAnet).
ICT and Information Strategies for a Knowledge Economy: The Indian Experience
ERIC Educational Resources Information Center
Ghosh, Maitrayee; Ghosh, Ipsheet
2009-01-01
Purpose: The purpose of this paper is to describe the progress India has made in its move towards a knowledge-based economy with details of how the Indian Government has demonstrated its commitment to the development of fundamental pillars of knowledge sharing infrastructure, knowledge workers and a knowledge innovation system. Libraries are…
How Cities Think: Knowledge Co-Production for Urban Sustainability and Resilience
Tischa Muñoz-Erickson; Clark Miller; Thaddeus Miller
2017-01-01
Understanding and transforming how cities think is a crucial part of developing effective knowledge infrastructures for the Anthropocene. In this article, we review knowledge co-production as a popular approach in environmental and sustainability science communities to the generationof useable knowledge for sustainability and resilience. We present knowledge systems...
Metadata squared: enhancing its usability for volunteered geographic information and the GeoWeb
Poore, Barbara S.; Wolf, Eric B.; Sui, Daniel Z.; Elwood, Sarah; Goodchild, Michael F.
2013-01-01
The Internet has brought many changes to the way geographic information is created and shared. One aspect that has not changed is metadata. Static spatial data quality descriptions were standardized in the mid-1990s and cannot accommodate the current climate of data creation where nonexperts are using mobile phones and other location-based devices on a continuous basis to contribute data to Internet mapping platforms. The usability of standard geospatial metadata is being questioned by academics and neogeographers alike. This chapter analyzes current discussions of metadata to demonstrate how the media shift that is occurring has affected requirements for metadata. Two case studies of metadata use are presented—online sharing of environmental information through a regional spatial data infrastructure in the early 2000s, and new types of metadata that are being used today in OpenStreetMap, a map of the world created entirely by volunteers. Changes in metadata requirements are examined for usability, the ease with which metadata supports coproduction of data by communities of users, how metadata enhances findability, and how the relationship between metadata and data has changed. We argue that traditional metadata associated with spatial data infrastructures is inadequate and suggest several research avenues to make this type of metadata more interactive and effective in the GeoWeb.
Geospatial Science is increasingly becoming an important tool in making Agency decisions. Quality Control and Quality Assurance are required to be integrated during the planning, implementation and assessment of geospatial databases, processes and products. In order to ensure Age...
The geospatial data quality REST API for primary biodiversity data
Otegui, Javier; Guralnick, Robert P.
2016-01-01
Summary: We present a REST web service to assess the geospatial quality of primary biodiversity data. It enables access to basic and advanced functions to detect completeness and consistency issues as well as general errors in the provided record or set of records. The API uses JSON for data interchange and efficient parallelization techniques for fast assessments of large datasets. Availability and implementation: The Geospatial Data Quality API is part of the VertNet set of APIs. It can be accessed at http://api-geospatial.vertnet-portal.appspot.com/geospatial and is already implemented in the VertNet data portal for quality reporting. Source code is freely available under GPL license from http://www.github.com/vertnet/api-geospatial. Contact: javier.otegui@gmail.com or rguralnick@flmnh.ufl.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26833340
The geospatial data quality REST API for primary biodiversity data.
Otegui, Javier; Guralnick, Robert P
2016-06-01
We present a REST web service to assess the geospatial quality of primary biodiversity data. It enables access to basic and advanced functions to detect completeness and consistency issues as well as general errors in the provided record or set of records. The API uses JSON for data interchange and efficient parallelization techniques for fast assessments of large datasets. The Geospatial Data Quality API is part of the VertNet set of APIs. It can be accessed at http://api-geospatial.vertnet-portal.appspot.com/geospatial and is already implemented in the VertNet data portal for quality reporting. Source code is freely available under GPL license from http://www.github.com/vertnet/api-geospatial javier.otegui@gmail.com or rguralnick@flmnh.ufl.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.
River predisposition to ice jams: a simplified geospatial model
NASA Astrophysics Data System (ADS)
De Munck, Stéphane; Gauthier, Yves; Bernier, Monique; Chokmani, Karem; Légaré, Serge
2017-07-01
Floods resulting from river ice jams pose a great risk to many riverside municipalities in Canada. The location of an ice jam is mainly influenced by channel morphology. The goal of this work was therefore to develop a simplified geospatial model to estimate the predisposition of a river channel to ice jams. Rather than predicting the timing of river ice breakup, the main question here was to predict where the broken ice is susceptible to jam based on the river's geomorphological characteristics. Thus, six parameters referred to potential causes for ice jams in the literature were initially selected: presence of an island, narrowing of the channel, high sinuosity, presence of a bridge, confluence of rivers, and slope break. A GIS-based tool was used to generate the aforementioned factors over regular-spaced segments along the entire channel using available geospatial data. An ice jam predisposition index
(IJPI) was calculated by combining the weighted optimal factors. Three Canadian rivers (province of Québec) were chosen as test sites. The resulting maps were assessed from historical observations and local knowledge. Results show that 77 % of the observed ice jam sites on record occurred in river sections that the model considered as having high or medium predisposition. This leaves 23 % of false negative errors (missed occurrence). Between 7 and 11 % of the highly predisposed
river sections did not have an ice jam on record (false-positive cases). Results, limitations, and potential improvements are discussed.
NASA Astrophysics Data System (ADS)
Rousi, A. M.; Branch, B. D.; Kong, N.; Fosmire, M.
2013-12-01
In their Finnish National Spatial Strategy 2010-2015 the Finland's Ministry of Agriculture and Forestry delineated e.g. that spatial data skills should support citizens everyday activities and facilitate decision-making and participation of citizens. Studies also predict that open data, particularly open spatial data, would create, when fully realizing their potential, a 15% increase into the turnovers of Finnish private sector companies. Finnish libraries have a long tradition of serving at the heart of Finnish information society. However, with the emerging possibilities of educating their users on open spatial data a very few initiatives have been made. The National Survey of Finland opened its data in 2012. Finnish technology university libraries, such as Aalto University Library, are open environments for all citizens, and seem suitable of being the first thriving entities in educating citizens on open geospatial data. There are however many obstacles to overcome, such as lack of knowledge about policies, lack of understanding of geospatial data services and insufficient know-how of GIS software among the personnel. This framework examines the benefits derived from an international collaboration between Purdue University Libraries and Aalto University Library to create local strategies in implementing open spatial data education initiatives in Aalto University Library's context. The results of this international collaboration are explicated for the benefit of the field as a whole.
Geomatics Education: Need Assessment
NASA Astrophysics Data System (ADS)
Vyas, A.
2014-11-01
Education system is divided in to two classes: formal and informal. Formal education establishes the basis of theory and practical learning whereas informal education is largely self-learning, learning from real world projects. Generally science and technology streams require formal method of education. The social and related aspects can be taught through the other methods. Education is a media through which the foundation of the knowledge and skill is built. The statistics reveals the increase in the trend of the literate population. This may be accounted due to the level of urbanization and migration to the cities in search for the "white-collar jobs". As a result, a shift in the employment structure is observed from a primary sector to a secondary and tertiary sector. Thomas Friedman in his book `The World is Flat' quotes the impact of globalization on adaptation of science and technology, the world has become large to tiny. One of the technologies to mention here is geospatial technology. With the advancement in the satellite remote sensing, geographical information system, global positioning system, the database management system has become important subject areas. The countries are accounting hugh budget on the space technology, which includes education, training and research. Today many developing countries do not have base maps, they are lacking in the systemic data and record keeping, which are essential for governance, decision making and other development purpose. There is no trained manpower available. There is no standard hardware and software identified. An imbalance is observed when the government is promoting the use of geospatial technology, there is no trained manpower nor the availability of the experts to review the accurateness of the spatial data developed. There are very few universities which impart the degree level education, there are very few trained faculty members who give standard education, there exists a lack of standard syllabus. On the other hand, the industry requires high skilled manpower, high experienced manpower. This is a low equilibrium situation. Since the need is enhancing day by day, the shortage of the skilled manpower is increasing, the need of the geomatics education emerges. This paper researches on the need assessment of the education in geospatial specialization. It emphasises on the challenges and issues prevail in geospatial education and in the specialized fields of remote sensing and GIS. This paper analyse the need assessment through all the three actors: government, geospatial industry and education institutions.
Geospatial Science is increasingly becoming an important tool in making Agency decisions. QualIty Control and Quality Assurance are required to be integrated during the planning, implementation and assessment of geospatial databases, processes and products. In order to ensure Age...
Advancements in Open Geospatial Standards for Photogrammetry and Remote Sensing from Ogc
NASA Astrophysics Data System (ADS)
Percivall, George; Simonis, Ingo
2016-06-01
The necessity of open standards for effective sharing and use of remote sensing continues to receive increasing emphasis in policies of agencies and projects around the world. Coordination on the development of open standards for geospatial information is a vital step to insure that the technical standards are ready to support the policy objectives. The mission of the Open Geospatial Consortium (OGC) is to advance development and use of international standards and supporting services that promote geospatial interoperability. To accomplish this mission, OGC serves as the global forum for the collaboration of geospatial data / solution providers and users. Photogrammetry and remote sensing are sources of the largest and most complex geospatial information. Some of the most mature OGC standards for remote sensing include the Sensor Web Enablement (SWE) standards, the Web Coverage Service (WCS) suite of standards, encodings such as NetCDF, GMLJP2 and GeoPackage, and the soon to be approved Discrete Global Grid Systems (DGGS) standard. In collaboration with ISPRS, OGC working with government, research and industrial organizations continue to advance the state of geospatial standards for full use of photogrammetry and remote sensing.
Using Integrated Earth and Social Science Data for Disaster Risk Assessment
NASA Astrophysics Data System (ADS)
Downs, R. R.; Chen, R. S.; Yetman, G.
2016-12-01
Society faces many different risks from both natural and technological hazards. In some cases, disaster risk managers focus on only a few risks, e.g., in regions where a single hazard such as earthquakes dominate. More often, however, disaster risk managers deal with multiple hazards that pose diverse threats to life, infrastructure, and livelihoods. From the viewpoint of scientists, hazards are often studied based on traditional disciplines such as seismology, hydrology, climatology, and epidemiology. But from the viewpoint of disaster risk managers, data are needed on all hazards in a specific region and on the exposure and vulnerability of population, infrastructure, and economic resources and activity. Such managers also need to understand how hazards, exposures, and vulnerabilities may interact, and human and environmental systems respond, to hazard events, as in the case of the Fukushima nuclear disaster that followed from the Sendai earthquake and tsunami. In this regard, geospatial tools that enable visualization and analysis of both Earth and social science data can support the use case of disaster risk managers who need to quickly assess where specific hazard events occur relative to population and critical infrastructure. Such information can help them assess the potential severity of actual or predicted hazard events, identify population centers or key infrastructure at risk, and visualize hazard dynamics, e.g., earthquakes and their aftershocks or the paths of severe storms. This can then inform efforts to mitigate risks across multiple hazards, including reducing exposure and vulnerability, strengthening system resiliency, improving disaster response mechanisms, and targeting mitigation resources to the highest or most critical risks. We report here on initial efforts to develop hazard mapping tools that draw on open web services and support simple spatial queries about population exposure. The NASA Socioeconomic Data and Applications Center (SEDAC) Hazards Mapper, a web-based mapping tool, enables users to estimate population living in areas subject to flood or tornado warnings, near recent earthquakes, or around critical infrastructure. The HazPop mobile app, implemented for iOS devices, utilizes location services to support disaster risk managers working in field conditions.
TEODOOR, a blueprint for distributed terrestrial observation data infrastructures
NASA Astrophysics Data System (ADS)
Kunkel, Ralf; Sorg, Jürgen; Abbrent, Martin; Borg, Erik; Gasche, Rainer; Kolditz, Olaf; Neidl, Frank; Priesack, Eckart; Stender, Vivien
2017-04-01
TERENO (TERrestrial ENvironmental Observatories) is an initiative funded by the large research infrastructure program of the Helmholtz Association of Germany. Four observation platforms to facilitate the investigation of consequences of global change for terrestrial ecosys-tems and the socioeconomic implications of these have been implemented and equipped from 2007 until 2013. Data collection, however, is planned to be performed for at least 30 years. TERENO provides series of system variables (e.g. precipitation, runoff, groundwater level, soil moisture, water vapor and trace gases fluxes) for the analysis and prognosis of global change consequences using integrated model systems, which will be used to derive efficient prevention, mitigation and adaptation strategies. Each platform is operated by a different Helmholtz-Institution, which maintains its local data infrastructure. Within the individual observatories, areas with intensive measurement programs have been implemented. Different sensors provide information on various physical parameters like soil moisture, temperatures, ground water levels or gas fluxes. Sensor data from more than 900 stations are collected automatically with a frequency of 20 s-1 up to 2 h-1, summing up to about 2,500,000 data values per day. In addition, three weather radar devices create raster data with a frequency of 12 to 60 h-1. The data are automatically imported into local relational database systems using a common data quality assessment framework, used to handle processing and assessment of heterogeneous environmental observation data. Starting with the way data are imported into the data infrastructure, custom workflows are developed. Data levels implying the underlying data processing, stages of quality assessment and data ac-cessibility are defined. In order to facilitate the acquisition, provision, integration, management and exchange of heterogeneous geospatial resources within a scientific and non-scientific environment the dis-tributed spatial data infrastructure TEODOOR (TEreno Online Data RepOsitORry) has been build-up. The individual observatories are connected via OGC-compliant web-services, while the TERENO Data Discovery Portal (DDP) enables data discovery, visualization and data ac-cess. Currently, free access to data from more than 900 monitoring stations is provided.
NASA Astrophysics Data System (ADS)
Shew, A. M.; Ghosh, A.
2017-10-01
Remote sensing in the optical domain is widely used in agricultural monitoring; however, such initiatives pose a challenge for developing countries due to a lack of high quality in situ information. Our proposed methodology could help developing countries bridge this gap by demonstrating the potential to quantify patterns of dry season rice production in Bangladesh. To analyze approximately 90,000 km2 of cultivated land in Bangladesh at 30 m spatial resolution, we used two decades of remote sensing data from the Landsat archive and Google Earth Engine (GEE), a cloud-based geospatial data analysis platform built on Google infrastructure and capable of processing petabyte-scale remote sensing data. We reconstructed the seasonal patterns of vegetation indices (VIs) for each pixel using a harmonic time series (HTS) model, which minimizes the effects of missing observations and noise. Next, we combined the seasonality information of VIs with our knowledge of rice cultivation systems in Bangladesh to delineate rice areas in the dry season, which are predominantly hybrid and High Yielding Varieties (HYV). Based on historical Landsat imagery, the harmonic time series of vegetation indices (HTS-VIs) model estimated 4.605 million ha, 3.519 million ha, and 4.021 million ha of rice production for Bangladesh in 2005, 2010, and 2015 respectively. Fine spatial scale information on HYV rice over the last 20 years will greatly improve our understanding of double-cropped rice systems, current status of production, and potential for HYV rice adoption in Bangladesh during the dry season.
Impacts of Climate Change on Public Health in India: Future Research Directions
Bush, Kathleen F.; Luber, George; Kotha, S. Rani; Dhaliwal, R.S.; Kapil, Vikas; Pascual, Mercedes; Brown, Daniel G.; Frumkin, Howard; Dhiman, R.C.; Hess, Jeremy; Wilson, Mark L.; Balakrishnan, Kalpana; Eisenberg, Joseph; Kaur, Tanvir; Rood, Richard; Batterman, Stuart; Joseph, Aley; Gronlund, Carina J.; Agrawal, Arun; Hu, Howard
2011-01-01
Background Climate change and associated increases in climate variability will likely further exacerbate global health disparities. More research is needed, particularly in developing countries, to accurately predict the anticipated impacts and inform effective interventions. Objectives Building on the information presented at the 2009 Joint Indo–U.S. Workshop on Climate Change and Health in Goa, India, we reviewed relevant literature and data, addressed gaps in knowledge, and identified priorities and strategies for future research in India. Discussion The scope of the problem in India is enormous, based on the potential for climate change and variability to exacerbate endemic malaria, dengue, yellow fever, cholera, and chikungunya, as well as chronic diseases, particularly among the millions of people who already experience poor sanitation, pollution, malnutrition, and a shortage of drinking water. Ongoing efforts to study these risks were discussed but remain scant. A universal theme of the recommendations developed was the importance of improving the surveillance, monitoring, and integration of meteorological, environmental, geospatial, and health data while working in parallel to implement adaptation strategies. Conclusions It will be critical for India to invest in improvements in information infrastructure that are innovative and that promote interdisciplinary collaborations while embarking on adaptation strategies. This will require unprecedented levels of collaboration across diverse institutions in India and abroad. The data can be used in research on the likely impacts of climate change on health that reflect India’s diverse climates and populations. Local human and technical capacities for risk communication and promoting adaptive behavior must also be enhanced. PMID:21273162
Integration of Grid and Sensor Web for Flood Monitoring and Risk Assessment from Heterogeneous Data
NASA Astrophysics Data System (ADS)
Kussul, Nataliia; Skakun, Sergii; Shelestov, Andrii
2013-04-01
Over last decades we have witnessed the upward global trend in natural disaster occurrence. Hydrological and meteorological disasters such as floods are the main contributors to this pattern. In recent years flood management has shifted from protection against floods to managing the risks of floods (the European Flood risk directive). In order to enable operational flood monitoring and assessment of flood risk, it is required to provide an infrastructure with standardized interfaces and services. Grid and Sensor Web can meet these requirements. In this paper we present a general approach to flood monitoring and risk assessment based on heterogeneous geospatial data acquired from multiple sources. To enable operational flood risk assessment integration of Grid and Sensor Web approaches is proposed [1]. Grid represents a distributed environment that integrates heterogeneous computing and storage resources administrated by multiple organizations. SensorWeb is an emerging paradigm for integrating heterogeneous satellite and in situ sensors and data systems into a common informational infrastructure that produces products on demand. The basic Sensor Web functionality includes sensor discovery, triggering events by observed or predicted conditions, remote data access and processing capabilities to generate and deliver data products. Sensor Web is governed by the set of standards, called Sensor Web Enablement (SWE), developed by the Open Geospatial Consortium (OGC). Different practical issues regarding integration of Sensor Web with Grids are discussed in the study. We show how the Sensor Web can benefit from using Grids and vice versa. For example, Sensor Web services such as SOS, SPS and SAS can benefit from the integration with the Grid platform like Globus Toolkit. The proposed approach is implemented within the Sensor Web framework for flood monitoring and risk assessment, and a case-study of exploiting this framework, namely the Namibia SensorWeb Pilot Project, is described. The project was created as a testbed for evaluating and prototyping key technologies for rapid acquisition and distribution of data products for decision support systems to monitor floods and enable flood risk assessment. The system provides access to real-time products on rainfall estimates and flood potential forecast derived from the Tropical Rainfall Measuring Mission (TRMM) mission with lag time of 6 h, alerts from the Global Disaster Alert and Coordination System (GDACS) with lag time of 4 h, and the Coupled Routing and Excess STorage (CREST) model to generate alerts. These are alerts are used to trigger satellite observations. With deployed SPS service for NASA's EO-1 satellite it is possible to automatically task sensor with re-image capability of less 8 h. Therefore, with enabled computational and storage services provided by Grid and cloud infrastructure it was possible to generate flood maps within 24-48 h after trigger was alerted. To enable interoperability between system components and services OGC-compliant standards are utilized. [1] Hluchy L., Kussul N., Shelestov A., Skakun S., Kravchenko O., Gripich Y., Kopp P., Lupian E., "The Data Fusion Grid Infrastructure: Project Objectives and Achievements," Computing and Informatics, 2010, vol. 29, no. 2, pp. 319-334.
Network Interdependency Modeling for Risk Assessment on Built Infrastructure Systems
2013-10-01
does begin to address infrastructure decay as a source of risk comes from the Department of Homeland Security (DHS). In 2009, the DHS Science and...network of connected edges and nodes. The National Research Council (2005) reported that the study of networks as a science and applications of...principles from this science are still in its early stages. As modern infrastructures have become more interlinked, knowledge of an infrastructure’s network
NASA Astrophysics Data System (ADS)
Niggemann, F.; Appel, F.; Bach, H.; de la Mar, J.; Schirpke, B.; Dutting, K.; Rucker, G.; Leimbach, D.
2015-04-01
To address the challenges of effective data handling faced by Small and Medium Sized Enterprises (SMEs) a cloud-based infrastructure for accessing and processing of Earth Observation(EO)-data has been developed within the project APPS4GMES(www.apps4gmes.de). To gain homogenous multi mission data access an Input Data Portal (IDP) been implemented on this infrastructure. The IDP consists of an Open Geospatial Consortium (OGC) conformant catalogue, a consolidation module for format conversion and an OGC-conformant ordering framework. Metadata of various EO-sources and with different standards is harvested and transferred to an OGC conformant Earth Observation Product standard and inserted into the catalogue by a Metadata Harvester. The IDP can be accessed for search and ordering of the harvested datasets by the services implemented on the cloud infrastructure. Different land-surface services have been realised by the project partners, using the implemented IDP and cloud infrastructure. Results of these are customer ready products, as well as pre-products (e.g. atmospheric corrected EO data), serving as a basis for other services. Within the IDP an automated access to ESA's Sentinel-1 Scientific Data Hub has been implemented. Searching and downloading of the SAR data can be performed in an automated way. With the implementation of the Sentinel-1 Toolbox and own software, for processing of the datasets for further use, for example for Vista's snow monitoring, delivering input for the flood forecast services, can also be performed in an automated way. For performance tests of the cloud environment a sophisticated model based atmospheric correction and pre-classification service has been implemented. Tests conducted an automated synchronised processing of one entire Landsat 8 (LS-8) coverage for Germany and performance comparisons to standard desktop systems. Results of these tests, showing a performance improvement by the factor of six, proved the high flexibility and computing power of the cloud environment. To make full use of the cloud capabilities a possibility for automated upscaling of the hardware resources has been implemented. Together with the IDP infrastructure fast and automated processing of various satellite sources to deliver market ready products can be realised, thus increasing customer needs and numbers can be satisfied without loss of accuracy and quality.
NASA Astrophysics Data System (ADS)
Rogers, K. G.; Brondizio, E.; Roy, K.; Syvitski, J. P.
2016-12-01
Because of their low-lying elevations and large number of inhabitants and infrastructure, river deltas are ground zero for climate change impacts, particularly from sea-level rise and storm surges. The increased vulnerability of downstream delta communities to coastal flooding as a result of upstream engineering has been acknowledged for decades. What has received less attention is the sensitivity of deltas to the interactions of these processes and increasing intensity of cultivation and irrigation in their coastal regions. Beyond basin-scale damming, regional infrastructure affects the movement of sediment and water on deltas, and combined with upstream modifications may exacerbate the risk of expanded tidal flooding, erosion of arable land, and salinization of soils and groundwater associated with sea level rise. To examine the social-biophysical feedbacks associated with regional-scale infrastructure, smallholder water management practices and coastal dynamics, a nested framework was applied to two districts of the coastal southwest region of Bangladesh. The two districts vary in tidal range, salinity, freshwater availability and socioeconomic structures, and are spatially varied in farmer's adaptations. Both districts contain numerous large embankment systems initially designed to protect cropland from tidal flooding, but that have been poorly maintained since their construction in the 1960's. The framework was co-produced using local-level stakeholder input collected during group interviews with rural farmers in 8 villages within the two districts, and explicitly accounts for engineered and natural biophysical variables as well as governance and institutional structures at 3 levels of analysis. Household survey results indicate that the presence or absence of embankments as a result of poor management and dynamic coastal processes is the primary control on freshwater availability and thus influences farming strategies, socioeconomic conditions and social positions in both districts. Local-scale interactions with the embankments are spatially heterogeneous, but geospatial analyses show the potential for these to collectively impact physical and social stability across a region already vulnerable to coastal flooding.
Vista-LA: Mapping methane-emitting infrastructure in the Los Angeles megacity
NASA Astrophysics Data System (ADS)
Carranza, Valerie; Rafiq, Talha; Frausto-Vicencio, Isis; Hopkins, Francesca M.; Verhulst, Kristal R.; Rao, Preeti; Duren, Riley M.; Miller, Charles E.
2018-03-01
Methane (CH4) is a potent greenhouse gas (GHG) and a critical target of climate mitigation efforts. However, actionable emission reduction efforts are complicated by large uncertainties in the methane budget on relevant scales. Here, we present Vista, a Geographic Information System (GIS)-based approach to map potential methane emissions sources in the South Coast Air Basin (SoCAB) that encompasses Los Angeles, an area with a dense, complex mixture of methane sources. The goal of this work is to provide a database that, together with atmospheric observations, improves methane emissions estimates in urban areas with complex infrastructure. We aggregated methane source location information into three sectors (energy, agriculture, and waste) following the frameworks used by the State of California GHG Inventory and the Intergovernmental Panel on Climate Change (IPCC) Guidelines for GHG Reporting. Geospatial modeling was applied to publicly available datasets to precisely geolocate facilities and infrastructure comprising major anthropogenic methane source sectors. The final database, Vista-Los Angeles (Vista-LA), is presented as maps of infrastructure known or expected to emit CH4. Vista-LA contains over 33 000 features concentrated on < 1 % of land area in the region. Currently, Vista-LA is used as a planning and analysis tool for atmospheric measurement surveys of methane sources, particularly for airborne remote sensing, and methane hotspot
detection using regional observations. This study represents a first step towards developing an accurate, spatially resolved methane flux estimate for point sources in SoCAB, with the potential to address discrepancies between bottom-up and top-down methane emissions accounting in this region. The Vista-LA datasets and associated metadata are available from the Oak Ridge National Laboratory Distributed Active Archive Center for Biogeochemical Dynamics (ORNL DAAC; https://doi.org/10.3334/ORNLDAAC/1525).
NASA Astrophysics Data System (ADS)
Bermudez, L. E.; Percivall, G.; Idol, T. A.
2015-12-01
Experts in climate modeling, remote sensing of the Earth, and cyber infrastructure must work together in order to make climate predictions available to decision makers. Such experts and decision makers worked together in the Open Geospatial Consortium's (OGC) Testbed 11 to address a scenario of population displacement by coastal inundation due to the predicted sea level rise. In a Policy Fact Sheet "Harnessing Climate Data to Boost Ecosystem & Water Resilience", issued by White House Office of Science and Technology (OSTP) in December 2014, OGC committed to increase access to climate change information using open standards. In July 2015, the OGC Testbed 11 Urban Climate Resilience activity delivered on that commitment with open standards based support for climate-change preparedness. Using open standards such as the OGC Web Coverage Service and Web Processing Service and the NetCDF and GMLJP2 encoding standards, Testbed 11 deployed an interoperable high-resolution flood model to bring climate model outputs together with global change assessment models and other remote sensing data for decision support. Methods to confirm model predictions and to allow "what-if-scenarios" included in-situ sensor webs and crowdsourcing. A scenario was in two locations: San Francisco Bay Area and Mozambique. The scenarios demonstrated interoperation and capabilities of open geospatial specifications in supporting data services and processing services. The resultant High Resolution Flood Information System addressed access and control of simulation models and high-resolution data in an open, worldwide, collaborative Web environment. The scenarios examined the feasibility and capability of existing OGC geospatial Web service specifications in supporting the on-demand, dynamic serving of flood information from models with forecasting capacity. Results of this testbed included identification of standards and best practices that help researchers and cities deal with climate-related issues. Results of the testbeds will now be deployed in pilot applications. The testbed also identified areas of additional development needed to help identify scientific investments and cyberinfrastructure approaches needed to improve the application of climate science research results to urban climate resilence.
Capacity Building on the Use of Earth Observation for Bridging the Gaps between Science and Policy
NASA Astrophysics Data System (ADS)
Thapa, R. B.; Bajracharya, B.
2017-12-01
Although the geospatial technologies and Earth observation (EO) data are getting more accessible, lack of skilled human resources and institutional capacities are the major hurdles in the effective applications in Hindu Kush Himalayan (HKH) region. Designing efficient and cost effective capacity building (CB) programs fitting needs by different users on the use of EO information for decision making will provide options in bridging the gaps in the region. This paper presents the strategies adopted by SERVIR-HKH as an attempt to strengthen the capacity of governments and development stakeholders in the region. SERVIR-HKH hub plays vital role in CB on EO applications by bringing together the leading scientists from the Globe and the key national institutions and stakeholders in the region. We conducted country consultation workshops in Afghanistan, Bangladesh, Pakistan, and Nepal to identify national priorities, requirements and the capacity of the institutions to utilize EO information in decision making. The need assessments were focused on four thematic areas of SERVIR where capacity gaps in utilization of EO data in policy decisions were identified in thirteen key service areas. Geospatial capacities in GIT infrastructure, data, and human resources were varied. Linking EO information to policy decision is mostly lacking. Geospatial data sharing provision among the institutions in the region is poor. We developed a capacity building strategy for HKH region which bridges the gaps in a coordinated manner through customized training programs, institutional strengthening, coordination and regional cooperation. Using the strategy, we conducted training on FEWS NET remote sensing products for agro-climatological analysis, which focused on technical interpretation and analysis of the remote sensing and modeled products, eg, CHIRPS, RFE2, CHIRTS, GFS, NDVI, GeoCLIM and GeoGLAM. Scientists from USGS FEWS NET program delivered the training to mid-level managers and decision makers. We also carried out on-the-job trainings on wheat mapping using multi-sensor EO data for co-development of methodologies and implementation on sustainable basis. In this presentation, we will also present the lesson learned from capacity building efforts at SERVIR-HKH and how we envision the best practices for other SERVIR hubs.
ASIS '99 Knowledge: Creation, Organization and Use, Part III: Plenary Sessions.
ERIC Educational Resources Information Center
Proceedings of the ASIS Annual Meeting, 1999
1999-01-01
Describes the following sessions: "Knowledge Management: A Celebration of Humans Connected with Quality Information Objects (Plenary Session 1); "Intellectual Property Rights and the Emerging Information Infrastructure (Plenary Session 2); and "Knowledge: Creation, Organization and Use (Conference Wrap-up Session). (AEF)
Assessing Embedded Geospatial Student Learning Outcomes
ERIC Educational Resources Information Center
Carr, John David
2012-01-01
Geospatial tools and technologies have become core competencies for natural resource professionals due to the monitoring, modeling, and mapping capabilities they provide. To prepare students with needed background, geospatial instructional activities were integrated across Forest Management; Natural Resources; Fisheries, Wildlife, &…
USDA-ARS?s Scientific Manuscript database
Increasingly, consumer organizations, businesses, and academic researchers are using UAS to gather geospatial, environmental data on natural and man-made phenomena. These data may be either remotely sensed or measured directly (e. g., sampling of atmospheric constituents). The term geospatial data r...
Transforming Undergraduate Education Through the use of Analytical Reasoning (TUETAR)
NASA Astrophysics Data System (ADS)
Bishop, M. P.; Houser, C.; Lemmons, K.
2015-12-01
Traditional learning limits the potential for self-discovery, and the use of data and knowledge to understand Earth system relationships, processes, feedback mechanisms and system coupling. It is extremely difficult for undergraduate students to analyze, synthesize, and integrate quantitative information related to complex systems, as many concepts may not be mathematically tractable or yet to be formalized. Conceptual models have long served as a means for Earth scientists to organize their understanding of Earth's dynamics, and have served as a basis for human analytical reasoning and landscape interpretation. Consequently, we evaluated the use of conceptual modeling, knowledge representation and analytical reasoning to provide undergraduate students with an opportunity to develop and test geocomputational conceptual models based upon their understanding of Earth science concepts. This study describes the use of geospatial technologies and fuzzy cognitive maps to predict desertification across the South-Texas Sandsheet in an upper-level geomorphology course. Students developed conceptual models based on their understanding of aeolian processes from lectures, and then compared and evaluated their modeling results against an expert conceptual model and spatial predictions, and the observed distribution of dune activity in 2010. Students perceived that the analytical reasoning approach was significantly better for understanding desertification compared to traditional lecture, and promoted reflective learning, working with data, teamwork, student interaction, innovation, and creative thinking. Student evaluations support the notion that the adoption of knowledge representation and analytical reasoning in the classroom has the potential to transform undergraduate education by enabling students to formalize and test their conceptual understanding of Earth science. A model for developing and utilizing this geospatial technology approach in Earth science is presented.
Exploring NASA GES DISC Data with Interoperable Services
NASA Technical Reports Server (NTRS)
Zhao, Peisheng; Yang, Wenli; Hegde, Mahabal; Wei, Jennifer C.; Kempler, Steven; Pham, Long; Teng, William; Savtchenko, Andrey
2015-01-01
Overview of NASA GES DISC (NASA Goddard Earth Science Data and Information Services Center) data with interoperable services: Open-standard and Interoperable Services Improve data discoverability, accessibility, and usability with metadata, catalogue and portal standards Achieve data, information and knowledge sharing across applications with standardized interfaces and protocols Open Geospatial Consortium (OGC) Data Services and Specifications Web Coverage Service (WCS) -- data Web Map Service (WMS) -- pictures of data Web Map Tile Service (WMTS) --- pictures of data tiles Styled Layer Descriptors (SLD) --- rendered styles.
Sieverling, Jennifer B.; Dietterle, Jeffrey
2014-01-01
The U.S. Geological Survey (USGS) is sponsoring the first The National Map Users Conference in conjunction with the eighth biennial Geographic Information Science (GIS) Workshop on May 10-13, 2011, in Lakewood, Colorado. The GIS Workshop will be held at the USGS National Training Center, located on the Denver Federal Center, Lakewood, Colorado, May 10-11. The National Map Users Conference will be held directly after the GIS Workshop at the Denver Marriott West, a convention hotel in the Lakewood, Colorado area, May 12-13. The National Map is designed to serve the Nation by providing geographic data and knowledge for government, industry, and public uses. The goal of The National Map Users Conference is to enhance communications and collaboration among the communities of users of and contributors to The National Map, including USGS, Department of the Interior, and other government GIS specialists and scientists, as well as the broader geospatial community. The USGS National Geospatial Program intends the conference to serve as a forum to engage users and more fully discover and meet their needs for the products and services of The National Map. The goal of the GIS Workshop is to promote advancement of GIS and related technologies and concepts as well as the sharing of GIS knowledge within the USGS GIS community. This collaborative opportunity for multi-disciplinary GIS and associated professionals will allow attendees to present and discuss a wide variety of geospatial-related topics. The Users Conference and Workshop collaboration will bring together scientists, managers, and data users who, through presentations, posters, seminars, workshops, and informal gatherings, will share accomplishments and progress on a variety of geospatial topics. During this joint event, attendees will have the opportunity to present or demonstrate their work; to develop their knowledge by attending hands-on workshops, seminars, and presentations given by professionals from USGS and other Federal Agencies, GIS related companies, and academia; and to network with other professionals to develop collaborative opportunities. Specific conference topics include scientific and modeling applications using The National Map, opportunities for partnerships, and advances in geospatial technologies. The first part of the week will be the GIS Workshop, offered as a pre-conference seminar. It will focus on hands-on GIS training and seminars concerning current topics of geospatial interest. The focus of the USGS GIS Workshop is to showcase specific techniques and concepts for using GIS in support of science. The presentations will be educational and not a marketing endeavor. To promote awareness of and interaction with selected USGS corporate and local science center data products, as well as promoting collaboration, a “GIS Olympics” event will be held Tuesday evening during the GIS Workshop. The second part of the week will feature interactive briefings and discussions on issues and opportunities of The National Map. The focus of the Users Conference will be on the role of The National Map in supporting science initiatives, emergency response, land and wildlife management, and other activities. All presentations at the Users Conference include use or innovations related to a The National Map data theme or application. On Wednesday evening, a poster session is being held as a combined event for all attendees and as a juncture between the events. On Thursday evening, the Henry Gannett Award will be presented. Additionally, poster awards will be presented. Several prominent speakers are featured at plenary sessions at The National Map Users Conference, including Deanna A. Archuleta, Deputy Assistant Secretary for Water and Science, Department of the Interior; Dr. Barbara P. Buttenfield, Professor of Geography at the University of Colorado in Boulder; best-selling author Frederick Reuss; and Dr. Joel Scheraga, Senior Advisor for Climate Adaptation, U.S. Environmental Protection Agency. Additionally, panel discussions have attracted participation from notable experts from government, academia, and the private sector. This Proceedings volume will serve as an activity reference for workshop attendees, as well as an archive of technical abstracts presented at the workshop. Author, co-author, and presenter names, affiliations, and contact information are listed with presentation titles with the abstracts. Some hands-on sessions are offered twice; in these instances, abstracts submitted for publication are presented in the proceedings on both days on which they are offered.
2014-09-01
Approved for public release; distribution is unlimited. Prepared for Geospatial Research Laboratory U.S. Army Engineer Research and Development...Center U.S. Army Corps of Engineers Under Data Level Enterprise Tools Monitored by Geospatial Research Laboratory 7701 Telegraph Road...Engineer Research and Development Center (ERDC) ERDC Geospatial Research Laboratory 7701 Telegraph Road 11. SPONSOR/MONITOR’S REPORT Alexandria, VA 22135
78 FR 69393 - Privacy Act of 1974; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-19
.... FOR FURTHER INFORMATION CONTACT: National Geospatial-Intelligence Agency (NGA), ATTN: Human...: Delete entry and replace with ``Human Development Directorate, National Geospatial-Intelligence Agency...; System of Records AGENCY: National Geospatial-Intelligence Agency, DoD. ACTION: Notice to alter a System...
77 FR 5820 - National Geospatial Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-06
... DEPARTMENT OF THE INTERIOR Office of the Secretary National Geospatial Advisory Committee AGENCY... that the Secretary of the Interior has renewed the National Geospatial Advisory Committee. The Committee will provide advice and recommendations to the Federal Geographic Data Committee (FGDC), through...
THE NEVADA GEOSPATIAL DATA BROWSER
The Landscape Ecology Branch of the U.S. Environmental Protection Agency (Las Vegas, NV) has developed the Nevada Geospatial Data Browser, a spatial data archive to centralize and distribute the geospatial data used to create the land cover, vertebrate habitat models, and land o...
Information Fusion for Feature Extraction and the Development of Geospatial Information
2004-07-01
of automated processing . 2. Requirements for Geospatial Information Accurate, timely geospatial information is critical for many military...this evaluation illustrates some of the difficulties in comparing manual and automated processing results (figure 5). The automated delineation of
geospatial data analysis using parallel processing High performance computing Renewable resource technical potential and supply curve analysis Spatial database utilization Rapid analysis of large geospatial datasets energy and geospatial analysis products Research Interests Rapid, web-based renewable resource analysis
Geospatial Information Best Practices
2012-01-01
26 Spring - 2012 By MAJ Christopher Blais, CW2 Joshua Stratton and MSG Moise Danjoint The fact that Geospatial information can be codified and...Operation Iraqi Freedom V (2007-2008, and Operation New Dawn (2011). MSG Moise Danjoint is the noncommissioned officer in charge, Geospatial
NASA Astrophysics Data System (ADS)
Deng, M.; di, L.
2006-12-01
Higher education in geosciences has imminent goals to prepare students with modern geoscience knowledge and skills to meet the increased demand on trained professionals for working on the big challenges faced by geoscience disciplines, such as the global environmental change, world energy supplies, sustainable development, etc. In order to reach the goal, the geoscience education in post-secondary institutes worldwide has to attract and retain enough students and to train students with knowledge and skills needed by the society. The classroom innovations that can encourage and support student investigations and research activities are key motivation mechanisms that help to reach the goal. This presentation describes the use of GeoBrain, an innovative geospatial knowledge system, as a powerful educating tool for motivating and facilitating innovative undergraduate and graduate teaching and research in geosciences. Developed in a NASA funded project, the GeoBrain system has adopted and implemented the latest Web services and knowledge management technologies for providing innovative methods in publishing, accessing, visualizing, and analyzing geospatial data and in building/sharing geoscience knowledge. It provides a data-rich online learning and research environment enabled by wealthy data and information available at NASA Earth Observing System (EOS) Data and Information System (EOSDIS). Students, faculty members, and researchers from institutes worldwide can easily access, analyze, and model with the huge amount of NASA EOS data just like they possess such vast resources locally at their desktops. The online environment provided by GeoBrain has brought significant positive changes to geosciences education in higher-education institutes because of its new concepts and technologies, motivation mechanisms, free exploration resources, and advanced geo- processing capabilities. With the system, the used-to-be very challenging or even impossible teaching tasks has become much easier or practical. For an instance, dynamic classroom demonstration and training for students to deal with data-intensive global climate and environment change issues in real-world applications through the system has become a very pleasant experience instead of the struggling efforts in the past. With GeoBrain, each student can be easily trained to handle multi-terabytes of EOS and other geospatial data in simulation and modeling for solving global-scale problems catering his own interests with a simple Internet connected computer. Preliminary classroom use of GeoBrain in multiple universities has demonstrated that the system is very useful for facilitating the transition of both undergraduate and graduate students from learners to investigators. It has also shown the system can improve teaching effectiveness, refine student's learning habit, and inspire students' interests in pursuing geoscience as their career. As an on-going project, GeoBrain has not reached its maturity. Surely it will improve its functionalities and make great advances in the above areas continuously.
NASA Astrophysics Data System (ADS)
Santoro, M.; Dubois, G.; Schulz, M.; Skøien, J. O.; Nativi, S.; Peedell, S.; Boldrini, E.
2012-04-01
The number of interoperable research infrastructures has increased significantly with the growing awareness of the efforts made by the Global Earth Observation System of Systems (GEOSS). One of the Social Benefit Areas (SBA) that is benefiting most from GEOSS is biodiversity, given the costs of monitoring the environment and managing complex information, from space observations to species records including their genetic characteristics. But GEOSS goes beyond the simple sharing of the data as it encourages the connectivity of models (the GEOSS Model Web), an approach easing the handling of often complex multi-disciplinary questions such as understanding the impact of environmental and climatological factors on ecosystems and habitats. In the context of GEOSS Architecture Implementation Pilot - Phase 3 (AIP-3), the EC-funded EuroGEOSS and GENESIS projects have developed and successfully demonstrated the "eHabitat" use scenario dealing with Climate Change and Biodiversity domains. Based on the EuroGEOSS multidisciplinary brokering infrastructure and on the DOPA (Digital Observatory for Protected Areas, see http://dopa.jrc.ec.europa.eu/), this scenario demonstrated how a GEOSS-based interoperability infrastructure can aid decision makers to assess and possibly forecast the irreplaceability of a given protected area, an essential indicator for assessing the criticality of threats this protected area is exposed to. The "eHabitat" use scenario was advanced in the GEOSS Sprint to Plenary activity; the advanced scenario will include the "EuroGEOSS Data Access Broker" and a new version of the eHabitat model in order to support the use of uncertain data. The multidisciplinary interoperability infrastructure which is used to demonstrate the "eHabitat" use scenario is composed of the following main components: a) A Discovery Broker: this component is able to discover resources from a plethora of different and heterogeneous geospatial services, presenting them on a single and standard discovery service; b) A Discovery Augmentation Component (DAC): this component builds on existing discovery and semantic services in order to provide the infrastructure with semantics enabled queries; c) A Data Access Broker: this component provides a seamless access of heterogeneous remote resources via a unique and standard service; d) Environmental Modeling Components (i.e. OGC WPS): these implement algorithms to predict evolution of protected areas This presentation introduces the advanced infrastructure developed to enhance the "eHabitat" use scenario. The presented infrastructure will be accessible through the GEO Portal and was used for demonstrating the "eHabitat" model at the last GEO Plenary Meeting - Istanbul, November 2011.
Geospatial Data for Computerisation of Public Administration in the Czech Republic
NASA Astrophysics Data System (ADS)
Cada, V.; Mildorf, T.
2011-08-01
The main aim of the eGovernment programme in the Czech Republic is to enhance the efficiency of public administration. The Digital Map of Public Administration (DMVS) should be composed of digital orthophotographs of the Czech Republic, digital and digitised cadastral maps, digital purpose cadastral map (ÚKM) and a technical map of municipality, if available. The DMVS project is a part of computerisation of public administration in the Czech Republic. The project enhances the productivity of government administration and also simplifies the processes between citizens and public administration. The DMVS project, that should be compliant with the INSPIRE (Infrastructure for Spatial Information in the European Community) initiative, generates definite demand for geodata on the level of detail of land data model. The user needs that are clearly specified and required are not met due to inconsistencies in terminology, data management and level of detail.
The Updating of Geospatial Base Data
NASA Astrophysics Data System (ADS)
Alrajhi, Muhamad N.; Konecny, Gottfried
2018-04-01
Topopographic mapping issues concern the area coverage at different scales and their age. The age of the map is determined by the system of updating. The United Nations (UNGGIM) have attempted to track the global map coverage at various scale ranges, which has greatly improved in recent decades. However the poor state of updating of base maps is still a global problem. In Saudi Arabia large scale mapping is carried out for all urban, suburban and rural areas by aerial surveys. Updating is carried out by remapping every 5 to 10 years. Due to the rapid urban development this is not satisfactory, but faster update methods are forseen by use of high resolution satellite imagery and the improvement of object oriented geodatabase structures, which will permit to utilize various survey technologies to update the photogrammetry established geodatabases. The longterm goal is to create an geodata infrastructure, which exists in Great Britain or Germany.
Using GIS to Understand and Prioritise Worker Movements during the 2012 London Olympics
NASA Astrophysics Data System (ADS)
McGuinness, I. M.
2013-05-01
The performance of the transport network and the associated movement of people was one of the most critical elements to London's successful delivery of the 2012 Olympic Games. During the planning stages Transport for London asked the London Borough of Newham to mitigate the impact of the authority's 13 500 employees on transport infrastructure close to the Olympic Park. To achieve this, the authority needed to understand the geographic distribution of its workforce and the demand it placed on roads and local transport hubs. The authority's Geospatial Team led the research based on four cross-referenced data sources, and spatial analysis was used to determine priorities for special absence arrangements and a commissioned coach service. The research was used to support a targeted information campaign but also presented considerations on large-scale data collection, the use of Human Resources data, and the degree to which the movement of people can be measured and managed.
The QuakeSim Project: Web Services for Managing Geophysical Data and Applications
NASA Astrophysics Data System (ADS)
Pierce, Marlon E.; Fox, Geoffrey C.; Aktas, Mehmet S.; Aydin, Galip; Gadgil, Harshawardhan; Qi, Zhigang; Sayar, Ahmet
2008-04-01
We describe our distributed systems research efforts to build the “cyberinfrastructure” components that constitute a geophysical Grid, or more accurately, a Grid of Grids. Service-oriented computing principles are used to build a distributed infrastructure of Web accessible components for accessing data and scientific applications. Our data services fall into two major categories: Archival, database-backed services based around Geographical Information System (GIS) standards from the Open Geospatial Consortium, and streaming services that can be used to filter and route real-time data sources such as Global Positioning System data streams. Execution support services include application execution management services and services for transferring remote files. These data and execution service families are bound together through metadata information and workflow services for service orchestration. Users may access the system through the QuakeSim scientific Web portal, which is built using a portlet component approach.
Flood trends and river engineering on the Mississippi River system
Pinter, N.; Jemberie, A.A.; Remo, J.W.F.; Heine, R.A.; Ickes, B.S.
2008-01-01
Along >4000 km of the Mississippi River system, we document that climate, land-use change, and river engineering have contributed to statistically significant increases in flooding over the past 100-150 years. Trends were tested using a database of >8 million hydrological measurements. A geospatial database of historical engineering construction was used to quantify the response of flood levels to each unit of engineering infrastructure. Significant climate- and/or land use-driven increases in flow were detected, but the largest and most pervasive contributors to increased flooding on the Mississippi River system were wing dikes and related navigational structures, followed by progressive levee construction. In the area of the 2008 Upper Mississippi flood, for example, about 2 m of the flood crest is linked to navigational and flood-control engineering. Systemwide, large increases in flood levels were documented at locations and at times of wing-dike and levee construction. Copyright 2008 by the American Geophysical Union.
Proposal for a Web Encoding Service (wes) for Spatial Data Transactio
NASA Astrophysics Data System (ADS)
Siew, C. B.; Peters, S.; Rahman, A. A.
2015-10-01
Web services utilizations in Spatial Data Infrastructure (SDI) have been well established and standardized by Open Geospatial Consortium (OGC). Similar web services for 3D SDI are also being established in recent years, with extended capabilities to handle 3D spatial data. The increasing popularity of using City Geographic Markup Language (CityGML) for 3D city modelling applications leads to the needs for large spatial data handling for data delivery. This paper revisits the available web services in OGC Web Services (OWS), and propose the background concepts and requirements for encoding spatial data via Web Encoding Service (WES). Furthermore, the paper discusses the data flow of the encoder within web service, e.g. possible integration with Web Processing Service (WPS) or Web 3D Services (W3DS). The integration with available web service could be extended to other available web services for efficient handling of spatial data, especially 3D spatial data.
Szabo, Jeff; Minamyer, Scott
2014-11-01
This report summarizes the current state of knowledge on the persistence of chemical contamination on drinking water infrastructure (such as pipes) along with information on decontamination should persistence occur. Decontamination options for drinking water infrastructure have been explored for some chemical contaminants, but important data gaps remain. In general, data on chemical persistence on drinking water infrastructure is available for inorganics such as arsenic and mercury, as well as select organics such as petroleum products, pesticides and rodenticides. Data specific to chemical warfare agents and pharmaceuticals was not found and data on toxins is scant. Future research suggestions focus on expanding the available chemical persistence data to other common drinking water infrastructure materials. Decontaminating agents that successfully removed persistent contamination from one infrastructure material should be used in further studies. Methods for sampling or extracting chemical agents from water infrastructure surfaces are needed. Published by Elsevier Ltd.
Research Practices, Evaluation and Infrastructure in the Digital Environment
ERIC Educational Resources Information Center
Houghton, John W.
2004-01-01
This paper examines changing research practices in the digital environment and draws out implications for research evaluation and the development of research infrastructure. Reviews of the literature, quantitative indicators of research activities and our own field research in Australia suggest that a new mode of knowledge production is emerging,…
Changing Research Practices and Research Infrastructure Development
ERIC Educational Resources Information Center
Houghton, John W.
2005-01-01
This paper examines changing research practices in the digital environment and draws out implications for the development of research infrastructure. Reviews of the literature, quantitative indicators of research activities and our own field research in Australia suggest that there is a new mode of knowledge production emerging, changing research…
US EPA GEOSPATIAL QUALITY COUNCIL: ENSURING QUALITY IN GEOPSPATIAL SOLUTIONS
In 1999, the U.S. Environmental Protection Agency (EPA), Office of Research and Development, Environmental Sciences Division, created the EPA Geospatial Quality Council (GQC) to fill the gap between the EPA Quality Assurance (QA) and Geospatial communities. GQC participants inclu...
Searches over graphs representing geospatial-temporal remote sensing data
Brost, Randolph; Perkins, David Nikolaus
2018-03-06
Various technologies pertaining to identifying objects of interest in remote sensing images by searching over geospatial-temporal graph representations are described herein. Graphs are constructed by representing objects in remote sensing images as nodes, and connecting nodes with undirected edges representing either distance or adjacency relationships between objects and directed edges representing changes in time. Geospatial-temporal graph searches are made computationally efficient by taking advantage of characteristics of geospatial-temporal data in remote sensing images through the application of various graph search techniques.
Deductive Coordination of Multiple Geospatial Knowledge Sources
NASA Astrophysics Data System (ADS)
Waldinger, R.; Reddy, M.; Culy, C.; Hobbs, J.; Jarvis, P.; Dungan, J. L.
2002-12-01
Deductive inference is applied to choreograph the cooperation of multiple knowledge sources to respond to geospatial queries. When no one source can provide an answer, the response may be deduced from pieces of the answer provided by many sources. Examples of sources include (1) The Alexandria Digital Library Gazetteer, a repository that gives the locations for almost six million place names, (2) The Cia World Factbook, an online almanac with basic information about more than 200 countries. (3) The SRI TerraVision 3D Terrain Visualization System, which displays a flight-simulator-like interactive display of geographic data held in a database, (4) The NASA GDACC WebGIS client for searching satellite and other geographic data available through OpenGIS Consortium (OGC) Web Map Servers, and (5) The Northern Arizona University Latitude/Longitude Distance Calculator. Queries are phrased in English and are translated into logical theorems by the Gemini Natural Language Parser. The theorems are proved by SNARK, a first-order-logic theorem prover, in the context of an axiomatic geospatial theory. The theory embodies a representational scheme that takes into account the fact that the same place may have many names, and the same name may refer to many places. SNARK has built-in procedures (RCC8 and the Allen calculus, respectively) for reasoning about spatial and temporal concepts. External knowledge sources may be consulted by SNARK as the proof is in progress, so that most knowledge need not be stored axiomatically. The Open Agent Architecture (OAA) facilitates communication between sources that may be implemented on different machines in different computer languages. An answer to the query, in the form of text or an image, is extracted from the proof. Currently, three-dimensional images are displayed by TerraVision but other displays are possible. The combined system is called Geo-Logica. Some example queries that can be handled by Geo-Logica include: (1) show the petrified forests in Oregon north of Portland, (2) show the lake in Argentina with the highest elevation, and (3) Show the IGPB land cover classification, derived using MODIS, of Montana for July, 2000. Use of a theorem prover allows sources to cooperate even if they adapt different notational conventions and representation schemes and have never been designed to work together. New sources can be added without reprogramming the system, by providing axioms that advertise their capabilities. Future directions include entering into a dialogue with the user to clarify ambiguities, elaborate on previous questions, or provide new information necessary to answer the question. In addition, of particular interest is to deal with temporally varying data, with answers displayed as animated images.
Enriching the Web Processing Service
NASA Astrophysics Data System (ADS)
Wosniok, Christoph; Bensmann, Felix; Wössner, Roman; Kohlus, Jörn; Roosmann, Rainer; Heidmann, Carsten; Lehfeldt, Rainer
2014-05-01
The OGC Web Processing Service (WPS) provides a standard for implementing geospatial processes in service-oriented networks. In its current version 1.0.0 it allocates the operations GetCapabilities, DescribeProcess and Execute, which can be used to offer custom processes based on single or multiple sub-processes. A large range of ready to use fine granular, fundamental geospatial processes have been developed by the GIS-community in the past. However, modern use cases or whole workflow processes demand specifications of lifecycle management and service orchestration. Orchestrating smaller sub-processes is a task towards interoperability; a comprehensive documentation by using appropriate metadata is also required. Though different approaches were tested in the past, developing complex WPS applications still requires programming skills, knowledge about software libraries in use and a lot of effort for integration. Our toolset RichWPS aims at providing a better overall experience by setting up two major components. The RichWPS ModelBuilder enables the graphics-aided design of workflow processes based on existing local and distributed processes and geospatial services. Once tested by the RichWPS Server, a composition can be deployed for production use on the RichWPS Server. The ModelBuilder obtains necessary processes and services from a directory service, the RichWPS semantic proxy. It manages the lifecycle and is able to visualize results and debugging-information. One aim will be to generate reproducible results; the workflow should be documented by metadata that can be integrated in Spatial Data Infrastructures. The RichWPS Server provides a set of interfaces to the ModelBuilder for, among others, testing composed workflow sequences, estimating their performance and to publish them as common processes. Therefore the server is oriented towards the upcoming WPS 2.0 standard and its ability to transactionally deploy and undeploy processes making use of a WPS-T interface. In order to deal with the results of these processing workflows, a server side extension enables the RichWPS Server and its clients to use WPS presentation directives (WPS-PD), a content related enhancement for the standardized WPS schema. We identified essential requirements of the components of our toolset by applying two use cases. The first enables the simplified comparison of modeled and measured data, a common task in hydro-engineering to validate the accuracy of a model. An implementation of the workflow includes reading, harmonizing and comparing two datasets in NetCDF-format. 2D Water level data from the German Bight can be chosen, presented and evaluated in a web client with interactive plots. The second use case is motivated by the Marine Strategy Directive (MSD) of the EU, which demands monitoring, action plans and at least an evaluation of the ecological situation in marine environment. Information technics adapted to those of INSPIRE should be used. One of the parameters monitored and evaluated for MSD is the expansion and quality of seagrass fields. With the view towards other evaluation parameters we decompose the complex process of evaluation of seagrass in reusable process steps and implement those packages as configurable WPS.
NASA Astrophysics Data System (ADS)
Pierleoni, Arnaldo; Casagrande, Luca; Bellezza, Michele; Casadei, Stefano
2010-05-01
The need for increasingly complex geospatial algorithms dedicated to the management of water resources, the fact that many of them require specific knowledge and the need for dedicated computing machines has led to the necessity of centralizing and sharing all the server applications and the plugins developed. For this purpose, a Web Processing Service (WPS) that can make available to users a range of geospatial analysis algorithms, geostatistics, remote sensing procedures and that can be used simply by providing data and input parameters and download the results has been developed. The core of the system infrastructure is a GRASS GIS, which acts as a computational engine, providing more than 350 forms of analysis and the opportunity to create new and ad hoc procedures. The implementation of the WPS was performed using the software PyWPS written in Python that is easily manageable and configurable. All these instruments are managed by a daemon named "Arcibald" specifically created for the purpose of listing the order of the requests that come from the users. In fact, it may happen that there are already ongoing processes so the system will queue the new ones registering the request and running it only when the previous calculations have been completed. However, individual Geoprocessing have an indicator to assess the resources necessary to implement it, enabling you to run geoprocesses that do not require excessive computing time in parallel. This assessment is also made in relation to the size of the input file provided. The WPS standard defines methods for accessing and running Geoprocessing regardless of the client used, however, the project has been developed specifically for a graphical client to access the resources. The client was built as a plugin for the GIS QGis Software which provides the most common tools for the view and the consultation of geographically referenced data. The tool was tested using the data taken during the bathymetric campaign at the Montedoglio Reservoir on the Tiber River in order to generate a digital model of the reservoir bed. Starting from a text file containing coordinates and the depth of the points (previously statistically treated to remove any inaccuracy), we used the plugin for QGis to connect to the Web service and started the process of cross validation in order to obtain the parameters to be used for interpolation. This makes possible to highlight the morphological variations of the basin of reservoirs due to silting phenomena, therefore to consider the actual capacity of the basin for a proper evaluation of the available water resource. Indeed, this is a critical step for the next phase of management. In this case, since the procedure is very long (order of days), the system automatically choose to send the results via email. Moreover the system, once the procedures invoked end, allows to choose whether to share data and results or to remove all traces of the calculation. This because in some cases data and sensitive information are used and this could violate privacy policies if shared. The entire project is made only with open-source software.
78 FR 32635 - Privacy Act of 1974; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-31
...; System of Records AGENCY: National Geospatial-Intelligence Agency, DoD. ACTION: Notice to Add a New System of Records. SUMMARY: The National Geospatial-Intelligence Agency is establishing a new system of... information. FOR FURTHER INFORMATION CONTACT: National Geospatial-Intelligence Agency [[Page 32636
78 FR 35606 - Privacy Act of 1974; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-13
...; System of Records AGENCY: National Geospatial-Intelligence Agency, DoD. ACTION: Notice to alter a System of Records. SUMMARY: The National Geospatial-Intelligence Agency is altering a system of records in.... FOR FURTHER INFORMATION CONTACT: National Geospatial-Intelligence Agency (NGA), ATTN: Security...
NASA Astrophysics Data System (ADS)
Leibovici, D. G.; Pourabdollah, A.; Jackson, M.
2011-12-01
Experts and decision-makers use or develop models to monitor global and local changes of the environment. Their activities require the combination of data and processing services in a flow of operations and spatial data computations: a geospatial scientific workflow. The seamless ability to generate, re-use and modify a geospatial scientific workflow is an important requirement but the quality of outcomes is equally much important [1]. Metadata information attached to the data and processes, and particularly their quality, is essential to assess the reliability of the scientific model that represents a workflow [2]. Managing tools, dealing with qualitative and quantitative metadata measures of the quality associated with a workflow, are, therefore, required for the modellers. To ensure interoperability, ISO and OGC standards [3] are to be adopted, allowing for example one to define metadata profiles and to retrieve them via web service interfaces. However these standards need a few extensions when looking at workflows, particularly in the context of geoprocesses metadata. We propose to fill this gap (i) at first through the provision of a metadata profile for the quality of processes, and (ii) through providing a framework, based on XPDL [4], to manage the quality information. Web Processing Services are used to implement a range of metadata analyses on the workflow in order to evaluate and present quality information at different levels of the workflow. This generates the metadata quality, stored in the XPDL file. The focus is (a) on the visual representations of the quality, summarizing the retrieved quality information either from the standardized metadata profiles of the components or from non-standard quality information e.g., Web 2.0 information, and (b) on the estimated qualities of the outputs derived from meta-propagation of uncertainties (a principle that we have introduced [5]). An a priori validation of the future decision-making supported by the outputs of the workflow once run, is then provided using the meta-propagated qualities, obtained without running the workflow [6], together with the visualization pointing out the need to improve the workflow with better data or better processes on the workflow graph itself. [1] Leibovici, DG, Hobona, G Stock, K Jackson, M (2009) Qualifying geospatial workfow models for adaptive controlled validity and accuracy. In: IEEE 17th GeoInformatics, 1-5 [2] Leibovici, DG, Pourabdollah, A (2010a) Workflow Uncertainty using a Metamodel Framework and Metadata for Data and Processes. OGC TC/PC Meetings, September 2010, Toulouse, France [3] OGC (2011) www.opengeospatial.org [4] XPDL (2008) Workflow Process Definition Interface - XML Process Definition Language.Workflow Management Coalition, Document WfMC-TC-1025, 2008 [5] Leibovici, DG Pourabdollah, A Jackson, M (2011) Meta-propagation of Uncertainties for Scientific Workflow Management in Interoperable Spatial Data Infrastructures. In: Proceedings of the European Geosciences Union (EGU2011), April 2011, Austria [6] Pourabdollah, A Leibovici, DG Jackson, M (2011) MetaPunT: an Open Source tool for Meta-Propagation of uncerTainties in Geospatial Processing. In: Proceedings of OSGIS2011, June 2011, Nottingham, UK
SWOT analysis on National Common Geospatial Information Service Platform of China
NASA Astrophysics Data System (ADS)
Zheng, Xinyan; He, Biao
2010-11-01
Currently, the trend of International Surveying and Mapping is shifting from map production to integrated service of geospatial information, such as GOS of U.S. etc. Under this circumstance, the Surveying and Mapping of China is inevitably shifting from 4D product service to NCGISPC (National Common Geospatial Information Service Platform of China)-centered service. Although State Bureau of Surveying and Mapping of China has already provided a great quantity of geospatial information service to various lines of business, such as emergency and disaster management, transportation, water resource, agriculture etc. The shortcomings of the traditional service mode are more and more obvious, due to the highly emerging requirement of e-government construction, the remarkable development of IT technology and emerging online geospatial service demands of various lines of business. NCGISPC, which aimed to provide multiple authoritative online one-stop geospatial information service and API for further development to government, business and public, is now the strategic core of SBSM (State Bureau of Surveying and Mapping of China). This paper focuses on the paradigm shift that NCGISPC brings up by using SWOT (Strength, Weakness, Opportunity and Threat) analysis, compared to the service mode that based on 4D product. Though NCGISPC is still at its early stage, it represents the future service mode of geospatial information of China, and surely will have great impact not only on the construction of digital China, but also on the way that everyone uses geospatial information service.
NASA Astrophysics Data System (ADS)
Kassab, Ala'; Liang, Steve; Gao, Yang
2010-12-01
Emergency agencies seek to maintain situational awareness and effective decision making through continuous monitoring of, and real-time alerting about, sources of information regarding current incidents and developing fire hazards. The nature of this goal requires integrating different, potentially numerous, sources of dynamic geospatial information on the one side, and a large number of clients having heterogeneous and specific interests in data on the other side. In such scenarios, the traditional request/reply communication style may function inefficiently, as it is based on point-to-point, synchronous, and pulling mode interaction between consumer clients and information providers/services. In this work, we propose Geospatial-based Publish/ Subscribe, an interaction framework that serves as a middleware for real-time transacting of spatially related information of interest, termed geospatial events, in distributed systems. Expressive data models, including geospatial event and geospatial subscription, as well as an efficient matching approach for fast dissemination of geospatial events to interested clients, are introduced. The proposed interaction framework is realized through the development of a Real-Time Fire Emergency Response System (RFERS) prototype. The prototype is designed for transacting several topics of geospatial events that are crucial within the context of fire emergencies, including GPS locations of emergency assets, meteorological observations of wireless sensors, fire incidents reports, and temporal sequences of remote sensing images of active wildfires. The performance of the system prototype has been evaluated in order to demonstrate its efficiency.
Geodecision system for traceability and sustainable production of beef cattle in Brazil
NASA Astrophysics Data System (ADS)
Victoria, D. D.; Andrade, R. G.; Bolfe, L.; Batistella, M.; Pires, P. P.; Vicente, L. E.; Visoli, M. C.
2011-12-01
Beef cattle production sustainability depends on incorporating innovative tools and technologies which are easy to comprehend, economically viable, and spatially explicit into the registration of precise, reliable data about production practices. This research developed from the needs and demands of food safety and food quality in extensive beef cattle production within the scope of the policies of Southern Cone and European Union's countries. Initially, the OTAG project (Operational Management and Geodecisional Prototype to Track and Trace Agricultural Production) focused on the development of a prototype traceability of cattle. The aim for the project's next phase is to enhance the electronic devices used in the identification and positioning of the animals, and the incorporation of more management and sanitary information. Besides, we intend to structure a database that enables the inclusion of greater amount of geospatial information linked to environmental aspects, such as water deficit, vegetation vigour, degradation indices of pasture areas, among others. For the extraction of knowledge, and the presentation of the results, we propose the development of a friendly interface to facilitate the exploration of the textual, tabular and geospatial information useful for the user.
Online Maps and Cloud-Supported Location-Based Services across a Manifold of Devices
NASA Astrophysics Data System (ADS)
Kröpfl, M.; Buchmüller, D.; Leberl, F.
2012-07-01
Online mapping, miniaturization of computing devices, the "cloud", Global Navigation Satellite System (GNSS) and cell tower triangulation all coalesce into an entirely novel infrastructure for numerous innovative map applications. This impacts the planning of human activities, navigating and tracking these activities as they occur, and finally documenting their outcome for either a single user or a network of connected users in a larger context. In this paper, we provide an example of a simple geospatial application making use of this model, which we will use to explain the basic steps necessary to deploy an application involving a web service hosting geospatial information and a client software consuming the web service through an API. The application allows an insurance claim specialist to add claims to a cloud-based database including a claim location. A field agent then uses a smartphone application to query the database by proximity, and heads out to capture photographs as supporting documentation for the claim. Once the photos have been uploaded to the web service, a second web service for image matching is called in order to try and match the current photograph to previously submitted assets. Image matching is used as a pre-verification step to determine whether the coverage of the respective object is sufficient for the claim specialist to process the claim. The development of the application was based on Microsoft's® Bing Maps™, Windows Phone™, Silverlight™, Windows Azure™ and Visual Studio™, and was completed in approximately 30 labour hours split among two developers.
Advancing the Implementation of Hydrologic Models as Web-based Applications
NASA Astrophysics Data System (ADS)
Dahal, P.; Tarboton, D. G.; Castronova, A. M.
2017-12-01
Advanced computer simulations are required to understand hydrologic phenomenon such as rainfall-runoff response, groundwater hydrology, snow hydrology, etc. Building a hydrologic model instance to simulate a watershed requires investment in data (diverse geospatial datasets such as terrain, soil) and computer resources, typically demands a wide skill set from the analyst, and the workflow involved is often difficult to reproduce. This work introduces a web-based prototype infrastructure in the form of a web application that provides researchers with easy to use access to complete hydrological modeling functionality. This includes creating the necessary geospatial and forcing data, preparing input files for a model by applying complex data preprocessing, running the model for a user defined watershed, and saving the results to a web repository. The open source Tethys Platform was used to develop the web app front-end Graphical User Interface (GUI). We used HydroDS, a webservice that provides data preparation processing capability to support backend computations used by the app. Results are saved in HydroShare, a hydrologic information system that supports the sharing of hydrologic data, model and analysis tools. The TOPographic Kinematic APproximation and Integration (TOPKAPI) model served as the example for which we developed a complete hydrologic modeling service to demonstrate the approach. The final product is a complete modeling system accessible through the web to create input files, and run the TOPKAPI hydrologic model for a watershed of interest. We are investigating similar functionality for the preparation of input to Regional Hydro-Ecological Simulation System (RHESSys). Key Words: hydrologic modeling, web services, hydrologic information system, HydroShare, HydroDS, Tethys Platform
US EPA GLOBAL POSITIONING SYSTEMS - TECHNICAL IMPLEMENTATION GUIDANCE
The U.S. EPA Geospatial Quality Council (GQC) was formed in 1998 to provide Quality Assurance guidance for the development, use, and products of geospatial activities and research. The long-term goals of the GQC are expressed in a living document, currently the EPA Geospatial Qua...
Integration of Geospatial Science in Teacher Education
ERIC Educational Resources Information Center
Hauselt, Peggy; Helzer, Jennifer
2012-01-01
One of the primary missions of our university is to train future primary and secondary teachers. Geospatial sciences, including GIS, have long been excluded from teacher education curriculum. This article explains the curriculum revisions undertaken to increase the geospatial technology education of future teachers. A general education class…
75 FR 43497 - Privacy Act of 1974; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-26
...; System of Records AGENCY: National Geospatial-Intelligence Agency (NGA), DoD. ACTION: Notice to add a system of records. SUMMARY: The National Geospatial-Intelligence Agency (NGA) proposes to add a system of...-3808. SUPPLEMENTARY INFORMATION: The National Geospatial-Intelligence Agency notices for systems of...
Critical Infrastructure Protection- Los Alamos National Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bofman, Ryan K.
Los Alamos National Laboratory (LANL) has been a key facet of Critical National Infrastructure since the nuclear bombing of Hiroshima exposed the nature of the Laboratory’s work in 1945. Common knowledge of the nature of sensitive information contained here presents a necessity to protect this critical infrastructure as a matter of national security. This protection occurs in multiple forms beginning with physical security, followed by cybersecurity, safeguarding of classified information, and concluded by the missions of the National Nuclear Security Administration.
NASA Astrophysics Data System (ADS)
McGowan, A. E.; Postlethwaite, V. R.; Pellatt, M. G.; Kohfeld, K. E.; Robinson, C.; Yakimishyn, J.; Chastain, S. G.
2016-12-01
Across the globe seagrass habits are recognized as highly productive systems, and have recently been characterized by their ability to store and sequester substantial amounts of organic carbon, known as `blue carbon.' Unfortunately, seagrasses are among the most rapidly disappearing ecosystems on Earth due to anthropogenic activities and development. Given the paucity of geospatial information on the global abundance of blue carbon environments, the rate of seagrass habitat loss is uncertain. Recent studies indicate that the consequences of coastal ecosystem conversion are larger than predicted, particularly on Canada's Pacific coastline where agricultural, forestry, and commercial developments have destroyed substantial amounts of seagrass habitat. This lack of knowledge hinders coastal habitat and blue carbon conservation planning and inhibits comprehensive policy development regarding coastal carbon management. This research quantitatively assesses various measures of above and below ground biomass and eelgrass shoot density as well as incorporates geospatial data collected from remote sensing technologies from three seagrass meadows on the Pacific coast of British Columbia. Using ArcGIS software, the distribution, extent, and density of seagrass located in the Pacific Rim National Park Reserve and southern Clayoquot Sound will be used to contribute to the first set of continental maps of blue carbon habitats within North America led by the Commission for Environmental Cooperation. Further, these results will be integrated into a geospatial database on the carbon accumulation rates in seagrass meadows on the Pacific coast of North America, providing a baseline for determining the role blue carbon habitats play in carbon mitigation on coastal British Columbia.
GeoPAT: A toolbox for pattern-based information retrieval from large geospatial databases
NASA Astrophysics Data System (ADS)
Jasiewicz, Jarosław; Netzel, Paweł; Stepinski, Tomasz
2015-07-01
Geospatial Pattern Analysis Toolbox (GeoPAT) is a collection of GRASS GIS modules for carrying out pattern-based geospatial analysis of images and other spatial datasets. The need for pattern-based analysis arises when images/rasters contain rich spatial information either because of their very high resolution or their very large spatial extent. Elementary units of pattern-based analysis are scenes - patches of surface consisting of a complex arrangement of individual pixels (patterns). GeoPAT modules implement popular GIS algorithms, such as query, overlay, and segmentation, to operate on the grid of scenes. To achieve these capabilities GeoPAT includes a library of scene signatures - compact numerical descriptors of patterns, and a library of distance functions - providing numerical means of assessing dissimilarity between scenes. Ancillary GeoPAT modules use these functions to construct a grid of scenes or to assign signatures to individual scenes having regular or irregular geometries. Thus GeoPAT combines knowledge retrieval from patterns with mapping tasks within a single integrated GIS environment. GeoPAT is designed to identify and analyze complex, highly generalized classes in spatial datasets. Examples include distinguishing between different styles of urban settlements using VHR images, delineating different landscape types in land cover maps, and mapping physiographic units from DEM. The concept of pattern-based spatial analysis is explained and the roles of all modules and functions are described. A case study example pertaining to delineation of landscape types in a subregion of NLCD is given. Performance evaluation is included to highlight GeoPAT's applicability to very large datasets. The GeoPAT toolbox is available for download from
Universities--Drivers for Regional Innovation Culture and Competitiveness
ERIC Educational Resources Information Center
Muresan, Mihaela; Gogu, Emilia
2010-01-01
The actual infrastructure of the information society sustains the globalization trend and increases the importance of the information and knowledge. The development of the knowledge society is the direct consequence of the mix of economic, social and cultural processes, which involve the knowledge creation and its equitable distribution, access…
Knowledge Infrastructures for Solar Cities
ERIC Educational Resources Information Center
Vanderburg, Willem H.
2006-01-01
The evolution of contemporary cities into solar cities will be affected by the decisions of countless specialists according to an established intellectual and professional division of labor. These specialists belong to groups responsible for advancing and applying a body of knowledge, and jointly, these bodies of knowledge make up a knowledge…
Towards the Reconciliation of Knowledge Management and e-Collaboration Systems
ERIC Educational Resources Information Center
Le Dinh, Thang; Rinfret, Louis; Raymond, Louis; Dong Thi, Bich-Thuy
2013-01-01
Purpose: The purpose of this paper is to propose an intelligent infrastructure for the reconciliation of knowledge management and e-collaboration systems. Design/Methodology/Approach:Literature on e-collaboration, information management, knowledge management, learning process, and intellectual capital is mobilised in order to build the conceptual…
Mapping the Future Today: The Community College of Baltimore County Geospatial Applications Program
ERIC Educational Resources Information Center
Jeffrey, Scott; Alvarez, Jaime
2010-01-01
The Geospatial Applications Program at the Community College of Baltimore County (CCBC), located five miles west of downtown Baltimore, Maryland, provides comprehensive instruction in geographic information systems (GIS), remote sensing and global positioning systems (GPS). Geospatial techniques, which include computer-based mapping and remote…
Strategizing Teacher Professional Development for Classroom Uses of Geospatial Data and Tools
ERIC Educational Resources Information Center
Zalles, Daniel R.; Manitakos, James
2016-01-01
Studying Topography, Orographic Rainfall, and Ecosystems with Geospatial Information Technology (STORE), a 4.5-year National Science Foundation funded project, explored the strategies that stimulate teacher commitment to the project's driving innovation: having students use geospatial information technology (GIT) to learn about weather, climate,…
Fostering 21st Century Learning with Geospatial Technologies
ERIC Educational Resources Information Center
Hagevik, Rita A.
2011-01-01
Global positioning systems (GPS) receivers and other geospatial tools can help teachers create engaging, hands-on activities in all content areas. This article provides a rationale for using geospatial technologies in the middle grades and describes classroom-tested activities in English language arts, science, mathematics, and social studies.…
EPA GEOSPATIAL QUALITY COUNCIL STRATEGY PLAN FY-02
The EPA Geospatial Quality Council (GQC), previously known as the EPA GIS-QA Team - EPA/600/R-00/009, was created to fill the gap between the EPA Quality Assurance (QA) and Geospatial communities. All EPA Offices and Regions were invited to participate. Currently, the EPA...
Mapping and monitoring potato cropping systems in Maine: geospatial methods and land use assessments
USDA-ARS?s Scientific Manuscript database
Geospatial frameworks and GIS-based approaches were used to assess current cropping practices in potato production systems in Maine. Results from the geospatial integration of remotely-sensed cropland layers (2008-2011) and soil datasets for Maine revealed a four-year potato systems footprint estima...
The Virginia Geocoin Adventure: An Experiential Geospatial Learning Activity
ERIC Educational Resources Information Center
Johnson, Laura; McGee, John; Campbell, James; Hays, Amy
2013-01-01
Geospatial technologies have become increasingly prevalent across our society. Educators at all levels have expressed a need for additional resources that can be easily adopted to support geospatial literacy and state standards of learning, while enhancing the overall learning experience. The Virginia Geocoin Adventure supports the needs of 4-H…
ERIC Educational Resources Information Center
Reed, Philip A.; Ritz, John
2004-01-01
Geospatial technology refers to a system that is used to acquire, store, analyze, and output data in two or three dimensions. This data is referenced to the earth by some type of coordinate system, such as a map projection. Geospatial systems include thematic mapping, the Global Positioning System (GPS), remote sensing (RS), telemetry, and…
A Geospatial Online Instruction Model
ERIC Educational Resources Information Center
Rodgers, John C., III; Owen-Nagel, Athena; Ambinakudige, Shrinidhi
2012-01-01
The objective of this study is to present a pedagogical model for teaching geospatial courses through an online format and to critique the model's effectiveness. Offering geospatial courses through an online format provides avenues to a wider student population, many of whom are not able to take traditional on-campus courses. Yet internet-based…
lawn: An R client for the Turf JavaScript Library for Geospatial Analysis
lawn is an R package to provide access to the geospatial analysis capabilities in the Turf javascript library. Turf expects data in GeoJSON format. Given that many datasets are now available natively in GeoJSON providing an easier method for conducting geospatial analyses on thes...
ERIC Educational Resources Information Center
Lee, Ashley; Hobson, Joe; Bienkowski, Marie; Midgley, Steve; Currier, Sarah; Campbell, Lorna M.; Novoselova, Tatiana
2012-01-01
In this article, the authors describe an open-source, open-data digital infrastructure for sharing information about open educational resources (OERs) across disparate systems and platforms. The Learning Registry, which began as a project funded by the U.S. Departments of Education and Defense, currently has an active international community…
Modeling And Detecting Anomalies In Scada Systems
NASA Astrophysics Data System (ADS)
Svendsen, Nils; Wolthusen, Stephen
The detection of attacks and intrusions based on anomalies is hampered by the limits of specificity underlying the detection techniques. However, in the case of many critical infrastructure systems, domain-specific knowledge and models can impose constraints that potentially reduce error rates. At the same time, attackers can use their knowledge of system behavior to mask their manipulations, causing adverse effects to observed only after a significant period of time. This paper describes elementary statistical techniques that can be applied to detect anomalies in critical infrastructure networks. A SCADA system employed in liquefied natural gas (LNG) production is used as a case study.
Gurdak, Jason J.; Qi, Sharon L.; Geisler, Michael L.
2009-01-01
The U.S. Geological Survey Raster Error Propagation Tool (REPTool) is a custom tool for use with the Environmental System Research Institute (ESRI) ArcGIS Desktop application to estimate error propagation and prediction uncertainty in raster processing operations and geospatial modeling. REPTool is designed to introduce concepts of error and uncertainty in geospatial data and modeling and provide users of ArcGIS Desktop a geoprocessing tool and methodology to consider how error affects geospatial model output. Similar to other geoprocessing tools available in ArcGIS Desktop, REPTool can be run from a dialog window, from the ArcMap command line, or from a Python script. REPTool consists of public-domain, Python-based packages that implement Latin Hypercube Sampling within a probabilistic framework to track error propagation in geospatial models and quantitatively estimate the uncertainty of the model output. Users may specify error for each input raster or model coefficient represented in the geospatial model. The error for the input rasters may be specified as either spatially invariant or spatially variable across the spatial domain. Users may specify model output as a distribution of uncertainty for each raster cell. REPTool uses the Relative Variance Contribution method to quantify the relative error contribution from the two primary components in the geospatial model - errors in the model input data and coefficients of the model variables. REPTool is appropriate for many types of geospatial processing operations, modeling applications, and related research questions, including applications that consider spatially invariant or spatially variable error in geospatial data.
An updated geospatial liquefaction model for global application
Zhu, Jing; Baise, Laurie G.; Thompson, Eric M.
2017-01-01
We present an updated geospatial approach to estimation of earthquake-induced liquefaction from globally available geospatial proxies. Our previous iteration of the geospatial liquefaction model was based on mapped liquefaction surface effects from four earthquakes in Christchurch, New Zealand, and Kobe, Japan, paired with geospatial explanatory variables including slope-derived VS30, compound topographic index, and magnitude-adjusted peak ground acceleration from ShakeMap. The updated geospatial liquefaction model presented herein improves the performance and the generality of the model. The updates include (1) expanding the liquefaction database to 27 earthquake events across 6 countries, (2) addressing the sampling of nonliquefaction for incomplete liquefaction inventories, (3) testing interaction effects between explanatory variables, and (4) overall improving model performance. While we test 14 geospatial proxies for soil density and soil saturation, the most promising geospatial parameters are slope-derived VS30, modeled water table depth, distance to coast, distance to river, distance to closest water body, and precipitation. We found that peak ground velocity (PGV) performs better than peak ground acceleration (PGA) as the shaking intensity parameter. We present two models which offer improved performance over prior models. We evaluate model performance using the area under the curve under the Receiver Operating Characteristic (ROC) curve (AUC) and the Brier score. The best-performing model in a coastal setting uses distance to coast but is problematic for regions away from the coast. The second best model, using PGV, VS30, water table depth, distance to closest water body, and precipitation, performs better in noncoastal regions and thus is the model we recommend for global implementation.
NASA Astrophysics Data System (ADS)
Huang, W.; Jiang, J.; Zha, Z.; Zhang, H.; Wang, C.; Zhang, J.
2014-04-01
Geospatial data resources are the foundation of the construction of geo portal which is designed to provide online geoinformation services for the government, enterprise and public. It is vital to keep geospatial data fresh, accurate and comprehensive in order to satisfy the requirements of application and development of geographic location, route navigation, geo search and so on. One of the major problems we are facing is data acquisition. For us, integrating multi-sources geospatial data is the mainly means of data acquisition. This paper introduced a practice integration approach of multi-source geospatial data with different data model, structure and format, which provided the construction of National Geospatial Information Service Platform of China (NGISP) with effective technical supports. NGISP is the China's official geo portal which provides online geoinformation services based on internet, e-government network and classified network. Within the NGISP architecture, there are three kinds of nodes: national, provincial and municipal. Therefore, the geospatial data is from these nodes and the different datasets are heterogeneous. According to the results of analysis of the heterogeneous datasets, the first thing we do is to define the basic principles of data fusion, including following aspects: 1. location precision; 2.geometric representation; 3. up-to-date state; 4. attribute values; and 5. spatial relationship. Then the technical procedure is researched and the method that used to process different categories of features such as road, railway, boundary, river, settlement and building is proposed based on the principles. A case study in Jiangsu province demonstrated the applicability of the principle, procedure and method of multi-source geospatial data integration.
Remote sensing applied to resource management
Henry M. Lachowski
1998-01-01
Effective management of forest resources requires access to current and consistent geospatial information that can be shared by resource managers and the public. Geospatial information describing our land and natural resources comes from many sources and is most effective when stored in a geospatial database and used in a geographic information system (GIS). The...
ERIC Educational Resources Information Center
Kulo, Violet; Bodzin, Alec
2013-01-01
Geospatial technologies are increasingly being integrated in science classrooms to foster learning. This study examined whether a Web-enhanced science inquiry curriculum supported by geospatial technologies promoted urban middle school students' understanding of energy concepts. The participants included one science teacher and 108 eighth-grade…
Introduction to the Complex Geospatial Web in Geographical Education
ERIC Educational Resources Information Center
Papadimitriou, Fivos
2010-01-01
The Geospatial Web is emerging in the geographical education landscape in all its complexity. How will geographers and educators react? What are the most important facets of this development? After reviewing the possible impacts on geographical education, it can be conjectured that the Geospatial Web will eventually replace the usual geographical…
ERIC Educational Resources Information Center
Bodzin, Alec M.; Fu, Qiong; Bressler, Denise; Vallera, Farah L.
2015-01-01
Geospatially enabled learning technologies may enhance Earth science learning by placing emphasis on geographic space, visualization, scale, representation, and geospatial thinking and reasoning (GTR) skills. This study examined if and how a series of Web geographic information system investigations that the researchers developed improved urban…
Hettinger Photo of Dylan Hettinger Dylan Hettinger Geospatial Data Scientist Dylan.Hettinger @nrel.gov | 303-275-3750 Dylan Hettinger is a member of the Geospatial Data Science team within the Systems Modeling & Geospatial Data Science Group in the Strategic Energy Analysis Center. Areas of Expertise
ERIC Educational Resources Information Center
Hanley, Carol D.; Davis, Hilarie B.; Davey, Bradford T.
2012-01-01
As use of geospatial technologies has increased in the workplace, so has interest in using these technologies in the K-12 classroom. Prior research has identified several reasons for using geospatial technologies in the classroom, such as developing spatial thinking, supporting local investigations, analyzing changes in the environment, and…
The Sky's the Limit: Integrating Geospatial Tools with Pre-College Youth Education
ERIC Educational Resources Information Center
McGee, John; Kirwan, Jeff
2010-01-01
Geospatial tools, which include global positioning systems (GPS), geographic information systems (GIS), and remote sensing, are increasingly driving a variety of applications. Local governments and private industry are embracing these tools, and the public is beginning to demand geospatial services. The U.S. Department of Labor (DOL) reported that…
Geospatial Services in Special Libraries: A Needs Assessment Perspective
ERIC Educational Resources Information Center
Barnes, Ilana
2013-01-01
Once limited to geographers and mapmakers, Geographic Information Systems (GIS) has taken a growing central role in information management and visualization. Geospatial services run a gamut of different products and services from Google maps to ArcGIS servers to Mobile development. Geospatial services are not new. Libraries have been writing about…
A Geospatial Semantic Enrichment and Query Service for Geotagged Photographs
Ennis, Andrew; Nugent, Chris; Morrow, Philip; Chen, Liming; Ioannidis, George; Stan, Alexandru; Rachev, Preslav
2015-01-01
With the increasing abundance of technologies and smart devices, equipped with a multitude of sensors for sensing the environment around them, information creation and consumption has now become effortless. This, in particular, is the case for photographs with vast amounts being created and shared every day. For example, at the time of this writing, Instagram users upload 70 million photographs a day. Nevertheless, it still remains a challenge to discover the “right” information for the appropriate purpose. This paper describes an approach to create semantic geospatial metadata for photographs, which can facilitate photograph search and discovery. To achieve this we have developed and implemented a semantic geospatial data model by which a photograph can be enrich with geospatial metadata extracted from several geospatial data sources based on the raw low-level geo-metadata from a smartphone photograph. We present the details of our method and implementation for searching and querying the semantic geospatial metadata repository to enable a user or third party system to find the information they are looking for. PMID:26205265
Citing geospatial feature inventories with XML manifests
NASA Astrophysics Data System (ADS)
Bose, R.; McGarva, G.
2006-12-01
Today published scientific papers include a growing number of citations for online information sources that either complement or replace printed journals and books. We anticipate this same trend for cartographic citations used in the geosciences, following advances in web mapping and geographic feature-based services. Instead of using traditional libraries to resolve citations for print material, the geospatial citation life cycle will include requesting inventories of objects or geographic features from distributed geospatial data repositories. Using a case study from the UK Ordnance Survey MasterMap database, which is illustrative of geographic object-based products in general, we propose citing inventories of geographic objects using XML feature manifests. These manifests: (1) serve as a portable listing of sets of versioned features; (2) could be used as citations within the identification portion of an international geospatial metadata standard; (3) could be incorporated into geospatial data transfer formats such as GML; but (4) can be resolved only with comprehensive, curated repositories of current and historic data. This work has implications for any researcher who foresees the need to make or resolve references to online geospatial databases.
ERIC Educational Resources Information Center
Cho, Taejun
2011-01-01
Knowledge is one of the most important assets for surviving in the modern business environment. The effective management of that asset mandates continuous adaptation by organizations, and requires employees to strive to improve the company's work processes. Organizations attempt to coordinate their unique knowledge with traditional means as well…
Examining the target levels of state renewable portfolio standards
NASA Astrophysics Data System (ADS)
Helwig, Laurence Douglas
At present 37 U.S. states have passed Renewable Portfolio Standards (RPS) or have a legislative driven goal that supports investment in renewable energy (RE) technologies. Previous research has identified economic, governmental, ideological and infrastructural characteristics as key predictors of policy adoption and renewable energy deployment efforts (Carley, 2009; Davis & Davis, 2009; Bohn & Lant, 2009; Lyon & Yin, 2010). To date, only a few studies have investigated the target levels of renewable portfolio standards. Carley & Miller (2012) found that policies of differing stringencies were motivated by systematically different factors that included governmental ideology. The purpose of this dissertation is to replicate and expand upon earlier models that predicted RPS adoption and RE deployment efforts by adding regulatory, infrastructural and spatial characteristics to predict RPS target levels. Hypotheses were tested using three alternative measurements of RPS target level strength to determine to what extent a combination of explanatory variables explain variation in policy target levels. Multivariate linear regression and global spatial autocorrelation results indicated that multiple state internal determinants influenced RPS target level including average electricity price, state government ideology and to a lesser extent actual RE potential capacity. In addition, some diffusion effects were found to exist that indicated that states are setting their RPS target levels lower than their neighboring states and a local geo-spatial clustering effect was observed in the target levels for a grouping of northeastern states.
Digital Earth - A sustainable Earth
NASA Astrophysics Data System (ADS)
Mahavir
2014-02-01
All life, particularly human, cannot be sustainable, unless complimented with shelter, poverty reduction, provision of basic infrastructure and services, equal opportunities and social justice. Yet, in the context of cities, it is believed that they can accommodate more and more people, endlessly, regardless to their carrying capacity and increasing ecological footprint. The 'inclusion', for bringing more and more people in the purview of development is often limited to social and economic inclusion rather than spatial and ecological inclusion. Economic investment decisions are also not always supported with spatial planning decisions. Most planning for a sustainable Earth, be at a level of rural settlement, city, region, national or Global, fail on the capacity and capability fronts. In India, for example, out of some 8,000 towns and cities, Master Plans exist for only about 1,800. A chapter on sustainability or environment is neither statutorily compulsory nor a norm for these Master Plans. Geospatial technologies including Remote Sensing, GIS, Indian National Spatial Data Infrastructure (NSDI), Indian National Urban Information Systems (NUIS), Indian Environmental Information System (ENVIS), and Indian National GIS (NGIS), etc. have potential to map, analyse, visualize and take sustainable developmental decisions based on participatory social, economic and social inclusion. Sustainable Earth, at all scales, is a logical and natural outcome of a digitally mapped, conceived and planned Earth. Digital Earth, in fact, itself offers a platform to dovetail the ecological, social and economic considerations in transforming it into a sustainable Earth.
The Importance of Biodiversity E-infrastructures for Megadiverse Countries
Canhos, Dora A. L.; Sousa-Baena, Mariane S.; de Souza, Sidnei; Maia, Leonor C.; Stehmann, João R.; Canhos, Vanderlei P.; De Giovanni, Renato; Bonacelli, Maria B. M.; Los, Wouter; Peterson, A. Townsend
2015-01-01
Addressing the challenges of biodiversity conservation and sustainable development requires global cooperation, support structures, and new governance models to integrate diverse initiatives and achieve massive, open exchange of data, tools, and technology. The traditional paradigm of sharing scientific knowledge through publications is not sufficient to meet contemporary demands that require not only the results but also data, knowledge, and skills to analyze the data. E-infrastructures are key in facilitating access to data and providing the framework for collaboration. Here we discuss the importance of e-infrastructures of public interest and the lack of long-term funding policies. We present the example of Brazil’s speciesLink network, an e-infrastructure that provides free and open access to biodiversity primary data and associated tools. SpeciesLink currently integrates 382 datasets from 135 national institutions and 13 institutions from abroad, openly sharing ~7.4 million records, 94% of which are associated to voucher specimens. Just as important as the data is the network of data providers and users. In 2014, more than 95% of its users were from Brazil, demonstrating the importance of local e-infrastructures in enabling and promoting local use of biodiversity data and knowledge. From the outset, speciesLink has been sustained through project-based funding, normally public grants for 2–4-year periods. In between projects, there are short-term crises in trying to keep the system operational, a fact that has also been observed in global biodiversity portals, as well as in social and physical sciences platforms and even in computing services portals. In the last decade, the open access movement propelled the development of many web platforms for sharing data. Adequate policies unfortunately did not follow the same tempo, and now many initiatives may perish. PMID:26204382
The Importance of Biodiversity E-infrastructures for Megadiverse Countries.
Canhos, Dora A L; Sousa-Baena, Mariane S; de Souza, Sidnei; Maia, Leonor C; Stehmann, João R; Canhos, Vanderlei P; De Giovanni, Renato; Bonacelli, Maria B M; Los, Wouter; Peterson, A Townsend
2015-07-01
Addressing the challenges of biodiversity conservation and sustainable development requires global cooperation, support structures, and new governance models to integrate diverse initiatives and achieve massive, open exchange of data, tools, and technology. The traditional paradigm of sharing scientific knowledge through publications is not sufficient to meet contemporary demands that require not only the results but also data, knowledge, and skills to analyze the data. E-infrastructures are key in facilitating access to data and providing the framework for collaboration. Here we discuss the importance of e-infrastructures of public interest and the lack of long-term funding policies. We present the example of Brazil's speciesLink network, an e-infrastructure that provides free and open access to biodiversity primary data and associated tools. SpeciesLink currently integrates 382 datasets from 135 national institutions and 13 institutions from abroad, openly sharing ~7.4 million records, 94% of which are associated to voucher specimens. Just as important as the data is the network of data providers and users. In 2014, more than 95% of its users were from Brazil, demonstrating the importance of local e-infrastructures in enabling and promoting local use of biodiversity data and knowledge. From the outset, speciesLink has been sustained through project-based funding, normally public grants for 2-4-year periods. In between projects, there are short-term crises in trying to keep the system operational, a fact that has also been observed in global biodiversity portals, as well as in social and physical sciences platforms and even in computing services portals. In the last decade, the open access movement propelled the development of many web platforms for sharing data. Adequate policies unfortunately did not follow the same tempo, and now many initiatives may perish.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Nathanael J. K.; Gearhart, Jared Lee; Jones, Dean A.
Currently, much of protection planning is conducted separately for each infrastructure and hazard. Limited funding requires a balance of expenditures between terrorism and natural hazards based on potential impacts. This report documents the results of a Laboratory Directed Research & Development (LDRD) project that created a modeling framework for investment planning in interdependent infrastructures focused on multiple hazards, including terrorism. To develop this framework, three modeling elements were integrated: natural hazards, terrorism, and interdependent infrastructures. For natural hazards, a methodology was created for specifying events consistent with regional hazards. For terrorism, we modeled the terrorists actions based on assumptions regardingmore » their knowledge, goals, and target identification strategy. For infrastructures, we focused on predicting post-event performance due to specific terrorist attacks and natural hazard events, tempered by appropriate infrastructure investments. We demonstrate the utility of this framework with various examples, including protection of electric power, roadway, and hospital networks.« less
Introduction to geospatial semantics and technology workshop handbook
Varanka, Dalia E.
2012-01-01
The workshop is a tutorial on introductory geospatial semantics with hands-on exercises using standard Web browsers. The workshop is divided into two sections, general semantics on the Web and specific examples of geospatial semantics using data from The National Map of the U.S. Geological Survey and the Open Ontology Repository. The general semantics section includes information and access to publicly available semantic archives. The specific session includes information on geospatial semantics with access to semantically enhanced data for hydrography, transportation, boundaries, and names. The Open Ontology Repository offers open-source ontologies for public use.
Nasir, Zaheer Ahmad; Campos, Luiza Cintra; Christie, Nicola; Colbeck, Ian
2016-08-01
Exposure to airborne biological hazards in an ever expanding urban transport infrastructure and highly diverse mobile population is of growing concern, in terms of both public health and biosecurity. The existing policies and practices on design, construction and operation of these infrastructures may have severe implications for airborne disease transmission, particularly, in the event of a pandemic or intentional release of biological of agents. This paper reviews existing knowledge on airborne disease transmission in different modes of transport, highlights the factors enhancing the vulnerability of transport infrastructures to airborne disease transmission, discusses the potential protection measures and identifies the research gaps in order to build a bioresilient transport infrastructure. The unification of security and public health research, inclusion of public health security concepts at the design and planning phase, and a holistic system approach involving all the stakeholders over the life cycle of transport infrastructure hold the key to mitigate the challenges posed by biological hazards in the twenty-first century transport infrastructure.
ERIC Educational Resources Information Center
Draper, Darryl C.
2013-01-01
The increased accessibility of technology and Internet connections has enabled organizations to provide their workforces with the opportunity to engage in distributed education. "Harnessing this innovation calls for organizational and technological infrastructures that support the interplay of knowledge and knowing" (Cook & Brown, 1999, p. 381).…
QSIA--A Web-Based Environment for Learning, Assessing and Knowledge Sharing in Communities
ERIC Educational Resources Information Center
Rafaeli, Sheizaf; Barak, Miri; Dan-Gur, Yuval; Toch, Eran
2004-01-01
This paper describes a Web-based and distributed system named QSIA that serves as an environment for learning, assessing and knowledge sharing. QSIA--Questions Sharing and Interactive Assignments--offers a unified infrastructure for developing, collecting, managing and sharing of knowledge items. QSIA enhances collaboration in authoring via online…
Knowledge Cultures and the Shaping of Work-Based Learning: The Case of Computer Engineering
ERIC Educational Resources Information Center
Nerland, Monika
2008-01-01
This paper examines how the knowledge culture of computer engineering--that is, the ways in which knowledge is produced, distributed, accumulated and collectively approached within this profession--serve to construct work-based learning in specific ways. Typically, the epistemic infrastructures take the form of information structures with a global…
Knowledge base for v-Embryo: Information Infrastructure for in silico modeling
Computers, imaging technologies, and the worldwide web have assumed an important role in augmenting traditional learning. Resources to disseminate multimedia information across platforms, and the emergence of communal knowledge environments, facilitate the visualization of diffi...
Arc4nix: A cross-platform geospatial analytical library for cluster and cloud computing
NASA Astrophysics Data System (ADS)
Tang, Jingyin; Matyas, Corene J.
2018-02-01
Big Data in geospatial technology is a grand challenge for processing capacity. The ability to use a GIS for geospatial analysis on Cloud Computing and High Performance Computing (HPC) clusters has emerged as a new approach to provide feasible solutions. However, users lack the ability to migrate existing research tools to a Cloud Computing or HPC-based environment because of the incompatibility of the market-dominating ArcGIS software stack and Linux operating system. This manuscript details a cross-platform geospatial library "arc4nix" to bridge this gap. Arc4nix provides an application programming interface compatible with ArcGIS and its Python library "arcpy". Arc4nix uses a decoupled client-server architecture that permits geospatial analytical functions to run on the remote server and other functions to run on the native Python environment. It uses functional programming and meta-programming language to dynamically construct Python codes containing actual geospatial calculations, send them to a server and retrieve results. Arc4nix allows users to employ their arcpy-based script in a Cloud Computing and HPC environment with minimal or no modification. It also supports parallelizing tasks using multiple CPU cores and nodes for large-scale analyses. A case study of geospatial processing of a numerical weather model's output shows that arcpy scales linearly in a distributed environment. Arc4nix is open-source software.
Understanding needs and barriers to using geospatial tools for public health policymaking in China.
Kim, Dohyeong; Zhang, Yingyuan; Lee, Chang Kil
2018-05-07
Despite growing popularity of using geographical information systems and geospatial tools in public health fields, these tools are only rarely implemented in health policy management in China. This study examines the barriers that could prevent policy-makers from applying such tools to actual managerial processes related to public health problems that could be assisted by such approaches, e.g. evidence-based policy-making. A questionnaire-based survey of 127 health-related experts and other stakeholders in China revealed that there is a consensus on the needs and demands for the use of geospatial tools, which shows that there is a more unified opinion on the matter than so far reported. Respondents pointed to lack of communication and collaboration among stakeholders as the most significant barrier to the implementation of geospatial tools. Comparison of survey results to those emanating from a similar study in Bangladesh revealed different priorities concerning the use of geospatial tools between the two countries. In addition, the follow-up in-depth interviews highlighted the political culture specific to China as a critical barrier to adopting new tools in policy development. Other barriers included concerns over the limited awareness of the availability of advanced geospatial tools. Taken together, these findings can facilitate a better understanding among policy-makers and practitioners of the challenges and opportunities for widespread adoption and implementation of a geospatial approach to public health policy-making in China.
NASA Astrophysics Data System (ADS)
XIA, J.; Yang, C.; Liu, K.; Huang, Q.; Li, Z.
2013-12-01
Big Data becomes increasingly important in almost all scientific domains, especially in geoscience where hundreds to millions of sensors are collecting data of the Earth continuously (Whitehouse News 2012). With the explosive growth of data, various Geospatial Cyberinfrastructure (GCI) (Yang et al. 2010) components are developed to manage geospatial resources and provide data access for the public. These GCIs are accessed by different users intensively on a daily basis. However, little research has been done to analyze the spatiotemporal patterns of user behavior, which could be critical to the management of Big Data and the operation of GCIs (Yang et al. 2011). For example, the spatiotemporal distribution of end users helps us better arrange and locate GCI computing facilities. A better indexing and caching mechanism could be developed based on the spatiotemporal pattern of user queries. In this paper, we use GEOSS Clearinghouse as an example to investigate spatiotemporal patterns of user behavior in GCIs. The investigation results show that user behaviors are heterogeneous but with patterns across space and time. Identified patterns include (1) the high access frequency regions; (2) local interests; (3) periodical accesses and rush hours; (4) spiking access. Based on identified patterns, this presentation reports several solutions to better support the operation of the GEOSS Clearinghouse and other GCIs. Keywords: Big Data, EarthCube, CyberGIS, Spatiotemporal Thinking and Computing, Data Mining, User Behavior Reference: Fayyad, U. M., Piatetsky-Shapiro, G., Smyth, P., & Uthurusamy, R. 1996. Advances in knowledge discovery and data mining. Whitehouse. 2012. Obama administration unveils 'BIG DATA' initiative: announces $200 million in new R&D investments. Whitehouse. Retrieved from http://www.whitehouse.gov/sites/default/files/microsites/ostp/big_data_press_release_final_2.pdf [Accessed 14 June 2013] Yang, C., Wu, H., Huang, Q., Li, Z., & Li, J. 2011. Using spatial principles to optimize distributed computing for enabling the physical science discoveries. Proceedings of the National Academy of Sciences, 108(14), 5498-5503. doi:10.1073/pnas.0909315108 Yang, C., Raskin, R., Goodchild, M., & Gahegan, M. 2010. Geospatial Cyberinfrastructure: Past, present and future. Computers, Environment and Urban Systems, 34(4), 264-277. doi:10.1016/j.compenvurbsys.2010.04.001
NASA Astrophysics Data System (ADS)
Chaudhary, A.
2017-12-01
Current simulation models and sensors are producing high-resolution, high-velocity data in geosciences domain. Knowledge discovery from these complex and large size datasets require tools that are capable of handling very large data and providing interactive data analytics features to researchers. To this end, Kitware and its collaborators are producing open-source tools GeoNotebook, GeoJS, Gaia, and Minerva for geosciences that are using hardware accelerated graphics and advancements in parallel and distributed processing (Celery and Apache Spark) and can be loosely coupled to solve real-world use-cases. GeoNotebook (https://github.com/OpenGeoscience/geonotebook) is co-developed by Kitware and NASA-Ames and is an extension to the Jupyter Notebook. It provides interactive visualization and python-based analysis of geospatial data and depending the backend (KTile or GeoPySpark) can handle data sizes of Hundreds of Gigabytes to Terabytes. GeoNotebook uses GeoJS (https://github.com/OpenGeoscience/geojs) to render very large geospatial data on the map using WebGL and Canvas2D API. GeoJS is more than just a GIS library as users can create scientific plots such as vector and contour and can embed InfoVis plots using D3.js. GeoJS aims for high-performance visualization and interactive data exploration of scientific and geospatial location aware datasets and supports features such as Point, Line, Polygon, and advanced features such as Pixelmap, Contour, Heatmap, and Choropleth. Our another open-source tool Minerva ((https://github.com/kitware/minerva) is a geospatial application that is built on top of open-source web-based data management system Girder (https://github.com/girder/girder) which provides an ability to access data from HDFS or Amazon S3 buckets and provides capabilities to perform visualization and analyses on geosciences data in a web environment using GDAL and GeoPandas wrapped in a unified API provided by Gaia (https://github.com/OpenDataAnalytics/gaia). In this presentation, we will discuss core features of each of these tools and will present lessons learned on handling large data in the context of data management, analyses and visualization.
Aiello, Danielle P.; Torregrosa, Alicia; Jason, Allyson L.; Fuentes, Tracy L.; Josberger, Edward G.
2008-01-01
This report summarizes existing geospatial data and monitoring programs for the Puget Sound Basin in northwestern Washington. This information was assembled as a preliminary data-development task for the U.S. Geological Survey (USGS) Puget Sound Integrated Landscape Monitoring (PSILM) pilot project. The PSILM project seeks to support natural resource decision-making by developing a 'whole system' approach that links ecological processes at the landscape level to the local level (Benjamin and others, 2008). Part of this effort will include building the capacity to provide cumulative information about impacts that cross jurisdictional and regulatory boundaries, such as cumulative effects of land-cover change and shoreline modification, or region-wide responses to climate change. The PSILM project study area is defined as the 23 HUC-8 (hydrologic unit code) catchments that comprise the watersheds that drain into Puget Sound and their near-shore environments. The study area includes 13 counties and more than four million people. One goal of the PSILM geospatial database is to integrate spatial data collected at multiple scales across the Puget Sound Basin marine and terrestrial landscape. The PSILM work plan specifies an iterative process that alternates between tasks associated with data development and tasks associated with research or strategy development. For example, an initial work-plan goal was to delineate the study area boundary. Geospatial data required to address this task included data from ecological regions, watersheds, jurisdictions, and other boundaries. This assemblage of data provided the basis for identifying larger research issues and delineating the study-area boundary based on these research needs. Once the study-area boundary was agreed upon, the next iteration between data development and research activities was guided by questions about data availability, data extent, data abundance, and data types. This report is not intended as an exhaustive compilation of all available geospatial data, rather, it is a collection of information about geospatial data that can be used to help answer the suite of questions posed after the study-area boundary was defined. This information will also be useful to the PSILM team for future project tasks, such as assessing monitoring gaps, exploring monitoring-design strategies, identifying and deriving landscape indicators and metrics, and visual geographic communication. The two main geospatial data types referenced in this report - base-reference layers and monitoring data - originated from numerous and varied sources. In addition to collecting information and metadata about the base-reference layers, the data also were collected for project needs, such as developing maps for visual communication among team members and with outside groups. In contrast, only information about the data was typically required for the monitoring data. The information on base-reference layers and monitoring data included in this report is only as detailed as what was readily available from the sources themselves. Although this report may appear to lack consistency between data records, the varying degree of details contained in this report are merely a reflection of varying source detail. This compilation is just a beginning. All data listed also are being catalogued in spreadsheets and knowledge-management systems. Our efforts are continual as we develop a geospatial catalog for the PSILM pilot project.
ERIC Educational Resources Information Center
Gaudet, Cyndi; Annulis, Heather; Kmiec, John
2010-01-01
The Geospatial Technology Apprenticeship Program (GTAP) pilot was designed as a replicable and sustainable program to enhance workforce skills in geospatial technologies to best leverage a $30 billion market potential. The purpose of evaluating GTAP was to ensure that investment in this high-growth industry was adding value. Findings from this…
USDA-ARS?s Scientific Manuscript database
The development of sensors that provide geospatial information on crop and soil conditions has been a primary success for precision agriculture. However, further developments are needed to integrate geospatial data into computer algorithms that spatially optimize crop production while considering po...
Toward Information Infrastructure Studies: Ways of Knowing in a Networked Environment
NASA Astrophysics Data System (ADS)
Bowker, Geoffrey C.; Baker, Karen; Millerand, Florence; Ribes, David
This article presents Information Infrastructure Studies, a research area that takes up some core issues in digital information and organization research. Infrastructure Studies simultaneously addresses the technical, social, and organizational aspects of the development, usage, and maintenance of infrastructures in local communities as well as global arenas. While infrastructure is understood as a broad category referring to a variety of pervasive, enabling network resources such as railroad lines, plumbing and pipes, electrical power plants and wires, this article focuses on information infrastructure, such as computational services and help desks, or federating activities such as scientific data repositories and archives spanning the multiple disciplines needed to address such issues as climate warming and the biodiversity crisis. These are elements associated with the internet and, frequently today, associated with cyberinfrastructure or e-science endeavors. We argue that a theoretical understanding of infrastructure provides the context for needed dialogue between design, use, and sustainability of internet-based infrastructure services. This article outlines a research area and outlines overarching themes of Infrastructure Studies. Part one of the paper presents definitions for infrastructure and cyberinfrastructure, reviewing salient previous work. Part two portrays key ideas from infrastructure studies (knowledge work, social and political values, new forms of sociality, etc.). In closing, the character of the field today is considered.
Geospatial Technology Strategic Plan 1997-2000
D'Erchia, Frank; D'Erchia, Terry D.; Getter, James; McNiff, Marcia; Root, Ralph; Stitt, Susan; White, Barbara
1997-01-01
Executive Summary -- Geospatial technology applications have been identified in many U.S. Geological Survey Biological Resources Division (BRD) proposals for grants awarded through internal and partnership programs. Because geospatial data and tools have become more sophisticated, accessible, and easy to use, BRD scientists frequently are using these tools and capabilities to enhance a broad spectrum of research activities. Bruce Babbitt, Secretary of the Interior, has acknowledged--and lauded--the important role of geospatial technology in natural resources management. In his keynote address to more than 5,500 people representing 87 countries at the Environmental Systems Research Institute Annual Conference (May 21, 1996), Secretary Babbitt stated, '. . .GIS [geographic information systems], if properly used, can provide a lot more than sets of data. Used effectively, it can help stakeholders to bring consensus out of conflict. And it can, by providing information, empower the participants to find new solutions to their problems.' This Geospatial Technology Strategic Plan addresses the use and application of geographic information systems, remote sensing, satellite positioning systems, image processing, and telemetry; describes methods of meeting national plans relating to geospatial data development, management, and serving; and provides guidance for sharing expertise and information. Goals are identified along with guidelines that focus on data sharing, training, and technology transfer. To measure success, critical performance indicators are included. The ability of the BRD to use and apply geospatial technology across all disciplines will greatly depend upon its success in transferring the technology to field biologists and researchers. The Geospatial Technology Strategic Planning Development Team coordinated and produced this document in the spirit of this premise. Individual Center and Program managers have the responsibility to implement the Strategic Plan by working within the policy and guidelines stated herein.
Jacquez, Geoffrey M; Essex, Aleksander; Curtis, Andrew; Kohler, Betsy; Sherman, Recinda; Emam, Khaled El; Shi, Chen; Kaufmann, Andy; Beale, Linda; Cusick, Thomas; Goldberg, Daniel; Goovaerts, Pierre
2017-07-01
As the volume, accuracy and precision of digital geographic information have increased, concerns regarding individual privacy and confidentiality have come to the forefront. Not only do these challenge a basic tenet underlying the advancement of science by posing substantial obstacles to the sharing of data to validate research results, but they are obstacles to conducting certain research projects in the first place. Geospatial cryptography involves the specification, design, implementation and application of cryptographic techniques to address privacy, confidentiality and security concerns for geographically referenced data. This article defines geospatial cryptography and demonstrates its application in cancer control and surveillance. Four use cases are considered: (1) national-level de-duplication among state or province-based cancer registries; (2) sharing of confidential data across cancer registries to support case aggregation across administrative geographies; (3) secure data linkage; and (4) cancer cluster investigation and surveillance. A secure multi-party system for geospatial cryptography is developed. Solutions under geospatial cryptography are presented and computation time is calculated. As services provided by cancer registries to the research community, de-duplication, case aggregation across administrative geographies and secure data linkage are often time-consuming and in some instances precluded by confidentiality and security concerns. Geospatial cryptography provides secure solutions that hold significant promise for addressing these concerns and for accelerating the pace of research with human subjects data residing in our nation's cancer registries. Pursuit of the research directions posed herein conceivably would lead to a geospatially encrypted geographic information system (GEGIS) designed specifically to promote the sharing and spatial analysis of confidential data. Geospatial cryptography holds substantial promise for accelerating the pace of research with spatially referenced human subjects data.
Carswell, William J.
2011-01-01
increases the efficiency of the Nation's geospatial community by improving communications about geospatial data, products, services, projects, needs, standards, and best practices. The NGP comprises seven major components (described below), that are managed as a unified set. For example, The National Map establishes data standards and identifies geographic areas where specific types of geospatial data need to be incorporated into The National Map. Partnership Network Liaisons work with Federal, State, local, and tribal partners to help acquire the data. Geospatial technical operations ensure the quality control, integration, and availability to the public of the data acquired. The Emergency Operations Office provides the requirements to The National Map and, during emergencies and natural disasters, provides rapid dissemination of information and data targeted to the needs of emergency responders. The National Atlas uses data from The National Map and other sources to make small-scale maps and multimedia articles about the maps.
Revelation of `Hidden' Balinese Geospatial Heritage on A Map
NASA Astrophysics Data System (ADS)
Soeria Atmadja, Dicky A. S.; Wikantika, Ketut; Budi Harto, Agung; Putra, Daffa Gifary M.
2018-05-01
Bali is not just about beautiful nature. It also has a unique and interesting cultural heritage, including `hidden' geospatial heritage. Tri Hita Karana is a Hinduism concept of life consisting of human relation to God, to other humans and to the nature (Parahiyangan, Pawongan and Palemahan), Based on it, - in term of geospatial aspect - the Balinese derived its spatial orientation, spatial planning & lay out, measurement as well as color and typography. Introducing these particular heritage would be a very interesting contribution to Bali tourism. As a respond to these issues, a question arise on how to reveal these unique and highly valuable geospatial heritage on a map which can be used to introduce and disseminate them to the tourists. Symbols (patterns & colors), orientation, distance, scale, layout and toponimy have been well known as elements of a map. There is an chance to apply Balinese geospatial heritage in representing these map elements.
Advanced European Network of E-Infrastructures for Astronomy with the SKA
NASA Astrophysics Data System (ADS)
Massardi, Marcella
2017-11-01
Here, I present the AENEAS (Advanced European Network of E-infrastructures for Astronomy with the SKA) project has been funded in the Horizon 2020 Work Programme call "Research and Innovation Actions for International Co-operation on high-end e-infrastructure requirements" supporting the Square Kilometre Array (SKA). INAF is contributing to all the AENEAS working packages and leading the WP5 - Access and Knowledge Creation (WP leader M. Massardi IRA-ARC), participants from IRA (Brand, Nanni, Venturi) ,OACT(Becciani, Costa, Umana), OATS (Smareglia, Knapic, Taffoni)
Earth Science Informatics - Overview
NASA Technical Reports Server (NTRS)
Ramapriyan, H. K.
2015-01-01
Over the last 10-15 years, significant advances have been made in information management, there are an increasing number of individuals entering the field of information management as it applies to Geoscience and Remote Sensing data, and the field of informatics has come to its own. Informatics is the science and technology of applying computers and computational methods to the systematic analysis, management, interchange, and representation of science data, information, and knowledge. Informatics also includes the use of computers and computational methods to support decision making and applications. Earth Science Informatics (ESI, a.k.a. geoinformatics) is the application of informatics in the Earth science domain. ESI is a rapidly developing discipline integrating computer science, information science, and Earth science. Major national and international research and infrastructure projects in ESI have been carried out or are on-going. Notable among these are: the Global Earth Observation System of Systems (GEOSS), the European Commissions INSPIRE, the U.S. NSDI and Geospatial One-Stop, the NASA EOSDIS, and the NSF DataONE, EarthCube and Cyberinfrastructure for Geoinformatics. More than 18 departments and agencies in the U.S. federal government have been active in Earth science informatics. All major space agencies in the world, have been involved in ESI research and application activities. In the United States, the Federation of Earth Science Information Partners (ESIP), whose membership includes nearly 150 organizations (government, academic and commercial) dedicated to managing, delivering and applying Earth science data, has been working on many ESI topics since 1998. The Committee on Earth Observation Satellites (CEOS)s Working Group on Information Systems and Services (WGISS) has been actively coordinating the ESI activities among the space agencies. Remote Sensing; Earth Science Informatics, Data Systems; Data Services; Metadata
Earth Science Informatics - Overview
NASA Technical Reports Server (NTRS)
Ramapriyan, H. K.
2017-01-01
Over the last 10-15 years, significant advances have been made in information management, there are an increasing number of individuals entering the field of information management as it applies to Geoscience and Remote Sensing data, and the field of informatics has come to its own. Informatics is the science and technology of applying computers and computational methods to the systematic analysis, management, interchange, and representation of science data, information, and knowledge. Informatics also includes the use of computers and computational methods to support decision making and applications. Earth Science Informatics (ESI, a.k.a. geoinformatics) is the application of informatics in the Earth science domain. ESI is a rapidly developing discipline integrating computer science, information science, and Earth science. Major national and international research and infrastructure projects in ESI have been carried out or are on-going. Notable among these are: the Global Earth Observation System of Systems (GEOSS), the European Commissions INSPIRE, the U.S. NSDI and Geospatial One-Stop, the NASA EOSDIS, and the NSF DataONE, EarthCube and Cyberinfrastructure for Geoinformatics. More than 18 departments and agencies in the U.S. federal government have been active in Earth science informatics. All major space agencies in the world, have been involved in ESI research and application activities. In the United States, the Federation of Earth Science Information Partners (ESIP), whose membership includes over 180 organizations (government, academic and commercial) dedicated to managing, delivering and applying Earth science data, has been working on many ESI topics since 1998. The Committee on Earth Observation Satellites (CEOS)s Working Group on Information Systems and Services (WGISS) has been actively coordinating the ESI activities among the space agencies.
NASA Astrophysics Data System (ADS)
Kuldeep, K.
2016-02-01
In India, most of the rivers form big size natural islands due to change in its course. However, identification of suitable river island for construction of Eco-friendly parks/tourist destination is a very challenging task since these are exposed to river flooding. River islands which are least vulnerable to the impact of severe flooding can be a suitable place for construction of tourism destination such as eco-friendly Parks, Hotels etc. The study involves a two step approach viz. automatic extraction of river islands and model development for flood inundation mapping for extraction of eco-friendly tourism destinations. In this study, automatic extraction of the river islands has been carried out using knowledge based classification approach. The satellite data acquired by the Indian Remote Sensing Satellites sensors such as LISS-III and Cartosat-1 DEM have been used for analyses. In the first step, satellite imagery has been broadly categorized into 5 landuse/cover classes viz. Water, Sand, Islands, Settlements and Cropland. Extraction of such islands which remain unaffected during severe flooding has been accomplished with the flood inundation mapping which has been carried out in HEC-GeoRas with in GIS environment. The model utilizes the primary 4 inputs viz. geometry of the river (DEM, slope), time series data of water surface elevation, landuse/cover, and location of rain gauge station for flood inundation mapping. This paper also investigates the applicability of the eco-island concept to include protection of wetland, management of land-resources, sustainable use of natural resources and construction of ecological park/hotels. The output of the study will be very useful for Government authorities in stabilizing economy, and enhancing the tourism infrastructure in a better way.
NASA Astrophysics Data System (ADS)
Kuldeep, K.
2015-12-01
In India, most of the rivers form big size natural islands due to change in its course. However, identification of suitable river island is a very challenging task since these are exposed to river flooding. River islands with least vulnerability to the impacts of severe flooding can be a suitable place for construction of tourism destination such as eco-friendly Parks, Hotels etc. The study involves a two step approach viz. automatic extraction of river islands and model development for flood inundation mapping for extraction of eco-friendly tourism destinations. In this study, automatic extraction of the river islands has been carried out using knowledge based classification approach. The satellite data acquired by the Indian Remote Sensing Satellites sensors such as LISS-III and Cartosat-1 DEM have been used for analyses. In the first step, satellite imagery has been broadly categorized into 5 landuse/cover classes viz. Water, Sand, Islands, Settlements and Cropland. Extraction of such islands which remain unaffected during severe flooding has been accomplished with the flood inundation mapping which has been carried out in HEC-GeoRas with in GIS environment. The model utilizes the primary 4 inputs viz. geometry of the river (DEM, slope), time series data of water surface elevation, landuse/cover, and location of rain gauge station for flood inundation mapping. This paper also investigates the applicability of the eco-island concept to include protection of wetland, management of land-resources, sustainable use of natural resources and construction of ecological park/hotels. The output of the study will be very helpful for Government authorities in stabilizing economy, and enhancing the tourism infrastructure in a better way.
Earth Science Informatics - Overview
NASA Technical Reports Server (NTRS)
Ramapriyan, H. K.
2017-01-01
Over the last 10-15 years, significant advances have been made in information management, there are an increasing number of individuals entering the field of information management as it applies to Geoscience and Remote Sensing data, and the field of informatics has come to its own. Informatics is the science and technology of applying computers and computational methods to the systematic analysis, management, interchange, and representation of science data, information, and knowledge. Informatics also includes the use of computers and computational methods to support decision making and applications. Earth Science Informatics (ESI, a.k.a. geoinformatics) is the application of informatics in the Earth science domain. ESI is a rapidly developing discipline integrating computer science, information science, and Earth science. Major national and international research and infrastructure projects in ESI have been carried out or are on-going. Notable among these are: the Global Earth Observation System of Systems (GEOSS), the European Commissions INSPIRE, the U.S. NSDI and Geospatial One-Stop, the NASA EOSDIS, and the NSF DataONE, EarthCube and Cyberinfrastructure for Geoinformatics. More than 18 departments and agencies in the U.S. federal government have been active in Earth science informatics. All major space agencies in the world, have been involved in ESI research and application activities. In the United States, the Federation of Earth Science Information Partners (ESIP), whose membership includes over 180 organizations (government, academic and commercial) dedicated to managing, delivering and applying Earth science data, has been working on many ESI topics since 1998. The Committee on Earth Observation Satellites (CEOS)s Working Group on Information Systems and Services (WGISS) has been actively coordinating the ESI activities among the space agencies.The talk will present an overview of current efforts in ESI, the role members of IEEE GRSS play, and discuss recent developments in data preservation and provenance.
Visualization and Ontology of Geospatial Intelligence
NASA Astrophysics Data System (ADS)
Chan, Yupo
Recent events have deepened our conviction that many human endeavors are best described in a geospatial context. This is evidenced in the prevalence of location-based services, as afforded by the ubiquitous cell phone usage. It is also manifested by the popularity of such internet engines as Google Earth. As we commute to work, travel on business or pleasure, we make decisions based on the geospatial information provided by such location-based services. When corporations devise their business plans, they also rely heavily on such geospatial data. By definition, local, state and federal governments provide services according to geographic boundaries. One estimate suggests that 85 percent of data contain spatial attributes.
Intelligent services for discovery of complex geospatial features from remote sensing imagery
NASA Astrophysics Data System (ADS)
Yue, Peng; Di, Liping; Wei, Yaxing; Han, Weiguo
2013-09-01
Remote sensing imagery has been commonly used by intelligence analysts to discover geospatial features, including complex ones. The overwhelming volume of routine image acquisition requires automated methods or systems for feature discovery instead of manual image interpretation. The methods of extraction of elementary ground features such as buildings and roads from remote sensing imagery have been studied extensively. The discovery of complex geospatial features, however, is still rather understudied. A complex feature, such as a Weapon of Mass Destruction (WMD) proliferation facility, is spatially composed of elementary features (e.g., buildings for hosting fuel concentration machines, cooling towers, transportation roads, and fences). Such spatial semantics, together with thematic semantics of feature types, can be used to discover complex geospatial features. This paper proposes a workflow-based approach for discovery of complex geospatial features that uses geospatial semantics and services. The elementary features extracted from imagery are archived in distributed Web Feature Services (WFSs) and discoverable from a catalogue service. Using spatial semantics among elementary features and thematic semantics among feature types, workflow-based service chains can be constructed to locate semantically-related complex features in imagery. The workflows are reusable and can provide on-demand discovery of complex features in a distributed environment.
Finding geospatial pattern of unstructured data by clustering routes
NASA Astrophysics Data System (ADS)
Boustani, M.; Mattmann, C. A.; Ramirez, P.; Burke, W.
2016-12-01
Today the majority of data generated has a geospatial context to it. Either in attribute form as a latitude or longitude, or name of location or cross referenceable using other means such as an external gazetteer or location service. Our research is interested in exploiting geospatial location and context in unstructured data such as that found on the web in HTML pages, images, videos, documents, and other areas, and in structured information repositories found on intranets, in scientific environments, and otherwise. We are working together on the DARPA MEMEX project to exploit open source software tools such as the Lucene Geo Gazetteer, Apache Tika, Apache Lucene, and Apache OpenNLP, to automatically extract, and make meaning out of geospatial information. In particular, we are interested in unstructured descriptors e.g., a phone number, or a named entity, and the ability to automatically learn geospatial paths related to these descriptors. For example, a particular phone number may represent an entity that travels on a monthly basis, according to easily identifiable and somes more difficult to track patterns. We will present a set of automatic techniques to extract descriptors, and then to geospatially infer their paths across unstructured data.
BPELPower—A BPEL execution engine for geospatial web services
NASA Astrophysics Data System (ADS)
Yu, Genong (Eugene); Zhao, Peisheng; Di, Liping; Chen, Aijun; Deng, Meixia; Bai, Yuqi
2012-10-01
The Business Process Execution Language (BPEL) has become a popular choice for orchestrating and executing workflows in the Web environment. As one special kind of scientific workflow, geospatial Web processing workflows are data-intensive, deal with complex structures in data and geographic features, and execute automatically with limited human intervention. To enable the proper execution and coordination of geospatial workflows, a specially enhanced BPEL execution engine is required. BPELPower was designed, developed, and implemented as a generic BPEL execution engine with enhancements for executing geospatial workflows. The enhancements are especially in its capabilities in handling Geography Markup Language (GML) and standard geospatial Web services, such as the Web Processing Service (WPS) and the Web Feature Service (WFS). BPELPower has been used in several demonstrations over the decade. Two scenarios were discussed in detail to demonstrate the capabilities of BPELPower. That study showed a standard-compliant, Web-based approach for properly supporting geospatial processing, with the only enhancement at the implementation level. Pattern-based evaluation and performance improvement of the engine are discussed: BPELPower directly supports 22 workflow control patterns and 17 workflow data patterns. In the future, the engine will be enhanced with high performance parallel processing and broad Web paradigms.
National Geographic FieldScope: Tools for Engaging a Range of Audiences in Citizen Science
NASA Astrophysics Data System (ADS)
OConnor, S.; Takaki, E.
2013-12-01
Recognizing the promise of projects that engage non-scientists in scientific research as a context for informal science learning, National Geographic set out in 2009 to develop a technology infrastructure to support public participation in scientific research (PPSR), or citizen science, projects. As a result, NG has developed a web-based platform called FieldScope to host projects in which geographically distributed participants submit local observations or measurements to a shared database. This project is motivated by the observation that historically citizen science initiatives have been siloed using different technologies, and that these projects rarely provide participants with the opportunity to participate in data analysis or any other aspects of the scientific process except for collecting and contributing data. Therefore, FieldScope has been designed to support data visualization and analysis using geospatial technologies and aims to develop social networking tools for communicating and discussing findings. Since educational impact is the project's primary priority, FieldScope is also being designed with usability by novices in mind. In addition to engaging novices in participation in citizen science, the design of the application is also meant to engage students and others in working with geospatial technologies, in this case, web-based GIS. The project's goal is to create a single, powerful infrastructure for PPSR projects that any organization can use to create their own project and support their own community of participants. The FieldScope environment will serve as a hosting environment for PPSR projects following the model of hosted communities of practice that has become widespread on the web. The goal is to make FieldScope a publicly-available resource for any PPSR project on a no- or low-cost basis. It will also make synergies possible between projects that are collecting related data in the same geographic area. NG is now in the fourth year of an NSF grant to bring this vision for FieldScope to life. The project is structured around key collaborations with 'conveners' of existing citizen science initiatives. These existing initiatives include Project BudBurst, the Association of Zoos and Aquariums FrogWatch USA, and the Alice Ferguson Fund's Trash Free Potomac Initiative. These groups are serving as testbed partners, building their citizen science projects with the FieldScope development tools and hosting their communities within the FieldScope infrastructure. Through outcomes research and evaluation, these testbeds will provide much-needed evidence about the value of citizen science for learning and the conditions that can maximize those outcomes. Presenters will share findings to date of the project, including a demonstration of the technology using examples from our convening partners from the NSF project, as well as other communities using FieldScope, ranging from citizen science initiatives in the Yukon River watershed aimed at engaging indigenous Alaskan populations to a wide-spread initiative across the Chesapeake Bay watershed designed for students and environmental education program participants.
Information gathering, management and transfering for geospacial intelligence
NASA Astrophysics Data System (ADS)
Nunes, Paulo; Correia, Anacleto; Teodoro, M. Filomena
2017-07-01
Information is a key subject in modern organization operations. The success of joint and combined operations with organizations partners depends on the accurate information and knowledge flow concerning the operations theatre: provision of resources, environment evolution, markets location, where and when an event occurred. As in the past and nowadays we cannot conceive modern operations without maps and geo-spatial information (GI). Information and knowledge management is fundamental to the success of organizational decisions in an uncertainty environment. The georeferenced information management is a process of knowledge management, it begins in the raw data and ends on generating knowledge. GI and intelligence systems allow us to integrate all other forms of intelligence and can be a main platform to process and display geo-spatial-time referenced events. Combining explicit knowledge with peoples know-how to generate a continuous learning cycle that supports real time decisions mitigates the influences of fog of everyday competition and provides the knowledge supremacy. Extending the preliminary analysis done in [1], this work applies the exploratory factor analysis to a questionnaire about the GI and intelligence management in an organization company allowing to identify future lines of action to improve information process sharing and exploration of all the potential of this important resource.
2016-01-01
Reconstructing and understanding the Human Physiome virtually is a complex mathematical problem, and a highly demanding computational challenge. Mathematical models spanning from the molecular level through to whole populations of individuals must be integrated, then personalized. This requires interoperability with multiple disparate and geographically separated data sources, and myriad computational software tools. Extracting and producing knowledge from such sources, even when the databases and software are readily available, is a challenging task. Despite the difficulties, researchers must frequently perform these tasks so that available knowledge can be continually integrated into the common framework required to realize the Human Physiome. Software and infrastructures that support the communities that generate these, together with their underlying standards to format, describe and interlink the corresponding data and computer models, are pivotal to the Human Physiome being realized. They provide the foundations for integrating, exchanging and re-using data and models efficiently, and correctly, while also supporting the dissemination of growing knowledge in these forms. In this paper, we explore the standards, software tooling, repositories and infrastructures that support this work, and detail what makes them vital to realizing the Human Physiome. PMID:27051515
Nickerson, David; Atalag, Koray; de Bono, Bernard; Geiger, Jörg; Goble, Carole; Hollmann, Susanne; Lonien, Joachim; Müller, Wolfgang; Regierer, Babette; Stanford, Natalie J; Golebiewski, Martin; Hunter, Peter
2016-04-06
Reconstructing and understanding the Human Physiome virtually is a complex mathematical problem, and a highly demanding computational challenge. Mathematical models spanning from the molecular level through to whole populations of individuals must be integrated, then personalized. This requires interoperability with multiple disparate and geographically separated data sources, and myriad computational software tools. Extracting and producing knowledge from such sources, even when the databases and software are readily available, is a challenging task. Despite the difficulties, researchers must frequently perform these tasks so that available knowledge can be continually integrated into the common framework required to realize the Human Physiome. Software and infrastructures that support the communities that generate these, together with their underlying standards to format, describe and interlink the corresponding data and computer models, are pivotal to the Human Physiome being realized. They provide the foundations for integrating, exchanging and re-using data and models efficiently, and correctly, while also supporting the dissemination of growing knowledge in these forms. In this paper, we explore the standards, software tooling, repositories and infrastructures that support this work, and detail what makes them vital to realizing the Human Physiome.
Current Standardization and Cooperative Efforts Related to Industrial Information Infrastructures.
1993-05-01
Data Management Systems: Components used to store, manage, and retrieve data. Data management includes knowledge bases, database management...Application Development Tools and Methods X/Open and POSIX APIs Integrated Design Support System (IDS) Knowledge -Based Systems (KBS) Application...IDEFlx) Yourdon Jackson System Design (JSD) Knowledge -Based Systems (KBSs) Structured Systems Development (SSD) Semantic Unification Meta-Model
Lindsay K. Campbell; Erika S. Svendsen; Lara A. Roman
2016-01-01
Cities are increasingly engaging in sustainability efforts and investment in green infrastructure, including large-scale urban tree planting campaigns. In this context, researchers and practitioners are working jointly to develop applicable knowledge for planning and managing the urban forest. This paper presents three case studies of knowledge co-production in the...
NASA Astrophysics Data System (ADS)
Une, Hiroshi; Nakano, Takayuki
2018-05-01
Geographic location is one of the most fundamental and indispensable information elements in the field of disaster response and prevention. For example, in the case of the Tohoku Earthquake in 2011, aerial photos taken immediately after the earthquake greatly improved information sharing among different government offices and facilitated rescue and recovery operations, and maps prepared after the disaster assisted in the rapid reconstruction of affected local communities. Thanks to the recent development of geospatial information technology, this information has become more essential for disaster response activities. Advancements in web mapping technology allows us to better understand the situation by overlaying various location-specific data on base maps on the web and specifying the areas on which activities should be focused. Through 3-D modelling technology, we can have a more realistic understanding of the relationship between disaster and topography. Geospatial information technology can sup-port proper preparation and emergency responses against disasters by individuals and local communities through hazard mapping and other information services using mobile devices. Thus, geospatial information technology is playing a more vital role on all stages of disaster risk management and responses. In acknowledging geospatial information's vital role in disaster risk reduction, the Sendai Framework for Disaster Risk Reduction 2015-2030, adopted at the Third United Nations World Conference on Disaster Risk Reduction, repeatedly reveals the importance of utilizing geospatial information technology for disaster risk reduction. This presentation aims to report the recent practical applications of geospatial information technology for disaster risk management and responses.
Mapping a Difference: The Power of Geospatial Visualization
NASA Astrophysics Data System (ADS)
Kolvoord, B.
2015-12-01
Geospatial Technologies (GST), such as GIS, GPS and remote sensing, offer students and teachers the opportunity to study the "why" of where. By making maps and collecting location-based data, students can pursue authentic problems using sophisticated tools. The proliferation of web- and cloud-based tools has made these technologies broadly accessible to schools. In addition, strong spatial thinking skills have been shown to be a key factor in supporting students that want to study science, technology, engineering, and mathematics (STEM) disciplines (Wai, Lubinski and Benbow) and pursue STEM careers. Geospatial technologies strongly scaffold the development of these spatial thinking skills. For the last ten years, the Geospatial Semester, a unique dual-enrollment partnership between James Madison University and Virginia high schools, has provided students with the opportunity to use GST's to hone their spatial thinking skills and to do extended projects of local interest, including environmental, geological and ecological studies. Along with strong spatial thinking skills, these students have also shown strong problem solving skills, often beyond those of fellow students in AP classes. Programs like the Geospatial Semester are scalable and within the reach of many college and university departments, allowing strong engagement with K-12 schools. In this presentation, we'll share details of the Geospatial Semester and research results on the impact of the use of these technologies on students' spatial thinking skills, and discuss the success and challenges of developing K-12 partnerships centered on geospatial visualization.
ERIC Educational Resources Information Center
Special Libraries Association, New York, NY.
These conference proceedings address the key issues relating to the National Information Infrastructure, including social policy, cultural issues, government policy, and technological applications. The goal is to provide the knowledge and resources needed to conceptualize and think clearly about this topic. Proceedings include: "Opening…
NASA Astrophysics Data System (ADS)
Wakil, K.; Hussnain, MQ; Tahir, A.; Naeem, M. A.
2016-06-01
Unmanaged placement, size, location, structure and contents of outdoor advertisement boards have resulted in severe urban visual pollution and deterioration of the socio-physical living environment in urban centres of Pakistan. As per the regulatory instruments, the approval decision for a new advertisement installation is supposed to be based on the locational density of existing boards and their proximity or remoteness to certain land- uses. In cities, where regulatory tools for the control of advertisement boards exist, responsible authorities are handicapped in effective implementation due to the absence of geospatial analysis capacity. This study presents the development of a spatial decision support system (SDSS) for regularization of advertisement boards in terms of their location and placement. The knowledge module of the proposed SDSS is based on provisions and restrictions prescribed in regulatory documents. While the user interface allows visualization and scenario evaluation to understand if the new board will affect existing linear density on a particular road and if it violates any buffer restrictions around a particular land use. Technically the structure of the proposed SDSS is a web-based solution which includes open geospatial tools such as OpenGeo Suite, GeoExt, PostgreSQL, and PHP. It uses three key data sets including road network, locations of existing billboards and building parcels with land use information to perform the analysis. Locational suitability has been calculated using pairwise comparison through analytical hierarchy process (AHP) and weighted linear combination (WLC). Our results indicate that open geospatial tools can be helpful in developing an SDSS which can assist solving space related iterative decision challenges on outdoor advertisements. Employing such a system will result in effective implementation of regulations resulting in visual harmony and aesthetic improvement in urban communities.
Lilley, Rebbecca; Kool, Bridget; Davie, Gabrielle; de Graaf, Brandon; Ameratunga, Shanthi N; Reid, Pararangi; Civil, Ian; Dicker, Bridget; Branas, Charles C
2017-02-09
Traumatic injury is a leading cause of premature death and health loss in New Zealand. Outcomes following injury are very time sensitive, and timely access of critically injured patients to advanced hospital trauma care services can improve injury survival. This cross-sectional study will investigate the epidemiology and geographic location of prehospital fatal injury deaths in relation to access to prehospital emergency services for the first time in New Zealand. Electronic Coronial case files for the period 2008-2012 will be reviewed to identify cases of prehospital fatal injury across New Zealand. The project will combine epidemiological and geospatial methods in three research phases: (1) identification, enumeration, description and geocoding of prehospital injury deaths using existing electronic injury data sets; (2) geocoding of advanced hospital-level care providers and emergency land and air ambulance services to determine the current theoretical service coverage in a specified time period and (3) synthesising of information from phases I and II using geospatial methods to determine the number of prehospital injury deaths located in areas without timely access to advanced-level hospital care. The findings of this research will identify opportunities to optimise access to advanced-level hospital care in New Zealand to increase the chances of survival from serious injury. The resulting epidemiological and geospatial analyses will represent an advancement of knowledge for injury prevention and health service quality improvement towards better patient outcomes following serious injury in New Zealand and similar countries. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
A high-precision rule-based extraction system for expanding geospatial metadata in GenBank records
Weissenbacher, Davy; Rivera, Robert; Beard, Rachel; Firago, Mari; Wallstrom, Garrick; Scotch, Matthew; Gonzalez, Graciela
2016-01-01
Objective The metadata reflecting the location of the infected host (LOIH) of virus sequences in GenBank often lacks specificity. This work seeks to enhance this metadata by extracting more specific geographic information from related full-text articles and mapping them to their latitude/longitudes using knowledge derived from external geographical databases. Materials and Methods We developed a rule-based information extraction framework for linking GenBank records to the latitude/longitudes of the LOIH. Our system first extracts existing geospatial metadata from GenBank records and attempts to improve it by seeking additional, relevant geographic information from text and tables in related full-text PubMed Central articles. The final extracted locations of the records, based on data assimilated from these sources, are then disambiguated and mapped to their respective geo-coordinates. We evaluated our approach on a manually annotated dataset comprising of 5728 GenBank records for the influenza A virus. Results We found the precision, recall, and f-measure of our system for linking GenBank records to the latitude/longitudes of their LOIH to be 0.832, 0.967, and 0.894, respectively. Discussion Our system had a high level of accuracy for linking GenBank records to the geo-coordinates of the LOIH. However, it can be further improved by expanding our database of geospatial data, incorporating spell correction, and enhancing the rules used for extraction. Conclusion Our system performs reasonably well for linking GenBank records for the influenza A virus to the geo-coordinates of their LOIH based on record metadata and information extracted from related full-text articles. PMID:26911818
A high-precision rule-based extraction system for expanding geospatial metadata in GenBank records.
Tahsin, Tasnia; Weissenbacher, Davy; Rivera, Robert; Beard, Rachel; Firago, Mari; Wallstrom, Garrick; Scotch, Matthew; Gonzalez, Graciela
2016-09-01
The metadata reflecting the location of the infected host (LOIH) of virus sequences in GenBank often lacks specificity. This work seeks to enhance this metadata by extracting more specific geographic information from related full-text articles and mapping them to their latitude/longitudes using knowledge derived from external geographical databases. We developed a rule-based information extraction framework for linking GenBank records to the latitude/longitudes of the LOIH. Our system first extracts existing geospatial metadata from GenBank records and attempts to improve it by seeking additional, relevant geographic information from text and tables in related full-text PubMed Central articles. The final extracted locations of the records, based on data assimilated from these sources, are then disambiguated and mapped to their respective geo-coordinates. We evaluated our approach on a manually annotated dataset comprising of 5728 GenBank records for the influenza A virus. We found the precision, recall, and f-measure of our system for linking GenBank records to the latitude/longitudes of their LOIH to be 0.832, 0.967, and 0.894, respectively. Our system had a high level of accuracy for linking GenBank records to the geo-coordinates of the LOIH. However, it can be further improved by expanding our database of geospatial data, incorporating spell correction, and enhancing the rules used for extraction. Our system performs reasonably well for linking GenBank records for the influenza A virus to the geo-coordinates of their LOIH based on record metadata and information extracted from related full-text articles. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.