Sample records for geospatial decision support

  1. Decision Performance Using Spatial Decision Support Systems: A Geospatial Reasoning Ability Perspective

    ERIC Educational Resources Information Center

    Erskine, Michael A.

    2013-01-01

    As many consumer and business decision makers are utilizing Spatial Decision Support Systems (SDSS), a thorough understanding of how such decisions are made is crucial for the information systems domain. This dissertation presents six chapters encompassing a comprehensive analysis of the impact of geospatial reasoning ability on…

  2. Supporting Timely Humanitarian Assistance/Disaster Relief (HA/DR) Decisions Through Geospatial Intelligence (GEOINT) and Geographical Information Systems (GIS) Tools

    DTIC Science & Technology

    2014-05-22

    attempted to respond to the advances in technology and the growing power of geographical information system (GIS) tools. However, the doctrine...Geospatial intelligence (GEOINT), Geographical information systems (GIS) tools, Humanitarian Assistance/Disaster Relief (HA/DR), 2010 Haiti Earthquake...Humanitarian Assistance/Disaster Relief (HA/DR) Decisions Through Geospatial Intelligence (GEOINT) and Geographical Information Systems (GIS) Tools

  3. Data to Decisions: Valuing the Societal Benefit of Geospatial Information

    NASA Astrophysics Data System (ADS)

    Pearlman, F.; Kain, D.

    2016-12-01

    The March 10-11, 2016 GEOValue workshop on "Data to Decisions" was aimed at creating a framework for identification and implementation of best practices that capture the societal value of geospatial information for both public and private uses. The end-to-end information flow starts with the earth observation and data acquisition systems, includes the full range of processes from geospatial information to decisions support systems, and concludes with the end user. Case studies, which will be described in this presentation, were identified for a range of applications. The goal was to demonstrate and compare approaches to valuation of geospatial information and forge a path forward for research that leads to standards of practice.

  4. Leveraging the geospatial advantage

    Treesearch

    Ben Butler; Andrew Bailey

    2013-01-01

    The Wildland Fire Decision Support System (WFDSS) web-based application leverages geospatial data to inform strategic decisions on wildland fires. A specialized data team, working within the Wildland Fire Management Research Development and Application group (WFM RD&A), assembles authoritative national-level data sets defining values to be protected. The use of...

  5. E-DECIDER Disaster Response and Decision Support Cyberinfrastructure: Technology and Challenges

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Parker, J. W.; Pierce, M. E.; Wang, J.; Eguchi, R. T.; Huyck, C. K.; Hu, Z.; Chen, Z.; Yoder, M. R.; Rundle, J. B.; Rosinski, A.

    2014-12-01

    Timely delivery of critical information to decision makers during a disaster is essential to response and damage assessment. Key issues to an efficient emergency response after a natural disaster include rapidly processing and delivering this critical information to emergency responders and reducing human intervention as much as possible. Essential elements of information necessary to achieve situational awareness are often generated by a wide array of organizations and disciplines, using any number of geospatial and non-geospatial technologies. A key challenge is the current state of practice does not easily support information sharing and technology interoperability. NASA E-DECIDER (Emergency Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response) has worked with the California Earthquake Clearinghouse and its partners to address these issues and challenges by adopting the XChangeCore Web Service Data Orchestration technology and participating in several earthquake response exercises. The E-DECIDER decision support system provides rapid delivery of advanced situational awareness data products to operations centers and emergency responders in the field. Remote sensing and hazard data, model-based map products, information from simulations, damage detection, and crowdsourcing is integrated into a single geospatial view and delivered through a service oriented architecture for improved decision-making and then directly to mobile devices of responders. By adopting a Service Oriented Architecture based on Open Geospatial Consortium standards, the system provides an extensible, comprehensive framework for geospatial data processing and distribution on Cloud platforms and other distributed environments. While the Clearinghouse and its partners are not first responders, they do support the emergency response community by providing information about the damaging effects earthquakes. It is critical for decision makers to maintain a situational awareness that is knowledgeable of potential and current conditions, possible impacts on populations and infrastructure, and other key information. E-DECIDER and the Clearinghouse have worked together to address many of these issues and challenges to deliver interoperable, authoritative decision support products.

  6. Quantum Leap in Cartography as a requirement of Sustainable Development of the World

    NASA Astrophysics Data System (ADS)

    Tikunov, Vladimir S.; Tikunova, Iryna N.; Eremchenko, Eugene N.

    2018-05-01

    Sustainable development is one of the most important challenges for humanity and one of the priorities of the United Nations. Achieving sustainability of the whole World is a main goal of management at all levels - from personal to local to global. Therefore, decision making should be supported by relevant geospatial information system. Nevertheless, classical geospatial products, maps and GIS, violate fundamental demand of `situational awareness' concept, well-known philosophy of decision-making - same representation of situation within a same volume of time and space for all decision-makers. Basic mapping principles like generalization and projections split the universal single model of situation on number of different separate and inconsistent replicas. It leads to wrong understanding of situation and, after all - to incorrect decisions. In another words, quality of the sustainable development depends on effective decision-making support based on universal global scale-independent and projection-independent model. This new way for interacting with geospatial information is a quantum leap in cartography method. It is implemented in the so-called `Digital Earth' paradigm and geospatial services like Google Earth. Com-paring of both methods, as well as possibilities of implementation of Digital Earth in the sustain-able development activities, are discussed.

  7. Data for Renewable Energy Planning, Policy, and Investment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cox, Sarah L

    Reliable, robust, and validated data are critical for informed planning, policy development, and investment in the clean energy sector. The Renewable Energy (RE) Explorer was developed to support data-driven renewable energy analysis that can inform key renewable energy decisions globally. This document presents the types of geospatial and other data at the core of renewable energy analysis and decision making. Individual data sets used to inform decisions vary in relation to spatial and temporal resolution, quality, and overall usefulness. From Data to Decisions, a complementary geospatial data and analysis decision guide, provides an in-depth view of these and other considerationsmore » to enable data-driven planning, policymaking, and investment. Data support a wide variety of renewable energy analyses and decisions, including technical and economic potential assessment, renewable energy zone analysis, grid integration, risk and resiliency identification, electrification, and distributed solar photovoltaic potential. This fact sheet provides information on the types of data that are important for renewable energy decision making using the RE Data Explorer or similar types of geospatial analysis tools.« less

  8. Improving the Slum Planning Through Geospatial Decision Support System

    NASA Astrophysics Data System (ADS)

    Shekhar, S.

    2014-11-01

    In India, a number of schemes and programmes have been launched from time to time in order to promote integrated city development and to enable the slum dwellers to gain access to the basic services. Despite the use of geospatial technologies in planning, the local, state and central governments have only been partially successful in dealing with these problems. The study on existing policies and programmes also proved that when the government is the sole provider or mediator, GIS can become a tool of coercion rather than participatory decision-making. It has also been observed that local level administrators who have adopted Geospatial technology for local planning continue to base decision-making on existing political processes. In this juncture, geospatial decision support system (GSDSS) can provide a framework for integrating database management systems with analytical models, graphical display, tabular reporting capabilities and the expert knowledge of decision makers. This assists decision-makers to generate and evaluate alternative solutions to spatial problems. During this process, decision-makers undertake a process of decision research - producing a large number of possible decision alternatives and provide opportunities to involve the community in decision making. The objective is to help decision makers and planners to find solutions through a quantitative spatial evaluation and verification process. The study investigates the options for slum development in a formal framework of RAY (Rajiv Awas Yojana), an ambitious program of Indian Government for slum development. The software modules for realizing the GSDSS were developed using the ArcGIS and Community -VIZ software for Gulbarga city.

  9. School Mapping and Geospatial Analysis of the Schools in Jasra Development Block of India

    NASA Astrophysics Data System (ADS)

    Agrawal, S.; Gupta, R. D.

    2016-06-01

    GIS is a collection of tools and techniques that works on the geospatial data and is used in the analysis and decision making. Education is an inherent part of any civil society. Proper educational facilities generate the high quality human resource for any nation. Therefore, government needs an efficient system that can help in analysing the current state of education and its progress. Government also needs a system that can support in decision making and policy framing. GIS can serve the mentioned requirements not only for government but also for the general public. In order to meet the standards of human development, it is necessary for the government and decision makers to have a close watch on the existing education policy and its implementation condition. School mapping plays an important role in this aspect. School mapping consists of building the geospatial database of schools that supports in the infrastructure development, policy analysis and decision making. The present research work is an attempt for supporting Right to Education (RTE) and Sarv Sikha Abhiyaan (SSA) programmes run by Government of India through the use of GIS. School mapping of the study area is performed which is followed by the geospatial analysis. This research work will help in assessing the present status of educational infrastructure in Jasra block of Allahabad district, India.

  10. Improved satellite and geospatial tools for pipeline operator decision support systems.

    DOT National Transportation Integrated Search

    2017-01-06

    Under Cooperative Agreement No. OASRTRS-14-H-CAL, California Polytechnic State University San Luis Obispo (Cal Poly), partnered with C-CORE, MDA, PRCI, and Electricore to design and develop improved satellite and geospatial tools for pipeline operato...

  11. Decision Modeling for Socio-Cultural Data

    DTIC Science & Technology

    2011-02-01

    REFERENCES [1] Malczewski, J. (1999) GIS and Multicriteria Decision Analysis . John Wiley and Sons, New York. [2] Ehrgott, M., and Gandibleux, X. (Eds...up, nonexclusive, irrevocable worldwide license to use , modify, reproduce, release, perform, display, or disclose the work by or on behalf of the...criteria decision analysis (MCDA), into a geospatial environment to support decision making for campaign management. Our development approach supports

  12. USER-CUSTOMIZED ENVIRONMENTAL MAPPING AND DECISION SUPPORT USING NASA WORLD WIND AND DOE GENIE PRO SOFTWARE

    EPA Science Inventory

    Effective environmental stewardship requires timely geospatial information about ecology and

    environment for informed environmental decision support. Unprecedented public access to high resolution

    imagery from earth-looking sensors via online virtual earth browsers ...

  13. Geospatial decision support framework for critical infrastructure interdependency assessment

    NASA Astrophysics Data System (ADS)

    Shih, Chung Yan

    Critical infrastructures, such as telecommunications, energy, banking and finance, transportation, water systems and emergency services are the foundations of modern society. There is a heavy dependence on critical infrastructures at multiple levels within the supply chain of any good or service. Any disruptions in the supply chain may cause profound cascading effect to other critical infrastructures. A 1997 report by the President's Commission on Critical Infrastructure Protection states that a serious interruption in freight rail service would bring the coal mining industry to a halt within approximately two weeks and the availability of electric power could be reduced in a matter of one to two months. Therefore, this research aimed at representing and assessing the interdependencies between coal supply, transportation and energy production. A proposed geospatial decision support framework was established and applied to analyze interdependency related disruption impact. By utilizing the data warehousing approach, geospatial and non-geospatial data were retrieved, integrated and analyzed based on the transportation model and geospatial disruption analysis developed in the research. The results showed that by utilizing this framework, disruption impacts can be estimated at various levels (e.g., power plant, county, state, etc.) for preventative or emergency response efforts. The information derived from the framework can be used for data mining analysis (e.g., assessing transportation mode usages; finding alternative coal suppliers, etc.).

  14. An Environmental Decision Support System for Spatial Assessment and Selective Remediation

    EPA Science Inventory

    Spatial Analysis and Decision Assistance (SADA) is a Windows freeware program that incorporates environmental assessment tools for effective problem-solving. The software integrates modules for GIS, visualization, geospatial analysis, statistical analysis, human health and ecolog...

  15. Economic assessment of the use value of geospatial information

    USGS Publications Warehouse

    Bernknopf, Richard L.; Shapiro, Carl D.

    2015-01-01

    Geospatial data inform decision makers. An economic model that involves application of spatial and temporal scientific, technical, and economic data in decision making is described. The value of information (VOI) contained in geospatial data is the difference between the net benefits (in present value terms) of a decision with and without the information. A range of technologies is used to collect and distribute geospatial data. These technical activities are linked to examples that show how the data can be applied in decision making, which is a cultural activity. The economic model for assessing the VOI in geospatial data for decision making is applied to three examples: (1) a retrospective model about environmental regulation of agrochemicals; (2) a prospective model about the impact and mitigation of earthquakes in urban areas; and (3) a prospective model about developing private–public geospatial information for an ecosystem services market. Each example demonstrates the potential value of geospatial information in a decision with uncertain information.

  16. Combining forest inventory, satellite remote sensing, and geospatial data for mapping forest attributes of the conterminous United States

    Treesearch

    Mark Nelson; Greg Liknes; Charles H. Perry

    2009-01-01

    Analysis and display of forest composition, structure, and pattern provides information for a variety of assessments and management decision support. The objective of this study was to produce geospatial datasets and maps of conterminous United States forest land ownership, forest site productivity, timberland, and reserved forest land. Satellite image-based maps of...

  17. NASA's Geospatial Interoperability Office(GIO)Program

    NASA Technical Reports Server (NTRS)

    Weir, Patricia

    2004-01-01

    NASA produces vast amounts of information about the Earth from satellites, supercomputer models, and other sources. These data are most useful when made easily accessible to NASA researchers and scientists, to NASA's partner Federal Agencies, and to society as a whole. A NASA goal is to apply its data for knowledge gain, decision support and understanding of Earth, and other planetary systems. The NASA Earth Science Enterprise (ESE) Geospatial Interoperability Office (GIO) Program leads the development, promotion and implementation of information technology standards that accelerate and expand the delivery of NASA's Earth system science research through integrated systems solutions. Our overarching goal is to make it easy for decision-makers, scientists and citizens to use NASA's science information. NASA's Federal partners currently participate with NASA and one another in the development and implementation of geospatial standards to ensure the most efficient and effective access to one another's data. Through the GIO, NASA participates with its Federal partners in implementing interoperability standards in support of E-Gov and the associated President's Management Agenda initiatives by collaborating on standards development. Through partnerships with government, private industry, education and communities the GIO works towards enhancing the ESE Applications Division in the area of National Applications and decision support systems. The GIO provides geospatial standards leadership within NASA, represents NASA on the Federal Geographic Data Committee (FGDC) Coordination Working Group and chairs the FGDC's Geospatial Applications and Interoperability Working Group (GAI) and supports development and implementation efforts such as Earth Science Gateway (ESG), Space Time Tool Kit and Web Map Services (WMS) Global Mosaic. The GIO supports NASA in the collection and dissemination of geospatial interoperability standards needs and progress throughout the agency including areas such as ESE Applications, the SEEDS Working Groups, the Facilities Engineering Division (Code JX) and NASA's Chief Information Offices (CIO). With these agency level requirements GIO leads, brokers and facilitates efforts to, develop, implement, influence and fully participate in standards development internationally, federally and locally. The GIO also represents NASA in the OpenGIS Consortium and ISO TC211. The OGC has made considerable progress in regards to relations with other open standards bodies; namely ISO, W3C and OASIS. ISO TC211 is the Geographic and Geomatics Information technical committee that works towards standardization in the field of digital geographic information. The GIO focuses on seamless access to data, applications of data, and enabling technologies furthering the interoperability of distributed data. Through teaming within the Applications Directorate and partnerships with government, private industry, education and communities, GIO works towards the data application goals of NASA, the ESE Applications Directorate, and our Federal partners by managing projects in four categories: Geospatial Standards and Leadership, Geospatial One Stop, Standards Development and Implementation, and National and NASA Activities.

  18. An Intelligent Polar Cyberinfrastrucuture to Support Spatiotemporal Decision Making

    NASA Astrophysics Data System (ADS)

    Song, M.; Li, W.; Zhou, X.

    2014-12-01

    In the era of big data, polar sciences have already faced an urgent demand of utilizing intelligent approaches to support precise and effective spatiotemporal decision-making. Service-oriented cyberinfrastructure has advantages of seamlessly integrating distributed computing resources, and aggregating a variety of geospatial data derived from Earth observation network. This paper focuses on building a smart service-oriented cyberinfrastructure to support intelligent question answering related to polar datasets. The innovation of this polar cyberinfrastructure includes: (1) a problem-solving environment that parses geospatial question in natural language, builds geoprocessing rules, composites atomic processing services and executes the entire workflow; (2) a self-adaptive spatiotemporal filter that is capable of refining query constraints through semantic analysis; (3) a dynamic visualization strategy to support results animation and statistics in multiple spatial reference systems; and (4) a user-friendly online portal to support collaborative decision-making. By means of this polar cyberinfrastructure, we intend to facilitate integration of distributed and heterogeneous Arctic datasets and comprehensive analysis of multiple environmental elements (e.g. snow, ice, permafrost) to provide a better understanding of the environmental variation in circumpolar regions.

  19. Creating Ecosystem Services Indices with EnviroAtlas Metrics

    EPA Science Inventory

    To support the well-being of future generations, ecosystem services (ES) need to be fully understood and evaluated by decision-makers. Geospatial tools, such as the EnviroAtlas, allow decision-makers, urban planners, public health professionals, and other stakeholders to view and...

  20. All Source Solution Decision Support Products Created for Stennis Space Center in Response to Hurricane Katrina

    NASA Technical Reports Server (NTRS)

    Ross, Kenton W.; Graham, William D.

    2007-01-01

    In the aftermath of Hurricane Katrina and in response to the needs of SSC (Stennis Space Center), NASA required the generation of decision support products with a broad range of geospatial inputs. Applying a systems engineering approach, the NASA ARTPO (Applied Research and Technology Project Office) at SSC evaluated the Center's requirements and source data quality. ARTPO identified data and information products that had the potential to meet decision-making requirements; included were remotely sensed data ranging from high-spatial-resolution aerial images through high-temporal-resolution MODIS (Moderate Resolution Imaging Spectroradiometer) products. Geospatial products, such as FEMA's (Federal Emergency Management Agency's) Advisory Base Flood Elevations, were also relevant. Where possible, ARTPO applied SSC calibration/validation expertise to both clarify the quality of various data source options and to validate that the inputs that were finally chosen met SSC requirements. ARTPO integrated various information sources into multiple decision support products, including two maps: Hurricane Katrina Inundation Effects at Stennis Space Center (highlighting surge risk posture) and Vegetation Change In and Around Stennis Space Center: Katrina and Beyond (highlighting fire risk posture).

  1. California Earthquake Clearinghouse: Advocating for, and Advancing, Collaboration and Technology Interoperability, Between the Scientific and Emergency Response Communities, to Produce Actionable Intelligence for Situational Awareness, and Decision Support

    NASA Astrophysics Data System (ADS)

    Rosinski, A.; Beilin, P.; Colwell, J.; Hornick, M.; Glasscoe, M. T.; Morentz, J.; Smorodinsky, S.; Millington, A.; Hudnut, K. W.; Penn, P.; Ortiz, M.; Kennedy, M.; Long, K.; Miller, K.; Stromberg, M.

    2015-12-01

    The Clearinghouse provides emergency management and response professionals, scientific and engineering communities with prompt information on ground failure, structural damage, and other consequences from significant seismic events such as earthquakes or tsunamis. Clearinghouse activations include participation from Federal, State and local government, law enforcement, fire, EMS, emergency management, public health, environmental protection, the military, public and non-governmental organizations, and private sector. For the August 24, 2014 S. Napa earthquake, over 100 people from 40 different organizations participated during the 3-day Clearinghouse activation. Every organization has its own role and responsibility in disaster response; however all require authoritative data about the disaster for rapid hazard assessment and situational awareness. The Clearinghouse has been proactive in fostering collaboration and sharing Essential Elements of Information across disciplines. The Clearinghouse-led collaborative promotes the use of standard formats and protocols to allow existing technology to transform data into meaningful incident-related content and to enable data to be used by the largest number of participating Clearinghouse partners, thus providing responding personnel with enhanced real-time situational awareness, rapid hazard assessment, and more informed decision-making in support of response and recovery. The Clearinghouse efforts address national priorities outlined in USGS Circular 1242, Plan to Coordinate NEHRP post-earthquake investigations and S. 740-Geospatial Data Act of 2015, Sen. Orrin Hatch (R-UT), to streamline and coordinate geospatial data infrastructure, maximizing geospatial data in support of the Robert T. Stafford Act. Finally, the US Dept. of Homeland Security, Geospatial Management Office, recognized Clearinghouse's data sharing efforts as a Best Practice to be included in the forthcoming 2015 HLS Geospatial Concept of Operations.

  2. EVALUATING HYDROLOGICAL RESPONSE TO FORECASTED LAND-USE CHANGE: SCENARIO TESTING WITH THE AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT (AGWA) TOOL

    EPA Science Inventory

    Envisioning and evaluating future scenarios has emerged as a critical component of both science and social decision-making. The ability to assess, report, map, and forecast the life support functions of ecosystems is absolutely critical to our capacity to make informed decisions...

  3. Bridging the Gap Between Surveyors and the Geo-Spatial Society

    NASA Astrophysics Data System (ADS)

    Müller, H.

    2016-06-01

    For many years FIG, the International Association of Surveyors, has been trying to bridge the gap between surveyors and the geospatial society as a whole, with the geospatial industries in particular. Traditionally the surveying profession contributed to the good of society by creating and maintaining highly precise and accurate geospatial data bases, based on an in-depth knowledge of spatial reference frameworks. Furthermore in many countries surveyors may be entitled to make decisions about land divisions and boundaries. By managing information spatially surveyors today develop into the role of geo-data managers, the longer the more. Job assignments in this context include data entry management, data and process quality management, design of formal and informal systems, information management, consultancy, land management, all that in close cooperation with many different stakeholders. Future tasks will include the integration of geospatial information into e-government and e-commerce systems. The list of professional tasks underpins the capabilities of surveyors to contribute to a high quality geospatial data and information management. In that way modern surveyors support the needs of a geo-spatial society. The paper discusses several approaches to define the role of the surveyor within the modern geospatial society.

  4. Unmanned aircraft systems for transportation decision support.

    DOT National Transportation Integrated Search

    2016-11-30

    Our nation relies on accurate geospatial information to map, measure, and monitor transportation infrastructure and the surrounding landscapes. This project focused on the application of Unmanned Aircraft systems (UAS) as a novel tool for improving e...

  5. Information gathering, management and transferring for geospatial intelligence - A conceptual approach to create a spatial data infrastructure

    NASA Astrophysics Data System (ADS)

    Nunes, Paulo; Correia, Anacleto; Teodoro, M. Filomena

    2017-06-01

    Since long ago, information is a key factor for military organizations. In military context the success of joint and combined operations depends on the accurate information and knowledge flow concerning the operational theatre: provision of resources, environment evolution, targets' location, where and when an event will occur. Modern military operations cannot be conceive without maps and geospatial information. Staffs and forces on the field request large volume of information during the planning and execution process, horizontal and vertical geospatial information integration is critical for decision cycle. Information and knowledge management are fundamental to clarify an environment full of uncertainty. Geospatial information (GI) management rises as a branch of information and knowledge management, responsible for the conversion process from raw data collect by human or electronic sensors to knowledge. Geospatial information and intelligence systems allow us to integrate all other forms of intelligence and act as a main platform to process and display geospatial-time referenced events. Combining explicit knowledge with person know-how to generate a continuous learning cycle that supports real time decisions, mitigates the influences of fog of war and provides the knowledge supremacy. This paper presents the analysis done after applying a questionnaire and interviews about the GI and intelligence management in a military organization. The study intended to identify the stakeholder's requirements for a military spatial data infrastructure as well as the requirements for a future software system development.

  6. Exploring U.S Cropland - A Web Service based Cropland Data Layer Visualization, Dissemination and Querying System (Invited)

    NASA Astrophysics Data System (ADS)

    Yang, Z.; Han, W.; di, L.

    2010-12-01

    The National Agricultural Statistics Service (NASS) of the USDA produces the Cropland Data Layer (CDL) product, which is a raster-formatted, geo-referenced, U.S. crop specific land cover classification. These digital data layers are widely used for a variety of applications by universities, research institutions, government agencies, and private industry in climate change studies, environmental ecosystem studies, bioenergy production & transportation planning, environmental health research and agricultural production decision making. The CDL is also used internally by NASS for crop acreage and yield estimation. Like most geospatial data products, the CDL product is only available by CD/DVD delivery or online bulk file downloading via the National Research Conservation Research (NRCS) Geospatial Data Gateway (external users) or in a printed paper map format. There is no online geospatial information access and dissemination, no crop visualization & browsing, no geospatial query capability, nor online analytics. To facilitate the application of this data layer and to help disseminating the data, a web-service based CDL interactive map visualization, dissemination, querying system is proposed. It uses Web service based service oriented architecture, adopts open standard geospatial information science technology and OGC specifications and standards, and re-uses functions/algorithms from GeoBrain Technology (George Mason University developed). This system provides capabilities of on-line geospatial crop information access, query and on-line analytics via interactive maps. It disseminates all data to the decision makers and users via real time retrieval, processing and publishing over the web through standards-based geospatial web services. A CDL region of interest can also be exported directly to Google Earth for mashup or downloaded for use with other desktop application. This web service based system greatly improves equal-accessibility, interoperability, usability, and data visualization, facilitates crop geospatial information usage, and enables US cropland online exploring capability without any client-side software installation. It also greatly reduces the need for paper map and analysis report printing and media usages, and thus enhances low-carbon Agro-geoinformation dissemination for decision support.

  7. Tribal-Focused Environmental Risk and Sustainability Tool (Tribal-FERST) Fact Sheet

    EPA Pesticide Factsheets

    The Tribal-Focused Environmental Risk and Sustainability Tool (Tribal- FERST) is a web-based geospatial decision support tool that will provide tribes with easy access to the best available human health and ecological science.

  8. A Python Geospatial Language Toolkit

    NASA Astrophysics Data System (ADS)

    Fillmore, D.; Pletzer, A.; Galloy, M.

    2012-12-01

    The volume and scope of geospatial data archives, such as collections of satellite remote sensing or climate model products, has been rapidly increasing and will continue to do so in the near future. The recently launched (October 2011) Suomi National Polar-orbiting Partnership satellite (NPP) for instance, is the first of a new generation of Earth observation platforms that will monitor the atmosphere, oceans, and ecosystems, and its suite of instruments will generate several terabytes each day in the form of multi-spectral images and derived datasets. Full exploitation of such data for scientific analysis and decision support applications has become a major computational challenge. Geophysical data exploration and knowledge discovery could benefit, in particular, from intelligent mechanisms for extracting and manipulating subsets of data relevant to the problem of interest. Potential developments include enhanced support for natural language queries and directives to geospatial datasets. The translation of natural language (that is, human spoken or written phrases) into complex but unambiguous objects and actions can be based on a context, or knowledge domain, that represents the underlying geospatial concepts. This poster describes a prototype Python module that maps English phrases onto basic geospatial objects and operations. This module, along with the associated computational geometry methods, enables the resolution of natural language directives that include geographic regions of arbitrary shape and complexity.

  9. Geospatial Data Fusion and Multigroup Decision Support for Surface Water Quality Management

    NASA Astrophysics Data System (ADS)

    Sun, A. Y.; Osidele, O.; Green, R. T.; Xie, H.

    2010-12-01

    Social networking and social media have gained significant popularity and brought fundamental changes to many facets of our everyday life. With the ever-increasing adoption of GPS-enabled gadgets and technology, location-based content is likely to play a central role in social networking sites. While location-based content is not new to the geoscience community, where geographic information systems (GIS) are extensively used, the delivery of useful geospatial data to targeted user groups for decision support is new. Decision makers and modelers ought to make more effective use of the new web-based tools to expand the scope of environmental awareness education, public outreach, and stakeholder interaction. Environmental decision processes are often rife with uncertainty and controversy, requiring integration of multiple sources of information and compromises between diverse interests. Fusing of multisource, multiscale environmental data for multigroup decision support is a challenging task. Toward this goal, a multigroup decision support platform should strive to achieve transparency, impartiality, and timely synthesis of information. The latter criterion often constitutes a major technical bottleneck to traditional GIS-based media, featuring large file or image sizes and requiring special processing before web deployment. Many tools and design patterns have appeared in recent years to ease the situation somewhat. In this project, we explore the use of Web 2.0 technologies for “pushing” location-based content to multigroups involved in surface water quality management and decision making. In particular, our granular bottom-up approach facilitates effective delivery of information to most relevant user groups. Our location-based content includes in-situ and remotely sensed data disseminated by NASA and other national and local agencies. Our project is demonstrated for managing the total maximum daily load (TMDL) program in the Arroyo Colorado coastal river basin in Texas. The overall design focuses on assigning spatial information to decision support elements and on efficiently using Web 2.0 technologies to relay scientific information to the nonscientific community. We conclude that (i) social networking, if appropriately used, has great potential for mitigating difficulty associated with multigroup decision making; (ii) all potential stakeholder groups should be involved in creating a useful decision support system; and (iii) environmental decision support systems should be considered a must-have, instead of an optional component of TMDL decision support projects. Acknowledgment: This project was supported by NASA grant NNX09AR63G.

  10. National Geospatial-Intelligence Agency Academic Research Program

    NASA Astrophysics Data System (ADS)

    Loomer, S. A.

    2004-12-01

    "Know the Earth.Show the Way." In fulfillment of its vision, the National Geospatial-Intelligence Agency (NGA) provides geospatial intelligence in all its forms and from whatever source-imagery, imagery intelligence, and geospatial data and information-to ensure the knowledge foundation for planning, decision, and action. To achieve this, NGA conducts a multi-disciplinary program of basic research in geospatial intelligence topics through grants and fellowships to the leading investigators, research universities, and colleges of the nation. This research provides the fundamental science support to NGA's applied and advanced research programs. The major components of the NGA Academic Research Program (NARP) are: - NGA University Research Initiatives (NURI): Three-year basic research grants awarded competitively to the best investigators across the US academic community. Topics are selected to provide the scientific basis for advanced and applied research in NGA core disciplines. - Historically Black College and University - Minority Institution Research Initiatives (HBCU-MI): Two-year basic research grants awarded competitively to the best investigators at Historically Black Colleges and Universities, and Minority Institutions across the US academic community. - Director of Central Intelligence Post-Doctoral Research Fellowships: Fellowships providing access to advanced research in science and technology applicable to the intelligence community's mission. The program provides a pool of researchers to support future intelligence community needs and develops long-term relationships with researchers as they move into career positions. This paper provides information about the NGA Academic Research Program, the projects it supports and how other researchers and institutions can apply for grants under the program.

  11. Geospatial Thinking of Information Professionals

    ERIC Educational Resources Information Center

    Bishop, Bradley Wade; Johnston, Melissa P.

    2013-01-01

    Geospatial thinking skills inform a host of library decisions including planning and managing facilities, analyzing service area populations, facility site location, library outlet and service point closures, as well as assisting users with their own geospatial needs. Geospatial thinking includes spatial cognition, spatial reasoning, and knowledge…

  12. Forest climate change Vulnerability and Adaptation Assessment in Himalayas

    NASA Astrophysics Data System (ADS)

    Chitale, V. S.; Shrestha, H. L.; Agarwal, N. K.; Choudhurya, D.; Gilani, H.; Dhonju, H. K.; Murthy, M. S. R.

    2014-11-01

    Forests offer an important basis for creating and safeguarding more climate-resilient communities over Hindu Kush Himalayan region. The forest ecosystem vulnerability assessment to climate change and developing knowledge base to identify and support relevant adaptation strategies is realized as an urgent need. The multi scale adaptation strategies portray increasing complexity with the increasing levels in terms of data requirements, vulnerability understanding and decision making to choose a particular adaptation strategy. We present here how such complexities could be addressed and adaptation decisions could be either directly supported by open source remote sensing based forestry products or geospatial analysis and modelled products. The forest vulnerability assessment under climate change scenario coupled with increasing forest social dependence was studied using IPCC Landscape scale Vulnerability framework in Chitwan-Annapurna Landscape (CHAL) situated in Nepal. Around twenty layers of geospatial information on climate, forest biophysical and forest social dependence data was used to assess forest vulnerability and associated adaptation needs using self-learning decision tree based approaches. The increase in forest fires, evapotranspiration and reduction in productivity over changing climate scenario was observed. The adaptation measures on enhancing productivity, improving resilience, reducing or avoiding pressure with spatial specificity are identified to support suitable decision making. The study provides spatial analytical framework to evaluate multitude of parameters to understand vulnerabilities and assess scope for alternative adaptation strategies with spatial explicitness.

  13. Urban Climate Resilience - Connecting climate models with decision support cyberinfrastructure using open standards

    NASA Astrophysics Data System (ADS)

    Bermudez, L. E.; Percivall, G.; Idol, T. A.

    2015-12-01

    Experts in climate modeling, remote sensing of the Earth, and cyber infrastructure must work together in order to make climate predictions available to decision makers. Such experts and decision makers worked together in the Open Geospatial Consortium's (OGC) Testbed 11 to address a scenario of population displacement by coastal inundation due to the predicted sea level rise. In a Policy Fact Sheet "Harnessing Climate Data to Boost Ecosystem & Water Resilience", issued by White House Office of Science and Technology (OSTP) in December 2014, OGC committed to increase access to climate change information using open standards. In July 2015, the OGC Testbed 11 Urban Climate Resilience activity delivered on that commitment with open standards based support for climate-change preparedness. Using open standards such as the OGC Web Coverage Service and Web Processing Service and the NetCDF and GMLJP2 encoding standards, Testbed 11 deployed an interoperable high-resolution flood model to bring climate model outputs together with global change assessment models and other remote sensing data for decision support. Methods to confirm model predictions and to allow "what-if-scenarios" included in-situ sensor webs and crowdsourcing. A scenario was in two locations: San Francisco Bay Area and Mozambique. The scenarios demonstrated interoperation and capabilities of open geospatial specifications in supporting data services and processing services. The resultant High Resolution Flood Information System addressed access and control of simulation models and high-resolution data in an open, worldwide, collaborative Web environment. The scenarios examined the feasibility and capability of existing OGC geospatial Web service specifications in supporting the on-demand, dynamic serving of flood information from models with forecasting capacity. Results of this testbed included identification of standards and best practices that help researchers and cities deal with climate-related issues. Results of the testbeds will now be deployed in pilot applications. The testbed also identified areas of additional development needed to help identify scientific investments and cyberinfrastructure approaches needed to improve the application of climate science research results to urban climate resilence.

  14. Infrastructure for the Geospatial Web

    NASA Astrophysics Data System (ADS)

    Lake, Ron; Farley, Jim

    Geospatial data and geoprocessing techniques are now directly linked to business processes in many areas. Commerce, transportation and logistics, planning, defense, emergency response, health care, asset management and many other domains leverage geospatial information and the ability to model these data to achieve increased efficiencies and to develop better, more comprehensive decisions. However, the ability to deliver geospatial data and the capacity to process geospatial information effectively in these domains are dependent on infrastructure technology that facilitates basic operations such as locating data, publishing data, keeping data current and notifying subscribers and others whose applications and decisions are dependent on this information when changes are made. This chapter introduces the notion of infrastructure technology for the Geospatial Web. Specifically, the Geography Markup Language (GML) and registry technology developed using the ebRIM specification delivered from the OASIS consortium are presented as atomic infrastructure components in a working Geospatial Web.

  15. Regulating outdoor advertisement boards; employing spatial decision support system to control urban visual pollution

    NASA Astrophysics Data System (ADS)

    Wakil, K.; Hussnain, MQ; Tahir, A.; Naeem, M. A.

    2016-06-01

    Unmanaged placement, size, location, structure and contents of outdoor advertisement boards have resulted in severe urban visual pollution and deterioration of the socio-physical living environment in urban centres of Pakistan. As per the regulatory instruments, the approval decision for a new advertisement installation is supposed to be based on the locational density of existing boards and their proximity or remoteness to certain land- uses. In cities, where regulatory tools for the control of advertisement boards exist, responsible authorities are handicapped in effective implementation due to the absence of geospatial analysis capacity. This study presents the development of a spatial decision support system (SDSS) for regularization of advertisement boards in terms of their location and placement. The knowledge module of the proposed SDSS is based on provisions and restrictions prescribed in regulatory documents. While the user interface allows visualization and scenario evaluation to understand if the new board will affect existing linear density on a particular road and if it violates any buffer restrictions around a particular land use. Technically the structure of the proposed SDSS is a web-based solution which includes open geospatial tools such as OpenGeo Suite, GeoExt, PostgreSQL, and PHP. It uses three key data sets including road network, locations of existing billboards and building parcels with land use information to perform the analysis. Locational suitability has been calculated using pairwise comparison through analytical hierarchy process (AHP) and weighted linear combination (WLC). Our results indicate that open geospatial tools can be helpful in developing an SDSS which can assist solving space related iterative decision challenges on outdoor advertisements. Employing such a system will result in effective implementation of regulations resulting in visual harmony and aesthetic improvement in urban communities.

  16. GEOSPATIAL QA

    EPA Science Inventory

    Geospatial Science is increasingly becoming an important tool in making Agency decisions. Quality Control and Quality Assurance are required to be integrated during the planning, implementation and assessment of geospatial databases, processes and products. In order to ensure Age...

  17. Geospatial decision support systems for societal decision making

    USGS Publications Warehouse

    Bernknopf, R.L.

    2005-01-01

    While science provides reliable information to describe and understand the earth and its natural processes, it can contribute more. There are many important societal issues in which scientific information can play a critical role. Science can add greatly to policy and management decisions to minimize loss of life and property from natural and man-made disasters, to manage water, biological, energy, and mineral resources, and in general, to enhance and protect our quality of life. However, the link between science and decision-making is often complicated and imperfect. Technical language and methods surround scientific research and the dissemination of its results. Scientific investigations often are conducted under different conditions, with different spatial boundaries, and in different timeframes than those needed to support specific policy and societal decisions. Uncertainty is not uniformly reported in scientific investigations. If society does not know that data exist, what the data mean, where to use the data, or how to include uncertainty when a decision has to be made, then science gets left out -or misused- in a decision making process. This paper is about using Geospatial Decision Support Systems (GDSS) for quantitative policy analysis. Integrated natural -social science methods and tools in a Geographic Information System that respond to decision-making needs can be used to close the gap between science and society. The GDSS has been developed so that nonscientists can pose "what if" scenarios to evaluate hypothetical outcomes of policy and management choices. In this approach decision makers can evaluate the financial and geographic distribution of potential policy options and their societal implications. Actions, based on scientific information, can be taken to mitigate hazards, protect our air and water quality, preserve the planet's biodiversity, promote balanced land use planning, and judiciously exploit natural resources. Applications using the GDSS have demonstrated the benefits of utilizing science for policy decisions. Investment in science reduces decision-making uncertainty and reducing that uncertainty has economic value.

  18. AGWA: The Automated Geospatial Watershed Assessment Tool to Inform Rangeland Management

    EPA Science Inventory

    Do you want a relatively easy to use tool to assess rangeland soil and water conservation practices on rangeland erosion that is specifically designed to use ecological information? New Decision Support Tools (DSTs) that are easy-to-use, incorporate ecological concepts and rangel...

  19. GEOSPATIAL QUALITY COUNCIL

    EPA Science Inventory

    Geospatial Science is increasingly becoming an important tool in making Agency decisions. QualIty Control and Quality Assurance are required to be integrated during the planning, implementation and assessment of geospatial databases, processes and products. In order to ensure Age...

  20. Applying Geospatial Technologies for International Development and Public Health: The USAID/NASA SERVIR Program

    NASA Technical Reports Server (NTRS)

    Hemmings, Sarah; Limaye, Ashutosh; Irwin, Dan

    2011-01-01

    Background: SERVIR -- the Regional Visualization and Monitoring System -- helps people use Earth observations and predictive models based on data from orbiting satellites to make timely decisions that benefit society. SERVIR operates through a network of regional hubs in Mesoamerica, East Africa, and the Hindu Kush-Himalayas. USAID and NASA support SERVIR, with the long-term goal of transferring SERVIR capabilities to the host countries. Objective/Purpose: The purpose of this presentation is to describe how the SERVIR system helps the SERVIR regions cope with eight areas of societal benefit identified by the Group on Earth Observations (GEO): health, disasters, ecosystems, biodiversity, weather, water, climate, and agriculture. This presentation will describe environmental health applications of data in the SERVIR system, as well as ongoing and future efforts to incorporate additional health applications into the SERVIR system. Methods: This presentation will discuss how the SERVIR Program makes environmental data available for use in environmental health applications. SERVIR accomplishes its mission by providing member nations with access to geospatial data and predictive models, information visualization, training and capacity building, and partnership development. SERVIR conducts needs assessments in partner regions, develops custom applications of Earth observation data, and makes NASA and partner data available through an online geospatial data portal at SERVIRglobal.net. Results: Decision makers use SERVIR to improve their ability to monitor air quality, extreme weather, biodiversity, and changes in land cover. In past several years, the system has been used over 50 times to respond to environmental threats such as wildfires, floods, landslides, and harmful algal blooms. Given that the SERVIR regions are experiencing increased stress under larger climate variability than historic observations, SERVIR provides information to support the development of adaptation strategies for nations affected by climate change. Conclusions: SERVIR is a platform for collaboration and cross-agency coordination, international partnerships, and delivery of web-based geospatial information services and applications. SERVIR makes a variety of geospatial data available for use in studies of environmental health outcomes.

  1. GeoProMT: A Collaborative Platform Supporting Natural Hazards Project Management From Assessment to Resilience

    NASA Astrophysics Data System (ADS)

    Renschler, C.; Sheridan, M. F.; Patra, A. K.

    2008-05-01

    The impact and consequences of extreme geophysical events (hurricanes, floods, wildfires, volcanic flows, mudflows, etc.) on properties and processes should be continuously assessed by a well-coordinated interdisciplinary research and outreach approach addressing risk assessment and resilience. Communication between various involved disciplines and stakeholders is the key to a successful implementation of an integrated risk management plan. These issues become apparent at the level of decision support tools for extreme events/disaster management in natural and managed environments. The Geospatial Project Management Tool (GeoProMT) is a collaborative platform for research and training to document and communicate the fundamental steps in transforming information for extreme events at various scales for analysis and management. GeoProMT is an internet-based interface for the management of shared geo-spatial and multi-temporal information such as measurements, remotely sensed images, and other GIS data. This tool enhances collaborative research activities and the ability to assimilate data from diverse sources by integrating information management. This facilitates a better understanding of natural processes and enhances the integrated assessment of resilience against both the slow and fast onset of hazard risks. Fundamental to understanding and communicating complex natural processes are: (a) representation of spatiotemporal variability, extremes, and uncertainty of environmental properties and processes in the digital domain, (b) transformation of their spatiotemporal representation across scales (e.g. interpolation, aggregation, disaggregation.) during data processing and modeling in the digital domain, and designing and developing tools for (c) geo-spatial data management, and (d) geo-spatial process modeling and effective implementation, and (e) supporting decision- and policy-making in natural resources and hazard management at various spatial and temporal scales of interest. GeoProMT is useful for researchers, practitioners, and decision-makers, because it provides an integrated environmental system assessment and data management approach that considers the spatial and temporal scales and variability in natural processes. Particularly in the occurrence or onset of extreme events it can utilize the latest data sources that are available at variable scales, combine them with existing information, and update assessment products such as risk and vulnerability assessment maps. Because integrated geo-spatial assessment requires careful consideration of all the steps in utilizing data, modeling and decision-making formats, each step in the sequence must be assessed in terms of how information is being scaled. At the process scale various geophysical models (e.g. TITAN, LAHARZ, or many other examples) are appropriate for incorporation in the tool. Some examples that illustrate our approach include: 1) coastal parishes impacted by Hurricane Rita (Southwestern Louisiana), 2) a watershed affected by extreme rainfall induced debris-flows (Madison County, Virginia; Panabaj, Guatemala; Casita, Nicaragua), and 3) the potential for pyroclastic flows to threaten a city (Tungurahua, Ecuador). This research was supported by the National Science Foundation.

  2. Topographic and physicochemical controls on soil denitrification potential in prior converted croplands located on the Delmarva Peninsula, USA

    USDA-ARS?s Scientific Manuscript database

    Topography and soil physiochemical characteristics exert substantial controls on denitrification in agricultural lands. In order to depict these controls at a landscape scale for decision support applications, metrics (i.e., proxies) must be developed based on commonly available geospatial data. In ...

  3. A Geospatial Data Recommender System based on Metadata and User Behaviour

    NASA Astrophysics Data System (ADS)

    Li, Y.; Jiang, Y.; Yang, C. P.; Armstrong, E. M.; Huang, T.; Moroni, D. F.; Finch, C. J.; McGibbney, L. J.

    2017-12-01

    Earth observations are produced in a fast velocity through real time sensors, reaching tera- to peta- bytes of geospatial data daily. Discovering and accessing the right data from the massive geospatial data is like finding needle in the haystack. To help researchers find the right data for study and decision support, quite a lot of research focusing on improving search performance have been proposed including recommendation algorithm. However, few papers have discussed the way to implement a recommendation algorithm in geospatial data retrieval system. In order to address this problem, we propose a recommendation engine to improve discovering relevant geospatial data by mining and utilizing metadata and user behavior data: 1) metadata based recommendation considers the correlation of each attribute (i.e., spatiotemporal, categorical, and ordinal) to data to be found. In particular, phrase extraction method is used to improve the accuracy of the description similarity; 2) user behavior data are utilized to predict the interest of a user through collaborative filtering; 3) an integration method is designed to combine the results of the above two methods to achieve better recommendation Experiments show that in the hybrid recommendation list, the all the precisions are larger than 0.8 from position 1 to 10.

  4. NOAA's National Water Model - Integration of National Water Model with Geospatial Data creating Water Intelligence

    NASA Astrophysics Data System (ADS)

    Clark, E. P.; Cosgrove, B.; Salas, F.

    2016-12-01

    As a significant step forward to transform NOAA's water prediction services, NOAA plans to implement a new National Water Model (NWM) Version 1.0 in August 2016. A continental scale water resources model, the NWM is an evolution of the WRF-Hydro architecture developed by the National Center for Atmospheric Research (NCAR). The NWM will provide analyses and forecasts of flow for the 2.7 million stream reaches nationwide in the National Hydrography Dataset Plus v2 (NHDPlusV2) jointly developed by the USGS and EPA. The NWM also produces high-resolution water budget variables of snow, soil moisture, and evapotranspiration on a 1-km grid. NOAA's stakeholders require additional decision support application to be built on these data. The Geo-intelligence division of the Office of Water Prediction is building new products and services that integrate output from the NWM with geospatial datasets such as infrastructure and demographics to better estimate the impacts dynamic water resource states on community resiliency. This presentation will detail the methods and underlying information to produce prototypes water resources intelligence that is timely, actionable and credible. Moreover, it will to explore the NWM capability to support sector-specific decision support services.

  5. Lowering the barriers for accessing distributed geospatial big data to advance spatial data science: the PolarHub solution

    NASA Astrophysics Data System (ADS)

    Li, W.

    2017-12-01

    Data is the crux of science. The widespread availability of big data today is of particular importance for fostering new forms of geospatial innovation. This paper reports a state-of-the-art solution that addresses a key cyberinfrastructure research problem—providing ready access to big, distributed geospatial data resources on the Web. We first formulate this data-access problem and introduce its indispensable elements, including identifying the cyber-location, space and time coverage, theme, and quality of the dataset. We then propose strategies to tackle each data-access issue and make the data more discoverable and usable for geospatial data users and decision makers. Among these strategies is large-scale web crawling as a key technique to support automatic collection of online geospatial data that are highly distributed, intrinsically heterogeneous, and known to be dynamic. To better understand the content and scientific meanings of the data, methods including space-time filtering, ontology-based thematic classification, and service quality evaluation are incorporated. To serve a broad scientific user community, these techniques are integrated into an operational data crawling system, PolarHub, which is also an important cyberinfrastructure building block to support effective data discovery. A series of experiments were conducted to demonstrate the outstanding performance of the PolarHub system. We expect this work to contribute significantly in building the theoretical and methodological foundation for data-driven geography and the emerging spatial data science.

  6. Web-services-based spatial decision support system to facilitate nuclear waste siting

    NASA Astrophysics Data System (ADS)

    Huang, L. Xinglai; Sheng, Grant

    2006-10-01

    The availability of spatial web services enables data sharing among managers, decision and policy makers and other stakeholders in much simpler ways than before and subsequently has created completely new opportunities in the process of spatial decision making. Though generally designed for a certain problem domain, web-services-based spatial decision support systems (WSDSS) can provide a flexible problem-solving environment to explore the decision problem, understand and refine problem definition, and generate and evaluate multiple alternatives for decision. This paper presents a new framework for the development of a web-services-based spatial decision support system. The WSDSS is comprised of distributed web services that either have their own functions or provide different geospatial data and may reside in different computers and locations. WSDSS includes six key components, namely: database management system, catalog, analysis functions and models, GIS viewers and editors, report generators, and graphical user interfaces. In this study, the architecture of a web-services-based spatial decision support system to facilitate nuclear waste siting is described as an example. The theoretical, conceptual and methodological challenges and issues associated with developing web services-based spatial decision support system are described.

  7. Web Map Services (WMS) Global Mosaic

    NASA Technical Reports Server (NTRS)

    Percivall, George; Plesea, Lucian

    2003-01-01

    The WMS Global Mosaic provides access to imagery of the global landmass using an open standard for web mapping. The seamless image is a mosaic of Landsat 7 scenes; geographically-accurate with 30 and 15 meter resolutions. By using the OpenGIS Web Map Service (WMS) interface, any organization can use the global mosaic as a layer in their geospatial applications. Based on a trade study, an implementation approach was chosen that extends a previously developed WMS hosting a Landsat 5 CONUS mosaic developed by JPL. The WMS Global Mosaic supports the NASA Geospatial Interoperability Office goal of providing an integrated digital representation of the Earth, widely accessible for humanity's critical decisions.

  8. Geospatial information technology: an adjunct to service-based outreach and education.

    PubMed

    Faruque, Fazlay; Hewlett, Peggy O; Wyatt, Sharon; Wilson, Kaye; Lofton, Susan; Frate, Dennis; Gunn, Jennie

    2004-02-01

    This exemplar highlights how geospatial information technology was effective in supporting academic practice, faculty outreach, and education initiatives at the University of Mississippi School of Nursing. Using this cutting-edge technology created a community-based prototype for fully integrating point-of-service research, practice, and academics into a cohesive strategy to influence change within the health care delivery system. This exemplar discusses ways this knowledge benefits practice and curriculum development; informs critical decision making affecting the people we serve; underscores the vital role nurses play in linking this technology to practice; and develops community residents as partners in their own health and that of the community.

  9. Ecosystem services and emergent vulnerability in managed ecosystems: A geospatial decision-support tool

    Treesearch

    Colin M. Beier; Trista M. Patterson; F. Stuart Chapin III

    2008-01-01

    Managed ecosystems experience vulnerabilities when ecological resilience declines and key flows of ecosystem services become depleted or lost. Drivers of vulnerability often include local management actions in conjunction with other external, larger scale factors. To translate these concepts to management applications, we developed a conceptual model of feedbacks...

  10. Linking Data Access to Geospatial Data Models to Applications at Local to National Scales: The Estuary Data Mapper

    EPA Science Inventory

    The U.S. Environmental Protection Agency (US EPA) is developing e-Estuary, a decision-support system for Clean Water Act applications in coastal management. E-Estuary has three elements: an estuarine geo-referenced relational database, watershed GIS coverages, and tools to suppo...

  11. Academic research opportunities at the National Geospatial-Intelligence Agency(NGA)

    NASA Astrophysics Data System (ADS)

    Loomer, Scott A.

    2006-05-01

    The vision of the National Geospatial-Intelligence Agency (NGA) is to "Know the Earth...Show the Way." To achieve this vision, the NGA provides geospatial intelligence in all its forms and from whatever source-imagery, imagery intelligence, and geospatial data and information-to ensure the knowledge foundation for planning, decision, and action. Academia plays a key role in the NGA research and development program through the NGA Academic Research Program. This multi-disciplinary program of basic research in geospatial intelligence topics provides grants and fellowships to the leading investigators, research universities, and colleges of the nation. This research provides the fundamental science support to NGA's applied and advanced research programs. The major components of the NGA Academic Research Program are: *NGA University Research Initiatives (NURI): Three-year basic research grants awarded competitively to the best investigators across the US academic community. Topics are selected to provide the scientific basis for advanced and applied research in NGA core disciplines. *Historically Black College and University - Minority Institution Research Initiatives (HBCU-MI): Two-year basic research grants awarded competitively to the best investigators at Historically Black Colleges and Universities, and Minority Institutions across the US academic community. *Intelligence Community Post-Doctoral Research Fellowships: Fellowships providing access to advanced research in science and technology applicable to the intelligence community's mission. The program provides a pool of researchers to support future intelligence community needs and develops long-term relationships with researchers as they move into career positions. This paper provides information about the NGA Academic Research Program, the projects it supports and how researchers and institutions can apply for grants under the program. In addition, other opportunities for academia to engage with NGA through training programs and recruitment are discussed.

  12. Distributed Hydrologic Modeling Apps for Decision Support in the Cloud

    NASA Astrophysics Data System (ADS)

    Swain, N. R.; Latu, K.; Christiensen, S.; Jones, N.; Nelson, J.

    2013-12-01

    Advances in computation resources and greater availability of water resources data represent an untapped resource for addressing hydrologic uncertainties in water resources decision-making. The current practice of water authorities relies on empirical, lumped hydrologic models to estimate watershed response. These models are not capable of taking advantage of many of the spatial datasets that are now available. Physically-based, distributed hydrologic models are capable of using these data resources and providing better predictions through stochastic analysis. However, there exists a digital divide that discourages many science-minded decision makers from using distributed models. This divide can be spanned using a combination of existing web technologies. The purpose of this presentation is to present a cloud-based environment that will offer hydrologic modeling tools or 'apps' for decision support and the web technologies that have been selected to aid in its implementation. Compared to the more commonly used lumped-parameter models, distributed models, while being more intuitive, are still data intensive, computationally expensive, and difficult to modify for scenario exploration. However, web technologies such as web GIS, web services, and cloud computing have made the data more accessible, provided an inexpensive means of high-performance computing, and created an environment for developing user-friendly apps for distributed modeling. Since many water authorities are primarily interested in the scenario exploration exercises with hydrologic models, we are creating a toolkit that facilitates the development of a series of apps for manipulating existing distributed models. There are a number of hurdles that cloud-based hydrologic modeling developers face. One of these is how to work with the geospatial data inherent with this class of models in a web environment. Supporting geospatial data in a website is beyond the capabilities of standard web frameworks and it requires the use of additional software. In particular, there are at least three elements that are needed: a geospatially enabled database, a map server, and geoprocessing toolbox. We recommend a software stack for geospatial web application development comprising: MapServer, PostGIS, and 52 North with Python as the scripting language to tie them together. Another hurdle that must be cleared is managing the cloud-computing load. We are using HTCondor as a solution to this end. Finally, we are creating a scripting environment wherein developers will be able to create apps that use existing hydrologic models in our system with minimal effort. This capability will be accomplished by creating a plugin for a Python content management system called CKAN. We are currently developing cyberinfrastructure that utilizes this stack and greatly lowers the investment required to deploy cloud-based modeling apps. This material is based upon work supported by the National Science Foundation under Grant No. 1135482

  13. Foundations for context-aware information retrieval for proactive decision support

    NASA Astrophysics Data System (ADS)

    Mittu, Ranjeev; Lin, Jessica; Li, Qingzhe; Gao, Yifeng; Rangwala, Huzefa; Shargo, Peter; Robinson, Joshua; Rose, Carolyn; Tunison, Paul; Turek, Matt; Thomas, Stephen; Hanselman, Phil

    2016-05-01

    Intelligence analysts and military decision makers are faced with an onslaught of information. From the now ubiquitous presence of intelligence, surveillance, and reconnaissance (ISR) platforms providing large volumes of sensor data, to vast amounts of open source data in the form of news reports, blog postings, or social media postings, the amount of information available to a modern decision maker is staggering. Whether tasked with leading a military campaign or providing support for a humanitarian mission, being able to make sense of all the information available is a challenge. Due to the volume and velocity of this data, automated tools are required to help support reasoned, human decisions. In this paper we describe several automated techniques that are targeted at supporting decision making. Our approaches include modeling the kinematics of moving targets as motifs; developing normalcy models and detecting anomalies in kinematic data; automatically classifying the roles of users in social media; and modeling geo-spatial regions based on the behavior that takes place in them. These techniques cover a wide-range of potential decision maker needs.

  14. Blowing in the wind: evaluating wind energy projects on the national forests

    Treesearch

    Kerry Schlichting; Evan Mercer

    2011-01-01

    The 650 million ac of federal lands are facing increased scrutiny for wind energy development. As a result, the US Forest Service has been directed to develop policies and procedures for siting wind energy projects. We incorporate geospatial site suitability analysis with applicable policy and management principles to illustrate the use of a Spatial Decision Support...

  15. Geospatial Data Sciences | Energy Analysis | NREL

    Science.gov Websites

    , demographics, and the earth's physical geography to provide the foundation for energy analysis and decision -making. Photo of two people discussing a map. Geospatial Analysis Our geographic information system

  16. The Federal Geospatial Platform a shared infrastructure for publishing, discovering and exploiting public data and spatial applications.

    NASA Astrophysics Data System (ADS)

    Dabolt, T. O.

    2016-12-01

    The proliferation of open data and data services continues to thrive and is creating new challenges on how researchers, policy analysts and other decision makes can quickly discover and use relevant data. While traditional metadata catalog approaches used by applications such as data.gov prove to be useful starting points for data search they can quickly frustrate end users who are seeking ways to quickly find and then use data in machine to machine environs. The Geospatial Platform is overcoming these obstacles and providing end users and applications developers a richer more productive user experience. The Geospatial Platform leverages a collection of open source and commercial technology hosted on Amazon Web Services providing an ecosystem of services delivering trusted, consistent data in open formats to all users as well as a shared infrastructure for federal partners to serve their spatial data assets. It supports a diverse array of communities of practice ranging on topics from the 16 National Geospatial Data Assets Themes, to homeland security and climate adaptation. Come learn how you can contribute your data and leverage others or check it out on your own at https://www.geoplatform.gov/

  17. Impacts of Geospatial Information for Decision Making

    NASA Astrophysics Data System (ADS)

    Pearlman, F.; Coote, A.; Friedl, L.; Stewart, M.

    2012-12-01

    Geospatial information contributes to decisions by both societal and individual decision-makers. More effective use of this information is essential as issues are increasingly complex and consequences can be critical for future economic and social development. To address this, a workshop brought together analysts, communicators, officials, and researchers from academia, government, non-governmental organizations, and the private sector. A range of policy issues, management needs, and resource requirements were discussed and a wide array of analyses, geospatial data, methods of analysis, and metrics were presented for assessing and communicating the value of geospatial information. It is clear that there are many opportunities for integrating science and engineering disciplines with the social sciences for addressing societal issues that would benefit from using geospatial information and earth observations. However, these collaborations must have outcomes that can be easily communicated to decision makers. This generally requires either succinct quantitative statements of value based on rigorous models and/or user testimonials of actual applications that save real money. An outcome of the workshop is to pursue the development of a community of practice or society that encompasses a wide range of scientific, social, management, and communication disciplines and fosters collaboration across specialties, helping to build trust across social and science aspects. A resource base is also necessary. This presentation will address approaches for creating a shared knowledge database, containing a glossary of terms, reference materials and examples of case studies and the potential applications for benefit analyses.

  18. REACT Real-Time Emergency Action Coordination Tool

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Recently the Emergency Management Operations Center (EMOC) of St. Tammany Parish turned to the Technology Development and Transfer Office (TDTO) of NASA's Stennis Space Center (SSC) for help in combating the problems associated with water inundation. Working through a Dual-Use Development Agreement the Technology Development and Transfer Office, EMOC and a small geospatial applications company named Nvision provided the parish with a new front-line defense. REACT, Real-time Emergency Action coordination Tool is a decision support system that integrates disparate information to enable more efficient decision making by emergency management personnel.

  19. Mapping the world: cartographic and geographic visualization by the United Nations Geospatial Information Section (formerly Cartographic Section)

    NASA Astrophysics Data System (ADS)

    Kagawa, Ayako; Le Sourd, Guillaume

    2018-05-01

    United Nations Secretariat activities, mapping began in 1946, and by 1951, the need for maps increased and an office with a team of cartographers was established. Since then, with the development of technologies including internet, remote sensing, unmanned aerial systems, relationship database management and information systems, geospatial information provides an ever-increasing variation of support to the work of the Organization for planning of operations, decision-making and monitoring of crises. However, the need for maps has remained intact. This presentation aims to highlight some of the cartographic representation styles over the decades by reviewing the evolution of selected maps by the office, and noting the changing cognitive and semiotic aspects of cartographic and geographic visualization required by the United Nations. Through presentation and analysis of these maps, the changing dynamics of the Organization in information management can be reflected, with a reminder of the continuing and expanding deconstructionist role of a cartographer, now geospatial information management experts.

  20. Nebraska NativeGEM (Geospatial Extension Model)

    NASA Technical Reports Server (NTRS)

    Bowen, Brent

    2004-01-01

    This proposal, Nebraska NativeGEM (Geospatial Extension Model) features a unique diversity component stemming from the exceptional reputation NNSGC has built by delivering geospatial science experiences to Nebraska s Native Americans. For 7 years, NNSGC has partner4 with the 2 tribal colleges and 4 reservation school districts in Nebraska to form the Nebraska Native American Outreach Program (NNAOP), a partnership among tribal community leaders, academia, tribal schools, and industry reaching close to 1,OOO Native American youth, over 1,200 community members (Lehrer & Zendajas, 2001).NativeGEM addresses all three key components of Cooperative State Research, Education, and Extension Service (CSREES) goals for advancing decision support, education, and workforce development through the GES. The existing long term commitments that the NNSGC and the GES have in these areas allow for the pursuit of a broad range of activities. NativeGEM builds upon these existing successful programs and collaborations. Outcomes and metrics for each proposed project are detailed in the Approach section of this document.

  1. Simultaneous Visualization of Different Utility Networks for Disaster Management

    NASA Astrophysics Data System (ADS)

    Semm, S.; Becker, T.; Kolbe, T. H.

    2012-07-01

    Cartographic visualizations of crises are used to create a Common Operational Picture (COP) and enforce Situational Awareness by presenting and representing relevant information. As nearly all crises affect geospatial entities, geo-data representations have to support location-specific decision-making throughout the crises. Since, Operator's attention span and their working memory are limiting factors for the process of getting and interpreting information; the cartographic presentation has to support individuals in coordinating their activities and with handling highly dynamic situations. The Situational Awareness of operators in conjunction with a COP are key aspects of the decision making process and essential for coming to appropriate decisions. Utility networks are one of the most complex and most needed systems within a city. The visualization of utility infrastructure in crisis situations is addressed in this paper. The paper will provide a conceptual approach on how to simplify, aggregate, and visualize multiple utility networks and their components to meet the requirements of the decision-making process and to support Situational Awareness.

  2. Barriers to use of geospatial data for adaptation to climate change and variability: case studies in public health.

    PubMed

    Aron, Joan L

    2006-11-01

    This paper presents two case studies of the barriers to the use of geospatial data in the context of public health adaptation to climate change and variability. The first case study is on the hazards of coastal zone development in the United States with the main emphasis on Hurricane Katrina. An important barrier to the use of geospatial data is that the legal system does not support restrictions on land use intended to protect the coastal zone. Economic interests to develop New Orleans and the Mississippi River, both over the long term and the short term, had the effect of increasing the impact of the hurricane. The second case study is epidemics of climate-sensitive diseases with the main emphasis on malaria in Africa. Limits to model accuracy may present a problem in using climate data for an early warning system, and some geographic locations are likely to be more suitable than others. Costs of the system, including the costs of errors, may also inhibit implementation. Deriving societal benefits from geospatial data requires an understanding of the particular decision contexts and organizational processes in which knowledge is developed and used. The data by themselves will not usually generate a societal response. Scientists working in applications should develop partnerships to address the use of geospatial data for societal benefit.

  3. Geospatial Engineering

    DTIC Science & Technology

    2017-02-22

    manages operations through guidance, policies, programs, and organizations. The NSG is designed to be a mutually supportive enterprise that...deliberate technical design and deliberate human actions. Geospatial engineer teams (GETs) within the geospatial intelligence cells are the day-to-day...standards working group and are designated by the AGC Geospatial Acquisition Support Directorate as required for interoperability. Applicable standards

  4. Business models for implementing geospatial technologies in transportation decision-making

    DOT National Transportation Integrated Search

    2007-03-31

    This report describes six State DOTs business models for implementing geospatial technologies. It provides a comparison of the organizational factors influencing how Arizona DOT, Delaware DOT, Georgia DOT, Montana DOT, North Carolina DOT, and Okla...

  5. The Infusion of Dust Model Model Outputs into Public Health Decision Making - an Examination of Differential Adoption of SOAP and Open Geospatial Consortium Service Products into Public Health Decision Support Systems

    NASA Astrophysics Data System (ADS)

    Benedict, K. K.

    2008-12-01

    Since 2004 the Earth Data Analysis Center, in collaboration with the researchers at the University of Arizona and George Mason University, with funding from NASA, has been developing a services oriented architecture (SOA) that acquires remote sensing, meteorological forecast, and observed ground level particulate data (EPA AirNow) from NASA, NOAA, and DataFed through a variety of standards-based service interfaces. These acquired data are used to initialize and set boundary conditions for the execution of the Dust Regional Atmospheric Model (DREAM) to generate daily 48-hour dust forecasts, which are then published via a combination of Open Geospatial Consortium (OGC) services (WMS and WCS), basic HTTP request-based services, and SOAP services. The goal of this work has been to develop services that can be integrated into existing public health decision support systems (DSS) to provide enhanced environmental data (i.e. ground surface particulate concentration estimates) for use in epidemiological analysis, public health warning systems, and syndromic surveillance systems. While the project has succeeded in deploying these products into the target systems, there has been differential adoption of the different service interface products, with the simple OGC and HTTP interfaces generating much greater interest by DSS developers and researchers than the more complex SOAP service interfaces. This paper reviews the SOA developed as part of this project and provides insights into how different service models may have a significant impact on the infusion of Earth science products into decision making processes and systems.

  6. Exploring Methodologies and Indicators for Cross-disciplinary Applications

    NASA Astrophysics Data System (ADS)

    Bernknopf, R.; Pearlman, J.

    2015-12-01

    Assessing the impact and benefit of geospatial information is a multidisciplinary task that involves the social, economic and environmental knowledge to formulate indicators and methods. There are use cases that couple the social sciences including economics, psychology, sociology that incorporate geospatial information. Benefit - cost analysis is an empirical approach that uses money as an indicator for decision making. It is a traditional base for a use case and has been applied to geospatial information and other areas. A new use case that applies indicators is Meta Regression analysis, which is used to evaluate transfers of socioeconomic benefits from different geographic regions into a unifying statistical approach. In this technique, qualitative and quantitative variables are indicators, which provide a weighted average of value for the nonmarket good or resource over a large region. The expected willingness to pay for the nonmarket good can be applied to a specific region. A third use case is the application of Decision Support Systems and Tools that have been used for forecasting agricultural prices and analysis of hazard policies. However, new methods for integrating these disciplines into use cases, an avenue to instruct the development of operational applications of geospatial information, are needed. Experience in one case may not be broadly transferable to other uses and applications if multiple disciplines are involved. To move forward, more use cases are needed and, especially, applications in the private sector. Applications are being examined across a multidisciplinary community for good examples that would be instructive in meeting the challenge. This presentation will look at the results of an investigation into directions in the broader applications of use cases to teach the methodologies and use of indicators that have applications across fields of interest.

  7. Delivery of Forecasted Atmospheric Ozone and Dust for the New Mexico Environmental Public Health Tracking System - An Open Source Geospatial Solution

    NASA Astrophysics Data System (ADS)

    Hudspeth, W. B.; Sanchez-Silva, R.; Cavner, J. A.

    2010-12-01

    New Mexico's Environmental Public Health Tracking System (EPHTS), funded by the Centers for Disease Control (CDC) Environmental Public Health Tracking Network (EPHTN), aims to improve health awareness and services by linking health effects data with levels and frequency of environmental exposure. As a public health decision-support system, EPHTS systems include: state-of-the-art statistical analysis tools; geospatial visualization tools; data discovery, extraction, and delivery tools; and environmental/public health linkage information. As part of its mandate, EPHTS issues public health advisories and forecasts of environmental conditions that have consequences for human health. Through a NASA-funded partnership between the University of New Mexico and the University of Arizona, NASA Earth Science results are fused into two existing models (the Dust Regional Atmospheric Model (DREAM) and the Community Multiscale Air Quality (CMAQ) model) in order to improve forecasts of atmospheric dust, ozone, and aerosols. The results and products derived from the outputs of these models are made available to an Open Source mapping component of the New Mexico EPHTS. In particular, these products are integrated into a Django content management system using GeoDjango, GeoAlchemy, and other OGC-compliant geospatial libraries written in the Python and C++ programming languages. Capabilities of the resultant mapping system include indicator-based thematic mapping, data delivery, and analytical capabilities. DREAM and CMAQ outputs can be inspected, via REST calls, through temporal and spatial subsetting of the atmospheric concentration data across analytical units employed by the public health community. This paper describes details of the architecture and integration of NASA Earth Science into the EPHTS decision-support system.

  8. Web-Based Geospatial Tools to Address Hazard Mitigation, Natural Resource Management, and Other Societal Issues

    USGS Publications Warehouse

    Hearn,, Paul P.

    2009-01-01

    Federal, State, and local government agencies in the United States face a broad range of issues on a daily basis. Among these are natural hazard mitigation, homeland security, emergency response, economic and community development, water supply, and health and safety services. The U.S. Geological Survey (USGS) helps decision makers address these issues by providing natural hazard assessments, information on energy, mineral, water and biological resources, maps, and other geospatial information. Increasingly, decision makers at all levels are challenged not by the lack of information, but by the absence of effective tools to synthesize the large volume of data available, and to utilize the data to frame policy options in a straightforward and understandable manner. While geographic information system (GIS) technology has been widely applied to this end, systems with the necessary analytical power have been usable only by trained operators. The USGS is addressing the need for more accessible, manageable data tools by developing a suite of Web-based geospatial applications that will incorporate USGS and cooperating partner data into the decision making process for a variety of critical issues. Examples of Web-based geospatial tools being used to address societal issues follow.

  9. A geospatial soil-based DSS to reconcile landscape management and land protection

    NASA Astrophysics Data System (ADS)

    Manna, Piero; Basile, Angelo; Bonfante, Antonello; D'Antonio, Amedeo; De Michele, Carlo; Iamarino, Michela; Langella, Giuliano; Florindo Mileti, Antonio; Pileri, Paolo; Vingiani, Simona; Terribile, Fabio

    2017-04-01

    The implementation of UN Agenda 2030 may represent a great opportunity to place soil science at the hearth of many Sustainable Development Goals (e.g. SDGs 2, 3, 13, 15, 15.3, 16.7). On the other side the high complexity embedded in the factual implementation of SDG and many others ambitious objectives (e.g. FAO goals) may cause new frustrations if these policy documents will not bring real progresses. The scientific communities are asked to contribute to disentangle this complexity and possibly identifying a "way to go". This may help the large number of European directives (e.g. WFD, EIA), regulations and communications aiming to achieve a better environment but still facing large difficulties in their full implementation (e.g. COM2015/120; COM2013/683). This contribution has the motivation to provide a different perspective, thinking that the full implementation of SDGs and integrated land policies requires to challenge some key overlooked issues including full competence (and managing capability) about the landscape variability, its multi-functionalities (e.g. agriculture / environment) and its dynamic nature (many processes, including crops growth and fate of pollutants, are dynamic); moreover, it requires to support actions at a very detailed local scale since many processes and problems are site specific. The landscape and all the above issues have the soil as pulsing heart. Accordingly, we aim to demonstrate the multiple benefits in using a smart geoSpatial Decision Support System (S-DSS) grounded on soil modelling, called SOILCONSWEB (EU LIFE+ project and its extensions). It is a freely-accessible web platform based on a Geospatial Cyber-Infrastructure (GCI) and developed in Valle Telesina (South Italy) over an area of 20,000 ha. It supports a multilevel decision-making in agriculture and environment including the interaction with other land uses (such as landscape and urban planning) and thus it simultaneously delivers to SDGs 2, 3, 13, 15, 15.3, 16.7.

  10. From Analysis to Impact: Challenges and Outcomes from Google's Cloud-based Platforms for Analyzing and Leveraging Petapixels of Geospatial Data

    NASA Astrophysics Data System (ADS)

    Thau, D.

    2017-12-01

    For the past seven years, Google has made petabytes of Earth observation data, and the tools to analyze it, freely available to researchers around the world via cloud computing. These data and tools were initially available via Google Earth Engine and are increasingly available on the Google Cloud Platform. We have introduced a number of APIs for both the analysis and presentation of geospatial data that have been successfully used to create impactful datasets and web applications, including studies of global surface water availability, global tree cover change, and crop yield estimation. Each of these projects used the cloud to analyze thousands to millions of Landsat scenes. The APIs support a range of publishing options, from outputting imagery and data for inclusion in papers, to providing tools for full scale web applications that provide analysis tools of their own. Over the course of developing these tools, we have learned a number of lessons about how to build a publicly available cloud platform for geospatial analysis, and about how the characteristics of an API can affect the kinds of impacts a platform can enable. This study will present an overview of how Google Earth Engine works and how Google's geospatial capabilities are extending to Google Cloud Platform. We will provide a number of case studies describing how these platforms, and the data they host, have been leveraged to build impactful decision support tools used by governments, researchers, and other institutions, and we will describe how the available APIs have shaped (or constrained) those tools. [Image Credit: Tyler A. Erickson

  11. When Informationists Get Involved: the CHICA-GIS Project.

    PubMed

    Whipple, Elizabeth C; Odell, Jere D; Ralston, Rick K; Liu, Gilbert C

    2013-01-01

    Child Health Improvement through Computer Automation (CHICA) is a computer decision support system (CDSS) that interfaces with existing electronic medical record systems (EMRS) and delivers "just-in-time" patient-relevant guidelines to physicians during the clinical encounter and accurately captures structured data from all who interact with the system. "Delivering Geospatial Intelligence to Health Care Professionals (CHICA-GIS)" (1R01LM010923-01) expands the medical application of Geographic Information Systems (GIS) by integrating a geographic information system with CHICA. To provide knowledge management support for CHICA-GIS, three informationists at the Indiana University School of Medicine were awarded a supplement from the National Library Medicine. The informationists will enhance CHICA-GIS by: improving the accuracy and accessibility of information, managing and mapping the knowledge which undergirds the CHICA-GIS decision support tool, supporting community engagement and consumer health information outreach, and facilitating the dissemination of new CHICA-GIS research results and services.

  12. A web based spatial decision supporting system for land management and soil conservation

    NASA Astrophysics Data System (ADS)

    Terribile, F.; Agrillo, A.; Bonfante, A.; Buscemi, G.; Colandrea, M.; D'Antonio, A.; De Mascellis, R.; De Michele, C.; Langella, G.; Manna, P.; Marotta, L.; Mileti, F. A.; Minieri, L.; Orefice, N.; Valentini, S.; Vingiani, S.; Basile, A.

    2015-02-01

    Today it is evident that there are many contrasting demands on our landscape (e.g. food security, more sustainable agriculture, higher income in rural areas, etc.) but also many land degradation problems. It has been proved that providing operational answers to these demands and problems is extremely difficult. Here we aim to demonstrate that a Spatial Decision Support System based on geospatial cyber-infrastructure (GCI) can embody all of the above, so producing a smart system for supporting decision making for agriculture, forestry and urban planning with respect to the landscape. In this paper, we discuss methods and results of a special kind of GCI architecture, one that is highly focused on soil and land conservation (SOILCONSWEB-LIFE+ project). The system allows us to obtain dynamic, multidisciplinary, multiscale, and multifunctional answers to agriculture, forestry and urban planning issues through the web. The system has been applied to and tested in an area of about 20 000 ha in the South of Italy, within the framework of a European LIFE+ project. The paper reports - as a case study - results from two different applications dealing with agriculture (olive growth tool) and environmental protection (soil capability to protect groundwater). Developed with the help of end users, the system is starting to be adopted by local communities. The system indirectly explores a change of paradigm for soil and landscape scientists. Indeed, the potential benefit is shown of overcoming current disciplinary fragmentation over landscape issues by offering - through a smart web based system - truly integrated geospatial knowledge that may be directly and freely used by any end user (http://www.landconsultingweb.eu). This may help bridge the last much important divide between scientists working on the landscape and end users.

  13. A Web-based spatial decision supporting system for land management and soil conservation

    NASA Astrophysics Data System (ADS)

    Terribile, F.; Agrillo, A.; Bonfante, A.; Buscemi, G.; Colandrea, M.; D'Antonio, A.; De Mascellis, R.; De Michele, C.; Langella, G.; Manna, P.; Marotta, L.; Mileti, F. A.; Minieri, L.; Orefice, N.; Valentini, S.; Vingiani, S.; Basile, A.

    2015-07-01

    Today it is evident that there are many contrasting demands on our landscape (e.g. food security, more sustainable agriculture, higher income in rural areas, etc.) as well as many land degradation problems. It has been proved that providing operational answers to these demands and problems is extremely difficult. Here we aim to demonstrate that a spatial decision support system based on geospatial cyberinfrastructure (GCI) can address all of the above, so producing a smart system for supporting decision making for agriculture, forestry, and urban planning with respect to the landscape. In this paper, we discuss methods and results of a special kind of GCI architecture, one that is highly focused on land management and soil conservation. The system allows us to obtain dynamic, multidisciplinary, multiscale, and multifunctional answers to agriculture, forestry, and urban planning issues through the Web. The system has been applied to and tested in an area of about 20 000 ha in the south of Italy, within the framework of a European LIFE+ project (SOILCONSWEB). The paper reports - as a case study - results from two different applications dealing with agriculture (olive growth tool) and environmental protection (soil capability to protect groundwater). Developed with the help of end users, the system is starting to be adopted by local communities. The system indirectly explores a change of paradigm for soil and landscape scientists. Indeed, the potential benefit is shown of overcoming current disciplinary fragmentation over landscape issues by offering - through a smart Web-based system - truly integrated geospatial knowledge that may be directly and freely used by any end user (www.landconsultingweb.eu). This may help bridge the last very important divide between scientists working on the landscape and end users.

  14. Intergraph video and images exploitation capabilities

    NASA Astrophysics Data System (ADS)

    Colla, Simone; Manesis, Charalampos

    2013-08-01

    The current paper focuses on the capture, fusion and process of aerial imagery in order to leverage full motion video, giving analysts the ability to collect, analyze, and maximize the value of video assets. Unmanned aerial vehicles (UAV) have provided critical real-time surveillance and operational support to military organizations, and are a key source of intelligence, particularly when integrated with other geospatial data. In the current workflow, at first, the UAV operators plan the flight by using a flight planning software. During the flight the UAV send a live video stream directly on the field to be processed by Intergraph software, to generate and disseminate georeferenced images trough a service oriented architecture based on ERDAS Apollo suite. The raw video-based data sources provide the most recent view of a situation and can augment other forms of geospatial intelligence - such as satellite imagery and aerial photos - to provide a richer, more detailed view of the area of interest. To effectively use video as a source of intelligence, however, the analyst needs to seamlessly fuse the video with these other types of intelligence, such as map features and annotations. Intergraph has developed an application that automatically generates mosaicked georeferenced image, tags along the video route which can then be seamlessly integrated with other forms of static data, such as aerial photos, satellite imagery, or geospatial layers and features. Consumers will finally have the ability to use a single, streamlined system to complete the entire geospatial information lifecycle: capturing geospatial data using sensor technology; processing vector, raster, terrain data into actionable information; managing, fusing, and sharing geospatial data and video toghether; and finally, rapidly and securely delivering integrated information products, ensuring individuals can make timely decisions.

  15. The Virginia Geocoin Adventure: An Experiential Geospatial Learning Activity

    ERIC Educational Resources Information Center

    Johnson, Laura; McGee, John; Campbell, James; Hays, Amy

    2013-01-01

    Geospatial technologies have become increasingly prevalent across our society. Educators at all levels have expressed a need for additional resources that can be easily adopted to support geospatial literacy and state standards of learning, while enhancing the overall learning experience. The Virginia Geocoin Adventure supports the needs of 4-H…

  16. Disaster Response Tools for Decision Support and Data Discovery - E-DECIDER and GeoGateway

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Donnellan, A.; Parker, J. W.; Granat, R. A.; Lyzenga, G. A.; Pierce, M. E.; Wang, J.; Grant Ludwig, L.; Eguchi, R. T.; Huyck, C. K.; Hu, Z.; Chen, Z.; Yoder, M. R.; Rundle, J. B.; Rosinski, A.

    2015-12-01

    Providing actionable data for situational awareness following an earthquake or other disaster is critical to decision makers in order to improve their ability to anticipate requirements and provide appropriate resources for response. E-DECIDER (Emergency Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response) is a decision support system producing remote sensing and geophysical modeling products that are relevant to the emergency preparedness and response communities and serves as a gateway to enable the delivery of actionable information to these communities. GeoGateway is a data product search and analysis gateway for scientific discovery, field use, and disaster response focused on NASA UAVSAR and GPS data that integrates with fault data, seismicity and models. Key information on the nature, magnitude and scope of damage, or Essential Elements of Information (EEI), necessary to achieve situational awareness are often generated from a wide array of organizations and disciplines, using any number of geospatial and non-geospatial technologies. We have worked in partnership with the California Earthquake Clearinghouse to develop actionable data products for use in their response efforts, particularly in regularly scheduled, statewide exercises like the recent May 2015 Capstone/SoCal NLE/Ardent Sentry Exercises and in the August 2014 South Napa earthquake activation. We also provided a number of products, services, and consultation to the NASA agency-wide response to the April 2015 Gorkha, Nepal earthquake. We will present perspectives on developing tools for decision support and data discovery in partnership with the Clearinghouse and for the Nepal earthquake. Products delivered included map layers as part of the common operational data plan for the Clearinghouse, delivered through XchangeCore Web Service Data Orchestration, enabling users to create merged datasets from multiple providers. For the Nepal response effort, products included models, damage and loss estimates, and aftershock forecasts that were posted to a NASA information site and delivered directly to end-users such as USAID, OFDA, World Bank, and UNICEF.

  17. Generalized Cartographic and Simultaneous Representation of Utility Networks for Decision-Support Systems and Crisis Management in Urban Environments

    NASA Astrophysics Data System (ADS)

    Becker, T.; König, G.

    2015-10-01

    Cartographic visualizations of crises are used to create a Common Operational Picture (COP) and enforce Situational Awareness by presenting relevant information to the involved actors. As nearly all crises affect geospatial entities, geo-data representations have to support location-specific analysis throughout the decision-making process. Meaningful cartographic presentation is needed for coordinating the activities of crisis manager in a highly dynamic situation, since operators' attention span and their spatial memories are limiting factors during the perception and interpretation process. Situational Awareness of operators in conjunction with a COP are key aspects in decision-making process and essential for making well thought-out and appropriate decisions. Considering utility networks as one of the most complex and particularly frequent required systems in urban environment, meaningful cartographic presentation of multiple utility networks with respect to disaster management do not exist. Therefore, an optimized visualization of utility infrastructure for emergency response procedures is proposed. The article will describe a conceptual approach on how to simplify, aggregate, and visualize multiple utility networks and their components to meet the requirements of the decision-making process and to support Situational Awareness.

  18. Integrating semantic web technologies and geospatial catalog services for geospatial information discovery and processing in cyberinfrastructure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yue, Peng; Gong, Jianya; Di, Liping

    Abstract A geospatial catalogue service provides a network-based meta-information repository and interface for advertising and discovering shared geospatial data and services. Descriptive information (i.e., metadata) for geospatial data and services is structured and organized in catalogue services. The approaches currently available for searching and using that information are often inadequate. Semantic Web technologies show promise for better discovery methods by exploiting the underlying semantics. Such development needs special attention from the Cyberinfrastructure perspective, so that the traditional focus on discovery of and access to geospatial data can be expanded to support the increased demand for processing of geospatial information andmore » discovery of knowledge. Semantic descriptions for geospatial data, services, and geoprocessing service chains are structured, organized, and registered through extending elements in the ebXML Registry Information Model (ebRIM) of a geospatial catalogue service, which follows the interface specifications of the Open Geospatial Consortium (OGC) Catalogue Services for the Web (CSW). The process models for geoprocessing service chains, as a type of geospatial knowledge, are captured, registered, and discoverable. Semantics-enhanced discovery for geospatial data, services/service chains, and process models is described. Semantic search middleware that can support virtual data product materialization is developed for the geospatial catalogue service. The creation of such a semantics-enhanced geospatial catalogue service is important in meeting the demands for geospatial information discovery and analysis in Cyberinfrastructure.« less

  19. The PANTHER User Experience

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coram, Jamie L.; Morrow, James D.; Perkins, David Nikolaus

    2015-09-01

    This document describes the PANTHER R&D Application, a proof-of-concept user interface application developed under the PANTHER Grand Challenge LDRD. The purpose of the application is to explore interaction models for graph analytics, drive algorithmic improvements from an end-user point of view, and support demonstration of PANTHER technologies to potential customers. The R&D Application implements a graph-centric interaction model that exposes analysts to the algorithms contained within the GeoGraphy graph analytics library. Users define geospatial-temporal semantic graph queries by constructing search templates based on nodes, edges, and the constraints among them. Users then analyze the results of the queries using bothmore » geo-spatial and temporal visualizations. Development of this application has made user experience an explicit driver for project and algorithmic level decisions that will affect how analysts one day make use of PANTHER technologies.« less

  20. WEB-GIS Decision Support System for CO2 storage

    NASA Astrophysics Data System (ADS)

    Gaitanaru, Dragos; Leonard, Anghel; Radu Gogu, Constantin; Le Guen, Yvi; Scradeanu, Daniel; Pagnejer, Mihaela

    2013-04-01

    Environmental decision support systems (DSS) paradigm evolves and changes as more knowledge and technology become available to the environmental community. Geographic Information Systems (GIS) can be used to extract, assess and disseminate some types of information, which are otherwise difficult to access by traditional methods. In the same time, with the help of the Internet and accompanying tools, creating and publishing online interactive maps has become easier and rich with options. The Decision Support System (MDSS) developed for the MUSTANG (A MUltiple Space and Time scale Approach for the quaNtification of deep saline formations for CO2 storaGe) project is a user friendly web based application that uses the GIS capabilities. MDSS can be exploited by the experts for CO2 injection and storage in deep saline aquifers. The main objective of the MDSS is to help the experts to take decisions based large structured types of data and information. In order to achieve this objective the MDSS has a geospatial objected-orientated database structure for a wide variety of data and information. The entire application is based on several principles leading to a series of capabilities and specific characteristics: (i) Open-Source - the entire platform (MDSS) is based on open-source technologies - (1) database engine, (2) application server, (3) geospatial server, (4) user interfaces, (5) add-ons, etc. (ii) Multiple database connections - MDSS is capable to connect to different databases that are located on different server machines. (iii)Desktop user experience - MDSS architecture and design follows the structure of a desktop software. (iv)Communication - the server side and the desktop are bound together by series functions that allows the user to upload, use, modify and download data within the application. The architecture of the system involves one database and a modular application composed by: (1) a visualization module, (2) an analysis module, (3) a guidelines module, and (4) a risk assessment module. The Database component is build by using the PostgreSQL and PostGIS open source technology. The visualization module allows the user to view data of CO2 injection sites in different ways: (1) geospatial visualization, (2) table view, (3) 3D visualization. The analysis module will allow the user to perform certain analysis like Injectivity, Containment and Capacity analysis. The Risk Assessment module focus on the site risk matrix approach. The Guidelines module contains the methodologies of CO2 injection and storage into deep saline aquifers guidelines.

  1. GeoSearch: A lightweight broking middleware for geospatial resources discovery

    NASA Astrophysics Data System (ADS)

    Gui, Z.; Yang, C.; Liu, K.; Xia, J.

    2012-12-01

    With petabytes of geodata, thousands of geospatial web services available over the Internet, it is critical to support geoscience research and applications by finding the best-fit geospatial resources from the massive and heterogeneous resources. Past decades' developments witnessed the operation of many service components to facilitate geospatial resource management and discovery. However, efficient and accurate geospatial resource discovery is still a big challenge due to the following reasons: 1)The entry barriers (also called "learning curves") hinder the usability of discovery services to end users. Different portals and catalogues always adopt various access protocols, metadata formats and GUI styles to organize, present and publish metadata. It is hard for end users to learn all these technical details and differences. 2)The cost for federating heterogeneous services is high. To provide sufficient resources and facilitate data discovery, many registries adopt periodic harvesting mechanism to retrieve metadata from other federated catalogues. These time-consuming processes lead to network and storage burdens, data redundancy, and also the overhead of maintaining data consistency. 3)The heterogeneous semantics issues in data discovery. Since the keyword matching is still the primary search method in many operational discovery services, the search accuracy (precision and recall) is hard to guarantee. Semantic technologies (such as semantic reasoning and similarity evaluation) offer a solution to solve these issues. However, integrating semantic technologies with existing service is challenging due to the expandability limitations on the service frameworks and metadata templates. 4)The capabilities to help users make final selection are inadequate. Most of the existing search portals lack intuitive and diverse information visualization methods and functions (sort, filter) to present, explore and analyze search results. Furthermore, the presentation of the value-added additional information (such as, service quality and user feedback), which conveys important decision supporting information, is missing. To address these issues, we prototyped a distributed search engine, GeoSearch, based on brokering middleware framework to search, integrate and visualize heterogeneous geospatial resources. Specifically, 1) A lightweight discover broker is developed to conduct distributed search. The broker retrieves metadata records for geospatial resources and additional information from dispersed services (portals and catalogues) and other systems on the fly. 2) A quality monitoring and evaluation broker (i.e., QoS Checker) is developed and integrated to provide quality information for geospatial web services. 3) The semantic assisted search and relevance evaluation functions are implemented by loosely interoperating with ESIP Testbed component. 4) Sophisticated information and data visualization functionalities and tools are assembled to improve user experience and assist resource selection.

  2. Visualization and Ontology of Geospatial Intelligence

    NASA Astrophysics Data System (ADS)

    Chan, Yupo

    Recent events have deepened our conviction that many human endeavors are best described in a geospatial context. This is evidenced in the prevalence of location-based services, as afforded by the ubiquitous cell phone usage. It is also manifested by the popularity of such internet engines as Google Earth. As we commute to work, travel on business or pleasure, we make decisions based on the geospatial information provided by such location-based services. When corporations devise their business plans, they also rely heavily on such geospatial data. By definition, local, state and federal governments provide services according to geographic boundaries. One estimate suggests that 85 percent of data contain spatial attributes.

  3. The Impact of a Geospatial Technology-Supported Energy Curriculum on Middle School Students' Science Achievement

    ERIC Educational Resources Information Center

    Kulo, Violet; Bodzin, Alec

    2013-01-01

    Geospatial technologies are increasingly being integrated in science classrooms to foster learning. This study examined whether a Web-enhanced science inquiry curriculum supported by geospatial technologies promoted urban middle school students' understanding of energy concepts. The participants included one science teacher and 108 eighth-grade…

  4. A Dynamic Information Framework: A Multi-Sector, Geospatial Gateway for Environmental Conservation and Adaptation to Climate Change

    NASA Astrophysics Data System (ADS)

    Fernandes, E. C.; Norbu, C.; Juizo, D.; Wangdi, T.; Richey, J. E.

    2011-12-01

    Landscapes, watersheds, and their downstream coastal and lacustrine zones are facing a series of challenges critical to their future, centered on the availability and distribution of water. Management options cover a range of issues, from bringing safe water to local villages for the rural poor, developing adaptation strategies for both rural and urban populations and large infrastructure, and sustaining environmental flows and ecosystem services needed for natural and human-dominated ecosystems. These targets represent a very complex set of intersecting issues of scale, cross-sector science and technology, education, politics, and economics, and the desired sustainable development is closely linked to how the nominally responsible governmental Ministries respond to the information they have. In practice, such information and even perspectives are virtually absent, in much of the developing world. A Dynamic Information Framework (DIF) is being designed as a knowledge platform whereby decision-makers in information-sparse regions can consider rigorous scenarios of alternative futures and obtain decision support for complex environmental and economic decisions is essential. The DIF is geospatial gateway, with functional components of base data layers, directed data layers focused on synthetic objectives, geospatially-explicit, process-based, cross-sector simulation models (requiring data from the directed data layers), and facilitated input/output (including visualizations), and decision support system and scenario testing capabilities. A fundamental aspect to a DIF is not only the convergence of multi-sector information, but how that information can be (a) integrated (b) used for robust simulations and projections, and (c) conveyed to policymakers and stakeholders, in the most compelling, and visual, manner. Examples are given of emerging applications. The ZambeziDIF was used to establish baselines for agriculture, biodiversity, and water resources in the lower Zambezi valley of Mozambique. The DrukDIF for Bhutan is moving from a test-of-concept to an operational phase, with uses from extending local biodiversity to computing how much energy can be sold tomorrow, based on waterflows today. AralDIF is being developed to serve as a neutral and transparent platform, as a catalyst for open and transparent discussion on water and energy linkages, for central Asia. ImisoziDIF is now being ramped up in Rwanda, to help guide scaling up of agricultural practices and biodiversity from sites to the country. The Virtual Mekong Basin, "tells the story" of the multiple issues facing the Mekong Basin.

  5. Adoption of Geospatial Systems towards evolving Sustainable Himalayan Mountain Development

    NASA Astrophysics Data System (ADS)

    Murthy, M. S. R.; Bajracharya, B.; Pradhan, S.; Shestra, B.; Bajracharya, R.; Shakya, K.; Wesselmann, S.; Ali, M.; Bajracharya, S.; Pradhan, S.

    2014-11-01

    Natural resources dependence of mountain communities, rapid social and developmental changes, disaster proneness and climate change are conceived as the critical factors regulating sustainable Himalayan mountain development. The Himalayan region posed by typical geographic settings, diverse physical and cultural diversity present a formidable challenge to collect and manage data, information and understands varied socio-ecological settings. Recent advances in earth observation, near real-time data, in-situ measurements and in combination of information and communication technology have transformed the way we collect, process, and generate information and how we use such information for societal benefits. Glacier dynamics, land cover changes, disaster risk reduction systems, food security and ecosystem conservation are a few thematic areas where geospatial information and knowledge have significantly contributed to informed decision making systems over the region. The emergence and adoption of near-real time systems, unmanned aerial vehicles (UAV), board-scale citizen science (crowd-sourcing), mobile services and mapping, and cloud computing have paved the way towards developing automated environmental monitoring systems, enhanced scientific understanding of geophysical and biophysical processes, coupled management of socio-ecological systems and community based adaptation models tailored to mountain specific environment. There are differentiated capacities among the ICIMOD regional member countries with regard to utilization of earth observation and geospatial technologies. The region can greatly benefit from a coordinated and collaborative approach to capture the opportunities offered by earth observation and geospatial technologies. The regional level data sharing, knowledge exchange, and Himalayan GEO supporting geospatial platforms, spatial data infrastructure, unique region specific satellite systems to address trans-boundary challenges would go a long way in evolving sustainable Himalayan livelihoods.

  6. Attenuation of Storm Surge Flooding By Wetlands in the Chesapeake Bay: An Integrated Geospatial Framework Evaluating Impacts to Critical Infrastructure

    NASA Astrophysics Data System (ADS)

    Khalid, A.; Haddad, J.; Lawler, S.; Ferreira, C.

    2014-12-01

    Areas along the Chesapeake Bay and its tributaries are extremely vulnerable to hurricane flooding, as evidenced by the costly effects and severe impacts of recent storms along the Virginia coast, such as Hurricane Isabel in 2003 and Hurricane Sandy in 2012. Coastal wetlands, in addition to their ecological importance, are expected to mitigate the impact of storm surge by acting as a natural protection against hurricane flooding. Quantifying such interactions helps to provide a sound scientific basis to support planning and decision making. Using storm surge flooding from various historical hurricanes, simulated using a coupled hydrodynamic wave model (ADCIRC-SWAN), we propose an integrated framework yielding a geospatial identification of the capacity of Chesapeake Bay wetlands to protect critical infrastructure. Spatial identification of Chesapeake Bay wetlands is derived from the National Wetlands Inventory (NWI), National Land Cover Database (NLCD), and the Coastal Change Analysis Program (C-CAP). Inventories of population and critical infrastructure are extracted from US Census block data and FEMA's HAZUS-Multi Hazard geodatabase. Geospatial and statistical analyses are carried out to develop a relationship between wetland land cover, hurricane flooding, population and infrastructure vulnerability. These analyses result in the identification and quantification of populations and infrastructure in flooded areas that lie within a reasonable buffer surrounding the identified wetlands. Our analysis thus produces a spatial perspective on the potential for wetlands to attenuate hurricane flood impacts in critical areas. Statistical analysis will support hypothesis testing to evaluate the benefits of wetlands from a flooding and storm-surge attenuation perspective. Results from geospatial analysis are used to identify where interactions with critical infrastructure are relevant in the Chesapeake Bay.

  7. The National Geospatial Technical Operations Center

    USGS Publications Warehouse

    Craun, Kari J.; Constance, Eric W.; Donnelly, Jay; Newell, Mark R.

    2009-01-01

    The United States Geological Survey (USGS) National Geospatial Technical Operations Center (NGTOC) provides geospatial technical expertise in support of the National Geospatial Program in its development of The National Map, National Atlas of the United States, and implementation of key components of the National Spatial Data Infrastructure (NSDI).

  8. Leveraging geospatial data, technology, and methods for improving the health of communities: priorities and strategies from an expert panel convened by the CDC.

    PubMed

    Elmore, Kim; Flanagan, Barry; Jones, Nicholas F; Heitgerd, Janet L

    2010-04-01

    In 2008, CDC convened an expert panel to gather input on the use of geospatial science in surveillance, research and program activities focused on CDC's Healthy Communities Goal. The panel suggested six priorities: spatially enable and strengthen public health surveillance infrastructure; develop metrics for geospatial categorization of community health and health inequity; evaluate the feasibility and validity of standard metrics of community health and health inequities; support and develop GIScience and geospatial analysis; provide geospatial capacity building, training and education; and, engage non-traditional partners. Following the meeting, the strategies and action items suggested by the expert panel were reviewed by a CDC subcommittee to determine priorities relative to ongoing CDC geospatial activities, recognizing that many activities may need to occur either in parallel, or occur multiple times across phases. Phase A of the action items centers on developing leadership support. Phase B focuses on developing internal and external capacity in both physical (e.g., software and hardware) and intellectual infrastructure. Phase C of the action items plan concerns the development and integration of geospatial methods. In summary, the panel members provided critical input to the development of CDC's strategic thinking on integrating geospatial methods and research issues across program efforts in support of its Healthy Communities Goal.

  9. CELL5M: A geospatial database of agricultural indicators for Africa South of the Sahara.

    PubMed

    Koo, Jawoo; Cox, Cindy M; Bacou, Melanie; Azzarri, Carlo; Guo, Zhe; Wood-Sichra, Ulrike; Gong, Queenie; You, Liangzhi

    2016-01-01

    Recent progress in large-scale georeferenced data collection is widening opportunities for combining multi-disciplinary datasets from biophysical to socioeconomic domains, advancing our analytical and modeling capacity. Granular spatial datasets provide critical information necessary for decision makers to identify target areas, assess baseline conditions, prioritize investment options, set goals and targets and monitor impacts. However, key challenges in reconciling data across themes, scales and borders restrict our capacity to produce global and regional maps and time series. This paper provides overview, structure and coverage of CELL5M-an open-access database of geospatial indicators at 5 arc-minute grid resolution-and introduces a range of analytical applications and case-uses. CELL5M covers a wide set of agriculture-relevant domains for all countries in Africa South of the Sahara and supports our understanding of multi-dimensional spatial variability inherent in farming landscapes throughout the region.

  10. Environmental risk management for radiological accidents: integrating risk assessment and decision analysis for remediation at different spatial scales.

    PubMed

    Yatsalo, Boris; Sullivan, Terrence; Didenko, Vladimir; Linkov, Igor

    2011-07-01

    The consequences of the Tohuku earthquake and subsequent tsunami in March 2011 caused a loss of power at the Fukushima Daiichi nuclear power plant, in Japan, and led to the release of radioactive materials into the environment. Although the full extent of the contamination is not currently known, the highly complex nature of the environmental contamination (radionuclides in water, soil, and agricultural produce) typical of nuclear accidents requires a detailed geospatial analysis of information with the ability to extrapolate across different scales with applications to risk assessment models and decision making support. This article briefly summarizes the approach used to inform risk-based land management and remediation decision making after the Chernobyl, Soviet Ukraine, accident in 1986. Copyright © 2011 SETAC.

  11. Initial PDS4 Support for the Geospatial Data Abstraction Library (GDAL)

    NASA Astrophysics Data System (ADS)

    Hare, T. M.; Gaddis, L. R.

    2018-04-01

    We introduce initial support for PDS4 within the Geospatial Data Abstraction Library (GDAL). Both highlights and limitations are presented, as well as a short discussion on methods for supporting a GDAL-based workflow for PDS4 conversions.

  12. FY 2018 Grant Announcement: FY2018 Support for Geospatial Analysis Support

    EPA Pesticide Factsheets

    The U.S. Environmental Protection Agency’s (EPA) Chesapeake Bay Program Office (CBPO) is announcing a Request for Proposals (RFP) for applicants to provide the Chesapeake Bay Program (CBP) partners with a proposal for providing geospatial analysis support

  13. EnviroAtlas: Providing Nationwide Geospatial Ecosystem Goods and Services Indicators and Indices to Inform Decision-Making, Research, and Education

    EPA Science Inventory

    EnviroAtlas is a multi-organization effort led by the US Environmental Protection Agency to develop, host and display a large suite of nation-wide geospatial indicators and indices of ecosystem services. This open access tool allows users to view, analyze, and download a wealth o...

  14. OGC and Grid Interoperability in enviroGRIDS Project

    NASA Astrophysics Data System (ADS)

    Gorgan, Dorian; Rodila, Denisa; Bacu, Victor; Giuliani, Gregory; Ray, Nicolas

    2010-05-01

    EnviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is a 4-years FP7 Project aiming to address the subjects of ecologically unsustainable development and inadequate resource management. The project develops a Spatial Data Infrastructure of the Black Sea Catchment region. The geospatial technologies offer very specialized functionality for Earth Science oriented applications as well as the Grid oriented technology that is able to support distributed and parallel processing. One challenge of the enviroGRIDS project is the interoperability between geospatial and Grid infrastructures by providing the basic and the extended features of the both technologies. The geospatial interoperability technology has been promoted as a way of dealing with large volumes of geospatial data in distributed environments through the development of interoperable Web service specifications proposed by the Open Geospatial Consortium (OGC), with applications spread across multiple fields but especially in Earth observation research. Due to the huge volumes of data available in the geospatial domain and the additional introduced issues (data management, secure data transfer, data distribution and data computation), the need for an infrastructure capable to manage all those problems becomes an important aspect. The Grid promotes and facilitates the secure interoperations of geospatial heterogeneous distributed data within a distributed environment, the creation and management of large distributed computational jobs and assures a security level for communication and transfer of messages based on certificates. This presentation analysis and discusses the most significant use cases for enabling the OGC Web services interoperability with the Grid environment and focuses on the description and implementation of the most promising one. In these use cases we give a special attention to issues such as: the relations between computational grid and the OGC Web service protocols, the advantages offered by the Grid technology - such as providing a secure interoperability between the distributed geospatial resource -and the issues introduced by the integration of distributed geospatial data in a secure environment: data and service discovery, management, access and computation. enviroGRIDS project proposes a new architecture which allows a flexible and scalable approach for integrating the geospatial domain represented by the OGC Web services with the Grid domain represented by the gLite middleware. The parallelism offered by the Grid technology is discussed and explored at the data level, management level and computation level. The analysis is carried out for OGC Web service interoperability in general but specific details are emphasized for Web Map Service (WMS), Web Feature Service (WFS), Web Coverage Service (WCS), Web Processing Service (WPS) and Catalog Service for Web (CSW). Issues regarding the mapping and the interoperability between the OGC and the Grid standards and protocols are analyzed as they are the base in solving the communication problems between the two environments: grid and geospatial. The presetation mainly highlights how the Grid environment and Grid applications capabilities can be extended and utilized in geospatial interoperability. Interoperability between geospatial and Grid infrastructures provides features such as the specific geospatial complex functionality and the high power computation and security of the Grid, high spatial model resolution and geographical area covering, flexible combination and interoperability of the geographical models. According with the Service Oriented Architecture concepts and requirements of interoperability between geospatial and Grid infrastructures each of the main functionality is visible from enviroGRIDS Portal and consequently, by the end user applications such as Decision Maker/Citizen oriented Applications. The enviroGRIDS portal is the single way of the user to get into the system and the portal faces a unique style of the graphical user interface. Main reference for further information: [1] enviroGRIDS Project, http://www.envirogrids.net/

  15. A comparison of geospatially modeled fire behavior and fire management utility of three data sources in the southeastern United States

    Treesearch

    LaWen T. Hollingsworth; Laurie L. Kurth; Bernard R. Parresol; Roger D. Ottmar; Susan J. Prichard

    2012-01-01

    Landscape-scale fire behavior analyses are important to inform decisions on resource management projects that meet land management objectives and protect values from adverse consequences of fire. Deterministic and probabilistic geospatial fire behavior analyses are conducted with various modeling systems including FARSITE, FlamMap, FSPro, and Large Fire Simulation...

  16. New directions in valuing geospatial information - how to value goespatial information for policy and business decisioins in the future

    NASA Astrophysics Data System (ADS)

    Smart, A. C.

    2014-12-01

    Governments are increasingly asking for more evidence of the benefits of investing in geospatial data and infrastructure before investing. They are looking for a clearer articulation of the economic, environmental and social benefits than has been possble in the past. Development of techniques has accelerated in the past five years as governments and industry become more involved in the capture and use of geospatial data. However evaluation practitioners have struggled to answer these emerging questions. The paper explores the types of questions that decision makers are asking and discusses the different approaches and methods that have been used recently to answer them. It explores the need for better buisness case models. The emerging approaches are then discussed and their attributes reviewed. These include methods of analysing tengible economic benefits, intangible benefits and societal benefits. The paper explores the use of value chain analysis and real options analysis to better articulate the impacts on international competitiveness and how to value the potential benefits of innovations enabled by the geospatial data that is produced. The paper concludes by illustrating the potential for these techniques in current and future decision making.

  17. Remote Sensing Technologies and Geospatial Modelling Hierarchy for Smart City Support

    NASA Astrophysics Data System (ADS)

    Popov, M.; Fedorovsky, O.; Stankevich, S.; Filipovich, V.; Khyzhniak, A.; Piestova, I.; Lubskyi, M.; Svideniuk, M.

    2017-12-01

    The approach to implementing the remote sensing technologies and geospatial modelling for smart city support is presented. The hierarchical structure and basic components of the smart city information support subsystem are considered. Some of the already available useful practical developments are described. These include city land use planning, urban vegetation analysis, thermal condition forecasting, geohazard detection, flooding risk assessment. Remote sensing data fusion approach for comprehensive geospatial analysis is discussed. Long-term city development forecasting by Forrester - Graham system dynamics model is provided over Kiev urban area.

  18. Stakeholder-driven geospatial modeling for assessing tsunami vertical-evacuation strategies in the U.S. Pacific Northwest

    NASA Astrophysics Data System (ADS)

    Wood, N. J.; Schmidtlein, M.; Schelling, J.; Jones, J.; Ng, P.

    2012-12-01

    Recent tsunami disasters, such as the 2010 Chilean and 2011 Tohoku events, demonstrate the significant life loss that can occur from tsunamis. Many coastal communities in the world are threatened by near-field tsunami hazards that may inundate low-lying areas only minutes after a tsunami begins. Geospatial integration of demographic data and hazard zones has identified potential impacts on populations in communities susceptible to near-field tsunami threats. Pedestrian-evacuation models build on these geospatial analyses to determine if individuals in tsunami-prone areas will have sufficient time to reach high ground before tsunami-wave arrival. Areas where successful evacuations are unlikely may warrant vertical-evacuation (VE) strategies, such as berms or structures designed to aid evacuation. The decision of whether and where VE strategies are warranted is complex. Such decisions require an interdisciplinary understanding of tsunami hazards, land cover conditions, demography, community vulnerability, pedestrian-evacuation models, land-use and emergency-management policy, and decision science. Engagement with the at-risk population and local emergency managers in VE planning discussions is critical because resulting strategies include permanent structures within a community and their local ownership helps ensure long-term success. We present a summary of an interdisciplinary approach to assess VE options in communities along the southwest Washington coast (U.S.A.) that are threatened by near-field tsunami hazards generated by Cascadia subduction zone earthquakes. Pedestrian-evacuation models based on an anisotropic approach that uses path-distance algorithms were merged with population data to forecast the distribution of at-risk individuals within several communities as a function of travel time to safe locations. A series of community-based workshops helped identify potential VE options in these communities, collectively known as "Project Safe Haven" at the State of Washington Emergency Management Division. Models of the influence of stakeholder-driven VE options identified changes in the type and distribution of at-risk individuals. Insights from VE use and performance as an aid to evacuations from the 2011 Tohoku tsunami helped to inform the meetings and the analysis. We developed geospatial tools to automate parts of the pedestrian-evacuation models to support the iterative process of developing VE options and forecasting changes in population exposure. Our summary presents the interdisciplinary effort to forecast population impacts from near-field tsunami threats and to develop effective VE strategies to minimize fatalities in future events.

  19. Geo-portal as a planning instrument: supporting decision making and fostering market potential of Energy efficiency in buildings

    NASA Astrophysics Data System (ADS)

    Cuca, Branka; Brumana, Raffaella; Oreni, Daniela; Iannaccone, Giuliana; Sesana, Marta Maria

    2014-03-01

    Steady technological progress has led to a noticeable advancement in disciplines associated with Earth observation. This has enabled information transition regarding changing scenarios, both natural and urban, to occur in (almost) real time. In particular, the need for integration on a local scale with the wider territorial framework has occurred in analysis and monitoring of built environments over the last few decades. The progress of Geographic Information (GI) science has provided significant advancements when it comes to spatial analysis, while the almost free availability of the internet has ensured a fast and constant exchange of geo-information, even for everyday users' requirements. Due to its descriptive and semantic nature, geo-spatial information is capable of providing a complete overview of a certain phenomenon and of predicting the implications within the natural, social and economic context. However, in order to integrate geospatial data into decision making processes, it is necessary to provide a specific context, which is well supported by verified data. This paper investigates the potentials of geo-portals as planning instruments developed to share multi-temporal/multi-scale spatial data, responding to specific end-users' demands in the case of Energy efficiency in Buildings (EeB) across European countries. The case study regards the GeoCluster geo-portal and mapping tool (Project GE2O, FP7), built upon a GeoClustering methodology for mapping of indicators relevant for energy efficiency technologies in the construction sector.

  20. Advancements in Open Geospatial Standards for Photogrammetry and Remote Sensing from Ogc

    NASA Astrophysics Data System (ADS)

    Percivall, George; Simonis, Ingo

    2016-06-01

    The necessity of open standards for effective sharing and use of remote sensing continues to receive increasing emphasis in policies of agencies and projects around the world. Coordination on the development of open standards for geospatial information is a vital step to insure that the technical standards are ready to support the policy objectives. The mission of the Open Geospatial Consortium (OGC) is to advance development and use of international standards and supporting services that promote geospatial interoperability. To accomplish this mission, OGC serves as the global forum for the collaboration of geospatial data / solution providers and users. Photogrammetry and remote sensing are sources of the largest and most complex geospatial information. Some of the most mature OGC standards for remote sensing include the Sensor Web Enablement (SWE) standards, the Web Coverage Service (WCS) suite of standards, encodings such as NetCDF, GMLJP2 and GeoPackage, and the soon to be approved Discrete Global Grid Systems (DGGS) standard. In collaboration with ISPRS, OGC working with government, research and industrial organizations continue to advance the state of geospatial standards for full use of photogrammetry and remote sensing.

  1. Tsunami vertical-evacuation planning in the U.S. Pacific Northwest as a geospatial, multi-criteria decision problem

    USGS Publications Warehouse

    Wood, Nathan; Jones, Jeanne; Schelling, John; Schmidtlein, Mathew

    2014-01-01

    Tsunami vertical-evacuation (TVE) refuges can be effective risk-reduction options for coastal communities with local tsunami threats but no accessible high ground for evacuations. Deciding where to locate TVE refuges is a complex risk-management question, given the potential for conflicting stakeholder priorities and multiple, suitable sites. We use the coastal community of Ocean Shores (Washington, USA) and the local tsunami threat posed by Cascadia subduction zone earthquakes as a case study to explore the use of geospatial, multi-criteria decision analysis for framing the locational problem of TVE siting. We demonstrate a mixed-methods approach that uses potential TVE sites identified at community workshops, geospatial analysis to model changes in pedestrian evacuation times for TVE options, and statistical analysis to develop metrics for comparing population tradeoffs and to examine influences in decision making. Results demonstrate that no one TVE site can save all at-risk individuals in the community and each site provides varying benefits to residents, employees, customers at local stores, tourists at public venues, children at schools, and other vulnerable populations. The benefit of some proposed sites varies depending on whether or not nearby bridges will be functioning after the preceding earthquake. Relative rankings of the TVE sites are fairly stable under various criteria-weighting scenarios but do vary considerably when comparing strategies to exclusively protect tourists or residents. The proposed geospatial framework can serve as an analytical foundation for future TVE siting discussions.

  2. Geospatial analysis of spaceborne remote sensing data for assessing disaster impacts and modeling surface runoff in the built-environment

    NASA Astrophysics Data System (ADS)

    Wodajo, Bikila Teklu

    Every year, coastal disasters such as hurricanes and floods claim hundreds of lives and severely damage homes, businesses, and lifeline infrastructure. This research was motivated by the 2005 Hurricane Katrina disaster, which devastated the Mississippi and Louisiana Gulf Coast. The primary objective was to develop a geospatial decision-support system for extracting built-up surfaces and estimating disaster impacts using spaceborne remote sensing satellite imagery. Pre-Katrina 1-m Ikonos imagery of a 5km x 10km area of Gulfport, Mississippi, was used as source data to develop the built-up area and natural surfaces or BANS classification methodology. Autocorrelation of 0.6 or higher values related to spectral reflectance values of groundtruth pixels were used to select spectral bands and establish the BANS decision criteria of unique ranges of reflectance values. Surface classification results using GeoMedia Pro geospatial analysis for Gulfport sample areas, based on BANS criteria and manually drawn polygons, were within +/-7% of the groundtruth. The difference between the BANS results and the groundtruth was statistically not significant. BANS is a significant improvement over other supervised classification methods, which showed only 50% correctly classified pixels. The storm debris and erosion estimation or SDE methodology was developed from analysis of pre- and post-Katrina surface classification results of Gulfport samples. The SDE severity level criteria considered hurricane and flood damages and vulnerability of inhabited built-environment. A linear regression model, with +0.93 Pearson R-value, was developed for predicting SDE as a function of pre-disaster percent built-up area. SDE predictions for Gulfport sample areas, used for validation, were within +/-4% of calculated values. The damage cost model considered maintenance, rehabilitation and reconstruction costs related to infrastructure damage and community impacts of Hurricane Katrina. The developed models were implemented for a study area along I-10 considering the predominantly flood-induced damages in New Orleans. The BANS methodology was calibrated for 0.6-m QuickBird2 multispectral imagery of Karachi Port area in Pakistan. The results were accurate within +/-6% of the groundtruth. Due to its computational simplicity, the unit hydrograph method is recommended for geospatial visualization of surface runoff in the built-environment using BANS surface classification maps and elevations data. Key words. geospatial analysis, satellite imagery, built-environment, hurricane, disaster impacts, runoff.

  3. Geospatial Information is the Cornerstone of Effective Hazards Response

    USGS Publications Warehouse

    Newell, Mark

    2008-01-01

    Every day there are hundreds of natural disasters world-wide. Some are dramatic, whereas others are barely noticeable. A natural disaster is commonly defined as a natural event with catastrophic consequences for living things in the vicinity. Those events include earthquakes, floods, hurricanes, landslides, tsunami, volcanoes, and wildfires. Man-made disasters are events that are caused by man either intentionally or by accident, and that directly or indirectly threaten public health and well-being. These occurrences span the spectrum from terrorist attacks to accidental oil spills. To assist in responding to natural and potential man-made disasters, the U.S. Geological Survey (USGS) has established the Geospatial Information Response Team (GIRT) (http://www.usgs.gov/emergency/). The primary purpose of the GIRT is to ensure rapid coordination and availability of geospatial information for effective response by emergency responders, and land and resource managers, and for scientific analysis. The GIRT is responsible for establishing monitoring procedures for geospatial data acquisition, processing, and archiving; discovery, access, and delivery of data; anticipating geospatial needs; and providing relevant geospatial products and services. The GIRT is focused on supporting programs, offices, other agencies, and the public in mission response to hazards. The GIRT will leverage the USGS Geospatial Liaison Network and partnerships with the Department of Homeland Security (DHS), National Geospatial-Intelligence Agency (NGA), and Northern Command (NORTHCOM) to coordinate the provisioning and deployment of USGS geospatial data, products, services, and equipment. The USGS geospatial liaisons will coordinate geospatial information sharing with State, local, and tribal governments, and ensure geospatial liaison back-up support procedures are in place. The GIRT will coordinate disposition of USGS staff in support of DHS response center activities as requested by DHS. The GIRT is a standing team that is available during all hazard events and is on high alert during the hurricane season from June through November each year. To track all of the requirements and data acquisitions processed through the team, the GIRT will use the new Emergency Request Track (ER Track) tool. Currently, the ER Track is only available to USGS personnel.

  4. Situational Awareness Geospatial Application (iSAGA)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sher, Benjamin

    Situational Awareness Geospatial Application (iSAGA) is a geospatial situational awareness software tool that uses an algorithm to extract location data from nearly any internet-based, or custom data source and display it geospatially; allows user-friendly conduct of spatial analysis using custom-developed tools; searches complex Geographic Information System (GIS) databases and accesses high resolution imagery. iSAGA has application at the federal, state and local levels of emergency response, consequence management, law enforcement, emergency operations and other decision makers as a tool to provide complete, visual, situational awareness using data feeds and tools selected by the individual agency or organization. Feeds may bemore » layered and custom tools developed to uniquely suit each subscribing agency or organization. iSAGA may similarly be applied to international agencies and organizations.« less

  5. Benefits of using Open Geo-spatial Data for valorization of Cultural Heritage: GeoPan app

    NASA Astrophysics Data System (ADS)

    Cuca, Branka; Previtali, Mattia; Barazzetti, Luigi; Brumana, Raffaella

    2017-04-01

    Experts evaluate the spatial data to be one of the categories of Public Sector Information (PSI), of which the exchange is particularly important. On the other side an initiative with a great vision such as Digital Agenda for Europe, emphasizes on intelligent processing of information as essential factor for tackling the challenges of the contemporary society. In such context, the Open Data are considered to be crucial in addressing, environmental pressures, energy efficiency issues, land use and climate change, pollution and traffic management. Furthermore, Open Data are thought to have an important impact on more informed decision making and policy creation for multiple domains that could be addressed even through "apps" of our smart devices. Activities performed in ENERGIC OD project - "European NEtwork for Redistributing Geospatial Information to user Communities - Open Data" have led to some first conclusions on the use and re-use of geo-spatial Open Data by means of Virtual Hubs - an innovative method for brokering of geo-spatial information. This paper illustrates some main benefits of using Open Geo-spatial Data for valorisation of Cultural Heritage through a case of an innovative app called "GeoPan Atl@s". GeoPan, inserted in a dynamic policy context described, aims to provide all information valuable for a sustainable territorial development in a common platform, in particular the material that regards history and changes of the cultural landscapes in Lombardy region. Furthermore, this innovative app is used as a test-bed to facilitate and encourage a more active exchange and exploitation of open geo-spatial information for purposes of valorisation of cultural heritage and landscapes. The aim of this practice is also to achieve a more active participation of experts, VGI communities and citizens and a higher awareness of the multiple use-possibilities of historic and contemporary geo-spatial information for smarter decision making.

  6. A study on spatial decision support systems for HIV/AIDS prevention based on COM GIS technology

    NASA Astrophysics Data System (ADS)

    Yang, Kun; Luo, Huasong; Peng, Shungyun; Xu, Quanli

    2007-06-01

    Based on the deeply analysis of the current status and the existing problems of GIS technology applications in Epidemiology, this paper has proposed the method and process for establishing the spatial decision support systems of AIDS epidemic prevention by integrating the COM GIS, Spatial Database, GPS, Remote Sensing, and Communication technologies, as well as ASP and ActiveX software development technologies. One of the most important issues for constructing the spatial decision support systems of AIDS epidemic prevention is how to integrate the AIDS spreading models with GIS. The capabilities of GIS applications in the AIDS epidemic prevention have been described here in this paper firstly. Then some mature epidemic spreading models have also been discussed for extracting the computation parameters. Furthermore, a technical schema has been proposed for integrating the AIDS spreading models with GIS and relevant geospatial technologies, in which the GIS and model running platforms share a common spatial database and the computing results can be spatially visualized on Desktop or Web GIS clients. Finally, a complete solution for establishing the decision support systems of AIDS epidemic prevention has been offered in this paper based on the model integrating methods and ESRI COM GIS software packages. The general decision support systems are composed of data acquisition sub-systems, network communication sub-systems, model integrating sub-systems, AIDS epidemic information spatial database sub-systems, AIDS epidemic information querying and statistical analysis sub-systems, AIDS epidemic dynamic surveillance sub-systems, AIDS epidemic information spatial analysis and decision support sub-systems, as well as AIDS epidemic information publishing sub-systems based on Web GIS.

  7. Examining the Effect of Enactment of a Geospatial Curriculum on Students' Geospatial Thinking and Reasoning

    NASA Astrophysics Data System (ADS)

    Bodzin, Alec M.; Fu, Qiong; Kulo, Violet; Peffer, Tamara

    2014-08-01

    A potential method for teaching geospatial thinking and reasoning (GTR) is through geospatially enabled learning technologies. We developed an energy resources geospatial curriculum that included learning activities with geographic information systems and virtual globes. This study investigated how 13 urban middle school teachers implemented and varied the enactment of the curriculum with their students and investigated which teacher- and student-level factors accounted for students' GTR posttest achievement. Data included biweekly implementation surveys from teachers and energy resources content and GTR pre- and posttest achievement measures from 1,049 students. Students significantly increased both their energy resources content knowledge and their GTR skills related to energy resources at the end of the curriculum enactment. Both multiple regression and hierarchical linear modeling found that students' initial GTR abilities and gain in energy content knowledge were significantly explanatory variables for their geospatial achievement at the end of curriculum enactment, p < .001. Teacher enactment factors, including adherence to implementing the critical components of the curriculum or the number of years the teachers had taught the curriculum, did not have significant effects on students' geospatial posttest achievement. The findings from this study provide support that learning with geospatially enabled learning technologies can support GTR with urban middle-level learners.

  8. The Future of Geospatial Standards

    NASA Astrophysics Data System (ADS)

    Bermudez, L. E.; Simonis, I.

    2016-12-01

    The OGC is an international not-for-profit standards development organization (SDO) committed to making quality standards for the geospatial community. A community of more than 500 member organizations with more than 6,000 people registered at the OGC communication platform drives the development of standards that are freely available for anyone to use and to improve sharing of the world's geospatial data. OGC standards are applied in a variety of application domains including Environment, Defense and Intelligence, Smart Cities, Aviation, Disaster Management, Agriculture, Business Development and Decision Support, and Meteorology. Profiles help to apply information models to different communities, thus adapting to particular needs of that community while ensuring interoperability by using common base models and appropriate support services. Other standards address orthogonal aspects such as handling of Big Data, Crowd-sourced information, Geosemantics, or container for offline data usage. Like most SDOs, the OGC develops and maintains standards through a formal consensus process under the OGC Standards Program (OGC-SP) wherein requirements and use cases are discussed in forums generally open to the public (Domain Working Groups, or DWGs), and Standards Working Groups (SWGs) are established to create standards. However, OGC is unique among SDOs in that it also operates the OGC Interoperability Program (OGC-IP) to provide real-world testing of existing and proposed standards. The OGC-IP is considered the experimental playground, where new technologies are researched and developed in a user-driven process. Its goal is to prototype, test, demonstrate, and promote OGC Standards in a structured environment. Results from the OGC-IP often become requirements for new OGC standards or identify deficiencies in existing OGC standards that can be addressed. This presentation will provide an analysis of the work advanced in the OGC consortium including standards and testbeds, where we can extract a trend for the future of geospatial standards. We see a number of key elements in focus, but simultaneously a broadening of standards to address particular communities' needs.

  9. The Efficacy of Educative Curriculum Materials to Support Geospatial Science Pedagogical Content Knowledge

    ERIC Educational Resources Information Center

    Bodzin, Alec; Peffer, Tamara; Kulo, Violet

    2012-01-01

    Teaching and learning about geospatial aspects of energy resource issues requires that science teachers apply effective science pedagogical approaches to implement geospatial technologies into classroom instruction. To address this need, we designed educative curriculum materials as an integral part of a comprehensive middle school energy…

  10. Center of Excellence for Geospatial Information Science research plan 2013-18

    USGS Publications Warehouse

    Usery, E. Lynn

    2013-01-01

    The U.S. Geological Survey Center of Excellence for Geospatial Information Science (CEGIS) was created in 2006 and since that time has provided research primarily in support of The National Map. The presentations and publications of the CEGIS researchers document the research accomplishments that include advances in electronic topographic map design, generalization, data integration, map projections, sea level rise modeling, geospatial semantics, ontology, user-centered design, volunteer geographic information, and parallel and grid computing for geospatial data from The National Map. A research plan spanning 2013–18 has been developed extending the accomplishments of the CEGIS researchers and documenting new research areas that are anticipated to support The National Map of the future. In addition to extending the 2006–12 research areas, the CEGIS research plan for 2013–18 includes new research areas in data models, geospatial semantics, high-performance computing, volunteered geographic information, crowdsourcing, social media, data integration, and multiscale representations to support the Three-Dimensional Elevation Program (3DEP) and The National Map of the future of the U.S. Geological Survey.

  11. Development of Geospatial Map Based Election Portal

    NASA Astrophysics Data System (ADS)

    Gupta, A. Kumar Chandra; Kumar, P.; Vasanth Kumar, N.

    2014-11-01

    The Geospatial Delhi Limited (GSDL), a Govt. of NCT of Delhi Company formed in order to provide the geospatial information of National Capital Territory of Delhi (NCTD) to the Government of National Capital Territory of Delhi (GNCTD) and its organs such as DDA, MCD, DJB, State Election Department, DMRC etc., for the benefit of all citizens of Government of National Capital Territory of Delhi (GNCTD). This paper describes the development of Geospatial Map based Election portal (GMEP) of NCT of Delhi. The portal has been developed as a map based spatial decision support system (SDSS) for pertain to planning and management of Department of Chief Electoral Officer, and as an election related information searching tools (Polling Station, Assembly and parliamentary constituency etc.,) for the citizens of NCTD. The GMEP is based on Client-Server architecture model. It has been developed using ArcGIS Server 10.0 with J2EE front-end on Microsoft Windows environment. The GMEP is scalable to enterprise SDSS with enterprise Geo Database & Virtual Private Network (VPN) connectivity. Spatial data to GMEP includes delimited precinct area boundaries of Voters Area of Polling stations, Assembly Constituency, Parliamentary Constituency, Election District, Landmark locations of Polling Stations & basic amenities (Police Stations, Hospitals, Schools and Fire Stations etc.). GMEP could help achieve not only the desired transparency and easiness in planning process but also facilitates through efficient & effective tools for management of elections. It enables a faster response to the changing ground realities in the development planning, owing to its in-built scientific approach and open-ended design.

  12. A Cloud-enabled Service-oriented Spatial Web Portal for Facilitating Arctic Data Discovery, Integration, and Utilization

    NASA Astrophysics Data System (ADS)

    dias, S. B.; Yang, C.; Li, Z.; XIA, J.; Liu, K.; Gui, Z.; Li, W.

    2013-12-01

    Global climate change has become one of the biggest concerns for human kind in the 21st century due to its broad impacts on society and ecosystems across the world. Arctic has been observed as one of the most vulnerable regions to the climate change. In order to understand the impacts of climate change on the natural environment, ecosystems, biodiversity and others in the Arctic region, and thus to better support the planning and decision making process, cross-disciplinary researches are required to monitor and analyze changes of Arctic regions such as water, sea level, biodiversity and so on. Conducting such research demands the efficient utilization of various geospatially referenced data, web services and information related to Arctic region. In this paper, we propose a cloud-enabled and service-oriented Spatial Web Portal (SWP) to support the discovery, integration and utilization of Arctic related geospatial resources, serving as a building block of polar CI. This SWP leverages the following techniques: 1) a hybrid searching mechanism combining centralized local search, distributed catalogue search and specialized Internet search for effectively discovering Arctic data and web services from multiple sources; 2) a service-oriented quality-enabled framework for seamless integration and utilization of various geospatial resources; and 3) a cloud-enabled parallel spatial index building approach to facilitate near-real time resource indexing and searching. A proof-of-concept prototype is developed to demonstrate the feasibility of the proposed SWP, using an example of analyzing the Arctic snow cover change over the past 50 years.

  13. Development of a spatial decision support system for flood risk management in Brazil that combines volunteered geographic information with wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Horita, Flávio E. A.; Albuquerque, João Porto de; Degrossi, Lívia C.; Mendiondo, Eduardo M.; Ueyama, Jó

    2015-07-01

    Effective flood risk management requires updated information to ensure that the correct decisions can be made. This can be provided by Wireless Sensor Networks (WSN) which are a low-cost means of collecting updated information about rivers. Another valuable resource is Volunteered Geographic Information (VGI) which is a comparatively new means of improving the coverage of monitored areas because it is able to supply supplementary information to the WSN and thus support decision-making in flood risk management. However, there still remains the problem of how to combine WSN data with VGI. In this paper, an attempt is made to investigate AGORA-DS, which is a Spatial Decision Support System (SDSS) that is able to make flood risk management more effective by combining these data sources, i.e. WSN with VGI. This approach is built over a conceptual model that complies with the interoperable standards laid down by the Open Geospatial Consortium (OGC) - e.g. Sensor Observation Service (SOS) and Web Feature Service (WFS) - and seeks to combine and present unified information in a web-based decision support tool. This work was deployed in a real scenario of flood risk management in the town of São Carlos in Brazil. The evidence obtained from this deployment confirmed that interoperable standards can support the integration of data from distinct data sources. In addition, they also show that VGI is able to provide information about areas of the river basin which lack data since there is no appropriate station in the area. Hence it provides a valuable support for the WSN data. It can thus be concluded that AGORA-DS is able to combine information provided by WSN and VGI, and provide useful information for supporting flood risk management.

  14. The Impact of a Geospatial Technology-Supported Energy Curriculum on Middle School Students' Science Achievement

    NASA Astrophysics Data System (ADS)

    Kulo, Violet; Bodzin, Alec

    2013-02-01

    Geospatial technologies are increasingly being integrated in science classrooms to foster learning. This study examined whether a Web-enhanced science inquiry curriculum supported by geospatial technologies promoted urban middle school students' understanding of energy concepts. The participants included one science teacher and 108 eighth-grade students classified in three ability level tracks. Data were gathered through pre/posttest content knowledge assessments, daily classroom observations, and daily reflective meetings with the teacher. Findings indicated a significant increase in the energy content knowledge for all the students. Effect sizes were large for all three ability level tracks, with the middle and low track classes having larger effect sizes than the upper track class. Learners in all three tracks were highly engaged with the curriculum. Curriculum effectiveness and practical issues involved with using geospatial technologies to support science learning are discussed.

  15. The Impact of Professional Development in Natural Resource Investigations Using Geospatial Technologies

    ERIC Educational Resources Information Center

    Hanley, Carol D.; Davis, Hilarie B.; Davey, Bradford T.

    2012-01-01

    As use of geospatial technologies has increased in the workplace, so has interest in using these technologies in the K-12 classroom. Prior research has identified several reasons for using geospatial technologies in the classroom, such as developing spatial thinking, supporting local investigations, analyzing changes in the environment, and…

  16. An approach for heterogeneous and loosely coupled geospatial data distributed computing

    NASA Astrophysics Data System (ADS)

    Chen, Bin; Huang, Fengru; Fang, Yu; Huang, Zhou; Lin, Hui

    2010-07-01

    Most GIS (Geographic Information System) applications tend to have heterogeneous and autonomous geospatial information resources, and the availability of these local resources is unpredictable and dynamic under a distributed computing environment. In order to make use of these local resources together to solve larger geospatial information processing problems that are related to an overall situation, in this paper, with the support of peer-to-peer computing technologies, we propose a geospatial data distributed computing mechanism that involves loosely coupled geospatial resource directories and a term named as Equivalent Distributed Program of global geospatial queries to solve geospatial distributed computing problems under heterogeneous GIS environments. First, a geospatial query process schema for distributed computing as well as a method for equivalent transformation from a global geospatial query to distributed local queries at SQL (Structured Query Language) level to solve the coordinating problem among heterogeneous resources are presented. Second, peer-to-peer technologies are used to maintain a loosely coupled network environment that consists of autonomous geospatial information resources, thus to achieve decentralized and consistent synchronization among global geospatial resource directories, and to carry out distributed transaction management of local queries. Finally, based on the developed prototype system, example applications of simple and complex geospatial data distributed queries are presented to illustrate the procedure of global geospatial information processing.

  17. Model My Watershed: A high-performance cloud application for public engagement, watershed modeling and conservation decision support

    NASA Astrophysics Data System (ADS)

    Aufdenkampe, A. K.; Tarboton, D. G.; Horsburgh, J. S.; Mayorga, E.; McFarland, M.; Robbins, A.; Haag, S.; Shokoufandeh, A.; Evans, B. M.; Arscott, D. B.

    2017-12-01

    The Model My Watershed Web app (https://app.wikiwatershed.org/) and the BiG-CZ Data Portal (http://portal.bigcz.org/) and are web applications that share a common codebase and a common goal to deliver high-performance discovery, visualization and analysis of geospatial data in an intuitive user interface in web browser. Model My Watershed (MMW) was designed as a decision support system for watershed conservation implementation. BiG CZ Data Portal was designed to provide context and background data for research sites. Users begin by creating an Area of Interest, via an automated watershed delineation tool, a free draw tool, selection of a predefined area such as a county or USGS Hydrological Unit (HUC), or uploading a custom polygon. Both Web apps visualize and provide summary statistics of land use, soil groups, streams, climate and other geospatial information. MMW then allows users to run a watershed model to simulate different scenarios of human impacts on stormwater runoff and water-quality. BiG CZ Data Portal allows users to search for scientific and monitoring data within the Area of Interest, which also serves as a prototype for the upcoming Monitor My Watershed web app. Both systems integrate with CUAHSI cyberinfrastructure, including visualizing observational data from CUAHSI Water Data Center and storing user data via CUAHSI HydroShare. Both systems also integrate with the new EnviroDIY Water Quality Data Portal (http://data.envirodiy.org/), a system for crowd-sourcing environmental monitoring data using open-source sensor stations (http://envirodiy.org/mayfly/) and based on the Observations Data Model v2.

  18. Intelligence, mapping, and geospatial exploitation system (IMAGES)

    NASA Astrophysics Data System (ADS)

    Moellman, Dennis E.; Cain, Joel M.

    1998-08-01

    This paper provides further detail to one facet of the battlespace visualization concept described in last year's paper Battlespace Situation Awareness for Force XXI. It focuses on the National Imagery and Mapping Agency (NIMA) goal to 'provide customers seamless access to tailorable imagery, imagery intelligence, and geospatial information.' This paper describes Intelligence, Mapping, and Geospatial Exploitation System (IMAGES), an exploitation element capable of CONUS baseplant operations or field deployment to provide NIMA geospatial information collaboratively into a reconnaissance, surveillance, and target acquisition (RSTA) environment through the United States Imagery and Geospatial Information System (USIGS). In a baseplant CONUS setting IMAGES could be used to produce foundation data to support mission planning. In the field it could be directly associated with a tactical sensor receiver or ground station (e.g. UAV or UGV) to provide near real-time and mission specific RSTA to support mission execution. This paper provides IMAGES functional level design; describes the technologies, their interactions and interdependencies; and presents a notional operational scenario to illustrate the system flexibility. Using as a system backbone an intelligent software agent technology, called Open Agent ArchitectureTM (OAATM), IMAGES combines multimodal data entry, natural language understanding, and perceptual and evidential reasoning for system management. Configured to be DII COE compliant, it would utilize, to the extent possible, COTS applications software for data management, processing, fusion, exploitation, and reporting. It would also be modular, scaleable, and reconfigurable. This paper describes how the OAATM achieves data synchronization and enables the necessary level of information to be rapidly available to various command echelons for making informed decisions. The reasoning component will provide for the best information to be developed in the timeline available and it will also provide statistical pedigree data. This pedigree data provides both uncertainties associated with the information and an audit trail cataloging the raw data sources and the processing/exploitation applied to derive the final product. Collaboration provides for a close union between the information producer(s)/exploiter(s) and the information user(s) as well as between local and remote producer(s)/exploiter(s). From a military operational perspective, IMAGES is a step toward further uniting NIMA with its customers and further blurring the dividing line between operational command and control (C2) and its supporting intelligence activities. IMAGES also provides a foundation for reachback to remote data sources, data stores, application software, and computational resources for achieving 'just-in- time' information delivery -- all of which is transparent to the analyst or operator employing the system.

  19. A geospatial framework for dynamic route planning using congestion prediction in transportation systems.

    DOT National Transportation Integrated Search

    2011-01-01

    The goal this research is to develop an end-to-end data-driven system, dubbed TransDec : (short for Transportation Decision-Making), to enable decision-making queries in : transportation systems with dynamic, real-time and historical data. With Trans...

  20. Real-time notification and improved situational awareness in fire emergencies using geospatial-based publish/subscribe

    NASA Astrophysics Data System (ADS)

    Kassab, Ala'; Liang, Steve; Gao, Yang

    2010-12-01

    Emergency agencies seek to maintain situational awareness and effective decision making through continuous monitoring of, and real-time alerting about, sources of information regarding current incidents and developing fire hazards. The nature of this goal requires integrating different, potentially numerous, sources of dynamic geospatial information on the one side, and a large number of clients having heterogeneous and specific interests in data on the other side. In such scenarios, the traditional request/reply communication style may function inefficiently, as it is based on point-to-point, synchronous, and pulling mode interaction between consumer clients and information providers/services. In this work, we propose Geospatial-based Publish/ Subscribe, an interaction framework that serves as a middleware for real-time transacting of spatially related information of interest, termed geospatial events, in distributed systems. Expressive data models, including geospatial event and geospatial subscription, as well as an efficient matching approach for fast dissemination of geospatial events to interested clients, are introduced. The proposed interaction framework is realized through the development of a Real-Time Fire Emergency Response System (RFERS) prototype. The prototype is designed for transacting several topics of geospatial events that are crucial within the context of fire emergencies, including GPS locations of emergency assets, meteorological observations of wireless sensors, fire incidents reports, and temporal sequences of remote sensing images of active wildfires. The performance of the system prototype has been evaluated in order to demonstrate its efficiency.

  1. A lake-centric geospatial database to guide research and inform management decisions in an Arctic watershed in northern Alaska experiencing climate and land-use changes

    USGS Publications Warehouse

    Jones, Benjamin M.; Arp, Christopher D.; Whitman, Matthew S.; Nigro, Debora A.; Nitze, Ingmar; Beaver, John; Gadeke, Anne; Zuck, Callie; Liljedahl, Anna K.; Daanen, Ronald; Torvinen, Eric; Fritz, Stacey; Grosse, Guido

    2017-01-01

    Lakes are dominant and diverse landscape features in the Arctic, but conventional land cover classification schemes typically map them as a single uniform class. Here, we present a detailed lake-centric geospatial database for an Arctic watershed in northern Alaska. We developed a GIS dataset consisting of 4362 lakes that provides information on lake morphometry, hydrologic connectivity, surface area dynamics, surrounding terrestrial ecotypes, and other important conditions describing Arctic lakes. Analyzing the geospatial database relative to fish and bird survey data shows relations to lake depth and hydrologic connectivity, which are being used to guide research and aid in the management of aquatic resources in the National Petroleum Reserve in Alaska. Further development of similar geospatial databases is needed to better understand and plan for the impacts of ongoing climate and land-use changes occurring across lake-rich landscapes in the Arctic.

  2. Open cyberGIS software for geospatial research and education in the big data era

    NASA Astrophysics Data System (ADS)

    Wang, Shaowen; Liu, Yan; Padmanabhan, Anand

    CyberGIS represents an interdisciplinary field combining advanced cyberinfrastructure, geographic information science and systems (GIS), spatial analysis and modeling, and a number of geospatial domains to improve research productivity and enable scientific breakthroughs. It has emerged as new-generation GIS that enable unprecedented advances in data-driven knowledge discovery, visualization and visual analytics, and collaborative problem solving and decision-making. This paper describes three open software strategies-open access, source, and integration-to serve various research and education purposes of diverse geospatial communities. These strategies have been implemented in a leading-edge cyberGIS software environment through three corresponding software modalities: CyberGIS Gateway, Toolkit, and Middleware, and achieved broad and significant impacts.

  3. The Value of Information - Accounting for a New Geospatial Paradigm

    NASA Astrophysics Data System (ADS)

    Pearlman, J.; Coote, A. M.

    2014-12-01

    A new frontier in consideration of socio-economic benefit is valuing information as an asset, often referred to as Infonomics. Conventional financial practice does not easily provide a mechanism for valuing information and yet clearly for many of the largest corporations, such as Google and Facebook, it is their principal asset. This is exacerbated for public sector organizations, as those that information-centric rather than information-enabled are relatively few - statistics, archiving and mapping agencies are perhaps the only examples - so it's not at the top of the agenda for Government. However, it is a hugely important issue when valuing Geospatial data and information. Geospatial data allows public institutions to operate, and facilitates the provision of essential services for emergency response and national defense. In this respect, geospatial data is strongly analogous to other types of public infrastructure, such as utilities and roads. The use of Geospatial data is widespread from companies in the transportation or construction sectors to individual planning for daily events. The categorization of geospatial data as infrastructure is critical to decisions related to investment in its management, maintenance and upgrade over time. Geospatial data depreciates in the same way that physical infrastructure depreciates. It needs to be maintained otherwise its functionality and value in use declines. We have coined the term geo-infonomics to encapsulate the concept. This presentation will develop the arguments around its importance and current avenues of research.

  4. Use of geospatial data to predict downstream influence of coal mining in Appalachia

    EPA Science Inventory

    A 2001 Supreme Court decision first called into question whether some headwater streams could be considered jurisdictional under the Clean Water Act. A subsequent decision then required that non-navigable waters must be "relatively permanent" or "possess a significant nexus" to ...

  5. Enhancing Public Participation to Improve Natural Resources Science and its Use in Decision Making

    NASA Astrophysics Data System (ADS)

    Glynn, P. D.; Shapiro, C. D.; Liu, S. B.

    2015-12-01

    The need for broader understanding and involvement in science coupled with social technology advances enabling crowdsourcing and citizen science have created greater opportunities for public participation in the gathering, interpretation, and use of geospatial information. The U.S. Geological Survey (USGS) is developing guidance for USGS scientists, partners, and interested members of the public on when and how public participation can most effectively be used in the conduct of scientific activities. Public participation can provide important perspectives and knowledge that cannot be obtained through traditional scientific methods alone. Citizen engagement can also provide increased efficiencies to USGS science and additional benefits to society including enhanced understanding, appreciation, and interest in geospatial information and its use in decision making.The USGS guidance addresses several fundamental issues by:1. Developing an operational definition of citizen or participatory science.2. Identifying the circumstances under which citizen science is appropriate for use and when its use is not recommended. 3. Describing structured processes for effective use of citizen science. 4. Defining the successful application of citizen science and identifying useful success metrics.The guidance is coordinated by the USGS Science and Decisions Center and developed by a multidisciplinary team of USGS scientists and managers. External perspectives will also be incorporated, as appropriate to align with other efforts such as the White House Office of Science and Technology Policy (OSTP) Citizen Science and Crowdsourcing Toolkit for the Federal government. The guidance will include the development of an economic framework to assess the benefits and costs of geospatial information developed through participatory processes. This economic framework considers tradeoffs between obtaining additional perspectives through enhanced participation with costs associated from obtaining geospatial information from multiple sources.

  6. Teaching Tectonics to Undergraduates with Web GIS

    NASA Astrophysics Data System (ADS)

    Anastasio, D. J.; Bodzin, A.; Sahagian, D. L.; Rutzmoser, S.

    2013-12-01

    Geospatial reasoning skills provide a means for manipulating, interpreting, and explaining structured information and are involved in higher-order cognitive processes that include problem solving and decision-making. Appropriately designed tools, technologies, and curriculum can support spatial learning. We present Web-based visualization and analysis tools developed with Javascript APIs to enhance tectonic curricula while promoting geospatial thinking and scientific inquiry. The Web GIS interface integrates graphics, multimedia, and animations that allow users to explore and discover geospatial patterns that are not easily recognized. Features include a swipe tool that enables users to see underneath layers, query tools useful in exploration of earthquake and volcano data sets, a subduction and elevation profile tool which facilitates visualization between map and cross-sectional views, drafting tools, a location function, and interactive image dragging functionality on the Web GIS. The Web GIS platform is independent and can be implemented on tablets or computers. The GIS tool set enables learners to view, manipulate, and analyze rich data sets from local to global scales, including such data as geology, population, heat flow, land cover, seismic hazards, fault zones, continental boundaries, and elevation using two- and three- dimensional visualization and analytical software. Coverages which allow users to explore plate boundaries and global heat flow processes aided learning in a Lehigh University Earth and environmental science Structural Geology and Tectonics class and are freely available on the Web.

  7. Building asynchronous geospatial processing workflows with web services

    NASA Astrophysics Data System (ADS)

    Zhao, Peisheng; Di, Liping; Yu, Genong

    2012-02-01

    Geoscience research and applications often involve a geospatial processing workflow. This workflow includes a sequence of operations that use a variety of tools to collect, translate, and analyze distributed heterogeneous geospatial data. Asynchronous mechanisms, by which clients initiate a request and then resume their processing without waiting for a response, are very useful for complicated workflows that take a long time to run. Geospatial contents and capabilities are increasingly becoming available online as interoperable Web services. This online availability significantly enhances the ability to use Web service chains to build distributed geospatial processing workflows. This paper focuses on how to orchestrate Web services for implementing asynchronous geospatial processing workflows. The theoretical bases for asynchronous Web services and workflows, including asynchrony patterns and message transmission, are examined to explore different asynchronous approaches to and architecture of workflow code for the support of asynchronous behavior. A sample geospatial processing workflow, issued by the Open Geospatial Consortium (OGC) Web Service, Phase 6 (OWS-6), is provided to illustrate the implementation of asynchronous geospatial processing workflows and the challenges in using Web Services Business Process Execution Language (WS-BPEL) to develop them.

  8. BPELPower—A BPEL execution engine for geospatial web services

    NASA Astrophysics Data System (ADS)

    Yu, Genong (Eugene); Zhao, Peisheng; Di, Liping; Chen, Aijun; Deng, Meixia; Bai, Yuqi

    2012-10-01

    The Business Process Execution Language (BPEL) has become a popular choice for orchestrating and executing workflows in the Web environment. As one special kind of scientific workflow, geospatial Web processing workflows are data-intensive, deal with complex structures in data and geographic features, and execute automatically with limited human intervention. To enable the proper execution and coordination of geospatial workflows, a specially enhanced BPEL execution engine is required. BPELPower was designed, developed, and implemented as a generic BPEL execution engine with enhancements for executing geospatial workflows. The enhancements are especially in its capabilities in handling Geography Markup Language (GML) and standard geospatial Web services, such as the Web Processing Service (WPS) and the Web Feature Service (WFS). BPELPower has been used in several demonstrations over the decade. Two scenarios were discussed in detail to demonstrate the capabilities of BPELPower. That study showed a standard-compliant, Web-based approach for properly supporting geospatial processing, with the only enhancement at the implementation level. Pattern-based evaluation and performance improvement of the engine are discussed: BPELPower directly supports 22 workflow control patterns and 17 workflow data patterns. In the future, the engine will be enhanced with high performance parallel processing and broad Web paradigms.

  9. Spatially explicit multi-criteria decision analysis for managing vector-borne diseases

    PubMed Central

    2011-01-01

    The complex epidemiology of vector-borne diseases creates significant challenges in the design and delivery of prevention and control strategies, especially in light of rapid social and environmental changes. Spatial models for predicting disease risk based on environmental factors such as climate and landscape have been developed for a number of important vector-borne diseases. The resulting risk maps have proven value for highlighting areas for targeting public health programs. However, these methods generally only offer technical information on the spatial distribution of disease risk itself, which may be incomplete for making decisions in a complex situation. In prioritizing surveillance and intervention strategies, decision-makers often also need to consider spatially explicit information on other important dimensions, such as the regional specificity of public acceptance, population vulnerability, resource availability, intervention effectiveness, and land use. There is a need for a unified strategy for supporting public health decision making that integrates available data for assessing spatially explicit disease risk, with other criteria, to implement effective prevention and control strategies. Multi-criteria decision analysis (MCDA) is a decision support tool that allows for the consideration of diverse quantitative and qualitative criteria using both data-driven and qualitative indicators for evaluating alternative strategies with transparency and stakeholder participation. Here we propose a MCDA-based approach to the development of geospatial models and spatially explicit decision support tools for the management of vector-borne diseases. We describe the conceptual framework that MCDA offers as well as technical considerations, approaches to implementation and expected outcomes. We conclude that MCDA is a powerful tool that offers tremendous potential for use in public health decision-making in general and vector-borne disease management in particular. PMID:22206355

  10. Learn More

    EPA Pesticide Factsheets

    NHDPlus is a geospatial, hydrologic framework dataset that is intended for use by geospatial analysts and modelers to support water resources related applications. NHDPlus was developed by the USEPA in partnership with the US Geologic Survey

  11. Get Data

    EPA Pesticide Factsheets

    NHDPlus is a geospatial, hydrologic framework dataset that is intended for use by geospatial analysts and modelers to support water resources related applications. NHDPlus was developed by the USEPA in partnership with the US Geologic Survey

  12. Basic Information

    EPA Pesticide Factsheets

    NHDPlus is a geospatial, hydrologic framework dataset that is intended for use by geospatial analysts and modelers to support water resources related applications. NHDPlus was developed by the USEPA in partnership with the US Geologic Survey

  13. Geospatial Modeling of Asthma Population in Relation to Air Pollution

    NASA Technical Reports Server (NTRS)

    Kethireddy, Swatantra R.; Tchounwou, Paul B.; Young, John H.; Luvall, Jeffrey C.; Alhamdan, Mohammad

    2013-01-01

    Current observations indicate that asthma is growing every year in the United States, specific reasons for this are not well understood. This study stems from an ongoing research effort to investigate the spatio-temporal behavior of asthma and its relatedness to air pollution. The association between environmental variables such as air quality and asthma related health issues over Mississippi State are investigated using Geographic Information Systems (GIS) tools and applications. Health data concerning asthma obtained from Mississippi State Department of Health (MSDH) for 9-year period of 2003-2011, and data of air pollutant concentrations (PM2.5) collected from USEPA web resources, and are analyzed geospatially to establish the impacts of air quality on human health specifically related to asthma. Disease mapping using geospatial techniques provides valuable insights into the spatial nature, variability, and association of asthma to air pollution. Asthma patient hospitalization data of Mississippi has been analyzed and mapped using quantitative Choropleth techniques in ArcGIS. Patients have been geocoded to their respective zip codes. Potential air pollutant sources of Interstate highways, Industries, and other land use data have been integrated in common geospatial platform to understand their adverse contribution on human health. Existing hospitals and emergency clinics are being injected into analysis to further understand their proximity and easy access to patient locations. At the current level of analysis and understanding, spatial distribution of Asthma is observed in the populations of Zip code regions in gulf coast, along the interstates of south, and in counties of Northeast Mississippi. It is also found that asthma is prevalent in most of the urban population. This GIS based project would be useful to make health risk assessment and provide information support to the administrators and decision makers for establishing satellite clinics in future.

  14. Development of Geospatial Map Based Portal for New Delhi Municipal Council

    NASA Astrophysics Data System (ADS)

    Gupta, A. Kumar Chandra; Kumar, P.; Sharma, P. Kumar

    2017-09-01

    The Geospatial Delhi Limited (GSDL), a Govt. of NCT of Delhi Company formed in order to provide the geospatial information of National Capital Territory of Delhi (NCTD) to the Government of National Capital Territory of Delhi (GNCTD) and its organs such as DDA, MCD, DJB, State Election Department, DMRC etc., for the benefit of all citizens of Government of National Capital Territory of Delhi (GNCTD). This paper describes the development of Geospatial Map based Portal (GMP) for New Delhi Municipal Council (NDMC) of NCT of Delhi. The GMP has been developed as a map based spatial decision support system (SDSS) for planning and development of NDMC area to the NDMC department and It's heaving the inbuilt information searching tools (identifying of location, nearest utilities locations, distance measurement etc.) for the citizens of NCTD. The GMP is based on Client-Server architecture model. It has been developed using Arc GIS Server 10.0 with .NET (pronounced dot net) technology. The GMP is scalable to enterprise SDSS with enterprise Geo Database & Virtual Private Network (VPN) connectivity. Spatial data to GMP includes Circle, Division, Sub-division boundaries of department pertaining to New Delhi Municipal Council, Parcels of residential, commercial, and government buildings, basic amenities (Police Stations, Hospitals, Schools, Banks, ATMs and Fire Stations etc.), Over-ground and Underground utility network lines, Roads, Railway features. GMP could help achieve not only the desired transparency and easiness in planning process but also facilitates through efficient & effective tools for development and management of MCD area. It enables a faster response to the changing ground realities in the development planning, owing to its in-built scientific approach and open-ended design.

  15. Development of Geospatial Map Based Portal for Delimitation of Mcd Wards

    NASA Astrophysics Data System (ADS)

    Gupta, A. Kumar Chandra; Kumar, P.; Sharma, P. Kumar

    2017-09-01

    The Geospatial Delhi Limited (GSDL), a Govt. of NCT of Delhi Company formed in order to provide the geospatial information of National Capital Territory of Delhi (NCTD) to the Government of National Capital Territory of Delhi (GNCTD) and its organs such as DDA, MCD, DJB, State Election Department, DMRC etc., for the benefit of all citizens of Government of National Capital Territory of Delhi (GNCTD). This paper describes the development of Geospatial Map based Portal for Delimitation of MCD Wards (GMPDW) and election of 3 Municipal Corporations of NCT of Delhi. The portal has been developed as a map based spatial decision support system (SDSS) for delimitation of MCD Wards and draw of peripheral wards boundaries to planning and management of MCD Election process of State Election Commission, and as an MCD election related information searching tools (Polling Station, MCD Wards and Assembly constituency etc.,) for the citizens of NCTD. The GMPDW is based on Client-Server architecture model. It has been developed using Arc GIS Server 10.0 with .NET (pronounced dot net) technology. The GMPDW is scalable to enterprise SDSS with enterprise Geo Database & Virtual Private Network (VPN) connectivity. Spatial data to GMPDW includes Enumeration Block (EB) and Enumeration Blocks Group (EBG) boundaries of Citizens of Delhi, Assembly Constituency, Parliamentary Constituency, Election District, Landmark locations of Polling Stations & basic amenities (Police Stations, Hospitals, Schools and Fire Stations etc.). GMPDW could help achieve not only the desired transparency and easiness in planning process but also facilitates through efficient & effective tools for management of MCD election. It enables a faster response to the changing ground realities in the development planning, owing to its in-built scientific approach and open-ended design.

  16. A Comprehensive Optimization Strategy for Real-time Spatial Feature Sharing and Visual Analytics in Cyberinfrastructure

    NASA Astrophysics Data System (ADS)

    Li, W.; Shao, H.

    2017-12-01

    For geospatial cyberinfrastructure enabled web services, the ability of rapidly transmitting and sharing spatial data over the Internet plays a critical role to meet the demands of real-time change detection, response and decision-making. Especially for the vector datasets which serve as irreplaceable and concrete material in data-driven geospatial applications, their rich geometry and property information facilitates the development of interactive, efficient and intelligent data analysis and visualization applications. However, the big-data issues of vector datasets have hindered their wide adoption in web services. In this research, we propose a comprehensive optimization strategy to enhance the performance of vector data transmitting and processing. This strategy combines: 1) pre- and on-the-fly generalization, which automatically determines proper simplification level through the introduction of appropriate distance tolerance (ADT) to meet various visualization requirements, and at the same time speed up simplification efficiency; 2) a progressive attribute transmission method to reduce data size and therefore the service response time; 3) compressed data transmission and dynamic adoption of a compression method to maximize the service efficiency under different computing and network environments. A cyberinfrastructure web portal was developed for implementing the proposed technologies. After applying our optimization strategies, substantial performance enhancement is achieved. We expect this work to widen the use of web service providing vector data to support real-time spatial feature sharing, visual analytics and decision-making.

  17. Evaluating hydrological response to forecasted land-use change—scenario testing with the automated geospatial watershed assessment (AGWA) tool

    USGS Publications Warehouse

    Kepner, William G.; Semmens, Darius J.; Hernandez, Mariano; Goodrich, David C.

    2009-01-01

    Envisioning and evaluating future scenarios has emerged as a critical component of both science and social decision-making. The ability to assess, report, map, and forecast the life support functions of ecosystems is absolutely critical to our capacity to make informed decisions to maintain the sustainable nature of our ecosystem services now and into the future. During the past two decades, important advances in the integration of remote imagery, computer processing, and spatial-analysis technologies have been used to develop landscape information that can be integrated with hydrologic models to determine long-term change and make predictive inferences about the future. Two diverse case studies in northwest Oregon (Willamette River basin) and southeastern Arizona (San Pedro River) were examined in regard to future land use scenarios relative to their impact on surface water conditions (e.g., sediment yield and surface runoff) using hydrologic models associated with the Automated Geospatial Watershed Assessment (AGWA) tool. The base reference grid for land cover was modified in both study locations to reflect stakeholder preferences 20 to 60 yrs into the future, and the consequences of landscape change were evaluated relative to the selected future scenarios. The two studies provide examples of integrating hydrologic modeling with a scenario analysis framework to evaluate plausible future forecasts and to understand the potential impact of landscape change on ecosystem services.

  18. Trusted Data Sharing and Imagery Workflow for Disaster Response in Partnership with the State of California

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Aubrey, A. D.; Rosinski, A.; Morentz, J.; Beilin, P.; Jones, D.

    2016-12-01

    Providing actionable data for situational awareness following an earthquake or other disaster is critical to decision makers in order to improve their ability to anticipate requirements and provide appropriate resources for response. Key information on the nature, magnitude and scope of damage, or Essential Elements of Information (EEI), necessary to achieve situational awareness are often generated from a wide array of organizations and disciplines, using any number of geospatial and non-geospatial technologies. We have worked in partnership with the California Earthquake Clearinghouse to develop actionable data products for use in their response efforts, particularly in regularly scheduled, statewide exercises like the recent 2016 Cascadia Rising NLE, the May 2015 Capstone/SoCal NLE/Ardent Sentry Exercises and in the August 2014 South Napa earthquake activation and plan to participate in upcoming exercises with the National Guard (Vigilant Guard 17) and the USGS (Haywired). Our efforts over the past several years have been to aid in enabling coordination between research scientists, applied scientists and decision makers in order to reduce duplication of effort, maximize information sharing, translate scientific results into actionable information for decision-makers, and increase situational awareness. We will present perspectives on developing tools for decision support and data discovery in partnership with the Clearinghouse. Products delivered include map layers as part of the common operational data plan for the Clearinghouse delivered through XchangeCore Web Service Data Orchestration and the SpotOnResponse field analysis application. We are exploring new capabilities for real-time collaboration using GeoCollaborate®. XchangeCore allows real-time, two-way information sharing, enabling users to create merged datasets from multiple providers; SpotOnResponse provides web-enabled secure information exchange, collaboration, and field analysis for responders; and GeoCollaborate® enables users to access, share, manipulate, and interact across disparate platforms, connecting public and private sector agencies and organizations rapidly on the same map at the same time, allowing improved collaborative decision making on the same datasets simultaneously.

  19. Interoperable Data Access Services for NOAA IOOS

    NASA Astrophysics Data System (ADS)

    de La Beaujardiere, J.

    2008-12-01

    The Integrated Ocean Observing System (IOOS) is intended to enhance our ability to collect, deliver, and use ocean information. The goal is to support research and decision-making by providing data on our open oceans, coastal waters, and Great Lakes in the formats, rates, and scales required by scientists, managers, businesses, governments, and the public. The US National Oceanic and Atmospheric Administration (NOAA) is the lead agency for IOOS. NOAA's IOOS office supports the development of regional coastal observing capability and promotes data management efforts to increase data accessibility. Geospatial web services have been established at NOAA data providers including the National Data Buoy Center (NDBC), the Center for Operational Oceanographic Products and Services (CO-OPS), and CoastWatch, and at regional data provider sites. Services established include Open-source Project for a Network Data Access Protocol (OpenDAP), Open Geospatial Consortium (OGC) Sensor Observation Service (SOS), and OGC Web Coverage Service (WCS). These services provide integrated access to data holdings that have been aggregated at each center from multiple sources. We wish to collaborate with other groups to improve our service offerings to maximize interoperability and enhance cross-provider data integration, and to share common service components such as registries, catalogs, data conversion, and gateways. This paper will discuss the current status of NOAA's IOOS efforts and possible next steps.

  20. NHDPlus (National Hydrography Dataset Plus)

    EPA Pesticide Factsheets

    NHDPlus is a geospatial, hydrologic framework dataset that is intended for use by geospatial analysts and modelers to support water resources related applications. NHDPlus was developed by the USEPA in partnership with the US Geologic Survey

  1. NHDPlusHR: A national geospatial framework for surface-water information

    USGS Publications Warehouse

    Viger, Roland; Rea, Alan H.; Simley, Jeffrey D.; Hanson, Karen M.

    2016-01-01

    The U.S. Geological Survey is developing a new geospatial hydrographic framework for the United States, called the National Hydrography Dataset Plus High Resolution (NHDPlusHR), that integrates a diversity of the best-available information, robustly supports ongoing dataset improvements, enables hydrographic generalization to derive alternate representations of the network while maintaining feature identity, and supports modern scientific computing and Internet accessibility needs. This framework is based on the High Resolution National Hydrography Dataset, the Watershed Boundaries Dataset, and elevation from the 3-D Elevation Program, and will provide an authoritative, high precision, and attribute-rich geospatial framework for surface-water information for the United States. Using this common geospatial framework will provide a consistent basis for indexing water information in the United States, eliminate redundancy, and harmonize access to, and exchange of water information.

  2. Best Practices for Preparing Interoperable Geospatial Data

    NASA Astrophysics Data System (ADS)

    Wei, Y.; Santhana Vannan, S.; Cook, R. B.; Wilson, B. E.; Beaty, T. W.

    2010-12-01

    Geospatial data is critically important for a wide scope of research and applications: carbon cycle and ecosystem, climate change, land use and urban planning, environmental protecting, etc. Geospatial data is created by different organizations using different methods, from remote sensing observations, field surveys, model simulations, etc., and stored in various formats. So geospatial data is diverse and heterogeneous, which brings a huge barrier for the sharing and using of geospatial data, especially when targeting a broad user community. Many efforts have been taken to address different aspects of using geospatial data by improving its interoperability. For example, the specification for Open Geospatial Consortium (OGC) catalog services defines a standard way for geospatial information discovery; OGC Web Coverage Services (WCS) and OPeNDAP define interoperable protocols for geospatial data access, respectively. But the reality is that only having the standard mechanisms for data discovery and access is not enough. The geospatial data content itself has to be organized in standard, easily understandable, and readily usable formats. The Oak Ridge National Lab Distributed Archived Data Center (ORNL DAAC) archives data and information relevant to biogeochemical dynamics, ecological data, and environmental processes. The Modeling and Synthesis Thematic Data Center (MAST-DC) prepares and distributes both input data and output data of carbon cycle models and provides data support for synthesis and terrestrial model inter-comparison in multi-scales. Both of these NASA-funded data centers compile and distribute a large amount of diverse geospatial data and have broad user communities, including GIS users, Earth science researchers, and ecosystem modeling teams. The ORNL DAAC and MAST-DC address this geospatial data interoperability issue by standardizing the data content and feeding them into a well-designed Spatial Data Infrastructure (SDI) which provides interoperable mechanisms to advertise, visualize, and distribute the standardized geospatial data. In this presentation, we summarize the experiences learned and the best practices for geospatial data standardization. The presentation will describe how diverse and historical data archived in the ORNL DAAC were converted into standard and non-proprietary formats; what tools were used to make the conversion; how the spatial and temporal information are properly captured in a consistent manor; how to name a data file or a variable to make it both human-friendly and semantically interoperable; how NetCDF file format and CF convention can promote the data usage in ecosystem modeling user community; how those standardized geospatial data can be fed into OGC Web Services to support on-demand data visualization and access; and how the metadata should be collected and organized so that they can be discovered through standard catalog services.

  3. EnviroAtlas: Providing Nationwide Geospatial Ecosystem Goods and Services Indicators and Indices to Inform Decision-Making, Research, and Education

    NASA Astrophysics Data System (ADS)

    Neale, A. C.

    2016-12-01

    EnviroAtlas is a multi-organization effort led by the US Environmental Protection Agency to develop, host and display a large suite of nation-wide geospatial indicators and indices of ecosystem services. This open access tool allows users to view, analyze, and download a wealth of geospatial data and other resources related to ecosystem goods and services. More than 160 national indicators of ecosystem service supply, demand, and drivers of change provide a framework to inform decisions and policies at multiple spatial scales, educate a range of audiences, and supply data for research. A higher resolution component is also available, providing over 100 data layers for finer-scale analyses for selected communities across the US. The ecosystem goods and services data are organized into seven general ecosystem benefit categories: clean and plentiful water; natural hazard mitigation; food, fuel, and materials; climate stabilization; clean air; biodiversity conservation; and recreation, culture, and aesthetics. Each indicator is described in terms of how it is important to human health or well-being. EnviroAtlas includes data describing existing ecosystem markets for water quality and quantity, biodiversity, wetland mitigation, and carbon credits. This presentation will briefly describe the EnviroAtlas data and tools and how they are being developed and used in ongoing research studies and in decision-making contexts.

  4. Implementing a Web-Based Decision Support System to Spatially and Statistically Analyze Ecological Conditions of the Sierra Nevada

    NASA Astrophysics Data System (ADS)

    Nguyen, A.; Mueller, C.; Brooks, A. N.; Kislik, E. A.; Baney, O. N.; Ramirez, C.; Schmidt, C.; Torres-Perez, J. L.

    2014-12-01

    The Sierra Nevada is experiencing changes in hydrologic regimes, such as decreases in snowmelt and peak runoff, which affect forest health and the availability of water resources. Currently, the USDA Forest Service Region 5 is undergoing Forest Plan revisions to include climate change impacts into mitigation and adaptation strategies. However, there are few processes in place to conduct quantitative assessments of forest conditions in relation to mountain hydrology, while easily and effectively delivering that information to forest managers. To assist the USDA Forest Service, this study is the final phase of a three-term project to create a Decision Support System (DSS) to allow ease of access to historical and forecasted hydrologic, climatic, and terrestrial conditions for the entire Sierra Nevada. This data is featured within three components of the DSS: the Mapping Viewer, Statistical Analysis Portal, and Geospatial Data Gateway. Utilizing ArcGIS Online, the Sierra DSS Mapping Viewer enables users to visually analyze and locate areas of interest. Once the areas of interest are targeted, the Statistical Analysis Portal provides subbasin level statistics for each variable over time by utilizing a recently developed web-based data analysis and visualization tool called Plotly. This tool allows users to generate graphs and conduct statistical analyses for the Sierra Nevada without the need to download the dataset of interest. For more comprehensive analysis, users are also able to download datasets via the Geospatial Data Gateway. The third phase of this project focused on Python-based data processing, the adaptation of the multiple capabilities of ArcGIS Online and Plotly, and the integration of the three Sierra DSS components within a website designed specifically for the USDA Forest Service.

  5. E-DECIDER Decision Support Gateway For Earthquake Disaster Response

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Stough, T. M.; Parker, J. W.; Burl, M. C.; Donnellan, A.; Blom, R. G.; Pierce, M. E.; Wang, J.; Ma, Y.; Rundle, J. B.; Yoder, M. R.

    2013-12-01

    Earthquake Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response (E-DECIDER) is a NASA-funded project developing capabilities for decision-making utilizing remote sensing data and modeling software in order to provide decision support for earthquake disaster management and response. E-DECIDER incorporates earthquake forecasting methodology and geophysical modeling tools developed through NASA's QuakeSim project in order to produce standards-compliant map data products to aid in decision-making following an earthquake. Remote sensing and geodetic data, in conjunction with modeling and forecasting tools, help provide both long-term planning information for disaster management decision makers as well as short-term information following earthquake events (i.e. identifying areas where the greatest deformation and damage has occurred and emergency services may need to be focused). E-DECIDER utilizes a service-based GIS model for its cyber-infrastructure in order to produce standards-compliant products for different user types with multiple service protocols (such as KML, WMS, WFS, and WCS). The goal is to make complex GIS processing and domain-specific analysis tools more accessible to general users through software services as well as provide system sustainability through infrastructure services. The system comprises several components, which include: a GeoServer for thematic mapping and data distribution, a geospatial database for storage and spatial analysis, web service APIs, including simple-to-use REST APIs for complex GIS functionalities, and geoprocessing tools including python scripts to produce standards-compliant data products. These are then served to the E-DECIDER decision support gateway (http://e-decider.org), the E-DECIDER mobile interface, and to the Department of Homeland Security decision support middleware UICDS (Unified Incident Command and Decision Support). The E-DECIDER decision support gateway features a web interface that delivers map data products including deformation modeling results (slope change and strain magnitude) and aftershock forecasts, with remote sensing change detection results under development. These products are event triggered (from the USGS earthquake feed) and will be posted to event feeds on the E-DECIDER webpage and accessible via the mobile interface and UICDS. E-DECIDER also features a KML service that provides infrastructure information from the FEMA HAZUS database through UICDS and the mobile interface. The back-end GIS service architecture and front-end gateway components form a decision support system that is designed for ease-of-use and extensibility for end-users.

  6. The African Geospatial Sciences Institute (agsi): a New Approach to Geospatial Training in North Africa

    NASA Astrophysics Data System (ADS)

    Oeldenberger, S.; Khaled, K. B.

    2012-07-01

    The African Geospatial Sciences Institute (AGSI) is currently being established in Tunisia as a non-profit, non-governmental organization (NGO). Its objective is to accelerate the geospatial capacity development in North-Africa, providing the facilities for geospatial project and management training to regional government employees, university graduates, private individuals and companies. With typical course durations between one and six months, including part-time programs and long-term mentoring, its focus is on practical training, providing actual project execution experience. The AGSI will complement formal university education and will work closely with geospatial certification organizations and the geospatial industry. In the context of closer cooperation between neighboring North Africa and the European Community, the AGSI will be embedded in a network of several participating European and African universities, e. g. the ITC, and international organizations, such as the ISPRS, the ICA and the OGC. Through a close cooperation with African organizations, such as the AARSE, the RCMRD and RECTAS, the network and exchange of ideas, experiences, technology and capabilities will be extended to Saharan and sub-Saharan Africa. A board of trustees will be steering the AGSI operations and will ensure that practical training concepts and contents are certifiable and can be applied within a credit system to graduate and post-graduate education at European and African universities. The geospatial training activities of the AGSI are centered on a facility with approximately 30 part- and full-time general staff and lecturers in Tunis during the first year. The AGSI will operate a small aircraft with a medium-format aerial camera and compact LIDAR instrument for local, community-scale data capture. Surveying training, the photogrammetric processing of aerial images, GIS data capture and remote sensing training will be the main components of the practical training courses offered, to build geospatial capacity and ensure that AGSI graduates will have the appropriate skill-sets required for employment in the geospatial industry. Geospatial management courses and high-level seminars will be targeted at decision makers in government and industry to build awareness for geospatial applications and benefits. Online education will be developed together with international partners and internet-based activities will involve the public to familiarize them with geospatial data and its many applications.

  7. Recent innovation of geospatial information technology to support disaster risk management and responses

    NASA Astrophysics Data System (ADS)

    Une, Hiroshi; Nakano, Takayuki

    2018-05-01

    Geographic location is one of the most fundamental and indispensable information elements in the field of disaster response and prevention. For example, in the case of the Tohoku Earthquake in 2011, aerial photos taken immediately after the earthquake greatly improved information sharing among different government offices and facilitated rescue and recovery operations, and maps prepared after the disaster assisted in the rapid reconstruction of affected local communities. Thanks to the recent development of geospatial information technology, this information has become more essential for disaster response activities. Advancements in web mapping technology allows us to better understand the situation by overlaying various location-specific data on base maps on the web and specifying the areas on which activities should be focused. Through 3-D modelling technology, we can have a more realistic understanding of the relationship between disaster and topography. Geospatial information technology can sup-port proper preparation and emergency responses against disasters by individuals and local communities through hazard mapping and other information services using mobile devices. Thus, geospatial information technology is playing a more vital role on all stages of disaster risk management and responses. In acknowledging geospatial information's vital role in disaster risk reduction, the Sendai Framework for Disaster Risk Reduction 2015-2030, adopted at the Third United Nations World Conference on Disaster Risk Reduction, repeatedly reveals the importance of utilizing geospatial information technology for disaster risk reduction. This presentation aims to report the recent practical applications of geospatial information technology for disaster risk management and responses.

  8. Enriching the national map database for multi-scale use: Introducing the visibilityfilter attribution

    USGS Publications Warehouse

    Stauffer, Andrew J.; Webinger, Seth; Roche, Brittany

    2016-01-01

    The US Geological Survey’s (USGS) National Geospatial Technical Operations Center is prototyping and evaluating the ability to filter data through a range of scales using 1:24,000-scale The National Map (TNM) datasets as the source. A “VisibilityFilter” attribute is under evaluation that can be added to all TNM vector data themes and will permit filtering of data to eight target scales between 1:24,000 and 1:5,000,000, thus defining each feature’s smallest applicable scale-of-use. For a prototype implementation, map specifications for 1:100,000- and 1:250,000-scale USGS Topographic Map Series are being utilized to define feature content appropriate at fixed mapping scales to guide generalization decisions that are documented in a ScaleMaster diagram. This paper defines the VisibilityFilter attribute, the generalization decisions made for each TNM data theme, and how these decisions are embedded into the data to support efficient data filtering.

  9. Geospatial Technology in Disease Mapping, E- Surveillance and Health Care for Rural Population in South India

    NASA Astrophysics Data System (ADS)

    Praveenkumar, B. A.; Suresh, K.; Nikhil, A.; Rohan, M.; Nikhila, B. S.; Rohit, C. K.; Srinivas, A.

    2014-11-01

    Providing Healthcare to rural population has been a challenge to the medical service providers especially in developing countries. For this to be effective, scalable and sustainable, certain strategic decisions have to be taken during the planning phase. Also, there is a big gap between the services available and the availability of doctors and medical resources in rural areas. Use of Information Technology can aid this deficiency to a good extent. In this paper, a mobile application has been developed to gather data from the field. A cloud based interface has been developed to store the data in the cloud for effective usage and management of the data. A decision tree based solution developed in this paper helps in diagnosing a patient based on his health parameters. Interactive geospatial maps have been developed to provide effective data visualization facility. This will help both the user community as well as decision makers to carry out long term strategy planning.

  10. a Framework for AN Open Source Geospatial Certification Model

    NASA Astrophysics Data System (ADS)

    Khan, T. U. R.; Davis, P.; Behr, F.-J.

    2016-06-01

    The geospatial industry is forecasted to have an enormous growth in the forthcoming years and an extended need for well-educated workforce. Hence ongoing education and training play an important role in the professional life. Parallel, in the geospatial and IT arena as well in the political discussion and legislation Open Source solutions, open data proliferation, and the use of open standards have an increasing significance. Based on the Memorandum of Understanding between International Cartographic Association, OSGeo Foundation, and ISPRS this development led to the implementation of the ICA-OSGeo-Lab imitative with its mission "Making geospatial education and opportunities accessible to all". Discussions in this initiative and the growth and maturity of geospatial Open Source software initiated the idea to develop a framework for a worldwide applicable Open Source certification approach. Generic and geospatial certification approaches are already offered by numerous organisations, i.e., GIS Certification Institute, GeoAcademy, ASPRS, and software vendors, i. e., Esri, Oracle, and RedHat. They focus different fields of expertise and have different levels and ways of examination which are offered for a wide range of fees. The development of the certification framework presented here is based on the analysis of diverse bodies of knowledge concepts, i.e., NCGIA Core Curriculum, URISA Body Of Knowledge, USGIF Essential Body Of Knowledge, the "Geographic Information: Need to Know", currently under development, and the Geospatial Technology Competency Model (GTCM). The latter provides a US American oriented list of the knowledge, skills, and abilities required of workers in the geospatial technology industry and influenced essentially the framework of certification. In addition to the theoretical analysis of existing resources the geospatial community was integrated twofold. An online survey about the relevance of Open Source was performed and evaluated with 105 respondents worldwide. 15 interviews (face-to-face or by telephone) with experts in different countries provided additional insights into Open Source usage and certification. The findings led to the development of a certification framework of three main categories with in total eleven sub-categories, i.e., "Certified Open Source Geospatial Data Associate / Professional", "Certified Open Source Geospatial Analyst Remote Sensing & GIS", "Certified Open Source Geospatial Cartographer", "Certified Open Source Geospatial Expert", "Certified Open Source Geospatial Associate Developer / Professional Developer", "Certified Open Source Geospatial Architect". Each certification is described by pre-conditions, scope and objectives, course content, recommended software packages, target group, expected benefits, and the methods of examination. Examinations can be flanked by proofs of professional career paths and achievements which need a peer qualification evaluation. After a couple of years a recertification is required. The concept seeks the accreditation by the OSGeo Foundation (and other bodies) and international support by a group of geospatial scientific institutions to achieve wide and international acceptance for this Open Source geospatial certification model. A business case for Open Source certification and a corresponding SWOT model is examined to support the goals of the Geo-For-All initiative of the ICA-OSGeo pact.

  11. The Implementation of a Geospatial Information Technology (GIT)-Supported Land Use Change Curriculum with Urban Middle School Learners to Promote Spatial Thinking

    ERIC Educational Resources Information Center

    Bodzin, Alec M.

    2011-01-01

    This study investigated whether a geospatial information technology (GIT)-supported science curriculum helped students in an urban middle school understand land use change (LUC) concepts and enhanced their spatial thinking. Five 8th grade earth and space science classes in an urban middle school consisting of three different ability level tracks…

  12. International outreach for promoting open geoscience content in Finnish university libraries - libraries as the advocates of citizen science awareness on emerging open geospatial data repositories in Finnish society

    NASA Astrophysics Data System (ADS)

    Rousi, A. M.; Branch, B. D.; Kong, N.; Fosmire, M.

    2013-12-01

    In their Finnish National Spatial Strategy 2010-2015 the Finland's Ministry of Agriculture and Forestry delineated e.g. that spatial data skills should support citizens everyday activities and facilitate decision-making and participation of citizens. Studies also predict that open data, particularly open spatial data, would create, when fully realizing their potential, a 15% increase into the turnovers of Finnish private sector companies. Finnish libraries have a long tradition of serving at the heart of Finnish information society. However, with the emerging possibilities of educating their users on open spatial data a very few initiatives have been made. The National Survey of Finland opened its data in 2012. Finnish technology university libraries, such as Aalto University Library, are open environments for all citizens, and seem suitable of being the first thriving entities in educating citizens on open geospatial data. There are however many obstacles to overcome, such as lack of knowledge about policies, lack of understanding of geospatial data services and insufficient know-how of GIS software among the personnel. This framework examines the benefits derived from an international collaboration between Purdue University Libraries and Aalto University Library to create local strategies in implementing open spatial data education initiatives in Aalto University Library's context. The results of this international collaboration are explicated for the benefit of the field as a whole.

  13. The Geoinformatica free and open source software stack

    NASA Astrophysics Data System (ADS)

    Jolma, A.

    2012-04-01

    The Geoinformatica free and open source software (FOSS) stack is based mainly on three established FOSS components, namely GDAL, GTK+, and Perl. GDAL provides access to a very large selection of geospatial data formats and data sources, a generic geospatial data model, and a large collection of geospatial analytical and processing functionality. GTK+ and the Cairo graphics library provide generic graphics and graphical user interface capabilities. Perl is a programming language, for which there is a very large set of FOSS modules for a wide range of purposes and which can be used as an integrative tool for building applications. In the Geoinformatica stack, data storages such as FOSS RDBMS PostgreSQL with its geospatial extension PostGIS can be used below the three above mentioned components. The top layer of Geoinformatica consists of a C library and several Perl modules. The C library comprises a general purpose raster algebra library, hydrological terrain analysis functions, and visualization code. The Perl modules define a generic visualized geospatial data layer and subclasses for raster and vector data and graphs. The hydrological terrain functions are already rather old and they suffer for example from the requirement of in-memory rasters. Newer research conducted using the platform include basic geospatial simulation modeling, visualization of ecological data, linking with a Bayesian network engine for spatial risk assessment in coastal areas, and developing standards-based distributed water resources information systems in Internet. The Geoinformatica stack constitutes a platform for geospatial research, which is targeted towards custom analytical tools, prototyping and linking with external libraries. Writing custom analytical tools is supported by the Perl language and the large collection of tools that are available especially in GDAL and Perl modules. Prototyping is supported by the GTK+ library, the GUI tools, and the support for object-oriented programming in Perl. New feature types, geospatial layer classes, and tools as extensions with specific features can be defined, used, and studied. Linking with external libraries is possible using the Perl foreign function interface tools or with generic tools such as Swig. We are interested in implementing and testing linking Geoinformatica with existing or new more specific hydrological FOSS.

  14. To ontologise or not to ontologise: An information model for a geospatial knowledge infrastructure

    NASA Astrophysics Data System (ADS)

    Stock, Kristin; Stojanovic, Tim; Reitsma, Femke; Ou, Yang; Bishr, Mohamed; Ortmann, Jens; Robertson, Anne

    2012-08-01

    A geospatial knowledge infrastructure consists of a set of interoperable components, including software, information, hardware, procedures and standards, that work together to support advanced discovery and creation of geoscientific resources, including publications, data sets and web services. The focus of the work presented is the development of such an infrastructure for resource discovery. Advanced resource discovery is intended to support scientists in finding resources that meet their needs, and focuses on representing the semantic details of the scientific resources, including the detailed aspects of the science that led to the resource being created. This paper describes an information model for a geospatial knowledge infrastructure that uses ontologies to represent these semantic details, including knowledge about domain concepts, the scientific elements of the resource (analysis methods, theories and scientific processes) and web services. This semantic information can be used to enable more intelligent search over scientific resources, and to support new ways to infer and visualise scientific knowledge. The work describes the requirements for semantic support of a knowledge infrastructure, and analyses the different options for information storage based on the twin goals of semantic richness and syntactic interoperability to allow communication between different infrastructures. Such interoperability is achieved by the use of open standards, and the architecture of the knowledge infrastructure adopts such standards, particularly from the geospatial community. The paper then describes an information model that uses a range of different types of ontologies, explaining those ontologies and their content. The information model was successfully implemented in a working geospatial knowledge infrastructure, but the evaluation identified some issues in creating the ontologies.

  15. The National Map product and services directory

    USGS Publications Warehouse

    Newell, Mark R.

    2008-01-01

    As one of the cornerstones of the U.S. Geological Survey's (USGS) National Geospatial Program (NGP), The National Map is a collaborative effort among the USGS and other Federal, state, and local partners to improve and deliver topographic information for the Nation. It has many uses ranging from recreation to scientific analysis to emergency response. The National Map is easily accessible for display on the Web, as products, and as downloadable data. The geographic information available from The National Map includes orthoimagery (aerial photographs), elevation, geographic names, hydrography, boundaries, transportation, structures, and land cover. Other types of geographic information can be added to create specific types of maps. Of major importance, The National Map currently is being transformed to better serve the geospatial community. The USGS National Geospatial Program Office (NGPO) was established to provide leadership for placing geographic knowledge at the fingertips of the Nation. The office supports The National Map, Geospatial One-Stop (GOS), National Atlas of the United States®, and the Federal Geographic Data Committee (FGDC). This integrated portfolio of geospatial information and data supports the essential components of delivering the National Spatial Data Infrastructure (NSDI) and capitalizing on the power of place.

  16. Geospatial tools for data-sharing : case studies of select transportation agencies

    DOT National Transportation Integrated Search

    2014-09-01

    This report provides case studies from 23 State Departments of Transportation (DOTs) and others that are developing, using, and maintaining a variety of geospatial applications and tools to support GDC goals. The report also summarizes the state of t...

  17. Qualitative-Geospatial Methods of Exploring Person-Place Transactions in Aging Adults: A Scoping Review.

    PubMed

    Hand, Carri; Huot, Suzanne; Laliberte Rudman, Debbie; Wijekoon, Sachindri

    2017-06-01

    Research exploring how places shape and interact with the lives of aging adults must be grounded in the places where aging adults live and participate. Combined participatory geospatial and qualitative methods have the potential to illuminate the complex processes enacted between person and place to create much-needed knowledge in this area. The purpose of this scoping review was to identify methods that can be used to study person-place relationships among aging adults and their neighborhoods by determining the extent and nature of research with aging adults that combines qualitative methods with participatory geospatial methods. A systematic search of nine databases identified 1,965 articles published from 1995 to late 2015. We extracted data and assessed whether the geospatial and qualitative methods were supported by a specified methodology, the methods of data analysis, and the extent of integration of geospatial and qualitative methods. Fifteen studies were included and used the photovoice method, global positioning system tracking plus interview, or go-along interviews. Most included articles provided sufficient detail about data collection methods, yet limited detail about methodologies supporting the study designs and/or data analysis. Approaches that combine participatory geospatial and qualitative methods are beginning to emerge in the aging literature. By more explicitly grounding studies in a methodology, better integrating different types of data during analysis, and reflecting on methods as they are applied, these methods can be further developed and utilized to provide crucial place-based knowledge that can support aging adults' health, well-being, engagement, and participation. © The Author 2017. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  18. MyGeoHub: A Collaborative Geospatial Research and Education Platform

    NASA Astrophysics Data System (ADS)

    Kalyanam, R.; Zhao, L.; Biehl, L. L.; Song, C. X.; Merwade, V.; Villoria, N.

    2017-12-01

    Scientific research is increasingly collaborative and globally distributed; research groups now rely on web-based scientific tools and data management systems to simplify their day-to-day collaborative workflows. However, such tools often lack seamless interfaces, requiring researchers to contend with manual data transfers, annotation and sharing. MyGeoHub is a web platform that supports out-of-the-box, seamless workflows involving data ingestion, metadata extraction, analysis, sharing and publication. MyGeoHub is built on the HUBzero cyberinfrastructure platform and adds general-purpose software building blocks (GABBs), for geospatial data management, visualization and analysis. A data management building block iData, processes geospatial files, extracting metadata for keyword and map-based search while enabling quick previews. iData is pervasive, allowing access through a web interface, scientific tools on MyGeoHub or even mobile field devices via a data service API. GABBs includes a Python map library as well as map widgets that in a few lines of code, generate complete geospatial visualization web interfaces for scientific tools. GABBs also includes powerful tools that can be used with no programming effort. The GeoBuilder tool provides an intuitive wizard for importing multi-variable, geo-located time series data (typical of sensor readings, GPS trackers) to build visualizations supporting data filtering and plotting. MyGeoHub has been used in tutorials at scientific conferences and educational activities for K-12 students. MyGeoHub is also constantly evolving; the recent addition of Jupyter and R Shiny notebook environments enable reproducible, richly interactive geospatial analyses and applications ranging from simple pre-processing to published tools. MyGeoHub is not a monolithic geospatial science gateway, instead it supports diverse needs ranging from just a feature-rich data management system, to complex scientific tools and workflows.

  19. Applying the Land Use Portfolio Model with Hazus to analyse risk from natural hazard events

    USGS Publications Warehouse

    Dinitz, Laura B.; Taketa, Richard A.

    2013-01-01

    This paper describes and demonstrates the integration of two geospatial decision-support systems for natural-hazard risk assessment and management. Hazus is a risk-assessment tool developed by the Federal Emergency Management Agency to identify risks and estimate the severity of risk from natural hazards. The Land Use Portfolio Model (LUPM) is a risk-management tool developed by the U.S. Geological Survey to evaluate plans or actions intended to reduce risk from natural hazards. We analysed three mitigation policies for one earthquake scenario in the San Francisco Bay area to demonstrate the added value of using Hazus and the LUPM together. The demonstration showed that Hazus loss estimates can be input to the LUPM to obtain estimates of losses avoided through mitigation, rates of return on mitigation investment, and measures of uncertainty. Together, they offer a more comprehensive approach to help with decisions for reducing risk from natural hazards.

  20. The U.S. Geological Survey cartographic and geographic information science research activities 2006-2010

    USGS Publications Warehouse

    Usery, E. Lynn

    2011-01-01

    The U.S. Geological Survey (USGS) produces geospatial databases and topographic maps for the United States of America. A part of that mission includes conducting research in geographic information science (GIScience) and cartography to support mapping and improve the design, quality, delivery, and use of geospatial data and topographic maps. The Center of Excellence for Geospatial Information Science (CEGIS) was established by the USGS in January 2006 as a part of the National Geospatial Program Office. CEGIS (http://cegis.usgs.gov) evolved from a team of cartographic researchers at the Mid-Continent Mapping Center. The team became known as the Cartographic Research group and was supported by the Cooperative Topographic Mapping, Geographic Analysis and Monitoring, and Land Remote Sensing programs of the Geography Discipline of the USGS from 1999-2005. In 2006, the Cartographic Research group and its projects (http://carto-research.er.usgs.gov/) became the core of CEGIS staff and research. In 2006, CEGIS research became focused on The National Map (http://nationalmap.gov).

  1. EPA Guidance for Geospatially Related Quality Assurance Project Plans

    EPA Pesticide Factsheets

    This March 2003 document discusses EPA's Quality Assurance (QA) Project Plan as a tool for project managers and planners to document the type and quality of data and information needed for making environmental decisions

  2. SERVIR: From Space to Village

    NASA Technical Reports Server (NTRS)

    Irwin, Dan

    2012-01-01

    SERVIR is a NASA/USAID partnership to improve environmental management and resilience to climate change by strengthening the capacity of governments and other key stakeholders to integrate earth observation information and geospatial technologies into development decision-making

  3. Geospatial optimization of siting large-scale solar projects

    USGS Publications Warehouse

    Macknick, Jordan; Quinby, Ted; Caulfield, Emmet; Gerritsen, Margot; Diffendorfer, James E.; Haines, Seth S.

    2014-01-01

    guidelines by being user-driven, transparent, interactive, capable of incorporating multiple criteria, and flexible. This work provides the foundation for a dynamic siting assistance tool that can greatly facilitate siting decisions among multiple stakeholders.

  4. Enabling joined-up decision making with geotemporal information

    NASA Astrophysics Data System (ADS)

    Smith, M. J.; Ahmed, S. E.; Purves, D. W.; Emmott, S.; Joppa, L. N.; Caldararu, S.; Visconti, P.; Newbold, T.; Formica, A. F.

    2015-12-01

    While the use of geospatial data to assist in decision making is becoming increasingly common, the use of geotemporal information: information that can be indexed by geographical space AND time, is much rarer. I will describe our scientific research and software development efforts intended to advance the availability and use of geotemporal information in general. I will show two recent examples of "stacking" geotemporal information to support land use decision making in the Brazilian Amazon and Kenya, involving data-constrained predictive models and empirically derived datasets of road development, deforestation, carbon, agricultural yields, water purification and poverty alleviation services and will show how we use trade-off analyses and constraint reasoning algorithms to explore the costs and benefits of different decisions. For the Brazilian Amazon we explore tradeoffs involved in different deforestation scenarios, while for Kenya we explore the impacts of conserving forest to support international carbon conservation initiatives (REDD+). I will also illustrate the cloud-based software tools we have developed to enable anyone to access geotemporal information, gridded (e.g. climate) or non-gridded (e.g. protected areas), for the past, present or future and incorporate such information into their analyses (e.g. www.fetchclimate.org), including how we train new predictive models to such data using Bayesian techniques: on this latter point I will show how we combine satellite and ground measured data with predictive models to forecast how crops might respond to climate change.

  5. SERVIR: Environmental Decision Making in the Americas

    NASA Technical Reports Server (NTRS)

    Lapenta, William; Irwin, Dan

    2008-01-01

    SERVIR is a regional visualization and monitoring system for Mesoamerica that integrates satellite and other geospatial data for improved scientific knowledge and decision making by managers, researchers, students, and the general public. SERVIR addresses the nine societal benefit areas of the Global Earth Observation System of Systems (GEOSS). This talk will provide an overview of products and services available through SERVIR.

  6. Temporal geospatial analysis of secondary school students’ examination performance

    NASA Astrophysics Data System (ADS)

    Nik Abd Kadir, ND; Adnan, NA

    2016-06-01

    Malaysia's Ministry of Education has improved the organization of the data to have the geographical information system (GIS) school database. However, no further analysis is done using geospatial analysis tool. Mapping has emerged as a communication tool and becomes effective way to publish the digital and statistical data such as school performance results. The objective of this study is to analyse secondary school student performance of science and mathematics scores of the Sijil Pelajaran Malaysia Examination result in the year 2010 to 2014 for the Kelantan's state schools with the aid of GIS software and geospatial analysis. The school performance according to school grade point average (GPA) from Grade A to Grade G were interpolated and mapped and query analysis using geospatial tools able to be done. This study will be beneficial to the education sector to analyse student performance not only in Kelantan but to the whole Malaysia and this will be a good method to publish in map towards better planning and decision making to prepare young Malaysians for the challenges of education system and performance.

  7. Not Just a Game … When We Play Together, We Learn Together: Interactive Virtual Environments and Gaming Engines for Geospatial Visualization

    NASA Astrophysics Data System (ADS)

    Shipman, J. S.; Anderson, J. W.

    2017-12-01

    An ideal tool for ecologists and land managers to investigate the impacts of both projected environmental changes and policy alternatives is the creation of immersive, interactive, virtual landscapes. As a new frontier in visualizing and understanding geospatial data, virtual landscapes require a new toolbox for data visualization that includes traditional GIS tools and uncommon tools such as the Unity3d game engine. Game engines provide capabilities to not only explore data but to build and interact with dynamic models collaboratively. These virtual worlds can be used to display and illustrate data that is often more understandable and plausible to both stakeholders and policy makers than is achieved using traditional maps.Within this context we will present funded research that has been developed utilizing virtual landscapes for geographic visualization and decision support among varied stakeholders. We will highlight the challenges and lessons learned when developing interactive virtual environments that require large multidisciplinary team efforts with varied competences. The results will emphasize the importance of visualization and interactive virtual environments and the link with emerging research disciplines within Visual Analytics.

  8. Geospatial Analysis of Climate-Related Changes in North American Arctic Ecosystems and Implications for Terrestrial Flora and Fauna

    NASA Astrophysics Data System (ADS)

    Amirazodi, S.; Griffin, R.

    2016-12-01

    Climate change induces range shifts among many terrestrial species in Arctic regions. At best, warming often forces poleward migration if a stable environment is to be maintained. At worst, marginal ecosystems may disappear entirely without a contiguous shift allowing migratory escape to similar environs. These changing migration patterns and poleward range expansion push species into higher latitudes where ecosystems are less stable and more sensitive to change. This project focuses on ecosystem geography and interspecies relationships and interactions by analyzing seasonality and changes over time in variables including the following: temperature, precipitation, vegetation, physical boundaries, population demographics, permafrost, sea ice, and food and water availability. Publicly available data from remote sensing platforms are used throughout, and processed with both commercially available and open sourced GIS tools. This analysis describes observed range changes for selected North American species, and attempts to provide insight into the causes and effects of these phenomena. As the responses to climate change are complex and varied, the goal is to produce the aforementioned results in an easily understood set of geospatial representations to better support decision making regarding conservation prioritization and enable adaptive responses and mitigation strategies.

  9. Geospatial Based Information System Development in Public Administration for Sustainable Development and Planning in Urban Environment

    NASA Astrophysics Data System (ADS)

    Kouziokas, Georgios N.

    2016-09-01

    It is generally agreed that the governmental authorities should actively encourage the development of an efficient framework of information and communication technology initiatives so as to advance and promote sustainable development and planning strategies. This paper presents a prototype Information System for public administration which was designed to facilitate public management and decision making for sustainable development and planning. The system was developed by using several programming languages and programming tools and also a Database Management System (DBMS) for storing and managing urban data of many kinds. Furthermore, geographic information systems were incorporated into the system in order to make possible to the authorities to deal with issues of spatial nature such as spatial planning. The developed system provides a technology based management of geospatial information, environmental and crime data of urban environment aiming at improving public decision making and also at contributing to a more efficient sustainable development and planning.

  10. Applying a Geospatial Visualization Based on USSD Messages to Real Time Identification of Epidemiological Risk Areas in Developing Countries: A Case of Study of Paraguay.

    PubMed

    Ochoa, Silvia; Talavera, Julia; Paciello, Julio

    2015-01-01

    The identification of epidemiological risk areas is one of the major problems in public health. Information management strategies are needed to facilitate prevention and control of disease in the affected areas. This paper presents a model to optimize geographical data collection of suspected or confirmed disease occurrences using the Unstructured Supplementary Service Data (USSD) mobile technology, considering its wide adoption even in developing countries such as Paraguay. A Geographic Information System (GIS) is proposed for visualizing potential epidemiological risk areas in real time, that aims to support decision making and to implement prevention or contingency programs for public health.

  11. Web GIS in practice IX: a demonstration of geospatial visual analytics using Microsoft Live Labs Pivot technology and WHO mortality data

    PubMed Central

    2011-01-01

    The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of human health, surveillance, emergency management and epidemiology-related geospatial visual analytics applications and examples of free software tools that readers can experiment with, such as Google Public Data Explorer. The authors also present a practical demonstration of geospatial visual analytics using partial data for 35 countries from a publicly available World Health Organization (WHO) mortality dataset and Microsoft Live Labs Pivot technology, a free, general purpose visual analytics tool that offers a fresh way to visually browse and arrange massive amounts of data and images online and also supports geographic and temporal classifications of datasets featuring geospatial and temporal components. Interested readers can download a Zip archive (included with the manuscript as an additional file) containing all files, modules and library functions used to deploy the WHO mortality data Pivot collection described in this paper. PMID:21410968

  12. Web GIS in practice IX: a demonstration of geospatial visual analytics using Microsoft Live Labs Pivot technology and WHO mortality data.

    PubMed

    Kamel Boulos, Maged N; Viangteeravat, Teeradache; Anyanwu, Matthew N; Ra Nagisetty, Venkateswara; Kuscu, Emin

    2011-03-16

    The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of human health, surveillance, emergency management and epidemiology-related geospatial visual analytics applications and examples of free software tools that readers can experiment with, such as Google Public Data Explorer. The authors also present a practical demonstration of geospatial visual analytics using partial data for 35 countries from a publicly available World Health Organization (WHO) mortality dataset and Microsoft Live Labs Pivot technology, a free, general purpose visual analytics tool that offers a fresh way to visually browse and arrange massive amounts of data and images online and also supports geographic and temporal classifications of datasets featuring geospatial and temporal components. Interested readers can download a Zip archive (included with the manuscript as an additional file) containing all files, modules and library functions used to deploy the WHO mortality data Pivot collection described in this paper.

  13. Modeling Being "Lost": Imperfect Situation Awareness

    NASA Technical Reports Server (NTRS)

    Middleton, Victor E.

    2011-01-01

    Being "lost" is an exemplar of imperfect Situation Awareness/Situation Understanding (SA/SU) -- information/knowledge that is uncertain, incomplete, and/or just wrong. Being "lost" may be a geo-spatial condition - not knowing/being wrong about where to go or how to get there. More broadly, being "lost" can serve as a metaphor for uncertainty and/or inaccuracy - not knowing/being wrong about how one fits into a larger world view, what one wants to do, or how to do it. This paper discusses using agent based modeling (ABM) to explore imperfect SA/SU, simulating geo-spatially "lost" intelligent agents trying to navigate in a virtual world. Each agent has a unique "mental map" -- its idiosyncratic view of its geo-spatial environment. Its decisions are based on this idiosyncratic view, but behavior outcomes are based on ground truth. Consequently, the rate and degree to which an agent's expectations diverge from ground truth provide measures of that agent's SA/SU.

  14. Effects of ensemble and summary displays on interpretations of geospatial uncertainty data.

    PubMed

    Padilla, Lace M; Ruginski, Ian T; Creem-Regehr, Sarah H

    2017-01-01

    Ensemble and summary displays are two widely used methods to represent visual-spatial uncertainty; however, there is disagreement about which is the most effective technique to communicate uncertainty to the general public. Visualization scientists create ensemble displays by plotting multiple data points on the same Cartesian coordinate plane. Despite their use in scientific practice, it is more common in public presentations to use visualizations of summary displays, which scientists create by plotting statistical parameters of the ensemble members. While prior work has demonstrated that viewers make different decisions when viewing summary and ensemble displays, it is unclear what components of the displays lead to diverging judgments. This study aims to compare the salience of visual features - or visual elements that attract bottom-up attention - as one possible source of diverging judgments made with ensemble and summary displays in the context of hurricane track forecasts. We report that salient visual features of both ensemble and summary displays influence participant judgment. Specifically, we find that salient features of summary displays of geospatial uncertainty can be misunderstood as displaying size information. Further, salient features of ensemble displays evoke judgments that are indicative of accurate interpretations of the underlying probability distribution of the ensemble data. However, when participants use ensemble displays to make point-based judgments, they may overweight individual ensemble members in their decision-making process. We propose that ensemble displays are a promising alternative to summary displays in a geospatial context but that decisions about visualization methods should be informed by the viewer's task.

  15. Identification of the condition of crops based on geospatial data embedded in graph databases

    NASA Astrophysics Data System (ADS)

    Idziaszek, P.; Mueller, W.; Górna, K.; Okoń, P.; Boniecki, P.; Koszela, K.; Fojud, A.

    2017-07-01

    The Web application presented here supports plant production and works with the graph database Neo4j shell to support the assessment of the condition of crops on the basis of geospatial data, including raster and vector data. The adoption of a graph database as a tool to store and manage the data, including geospatial data, is completely justified in the case of those agricultural holdings that have a wide range of types and sizes of crops. In addition, the authors tested the option of using the technology of Microsoft Cognitive Services at the level of produced application that enables an image analysis using the services provided. The presented application was designed using ASP.NET MVC technology and a wide range of leading IT tools.

  16. Rapid Data Delivery System (RDDS)

    USGS Publications Warehouse

    Cress, Jill J.; Goplen, Susan E.

    2007-01-01

    Since the start of the active 2000 summer fire season, the U. S. Geological Survey (USGS) Rocky Mountain Geographic Science Center (RMGSC) has been actively engaged in providing crucial and timely support to Federal, State, and local natural hazards monitoring, analysis, response, and recovery activities. As part of this support, RMGSC has developed the Rapid Data Delivery System (RDDS) to provide emergency and incident response teams with timely access to geospatial data. The RDDS meets these needs by combining a simple web-enabled data viewer for the selection and preview of vector and raster geospatial data with an easy to use data ordering form. The RDDS viewer also incorporates geospatial locations for current natural hazard incidents, including wildfires, earthquakes, hurricanes, and volcanoes, allowing incident responders to quickly focus on their area of interest for data selection.

  17. LDRD final report :

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brost, Randolph C.; McLendon, William Clarence,

    2013-01-01

    Modeling geospatial information with semantic graphs enables search for sites of interest based on relationships between features, without requiring strong a priori models of feature shape or other intrinsic properties. Geospatial semantic graphs can be constructed from raw sensor data with suitable preprocessing to obtain a discretized representation. This report describes initial work toward extending geospatial semantic graphs to include temporal information, and initial results applying semantic graph techniques to SAR image data. We describe an efficient graph structure that includes geospatial and temporal information, which is designed to support simultaneous spatial and temporal search queries. We also report amore » preliminary implementation of feature recognition, semantic graph modeling, and graph search based on input SAR data. The report concludes with lessons learned and suggestions for future improvements.« less

  18. United States Geological Survey (USGS) Natural Hazards Response

    USGS Publications Warehouse

    Lamb, Rynn M.; Jones, Brenda K.

    2012-01-01

    The primary goal of U.S. Geological Survey (USGS) Natural Hazards Response is to ensure that the disaster response community has access to timely, accurate, and relevant geospatial products, imagery, and services during and after an emergency event. To accomplish this goal, products and services provided by the National Geospatial Program (NGP) and Land Remote Sensing (LRS) Program serve as a geospatial framework for mapping activities of the emergency response community. Post-event imagery and analysis can provide important and timely information about the extent and severity of an event. USGS Natural Hazards Response will also support the coordination of remotely sensed data acquisitions, image distribution, and authoritative geospatial information production as required for use in disaster preparedness, response, and recovery operations.

  19. Geospatial Tools for Evaluating Ecosystems in Lakes and Ponds of the Northeastern US

    EPA Science Inventory

    Northeastern lakes benefit residents and visitors by providing valuable ecosystem services such as nutrient retention, recreational opportunities, and aesthetic value. Concurrently, however, complex changes such landscape change, population growth, and management decisions influ...

  20. Cloud-Based Data Sharing Connects Emergency Managers

    NASA Technical Reports Server (NTRS)

    2014-01-01

    Under an SBIR contract with Stennis Space Center, Baltimore-based StormCenter Communications Inc. developed an improved interoperable platform for sharing geospatial data over the Internet in real time-information that is critical for decision makers in emergency situations.

  1. Progress of Interoperability in Planetary Research for Geospatial Data Analysis

    NASA Astrophysics Data System (ADS)

    Hare, T. M.; Gaddis, L. R.

    2015-12-01

    For nearly a decade there has been a push in the planetary science community to support interoperable methods of accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (i.e., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized image formats that retain geographic information (e.g., GeoTiff, GeoJpeg2000), digital geologic mapping conventions, planetary extensions for symbols that comply with U.S. Federal Geographic Data Committee cartographic and geospatial metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they have been modified to support the planetary domain. The motivation to support common, interoperable data format and delivery standards is not only to improve access for higher-level products but also to address the increasingly distributed nature of the rapidly growing volumes of data. The strength of using an OGC approach is that it provides consistent access to data that are distributed across many facilities. While data-steaming standards are well-supported by both the more sophisticated tools used in Geographic Information System (GIS) and remote sensing industries, they are also supported by many light-weight browsers which facilitates large and small focused science applications and public use. Here we provide an overview of the interoperability initiatives that are currently ongoing in the planetary research community, examples of their successful application, and challenges that remain.

  2. OnEarth: An Open Source Solution for Efficiently Serving High-Resolution Mapped Image Products

    NASA Astrophysics Data System (ADS)

    Thompson, C. K.; Plesea, L.; Hall, J. R.; Roberts, J. T.; Cechini, M. F.; Schmaltz, J. E.; Alarcon, C.; Huang, T.; McGann, J. M.; Chang, G.; Boller, R. A.; Ilavajhala, S.; Murphy, K. J.; Bingham, A. W.

    2013-12-01

    This presentation introduces OnEarth, a server side software package originally developed at the Jet Propulsion Laboratory (JPL), that facilitates network-based, minimum-latency geolocated image access independent of image size or spatial resolution. The key component in this package is the Meta Raster Format (MRF), a specialized raster file extension to the Geospatial Data Abstraction Library (GDAL) consisting of an internal indexed pyramid of image tiles. Imagery to be served is converted to the MRF format and made accessible online via an expandable set of server modules handling requests in several common protocols, including the Open Geospatial Consortium (OGC) compliant Web Map Tile Service (WMTS) as well as Tiled WMS and Keyhole Markup Language (KML). OnEarth has recently transitioned to open source status and is maintained and actively developed as part of GIBS (Global Imagery Browse Services), a collaborative project between JPL and Goddard Space Flight Center (GSFC). The primary function of GIBS is to enhance and streamline the data discovery process and to support near real-time (NRT) applications via the expeditious ingestion and serving of full-resolution imagery representing science products from across the NASA Earth Science spectrum. Open source software solutions are leveraged where possible in order to utilize existing available technologies, reduce development time, and enlist wider community participation. We will discuss some of the factors and decision points in transitioning OnEarth to a suitable open source paradigm, including repository and licensing agreement decision points, institutional hurdles, and perceived benefits. We will also provide examples illustrating how OnEarth is integrated within GIBS and other applications.

  3. SERVIR: From Space to Village. A Regional Monitoring and Visualization System For Environmental Management Using Satellite Applications For Sustainable Development

    NASA Technical Reports Server (NTRS)

    Sever, Tom; Stahl, H. Philip; Irwin, Dan; Lee, Daniel

    2007-01-01

    NASA is committed to providing technological support and expertise to regional and national organizations for earth science monitoring and analysis. This commitment is exemplified by NASA's long-term relationship with Central America. The focus of these efforts has primarily been to measure the impact of human development on the environment and to provide data for the management of human settlement and expansion in the region. Now, NASA is planning to extend and expand this capability to other regions of the world including Africa and the Caribbean. NASA began using satellite imagery over twenty-five years ago to locate important Maya archeological sites in Mesoamerica and to quantify the affect of deforestation on those sites. Continuing that mission, NASA has partnered with the U.S. Agency for International Development (USAID), the World Bank, the Water Center for the Humid Tropics of Latin America and the Caribbean (CATHALAC) and the Central American Commission for Environment and Development (CCAD) to develop SERVIR (Sistema Regional de Visualizacion y Monitoreo), for the Mesoamerican Biological Corridor. SERVIR has become one of the most important aspects of NASA's geospatial efforts in Central America by establishing a common access portal for information that affects the lives, livelihood and future of everyone in the region. SERVIR, most commonly referred to as a regional visualization and monitoring system, is a scientific and technological platform that integrates satellite and other geospatial data sets to generate tools for improved decision-making capabilities. It has a collection of data and models that are easily accessible to earth science managers, first responders, NGO's (Non-Government Organizations) and a host of others. SERVIR is currently used to monitor and forecast ecological changes as well as provide information for decision support during severe events such as forest fires, red tides,and tropical storms. Additionally, SERVIR addresses the nine societal benefit areas of the Global Earth Observation System (GEOSS): disasters, ecosystems, biodiversity, weather, water, climate, health, agriculture and energy.

  4. The AgESGUI geospatial simulation system for environmental model application and evaluation

    USDA-ARS?s Scientific Manuscript database

    Practical decision making in spatially-distributed environmental assessment and management is increasingly being based on environmental process-based models linked to geographical information systems (GIS). Furthermore, powerful computers and Internet-accessible assessment tools are providing much g...

  5. Free and Open Source Software for Geospatial in the field of planetary science

    NASA Astrophysics Data System (ADS)

    Frigeri, A.

    2012-12-01

    Information technology applied to geospatial analyses has spread quickly in the last ten years. The availability of OpenData and data from collaborative mapping projects increased the interest on tools, procedures and methods to handle spatially-related information. Free Open Source Software projects devoted to geospatial data handling are gaining a good success as the use of interoperable formats and protocols allow the user to choose what pipeline of tools and libraries is needed to solve a particular task, adapting the software scene to his specific problem. In particular, the Free Open Source model of development mimics the scientific method very well, and researchers should be naturally encouraged to take part to the development process of these software projects, as this represent a very agile way to interact among several institutions. When it comes to planetary sciences, geospatial Free Open Source Software is gaining a key role in projects that commonly involve different subjects in an international scenario. Very popular software suites for processing scientific mission data (for example, ISIS) and for navigation/planning (SPICE) are being distributed along with the source code and the interaction between user and developer is often very strict, creating a continuum between these two figures. A very widely spread library for handling geospatial data (GDAL) has started to support planetary data from the Planetary Data System, and recent contributions enabled the support to other popular data formats used in planetary science, as the Vicar one. The use of Geographic Information System in planetary science is now diffused, and Free Open Source GIS, open GIS formats and network protocols allow to extend existing tools and methods developed to solve Earth based problems, also to the case of the study of solar system bodies. A day in the working life of a researcher using Free Open Source Software for geospatial will be presented, as well as benefits and solutions to possible detriments coming from the effort required by using, supporting and contributing.

  6. Preserving the Finger Lakes for the Future: A Prototype Decision Support System for Water Resource Management, Open Space, and Agricultural Protection

    NASA Technical Reports Server (NTRS)

    Brower, Robert

    2004-01-01

    This report summarizes the activity conducted under NASA Grant NAG13-02059 entitled "Preserving the Finger Lakes for the Future" A Prototype Decision Support System for Water Resources Management, Open Space and Agricultural Protection, for the period of September 26, 2003 to September 25, 2004. The RACNE continues to utilize the services of its affiliate, the Institute for the Application of Geospatial Technology at Cayuga Community College, Inc. (IAGT), for the purposes of this project under its permanent operating agreement with IAGT. IAGT is a 501(c)(3) not-for-profit Corporation created by the RACNE for the purpose of carrying out its programmatic and administrative mission. The "Preserving the Finger Lakes for the Future" project has progressed and evolved as planned, with the continuation or initiation of a number of program facets at programmatic, technical, and inter-agency levels. The project has grown, starting with the well received core concept of the Virtual Management Operations Center (VMOC), to the functional Watershed Virtual Management Operations Center (W-VMOC) prototype, to the more advanced Finger Lakes Decision Support System (FLDSS) prototype, deployed for evaluation and assessment to a wide variety of agencies and organizations in the Finger Lakes region and beyond. This suite of tools offers the advanced, compelling functionality of interactive 3D visualization interfaced with 2D mapping, all accessed via Internet or virtually any kind of distributed computer network.

  7. Two decision-support tools for assessing the potential effects of energy development on hydrologic resources as part of the Energy and Environment in the Rocky Mountain Area interactive energy atlas

    USGS Publications Warehouse

    Linard, Joshua I.; Matherne, Anne Marie; Leib, Kenneth J.; Carr, Natasha B.; Diffendorfer, James E.; Hawkins, Sarah J.; Latysh, Natalie; Ignizio, Drew A.; Babel, Nils C.

    2014-01-01

    The U.S. Geological Survey project—Energy and Environment in the Rocky Mountain Area (EERMA)—has developed a set of virtual tools in the form of an online interactive energy atlas for Colorado and New Mexico to facilitate access to geospatial data related to energy resources, energy infrastructure, and natural resources that may be affected by energy development. The interactive energy atlas currently (2014) consists of three components: (1) a series of interactive maps; (2) downloadable geospatial datasets; and (3) decison-support tools, including two maps related to hydrologic resources discussed in this report. The hydrologic-resource maps can be used to examine the potential effects of energy development on hydrologic resources with respect to (1) groundwater vulnerability, by using the depth to water, recharge, aquifer media, soil media, topography, impact of the vadose zone, and hydraulic conductivity of the aquifer (DRASTIC) model, and (2) landscape erosion potential, by using the revised universal soil loss equation (RUSLE). The DRASTIC aquifer vulnerability index value for the two-State area ranges from 48 to 199. Higher values, indicating greater relative aquifer vulnerability, are centered in south-central Colorado, areas in southeastern New Mexico, and along riparian corridors in both States—all areas where the water table is relatively close to the land surface and the aquifer is more susceptible to surface influences. As calculated by the RUSLE model, potential mean annual erosion, as soil loss in units of tons per acre per year, ranges from 0 to 12,576 over the two-State area. The RUSLE model calculated low erosion potential over most of Colorado and New Mexico, with predictions of highest erosion potential largely confined to areas of mountains or escarpments. An example is presented of how a fully interactive RUSLE model could be further used as a decision-support tool to evaluate the potential hydrologic effects of energy development on a site-specific basis and to explore the effectiveness of various mitigation practices.

  8. Keeping Current and Increasing The Effectiveness of the Decision-Making Process and the Interoperability in the Digital Age: Geospatial Intelligence and Geospatial Information Systems’ Applications in the Military and Intelligence Fields for the Mexican Navy

    DTIC Science & Technology

    2008-12-01

    Guide, 3rd ed. (London; Thousand Oaks, Calif: Sage Publications, 1999), 228, http://www.loc.gov/catdir/ toc /fy042/99214121.html; (accessed date 6/25...analysis in the GEOINT context is the Sistema Nacional de Seguridad Pública, SNSP (National System of Public Security) with the implementation of the...named Sistema de Información Geográfica del Atlas Nacional de Riesgos (GIS national risk atlas) that is under the “direction of research” of that

  9. Wind Maps | Geospatial Data Science | NREL

    Science.gov Websites

    Wind Maps Wind Maps Wind Prospector This GIS application supports resource assessment and data exploration for wind development. This collection of wind maps and assessments details the wind resource in Geospatial Data Science Team. National Wind Resource Assessment The national wind resource assessment was

  10. Spatial Thinking Assists Geographic Thinking: Evidence from a Study Exploring the Effects of Geospatial Technology

    ERIC Educational Resources Information Center

    Metoyer, Sandra; Bednarz, Robert

    2017-01-01

    This article provides a description and discussion of an exploratory research study that examined the effects of using geospatial technology (GST) on high school students' spatial skills and spatial-relations content knowledge. It presents results that support the use of GST to teach spatially dependent content. It also provides indication of an…

  11. CROSS-CUTTING QA ISSUES INVOLVING GEOSPATIAL SCIENCES, CHEMISTRY, INFORMATION MANAGEMENT, AND LAW

    EPA Science Inventory

    The Agency spends hundreds of millions of dollars annually collecting and processing environmental data for scientific research and regulatory decision making. In addition, the regulated community may spend as much or more each year responding to Agency compliance requirements. ...

  12. Geospatial Tools for Prevention of Urban Floods Case Study: River of EL Maleh (city of Mohammedia - Morocco)

    NASA Astrophysics Data System (ADS)

    Chaabane, M. S.; Abouali, N.; Boumeaza, T.; Zahouily, M.

    2017-11-01

    Today, the prevention and the risk management occupy an important part of public policy activities and are considered as major components in the process of sustainable development of territories. Due to the expansion of IT processes, in particular the geomatics sciences, decision-makers are increasingly requesting for digital tools before, during and after the risks of natural disasters. Both, the geographic information system (GIS) and the remote sensing are considered as geospatial and fundamental tools which help to understand the evolution of risks, to analyze their temporality and to make the right decisions. The historic events (on 1996, 2002 and 2010) which struck the city of Mohammedia and having caused the consequent damage to vital infrastructure and private property, require a thorough and rational analyze to benefit from it and well manage the floods phenomena. This article present i) the contribution of the geospatial tools for the floods simulation of Oued of el Maleh city at various return periods. These tools allow the demarcation of flood-risk areas and so to make floods simulations in several scenarios (decadal flood, 20-year flood, 50-year flood, 100-year flood, 500-year flood & also millennial flood) and besides (ii) present a synthesis map combining the territorial stakes superposed on the flood scenarios at different periods of return.

  13. Environmental impact assessment of transportation projects: An analysis using an integrated GIS, remote sensing, and spatial modeling approach

    NASA Astrophysics Data System (ADS)

    El-Gafy, Mohamed Anwar

    Transportation projects will have impact on the environment. The general environmental pollution and damage caused by roads is closely associated with the level of economic activity. Although Environmental Impact Assessments (EIAs) are dependent on geo-spatial information in order to make an assessment, there are no rules per se how to conduct an environmental assessment. Also, the particular objective of each assessment is dictated case-by-case, based on what information and analyses are required. The conventional way of Environmental Impact Assessment (EIA) study is a time consuming process because it has large number of dependent and independent variables which have to be taken into account, which also have different consequences. With the emergence of satellite remote sensing technology and Geographic Information Systems (GIS), this research presents a new framework for the analysis phase of the Environmental Impact Assessment (EIA) for transportation projects based on the integration between remote sensing technology, geographic information systems, and spatial modeling. By integrating the merits of the map overlay method and the matrix method, the framework analyzes comprehensively the environmental vulnerability around the road and its impact on the environment. This framework is expected to: (1) improve the quality of the decision making process, (2) be applied both to urban and inter-urban projects, regardless of transport mode, and (3) present the data and make the appropriate analysis to support the decision of the decision-makers and allow them to present these data to the public hearings in a simple manner. Case studies, transportation projects in the State of Florida, were analyzed to illustrate the use of the decision support framework and demonstrate its capabilities. This cohesive and integrated system will facilitate rational decisions through cost effective coordination of environmental information and data management that can be tailored to specific projects. The framework would facilitate collecting, organizing, analyzing, archiving, and coordinating the information and data necessary to support technical and policy transportation decisions.

  14. International boundary experiences by the United Nations

    NASA Astrophysics Data System (ADS)

    Kagawa, A.

    2013-12-01

    Over the last few decades, the United Nations (UN) has been approached by Security Council and Member States on international boundary issues. The United Nations regards the adequate delimitation and demarcation of international boundaries as a very important element for the maintenance of peace and security in fragile post-conflict situations, establishment of friendly relationships and cross-border cooperation between States. This paper will present the main principles and framework the United Nations applies to support the process of international boundary delimitation and demarcation activities. The United Nations is involved in international boundary issues following the principle of impartiality and neutrality and its role as mediator. Since international boundary issues are multi-faceted, a range of expertise is required and the United Nations Secretariat is in a good position to provide diverse expertise within the multiple departments. Expertise in different departments ranging from legal, political, technical, administrative and logistical are mobilised in different ways to provide support to Member States depending on their specific needs. This presentation aims to highlight some of the international boundary projects that the United Nations Cartographic Section has been involved in order to provide the technical support to different boundary requirements as each international boundary issue requires specific focus and attention whether it be in preparation, delimitation, demarcation or management. Increasingly, the United Nations is leveraging geospatial technology to facilitate boundary delimitation and demarcation process between Member States. Through the presentation of the various case studies ranging from Iraq - Kuwait, Israel - Lebanon (Blue Line), Eritrea - Ethiopia, Cyprus (Green Line), Cameroon - Nigeria, Sudan - South Sudan, it will illustrate how geospatial technology is increasingly used to carry out the support. In having applied a range of geospatial solutions, some of the good practices that have been applied in preceding projects, but there have been challenges and limitations faced. However, these challenges need to be seen as an opportunity to improve the geospatial technology solutions in future international boundary projects. This presentation will also share the aspirations that the United Nations Cartographic Section has in becoming a facilitator in geospatial technical aspects related to international boundary issues as we increasingly develop our geospatial institutional knowledge base and expertise. The presentation will conclude by emphasizing the need for more collaboration between different actors dealing with geospatial technology on borderland issues in order to meet the main goal of the United Nations - to live and work together as "We the Peoples of the United Nations".

  15. Potentiality of rainwater harvesting for an urban community in Bangladesh

    NASA Astrophysics Data System (ADS)

    Akter, Aysha; Ahmed, Shoukat

    2015-09-01

    Due to cost effectiveness, rainwater harvesting (RWH) systems are practicing already in some rural parts of Bangladesh but very few in urban areas. This paper aimed to evaluate the potentiality of RWH systems in the South Agrabad in Chittagong city with an average annual precipitation of 3000 mm, experiencing both water scarcity and urban flooding in the same year. The adopted approach was Analytic Hierarchy Process (AHP) based multicriteria decision analysis technique, and the evaluation criteria were roof area, slope, drainage density and runoff coefficient. Geospatial Hydrologic Modeling Extension supported hydrologic model viz. HEC-HMS used to simulate the precipitation-runoff process, the model outcomes showed RWH potentiality which could minimize stagnant storm water up to 26% through supplementing city water supply annually up to 20 liter/person/day. Then, assigning suitable weightage to the evaluation criteria with their associated features in ArcGIS 9.3, the study area was reasonably divided into three potential zones i.e. good, moderate and poor covering 19%, 64% and 17% of the total area respectively. Thus, this is envisaged AHP using HEC-HMS could provide important guidance to the decision supporting system not only for urban areas but also for the wide sub-basin/basin context.

  16. Capacity Building on the Use of Earth Observation for Bridging the Gaps between Science and Policy

    NASA Astrophysics Data System (ADS)

    Thapa, R. B.; Bajracharya, B.

    2017-12-01

    Although the geospatial technologies and Earth observation (EO) data are getting more accessible, lack of skilled human resources and institutional capacities are the major hurdles in the effective applications in Hindu Kush Himalayan (HKH) region. Designing efficient and cost effective capacity building (CB) programs fitting needs by different users on the use of EO information for decision making will provide options in bridging the gaps in the region. This paper presents the strategies adopted by SERVIR-HKH as an attempt to strengthen the capacity of governments and development stakeholders in the region. SERVIR-HKH hub plays vital role in CB on EO applications by bringing together the leading scientists from the Globe and the key national institutions and stakeholders in the region. We conducted country consultation workshops in Afghanistan, Bangladesh, Pakistan, and Nepal to identify national priorities, requirements and the capacity of the institutions to utilize EO information in decision making. The need assessments were focused on four thematic areas of SERVIR where capacity gaps in utilization of EO data in policy decisions were identified in thirteen key service areas. Geospatial capacities in GIT infrastructure, data, and human resources were varied. Linking EO information to policy decision is mostly lacking. Geospatial data sharing provision among the institutions in the region is poor. We developed a capacity building strategy for HKH region which bridges the gaps in a coordinated manner through customized training programs, institutional strengthening, coordination and regional cooperation. Using the strategy, we conducted training on FEWS NET remote sensing products for agro-climatological analysis, which focused on technical interpretation and analysis of the remote sensing and modeled products, eg, CHIRPS, RFE2, CHIRTS, GFS, NDVI, GeoCLIM and GeoGLAM. Scientists from USGS FEWS NET program delivered the training to mid-level managers and decision makers. We also carried out on-the-job trainings on wheat mapping using multi-sensor EO data for co-development of methodologies and implementation on sustainable basis. In this presentation, we will also present the lesson learned from capacity building efforts at SERVIR-HKH and how we envision the best practices for other SERVIR hubs.

  17. Impact of Drought on Groundwater and Soil Moisture - A Geospatial Tool for Water Resource Management

    NASA Astrophysics Data System (ADS)

    Ziolkowska, J. R.; Reyes, R.

    2016-12-01

    For many decades, recurring droughts in different regions in the US have been negatively impacting ecosystems and economic sectors. Oklahoma and Texas have been suffering from exceptional and extreme droughts in 2011-2014, with almost 95% of the state areas being affected (Drought Monitor, 2015). Accordingly, in 2011 alone, around 1.6 billion were lost in the agricultural sector alone as a result of drought in Oklahoma (Stotts 2011), and 7.6 billion in Texas agriculture (Fannin 2012). While surface water is among the instant indicators of drought conditions, it does not translate directly to groundwater resources that are the main source of irrigation water. Both surface water and groundwater are susceptible to drought, while groundwater depletion is a long-term process and might not show immediately. However, understanding groundwater availability is crucial for designing water management strategies and sustainable water use in the agricultural sector and other economic sectors. This paper presents an interactive geospatially weighted evaluation model and a tool at the same time to analyze groundwater resources that can be used for decision support in water management. The tool combines both groundwater and soil moisture changes in Oklahoma and Texas in 2003-2014, thus representing the most important indicators of agricultural and hydrological drought. The model allows for analyzing temporal and geospatial long-term drought at the county level. It can be expanded to other regions in the US and the world. The model has been validated with the Palmer Drought Index Severity Index to account for other indicators of meteorological drought. It can serve as a basis for an upcoming socio-economic and environmental analysis of drought events in the short and long-term in different geographic regions.

  18. Digital Mapping and Environmental Characterization of National Wild and Scenic River Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McManamay, Ryan A; Bosnall, Peter; Hetrick, Shelaine L

    2013-09-01

    Spatially accurate geospatial information is required to support decision-making regarding sustainable future hydropower development. Under a memorandum of understanding among several federal agencies, a pilot study was conducted to map a subset of National Wild and Scenic Rivers (WSRs) at a higher resolution and provide a consistent methodology for mapping WSRs across the United States and across agency jurisdictions. A subset of rivers (segments falling under the jurisdiction of the National Park Service) were mapped at a high resolution using the National Hydrography Dataset (NHD). The spatial extent and representation of river segments mapped at NHD scale were compared withmore » the prevailing geospatial coverage mapped at a coarser scale. Accurately digitized river segments were linked to environmental attribution datasets housed within the Oak Ridge National Laboratory s National Hydropower Asset Assessment Program database to characterize the environmental context of WSR segments. The results suggest that both the spatial scale of hydrography datasets and the adherence to written policy descriptions are critical to accurately mapping WSRs. The environmental characterization provided information to deduce generalized trends in either the uniqueness or the commonness of environmental variables associated with WSRs. Although WSRs occur in a wide range of human-modified landscapes, environmental data layers suggest that they provide habitats important to terrestrial and aquatic organisms and recreation important to humans. Ultimately, the research findings herein suggest that there is a need for accurate, consistent, mapping of the National WSRs across the agencies responsible for administering each river. Geospatial applications examining potential landscape and energy development require accurate sources of information, such as data layers that portray realistic spatial representations.« less

  19. Two Contrasting Approaches to Building High School Teacher Capacity to Teach About Local Climate Change Using Powerful Geospatial Data and Visualization Technology

    NASA Astrophysics Data System (ADS)

    Zalles, D. R.

    2011-12-01

    The presentation will compare and contrast two different place-based approaches to helping high school science teachers use geospatial data visualization technology to teach about climate change in their local regions. The approaches are being used in the development, piloting, and dissemination of two projects for high school science led by the author: the NASA-funded Data-enhanced Investigations for Climate Change Education (DICCE) and the NSF funded Studying Topography, Orographic Rainfall, and Ecosystems with Geospatial Information Technology (STORE). DICCE is bringing an extensive portal of Earth observation data, the Goddard Interactive Online Visualization and Analysis Infrastructure, to high school classrooms. STORE is making available data for viewing results of a particular IPCC-sanctioned climate change model in relation to recent data about average temperatures, precipitation, and land cover for study areas in central California and western New York State. Across the two projects, partner teachers of academically and ethnically diverse students from five states are participating in professional development and pilot testing. Powerful geospatial data representation technologies are difficult to implement in high school science because of challenges that teachers and students encounter navigating data access and making sense of data characteristics and nomenclature. Hence, on DICCE, the researchers are testing the theory that by providing a scaffolded technology-supported process for instructional design, starting from fundamental questions about the content domain, teachers will make better instructional decisions. Conversely, the STORE approach is rooted in the perspective that co-design of curricular materials among researchers and teacher partners that work off of "starter" lessons covering focal skills and understandings will lead to the most effective utilizations of the technology in the classroom. The projects' goals and strategies for student learning proceed from research suggesting that students will be more engaged and able to utilize prior knowledge better when seeing the local and hence personal relevance of climate change and other pressing contemporary science-related issues. In these projects, the students look for climate change trends in geospatial Earth System data layers from weather stations, satellites, and models in relation to global trends. They examine these data to (1) reify what they are learning in science class about meteorology, climate, and ecology, (2) build inquiry skills by posing and seeking answers to research questions, and (3) build data literacy skills through experience generating appropriate data queries and examining data output on different forms of geospatial representations such as maps, elevation profiles, and time series plots. Teachers also are given the opportunity to have their students look at geospatially represented census data from the tool Social Explorer (http://www.socialexplorer.com/pub/maps/home.aspx) in order to better understand demographic trends in relation to climate change-related trends in the Earth system. Early results will be reported about teacher professional development and student learning, gleaned from interviews and observations.

  20. Business models for implementing geospatial technologies in transportation decision-making : GIS-T symposium, April 8, 2009.

    DOT National Transportation Integrated Search

    2009-04-08

    In 2005 and 2006, the Federal Highway Administration (FHWA) Office of Interstate and Border Planning (HEPI), along with several state transportation executives, conducted a series of site visits to transportation agencies and GIS vendors to identify ...

  1. SDI-based business processes: A territorial analysis web information system in Spain

    NASA Astrophysics Data System (ADS)

    Béjar, Rubén; Latre, Miguel Á.; Lopez-Pellicer, Francisco J.; Nogueras-Iso, Javier; Zarazaga-Soria, F. J.; Muro-Medrano, Pedro R.

    2012-09-01

    Spatial Data Infrastructures (SDIs) provide access to geospatial data and operations through interoperable Web services. These data and operations can be chained to set up specialized geospatial business processes, and these processes can give support to different applications. End users can benefit from these applications, while experts can integrate the Web services in their own business processes and developments. This paper presents an SDI-based territorial analysis Web information system for Spain, which gives access to land cover, topography and elevation data, as well as to a number of interoperable geospatial operations by means of a Web Processing Service (WPS). Several examples illustrate how different territorial analysis business processes are supported. The system has been established by the Spanish National SDI (Infraestructura de Datos Espaciales de España, IDEE) both as an experimental platform for geoscientists and geoinformation system developers, and as a mechanism to contribute to the Spanish citizens knowledge about their territory.

  2. Ready or Not? Assessing Change Readiness for Implementation of the Geospatial Technology Competency Model[c

    ERIC Educational Resources Information Center

    Annulis, Heather M.; Gaudet, Cyndi H.

    2007-01-01

    A shortage of a qualified and skilled workforce exists to meet the demands of the geospatial industry (NASA, 2002). Solving today's workforce issues requires new and innovative methods and techniques for this high growth, high technology industry. One tool to support workforce development is a competency model which can be used to build a…

  3. GeoNotebook: Browser based Interactive analysis and visualization workflow for very large climate and geospatial datasets

    NASA Astrophysics Data System (ADS)

    Ozturk, D.; Chaudhary, A.; Votava, P.; Kotfila, C.

    2016-12-01

    Jointly developed by Kitware and NASA Ames, GeoNotebook is an open source tool designed to give the maximum amount of flexibility to analysts, while dramatically simplifying the process of exploring geospatially indexed datasets. Packages like Fiona (backed by GDAL), Shapely, Descartes, Geopandas, and PySAL provide a stack of technologies for reading, transforming, and analyzing geospatial data. Combined with the Jupyter notebook and libraries like matplotlib/Basemap it is possible to generate detailed geospatial visualizations. Unfortunately, visualizations generated is either static or does not perform well for very large datasets. Also, this setup requires a great deal of boilerplate code to create and maintain. Other extensions exist to remedy these problems, but they provide a separate map for each input cell and do not support map interactions that feed back into the python environment. To support interactive data exploration and visualization on large datasets we have developed an extension to the Jupyter notebook that provides a single dynamic map that can be managed from the Python environment, and that can communicate back with a server which can perform operations like data subsetting on a cloud-based cluster.

  4. Arc4nix: A cross-platform geospatial analytical library for cluster and cloud computing

    NASA Astrophysics Data System (ADS)

    Tang, Jingyin; Matyas, Corene J.

    2018-02-01

    Big Data in geospatial technology is a grand challenge for processing capacity. The ability to use a GIS for geospatial analysis on Cloud Computing and High Performance Computing (HPC) clusters has emerged as a new approach to provide feasible solutions. However, users lack the ability to migrate existing research tools to a Cloud Computing or HPC-based environment because of the incompatibility of the market-dominating ArcGIS software stack and Linux operating system. This manuscript details a cross-platform geospatial library "arc4nix" to bridge this gap. Arc4nix provides an application programming interface compatible with ArcGIS and its Python library "arcpy". Arc4nix uses a decoupled client-server architecture that permits geospatial analytical functions to run on the remote server and other functions to run on the native Python environment. It uses functional programming and meta-programming language to dynamically construct Python codes containing actual geospatial calculations, send them to a server and retrieve results. Arc4nix allows users to employ their arcpy-based script in a Cloud Computing and HPC environment with minimal or no modification. It also supports parallelizing tasks using multiple CPU cores and nodes for large-scale analyses. A case study of geospatial processing of a numerical weather model's output shows that arcpy scales linearly in a distributed environment. Arc4nix is open-source software.

  5. Public health, GIS, and the internet.

    PubMed

    Croner, Charles M

    2003-01-01

    Internet access and use of georeferenced public health information for GIS application will be an important and exciting development for the nation's Department of Health and Human Services and other health agencies in this new millennium. Technological progress toward public health geospatial data integration, analysis, and visualization of space-time events using the Web portends eventual robust use of GIS by public health and other sectors of the economy. Increasing Web resources from distributed spatial data portals and global geospatial libraries, and a growing suite of Web integration tools, will provide new opportunities to advance disease surveillance, control, and prevention, and insure public access and community empowerment in public health decision making. Emerging supercomputing, data mining, compression, and transmission technologies will play increasingly critical roles in national emergency, catastrophic planning and response, and risk management. Web-enabled public health GIS will be guided by Federal Geographic Data Committee spatial metadata, OpenGIS Web interoperability, and GML/XML geospatial Web content standards. Public health will become a responsive and integral part of the National Spatial Data Infrastructure.

  6. Open Source and Open Standard based decision support system: the example of lake Verbano floods management.

    NASA Astrophysics Data System (ADS)

    Cannata, Massimiliano; Antonovic, Milan; Pozzoni, Maurizio; Graf, Andrea

    2015-04-01

    The Locarno area (Switzerland, Canton Ticino) is exposed to lacual floods with a return period of about 7-8 years. The risk is of particular concern because the area is located in a floodplain that registered in the last decades a great increase in settlement and values of the real estates. Moreover small differences in lake altitude may produce a significant increase in flooded area due to the very low average slope of the terrain. While fatalities are not generally registered, several important economic costs are associated, e.g.: damages to real estates, interruption of activities, evacuation and relocation and environmental damages. While important events were registered in 1978, 1993, 2000, 2002 and 2014 the local stakeholder invested time and money in the set-up of an up-to-date decision support system that allows for the reduction of risks. Thanks to impressive technological advances the visionary concept of the Digital Earth (Gore 1992, 1998) is being realizing: geospatial coverages and monitoring systems data are increasingly available on the Web, and more importantly, in a standard format. As a result, today is possible to develop innovative decision support systems (Molinari et al. 2013) which mesh-up several information sources and offers special features for risk scenarios evaluation. In agreement with the exposed view, the authors have recently developed a new Web system whose design is based on the Service Oriented Architecture pattern. Open source software (e.g.: Geoserver, PostGIS, OpenLayers) has been used throughout the whole system and geospatial Open Standards (e.g.: SOS, WMS, WFS) are the pillars it rely on. SITGAP 2.0, implemented in collaboration with the Civil protection of Locarno e Vallemaggia, combines a number of data sources such as the Federal Register of Buildings and Dwellings, the Cantonal Register of residents, the Cadastral Surveying, the Cantonal Hydro-meteorological monitoring observations, the Meteoswiss weather forecasts, and others. As a result of this orchestration of data, SITGAP 2.0 serves features that allows, for example, to be informed on active alarms, to visualize lake level forecasts and associated flooding areas, to evaluate and map exposed elements and people, to plan and manage evacuation by searching for people living in particular areas or buildings, by registering evacuation actions and by searching for evacuated people. System architecture and functionalities, and consideration on the integration and accessibility of the beneath information together with the lesson learnt during the usage of the system during the last floods of November 2014, provides interesting discussion points for the identification of current and future needs.

  7. Alternative Land-Use Method for Spatially Informed Watershed Management Decision Making Using SWAT

    EPA Science Inventory

    In this study, a modification is proposed to the Soil and Water Assessment Tool (SWAT) to enable identification of areas where the implementation of best management practices would likely result in the most significant improvement in downstream water quality. To geospatially link...

  8. A Practice Approach of Multi-source Geospatial Data Integration for Web-based Geoinformation Services

    NASA Astrophysics Data System (ADS)

    Huang, W.; Jiang, J.; Zha, Z.; Zhang, H.; Wang, C.; Zhang, J.

    2014-04-01

    Geospatial data resources are the foundation of the construction of geo portal which is designed to provide online geoinformation services for the government, enterprise and public. It is vital to keep geospatial data fresh, accurate and comprehensive in order to satisfy the requirements of application and development of geographic location, route navigation, geo search and so on. One of the major problems we are facing is data acquisition. For us, integrating multi-sources geospatial data is the mainly means of data acquisition. This paper introduced a practice integration approach of multi-source geospatial data with different data model, structure and format, which provided the construction of National Geospatial Information Service Platform of China (NGISP) with effective technical supports. NGISP is the China's official geo portal which provides online geoinformation services based on internet, e-government network and classified network. Within the NGISP architecture, there are three kinds of nodes: national, provincial and municipal. Therefore, the geospatial data is from these nodes and the different datasets are heterogeneous. According to the results of analysis of the heterogeneous datasets, the first thing we do is to define the basic principles of data fusion, including following aspects: 1. location precision; 2.geometric representation; 3. up-to-date state; 4. attribute values; and 5. spatial relationship. Then the technical procedure is researched and the method that used to process different categories of features such as road, railway, boundary, river, settlement and building is proposed based on the principles. A case study in Jiangsu province demonstrated the applicability of the principle, procedure and method of multi-source geospatial data integration.

  9. Knowledge to Action - Understanding Natural Hazards-Induced Power Outage Scenarios for Actionable Disaster Responses

    NASA Astrophysics Data System (ADS)

    Kar, B.; Robinson, C.; Koch, D. B.; Omitaomu, O.

    2017-12-01

    The Sendai Framework for Disaster Risk Reduction 2015-2030 identified the following four priorities to prevent and reduce disaster risks: i) understanding disaster risk; ii) strengthening governance to manage disaster risk; iii) investing in disaster risk reduction for resilience and; iv) enhancing disaster preparedness for effective response, and to "Build Back Better" in recovery, rehabilitation and reconstruction. While forecasting and decision making tools are in place to predict and understand future impacts of natural hazards, the knowledge to action approach that currently exists fails to provide updated information needed by decision makers to undertake response and recovery efforts following a hazard event. For instance, during a tropical storm event advisories are released every two to three hours, but manual analysis of geospatial data to determine potential impacts of the event tends to be time-consuming and a post-event process. Researchers at Oak Ridge National Laboratory have developed a Spatial Decision Support System that enables real-time analysis of storm impact based on updated advisory. A prototype of the tool that focuses on determining projected power outage areas and projected duration of outages demonstrates the feasibility of integrating science with decision making for emergency management personnel to act in real time to protect communities and reduce risk.

  10. An extreme events laboratory to provide network centric collaborative situation assessment and decision making

    NASA Astrophysics Data System (ADS)

    Panulla, Brian J.; More, Loretta D.; Shumaker, Wade R.; Jones, Michael D.; Hooper, Robert; Vernon, Jeffrey M.; Aungst, Stanley G.

    2009-05-01

    Rapid improvements in communications infrastructure and sophistication of commercial hand-held devices provide a major new source of information for assessing extreme situations such as environmental crises. In particular, ad hoc collections of humans can act as "soft sensors" to augment data collected by traditional sensors in a net-centric environment (in effect, "crowd-sourcing" observational data). A need exists to understand how to task such soft sensors, characterize their performance and fuse the data with traditional data sources. In order to quantitatively study such situations, as well as study distributed decision-making, we have developed an Extreme Events Laboratory (EEL) at The Pennsylvania State University. This facility provides a network-centric, collaborative situation assessment and decision-making capability by supporting experiments involving human observers, distributed decision making and cognition, and crisis management. The EEL spans the information chain from energy detection via sensors, human observations, signal and image processing, pattern recognition, statistical estimation, multi-sensor data fusion, visualization and analytics, and modeling and simulation. The EEL command center combines COTS and custom collaboration tools in innovative ways, providing capabilities such as geo-spatial visualization and dynamic mash-ups of multiple data sources. This paper describes the EEL and several on-going human-in-the-loop experiments aimed at understanding the new collective observation and analysis landscape.

  11. Building Geospatial Web Services for Ecological Monitoring and Forecasting

    NASA Astrophysics Data System (ADS)

    Hiatt, S. H.; Hashimoto, H.; Melton, F. S.; Michaelis, A. R.; Milesi, C.; Nemani, R. R.; Wang, W.

    2008-12-01

    The Terrestrial Observation and Prediction System (TOPS) at NASA Ames Research Center is a modeling system that generates a suite of gridded data products in near real-time that are designed to enhance management decisions related to floods, droughts, forest fires, human health, as well as crop, range, and forest production. While these data products introduce great possibilities for assisting management decisions and informing further research, realization of their full potential is complicated by their shear volume and by the need for a necessary infrastructure for remotely browsing, visualizing, and analyzing the data. In order to address these difficulties we have built an OGC-compliant WMS and WCS server based on an open source software stack that provides standardized access to our archive of data. This server is built using the open source Java library GeoTools which achieves efficient I/O and image rendering through Java Advanced Imaging. We developed spatio-temporal raster management capabilities using the PostGrid raster indexation engine. We provide visualization and browsing capabilities through a customized Ajax web interface derived from the kaMap project. This interface allows resource managers to quickly assess ecosystem conditions and identify significant trends and anomalies from within their web browser without the need to download source data or install special software. Our standardized web services also expose TOPS data to a range of potential clients, from web mapping applications to virtual globes and desktop GIS packages. However, support for managing the temporal dimension of our data is currently limited in existing software systems. Future work will attempt to overcome this shortcoming by building time-series visualization and analysis tools that can be integrated with existing geospatial software.

  12. Rethinking GIS Towards The Vision Of Smart Cities Through CityGML

    NASA Astrophysics Data System (ADS)

    Guney, C.

    2016-10-01

    Smart cities present a substantial growth opportunity in the coming years. The role of GIS in the smart city ecosystem is to integrate different data acquired by sensors in real time and provide better decisions, more efficiency and improved collaboration. Semantically enriched vision of GIS will help evolve smart cities into tomorrow's much smarter cities since geospatial/location data and applications may be recognized as a key ingredient of smart city vision. However, it is need for the Geospatial Information communities to debate on "Is 3D Web and mobile GIS technology ready for smart cities?" This research places an emphasis on the challenges of virtual 3D city models on the road to smarter cities.

  13. Spatial Thinking: Precept for Understanding Operational Environments

    DTIC Science & Technology

    2016-06-10

    A Computer Movie Simulating Urban Growth in the Detroit Region,” 236. 29 U.S. National Research Council, Learning to Think Spatially: GIS as a... children and spatial language, the article focuses on the use of geospatial information systems (GIS) as a support mechanism for learning to think...Thinking, Cognition, Learning , Geospatial, Operating Environment, Space Perception 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18

  14. Emerging Geospatial Sharing Technologies in Earth and Space Science Informatics

    NASA Astrophysics Data System (ADS)

    Singh, R.; Bermudez, L. E.

    2013-12-01

    Emerging Geospatial Sharing Technologies in Earth and Space Science Informatics The Open Geospatial Consortium (OGC) mission is to serve as a global forum for the collaboration of developers and users of spatial data products and services, and to advance the development of international standards for geospatial interoperability. The OGC coordinates with over 400 institutions in the development of geospatial standards. In the last years two main trends are making disruptions in geospatial applications: mobile and context sharing. People now have more and more mobile devices to support their work and personal life. Mobile devices are intermittently connected to the internet and have smaller computing capacity than a desktop computer. Based on this trend a new OGC file format standard called GeoPackage will enable greater geospatial data sharing on mobile devices. GeoPackage is perhaps best understood as the natural evolution of Shapefiles, which have been the predominant lightweight geodata sharing format for two decades. However the format is extremely limited. Four major shortcomings are that only vector points, lines, and polygons are supported; property names are constrained by the dBASE format; multiple files are required to encode a single data set; and multiple Shapefiles are required to encode multiple data sets. A more modern lingua franca for geospatial data is long overdue. GeoPackage fills this need with support for vector data, image tile matrices, and raster data. And it builds upon a database container - SQLite - that's self-contained, single-file, cross-platform, serverless, transactional, and open source. A GeoPackage, in essence, is a set of SQLite database tables whose content and layout is described in the candidate GeoPackage Implementation Specification available at https://portal.opengeospatial.org/files/?artifact_id=54838&version=1. The second trend is sharing client 'contexts'. When a user is looking into an article or a product on the web, they can easily share this information with colleagues or friends via an email that includes URLs (links to web resources) and attachments (inline data). In the case of geospatial information, a user would like to share a map created from different OGC sources, which may include for example, WMS and WFS links, and GML and KML annotations. The emerging OGC file format is called the OGC Web Services Context Document (OWS Context), which allows clients to reproduce a map previously created by someone else. Context sharing is important in a variety of domains, from emergency response, where fire, police and emergency medical personnel need to work off a common map, to multi-national military operations, where coalition forces need to share common data sources, but have cartographic displays in different languages and symbology sets. OWS Contexts can be written in XML (building upon the Atom Syndication Format) or JSON. This presentation will provide an introduction of GeoPackage and OWS Context and how they can be used to advance sharing of Earth and Space Science information.

  15. Restful Implementation of Catalogue Service for Geospatial Data Provenance

    NASA Astrophysics Data System (ADS)

    Jiang, L. C.; Yue, P.; Lu, X. C.

    2013-10-01

    Provenance, also known as lineage, is important in understanding the derivation history of data products. Geospatial data provenance helps data consumers to evaluate the quality and reliability of geospatial data. In a service-oriented environment, where data are often consumed or produced by distributed services, provenance could be managed by following the same service-oriented paradigm. The Open Geospatial Consortium (OGC) Catalogue Service for the Web (CSW) is used for the registration and query of geospatial data provenance by extending ebXML Registry Information Model (ebRIM). Recent advance of the REpresentational State Transfer (REST) paradigm has shown great promise for the easy integration of distributed resources. RESTful Web Service aims to provide a standard way for Web clients to communicate with servers based on REST principles. The existing approach for provenance catalogue service could be improved by adopting the RESTful design. This paper presents the design and implementation of a catalogue service for geospatial data provenance following RESTful architecture style. A middleware named REST Converter is added on the top of the legacy catalogue service to support a RESTful style interface. The REST Converter is composed of a resource request dispatcher and six resource handlers. A prototype service is developed to demonstrate the applicability of the approach.

  16. An Interoperable Architecture for Air Pollution Early Warning System Based on Sensor Web

    NASA Astrophysics Data System (ADS)

    Samadzadegan, F.; Zahmatkesh, H.; Saber, M.; Ghazi khanlou, H. J.

    2013-09-01

    Environmental monitoring systems deal with time-sensitive issues which require quick responses in emergency situations. Handling the sensor observations in near real-time and obtaining valuable information is challenging issues in these systems from a technical and scientific point of view. The ever-increasing population growth in urban areas has caused certain problems in developing countries, which has direct or indirect impact on human life. One of applicable solution for controlling and managing air quality by considering real time and update air quality information gathered by spatially distributed sensors in mega cities, using sensor web technology for developing monitoring and early warning systems. Urban air quality monitoring systems using functionalities of geospatial information system as a platform for analysing, processing, and visualization of data in combination with Sensor Web for supporting decision support systems in disaster management and emergency situations. This system uses Sensor Web Enablement (SWE) framework of the Open Geospatial Consortium (OGC), which offers a standard framework that allows the integration of sensors and sensor data into spatial data infrastructures. SWE framework introduces standards for services to access sensor data and discover events from sensor data streams as well as definition set of standards for the description of sensors and the encoding of measurements. The presented system provides capabilities to collect, transfer, share, process air quality sensor data and disseminate air quality status in real-time. It is possible to overcome interoperability challenges by using standard framework. In a routine scenario, air quality data measured by in-situ sensors are communicated to central station where data is analysed and processed. The extracted air quality status is processed for discovering emergency situations, and if necessary air quality reports are sent to the authorities. This research proposed an architecture to represent how integrate air quality sensor data stream into geospatial data infrastructure to present an interoperable air quality monitoring system for supporting disaster management systems by real time information. Developed system tested on Tehran air pollution sensors for calculating Air Quality Index (AQI) for CO pollutant and subsequently notifying registered users in emergency cases by sending warning E-mails. Air quality monitoring portal used to retrieving and visualize sensor observation through interoperable framework. This system provides capabilities to retrieve SOS observation using WPS in a cascaded service chaining pattern for monitoring trend of timely sensor observation.

  17. Factors affecting species distribution predictions: A simulation modeling experiment

    Treesearch

    Gordon C. Reese; Kenneth R. Wilson; Jennifer A. Hoeting; Curtis H. Flather

    2005-01-01

    Geospatial species sample data (e.g., records with location information from natural history museums or annual surveys) are rarely collected optimally, yet are increasingly used for decisions concerning our biological heritage. Using computer simulations, we examined factors that could affect the performance of autologistic regression (ALR) models that predict species...

  18. Get a Grip on Demographics with Geospatial Technology

    ERIC Educational Resources Information Center

    Raymond, Randall E.

    2009-01-01

    Aging school infrastructure, changing population dynamics, decreased funding, and increased accountability for reporting school success all require today's school business officials to combine a variety of disparate data sets into a coherent system that enables effective and efficient decision making. School business officials are required to: (1)…

  19. A Land-Use-Planning Simulation Using Google Earth

    ERIC Educational Resources Information Center

    Bodzin, Alec M.; Cirucci, Lori

    2009-01-01

    Google Earth (GE) is proving to be a valuable tool in the science classroom for understanding the environment and making responsible environmental decisions (Bodzin 2008). GE provides learners with a dynamic mapping experience using a simple interface with a limited range of functions. This interface makes geospatial analysis accessible and…

  20. Quality Metadata Management for Geospatial Scientific Workflows: from Retrieving to Assessing with Online Tools

    NASA Astrophysics Data System (ADS)

    Leibovici, D. G.; Pourabdollah, A.; Jackson, M.

    2011-12-01

    Experts and decision-makers use or develop models to monitor global and local changes of the environment. Their activities require the combination of data and processing services in a flow of operations and spatial data computations: a geospatial scientific workflow. The seamless ability to generate, re-use and modify a geospatial scientific workflow is an important requirement but the quality of outcomes is equally much important [1]. Metadata information attached to the data and processes, and particularly their quality, is essential to assess the reliability of the scientific model that represents a workflow [2]. Managing tools, dealing with qualitative and quantitative metadata measures of the quality associated with a workflow, are, therefore, required for the modellers. To ensure interoperability, ISO and OGC standards [3] are to be adopted, allowing for example one to define metadata profiles and to retrieve them via web service interfaces. However these standards need a few extensions when looking at workflows, particularly in the context of geoprocesses metadata. We propose to fill this gap (i) at first through the provision of a metadata profile for the quality of processes, and (ii) through providing a framework, based on XPDL [4], to manage the quality information. Web Processing Services are used to implement a range of metadata analyses on the workflow in order to evaluate and present quality information at different levels of the workflow. This generates the metadata quality, stored in the XPDL file. The focus is (a) on the visual representations of the quality, summarizing the retrieved quality information either from the standardized metadata profiles of the components or from non-standard quality information e.g., Web 2.0 information, and (b) on the estimated qualities of the outputs derived from meta-propagation of uncertainties (a principle that we have introduced [5]). An a priori validation of the future decision-making supported by the outputs of the workflow once run, is then provided using the meta-propagated qualities, obtained without running the workflow [6], together with the visualization pointing out the need to improve the workflow with better data or better processes on the workflow graph itself. [1] Leibovici, DG, Hobona, G Stock, K Jackson, M (2009) Qualifying geospatial workfow models for adaptive controlled validity and accuracy. In: IEEE 17th GeoInformatics, 1-5 [2] Leibovici, DG, Pourabdollah, A (2010a) Workflow Uncertainty using a Metamodel Framework and Metadata for Data and Processes. OGC TC/PC Meetings, September 2010, Toulouse, France [3] OGC (2011) www.opengeospatial.org [4] XPDL (2008) Workflow Process Definition Interface - XML Process Definition Language.Workflow Management Coalition, Document WfMC-TC-1025, 2008 [5] Leibovici, DG Pourabdollah, A Jackson, M (2011) Meta-propagation of Uncertainties for Scientific Workflow Management in Interoperable Spatial Data Infrastructures. In: Proceedings of the European Geosciences Union (EGU2011), April 2011, Austria [6] Pourabdollah, A Leibovici, DG Jackson, M (2011) MetaPunT: an Open Source tool for Meta-Propagation of uncerTainties in Geospatial Processing. In: Proceedings of OSGIS2011, June 2011, Nottingham, UK

  1. Mapping a Difference: The Power of Geospatial Visualization

    NASA Astrophysics Data System (ADS)

    Kolvoord, B.

    2015-12-01

    Geospatial Technologies (GST), such as GIS, GPS and remote sensing, offer students and teachers the opportunity to study the "why" of where. By making maps and collecting location-based data, students can pursue authentic problems using sophisticated tools. The proliferation of web- and cloud-based tools has made these technologies broadly accessible to schools. In addition, strong spatial thinking skills have been shown to be a key factor in supporting students that want to study science, technology, engineering, and mathematics (STEM) disciplines (Wai, Lubinski and Benbow) and pursue STEM careers. Geospatial technologies strongly scaffold the development of these spatial thinking skills. For the last ten years, the Geospatial Semester, a unique dual-enrollment partnership between James Madison University and Virginia high schools, has provided students with the opportunity to use GST's to hone their spatial thinking skills and to do extended projects of local interest, including environmental, geological and ecological studies. Along with strong spatial thinking skills, these students have also shown strong problem solving skills, often beyond those of fellow students in AP classes. Programs like the Geospatial Semester are scalable and within the reach of many college and university departments, allowing strong engagement with K-12 schools. In this presentation, we'll share details of the Geospatial Semester and research results on the impact of the use of these technologies on students' spatial thinking skills, and discuss the success and challenges of developing K-12 partnerships centered on geospatial visualization.

  2. Distributed geospatial model sharing based on open interoperability standards

    USGS Publications Warehouse

    Feng, Min; Liu, Shuguang; Euliss, Ned H.; Fang, Yin

    2009-01-01

    Numerous geospatial computational models have been developed based on sound principles and published in journals or presented in conferences. However modelers have made few advances in the development of computable modules that facilitate sharing during model development or utilization. Constraints hampering development of model sharing technology includes limitations on computing, storage, and connectivity; traditional stand-alone and closed network systems cannot fully support sharing and integrating geospatial models. To address this need, we have identified methods for sharing geospatial computational models using Service Oriented Architecture (SOA) techniques and open geospatial standards. The service-oriented model sharing service is accessible using any tools or systems compliant with open geospatial standards, making it possible to utilize vast scientific resources available from around the world to solve highly sophisticated application problems. The methods also allow model services to be empowered by diverse computational devices and technologies, such as portable devices and GRID computing infrastructures. Based on the generic and abstract operations and data structures required for Web Processing Service (WPS) standards, we developed an interactive interface for model sharing to help reduce interoperability problems for model use. Geospatial computational models are shared on model services, where the computational processes provided by models can be accessed through tools and systems compliant with WPS. We developed a platform to help modelers publish individual models in a simplified and efficient way. Finally, we illustrate our technique using wetland hydrological models we developed for the prairie pothole region of North America.

  3. Negotiation and Decision Making with Collaborative Software: How MarineMap 'Changed the Game' in California's Marine Life Protected Act Initiative.

    PubMed

    Cravens, Amanda E

    2016-02-01

    Environmental managers and planners have become increasingly enthusiastic about the potential of decision support tools (DSTs) to improve environmental decision-making processes as information technology transforms many aspects of daily life. Discussions about DSTs, however, rarely recognize the range of ways software can influence users' negotiation, problem-solving, or decision-making strategies and incentives, in part because there are few empirical studies of completed processes that used technology. This mixed-methods study-which draws on data from approximately 60 semi-structured interviews and an online survey--examines how one geospatial DST influenced participants' experiences during a multi-year marine planning process in California. Results suggest that DSTs can facilitate communication by creating a common language, help users understand the geography and scientific criteria in play during the process, aid stakeholders in identifying shared or diverging interests, and facilitate joint problem solving. The same design features that enabled the tool to aid in decision making, however, also presented surprising challenges in certain circumstances by, for example, making it difficult for participants to discuss information that was not spatially represented on the map-based interface. The study also highlights the importance of the social context in which software is developed and implemented, suggesting that the relationship between the software development team and other participants may be as important as technical software design in shaping how DSTs add value. The paper concludes with considerations to inform the future use of DSTs in environmental decision-making processes.

  4. Negotiation and Decision Making with Collaborative Software: How MarineMap `Changed the Game' in California's Marine Life Protected Act Initiative

    NASA Astrophysics Data System (ADS)

    Cravens, Amanda E.

    2016-02-01

    Environmental managers and planners have become increasingly enthusiastic about the potential of decision support tools (DSTs) to improve environmental decision-making processes as information technology transforms many aspects of daily life. Discussions about DSTs, however, rarely recognize the range of ways software can influence users' negotiation, problem-solving, or decision-making strategies and incentives, in part because there are few empirical studies of completed processes that used technology. This mixed-methods study—which draws on data from approximately 60 semi-structured interviews and an online survey—examines how one geospatial DST influenced participants' experiences during a multi-year marine planning process in California. Results suggest that DSTs can facilitate communication by creating a common language, help users understand the geography and scientific criteria in play during the process, aid stakeholders in identifying shared or diverging interests, and facilitate joint problem solving. The same design features that enabled the tool to aid in decision making, however, also presented surprising challenges in certain circumstances by, for example, making it difficult for participants to discuss information that was not spatially represented on the map-based interface. The study also highlights the importance of the social context in which software is developed and implemented, suggesting that the relationship between the software development team and other participants may be as important as technical software design in shaping how DSTs add value. The paper concludes with considerations to inform the future use of DSTs in environmental decision-making processes.

  5. Key recovery factors for the August 24, 2014, South Napa Earthquake

    USGS Publications Warehouse

    Hudnut, Kenneth W.; Brocher, Thomas M.; Prentice, Carol S.; Boatwright, John; Brooks, Benjamin A.; Aagaard, Brad T.; Blair, James Luke; Fletcher, Jon Peter B.; Erdem, Jemile; Wicks, Chuck; Murray, Jessica R.; Pollitz, Fred F.; Langbein, John O.; Svarc, Jerry L.; Schwartz, David P.; Ponti, Daniel J.; Hecker, Suzanne; DeLong, Stephen B.; Rosa, Carla M.; Jones, Brenda; Lamb, Rynn M.; Rosinski, Anne M.; McCrink, Timothy P.; Dawson, Timothy E.; Seitz, Gordon G.; Glennie, Craig; Hauser, Darren; Ericksen, Todd; Mardock, Dan; Hoirup, Don F.; Bray, Jonathan D.; Rubin, Ron S.

    2014-01-01

    Through discussions between the Federal Emergency Management Agency (FEMA) and the U.S. Geological Survey (USGS) following the South Napa earthquake, it was determined that several key decision points would be faced by FEMA for which additional information should be sought and provided by USGS and its partners. This report addresses the four tasks that were agreed to. These tasks are (1) assessment of ongoing fault movement (called afterslip) especially in the Browns Valley residential neighborhood, (2) assessment of the shaking pattern in the downtown area of the City of Napa, (3) improvement of information on the fault hazards posed by the West Napa Fault System (record of past earthquakes and slip rate, for example), and (4) imagery acquisition and data processing to provide overall geospatial information support to FEMA.

  6. Geospatial cryptography: enabling researchers to access private, spatially referenced, human subjects data for cancer control and prevention.

    PubMed

    Jacquez, Geoffrey M; Essex, Aleksander; Curtis, Andrew; Kohler, Betsy; Sherman, Recinda; Emam, Khaled El; Shi, Chen; Kaufmann, Andy; Beale, Linda; Cusick, Thomas; Goldberg, Daniel; Goovaerts, Pierre

    2017-07-01

    As the volume, accuracy and precision of digital geographic information have increased, concerns regarding individual privacy and confidentiality have come to the forefront. Not only do these challenge a basic tenet underlying the advancement of science by posing substantial obstacles to the sharing of data to validate research results, but they are obstacles to conducting certain research projects in the first place. Geospatial cryptography involves the specification, design, implementation and application of cryptographic techniques to address privacy, confidentiality and security concerns for geographically referenced data. This article defines geospatial cryptography and demonstrates its application in cancer control and surveillance. Four use cases are considered: (1) national-level de-duplication among state or province-based cancer registries; (2) sharing of confidential data across cancer registries to support case aggregation across administrative geographies; (3) secure data linkage; and (4) cancer cluster investigation and surveillance. A secure multi-party system for geospatial cryptography is developed. Solutions under geospatial cryptography are presented and computation time is calculated. As services provided by cancer registries to the research community, de-duplication, case aggregation across administrative geographies and secure data linkage are often time-consuming and in some instances precluded by confidentiality and security concerns. Geospatial cryptography provides secure solutions that hold significant promise for addressing these concerns and for accelerating the pace of research with human subjects data residing in our nation's cancer registries. Pursuit of the research directions posed herein conceivably would lead to a geospatially encrypted geographic information system (GEGIS) designed specifically to promote the sharing and spatial analysis of confidential data. Geospatial cryptography holds substantial promise for accelerating the pace of research with spatially referenced human subjects data.

  7. Sideloading - Ingestion of Large Point Clouds Into the Apache Spark Big Data Engine

    NASA Astrophysics Data System (ADS)

    Boehm, J.; Liu, K.; Alis, C.

    2016-06-01

    In the geospatial domain we have now reached the point where data volumes we handle have clearly grown beyond the capacity of most desktop computers. This is particularly true in the area of point cloud processing. It is therefore naturally lucrative to explore established big data frameworks for big geospatial data. The very first hurdle is the import of geospatial data into big data frameworks, commonly referred to as data ingestion. Geospatial data is typically encoded in specialised binary file formats, which are not naturally supported by the existing big data frameworks. Instead such file formats are supported by software libraries that are restricted to single CPU execution. We present an approach that allows the use of existing point cloud file format libraries on the Apache Spark big data framework. We demonstrate the ingestion of large volumes of point cloud data into a compute cluster. The approach uses a map function to distribute the data ingestion across the nodes of a cluster. We test the capabilities of the proposed method to load billions of points into a commodity hardware compute cluster and we discuss the implications on scalability and performance. The performance is benchmarked against an existing native Apache Spark data import implementation.

  8. Establishing Transportation Framework Services Using the Open Geospatial Consortium Web Feature Service Specification

    NASA Astrophysics Data System (ADS)

    Yang, C.; Wong, D. W.; Phillips, T.; Wright, R. A.; Lindsey, S.; Kafatos, M.

    2005-12-01

    As a teamed partnership of the Center for Earth Observing and Space Research (CEOSR) at George Mason University (GMU), Virginia Department of Transportation (VDOT), Bureau of Transportation Statistics at the Department of Transportation (BTS/DOT), and Intergraph, we established Transportation Framework Data Services using Open Geospatial Consortium (OGC)'s Web Feature Service (WFS) Specification to enable the sharing of transportation data among the federal level with data from BTS/DOT, the state level through VDOT, the industries through Intergraph. CEOSR develops WFS solutions using Intergraph software. Relevant technical documents are also developed and disseminated through the partners. The WFS is integrated with operational geospatial systems at CEOSR and VDOT. CEOSR works with Intergraph on developing WFS solutions and technical documents. GeoMedia WebMap WFS toolkit is used with software and technical support from Intergraph. ESRI ArcIMS WFS connector is used with GMU's campus license of ESRI products. Tested solutions are integrated with framework data service operational systems, including 1) CEOSR's interoperable geospatial information services, FGDC clearinghouse Node, Geospatial One Stop (GOS) portal, and WMS services, 2) VDOT's state transportation data and GIS infrastructure, and 3)BTS/DOT's national transportation data. The project presents: 1) develop and deploy an operational OGC WFS 1.1 interfaces at CEOSR for registering with FGDC/GOS Portal and responding to Web ``POST'' requests for transportation Framework data as listed in Table 1; 2) build the WFS service that can return the data that conform to the drafted ANSI/INCITS L1 Standard (when available) for each identified theme in the format given by OGC Geography Markup Language (GML) Version 3.0 or higher; 3) integrate the OGC WFS with CEOSR's clearinghouse nodes, 4) establish a formal partnership to develop and share WFS-based geospatial interoperability technology among GMU, VDOT, BTS/DOT, and Intergraph; and 5) develop WFS-based solutions and technical documents using the GeoMedia WebMap WFS toolkit. Geospatial Web Feature Service is demonstrated to be more efficient in sharing vector data and supports direct Internet access transportation data. Developed WFS solutions also enhanced the interoperable service provided by CEOSR through the FGDC clearinghouse node and the GOS Portal.

  9. Prototyping an online wetland ecosystem services model using open model sharing standards

    USGS Publications Warehouse

    Feng, M.; Liu, S.; Euliss, N.H.; Young, Caitlin; Mushet, D.M.

    2011-01-01

    Great interest currently exists for developing ecosystem models to forecast how ecosystem services may change under alternative land use and climate futures. Ecosystem services are diverse and include supporting services or functions (e.g., primary production, nutrient cycling), provisioning services (e.g., wildlife, groundwater), regulating services (e.g., water purification, floodwater retention), and even cultural services (e.g., ecotourism, cultural heritage). Hence, the knowledge base necessary to quantify ecosystem services is broad and derived from many diverse scientific disciplines. Building the required interdisciplinary models is especially challenging as modelers from different locations and times may develop the disciplinary models needed for ecosystem simulations, and these models must be identified and made accessible to the interdisciplinary simulation. Additional difficulties include inconsistent data structures, formats, and metadata required by geospatial models as well as limitations on computing, storage, and connectivity. Traditional standalone and closed network systems cannot fully support sharing and integrating interdisciplinary geospatial models from variant sources. To address this need, we developed an approach to openly share and access geospatial computational models using distributed Geographic Information System (GIS) techniques and open geospatial standards. We included a means to share computational models compliant with Open Geospatial Consortium (OGC) Web Processing Services (WPS) standard to ensure modelers have an efficient and simplified means to publish new models. To demonstrate our approach, we developed five disciplinary models that can be integrated and shared to simulate a few of the ecosystem services (e.g., water storage, waterfowl breeding) that are provided by wetlands in the Prairie Pothole Region (PPR) of North America.

  10. E-Learning in Photogrammetry, Remote Sensing and Spatial Information Science

    NASA Astrophysics Data System (ADS)

    Vyas, Anjana; König, Gerhard

    2016-06-01

    Science and technology are evolving leaps and bounds. The advancements in GI-Science for natural and built environment helps in improving the quality of life. Learning through education and training needs to be at par with those advancements, which plays a vital role in utilization of technology. New technologies that creates new opportunities have enabled Geomatics to broaden the horizon (skills and competencies). Government policies and decisions support the use of geospatial science in various sectors of governance. Mapping, Land management, Urban planning, Environmental planning, Industrialization are some of the areas where the geomatics has become a baseline for decision making at national level. There is a need to bridge the gap between developments in geospatial science and its utilization and implementation. To prepare a framework for standardisation it is important to understand the theories of education and prevailing practices, with articulate goals exploring variety of teaching techniques. E-Learning is an erudition practice shaped for facilitating learning and improving performance by creating, using and managing appropriate technological processes and resources through digital and network-enabled technology. It is a shift from traditional education or training to ICT-based flexible and collaborative learning based on the community of learners, academia, professionals, experts and facilitators. Developments in e-learning is focussed on computer assisted learning which has become popular because of its potential for providing more flexible access to content and instruction at any time, from any place (Means et al, 2009). With the advent of the geo-spatial technology, fast development in the software and hardware, the demand for skilled manpower is increasing and the need is for training, education, research and dissemination. It suggests inter-organisational cooperation between academia, industry, government and international collaboration. There is a nascent need to adopt multi-specialisation approach to examine the issues and challenges of research in such a valued topic of education and training in multi-disciplinary areas. Learning involve a change in an individual's knowledge, ability to perform a skill, participate and communicate. There is considerable variation among the theories about the nature of this change. This paper derives from a scientific research grant received from ISPRS, reveals a summary result from assessing various theories and methods of evaluation of learning through education, system and structure of it for GeoInformatics.

  11. High performance geospatial and climate data visualization using GeoJS

    NASA Astrophysics Data System (ADS)

    Chaudhary, A.; Beezley, J. D.

    2015-12-01

    GeoJS (https://github.com/OpenGeoscience/geojs) is an open-source library developed to support interactive scientific and geospatial visualization of climate and earth science datasets in a web environment. GeoJS has a convenient application programming interface (API) that enables users to harness the fast performance of WebGL and Canvas 2D APIs with sophisticated Scalable Vector Graphics (SVG) features in a consistent and convenient manner. We started the project in response to the need for an open-source JavaScript library that can combine traditional geographic information systems (GIS) and scientific visualization on the web. Many libraries, some of which are open source, support mapping or other GIS capabilities, but lack the features required to visualize scientific and other geospatial datasets. For instance, such libraries are not be capable of rendering climate plots from NetCDF files, and some libraries are limited in regards to geoinformatics (infovis in a geospatial environment). While libraries such as d3.js are extremely powerful for these kinds of plots, in order to integrate them into other GIS libraries, the construction of geoinformatics visualizations must be completed manually and separately, or the code must somehow be mixed in an unintuitive way.We developed GeoJS with the following motivations:• To create an open-source geovisualization and GIS library that combines scientific visualization with GIS and informatics• To develop an extensible library that can combine data from multiple sources and render them using multiple backends• To build a library that works well with existing scientific visualizations tools such as VTKWe have successfully deployed GeoJS-based applications for multiple domains across various projects. The ClimatePipes project funded by the Department of Energy, for example, used GeoJS to visualize NetCDF datasets from climate data archives. Other projects built visualizations using GeoJS for interactively exploring data and analysis regarding 1) the human trafficking domain, 2) New York City taxi drop-offs and pick-ups, and 3) the Ebola outbreak. GeoJS supports advanced visualization features such as picking and selecting, as well as clustering. It also supports 2D contour plots, vector plots, heat maps, and geospatial graphs.

  12. Uncertainty and risk in wildland fire management: a review.

    PubMed

    Thompson, Matthew P; Calkin, Dave E

    2011-08-01

    Wildland fire management is subject to manifold sources of uncertainty. Beyond the unpredictability of wildfire behavior, uncertainty stems from inaccurate/missing data, limited resource value measures to guide prioritization across fires and resources at risk, and an incomplete scientific understanding of ecological response to fire, of fire behavior response to treatments, and of spatiotemporal dynamics involving disturbance regimes and climate change. This work attempts to systematically align sources of uncertainty with the most appropriate decision support methodologies, in order to facilitate cost-effective, risk-based wildfire planning efforts. We review the state of wildfire risk assessment and management, with a specific focus on uncertainties challenging implementation of integrated risk assessments that consider a suite of human and ecological values. Recent advances in wildfire simulation and geospatial mapping of highly valued resources have enabled robust risk-based analyses to inform planning across a variety of scales, although improvements are needed in fire behavior and ignition occurrence models. A key remaining challenge is a better characterization of non-market resources at risk, both in terms of their response to fire and how society values those resources. Our findings echo earlier literature identifying wildfire effects analysis and value uncertainty as the primary challenges to integrated wildfire risk assessment and wildfire management. We stress the importance of identifying and characterizing uncertainties in order to better quantify and manage them. Leveraging the most appropriate decision support tools can facilitate wildfire risk assessment and ideally improve decision-making. Published by Elsevier Ltd.

  13. Summary of hydrologic modeling for the Delaware River Basin using the Water Availability Tool for Environmental Resources (WATER)

    USGS Publications Warehouse

    Williamson, Tanja N.; Lant, Jeremiah G.; Claggett, Peter; Nystrom, Elizabeth A.; Milly, Paul C.D.; Nelson, Hugh L.; Hoffman, Scott A.; Colarullo, Susan J.; Fischer, Jeffrey M.

    2015-11-18

    The Water Availability Tool for Environmental Resources (WATER) is a decision support system for the nontidal part of the Delaware River Basin that provides a consistent and objective method of simulating streamflow under historical, forecasted, and managed conditions. In order to quantify the uncertainty associated with these simulations, however, streamflow and the associated hydroclimatic variables of potential evapotranspiration, actual evapotranspiration, and snow accumulation and snowmelt must be simulated and compared to long-term, daily observations from sites. This report details model development and optimization, statistical evaluation of simulations for 57 basins ranging from 2 to 930 km2 and 11.0 to 99.5 percent forested cover, and how this statistical evaluation of daily streamflow relates to simulating environmental changes and management decisions that are best examined at monthly time steps normalized over multiple decades. The decision support system provides a database of historical spatial and climatic data for simulating streamflow for 2001–11, in addition to land-cover and general circulation model forecasts that focus on 2030 and 2060. WATER integrates geospatial sampling of landscape characteristics, including topographic and soil properties, with a regionally calibrated hillslope-hydrology model, an impervious-surface model, and hydroclimatic models that were parameterized by using three hydrologic response units: forested, agricultural, and developed land cover. This integration enables the regional hydrologic modeling approach used in WATER without requiring site-specific optimization or those stationary conditions inferred when using a statistical model.

  14. Relating Local to Global Spatial Knowledge: Heuristic Influence of Local Features on Direction Estimates

    ERIC Educational Resources Information Center

    Phillips, Daniel W.; Montello, Daniel R.

    2015-01-01

    Previous research has examined heuristics--simplified decision-making rules-of-thumb--for geospatial reasoning. This study examined at two locations the influence of beliefs about local coastline orientation on estimated directions to local and distant places; estimates were made immediately or after fifteen seconds. This study goes beyond…

  15. RacerGISOnline: Enhancing Learning in Marketing Classes with Web-Based Business GIS

    ERIC Educational Resources Information Center

    Miller, Fred L.; Mangold, W. Glynn; Roach, Joy; Brockway, Gary; Johnston, Timothy; Linnhoff, Stefan; McNeely, Sam; Smith, Kathy; Holmes, Terence

    2014-01-01

    Geographic Information Systems (GIS) offer geospatial analytical tools with great potential for applications in marketing decision making. However, for various reasons, the rate of adoption of these tools in academic marketing programs has lagged behind that of marketing practitioners. RacerGISOnline is an innovative approach to integrating these…

  16. Integrating Geographic Information Systems in Business School Curriculum: An Initial Example

    ERIC Educational Resources Information Center

    King, Michael A.; Arnette, Andrew N.

    2011-01-01

    Geographic information systems have experienced rapid growth and user adoption over the last four decades, due to an increasing value to the business community. However, business schools are not teaching geospatial concepts and the related location intelligence to their students. This curriculum decision seems completely at odds with business'…

  17. Interoperability in planetary research for geospatial data analysis

    NASA Astrophysics Data System (ADS)

    Hare, Trent M.; Rossi, Angelo P.; Frigeri, Alessandro; Marmo, Chiara

    2018-01-01

    For more than a decade there has been a push in the planetary science community to support interoperable methods for accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (e.g., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized geospatial image formats, geologic mapping conventions, U.S. Federal Geographic Data Committee (FGDC) cartographic and metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Map Tile Services (cached image tiles), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they can be just as valuable for planetary domain. Another initiative, called VESPA (Virtual European Solar and Planetary Access), will marry several of the above geoscience standards and astronomy-based standards as defined by International Virtual Observatory Alliance (IVOA). This work outlines the current state of interoperability initiatives in use or in the process of being researched within the planetary geospatial community.

  18. Data Democracy and Decision Making: Enhancing the Use and Value of Geospatial Data and Scientific Information

    NASA Astrophysics Data System (ADS)

    Shapiro, C. D.

    2014-12-01

    Data democracy is a concept that has great relevance to the use and value of geospatial data and scientific information. Data democracy describes a world in which data and information are widely and broadly accessible, understandable, and useable. The concept operationalizes the public good nature of scientific information and provides a framework for increasing benefits from its use. Data democracy encompasses efforts to increase accessibility to geospatial data and to expand participation in its collection, analysis, and application. These two pillars are analogous to demand and supply relationships. Improved accessibility, or demand, includes increased knowledge about geospatial data and low barriers to retrieval and use. Expanded participation, or supply, encompasses a broader community involved in developing geospatial data and scientific information. This pillar of data democracy is characterized by methods such as citizen science or crowd sourcing.A framework is developed for advancing the use of data democracy. This includes efforts to assess the societal benefits (economic and social) of scientific information. This knowledge is critical to continued monitoring of the effectiveness of data democracy implementation and of potential impact on the use and value of scientific information. The framework also includes an assessment of opportunities for advancing data democracy both on the supply and demand sides. These opportunities include relatively inexpensive efforts to reduce barriers to use as well as the identification of situations in which participation can be expanded in scientific efforts to enhance the breadth of involvement as well as expanding participation to non-traditional communities. This framework provides an initial perspective on ways to expand the "scientific community" of data users and providers. It also describes a way forward for enhancing the societal benefits from geospatial data and scientific information. As a result, data democracy not only provides benefits to a greater population, it enhances the value of science.

  19. Celebrating ten years of collaboration

    USGS Publications Warehouse

    Cushing, W. Matthew

    2017-01-01

    Since the GEOSUR Program launched in 2007, the U.S. Geological Survey (USGS) Earth Resources Observation and Science (EROS) Center has had the honor of collaborating with CAF, PAIGH, and others supporting the Latin America GEOSUR Program. The catalyst for starting the program was the convergence of regional geospatial activities USGS, PAIGH, and CAF had been involved in and they seized the opportunity to consolidate, and increase the sharing of geospatial information at national and regional levels.

  20. Measuring the Interdisciplinary Impact of Using Geospatial Data with Remote Sensing Data

    NASA Astrophysics Data System (ADS)

    Downs, R. R.; Chen, R. S.; Schumacher, J.

    2017-12-01

    Various disciplines offer benefits to society by contributing to the scientific progress that informs the knowledge and decisions that improve the lives, safety, and conditions of people around the globe. In addition to disciplines within the natural sciences, other disciplines, including those in the social, health, and computer sciences, provide benefits to society by collecting, preparing, and analyzing data in the process of conducting research. Preparing geospatial environmental and socioeconomic data together with remote sensing data from satellite-based instruments for wider use by heterogeneous communities of users increases the potential impact of these data by enabling their use in different application areas and sectors of society. Furthermore, enabling wider use of scientific data can bring to bear resources and expertise that will improve reproducibility, quality, methodological transparency, interoperability, and improved understanding by diverse communities of users. In line with its commitment to open data, the NASA Socioeconomic Data and Applications Center (SEDAC), which focuses on human interactions in the environment, curates and disseminates freely and publicly available geospatial data for use across many disciplines and societal benefit areas. We describe efforts to broaden the use of SEDAC data and to publicly document their impact, assess the interdisciplinary impact of the use of SEDAC data with remote sensing data, and characterize these impacts in terms of their influence across disciplines by analyzing citations of geospatial data with remote sensing data within scientific journals.

  1. One map policy (OMP) implementation strategy to accelerate mapping of regional spatial planing (RTRW) in Indonesia

    NASA Astrophysics Data System (ADS)

    Hasyim, Fuad; Subagio, Habib; Darmawan, Mulyanto

    2016-06-01

    A preparation of spatial planning documents require basic geospatial information and thematic accuracies. Recently these issues become important because spatial planning maps are impartial attachment of the regional act draft on spatial planning (PERDA). The needs of geospatial information in the preparation of spatial planning maps preparation can be divided into two major groups: (i). basic geospatial information (IGD), consist of of Indonesia Topographic maps (RBI), coastal and marine environmental maps (LPI), and geodetic control network and (ii). Thematic Geospatial Information (IGT). Currently, mostly local goverment in Indonesia have not finished their regulation draft on spatial planning due to some constrain including technical aspect. Some constrain in mapping of spatial planning are as follows: the availability of large scale ofbasic geospatial information, the availability of mapping guidelines, and human resources. Ideal conditions to be achieved for spatial planning maps are: (i) the availability of updated geospatial information in accordance with the scale needed for spatial planning maps, (ii) the guideline of mapping for spatial planning to support local government in completion their PERDA, and (iii) capacity building of local goverment human resources to completed spatial planning maps. The OMP strategies formulated to achieve these conditions are: (i) accelerating of IGD at scale of 1:50,000, 1: 25,000 and 1: 5,000, (ii) to accelerate mapping and integration of Thematic Geospatial Information (IGT) through stocktaking availability and mapping guidelines, (iii) the development of mapping guidelines and dissemination of spatial utilization and (iv) training of human resource on mapping technology.

  2. A multi-service data management platform for scientific oceanographic products

    NASA Astrophysics Data System (ADS)

    D'Anca, Alessandro; Conte, Laura; Nassisi, Paola; Palazzo, Cosimo; Lecci, Rita; Cretì, Sergio; Mancini, Marco; Nuzzo, Alessandra; Mirto, Maria; Mannarini, Gianandrea; Coppini, Giovanni; Fiore, Sandro; Aloisio, Giovanni

    2017-02-01

    An efficient, secure and interoperable data platform solution has been developed in the TESSA project to provide fast navigation and access to the data stored in the data archive, as well as a standard-based metadata management support. The platform mainly targets scientific users and the situational sea awareness high-level services such as the decision support systems (DSS). These datasets are accessible through the following three main components: the Data Access Service (DAS), the Metadata Service and the Complex Data Analysis Module (CDAM). The DAS allows access to data stored in the archive by providing interfaces for different protocols and services for downloading, variables selection, data subsetting or map generation. Metadata Service is the heart of the information system of the TESSA products and completes the overall infrastructure for data and metadata management. This component enables data search and discovery and addresses interoperability by exploiting widely adopted standards for geospatial data. Finally, the CDAM represents the back-end of the TESSA DSS by performing on-demand complex data analysis tasks.

  3. Geo-spatial Service and Application based on National E-government Network Platform and Cloud

    NASA Astrophysics Data System (ADS)

    Meng, X.; Deng, Y.; Li, H.; Yao, L.; Shi, J.

    2014-04-01

    With the acceleration of China's informatization process, our party and government take a substantive stride in advancing development and application of digital technology, which promotes the evolution of e-government and its informatization. Meanwhile, as a service mode based on innovative resources, cloud computing may connect huge pools together to provide a variety of IT services, and has become one relatively mature technical pattern with further studies and massive practical applications. Based on cloud computing technology and national e-government network platform, "National Natural Resources and Geospatial Database (NRGD)" project integrated and transformed natural resources and geospatial information dispersed in various sectors and regions, established logically unified and physically dispersed fundamental database and developed national integrated information database system supporting main e-government applications. Cross-sector e-government applications and services are realized to provide long-term, stable and standardized natural resources and geospatial fundamental information products and services for national egovernment and public users.

  4. A Collaborative Decision Environment for UAV Operations

    NASA Technical Reports Server (NTRS)

    D'Ortenzio, Matthew V.; Enomoto, Francis Y.; Johan, Sandra L.

    2005-01-01

    NASA is developing Intelligent Mission Management (IMM) technology for science missions employing long endurance unmanned aerial vehicles (UAV's). The IMM groundbased component is the Collaborative Decision Environment (CDE), a ground system that provides the Mission/Science team with situational awareness, collaboration, and decisionmaking tools. The CDE is used for pre-flight planning, mission monitoring, and visualization of acquired data. It integrates external data products used for planning and executing a mission, such as weather, satellite data products, and topographic maps by leveraging established and emerging Open Geospatial Consortium (OGC) standards to acquire external data products via the Internet, and an industry standard geographic information system (GIs) toolkit for visualization As a Science/Mission team may be geographically dispersed, the CDE is capable of providing access to remote users across wide area networks using Web Services technology. A prototype CDE is being developed for an instrument checkout flight on a manned aircraft in the fall of 2005, in preparation for a full deployment in support of the US Forest Service and NASA Ames Western States Fire Mission in 2006.

  5. Fast Tracking Data to Informed Decisions: An Advanced Information System to Improve Environmental Understanding and Management (Invited)

    NASA Astrophysics Data System (ADS)

    Minsker, B. S.; Myers, J.; Liu, Y.; Bajcsy, P.

    2010-12-01

    Emerging sensing and information technology are rapidly creating a new paradigm for environmental research and management, in which data from multiple sensors and information sources can guide real-time adaptive observation and decision making. This talk will provide an overview of emerging cyberinfrastructure and three case studies that illustrate their potential: combined sewer overflows in Chicago, hypoxia in Corpus Christi Bay, Texas, and sustainable agriculture in Illinois. An advanced information system for real-time decision making and visual geospatial analytics will be presented as an example of cyberinfrastructure that enables easier implementation of numerous real-time applications.

  6. a New Approach for Progressive Dense Reconstruction from Consecutive Images Based on Prior Low-Density 3d Point Clouds

    NASA Astrophysics Data System (ADS)

    Lari, Z.; El-Sheimy, N.

    2017-09-01

    In recent years, the increasing incidence of climate-related disasters has tremendously affected our environment. In order to effectively manage and reduce dramatic impacts of such events, the development of timely disaster management plans is essential. Since these disasters are spatial phenomena, timely provision of geospatial information is crucial for effective development of response and management plans. Due to inaccessibility of the affected areas and limited budget of first-responders, timely acquisition of the required geospatial data for these applications is usually possible only using low-cost imaging and georefencing sensors mounted on unmanned platforms. Despite rapid collection of the required data using these systems, available processing techniques are not yet capable of delivering geospatial information to responders and decision makers in a timely manner. To address this issue, this paper introduces a new technique for dense 3D reconstruction of the affected scenes which can deliver and improve the needed geospatial information incrementally. This approach is implemented based on prior 3D knowledge of the scene and employs computationally-efficient 2D triangulation, feature descriptor, feature matching and point verification techniques to optimize and speed up 3D dense scene reconstruction procedure. To verify the feasibility and computational efficiency of the proposed approach, an experiment using a set of consecutive images collected onboard a UAV platform and prior low-density airborne laser scanning over the same area is conducted and step by step results are provided. A comparative analysis of the proposed approach and an available image-based dense reconstruction technique is also conducted to prove the computational efficiency and competency of this technique for delivering geospatial information with pre-specified accuracy.

  7. Geospatial Applications on Different Parallel and Distributed Systems in enviroGRIDS Project

    NASA Astrophysics Data System (ADS)

    Rodila, D.; Bacu, V.; Gorgan, D.

    2012-04-01

    The execution of Earth Science applications and services on parallel and distributed systems has become a necessity especially due to the large amounts of Geospatial data these applications require and the large geographical areas they cover. The parallelization of these applications comes to solve important performance issues and can spread from task parallelism to data parallelism as well. Parallel and distributed architectures such as Grid, Cloud, Multicore, etc. seem to offer the necessary functionalities to solve important problems in the Earth Science domain: storing, distribution, management, processing and security of Geospatial data, execution of complex processing through task and data parallelism, etc. A main goal of the FP7-funded project enviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is the development of a Spatial Data Infrastructure targeting this catchment region but also the development of standardized and specialized tools for storing, analyzing, processing and visualizing the Geospatial data concerning this area. For achieving these objectives, the enviroGRIDS deals with the execution of different Earth Science applications, such as hydrological models, Geospatial Web services standardized by the Open Geospatial Consortium (OGC) and others, on parallel and distributed architecture to maximize the obtained performance. This presentation analysis the integration and execution of Geospatial applications on different parallel and distributed architectures and the possibility of choosing among these architectures based on application characteristics and user requirements through a specialized component. Versions of the proposed platform have been used in enviroGRIDS project on different use cases such as: the execution of Geospatial Web services both on Web and Grid infrastructures [2] and the execution of SWAT hydrological models both on Grid and Multicore architectures [3]. The current focus is to integrate in the proposed platform the Cloud infrastructure, which is still a paradigm with critical problems to be solved despite the great efforts and investments. Cloud computing comes as a new way of delivering resources while using a large set of old as well as new technologies and tools for providing the necessary functionalities. The main challenges in the Cloud computing, most of them identified also in the Open Cloud Manifesto 2009, address resource management and monitoring, data and application interoperability and portability, security, scalability, software licensing, etc. We propose a platform able to execute different Geospatial applications on different parallel and distributed architectures such as Grid, Cloud, Multicore, etc. with the possibility of choosing among these architectures based on application characteristics and complexity, user requirements, necessary performances, cost support, etc. The execution redirection on a selected architecture is realized through a specialized component and has the purpose of offering a flexible way in achieving the best performances considering the existing restrictions.

  8. GEO Label: User and Producer Perspectives on a Label for Geospatial Data

    NASA Astrophysics Data System (ADS)

    Lush, V.; Lumsden, J.; Masó, J.; Díaz, P.; McCallum, I.

    2012-04-01

    One of the aims of the Science and Technology Committee (STC) of the Group on Earth Observations (GEO) was to establish a GEO Label- a label to certify geospatial datasets and their quality. As proposed, the GEO Label will be used as a value indicator for geospatial data and datasets accessible through the Global Earth Observation System of Systems (GEOSS). It is suggested that the development of such a label will significantly improve user recognition of the quality of geospatial datasets and that its use will help promote trust in datasets that carry the established GEO Label. Furthermore, the GEO Label is seen as an incentive to data providers. At the moment GEOSS contains a large amount of data and is constantly growing. Taking this into account, a GEO Label could assist in searching by providing users with visual cues of dataset quality and possibly relevance; a GEO Label could effectively stand as a decision support mechanism for dataset selection. Currently our project - GeoViQua, - together with EGIDA and ID-03 is undertaking research to define and evaluate the concept of a GEO Label. The development and evaluation process will be carried out in three phases. In phase I we have conducted an online survey (GEO Label Questionnaire) to identify the initial user and producer views on a GEO Label or its potential role. In phase II we will conduct a further study presenting some GEO Label examples that will be based on Phase I. We will elicit feedback on these examples under controlled conditions. In phase III we will create physical prototypes which will be used in a human subject study. The most successful prototypes will then be put forward as potential GEO Label options. At the moment we are in phase I, where we developed an online questionnaire to collect the initial GEO Label requirements and to identify the role that a GEO Label should serve from the user and producer standpoint. The GEO Label Questionnaire consists of generic questions to identify whether users and producers believe a GEO Label is relevant to geospatial data; whether they want a single "one-for-all" label or separate labels that will serve a particular role; the function that would be most relevant for a GEO Label to carry; and the functionality that users and producers would like to see from common rating and review systems they use. To distribute the questionnaire, relevant user and expert groups were contacted at meetings or by email. At this stage we successfully collected over 80 valid responses from geospatial data users and producers. This communication will provide a comprehensive analysis of the survey results, indicating to what extent the users surveyed in Phase I value a GEO Label, and suggesting in what directions a GEO Label may develop. Potential GEO Label examples based on the results of the survey will be presented for use in Phase II.

  9. Facilitating participatory multilevel decision-making by using interactive mental maps.

    PubMed

    Pfeiffer, Constanze; Glaser, Stephanie; Vencatesan, Jayshree; Schliermann-Kraus, Elke; Drescher, Axel; Glaser, Rüdiger

    2008-11-01

    Participation of citizens in political, economic or social decisions is increasingly recognized as a precondition to foster sustainable development processes. Since spatial information is often important during planning and decision making, participatory mapping gains in popularity. However, little attention has been paid to the fact that information must be presented in a useful way to reach city planners and policy makers. Above all, the importance of visualisation tools to support collaboration, analytical reasoning, problem solving and decision-making in analysing and planning processes has been underestimated. In this paper, we describe how an interactive mental map tool has been developed in a highly interdisciplinary disaster management project in Chennai, India. We moved from a hand drawn mental maps approach to an interactive mental map tool. This was achieved by merging socio-economic and geospatial data on infrastructure, local perceptions, coping and adaptation strategies with remote sensing data and modern technology of map making. This newly developed interactive mapping tool allowed for insights into different locally-constructed realities and facilitated the communication of results to the wider public and respective policy makers. It proved to be useful in visualising information and promoting participatory decision-making processes. We argue that the tool bears potential also for health research projects. The interactive mental map can be used to spatially and temporally assess key health themes such as availability of, and accessibility to, existing health care services, breeding sites of disease vectors, collection and storage of water, waste disposal, location of public toilets or defecation sites.

  10. Data-Driven Geospatial Visual Analytics for Real-Time Urban Flooding Decision Support

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Hill, D.; Rodriguez, A.; Marini, L.; Kooper, R.; Myers, J.; Wu, X.; Minsker, B. S.

    2009-12-01

    Urban flooding is responsible for the loss of life and property as well as the release of pathogens and other pollutants into the environment. Previous studies have shown that spatial distribution of intense rainfall significantly impacts the triggering and behavior of urban flooding. However, no general purpose tools yet exist for deriving rainfall data and rendering them in real-time at the resolution of hydrologic units used for analyzing urban flooding. This paper presents a new visual analytics system that derives and renders rainfall data from the NEXRAD weather radar system at the sewershed (i.e. urban hydrologic unit) scale in real-time for a Chicago stormwater management project. We introduce a lightweight Web 2.0 approach which takes advantages of scientific workflow management and publishing capabilities developed at NCSA (National Center for Supercomputing Applications), streaming data-aware semantic content management repository, web-based Google Earth/Map and time-aware KML (Keyhole Markup Language). A collection of polygon-based virtual sensors is created from the NEXRAD Level II data using spatial, temporal and thematic transformations at the sewershed level in order to produce persistent virtual rainfall data sources for the animation. Animated color-coded rainfall map in the sewershed can be played in real-time as a movie using time-aware KML inside the web browser-based Google Earth for visually analyzing the spatiotemporal patterns of the rainfall intensity in the sewershed. Such system provides valuable information for situational awareness and improved decision support during extreme storm events in an urban area. Our further work includes incorporating additional data (such as basement flooding events data) or physics-based predictive models that can be used for more integrated data-driven decision support.

  11. Utilizing Arc Marine Concepts for Designing a Geospatially Enabled Database to Support Rapid Environmental Assessment

    DTIC Science & Technology

    2009-07-01

    data were recognized as being largely geospatial and thus a GIS was considered the most reasonable way to proceed. The Postgre suite of software also...for the ESRI (2009) geodatabase environment but is applicable for this Postgre -based system. We then introduce and discuss spatial reference...PostgreSQL database using a Postgre ODBC connection. This procedure identified 100 tables with 737 columns. This is after the removal of two

  12. Bottlenecks in Geospatial Data-Driven Decision-Making for Natural Disaster Management: A Case Study of Forest Fire Prevention and Control in Guatemala's Maya Biosphere Reserve

    NASA Astrophysics Data System (ADS)

    Berenter, J. S.; Mueller, J. M.; Morrison, I.

    2016-12-01

    Annual forest fires are a source of great economic and environmental cost in the Maya Biosphere Reserve (MBR), a region of high ecological and historical value in Guatemala's department of Petén. Scarce institutional resources, limited local response capacity, and difficult terrain place a premium on the use of Earth observation data for forest fire management in the MBR, but also present significant institutional barriers to optimizing the value of this data. Drawing upon key informant interviews and a contingent valuation survey of national and local actors conducted during a three-year performance evaluation of the USAID/NASA Regional Visualization and Monitoring System (SERVIR), this paper traces the flow of SERVIR data from acquisition to decision in order to assess the institutional and contextual factors affecting the value of Earth observation data for forest fire management in the MBR. Findings indicate that the use of satellite data for forest fire management in the MBR is widespread and multi-dimensional: historical assessments of land use and land cover, fire scarring, and climate data help central-level fire management agencies identify and regulate fire-sensitive areas; regular monitoring and dissemination of climate data enables coordination between agricultural burning activities and fire early warning systems; and daily satellite detection of thermal anomalies in land surface temperature permits first responders to monitor and react to "hotspot" activity. Findings also suggest, however, that while the decentralized operations of Petén's fire management systems foster the use of Earth observation data, systemic bottlenecks, including budgetary constraints, inadequate data infrastructure and interpretation capacity, and obstacles to regulatory enforcement, impede the flow of information and use of technology and thus impact the value of that data, particularly in remote and under-resourced areas of the MBR. A geographic expansion and fortification of support systems for use of Earth observation data is thus required to maximize the value of data-driven forest fire management in the MBR. Findings further validate a need for continued cooperation between scientific and governance institutions to disseminate and integrate geospatial data into environmental decision-making.

  13. Integrated national-scale assessment of wildfire risk to human and ecological values

    Treesearch

    Matthew P. Thompson; David E. Calkin; Mark A. Finney; Alan A. Ager; Julie W. Gilbertson-Day

    2011-01-01

    The spatial, temporal, and social dimensions of wildfire risk are challenging U.S. federal land management agencies to meet societal needs while maintaining the health of the lands they manage. In this paper we present a quantitative, geospatial wildfire risk assessment tool, developed in response to demands for improved risk-based decision frameworks. The methodology...

  14. Facilitating Data-Intensive Education and Research in Earth Science through Geospatial Web Services

    ERIC Educational Resources Information Center

    Deng, Meixia

    2009-01-01

    The realm of Earth science (ES) is increasingly data-intensive. Geoinformatics research attempts to robustly smooth and accelerate the flow of data to information, information to knowledge, and knowledge to decisions and to supply necessary infrastructure and tools for advancing ES. Enabling easy access to and use of large volumes of ES data and…

  15. Geospatial resources for the geologic community: The USGS National Map

    USGS Publications Warehouse

    Witt, Emitt C.

    2015-01-01

    Geospatial data are a key component of investigating, interpreting, and communicating the geological sciences. Locating geospatial data can be time-consuming, which detracts from time spent on a study because these data are not obviously placed in central locations or are served from many disparate databases. The National Map of the US Geological Survey is a publicly available resource for accessing the geospatial base map data needs of the geological community from a central location. The National Map data are available through a viewer and download platform providing access to eight primary data themes, plus the US Topo and scanned historical topographic maps. The eight themes are elevation, orthoimagery, hydrography, geographic names, boundaries, transportation, structures, and land cover, and they are being offered for download as predefined tiles in formats supported by leading geographic information system software. Data tiles are periodically refreshed to capture the most current content and are an efficient method for disseminating and receiving geospatial information. Elevation data, for example, are offered as a download from the National Map as 1° × 1° tiles for the 10- and 30- m products and as 15′ × 15′ tiles for the higher-resolution 3-m product. Vector data sets with smaller file sizes are offered at several tile sizes and formats. Partial tiles are not a download option—any prestaged data that intersect the requesting bounding box will be, in their entirety, part of the download order. While there are many options for accessing geospatial data via the Web, the National Map represents authoritative sources of data that are documented and can be referenced for citation and inclusion in scientific publications. Therefore, National Map products and services should be part of a geologist’s first stop for geospatial information and data.

  16. epiDMS: Data Management and Analytics for Decision-Making From Epidemic Spread Simulation Ensembles.

    PubMed

    Liu, Sicong; Poccia, Silvestro; Candan, K Selçuk; Chowell, Gerardo; Sapino, Maria Luisa

    2016-12-01

    Carefully calibrated large-scale computational models of epidemic spread represent a powerful tool to support the decision-making process during epidemic emergencies. Epidemic models are being increasingly used for generating forecasts of the spatial-temporal progression of epidemics at different spatial scales and for assessing the likely impact of different intervention strategies. However, the management and analysis of simulation ensembles stemming from large-scale computational models pose challenges, particularly when dealing with multiple interdependent parameters, spanning multiple layers and geospatial frames, affected by complex dynamic processes operating at different resolutions. We describe and illustrate with examples a novel epidemic simulation data management system, epiDMS, that was developed to address the challenges that arise from the need to generate, search, visualize, and analyze, in a scalable manner, large volumes of epidemic simulation ensembles and observations during the progression of an epidemic. epiDMS is a publicly available system that facilitates management and analysis of large epidemic simulation ensembles. epiDMS aims to fill an important hole in decision-making during healthcare emergencies by enabling critical services with significant economic and health impact. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail journals.permissions@oup.com.

  17. Design and Implementation WebGIS for Improving the Quality of Exploration Decisions at Sin-Quyen Copper Mine, Northern Vietnam

    NASA Astrophysics Data System (ADS)

    Quang Truong, Xuan; Luan Truong, Xuan; Nguyen, Tuan Anh; Nguyen, Dinh Tuan; Cong Nguyen, Chi

    2017-12-01

    The objective of this study is to design and implement a WebGIS Decision Support System (WDSS) for reducing uncertainty and supporting to improve the quality of exploration decisions in the Sin-Quyen copper mine, northern Vietnam. The main distinctive feature of the Sin-Quyen deposit is an unusual composition of ores. Computer and software applied to the exploration problem have had a significant impact on the exploration process over the past 25 years, but up until now, no online system has been undertaken. The system was completely built on open source technology and the Open Geospatial Consortium Web Services (OWS). The input data includes remote sensing (RS), Geographical Information System (GIS) and data from drillhole explorations, the drillhole exploration data sets were designed as a geodatabase and stored in PostgreSQL. The WDSS must be able to processed exploration data and support users to access 2-dimensional (2D) or 3-dimensional (3D) cross-sections and map of boreholles exploration data and drill holes. The interface was designed in order to interact with based maps (e.g., Digital Elevation Model, Google Map, OpenStreetMap) and thematic maps (e.g., land use and land cover, administrative map, drillholes exploration map), and to provide GIS functions (such as creating a new map, updating an existing map, querying and statistical charts). In addition, the system provides geological cross-sections of ore bodies based on Inverse Distance Weighting (IDW), nearest neighbour interpolation and Kriging methods (e.g., Simple Kriging, Ordinary Kriging, Indicator Kriging and CoKriging). The results based on data available indicate that the best estimation method (of 23 borehole exploration data sets) for estimating geological cross-sections of ore bodies in Sin-Quyen copper mine is Ordinary Kriging. The WDSS could provide useful information to improve drilling efficiency in mineral exploration and for management decision making.

  18. Developing Energy Literacy in US Middle-Level Students Using the Geospatial Curriculum Approach

    NASA Astrophysics Data System (ADS)

    Bodzin, Alec M.; Fu, Qiong; Peffer, Tamara E.; Kulo, Violet

    2013-06-01

    This quantitative study examined the effectiveness of a geospatial curriculum approach to promote energy literacy in an urban school district and examined factors that may account for energy content knowledge achievement. An energy literacy measure was administered to 1,044 eighth-grade students (ages 13-15) in an urban school district in Pennsylvania, USA. One group of students received instruction with a geospatial curriculum approach (geospatial technologies (GT)) and another group of students received 'business as usual' (BAU) curriculum instruction. For the GT students, findings revealed statistically significant gains from pretest to posttest (p < 0.001) on knowledge of energy resource acquisition, energy generation, storage and transport, and energy consumption and conservation. The GT students had year-end energy content knowledge scores significantly higher than those who learned with the BAU curriculum (p < 0.001; effect size being large). A multiple regression found that prior energy content knowledge was the only significant predictor to the year-end energy content knowledge achievement for the GT students (p < 0.001). The findings support that the implementation of a geospatial curriculum approach that employs learning activities that focus on the spatial nature of energy resources can improve the energy literacy of urban middle-level education students.

  19. Towards a framework for geospatial tangible user interfaces in collaborative urban planning

    NASA Astrophysics Data System (ADS)

    Maquil, Valérie; Leopold, Ulrich; De Sousa, Luís Moreira; Schwartz, Lou; Tobias, Eric

    2018-04-01

    The increasing complexity of urban planning projects today requires new approaches to better integrate stakeholders with different professional backgrounds throughout a city. Traditional tools used in urban planning are designed for experts and offer little opportunity for participation and collaborative design. This paper introduces the concept of geospatial tangible user interfaces (GTUI) and reports on the design and implementation as well as the usability of such a GTUI to support stakeholder participation in collaborative urban planning. The proposed system uses physical objects to interact with large digital maps and geospatial data projected onto a tabletop. It is implemented using a PostGIS database, a web map server providing OGC web services, the computer vision framework reacTIVision, a Java-based TUIO client, and GeoTools. We describe how a GTUI has be instantiated and evaluated within the scope of two case studies related to real world collaborative urban planning scenarios. Our results confirm the feasibility of our proposed GTUI solutions to (a) instantiate different urban planning scenarios, (b) support collaboration, and (c) ensure an acceptable usability.

  20. Towards a Web-Enabled Geovisualization and Analytics Platform for the Energy and Water Nexus

    NASA Astrophysics Data System (ADS)

    Sanyal, J.; Chandola, V.; Sorokine, A.; Allen, M.; Berres, A.; Pang, H.; Karthik, R.; Nugent, P.; McManamay, R.; Stewart, R.; Bhaduri, B. L.

    2017-12-01

    Interactive data analytics are playing an increasingly vital role in the generation of new, critical insights regarding the complex dynamics of the energy/water nexus (EWN) and its interactions with climate variability and change. Integration of impacts, adaptation, and vulnerability (IAV) science with emerging, and increasingly critical, data science capabilities offers a promising potential to meet the needs of the EWN community. To enable the exploration of pertinent research questions, a web-based geospatial visualization platform is being built that integrates a data analysis toolbox with advanced data fusion and data visualization capabilities to create a knowledge discovery framework for the EWN. The system, when fully built out, will offer several geospatial visualization capabilities including statistical visual analytics, clustering, principal-component analysis, dynamic time warping, support uncertainty visualization and the exploration of data provenance, as well as support machine learning discoveries to render diverse types of geospatial data and facilitate interactive analysis. Key components in the system architecture includes NASA's WebWorldWind, the Globus toolkit, postgresql, as well as other custom built software modules.

  1. Towards a framework for geospatial tangible user interfaces in collaborative urban planning

    NASA Astrophysics Data System (ADS)

    Maquil, Valérie; Leopold, Ulrich; De Sousa, Luís Moreira; Schwartz, Lou; Tobias, Eric

    2018-03-01

    The increasing complexity of urban planning projects today requires new approaches to better integrate stakeholders with different professional backgrounds throughout a city. Traditional tools used in urban planning are designed for experts and offer little opportunity for participation and collaborative design. This paper introduces the concept of geospatial tangible user interfaces (GTUI) and reports on the design and implementation as well as the usability of such a GTUI to support stakeholder participation in collaborative urban planning. The proposed system uses physical objects to interact with large digital maps and geospatial data projected onto a tabletop. It is implemented using a PostGIS database, a web map server providing OGC web services, the computer vision framework reacTIVision, a Java-based TUIO client, and GeoTools. We describe how a GTUI has be instantiated and evaluated within the scope of two case studies related to real world collaborative urban planning scenarios. Our results confirm the feasibility of our proposed GTUI solutions to (a) instantiate different urban planning scenarios, (b) support collaboration, and (c) ensure an acceptable usability.

  2. A Spatial Data Infrastructure to Share Earth and Space Science Data

    NASA Astrophysics Data System (ADS)

    Nativi, S.; Mazzetti, P.; Bigagli, L.; Cuomo, V.

    2006-05-01

    Spatial Data Infrastructure:SDI (also known as Geospatial Data Infrastructure) is fundamentally a mechanism to facilitate the sharing and exchange of geospatial data. SDI is a scheme necessary for the effective collection, management, access, delivery and utilization of geospatial data; it is important for: objective decision making and sound land based policy, support economic development and encourage socially and environmentally sustainable development. As far as data model and semantics are concerned, a valuable and effective SDI should be able to cross the boundaries between the Geographic Information System/Science (GIS) and Earth and Space Science (ESS) communities. Hence, SDI should be able to discover, access and share information and data produced and managed by both GIS and ESS communities, in an integrated way. In other terms, SDI must be built on a conceptual and technological framework which abstracts the nature and structure of shared dataset: feature-based data or Imagery, Gridded and Coverage Data (IGCD). ISO TC211 and the Open Geospatial Consortium provided important artifacts to build up this framework. In particular, the OGC Web Services (OWS) initiatives and several Interoperability Experiment (e.g. the GALEON IE) are extremely useful for this purpose. We present a SDI solution which is able to manage both GIS and ESS datasets. It is based on OWS and other well-accepted or promising technologies, such as: UNIDATA netCDF and CDM, ncML and ncML-GML. Moreover, it uses a specific technology to implement a distributed and federated system of catalogues: the GI-Cat. This technology performs data model mediation and protocol adaptation tasks. It is used to work out a metadata clearinghouse service, implementing a common (federal) catalogue model which is based on the ISO 19115 core metadata for geo-dataset. Nevertheless, other well- accepted or standard catalogue data models can be easily implemented as common view (e.g. OGC CS-W, the next coming INSPIRE discovery metadata model, etc.). The proposed solution has been conceived and developed for building up the "Lucan SDI". This is the SDI of the Italian Basilicata Region. It aims to connect the following data providers and users: the National River Basin Authority of Basilicata, the Regional Environmental Agency, the Land Management & Cadastre Regional Authorities, the Prefecture, the Regional Civil Protection Centers, the National Research Council Institutes in Basilicata, the Academia, several SMEs.

  3. GIS-based Geospatial Infrastructure of Water Resource Assessment for Supporting Oil Shale Development in Piceance Basin of Northwestern Colorado

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Wei; Minnick, Matthew D; Mattson, Earl D

    Oil shale deposits of the Green River Formation (GRF) in Northwestern Colorado, Southwestern Wyoming, and Northeastern Utah may become one of the first oil shale deposits to be developed in the U.S. because of their richness, accessibility, and extensive prior characterization. Oil shale is an organic-rich fine-grained sedimentary rock that contains significant amounts of kerogen from which liquid hydrocarbons can be produced. Water is needed to retort or extract oil shale at an approximate rate of three volumes of water for every volume of oil produced. Concerns have been raised over the demand and availability of water to produce oilmore » shale, particularly in semiarid regions where water consumption must be limited and optimized to meet demands from other sectors. The economic benefit of oil shale development in this region may have tradeoffs within the local and regional environment. Due to these potential environmental impacts of oil shale development, water usage issues need to be further studied. A basin-wide baseline for oil shale and water resource data is the foundation of the study. This paper focuses on the design and construction of a centralized geospatial infrastructure for managing a large amount of oil shale and water resource related baseline data, and for setting up the frameworks for analytical and numerical models including but not limited to three-dimensional (3D) geologic, energy resource development systems, and surface water models. Such a centralized geospatial infrastructure made it possible to directly generate model inputs from the same database and to indirectly couple the different models through inputs/outputs. Thus ensures consistency of analyses conducted by researchers from different institutions, and help decision makers to balance water budget based on the spatial distribution of the oil shale and water resources, and the spatial variations of geologic, topographic, and hydrogeological Characterization of the basin. This endeavor encountered many technical challenging and hasn't been done in the past for any oil shale basin. The database built during this study remains valuable for any other future studies involving oil shale and water resource management in the Piceance Basin. The methodology applied in the development of the GIS based Geospatial Infrastructure can be readily adapted for other professionals to develop database structure for other similar basins.« less

  4. Mobile Traffic Alert and Tourist Route Guidance System Design Using Geospatial Data

    NASA Astrophysics Data System (ADS)

    Bhattacharya, D.; Painho, M.; Mishra, S.; Gupta, A.

    2017-09-01

    The present study describes an integrated system for traffic data collection and alert warning. Geographical information based decision making related to traffic destinations and routes is proposed through the design. The system includes a geospatial database having profile relating to a user of a mobile device. The processing and understanding of scanned maps, other digital data input leads to route guidance. The system includes a server configured to receive traffic information relating to a route and location information relating to the mobile device. Server is configured to send a traffic alert to the mobile device when the traffic information and the location information indicate that the mobile device is traveling toward traffic congestion. Proposed system has geospatial and mobile data sets pertaining to Bangalore city in India. It is envisaged to be helpful for touristic purposes as a route guidance and alert relaying information system to tourists for proximity to sites worth seeing in a city they have entered into. The system is modular in architecture and the novelty lies in integration of different modules carrying different technologies for a complete traffic information system. Generic information processing and delivery system has been tested to be functional and speedy under test geospatial domains. In a restricted prototype model with geo-referenced route data required information has been delivered correctly over sustained trials to designated cell numbers, with average time frame of 27.5 seconds, maximum 50 and minimum 5 seconds. Traffic geo-data set trials testing is underway.

  5. Description of Existing Data for Integrated Landscape Monitoring in the Puget Sound Basin, Washington

    USGS Publications Warehouse

    Aiello, Danielle P.; Torregrosa, Alicia; Jason, Allyson L.; Fuentes, Tracy L.; Josberger, Edward G.

    2008-01-01

    This report summarizes existing geospatial data and monitoring programs for the Puget Sound Basin in northwestern Washington. This information was assembled as a preliminary data-development task for the U.S. Geological Survey (USGS) Puget Sound Integrated Landscape Monitoring (PSILM) pilot project. The PSILM project seeks to support natural resource decision-making by developing a 'whole system' approach that links ecological processes at the landscape level to the local level (Benjamin and others, 2008). Part of this effort will include building the capacity to provide cumulative information about impacts that cross jurisdictional and regulatory boundaries, such as cumulative effects of land-cover change and shoreline modification, or region-wide responses to climate change. The PSILM project study area is defined as the 23 HUC-8 (hydrologic unit code) catchments that comprise the watersheds that drain into Puget Sound and their near-shore environments. The study area includes 13 counties and more than four million people. One goal of the PSILM geospatial database is to integrate spatial data collected at multiple scales across the Puget Sound Basin marine and terrestrial landscape. The PSILM work plan specifies an iterative process that alternates between tasks associated with data development and tasks associated with research or strategy development. For example, an initial work-plan goal was to delineate the study area boundary. Geospatial data required to address this task included data from ecological regions, watersheds, jurisdictions, and other boundaries. This assemblage of data provided the basis for identifying larger research issues and delineating the study-area boundary based on these research needs. Once the study-area boundary was agreed upon, the next iteration between data development and research activities was guided by questions about data availability, data extent, data abundance, and data types. This report is not intended as an exhaustive compilation of all available geospatial data, rather, it is a collection of information about geospatial data that can be used to help answer the suite of questions posed after the study-area boundary was defined. This information will also be useful to the PSILM team for future project tasks, such as assessing monitoring gaps, exploring monitoring-design strategies, identifying and deriving landscape indicators and metrics, and visual geographic communication. The two main geospatial data types referenced in this report - base-reference layers and monitoring data - originated from numerous and varied sources. In addition to collecting information and metadata about the base-reference layers, the data also were collected for project needs, such as developing maps for visual communication among team members and with outside groups. In contrast, only information about the data was typically required for the monitoring data. The information on base-reference layers and monitoring data included in this report is only as detailed as what was readily available from the sources themselves. Although this report may appear to lack consistency between data records, the varying degree of details contained in this report are merely a reflection of varying source detail. This compilation is just a beginning. All data listed also are being catalogued in spreadsheets and knowledge-management systems. Our efforts are continual as we develop a geospatial catalog for the PSILM pilot project.

  6. A GEO Initiative to Support the Sustainable Development Goals

    NASA Astrophysics Data System (ADS)

    Friedl, L.

    2016-12-01

    The United Nations Agenda 2030 serves as a global development agenda for progress on economic, social and environmental sustainability. These Sustainable Development Goals (SDG) have a specific provision for the use of Earth observations and geospatial information to support progress. The international Group on Earth Observations, GEO, has a dedicated initiative focused on the SDGs. This initiative supports efforts to integrate Earth observations and geospatial information into national development and monitoring frameworks for the SDGs. It helps enables countries and stakeholders to leverage Earth observations to support the implementation, planning, measuring, monitoring, reporting, and evaluation of the SDGs. This paper will present an overview of the GEO initiative and ways that Earth observations support the development goals. It will address how information and knowledge can be shared on effective methods to apply Earth observations to the SDGs and their associated targets and indicators. It will also highlight some existing information sources and tools on the SDGs, which can help identify key approaches for developing a knowledge base.

  7. Nebhydro: Sharing Geospatial Data to Supportwater Management in Nebraska

    NASA Astrophysics Data System (ADS)

    Kamble, B.; Irmak, A.; Hubbard, K.; Deogun, J.; Dvorak, B.

    2012-12-01

    Recent advances in web-enabled geographical technologies have the potential to make a dramatic impact on development of highly interactive spatial applications on the web for visualization of large-scale geospatial data by water resources and irrigation scientists. Spatial and point scale water resources data visualization are an emerging and challenging application domain. Query based visual explorations of geospatial hydrological data can play an important role in stimulating scientific hypotheses and seeking causal relationships among hydro variables. The Nebraska Hydrological Information System (NebHydro) utilizes ESRI's ArcGIS server technology to increase technological awareness among farmers, irrigation managers and policy makers. Web-based geospatial applications are an effective way to expose scientific hydrological datasets to the research community and the public. NebHydro uses Adobe Flex technology to offer an online visualization and data analysis system for presentation of social and economic data. Internet mapping services is an integrated product of GIS and Internet technologies; it is a favored solution to achieve the interoperability of GIS. The development of Internet based GIS services in the state of Nebraska showcases the benefits of sharing geospatial hydrological data among agencies, resource managers and policy makers. Geospatial hydrological Information (Evapotranspiration from Remote Sensing, vegetation indices (NDVI), USGS Stream gauge data, Climatic data etc.) is generally generated through model simulation (METRIC, SWAP, Linux, Python based scripting etc). Information is compiled into and stored within object oriented relational spatial databases using a geodatabase information model that supports the key data types needed by applications including features, relationships, networks, imagery, terrains, maps and layers. The system provides online access, querying, visualization, and analysis of the hydrological data from several sources at one place. The study indicates that internet GIS, developed using advanced technologies, provides valuable education potential to users in hydrology and irrigation engineering and suggests that such a system can support advanced hydrological data access and analysis tools to improve utility of data in operations. Keywords: Hydrological Information System, NebHydro, Water Management, data sharing, data visualization, ArcGIS server.

  8. Decision Support System for an efficient irrigation water management in semi arid environment

    NASA Astrophysics Data System (ADS)

    Khan, M. A.; Islam, M.; Hafeez, M. M.; Flugel, W. A.

    2009-12-01

    A significant increase in agricultural productivity over the last few decades has protected the world from episodes of hunger and food shortages. Water management in irrigated agriculture was instrumental in achieving those gains. Water resources are under high pressure due to rapid population growth and increased competition among various sectors. Access to reliable data on water availability, quantity and quality can provide the necessary foundation for sound management of water resources. There are many traditional methods for matching water demand and supply, however imbalances between demand and supply remain inevitable. It is possible to reduce the imbalances considerably through development of appropriate irrigation water management tool that take into account various factors such as soil type, irrigation water supply, and crop water demand. All components of water balance need to be understood and quantified for efficient and sustainable management of water resources. Application of an intelligent Decision Support System (DSS) is becoming significant. A DSS incorporates knowledge and expertise within the decision support framework. It is an integrated set of data, functions, models and other relevant information that efficiently processes input data, simulates models and displays the results in a user friendly format. It helps in decision-making process, to analyse the problem and explore various scenarios to make the most appropriate decision for water management. This paper deals with the Coleambally Irrigation Area (CIA) located in Murrumbidgee catchment, NSW, Australia. An Integrated River Information System called Coleambally IRIS has been developed to improve the irrigation water management ranging from farm to sub-system and system level. It is a web-based information management system with a focus on time series and geospatial hydrological, climatic and remote sensing data including land cover class, surface temperature, soil moisture, Normalized Difference Vegetation Index (NDVI), Leaf Area Index (LAI) and Evapotranspiration (ET). Coleambally IRIS provides user friendly environment for data input and output, and an adaptable set of functions for data analysis, management and decision making to develops strategies for sustainable irrigation water management. Coleambally IRIS is used to assist the managers of irrigation service provider and the farmers in their decision making by providing relevant information over the web. The developed DSS has been practically used in managing irrigation water under the current drought conditions. The DSS will be further extended for forecasting irrigation water demand in the future.

  9. Accuracy assessment of maps of forest condition: Statistical design and methodological considerations [Chapter 5

    Treesearch

    Raymond L. Czaplewski

    2003-01-01

    No thematic map is perfect. Some pixels or polygons are not accurately classified, no matter how well the map is crafted. Therefore, thematic maps need metadata that sufficiently characterize the nature and degree of these imperfections. To decision-makers, an accuracy assessment helps judge the risks of using imperfect geospatial data. To analysts, an accuracy...

  10. Diy Geospatial Web Service Chains: Geochaining Make it Easy

    NASA Astrophysics Data System (ADS)

    Wu, H.; You, L.; Gui, Z.

    2011-08-01

    It is a great challenge for beginners to create, deploy and utilize a Geospatial Web Service Chain (GWSC). People in Computer Science are usually not familiar with geospatial domain knowledge. Geospatial practitioners may lack the knowledge about web services and service chains. The end users may lack both. However, integrated visual editing interfaces, validation tools, and oneclick deployment wizards may help to lower the learning curve and improve modelling skills so beginners will have a better experience. GeoChaining is a GWSC modelling tool designed and developed based on these ideas. GeoChaining integrates visual editing, validation, deployment, execution etc. into a unified platform. By employing a Virtual Globe, users can intuitively visualize raw data and results produced by GeoChaining. All of these features allow users to easily start using GWSC, regardless of their professional background and computer skills. Further, GeoChaining supports GWSC model reuse, meaning that an entire GWSC model created or even a specific part can be directly reused in a new model. This greatly improves the efficiency of creating a new GWSC, and also contributes to the sharing and interoperability of GWSC.

  11. Sustainable Urban Forestry Potential Based Quantitative And Qualitative Measurement Using Geospatial Technique

    NASA Astrophysics Data System (ADS)

    Rosli, A. Z.; Reba, M. N. M.; Roslan, N.; Room, M. H. M.

    2014-02-01

    In order to maintain the stability of natural ecosystems around urban areas, urban forestry will be the best initiative to maintain and control green space in our country. Integration between remote sensing (RS) and geospatial information system (GIS) serves as an effective tool for monitoring environmental changes and planning, managing and developing a sustainable urbanization. This paper aims to assess capability of the integration of RS and GIS to provide information for urban forest potential sites based on qualitative and quantitative by using priority parameter ranking in the new township of Nusajaya. SPOT image was used to provide high spatial accuracy while map of topography, landuse, soils group, hydrology, Digital Elevation Model (DEM) and soil series data were applied to enhance the satellite image in detecting and locating present attributes and features on the ground. Multi-Criteria Decision Making (MCDM) technique provides structural and pair wise quantification and comparison elements and criteria for priority ranking for urban forestry purpose. Slope, soil texture, drainage, spatial area, availability of natural resource, and vicinity of urban area are criteria considered in this study. This study highlighted the priority ranking MCDM is cost effective tool for decision-making in urban forestry planning and landscaping.

  12. Distributed Multi-interface Catalogue for Geospatial Data

    NASA Astrophysics Data System (ADS)

    Nativi, S.; Bigagli, L.; Mazzetti, P.; Mattia, U.; Boldrini, E.

    2007-12-01

    Several geosciences communities (e.g. atmospheric science, oceanography, hydrology) have developed tailored data and metadata models and service protocol specifications for enabling online data discovery, inventory, evaluation, access and download. These specifications are conceived either profiling geospatial information standards or extending the well-accepted geosciences data models and protocols in order to capture more semantics. These artifacts have generated a set of related catalog -and inventory services- characterizing different communities, initiatives and projects. In fact, these geospatial data catalogs are discovery and access systems that use metadata as the target for query on geospatial information. The indexed and searchable metadata provide a disciplined vocabulary against which intelligent geospatial search can be performed within or among communities. There exists a clear need to conceive and achieve solutions to implement interoperability among geosciences communities, in the context of the more general geospatial information interoperability framework. Such solutions should provide search and access capabilities across catalogs, inventory lists and their registered resources. Thus, the development of catalog clearinghouse solutions is a near-term challenge in support of fully functional and useful infrastructures for spatial data (e.g. INSPIRE, GMES, NSDI, GEOSS). This implies the implementation of components for query distribution and virtual resource aggregation. These solutions must implement distributed discovery functionalities in an heterogeneous environment, requiring metadata profiles harmonization as well as protocol adaptation and mediation. We present a catalog clearinghouse solution for the interoperability of several well-known cataloguing systems (e.g. OGC CSW, THREDDS catalog and data services). The solution implements consistent resource discovery and evaluation over a dynamic federation of several well-known cataloguing and inventory systems. Prominent features include: 1)Support to distributed queries over a hierarchical data model, supporting incremental queries (i.e. query over collections, to be subsequently refined) and opaque/translucent chaining; 2)Support to several client protocols, through a compound front-end interface module. This allows to accommodate a (growing) number of cataloguing standards, or profiles thereof, including the OGC CSW interface, ebRIM Application Profile (for Core ISO Metadata and other data models), and the ISO Application Profile. The presented catalog clearinghouse supports both the opaque and translucent pattern for service chaining. In fact, the clearinghouse catalog may be configured either to completely hide the underlying federated services or to provide clients with services information. In both cases, the clearinghouse solution presents a higher level interface (i.e. OGC CSW) which harmonizes multiple lower level services (e.g. OGC CSW, WMS and WCS, THREDDS, etc.), and handles all control and interaction with them. In the translucent case, client has the option to directly access the lower level services (e.g. to improve performances). In the GEOSS context, the solution has been experimented both as a stand-alone user application and as a service framework. The first scenario allows a user to download a multi-platform client software and query a federation of cataloguing systems, that he can customize at will. The second scenario support server-side deployment and can be flexibly adapted to several use-cases, such as intranet proxy, catalog broker, etc.

  13. Assessing and Valuing Historical Geospatial Data for Decisions

    NASA Astrophysics Data System (ADS)

    Sylak-Glassman, E.; Gallo, J.

    2016-12-01

    We will present a method for assessing the use and valuation of historical geospatial data and information products derived from Earth observations (EO). Historical data is widely used in the establishment of baseline reference cases, time-series analysis, and Earth system modeling. Historical geospatial data is used in diverse application areas, such as risk assessment in the insurance and reinsurance industry, disaster preparedness and response planning, historical demography, land-use change analysis, and paleoclimate research, among others. Establishing the current value of previously collected data, often from EO systems that are no longer operating, is difficult since the costs associated with their preservation, maintenance, and dissemination are current, while the costs associated with their original collection are sunk. Understanding their current use and value can aid in funding decisions about the data management infrastructure and workforce allocation required to maintain their availability. Using a value-tree framework to trace the application of data from EO systems, sensors, networks, and surveys, to weighted key Federal objectives, we are able to estimate relative contribution of individual EO systems, sensors, networks, and surveys to meeting those objectives. The analysis relies on a modified Delphi method to elicit relative levels of reliance on individual EO data inputs, including historical data, from subject matter experts. This results in the identification of a representative portfolio of all EO data used to meet key Federal objectives. Because historical data is collected in conjunction with all other EO data within a weighted framework, its contribution to meeting key Federal objectives can be specifically identified and evaluated in relationship to other EO data. The results of this method could be applied better understanding and projecting the long-term value of data from current and future EO systems.

  14. Using Esri Story Map Technology to Demonstrate SERVIR Global Success Stories

    NASA Astrophysics Data System (ADS)

    Adams, E. C.; Flores, A.; Muench, R.; Coulter, D.; Limaye, A. S.; Irwin, D.

    2016-12-01

    A joint development initiative of the National Aeronautics and Space Administration (NASA) and the United States Agency for International Development (USAID), SERVIR works in partnership with leading regional organizations world-wide to help developing countries build their capacity to use information provided by Earth observing satellites and geospatial technologies for managing climate and weather risks, food security and agriculture, land use change, water resources, and natural disaster response. The SERVIR network currently includes 4 regional hubs: Eastern and Southern Africa, Hindu-Kush-Himalaya, the Lower Mekong region, and West Africa, and has completed project activities in the Mesoamerica region. SERVIR has activities in over 40 countries, has developed 70 custom tools, and has collaborated with 155 institutions to apply current state of the art science and technology to decision making. Many of these efforts have the potential to continue to influence decision-making at new institutions throughout the globe; however, engaging those stakeholders and society while maintaining a global brand identity is challenging. Esri story map technologies have allowed the SERVIR network to highlight the applications of SERVIR projects. Conventional communication approaches have been used in SERVIR to share success stories of our geospatial projects; however, the power of Esri story telling offers a great opportunity to convey effectively the impacts of the geospatial solutions provided through SERVIR to end users. This paper will present use cases of how Esri story map technologies are being used across the SERVIR network to effectively communicate science to SERVIR users and general public. The easy to use design templates and interactive user interface are ideal for highlighting SERVIR's diverse products. In addition, the SERVIR team hopes to continue using story maps for project outreach and user engagement.

  15. Linking Science and Management in an Interactive Geospatial, Mutli-Criterion, Structured Decision Support Framework: Use Case Studies of the "Future Forests Geo-visualization and Decision Support Tool

    NASA Astrophysics Data System (ADS)

    Pontius, J.; Duncan, J.

    2017-12-01

    Land managers are often faced with balancing management activities to accomplish a diversity of management objectives, in systems faced with many stress agents. Advances in ecosystem modeling provide a rich source of information to inform management. Coupled with advances in decision support techniques and computing capabilities, interactive tools are now accessible for a broad audience of stakeholders. Here we present one such tool designed to capture information on how climate change may impact forested ecosystems, and how that impact varies spatially across the landscape. This tool integrates empirical models of current and future forest structure and function in a structured decision framework that allows users to customize weights for multiple management objectives and visualize suitability outcomes across the landscape. Combined with climate projections, the resulting products allow stakeholders to compare the relative success of various management objectives on a pixel by pixel basis and identify locations where management outcomes are most likely to be met. Here we demonstrate this approach with the integration of several of the preliminary models developed to map species distributions, sugar maple health, forest fragmentation risk and hemlock vulnerability to hemlock woolly adelgid under current and future climate scenarios. We compare three use case studies with objective weightings designed to: 1) Identify key parcels for sugarbush conservation and management, 2) Target state lands that may serve as hemlock refugia from hemlock woolly adelgid induced mortality, and 3) Examine how climate change may alter the success of managing for both sugarbush and hemlock across privately owned lands. This tool highlights the value of flexible models that can be easily run with customized weightings in a dynamic, integrated assessment that allows users to hone in on their potentially complex management objectives, and to visualize and prioritize locations across the landscape. It also demonstrates the importance of including climate considerations for long-term management. This merging of scientific knowledge with the diversity of stakeholder needs is an important step towards using science to inform management and policy decisions.

  16. Varying geospatial analyses to assess climate risk and adaptive capacity in a hotter, drier Southwestern United States

    NASA Astrophysics Data System (ADS)

    Elias, E.; Reyes, J. J.; Steele, C. M.; Rango, A.

    2017-12-01

    Assessing vulnerability of agricultural systems to climate variability and change is vital in securing food systems and sustaining rural livelihoods. Farmers, ranchers, and forest landowners rely on science-based, decision-relevant, and localized information to maintain production, ecological viability, and economic returns. This contribution synthesizes a collection of research on the future of agricultural production in the American Southwest (SW). Research was based on a variety of geospatial methodologies and datasets to assess the vulnerability of rangelands and livestock, field crops, specialty crops, and forests in the SW to climate-risk and change. This collection emerged from the development of regional vulnerability assessments for agricultural climate-risk by the U.S. Department of Agriculture (USDA) Climate Hub Network, established to deliver science-based information and technologies to enable climate-informed decision-making. Authors defined vulnerability differently based on their agricultural system of interest, although each primarily focuses on biophysical systems. We found that an inconsistent framework for vulnerability and climate risk was necessary to adequately capture the diversity, variability, and heterogeneity of SW landscapes, peoples, and agriculture. Through the diversity of research questions and methodologies, this collection of articles provides valuable information on various aspects of SW vulnerability. All articles relied on geographic information systems technology, with highly variable levels of complexity. Agricultural articles used National Agricultural Statistics Service data, either as tabular county level summaries or through the CropScape cropland raster datasets. Most relied on modeled historic and future climate information, but with differing assumptions regarding spatial resolution and temporal framework. We assert that it is essential to evaluate climate risk using a variety of complementary methodologies and perspectives. In addition, we found that spatial analysis supports informed adaptation, within and outside the SW United States. The persistence and adaptive capacity of agriculture in the water-limited Southwest serves as an instructive example and may offer solutions to reduce future climate risk.

  17. Estimating the Socio-economic Impact of Earth Observing Data in Sonoma County

    NASA Astrophysics Data System (ADS)

    Green, K.; Gaffney, K.; Escobar, V. M.; Tukman, M.

    2016-12-01

    In 2013, NASA's Carbon Monitoring System Applications Effort funded a ROSES proposal from the University of Maryland to develop of a prototype for measuring, reporting and verification (MRV) system based on commercial off-the-shelf (COTS) remote sensing and analysis capabilities to support ecomarket infrastructure in Sonoma County, California. One of the goals of the project is to identify how stakeholder needs and requirements can be integrated during the creation and implementation of MRV systems to provide effective decision support and compliance capabilities, and with better-informed policy decisions. NASA funding was pooled with that from Sonoma County, USGS, and others for the creation of multiple high resolution county wide geospatial products The project included the acquisition and processing of Q1 lidar and 6 inch, 4-band multispectral imagery for the entire county of Sonoma which the county makes available to the public for download at http://sonomavegmap.org, http://opentopography.org/, and https://coast.noaa.gov . To understand the value of the county's ortho-imagery and lidar products to users, the county initiated a survey of users in the spring of 2016. Survey questions were developed by Sonoma county, NASA , and consultants, and a link to them in SuveyMonkey was sent out to 400+ individuals signed up to receive the project's newsletters (www.sonomavegmap.org). This presentation will summarize the results and key findings of the survey.

  18. 3D geospatial visualizations: Animation and motion effects on spatial objects

    NASA Astrophysics Data System (ADS)

    Evangelidis, Konstantinos; Papadopoulos, Theofilos; Papatheodorou, Konstantinos; Mastorokostas, Paris; Hilas, Constantinos

    2018-02-01

    Digital Elevation Models (DEMs), in combination with high quality raster graphics provide realistic three-dimensional (3D) representations of the globe (virtual globe) and amazing navigation experience over the terrain through earth browsers. In addition, the adoption of interoperable geospatial mark-up languages (e.g. KML) and open programming libraries (Javascript) makes it also possible to create 3D spatial objects and convey on them the sensation of any type of texture by utilizing open 3D representation models (e.g. Collada). One step beyond, by employing WebGL frameworks (e.g. Cesium.js, three.js) animation and motion effects are attributed on 3D models. However, major GIS-based functionalities in combination with all the above mentioned visualization capabilities such as for example animation effects on selected areas of the terrain texture (e.g. sea waves) as well as motion effects on 3D objects moving in dynamically defined georeferenced terrain paths (e.g. the motion of an animal over a hill, or of a big fish in an ocean etc.) are not widely supported at least by open geospatial applications or development frameworks. Towards this we developed and made available to the research community, an open geospatial software application prototype that provides high level capabilities for dynamically creating user defined virtual geospatial worlds populated by selected animated and moving 3D models on user specified locations, paths and areas. At the same time, the generated code may enhance existing open visualization frameworks and programming libraries dealing with 3D simulations, with the geospatial aspect of a virtual world.

  19. Economic Valuation for Improved Water Quality: Analyzing the Public's Preferences Using Geospatial Analysis

    NASA Astrophysics Data System (ADS)

    Tsagarakis, Konstantinos P.; Mavragani, Amaryllis; Gemitzi, Alexandra

    2017-04-01

    As the subject of water quality in the European Union is becoming all the more important, public awareness is of significant importance in exploring ways towards the implementation of better water quality. Over the last decade, significant steps towards this direction have been employed in EU, such as Directive 2008/105/EC and Directive 2013/39/EU and Groundwater Directive and Decision 2015/495. What has been suggested so far is that public participation and information levels are relatively low in some EU countries. This paper focuses on providing a review on economic valuation in EU and in regions with degradated waters by applying geospatial techniques. Overall, it is shown that public awareness and information levels are crucial in better assessing the issues that arise due to water quality, and help better implement EU legislation.

  20. Data Quality, Provenance and IPR Management services: their role in empowering geospatial data suppliers and users

    NASA Astrophysics Data System (ADS)

    Millard, Keiran

    2015-04-01

    This paper looks at current experiences of geospatial users and geospatial suppliers and how they have been limited by suitable frameworks for managing and communicating data quality, data provenance and intellectual property rights (IPR). Current political and technological drivers mean that increasing volumes of geospatial data are available through a plethora of different products and services, and whilst this is inherently a good thing it does create a new generation of challenges. This paper consider two examples of where these issues have been examined and looks at the challenges and possible solutions from a data user and data supplier perspective. The first example is the IQmulus project that is researching fusion environments for big geospatial point clouds and coverages. The second example is the EU Emodnet programme that is establishing thematic data portals for public marine and coastal data. IQmulus examines big geospatial data; the data from sources such as LIDAR, SONAR and numerical simulations; these data are simply too big for routine and ad-hoc analysis, yet they could realise a myriad of disparate, and readily useable, information products with the right infrastructure in place. IQmulus is researching how to deliver this infrastructure technically, but a financially sustainable delivery depends on being able to track and manage ownership and IPR across the numerous data sets being processed. This becomes complex when the data is composed of multiple overlapping coverages, however managing this allows for uses to be delivered highly-bespoke products to meet their budget and technical needs. The Emodnet programme delivers harmonised marine data at the EU scale across seven thematic portals. As part of the Emodnet programme a series of 'check points' have been initiated to examine how useful these services and other public data services actually are to solve real-world problems. One key finding is that users have been confused by the fact that often data from the same source appears across multiple platforms and that current 19115-style metadata catalogues do not help the vast majority of users in making data selections. To address this, we have looked at approaches used in the leisure industry. This industry has established tools to support users selecting the best hotel for their needs from the metadata available, supported by peer to peer rating. We have looked into how this approach can support users in selecting the best data to meet their needs.

  1. The Hazards Data Distribution System update

    USGS Publications Warehouse

    Jones, Brenda K.; Lamb, Rynn M.

    2010-01-01

    After a major disaster, a satellite image or a collection of aerial photographs of the event is frequently the fastest, most effective way to determine its scope and severity. The U.S. Geological Survey (USGS) Emergency Operations Portal provides emergency first responders and support personnel with easy access to imagery and geospatial data, geospatial Web services, and a digital library focused on emergency operations. Imagery and geospatial data are accessed through the Hazards Data Distribution System (HDDS). HDDS historically provided data access and delivery services through nongraphical interfaces that allow emergency response personnel to select and obtain pre-event baseline data and (or) event/disaster response data. First responders are able to access full-resolution GeoTIFF images or JPEG images at medium- and low-quality compressions through ftp downloads. USGS HDDS home page: http://hdds.usgs.gov/hdds2/

  2. Synergy Between Individual and Institutional Capacity Building: Examples from the NASA DEVELOP National Program

    NASA Astrophysics Data System (ADS)

    Ross, K. W.; Childs-Gleason, L. M.; Favors, J.; Rogers, L.; Ruiz, M. L.; Allsbrook, K. N.

    2016-12-01

    The NASA DEVELOP National Program seeks to simultaneously build capacity to use Earth observations in early career and transitioning professionals while building capacity with institutional partners to apply Earth observations in conducting operations, making decisions, or informing policy. Engaging professionals in this manner lays the foundation of the NASA DEVELOP experience and provides a fresh perspective into institutional challenges. This energetic engagement of people in the emerging workforce elicits heightened attention and greater openness to new resources and processes from project partners. This presentation will describe how NASA DEVELOP provides over 350 opportunities for individuals to engage with over 140 partners per year. It will discuss how the program employs teaming approaches, logistical support, and access to science expertise to facilitate increased awareness and use of NASA geospatial information. It will conclude with examples of how individual/institutional capacity building synergies have led to useful capacity building outcomes.

  3. Integration of Airborne Aerosol Prediction Systems and Vegetation Phenology to Track Pollen for Asthma Alerts in Public Health Decision Support Systems

    NASA Technical Reports Server (NTRS)

    Luvall, Jeffrey C.; Sprigg, William A.; Huete, Alfredo; Pejanovic, Goran; Nickovic, Slobodan; Krapfl, Heide; Budge, Amy; Zelicoff, Alan; VandeWater, Peter K.; Levetin, Estelle; hide

    2009-01-01

    The residual signal indicates that the pollen event may influence the seasonal signal to an extent that would allow detection, given accurate QA filtering and BRDF corrections. MODIS daily reflectances increased during the pollen season. The DREAM model (PREAM) was successfully modified for use with pollen and may provide 24-36 hour running pollen forecasts. Publicly available pollen forecasts are linked to general weather patterns and roughly-known species phenologies. These are too coarse for timely health interventions. PREAM addresses this key data gap so that targeting intervention measures can be determined temporally and geospatially. The New Mexico Department of Health (NMDOH) as part of its Environmental Public Health Tracking Network (EPHTN) would use PREAM a tool for alerting the public in advance of pollen bursts to intervene and reduce the health impact on asthma populations at risk.

  4. Spatial modelling of arsenic distribution and human health effects in Lake Victoria basin, Tanzania

    NASA Astrophysics Data System (ADS)

    Ijumulana, Julian; Mtalo, Felix; Bhattacharya, Prosun

    2016-04-01

    Increasing incidences of naturally occurring geogenic pollutants in drinking water sources and associated human health risks are the two major challenges requiring detailed knowledge to support decision making process at various levels. The presence, location and extent of environmental contamination is needed towards developing mitigation measures to achieve required standards. In this study we are developing a GIS-based model to detect and predict drinking water pollutants at the identified hotspots and monitor its variation in space. In addition, the mobility of pollutants within the affected region needs to be evaluated using topographic and hydrogeological data. Based on these geospatial data on contaminant distribution, spatial relationship of As and F contamination and reported human health effects such as dental caries, dental fluorosis, skeletal fluorosis and bone crippling, skin and other cancers etc. can be modeled for potential interventions for safe drinking water supplies.

  5. Integrating multi-criteria decision analysis for a GIS-based hazardous waste landfill sitting in Kurdistan Province, western Iran

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharifi, Mozafar; Hadidi, Mosslem; Vessali, Elahe

    2009-10-15

    The evaluation of a hazardous waste disposal site is a complicated process because it requires data from diverse social and environmental fields. These data often involve processing of a significant amount of spatial information which can be used by GIS as an important tool for land use suitability analysis. This paper presents a multi-criteria decision analysis alongside with a geospatial analysis for the selection of hazardous waste landfill sites in Kurdistan Province, western Iran. The study employs a two-stage analysis to provide a spatial decision support system for hazardous waste management in a typically under developed region. The purpose ofmore » GIS was to perform an initial screening process to eliminate unsuitable land followed by utilization of a multi-criteria decision analysis (MCDA) to identify the most suitable sites using the information provided by the regional experts with reference to new chosen criteria. Using 21 exclusionary criteria, as input layers, masked maps were prepared. Creating various intermediate or analysis map layers a final overlay map was obtained representing areas for hazardous waste landfill sites. In order to evaluate different landfill sites produced by the overlaying a landfill suitability index system was developed representing cumulative effects of relative importance (weights) and suitability values of 14 non-exclusionary criteria including several criteria resulting from field observation. Using this suitability index 15 different sites were visited and based on the numerical evaluation provided by MCDA most suitable sites were determined.« less

  6. Integrating multi-criteria decision analysis for a GIS-based hazardous waste landfill sitting in Kurdistan Province, western Iran.

    PubMed

    Sharifi, Mozafar; Hadidi, Mosslem; Vessali, Elahe; Mosstafakhani, Parasto; Taheri, Kamal; Shahoie, Saber; Khodamoradpour, Mehran

    2009-10-01

    The evaluation of a hazardous waste disposal site is a complicated process because it requires data from diverse social and environmental fields. These data often involve processing of a significant amount of spatial information which can be used by GIS as an important tool for land use suitability analysis. This paper presents a multi-criteria decision analysis alongside with a geospatial analysis for the selection of hazardous waste landfill sites in Kurdistan Province, western Iran. The study employs a two-stage analysis to provide a spatial decision support system for hazardous waste management in a typically under developed region. The purpose of GIS was to perform an initial screening process to eliminate unsuitable land followed by utilization of a multi-criteria decision analysis (MCDA) to identify the most suitable sites using the information provided by the regional experts with reference to new chosen criteria. Using 21 exclusionary criteria, as input layers, masked maps were prepared. Creating various intermediate or analysis map layers a final overlay map was obtained representing areas for hazardous waste landfill sites. In order to evaluate different landfill sites produced by the overlaying a landfill suitability index system was developed representing cumulative effects of relative importance (weights) and suitability values of 14 non-exclusionary criteria including several criteria resulting from field observation. Using this suitability index 15 different sites were visited and based on the numerical evaluation provided by MCDA most suitable sites were determined.

  7. Lsiviewer 2.0 - a Client-Oriented Online Visualization Tool for Geospatial Vector Data

    NASA Astrophysics Data System (ADS)

    Manikanta, K.; Rajan, K. S.

    2017-09-01

    Geospatial data visualization systems have been predominantly through applications that are installed and run in a desktop environment. Over the last decade, with the advent of web technologies and its adoption by Geospatial community, the server-client model for data handling, data rendering and visualization respectively has been the most prevalent approach in Web-GIS. While the client devices have become functionally more powerful over the recent years, the above model has largely ignored it and is still in a mode of serverdominant computing paradigm. In this paper, an attempt has been made to develop and demonstrate LSIViewer - a simple, easy-to-use and robust online geospatial data visualisation system for the user's own data that harness the client's capabilities for data rendering and user-interactive styling, with a reduced load on the server. The developed system can support multiple geospatial vector formats and can be integrated with other web-based systems like WMS, WFS, etc. The technology stack used to build this system is Node.js on the server side and HTML5 Canvas and JavaScript on the client side. Various tests run on a range of vector datasets, upto 35 MB, showed that the time taken to render the vector data using LSIViewer is comparable to a desktop GIS application, QGIS, over an identical system.

  8. A Curriculum-Linked Professional Development Approach to Support Teachers' Adoption of Web GIS Tectonics Investigations

    ERIC Educational Resources Information Center

    Bodzin, Alec; Anastasio, David; Sahagian, Dork; Henry, Jill Burrows

    2016-01-01

    A curriculum-linked professional development approach designed to support middle level science teachers' understandings about tectonics and geospatial pedagogical content knowledge was developed. This approach takes into account limited face-to-face professional development time and instead provides pedagogical support within the design of a…

  9. Cloud computing geospatial application for water resources based on free and open source software and open standards - a prototype

    NASA Astrophysics Data System (ADS)

    Delipetrev, Blagoj

    2016-04-01

    Presently, most of the existing software is desktop-based, designed to work on a single computer, which represents a major limitation in many ways, starting from limited computer processing, storage power, accessibility, availability, etc. The only feasible solution lies in the web and cloud. This abstract presents research and development of a cloud computing geospatial application for water resources based on free and open source software and open standards using hybrid deployment model of public - private cloud, running on two separate virtual machines (VMs). The first one (VM1) is running on Amazon web services (AWS) and the second one (VM2) is running on a Xen cloud platform. The presented cloud application is developed using free and open source software, open standards and prototype code. The cloud application presents a framework how to develop specialized cloud geospatial application that needs only a web browser to be used. This cloud application is the ultimate collaboration geospatial platform because multiple users across the globe with internet connection and browser can jointly model geospatial objects, enter attribute data and information, execute algorithms, and visualize results. The presented cloud application is: available all the time, accessible from everywhere, it is scalable, works in a distributed computer environment, it creates a real-time multiuser collaboration platform, the programing languages code and components are interoperable, and it is flexible in including additional components. The cloud geospatial application is implemented as a specialized water resources application with three web services for 1) data infrastructure (DI), 2) support for water resources modelling (WRM), 3) user management. The web services are running on two VMs that are communicating over the internet providing services to users. The application was tested on the Zletovica river basin case study with concurrent multiple users. The application is a state-of-the-art cloud geospatial collaboration platform. The presented solution is a prototype and can be used as a foundation for developing of any specialized cloud geospatial applications. Further research will be focused on distributing the cloud application on additional VMs, testing the scalability and availability of services.

  10. Renewable Energy Data Explorer User Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cox, Sarah L; Grue, Nicholas W; Tran, July

    This publication provides a user guide for the Renewable Energy Data Explorer and technical potential tool within the Explorer. The Renewable Energy Data Explorer is a dynamic, web-based geospatial analysis tool that facilitates renewable energy decision-making, investment, and deployment. It brings together renewable energy resource data and other modeled or measured geographic information system (GIS) layers, including land use, weather, environmental, population density, administrative, and grid data.

  11. Sensor Web Interoperability Testbed Results Incorporating Earth Observation Satellites

    NASA Technical Reports Server (NTRS)

    Frye, Stuart; Mandl, Daniel J.; Alameh, Nadine; Bambacus, Myra; Cappelaere, Pat; Falke, Stefan; Derezinski, Linda; Zhao, Piesheng

    2007-01-01

    This paper describes an Earth Observation Sensor Web scenario based on the Open Geospatial Consortium s Sensor Web Enablement and Web Services interoperability standards. The scenario demonstrates the application of standards in describing, discovering, accessing and tasking satellites and groundbased sensor installations in a sequence of analysis activities that deliver information required by decision makers in response to national, regional or local emergencies.

  12. Open Standards in Practice: An OGC China Forum Initiative

    NASA Astrophysics Data System (ADS)

    Yue, Peng; Zhang, Mingda; Taylor, Trevor; Xie, Jibo; Zhang, Hongping; Tong, Xiaochong; Yu, Jinsongdi; Huang, Juntao

    2016-11-01

    Open standards like OGC standards can be used to improve interoperability and support machine-to-machine interaction over the Web. In the Big Data era, standard-based data and processing services from various vendors could be combined to automate the extraction of information and knowledge from heterogeneous and large volumes of geospatial data. This paper introduces an ongoing OGC China forum initiative, which will demonstrate how OGC standards can benefit the interaction among multiple organizations in China. The ability to share data and processing functions across organizations using standard services could change traditional manual interactions in their business processes, and provide on-demand decision support results by on-line service integration. In the initiative, six organizations are involved in two “MashUp” scenarios on disaster management. One “MashUp” is to derive flood maps in the Poyang Lake, Jiangxi. And the other one is to generate turbidity maps on demand in the East Lake, Wuhan, China. The two scenarios engage different organizations from the Chinese community by integrating sensor observations, data, and processing services from them, and improve the automation of data analysis process using open standards.

  13. Mathematical models application for mapping soils spatial distribution on the example of the farm from the North of Udmurt Republic of Russia

    NASA Astrophysics Data System (ADS)

    Dokuchaev, P. M.; Meshalkina, J. L.; Yaroslavtsev, A. M.

    2018-01-01

    Comparative analysis of soils geospatial modeling using multinomial logistic regression, decision trees, random forest, regression trees and support vector machines algorithms was conducted. The visual interpretation of the digital maps obtained and their comparison with the existing map, as well as the quantitative assessment of the individual soil groups detection overall accuracy and of the models kappa showed that multiple logistic regression, support vector method, and random forest models application with spatial prediction of the conditional soil groups distribution can be reliably used for mapping of the study area. It has shown the most accurate detection for sod-podzolics soils (Phaeozems Albic) lightly eroded and moderately eroded soils. In second place, according to the mean overall accuracy of the prediction, there are sod-podzolics soils - non-eroded and warp one, as well as sod-gley soils (Umbrisols Gleyic) and alluvial soils (Fluvisols Dystric, Umbric). Heavy eroded sod-podzolics and gray forest soils (Phaeozems Albic) were detected by methods of automatic classification worst of all.

  14. PANTHER. Pattern ANalytics To support High-performance Exploitation and Reasoning.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Czuchlewski, Kristina Rodriguez; Hart, William E.

    Sandia has approached the analysis of big datasets with an integrated methodology that uses computer science, image processing, and human factors to exploit critical patterns and relationships in large datasets despite the variety and rapidity of information. The work is part of a three-year LDRD Grand Challenge called PANTHER (Pattern ANalytics To support High-performance Exploitation and Reasoning). To maximize data analysis capability, Sandia pursued scientific advances across three key technical domains: (1) geospatial-temporal feature extraction via image segmentation and classification; (2) geospatial-temporal analysis capabilities tailored to identify and process new signatures more efficiently; and (3) domain- relevant models of humanmore » perception and cognition informing the design of analytic systems. Our integrated results include advances in geographical information systems (GIS) in which we discover activity patterns in noisy, spatial-temporal datasets using geospatial-temporal semantic graphs. We employed computational geometry and machine learning to allow us to extract and predict spatial-temporal patterns and outliers from large aircraft and maritime trajectory datasets. We automatically extracted static and ephemeral features from real, noisy synthetic aperture radar imagery for ingestion into a geospatial-temporal semantic graph. We worked with analysts and investigated analytic workflows to (1) determine how experiential knowledge evolves and is deployed in high-demand, high-throughput visual search workflows, and (2) better understand visual search performance and attention. Through PANTHER, Sandia's fundamental rethinking of key aspects of geospatial data analysis permits the extraction of much richer information from large amounts of data. The project results enable analysts to examine mountains of historical and current data that would otherwise go untouched, while also gaining meaningful, measurable, and defensible insights into overlooked relationships and patterns. The capability is directly relevant to the nation's nonproliferation remote-sensing activities and has broad national security applications for military and intelligence- gathering organizations.« less

  15. The road to NHDPlus — Advancements in digital stream networks and associated catchments

    USGS Publications Warehouse

    Moore, Richard B.; Dewald, Thomas A.

    2016-01-01

    A progression of advancements in Geographic Information Systems techniques for hydrologic network and associated catchment delineation has led to the production of the National Hydrography Dataset Plus (NHDPlus). NHDPlus is a digital stream network for hydrologic modeling with catchments and a suite of related geospatial data. Digital stream networks with associated catchments provide a geospatial framework for linking and integrating water-related data. Advancements in the development of NHDPlus are expected to continue to improve the capabilities of this national geospatial hydrologic framework. NHDPlus is built upon the medium-resolution NHD and, like NHD, was developed by the U.S. Environmental Protection Agency and U.S. Geological Survey to support the estimation of streamflow and stream velocity used in fate-and-transport modeling. Catchments included with NHDPlus were created by integrating vector information from the NHD and from the Watershed Boundary Dataset with the gridded land surface elevation as represented by the National Elevation Dataset. NHDPlus is an actively used and continually improved dataset. Users recognize the importance of a reliable stream network and associated catchments. The NHDPlus spatial features and associated data tables will continue to be improved to support regional water quality and streamflow models and other user-defined applications.

  16. Information gathering, management and transfering for geospacial intelligence

    NASA Astrophysics Data System (ADS)

    Nunes, Paulo; Correia, Anacleto; Teodoro, M. Filomena

    2017-07-01

    Information is a key subject in modern organization operations. The success of joint and combined operations with organizations partners depends on the accurate information and knowledge flow concerning the operations theatre: provision of resources, environment evolution, markets location, where and when an event occurred. As in the past and nowadays we cannot conceive modern operations without maps and geo-spatial information (GI). Information and knowledge management is fundamental to the success of organizational decisions in an uncertainty environment. The georeferenced information management is a process of knowledge management, it begins in the raw data and ends on generating knowledge. GI and intelligence systems allow us to integrate all other forms of intelligence and can be a main platform to process and display geo-spatial-time referenced events. Combining explicit knowledge with peoples know-how to generate a continuous learning cycle that supports real time decisions mitigates the influences of fog of everyday competition and provides the knowledge supremacy. Extending the preliminary analysis done in [1], this work applies the exploratory factor analysis to a questionnaire about the GI and intelligence management in an organization company allowing to identify future lines of action to improve information process sharing and exploration of all the potential of this important resource.

  17. Classification of rocky headlands in California with relevance to littoral cell boundary delineation

    USGS Publications Warehouse

    George, Douglas A.; Largier, John L.; Storlazzi, Curt D.; Barnard, Patrick L.

    2015-01-01

    Despite extensive studies of hydrodynamics and sediment flux along beaches, there is little information on the processes, pathways and timing of water and sediment transport around rocky headlands. In this study, headlands along the California coast are classified to advance understanding of headland dynamics and littoral cell boundaries in support of improved coastal management decisions. Geomorphological parameters for 78 headlands were quantified from geological maps, remote-sensing imagery, navigational charts, and shoreline geospatial databases. K-means cluster analysis grouped the headlands into eight distinct classes based on headland perimeter, bathymetric slope ratio, and the headland apex angle. Wave data were used to investigate the potential for sediment transport around the headland types and determine the efficacy of the headland as a littoral cell boundary. Four classes of headland appear to function well as littoral cell boundaries, with headland size (e.g., perimeter or area) and a marked change in nearshore bathymetry across the headland being relevant attributes. About half of the traditional California littoral cell boundaries align with headland classes that are expected to perform poorly in blocking alongshore sediment transport, calling into question these boundaries. Better definition of these littoral cell boundaries is important for regional sediment management decisions.

  18. Diagnosing Geospatial Uncertainty Visualization Challenges in Seasonal Temperature and Precipitation Forecasts

    NASA Astrophysics Data System (ADS)

    Speciale, A.; Kenney, M. A.; Gerst, M.; Baer, A. E.; DeWitt, D.; Gottschalk, J.; Handel, S.

    2017-12-01

    The uncertainty of future weather and climate conditions is important for many decisions made in communities and economic sectors. One tool that decision-makers use in gauging this uncertainty is forecasts, especially maps (or visualizations) of probabilistic forecast results. However, visualizing geospatial uncertainty is challenging because including probability introduces an extra variable to represent and probability is often poorly understood by users. Using focus group and survey methods, this study seeks to understand the barriers to using probabilistic temperature and precipitation visualizations for specific decisions in the agriculture, energy, emergency management, and water resource sectors. Preliminary results shown here focus on findings of emergency manager needs. Our experimental design uses National Oceanic and Atmospheric Administration (NOAA's) Climate Prediction Center (CPC) climate outlooks, which produce probabilistic temperature and precipitation forecast visualizations at the 6-10 day, 8-14 day, 3-4 week, and 1 and 3 month timeframes. Users were asked to complete questions related to how they use weather information, how uncertainty is represented, and design elements (e.g., color, contour lines) of the visualizations. Preliminary results from the emergency management sector indicate there is significant confusion on how "normal" weather is defined, boundaries between probability ranges, and meaning of the contour lines. After a complete understandability diagnosis is made using results from all sectors, we will collaborate with CPC to suggest modifications to the climate outlook visualizations. These modifications will then be retested in similar focus groups and web-based surveys to confirm they better meet the needs of users.

  19. Smart Cities Intelligence System (SMACiSYS) Integrating Sensor Web with Spatial Data Infrastructures (sensdi)

    NASA Astrophysics Data System (ADS)

    Bhattacharya, D.; Painho, M.

    2017-09-01

    The paper endeavours to enhance the Sensor Web with crucial geospatial analysis capabilities through integration with Spatial Data Infrastructure. The objective is development of automated smart cities intelligence system (SMACiSYS) with sensor-web access (SENSDI) utilizing geomatics for sustainable societies. There has been a need to develop automated integrated system to categorize events and issue information that reaches users directly. At present, no web-enabled information system exists which can disseminate messages after events evaluation in real time. Research work formalizes a notion of an integrated, independent, generalized, and automated geo-event analysing system making use of geo-spatial data under popular usage platform. Integrating Sensor Web With Spatial Data Infrastructures (SENSDI) aims to extend SDIs with sensor web enablement, converging geospatial and built infrastructure, and implement test cases with sensor data and SDI. The other benefit, conversely, is the expansion of spatial data infrastructure to utilize sensor web, dynamically and in real time for smart applications that smarter cities demand nowadays. Hence, SENSDI augments existing smart cities platforms utilizing sensor web and spatial information achieved by coupling pairs of otherwise disjoint interfaces and APIs formulated by Open Geospatial Consortium (OGC) keeping entire platform open access and open source. SENSDI is based on Geonode, QGIS and Java, that bind most of the functionalities of Internet, sensor web and nowadays Internet of Things superseding Internet of Sensors as well. In a nutshell, the project delivers a generalized real-time accessible and analysable platform for sensing the environment and mapping the captured information for optimal decision-making and societal benefit.

  20. Soil Monitor: an advanced and freely accesible platform to challenge soil sealing in Italy

    NASA Astrophysics Data System (ADS)

    Langella, Giuliano; Basile, Angelo; Giannecchini, Simone; Domenico Moccia, Francesco; Munafò, Michele; Terribile, Fabio

    2017-04-01

    Soil sealing is known to be one of the most serious soil degradation processes since it greatly disturbs or removes essential ecosystem services. Although important policy documents (Roadmap to a Resource Efficient in Europe, SDG'S) promise to mitigate this problem, there are still no signs of change and today soil sealing continues to increase globally. We believe an immediate action is required to reduce the distance between the grand policy declarations and the poor availability of operational - and scientifically robust - tools to challenge soil sealing. These tools must be able to support the decisions made by people who manage and control the soil sealing, namely urban and landscape planning professionals and authorities. In this contribution, we demonstrate that soil sealing can be effectively challenged by the implementation of a dedicated Geospatial Cyberinfrastructure. The platform we are developing - named Soil Monitor - is at now a well-functioning prototype freely available at http://www.soilmonitor.it/. It has been developed by research scientists coming from different disciplines. The national authority for environmental protection (ISPRA) provided the dataset while INU (Italian association of urban planners) tested the soil sealing and the urban planning indicators. More generally, Soil Monitor has been designed to support the Italian policy documents connected to soil sealing: AS 1181, AS 2383, L. 22 May 2015, n. 68; L. 28 December, n. 221). Thus, it connects many different soil sealing aspects including science, community, policy and economy. Soil Monitor performs geospatial computation in real-time to support the decision making in the landscape planning. This aims at measuring soil sealing in order to mitigate it and in particular at recognizing actions to achieve the land degradation neutrality. The web platform covers the entire Italy, even though it is "Country-agnostic". Data are processed at a very high spatial resolution (10-20 m), which is a "must" for effective landscape planning. Computation is designed to be highly scalable enabling real time responses over a customised range of spatial extents and high-demand calculations are embedded by means of advanced parallel codes running fast on GPUs (Graphical Processing Units). For any Italian area of interest drawn or selected by the user the analysis includes real time quantification of (i) land use changes at different times (ii) rural landscape fragmentation, (iii) loss of ecosystem services after new urbanisation, (iv) potential impact of new green corridors. A library of parallel routines based on the CUDA (Computing Unified Device Architecture) framework is going to be built which enables the easy implementation of new indicators for measuring land state and degradation.

  1. Data Management Supporting the U.S. Extended Continental Shelf Project

    NASA Astrophysics Data System (ADS)

    Lim, E.; Henderson, J. F.; Warnken, R.; McLean, S. J.; Varner, J. D.; Mcquinn, E.; LaRocque, J.

    2013-12-01

    The U.S. Extended Continental Shelf (ECS) Project is a multi-agency collaboration led by the U.S. Department of State whose mission is to establish the full extent of the continental shelf of the United States consistent with international law. Since 2003, the U.S. has been actively collecting bathymetric, seismic, and other geophysical data and geologic samples required to delineate its outer limits in accordance with Article 76 of the UN Convention on the Law of the Sea. In 2007, the U.S. ECS Task Force designated the National Geophysical Data Center (NGDC) to serve as both the Data Management lead and the Data Archive and Integration Center for the U.S. ECS Project. NGDC, one of three National Oceanic and Atmospheric Administration (NOAA) Offices active in the ECS Project, has the primary responsibility to provide a common infrastructure and a means to integrate the data supporting, and products resulting from ECS analysis. One of the key challenges in the ECS project is the requirement to track the provenance of data and derived products. Final ECS analyses may result in hundreds of points that define a new maritime boundary that is our extended continental shelf. These points will be developed in a rigorous process of analysis encompassing potentially thousands of raw datasets and derived products. NGDC has spent the past two years planning, designing, and partially implementing the Information Management System (IMS), a highly functional, interactive software system that serves as the master database for the ECS Project. The purpose of this geospatial database is to archive, access, and manage the primary data, derivative data and products, associated metadata, information and decisions that will form the U.S. submission. The IMS enables team members to manage ECS data in a consistent way while maintaining institutional memory and the rationale behind decisions. The IMS contains two major components: First, a catalog that acts as the interface to the IMS by organizing the data and products and assisting in populating submission document templates. Second, a web map viewer that geospatially displays the data and products. These components enable dispersed team members to manage ECS data consistently, to track the provenance of data and derived products used in the analyses, and to display analyses using a dynamic web map service. This poster illustrates the importance of data management within the ECS project and focuses on the implementation of the IMS and its use supporting the final determination of a new maritime boundary for the U.S.

  2. RECOVER - An Automated Burned Area Emergency Response Decision Support System for Post-fire Rehabilitation Management of Savanna Ecosystems in the Western US

    NASA Astrophysics Data System (ADS)

    Weber, K.; Schnase, J. L.; Carroll, M.; Brown, M. E.; Gill, R.; Haskett, G.; Gardner, T.

    2013-12-01

    In partnership with the Department of Interior's Bureau of Land Management (BLM) and the Idaho Department of Lands (IDL), we are building and evaluating the RECOVER decision support system. RECOVER - which stands for Rehabilitation Capability Convergence for Ecosystem Recovery - is an automatically deployable, context-aware decision support system for savanna wildfires that brings together in a single application the information necessary for post-fire rehabilitation decision-making and long-term ecosystem monitoring. RECOVER uses state-of-the-art cloud-based data management technologies to improve performance, reduce cost, and provide site-specific flexibility for each fire. The RECOVER Server uses Integrated Rule-Oriented Data System (iRODS) data grid technology deployed in the Amazon Elastic Compute Cloud (EC2). The RECOVER Client is an Adobe Flex web map application that is able to provide a suite of convenient GIS analytical capabilities. In a typical use scenario, the RECOVER Server is provided a wildfire name and geospatial extent. The Server then automatically gathers Earth observational data and other relevant products from various geographically distributed data sources. The Server creates a database in the cloud where all relevant information about the wildfire is stored. This information is made available to the RECOVER Client and ultimately to fire managers through their choice of web browser. The Server refreshes the data throughout the burn and subsequent recovery period (3-5 years) with each refresh requiring two minutes to complete. Since remediation plans must be completed within 14 days of a fire's containment, RECOVER has the potential to significantly improve the decision-making process. RECOVER adds an important new dimension to post-fire decision-making by focusing on ecosystem rehabilitation in semiarid savannas. A novel aspect of RECOVER's approach involves the use of soil moisture estimates, which are an important but difficult-to-obtain element of post-fire rehabilitation planning. We will use downscaled soil moisture data from three primary observational sources to begin evaluation of soil moisture products and build the technology needed for RECOVER to use future SMAP products. As a result, RECOVER, BLM, and the fire applications community will be ready customers for data flowing out of new NASA missions, such as NPP, LDCM, and SMAP.

  3. Cloud Geospatial Analysis Tools for Global-Scale Comparisons of Population Models for Decision Making

    NASA Astrophysics Data System (ADS)

    Hancher, M.; Lieber, A.; Scott, L.

    2017-12-01

    The volume of satellite and other Earth data is growing rapidly. Combined with information about where people are, these data can inform decisions in a range of areas including food and water security, disease and disaster risk management, biodiversity, and climate adaptation. Google's platform for planetary-scale geospatial data analysis, Earth Engine, grants access to petabytes of continually updating Earth data, programming interfaces for analyzing the data without the need to download and manage it, and mechanisms for sharing the analyses and publishing results for data-driven decision making. In addition to data about the planet, data about the human planet - population, settlement and urban models - are now available for global scale analysis. The Earth Engine APIs enable these data to be joined, combined or visualized with economic or environmental indicators such as nighttime lights trends, global surface water, or climate projections, in the browser without the need to download anything. We will present our newly developed application intended to serve as a resource for government agencies, disaster response and public health programs, or other consumers of these data to quickly visualize the different population models, and compare them to ground truth tabular data to determine which model suits their immediate needs. Users can further tap into the power of Earth Engine and other Google technologies to perform a range of analysis from simple statistics in custom regions to more complex machine learning models. We will highlight case studies in which organizations around the world have used Earth Engine to combine population data with multiple other sources of data, such as water resources and roads data, over deep stacks of temporal imagery to model disease risk and accessibility to inform decisions.

  4. Increasing the value of geospatial informatics with open approaches for Big Data

    NASA Astrophysics Data System (ADS)

    Percivall, G.; Bermudez, L. E.

    2017-12-01

    Open approaches to big data provide geoscientists with new capabilities to address problems of unmatched size and complexity. Consensus approaches for Big Geo Data have been addressed in multiple international workshops and testbeds organized by the Open Geospatial Consortium (OGC) in the past year. Participants came from government (NASA, ESA, USGS, NOAA, DOE); research (ORNL, NCSA, IU, JPL, CRIM, RENCI); industry (ESRI, Digital Globe, IBM, rasdaman); standards (JTC 1/NIST); and open source software communities. Results from the workshops and testbeds are documented in Testbed reports and a White Paper published by the OGC. The White Paper identifies the following set of use cases: Collection and Ingest: Remote sensed data processing; Data stream processing Prepare and Structure: SQL and NoSQL databases; Data linking; Feature identification Analytics and Visualization: Spatial-temporal analytics; Machine Learning; Data Exploration Modeling and Prediction: Integrated environmental models; Urban 4D models. Open implementations were developed in the Arctic Spatial Data Pilot using Discrete Global Grid Systems (DGGS) and in Testbeds using WPS and ESGF to publish climate predictions. Further development activities to advance open implementations of Big Geo Data include the following: Open Cloud Computing: Avoid vendor lock-in through API interoperability and Application portability. Open Source Extensions: Implement geospatial data representations in projects from Apache, Location Tech, and OSGeo. Investigate parallelization strategies for N-Dimensional spatial data. Geospatial Data Representations: Schemas to improve processing and analysis using geospatial concepts: Features, Coverages, DGGS. Use geospatial encodings like NetCDF and GeoPackge. Big Linked Geodata: Use linked data methods scaled to big geodata. Analysis Ready Data: Support "Download as last resort" and "Analytics as a service". Promote elements common to "datacubes."

  5. Digital Earth - A sustainable Earth

    NASA Astrophysics Data System (ADS)

    Mahavir

    2014-02-01

    All life, particularly human, cannot be sustainable, unless complimented with shelter, poverty reduction, provision of basic infrastructure and services, equal opportunities and social justice. Yet, in the context of cities, it is believed that they can accommodate more and more people, endlessly, regardless to their carrying capacity and increasing ecological footprint. The 'inclusion', for bringing more and more people in the purview of development is often limited to social and economic inclusion rather than spatial and ecological inclusion. Economic investment decisions are also not always supported with spatial planning decisions. Most planning for a sustainable Earth, be at a level of rural settlement, city, region, national or Global, fail on the capacity and capability fronts. In India, for example, out of some 8,000 towns and cities, Master Plans exist for only about 1,800. A chapter on sustainability or environment is neither statutorily compulsory nor a norm for these Master Plans. Geospatial technologies including Remote Sensing, GIS, Indian National Spatial Data Infrastructure (NSDI), Indian National Urban Information Systems (NUIS), Indian Environmental Information System (ENVIS), and Indian National GIS (NGIS), etc. have potential to map, analyse, visualize and take sustainable developmental decisions based on participatory social, economic and social inclusion. Sustainable Earth, at all scales, is a logical and natural outcome of a digitally mapped, conceived and planned Earth. Digital Earth, in fact, itself offers a platform to dovetail the ecological, social and economic considerations in transforming it into a sustainable Earth.

  6. Task and Progress of Iaeg-Sdgs Wggi in Monitoring Sdgs Through a `GEOGRAPHIC Location' Lens

    NASA Astrophysics Data System (ADS)

    Geng, W.; Chen, J.; Zhang, H. P.; Xu, K.

    2018-04-01

    In September 2015, the 193 Member States of the United Nations (UN) unanimously adopted the 2030 Agenda for Sustainable Development and its 17 Sustainable Development Goals (SDGs), aiming to transform the world over the next 15 years (ESDN, 2016). To meet the ambitions and demands of the 2030 Agenda, it is necessary for the global indicator framework to adequately and systematically address the issue of alternative data sources and methodologies, including geospatial information and Earth observations in the context of geographic location (UN-GGIM, 2016). For this purpose, the Inter-Agency and Expert Group on Sustainable Development Goals Indicator (IAEG-SDGs) created the Working Group on Geospatial Information (IAEG-SDGs: WGGI) to give full play to the role of geospatial data in SDGs measurement and monitoring. The Working Group reviewed global indicators through a `geographic location' lens to pick out those which geospatial information can significantly support the production, and analyzed the methodological and measurements issues. This paper has discussed the progress in monitoring SDGs ever since the establishment of IAEG-SDGs: WGGI, as well as the existing problems, appropriate solutions and plans for the next stage of work.

  7. Topologically Consistent Models for Efficient Big Geo-Spatio Data Distribution

    NASA Astrophysics Data System (ADS)

    Jahn, M. W.; Bradley, P. E.; Doori, M. Al; Breunig, M.

    2017-10-01

    Geo-spatio-temporal topology models are likely to become a key concept to check the consistency of 3D (spatial space) and 4D (spatial + temporal space) models for emerging GIS applications such as subsurface reservoir modelling or the simulation of energy and water supply of mega or smart cities. Furthermore, the data management for complex models consisting of big geo-spatial data is a challenge for GIS and geo-database research. General challenges, concepts, and techniques of big geo-spatial data management are presented. In this paper we introduce a sound mathematical approach for a topologically consistent geo-spatio-temporal model based on the concept of the incidence graph. We redesign DB4GeO, our service-based geo-spatio-temporal database architecture, on the way to the parallel management of massive geo-spatial data. Approaches for a new geo-spatio-temporal and object model of DB4GeO meeting the requirements of big geo-spatial data are discussed in detail. Finally, a conclusion and outlook on our future research are given on the way to support the processing of geo-analytics and -simulations in a parallel and distributed system environment.

  8. Sustainable Development of Research Capacity in West Africa based on the GLOWA Volta Project

    NASA Astrophysics Data System (ADS)

    Liebe, Jens R.; Rogmann, Antonio; Falk, Ulrike; Amisigo, Barnabas; Nyarko, Kofi; Harmsen, Karl; Vlek, Paul L. G.

    2010-05-01

    The Sustainable Development of Research Capacity (SDRC) in West Africa is an 18 month project, funded by the German Ministry of Education and Research, to strengthen the research capacity, give access to data and models, and to support the establishment of the newly formed Volta Basin Authority. The SDRC project largely builds on the results and models developed in the framework of the GLOWA Volta Project (GVP), a nine-year, interdisciplinary research project (May 2000 - May 2009). The GVP's central objectives were to analyze the physical and socio-economic determinants of the hydrological cycle in the Volta Basin in the face of global change, and to develop scientifically sound decision support resources. Another major achievement of GVP was the extensive capacity building. Of the 81 participating students (57 Ph.D.'s), 44 originated from West Africa, and 85% of the West African graduates returned to their home countries. The SDRC makes use of the wide range of research results and decision support tools developed in the course of the GVP. It is based on three columns: I. knowledge transfer and strengthening of human capacity, which focus on a training on the modeling of the onset of the rainy season, hydrological, economic, and hydro-economic modeling, and training of geospatial database managers; II. strengthening of infrastructural research capacity through the support of a research instrumentation network through the operation and transfer of a weather station network, a network of tele-transmitted stream gauges; and III. the transfer of a publicly accessible online Geoportal for the dissemination of various geospatial data and research results. At the center of the SDRC effort is the strengthening of the Volta Basin Authority, a river basin authority with a transnational mandate, especially through the transfer of the Geoportal, and the associated training and promotion efforts. The Geoportal is an effort to overcome the data scarcity previously observed in the Volta Basin, and represents the first comprehensive, publicly accessible data- and meta-database for the Volta Basin. The Geoportal can be used to search for data, for interactive mapping or the download of ready-made maps, and to publish and share new data and research results. Local institutions are actively involved in acquiring data for the Geoportal, and trained in its operation. For the contributing institutions, the ability to manage data access and use rights (publicly available, available to defined user groups, available upon request) is of great importance. It allows them to publish the existence of their data and facilitate access to it without sacrificing their ownership rights. The Geoportal can be accessed at http://131.220.109.6/Geoportal

  9. The Wildland Fire Emissions Information System: Providing information for carbon cycle studies with open source geospatial tools

    NASA Astrophysics Data System (ADS)

    French, N. H.; Erickson, T.; McKenzie, D.

    2008-12-01

    A major goal of the North American Carbon Program is to resolve uncertainties in understanding and managing the carbon cycle of North America. As carbon modeling tools become more comprehensive and spatially oriented, accurate datasets to spatially quantify carbon emissions from fire are needed, and these data resources need to be accessible to users for decision-making. Under a new NASA Carbon Cycle Science project, Drs. Nancy French and Tyler Erickson, of the Michigan Technological University, Michigan Tech Research Institute (MTRI), are teaming with specialists with the USDA Forest Service Fire and Environmental Research Applications (FERA) team to provide information for mapping fire-derived carbon emissions to users. The project focus includes development of a web-based system to provide spatially resolved fire emissions estimates for North America in a user-friendly environment. The web-based Decision Support System will be based on a variety of open source technologies. The Fuel Characteristic Classification System (FCCS) raster map of fuels and MODIS-derived burned area vector maps will be processed using the Geographic Data Abstraction Library (GDAL) and OGR Simple Features Library. Tabular and spatial project data will be stored in a PostgreSQL/PostGIS, a spatially enabled relational database server. The browser-based user interface will be created using the Django web page framework to allow user input for the decision support system. The OpenLayers mapping framework will be used to provide users with interactive maps within the browser. In addition, the data products will be made available in standard open data formats such as KML, to allow for easy integration into other spatial models and data systems.

  10. Incorporating Resilience into Transportation Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Connelly, Elizabeth; Melaina, Marc

    To aid decision making for developing transportation infrastructure, the National Renewable Energy Laboratory has developed the Scenario Evaluation, Regionalization and Analysis (SERA) model. The SERA model is a geospatially and temporally oriented model that has been applied to determine optimal production and delivery scenarios for hydrogen, given resource availability and technology cost and performance, for use in fuel cell vehicles. In addition, the SERA model has been applied to plug-in electric vehicles.

  11. Modelling a suitable location for Urban Solid Waste Management using AHP method and GIS -A geospatial approach and MCDM Model

    NASA Astrophysics Data System (ADS)

    Iqbal, M.; Islam, A.; Hossain, A.; Mustaque, S.

    2016-12-01

    Multi-Criteria Decision Making(MCDM) is advanced analytical method to evaluate appropriate result or decision from multiple criterion environment. Present time in advanced research, MCDM technique is progressive analytical process to evaluate a logical decision from various conflict. In addition, Present day Geospatial approach (e.g. Remote sensing and GIS) also another advanced technical approach in a research to collect, process and analyze various spatial data at a time. GIS and Remote sensing together with the MCDM technique could be the best platform to solve a complex decision making process. These two latest process combined very effectively used in site selection for solid waste management in urban policy. The most popular MCDM technique is Weighted Linear Method (WLC) where Analytical Hierarchy Process (AHP) is another popular and consistent techniques used in worldwide as dependable decision making. Consequently, the main objective of this study is improving a AHP model as MCDM technique with Geographic Information System (GIS) to select a suitable landfill site for urban solid waste management. Here AHP technique used as a MCDM tool to select the best suitable landfill location for urban solid waste management. To protect the urban environment in a sustainable way municipal waste needs an appropriate landfill site considering environmental, geological, social and technical aspect of the region. A MCDM model generate from five class related which related to environmental, geological, social and technical using AHP method and input the result set in GIS for final model location for urban solid waste management. The final suitable location comes out that 12.2% of the area corresponds to 22.89 km2 considering the total study area. In this study, Keraniganj sub-district of Dhaka district in Bangladesh is consider as study area which is densely populated city currently undergoes an unmanaged waste management system especially the suitable landfill sites for waste dumping site.

  12. GIS-and Web-based Water Resource Geospatial Infrastructure for Oil Shale Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Wei; Minnick, Matthew; Geza, Mengistu

    2012-09-30

    The Colorado School of Mines (CSM) was awarded a grant by the National Energy Technology Laboratory (NETL), Department of Energy (DOE) to conduct a research project en- titled GIS- and Web-based Water Resource Geospatial Infrastructure for Oil Shale Development in October of 2008. The ultimate goal of this research project is to develop a water resource geo-spatial infrastructure that serves as “baseline data” for creating solutions on water resource management and for supporting decisions making on oil shale resource development. The project came to the end on September 30, 2012. This final project report will report the key findings frommore » the project activity, major accomplishments, and expected impacts of the research. At meantime, the gamma version (also known as Version 4.0) of the geodatabase as well as other various deliverables stored on digital storage media will be send to the program manager at NETL, DOE via express mail. The key findings from the project activity include the quantitative spatial and temporal distribution of the water resource throughout the Piceance Basin, water consumption with respect to oil shale production, and data gaps identified. Major accomplishments of this project include the creation of a relational geodatabase, automated data processing scripts (Matlab) for database link with surface water and geological model, ArcGIS Model for hydrogeologic data processing for groundwater model input, a 3D geological model, surface water/groundwater models, energy resource development systems model, as well as a web-based geo-spatial infrastructure for data exploration, visualization and dissemination. This research will have broad impacts of the devel- opment of the oil shale resources in the US. The geodatabase provides a “baseline” data for fur- ther study of the oil shale development and identification of further data collection needs. The 3D geological model provides better understanding through data interpolation and visualization techniques of the Piceance Basin structure spatial distribution of the oil shale resources. The sur- face water/groundwater models quantify the water shortage and better understanding the spatial distribution of the available water resources. The energy resource development systems model reveals the phase shift of water usage and the oil shale production, which will facilitate better planning for oil shale development. Detailed descriptions about the key findings from the project activity, major accomplishments, and expected impacts of the research will be given in the sec- tion of “ACCOMPLISHMENTS, RESULTS, AND DISCUSSION” of this report.« less

  13. ISTIMES Integrated System for Transport Infrastructures Surveillance and Monitoring by Electromagnetic Sensing

    NASA Astrophysics Data System (ADS)

    Argenti, M.; Giannini, V.; Averty, R.; Bigagli, L.; Dumoulin, J.

    2012-04-01

    The EC FP7 ISTIMES project has the goal of realizing an ICT-based system exploiting distributed and local sensors for non destructive electromagnetic monitoring in order to make critical transport infrastructures more reliable and safe. Higher situation awareness thanks to real time and detailed information and images of the controlled infrastructure status allows improving decision capabilities for emergency management stakeholders. Web-enabled sensors and a service-oriented approach are used as core of the architecture providing a sys-tem that adopts open standards (e.g. OGC SWE, OGC CSW etc.) and makes efforts to achieve full interoperability with other GMES and European Spatial Data Infrastructure initiatives as well as compliance with INSPIRE. The system exploits an open easily scalable network architecture to accommodate a wide range of sensors integrated with a set of tools for handling, analyzing and processing large data volumes from different organizations with different data models. Situation Awareness tools are also integrated in the system. Definition of sensor observations and services follows a metadata model based on the ISO 19115 Core set of metadata elements and the O&M model of OGC SWE. The ISTIMES infrastructure is based on an e-Infrastructure for geospatial data sharing, with a Data Cata-log that implements the discovery services for sensor data retrieval, acting as a broker through static connections based on standard SOS and WNS interfaces; a Decision Support component which helps decision makers providing support for data fusion and inference and generation of situation indexes; a Presentation component which implements system-users interaction services for information publication and rendering, by means of a WEB Portal using SOA design principles; A security framework using Shibboleth open source middleware based on the Security Assertion Markup Language supporting Single Sign On (SSO). ACKNOWLEDGEMENT - The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under Grant Agreement n° 225663

  14. Geospatial Data Science Modeling | Geospatial Data Science | NREL

    Science.gov Websites

    Geospatial Data Science Modeling Geospatial Data Science Modeling NREL uses geospatial data science modeling to develop innovative models and tools for energy professionals, project developers, and consumers . Photo of researchers inspecting maps on a large display. Geospatial modeling at NREL often produces the

  15. E-learning based distance education programme on Remote Sensing and Geoinformation Science - An initiative of IIRS

    NASA Astrophysics Data System (ADS)

    Karnatak, H.; Raju, P. L. N.; Krishna Murthy, Y. V. N.; Srivastav, S. K.; Gupta, P. K.

    2014-11-01

    IIRS has initiated its interactive distance education based capacity building under IIRS outreach programme in year 2007 where more than 15000+ students were trained in the field of geospatial technology using Satellite based interactive terminals and internet based learning using A-View software. During last decade the utilization of Internet technology by different user groups in the society is emerged as a technological revaluation which has directly affect the life of human being. The Internet is used extensively in India for various purposes right from entrainment to critical decision making in government machinery. The role of internet technology is very important for capacity building in any discipline which can satisfy the needs of maximum users in minimum time. Further to enhance the outreach of geospatial science and technology, IIRS has initiated e-learning based certificate courses of different durations. The contents for e-learning based capacity building programme are developed for various target user groups including mid-career professionals, researchers, academia, fresh graduates, and user department professionals from different States and Central Government ministries. The official website of IIRS e-learning is hosted at http://elearning.iirs.gov.in. The contents of IIRS e-learning programme are flexible for anytime, anywhere learning keeping in mind the demands of geographically dispersed audience and their requirements. The program is comprehensive with variety of online delivery modes with interactive, easy to learn and having a proper blend of concepts and practical to elicit students' full potential. The course content of this programme includes Image Statistics, Basics of Remote Sensing, Photogrammetry and Cartography, Digital Image Processing, Geographical Information System, Global Positioning System, Customization of Geospatial tools and Applications of Geospatial Technologies. The syllabus of the courses is as per latest developments and trends in geo-spatial science and technologies with specific focus on Indian case studies for geo-spatial applications. The learning is made available through interactive 2D and 3D animations, audio, video for practical demonstrations, software operations with free data applications. The learning methods are implemented to make it more interactive and learner centric application with practical examples of real world problems.

  16. Strengthened IAEA Safeguards-Imagery Analysis: Geospatial Tools for Nonproliferation Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pabian, Frank V

    2012-08-14

    This slide presentation focuses on the growing role and importance of imagery analysis for IAEA safeguards applications and how commercial satellite imagery, together with the newly available geospatial tools, can be used to promote 'all-source synergy.' As additional sources of openly available information, satellite imagery in conjunction with the geospatial tools can be used to significantly augment and enhance existing information gathering techniques, procedures, and analyses in the remote detection and assessment of nonproliferation relevant activities, facilities, and programs. Foremost of the geospatial tools are the 'Digital Virtual Globes' (i.e., GoogleEarth, Virtual Earth, etc.) that are far better than previouslymore » used simple 2-D plan-view line drawings for visualization of known and suspected facilities of interest which can be critical to: (1) Site familiarization and true geospatial context awareness; (2) Pre-inspection planning; (3) Onsite orientation and navigation; (4) Post-inspection reporting; (5) Site monitoring over time for changes; (6) Verification of states site declarations and for input to State Evaluation reports; and (7) A common basis for discussions among all interested parties (Member States). Additionally, as an 'open-source', such virtual globes can also provide a new, essentially free, means to conduct broad area search for undeclared nuclear sites and activities - either alleged through open source leads; identified on internet BLOGS and WIKI Layers, with input from a 'free' cadre of global browsers and/or by knowledgeable local citizens (a.k.a.: 'crowdsourcing'), that can include ground photos and maps; or by other initiatives based on existing information and in-house country knowledge. They also provide a means to acquire ground photography taken by locals, hobbyists, and tourists of the surrounding locales that can be useful in identifying and discriminating between relevant and non-relevant facilities and their associated infrastructure. The digital globes also provide highly accurate terrain mapping for better geospatial context and allow detailed 3-D perspectives of all sites or areas of interest. 3-D modeling software (i.e., Google's SketchUp6 newly available in 2007) when used in conjunction with these digital globes can significantly enhance individual building characterization and visualization (including interiors), allowing for better assessments including walk-arounds or fly-arounds and perhaps better decision making on multiple levels (e.g., the best placement for International Atomic Energy Agency (IAEA) video monitoring cameras).« less

  17. Earth Observations to Assess Impact of Hurricane Katrina on John C. Stennis Space Center

    NASA Technical Reports Server (NTRS)

    Graham, William D.; Ross, Kenton W.

    2007-01-01

    The peril from hurricanes to Space Operations Centers is real and is forecast to continue; Katrina, Rita, and Wilma of 2005 and Charley, Frances, Ivan, and Jeanne of 2004 are sufficient motivation for NASA to develop a multi-Center plan for preparedness and response. As was demonstrated at SSC (Stennis Space Center) in response to Hurricane Katrina, NASA Centers are efficiently activated as local command centers, playing host to Federal and State agencies and first responders to coordinate and provide evacuation, relocation, response, and recovery activities. Remote sensing decision support provides critical insight for managing NASA infrastructure and for assisting Center decision makers. Managers require geospatial information to manage the federal city. Immediately following Katrina, SSC s power and network connections were disabled, hardware was inoperative, technical staff was displaced and/or out of contact, and graphical decision support tools were non-existent or less than fully effective. Despite this circumstance, SSC EOC (Emergency Operations Center) implemented response operations to assess damage and to activate recovery plans. To assist Center Managers, the NASA ASP (Applied Sciences Program) made its archive of high-resolution data over the site available. In the weeks and months after the immediate crisis, NASA supplemented this data with high-resolution, post-Katrina imagery over SSC and much of the affected coastal areas. Much of the high-resolution imagery was made available through the Department of Defense Clear View contract and was distributed through U.S. Geological Survey Center for Earth Resources Observation and Science "Hurricane Katrina Disaster Response" Web site. By integrating multiple image data types with other information sources, ASP applied an all-source solutions approach to develop decision support tools that enabled managers to respond to critical issues, such as expedient access to infrastructure and deployment of resources, provision of temporary shelter, logistical control of critical supplies, and the mobilization and coordination of assets from ground crews to aircraft/airspace management. Furthermore, ASP developed information products that illustrate risks to SSC's infrastructure from surge, inundation, and flood. Current plans include developing wind-risk prototype products for refinement and adoption into EOC plans.

  18. The Huaihe Basin Water Resource and Water Quality Management Platform Implemented with a Spatio-Temporal Data Model

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Zhang, W.; Yan, C.

    2012-07-01

    Presently, planning and assessment in maintenance, renewal and decision-making for watershed hydrology, water resource management and water quality assessment are evolving toward complex, spatially explicit regional environmental assessments. These problems have to be addressed with object-oriented spatio-temporal data models that can restore, manage, query and visualize various historic and updated basic information concerning with watershed hydrology, water resource management and water quality as well as compute and evaluate the watershed environmental conditions so as to provide online forecasting to police-makers and relevant authorities for supporting decision-making. The extensive data requirements and the difficult task of building input parameter files, however, has long been an obstacle to use of such complex models timely and effectively by resource managers. Success depends on an integrated approach that brings together scientific, education and training advances made across many individual disciplines and modified to fit the needs of the individuals and groups who must write, implement, evaluate, and adjust their watershed management plans. The centre for Hydro-science Research, Nanjing University, in cooperation with the relevant watershed management authorities, has developed a WebGIS management platform to facilitate this complex process. Improve the management of watersheds over the Huaihe basin through the development, promotion and use of a web-based, user-friendly, geospatial watershed management data and decision support system (WMDDSS) involved many difficulties for the development of this complicated System. In terms of the spatial and temporal characteristics of historic and currently available information on meteorological, hydrological, geographical, environmental and other relevant disciplines, we designed an object-oriented spatiotemporal data model that combines spatial, attribute and temporal information to implement the management system. Using this system, we can update, query and analyze environmental information as well as manage historical data, and a visualization tool was provided to help the user interpret results so as to provide scientific support for decision-making. The utility of the system has been demonstrated its values by being used in watershed management and environmental assessments.

  19. Student Focused Geospatial Curriculum Initiatives: Internships and Certificate Programs at NCCU

    NASA Astrophysics Data System (ADS)

    Vlahovic, G.; Malhotra, R.

    2009-12-01

    This paper reports recent efforts by the Department of Environmental, Earth and Geospatial Sciences faculty at North Carolina Central University (NCCU) to develop a leading geospatial sciences program that will be considered a model for other Historically Black College/University (HBCU) peers nationally. NCCU was established in 1909 and is the nation’s first state supported public liberal arts college funded for African Americans. In the most recent annual ranking of America’s best black colleges by the US News and World Report (Best Colleges 2010), NCCU was ranked 10th in the nation. As one of only two HBCUs in the southeast offering an undergraduate degree in Geography (McKee, J.O. and C. V. Dixon. Geography in Historically Black Colleges/ Universities in the Southeast, in The Role of the South in Making of American Geography: Centennial of the AAG, 2004), NCCU is uniquely positioned to positively affect talent and diversity of the geospatial discipline in the future. Therefore, successful creation of research and internship pathways for NCCU students has national implications because it will increase the number of minority students joining the workforce and applying to PhD programs. Several related efforts will be described, including research and internship projects with Fugro EarthData Inc., Center for Remote Sensing and Mapping Science at the University of Georgia, Center for Earthquake Research and Information at the University of Memphis and the City of Durham. The authors will also outline requirements and recent successes of ASPRS Provisional Certification Program, developed and pioneered as collaborative effort between ASPRS and NCCU. This certificate program allows graduating students majoring in geospatial technologies and allied fields to become provisionally certified by passing peer-review and taking the certification exam. At NCCU, projects and certification are conducted under the aegis of the Geospatial Research, Innovative Teaching and Service (GRITS) Center housed in the Department of Environmental, Earth and Geospatial Sciences. The GRITS center was established in 2006 with funding from the National Science Foundation to promote the learning and application of geospatial technologies. Since then GRITS has been a hub for Geographical Information Science (GIS) curriculum development, faculty and professional GIS workshops, grant writing and outreach efforts. The Center also serves as a contact point for partnerships with other universities, national organizations and businesses in the geospatial arena - and as a result, opens doors to the professional world for our graduate and undergraduate students.

  20. Applying the Land Use Portfolio Model to Estimate Natural-Hazard Loss and Risk - A Hypothetical Demonstration for Ventura County, California

    USGS Publications Warehouse

    Dinitz, Laura B.

    2008-01-01

    With costs of natural disasters skyrocketing and populations increasingly settling in areas vulnerable to natural hazards, society is challenged to better allocate its limited risk-reduction resources. In 2000, Congress passed the Disaster Mitigation Act, amending the Robert T. Stafford Disaster Relief and Emergency Assistance Act (Robert T. Stafford Disaster Relief and Emergency Assistance Act, Pub. L. 93-288, 1988; Federal Emergency Management Agency, 2002, 2008b; Disaster Mitigation Act, 2000), mandating that State, local, and tribal communities prepare natural-hazard mitigation plans to qualify for pre-disaster mitigation grants and post-disaster aid. The Federal Emergency Management Agency (FEMA) was assigned to coordinate and implement hazard-mitigation programs, and it published information about specific mitigation-plan requirements and the mechanisms (through the Hazard Mitigation Grant Program-HMGP) for distributing funds (Federal Emergency Management Agency, 2002). FEMA requires that each community develop a mitigation strategy outlining long-term goals to reduce natural-hazard vulnerability, mitigation objectives and specific actions to reduce the impacts of natural hazards, and an implementation plan for those actions. The implementation plan should explain methods for prioritizing, implementing, and administering the actions, along with a 'cost-benefit review' justifying the prioritization. FEMA, along with the National Institute of Building Sciences (NIBS), supported the development of HAZUS ('Hazards U.S.'), a geospatial natural-hazards loss-estimation tool, to help communities quantify potential losses and to aid in the selection and prioritization of mitigation actions. HAZUS was expanded to a multiple-hazard version, HAZUS-MH, that combines population, building, and natural-hazard science and economic data and models to estimate physical damages, replacement costs, and business interruption for specific natural-hazard scenarios. HAZUS-MH currently performs analyses for earthquakes, floods, and hurricane wind. HAZUS-MH loss estimates, however, do not account for some uncertainties associated with the specific natural-hazard scenarios, such as the likelihood of occurrence within a particular time horizon or the effectiveness of alternative risk-reduction options. Because of the uncertainties involved, it is challenging to make informative decisions about how to cost-effectively reduce risk from natural-hazard events. Risk analysis is one approach that decision-makers can use to evaluate alternative risk-reduction choices when outcomes are unknown. The Land Use Portfolio Model (LUPM), developed by the U.S. Geological Survey (USGS), is a geospatial scenario-based tool that incorporates hazard-event uncertainties to support risk analysis. The LUPM offers an approach to estimate and compare risks and returns from investments in risk-reduction measures. This paper describes and demonstrates a hypothetical application of the LUPM for Ventura County, California, and examines the challenges involved in developing decision tools that provide quantitative methods to estimate losses and analyze risk from natural hazards.

  1. Development of spatial density maps based on geoprocessing web services: application to tuberculosis incidence in Barcelona, Spain.

    PubMed

    Dominkovics, Pau; Granell, Carlos; Pérez-Navarro, Antoni; Casals, Martí; Orcau, Angels; Caylà, Joan A

    2011-11-29

    Health professionals and authorities strive to cope with heterogeneous data, services, and statistical models to support decision making on public health. Sophisticated analysis and distributed processing capabilities over geocoded epidemiological data are seen as driving factors to speed up control and decision making in these health risk situations. In this context, recent Web technologies and standards-based web services deployed on geospatial information infrastructures have rapidly become an efficient way to access, share, process, and visualize geocoded health-related information. Data used on this study is based on Tuberculosis (TB) cases registered in Barcelona city during 2009. Residential addresses are geocoded and loaded into a spatial database that acts as a backend database. The web-based application architecture and geoprocessing web services are designed according to the Representational State Transfer (REST) principles. These web processing services produce spatial density maps against the backend database. The results are focused on the use of the proposed web-based application to the analysis of TB cases in Barcelona. The application produces spatial density maps to ease the monitoring and decision making process by health professionals. We also include a discussion of how spatial density maps may be useful for health practitioners in such contexts. In this paper, we developed web-based client application and a set of geoprocessing web services to support specific health-spatial requirements. Spatial density maps of TB incidence were generated to help health professionals in analysis and decision-making tasks. The combined use of geographic information tools, map viewers, and geoprocessing services leads to interesting possibilities in handling health data in a spatial manner. In particular, the use of spatial density maps has been effective to identify the most affected areas and its spatial impact. This study is an attempt to demonstrate how web processing services together with web-based mapping capabilities suit the needs of health practitioners in epidemiological analysis scenarios.

  2. Designing a Hydro-Economic Collaborative Computer Decision Support System: Approaches, Best Practices, Lessons Learned, and Future Trends

    NASA Astrophysics Data System (ADS)

    Rosenberg, D. E.

    2008-12-01

    Designing and implementing a hydro-economic computer model to support or facilitate collaborative decision making among multiple stakeholders or users can be challenging and daunting. Collaborative modeling is distinguished and more difficult than non-collaborative efforts because of a large number of users with different backgrounds, disagreement or conflict among stakeholders regarding problem definitions, modeling roles, and analysis methods, plus evolving ideas of model scope and scale and needs for information and analysis as stakeholders interact, use the model, and learn about the underlying water system. This presentation reviews the lifecycle for collaborative model making and identifies some key design decisions that stakeholders and model developers must make to develop robust and trusted, verifiable and transparent, integrated and flexible, and ultimately useful models. It advances some best practices to implement and program these decisions. Among these best practices are 1) modular development of data- aware input, storage, manipulation, results recording and presentation components plus ways to couple and link to other models and tools, 2) explicitly structure both input data and the meta data that describes data sources, who acquired it, gaps, and modifications or translations made to put the data in a form usable by the model, 3) provide in-line documentation on model inputs, assumptions, calculations, and results plus ways for stakeholders to document their own model use and share results with others, and 4) flexibly program with graphical object-oriented properties and elements that allow users or the model maintainers to easily see and modify the spatial, temporal, or analysis scope as the collaborative process moves forward. We draw on examples of these best practices from the existing literature, the author's prior work, and some new applications just underway. The presentation concludes by identifying some future directions for collaborative modeling including geo-spatial display and analysis, real-time operations, and internet-based tools plus the design and programming needed to implement these capabilities.

  3. Development of spatial density maps based on geoprocessing web services: application to tuberculosis incidence in Barcelona, Spain

    PubMed Central

    2011-01-01

    Background Health professionals and authorities strive to cope with heterogeneous data, services, and statistical models to support decision making on public health. Sophisticated analysis and distributed processing capabilities over geocoded epidemiological data are seen as driving factors to speed up control and decision making in these health risk situations. In this context, recent Web technologies and standards-based web services deployed on geospatial information infrastructures have rapidly become an efficient way to access, share, process, and visualize geocoded health-related information. Methods Data used on this study is based on Tuberculosis (TB) cases registered in Barcelona city during 2009. Residential addresses are geocoded and loaded into a spatial database that acts as a backend database. The web-based application architecture and geoprocessing web services are designed according to the Representational State Transfer (REST) principles. These web processing services produce spatial density maps against the backend database. Results The results are focused on the use of the proposed web-based application to the analysis of TB cases in Barcelona. The application produces spatial density maps to ease the monitoring and decision making process by health professionals. We also include a discussion of how spatial density maps may be useful for health practitioners in such contexts. Conclusions In this paper, we developed web-based client application and a set of geoprocessing web services to support specific health-spatial requirements. Spatial density maps of TB incidence were generated to help health professionals in analysis and decision-making tasks. The combined use of geographic information tools, map viewers, and geoprocessing services leads to interesting possibilities in handling health data in a spatial manner. In particular, the use of spatial density maps has been effective to identify the most affected areas and its spatial impact. This study is an attempt to demonstrate how web processing services together with web-based mapping capabilities suit the needs of health practitioners in epidemiological analysis scenarios. PMID:22126392

  4. A discussion for integrating INSPIRE with volunteered geographic information (VGI) and the vision for a global spatial-based platform

    NASA Astrophysics Data System (ADS)

    Demetriou, Demetris; Campagna, Michele; Racetin, Ivana; Konecny, Milan

    2017-09-01

    INSPIRE is the EU's authoritative Spatial Data Infrastructure (SDI) in which each Member State provides access to their spatial data across a wide spectrum of data themes to support policy making. In contrast, Volunteered Geographic Information (VGI) is one type of user-generated geographic information where volunteers use the web and mobile devices to create, assemble and disseminate spatial information. There are similarities and differences between SDIs and VGI initiatives, as well as advantages and disadvantages. Thus, the integration of these two data sources will enhance what is offered to end users to facilitate decision makers and the wider community regarding solving complex spatial problems, managing emergency situations and getting useful information for peoples' daily activities. Although some efforts towards this direction have been arisen, several key issues need to be considered and resolved. Further to this integration, the vision is the development of a global integrated GIS platform, which extends the capabilities of a typical data-hub by embedding on-line spatial and non-spatial applications, to deliver both static and dynamic outputs to support planning and decision making. In this context, this paper discusses the challenges of integrating INSPIRE with VGI and outlines a generic framework towards creating a global integrated web-based GIS platform. The tremendous high speed evolution of the Web and Geospatial technologies suggest that this "super" global Geo-system is not far away.

  5. High-End Scientific Computing

    EPA Pesticide Factsheets

    EPA uses high-end scientific computing, geospatial services and remote sensing/imagery analysis to support EPA's mission. The Center for Environmental Computing (CEC) assists the Agency's program offices and regions to meet staff needs in these areas.

  6. ASPECT (Airborne Spectral Photometric Environmental Collection Technology) Fact Sheet

    EPA Pesticide Factsheets

    This multi-sensor screening tool provides infrared and photographic images with geospatial, chemical, and radiological data within minutes to support emergency responses, home-land security missions, environmental surveys, and climate monitoring missions.

  7. An integrated, open-source set of tools for urban vulnerability monitoring from Earth observation data

    NASA Astrophysics Data System (ADS)

    De Vecchi, Daniele; Harb, Mostapha; Dell'Acqua, Fabio; Aurelio Galeazzo, Daniel

    2015-04-01

    Aim: The paper introduces an integrated set of open-source tools designed to process medium and high-resolution imagery with the aim to extract vulnerability indicators [1]. Problem: In the context of risk monitoring [2], a series of vulnerability proxies can be defined, such as the extension of a built-up area or buildings regularity [3]. Different open-source C and Python libraries are already available for image processing and geospatial information (e.g. OrfeoToolbox, OpenCV and GDAL). They include basic processing tools but not vulnerability-oriented workflows. Therefore, it is of significant importance to provide end-users with a set of tools capable to return information at a higher level. Solution: The proposed set of python algorithms is a combination of low-level image processing and geospatial information handling tools along with high-level workflows. In particular, two main products are released under the GPL license: source code, developers-oriented, and a QGIS plugin. These tools were produced within the SENSUM project framework (ended December 2014) where the main focus was on earthquake and landslide risk. Further development and maintenance is guaranteed by the decision to include them in the platform designed within the FP 7 RASOR project . Conclusion: With the lack of a unified software suite for vulnerability indicators extraction, the proposed solution can provide inputs for already available models like the Global Earthquake Model. The inclusion of the proposed set of algorithms within the RASOR platforms can guarantee support and enlarge the community of end-users. Keywords: Vulnerability monitoring, remote sensing, optical imagery, open-source software tools References [1] M. Harb, D. De Vecchi, F. Dell'Acqua, "Remote sensing-based vulnerability proxies in the EU FP7 project SENSUM", Symposium on earthquake and landslide risk in Central Asia and Caucasus: exploiting remote sensing and geo-spatial information management, 29-30th January 2014, Bishkek, Kyrgyz Republic. [2] UNISDR, "Living with Risk", Geneva, Switzerland, 2004. [3] P. Bisch, E. Carvalho, H. Degree, P. Fajfar, M. Fardis, P. Franchin, M. Kreslin, A. Pecker, "Eurocode 8: Seismic Design of Buildings", Lisbon, 2011. (SENSUM: www.sensum-project.eu, grant number: 312972 ) (RASOR: www.rasor-project.eu, grant number: 606888 )

  8. Development and deployment of a water-crop-nutrient simulation model embedded in a web application

    NASA Astrophysics Data System (ADS)

    Langella, Giuliano; Basile, Angelo; Coppola, Antonio; Manna, Piero; Orefice, Nadia; Terribile, Fabio

    2016-04-01

    It is long time by now that scientific research on environmental and agricultural issues spent large effort in the development and application of models for prediction and simulation in spatial and temporal domains. This is fulfilled by studying and observing natural processes (e.g. rainfall, water and chemicals transport in soils, crop growth) whose spatiotemporal behavior can be reproduced for instance to predict irrigation and fertilizer requirements and yield quantities/qualities. In this work a mechanistic model to simulate water flow and solute transport in the soil-plant-atmosphere continuum is presented. This desktop computer program was written according to the specific requirement of developing web applications. The model is capable to solve the following issues all together: (a) water balance and (b) solute transport; (c) crop modelling; (d) GIS-interoperability; (e) embedability in web-based geospatial Decision Support Systems (DSS); (f) adaptability at different scales of application; and (g) ease of code modification. We maintained the desktop characteristic in order to further develop (e.g. integrate novel features) and run the key program modules for testing and validation purporses, but we also developed a middleware component to allow the model run the simulations directly over the web, without software to be installed. The GIS capabilities allows the web application to make simulations in a user-defined region of interest (delimited over a geographical map) without the need to specify the proper combination of model parameters. It is possible since the geospatial database collects information on pedology, climate, crop parameters and soil hydraulic characteristics. Pedological attributes include the spatial distribution of key soil data such as soil profile horizons and texture. Further, hydrological parameters are selected according to the knowledge about the spatial distribution of soils. The availability and definition in the geospatial domain of these attributes allow the simulation outputs at a different spatial scale. Two different applications were implemented using the same framework but with different configurations of the software pieces making the physically based modelling chain: an irrigation tool simulating water requirements and their dates and a fertilization tool for optimizing in particular mineral nitrogen adds.

  9. Online Resources to Support Professional Development for Managing and Preserving Geospatial Data

    NASA Astrophysics Data System (ADS)

    Downs, R. R.; Chen, R. S.

    2013-12-01

    Improved capabilities of information and communication technologies (ICT) enable the development of new systems and applications for collecting, managing, disseminating, and using scientific data. New knowledge, skills, and techniques are also being developed to leverage these new ICT capabilities and improve scientific data management practices throughout the entire data lifecycle. In light of these developments and in response to increasing recognition of the wider value of scientific data for society, government agencies are requiring plans for the management, stewardship, and public dissemination of data and research products that are created by government-funded studies. Recognizing that data management and dissemination have not been part of traditional science education programs, new educational programs and learning resources are being developed to prepare new and practicing scientists, data scientists, data managers, and other data professionals with skills in data science and data management. Professional development and training programs also are being developed to address the need for scientists and professionals to improve their expertise in using the tools and techniques for managing and preserving scientific data. The Geospatial Data Preservation Resource Center offers an online catalog of various open access publications, open source tools, and freely available information for the management and stewardship of geospatial data and related resources, such as maps, GIS, and remote sensing data. Containing over 500 resources that can be found by type, topic, or search query, the geopreservation.org website enables discovery of various types of resources to improve capabilities for managing and preserving geospatial data. Applications and software tools can be found for use online or for download. Online journal articles, presentations, reports, blogs, and forums are also available through the website. Available education and training materials include tutorials, primers, guides, and online learning modules. The site enables users to find and access standards, real-world examples, and websites of other resources about geospatial data management. Quick links to lists of resources are available for data managers, system developers, and researchers. New resources are featured regularly to highlight current developments in practice and research. A user-centered approach was taken to design and develop the site iteratively, based on a survey of the expectations and needs of community members who have an interest in the management and preservation of geospatial data. Formative and summative evaluation activities have informed design, content, and feature enhancements to enable users to use the website efficiently and effectively. Continuing management and evaluation of the website keeps the content and the infrastructure current with evolving research, practices, and technology. The design, development, evaluation, and use of the website are described along with selected resources and activities that support education and professional development for the management, preservation, and stewardship of geospatial data.

  10. Real-time GIS data model and sensor web service platform for environmental data management.

    PubMed

    Gong, Jianya; Geng, Jing; Chen, Zeqiang

    2015-01-09

    Effective environmental data management is meaningful for human health. In the past, environmental data management involved developing a specific environmental data management system, but this method often lacks real-time data retrieving and sharing/interoperating capability. With the development of information technology, a Geospatial Service Web method is proposed that can be employed for environmental data management. The purpose of this study is to determine a method to realize environmental data management under the Geospatial Service Web framework. A real-time GIS (Geographic Information System) data model and a Sensor Web service platform to realize environmental data management under the Geospatial Service Web framework are proposed in this study. The real-time GIS data model manages real-time data. The Sensor Web service platform is applied to support the realization of the real-time GIS data model based on the Sensor Web technologies. To support the realization of the proposed real-time GIS data model, a Sensor Web service platform is implemented. Real-time environmental data, such as meteorological data, air quality data, soil moisture data, soil temperature data, and landslide data, are managed in the Sensor Web service platform. In addition, two use cases of real-time air quality monitoring and real-time soil moisture monitoring based on the real-time GIS data model in the Sensor Web service platform are realized and demonstrated. The total time efficiency of the two experiments is 3.7 s and 9.2 s. The experimental results show that the method integrating real-time GIS data model and Sensor Web Service Platform is an effective way to manage environmental data under the Geospatial Service Web framework.

  11. Near Real-time Scientific Data Analysis and Visualization with the ArcGIS Platform

    NASA Astrophysics Data System (ADS)

    Shrestha, S. R.; Viswambharan, V.; Doshi, A.

    2017-12-01

    Scientific multidimensional data are generated from a variety of sources and platforms. These datasets are mostly produced by earth observation and/or modeling systems. Agencies like NASA, NOAA, USGS, and ESA produce large volumes of near real-time observation, forecast, and historical data that drives fundamental research and its applications in larger aspects of humanity from basic decision making to disaster response. A common big data challenge for organizations working with multidimensional scientific data and imagery collections is the time and resources required to manage and process such large volumes and varieties of data. The challenge of adopting data driven real-time visualization and analysis, as well as the need to share these large datasets, workflows, and information products to wider and more diverse communities, brings an opportunity to use the ArcGIS platform to handle such demand. In recent years, a significant effort has put in expanding the capabilities of ArcGIS to support multidimensional scientific data across the platform. New capabilities in ArcGIS to support scientific data management, processing, and analysis as well as creating information products from large volumes of data using the image server technology are becoming widely used in earth science and across other domains. We will discuss and share the challenges associated with big data by the geospatial science community and how we have addressed these challenges in the ArcGIS platform. We will share few use cases, such as NOAA High Resolution Refresh Radar (HRRR) data, that demonstrate how we access large collections of near real-time data (that are stored on-premise or on the cloud), disseminate them dynamically, process and analyze them on-the-fly, and serve them to a variety of geospatial applications. We will also share how on-the-fly processing using raster functions capabilities, can be extended to create persisted data and information products using raster analytics capabilities that exploit distributed computing in an enterprise environment.

  12. ClimatePipes: User-Friendly Data Access, Manipulation, Analysis & Visualization of Community Climate Models

    NASA Astrophysics Data System (ADS)

    Chaudhary, A.; DeMarle, D.; Burnett, B.; Harris, C.; Silva, W.; Osmari, D.; Geveci, B.; Silva, C.; Doutriaux, C.; Williams, D. N.

    2013-12-01

    The impact of climate change will resonate through a broad range of fields including public health, infrastructure, water resources, and many others. Long-term coordinated planning, funding, and action are required for climate change adaptation and mitigation. Unfortunately, widespread use of climate data (simulated and observed) in non-climate science communities is impeded by factors such as large data size, lack of adequate metadata, poor documentation, and lack of sufficient computational and visualization resources. We present ClimatePipes to address many of these challenges by creating an open source platform that provides state-of-the-art, user-friendly data access, analysis, and visualization for climate and other relevant geospatial datasets, making the climate data available to non-researchers, decision-makers, and other stakeholders. The overarching goals of ClimatePipes are: - Enable users to explore real-world questions related to climate change. - Provide tools for data access, analysis, and visualization. - Facilitate collaboration by enabling users to share datasets, workflows, and visualization. ClimatePipes uses a web-based application platform for its widespread support on mainstream operating systems, ease-of-use, and inherent collaboration support. The front-end of ClimatePipes uses HTML5 (WebGL, Canvas2D, CSS3) to deliver state-of-the-art visualization and to provide a best-in-class user experience. The back-end of the ClimatePipes is built around Python using the Visualization Toolkit (VTK, http://vtk.org), Climate Data Analysis Tools (CDAT, http://uv-cdat.llnl.gov), and other climate and geospatial data processing tools such as GDAL and PROJ4. ClimatePipes web-interface to query and access data from remote sources (such as ESGF). Shown in the figure is climate data layer from ESGF on top of map data layer from OpenStreetMap. The ClimatePipes workflow editor provides flexibility and fine grained control, and uses the VisTrails (http://www.vistrails.org) workflow engine in the backend.

  13. Planning and Management of Real-Time Geospatialuas Missions Within a Virtual Globe Environment

    NASA Astrophysics Data System (ADS)

    Nebiker, S.; Eugster, H.; Flückiger, K.; Christen, M.

    2011-09-01

    This paper presents the design and development of a hardware and software framework supporting all phases of typical monitoring and mapping missions with mini and micro UAVs (unmanned aerial vehicles). The developed solution combines state-of-the art collaborative virtual globe technologies with advanced geospatial imaging techniques and wireless data link technologies supporting the combined and highly reliable transmission of digital video, high-resolution still imagery and mission control data over extended operational ranges. The framework enables the planning, simulation, control and real-time monitoring of UAS missions in application areas such as monitoring of forest fires, agronomical research, border patrol or pipeline inspection. The geospatial components of the project are based on the Virtual Globe Technology i3D OpenWebGlobe of the Institute of Geomatics Engineering at the University of Applied Sciences Northwestern Switzerland (FHNW). i3D OpenWebGlobe is a high-performance 3D geovisualisation engine supporting the web-based streaming of very large amounts of terrain and POI data.

  14. Cloud Computing for Geosciences--GeoCloud for standardized geospatial service platforms (Invited)

    NASA Astrophysics Data System (ADS)

    Nebert, D. D.; Huang, Q.; Yang, C.

    2013-12-01

    The 21st century geoscience faces challenges of Big Data, spike computing requirements (e.g., when natural disaster happens), and sharing resources through cyberinfrastructure across different organizations (Yang et al., 2011). With flexibility and cost-efficiency of computing resources a primary concern, cloud computing emerges as a promising solution to provide core capabilities to address these challenges. Many governmental and federal agencies are adopting cloud technologies to cut costs and to make federal IT operations more efficient (Huang et al., 2010). However, it is still difficult for geoscientists to take advantage of the benefits of cloud computing to facilitate the scientific research and discoveries. This presentation reports using GeoCloud to illustrate the process and strategies used in building a common platform for geoscience communities to enable the sharing, integration of geospatial data, information and knowledge across different domains. GeoCloud is an annual incubator project coordinated by the Federal Geographic Data Committee (FGDC) in collaboration with the U.S. General Services Administration (GSA) and the Department of Health and Human Services. It is designed as a staging environment to test and document the deployment of a common GeoCloud community platform that can be implemented by multiple agencies. With these standardized virtual geospatial servers, a variety of government geospatial applications can be quickly migrated to the cloud. In order to achieve this objective, multiple projects are nominated each year by federal agencies as existing public-facing geospatial data services. From the initial candidate projects, a set of common operating system and software requirements was identified as the baseline for platform as a service (PaaS) packages. Based on these developed common platform packages, each project deploys and monitors its web application, develops best practices, and documents cost and performance information. This paper presents the background, architectural design, and activities of GeoCloud in support of the Geospatial Platform Initiative. System security strategies and approval processes for migrating federal geospatial data, information, and applications into cloud, and cost estimation for cloud operations are covered. Finally, some lessons learned from the GeoCloud project are discussed as reference for geoscientists to consider in the adoption of cloud computing.

  15. A resource-oriented architecture for a Geospatial Web

    NASA Astrophysics Data System (ADS)

    Mazzetti, Paolo; Nativi, Stefano

    2010-05-01

    In this presentation we discuss some architectural issues on the design of an architecture for a Geospatial Web, that is an information system for sharing geospatial resources according to the Web paradigm. The success of the Web in building a multi-purpose information space, has raised questions about the possibility of adopting the same approach for systems dedicated to the sharing of more specific resources, such as the geospatial information, that is information characterized by spatial/temporal reference. To this aim an investigation on the nature of the Web and on the validity of its paradigm for geospatial resources is required. The Web was born in the early 90's to provide "a shared information space through which people and machines could communicate" [Berners-Lee 1996]. It was originally built around a small set of specifications (e.g. URI, HTTP, HTML, etc.); however, in the last two decades several other technologies and specifications have been introduced in order to extend its capabilities. Most of them (e.g. the SOAP family) actually aimed to transform the Web in a generic Distributed Computing Infrastructure. While these efforts were definitely successful enabling the adoption of service-oriented approaches for machine-to-machine interactions supporting complex business processes (e.g. for e-Government and e-Business applications), they do not fit in the original concept of the Web. In the year 2000, R. T. Fielding, one of the designers of the original Web specifications, proposes a new architectural style for distributed systems, called REST (Representational State Transfer), aiming to capture the fundamental characteristics of the Web as it was originally conceived [Fielding 2000]. In this view, the nature of the Web lies not so much in the technologies, as in the way they are used. Maintaining the Web architecture conform to the REST style would then assure the scalability, extensibility and low entry barrier of the original Web. On the contrary, systems using the same Web technologies and specifications but according to a different architectural style, despite their usefulness, should not be considered part of the Web. If the REST style captures the significant Web characteristics, then, in order to build a Geospatial Web it is necessary that its architecture satisfies all the REST constraints. One of them is of particular importance: the adoption of a Uniform Interface. It prescribes that all the geospatial resources must be accessed through the same interface; moreover according to the REST style this interface must satisfy four further constraints: a) identification of resources; b) manipulation of resources through representations; c) self-descriptive messages; and, d) hypermedia as the engine of application state. In the Web, the uniform interface provides basic operations which are meaningful for generic resources. They typically implement the CRUD pattern (Create-Retrieve-Update-Delete) which demonstrated to be flexible and powerful in several general-purpose contexts (e.g. filesystem management, SQL for database management systems, etc.). Restricting the scope to a subset of resources it would be possible to identify other generic actions which are meaningful for all of them. For example for geospatial resources, subsetting, resampling, interpolation and coordinate reference systems transformations functionalities are candidate functionalities for a uniform interface. However an investigation is needed to clarify the semantics of those actions for different resources, and consequently if they can really ascend the role of generic interface operation. Concerning the point a), (identification of resources), it is required that every resource addressable in the Geospatial Web has its own identifier (e.g. a URI). This allows to implement citation and re-use of resources, simply providing the URI. OPeNDAP and KVP encodings of OGC data access services specifications might provide a basis for it. Concerning point b) (manipulation of resources through representations), the Geospatial Web poses several issues. In fact, while the Web mainly handles semi-structured information, in the Geospatial Web the information is typically structured with several possible data models (e.g. point series, gridded coverages, trajectories, etc.) and encodings. A possibility would be to simplify the interchange formats, choosing to support a subset of data models and format(s). This is what actually the Web designers did choosing to define a common format for hypermedia (HTML), although the underlying protocol would be generic. Concerning point c), self-descriptive messages, the exchanged messages should describe themselves and their content. This would not be actually a major issue considering the effort put in recent years on geospatial metadata models and specifications. The point d), hypermedia as the engine of application state, is actually where the Geospatial Web would mainly differ from existing geospatial information sharing systems. In fact the existing systems typically adopt a service-oriented architecture, where applications are built as a single service or as a workflow of services. On the other hand, in the Geospatial Web, applications should be built following the path between interconnected resources. The link between resources should be made explicit as hyperlinks. The adoption of Semantic Web solutions would allow to define not only the existence of a link between two resources, but also the nature of the link. The implementation of a Geospatial Web would allow to build an information system with the same characteristics of the Web sharing its points-of-strength and weaknesses. The main advantages would be the following: • The user would interact with the Geospatial Web according to the well-known Web navigation paradigm. This would lower the barrier to the access to geospatial applications for non-specialists (e.g. the success of Google Maps and other Web mapping applications); • Successful Web and Web 2.0 applications - search engines, feeds, social network - could be integrated/replicated in the Geospatial Web; The main drawbacks would be the following: • The Uniform Interface simplifies the overall system architecture (e.g. no service registry, and service descriptors required), but moves the complexity to the data representation. Moreover since the interface must stay generic, it results really simple and therefore complex interactions would require several transfers. • In the geospatial domain one of the most valuable resources are processes (e.g. environmental models). How they can be modeled as resources accessed through the common interface is an open issue. Taking into account advantages and drawback it seems that a Geospatial Web would be useful, but its use would be limited to specific use-cases not covering all the possible applications. The Geospatial Web architecture could be partly based on existing specifications, while other aspects need investigation. References [Berners-Lee 1996] T. Berners-Lee, "WWW: Past, present, and future". IEEE Computer, 29(10), Oct. 1996, pp. 69-77. [Fielding 2000] Fielding, R. T. 2000. Architectural styles and the design of network-based software architectures. PhD Dissertation. Dept. of Information and Computer Science, University of California, Irvine

  16. Modeling and formal representation of geospatial knowledge for the Geospatial Semantic Web

    NASA Astrophysics Data System (ADS)

    Huang, Hong; Gong, Jianya

    2008-12-01

    GML can only achieve geospatial interoperation at syntactic level. However, it is necessary to resolve difference of spatial cognition in the first place in most occasions, so ontology was introduced to describe geospatial information and services. But it is obviously difficult and improper to let users to find, match and compose services, especially in some occasions there are complicated business logics. Currently, with the gradual introduction of Semantic Web technology (e.g., OWL, SWRL), the focus of the interoperation of geospatial information has shifted from syntactic level to Semantic and even automatic, intelligent level. In this way, Geospatial Semantic Web (GSM) can be put forward as an augmentation to the Semantic Web that additionally includes geospatial abstractions as well as related reasoning, representation and query mechanisms. To advance the implementation of GSM, we first attempt to construct the mechanism of modeling and formal representation of geospatial knowledge, which are also two mostly foundational phases in knowledge engineering (KE). Our attitude in this paper is quite pragmatical: we argue that geospatial context is a formal model of the discriminate environment characters of geospatial knowledge, and the derivation, understanding and using of geospatial knowledge are located in geospatial context. Therefore, first, we put forward a primitive hierarchy of geospatial knowledge referencing first order logic, formal ontologies, rules and GML. Second, a metamodel of geospatial context is proposed and we use the modeling methods and representation languages of formal ontologies to process geospatial context. Thirdly, we extend Web Process Service (WPS) to be compatible with local DLL for geoprocessing and possess inference capability based on OWL.

  17. Spatial information semantic query based on SPARQL

    NASA Astrophysics Data System (ADS)

    Xiao, Zhifeng; Huang, Lei; Zhai, Xiaofang

    2009-10-01

    How can the efficiency of spatial information inquiries be enhanced in today's fast-growing information age? We are rich in geospatial data but poor in up-to-date geospatial information and knowledge that are ready to be accessed by public users. This paper adopts an approach for querying spatial semantic by building an Web Ontology language(OWL) format ontology and introducing SPARQL Protocol and RDF Query Language(SPARQL) to search spatial semantic relations. It is important to establish spatial semantics that support for effective spatial reasoning for performing semantic query. Compared to earlier keyword-based and information retrieval techniques that rely on syntax, we use semantic approaches in our spatial queries system. Semantic approaches need to be developed by ontology, so we use OWL to describe spatial information extracted by the large-scale map of Wuhan. Spatial information expressed by ontology with formal semantics is available to machines for processing and to people for understanding. The approach is illustrated by introducing a case study for using SPARQL to query geo-spatial ontology instances of Wuhan. The paper shows that making use of SPARQL to search OWL ontology instances can ensure the result's accuracy and applicability. The result also indicates constructing a geo-spatial semantic query system has positive efforts on forming spatial query and retrieval.

  18. Application of the AMBUR R package for spatio-temporal analysis of shoreline change: Jekyll Island, Georgia, USA

    NASA Astrophysics Data System (ADS)

    Jackson, Chester W.; Alexander, Clark R.; Bush, David M.

    2012-04-01

    The AMBUR (Analyzing Moving Boundaries Using R) package for the R software environment provides a collection of functions for assisting with analyzing and visualizing historical shoreline change. The package allows import and export of geospatial data in ESRI shapefile format, which is compatible with most commercial and open-source GIS software. The "baseline and transect" method is the primary technique used to quantify distances and rates of shoreline movement, and to detect classification changes across time. Along with the traditional "perpendicular" transect method, two new transect methods, "near" and "filtered," assist with quantifying changes along curved shorelines that are problematic for perpendicular transect methods. Output from the analyses includes data tables, graphics, and geospatial data, which are useful in rapidly assessing trends and potential errors in the dataset. A forecasting function also allows the user to estimate the future location of the shoreline and store the results in a shapefile. Other utilities and tools provided in the package assist with preparing and manipulating geospatial data, error checking, and generating supporting graphics and shapefiles. The package can be customized to perform additional statistical, graphical, and geospatial functions, and, it is capable of analyzing the movement of any boundary (e.g., shorelines, glacier terminus, fire edge, and marine and terrestrial ecozones).

  19. A spatial information crawler for OpenGIS WFS

    NASA Astrophysics Data System (ADS)

    Jiang, Jun; Yang, Chong-jun; Ren, Ying-chao

    2008-10-01

    The growth of the internet makes it non-trivial to search for the accuracy information efficiently. Topical crawler, which is aiming at a certain area, attracts more and more intention now because it can help people to find out what they need. Furthermore, with the OpenGIS WFS (Web Feature Service) Specification developed by OGC (Open GIS Consortium), much more geospatial data providers adopt this protocol to publish their data on the internet. In this case, a crawler which is aiming at the WFS servers can help people to find the geospatial data from WFS servers. In this paper, we propose a prototype system of a WFS crawler based on the OpenGIS WFS Specification. The crawler architecture, working principles, and detailed function of each component are introduced. This crawler is capable of discovering WFS servers dynamically, saving and updating the service contents of the servers. The data collect by the crawler can be supported to a geospatial data search engine as its data source.

  20. A Big Data Platform for Storing, Accessing, Mining and Learning Geospatial Data

    NASA Astrophysics Data System (ADS)

    Yang, C. P.; Bambacus, M.; Duffy, D.; Little, M. M.

    2017-12-01

    Big Data is becoming a norm in geoscience domains. A platform that is capable to effiently manage, access, analyze, mine, and learn the big data for new information and knowledge is desired. This paper introduces our latest effort on developing such a platform based on our past years' experiences on cloud and high performance computing, analyzing big data, comparing big data containers, and mining big geospatial data for new information. The platform includes four layers: a) the bottom layer includes a computing infrastructure with proper network, computer, and storage systems; b) the 2nd layer is a cloud computing layer based on virtualization to provide on demand computing services for upper layers; c) the 3rd layer is big data containers that are customized for dealing with different types of data and functionalities; d) the 4th layer is a big data presentation layer that supports the effient management, access, analyses, mining and learning of big geospatial data.

  1. A geospatial assessment of mini/small hydropower potential in Sub-Saharan Africa

    NASA Astrophysics Data System (ADS)

    Korkovelos, Alexandros; Mentis, Dimitrios; Hussain Siyal, Shahid; Arderne, Christopher; Beck, Hylke; de Roo, Ad; Howells, Mark

    2017-04-01

    Sub-Saharan Africa has been the epicenter of ongoing global dialogues around energy poverty and justifiably so. More than half of the world's unserved population lives there. At the same time, a big part of the continent is privileged with plentiful renewable energy resources. Hydropower is one of them and to a large extent it remains untapped. This study focuses on the technical assessment of small-scale hydropower (0.01-10 MW) in Sub-Saharan Africa. The underlying methodology was based on open source geospatial datasets, whose combination allowed a consistent evaluation of 712,615 km of river network spanning over 44 countries. Environmental, topological and social constraints were included in the form of geospatial restrictions to help preserve the natural wealth and promote sustainable development. The results revealed that small-scale hydropower could cover 8.5-12.5% of the estimated electricity demand in 2030, thus making it a viable option to support electrification efforts in the region.

  2. EPA Geospatial Quality Council Promoting Quality Assurance in the Geospatial Coummunity

    EPA Science Inventory

    After establishing a foundation for the EPA National Geospatial Program, the EPA Geospatial Quality Council (GQC) is, in part, focusing on improving administrative efficiency in the geospatial community. To realize this goal, the GQC is developing Standard Operating Procedures (S...

  3. DECATASTROPHIZE - Use of SDSS and MCDA to prepare for disasters or plan for multiple hazards

    NASA Astrophysics Data System (ADS)

    Damalas, Andreas; Mettas, Christodoulos; Evagorou, Evagoras; Hadjimitsis, Diofantos

    2017-04-01

    This project presents effective early warning and alert systems in order to ensure lives and protect citizens, property and the environment in regards to natural and also man-made disasters. Civil protection can be rewarded from developed analysis tools in order to manage the resources available at all levels within the organization. The utilization of Geo-Spatial Early-warning Decision Support Systems (GE-DSS) combined with integrated Geographic Information System (GIS) solutions and multi-criteria decision analysis (MCDA) fuses text and geographic information into one view. DECAT' s purpose is the use of GE-DSS for rapid preparation ability and sustainability to assess and respond to multiple natural, man-made hazards disasters and environmental circumstances. This will be achieved by using existing models / systems in the direction of one multiplatform, which is distributed and integrated framework known as DECAT. The project is expected to create better prerequisites for, and improve preparedness, as well as enhance awareness of, civil protection, natural hazard and marine pollution professionals and volunteers. It intends to support and equilibrate the efforts of the participating states for the protection of citizens, environment and property in regards to natural and man-made disaster. Moreover, the respective project is pointing out the importance exchanging information and experience in meanings of improving the operations of all parties involved in civil protection (private and public professionals and volunteers). DECATASTROPHIZE targets for the support of the EU coordinate countries and potentials who do not participate in the ''Mechanisms and European Neighborhood Policy'' countries in the view of disaster Preparedness. Enhancing their cooperation their cooperation within the union Civil Protection Mechanism is also of high importance.

  4. Using Google Earth in Marine Research and Operational Decision Support

    NASA Astrophysics Data System (ADS)

    Blower, J. D.; Bretherton, D.; Haines, K.; Liu, C.; Rawlings, C.; Santokhee, A.; Smith, I.

    2006-12-01

    A key advantage of Virtual Globes ("geobrowsers") such as Google Earth is that they can display many different geospatial data types at a huge range of spatial scales. In this demonstration and poster display we shall show how marine data from disparate sources can be brought together in a geobrowser in order to support both scientific research and operational search and rescue activities. We have developed the Godiva2 interactive website for browsing and exploring marine data, mainly output from supercomputer analyses and predictions of ocean circulation. The user chooses a number of parameters (e.g. sea temperature at 100m depth on 1st July 2006) and can load an image of the resulting data in Google Earth. Through the use of an automatically-refreshing NetworkLink the user can explore the whole globe at a very large range of spatial scales: the displayed data will automatically be refreshed to show data at increasingly fine resolution as the user zooms in. This is a valuable research tool for exploring these terabyte- scale datasets. Many coastguard organizations around the world use SARIS, a software application produced by BMT Cordah Ltd., to predict the drift pattern of objects in the sea in order to support search and rescue operations. Different drifting objects have different trajectories depending on factors such as their buoyancy and windage and so a computer model, supported by meteorological and oceanographic data, is needed to help rescuers locate their targets. We shall demonstrate how Google Earth is used to display output from the SARIS model (including the search target location and associated error polygon) alongside meteorological data (wind vectors) and oceanographic data (sea temperature, surface currents) from Godiva2 in order to support decision-making. We shall also discuss the limitations of using Google Earth in this context: these include the difficulties of working with time- dependent data and the need to access data securely. essc.ac.uk:8080/Godiva2

  5. Predicting breeding habitat for amphibians: a spatiotemporal analysis across Yellowstone National Park

    USGS Publications Warehouse

    Bartelt, Paul E.; Gallant, Alisa L.; Klaver, Robert W.; Wright, Christopher K.; Patla, Debra A.; Peterson, Charles R.

    2011-01-01

    The ability to predict amphibian breeding across landscapes is important for informing land management decisions and helping biologists better understand and remediate factors contributing to declines in amphibian populations. We built geospatial models of likely breeding habitats for each of four amphibian species that breed in Yellowstone National Park (YNP). We used field data collected in 2000-2002 from 497 sites among 16 basins and predictor variables from geospatial models produced from remotely sensed data (e.g., digital elevation model, complex topographic index, landform data, wetland probabililty, and vegetative cover). Except for 31 sites in one basin that were surveyed in both 2000 and 2002, all sites were surveyed once. We used polytomous regression to build statistical models for each species of amphibian from 1) field survey site data only, 2) field data combined with data from geospatial models, and 3) data from geospatial models only. Based on measures of receiver operating characteristic (ROC) scores, models of the second type best explained likely breeding habitat because they contained the most information (ROC values ranged from 0.70 - 0.88). However, models of the third type could be applied to the entire YNP landscape and produced maps that could be verified with reserve field data. Accuracy rates for models built for single years were highly variable, ranging from 0.30 to 0.78. Accuracy rates for models built with data combined from multiple years were higher and less variable, ranging from 0.60 to 0.80. Combining results from the geospatial multiyear models yielded maps of "core" breeding areas (areas with high probability values for all three years) surrounded by areas that scored high for only one or two years, providing an estimate of variability among years. Such information can highlight landscape options for amphibian conservation. For example, our models identify alternative for areas that could be protected for each species, including 6828-10 764 ha for tiger salamanders; 971-3017 ha for western toads; 4732-16 696 ha for boreal chorus frogs; 4940-19 690 hectares for Columbia spotted frogs.

  6. Predicting breeding habitat for amphibians: A spatiotemporal analysis across Yellowstone National Park

    USGS Publications Warehouse

    Bartelt, Paul E.; Gallant, Alisa L.; Klaver, Robert W.; Wright, C.K.; Patla, Debra A.; Peterson, Charles R.

    2011-01-01

    The ability to predict amphibian breeding across landscapes is important for informing land management decisions and helping biologists better understand and remediate factors contributing to declines in amphibian populations. We built geospatial models of likely breeding habitats for each of four amphibian species that breed in Yellowstone National Park (YNP). We used field data collected in 2000-2002 from 497 sites among 16 basins and predictor variables from geospatial models produced from remotely sensed data (e.g., digital elevation model, complex topographic index, landform data, wetland probability, and vegetative cover). Except for 31 sites in one basin that were surveyed in both 2000 and 2002, all sites were surveyed once. We used polytomous regression to build statistical models for each species of amphibian from (1) field survey site data only, (2) field data combined with data from geospatial models, and (3) data from geospatial models only. Based on measures of receiver operating characteristic (ROC) scores, models of the second type best explained likely breeding habitat because they contained the most information (ROC values ranged from 0.70 to 0.88). However, models of the third type could be applied to the entire YNP landscape and produced maps that could be verified with reserve field data. Accuracy rates for models built for single years were highly variable, ranging from 0.30 to 0.78. Accuracy rates for models built with data combined from multiple years were higher and less variable, ranging from 0.60 to 0.80. Combining results from the geospatial multiyear models yielded maps of "core" breeding areas (areas with high probability values for all three years) surrounded by areas that scored high for only one or two years, providing an estimate of variability among years. Such information can highlight landscape options for amphibian conservation. For example, our models identify alternative areas that could be protected for each species, including 6828-10 764 ha for tiger salamanders, 971-3017 ha for western toads, 4732-16 696 ha for boreal chorus frogs, and 4940-19 690 ha for Columbia spotted frogs. ?? 2011 by the Ecological Society of America.

  7. A Geospatial Database for Wind and Solar Energy Applications: The Kingdom of Bahrain Study Case

    NASA Astrophysics Data System (ADS)

    Al-Joburi, Khalil; Dahman, Nidal

    2017-11-01

    This research is aimed at designing, implementing, and testing a geospatial database for wind and solar energy applications in the Kingdom of Bahrain. All decision making needed to determine economic feasibility and establish site location for wind turbines or solar panels depends primarily on geospatial feature theme information and non-spatial (attribute) data for wind, solar, rainfall, temperature and weather characteristics of a particular region. Spatial data includes, but is not limited to, digital elevation, slopes, land use, zonings, parks, population density, road utility maps, and other related information. Digital elevations for over 450,000 spot at 50 m spatial horizontal resolution plus field surveying and GPS (at selected locations) was obtained from the Surveying and Land Registration Bureau (SLRB). Road, utilities, and population density are obtained from the Central Information Organization (CIO). Land use zoning, recreational parks, and other data are obtained from the Ministry of Municipalities and Agricultural Affairs. Wind, solar, humidity, rainfall, and temperature data are obtained from the Ministry of Transportation, Civil Aviation Section. LandSat Satellite and others images are obtained from NASA and online sources respectively. The collected geospatial data was geo-referenced to Ain el-Abd UTM Zone 39 North. 3D Digital Elevation Model (DEM)-50 m spatial resolutions was created using SLRB spot elevations. Slope and aspect maps were generate based on the DEM. Supervised image classification to identify open spaces was performed utilizing satellite images. Other geospatial data was converted to raster format with the same cell resolution. Non-spatial data are entered as an attribute to spatial features. To eliminate ambiguous solution, multi-criteria GIS model is developed based on, vector (discrete point, line, and polygon representations) as well as raster model (continuous representation). The model was tested at the Al-Areen proposed project, a relatively small area (15 km2). Optimum site spatial location for the location of wind turbines and solar panels was determined and initial results indicates that the combination of wind and solar energy would be sufficient for the project to meet the energy demand at the present per capita consummation rate..

  8. Predicting breeding habitat for amphibians: a spatiotemporal analysis across Yellowstone National Park.

    PubMed

    Bartelt, Paul E; Gallant, Alisa L; Klaver, Robert W; Wright, Chris K; Patla, Debra A; Peterson, Charles R

    2011-10-01

    The ability to predict amphibian breeding across landscapes is important for informing land management decisions and helping biologists better understand and remediate factors contributing to declines in amphibian populations. We built geospatial models of likely breeding habitats for each of four amphibian species that breed in Yellowstone National Park (YNP). We used field data collected in 2000-2002 from 497 sites among 16 basins and predictor variables from geospatial models produced from remotely sensed data (e.g., digital elevation model, complex topographic index, landform data, wetland probability, and vegetative cover). Except for 31 sites in one basin that were surveyed in both 2000 and 2002, all sites were surveyed once. We used polytomous regression to build statistical models for each species of amphibian from (1) field survey site data only, (2) field data combined with data from geospatial models, and (3) data from geospatial models only. Based on measures of receiver operating characteristic (ROC) scores, models of the second type best explained likely breeding habitat because they contained the most information (ROC values ranged from 0.70 to 0.88). However, models of the third type could be applied to the entire YNP landscape and produced maps that could be verified with reserve field data. Accuracy rates for models built for single years were highly variable, ranging from 0.30 to 0.78. Accuracy rates for models built with data combined from multiple years were higher and less variable, ranging from 0.60 to 0.80. Combining results from the geospatial multiyear models yielded maps of "core" breeding areas (areas with high probability values for all three years) surrounded by areas that scored high for only one or two years, providing an estimate of variability among years. Such information can highlight landscape options for amphibian conservation. For example, our models identify alternative areas that could be protected for each species, including 6828-10 764 ha for tiger salamanders, 971-3017 ha for western toads, 4732-16 696 ha for boreal chorus frogs, and 4940-19 690 ha for Columbia spotted frogs.

  9. An on-demand provision model for geospatial multisource information with active self-adaption services

    NASA Astrophysics Data System (ADS)

    Fan, Hong; Li, Huan

    2015-12-01

    Location-related data are playing an increasingly irreplaceable role in business, government and scientific research. At the same time, the amount and types of data are rapidly increasing. It is a challenge how to quickly find required information from this rapidly growing volume of data, as well as how to efficiently provide different levels of geospatial data to users. This paper puts forward a data-oriented access model for geographic information science data. First, we analyze the features of GIS data including traditional types such as vector and raster data and new types such as Volunteered Geographic Information (VGI). Taking into account these analyses, a classification scheme for geographic data is proposed and TRAFIE is introduced to describe the establishment of a multi-level model for geographic data. Based on this model, a multi-level, scalable access system for geospatial information is put forward. Users can select different levels of data according to their concrete application needs. Pull-based and push-based data access mechanisms based on this model are presented. A Service Oriented Architecture (SOA) was chosen for the data processing. The model of this study has been described by providing decision-making process of government departments with a simulation of fire disaster data collection. The use case shows this data model and the data provision system is flexible and has good adaptability.

  10. The Value of Information and Geospatial Technologies for the analysis of tidal current patterns in the Guanabara Bay (Rio de Janeiro)

    NASA Astrophysics Data System (ADS)

    Isotta Cristofori, Elena; Demarchi, Alessandro; Facello, Anna; Cámaro, Walther; Hermosilla, Fernando; López, Jaime

    2016-04-01

    The study and validation of tidal current patterns relies on the combination of several data sources such as numerical weather prediction models, hydrodynamic models, weather stations, current drifters and remote sensing observations. The assessment of the accuracy and the reliability of produced patterns and the communication of results, including an easy to understand visualization of data, is crucial for a variety of stakeholders including decision-makers. The large diffusion of geospatial equipment such as GPS, current drifters, aerial photogrammetry, allows to collect data in the field using mobile and portable devices with a relative limited effort in terms of time and economic resources. Theses real-time measurements are essential in order to validate the models and specifically to assess the skill of the model during critical environmental conditions. Moreover, the considerable development in remote sensing technologies, cartographic services and GPS applications have enabled the creation of Geographic Information Systems (GIS) capable to store, analyze, manage and integrate spatial or geographical information with hydro-meteorological data. This valuable contribution of Information and geospatial technologies can benefit manifold decision-makers including high level sport athletes. While the numerical approach, commonly used to validate models with in-situ data, is more familiar for scientific users, high level sport users are not familiar with a numerical representations of data. Therefore the integration of data collected in the field into a GIS allows an immediate visualization of performed analysis into geographic maps. This visualization represents a particularly effective way to communicate current patterns assessment results and uncertainty in information, leading to an increase of confidence level about the forecast. The aim of this paper is to present the methodology set-up in collaboration with the Austrian Sailing Federation, for the study of tidal current patterns of the Guanabara Bay, venue for the sailing competitions of Rio 2016 Olympic Games. The methodology relies on the integration of a consistent amount of data collected in the field, hydrodynamic model output, cartography and "key-signs" visible on the water into a GIS, proving to be particularly useful to simplify the final information, to help the learning process and to improve the decision making.

  11. Grid Enabled Geospatial Catalogue Web Service

    NASA Technical Reports Server (NTRS)

    Chen, Ai-Jun; Di, Li-Ping; Wei, Ya-Xing; Liu, Yang; Bui, Yu-Qi; Hu, Chau-Min; Mehrotra, Piyush

    2004-01-01

    Geospatial Catalogue Web Service is a vital service for sharing and interoperating volumes of distributed heterogeneous geospatial resources, such as data, services, applications, and their replicas over the web. Based on the Grid technology and the Open Geospatial Consortium (0GC) s Catalogue Service - Web Information Model, this paper proposes a new information model for Geospatial Catalogue Web Service, named as GCWS which can securely provides Grid-based publishing, managing and querying geospatial data and services, and the transparent access to the replica data and related services under the Grid environment. This information model integrates the information model of the Grid Replica Location Service (RLS)/Monitoring & Discovery Service (MDS) with the information model of OGC Catalogue Service (CSW), and refers to the geospatial data metadata standards from IS0 19115, FGDC and NASA EOS Core System and service metadata standards from IS0 191 19 to extend itself for expressing geospatial resources. Using GCWS, any valid geospatial user, who belongs to an authorized Virtual Organization (VO), can securely publish and manage geospatial resources, especially query on-demand data in the virtual community and get back it through the data-related services which provide functions such as subsetting, reformatting, reprojection etc. This work facilitates the geospatial resources sharing and interoperating under the Grid environment, and implements geospatial resources Grid enabled and Grid technologies geospatial enabled. It 2!so makes researcher to focus on science, 2nd not cn issues with computing ability, data locztic, processir,g and management. GCWS also is a key component for workflow-based virtual geospatial data producing.

  12. 78 FR 57455 - Pipeline Safety: Information Collection Activities

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-18

    ... ``. . . system-specific information, including pipe diameter, operating pressure, product transported, and...) must provide contact information and geospatial data on their pipeline system. This information should... Mapping System (NPMS) to support various regulatory programs, pipeline inspections, and authorized...

  13. Dynamic Server-Based KML Code Generator Method for Level-of-Detail Traversal of Geospatial Data

    NASA Technical Reports Server (NTRS)

    Baxes, Gregory; Mixon, Brian; Linger, TIm

    2013-01-01

    Web-based geospatial client applications such as Google Earth and NASA World Wind must listen to data requests, access appropriate stored data, and compile a data response to the requesting client application. This process occurs repeatedly to support multiple client requests and application instances. Newer Web-based geospatial clients also provide user-interactive functionality that is dependent on fast and efficient server responses. With massively large datasets, server-client interaction can become severely impeded because the server must determine the best way to assemble data to meet the client applications request. In client applications such as Google Earth, the user interactively wanders through the data using visually guided panning and zooming actions. With these actions, the client application is continually issuing data requests to the server without knowledge of the server s data structure or extraction/assembly paradigm. A method for efficiently controlling the networked access of a Web-based geospatial browser to server-based datasets in particular, massively sized datasets has been developed. The method specifically uses the Keyhole Markup Language (KML), an Open Geospatial Consortium (OGS) standard used by Google Earth and other KML-compliant geospatial client applications. The innovation is based on establishing a dynamic cascading KML strategy that is initiated by a KML launch file provided by a data server host to a Google Earth or similar KMLcompliant geospatial client application user. Upon execution, the launch KML code issues a request for image data covering an initial geographic region. The server responds with the requested data along with subsequent dynamically generated KML code that directs the client application to make follow-on requests for higher level of detail (LOD) imagery to replace the initial imagery as the user navigates into the dataset. The approach provides an efficient data traversal path and mechanism that can be flexibly established for any dataset regardless of size or other characteristics. The method yields significant improvements in userinteractive geospatial client and data server interaction and associated network bandwidth requirements. The innovation uses a C- or PHP-code-like grammar that provides a high degree of processing flexibility. A set of language lexer and parser elements is provided that offers a complete language grammar for writing and executing language directives. A script is wrapped and passed to the geospatial data server by a client application as a component of a standard KML-compliant statement. The approach provides an efficient means for a geospatial client application to request server preprocessing of data prior to client delivery. Data is structured in a quadtree format. As the user zooms into the dataset, geographic regions are subdivided into four child regions. Conversely, as the user zooms out, four child regions collapse into a single, lower-LOD region. The approach provides an efficient data traversal path and mechanism that can be flexibly established for any dataset regardless of size or other characteristics.

  14. Merging climate and multi-sensor time-series data in real-time drought monitoring across the U.S.A.

    USGS Publications Warehouse

    Brown, Jesslyn F.; Miura, T.; Wardlow, B.; Gu, Yingxin

    2011-01-01

    Droughts occur repeatedly in the United States resulting in billions of dollars of damage. Monitoring and reporting on drought conditions is a necessary function of government agencies at multiple levels. A team of Federal and university partners developed a drought decision- support tool with higher spatial resolution relative to traditional climate-based drought maps. The Vegetation Drought Response Index (VegDRI) indicates general canopy vegetation condition assimilation of climate, satellite, and biophysical data via geospatial modeling. In VegDRI, complementary drought-related data are merged to provide a comprehensive, detailed representation of drought stress on vegetation. Time-series data from daily polar-orbiting earth observing systems [Advanced Very High Resolution Radiometer (AVHRR) and Moderate Resolution Imaging Spectroradiometer (MODIS)] providing global measurements of land surface conditions are ingested into VegDRI. Inter-sensor compatibility is required to extend multi-sensor data records; thus, translations were developed using overlapping observations to create consistent, long-term data time series. 

  15. A conceptual prototype for the next-generation national elevation dataset

    USGS Publications Warehouse

    Stoker, Jason M.; Heidemann, Hans Karl; Evans, Gayla A.; Greenlee, Susan K.

    2013-01-01

    In 2012 the U.S. Geological Survey's (USGS) National Geospatial Program (NGP) funded a study to develop a conceptual prototype for a new National Elevation Dataset (NED) design with expanded capabilities to generate and deliver a suite of bare earth and above ground feature information over the United States. This report details the research on identifying operational requirements based on prior research, evaluation of what is needed for the USGS to meet these requirements, and development of a possible conceptual framework that could potentially deliver the kinds of information that are needed to support NGP's partners and constituents. This report provides an initial proof-of-concept demonstration using an existing dataset, and recommendations for the future, to inform NGP's ongoing and future elevation program planning and management decisions. The demonstration shows that this type of functional process can robustly create derivatives from lidar point cloud data; however, more research needs to be done to see how well it extends to multiple datasets.

  16. Optimization of Land Use Suitability for Agriculture Using Integrated Geospatial Model and Genetic Algorithms

    NASA Astrophysics Data System (ADS)

    Mansor, S. B.; Pormanafi, S.; Mahmud, A. R. B.; Pirasteh, S.

    2012-08-01

    In this study, a geospatial model for land use allocation was developed from the view of simulating the biological autonomous adaptability to environment and the infrastructural preference. The model was developed based on multi-agent genetic algorithm. The model was customized to accommodate the constraint set for the study area, namely the resource saving and environmental-friendly. The model was then applied to solve the practical multi-objective spatial optimization allocation problems of land use in the core region of Menderjan Basin in Iran. The first task was to study the dominant crops and economic suitability evaluation of land. Second task was to determine the fitness function for the genetic algorithms. The third objective was to optimize the land use map using economical benefits. The results has indicated that the proposed model has much better performance for solving complex multi-objective spatial optimization allocation problems and it is a promising method for generating land use alternatives for further consideration in spatial decision-making.

  17. Comparing children's GPS tracks with geospatial proxies for exposure to junk food.

    PubMed

    Sadler, Richard C; Gilliland, Jason A

    2015-01-01

    Various geospatial techniques have been employed to estimate children's exposure to environmental cardiometabolic risk factors, including junk food. But many studies uncritically rely on exposure proxies which differ greatly from actual exposure. Misrepresentation of exposure by researchers could lead to poor decisions and ineffective policymaking. This study conducts a GIS-based analysis of GPS tracks--'activity spaces'--and 21 proxies for activity spaces (e.g. buffers, container approaches) for a sample of 526 children (ages 9-14) in London, Ontario, Canada. These measures are combined with a validated food environment database (including fast food and convenience stores) to create a series of junk food exposure estimates and quantify the errors resulting from use of different proxy methods. Results indicate that exposure proxies consistently underestimate exposure to junk foods by as much as 68%. This underestimation is important to policy development because children are exposed to more junk food than estimated using typical methods. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. a Kml-Based Approach for Distributed Collaborative Interpretation of Remote Sensing Images in the Geo-Browser

    NASA Astrophysics Data System (ADS)

    Huang, L.; Zhu, X.; Guo, W.; Xiang, L.; Chen, X.; Mei, Y.

    2012-07-01

    Existing implementations of collaborative image interpretation have many limitations for very large satellite imageries, such as inefficient browsing, slow transmission, etc. This article presents a KML-based approach to support distributed, real-time, synchronous collaborative interpretation for remote sensing images in the geo-browser. As an OGC standard, KML (Keyhole Markup Language) has the advantage of organizing various types of geospatial data (including image, annotation, geometry, etc.) in the geo-browser. Existing KML elements can be used to describe simple interpretation results indicated by vector symbols. To enlarge its application, this article expands KML elements to describe some complex image processing operations, including band combination, grey transformation, geometric correction, etc. Improved KML is employed to describe and share interpretation operations and results among interpreters. Further, this article develops some collaboration related services that are collaboration launch service, perceiving service and communication service. The launch service creates a collaborative interpretation task and provides a unified interface for all participants. The perceiving service supports interpreters to share collaboration awareness. Communication service provides interpreters with written words communication. Finally, the GeoGlobe geo-browser (an extensible and flexible geospatial platform developed in LIESMARS) is selected to perform experiments of collaborative image interpretation. The geo-browser, which manage and visualize massive geospatial information, can provide distributed users with quick browsing and transmission. Meanwhile in the geo-browser, GIS data (for example DEM, DTM, thematic map and etc.) can be integrated to assist in improving accuracy of interpretation. Results show that the proposed method is available to support distributed collaborative interpretation of remote sensing image

  19. Parallel Processing of Numerical Tsunami Simulations on a High Performance Cluster based on the GDAL Library

    NASA Astrophysics Data System (ADS)

    Schroeder, Matthias; Jankowski, Cedric; Hammitzsch, Martin; Wächter, Joachim

    2014-05-01

    Thousands of numerical tsunami simulations allow the computation of inundation and run-up along the coast for vulnerable areas over the time. A so-called Matching Scenario Database (MSDB) [1] contains this large number of simulations in text file format. In order to visualize these wave propagations the scenarios have to be reprocessed automatically. In the TRIDEC project funded by the seventh Framework Programme of the European Union a Virtual Scenario Database (VSDB) and a Matching Scenario Database (MSDB) were established amongst others by the working group of the University of Bologna (UniBo) [1]. One part of TRIDEC was the developing of a new generation of a Decision Support System (DSS) for tsunami Early Warning Systems (TEWS) [2]. A working group of the GFZ German Research Centre for Geosciences was responsible for developing the Command and Control User Interface (CCUI) as central software application which support operator activities, incident management and message disseminations. For the integration and visualization in the CCUI, the numerical tsunami simulations from MSDB must be converted into the shapefiles format. The usage of shapefiles enables a much easier integration into standard Geographic Information Systems (GIS). Since also the CCUI is based on two widely used open source products (GeoTools library and uDig), whereby the integration of shapefiles is provided by these libraries a priori. In this case, for an example area around the Western Iberian margin several thousand tsunami variations were processed. Due to the mass of data only a program-controlled process was conceivable. In order to optimize the computing efforts and operating time the use of an existing GFZ High Performance Computing Cluster (HPC) had been chosen. Thus, a geospatial software was sought after that is capable for parallel processing. The FOSS tool Geospatial Data Abstraction Library (GDAL/OGR) was used to match the coordinates with the wave heights and generates the different shapefiles for certain time steps. The shapefiles contain afterwards lines for visualizing the isochrones of the wave propagation and moreover, data about the maximum wave height and the Estimated Time of Arrival (ETA) at the coast. Our contribution shows the entire workflow and the visualizing results of the-processing for the example region Western Iberian ocean margin. [1] Armigliato A., Pagnoni G., Zaniboni F, Tinti S. (2013), Database of tsunami scenario simulations for Western Iberia: a tool for the TRIDEC Project Decision Support System for tsunami early warning, Vol. 15, EGU2013-5567, EGU General Assembly 2013, Vienna (Austria). [2] Löwe, P., Wächter, J., Hammitzsch, M., Lendholt, M., Häner, R. (2013): The Evolution of Service-oriented Disaster Early Warning Systems in the TRIDEC Project, 23rd International Ocean and Polar Engineering Conference - ISOPE-2013, Anchorage (USA).

  20. Geospatial considerations for a multiorganizational, landscape-scale program

    USGS Publications Warehouse

    O'Donnell, Michael S.; Assal, Timothy J.; Anderson, Patrick J.; Bowen, Zachary H.

    2013-01-01

    Geospatial data play an increasingly important role in natural resources management, conservation, and science-based projects. The management and effective use of spatial data becomes significantly more complex when the efforts involve a myriad of landscape-scale projects combined with a multiorganizational collaboration. There is sparse literature to guide users on this daunting subject; therefore, we present a framework of considerations for working with geospatial data that will provide direction to data stewards, scientists, collaborators, and managers for developing geospatial management plans. The concepts we present apply to a variety of geospatial programs or projects, which we describe as a “scalable framework” of processes for integrating geospatial efforts with management, science, and conservation initiatives. Our framework includes five tenets of geospatial data management: (1) the importance of investing in data management and standardization, (2) the scalability of content/efforts addressed in geospatial management plans, (3) the lifecycle of a geospatial effort, (4) a framework for the integration of geographic information systems (GIS) in a landscape-scale conservation or management program, and (5) the major geospatial considerations prior to data acquisition. We conclude with a discussion of future considerations and challenges.

  1. Considerations on Geospatial Big Data

    NASA Astrophysics Data System (ADS)

    LIU, Zhen; GUO, Huadong; WANG, Changlin

    2016-11-01

    Geospatial data, as a significant portion of big data, has recently gained the full attention of researchers. However, few researchers focus on the evolution of geospatial data and its scientific research methodologies. When entering into the big data era, fully understanding the changing research paradigm associated with geospatial data will definitely benefit future research on big data. In this paper, we look deep into these issues by examining the components and features of geospatial big data, reviewing relevant scientific research methodologies, and examining the evolving pattern of geospatial data in the scope of the four ‘science paradigms’. This paper proposes that geospatial big data has significantly shifted the scientific research methodology from ‘hypothesis to data’ to ‘data to questions’ and it is important to explore the generality of growing geospatial data ‘from bottom to top’. Particularly, four research areas that mostly reflect data-driven geospatial research are proposed: spatial correlation, spatial analytics, spatial visualization, and scientific knowledge discovery. It is also pointed out that privacy and quality issues of geospatial data may require more attention in the future. Also, some challenges and thoughts are raised for future discussion.

  2. Global polar geospatial information service retrieval based on search engine and ontology reasoning

    USGS Publications Warehouse

    Chen, Nengcheng; E, Dongcheng; Di, Liping; Gong, Jianya; Chen, Zeqiang

    2007-01-01

    In order to improve the access precision of polar geospatial information service on web, a new methodology for retrieving global spatial information services based on geospatial service search and ontology reasoning is proposed, the geospatial service search is implemented to find the coarse service from web, the ontology reasoning is designed to find the refined service from the coarse service. The proposed framework includes standardized distributed geospatial web services, a geospatial service search engine, an extended UDDI registry, and a multi-protocol geospatial information service client. Some key technologies addressed include service discovery based on search engine and service ontology modeling and reasoning in the Antarctic geospatial context. Finally, an Antarctica multi protocol OWS portal prototype based on the proposed methodology is introduced.

  3. Interactive Visualization and Analysis of Geospatial Data Sets - TrikeND-iGlobe

    NASA Astrophysics Data System (ADS)

    Rosebrock, Uwe; Hogan, Patrick; Chandola, Varun

    2013-04-01

    The visualization of scientific datasets is becoming an ever-increasing challenge as advances in computing technologies have enabled scientists to build high resolution climate models that have produced petabytes of climate data. To interrogate and analyze these large datasets in real-time is a task that pushes the boundaries of computing hardware and software. But integration of climate datasets with geospatial data requires considerable amount of effort and close familiarity of various data formats and projection systems, which has prevented widespread utilization outside of climate community. TrikeND-iGlobe is a sophisticated software tool that bridges this gap, allows easy integration of climate datasets with geospatial datasets and provides sophisticated visualization and analysis capabilities. The objective for TrikeND-iGlobe is the continued building of an open source 4D virtual globe application using NASA World Wind technology that integrates analysis of climate model outputs with remote sensing observations as well as demographic and environmental data sets. This will facilitate a better understanding of global and regional phenomenon, and the impact analysis of climate extreme events. The critical aim is real-time interactive interrogation. At the data centric level the primary aim is to enable the user to interact with the data in real-time for the purpose of analysis - locally or remotely. TrikeND-iGlobe provides the basis for the incorporation of modular tools that provide extended interactions with the data, including sub-setting, aggregation, re-shaping, time series analysis methods and animation to produce publication-quality imagery. TrikeND-iGlobe may be run locally or can be accessed via a web interface supported by high-performance visualization compute nodes placed close to the data. It supports visualizing heterogeneous data formats: traditional geospatial datasets along with scientific data sets with geographic coordinates (NetCDF, HDF, etc.). It also supports multiple data access mechanisms, including HTTP, FTP, WMS, WCS, and Thredds Data Server (for NetCDF data and for scientific data, TrikeND-iGlobe supports various visualization capabilities, including animations, vector field visualization, etc. TrikeND-iGlobe is a collaborative open-source project, contributors include NASA (ARC-PX), ORNL (Oakridge National Laboratories), Unidata, Kansas University, CSIRO CMAR Australia and Geoscience Australia.

  4. Geospatial Information from Satellite Imagery for Geovisualisation of Smart Cities in India

    NASA Astrophysics Data System (ADS)

    Mohan, M.

    2016-06-01

    In the recent past, there have been large emphasis on extraction of geospatial information from satellite imagery. The Geospatial information are being processed through geospatial technologies which are playing important roles in developing of smart cities, particularly in developing countries of the world like India. The study is based on the latest geospatial satellite imagery available for the multi-date, multi-stage, multi-sensor, and multi-resolution. In addition to this, the latest geospatial technologies have been used for digital image processing of remote sensing satellite imagery and the latest geographic information systems as 3-D GeoVisualisation, geospatial digital mapping and geospatial analysis for developing of smart cities in India. The Geospatial information obtained from RS and GPS systems have complex structure involving space, time and presentation. Such information helps in 3-Dimensional digital modelling for smart cities which involves of spatial and non-spatial information integration for geographic visualisation of smart cites in context to the real world. In other words, the geospatial database provides platform for the information visualisation which is also known as geovisualisation. So, as a result there have been an increasing research interest which are being directed to geospatial analysis, digital mapping, geovisualisation, monitoring and developing of smart cities using geospatial technologies. However, the present research has made an attempt for development of cities in real world scenario particulary to help local, regional and state level planners and policy makers to better understand and address issues attributed to cities using the geospatial information from satellite imagery for geovisualisation of Smart Cities in emerging and developing country, India.

  5. The impact of geographic information systems on emergency management decision making at the U.S. Department of Homeland Security

    NASA Astrophysics Data System (ADS)

    King, Steven Gray

    Geographic information systems (GIS) reveal relationships and patterns from large quantities of diverse data in the form of maps and reports. The United States spends billions of dollars to use GIS to improve decisions made during responses to natural disasters and terrorist attacks, but precisely how GIS improves or impairs decision making is not known. This research examined how GIS affect decision making during natural disasters, and how GIS can be more effectively used to improve decision making for emergency management. Using a qualitative case study methodology, this research examined decision making at the U.S. Department of Homeland Security (DHS) during a large full-scale disaster exercise. This study indicates that GIS provided decision makers at DHS with an outstanding context for information that would otherwise be challenging to understand, especially through the integration of multiple data sources and dynamic three-dimensional interactive maps. Decision making was hampered by outdated information, a reliance on predictive models based on hypothetical data rather than actual event data, and a lack of understanding of the capabilities of GIS beyond cartography. Geospatial analysts, emergency managers, and other decision makers who use GIS should take specific steps to improve decision making based on GIS for disaster response and emergency management.

  6. Statistical Validation of a Web-Based GIS Application and Its Applicability to Cardiovascular-Related Studies.

    PubMed

    Lee, Jae Eun; Sung, Jung Hye; Malouhi, Mohamad

    2015-12-22

    There is abundant evidence that neighborhood characteristics are significantly linked to the health of the inhabitants of a given space within a given time frame. This study is to statistically validate a web-based GIS application designed to support cardiovascular-related research developed by the NIH funded Research Centers in Minority Institutions (RCMI) Translational Research Network (RTRN) Data Coordinating Center (DCC) and discuss its applicability to cardiovascular studies. Geo-referencing, geocoding and geospatial analyses were conducted for 500 randomly selected home addresses in a U.S. southeastern Metropolitan area. The correlation coefficient, factor analysis and Cronbach's alpha (α) were estimated to quantify measures of the internal consistency, reliability and construct/criterion/discriminant validity of the cardiovascular-related geospatial variables (walk score, number of hospitals, fast food restaurants, parks and sidewalks). Cronbach's α for CVD GEOSPATIAL variables was 95.5%, implying successful internal consistency. Walk scores were significantly correlated with number of hospitals (r = 0.715; p < 0.0001), fast food restaurants (r = 0.729; p < 0.0001), parks (r = 0.773; p < 0.0001) and sidewalks (r = 0.648; p < 0.0001) within a mile from homes. It was also significantly associated with diversity index (r = 0.138, p = 0.0023), median household incomes (r = -0.181; p < 0.0001), and owner occupied rates (r = -0.440; p < 0.0001). However, its non-significant correlation was found with median age, vulnerability, unemployment rate, labor force, and population growth rate. Our data demonstrates that geospatial data generated by the web-based application were internally consistent and demonstrated satisfactory validity. Therefore, the GIS application may be useful to apply to cardiovascular-related studies aimed to investigate potential impact of geospatial factors on diseases and/or the long-term effect of clinical trials.

  7. A Rule-Based Spatial Reasoning Approach for OpenStreetMap Data Quality Enrichment; Case Study of Routing and Navigation

    PubMed Central

    2017-01-01

    Finding relevant geospatial information is increasingly critical because of the growing volume of geospatial data available within the emerging “Big Data” era. Users are expecting that the availability of massive datasets will create more opportunities to uncover hidden information and answer more complex queries. This is especially the case with routing and navigation services where the ability to retrieve points of interest and landmarks make the routing service personalized, precise, and relevant. In this paper, we propose a new geospatial information approach that enables the retrieval of implicit information, i.e., geospatial entities that do not exist explicitly in the available source. We present an information broker that uses a rule-based spatial reasoning algorithm to detect topological relations. The information broker is embedded into a framework where annotations and mappings between OpenStreetMap data attributes and external resources, such as taxonomies, support the enrichment of queries to improve the ability of the system to retrieve information. Our method is tested with two case studies that leads to enriching the completeness of OpenStreetMap data with footway crossing points-of-interests as well as building entrances for routing and navigation purposes. It is concluded that the proposed approach can uncover implicit entities and contribute to extract required information from the existing datasets. PMID:29088125

  8. A natural language processing and geospatial clustering framework for harvesting local place names from geotagged housing advertisements

    DOE PAGES

    Hu, Yingjie; Mao, Huina; Mckenzie, Grant

    2018-04-13

    We report that local place names are frequently used by residents living in a geographic region. Such place names may not be recorded in existing gazetteers, due to their vernacular nature, relative insignificance to a gazetteer covering a large area (e.g. the entire world), recent establishment (e.g. the name of a newly-opened shopping center) or other reasons. While not always recorded, local place names play important roles in many applications, from supporting public participation in urban planning to locating victims in disaster response. In this paper, we propose a computational framework for harvesting local place names from geotagged housing advertisements.more » We make use of those advertisements posted on local-oriented websites, such as Craigslist, where local place names are often mentioned. The proposed framework consists of two stages: natural language processing (NLP) and geospatial clustering. The NLP stage examines the textual content of housing advertisements and extracts place name candidates. The geospatial stage focuses on the coordinates associated with the extracted place name candidates and performs multiscale geospatial clustering to filter out the non-place names. We evaluate our framework by comparing its performance with those of six baselines. Finally, we also compare our result with four existing gazetteers to demonstrate the not-yet-recorded local place names discovered by our framework.« less

  9. Investigating Climate Change Issues With Web-Based Geospatial Inquiry Activities

    NASA Astrophysics Data System (ADS)

    Dempsey, C.; Bodzin, A. M.; Sahagian, D. L.; Anastasio, D. J.; Peffer, T.; Cirucci, L.

    2011-12-01

    In the Environmental Literacy and Inquiry middle school Climate Change curriculum we focus on essential climate literacy principles with an emphasis on weather and climate, Earth system energy balance, greenhouse gases, paleoclimatology, and how human activities influence climate change (http://www.ei.lehigh.edu/eli/cc/). It incorporates a related set of a framework and design principles to provide guidance for the development of the geospatial technology-integrated Earth and environmental science curriculum materials. Students use virtual globes, Web-based tools including an interactive carbon calculator and geologic timeline, and inquiry-based lab activities to investigate climate change topics. The curriculum includes educative curriculum materials that are designed to promote and support teachers' learning of important climate change content and issues, geospatial pedagogical content knowledge, and geographic spatial thinking. The curriculum includes baseline instructional guidance for teachers and provides implementation and adaptation guidance for teaching with diverse learners including low-level readers, English language learners and students with disabilities. In the curriculum, students use geospatial technology tools including Google Earth with embedded spatial data to investigate global temperature changes, areas affected by climate change, evidence of climate change, and the effects of sea level rise on the existing landscape. We conducted a designed-based research implementation study with urban middle school students. Findings showed that the use of the Climate Change curriculum showed significant improvement in urban middle school students' understanding of climate change concepts.

  10. A natural language processing and geospatial clustering framework for harvesting local place names from geotagged housing advertisements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, Yingjie; Mao, Huina; Mckenzie, Grant

    We report that local place names are frequently used by residents living in a geographic region. Such place names may not be recorded in existing gazetteers, due to their vernacular nature, relative insignificance to a gazetteer covering a large area (e.g. the entire world), recent establishment (e.g. the name of a newly-opened shopping center) or other reasons. While not always recorded, local place names play important roles in many applications, from supporting public participation in urban planning to locating victims in disaster response. In this paper, we propose a computational framework for harvesting local place names from geotagged housing advertisements.more » We make use of those advertisements posted on local-oriented websites, such as Craigslist, where local place names are often mentioned. The proposed framework consists of two stages: natural language processing (NLP) and geospatial clustering. The NLP stage examines the textual content of housing advertisements and extracts place name candidates. The geospatial stage focuses on the coordinates associated with the extracted place name candidates and performs multiscale geospatial clustering to filter out the non-place names. We evaluate our framework by comparing its performance with those of six baselines. Finally, we also compare our result with four existing gazetteers to demonstrate the not-yet-recorded local place names discovered by our framework.« less

  11. Multi-source Geospatial Data Analysis with Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Erickson, T.

    2014-12-01

    The Google Earth Engine platform is a cloud computing environment for data analysis that combines a public data catalog with a large-scale computational facility optimized for parallel processing of geospatial data. The data catalog is a multi-petabyte archive of georeferenced datasets that include images from Earth observing satellite and airborne sensors (examples: USGS Landsat, NASA MODIS, USDA NAIP), weather and climate datasets, and digital elevation models. Earth Engine supports both a just-in-time computation model that enables real-time preview and debugging during algorithm development for open-ended data exploration, and a batch computation mode for applying algorithms over large spatial and temporal extents. The platform automatically handles many traditionally-onerous data management tasks, such as data format conversion, reprojection, and resampling, which facilitates writing algorithms that combine data from multiple sensors and/or models. Although the primary use of Earth Engine, to date, has been the analysis of large Earth observing satellite datasets, the computational platform is generally applicable to a wide variety of use cases that require large-scale geospatial data analyses. This presentation will focus on how Earth Engine facilitates the analysis of geospatial data streams that originate from multiple separate sources (and often communities) and how it enables collaboration during algorithm development and data exploration. The talk will highlight current projects/analyses that are enabled by this functionality.https://earthengine.google.org

  12. A Vision for Incorporating Environmental Effects into Nitrogen Management Decision Support Tools for U.S. Maize Production

    PubMed Central

    Banger, Kamaljit; Yuan, Mingwei; Wang, Junming; Nafziger, Emerson D.; Pittelkow, Cameron M.

    2017-01-01

    Meeting crop nitrogen (N) demand while minimizing N losses to the environment has proven difficult despite significant field research and modeling efforts. To improve N management, several real-time N management tools have been developed with a primary focus on enhancing crop production. However, no coordinated effort exists to simultaneously address sustainability concerns related to N losses at field- and regional-scales. In this perspective, we highlight the opportunity for incorporating environmental effects into N management decision support tools for United States maize production systems by integrating publicly available crop models with grower-entered management information and gridded soil and climate data in a geospatial framework specifically designed to quantify environmental and crop production tradeoffs. To facilitate advances in this area, we assess the capability of existing crop models to provide in-season N recommendations while estimating N leaching and nitrous oxide emissions, discuss several considerations for initial framework development, and highlight important challenges related to improving the accuracy of crop model predictions. Such a framework would benefit the development of regional sustainable intensification strategies by enabling the identification of N loss hotspots which could be used to implement spatially explicit mitigation efforts in relation to current environmental quality goals and real-time weather conditions. Nevertheless, we argue that this long-term vision can only be realized by leveraging a variety of existing research efforts to overcome challenges related to improving model structure, accessing field data to enhance model performance, and addressing the numerous social difficulties in delivery and adoption of such tool by stakeholders. PMID:28804490

  13. A Vision for Incorporating Environmental Effects into Nitrogen Management Decision Support Tools for U.S. Maize Production.

    PubMed

    Banger, Kamaljit; Yuan, Mingwei; Wang, Junming; Nafziger, Emerson D; Pittelkow, Cameron M

    2017-01-01

    Meeting crop nitrogen (N) demand while minimizing N losses to the environment has proven difficult despite significant field research and modeling efforts. To improve N management, several real-time N management tools have been developed with a primary focus on enhancing crop production. However, no coordinated effort exists to simultaneously address sustainability concerns related to N losses at field- and regional-scales. In this perspective, we highlight the opportunity for incorporating environmental effects into N management decision support tools for United States maize production systems by integrating publicly available crop models with grower-entered management information and gridded soil and climate data in a geospatial framework specifically designed to quantify environmental and crop production tradeoffs. To facilitate advances in this area, we assess the capability of existing crop models to provide in-season N recommendations while estimating N leaching and nitrous oxide emissions, discuss several considerations for initial framework development, and highlight important challenges related to improving the accuracy of crop model predictions. Such a framework would benefit the development of regional sustainable intensification strategies by enabling the identification of N loss hotspots which could be used to implement spatially explicit mitigation efforts in relation to current environmental quality goals and real-time weather conditions. Nevertheless, we argue that this long-term vision can only be realized by leveraging a variety of existing research efforts to overcome challenges related to improving model structure, accessing field data to enhance model performance, and addressing the numerous social difficulties in delivery and adoption of such tool by stakeholders.

  14. Hazards Data Distribution System (HDDS)

    USGS Publications Warehouse

    Jones, Brenda; Lamb, Rynn M.

    2015-07-09

    When emergencies occur, first responders and disaster response teams often need rapid access to aerial photography and satellite imagery that is acquired before and after the event. The U.S. Geological Survey (USGS) Hazards Data Distribution System (HDDS) provides quick and easy access to pre- and post-event imagery and geospatial datasets that support emergency response and recovery operations. The HDDS provides a single, consolidated point-of-entry and distribution system for USGS-hosted remotely sensed imagery and other geospatial datasets related to an event response. The data delivery services are provided through an interactive map-based interface that allows emergency response personnel to rapidly select and download pre-event ("baseline") and post-event emergency response imagery.

  15. Challenges in sharing of geospatial data by data custodians in South Africa

    NASA Astrophysics Data System (ADS)

    Kay, Sissiel E.

    2018-05-01

    As most development planning and rendering of public services happens at a place or in a space, geospatial data is required. This geospatial data is best managed through a spatial data infrastructure, which has as a key objective to share geospatial data. The collection and maintenance of geospatial data is expensive and time consuming and so the principle of "collect once - use many times" should apply. It is best to obtain the geospatial data from the authoritative source - the appointed data custodian. In South Africa the South African Spatial Data Infrastructure (SASDI) is the means to achieve the requirement for geospatial data sharing. This requires geospatial data sharing to take place between the data custodian and the user. All data custodians are expected to comply with the Spatial Data Infrastructure Act (SDI Act) in terms of geo-spatial data sharing. Currently data custodians are experiencing challenges with regard to the sharing of geospatial data. This research is based on the current ten data themes selected by the Committee for Spatial Information and the organisations identified as the data custodians for these ten data themes. The objectives are to determine whether the identified data custodians comply with the SDI Act with respect to geospatial data sharing, and if not what are the reasons for this. Through an international comparative assessment it then determines if the compliance with the SDI Act is not too onerous on the data custodians. The research concludes that there are challenges with geospatial data sharing in South Africa and that the data custodians only partially comply with the SDI Act in terms of geospatial data sharing. However, it is shown that the South African legislation is not too onerous on the data custodians.

  16. Semantically optiMize the dAta seRvice operaTion (SMART) system for better data discovery and access

    NASA Astrophysics Data System (ADS)

    Yang, C.; Huang, T.; Armstrong, E. M.; Moroni, D. F.; Liu, K.; Gui, Z.

    2013-12-01

    Abstract: We present a Semantically optiMize the dAta seRvice operaTion (SMART) system for better data discovery and access across the NASA data systems, Global Earth Observation System of Systems (GEOSS) Clearinghouse and Data.gov to facilitate scientists to select Earth observation data that fit better their needs in four aspects: 1. Integrating and interfacing the SMART system to include the functionality of a) semantic reasoning based on Jena, an open source semantic reasoning engine, b) semantic similarity calculation, c) recommendation based on spatiotemporal, semantic, and user workflow patterns, and d) ranking results based on similarity between search terms and data ontology. 2. Collaborating with data user communities to a) capture science data ontology and record relevant ontology triple stores, b) analyze and mine user search and download patterns, c) integrate SMART into metadata-centric discovery system for community-wide usage and feedback, and d) customizing data discovery, search and access user interface to include the ranked results, recommendation components, and semantic based navigations. 3. Laying the groundwork to interface the SMART system with other data search and discovery systems as an open source data search and discovery solution. The SMART systems leverages NASA, GEO, FGDC data discovery, search and access for the Earth science community by enabling scientists to readily discover and access data appropriate to their endeavors, increasing the efficiency of data exploration and decreasing the time that scientists must spend on searching, downloading, and processing the datasets most applicable to their research. By incorporating the SMART system, it is a likely aim that the time being devoted to discovering the most applicable dataset will be substantially reduced, thereby reducing the number of user inquiries and likewise reducing the time and resources expended by a data center in addressing user inquiries. Keywords: EarthCube; ECHO, DAACs, GeoPlatform; Geospatial Cyberinfrastructure References: 1. Yang, P., Evans, J., Cole, M., Alameh, N., Marley, S., & Bambacus, M., (2007). The Emerging Concepts and Applications of the Spatial Web Portal. Photogrammetry Engineering &Remote Sensing,73(6):691-698. 2. Zhang, C, Zhao, T. and W. Li. (2010). The Framework of a Geospatial Semantic Web based Spatial Decision Support System for Digital Earth. International Journal of Digital Earth. 3(2):111-134. 3. Yang C., Raskin R., Goodchild M.F., Gahegan M., 2010, Geospatial Cyberinfrastructure: Past, Present and Future,Computers, Environment, and Urban Systems, 34(4):264-277. 4. Liu K., Yang C., Li W., Gui Z., Xu C., Xia J., 2013. Using ontology and similarity calculations to rank Earth science data searching results, International Journal of Geospatial Information Applications. (in press)

  17. Enhancing Earth Observation and Modeling for Tsunami Disaster Response and Management

    NASA Astrophysics Data System (ADS)

    Koshimura, Shunichi; Post, Joachim

    2017-04-01

    In the aftermath of catastrophic natural disasters, such as earthquakes and tsunamis, our society has experienced significant difficulties in assessing disaster impact in the limited amount of time. In recent years, the quality of satellite sensors and access to and use of satellite imagery and services has greatly improved. More and more space agencies have embraced data-sharing policies that facilitate access to archived and up-to-date imagery. Tremendous progress has been achieved through the continuous development of powerful algorithms and software packages to manage and process geospatial data and to disseminate imagery and geospatial datasets in near-real time via geo-web-services, which can be used in disaster-risk management and emergency response efforts. Satellite Earth observations now offer consistent coverage and scope to provide a synoptic overview of large areas, repeated regularly. These can be used to compare risk across different countries, day and night, in all weather conditions, and in trans-boundary areas. On the other hand, with use of modern computing power and advanced sensor networks, the great advances of real-time simulation have been achieved. The data and information derived from satellite Earth observations, integrated with in situ information and simulation modeling provides unique value and the necessary complement to socio-economic data. Emphasis also needs to be placed on ensuring space-based data and information are used in existing and planned national and local disaster risk management systems, together with other data and information sources as a way to strengthen the resilience of communities. Through the case studies of the 2011 Great East Japan earthquake and tsunami disaster, we aim to discuss how earth observations and modeling, in combination with local, in situ data and information sources, can support the decision-making process before, during and after a disaster strikes.

  18. Graduate Ethics Curricula for Future Geospatial Technology Professionals (Invited)

    NASA Astrophysics Data System (ADS)

    Wright, D. J.; Dibiase, D.; Harvey, F.; Solem, M.

    2009-12-01

    Professionalism in today's rapidly-growing, multidisciplinary geographic information science field (e.g., geographic information systems or GIS, remote sensing, cartography, quantitative spatial analysis), now involves a commitment to ethical practice as informed by a more sophisticated understanding of the ethical implications of geographic technologies. The lack of privacy introduced by mobile mapping devices, the use of GIS for military and surveillance purposes, the appropriate use of data collected using these technologies for policy decisions (especially for conservation and sustainability) and general consequences of inequities that arise through biased access to geospatial tools and derived data all continue to be challenging issues and topics of deep concern for many. Students and professionals working with GIS and related technologies should develop a sound grasp of these issues and a thorough comprehension of the concerns impacting their use and development in today's world. However, while most people agree that ethics matters for GIS, we often have difficulty putting ethical issues into practice. An ongoing project supported by NSF seeks to bridge this gap by providing a sound basis for future ethical consideration of a variety of issues. A model seminar curriculum is under development by a team of geographic information science and technology (GIS&T) researchers and professional ethicists, along with protocols for course evaluations. In the curricula students first investigate the nature of professions in general and the characteristics of a GIS&T profession in particular. They hone moral reasoning skills through methodical analyses of case studies in relation to various GIS Code of Ethics and Rules of Conduct. They learn to unveil the "moral ecologies" of a profession through actual interviews with real practitioners in the field. Assignments thus far include readings, class discussions, practitioner interviews, and preparations of original case studies. Curricula thus far are freely available via gisprofessionalethics.org.

  19. GSKY: A scalable distributed geospatial data server on the cloud

    NASA Astrophysics Data System (ADS)

    Rozas Larraondo, Pablo; Pringle, Sean; Antony, Joseph; Evans, Ben

    2017-04-01

    Earth systems, environmental and geophysical datasets are an extremely valuable sources of information about the state and evolution of the Earth. Being able to combine information coming from different geospatial collections is in increasing demand by the scientific community, and requires managing and manipulating data with different formats and performing operations such as map reprojections, resampling and other transformations. Due to the large data volume inherent in these collections, storing multiple copies of them is unfeasible and so such data manipulation must be performed on-the-fly using efficient, high performance techniques. Ideally this should be performed using a trusted data service and common system libraries to ensure wide use and reproducibility. Recent developments in distributed computing based on dynamic access to significant cloud infrastructure opens the door for such new ways of processing geospatial data on demand. The National Computational Infrastructure (NCI), hosted at the Australian National University (ANU), has over 10 Petabytes of nationally significant research data collections. Some of these collections, which comprise a variety of observed and modelled geospatial data, are now made available via a highly distributed geospatial data server, called GSKY (pronounced [jee-skee]). GSKY supports on demand processing of large geospatial data products such as satellite earth observation data as well as numerical weather products, allowing interactive exploration and analysis of the data. It dynamically and efficiently distributes the required computations among cloud nodes providing a scalable analysis framework that can adapt to serve large number of concurrent users. Typical geospatial workflows handling different file formats and data types, or blending data in different coordinate projections and spatio-temporal resolutions, is handled transparently by GSKY. This is achieved by decoupling the data ingestion and indexing process as an independent service. An indexing service crawls data collections either locally or remotely by extracting, storing and indexing all spatio-temporal metadata associated with each individual record. GSKY provides the user with the ability of specifying how ingested data should be aggregated, transformed and presented. It presents an OGC standards-compliant interface, allowing ready accessibility for users of the data via Web Map Services (WMS), Web Processing Services (WPS) or raw data arrays using Web Coverage Services (WCS). The presentation will show some cases where we have used this new capability to provide a significant improvement over previous approaches.

  20. NASA's Agricultural Program: A USDA/Grower Partnership

    NASA Technical Reports Server (NTRS)

    McKellip, Rodney; Thomas, Michael

    2002-01-01

    Ag20/20 is a partnership between USDA, NASA, and four national commodity associations. It is driven by the information needs of U.S. farmers. Ag20/20 is focused on utilization of earth science and remote sensing for decision-making and oriented toward economically viable operational solutions. Its purpose is to accelerate the use of remote sensing and other geospatial technologies on the farm to: 1) Increase the production efficiency of the American farmer; 2) Reduce crop production risks; 3) Improve environmental stewardship tools for agricultural production.

  1. Geospatial Data Science Research Staff | Geospatial Data Science | NREL

    Science.gov Websites

    Oliveira, Ricardo Researcher II-Geospatial Science Ricardo.Oliveira@nrel.gov 303-275-3272 Gilroy, Nicholas Specialist Pamela.Gray.hann@nrel.gov 303-275-4626 Grue, Nicholas Researcher III-Geospatial Science Nick.Grue

  2. PLANNING QUALITY IN GEOSPATIAL PROJECTS

    EPA Science Inventory

    This presentation will briefly review some legal drivers and present a structure for the writing of geospatial Quality Assurance Projects Plans. In addition, the Geospatial Quality Council geospatial information life-cycle and sources of error flowchart will be reviewed.

  3. Automatic geospatial information Web service composition based on ontology interface matching

    NASA Astrophysics Data System (ADS)

    Xu, Xianbin; Wu, Qunyong; Wang, Qinmin

    2008-10-01

    With Web services technology the functions of WebGIS can be presented as a kind of geospatial information service, and helped to overcome the limitation of the information-isolated situation in geospatial information sharing field. Thus Geospatial Information Web service composition, which conglomerates outsourced services working in tandem to offer value-added service, plays the key role in fully taking advantage of geospatial information services. This paper proposes an automatic geospatial information web service composition algorithm that employed the ontology dictionary WordNet to analyze semantic distances among the interfaces. Through making matching between input/output parameters and the semantic meaning of pairs of service interfaces, a geospatial information web service chain can be created from a number of candidate services. A practice of the algorithm is also proposed and the result of it shows the feasibility of this algorithm and the great promise in the emerging demand for geospatial information web service composition.

  4. A Lifecycle Approach to Brokered Data Management for Hydrologic Modeling Data Using Open Standards.

    NASA Astrophysics Data System (ADS)

    Blodgett, D. L.; Booth, N.; Kunicki, T.; Walker, J.

    2012-12-01

    The U.S. Geological Survey Center for Integrated Data Analytics has formalized an information management-architecture to facilitate hydrologic modeling and subsequent decision support throughout a project's lifecycle. The architecture is based on open standards and open source software to decrease the adoption barrier and to build on existing, community supported software. The components of this system have been developed and evaluated to support data management activities of the interagency Great Lakes Restoration Initiative, Department of Interior's Climate Science Centers and WaterSmart National Water Census. Much of the research and development of this system has been in cooperation with international interoperability experiments conducted within the Open Geospatial Consortium. Community-developed standards and software, implemented to meet the unique requirements of specific disciplines, are used as a system of interoperable, discipline specific, data types and interfaces. This approach has allowed adoption of existing software that satisfies the majority of system requirements. Four major features of the system include: 1) assistance in model parameter and forcing creation from large enterprise data sources; 2) conversion of model results and calibrated parameters to standard formats, making them available via standard web services; 3) tracking a model's processes, inputs, and outputs as a cohesive metadata record, allowing provenance tracking via reference to web services; and 4) generalized decision support tools which rely on a suite of standard data types and interfaces, rather than particular manually curated model-derived datasets. Recent progress made in data and web service standards related to sensor and/or model derived station time series, dynamic web processing, and metadata management are central to this system's function and will be presented briefly along with a functional overview of the applications that make up the system. As the separate pieces of this system progress, they will be combined and generalized to form a sort of social network for nationally consistent hydrologic modeling.

  5. The National Map 2.0 Tactical Plan: "Toward the (Integrated) National Map"

    USGS Publications Warehouse

    Zulick, Carl A.

    2008-01-01

    The National Map's 2-year goal, as described in this plan, is to provide a range of geospatial products and services that meet the basic goals of the original vision for The National Map while furthering the National Spatial Data Infrastructure that underpins U.S. Geological Survey (USGS) science. To accomplish this goal, the National Geospatial Program (NGP) will acquire, store, maintain, and distribute base map data. The management team for the NGP sets priorities for The National Map in three areas: Data and Products, Services, and Management. Priorities for fiscal years 2008 and 2009 (October 1, 2007 through September 30, 2009), involving the current data inventory, data acquisition, and the integration of data, are (1) incorporating current data from Federal, State, and local organizations into The National Map to the degree possible, given data availability and program resources; (2) collaborating with other USGS programs to incorporate data that support the USGS Science Strategy; (3) supporting the Department of the Interior (DOI) high-priority geospatial information needs; (4) emergency response; (5) homeland security, natural hazards; and (6) graphics products delivery. The management team identified known constraints, enablers, and drivers for the acquisition and integration of data. The NGP management team also identified customer-focused products and services of The National Map. Ongoing planning and management activities direct the development and delivery of these products and services. Management of work flow processes to support The National Map priorities are identified and established through a business-driven prioritization process. This tactical plan is primarily for use as a document to guide The National Map program for the next two fiscal years. The document is available to the public because of widespread interest in The National Map. The USGS collaborates with a broad range of customers and partners who are essential to the success of The National Map, including the science community, State and Federal agencies involved in homeland security, planners and emergency responders at the local level, and private companies. Partner contributions and data remain a primary input and foundation of The National Map. Partnership strategies for each of The National Map's component data themes are outlined in this plan. Because of the importance of The National Map customers, a reassessment of customer needs will be completed during 2008. Results of the assessment will be incorporated into future decisions and priorities. A performance milestone matrix has been developed that contains the full list of milestones, major deliverables, and major tasks. The matrix forms the basis for reporting on accomplishments and issues. However, a number of risks, dependencies, and issues have been identified that could affect meeting milestones in the matrix, such as: the USGS is not the Circular A-16 lead for boundaries, transportation, and structures; availability of sufficient and sustainable funding; availability of Federal workforce and contractors with necessary skills, and numerous competing customer and stakeholder requirements.

  6. 75 FR 6056 - National Geospatial Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-05

    ... DEPARTMENT OF THE INTERIOR Office of the Secretary National Geospatial Advisory Committee AGENCY: Office of the Secretary, Interior. ACTION: Notice of renewal of National Geospatial Advisory Committee... renewed the National Geospatial Advisory Committee. The Committee will provide advice and recommendations...

  7. 77 FR 37004 - Privacy Act of 1974; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-20

    ...-Intelligence Agency (NGA), ATTN: Security Specialist, Mission Support, MSRS P-12, 7500 GEOINT Drive..., Alternate OSD Federal Register Liaison Officer, Department of Defense. NGA-005 System name: National... maintained at National Geospatial-Intelligence Agency (NGA) Headquarters in Washington, DC metro area...

  8. Geospatial Service Platform for Education and Research

    NASA Astrophysics Data System (ADS)

    Gong, J.; Wu, H.; Jiang, W.; Guo, W.; Zhai, X.; Yue, P.

    2014-04-01

    We propose to advance the scientific understanding through applications of geospatial service platforms, which can help students and researchers investigate various scientific problems in a Web-based environment with online tools and services. The platform also offers capabilities for sharing data, algorithm, and problem-solving knowledge. To fulfil this goal, the paper introduces a new course, named "Geospatial Service Platform for Education and Research", to be held in the ISPRS summer school in May 2014 at Wuhan University, China. The course will share cutting-edge achievements of a geospatial service platform with students from different countries, and train them with online tools from the platform for geospatial data processing and scientific research. The content of the course includes the basic concepts of geospatial Web services, service-oriented architecture, geoprocessing modelling and chaining, and problem-solving using geospatial services. In particular, the course will offer a geospatial service platform for handson practice. There will be three kinds of exercises in the course: geoprocessing algorithm sharing through service development, geoprocessing modelling through service chaining, and online geospatial analysis using geospatial services. Students can choose one of them, depending on their interests and background. Existing geoprocessing services from OpenRS and GeoPW will be introduced. The summer course offers two service chaining tools, GeoChaining and GeoJModelBuilder, as instances to explain specifically the method for building service chains in view of different demands. After this course, students can learn how to use online service platforms for geospatial resource sharing and problem-solving.

  9. Geospatial Modelling for Micro Zonation of Groundwater Regime in Western Assam, India

    NASA Astrophysics Data System (ADS)

    Singh, R. P.

    2016-12-01

    Water, most precious natural resource on earth, is vital to sustain the natural system and human civilisation on the earth. The Assam state located in north-eastern part of India has a relatively good source of ground water due to their geographic and physiographic location but there is problem deterioration of groundwater quality causing major health problem in the area. In this study, I tried a integrated study of remote sensing and GIS and chemical analysis of groundwater samples to throw a light over groundwater regime and provides information for decision makers to make sustainable water resource management. The geospatial modelling performed by integrating hydrogeomorphic features. Geomorphology, lineament, Drainage, Landuse/landcover layer were generated through visual interpretation on satellite image (LISS III) based on tone, texture, shape, size, and arrangement of the features. Slope layer was prepared by using SRTM DEM data set .The LULC of the area were categories in to 6 classes of Agricultural field, Forest area ,River, Settlement , Tree-clad area and Wetlands. The geospatial modelling performed through weightage and rank method in GIS, depending on the influence of the features on ground water regime. To Assess the ground water quality of the area 45 groundwater samples have been collected from the field and chemical analysis performed through the standard method in the laboratory. The overall assessment of the ground water quality of the area analyse through Water Quality Index and found that about 70% samples are not potable for drinking purposes due to higher concentration Arsenic, Fluoride and Iron. It appears that, source of all these pollutants geologically and geomorphologically derived. Interpolated layer of Water Quality Index and geospatial modelled Groundwater potential layer provides a holistic view of groundwater scenario and provide direction for better planning and groundwater resource management. Study will be discussed in details during the conference.

  10. Web catalog of oceanographic data using GeoNetwork

    NASA Astrophysics Data System (ADS)

    Marinova, Veselka; Stefanov, Asen

    2017-04-01

    Most of the data collected, analyzed and used by Bulgarian oceanographic data center (BgODC) from scientific cruises, argo floats, ferry boxes and real time operating systems are spatially oriented and need to be displayed on the map. The challenge is to make spatial information more accessible to users, decision makers and scientists. In order to meet this challenge, BgODC concentrate its efforts on improving dynamic and standardized access to their geospatial data as well as those from various related organizations and institutions. BgODC currently is implementing a project to create a geospatial portal for distributing metadata and search, exchange and harvesting spatial data. There are many open source software solutions able to create such spatial data infrastructure (SDI). Finally, the GeoNetwork open source is chosen, as it is already widespread. This software is free, effective and "cheap" solution for implementing SDI at organization level. It is platform independent and runs under many operating systems. Filling of the catalog goes through these practical steps: • Managing and storing data reliably within MS SQL spatial data base; • Registration of maps and data of various formats and sources in GeoServer (most popular open source geospatial server embedded with GeoNetwork) ; • Filling added meta data and publishing geospatial data at the desktop of GeoNetwork. GeoServer and GeoNetwork are based on Java so they require installing of a servlet engine like Tomcat. The experience gained from the use of GeoNetwork Open Source confirms that the catalog meets the requirements for data management and is flexible enough to customize. Building the catalog facilitates sustainable data exchange between end users. The catalog is a big step towards implementation of the INSPIRE directive due to availability of many features necessary for producing "INSPIRE compliant" metadata records. The catalog now contains all available GIS data provided by BgODC for Internet access. Searching data within the catalog is based upon geographic extent, theme type and free text search.

  11. EPA GEOSPATIAL QUALITY COUNCIL

    EPA Science Inventory

    The EPA Geospatial Quality Council (previously known as the EPA GIS-QA Team - EPA/600/R-00/009 was created to fill the gap between the EPA Quality Assurance (QA) and Geospatial communities. All EPA Offices and Regions were invited to participate. Currently, the EPA Geospatial Q...

  12. A Framework for Achieving Situational Awareness during Crisis based on Twitter Analysis

    NASA Astrophysics Data System (ADS)

    Zielinski, Andrea; Tokarchuk, Laurissa; Middleton, Stuart; Chaves, Fernando

    2013-04-01

    Decision Support Systems for Natural Crisis Management increasingly employ Web 2.0 and 3.0 technologies for future collaborative decision making, including the use of social networks like Twitter. However, human sensor data is not readily accessible and interpretable, since the texts are unstructured, noisy and available in various languages. The present work focusses on the detection of crisis events in a multilingual setting as part of the FP7-funded EU project TRIDEC and is motivated by the goal to establish a Tsunami warning system for the Mediterranean. It is integrated into a dynamic spatial-temporal decision making component with a command and control unit's graphical user interface that presents all relevant information to the human operator to support critical decision-support. To this end, a tool for the interactive visualization of geospatial data is implemented: All tweets with an exact timestamp or geo-location are monitored on the map in real-time so that the operator on duty can get an overall picture of the situation. Apart from the human sensor data, the seismic sensor data will appear also on the same screen. Signs of abnormal activity from twitter usage in social networks as well as in sensor networks devices can then be used to trigger official warning alerts according to the CAP message standard. Whenever a certain threshold of relevant tweets in a HASC region (Hierarchical Administrative Subdivision Code) is exceeded, the twitter activity in this administrative region will be shown on a map. We believe that the following functionalities are crucial for monitoring crisis, making use of text mining and network analysis techniques: Focussed crawling, trustworthyness analysis geo-parsing, and multilingual tweet classification. In the first step, the Twitter Streaming API accesses the social data, using an adaptive keyword list (focussed crawling). Then, tweets are filtered and aggregated to form counts for a certain time-span (e.g., an interval of 1-2 minutes). Particularly, we investigate the following novel techniques that help to fulfill this task: trustworthyness analysis (linkage analysis and user network analysis), geo-parsing (locating the event in space), and multilingual tweet classification (filtering out of noisy tweets for various Mediterranean languages). Lastly, an aberration algorithm looks for spikes in the temporal stream of twitter data.

  13. Distributed Research Center for Analysis of Regional Climatic Changes and Their Impacts on Environment

    NASA Astrophysics Data System (ADS)

    Shiklomanov, A. I.; Okladnikov, I.; Gordov, E. P.; Proussevitch, A. A.; Titov, A. G.

    2016-12-01

    Presented is a collaborative project carrying out by joint team of researchers from the Institute of Monitoring of Climatic and Ecological Systems, Russia and Earth Systems Research Center, University of New Hampshire, USA. Its main objective is development of a hardware and software prototype of Distributed Research Center (DRC) for monitoring and projecting of regional climatic and and their impacts on the environment over the Northern extratropical areas. In the framework of the project new approaches to "cloud" processing and analysis of large geospatial datasets (big geospatial data) are being developed. It will be deployed on technical platforms of both institutions and applied in research of climate change and its consequences. Datasets available at NCEI and IMCES include multidimensional arrays of climatic, environmental, demographic, and socio-economic characteristics. The project is aimed at solving several major research and engineering tasks: 1) structure analysis of huge heterogeneous climate and environmental geospatial datasets used in the project, their preprocessing and unification; 2) development of a new distributed storage and processing model based on a "shared nothing" paradigm; 3) development of a dedicated database of metadata describing geospatial datasets used in the project; 4) development of a dedicated geoportal and a high-end graphical frontend providing intuitive user interface, internet-accessible online tools for analysis of geospatial data and web services for interoperability with other geoprocessing software packages. DRC will operate as a single access point to distributed archives of spatial data and online tools for their processing. Flexible modular computational engine running verified data processing routines will provide solid results of geospatial data analysis. "Cloud" data analysis and visualization approach will guarantee access to the DRC online tools and data from all over the world. Additionally, exporting of data processing results through WMS and WFS services will be used to provide their interoperability. Financial support of this activity by the RF Ministry of Education and Science under Agreement 14.613.21.0037 (RFMEFI61315X0037) and by the Iola Hubbard Climate Change Endowment is acknowledged.

  14. EPA Geospatial Quality Council Strategic and Implementation Plan 2010 to 2015

    EPA Science Inventory

    The EPA Geospatial Quality Council (GQC) was created to promote and provide Quality Assurance guidance for the development, use, and products of geospatial science. The GQC was created when the gap between the EPA Quality Assurance (QA) and Geospatial communities was recognized. ...

  15. US EPA GEOSPATIAL QUALITY COUNCIL: ENSURING QUALITY GEOSPATIAL SOLUTIONS

    EPA Science Inventory

    This presentation will discuss the history, strategy, products, and future plans of the EPA Geospatial Quality Council (GQC). A topical review of GQC products will be presented including:

    o Guidance for Geospatial Data Quality Assurance Project Plans.

    o GPS - Tec...

  16. A Dynamic Information Framework (DIF): A Portal for the Changing Biogeochemistry of Aquatic Systems

    NASA Astrophysics Data System (ADS)

    Richey, J. E.; Fernandes, E. C. M.

    2014-12-01

    The ability of societies to adapt to climate and landuse change in aquatic systems is functionally and practically expressed by how regional stakeholders are able to address complex management issues. These targets represent a very complex set of intersecting issues of scale, cross-sector science and technology, education, politics, and economics. Implications transcend individual projects and ministries. An immediate challenge is to incorporate the realities of changing environmental conditions in these sectors into the policies and projects of the Ministries nominally responsible. Ideally this would be done on the basis of the absolute best understanding of the issues involved, and done in a way that optimizes a multi-stakeholder return. Central to a response is "actionable information-" the synthesis and "bringing to life" of the key information that integrates the end-to-end knowledge required to provide the high-level decision support to make the most informed decisions. But, in practice, the information necessary and even perspectives are virtually absent, in much of especially the developing world. To meet this challenge, we have been developing a Dynamic Information Framework (DIF), primarily through collaborations with the World Bank in Asia, Africa, and Brazil. The DIF is, essentially a decision support structure, built around "earth system" models. The environment is built on progressive information layers that are fed through hydrological and geospatial landscape models to produce outputs that address specific science questions related to water resources management of the region. Information layers from diverse sources are assembled, according to the principles of how the landscape is organized, and computer models are used to bring the information "to life." A fundamental aspect to a DIF is not only the convergence of multi-sector information, but how that information can be conveyed, in the most compelling, and visual, manner. Deployment of the environment in the Cloud facilitates access for stakeholders.

  17. Searching and exploitation of distributed geospatial data sources via the Naval Research Lab's Geospatial Information Database (GIDB) Portal System

    NASA Astrophysics Data System (ADS)

    McCreedy, Frank P.; Sample, John T.; Ladd, William P.; Thomas, Michael L.; Shaw, Kevin B.

    2005-05-01

    The Naval Research Laboratory"s Geospatial Information Database (GIDBTM) Portal System has been extended to now include an extensive geospatial search functionality. The GIDB Portal System interconnects over 600 distributed geospatial data sources via the Internet with a thick client, thin client and a PDA client. As the GIDB Portal System has rapidly grown over the last two years (adding hundreds of geospatial sources), the obvious requirement has arisen to more effectively mine the interconnected sources in near real-time. How the GIDB Search addresses this issue is the prime focus of this paper.

  18. High-Resolution Climate Data Visualization through GIS- and Web-based Data Portals

    NASA Astrophysics Data System (ADS)

    WANG, X.; Huang, G.

    2017-12-01

    Sound decisions on climate change adaptation rely on an in-depth assessment of potential climate change impacts at regional and local scales, which usually requires finer resolution climate projections at both spatial and temporal scales. However, effective downscaling of global climate projections is practically difficult due to the lack of computational resources and/or long-term reference data. Although a large volume of downscaled climate data has been make available to the public, how to understand and interpret the large-volume climate data and how to make use of the data to drive impact assessment and adaptation studies are still challenging for both impact researchers and decision makers. Such difficulties have become major barriers preventing informed climate change adaptation planning at regional scales. Therefore, this research will explore new GIS- and web-based technologies to help visualize the large-volume regional climate data with high spatiotemporal resolutions. A user-friendly public data portal, named Climate Change Data Portal (CCDP, http://ccdp.network), will be established to allow intuitive and open access to high-resolution regional climate projections at local scales. The CCDP offers functions of visual representation through geospatial maps and data downloading for a variety of climate variables (e.g., temperature, precipitation, relative humidity, solar radiation, and wind) at multiple spatial resolutions (i.e., 25 - 50 km) and temporal resolutions (i.e., annual, seasonal, monthly, daily, and hourly). The vast amount of information the CCDP encompasses can provide a crucial basis for assessing impacts of climate change on local communities and ecosystems and for supporting better decision making under a changing climate.

  19. Improved hydrological modeling using AGWA; incorporation of different management practices in hydrological modeling.

    NASA Astrophysics Data System (ADS)

    Vithanage, J.; Miller, S. N.; Paige, G. B.; Liu, T.

    2017-12-01

    We present a novel way to simulate the effects of rangeland management decisions in a GIS-based hydrologic modeling toolkit. We have implemented updates to the Automated Geospatial Watershed Assessment tool (AGWA) in which a landscape can be broken into management units (e.g., high intensity grazing, low intensity grazing, fire management, and unmanaged), each of which is assigned a different hydraulic conductivity (Ks) parameter in KINEmatic Runoff and EROSion model (KINEROS2). These updates are designed to provide modeling support to land managers tasked with rangeland watershed management planning and/or monitoring, and evaluation of water resources management. Changes to hydrologic processes and resulting hydrographs and sedigraphs are simulated within the AGWA framework. Case studies are presented in which a user selects various management scenarios and design storms, and the model identifies areas that become susceptible to change as a consequence of management decisions. The baseline (unmanaged) scenario is built using commonly available GIS data, after which the watershed is subdivided into management units. We used an array of design storms with various return periods and frequencies to evaluate the impact of management practices while changing the scale of watershed. Watershed parameters governing interception, infiltration, and surface runoff were determined with the aid of literature published on research studies carried out in the Walnut Gulch Experimental Watershed in southeast Arizona. We observed varied, but significant changes in hydrological responses (runoff) with different management practices as well with varied scales of watersheds. Results show that the toolkit can be used to quantify potential hydrologic change as a result of unitized land use decision-making.

  20. Pamela Gray-Hann | NREL

    Science.gov Websites

    Pamela Gray-Hann Photo of Pamela Gray-Hann Pamela Gray-Hann Project Support Specialist Pamela.Gray.hann@nrel.gov | 303-275-4626 Pamela Gray-Hann is a member of the Geospatial Data Science team within Pam Gray-Hann

  1. Integrated web system of geospatial data services for climate research

    NASA Astrophysics Data System (ADS)

    Okladnikov, Igor; Gordov, Evgeny; Titov, Alexander

    2016-04-01

    Georeferenced datasets are currently actively used for modeling, interpretation and forecasting of climatic and ecosystem changes on different spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their huge size (up to tens terabytes for a single dataset) a special software supporting studies in the climate and environmental change areas is required. An approach for integrated analysis of georefernced climatological data sets based on combination of web and GIS technologies in the framework of spatial data infrastructure paradigm is presented. According to this approach a dedicated data-processing web system for integrated analysis of heterogeneous georeferenced climatological and meteorological data is being developed. It is based on Open Geospatial Consortium (OGC) standards and involves many modern solutions such as object-oriented programming model, modular composition, and JavaScript libraries based on GeoExt library, ExtJS Framework and OpenLayers software. This work is supported by the Ministry of Education and Science of the Russian Federation, Agreement #14.613.21.0037.

  2. Providing R-Tree Support for Mongodb

    NASA Astrophysics Data System (ADS)

    Xiang, Longgang; Shao, Xiaotian; Wang, Dehao

    2016-06-01

    Supporting large amounts of spatial data is a significant characteristic of modern databases. However, unlike some mature relational databases, such as Oracle and PostgreSQL, most of current burgeoning NoSQL databases are not well designed for storing geospatial data, which is becoming increasingly important in various fields. In this paper, we propose a novel method to provide R-tree index, as well as corresponding spatial range query and nearest neighbour query functions, for MongoDB, one of the most prevalent NoSQL databases. First, after in-depth analysis of MongoDB's features, we devise an efficient tabular document structure which flattens R-tree index into MongoDB collections. Further, relevant mechanisms of R-tree operations are issued, and then we discuss in detail how to integrate R-tree into MongoDB. Finally, we present the experimental results which show that our proposed method out-performs the built-in spatial index of MongoDB. Our research will greatly facilitate big data management issues with MongoDB in a variety of geospatial information applications.

  3. Science framework for the conservation and restoration strategy of DOI secretarial order 3336: Utilizing resilience and resistance concepts to assess threats to sagebrush ecosystems and greater sage-grouse, prioritize conservation and restoration actions, and inform management strategies

    USGS Publications Warehouse

    Chambers, Jeanne C.; Campbell, Steve; Carlson, John; Beck, Jeffrey L.; Clause, Karen J.; Dinkins, Jonathan B.; Doherty, Kevin E.; Espinosa, Shawn; Griffin, Kathleen A.; Christiansen, Thomas J.; Crist, Michele R.; Hanser, Steven E.; Havlina, Douglas W.; Henke, Kenneth F.; Hennig, Jacob D.; Kurth, Laurie L.; Maestas, Jeremy D.; Mayer, Kenneth E.; Manning, Mary E.; Mealor, Brian A.; McCarthy, Clinton; Pellant, Mike; Prentice, Karen L.; Perea, Marco A.; Pyke, David A.; Wiechman , Lief A.; Wuenschel, Amarina

    2016-01-01

    The Science Framework for the Conservation and Restoration Strategy of the Department of the Interior, Secretarial Order 3336 (SO 3336), Rangeland Fire Prevention, Management and Restoration, provides a strategic, multiscale approach for prioritizing areas for management and determining effective management strategies across the sagebrush biome. The emphasis of this version is on sagebrush ecosystems and greater sage-grouse. The Science Framework uses a six step process in which sagebrush ecosystem resilience to disturbance and resistance to nonnative, invasive annual grasses is linked to species habitat information based on the distribution and abundance of focal species. The predominant ecosystem and anthropogenic threats are assessed, and a habitat matrix is developed that helps decision makers evaluate risks and determine appropriate management strategies at regional and local scales. Areas are prioritized for management action using a geospatial approach that overlays resilience and resistance, species habitat information, and predominant threats. Decision tools are discussed for determining the suitability of priority areas for management and the most appropriate management actions at regional to local scales. The Science Framework and geospatial crosscut are intended to complement the mitigation strategies associated with the Greater Sage-Grouse Land Use Plan amendments for the Department of the Interior Bureaus, such as the Bureau of Land Management, and the U.S. Forest Service.

  4. Geospatial strategy for sustainable management of municipal solid waste for growing urban environment.

    PubMed

    Pandey, Prem Chandra; Sharma, Laxmi Kant; Nathawat, Mahendra Singh

    2012-04-01

    This paper presents the implementation of a Geospatial approach for improving the Municipal Solid Waste (MSW) disposal suitability site assessment in growing urban environment. The increasing trend of population growth and the absolute amounts of waste disposed of worldwide have increased substantially reflecting changes in consumption patterns, consequently worldwide. MSW is now a bigger problem than ever. Despite an increase in alternative techniques for disposing of waste, land-filling remains the primary means. In this context, the pressures and requirements placed on decision makers dealing with land-filling by government and society have increased, as they now have to make decisions taking into considerations environmental safety and economic practicality. The waste disposed by the municipal corporation in the Bhagalpur City (India) is thought to be different from the landfill waste where clearly scientific criterion for locating suitable disposal sites does not seem to exist. The location of disposal sites of Bhagalpur City represents the unconsciousness about the environmental and public health hazards arising from disposing of waste in improper location. Concerning about urban environment and health aspects of people, a good method of waste management and appropriate technologies needed for urban area of Bhagalpur city to improve this trend using Multi Criteria Geographical Information System and Remote Sensing for selection of suitable disposal sites. The purpose of GIS was to perform process to part restricted to highly suitable land followed by using chosen criteria. GIS modeling with overlay operation has been used to find the suitability site for MSW.

  5. a Bottom-Up Geosptial Data Update Mechanism for Spatial Data Infrastructure Updating

    NASA Astrophysics Data System (ADS)

    Tian, W.; Zhu, X.; Liu, Y.

    2012-08-01

    Currently, the top-down spatial data update mechanism has made a big progress and it is wildly applied in many SDI (spatial data infrastructure). However, this mechanism still has some issues. For example, the update schedule is limited by the professional department's project, usually which is too long for the end-user; the data form collection to public cost too much time and energy for professional department; the details of geospatial information does not provide sufficient attribute, etc. Thus, how to deal with the problems has become the effective shortcut. Emerging Internet technology, 3S technique and geographic information knowledge which is popular in the public promote the booming development of geoscience in volunteered geospatial information. Volunteered geospatial information is the current "hotspot", which attracts many researchers to study its data quality and credibility, accuracy, sustainability, social benefit, application and so on. In addition to this, a few scholars also pay attention to the value of VGI to support the SDI updating. And on that basis, this paper presents a bottom-up update mechanism form VGI to SDI, which includes the processes of match homonymous elements between VGI and SDI vector data , change data detection, SDI spatial database update and new data product publication to end-users. Then, the proposed updating cycle is deeply discussed about the feasibility of which can detect the changed elements in time and shorten the update period, provide more accurate geometry and attribute data for spatial data infrastructure and support update propagation.

  6. Geomatics Education: Need Assessment

    NASA Astrophysics Data System (ADS)

    Vyas, A.

    2014-11-01

    Education system is divided in to two classes: formal and informal. Formal education establishes the basis of theory and practical learning whereas informal education is largely self-learning, learning from real world projects. Generally science and technology streams require formal method of education. The social and related aspects can be taught through the other methods. Education is a media through which the foundation of the knowledge and skill is built. The statistics reveals the increase in the trend of the literate population. This may be accounted due to the level of urbanization and migration to the cities in search for the "white-collar jobs". As a result, a shift in the employment structure is observed from a primary sector to a secondary and tertiary sector. Thomas Friedman in his book `The World is Flat' quotes the impact of globalization on adaptation of science and technology, the world has become large to tiny. One of the technologies to mention here is geospatial technology. With the advancement in the satellite remote sensing, geographical information system, global positioning system, the database management system has become important subject areas. The countries are accounting hugh budget on the space technology, which includes education, training and research. Today many developing countries do not have base maps, they are lacking in the systemic data and record keeping, which are essential for governance, decision making and other development purpose. There is no trained manpower available. There is no standard hardware and software identified. An imbalance is observed when the government is promoting the use of geospatial technology, there is no trained manpower nor the availability of the experts to review the accurateness of the spatial data developed. There are very few universities which impart the degree level education, there are very few trained faculty members who give standard education, there exists a lack of standard syllabus. On the other hand, the industry requires high skilled manpower, high experienced manpower. This is a low equilibrium situation. Since the need is enhancing day by day, the shortage of the skilled manpower is increasing, the need of the geomatics education emerges. This paper researches on the need assessment of the education in geospatial specialization. It emphasises on the challenges and issues prevail in geospatial education and in the specialized fields of remote sensing and GIS. This paper analyse the need assessment through all the three actors: government, geospatial industry and education institutions.

  7. Computing quality scores and uncertainty for approximate pattern matching in geospatial semantic graphs

    DOE PAGES

    Stracuzzi, David John; Brost, Randolph C.; Phillips, Cynthia A.; ...

    2015-09-26

    Geospatial semantic graphs provide a robust foundation for representing and analyzing remote sensor data. In particular, they support a variety of pattern search operations that capture the spatial and temporal relationships among the objects and events in the data. However, in the presence of large data corpora, even a carefully constructed search query may return a large number of unintended matches. This work considers the problem of calculating a quality score for each match to the query, given that the underlying data are uncertain. As a result, we present a preliminary evaluation of three methods for determining both match qualitymore » scores and associated uncertainty bounds, illustrated in the context of an example based on overhead imagery data.« less

  8. The Geospatial Web and Local Geographical Education

    ERIC Educational Resources Information Center

    Harris, Trevor M.; Rouse, L. Jesse; Bergeron, Susan J.

    2010-01-01

    Recent innovations in the Geospatial Web represent a paradigm shift in Web mapping by enabling educators to explore geography in the classroom by dynamically using a rapidly growing suite of impressive online geospatial tools. Coupled with access to spatial data repositories and User-Generated Content, the Geospatial Web provides a powerful…

  9. Brandenburg 3D - a comprehensive 3D Subsurface Model, Conception of an Infrastructure Node and a Web Application

    NASA Astrophysics Data System (ADS)

    Kerschke, Dorit; Schilling, Maik; Simon, Andreas; Wächter, Joachim

    2014-05-01

    The Energiewende and the increasing scarcity of raw materials will lead to an intensified utilization of the subsurface in Germany. Within this context, geological 3D modeling is a fundamental approach for integrated decision and planning processes. Initiated by the development of the European Geospatial Infrastructure INSPIRE, the German State Geological Offices started digitizing their predominantly analog archive inventory. Until now, a comprehensive 3D subsurface model of Brandenburg did not exist. Therefore the project B3D strived to develop a new 3D model as well as a subsequent infrastructure node to integrate all geological and spatial data within the Geodaten-Infrastruktur Brandenburg (Geospatial Infrastructure, GDI-BB) and provide it to the public through an interactive 2D/3D web application. The functionality of the web application is based on a client-server architecture. Server-sided, all available spatial data is published through GeoServer. GeoServer is designed for interoperability and acts as the reference implementation of the Open Geospatial Consortium (OGC) Web Feature Service (WFS) standard that provides the interface that allows requests for geographical features. In addition, GeoServer implements, among others, the high performance certified compliant Web Map Service (WMS) that serves geo-referenced map images. For publishing 3D data, the OGC Web 3D Service (W3DS), a portrayal service for three-dimensional geo-data, is used. The W3DS displays elements representing the geometry, appearance, and behavior of geographic objects. On the client side, the web application is solely based on Free and Open Source Software and leans on the JavaScript API WebGL that allows the interactive rendering of 2D and 3D graphics by means of GPU accelerated usage of physics and image processing as part of the web page canvas without the use of plug-ins. WebGL is supported by most web browsers (e.g., Google Chrome, Mozilla Firefox, Safari, and Opera). The web application enables an intuitive navigation through all available information and allows the visualization of geological maps (2D), seismic transects (2D/3D), wells (2D/3D), and the 3D-model. These achievements will alleviate spatial and geological data management within the German State Geological Offices and foster the interoperability of heterogeneous systems. It will provide guidance to a systematic subsurface management across system, domain and administrative boundaries on the basis of a federated spatial data infrastructure, and include the public in the decision processes (e-Governance). Yet, the interoperability of the systems has to be strongly propelled forward through agreements on standards that need to be decided upon in responsible committees. The project B3D is funded with resources from the European Fund for Regional Development (EFRE).

  10. Data Model for Multi Hazard Risk Assessment Spatial Support Decision System

    NASA Astrophysics Data System (ADS)

    Andrejchenko, Vera; Bakker, Wim; van Westen, Cees

    2014-05-01

    The goal of the CHANGES Spatial Decision Support System is to support end-users in making decisions related to risk reduction measures for areas at risk from multiple hydro-meteorological hazards. The crucial parts in the design of the system are the user requirements, the data model, the data storage and management, and the relationships between the objects in the system. The implementation of the data model is carried out entirely with an open source database management system with a spatial extension. The web application is implemented using open source geospatial technologies with PostGIS as the database, Python for scripting, and Geoserver and javascript libraries for visualization and the client-side user-interface. The model can handle information from different study areas (currently, study areas from France, Romania, Italia and Poland are considered). Furthermore, the data model handles information about administrative units, projects accessible by different types of users, user-defined hazard types (floods, snow avalanches, debris flows, etc.), hazard intensity maps of different return periods, spatial probability maps, elements at risk maps (buildings, land parcels, linear features etc.), economic and population vulnerability information dependent on the hazard type and the type of the element at risk, in the form of vulnerability curves. The system has an inbuilt database of vulnerability curves, but users can also add their own ones. Included in the model is the management of a combination of different scenarios (e.g. related to climate change, land use change or population change) and alternatives (possible risk-reduction measures), as well as data-structures for saving the calculated economic or population loss or exposure per element at risk, aggregation of the loss and exposure using the administrative unit maps, and finally, producing the risk maps. The risk data can be used for cost-benefit analysis (CBA) and multi-criteria evaluation (SMCE). The data model includes data-structures for CBA and SMCE. The model is at the stage where risk and cost-benefit calculations can be stored but the remaining part is currently under development. Multi-criteria information, user management and the relation of these with the rest of the model is our next step. Having a carefully designed data model plays a crucial role in the development of the whole system for rapid development, keeping the data consistent, and in the end, support the end-user in making good decisions in risk-reduction measures related to multiple natural hazards. This work is part of the EU FP7 Marie Curie ITN "CHANGES"project (www.changes-itn.edu)

  11. Smartkadaster: Observing Beyond Traditional Cadastre Capabilities for Malaysia

    NASA Astrophysics Data System (ADS)

    Isa, M. N. Bin; Hua, T. C.; Halim, N. Z. Binti Abdul

    2015-10-01

    The digital age for cadastral surveying started in stages, more than 20 years ago in Malaysia and JUPEM played a vital role in its successful implementation nationwide. One of the key products of cadastral survey is cadastral maps, which provide useful information for any land information system. However, as technology evolved and simplicity is familiarised, better services are anticipated and have affected how cadastral survey information are perceived. A paradigm shift is necessary where enriched cadastral information is required for multiple usage and allow real cadastral information based services to users. On that note, JUPEM is intrigued to develop a system where National Digital Cadastral Database is value added with other geospatial information for a smart and multipurpose environment and clearly be interpreted as a decision making tool with the aids of 3D realistic spatial data, namely SmartKADASTER. The SmartKADASTER is an ongoing project developed by JUPEM with the aim to establish a realistic and SMART cadastral-based spatial analysis platform for an effective planning, decision making, enabling efficiencies and enhancing communication and management to support SMART services towards SMART City enablement in Malaysia. It is developed in phases with the Federal Territory of Putrajaya and Kuala Lumpur as the initial project implementation area. This paper provides awareness and insights of the on-going development of the project and how it could benefit potential users and stakeholders.

  12. Developing an educational curriculum for EnviroAtlas ...

    EPA Pesticide Factsheets

    EnviroAtlas is a web-based tool developed by the EPA and its partners, which provides interactive tools and resources for users to explore the benefits that people receive from nature, often referred to as ecosystem goods and services.Ecosystem goods and services are important to human health and well-being. Using EnviroAtlas, users can access, view, and analyze diverse information to better understand the potential impacts of decisions. EnviroAtlas provides two primary tools, the Interactive Map and the Eco-Health Relationship Browser. EnviroAtlas integrates geospatial data from a variety of sources so that users can visualize the impacts of decision-making on ecosystems. The Interactive Map allows users to investigate various ecosystem elements (i.e. land cover, pollution, and community development) and compare them across localities in the United States. The best part of the Interactive Map is that it does not require specialized software for map application; rather, it requires only a computer and an internet connection. As such, it can be used as a powerful educational tool. The Eco-Health Relationship Browser is also a web-based, highly interactive tool that uses existing scientific literature to visually demonstrate the connections between the environment and human health.As an ASPPH/EPA Fellow with a background in environmental science and secondary science education, I am currently developing an educational curriculum to support the EnviroAtlas to

  13. On the Science-Policy Bridge: Do Spatial Heat Vulnerability Assessment Studies Influence Policy?

    PubMed

    Wolf, Tanja; Chuang, Wen-Ching; McGregor, Glenn

    2015-10-23

    Human vulnerability to heat varies at a range of spatial scales, especially within cities where there can be noticeable intra-urban differences in heat risk factors. Mapping and visualizing intra-urban heat vulnerability offers opportunities for presenting information to support decision-making. For example the visualization of the spatial variation of heat vulnerability has the potential to enable local governments to identify hot spots of vulnerability and allocate resources and increase assistance to people in areas of greatest need. Recently there has been a proliferation of heat vulnerability mapping studies, all of which, to varying degrees, justify the process of vulnerability mapping in a policy context. However, to date, there has not been a systematic review of the extent to which the results of vulnerability mapping studies have been applied in decision-making. Accordingly we undertook a comprehensive review of 37 recently published papers that use geospatial techniques for assessing human vulnerability to heat. In addition, we conducted an anonymous survey of the lead authors of the 37 papers in order to establish the level of interaction between the researchers as science information producers and local authorities as information users. Both paper review and author survey results show that heat vulnerability mapping has been used in an attempt to communicate policy recommendations, raise awareness and induce institutional networking and learning, but has not as yet had a substantive influence on policymaking or preventive action.

  14. Making the Most of MASINT and Advanced Geospatial Intelligence

    DTIC Science & Technology

    2012-04-10

    mining contracts and new job creation that ultimately supports economic development. This 22 type of forensic level analysis can make MASINT an...and Technology, ed. John D. Bossler (London: Taylor and Francis, 2002), 305 17 Jian Guo Liu and Philippa J. Mason, Essential Image Processing and

  15. Chapter 1 - Executive summary

    Treesearch

    Matthew G. Rollins; Robert E. Keane; Zhiliang Zhu

    2006-01-01

    Geospatial data describing wildland fuel and current as well as historical vegetation conditions are essential for planning, implementing, and monitoring projects supported by the National Fire Plan and the Healthy Forests Restoration Act. Scientifically credible, consistent, and standardized spatial data allow fire and land managers to accurately identify the amount...

  16. Automated Geospatial Watershed Assessment (AGWA)Documentation Version 2.0

    EPA Science Inventory

    What are the human impacts of environmental change? How might land be used and what would be the potential benefits or consequences? Numerous questions arise as the world we know becomes smaller in our perception and the human population it supports becomes more dependent on the ...

  17. Data management for geospatial vulnerability assessment of interdependencies in US power generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shih, C.Y.; Scown, C.D.; Soibelman, L.

    2009-09-15

    Critical infrastructures maintain our society's stability, security, and quality of life. These systems are also interdependent, which means that the disruption of one infrastructure system can significantly impact the operation of other systems. Because of the heavy reliance on electricity production, it is important to assess possible vulnerabilities. Determining the source of these vulnerabilities can provide insight for risk management and emergency response efforts. This research uses data warehousing and visualization techniques to explore the interdependencies between coal mines, rail transportation, and electric power plants. By merging geospatial and nonspatial data, we are able to model the potential impacts ofmore » a disruption to one or more mines, rail lines, or power plants, and visually display the results using a geographical information system. A scenario involving a severe earthquake in the New Madrid Seismic Zone is used to demonstrate the capabilities of the model when given input in the form of a potentially impacted area. This type of interactive analysis can help decision makers to understand the vulnerabilities of the coal distribution network and the potential impact it can have on electricity production.« less

  18. Geospatial Perspective: Toward a Visual Political Literacy Project in Education, Health, and Human Services

    ERIC Educational Resources Information Center

    Hogrebe, Mark C.; Tate, William F., IV

    2012-01-01

    In this chapter, "geospatial" refers to geographic space that includes location, distance, and the relative position of things on the earth's surface. Geospatial perspective calls for the addition of a geographic lens that focuses on place and space as important contextual variables. A geospatial view increases one's understanding of…

  19. Geospatial Data Curation at the University of Idaho

    ERIC Educational Resources Information Center

    Kenyon, Jeremy; Godfrey, Bruce; Eckwright, Gail Z.

    2012-01-01

    The management and curation of digital geospatial data has become a central concern for many academic libraries. Geospatial data is a complex type of data critical to many different disciplines, and its use has become more expansive in the past decade. The University of Idaho Library maintains a geospatial data repository called the Interactive…

  20. Design and Implementation of Surrounding Transaction Plotting and Management System Based on Google Map API

    NASA Astrophysics Data System (ADS)

    Cao, Y. B.; Hua, Y. X.; Zhao, J. X.; Guo, S. M.

    2013-11-01

    With China's rapid economic development and comprehensive national strength growing, Border work has become a long-term and important task in China's diplomatic work. How to implement rapid plotting, real-time sharing and mapping surrounding affairs has taken great significance for government policy makers and diplomatic staff. However, at present the already exists Boundary information system are mainly have problems of Geospatial data update is heavily workload, plotting tools are in a state of serious lack of, Geographic events are difficult to share, this phenomenon has seriously hampered the smooth development of the border task. The development and progress of Geographic information system technology especially the development of Web GIS offers the possibility to solve the above problems, this paper adopts four layers of B/S architecture, with the support of Google maps service, uses the free API which is offered by Google maps and its features of openness, ease of use, sharing characteristics, highresolution images to design and implement the surrounding transaction plotting and management system based on the web development technology of ASP.NET, C#, Ajax. The system can provide decision support for government policy makers as well as diplomatic staff's real-time plotting and sharing of surrounding information. The practice has proved that the system has good usability and strong real-time.

  1. Grid computing enhances standards-compatible geospatial catalogue service

    NASA Astrophysics Data System (ADS)

    Chen, Aijun; Di, Liping; Bai, Yuqi; Wei, Yaxing; Liu, Yang

    2010-04-01

    A catalogue service facilitates sharing, discovery, retrieval, management of, and access to large volumes of distributed geospatial resources, for example data, services, applications, and their replicas on the Internet. Grid computing provides an infrastructure for effective use of computing, storage, and other resources available online. The Open Geospatial Consortium has proposed a catalogue service specification and a series of profiles for promoting the interoperability of geospatial resources. By referring to the profile of the catalogue service for Web, an innovative information model of a catalogue service is proposed to offer Grid-enabled registry, management, retrieval of and access to geospatial resources and their replicas. This information model extends the e-business registry information model by adopting several geospatial data and service metadata standards—the International Organization for Standardization (ISO)'s 19115/19119 standards and the US Federal Geographic Data Committee (FGDC) and US National Aeronautics and Space Administration (NASA) metadata standards for describing and indexing geospatial resources. In order to select the optimal geospatial resources and their replicas managed by the Grid, the Grid data management service and information service from the Globus Toolkits are closely integrated with the extended catalogue information model. Based on this new model, a catalogue service is implemented first as a Web service. Then, the catalogue service is further developed as a Grid service conforming to Grid service specifications. The catalogue service can be deployed in both the Web and Grid environments and accessed by standard Web services or authorized Grid services, respectively. The catalogue service has been implemented at the George Mason University/Center for Spatial Information Science and Systems (GMU/CSISS), managing more than 17 TB of geospatial data and geospatial Grid services. This service makes it easy to share and interoperate geospatial resources by using Grid technology and extends Grid technology into the geoscience communities.

  2. Ontology for Transforming Geo-Spatial Data for Discovery and Integration of Scientific Data

    NASA Astrophysics Data System (ADS)

    Nguyen, L.; Chee, T.; Minnis, P.

    2013-12-01

    Discovery and access to geo-spatial scientific data across heterogeneous repositories and multi-discipline datasets can present challenges for scientist. We propose to build a workflow for transforming geo-spatial datasets into semantic environment by using relationships to describe the resource using OWL Web Ontology, RDF, and a proposed geo-spatial vocabulary. We will present methods for transforming traditional scientific dataset, use of a semantic repository, and querying using SPARQL to integrate and access datasets. This unique repository will enable discovery of scientific data by geospatial bound or other criteria.

  3. Surface Water and Flood Extent Mapping, Monitoring, and Modeling Products and Services for the SERVIR Regions

    NASA Technical Reports Server (NTRS)

    Anderson, Eric

    2016-01-01

    SERVIR is a joint NASA - US Agency for International Development (USAID) project to improve environmental decision-making using Earth observations and geospatial technologies. A common need identified among SERVIR regions has been improved information for disaster risk reduction and in specific surface water and flood extent mapping, monitoring and forecasting. Of the 70 SERVIR products (active, complete, and in development), 4 are related to surface water and flood extent mapping, monitoring or forecasting. Visit http://www.servircatalog.net for more product details.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coleman, Andre M.; Johnson, Gary E.; Borde, Amy B.

    Pacific Northwest National Laboratory (PNNL) conducted this project for the U.S. Army Corps of Engineers, Portland District (Corps). The purpose of the project is to develop a geospatial, web-accessible database (called “Oncor”) for action effectiveness and related data from monitoring and research efforts for the Columbia Estuary Ecosystem Restoration Program (CEERP). The intent is for the Oncor database to enable synthesis and evaluation, the results of which can then be applied in subsequent CEERP decision-making. This is the first annual report in what is expected to be a 3- to 4-year project, which commenced on February 14, 2012.

  5. Cloud-based federation and fusion of distributed data sources for supporting hurricane response : requirements, challenges, and opportunities.

    DOT National Transportation Integrated Search

    2016-12-01

    Geospatial data have been playing an increasingly important role in disaster response and recovery. For large-scale natural disasters such as Hurricanes which often have the capacity to topple a large region within a span of a few days, disaster prep...

  6. BEYOND REGULATION TO PROTECTION. THE APPLICATION OF NATIONAL RECONNAISSANCE SYSTEMS IN THE SCIENCE MISSION OF THE ENVIRONMENTAL PROTECTION AGENCY

    EPA Science Inventory

    The use of National Technical Means (NTM) data and advanced geospatial technologies has an important role in supporting the mission of the Environmental Protection Agency (EPA). EPA's responsibilities have grown beyond pollution compliance monitoring and enforcement to include t...

  7. NREL: International Activities - Philippines Wind Resource Maps and Data

    Science.gov Websites

    Philippines Wind Resource Maps and Data In 2014, under the Enhancing Capacity for Low Emission National Wind Technology Center and Geospatial Data Science Team applied modern approaches to update previous estimates to support the development of wind energy potential in the Philippines. The new

  8. Maps | Geospatial Data Science | NREL

    Science.gov Websites

    Maps Maps NREL develops an array of maps to support renewable energy development and generation resource in the United States by county Geothermal Maps of geothermal power plants, resources for enhanced geothermal systems, and hydrothermal sites in the United States Hydrogen Maps of hydrogen production

  9. Methods and Tools to Align Curriculum to the Skills and Competencies Needed by the Workforce - an Example from Geospatial Science and Technology

    NASA Astrophysics Data System (ADS)

    Johnson, A. B.

    2012-12-01

    Geospatial science and technology (GST) including geographic information systems, remote sensing, global positioning systems and mobile applications, are valuable tools for geoscientists and students learning to become geoscientists. GST allows the user to analyze data spatially and temporarily and then visualize the data and outcomes in multiple formats (digital, web and paper). GST has evolved rapidly and it has been difficult to create effective curriculum as few guidelines existed to help educators. In 2010, the US Department of Labor (DoL), in collaboration with the National Geospatial Center of Excellence (GeoTech Center), a National Science Foundation supported grant, approved the Geospatial Technology Competency Mode (GTCM). The GTCM was developed and vetted with industry experts and provided the structure and example competencies needed across the industry. While the GTCM was helpful, a more detailed list of skills and competencies needed to be identified in order to build appropriate curriculum. The GeoTech Center carried out multiple DACUM events to identify the skills and competencies needed by entry-level workers. DACUM (Developing a Curriculum) is a job analysis process whereby expert workers are convened to describe what they do for a specific occupation. The outcomes from multiple DACUMs were combined into a MetaDACUM and reviewed by hundreds of GST professionals. This provided a list of more than 320 skills and competencies needed by the workforce. The GeoTech Center then held multiple workshops across the U.S. where more than 100 educators knowledgeable in teaching GST parsed the list into Model Courses and a Model Certificate Program. During this process, tools were developed that helped educators define which competency should be included in a specific course and the depth of instruction for that competency. This presentation will provide details about the process, methodology and tools used to create the Models and suggest how they can be used to create customized curriculum integrating geospatial science and technology into geoscience programs.

  10. Interacting With A Near Real-Time Urban Digital Watershed Using Emerging Geospatial Web Technologies

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Fazio, D. J.; Abdelzaher, T.; Minsker, B.

    2007-12-01

    The value of real-time hydrologic data dissemination including river stage, streamflow, and precipitation for operational stormwater management efforts is particularly high for communities where flash flooding is common and costly. Ideally, such data would be presented within a watershed-scale geospatial context to portray a holistic view of the watershed. Local hydrologic sensor networks usually lack comprehensive integration with sensor networks managed by other agencies sharing the same watershed due to administrative, political, but mostly technical barriers. Recent efforts on providing unified access to hydrological data have concentrated on creating new SOAP-based web services and common data format (e.g. WaterML and Observation Data Model) for users to access the data (e.g. HIS and HydroSeek). Geospatial Web technology including OGC sensor web enablement (SWE), GeoRSS, Geo tags, Geospatial browsers such as Google Earth and Microsoft Virtual Earth and other location-based service tools provides possibilities for us to interact with a digital watershed in near-real-time. OGC SWE proposes a revolutionary concept towards a web-connected/controllable sensor networks. However, these efforts have not provided the capability to allow dynamic data integration/fusion among heterogeneous sources, data filtering and support for workflows or domain specific applications where both push and pull mode of retrieving data may be needed. We propose a light weight integration framework by extending SWE with open source Enterprise Service Bus (e.g., mule) as a backbone component to dynamically transform, transport, and integrate both heterogeneous sensor data sources and simulation model outputs. We will report our progress on building such framework where multi-agencies" sensor data and hydro-model outputs (with map layers) will be integrated and disseminated in a geospatial browser (e.g. Microsoft Virtual Earth). This is a collaborative project among NCSA, USGS Illinois Water Science Center, Computer Science Department at UIUC funded by the Adaptive Environmental Infrastructure Sensing and Information Systems initiative at UIUC.

  11. NCI's Distributed Geospatial Data Server

    NASA Astrophysics Data System (ADS)

    Larraondo, P. R.; Evans, B. J. K.; Antony, J.

    2016-12-01

    Earth systems, environmental and geophysics datasets are an extremely valuable source of information about the state and evolution of the Earth. However, different disciplines and applications require this data to be post-processed in different ways before it can be used. For researchers experimenting with algorithms across large datasets or combining multiple data sets, the traditional approach to batch data processing and storing all the output for later analysis rapidly becomes unfeasible, and often requires additional work to publish for others to use. Recent developments on distributed computing using interactive access to significant cloud infrastructure opens the door for new ways of processing data on demand, hence alleviating the need for storage space for each individual copy of each product. The Australian National Computational Infrastructure (NCI) has developed a highly distributed geospatial data server which supports interactive processing of large geospatial data products, including satellite Earth Observation data and global model data, using flexible user-defined functions. This system dynamically and efficiently distributes the required computations among cloud nodes and thus provides a scalable analysis capability. In many cases this completely alleviates the need to preprocess and store the data as products. This system presents a standards-compliant interface, allowing ready accessibility for users of the data. Typical data wrangling problems such as handling different file formats and data types, or harmonising the coordinate projections or temporal and spatial resolutions, can now be handled automatically by this service. The geospatial data server exposes functionality for specifying how the data should be aggregated and transformed. The resulting products can be served using several standards such as the Open Geospatial Consortium's (OGC) Web Map Service (WMS) or Web Feature Service (WFS), Open Street Map tiles, or raw binary arrays under different conventions. We will show some cases where we have used this new capability to provide a significant improvement over previous approaches.

  12. NASA World Wind Near Real Time Data for Earth

    NASA Astrophysics Data System (ADS)

    Hogan, P.

    2013-12-01

    Innovation requires open standards for data exchange, not to mention ^access to data^ so that value-added, the information intelligence, can be continually created and advanced by the larger community. Likewise, innovation by academia and entrepreneurial enterprise alike, are greatly benefited by an open platform that provides the basic technology for access and visualization of that data. NASA World Wind Java, and now NASA World Wind iOS for the iPhone and iPad, provides that technology. Whether the interest is weather science or climate science, emergency response or supply chain, seeing spatial data in its native context of Earth accelerates understanding and improves decision-making. NASA World Wind open source technology provides the basic elements for 4D visualization, using Open Geospatial Consortium (OGC) protocols, while allowing for customized access to any data, big or small, including support for NetCDF. NASA World Wind includes access to a suite of US Government WMS servers with near real time data. The larger community can readily capitalize on this technology, building their own value-added applications, either open or proprietary. Night lights heat map Glacier National Park

  13. Community Needs Assessment and Portal Prototype Development for an Arctic Spatial Data Infrastructure (ASDI)

    NASA Astrophysics Data System (ADS)

    Wiggins, H. V.; Warnick, W. K.; Hempel, L. C.; Henk, J.; Sorensen, M.; Tweedie, C. E.; Gaylord, A. G.

    2007-12-01

    As the creation and use of geospatial data in research, management, logistics, and education applications has proliferated, there is now a tremendous potential for advancing science through a variety of cyber-infrastructure applications, including Spatial Data Infrastructure (SDI) and related technologies. SDIs provide a necessary and common framework of standards, securities, policies, procedures, and technology to support the effective acquisition, coordination, dissemination and use of geospatial data by multiple and distributed stakeholder and user groups. Despite the numerous research activities in the Arctic, there is no established SDI and, because of this lack of a coordinated infrastructure, there is inefficiency, duplication of effort, and reduced data quality and search ability of arctic geospatial data. The urgency for establishing this framework is significant considering the myriad of data that is being collected in celebration of the International Polar Year (IPY) in 2007-2008 and the current international momentum for an improved and integrated circum-arctic terrestrial-marine-atmospheric environmental observatories network. The key objective of this project is to lay the foundation for full implementation of an Arctic Spatial Data Infrastructure (ASDI) through an assessment of community needs, readiness, and resources and through the development of a prototype web-mapping portal.

  14. A geospatial search engine for discovering multi-format geospatial data across the web

    Treesearch

    Christopher Bone; Alan Ager; Ken Bunzel; Lauren Tierney

    2014-01-01

    The volume of publically available geospatial data on the web is rapidly increasing due to advances in server-based technologies and the ease at which data can now be created. However, challenges remain with connecting individuals searching for geospatial data with servers and websites where such data exist. The objective of this paper is to present a publically...

  15. The Use of Geospatial Technologies Instruction within a Student/Teacher/Scientist Partnership: Increasing Students' Geospatial Skills and Atmospheric Concept Knowledge

    ERIC Educational Resources Information Center

    Hedley, Mikell Lynne; Templin, Mark A.; Czaljkowski, Kevin; Czerniak, Charlene

    2013-01-01

    Many 21st century careers rely on geospatial skills; yet, curricula and professional development lag behind in incorporating these skills. As a result, many teachers have limited experience or preparation for teaching geospatial skills. One strategy for overcoming such problems is the creation of a student/teacher/scientist (STS) partnership…

  16. Automated geospatial Web Services composition based on geodata quality requirements

    NASA Astrophysics Data System (ADS)

    Cruz, Sérgio A. B.; Monteiro, Antonio M. V.; Santos, Rafael

    2012-10-01

    Service-Oriented Architecture and Web Services technologies improve the performance of activities involved in geospatial analysis with a distributed computing architecture. However, the design of the geospatial analysis process on this platform, by combining component Web Services, presents some open issues. The automated construction of these compositions represents an important research topic. Some approaches to solving this problem are based on AI planning methods coupled with semantic service descriptions. This work presents a new approach using AI planning methods to improve the robustness of the produced geospatial Web Services composition. For this purpose, we use semantic descriptions of geospatial data quality requirements in a rule-based form. These rules allow the semantic annotation of geospatial data and, coupled with the conditional planning method, this approach represents more precisely the situations of nonconformities with geodata quality that may occur during the execution of the Web Service composition. The service compositions produced by this method are more robust, thus improving process reliability when working with a composition of chained geospatial Web Services.

  17. EPA National Geospatial Data Policy

    EPA Pesticide Factsheets

    National Geospatial Data Policy (NGDP) establishes principles, responsibilities, and requirements for collecting and managing geospatial data used by Federal environmental programs and projects within the jurisdiction of the U.S. EPA

  18. Towards the Geospatial Web: Media Platforms for Managing Geotagged Knowledge Repositories

    NASA Astrophysics Data System (ADS)

    Scharl, Arno

    International media have recognized the visual appeal of geo-browsers such as NASA World Wind and Google Earth, for example, when Web and television coverage on Hurricane Katrina used interactive geospatial projections to illustrate its path and the scale of destruction in August 2005. Yet these early applications only hint at the true potential of geospatial technology to build and maintain virtual communities and to revolutionize the production, distribution and consumption of media products. This chapter investigates this potential by reviewing the literature and discussing the integration of geospatial and semantic reference systems, with an emphasis on extracting geospatial context from unstructured text. A content analysis of news coverage based on a suite of text mining tools (webLyzard) sheds light on the popularity and adoption of geospatial platforms.

  19. The Inter-American Geospatial Data Network— developing a Western Hemisphere geospatial data clearinghouse

    USGS Publications Warehouse

    Anthony, Michelle L.; Klaver, Jacqueline M.; Quenzer, Robert

    1998-01-01

    The US Geological Survey and US Agency for International Development are enhancing the geographic information infrastructure of the Western Hemisphere by establishing the Inter-American Geospatial Data Network (IGDN). In its efforts to strengthen the Western Hemisphere's information infrastructure, the IGDN is consistent with the goals of the Plan of Action that emerged from the 1994 Summit of the Americas. The IGDN is an on-line cooperative, or clearinghouse, of geospatial data. Internet technology is used to facilitate the discovery and access of Western Hemisphere geospatial data. It was established by using the standards and guidelines of the Federal Geographic Data Committee to provide a consistent data discovery mechanism that will help minimize geospatial data duplication, promote data availability, and coordinate data collection and research activities.

  20. EPA Geospatial Applications

    EPA Pesticide Factsheets

    EPA has developed many applications that allow users to explore and interact with geospatial data. This page highlights some of the flagship geospatial web applications but these represent only a fraction of the total.

  1. Technology Transfer Opportunities: On-Demand Printing in Support of National Geospatial Data

    USGS Publications Warehouse

    ,

    1997-01-01

    The U.S. Geological Survey (USGS) and the 3M Company of St. Paul, Minnesota, have entered into a cooperative research and development agreement (CRADA) to investigate maps-on-demand technology to support the production of USGS mapping products. The CRADA will potentially help the USGS to develop on-demand alternatives to lithographic maps and help 3M to develop a series of commercial instant map-printing systems.

  2. Common Web Mapping and Mobile Device Framework for Display of NASA Real-time Data

    NASA Astrophysics Data System (ADS)

    Burks, J. E.

    2013-12-01

    Scientists have strategic goals to deliver their unique datasets and research to both collaborative partners and more broadly to the public. These datasets can have a significant impact locally and globally as has been shown by the success of the NASA Short-term Prediction Research and Transition (SPoRT) Center and SERVIR programs at Marshall Space Flight Center. Each of these respective organizations provides near real-time data at the best resolution possible to address concerns of the operational weather forecasting community (SPoRT) and to support environmental monitoring and disaster assessment (SERVIR). However, one of the biggest struggles to delivering the data to these and other Earth science community partners is formatting the product to fit into an end user's Decision Support System (DSS). The problem of delivering the data to the end-user's DSS can be a significant impediment to transitioning research to operational environments especially for disaster response where the deliver time is critical. The decision makers, in addition to the DSS, need seamless access to these same datasets from a web browser or a mobile phone for support when they are away from their DSS or for personnel out in the field. A framework has been developed for MSFC Earth Science program that can be used to easily enable seamless delivery of scientific data to end users in multiple formats. The first format is an open geospatial format, Web Mapping Service (WMS), which is easily integrated into most DSSs. The second format is a web browser display, which can be embedded within any MSFC Science web page with just a few lines of web page coding. The third format is accessible in the form of iOS and Android native mobile applications that could be downloaded from an 'app store'. The framework developed has reduced the level of effort needed to bring new and existing NASA datasets to each of these end user platforms and help extend the reach of science data.

  3. Minerva: An Integrated Geospatial/Temporal Toolset for Real-time Science Decision Making and Data Collection

    NASA Astrophysics Data System (ADS)

    Lees, D. S.; Cohen, T.; Deans, M. C.; Lim, D. S. S.; Marquez, J.; Heldmann, J. L.; Hoffman, J.; Norheim, J.; Vadhavk, N.

    2016-12-01

    Minerva integrates three capabilities that are critical to the success of NASA analogs. It combines NASA's Exploration Ground Data Systems (xGDS) and Playbook software, and MIT's Surface Exploration Traverse Analysis and Navigation Tool (SEXTANT). Together, they help to plan, optimize, and monitor traverses; schedule and track activity; assist with science decision-making and document sample and data collection. Pre-mission, Minerva supports planning with a priori map data (e.g., UAV and satellite imagery) and activity scheduling. During missions, xGDS records and broadcasts live data to a distributed team who take geolocated notes and catalogue samples. Playbook provides live schedule updates and multi-media chat. Post-mission, xGDS supports data search and visualization for replanning and analysis. NASA's BASALT (Biologic Analog Science Associated with Lava Terrains) and FINESSE (Field Investigations to Enable Solar System Science and Exploration) projects use Minerva to conduct field science under simulated Mars mission conditions including 5 and 15 minute one-way communication delays. During the recent BASALT-FINESSE mission, two field scientists (EVA team) executed traverses across volcanic terrain to characterize and sample basalts. They wore backpacks with communications and imaging capabilities, and carried field portable spectrometers. The Science Team was 40 km away in a simulated mission control center. The Science Team monitored imaging (video and still), spectral, voice, location and physiological data from the EVA team via the network from the field, under communication delays. Minerva provided the Science Team with a unified context of operations at the field site, so they could make meaningful remote contributions to the collection of 10's of geotagged samples. Minerva's mission architecture will be presented with technical details and capabilities. Through the development, testing and application of Minerva, we are defining requirements for the design of future capabilities to support human and human-robotic missions to deep space and Mars.

  4. Operational data products to support phenological research and applications at local to continental scales

    NASA Astrophysics Data System (ADS)

    Weltzin, J. F.

    2017-12-01

    Phenological data from a variety of platforms - across a range of spatial and temporal scales - are required to support research, natural resource management, and policy- and decision-making in a changing world. Observational and modeled phenological data, especially when integrated with associated biophysical data (e.g., climate, land-use/land-cover, hydrology) has great potential to provide multi-faceted information critical to decision support systems, vulnerability and risk assessments, change detection applications, and early-warning and forecasting systems for natural and modified ecosystems. The USA National Phenology Network (USA-NPN; www.usanpn.org) is a national-scale science and monitoring initiative focused on understanding the drivers and feedback effects of phenological variation in changing environments. The Network maintains a centralized database of over 10M ground-based observations of plants and animals for 1954-present, and leverages these data to produce operational data products for use by a variety of audiences, including researchers and resource managers. This presentation highlights our operational data products, including the tools, maps, and services that facilitate discovery, accessibility and usability of integrated phenological information. We describe (1) the data download tool, a customizable GUI that provides geospatially referenced raw, bounded or summarized organismal and climatological data and associated metadata (including calendars, time-series curves, and XY graphs), (2) the visualization tool, which provides opportunities to explore, visualize and export or download both organismal and modeled (gridded) products at daily time-steps and relatively fine spatial resolutions ( 2.5 km to 4 km) for the period 1980 to 6 days into the future, and (3) web services that enable custom query and download of map, feature and cover services in a variety of standard formats. These operational products facilitate scaling of integrated phenological and associated data to landscapes and regions, and enable novel investigations of biophysical interactions at unprecedented scales, e.g., continental-scale migrations.

  5. Common Web Mapping and Mobile Device Framework for Display of NASA Real-time Data

    NASA Technical Reports Server (NTRS)

    Burks, Jason

    2013-01-01

    Scientists have strategic goals to deliver their unique datasets and research to both collaborative partners and more broadly to the public. These datasets can have a significant impact locally and globally as has been shown by the success of the NASA Short-term Prediction Research and Transition (SPoRT) Center and SERVIR programs at Marshall Space Flight Center. Each of these respective organizations provides near real-time data at the best resolution possible to address concerns of the operational weather forecasting community (SPoRT) and to support environmental monitoring and disaster assessment (SERVIR). However, one of the biggest struggles to delivering the data to these and other Earth science community partners is formatting the product to fit into an end user's Decision Support System (DSS). The problem of delivering the data to the end-user's DSS can be a significant impediment to transitioning research to operational environments especially for disaster response where the deliver time is critical. The decision makers, in addition to the DSS, need seamless access to these same datasets from a web browser or a mobile phone for support when they are away from their DSS or for personnel out in the field. A framework has been developed for MSFC Earth Science program that can be used to easily enable seamless delivery of scientific data to end users in multiple formats. The first format is an open geospatial format, Web Mapping Service (WMS), which is easily integrated into most DSSs. The second format is a web browser display, which can be embedded within any MSFC Science web page with just a few lines of web page coding. The third format is accessible in the form of iOS and Android native mobile applications that could be downloaded from an "app store". The framework developed has reduced the level of effort needed to bring new and existing NASA datasets to each of these end user platforms and help extend the reach of science data.

  6. In-field Access to Geoscientific Metadata through GPS-enabled Mobile Phones

    NASA Astrophysics Data System (ADS)

    Hobona, Gobe; Jackson, Mike; Jordan, Colm; Butchart, Ben

    2010-05-01

    Fieldwork is an integral part of much geosciences research. But whilst geoscientists have physical or online access to data collections whilst in the laboratory or at base stations, equivalent in-field access is not standard or straightforward. The increasing availability of mobile internet and GPS-supported mobile phones, however, now provides the basis for addressing this issue. The SPACER project was commissioned by the Rapid Innovation initiative of the UK Joint Information Systems Committee (JISC) to explore the potential for GPS-enabled mobile phones to access geoscientific metadata collections. Metadata collections within the geosciences and the wider geospatial domain can be disseminated through web services based on the Catalogue Service for Web(CSW) standard of the Open Geospatial Consortium (OGC) - a global grouping of over 380 private, public and academic organisations aiming to improve interoperability between geospatial technologies. CSW offers an XML-over-HTTP interface for querying and retrieval of geospatial metadata. By default, the metadata returned by CSW is based on the ISO19115 standard and encoded in XML conformant to ISO19139. The SPACER project has created a prototype application that enables mobile phones to send queries to CSW containing user-defined keywords and coordinates acquired from GPS devices built-into the phones. The prototype has been developed using the free and open source Google Android platform. The mobile application offers views for listing titles, presenting multiple metadata elements and a Google Map with an overlay of bounding coordinates of datasets. The presentation will describe the architecture and approach applied in the development of the prototype.

  7. PolarHub: A Global Hub for Polar Data Discovery

    NASA Astrophysics Data System (ADS)

    Li, W.

    2014-12-01

    This paper reports the outcome of a NSF project in developing a large-scale web crawler PolarHub to discover automatically the distributed polar dataset in the format of OGC web services (OWS) in the cyberspace. PolarHub is a machine robot; its goal is to visit as many webpages as possible to find those containing information about polar OWS, extract this information and store it into the backend data repository. This is a very challenging task given huge data volume of webpages on the Web. Three unique features was introduced in PolarHub to make it distinctive from earlier crawler solutions: (1) a multi-task, multi-user, multi-thread support to the crawling tasks; (2) an extensive use of thread pool and Data Access Object (DAO) design patterns to separate persistent data storage and business logic to achieve high extendibility of the crawler tool; (3) a pattern-matching based customizable crawling algorithm to support discovery of multi-type geospatial web services; and (4) a universal and portable client-server communication mechanism combining a server-push and client pull strategies for enhanced asynchronous processing. A series of experiments were conducted to identify the impact of crawling parameters to the overall system performance. The geographical distribution pattern of all PolarHub identified services is also demonstrated. We expect this work to make a major contribution to the field of geospatial information retrieval and geospatial interoperability, to bridge the gap between data provider and data consumer, and to accelerate polar science by enhancing the accessibility and reusability of adequate polar data.

  8. Challenges of Remote Sensing and Spatial Information Education and Technology Transfer in a Fast Developing Industry

    NASA Astrophysics Data System (ADS)

    Tsai, F.; Chen, L.-C.

    2014-04-01

    During the past decade, Taiwan has experienced an unusual and fast growing in the industry of mapping, remote sensing, spatial information and related markets. A successful space program and dozens of advanced airborne and ground-based remote sensing instruments as well as mobile mapping systems have been implemented and put into operation to support the vast demands of geospatial data acquisition. Moreover, in addition to the government agencies and research institutes, there are also tens of companies in the private sector providing geo-spatial data and services. However, the fast developing industry is also posing a great challenge to the education sector in Taiwan, especially the higher education for geo-spatial information. Facing this fast developing industry, the demands of skilled professionals and new technologies in order to address diversified needs are indubitably high. Consequently, while delighting in the expanding and prospering benefitted from the fast growing industry, how to fulfill these demands has become a challenge for the remote sensing and spatial information disciplines in the higher education institutes in Taiwan. This paper provides a brief insight into the status of the remote sensing and spatial information industry in Taiwan as well as the challenges of the education and technology transfer to support the increasing demands and to ensure the continuous development of the industry. In addition to the report of the current status of the remote sensing and spatial information related courses and programs in the colleges and universities, current and potential threatening issues and possible resolutions are also discussed in different points of view.

  9. Geospatial Data Science Analysis | Geospatial Data Science | NREL

    Science.gov Websites

    different levels of technology maturity. Photo of a man taking field measurements. Geospatial analysis energy for different technologies across the nation? Featured Analysis Products Renewable Energy

  10. The National 3-D Geospatial Information Web-Based Service of Korea

    NASA Astrophysics Data System (ADS)

    Lee, D. T.; Kim, C. W.; Kang, I. G.

    2013-09-01

    3D geospatial information systems should provide efficient spatial analysis tools and able to use all capabilities of the third dimension, and a visualization. Currently, many human activities make steps toward the third dimension like land use, urban and landscape planning, cadastre, environmental monitoring, transportation monitoring, real estate market, military applications, etc. To reflect this trend, the Korean government has been started to construct the 3D geospatial data and service platform. Since the geospatial information was introduced in Korea, the construction of geospatial information (3D geospatial information, digital maps, aerial photographs, ortho photographs, etc.) has been led by the central government. The purpose of this study is to introduce the Korean government-lead 3D geospatial information web-based service for the people who interested in this industry and we would like to introduce not only the present conditions of constructed 3D geospatial data but methodologies and applications of 3D geospatial information. About 15% (about 3,278.74 km2) of the total urban area's 3D geospatial data have been constructed by the national geographic information institute (NGII) of Korea from 2005 to 2012. Especially in six metropolitan cities and Dokdo (island belongs to Korea) on level of detail (LOD) 4 which is photo-realistic textured 3D models including corresponding ortho photographs were constructed in 2012. In this paper, we represented web-based 3D map service system composition and infrastructure and comparison of V-world with Google Earth service will be presented. We also represented Open API based service cases and discussed about the protection of location privacy when we construct 3D indoor building models. In order to prevent an invasion of privacy, we processed image blurring, elimination and camouflage. The importance of public-private cooperation and advanced geospatial information policy is emphasized in Korea. Thus, the progress of spatial information industry of Korea is expected in the near future.

  11. a Public Platform for Geospatial Data Sharing for Disaster Risk Management

    NASA Astrophysics Data System (ADS)

    Balbo, S.; Boccardo, P.; Dalmasso, S.; Pasquali, P.

    2013-01-01

    Several studies have been conducted in Africa to assist local governments in addressing the risk situation related to natural hazards. Geospatial data containing information on vulnerability, impacts, climate change, disaster risk reduction is usually part of the output of such studies and is valuable to national and international organizations to reduce the risks and mitigate the impacts of disasters. Nevertheless this data isn't efficiently widely distributed and often resides in remote storage solutions hardly reachable. Spatial Data Infrastructures are technical solutions capable to solve this issue, by storing geospatial data and making them widely available through the internet. Among these solutions, GeoNode, an open source online platform for geospatial data sharing, has been developed in recent years. GeoNode is a platform for the management and publication of geospatial data. It brings together mature and stable open-source software projects under a consistent and easy-to-use interface allowing users, with little training, to quickly and easily share data and create interactive maps. GeoNode data management tools allow for integrated creation of data, metadata, and map visualizations. Each dataset in the system can be shared publicly or restricted to allow access to only specific users. Social features like user profiles and commenting and rating systems allow for the development of communities around each platform to facilitate the use, management, and quality control of the data the GeoNode instance contains (http://geonode.org/). This paper presents a case study scenario of setting up a Web platform based on GeoNode. It is a public platform called MASDAP and promoted by the Government of Malawi in order to support development of the country and build resilience against natural disasters. A substantial amount of geospatial data has already been collected about hydrogeological risk, as well as several other-disasters related information. Moreover this platform will help to ensure that the data created by a number of past or ongoing projects is maintained and that this information remains accessible and useful. An Integrated Flood Risk Management Plan for a river basin has already been included in the platform and other data from future disaster risk management projects will be added as well.

  12. Raster Data Partitioning for Supporting Distributed GIS Processing

    NASA Astrophysics Data System (ADS)

    Nguyen Thai, B.; Olasz, A.

    2015-08-01

    In the geospatial sector big data concept also has already impact. Several studies facing originally computer science techniques applied in GIS processing of huge amount of geospatial data. In other research studies geospatial data is considered as it were always been big data (Lee and Kang, 2015). Nevertheless, we can prove data acquisition methods have been improved substantially not only the amount, but the resolution of raw data in spectral, spatial and temporal aspects as well. A significant portion of big data is geospatial data, and the size of such data is growing rapidly at least by 20% every year (Dasgupta, 2013). The produced increasing volume of raw data, in different format, representation and purpose the wealth of information derived from this data sets represents only valuable results. However, the computing capability and processing speed rather tackle with limitations, even if semi-automatic or automatic procedures are aimed on complex geospatial data (Kristóf et al., 2014). In late times, distributed computing has reached many interdisciplinary areas of computer science inclusive of remote sensing and geographic information processing approaches. Cloud computing even more requires appropriate processing algorithms to be distributed and handle geospatial big data. Map-Reduce programming model and distributed file systems have proven their capabilities to process non GIS big data. But sometimes it's inconvenient or inefficient to rewrite existing algorithms to Map-Reduce programming model, also GIS data can not be partitioned as text-based data by line or by bytes. Hence, we would like to find an alternative solution for data partitioning, data distribution and execution of existing algorithms without rewriting or with only minor modifications. This paper focuses on technical overview of currently available distributed computing environments, as well as GIS data (raster data) partitioning, distribution and distributed processing of GIS algorithms. A proof of concept implementation have been made for raster data partitioning, distribution and processing. The first results on performance have been compared against commercial software ERDAS IMAGINE 2011 and 2014. Partitioning methods heavily depend on application areas, therefore we may consider data partitioning as a preprocessing step before applying processing services on data. As a proof of concept we have implemented a simple tile-based partitioning method splitting an image into smaller grids (NxM tiles) and comparing the processing time to existing methods by NDVI calculation. The concept is demonstrated using own development open source processing framework.

  13. The geospatial data quality REST API for primary biodiversity data

    PubMed Central

    Otegui, Javier; Guralnick, Robert P.

    2016-01-01

    Summary: We present a REST web service to assess the geospatial quality of primary biodiversity data. It enables access to basic and advanced functions to detect completeness and consistency issues as well as general errors in the provided record or set of records. The API uses JSON for data interchange and efficient parallelization techniques for fast assessments of large datasets. Availability and implementation: The Geospatial Data Quality API is part of the VertNet set of APIs. It can be accessed at http://api-geospatial.vertnet-portal.appspot.com/geospatial and is already implemented in the VertNet data portal for quality reporting. Source code is freely available under GPL license from http://www.github.com/vertnet/api-geospatial. Contact: javier.otegui@gmail.com or rguralnick@flmnh.ufl.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26833340

  14. The geospatial data quality REST API for primary biodiversity data.

    PubMed

    Otegui, Javier; Guralnick, Robert P

    2016-06-01

    We present a REST web service to assess the geospatial quality of primary biodiversity data. It enables access to basic and advanced functions to detect completeness and consistency issues as well as general errors in the provided record or set of records. The API uses JSON for data interchange and efficient parallelization techniques for fast assessments of large datasets. The Geospatial Data Quality API is part of the VertNet set of APIs. It can be accessed at http://api-geospatial.vertnet-portal.appspot.com/geospatial and is already implemented in the VertNet data portal for quality reporting. Source code is freely available under GPL license from http://www.github.com/vertnet/api-geospatial javier.otegui@gmail.com or rguralnick@flmnh.ufl.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  15. Suitable Site Selection of Small Dams Using Geo-Spatial Technique: a Case Study of Dadu Tehsil, Sindh

    NASA Astrophysics Data System (ADS)

    Khalil, Zahid

    2016-07-01

    Decision making about identifying suitable sites for any project by considering different parameters, is difficult. Using GIS and Multi-Criteria Analysis (MCA) can make it easy for those projects. This technology has proved to be an efficient and adequate in acquiring the desired information. In this study, GIS and MCA were employed to identify the suitable sites for small dams in Dadu Tehsil, Sindh. The GIS software is used to create all the spatial parameters for the analysis. The parameters that derived are slope, drainage density, rainfall, land use / land cover, soil groups, Curve Number (CN) and runoff index with a spatial resolution of 30m. The data used for deriving above layers include 30 meter resolution SRTM DEM, Landsat 8 imagery, and rainfall from National Centre of Environment Prediction (NCEP) and soil data from World Harmonized Soil Data (WHSD). Land use/Land cover map is derived from Landsat 8 using supervised classification. Slope, drainage network and watershed are delineated by terrain processing of DEM. The Soil Conservation Services (SCS) method is implemented to estimate the surface runoff from the rainfall. Prior to this, SCS-CN grid is developed by integrating the soil and land use/land cover raster. These layers with some technical and ecological constraints are assigned weights on the basis of suitability criteria. The pair wise comparison method, also known as Analytical Hierarchy Process (AHP) is took into account as MCA for assigning weights on each decision element. All the parameters and group of parameters are integrated using weighted overlay in GIS environment to produce suitable sites for the Dams. The resultant layer is then classified into four classes namely, best suitable, suitable, moderate and less suitable. This study reveals a contribution to decision making about suitable sites analysis for small dams using geo-spatial data with minimal amount of ground data. This suitability maps can be helpful for water resource management organizations in determination of feasible rainwater harvesting structures (RWH).

  16. Generation of Multiple Metadata Formats from a Geospatial Data Repository

    NASA Astrophysics Data System (ADS)

    Hudspeth, W. B.; Benedict, K. K.; Scott, S.

    2012-12-01

    The Earth Data Analysis Center (EDAC) at the University of New Mexico is partnering with the CYBERShARE and Environmental Health Group from the Center for Environmental Resource Management (CERM), located at the University of Texas, El Paso (UTEP), the Biodiversity Institute at the University of Kansas (KU), and the New Mexico Geo- Epidemiology Research Network (GERN) to provide a technical infrastructure that enables investigation of a variety of climate-driven human/environmental systems. Two significant goals of this NASA-funded project are: a) to increase the use of NASA Earth observational data at EDAC by various modeling communities through enabling better discovery, access, and use of relevant information, and b) to expose these communities to the benefits of provenance for improving understanding and usability of heterogeneous data sources and derived model products. To realize these goals, EDAC has leveraged the core capabilities of its Geographic Storage, Transformation, and Retrieval Engine (Gstore) platform, developed with support of the NSF EPSCoR Program. The Gstore geospatial services platform provides general purpose web services based upon the REST service model, and is capable of data discovery, access, and publication functions, metadata delivery functions, data transformation, and auto-generated OGC services for those data products that can support those services. Central to the NASA ACCESS project is the delivery of geospatial metadata in a variety of formats, including ISO 19115-2/19139, FGDC CSDGM, and the Proof Markup Language (PML). This presentation details the extraction and persistence of relevant metadata in the Gstore data store, and their transformation into multiple metadata formats that are increasingly utilized by the geospatial community to document not only core library catalog elements (e.g. title, abstract, publication data, geographic extent, projection information, and database elements), but also the processing steps used to generate derived modeling products. In particular, we discuss the generation and service delivery of provenance, or trace of data sources and analytical methods used in a scientific analysis, for archived data. We discuss the workflows developed by EDAC to capture end-to-end provenance, the storage model for those data in a delivery format independent data structure, and delivery of PML, ISO, and FGDC documents to clients requesting those products.

  17. Remote distinction of a noxious weed (musk thistle: Carduus nutans) using airborne hyperspectral imagery and the support vector machine classifier

    USDA-ARS?s Scientific Manuscript database

    Remote detection of invasive plant species using geospatial imagery may significantly improve monitoring, planning, and management practices by eliminating shortfalls such as observer bias and accessibility involved in ground-based surveys. The use of remote sensing for accurate mapping invasion ex...

  18. Supporting Latinx/a/o Community College Leaders: A Geo-Spatial Approach

    ERIC Educational Resources Information Center

    Hernández, Ignacio, Jr.

    2017-01-01

    Community colleges play a significant role in guiding pathways to postsencondary degrees for Latinx/a/o students. To gain a greater understanding of ways Latinx/a/o students utilize community colleges as pathways to degrees, this article focused on institutional leaders, members of one community college professional assocation. An original survey…

  19. LANDFIRE

    USGS Publications Warehouse

    Zahn, Stephen G.

    2015-07-13

    LANDFIRE data products are primarily designed and developed to be used at the landscape level to facilitate national and regional strategic planning and reporting of wild land fire and other natural resource management activities. However, LANDFIRE’s spatially comprehensive dataset can also be adapted to support a variety of local management applications that need current and comprehensive geospatial data.

  20. USGS Emergency Response Resources

    USGS Publications Warehouse

    Bewley, Robert D.

    2011-01-01

    Every day, emergency responders are confronted with worldwide natural and manmade disasters, including earthquakes, floods, hurricanes, landslides, tsunami, volcanoes, wildfires, terrorist attacks, and accidental oil spills.The U.S. Geological Survey (USGS) is ready to coordinate the provisioning and deployment of USGS staff, equipment, geospatial data, products, and services in support of national emergency response requirements.

  1. 78 FR 46362 - Information Collection Sent to the Office of Management and Budget (OMB) for Approval; National...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-31

    ..., nongovernmental organizations; and academic institutions to advance the development of The National Map and other national geospatial databases. This effort will support our need to supplement ongoing data collection.... Description of Respondents: State, local, and tribal governments; private and non-profit firms; and academic...

  2. Creating Hybrid Learning Experiences in Robotics: Implications for Supporting Teaching and Learning

    ERIC Educational Resources Information Center

    Frerichs, Saundra Wever; Barker, Bradley; Morgan, Kathy; Patent-Nygren, Megan; Rezac, Micaela

    2012-01-01

    Geospatial and Robotics Technologies for the 21st Century (GEAR-Tech-21), teaches science, technology, engineering and mathematics (STEM) through robotics, global positioning systems (GPS), and geographic information systems (GIS) activities for youth in grades 5-8. Participants use a robotics kit, handheld GPS devices, and GIS technology to…

  3. 71 FR 66315 - Notice of Availability of Invention for Licensing; Government-Owned Invention

    Federal Register 2010, 2011, 2012, 2013, 2014

    2006-11-14

    ... Coating and Method of Formulator.//Navy Case No. 97,486: Processing Semantic Markups in Web Ontology... Rotating Clip.//Navy Case No. 97,886: Adding Semantic Support to Existing UDDI Infrastructure.//Navy Case..., Binding, and Integration of Non-Registered Geospatial Web Services.//Navy Case No. 98,094: Novel, Single...

  4. The California Seafloor and Coastal Mapping Program – Providing science and geospatial data for California's State Waters

    USGS Publications Warehouse

    Johnson, Samuel Y.; Cochrane, Guy R.; Golden, Nadine; Dartnell, Peter; Hartwell, Stephen; Cochran, Susan; Watt, Janet

    2017-01-01

    The California Seafloor and Coastal Mapping Program (CSCMP) is a collaborative effort to develop comprehensive bathymetric, geologic, and habitat maps and data for California's State Waters. CSCMP began in 2007 when the California Ocean Protection Council (OPC) and the National Oceanic and Atmospheric Administration (NOAA) allocated funding for high-resolution bathymetric mapping, largely to support the California Marine Life Protection Act and to update nautical charts. Collaboration and support from the U.S. Geological Survey and other partners has led to development and dissemination of one of the world's largest seafloor-mapping datasets. CSCMP provides essential science and data for ocean and coastal management, stimulates and enables research, and raises public education and awareness of coastal and ocean issues. Specific applications include:•Delineation and designation of marine protected areas•Characterization and modeling of benthic habitats and ecosystems•Updating nautical charts•Earthquake hazard assessments•Tsunami hazard assessments•Planning offshore infrastructure•Providing baselines for monitoring change•Input to models of sediment transport, coastal erosion, and coastal flooding•Regional sediment management•Understanding coastal aquifers•Providing geospatial data for emergency response

  5. 3D Geospatial Models for Visualization and Analysis of Groundwater Contamination at a Nuclear Materials Processing Facility

    NASA Astrophysics Data System (ADS)

    Stirewalt, G. L.; Shepherd, J. C.

    2003-12-01

    Analysis of hydrostratigraphy and uranium and nitrate contamination in groundwater at a former nuclear materials processing facility in Oklahoma were undertaken employing 3-dimensional (3D) geospatial modeling software. Models constructed played an important role in the regulatory decision process of the U.S. Nuclear Regulatory Commission (NRC) because they enabled visualization of temporal variations in contaminant concentrations and plume geometry. Three aquifer systems occur at the site, comprised of water-bearing fractured shales separated by indurated sandstone aquitards. The uppermost terrace groundwater system (TGWS) aquifer is composed of terrace and alluvial deposits and a basal shale. The shallow groundwater system (SGWS) aquifer is made up of three shale units and two sandstones. It is separated from the overlying TGWS and underlying deep groundwater system (DGWS) aquifer by sandstone aquitards. Spills of nitric acid solutions containing uranium and radioactive decay products around the main processing building (MPB), leakage from storage ponds west of the MPB, and leaching of radioactive materials from discarded equipment and waste containers contaminated both the TGWS and SGWS aquifers during facility operation between 1970 and 1993. Constructing 3D geospatial property models for analysis of groundwater contamination at the site involved use of EarthVision (EV), a 3D geospatial modeling software developed by Dynamic Graphics, Inc. of Alameda, CA. A viable 3D geohydrologic framework model was initially constructed so property data could be spatially located relative to subsurface geohydrologic units. The framework model contained three hydrostratigraphic zones equivalent to the TGWS, SGWS, and DGWS aquifers in which groundwater samples were collected, separated by two sandstone aquitards. Groundwater data collected in the three aquifer systems since 1991 indicated high concentrations of uranium (>10,000 micrograms/liter) and nitrate (> 500 milligrams/liter) around the MPB and elevated nitrate (> 2000 milligrams/ liter) around storage ponds. Vertical connectivity was suggested between the TGWS and SGWS, while the DGWS appeared relatively isolated from the overlying aquifers. Lateral movement of uranium was also suggested over time. For example, lateral migration in the TGWS is suggested along a shallow depression in the bedrock surface trending south-southwest from the southwest corner of the MPB. Another pathway atop the buried bedrock surface, trending west-northwest from the MPB and partially reflected by current surface topography, suggested lateral migration of nitrate in the SGWS. Lateral movement of nitrate in the SGWS was also indicated north, south, and west of the largest storage pond. Definition of contaminant plume movement over time is particularly important for assessing direction and rate of migration and the potential need for preventive measures to control contamination of groundwater outside facility property lines. The 3D geospatial property models proved invaluable for visualizing and analyzing variations in subsurface uranium and nitrate contamination in space and time within and between the three aquifers at the site. The models were an exceptional visualization tool for illustrating extent, volume, and quantitative amounts of uranium and nitrate contamination in the subsurface to regulatory decision-makers in regard to site decommissioning issues, including remediation concerns, providing a perspective not possible to achieve with traditional 2D maps. The geohydrologic framework model provides a conceptual model for consideration in flow and transport analyses.

  6. Geospatial data infrastructure: The development of metadata for geo-information in China

    NASA Astrophysics Data System (ADS)

    Xu, Baiquan; Yan, Shiqiang; Wang, Qianju; Lian, Jian; Wu, Xiaoping; Ding, Keyong

    2014-03-01

    Stores of geoscience records are in constant flux. These stores are continually added to by new information, ideas and data, which are frequently revised. The geoscience record is in restrained by human thought and technology for handling information. Conventional methods strive, with limited success, to maintain geoscience records which are readily susceptible and renewable. The information system must adapt to the diversity of ideas and data in geoscience and their changes through time. In China, more than 400,000 types of important geological data are collected and produced in geological work during the last two decades, including oil, natural gas and marine data, mine exploration, geophysical, geochemical, remote sensing and important local geological survey and research reports. Numerous geospatial databases are formed and stored in National Geological Archives (NGA) with available formats of MapGIS, ArcGIS, ArcINFO, Metalfile, Raster, SQL Server, Access and JPEG. But there is no effective way to warrant that the quality of information is adequate in theory and practice for decision making. The need for fast, reliable, accurate and up-to-date information by providing the Geographic Information System (GIS) communities are becoming insistent for all geoinformation producers and users in China. Since 2010, a series of geoinformation projects have been carried out under the leadership of the Ministry of Land and Resources (MLR), including (1) Integration, update and maintenance of geoinformation databases; (2) Standards research on clusterization and industrialization of information services; (3) Platform construction of geological data sharing; (4) Construction of key borehole databases; (5) Product development of information services. "Nine-System" of the basic framework has been proposed for the development and improvement of the geospatial data infrastructure, which are focused on the construction of the cluster organization, cluster service, convergence, database, product, policy, technology, standard and infrastructure systems. The development of geoinformation stores and services put forward a need for Geospatial Data Infrastructure (GDI) in China. In this paper, some of the ideas envisaged into the development of metadata in China are discussed.

  7. Experiences with Acquiring Highly Redundant Spatial Data to Support Driverless Vehicle Technologies

    NASA Astrophysics Data System (ADS)

    Koppanyi, Z.; Toth, C. K.

    2018-05-01

    As vehicle technology is moving towards higher autonomy, the demand for highly accurate geospatial data is rapidly increasing, as accurate maps have a huge potential of increasing safety. In particular, high definition 3D maps, including road topography and infrastructure, as well as city models along the transportation corridors represent the necessary support for driverless vehicles. In this effort, a vehicle equipped with high-, medium- and low-resolution active and passive cameras acquired data in a typical traffic environment, represented here by the OSU campus, where GPS/GNSS data are available along with other navigation sensor data streams. The data streams can be used for two purposes. First, high-definition 3D maps can be created by integrating all the sensory data, and Data Analytics/Big Data methods can be tested for automatic object space reconstruction. Second, the data streams can support algorithmic research for driverless vehicle technologies, including object avoidance, navigation/positioning, detecting pedestrians and bicyclists, etc. Crucial cross-performance analyses on map database resolution and accuracy with respect to sensor performance metrics to achieve economic solution for accurate driverless vehicle positioning can be derived. These, in turn, could provide essential information on optimizing the choice of geospatial map databases and sensors' quality to support driverless vehicle technologies. The paper reviews the data acquisition and primary data processing challenges and performance results.

  8. Enhancing The National Map Through Tactical Planning and Performance Monitoring

    USGS Publications Warehouse

    ,

    2008-01-01

    Tactical planning and performance monitoring are initial steps toward improving 'the way The National Map works' and supporting the U.S. Geological Survey (USGS) Science Strategy. This Tactical Performance Planning Summary for The National Map combines information from The National Map 2.0 Tactical Plan and The National Map Performance Milestone Matrix. The National Map 2.0 Tactical Plan is primarily a working document to guide The National Map program's execution, production, and metrics monitoring for fiscal years (FY) 2008 and 2009. The Tactical Plan addresses data, products, and services, as well as supporting and enabling activities. The National Map's 2-year goal for FY 2008 and FY 2009 is to provide a range of geospatial products and services that further the National Spatial Data Infrastructure and underpin USGS science. To do this, the National Geospatial Program will develop a renewed understanding during FY 2008 of key customer needs and requirements, develop the infrastructure to support The National Map business model, modernize its business processes, and reengineer its workforce. Priorities for The National Map will be adjusted if necessary to respond to changes to the project that may impact resources, constrain timeframes, or change customer needs. The supporting and enabling activities that make it possible to produce the products and services of The National Map will include partnership activities, improved compatibility of systems, outreach, and integration of data themes.

  9. GABBs: Cyberinfrastructure for Self-Service Geospatial Data Exploration, Computation, and Sharing

    NASA Astrophysics Data System (ADS)

    Song, C. X.; Zhao, L.; Biehl, L. L.; Merwade, V.; Villoria, N.

    2016-12-01

    Geospatial data are present everywhere today with the proliferation of location-aware computing devices. This is especially true in the scientific community where large amounts of data are driving research and education activities in many domains. Collaboration over geospatial data, for example, in modeling, data analysis and visualization, must still overcome the barriers of specialized software and expertise among other challenges. In addressing these needs, the Geospatial data Analysis Building Blocks (GABBs) project aims at building geospatial modeling, data analysis and visualization capabilities in an open source web platform, HUBzero. Funded by NSF's Data Infrastructure Building Blocks initiative, GABBs is creating a geospatial data architecture that integrates spatial data management, mapping and visualization, and interfaces in the HUBzero platform for scientific collaborations. The geo-rendering enabled Rappture toolkit, a generic Python mapping library, geospatial data exploration and publication tools, and an integrated online geospatial data management solution are among the software building blocks from the project. The GABBS software will be available through Amazon's AWS Marketplace VM images and open source. Hosting services are also available to the user community. The outcome of the project will enable researchers and educators to self-manage their scientific data, rapidly create GIS-enable tools, share geospatial data and tools on the web, and build dynamic workflows connecting data and tools, all without requiring significant software development skills, GIS expertise or IT administrative privileges. This presentation will describe the GABBs architecture, toolkits and libraries, and showcase the scientific use cases that utilize GABBs capabilities, as well as the challenges and solutions for GABBs to interoperate with other cyberinfrastructure platforms.

  10. Assessing Embedded Geospatial Student Learning Outcomes

    ERIC Educational Resources Information Center

    Carr, John David

    2012-01-01

    Geospatial tools and technologies have become core competencies for natural resource professionals due to the monitoring, modeling, and mapping capabilities they provide. To prepare students with needed background, geospatial instructional activities were integrated across Forest Management; Natural Resources; Fisheries, Wildlife, &…

  11. UASs for geospatial data

    USDA-ARS?s Scientific Manuscript database

    Increasingly, consumer organizations, businesses, and academic researchers are using UAS to gather geospatial, environmental data on natural and man-made phenomena. These data may be either remotely sensed or measured directly (e. g., sampling of atmospheric constituents). The term geospatial data r...

  12. Distribution of sea anemones (Cnidaria, Actiniaria) in Korea analyzed by environmental clustering

    USGS Publications Warehouse

    Cha, H.-R.; Buddemeier, R.W.; Fautin, D.G.; Sandhei, P.

    2004-01-01

    Using environmental data and the geospatial clustering tools LOICZView and DISCO, we empirically tested the postulated existence and boundaries of four biogeographic regions in the southern part of the Korean peninsula. Environmental variables used included wind speed, sea surface temperature (SST), salinity, tidal amplitude, and the chlorophyll spectral signal. Our analysis confirmed the existence of four biogeographic regions, but the details of the borders between them differ from those previously postulated. Specimen-level distribution records of intertidal sea anemones were mapped; their distribution relative to the environmental data supported the importance of the environmental parameters we selected in defining suitable habitats. From the geographic coincidence between anemone distribution and the clusters based on environmental variables, we infer that geospatial clustering has the power to delimit ranges for marine organisms within relatively small geographical areas.

  13. Quality Assessment and Accessibility Applications of Crowdsourced Geospatial Data: A Report on the Development and Extension of the George Mason University Geocrowdsourcing Testbed

    DTIC Science & Technology

    2014-09-01

    Approved for public release; distribution is unlimited. Prepared for Geospatial Research Laboratory U.S. Army Engineer Research and Development...Center U.S. Army Corps of Engineers Under Data Level Enterprise Tools Monitored by Geospatial Research Laboratory 7701 Telegraph Road...Engineer Research and Development Center (ERDC) ERDC Geospatial Research Laboratory 7701 Telegraph Road 11. SPONSOR/MONITOR’S REPORT Alexandria, VA 22135

  14. An Institutional Community-Driven effort to Curate and Preserve Geospatial Data using GeoBlacklight

    NASA Astrophysics Data System (ADS)

    Petters, J.; Coleman, S.; Andrea, O.

    2016-12-01

    A variety of geospatial data is produced or collected by both academic researchers and non-academic groups in the Virginia Tech community. In an effort to preserve, curate and make this geospatial data discoverable, the University Libraries have been building a local implementation of GeoBlacklight, a multi-institutional open-source collaborative project to improve the discoverability and sharing of geospatial data. We will discuss the local implementation of Geoblacklight at Virginia Tech, focusing on the efforts necessary to make it a sustainable resource for the institution and local community going forward. This includes technical challenges such as the development of uniform workflows for geospatial data produced within and outside the course of research, but organizational and economic barriers must be overcome as well. In spearheading this GeoBlacklight effort the Libraries have partnered with University Facilities and University IT. The IT group manages the storage and backup of geospatial data, allowing our group to focus on geospatial data collection and curation. Both IT and University Facilities are in possession of localized geospatial data of interest to Viriginia Tech researchers that all parties agreed should be made discoverable and accessible. The interest and involvement of these and other university stakeholders is key to establishing the sustainability of the infrastructure and the capabilities it can provide to the Virginia Tech community and beyond.

  15. The BlueSky Smoke Modeling Framework: Recent Developments

    NASA Astrophysics Data System (ADS)

    Sullivan, D. C.; Larkin, N.; Raffuse, S. M.; Strand, T.; ONeill, S. M.; Leung, F. T.; Qu, J. J.; Hao, X.

    2012-12-01

    BlueSky systems—a set of decision support tools including SmartFire and the BlueSky Framework—aid public policy decision makers and scientific researchers in evaluating the air quality impacts of fires. Smoke and fire managers use BlueSky systems in decisions about prescribed burns and wildland firefighting. Air quality agencies use BlueSky systems to support decisions related to air quality regulations. We will discuss a range of recent improvements to the BlueSky systems, as well as examples of applications and future plans. BlueSky systems have the flexibility to accept basic fire information from virtually any source and can reconcile multiple information sources so that duplication of fire records is eliminated. BlueSky systems currently apply information from (1) the National Oceanic and Atmospheric Administration's (NOAA) Hazard Mapping System (HMS), which represents remotely sensed data from the Moderate Resolution Imaging Spectroradiometer (MODIS), Advanced Very High Resolution Radiometer (AVHRR), and Geostationary Operational Environmental Satellites (GOES); (2) the Monitoring Trends in Burn Severity (MTBS) interagency project, which derives fire perimeters from Landsat 30-meter burn scars; (3) the Geospatial Multi-Agency Coordination Group (GeoMAC), which produces helicopter-flown burn perimeters; and (4) ground-based fire reports, such as the ICS-209 reports managed by the National Wildfire Coordinating Group. Efforts are currently underway to streamline the use of additional ground-based systems, such as states' prescribed burn databases. BlueSky systems were recently modified to address known uncertainties in smoke modeling associated with (1) estimates of biomass consumption derived from sparse fuel moisture data, and (2) models of plume injection heights. Additional sources of remotely sensed data are being applied to address these issues as follows: - The National Aeronautics and Space Administration's (NASA) Tropical Rainfall Measuring Mission (TRMM) Multi-satellite Precipitation Analysis Real-Time (TMPA-RT) data set is being used to improve dead fuel moisture estimates. - EastFire live fuel moisture estimates, which are derived from NASA's MODIS direct broadcast, are being used to improve live fuel moisture estimates. - NASA's Multi-angle Imaging Spectroradiometer (MISR) stereo heights are being used to improve estimates of plume injection heights. Further, the Fire Location and Modeling of Burning Emissions (FLAMBÉ) model was incorporated into the BlueSky Framework as an alternative means of calculating fire emissions. FLAMBÉ directly estimates emissions on the basis of fire detections and radiance measures from NASA's MODIS and NOAA's GOES satellites. (The authors gratefully acknowledge NASA's Applied Sciences Program [Grant Nos. NN506AB52A and NNX09AV76G)], the USDA Forest Service, and the Joint Fire Science Program for their support.)

  16. A cross-sectional ecological analysis of international and sub-national health inequalities in commercial geospatial resource availability.

    PubMed

    Dotse-Gborgbortsi, Winfred; Wardrop, Nicola; Adewole, Ademola; Thomas, Mair L H; Wright, Jim

    2018-05-23

    Commercial geospatial data resources are frequently used to understand healthcare utilisation. Although there is widespread evidence of a digital divide for other digital resources and infra-structure, it is unclear how commercial geospatial data resources are distributed relative to health need. To examine the distribution of commercial geospatial data resources relative to health needs, we assembled coverage and quality metrics for commercial geocoding, neighbourhood characterisation, and travel time calculation resources for 183 countries. We developed a country-level, composite index of commercial geospatial data quality/availability and examined its distribution relative to age-standardised all-cause and cause specific (for three main causes of death) mortality using two inequality metrics, the slope index of inequality and relative concentration index. In two sub-national case studies, we also examined geocoding success rates versus area deprivation by district in Eastern Region, Ghana and Lagos State, Nigeria. Internationally, commercial geospatial data resources were inversely related to all-cause mortality. This relationship was more pronounced when examining mortality due to communicable diseases. Commercial geospatial data resources for calculating patient travel times were more equitably distributed relative to health need than resources for characterising neighbourhoods or geocoding patient addresses. Countries such as South Africa have comparatively high commercial geospatial data availability despite high mortality, whilst countries such as South Korea have comparatively low data availability and low mortality. Sub-nationally, evidence was mixed as to whether geocoding success was lowest in more deprived districts. To our knowledge, this is the first global analysis of commercial geospatial data resources in relation to health outcomes. In countries such as South Africa where there is high mortality but also comparatively rich commercial geospatial data, these data resources are a potential resource for examining healthcare utilisation that requires further evaluation. In countries such as Sierra Leone where there is high mortality but minimal commercial geospatial data, alternative approaches such as open data use are needed in quantifying patient travel times, geocoding patient addresses, and characterising patients' neighbourhoods.

  17. Geo-Spatial Social Network Analysis of Social Media to Mitigate Disasters

    NASA Astrophysics Data System (ADS)

    Carley, K. M.

    2017-12-01

    Understanding the spatial layout of human activity can afford a better understanding many phenomena - such as local cultural, the spread of ideas, and the scope of a disaster. Today, social media is one of the key sensors for acquiring information on socio-cultural activity, some with cues as to the geo-location. We ask, What can be learned by putting such data on maps? For example, are people who chat on line more likely to be near each other? Can Twitter data support disaster planning or early warning? In this talk, such issues are examined using data collected via Twitter and analyzed using ORA. ORA is a network analysis and visualization system. It supports not just social networks (who is interacting with whom), but also high dimensional networks with many types of nodes (e.g. people, organizations, resources, activities …) and relations, geo-spatial network analysis, dynamic network analysis, & geo-temporal analysis. Using ORA lessons learned from five case studies are considered: Arab Spring, Tsunami warning in Padang Indonesia, Twitter around Fukushima in Japan, Typhoon Haiyan (Yolanda), & regional conflict. Using Padang Indonesia data, we characterize the strengths and limitations of social media data to support disaster planning & early warning, identify at risk areas & issues of concern, and estimate where people are and which areas are impacted. Using Fukushima Japanese data, social media is used to estimate geo-spatial regularities in movement and communication that can inform disaster response and risk estimation. Using Arab Spring data, we find that the spread of bots & extremists varies by country and time, to the extent that using twitter to understand who is important or what ideas are critical can be compromised. Bots and extremists can exploit disaster messaging to create havoc and facilitate criminal activity e.g. human trafficking. Event discovery mechanisms support isolating geo-epi-centers for key events become crucial. Spatial inference enables improved country, and city identification. Geo-network analytics with and without these inferences reveal that explicitly geo-tagged data may not be representative and that improved location estimation provides better insight into the social condition. These results demonstrate the value of these technique to mitigate the social impact of disasters.

  18. On the Science-Policy Bridge: Do Spatial Heat Vulnerability Assessment Studies Influence Policy?

    PubMed Central

    Wolf, Tanja; Chuang, Wen-Ching; McGregor, Glenn

    2015-01-01

    Human vulnerability to heat varies at a range of spatial scales, especially within cities where there can be noticeable intra-urban differences in heat risk factors. Mapping and visualizing intra-urban heat vulnerability offers opportunities for presenting information to support decision-making. For example the visualization of the spatial variation of heat vulnerability has the potential to enable local governments to identify hot spots of vulnerability and allocate resources and increase assistance to people in areas of greatest need. Recently there has been a proliferation of heat vulnerability mapping studies, all of which, to varying degrees, justify the process of vulnerability mapping in a policy context. However, to date, there has not been a systematic review of the extent to which the results of vulnerability mapping studies have been applied in decision-making. Accordingly we undertook a comprehensive review of 37 recently published papers that use geospatial techniques for assessing human vulnerability to heat. In addition, we conducted an anonymous survey of the lead authors of the 37 papers in order to establish the level of interaction between the researchers as science information producers and local authorities as information users. Both paper review and author survey results show that heat vulnerability mapping has been used in an attempt to communicate policy recommendations, raise awareness and induce institutional networking and learning, but has not as yet had a substantive influence on policymaking or preventive action. PMID:26512681

  19. 78 FR 69393 - Privacy Act of 1974; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-19

    .... FOR FURTHER INFORMATION CONTACT: National Geospatial-Intelligence Agency (NGA), ATTN: Human...: Delete entry and replace with ``Human Development Directorate, National Geospatial-Intelligence Agency...; System of Records AGENCY: National Geospatial-Intelligence Agency, DoD. ACTION: Notice to alter a System...

  20. 77 FR 5820 - National Geospatial Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-06

    ... DEPARTMENT OF THE INTERIOR Office of the Secretary National Geospatial Advisory Committee AGENCY... that the Secretary of the Interior has renewed the National Geospatial Advisory Committee. The Committee will provide advice and recommendations to the Federal Geographic Data Committee (FGDC), through...

  1. THE NEVADA GEOSPATIAL DATA BROWSER

    EPA Science Inventory

    The Landscape Ecology Branch of the U.S. Environmental Protection Agency (Las Vegas, NV) has developed the Nevada Geospatial Data Browser, a spatial data archive to centralize and distribute the geospatial data used to create the land cover, vertebrate habitat models, and land o...

  2. Information Fusion for Feature Extraction and the Development of Geospatial Information

    DTIC Science & Technology

    2004-07-01

    of automated processing . 2. Requirements for Geospatial Information Accurate, timely geospatial information is critical for many military...this evaluation illustrates some of the difficulties in comparing manual and automated processing results (figure 5). The automated delineation of

  3. Nick Grue | NREL

    Science.gov Websites

    geospatial data analysis using parallel processing High performance computing Renewable resource technical potential and supply curve analysis Spatial database utilization Rapid analysis of large geospatial datasets energy and geospatial analysis products Research Interests Rapid, web-based renewable resource analysis

  4. Geospatial Information Best Practices

    DTIC Science & Technology

    2012-01-01

    26 Spring - 2012 By MAJ Christopher Blais, CW2 Joshua Stratton and MSG Moise Danjoint The fact that Geospatial information can be codified and...Operation Iraqi Freedom V (2007-2008, and Operation New Dawn (2011). MSG Moise Danjoint is the noncommissioned officer in charge, Geospatial

  5. Addressing Climate Change Adaptation in Regional Transportation Plans in California: A Guide and Online Visualization Tool for Planners to Incorporate Risks of Climate Change Impacts in Policy and Decision-Making

    NASA Astrophysics Data System (ADS)

    Tao, W.; Tucker, K.; DeFlorio, J.

    2012-12-01

    The reality of a changing climate means that transportation and planning agencies need to understand the potential effects of changes in storm activity, sea levels, temperature, and precipitation patterns; and develop strategies to ensure the continuing robustness and resilience of transportation infrastructure and services. This is a relatively new challenge for California's regional planning agencies - adding yet one more consideration to an already complex and multifaceted planning process. In that light, the California Department of Transportation (Caltrans) is developing a strategy framework using a module-based process that planning agencies can undertake to incorporating the risks of climate change impacts into their decision-making and long-range transportation plans. The module-based approach was developed using a best practices survey of existing work nationally, along with a set of structured interviews with metropolitan planning organizations (MPOs) and regional transportation planning agencies (RTPAs) within California. Findings led to the development of a process, as well as a package of foundational geospatial layers (i.e. the Statewide Transportation Asset Geodatabase - STAG), primarily comprising state and Federal transportation assets. These assets are intersected with a set of geospatial layers for the climate stressors of relevance in the state which are placed in the same reference layers as the STAG; thus providing a full set of GIS layers that can be a starting point for MPOs/RTPAs that want to follow the step-by-step module-based approach in its entirety. The fast-paced changes in science and climate change knowledge requires a flexible platform to display continuously evolving information. To this end, the development of the modules are accompanied by a set of geospatial analysis disseminated using an online web portal. In this way, the information can be relayed to MPO/RTPAs in a easy-to-use fashion that can help them follow the modules for the strategy framework. The strategy framework for MPOs and RTPAs is used to: 1) Assess the relative risks to their transportation system infrastructure and services of different climate stressors (sea level rise, temperature changes, snow melt, precipita¬tion changes, flooding, extreme weather events); 2) Conduct an asset inventory and vulnerability assessment of existing infrastructure; 3) Prioritize segments and facilities for adaptation action; 4) Identify appropriate and cost-effective adaptation strategies; and 5) Incorporate climate impact considerations into future long-range transportation planning and investment decisions. This framework complements the broader planning and investment processes that MPOs and RTPAs already manage. It recognizes the varying capacities and resources among MPOs and RTPAs and provide methods that can be used by organizations seeking to conduct in-depth analysis or a more sketch-level assessment.

  6. Development of Distributed Research Center for analysis of regional climatic and environmental changes

    NASA Astrophysics Data System (ADS)

    Gordov, E.; Shiklomanov, A.; Okladnikov, I.; Prusevich, A.; Titov, A.

    2016-11-01

    We present an approach and first results of a collaborative project being carried out by a joint team of researchers from the Institute of Monitoring of Climatic and Ecological Systems, Russia and Earth Systems Research Center UNH, USA. Its main objective is development of a hardware and software platform prototype of a Distributed Research Center (DRC) for monitoring and projecting of regional climatic and environmental changes in the Northern extratropical areas. The DRC should provide the specialists working in climate related sciences and decision-makers with accurate and detailed climatic characteristics for the selected area and reliable and affordable tools for their in-depth statistical analysis and studies of the effects of climate change. Within the framework of the project, new approaches to cloud processing and analysis of large geospatial datasets (big geospatial data) inherent to climate change studies are developed and deployed on technical platforms of both institutions. We discuss here the state of the art in this domain, describe web based information-computational systems developed by the partners, justify the methods chosen to reach the project goal, and briefly list the results obtained so far.

  7. Geospatial analysis identifies critical mineral-resource potential in Alaska

    USGS Publications Warehouse

    Karl, Susan M.; Labay, Keith A.; Jacques, Katherine; Landowski, Claire

    2017-03-03

    Alaska consists of more than 663,000 square miles (1,717,000 square kilometers) of land—more than a sixth of the total area of the United States—and large tracts of it have not been systematically studied or sampled for mineral-resource potential. Many regions of the State are known to have significant mineral-resource potential, and there are currently six operating mines in the State along with numerous active mineral exploration projects. The U.S. Geological Survey and the Alaska Division of Geological & Geophysical Surveys have developed a new geospatial tool that integrates and analyzes publicly available databases of geologic information and estimates the mineral-resource potential for critical minerals, which was recently used to evaluate Alaska. The results of the analyses highlight areas that have known mineral deposits and also reveal areas that were not previously considered to be prospective for these deposit types. These results will inform land management decisions by Federal, State, and private landholders, and will also help guide future exploration activities and scientific investigations in Alaska.

  8. Interoperable cross-domain semantic and geospatial framework for automatic change detection

    NASA Astrophysics Data System (ADS)

    Kuo, Chiao-Ling; Hong, Jung-Hong

    2016-01-01

    With the increasingly diverse types of geospatial data established over the last few decades, semantic interoperability in integrated applications has attracted much interest in the field of Geographic Information System (GIS). This paper proposes a new strategy and framework to process cross-domain geodata at the semantic level. This framework leverages the semantic equivalence of concepts between domains through bridge ontology and facilitates the integrated use of different domain data, which has been long considered as an essential superiority of GIS, but is impeded by the lack of understanding about the semantics implicitly hidden in the data. We choose the task of change detection to demonstrate how the introduction of ontology concept can effectively make the integration possible. We analyze the common properties of geodata and change detection factors, then construct rules and summarize possible change scenario for making final decisions. The use of topographic map data to detect changes in land use shows promising success, as far as the improvement of efficiency and level of automation is concerned. We believe the ontology-oriented approach will enable a new way for data integration across different domains from the perspective of semantic interoperability, and even open a new dimensionality for the future GIS.

  9. Geospatial-enabled Data Exploration and Computation through Data Infrastructure Building Blocks

    NASA Astrophysics Data System (ADS)

    Song, C. X.; Biehl, L. L.; Merwade, V.; Villoria, N.

    2015-12-01

    Geospatial data are present everywhere today with the proliferation of location-aware computing devices and sensors. This is especially true in the scientific community where large amounts of data are driving research and education activities in many domains. Collaboration over geospatial data, for example, in modeling, data analysis and visualization, must still overcome the barriers of specialized software and expertise among other challenges. The GABBs project aims at enabling broader access to geospatial data exploration and computation by developing spatial data infrastructure building blocks that leverage capabilities of end-to-end application service and virtualized computing framework in HUBzero. Funded by NSF Data Infrastructure Building Blocks (DIBBS) initiative, GABBs provides a geospatial data architecture that integrates spatial data management, mapping and visualization and will make it available as open source. The outcome of the project will enable users to rapidly create tools and share geospatial data and tools on the web for interactive exploration of data without requiring significant software development skills, GIS expertise or IT administrative privileges. This presentation will describe the development of geospatial data infrastructure building blocks and the scientific use cases that help drive the software development, as well as seek feedback from the user communities.

  10. Goddard Space Flight Center's Partnership with Florida International University

    NASA Astrophysics Data System (ADS)

    Rishe, N. D.; Graham, S. C.; Gutierrez, M. E.

    2004-12-01

    NASA's Goddard Space Flight Center (GSFC) has been collaborating with Florida International University's High Performance Database Research Center (FIU HPDRC) for nearly ten years. Much of this collaboration was funded through a NASA Institutional Research Award (IRA). That award involved research in the Internet dissemination of geospatial data, and in recruiting and training student researchers. FIU's TerraFly web service presently serves more than 10,000 unique users per day by providing an easy-to-use mechanism for exploring geospatial data and imagery. IRA-supported students have received 47 Bachelor's degrees, 20 Master's degrees, and 2 Doctoral degrees at FIU. FIU leveraged IRA funding into over \\$19 million in other funding and donations for their research and training activities and has published nearly 150 scientific papers acknowledging the NASA IRA award. GSFC has worked closely with FIU HPDRC in the development of their geospatial data storage and dissemination research. TerraFly presents many NASA datasets such as the nationwide mosaic of LandSat 5, the PRISM precipitation model, the TRMM accumulated rainfall worldwide; as well as USGS aerial photography nationwide at 30cm to 1m resolutions, demographic data, Ikonos satellite imagery, and many more. Our presentation will discuss the lessons learned during the collaboration between GSFC and FIU as well as our current research projects.

  11. Using Participatory Approach to Improve Availability of Spatial Data for Local Government

    NASA Astrophysics Data System (ADS)

    Kliment, T.; Cetl, V.; Tomič, H.; Lisiak, J.; Kliment, M.

    2016-09-01

    Nowadays, the availability of authoritative geospatial features of various data themes is becoming wider on global, regional and national levels. The reason is existence of legislative frameworks for public sector information and related spatial data infrastructure implementations, emergence of support for initiatives as open data, big data ensuring that online geospatial information are made available to digital single market, entrepreneurs and public bodies on both national and local level. However, the availability of authoritative reference spatial data linking the geographic representation of the properties and their owners are still missing in an appropriate quantity and quality level, even though this data represent fundamental input for local governments regarding the register of buildings used for property tax calculations, identification of illegal buildings, etc. We propose a methodology to improve this situation by applying the principles of participatory GIS and VGI used to collect observations, update authoritative datasets and verify the newly developed datasets of areas of buildings used to calculate property tax rates issued to their owners. The case study was performed within the district of the City of Požega in eastern Croatia in the summer 2015 and resulted in a total number of 16072 updated and newly identified objects made available online for quality verification by citizens using open source geospatial technologies.

  12. I Want It, You've Got It - Effectively Connect Users to Geospatial Resources

    NASA Astrophysics Data System (ADS)

    White, C. E.

    2012-12-01

    How do users of scientific data find what they need? How do they know where to look, what to look for, how to evaluate, and - if they find the right resource - then how to get it? When the data is of a geospatial nature, other factors also come into play - is the data in a format/projection compatible with other data being used, does the user have access to tools that can analyze and display the data to adequately evaluate it, and does the user have knowledge on how to manage that access - especially if the data is being exposed by web services. Supporting users to connect them to geospatial data in a continually evolving technological climate is a challenge that reaches deeply into all levels of data management. In this talk, we will discuss specific challenges in how users discover and access resources, and how Esri has evolved solutions over time to more effectively connect users to what they need. Some of the challenges - and current solutions - that will be discussed are: balancing a straightforward user experience with rich functionality, providing simple descriptions while maintaining complete metadata, enabling data access to work with an organization's content while being compatible with other organizations' access mechanisms, and the ability to publish data once yet share it in many venues.

  13. Data collected to support monitoring of constructed emergent sandbar habitat on the Missouri River downstream from Gavins Point Dam, South Dakota and Nebraska, 2004-06

    USGS Publications Warehouse

    Thompson, Ryan F.; Johnson, Michaela R.; Andersen, Michael J.

    2007-01-01

    The U.S. Army Corps of Engineers has constructed emergent sandbar habitat on sections of the Missouri River bordering South Dakota and Nebraska downstream from Gavins Point Dam to create and enhance habitat for threatened and endangered bird species. Two areas near river miles 761.3 and 769.8 were selected for construction of emergent sandbar habitat. Pre- and postconstruction data were collected by the U.S. Geological Survey, in cooperation with the U.S. Army Corps of Engineers, to evaluate the success of the habitat management techniques. Data collected include pre- and postconstruction channel-geometry data (bathymetric and topographic) for areas upstream from, downstream from, and within each construction site. Water-velocity data were collected for selected parts of the site near river mile 769.8. Instruments and methods used in data collection, as well as quality-assurance and quality-control measures, are described. Geospatial channel-geometry data are presented for transects of the river channel as cross sections and as geographical information system shapefiles. Geospatial land-surface elevation data are provided for part of each site in the form of a color-shaded relief map. Geospatial water-velocity data also are provided as color-shaded maps and geographical information system shapefiles.

  14. Multi-class geospatial object detection and geographic image classification based on collection of part detectors

    NASA Astrophysics Data System (ADS)

    Cheng, Gong; Han, Junwei; Zhou, Peicheng; Guo, Lei

    2014-12-01

    The rapid development of remote sensing technology has facilitated us the acquisition of remote sensing images with higher and higher spatial resolution, but how to automatically understand the image contents is still a big challenge. In this paper, we develop a practical and rotation-invariant framework for multi-class geospatial object detection and geographic image classification based on collection of part detectors (COPD). The COPD is composed of a set of representative and discriminative part detectors, where each part detector is a linear support vector machine (SVM) classifier used for the detection of objects or recurring spatial patterns within a certain range of orientation. Specifically, when performing multi-class geospatial object detection, we learn a set of seed-based part detectors where each part detector corresponds to a particular viewpoint of an object class, so the collection of them provides a solution for rotation-invariant detection of multi-class objects. When performing geographic image classification, we utilize a large number of pre-trained part detectors to discovery distinctive visual parts from images and use them as attributes to represent the images. Comprehensive evaluations on two remote sensing image databases and comparisons with some state-of-the-art approaches demonstrate the effectiveness and superiority of the developed framework.

  15. Geospatial techniques for developing a sampling frame of watersheds across a region

    USGS Publications Warehouse

    Gresswell, Robert E.; Bateman, Douglas S.; Lienkaemper, George; Guy, T.J.

    2004-01-01

    Current land-management decisions that affect the persistence of native salmonids are often influenced by studies of individual sites that are selected based on judgment and convenience. Although this approach is useful for some purposes, extrapolating results to areas that were not sampled is statistically inappropriate because the sampling design is usually biased. Therefore, in recent investigations of coastal cutthroat trout (Oncorhynchus clarki clarki) located above natural barriers to anadromous salmonids, we used a methodology for extending the statistical scope of inference. The purpose of this paper is to apply geospatial tools to identify a population of watersheds and develop a probability-based sampling design for coastal cutthroat trout in western Oregon, USA. The population of mid-size watersheds (500-5800 ha) west of the Cascade Range divide was derived from watershed delineations based on digital elevation models. Because a database with locations of isolated populations of coastal cutthroat trout did not exist, a sampling frame of isolated watersheds containing cutthroat trout had to be developed. After the sampling frame of watersheds was established, isolated watersheds with coastal cutthroat trout were stratified by ecoregion and erosion potential based on dominant bedrock lithology (i.e., sedimentary and igneous). A stratified random sample of 60 watersheds was selected with proportional allocation in each stratum. By comparing watershed drainage areas of streams in the general population to those in the sampling frame and the resulting sample (n = 60), we were able to evaluate the how representative the subset of watersheds was in relation to the population of watersheds. Geospatial tools provided a relatively inexpensive means to generate the information necessary to develop a statistically robust, probability-based sampling design.

  16. Model My Watershed and BiG CZ Data Portal: Interactive geospatial analysis and hydrological modeling web applications that leverage the Amazon cloud for scientists, resource managers and students

    NASA Astrophysics Data System (ADS)

    Aufdenkampe, A. K.; Mayorga, E.; Tarboton, D. G.; Sazib, N. S.; Horsburgh, J. S.; Cheetham, R.

    2016-12-01

    The Model My Watershed Web app (http://wikiwatershed.org/model/) was designed to enable citizens, conservation practitioners, municipal decision-makers, educators, and students to interactively select any area of interest anywhere in the continental USA to: (1) analyze real land use and soil data for that area; (2) model stormwater runoff and water-quality outcomes; and (3) compare how different conservation or development scenarios could modify runoff and water quality. The BiG CZ Data Portal is a web application for scientists for intuitive, high-performance map-based discovery, visualization, access and publication of diverse earth and environmental science data via a map-based interface that simultaneously performs geospatial analysis of selected GIS and satellite raster data for a selected area of interest. The two web applications share a common codebase (https://github.com/WikiWatershed and https://github.com/big-cz), high performance geospatial analysis engine (http://geotrellis.io/ and https://github.com/geotrellis) and deployment on the Amazon Web Services (AWS) cloud cyberinfrastructure. Users can use "on-the-fly" rapid watershed delineation over the national elevation model to select their watershed or catchment of interest. The two web applications also share the goal of enabling the scientists, resource managers and students alike to share data, analyses and model results. We will present these functioning web applications and their potential to substantially lower the bar for studying and understanding our water resources. We will also present work in progress, including a prototype system for enabling citizen-scientists to register open-source sensor stations (http://envirodiy.org/mayfly/) to stream data into these systems, so that they can be reshared using Water One Flow web services.

  17. Community Needs Assessment and Portal Prototype Development for an Arctic Spatial Data Infrastructure (ASDI): A Contribution to an IPY Data Cyberinfrastructure

    NASA Astrophysics Data System (ADS)

    Wiggins, H. V.; Warnick, W. K.; Hempel, L. C.; Henk, J.; Sorensen, M.; Tweedie, C. E.; Gaylord, A.; Behr, S.

    2006-12-01

    As the creation and use of geospatial data in research, management, logistics, and education applications has proliferated, there is now a tremendous potential for advancing the IPY initiative through a variety of cyberinfrastructure applications, including Spatial Data Infrastructure (SDI) and related technologies. SDIs provide a necessary and common framework of standards, securities, policies, procedures, and technology to support the effective acquisition, coordination, dissemination and use of geospatial data by multiple and distributed stakeholder and user groups. Despite the numerous research activities in the Arctic, there is no established SDI and, because of this lack of a coordinated infrastructure, there is inefficiency, duplication of effort, and reduced data quality and search ability of arctic geospatial data. The urgency for establishing this framework is significant considering the myriad of data that is likely to be collected in celebration of the International Polar Year (IPY) in 2007-2008 and the current international momentum for an improved and integrated circumarctic terrestrial-marine-atmospheric environmental observatories network. The key objective of this project is to lay the foundation for full implementation of an Arctic Spatial Data Infrastructure (ASDI) through two related activities: (1) an assessment - via interviews, questionnaires, a workshop, and other means - of community needs, readiness, and resources, and (2) the development of a prototype web mapping portal to demonstrate the purpose and function on an arctic geospatial one-stop portal technology and to solicit community input on design and function. The results of this project will be compiled into a comprehensive report guiding the research community and funding agencies in the design and implementation of an ASDI to contribute to a robust IPY data cyberinfrastructure.

  18. Geospatial Analysis Using Remote Sensing Images: Case Studies of Zonguldak Test Field

    NASA Astrophysics Data System (ADS)

    Bayık, Çağlar; Topan, Hüseyin; Özendi, Mustafa; Oruç, Murat; Cam, Ali; Abdikan, Saygın

    2016-06-01

    Inclined topographies are one of the most challenging problems for geospatial analysis of air-borne and space-borne imageries. However, flat areas are mostly misleading to exhibit the real performance. For this reason, researchers generally require a study area which includes mountainous topography and various land cover and land use types. Zonguldak and its vicinity is a very suitable test site for performance investigation of remote sensing systems due to the fact that it contains different land use types such as dense forest, river, sea, urban area; different structures such as open pit mining operations, thermal power plant; and its mountainous structure. In this paper, we reviewed more than 120 proceeding papers and journal articles about geospatial analysis that are performed on the test field of Zonguldak and its surroundings. Geospatial analysis performed with imageries include elimination of systematic geometric errors, 2/3D georeferencing accuracy assessment, DEM and DSM generation and validation, ortho-image production, evaluation of information content, image classification, automatic feature extraction and object recognition, pan-sharpening, land use and land cover change analysis and deformation monitoring. In these applications many optical satellite images are used i.e. ASTER, Bilsat-1, IKONOS, IRS-1C, KOMPSAT-1, KVR-1000, Landsat-3-5-7, Orbview-3, QuickBird, Pleiades, SPOT-5, TK-350, RADARSAT-1, WorldView-1-2; as well as radar data i.e. JERS-1, Envisat ASAR, TerraSAR-X, ALOS PALSAR and SRTM. These studies are performed by Departments of Geomatics Engineering at Bülent Ecevit University, at İstanbul Technical University, at Yıldız Technical University, and Institute of Photogrammetry and GeoInformation at Leibniz University Hannover. These studies are financially supported by TÜBİTAK (Turkey), the Universities, ESA, Airbus DS, ERSDAC (Japan) and Jülich Research Centre (Germany).

  19. US EPA GEOSPATIAL QUALITY COUNCIL: ENSURING QUALITY IN GEOPSPATIAL SOLUTIONS

    EPA Science Inventory

    In 1999, the U.S. Environmental Protection Agency (EPA), Office of Research and Development, Environmental Sciences Division, created the EPA Geospatial Quality Council (GQC) to fill the gap between the EPA Quality Assurance (QA) and Geospatial communities. GQC participants inclu...

  20. Searches over graphs representing geospatial-temporal remote sensing data

    DOEpatents

    Brost, Randolph; Perkins, David Nikolaus

    2018-03-06

    Various technologies pertaining to identifying objects of interest in remote sensing images by searching over geospatial-temporal graph representations are described herein. Graphs are constructed by representing objects in remote sensing images as nodes, and connecting nodes with undirected edges representing either distance or adjacency relationships between objects and directed edges representing changes in time. Geospatial-temporal graph searches are made computationally efficient by taking advantage of characteristics of geospatial-temporal data in remote sensing images through the application of various graph search techniques.

Top