Sample records for google earth api

  1. Web GIS in practice III: creating a simple interactive map of England's Strategic Health Authorities using Google Maps API, Google Earth KML, and MSN Virtual Earth Map Control

    PubMed Central

    Boulos, Maged N Kamel

    2005-01-01

    This eye-opener article aims at introducing the health GIS community to the emerging online consumer geoinformatics services from Google and Microsoft (MSN), and their potential utility in creating custom online interactive health maps. Using the programmable interfaces provided by Google and MSN, we created three interactive demonstrator maps of England's Strategic Health Authorities. These can be browsed online at – Google Maps API (Application Programming Interface) version, – Google Earth KML (Keyhole Markup Language) version, and – MSN Virtual Earth Map Control version. Google and MSN's worldwide distribution of "free" geospatial tools, imagery, and maps is to be commended as a significant step towards the ultimate "wikification" of maps and GIS. A discussion is provided of these emerging online mapping trends, their expected future implications and development directions, and associated individual privacy, national security and copyrights issues. Although ESRI have announced their planned response to Google (and MSN), it remains to be seen how their envisaged plans will materialize and compare to the offerings from Google and MSN, and also how Google and MSN mapping tools will further evolve in the near future. PMID:16176577

  2. From Analysis to Impact: Challenges and Outcomes from Google's Cloud-based Platforms for Analyzing and Leveraging Petapixels of Geospatial Data

    NASA Astrophysics Data System (ADS)

    Thau, D.

    2017-12-01

    For the past seven years, Google has made petabytes of Earth observation data, and the tools to analyze it, freely available to researchers around the world via cloud computing. These data and tools were initially available via Google Earth Engine and are increasingly available on the Google Cloud Platform. We have introduced a number of APIs for both the analysis and presentation of geospatial data that have been successfully used to create impactful datasets and web applications, including studies of global surface water availability, global tree cover change, and crop yield estimation. Each of these projects used the cloud to analyze thousands to millions of Landsat scenes. The APIs support a range of publishing options, from outputting imagery and data for inclusion in papers, to providing tools for full scale web applications that provide analysis tools of their own. Over the course of developing these tools, we have learned a number of lessons about how to build a publicly available cloud platform for geospatial analysis, and about how the characteristics of an API can affect the kinds of impacts a platform can enable. This study will present an overview of how Google Earth Engine works and how Google's geospatial capabilities are extending to Google Cloud Platform. We will provide a number of case studies describing how these platforms, and the data they host, have been leveraged to build impactful decision support tools used by governments, researchers, and other institutions, and we will describe how the available APIs have shaped (or constrained) those tools. [Image Credit: Tyler A. Erickson

  3. Interactive Computing and Processing of NASA Land Surface Observations Using Google Earth Engine

    NASA Technical Reports Server (NTRS)

    Molthan, Andrew; Burks, Jason; Bell, Jordan

    2016-01-01

    Google's Earth Engine offers a "big data" approach to processing large volumes of NASA and other remote sensing products. h\\ps://earthengine.google.com/ Interfaces include a Javascript or Python-based API, useful for accessing and processing over large periods of record for Landsat and MODIS observations. Other data sets are frequently added, including weather and climate model data sets, etc. Demonstrations here focus on exploratory efforts to perform land surface change detection related to severe weather, and other disaster events.

  4. Low-cost Tools for Aerial Video Geolocation and Air Traffic Analysis for Delay Reduction Using Google Earth

    NASA Astrophysics Data System (ADS)

    Zetterlind, V.; Pledgie, S.

    2009-12-01

    Low-cost, low-latency, robust geolocation and display of aerial video is a common need for a wide range of earth observing as well as emergency response and security applications. While hardware costs for aerial video collection systems, GPS, and inertial sensors have been decreasing, software costs for geolocation algorithms and reference imagery/DTED remain expensive and highly proprietary. As part of a Federal Small Business Innovative Research project, MosaicATM and EarthNC, Inc have developed a simple geolocation system based on the Google Earth API and Google's 'built-in' DTED and reference imagery libraries. This system geolocates aerial video based on platform and camera position, attitude, and field-of-view metadata using geometric photogrammetric principles of ray-intersection with DTED. Geolocated video can be directly rectified and viewed in the Google Earth API during processing. Work is underway to extend our geolocation code to NASA World Wind for additional flexibility and a fully open-source platform. In addition to our airborne remote sensing work, MosaicATM has developed the Surface Operations Data Analysis and Adaptation (SODAA) tool, funded by NASA Ames, which supports analysis of airport surface operations to optimize aircraft movements and reduce fuel burn and delays. As part of SODAA, MosaicATM and EarthNC, Inc have developed powerful tools to display national airspace data and time-animated 3D flight tracks in Google Earth for 4D analysis. The SODAA tool can convert raw format flight track data, FAA National Flight Data (NFD), and FAA 'Adaptation' airport surface data to a spatial database representation and then to Google Earth KML. The SODAA client provides users with a simple graphical interface through which to generate queries with a wide range of predefined and custom filters, plot results, and export for playback in Google Earth in conjunction with NFD and Adaptation overlays.

  5. MaRGEE: Move and Rotate Google Earth Elements

    NASA Astrophysics Data System (ADS)

    Dordevic, Mladen M.; Whitmeyer, Steven J.

    2015-12-01

    Google Earth is recognized as a highly effective visualization tool for geospatial information. However, there remain serious limitations that have hindered its acceptance as a tool for research and education in the geosciences. One significant limitation is the inability to translate or rotate geometrical elements on the Google Earth virtual globe. Here we present a new JavaScript web application to "Move and Rotate Google Earth Elements" (MaRGEE). MaRGEE includes tools to simplify, translate, and rotate elements, add intermediate steps to a transposition, and batch process multiple transpositions. The transposition algorithm uses spherical geometry calculations, such as the haversine formula, to accurately reposition groups of points, paths, and polygons on the Google Earth globe without distortion. Due to the imminent deprecation of the Google Earth API and browser plugin, MaRGEE uses a Google Maps interface to facilitate and illustrate the transpositions. However, the inherent spatial distortions that result from the Google Maps Web Mercator projection are not apparent once the transposed elements are saved as a KML file and opened in Google Earth. Potential applications of the MaRGEE toolkit include tectonic reconstructions, the movements of glaciers or thrust sheets, and time-based animations of other large- and small-scale geologic processes.

  6. Google Earth Engine: a new cloud-computing platform for global-scale earth observation data and analysis

    NASA Astrophysics Data System (ADS)

    Moore, R. T.; Hansen, M. C.

    2011-12-01

    Google Earth Engine is a new technology platform that enables monitoring and measurement of changes in the earth's environment, at planetary scale, on a large catalog of earth observation data. The platform offers intrinsically-parallel computational access to thousands of computers in Google's data centers. Initial efforts have focused primarily on global forest monitoring and measurement, in support of REDD+ activities in the developing world. The intent is to put this platform into the hands of scientists and developing world nations, in order to advance the broader operational deployment of existing scientific methods, and strengthen the ability for public institutions and civil society to better understand, manage and report on the state of their natural resources. Earth Engine currently hosts online nearly the complete historical Landsat archive of L5 and L7 data collected over more than twenty-five years. Newly-collected Landsat imagery is downloaded from USGS EROS Center into Earth Engine on a daily basis. Earth Engine also includes a set of historical and current MODIS data products. The platform supports generation, on-demand, of spatial and temporal mosaics, "best-pixel" composites (for example to remove clouds and gaps in satellite imagery), as well as a variety of spectral indices. Supervised learning methods are available over the Landsat data catalog. The platform also includes a new application programming framework, or "API", that allows scientists access to these computational and data resources, to scale their current algorithms or develop new ones. Under the covers of the Google Earth Engine API is an intrinsically-parallel image-processing system. Several forest monitoring applications powered by this API are currently in development and expected to be operational in 2011. Combining science with massive data and technology resources in a cloud-computing framework can offer advantages of computational speed, ease-of-use and collaboration, as well as transparency in data and methods. Methods developed for global processing of MODIS data to map land cover are being adopted for use with Landsat data. Specifically, the MODIS Vegetation Continuous Field product methodology has been applied for mapping forest extent and change at national scales using Landsat time-series data sets. Scaling this method to continental and global scales is enabled by Google Earth Engine computing capabilities. By combining the supervised learning VCF approach with the Landsat archive and cloud computing, unprecedented monitoring of land cover dynamics is enabled.

  7. A Web-Based Interactive Mapping System of State Wide School Performance: Integrating Google Maps API Technology into Educational Achievement Data

    ERIC Educational Resources Information Center

    Wang, Kening; Mulvenon, Sean W.; Stegman, Charles; Anderson, Travis

    2008-01-01

    Google Maps API (Application Programming Interface), released in late June 2005 by Google, is an amazing technology that allows users to embed Google Maps in their own Web pages with JavaScript. Google Maps API has accelerated the development of new Google Maps based applications. This article reports a Web-based interactive mapping system…

  8. Learning to Map the Earth and Planets using a Google Earth - based Multi-student Game

    NASA Astrophysics Data System (ADS)

    De Paor, D. G.; Wild, S. C.; Dordevic, M.

    2011-12-01

    We report on progress in developing an interactive geological and geophysical mapping game employing the Google Earth, Google Moon, and Goole Mars virtual globes. Working in groups of four, students represent themselves on the Google Earth surface by selecting an avatar. One of the group drives to each field stop in a model vehicle using game-like controls. When they arrive at a field stop and get out of their field vehicle, students can control their own avatars' movements independently and can communicate with one another by text message. They are geo-fenced and receive automatic messages if they wander off target. Individual movements are logged and stored in a MySQL database for later analysis. Students collaborate on mapping decisions and submit a report to their instructor through a Javascript interface to the Google Earth API. Unlike real mapping, students are not restricted by geographic access and can engage in comparative mapping on different planets. Using newly developed techniques, they can also explore and map the sub-surface down to the core-mantle boundary. Virtual specimens created with a 3D scanner, Gigapan images of outcrops, and COLLADA models of mantle structures such as subducted lithospheric slabs all contribute to an engaging learning experience.

  9. Interacting with Petabytes of Earth Science Data using Jupyter Notebooks, IPython Widgets and Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Erickson, T. A.; Granger, B.; Grout, J.; Corlay, S.

    2017-12-01

    The volume of Earth science data gathered from satellites, aircraft, drones, and field instruments continues to increase. For many scientific questions in the Earth sciences, managing this large volume of data is a barrier to progress, as it is difficult to explore and analyze large volumes of data using the traditional paradigm of downloading datasets to a local computer for analysis. Furthermore, methods for communicating Earth science algorithms that operate on large datasets in an easily understandable and reproducible way are needed. Here we describe a system for developing, interacting, and sharing well-documented Earth Science algorithms that combines existing software components: Jupyter Notebook: An open-source, web-based environment that supports documents that combine code and computational results with text narrative, mathematics, images, and other media. These notebooks provide an environment for interactive exploration of data and development of well documented algorithms. Jupyter Widgets / ipyleaflet: An architecture for creating interactive user interface controls (such as sliders, text boxes, etc.) in Jupyter Notebooks that communicate with Python code. This architecture includes a default set of UI controls (sliders, dropboxes, etc.) as well as APIs for building custom UI controls. The ipyleaflet project is one example that offers a custom interactive map control that allows a user to display and manipulate geographic data within the Jupyter Notebook. Google Earth Engine: A cloud-based geospatial analysis platform that provides access to petabytes of Earth science data via a Python API. The combination of Jupyter Notebooks, Jupyter Widgets, ipyleaflet, and Google Earth Engine makes it possible to explore and analyze massive Earth science datasets via a web browser, in an environment suitable for interactive exploration, teaching, and sharing. Using these environments can make Earth science analyses easier to understand and reproducible, which may increase the rate of scientific discoveries and the transition of discoveries into real-world impacts.

  10. Development, Deployment, and Assessment of Dynamic Geological and Geophysical Models Using the Google Earth APP and API: Implications for Undergraduate Education in the Earth and Planetary Sciences

    NASA Astrophysics Data System (ADS)

    de Paor, D. G.; Whitmeyer, S. J.; Gobert, J.

    2009-12-01

    We previously reported on innovative techniques for presenting data on virtual globes such as Google Earth using emergent Collada models that reveal subsurface geology and geophysics. We here present several new and enhanced models and linked lesson plans to aid deployment in undergraduate geoscience courses, along with preliminary results from our assessment of their effectiveness. The new Collada models are created with Google SketchUp, Bonzai3D, and MeshLab software, and are grouped to cover (i) small scale field mapping areas; (ii) regional scale studies of the North Atlantic Ocean Basin, the Appalachian Orogen, and the Pacific Ring of Fire; and (iii) global scale studies of terrestrial planets, moons, and asteroids. Enhancements include emergent block models with three-dimensional surface topography; models that conserve structural orientation data; interactive virtual specimens; models that animate plate movements on the virtual globe; exploded 3-D views of planetary mantles and cores; and server-generated dynamic KML. We tested volunteer students and professors using Silverback monitoring software, think-aloud verbalizations, and questionnaires designed to assess their understanding of the underlying geo-scientific phenomena. With the aid of a cohort of instructors across the U.S., we are continuing to assess areas in which users encounter difficulties with both the software and geoscientific concepts. Preliminary results suggest that it is easy to overestimate the computer expertise of novice users even when they are content knowledge experts (i.e., instructors), and that a detailed introduction to virtual globe manipulation is essential before moving on to geoscience applications. Tasks that seem trivial to developers may present barriers to non-technical users and technicalities that challenge instructors may block adoption in the classroom. We have developed new models using the Google Earth API which permits enhanced interaction and dynamic feedback and are assessing their relative merits versus the Google Earth APP. Overall, test students and professors value the models very highly. There are clear pedagogical opportunities for using materials such as these to create engaging in-course research opportunities for undergraduates.

  11. Detecting Runtime Anomalies in AJAX Applications through Trace Analysis

    DTIC Science & Technology

    2011-08-10

    statements by adding the instrumentation to the GWT UI classes, leaving the user code untouched. Some content management frameworks such as Drupal [12...Google web toolkit.” http://code.google.com/webtoolkit/. [12] “Form generation – drupal api.” http://api.drupal.org/api/group/form_api/6. 9

  12. Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Gorelick, Noel

    2013-04-01

    The Google Earth Engine platform is a system designed to enable petabyte-scale, scientific analysis and visualization of geospatial datasets. Earth Engine provides a consolidated environment including a massive data catalog co-located with thousands of computers for analysis. The user-friendly front-end provides a workbench environment to allow interactive data and algorithm development and exploration and provides a convenient mechanism for scientists to share data, visualizations and analytic algorithms via URLs. The Earth Engine data catalog contains a wide variety of popular, curated datasets, including the world's largest online collection of Landsat scenes (> 2.0M), numerous MODIS collections, and many vector-based data sets. The platform provides a uniform access mechanism to a variety of data types, independent of their bands, projection, bit-depth, resolution, etc..., facilitating easy multi-sensor analysis. Additionally, a user is able to add and curate their own data and collections. Using a just-in-time, distributed computation model, Earth Engine can rapidly process enormous quantities of geo-spatial data. All computation is performed lazily; nothing is computed until it's required either for output or as input to another step. This model allows real-time feedback and preview during algorithm development, supporting a rapid algorithm development, test, and improvement cycle that scales seamlessly to large-scale production data processing. Through integration with a variety of other services, Earth Engine is able to bring to bear considerable analytic and technical firepower in a transparent fashion, including: AI-based classification via integration with Google's machine learning infrastructure, publishing and distribution at Google scale through integration with the Google Maps API, Maps Engine and Google Earth, and support for in-the-field activities such as validation, ground-truthing, crowd-sourcing and citizen science though the Android Open Data Kit.

  13. Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Gorelick, N.

    2012-12-01

    The Google Earth Engine platform is a system designed to enable petabyte-scale, scientific analysis and visualization of geospatial datasets. Earth Engine provides a consolidated environment including a massive data catalog co-located with thousands of computers for analysis. The user-friendly front-end provides a workbench environment to allow interactive data and algorithm development and exploration and provides a convenient mechanism for scientists to share data, visualizations and analytic algorithms via URLs. The Earth Engine data catalog contains a wide variety of popular, curated datasets, including the world's largest online collection of Landsat scenes (> 2.0M), numerous MODIS collections, and many vector-based data sets. The platform provides a uniform access mechanism to a variety of data types, independent of their bands, projection, bit-depth, resolution, etc..., facilitating easy multi-sensor analysis. Additionally, a user is able to add and curate their own data and collections. Using a just-in-time, distributed computation model, Earth Engine can rapidly process enormous quantities of geo-spatial data. All computation is performed lazily; nothing is computed until it's required either for output or as input to another step. This model allows real-time feedback and preview during algorithm development, supporting a rapid algorithm development, test, and improvement cycle that scales seamlessly to large-scale production data processing. Through integration with a variety of other services, Earth Engine is able to bring to bear considerable analytic and technical firepower in a transparent fashion, including: AI-based classification via integration with Google's machine learning infrastructure, publishing and distribution at Google scale through integration with the Google Maps API, Maps Engine and Google Earth, and support for in-the-field activities such as validation, ground-truthing, crowd-sourcing and citizen science though the Android Open Data Kit.

  14. 3D Visualization of near real-time remote-sensing observation for hurricanes field campaign using Google Earth API

    NASA Astrophysics Data System (ADS)

    Li, P.; Turk, J.; Vu, Q.; Knosp, B.; Hristova-Veleva, S. M.; Lambrigtsen, B.; Poulsen, W. L.; Licata, S.

    2009-12-01

    NASA is planning a new field experiment, the Genesis and Rapid Intensification Processes (GRIP), in the summer of 2010 to better understand how tropical storms form and develop into major hurricanes. The DC-8 aircraft and the Global Hawk Unmanned Airborne System (UAS) will be deployed loaded with instruments for measurements including lightning, temperature, 3D wind, precipitation, liquid and ice water contents, aerosol and cloud profiles. During the field campaign, both the spaceborne and the airborne observations will be collected in real-time and integrated with the hurricane forecast models. This observation-model integration will help the campaign achieve its science goals by allowing team members to effectively plan the mission with current forecasts. To support the GRIP experiment, JPL developed a website for interactive visualization of all related remote-sensing observations in the GRIP’s geographical domain using the new Google Earth API. All the observations are collected in near real-time (NRT) with 2 to 5 hour latency. The observations include a 1KM blended Sea Surface Temperature (SST) map from GHRSST L2P products; 6-hour composite images of GOES IR; stability indices, temperature and vapor profiles from AIRS and AMSU-B; microwave brightness temperature and rain index maps from AMSR-E, SSMI and TRMM-TMI; ocean surface wind vectors, vorticity and divergence of the wind from QuikSCAT; the 3D precipitation structure from TRMM-PR and vertical profiles of cloud and precipitation from CloudSAT. All the NRT observations are collected from the data centers and science facilities at NASA and NOAA, subsetted, re-projected, and composited into hourly or daily data products depending on the frequency of the observation. The data products are then displayed on the 3D Google Earth plug-in at the JPL Tropical Cyclone Information System (TCIS) website. The data products offered by the TCIS in the Google Earth display include image overlays, wind vectors, clickable placemarks with vertical profiles for temperature and water vapors and curtain plots along the satellite tracks. Multiple products can be overlaid with individual adjustable opacity control. The time sequence visualization is supported by calendar and Google Earth time animation. The work described here was performed at the Jet Propulsion Laboratory, California Institute of Technology, under contract with the National Aeronautics and Space Administration.

  15. Fast segmentation of satellite images using SLIC, WebGL and Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Donchyts, Gennadii; Baart, Fedor; Gorelick, Noel; Eisemann, Elmar; van de Giesen, Nick

    2017-04-01

    Google Earth Engine (GEE) is a parallel geospatial processing platform, which harmonizes access to petabytes of freely available satellite images. It provides a very rich API, allowing development of dedicated algorithms to extract useful geospatial information from these images. At the same time, modern GPUs provide thousands of computing cores, which are mostly not utilized in this context. In the last years, WebGL became a popular and well-supported API, allowing fast image processing directly in web browsers. In this work, we will evaluate the applicability of WebGL to enable fast segmentation of satellite images. A new implementation of a Simple Linear Iterative Clustering (SLIC) algorithm using GPU shaders will be presented. SLIC is a simple and efficient method to decompose an image in visually homogeneous regions. It adapts a k-means clustering approach to generate superpixels efficiently. While this approach will be hard to scale, due to a significant amount of data to be transferred to the client, it should significantly improve exploratory possibilities and simplify development of dedicated algorithms for geoscience applications. Our prototype implementation will be used to improve surface water detection of the reservoirs using multispectral satellite imagery.

  16. Reaching the Next Generation of College Students via Their Digital Devices.

    NASA Astrophysics Data System (ADS)

    Whitmeyer, S. J.; De Paor, D. G.; Bentley, C.

    2015-12-01

    Current college students attended school during a decade in which many school districts banned cellphones from the classroom or even from school grounds. These students are used to being told to put away their mobile devices and concentrate on traditional classroom activities such as watching PowerPoint presentations or calculating with pencil and paper. However, due to a combination of parental security concerns and recent education research, schools are rapidly changing policy and embracing mobile devices for ubiquitous learning opportunities inside and outside of the classroom. Consequently, many of the next generation of college students will have expectations of learning via mobile technology. We have developed a range of digital geology resources to aid mobile-based geoscience education at college level, including mapping on iPads and other tablets, "crowd-sourced" field projects, augmented reality-supported asynchronous field classes, 3D and 4D split-screen virtual reality tours, macroscopic and microscopic gigapixel imagery, 360° panoramas, assistive devices for inclusive field education, and game-style educational challenges. Class testing of virtual planetary tours shows modest short-term learning gains, but more work is needed to ensure long-term retention. Many of our resources rely on the Google Earth browser plug-in and application program interface (API). Because of security concerns, browser plug-ins in general are being phased out and the Google Earth API will not be supported in future browsers. However, a new plug-in-free API is promised by Google and an alternative open-source virtual globe called Cesium is undergoing rapid development. It already supports the main aspects of Keyhole Markup Language and has features of significant benefit to geoscience, including full support on mobile devices and sub-surface viewing and touring. The research team includes: Heather Almquist, Stephen Burgin, Cinzia Cervato, Filis Coba, Chloe Constants, Gene Cooper, Mladen Dordevic, Marissa Dudek, Brandon Fitzwater, Bridget Gomez, Tyler Hansen, Paul Karabinos, Terry Pavlis, Jen Piatek, Alan Pitts, Robin Rohrback, Bill Richards, Caroline Robinson, Jeff Rollins, Jeff Ryan, Ron Schott, Kristen St. John, and Barb Tewksbury. Supported by NSF DUE 1323419 and by Google Geo Curriculum Awards.

  17. Sally Ride EarthKAM - Automated Image Geo-Referencing Using Google Earth Web Plug-In

    NASA Technical Reports Server (NTRS)

    Andres, Paul M.; Lazar, Dennis K.; Thames, Robert Q.

    2013-01-01

    Sally Ride EarthKAM is an educational program funded by NASA that aims to provide the public the ability to picture Earth from the perspective of the International Space Station (ISS). A computer-controlled camera is mounted on the ISS in a nadir-pointing window; however, timing limitations in the system cause inaccurate positional metadata. Manually correcting images within an orbit allows the positional metadata to be improved using mathematical regressions. The manual correction process is time-consuming and thus, unfeasible for a large number of images. The standard Google Earth program allows for the importing of KML (keyhole markup language) files that previously were created. These KML file-based overlays could then be manually manipulated as image overlays, saved, and then uploaded to the project server where they are parsed and the metadata in the database is updated. The new interface eliminates the need to save, download, open, re-save, and upload the KML files. Everything is processed on the Web, and all manipulations go directly into the database. Administrators also have the control to discard any single correction that was made and validate a correction. This program streamlines a process that previously required several critical steps and was probably too complex for the average user to complete successfully. The new process is theoretically simple enough for members of the public to make use of and contribute to the success of the Sally Ride EarthKAM project. Using the Google Earth Web plug-in, EarthKAM images, and associated metadata, this software allows users to interactively manipulate an EarthKAM image overlay, and update and improve the associated metadata. The Web interface uses the Google Earth JavaScript API along with PHP-PostgreSQL to present the user the same interface capabilities without leaving the Web. The simpler graphical user interface will allow the public to participate directly and meaningfully with EarthKAM. The use of similar techniques is being investigated to place ground-based observations in a Google Mars environment, allowing the MSL (Mars Science Laboratory) Science Team a means to visualize the rover and its environment.

  18. Finding, Weighting and Describing Venues: CSIRO at the 2012 TREC Contextual Suggestion Track

    DTIC Science & Technology

    2012-11-01

    commercial system (namely the Google Places API ), and whether the current experimental setup encourages diversity. The remaining two submissions...baseline systems that rely on the Google Places API and the user reviews it provides, and two more complex systems that incorporate information...from the Foursquare API , and are sensitive to personal preference and time. The remainder of this paper is structured as follows. The next section

  19. Generating and Visualizing Climate Indices using Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Erickson, T. A.; Guentchev, G.; Rood, R. B.

    2017-12-01

    Climate change is expected to have largest impacts on regional and local scales. Relevant and credible climate information is needed to support the planning and adaptation efforts in our communities. The volume of climate projections of temperature and precipitation is steadily increasing, as datasets are being generated on finer spatial and temporal grids with an increasing number of ensembles to characterize uncertainty. Despite advancements in tools for querying and retrieving subsets of these large, multi-dimensional datasets, ease of access remains a barrier for many existing and potential users who want to derive useful information from these data, particularly for those outside of the climate modelling research community. Climate indices, that can be derived from daily temperature and precipitation data, such as annual number of frost days or growing season length, can provide useful information to practitioners and stakeholders. For this work the NASA Earth Exchange Global Daily Downscaled Projections (NEX-GDDP) dataset was loaded into Google Earth Engine, a cloud-based geospatial processing platform. Algorithms that use the Earth Engine API to generate several climate indices were written. The indices were chosen from the set developed by the joint CCl/CLIVAR/JCOMM Expert Team on Climate Change Detection and Indices (ETCCDI). Simple user interfaces were created that allow users to query, produce maps and graphs of the indices, as well as download results for additional analyses. These browser-based interfaces could allow users in low-bandwidth environments to access climate information. This research shows that calculating climate indices from global downscaled climate projection datasets and sharing them widely using cloud computing technologies is feasible. Further development will focus on exposing the climate indices to existing applications via the Earth Engine API, and building custom user interfaces for presenting climate indices to a diverse set of user groups.

  20. Jupyter meets Earth: Creating Comprehensible and Reproducible Scientific Workflows with Jupyter Notebooks and Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Erickson, T.

    2016-12-01

    Deriving actionable information from Earth observation data obtained from sensors or models can be quite complicated, and sharing those insights with others in a form that they can understand, reproduce, and improve upon is equally difficult. Journal articles, even if digital, commonly present just a summary of an analysis that cannot be understood in depth or reproduced without major effort on the part of the reader. Here we show a method of improving scientific literacy by pairing a recently developed scientific presentation technology (Jupyter Notebooks) with a petabyte-scale platform for accessing and analyzing Earth observation and model data (Google Earth Engine). Jupyter Notebooks are interactive web documents that mix live code with annotations such as rich-text markup, equations, images, videos, hyperlinks and dynamic output. Notebooks were first introduced as part of the IPython project in 2011, and have since gained wide acceptance in the scientific programming community, initially among Python programmers but later by a wide range of scientific programming languages. While Jupyter Notebooks have been widely adopted for general data analysis, data visualization, and machine learning, to date there have been relatively few examples of using Jupyter Notebooks to analyze geospatial datasets. Google Earth Engine is cloud-based platform for analyzing geospatial data, such as satellite remote sensing imagery and/or Earth system model output. Through its Python API, Earth Engine makes petabytes of Earth observation data accessible, and provides hundreds of algorithmic building blocks that can be chained together to produce high-level algorithms and outputs in real-time. We anticipate that this technology pairing will facilitate a better way of creating, documenting, and sharing complex analyses that derive information on our Earth that can be used to promote broader understanding of the complex issues that it faces. http://jupyter.orghttps://earthengine.google.com

  1. A Knowledge Portal and Collaboration Environment for the Earth Sciences

    NASA Astrophysics Data System (ADS)

    D'Agnese, F. A.

    2008-12-01

    Earth Knowledge is developing a web-based 'Knowledge Portal and Collaboration Environment' that will serve as the information-technology-based foundation of a modular Internet-based Earth-Systems Monitoring, Analysis, and Management Tool. This 'Knowledge Portal' is essentially a 'mash- up' of web-based and client-based tools and services that support on-line collaboration, community discussion, and broad public dissemination of earth and environmental science information in a wide-area distributed network. In contrast to specialized knowledge-management or geographic-information systems developed for long- term and incremental scientific analysis, this system will exploit familiar software tools using industry standard protocols, formats, and APIs to discover, process, fuse, and visualize existing environmental datasets using Google Earth and Google Maps. An early form of these tools and services is being used by Earth Knowledge to facilitate the investigations and conversations of scientists, resource managers, and citizen-stakeholders addressing water resource sustainability issues in the Great Basin region of the desert southwestern United States. These ongoing projects will serve as use cases for the further development of this information-technology infrastructure. This 'Knowledge Portal' will accelerate the deployment of Earth- system data and information into an operational knowledge management system that may be used by decision-makers concerned with stewardship of water resources in the American Desert Southwest.

  2. Feature Positioning on Google Street View Panoramas

    NASA Astrophysics Data System (ADS)

    Tsai, V. J. D.; Chang, C.-T.

    2012-07-01

    Location-based services (LBS) on web-based maps and images have come into real-time since Google launched its Street View imaging services in 2007. This research employs Google Maps API and Web Service, GAE for JAVA, AJAX, Proj4js, CSS and HTML in developing an internet platform for accessing the orientation parameters of Google Street View (GSV) panoramas in order to determine the three dimensional position of interest features that appear on two overlapping panoramas by geometric intersection. A pair of GSV panoramas was examined using known points located on the Library Building of National Chung Hsing University (NCHU) with the root-mean-squared errors of ±0.522m, ±1.230m, and ±5.779m for intersection and ±0.142m, ±1.558m, and ±5.733m for resection in X, Y, and h (elevation), respectively. Potential error sources in GSV positioning were analyzed and illustrated that the errors in Google provided GSV positional parameters dominate the errors in geometric intersection. The developed system is suitable for data collection in establishing LBS applications integrated with Google Maps and Google Earth in traffic sign and infrastructure inventory by adding automatic extraction and matching techniques for points of interest (POI) from GSV panoramas.

  3. How Accurately Can the Google Web Speech API Recognize and Transcribe Japanese L2 English Learners' Oral Production?

    ERIC Educational Resources Information Center

    Ashwell, Tim; Elam, Jesse R.

    2017-01-01

    The ultimate aim of our research project was to use the Google Web Speech API to automate scoring of elicited imitation (EI) tests. However, in order to achieve this goal, we had to take a number of preparatory steps. We needed to assess how accurate this speech recognition tool is in recognizing native speakers' production of the test items; we…

  4. Leveraging Open Standards and Technologies to Search and Display Planetary Image Data

    NASA Astrophysics Data System (ADS)

    Rose, M.; Schauer, C.; Quinol, M.; Trimble, J.

    2011-12-01

    Mars and the Moon have both been visited by multiple NASA spacecraft. A large number of images and other data have been gathered by the spacecraft and are publicly available in NASA's Planetary Data System. Through a collaboration with Google, Inc., the User Centered Technologies group at NASA Ames Resarch Center has developed at tool for searching and browsing among images from multiple Mars and Moon missions. Development of this tool was facilitated by the use of several open technologies and standards. First, an open-source full-text search engine is used to search both place names on the target and to find images matching a geographic region. Second, the published API of the Google Earth browser plugin is used to geolocate the images on a virtual globe and allow the user to navigate on the globe to see related images. The structure of the application also employs standard protocols and services. The back-end is exposed as RESTful APIs, which could be reused by other client systems in the future. Further, the communication between the front- and back-end portions of the system utilizes open data standards including XML and KML (Keyhole Markup Language) for representation of textual and geographic data. The creation of the search index was facilitated by reuse of existing, publicly available metadata, including the Gazetteer of Planetary Nomenclature from the USGS, available in KML format. And the image metadata was reused from standards-compliant archives in the Planetary Data System. The system also supports collaboration with other tools by allowing export of search results in KML, and the ability to display those results in the Google Earth desktop application. We will demonstrate the search and visualization capabilities of the system, with emphasis on how the system facilitates reuse of data and services through the adoption of open standards.

  5. Cloud Geospatial Analysis Tools for Global-Scale Comparisons of Population Models for Decision Making

    NASA Astrophysics Data System (ADS)

    Hancher, M.; Lieber, A.; Scott, L.

    2017-12-01

    The volume of satellite and other Earth data is growing rapidly. Combined with information about where people are, these data can inform decisions in a range of areas including food and water security, disease and disaster risk management, biodiversity, and climate adaptation. Google's platform for planetary-scale geospatial data analysis, Earth Engine, grants access to petabytes of continually updating Earth data, programming interfaces for analyzing the data without the need to download and manage it, and mechanisms for sharing the analyses and publishing results for data-driven decision making. In addition to data about the planet, data about the human planet - population, settlement and urban models - are now available for global scale analysis. The Earth Engine APIs enable these data to be joined, combined or visualized with economic or environmental indicators such as nighttime lights trends, global surface water, or climate projections, in the browser without the need to download anything. We will present our newly developed application intended to serve as a resource for government agencies, disaster response and public health programs, or other consumers of these data to quickly visualize the different population models, and compare them to ground truth tabular data to determine which model suits their immediate needs. Users can further tap into the power of Earth Engine and other Google technologies to perform a range of analysis from simple statistics in custom regions to more complex machine learning models. We will highlight case studies in which organizations around the world have used Earth Engine to combine population data with multiple other sources of data, such as water resources and roads data, over deep stacks of temporal imagery to model disease risk and accessibility to inform decisions.

  6. ANTP Protocol Suite Software Implementation Architecture in Python

    DTIC Science & Technology

    2011-06-03

    a popular platform of networking programming, an area in which C has traditionally dominated. 2 NetController AeroRP AeroNP AeroNP API AeroTP...visualisation of the running system. For example using the Google Maps API , the main logging web page can show all the running nodes in the system. By...communication between AeroNP and AeroRP and runs on the operating system as daemon. Furthermore, it creates an API interface to mange the communication between

  7. Google Earth and Geo Applications: A Toolset for Viewing Earth's Geospatial Information

    NASA Astrophysics Data System (ADS)

    Tuxen-Bettman, K.

    2016-12-01

    Earth scientists measure and derive fundamental data that can be of broad general interest to the public and policy makers. Yet, one of the challenges that has always faced the Earth science community is how to present their data and findings in an easy-to-use and compelling manner. Google's Geo Tools offer an efficient and dynamic way for scientists, educators, journalists and others to both access data and view or tell stories in a dynamic three-dimensional geospatial context. Google Earth in particular provides a dense canvas of satellite imagery on which can be viewed rich vector and raster datasets using the medium of Keyhole Markup Language (KML). Through KML, Google Earth can combine the analytical capabilities of Earth Engine, collaborative mapping of My Maps, and storytelling of Tour Builder and more to make Google's Geo Applications a coherent suite of tools for exploring our planet.https://earth.google.com/https://earthengine.google.com/https://mymaps.google.com/https://tourbuilder.withgoogle.com/https://www.google.com/streetview/

  8. Google Earth: A Virtual Globe for Elementary Geography

    ERIC Educational Resources Information Center

    Britt, Judy; LaFontaine, Gus

    2009-01-01

    Originally called Earth Viewer in 2004, Google Earth was the first virtual globe easily available to the ordinary user of the Internet. Google Earth, at earth.google.com, is a free, 3-dimensional computer model of Earth, but that means more than just a large collection of pretty pictures. It allows the viewer to "fly" anywhere on Earth "to view…

  9. MARs Tools for Interactive ANalysis (MARTIAN): Google Maps Tools for Visual Exploration of Geophysical Modeling on Mars

    NASA Astrophysics Data System (ADS)

    Dimitrova, L. L.; Haines, M.; Holt, W. E.; Schultz, R. A.; Richard, G.; Haines, A. J.

    2006-12-01

    Interactive maps of surface-breaking faults and stress models on Mars provide important tools to engage undergraduate students, educators, and scientists with current geological and geophysical research. We have developed a map based on the Google Maps API -- an Internet based tool combining DHTML and AJAX, -- which allows very large maps to be viewed over the World Wide Web. Typically, small portions of the maps are downloaded as needed, rather than the entire image at once. This set-up enables relatively fast access for users with low bandwidth. Furthermore, Google Maps provides an extensible interactive interface making it ideal for visualizing multiple data sets at the user's choice. The Google Maps API works primarily with data referenced to latitudes and longitudes, which is then mapped in Mercator projection only. We have developed utilities for general cylindrical coordinate systems by converting these coordinates into equivalent Mercator projection before including them on the map. The MARTIAN project is available at http://rock.geo.sunysb.edu/~holt/Mars/MARTIAN/. We begin with an introduction to the Martian surface using a topography model. Faults from several datasets are classified by type (extension vs. compression) and by time epoch. Deviatoric stresses due to gravitational potential energy differences, calculated from the topography and crustal thickness, can be overlain. Several quantitative measures for the fit of the stress field to the faults are also included. We provide introductory text and exercises spanning a range of topics: how are faults identified, what stress is and how it relates to faults, what gravitational potential energy is and how variations in it produce stress, how the models are created, and how these models can be evaluated and interpreted. The MARTIAN tool is used at Stony Brook University in GEO 310: Introduction to Geophysics, a class geared towards junior and senior geosciences majors. Although this project is in its early stages, high school and college teachers, as well as researchers have expressed interest in using and extending these tools for visualizing and interacting with data on Earth and other planetary bodies.

  10. Google Moon Press Conference

    NASA Image and Video Library

    2009-07-19

    Michael Weiss-Malik, Product Manager for Moon in Google Earth, Google, Inc., speaks during a press conference, Monday, July 20, 2009, announcing the launch of Moon in Google Earth, an immersive 3D atlas of the Moon, accessible within Google Earth 5.0, Monday, July 20, 2009, at the Newseum in Washington. Photo Credit: (NASA/Bill Ingalls)

  11. Building a Dashboard of the Planet with Google Earth and Earth Engine

    NASA Astrophysics Data System (ADS)

    Moore, R. T.; Hancher, M.

    2016-12-01

    In 2005 Google Earth, a popular 3-D virtual globe, was first released. Scientists immediately recognized how it could be used to tell stories about the Earth. From 2006 to 2009, the "Virtual Globes" sessions of AGU included innovative examples of scientists and educators using Google Earth, and since that time it has become a commonplace tool for communicating scientific results. In 2009 Google Earth Engine, a cloud-based platform for planetary-scale geospatial analysis, was first announced. Earth Engine was initially used to extract information about the world's forests from raw Landsat data. Since then, the platform has proven highly effective for general analysis of georeferenced data, and users have expanded the list of use cases to include high-impact societal issues such as conservation, drought, disease, food security, water management, climate change and environmental monitoring. To support these use cases, the platform has continuously evolved with new datasets, analysis functions, and user interface tools. This talk will give an overview of the latest Google Earth and Earth Engine functionality that allow partners to understand, monitor and tell stories about of our living, breathing Earth. https://earth.google.com https://earthengine.google.com

  12. Recent Advances in Geospatial Visualization with the New Google Earth

    NASA Astrophysics Data System (ADS)

    Anderson, J. C.; Poyart, E.; Yan, S.; Sargent, R.

    2017-12-01

    Google Earth's detailed, world-wide imagery and terrain data provide a rich backdrop for geospatial visualization at multiple scales, from global to local. The Keyhole Markup Language (KML) is an open standard that has been the primary way for users to author and share data visualizations in Google Earth. Despite its ease of use and flexibility for relatively small amounts of data, users can quickly run into difficulties and limitations working with large-scale or time-varying datasets using KML in Google Earth. Recognizing these challenges, we present our recent work toward extending Google Earth to be a more powerful data visualization platform. We describe a new KML extension to simplify the display of multi-resolution map tile pyramids - which can be created by analysis platforms like Google Earth Engine, or by a variety of other map tile production pipelines. We also describe how this implementation can pave the way to creating novel data visualizations by leveraging custom graphics shaders. Finally, we present our investigations into native support in Google Earth for data storage and transport formats that are well-suited for big raster and vector data visualization. Taken together, these capabilities make it easier to create and share new scientific data visualization experiences using Google Earth, and simplify the integration of Google Earth with existing map data products, services, and analysis pipelines.

  13. Google earth as a source of ancillary material in a history of psychology class.

    PubMed

    Stevison, Blake K; Biggs, Patrick T; Abramson, Charles I

    2010-06-01

    This article discusses the use of Google Earth to visit significant geographical locations associated with events in the history of psychology. The process of opening files, viewing content, adding placemarks, and saving customized virtual tours on Google Earth are explained. Suggestions for incorporating Google Earth into a history of psychology course are also described.

  14. Enhancing Geographic and Digital Literacy with a Student-Generated Course Portfolio in Google Earth

    ERIC Educational Resources Information Center

    Guertin, Laura; Stubbs, Christopher; Millet, Christopher; Lee, Tsan-Kuang; Bodek, Matthew

    2012-01-01

    Google Earth can serve as a platform for students to construct a course ePortfolio. By having students construct their own placemarks in a customized Google Earth file, students document their learning in a geospatial context, learn an innovative use of Google Earth, and have the opportunity for creativity and flexibility with disseminating their…

  15. Supporting our scientists with Google Earth-based UIs.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scott, Janine

    2010-10-01

    Google Earth and Google Maps are incredibly useful for researchers looking for easily-digestible displays of data. This presentation will provide a step-by-step tutorial on how to begin using Google Earth to create tools that further the mission of the DOE national lab complex.

  16. Towards Large-area Field-scale Operational Evapotranspiration for Water Use Mapping

    NASA Astrophysics Data System (ADS)

    Senay, G. B.; Friedrichs, M.; Morton, C.; Huntington, J. L.; Verdin, J.

    2017-12-01

    Field-scale evapotranspiration (ET) estimates are needed for improving surface and groundwater use and water budget studies. Ideally, field-scale ET estimates would be at regional to national levels and cover long time periods. As a result of large data storage and computational requirements associated with processing field-scale satellite imagery such as Landsat, numerous challenges remain to develop operational ET estimates over large areas for detailed water use and availability studies. However, the combination of new science, data availability, and cloud computing technology is enabling unprecedented capabilities for ET mapping. To demonstrate this capability, we used Google's Earth Engine cloud computing platform to create nationwide annual ET estimates with 30-meter resolution Landsat ( 16,000 images) and gridded weather data using the Operational Simplified Surface Energy Balance (SSEBop) model in support of the National Water Census, a USGS research program designed to build decision support capacity for water management agencies and other natural resource managers. By leveraging Google's Earth Engine Application Programming Interface (API) and developing software in a collaborative, open-platform environment, we rapidly advance from research towards applications for large-area field-scale ET mapping. Cloud computing of the Landsat image archive combined with other satellite, climate, and weather data, is creating never imagined opportunities for assessing ET model behavior and uncertainty, and ultimately providing the ability for more robust operational monitoring and assessment of water use at field-scales.

  17. NASA Earth Observations (NEO): Data Imagery for Education and Visualization

    NASA Astrophysics Data System (ADS)

    Ward, K.

    2008-12-01

    NASA Earth Observations (NEO) has dramatically simplified public access to georeferenced imagery of NASA remote sensing data. NEO targets the non-traditional data users who are currently underserved by functionality and formats available from the existing data ordering systems. These users include formal and informal educators, museum and science center personnel, professional communicators, and citizen scientists. NEO currently serves imagery from 45 different datasets with daily, weekly, and/or monthly temporal resolutions, with more datasets currently under development. The imagery from these datasets is produced in coordination with several data partners who are affiliated either with the instrument science teams or with the respective data processing center. NEO is a system of three components -- website, WMS (Web Mapping Service), and ftp archive -- which together are able to meet the wide-ranging needs of our users. Some of these needs include the ability to: view and manipulate imagery using the NEO website -- e.g., applying color palettes, resizing, exporting to a variety of formats including PNG, JPEG, KMZ (Google Earth), GeoTIFF; access the NEO collection via a standards-based API (WMS); and create customized exports for select users (ftp archive) such as Science on a Sphere, NASA's Earth Observatory, and others.

  18. Positional Accuracy Assessment of Googleearth in Riyadh

    NASA Astrophysics Data System (ADS)

    Farah, Ashraf; Algarni, Dafer

    2014-06-01

    Google Earth is a virtual globe, map and geographical information program that is controlled by Google corporation. It maps the Earth by the superimposition of images obtained from satellite imagery, aerial photography and GIS 3D globe. With millions of users all around the globe, GoogleEarth® has become the ultimate source of spatial data and information for private and public decision-support systems besides many types and forms of social interactions. Many users mostly in developing countries are also using it for surveying applications, the matter that raises questions about the positional accuracy of the Google Earth program. This research presents a small-scale assessment study of the positional accuracy of GoogleEarth® Imagery in Riyadh; capital of Kingdom of Saudi Arabia (KSA). The results show that the RMSE of the GoogleEarth imagery is 2.18 m and 1.51 m for the horizontal and height coordinates respectively.

  19. The jmzQuantML programming interface and validator for the mzQuantML data standard.

    PubMed

    Qi, Da; Krishna, Ritesh; Jones, Andrew R

    2014-03-01

    The mzQuantML standard from the HUPO Proteomics Standards Initiative has recently been released, capturing quantitative data about peptides and proteins, following analysis of MS data. We present a Java application programming interface (API) for mzQuantML called jmzQuantML. The API provides robust bridges between Java classes and elements in mzQuantML files and allows random access to any part of the file. The API provides read and write capabilities, and is designed to be embedded in other software packages, enabling mzQuantML support to be added to proteomics software tools (http://code.google.com/p/jmzquantml/). The mzQuantML standard is designed around a multilevel validation system to ensure that files are structurally and semantically correct for different proteomics quantitative techniques. In this article, we also describe a Java software tool (http://code.google.com/p/mzquantml-validator/) for validating mzQuantML files, which is a formal part of the data standard. © 2014 The Authors. Proteomics published by Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. The AppScale Cloud Platform

    PubMed Central

    Krintz, Chandra

    2013-01-01

    AppScale is an open source distributed software system that implements a cloud platform as a service (PaaS). AppScale makes cloud applications easy to deploy and scale over disparate cloud fabrics, implementing a set of APIs and architecture that also makes apps portable across the services they employ. AppScale is API-compatible with Google App Engine (GAE) and thus executes GAE applications on-premise or over other cloud infrastructures, without modification. PMID:23828721

  1. Flying across Galaxy Clusters with Google Earth: additional imagery from SDSS co-added data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hao, Jiangang; Annis, James; /Fermilab

    2010-10-01

    Galaxy clusters are spectacular. We provide a Google Earth compatible imagery for the deep co-added images from the Sloan Digital Sky Survey and make it a tool for examing galaxy clusters. Google Earth (in sky mode) provides a highly interactive environment for visualizing the sky. By encoding the galaxy cluster information into a kml/kmz file, one can use Google Earth as a tool for examining galaxy clusters and fly across them freely. However, the resolution of the images provided by Google Earth is not very high. This is partially because the major imagery google earth used is from Sloan Digitalmore » Sky Survey (SDSS) (SDSS collaboration 2000) and the resolutions have been reduced to speed up the web transferring. To have higher resolution images, you need to add your own images in a way that Google Earth can understand. The SDSS co-added data are the co-addition of {approx}100 scans of images from SDSS stripe 82 (Annis et al. 2010). It provides the deepest images based on SDSS and reach as deep as about redshift 1.0. Based on the co-added images, we created color images in a way as described by Lupton et al. (2004) and convert the color images to Google Earth compatible images using wcs2kml (Brewer et al. 2007). The images are stored at a public server at Fermi National Accelerator Laboratory and can be accessed by the public. To view those images in Google Earth, you need to download a kmz file, which contains the links to the color images, and then open the kmz file with your Google Earth. To meet different needs for resolutions, we provide three kmz files corresponding to low, medium and high resolution images. We recommend the high resolution one as long as you have a broadband Internet connection, though you should choose to download any of them, depending on your own needs and Internet speed. After you open the downloaded kmz file with Google Earth (in sky mode), it takes about 5 minutes (depending on your Internet connection and the resolution of images you want) to get some initial images loaded. Then, additional images corresponding to the region you are browsing will be loaded automatically. So far, you have access to all the co-added images. But you still do not have the galaxy cluster position information to look at. In order to see the galaxy clusters, you need to download another kmz file that tell Google Earth where to find the galaxy clusters in the co-added data region. We provide a kmz file for a few galaxy clusters in the stripe 82 region and you can download and open it with Google Earth. In the SDSS co-added region (stripe 82 region), the imagery from Google Earth itself is from the Digitized Sky Survey (2007), which is in very poor quality. In Figure1 and Figure2, we show screenshots of a cluster with and without the new co-added imagery in Google Earth. Much more details have been revealed with the deep images.« less

  2. Google Moon Press Conference

    NASA Image and Video Library

    2009-07-19

    Tiffany Montague, Technical Program Manager for NASA and Google Lunar X PRIZE, Google, Inc., speaks during a press conference, Monday, July 20, 2009, announcing the launch of Moon in Google Earth, an immersive 3D atlas of the Moon, accessible within Google Earth 5.0, Monday, July 20, 2009, at the Newseum in Washington. Photo Credit: (NASA/Bill Ingalls)

  3. Google Earth as a method for connecting scientific research with the World

    NASA Astrophysics Data System (ADS)

    Graham, J. R.

    2012-12-01

    Google Earth has proven itself to be an exceptionally successful and ambitious application: fully capable as a scientific tool, yet able to also satisfy the intellectual and virtual touristic needs of students, educators and the general public. It is difficult to overstate Google Earth's impact on our understanding of the World we inhabit, and yet there is also considerable potential that remains unexplored. This paper will discuss Google Earth's potential as a social network for the science community - connecting the general public with scientists, and scientists with their research. This paper will look at the University of Lethbridge's RAVE (Reaching Audiences through Virtual Entryways) project as a model for how this social network can function within the Google Earth environment.

  4. [Who Hits the Mark? A Comparative Study of the Free Geocoding Services of Google and OpenStreetMap].

    PubMed

    Lemke, D; Mattauch, V; Heidinger, O; Hense, H W

    2015-09-01

    Geocoding, the process of converting textual information (addresses) into geographic coordinates is increasingly used in public health/epidemiological research and practice. To date, little attention has been paid to geocoding quality and its impact on different types of spatially-related health studies. The primary aim of this study was to compare 2 freely available geocoding services (Google and OpenStreetMap) with regard to matching rate (percentage of address records capable of being geocoded) and positional accuracy (distance between geocodes and the ground truth locations). Residential addresses were geocoded by the NRW state office for information and technology and were considered as reference data (gold standard). The gold standard included the coordinates, the quality of the addresses (4 categories), and a binary urbanity indicator based on the CORINE land cover data. 2 500 addresses were randomly sampled after stratification for address quality and urbanity indicator (approximately 20 000 addresses). These address samples were geocoded using the geocoding services from Google and OSM. In general, both geocoding services showed a decrease in the matching rate with decreasing address quality and urbanity. Google showed consistently a higher completeness than OSM (>93 vs. >82%). Also, the cartographic confounding between urban and rural regions was less distinct with Google's geocoding API. Regarding the positional accuracy of the geo-coordinates, Google also showed the smallest deviations from the reference coordinates, with a median of <9 vs. <175.8 m. The cumulative density function derived from the positional accuracy showed for Google that nearly 95% and for OSM 50% of the addresses were geocoded within <50 m of their reference coordinates. The geocoding API from Google is superior to OSM regarding completeness and positional accuracy of the geocoded addresses. On the other hand, Google has several restrictions, such as the limitation of the requests to 2 500 addresses per 24 h and the presentation of the results exclusively on Google Maps, which may complicate the use for scientific purposes. © Georg Thieme Verlag KG Stuttgart · New York.

  5. Overcoming Assessment Problems in Google Earth-Based Assignments

    ERIC Educational Resources Information Center

    Johnson, Nicholas D.; Lang, Nicholas P.; Zophy, Kelley T.

    2011-01-01

    Educational technologies such as Google Earth have the potential to increase student learning and participation in geoscience classrooms. However, little has been written about tying the use of such software with effective assessment. To maximize Google Earth's learning potential for students, educators need to craft appropriate, research-based…

  6. A Google Earth Grand Tour of the Terrestrial Planets

    ERIC Educational Resources Information Center

    De Paor, Declan; Coba, Filis; Burgin, Stephen

    2016-01-01

    Google Earth is a powerful instructional resource for geoscience education. We have extended the virtual globe to include all terrestrial planets. Downloadable Keyhole Markup Language (KML) files (Google Earth's scripting language) associated with this paper include lessons about Mercury, Venus, the Moon, and Mars. We created "grand…

  7. Teaching Waves with Google Earth

    ERIC Educational Resources Information Center

    Logiurato, Fabrizio

    2012-01-01

    Google Earth is a huge source of interesting illustrations of various natural phenomena. It can represent a valuable tool for science education, not only for teaching geography and geology, but also physics. Here we suggest that Google Earth can be used for introducing in an attractive way the physics of waves. (Contains 9 figures.)

  8. Google Earth for Landowners: Insights from Hands-on Workshops

    ERIC Educational Resources Information Center

    Huff, Tristan

    2014-01-01

    Google Earth is an accessible, user-friendly GIS that can help landowners in their management planning. I offered hands-on Google Earth workshops to landowners to teach skills, including mapmaking, length and area measurement, and database management. Workshop participants were surveyed at least 6 months following workshop completion, and learning…

  9. Google Moon Press Conference

    NASA Image and Video Library

    2009-07-19

    Brian McLendon, VP of Engineering, Google, Inc., speaks during a press conference, Monday, July 20, 2009, announcing the launch of Moon in Google Earth, an immersive 3D atlas of the Moon, accessible within Google Earth 5.0, Monday, July 20, 2009, at the Newseum in Washington. Photo Credit: (NASA/Bill Ingalls)

  10. Google Moon Press Conference

    NASA Image and Video Library

    2009-07-19

    Alan Eustace, Senior VP of Engineering and Research, Google, Inc., speaks during a press conference, Monday, July 20, 2009, announcing the launch of Moon in Google Earth, an immersive 3D atlas of the Moon, accessible within Google Earth 5.0, Monday, July 20, 2009, at the Newseum in Washington. Photo Credit: (NASA/Bill Ingalls)

  11. Using Google Earth as an innovative tool for community mapping.

    PubMed

    Lefer, Theodore B; Anderson, Matthew R; Fornari, Alice; Lambert, Anastasia; Fletcher, Jason; Baquero, Maria

    2008-01-01

    Maps are used to track diseases and illustrate the social context of health problems. However, commercial mapping software requires special training. This article illustrates how nonspecialists used Google Earth, a free program, to create community maps. The Bronx, New York, is characterized by high levels of obesity and diabetes. Residents and medical students measured the variety and quality of food and exercise sources around a residency training clinic and a student-run free clinic, using Google Earth to create maps with minimal assistance. Locations were identified using street addresses or simply by pointing to them on a map. Maps can be shared via e-mail, viewed online with Google Earth or Google Maps, and the data can be incorporated into other mapping software.

  12. What Google Maps can do for biomedical data dissemination: examples and a design study.

    PubMed

    Jianu, Radu; Laidlaw, David H

    2013-05-04

    Biologists often need to assess whether unfamiliar datasets warrant the time investment required for more detailed exploration. Basing such assessments on brief descriptions provided by data publishers is unwieldy for large datasets that contain insights dependent on specific scientific questions. Alternatively, using complex software systems for a preliminary analysis may be deemed as too time consuming in itself, especially for unfamiliar data types and formats. This may lead to wasted analysis time and discarding of potentially useful data. We present an exploration of design opportunities that the Google Maps interface offers to biomedical data visualization. In particular, we focus on synergies between visualization techniques and Google Maps that facilitate the development of biological visualizations which have both low-overhead and sufficient expressivity to support the exploration of data at multiple scales. The methods we explore rely on displaying pre-rendered visualizations of biological data in browsers, with sparse yet powerful interactions, by using the Google Maps API. We structure our discussion around five visualizations: a gene co-regulation visualization, a heatmap viewer, a genome browser, a protein interaction network, and a planar visualization of white matter in the brain. Feedback from collaborative work with domain experts suggests that our Google Maps visualizations offer multiple, scale-dependent perspectives and can be particularly helpful for unfamiliar datasets due to their accessibility. We also find that users, particularly those less experienced with computer use, are attracted by the familiarity of the Google Maps API. Our five implementations introduce design elements that can benefit visualization developers. We describe a low-overhead approach that lets biologists access readily analyzed views of unfamiliar scientific datasets. We rely on pre-computed visualizations prepared by data experts, accompanied by sparse and intuitive interactions, and distributed via the familiar Google Maps framework. Our contributions are an evaluation demonstrating the validity and opportunities of this approach, a set of design guidelines benefiting those wanting to create such visualizations, and five concrete example visualizations.

  13. What google maps can do for biomedical data dissemination: examples and a design study

    PubMed Central

    2013-01-01

    Background Biologists often need to assess whether unfamiliar datasets warrant the time investment required for more detailed exploration. Basing such assessments on brief descriptions provided by data publishers is unwieldy for large datasets that contain insights dependent on specific scientific questions. Alternatively, using complex software systems for a preliminary analysis may be deemed as too time consuming in itself, especially for unfamiliar data types and formats. This may lead to wasted analysis time and discarding of potentially useful data. Results We present an exploration of design opportunities that the Google Maps interface offers to biomedical data visualization. In particular, we focus on synergies between visualization techniques and Google Maps that facilitate the development of biological visualizations which have both low-overhead and sufficient expressivity to support the exploration of data at multiple scales. The methods we explore rely on displaying pre-rendered visualizations of biological data in browsers, with sparse yet powerful interactions, by using the Google Maps API. We structure our discussion around five visualizations: a gene co-regulation visualization, a heatmap viewer, a genome browser, a protein interaction network, and a planar visualization of white matter in the brain. Feedback from collaborative work with domain experts suggests that our Google Maps visualizations offer multiple, scale-dependent perspectives and can be particularly helpful for unfamiliar datasets due to their accessibility. We also find that users, particularly those less experienced with computer use, are attracted by the familiarity of the Google Maps API. Our five implementations introduce design elements that can benefit visualization developers. Conclusions We describe a low-overhead approach that lets biologists access readily analyzed views of unfamiliar scientific datasets. We rely on pre-computed visualizations prepared by data experts, accompanied by sparse and intuitive interactions, and distributed via the familiar Google Maps framework. Our contributions are an evaluation demonstrating the validity and opportunities of this approach, a set of design guidelines benefiting those wanting to create such visualizations, and five concrete example visualizations. PMID:23642009

  14. Presenting Big Data in Google Earth with KML

    NASA Astrophysics Data System (ADS)

    Hagemark, B.

    2006-12-01

    KML 2.1 and Google Earth 4 provides support to enable streaming of very large datasets, with "smart" loading of data at multiple levels of resolution and incremental update to previously loaded data. This presentation demonstrates this technology for use with the Google Earth KML geometry and image primitives and shows some techniques and tools for creating this KML.

  15. Cultural Adventures for the Google[TM] Generation

    ERIC Educational Resources Information Center

    Dann, Tammy

    2010-01-01

    Google Earth is a computer program that allows users to view the Earth through satellite imagery and maps, to see cities from above and through street views, and to search for addresses and browse locations. Many famous buildings and structures from around the world have detailed 3D views accessible on Google Earth. It is possible to explore the…

  16. Google Mercury: The Launch of a New Planet

    NASA Astrophysics Data System (ADS)

    Hirshon, B.; Chapman, C. R.; Edmonds, J.; Goldstein, J.; Hallau, K. G.; Solomon, S. C.; Vanhala, H.; Weir, H. M.; Messenger Education; Public Outreach Epo Team

    2010-12-01

    The NASA MESSENGER mission’s Education and Public Outreach (EPO) Team, in cooperation with Google, Inc., has launched Google Mercury, an immersive new environment on the Google Earth platform. Google Mercury features hundreds of surface features, most of them newly revealed by the three flybys of the innermost planet by the MESSENGER spacecraft. As with Google Earth, Google Mercury is available on line at no cost. This presentation will demonstrate how our team worked with Google staff, features we incorporated, how games can be developed within the Google Earth platform, and how others can add tours, games, and other educational features. Finally, we will detail new enhancements to be added once MESSENGER enters into orbit about Mercury in March 2011 and begins sending back compelling images and other global data sets on a daily basis. The MESSENGER EPO Team comprises individuals from the American Association for the Advancement of Science (AAAS); Carnegie Academy for Science Education (CASE); Center for Educational Resources (CERES) at Montana State University (MSU) - Bozeman; National Center for Earth and Space Science Education (NCESSE); Johns Hopkins University Applied Physics Laboratory (JHU/APL); National Air and Space Museum (NASM); Science Systems and Applications, Inc. (SSAI); and Southwest Research Institute (SwRI). Screen shot of Google Mercury as a work in progress

  17. Global Analysis of River Planform Change using the Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Bryk, A.; Dietrich, W. E.; Gorelick, N.; Sargent, R.; Braudrick, C. A.

    2014-12-01

    Geomorphologists have historically tracked river dynamics using a combination of maps, aerial photographs, and the stratigraphic record. Although stratigraphic records can extend into deep time, maps and aerial photographs often confine our record of change to sparse measurements over the last ~80 years and in some cases much less time. For the first time Google's Earth Engine (GEE) cloud based platform allows researchers the means to analyze quantitatively the pattern and pace of river channel change over the last 30 years with high temporal resolution across the entire planet. The GEE provides an application programing interface (API) that enables quantitative analysis of various data sets including the entire Landsat L1T archive. This allows change detection for channels wider than about 150 m over 30 years of successive, georeferenced imagery. Qualitatively, it becomes immediately evident that the pace of channel morphodynamics for similar planforms varies by orders of magnitude across the planet and downstream along individual rivers. To quantify these rates of change and to explore their controls we have developed methods for differentiating channels from floodplain along large alluvial rivers. We introduce a new metric of morphodynamics: the ratio of eroded area to channel area per unit time, referred to as "M". We also keep track of depositional areas resulting from channel shifting. To date our quantitative analysis has focused on rivers in the Andean foreland. Our analysis shows channel bank erosion rates, M, varies by orders of magnitude for these rivers, from 0 to ~0.25 yr-1, yet these rivers have essentially identical curvature and sinuosity and are visually indistinguishable. By tracking both bank paths in time, we find that, for some meandering rivers, a significant fraction of new floodplain is produced through outer-bank accretion rather than point bar deposition. This process is perhaps more important in generating floodplain stratigraphy than previously recognized. These initial findings indicate a new set of quantitative observations will emerge to further test and advance morphodynamic theory. The Google Earth Engine offers the opportunity to explore river morphodynamics on an unprecedented scale and provides a powerful tool for addressing fundamental questions in river morphodynamics.

  18. Assessing Coupled Social Ecological Flood Vulnerability from Uttarakhand, India, to the State of New York with Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Tellman, B.; Schwarz, B.

    2014-12-01

    This talk describes the development of a web application to predict and communicate vulnerability to floods given publicly available data, disaster science, and geotech cloud capabilities. The proof of concept in Google Earth Engine API with initial testing on case studies in New York and Utterakhand India demonstrates the potential of highly parallelized cloud computing to model socio-ecological disaster vulnerability at high spatial and temporal resolution and in near real time. Cloud computing facilitates statistical modeling with variables derived from large public social and ecological data sets, including census data, nighttime lights (NTL), and World Pop to derive social parameters together with elevation, satellite imagery, rainfall, and observed flood data from Dartmouth Flood Observatory to derive biophysical parameters. While more traditional, physically based hydrological models that rely on flow algorithms and numerical methods are currently unavailable in parallelized computing platforms like Google Earth Engine, there is high potential to explore "data driven" modeling that trades physics for statistics in a parallelized environment. A data driven approach to flood modeling with geographically weighted logistic regression has been initially tested on Hurricane Irene in southeastern New York. Comparison of model results with observed flood data reveals a 97% accuracy of the model to predict flooded pixels. Testing on multiple storms is required to further validate this initial promising approach. A statistical social-ecological flood model that could produce rapid vulnerability assessments to predict who might require immediate evacuation and where could serve as an early warning. This type of early warning system would be especially relevant in data poor places lacking the computing power, high resolution data such as LiDar and stream gauges, or hydrologic expertise to run physically based models in real time. As the data-driven model presented relies on globally available data, the only real time data input required would be typical data from a weather service, e.g. precipitation or coarse resolution flood prediction. However, model uncertainty will vary locally depending upon the resolution and frequency of observed flood and socio-economic damage impact data.

  19. Application based on ArcObject inquiry and Google maps demonstration to real estate database

    NASA Astrophysics Data System (ADS)

    Hwang, JinTsong

    2007-06-01

    Real estate industry in Taiwan has been flourishing in recent years. To acquire various and abundant information of real estate for sale is the same goal for the consumers and the brokerages. Therefore, before looking at the property, it is important to get all pertinent information possible. Not only this beneficial for the real estate agent as they can provide the sellers with the most information, thereby solidifying the interest of the buyer, but may also save time and the cost of manpower were something out of place. Most of the brokerage sites are aware of utilizes Internet as form of media for publicity however; the contents are limited to specific property itself and the functions of query are mostly just provided searching by condition. This paper proposes a query interface on website which gives function of zone query by spatial analysis for non-GIS users, developing a user-friendly interface with ArcObject in VB6, and query by condition. The inquiry results can show on the web page which is embedded functions of Google Maps and the UrMap API on it. In addition, the demonstration of inquiry results will give the multimedia present way which includes hyperlink to Google Earth with surrounding of the property, the Virtual Reality scene of house, panorama of interior of building and so on. Therefore, the website provides extra spatial solution for query and demonstration abundant information of real estate in two-dimensional and three-dimensional types of view.

  20. Identifying Severe Weather Impacts and Damage with Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Molthan, A.; Burks, J. E.; Bell, J. R.

    2015-12-01

    Hazards associated with severe convective storms can lead to rapid changes in land surface vegetation. Depending upon the type of vegetation that has been impacted, their impacts can be relatively short lived, such as damage to seasonal crops that are eventually removed by harvest, or longer-lived, such as damage to a stand of trees or expanse of forest that require several years to recover. Since many remote sensing imagers provide their highest spatial resolution bands in the red and near-infrared to support monitoring of vegetation, these impacts can be readily identified as short-term and marked decreases in common vegetation indices such as NDVI, along with increases in land surface temperature that are observed at a reduced spatial resolution. The ability to identify an area of vegetation change is improved by understanding the conditions that are normal for a given time of year and location, along with a typical range of variability in a given parameter. This analysis requires a period of record well beyond the availability of near real-time data. These activities would typically require an analyst to download large volumes of data from sensors such as NASA's MODIS (aboard Terra and Aqua) or higher resolution imagers from the Landsat series of satellites. Google's Earth Engine offers a "big data" solution to these challenges, by providing a streamlined API and option to process the period of record of NASA MODIS and Landsat products through relatively simple Javascript coding. This presentation will highlight efforts to date in using Earth Engine holdings to produce vegetation and land surface temperature anomalies that are associated with damage to agricultural and other vegetation caused by severe thunderstorms across the Central and Southeastern United States. Earth Engine applications will show how large data holdings can be used to map severe weather damage, ascertain longer-term impacts, and share best practices learned and challenges with applying Earth Engine holdings to the analysis of severe weather damage. Other applications are also demonstrated, such as use of Earth Engine to prepare pre-event composites that can be used to subjectively identify other severe weather impacts. Future extension to flooding and wildfires is also proposed.

  1. Google earth mapping of damage from the Nigata-Ken-Chuetsu M6.6 earthquake of 16 July 2007

    USGS Publications Warehouse

    Kayen, Robert E.; Steele, WM. Clint; Collins, Brian; Walker, Kevin

    2008-01-01

    We describe the use of Google Earth during and after a large damaging earthquake thatstruck the central Japan coast on 16 July 2007 to collect and organize damage information and guide the reconnaissance activities. This software enabled greater real-time collaboration among scientists and engineers. After the field investigation, the Google Earth map is used as a final reporting product that was directly linked to the more traditional research report document. Finally, we analyze the use of the software within the context of a post-disaster reconnaissance investigation, and link it to student use of GoogleEarth in field situations

  2. Using Cloud-based Storage Technologies for Earth Science Data

    NASA Astrophysics Data System (ADS)

    Michaelis, A.; Readey, J.; Votava, P.

    2016-12-01

    Cloud based infrastructure may offer several key benefits of scalability, built in redundancy and reduced total cost of ownership as compared with a traditional data center approach. However, most of the tools and software systems developed for NASA data repositories were not developed with a cloud based infrastructure in mind and do not fully take advantage of commonly available cloud-based technologies. Object storage services are provided through all the leading public (Amazon Web Service, Microsoft Azure, Google Cloud, etc.) and private (Open Stack) clouds, and may provide a more cost-effective means of storing large data collections online. We describe a system that utilizes object storage rather than traditional file system based storage to vend earth science data. The system described is not only cost effective, but shows superior performance for running many different analytics tasks in the cloud. To enable compatibility with existing tools and applications, we outline client libraries that are API compatible with existing libraries for HDF5 and NetCDF4. Performance of the system is demonstrated using clouds services running on Amazon Web Services.

  3. Engaging Middle School Students with Google Earth Technology to Analyze Ocean Cores as Evidence for Sea Floor Spreading

    NASA Astrophysics Data System (ADS)

    Prouhet, T.; Cook, J.

    2006-12-01

    Google Earth's ability to captivate students' attention, its ease of use, and its high quality images give it the potential to be an extremely effective tool for earth science educators. The unique properties of Google Earth satisfy a growing demand to incorporate technology in science instruction. Google Earth is free and relatively easy to use unlike some other visualization software. Students often have difficulty conceptualizing and visualizing earth systems, such as deep-ocean basins, because of the complexity and dynamic nature of the processes associated with them (e.g. plate tectonics). Google Earth's combination of aerial photography, satellite images and remote sensing data brings a sense of realism to science concepts. The unobstructed view of the ocean floor provided by this technology illustrates three-dimensional subsurface features such as rift valleys, subduction zones, and sea-mounts enabling students to better understand the seafloor's dynamic nature. Students will use Google Earth to navigate the sea floor, and examine Deep Sea Drilling Project (DSDP) core locations the from the Glomar Challenger Leg 3 expedition. The lesson to be implemented was expanded upon and derived from the Joint Oceanographic Insitute (JOI) Learning exercise, Nannofossils Reveal Seafloor Spreading. In addition, students take on the role of scientists as they graph and analyze paleontological data against the distance from the Mid Ocean Ridge. The integration of ocean core data in this three-dimensional view aids students' ability to draw and communicate valid conclusions about their scientific observations. A pre and post survey will be given to examine attitudes, self-efficacy, achievement and content mastery to a sample of approximately 300 eighth grade science students. The hypothesis is that the integration of Google Earth will significantly improve all areas of focus as mentioned above.

  4. A Web Portal-Based Time-Aware KML Animation Tool for Exploring Spatiotemporal Dynamics of Hydrological Events

    NASA Astrophysics Data System (ADS)

    Bao, X.; Cai, X.; Liu, Y.

    2009-12-01

    Understanding spatiotemporal dynamics of hydrological events such as storms and droughts is highly valuable for decision making on disaster mitigation and recovery. Virtual Globe-based technologies such as Google Earth and Open Geospatial Consortium KML standards show great promises for collaborative exploration of such events using visual analytical approaches. However, currently there are two barriers for wider usage of such approaches. First, there lacks an easy way to use open source tools to convert legacy or existing data formats such as shapefiles, geotiff, or web services-based data sources to KML and to produce time-aware KML files. Second, an integrated web portal-based time-aware animation tool is currently not available. Thus users usually share their files in the portal but have no means to visually explore them without leaving the portal environment which the users are familiar with. We develop a web portal-based time-aware KML animation tool for viewing extreme hydrologic events. The tool is based on Google Earth JavaScript API and Java Portlet standard 2.0 JSR-286, and it is currently deployable in one of the most popular open source portal frameworks, namely Liferay. We have also developed an open source toolkit kml-soc-ncsa (http://code.google.com/p/kml-soc-ncsa/) to facilitate the conversion of multiple formats into KML and the creation of time-aware KML files. We illustrate our tool using some example cases, in which drought and storm events with both time and space dimension can be explored in this web-based KML animation portlet. The tool provides an easy-to-use web browser-based portal environment for multiple users to collaboratively share and explore their time-aware KML files as well as improving the understanding of the spatiotemporal dynamics of the hydrological events.

  5. Leveraging Google Geo Tools for Interactive STEM Education: Insights from the GEODE Project

    NASA Astrophysics Data System (ADS)

    Dordevic, M.; Whitmeyer, S. J.; De Paor, D. G.; Karabinos, P.; Burgin, S.; Coba, F.; Bentley, C.; St John, K. K.

    2016-12-01

    Web-based imagery and geospatial tools have transformed our ability to immerse students in global virtual environments. Google's suite of geospatial tools, such as Google Earth (± Engine), Google Maps, and Street View, allow developers and instructors to create interactive and immersive environments, where students can investigate and resolve common misconceptions in STEM concepts and natural processes. The GEODE (.net) project is developing digital resources to enhance STEM education. These include virtual field experiences (VFEs), such as an interactive visualization of the breakup of the Pangaea supercontinent, a "Grand Tour of the Terrestrial Planets," and GigaPan-based VFEs of sites like the Canadian Rockies. Web-based challenges, such as EarthQuiz (.net) and the "Fold Analysis Challenge," incorporate scaffolded investigations of geoscience concepts. EarthQuiz features web-hosted imagery, such as Street View, Photo Spheres, GigaPans, and Satellite View, as the basis for guided inquiry. In the Fold Analysis Challenge, upper-level undergraduates use Google Earth to evaluate a doubly-plunging fold at Sheep Mountain, WY. GEODE.net also features: "Reasons for the Seasons"—a Google Earth-based visualization that addresses misconceptions that abound amongst students, teachers, and the public, many of whom believe that seasonality is caused by large variations in Earth's distance from the Sun; "Plate Euler Pole Finder," which helps students understand rotational motion of tectonic plates on the globe; and "Exploring Marine Sediments Using Google Earth," an exercise that uses empirical data to explore the surficial distribution of marine sediments in the modern ocean. The GEODE research team includes the authors and: Heather Almquist, Cinzia Cervato, Gene Cooper, Helen Crompton, Terry Pavlis, Jen Piatek, Bill Richards, Jeff Ryan, Ron Schott, Barb Tewksbury, and their students and collaborating colleagues. We are supported by NSF DUE 1323419 and a Google Geo Curriculum Award.

  6. Using Mobile App Development Tools to Build a GIS Application

    NASA Astrophysics Data System (ADS)

    Mital, A.; Catchen, M.; Mital, K.

    2014-12-01

    Our group designed and built working web, android, and IOS applications using different mapping libraries as bases on which to overlay fire data from NASA. The group originally planned to make app versions for Google Maps, Leaflet, and OpenLayers. However, because the Leaflet library did not properly load on Android, the group focused efforts on the other two mapping libraries. For Google Maps, the group first designed a UI for the web app and made a working version of the app. After updating the source of fire data to one which also provided historical fire data, the design had to be modified to include the extra data. After completing a working version of the web app, the group used webview in android, a built in resource which allowed porting the web app to android without rewriting the code for android. Upon completing this, the group found Apple IOS devices had a similar capability, and so decided to add an IOS app to the project using a function similar to webview. Alongside this effort, the group began implementing an OpenLayers fire map using a simpler UI. This web app was completed fairly quickly relative to Google Maps; however, it did not include functionality such as satellite imagery or searchable locations. The group finished the project with a working android version of the Google Maps based app supporting API levels 14-19 and an OpenLayers based app supporting API levels 8-19, as well as a Google Maps based IOS app supporting both old and new screen formats. This project was implemented by high school and college students under an SGT Inc. STEM internship program

  7. a Map Mash-Up Application: Investigation the Temporal Effects of Climate Change on Salt Lake Basin

    NASA Astrophysics Data System (ADS)

    Kirtiloglu, O. S.; Orhan, O.; Ekercin, S.

    2016-06-01

    The main purpose of this paper is to investigate climate change effects that have been occurred at the beginning of the twenty-first century at the Konya Closed Basin (KCB) located in the semi-arid central Anatolian region of Turkey and particularly in Salt Lake region where many major wetlands located in and situated in KCB and to share the analysis results online in a Web Geographical Information System (GIS) environment. 71 Landsat 5-TM, 7-ETM+ and 8-OLI images and meteorological data obtained from 10 meteorological stations have been used at the scope of this work. 56 of Landsat images have been used for extraction of Salt Lake surface area through multi-temporal Landsat imagery collected from 2000 to 2014 in Salt lake basin. 15 of Landsat images have been used to make thematic maps of Normalised Difference Vegetation Index (NDVI) in KCB, and 10 meteorological stations data has been used to generate the Standardized Precipitation Index (SPI), which was used in drought studies. For the purpose of visualizing and sharing the results, a Web GIS-like environment has been established by using Google Maps and its useful data storage and manipulating product Fusion Tables which are all Google's free of charge Web service elements. The infrastructure of web application includes HTML5, CSS3, JavaScript, Google Maps API V3 and Google Fusion Tables API technologies. These technologies make it possible to make effective "Map Mash-Ups" involving an embedded Google Map in a Web page, storing the spatial or tabular data in Fusion Tables and add this data as a map layer on embedded map. The analysing process and map mash-up application have been discussed in detail as the main sections of this paper.

  8. [Establishment of Oncomelania hupensis snail database based on smartphone and Google Earth].

    PubMed

    Wang, Wei-chun; Zhan, Ti; Zhu, Ying-fu

    2015-02-01

    To establish an Oncomelania hupensis snail database based on smartphone and Google Earth. The HEAD GPS software was loaded in the smartphone first. The GPS data of the snails were collected by the smartphone. The original data were exported to the computer with the format of KMIUKMZ. Then the data were converted into Excel file format by using some software. Finally, the results based on laboratory were filled, and the digital snail data were established. The data were converted into KML, and then were showed by Google Earth visually. The snail data of a 5 hm2-beach along the Yangtze River were collected and the distribution of the snails based on Google Earth was obtained. The database of the snails was built. The query function was implemented about the number of the total snails, the living snails and the schistosome infected snails of each survey frame. The digital management of the snail data is realized by using the smartphone and Google Earth.

  9. Google Moon Press Conference

    NASA Image and Video Library

    2009-07-19

    NASA Deputy Administrator Lori Garver, speaks during a press conference, Monday, July 20, 2009, announcing the launch of Moon in Google Earth, an immersive 3D atlas of the Moon, accessible within Google Earth 5.0, Monday, July 20, 2009, at the Newseum in Washington. Photo Credit: (NASA/Bill Ingalls)

  10. Visualizing Cross-sectional Data in a Real-World Context

    NASA Astrophysics Data System (ADS)

    Van Noten, K.; Lecocq, T.

    2016-12-01

    If you could fly around your research results in three dimensions, wouldn't you like to do it? Visualizing research results properly during scientific presentations already does half the job of informing the public on the geographic framework of your research. Many scientists use the Google Earth™ mapping service (V7.1.2.2041) because it's a great interactive mapping tool for assigning geographic coordinates to individual data points, localizing a research area, and draping maps of results over Earth's surface for 3D visualization. However, visualizations of research results in vertical cross-sections are often not shown simultaneously with the maps in Google Earth. A few tutorials and programs to display cross-sectional data in Google Earth do exist, and the workflow is rather simple. By importing a cross-sectional figure into in the open software SketchUp Make [Trimble Navigation Limited, 2016], any spatial model can be exported to a vertical figure in Google Earth. In this presentation a clear workflow/tutorial is presented how to image cross-sections manually in Google Earth. No software skills, nor any programming codes are required. It is very easy to use, offers great possibilities for teaching and allows fast figure manipulation in Google Earth. The full workflow can be found in "Van Noten, K. 2016. Visualizing Cross-Sectional Data in a Real-World Context. EOS, Transactions AGU, 97, 16-19".The video tutorial can be found here: https://www.youtube.com/watch?v=Tr8LwFJ4RYU&Figure: Cross-sectional Research Examples Illustrated in Google Earth

  11. USGS Coastal and Marine Geology Survey Data in Google Earth

    NASA Astrophysics Data System (ADS)

    Reiss, C.; Steele, C.; Ma, A.; Chin, J.

    2006-12-01

    The U.S. Geological Survey (USGS) Coastal and Marine Geology (CMG) program has a rich data catalog of geologic field activities and metadata called InfoBank, which has been a standard tool for researchers within and outside of the agency. Along with traditional web maps, the data are now accessible in Google Earth, which greatly expands the possible user audience. The Google Earth interface provides geographic orientation and panning/zooming capabilities to locate data relative to topography, bathymetry, and coastal areas. Viewing navigation with Google Earth's background imagery allows queries such as, why areas were not surveyed (answer presence of islands, shorelines, cliffs, etc.). Detailed box core subsample photos from selected sampling activities, published geotechnical data, and sample descriptions are now viewable on Google Earth, (for example, M-1-95-MB, P-2-95-MB, and P-1-97- MB box core samples). One example of the use of Google Earth is CMG's surveys of San Francisco's Ocean Beach since 2004. The surveys are conducted with an all-terrain vehicle (ATV) and shallow-water personal watercraft (PWC) equipped with Global Positioning System (GPS), and elevation and echo sounder data collectors. 3D topographic models with centimeter accuracy have been produced from these surveys to monitor beach and nearshore processes, including sand transport, sedimentation patterns, and seasonal trends. Using Google Earth, multiple track line data (examples: OB-1-05-CA and OB-2-05-CA) can be overlaid on beach imagery. The images also help explain the shape of track lines as objects are encountered.

  12. Google Moon Press Conference

    NASA Image and Video Library

    2009-07-19

    Andrew Chaikin, author of "A Man on the Moon" speaks during a press conference, Monday, July 20, 2009, announcing the launch of Moon in Google Earth, an immersive 3D atlas of the Moon, accessible within Google Earth 5.0, Monday, July 20, 2009, at the Newseum in Washington. Photo Credit: (NASA/Bill Ingalls)

  13. Google Moon Press Conference

    NASA Image and Video Library

    2009-07-19

    Buzz Aldrin, the second man to walk on the moon, speaks during a press conference, Monday, July 20, 2009, announcing the launch of Moon in Google Earth, an immersive 3D atlas of the Moon, accessible within Google Earth 5.0, Monday, July 20, 2009, at the Newseum in Washington. Photo Credit: (NASA/Bill Ingalls)

  14. Design and Implementation of Surrounding Transaction Plotting and Management System Based on Google Map API

    NASA Astrophysics Data System (ADS)

    Cao, Y. B.; Hua, Y. X.; Zhao, J. X.; Guo, S. M.

    2013-11-01

    With China's rapid economic development and comprehensive national strength growing, Border work has become a long-term and important task in China's diplomatic work. How to implement rapid plotting, real-time sharing and mapping surrounding affairs has taken great significance for government policy makers and diplomatic staff. However, at present the already exists Boundary information system are mainly have problems of Geospatial data update is heavily workload, plotting tools are in a state of serious lack of, Geographic events are difficult to share, this phenomenon has seriously hampered the smooth development of the border task. The development and progress of Geographic information system technology especially the development of Web GIS offers the possibility to solve the above problems, this paper adopts four layers of B/S architecture, with the support of Google maps service, uses the free API which is offered by Google maps and its features of openness, ease of use, sharing characteristics, highresolution images to design and implement the surrounding transaction plotting and management system based on the web development technology of ASP.NET, C#, Ajax. The system can provide decision support for government policy makers as well as diplomatic staff's real-time plotting and sharing of surrounding information. The practice has proved that the system has good usability and strong real-time.

  15. Factors Affecting Student Success with a Google Earth-Based Earth Science Curriculum

    ERIC Educational Resources Information Center

    Blank, Lisa M.; Almquist, Heather; Estrada, Jen; Crews, Jeff

    2016-01-01

    This study investigated to what extent the implementation of a Google Earth (GE)-based earth science curriculum increased students' understanding of volcanoes, earthquakes, plate tectonics, scientific reasoning abilities, and science identity. Nine science classrooms participated in the study. In eight of the classrooms, pre- and post-assessments…

  16. Integration of Geophysical and Geochemical Data

    NASA Astrophysics Data System (ADS)

    Yamagishi, Y.; Suzuki, K.; Tamura, H.; Nagao, H.; Yanaka, H.; Tsuboi, S.

    2006-12-01

    Integration of geochemical and geophysical data would give us a new insight to the nature of the Earth. It should advance our understanding for the dynamics of the Earth's interior and surface processes. Today various geochemical and geophysical data are available on Internet. These data are stored in various database systems. Each system is isolated and provides own format data. The goal of this study is to display both the geochemical and geophysical data obtained from such databases together visually. We adopt Google Earth as the presentation tool. Google Earth is virtual globe software and is provided free of charge by Google, Inc. Google Earth displays the Earth's surface using satellite images with mean resolution of ~15m. We display any graphical features on Google Earth by KML format file. We have developed softwares to convert geochemical and geophysical data to KML file. First of all, we tried to overlay data from Georoc and PetDB and seismic tomography data on Google Earth. Georoc and PetDB are both online database systems for geochemical data. The data format of Georoc is CSV and that of PetDB is Microsoft Excel. The format of tomography data we used is plain text. The conversion software can process these different file formats. The geochemical data (e. g. compositional abundance) is displayed as a three-dimensional column on the Earth's surface. The shape and color of the column mean the element type. The size and color tone vary according to the abundance of the element. The tomography data can be converted into a KML file for each depth. This overlay plot of geochemical data and tomography data should help us to correlate internal temperature anomalies to geochemical anomalies, which are observed at the surface of the Earth. Our tool can convert any geophysical and geochemical data to a KML as long as the data is associated with longitude and latitude. We are going to support more geophysical data formats. In addition, we are currently trying to obtain scientific insights for the Earth's interior based on the view of both geophysical and geochemical data on Google Earth.

  17. Google Moon Press Conference

    NASA Image and Video Library

    2009-07-19

    Apollo 11 astronaut Buzz Aldrin, the second man to walk on the Moon, speaks during a press conference, Monday, July 20, 2009, announcing the launch of Moon in Google Earth, an immersive 3D atlas of the Moon, accessible within Google Earth 5.0, Monday, July 20, 2009, at the Newseum in Washington. Photo Credit: (NASA/Bill Ingalls)

  18. Google Moon Press Conference

    NASA Image and Video Library

    2009-07-19

    Miles O'Brien, former chief science and tech correspondent for CNN, speaks during a press conference, Monday, July 20, 2009, announcing the launch of Moon in Google Earth, an immersive 3D atlas of the Moon, accessible within Google Earth 5.0, Monday, July 20, 2009, at the Newseum in Washington. Photo Credit: (NASA/Bill Ingalls)

  19. Google Moon Press Conference

    NASA Image and Video Library

    2009-07-19

    Yoshinori Yoshimura, a respresentative from the Japan Aerospace Exploration Agency (JAXA), speaks during a press conference, Monday, July 20, 2009, announcing the launch of Moon in Google Earth, an immersive 3D atlas of the Moon, accessible within Google Earth 5.0, Monday, July 20, 2009, at the Newseum in Washington. Photo Credit: (NASA/Bill Ingalls)

  20. Solar Eclipse Computer API: Planning Ahead for August 2017

    NASA Astrophysics Data System (ADS)

    Bartlett, Jennifer L.; Chizek Frouard, Malynda; Lesniak, Michael V.; Bell, Steve

    2016-01-01

    With the total solar eclipse of 2017 August 21 over the continental United States approaching, the U.S. Naval Observatory (USNO) on-line Solar Eclipse Computer can now be accessed via an application programming interface (API). This flexible interface returns local circumstances for any solar eclipse in JavaScript Object Notation (JSON) that can be incorporated into third-party Web sites or applications. For a given year, it can also return a list of solar eclipses that can be used to build a more specific request for local circumstances. Over the course of a particular eclipse as viewed from a specific site, several events may be visible: the beginning and ending of the eclipse (first and fourth contacts), the beginning and ending of totality (second and third contacts), the moment of maximum eclipse, sunrise, or sunset. For each of these events, the USNO Solar Eclipse Computer reports the time, Sun's altitude and azimuth, and the event's position and vertex angles. The computer also reports the duration of the total phase, the duration of the eclipse, the magnitude of the eclipse, and the percent of the Sun obscured for a particular eclipse site. On-line documentation for using the API-enabled Solar Eclipse Computer, including sample calls, is available (http://aa.usno.navy.mil/data/docs/api.php). The same Web page also describes how to reach the Complete Sun and Moon Data for One Day, Phases of the Moon, Day and Night Across the Earth, and Apparent Disk of a Solar System Object services using API calls.For those who prefer using a traditional data input form, local circumstances can still be requested that way at http://aa.usno.navy.mil/data/docs/SolarEclipses.php. In addition, the 2017 August 21 Solar Eclipse Resource page (http://aa.usno.navy.mil/data/docs/Eclipse2017.php) consolidates all of the USNO resources for this event, including a Google Map view of the eclipse track designed by Her Majesty's Nautical Almanac Office (HMNAO). Looking further ahead, a 2024 April 8 Solar Eclipse Resource page (http://aa.usno.navy.mil/data/docs/Eclipse2024.php) is also available.

  1. Visualizing Dynamic Weather and Ocean Data in Google Earth

    NASA Astrophysics Data System (ADS)

    Castello, C.; Giencke, P.

    2008-12-01

    Katrina. Climate change. Rising sea levels. Low lake levels. These headliners, and countless others like them, underscore the need to better understand our changing oceans and lakes. Over the past decade, efforts such as the Global Ocean Observing System (GOOS) have added to this understanding, through the creation of interoperable ocean observing systems. These systems, including buoy networks, gliders, UAV's, etc, have resulted in a dramatic increase in the amount of Earth observation data available to the public. Unfortunately, these data tend to be restrictive to mass consumption, owing to large file sizes, incompatible formats, and/or a dearth of user friendly visualization software. Google Earth offers a flexible way to visualize Earth observation data. Marrying high resolution orthoimagery, user friendly query and navigation tools, and the power of OGC's KML standard, Google Earth can make observation data universally understandable and accessible. This presentation will feature examples of meteorological and oceanographic data visualized using KML and Google Earth, along with tools and tips for integrating other such environmental datasets.

  2. Information Technology Infusion Case Study: Integrating Google Earth(Trademark) into the A-Train Data Depot

    NASA Technical Reports Server (NTRS)

    Smith, Peter; Kempler, Steven; Leptoukh, Gregory; Chen, Aijun

    2010-01-01

    This poster paper represents the NASA funded project that was to employ the latest three dimensional visualization technology to explore and provide direct data access to heterogeneous A-Train datasets. Google Earth (tm) provides foundation for organizing, visualizing, publishing and synergizing Earth science data .

  3. Maintaining the momentum of Open Search in Earth Science Data discovery

    NASA Astrophysics Data System (ADS)

    Newman, D. J.; Lynnes, C.

    2013-12-01

    Federated Search for Earth Observation data has been a hallmark of EOSDIS (Earth Observing System Data and Information System) for two decades. Originally, the EOSDIS Version 0 system provided both data-collection-level and granule/file-level search in the mid 1990s with EOSDIS-specific socket protocols and message formats. Since that time, the advent of several standards has helped to simplify EOSDIS federated search, beginning with HTTP as the transfer protocol. Most recently, OpenSearch (www.opensearch.org) was employed for the EOS Clearinghouse (ECHO), based on a set of conventions that had been developed within the Earth Science Information Partners (ESIP) Federation. The ECHO OpenSearch API has evolved to encompass the ESIP RFC and the Open Geospatial Consortium (OGC) Open Search standard. Uptake of the ECHO Open Search API has been significant and has made ECHO accessible to client developers that found the previous ECHO SOAP API and current REST API too complex. Client adoption of the OpenSearch API appears to be largely driven by the simplicity of the OpenSearch convention. This simplicity is thus important to retain as the standard and convention evolve. For example, ECHO metrics indicate that the vast majority of ECHO users favor the following search criteria when using the REST API, - Spatial - bounding box, polygon, line and point - Temporal - start and end time - Keywords - free text Fewer than 10% of searches use additional constraints, particularly those requiring a controlled vocabulary, such as instrument, sensor, etc. This suggests that ongoing standardization efforts around OpenSearch usage for Earth Observation data may be more productive if oriented toward improving support for the Spatial, Temporal and Keyword search aspects. Areas still requiring improvement include support of - Concrete requirements for keyword constraints - Phrasal search for keyword constraints - Temporal constraint relations - Terminological symmetry between search URLs and response documents for both temporal and spatial terms - Best practices for both servers and clients. Over the past year we have seen several ongoing efforts to further standardize Open Search in the earth science domain such as, - Federation of Earth Science Information Partners (ESIP) - Open Geospatial Consortium (OGC) - Committee on Earth Observation Satellites (CEOS)

  4. Student-Teachers' Use of "Google Earth" in Problem-Based Geology Learning

    ERIC Educational Resources Information Center

    Ratinen, Ilkka; Keinonen, Tuula

    2011-01-01

    Geographical Information Systems (GIS) are adequate for analyzing complex scientific and spatial phenomena in geography education. "Google Earth" is a geographic information tool for GIS-based learning. It allows students to engage in the lesson, explore the Earth, explain what they identify and evaluate the implications of what they are…

  5. Using Web Speech Technology with Language Learning Applications

    ERIC Educational Resources Information Center

    Daniels, Paul

    2015-01-01

    In this article, the author presents the history of human-to-computer interaction based upon the design of sophisticated computerized speech recognition algorithms. Advancements such as the arrival of cloud-based computing and software like Google's Web Speech API allows anyone with an Internet connection and Chrome browser to take advantage of…

  6. Secure and Resilient Cloud Computing for the Department of Defense

    DTIC Science & Technology

    2015-11-16

    platform as a service (PaaS), and software as a service ( SaaS )—that target system administrators, developers, and end-users respectively (see Table 2...interfaces (API) and services Medium Amazon Elastic MapReduce, MathWorks Cloud, Red Hat OpenShift SaaS Full-fledged applications Low Google gMail

  7. Getting Your GIS Data into Google Earth: Data Conversion Tools and Tips

    NASA Astrophysics Data System (ADS)

    Nurik, R.; Marks, M.

    2009-12-01

    Google Earth is a powerful platform for displaying your data. You can easily visualize content using the Keyhole Markup Language (KML). But what if you don't have your data in KML format? GIS data comes in a wide variety of formats, including .shp files, CSV, and many others. What can you do? This session will walk you through some of the tools for converting data to KML format. We will explore a variety of tools, including: Google Earth Pro, GDAL/OGR, KML2KML, etc. This session will be paced so that you can follow along on your laptop if you wish. Should you want to follow along, bring a laptop, and install the trial versions of Google Earth Pro and KML2KML. It is also recommended that you download GDAL from gdal.org and install it on your system.

  8. Next-generation Digital Earth.

    PubMed

    Goodchild, Michael F; Guo, Huadong; Annoni, Alessandro; Bian, Ling; de Bie, Kees; Campbell, Frederick; Craglia, Max; Ehlers, Manfred; van Genderen, John; Jackson, Davina; Lewis, Anthony J; Pesaresi, Martino; Remetey-Fülöpp, Gábor; Simpson, Richard; Skidmore, Andrew; Wang, Changlin; Woodgate, Peter

    2012-07-10

    A speech of then-Vice President Al Gore in 1998 created a vision for a Digital Earth, and played a role in stimulating the development of a first generation of virtual globes, typified by Google Earth, that achieved many but not all the elements of this vision. The technical achievements of Google Earth, and the functionality of this first generation of virtual globes, are reviewed against the Gore vision. Meanwhile, developments in technology continue, the era of "big data" has arrived, the general public is more and more engaged with technology through citizen science and crowd-sourcing, and advances have been made in our scientific understanding of the Earth system. However, although Google Earth stimulated progress in communicating the results of science, there continue to be substantial barriers in the public's access to science. All these factors prompt a reexamination of the initial vision of Digital Earth, and a discussion of the major elements that should be part of a next generation.

  9. The Earth Observatory Natural Event Tracker (EONET): An API for Matching Natural Events to GIBS Imagery

    NASA Astrophysics Data System (ADS)

    Ward, K.

    2015-12-01

    Hidden within the terabytes of imagery in NASA's Global Imagery Browse Services (GIBS) collection are hundreds of daily natural events. Some events are newsworthy, devastating, and visibly obvious at a global scale, others are merely regional curiosities. Regardless of the scope and significance of any one event, it is likely that multiple GIBS layers can be viewed to provide a multispectral, dataset-based view of the event. To facilitate linking between the discrete event and the representative dataset imagery, NASA's Earth Observatory Group has developed a prototype application programming interface (API): the Earth Observatory Natural Event Tracker (EONET). EONET supports an API model that allows users to retrieve event-specific metadata--date/time, location, and type (wildfire, storm, etc.)--and web service layer-specific metadata which can be used to link to event-relevant dataset imagery in GIBS. GIBS' ability to ingest many near real time datasets, combined with its growing archive of past imagery, means that API users will be able to develop client applications that not only show ongoing events but can also look at imagery from before and after. In our poster, we will present the API and show examples of its use.

  10. Visualizing Geographic Data in Google Earth for Education and Outreach

    NASA Astrophysics Data System (ADS)

    Martin, D. J.; Treves, R.

    2008-12-01

    Google Earth is an excellent tool to help students and the public visualize scientific data as with low technical skill scientific content can be shown in three dimensions against a background of remotely sensed imagery. It therefore has a variety of uses in university education and as a tool for public outreach. However, in both situations it is of limited value if it is only used to attract attention with flashy three dimensional animations. In this poster we shall illustrate several applications that represent what we believe is good educational practice. The first example shows how the combination of a floor map and a projection of Google Earth on a screen can be used to produce active learning. Students are asked to imagine where they would build a house on Big Island Hawaii in order to avoid volcanic hazards. In the second example Google Earth is used to illustrate evidence over a range of scales in a description of Lake Agassiz flood events which would be more difficult to comprehend in a traditional paper based format. In the final example a simple text manipulation application "TMapper" is used to change the color palette of a thematic map generated by the students in Google Earth to teach them about the use of color in map design.

  11. Visualizing Moon Data and Imagery with Google Earth

    NASA Astrophysics Data System (ADS)

    Weiss-Malik, M.; Scharff, T.; Nefian, A.; Moratto, Z.; Kolb, E.; Lundy, M.; Hancher, M.; Gorelick, N.; Broxton, M.; Beyer, R. A.

    2009-12-01

    There is a vast store of planetary geospatial data that has been collected by NASA but is difficult to access and visualize. Virtual globes have revolutionized the way we visualize and understand the Earth, but other planetary bodies including Mars and the Moon can be visualized in similar ways. Extraterrestrial virtual globes are poised to revolutionize planetary science, bring an exciting new dimension to science education, and allow ordinary users to explore imagery being sent back to Earth by planetary science satellites. The original Google Moon Web site was a limited series of maps and Apollo content. The new Moon in Google Earth feature provides a similar virtual planet experience for the Moon as we have for the Earth and Mars. We incorporated existing Clementine and Lunar Orbiter imagery for the basemaps and a combination of Kaguya LALT topography and some terrain created from Apollo Metric and Panoramic images. We also have information about the Apollo landings and other robotic landers on the surface, as well as historic maps and charts, and guided tours. Some of the first-released LROC imagery of the Apollo landing sites has been put in place, and we look forward to incorporating more data as it is released from LRO, Chandraayan-1, and Kaguya. These capabilities have obvious public outreach and education benefits, but the potential benefits of allowing planetary scientists to rapidly explore these large and varied data collections — in geological context and within a single user interface — are also becoming evident. Because anyone can produce additional KML content for use in Google Earth, scientists can customize the environment to their needs as well as publish their own processed data and results for others to use. Many scientists and organizations have begun to do this already, resulting in a useful and growing collection of planetary-science-oriented Google Earth layers. Screen shot of Moon in Google Earth, a freely downloadable application for visualizing Moon imagery and data.

  12. Google Earth locations of USA and seafloor hydrothermal vents with associated rare earth element data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrew Fowler

    Google Earth .kmz files that contain the locations of geothermal wells and thermal springs in the USA, and seafloor hydrothermal vents that have associated rare earth element data. The file does not contain the actual data, the actual data is available through the GDR website in two tier 3 data sets entitled "Compilation of Rare Earth Element Analyses from US Geothermal Fields and Mid Ocean Ridge (MOR) Hydrothermal Vents" and "Rare earth element content of thermal fluids from Surprise Valley, California"

  13. The Snow Data System at NASA JPL

    NASA Astrophysics Data System (ADS)

    Laidlaw, R.; Painter, T. H.; Mattmann, C. A.; Ramirez, P.; Bormann, K.; Brodzik, M. J.; Burgess, A. B.; Rittger, K.; Goodale, C. E.; Joyce, M.; McGibbney, L. J.; Zimdars, P.

    2014-12-01

    NASA JPL's Snow Data System has a data-processing pipeline powered by Apache OODT, an open source software tool. The pipeline has been running for several years and has successfully generated a significant amount of cryosphere data, including MODIS-based products such as MODSCAG, MODDRFS and MODICE, with historical and near-real time windows and covering regions such as the Artic, Western US, Alaska, Central Europe, Asia, South America, Australia and New Zealand. The team continues to improve the pipeline, using monitoring tools such as Ganglia to give an overview of operations, and improving fault-tolerance with automated recovery scripts. Several alternative adaptations of the Snow Covered Area and Grain size (SCAG) algorithm are being investigated. These include using VIIRS and Landsat TM/ETM+ satellite data as inputs. Parallel computing techniques are being considered for core SCAG processing, such as using the PyCUDA Python API to utilize multi-core GPU architectures. An experimental version of MODSCAG is also being developed for the Google Earth Engine platform, a cloud-based service.

  14. Effects of Thinking Style and Spatial Ability on Anchoring Behavior in Geographic Information Systems

    ERIC Educational Resources Information Center

    Wang, Dai-Yi; Lee, Mei-Hsuan; Sun, Chuen-Tsai

    2013-01-01

    The authors propose an instructional use for Google Earth (a GIS application) as an anchoring tool for knowledge integration. Google Earth can be used to support student explorations of world geography based on Wikipedia articles on earth science and history topics. We asked 66 Taiwanese high-school freshmen to make place marks with explanatory…

  15. Google Earth Mapping Exercises for Structural Geology Students--A Promising Intervention for Improving Penetrative Visualization Ability

    ERIC Educational Resources Information Center

    Giorgis, Scott

    2015-01-01

    Three-dimensional thinking skills are extremely useful for geoscientists, and at the undergraduate level, these skills are often emphasized in structural geology courses. Google Earth is a powerful tool for visualizing the three-dimensional nature of data collected on the surface of Earth. The results of a 5 y pre- and posttest study of the…

  16. PhyloGeoViz: a web-based program that visualizes genetic data on maps.

    PubMed

    Tsai, Yi-Hsin E

    2011-05-01

    The first step of many population genetic studies is the simple visualization of allele frequencies on a landscape. This basic data exploration can be challenging without proprietary software, and the manual plotting of data is cumbersome and unfeasible at large sample sizes. I present an open source, web-based program that plots any kind of frequency or count data as pie charts in Google Maps (Google Inc., Mountain View, CA). Pie polygons are then exportable to Google Earth (Google Inc.), a free Geographic Information Systems platform. Import of genetic data into Google Earth allows phylogeographers access to a wealth of spatial information layers integral to forming hypotheses and understanding patterns in the data. © 2010 Blackwell Publishing Ltd.

  17. Next-generation Digital Earth

    PubMed Central

    Goodchild, Michael F.; Guo, Huadong; Annoni, Alessandro; Bian, Ling; de Bie, Kees; Campbell, Frederick; Craglia, Max; Ehlers, Manfred; van Genderen, John; Jackson, Davina; Lewis, Anthony J.; Pesaresi, Martino; Remetey-Fülöpp, Gábor; Simpson, Richard; Skidmore, Andrew; Wang, Changlin; Woodgate, Peter

    2012-01-01

    A speech of then-Vice President Al Gore in 1998 created a vision for a Digital Earth, and played a role in stimulating the development of a first generation of virtual globes, typified by Google Earth, that achieved many but not all the elements of this vision. The technical achievements of Google Earth, and the functionality of this first generation of virtual globes, are reviewed against the Gore vision. Meanwhile, developments in technology continue, the era of “big data” has arrived, the general public is more and more engaged with technology through citizen science and crowd-sourcing, and advances have been made in our scientific understanding of the Earth system. However, although Google Earth stimulated progress in communicating the results of science, there continue to be substantial barriers in the public’s access to science. All these factors prompt a reexamination of the initial vision of Digital Earth, and a discussion of the major elements that should be part of a next generation. PMID:22723346

  18. Google Earth as a (Not Just) Geography Education Tool

    ERIC Educational Resources Information Center

    Patterson, Todd C.

    2007-01-01

    The implementation of Geographic Information Science (GIScience) applications and discussion of GIScience-related themes are useful for teaching fundamental geographic and technological concepts. As one of the newest geographic information tools available on the World Wide Web, Google Earth has considerable potential to enhance methods for…

  19. Google Earth-Based Grand Tours of the World's Ocean Basins and Marine Sediments

    NASA Astrophysics Data System (ADS)

    St John, K. K.; De Paor, D. G.; Suranovic, B.; Robinson, C.; Firth, J. V.; Rand, C.

    2016-12-01

    The GEODE project has produced a collection of Google Earth-based marine geology teaching resources that offer grand tours of the world's ocean basins and marine sediments. We use a map of oceanic crustal ages from Müller et al (2008; doi:10.1029/2007GC001743), and a set of emergent COLLADA models of IODP drill core data as a basis for a Google Earth tour introducing students to the world's ocean basins. Most students are familiar with basic seafloor spreading patterns but teaching experience suggests that few students have an appreciation of the number of abandoned ocean basins on Earth. Students also lack a valid visualization of the west Pacific where the oldest crust forms an isolated triangular patch and the ocean floor becomes younger towards the subduction zones. Our tour links geographic locations to mechanical models of rifting, seafloor spreading, subduction, and transform faulting. Google Earth's built-in earthquake and volcano data are related to ocean floor patterns. Marine sediments are explored in a Google Earth tour that draws on exemplary IODP core samples of a range of sediment types (e.g., turbidites, diatom ooze). Information and links are used to connect location to sediment type. This tour compliments a physical core kit of core catcher sections that can be employed for classroom instruction (geode.net/marine-core-kit/). At a larger scale, we use data from IMLGS to explore the distribution of the marine sediments types in the modern global ocean. More than 2,500 sites are plotted with access to original data. Students are guided to compare modern "type sections" of primary marine sediment lithologies, as well as examine site transects to address questions of bathymetric setting, ocean circulation, chemistry (e.g., CCD), and bioproductivity as influences on modern seafloor sedimentation. KMZ files, student exercises, and tips for instructors are available at geode.net/exploring-marine-sediments-using-google-earth.

  20. KML Tours: A New Platform for Exploring and Sharing Geospatial Data

    NASA Astrophysics Data System (ADS)

    Barcay, D. P.; Weiss-Malik, M.

    2009-12-01

    Google Earth and other virtual globes have allowed millions of people to explore the world from their own home. This technology has also raised the bar for professional visualizations: enabling interactive 3D visualizations to be created from massive data-sets, and shared using the KML language. For academics and professionals alike, an engaging presentation of your geospatial data is generally expected and can be the most effective form of advertisement. To that end, we released 'Touring' in Google Earth 5.0: a new medium for cinematic expression, visualized in Google Earth and written as extensions to the KML language. In a KML tour, the author has fine-grained control over the entire visual experience: precisely moving the virtual camera through the world while dynamically modifying the content, style, position, and visibility of the displayed data. An author can synchronize audio to this experience, bringing further immersion to a visualization. KML tours can help engage a broad user-base and conveying subtle concepts that aren't immediately apparent in traditional geospatial content. Unlike a pre-rendered video, a KML Tour maintains the rich interactivity of Google Earth, allowing users to continue exploring your content, and to mash-up other content with your visualization. This session will include conceptual explanations of the Touring feature in Google Earth, the structure of the touring KML extensions, as well as examples of compelling tours.

  1. A Land-Use-Planning Simulation Using Google Earth

    ERIC Educational Resources Information Center

    Bodzin, Alec M.; Cirucci, Lori

    2009-01-01

    Google Earth (GE) is proving to be a valuable tool in the science classroom for understanding the environment and making responsible environmental decisions (Bodzin 2008). GE provides learners with a dynamic mapping experience using a simple interface with a limited range of functions. This interface makes geospatial analysis accessible and…

  2. Assessing Place Location Knowledge Using a Virtual Globe

    ERIC Educational Resources Information Center

    Zhu, Liangfeng; Pan, Xin; Gao, Gongcheng

    2016-01-01

    Advances in the Google Earth virtual globe and the concomitant Keyhole Markup Language (KML) are providing educators with a convenient platform to cultivate and assess one's place location knowledge (PLK). This article presents a general framework and associated implementation methods for the online testing of PLK using Google Earth. The proposed…

  3. Teachable Moment: Google Earth Takes Us There

    ERIC Educational Resources Information Center

    Williams, Ann; Davinroy, Thomas C.

    2015-01-01

    In the current educational climate, where clearly articulated learning objectives are required, it is clear that the spontaneous teachable moment still has its place. Authors Ann Williams and Thomas Davinroy think that instructors from almost any discipline can employ Google Earth as a tool to take advantage of teachable moments through the…

  4. Automated protocols for spaceborne sub-meter resolution "Big Data" products for Earth Science

    NASA Astrophysics Data System (ADS)

    Neigh, C. S. R.; Carroll, M.; Montesano, P.; Slayback, D. A.; Wooten, M.; Lyapustin, A.; Shean, D. E.; Alexandrov, O.; Macander, M. J.; Tucker, C. J.

    2017-12-01

    The volume of available remotely sensed data has grown exceeding Petabytes per year and the cost for data, storage systems and compute power have both dropped exponentially. This has opened the door for "Big Data" processing systems with high-end computing (HEC) such as the Google Earth Engine, NASA Earth Exchange (NEX), and NASA Center for Climate Simulation (NCCS). At the same time, commercial very high-resolution (VHR) satellites have grown into a constellation with global repeat coverage that can support existing NASA Earth observing missions with stereo and super-spectral capabilities. Through agreements with the National Geospatial-Intelligence Agency NASA-Goddard Space Flight Center is acquiring Petabytes of global sub-meter to 4 meter resolution imagery from WorldView-1,2,3 Quickbird-2, GeoEye-1 and IKONOS-2 satellites. These data are a valuable no-direct cost for the enhancement of Earth observation research that supports US government interests. We are currently developing automated protocols for generating VHR products to support NASA's Earth observing missions. These include two primary foci: 1) on demand VHR 1/2° ortho mosaics - process VHR to surface reflectance, orthorectify and co-register multi-temporal 2 m multispectral imagery compiled as user defined regional mosaics. This will provide an easy access dataset to investigate biodiversity, tree canopy closure, surface water fraction, and cropped area for smallholder agriculture; and 2) on demand VHR digital elevation models (DEMs) - process stereo VHR to extract VHR DEMs with the NASA Ames stereo pipeline. This will benefit Earth surface studies on the cryosphere (glacier mass balance, flow rates and snow depth), hydrology (lake/water body levels, landslides, subsidence) and biosphere (forest structure, canopy height/cover) among others. Recent examples of products used in NASA Earth Science projects will be provided. This HEC API could foster surmounting prior spatial-temporal limitations while providing broad benefits to Earth Science.

  5. Population resizing on fitness improvement genetic algorithm to optimize promotion visit route based on android and google maps API

    NASA Astrophysics Data System (ADS)

    Listyorini, Tri; Muzid, Syafiul

    2017-06-01

    The promotion team of Muria Kudus University (UMK) has done annual promotion visit to several senior high schools in Indonesia. The visits were done to numbers of schools in Kudus, Jepara, Demak, Rembang and Purwodadi. To simplify the visit, each visit round is limited to 15 (fifteen) schools. However, the team frequently faces some obstacles during the visit, particularly in determining the route that they should take toward the targeted school. It is due to the long distance or the difficult route to reach the targeted school that leads to elongated travel duration and inefficient fuel cost. To solve these problems, the development of a certain application using heuristic genetic algorithm method based on the dynamic of population size or Population Resizing on Fitness lmprovement Genetic Algorithm (PRoFIGA), was done. This android-based application was developed to make the visit easier and to determine a shorter route for the team, hence, the visiting period will be effective and efficient. The result of this research was an android-based application to determine the shortest route by combining heuristic method and Google Maps Application Programming lnterface (API) that display the route options for the team.

  6. A Highly Scalable Data Service (HSDS) using Cloud-based Storage Technologies for Earth Science Data

    NASA Astrophysics Data System (ADS)

    Michaelis, A.; Readey, J.; Votava, P.; Henderson, J.; Willmore, F.

    2017-12-01

    Cloud based infrastructure may offer several key benefits of scalability, built in redundancy, security mechanisms and reduced total cost of ownership as compared with a traditional data center approach. However, most of the tools and legacy software systems developed for online data repositories within the federal government were not developed with a cloud based infrastructure in mind and do not fully take advantage of commonly available cloud-based technologies. Moreover, services bases on object storage are well established and provided through all the leading cloud service providers (Amazon Web Service, Microsoft Azure, Google Cloud, etc…) of which can often provide unmatched "scale-out" capabilities and data availability to a large and growing consumer base at a price point unachievable from in-house solutions. We describe a system that utilizes object storage rather than traditional file system based storage to vend earth science data. The system described is not only cost effective, but shows a performance advantage for running many different analytics tasks in the cloud. To enable compatibility with existing tools and applications, we outline client libraries that are API compatible with existing libraries for HDF5 and NetCDF4. Performance of the system is demonstrated using clouds services running on Amazon Web Services.

  7. The Future of Risk Analysis: Operationalizing Living Vulnerability Assessments from the Cloud to the Street (and Back)

    NASA Astrophysics Data System (ADS)

    Tellman, B.; Schwarz, B.; Kuhn, C.; Pandey, B.; Schank, C.; Sullivan, J.; Mahtta, R.; Hammet, L.

    2016-12-01

    21 million people are exposed to flooding every year, and that number is expected to more than double by 2030 due to climate, land use, and demographic change. Cloud to Street, a mission driven science organization, is working to make big and real time data more meaningful to understand both biophysical and social vulnerability to flooding in this changing world. This talk will showcase the science and practice we have built of integrated social and biophysical flood vulnerability assessments based on our work in Uttarakhand, India and Senegal, in conjunction with nonprofits and development banks. We will show developments of our global historical flood database, detected from MODIS and Landsat satellites, used to power machine learning flood exposure models in Google Earth Engine's API. Demonstrating the approach, we will also showcase new approaches in social vulnerability science, from developing data-driven social vulnerability indices in India, to deriving predictive models that explain the social conditions that lead to disproportionate flood damage and fatality in the US. While this talk will draw on examples of completed vulnerability assessments, we will also discuss the possible future for place-based "living" flood vulnerability assessments that are updated each time satellites circle the earth or people add crowd-sourced observations about flood events and social conditions.

  8. Three-dimensional slum urban reconstruction in Envisat and Google Earth Egypt

    NASA Astrophysics Data System (ADS)

    Marghany, M.; Genderen, J. v.

    2014-02-01

    This study aims to aim to investigate the capability of ENVISAT ASAR satellite and Google Earth data for three-dimensional (3-D) slum urban reconstruction in developed country such as Egypt. The main objective of this work is to utilize 3-D automatic detection algorithm for urban slum in ENVISAT ASAR and Google Erath images were acquired in Cairo, Egypt using Fuzzy B-spline algorithm. The results show that fuzzy algorithm is the best indicator for chaotic urban slum as it can discriminate them from its surrounding environment. The combination of Fuzzy and B-spline then used to reconstruct 3-D of urban slam. The results show that urban slums, road network, and infrastructures are perfectly discriminated. It can therefore be concluded that fuzzy algorithm is an appropriate algorithm for chaotic urban slum automatic detection in ENVSIAT ASAR and Google Earth data.

  9. Efficiently Communicating Rich Heterogeneous Geospatial Data from the FeMO2008 Dive Cruise with FlashMap on EarthRef.org

    NASA Astrophysics Data System (ADS)

    Minnett, R. C.; Koppers, A. A.; Staudigel, D.; Staudigel, H.

    2008-12-01

    EarthRef.org is comprehensive and convenient resource for Earth Science reference data and models. It encompasses four main portals: the Geochemical Earth Reference Model (GERM), the Magnetics Information Consortium (MagIC), the Seamount Biogeosciences Network (SBN), and the Enduring Resources for Earth Science Education (ERESE). Their underlying databases are publically available and the scientific community has contributed widely and is urged to continue to do so. However, the net result is a vast and largely heterogeneous warehouse of geospatial data ranging from carefully prepared maps of seamounts to geochemical data/metadata, daily reports from seagoing expeditions, large volumes of raw and processed multibeam data, images of paleomagnetic sampling sites, etc. This presents a considerable obstacle for integrating other rich media content, such as videos, images, data files, cruise tracks, and interoperable database results, without overwhelming the web user. The four EarthRef.org portals clearly lend themselves to a more intuitive user interface and has, therefore, been an invaluable test bed for the design and implementation of FlashMap, a versatile KML-driven geospatial browser written for reliability and speed in Adobe Flash. FlashMap allows layers of content to be loaded and displayed over a streaming high-resolution map which can be zoomed and panned similarly to Google Maps and Google Earth. Many organizations, from National Geographic to the USGS, have begun using Google Earth software to display geospatial content. However, Google Earth, as a desktop application, does not integrate cleanly with existing websites requiring the user to navigate away from the browser and focus on a separate application and Google Maps, written in Java Script, does not scale up reliably to large datasets. FlashMap remedies these problems as a web-based application that allows for seamless integration of the real-time display power of Google Earth and the flexibility of the web without losing scalability and control of the base maps. Our Flash-based application is fully compatible with KML (Keyhole Markup Language) 2.2, the most recent iteration of KML, allowing users with existing Google Earth KML files to effortlessly display their geospatial content embedded in a web page. As a test case for FlashMap, the annual Iron-Oxidizing Microbial Observatory (FeMO) dive cruise to the Loihi Seamount, in conjunction with data available from ongoing and published FeMO laboratory studies, showcases the flexibility of this single web-based application. With a KML 2.2 compatible web-service providing the content, any database can display results in FlashMap. The user can then hide and show multiple layers of content, potentially from several data sources, and rapidly digest a vast quantity of information to narrow the search results. This flexibility gives experienced users the ability to drill down to exactly the record they are looking for (SERC at Carleton College's educational application of FlashMap at http://serc.carleton.edu/sp/erese/activities/22223.html) and allows users familiar with Google Earth the ability to load and view geospatial data content within a browser from any computer with an internet connection.

  10. Google Earth Science

    ERIC Educational Resources Information Center

    Baird, William H.; Padgett, Clifford W.; Secrest, Jeffery A.

    2015-01-01

    Google Earth has made a wealth of aerial imagery available online at no cost to users. We examine some of the potential uses of that data in illustrating basic physics and astronomy, such as finding the local magnetic declination, using landmarks such as the Washington Monument and Luxor Obelisk as gnomons, and showing how airport runways get…

  11. Using Geo-Spatial Technologies for Field Applications in Higher Geography Education

    ERIC Educational Resources Information Center

    Karatepe, Akif

    2012-01-01

    Today's important geo-spatial technologies, GIS (Geographic Information Systems), GPS (Global Positioning Systems) and Google Earth have been widely used in geography education. Transferring spatially oriented data taken by GPS to the GIS and Google Earth has provided great benefits in terms of showing the usage of spatial technologies for field…

  12. Got the World on a Screen

    ERIC Educational Resources Information Center

    Adam, Anna; Mowers, Helen

    2007-01-01

    In this article, the authors discuss how Google Earth provides more than a geography lesson. For starters, Google Earth is perfect for teaching geography. Subscribe to Where in the World, for example, and have their students listen to podcast clues in a find-the-location game created by students worldwide. Clues relate to math (the population of…

  13. A Virtual Tour of Plate Tectonics: Using Google Earth for Inquiry Investigations

    ERIC Educational Resources Information Center

    Mulvey, Bridget; Bell, Randy

    2012-01-01

    Google Earth is an exciting way to engage students in scientific inquiry--the foundation of science education standards and reforms. The National Science Education Standards identify inquiry as an active process that incorporates questioning, gathering and analyzing data, and thinking critically about the interplay of evidence and explanations.…

  14. Teaching Topographic Map Skills and Geomorphology Concepts with Google Earth in a One-Computer Classroom

    ERIC Educational Resources Information Center

    Hsu, Hsiao-Ping; Tsai, Bor-Wen; Chen, Che-Ming

    2018-01-01

    Teaching high-school geomorphological concepts and topographic map reading entails many challenges. This research reports the applicability and effectiveness of Google Earth in teaching topographic map skills and geomorphological concepts, by a single teacher, in a one-computer classroom. Compared to learning via a conventional instructional…

  15. TouchTerrain: A simple web-tool for creating 3D-printable topographic models

    NASA Astrophysics Data System (ADS)

    Hasiuk, Franciszek J.; Harding, Chris; Renner, Alex Raymond; Winer, Eliot

    2017-12-01

    An open-source web-application, TouchTerrain, was developed to simplify the production of 3D-printable terrain models. Direct Digital Manufacturing (DDM) using 3D Printers can change how geoscientists, students, and stakeholders interact with 3D data, with the potential to improve geoscience communication and environmental literacy. No other manufacturing technology can convert digital data into tangible objects quickly at relatively low cost; however, the expertise necessary to produce a 3D-printed terrain model can be a substantial burden: knowledge of geographical information systems, computer aided design (CAD) software, and 3D printers may all be required. Furthermore, printing models larger than the build volume of a 3D printer can pose further technical hurdles. The TouchTerrain web-application simplifies DDM for elevation data by generating digital 3D models customized for a specific 3D printer's capabilities. The only required user input is the selection of a region-of-interest using the provided web-application with a Google Maps-style interface. Publically available digital elevation data is processed via the Google Earth Engine API. To allow the manufacture of 3D terrain models larger than a 3D printer's build volume the selected area can be split into multiple tiles without third-party software. This application significantly reduces the time and effort required for a non-expert like an educator to obtain 3D terrain models for use in class. The web application is deployed at http://touchterrain.geol.iastate.edu/.

  16. A Java-based tool for creating KML files from GPS waypoints

    NASA Astrophysics Data System (ADS)

    Kinnicutt, P. G.; Rivard, C.; Rimer, S.

    2008-12-01

    Google Earth provides a free tool with powerful capabilities for visualizing geoscience images and data. Commercial software tools exist for doing sophisticated digitizing and spatial modeling , but for the purposes of presentation, visualization and overlaying aerial images with data Google Earth provides much of the functionality. Likewise, with current technologies in GPS (Global Positioning System) systems and with Google Earth Plus, it is possible to upload GPS waypoints, tracks and routes directly into Google Earth for visualization. However, older technology GPS units and even low-cost GPS units found today may lack the necessary communications interface to a computer (e.g. no Bluetooth, no WiFi, no USB, no Serial, etc.) or may have an incompatible interface, such as a Serial port but no USB adapter available. In such cases, any waypoints, tracks and routes saved in the GPS unit or recorded in a field notebook must be manually transferred to a computer for use in a GIS system or other program. This presentation describes a Java-based tool developed by the author which enables users to enter GPS coordinates in a user-friendly manner, then save these coordinates in a Keyhole MarkUp Language (KML) file format, for visualization in Google Earth. This tool either accepts user-interactive input or accepts input from a CSV (Comma Separated Value) file, which can be generated from any spreadsheet program. This tool accepts input in the form of lat/long or UTM (Universal Transverse Mercator) coordinates. This presentation describes this system's applicability through several small case studies. This free and lightweight tool simplifies the task of manually inputting GPS data into Google Earth for people working in the field without an automated mechanism for uploading the data; for instance, the user may not have internet connectivity or may not have the proper hardware or software. Since it is a Java application and not a web- based tool, it can be installed on one's field laptop and the GPS data can be manually entered without the need for internet connectivity. This tool provides a table view of the GPS data, but lacks a KML viewer to view the data overlain on top of an aerial view, as this viewer functionality is provided in Google Earth. The tool's primary contribution lies in its more convenient method for entering the GPS data manually when automated technologies are not available.

  17. Mars @ ASDC

    NASA Astrophysics Data System (ADS)

    Carraro, Francesco

    "Mars @ ASDC" is a project born with the goal of using the new web technologies to assist researches involved in the study of Mars. This project employs Mars map and javascript APIs provided by Google to visualize data acquired by space missions on the planet. So far, visualization of tracks acquired by MARSIS and regions observed by VIRTIS-Rosetta has been implemented. The main reason for the creation of this kind of tool is the difficulty in handling hundreds or thousands of acquisitions, like the ones from MARSIS, and the consequent difficulty in finding observations related to a particular region. This led to the development of a tool which allows to search for acquisitions either by defining the region of interest through a set of geometrical parameters or by manually selecting the region on the map through a few mouse clicks The system allows the visualization of tracks (acquired by MARSIS) or regions (acquired by VIRTIS-Rosetta) which intersect the user defined region. MARSIS tracks can be visualized both in Mercator and polar projections while the regions observed by VIRTIS can presently be visualized only in Mercator projection. The Mercator projection is the standard map provided by Google. The polar projections are provided by NASA and have been developed to be used in combination with APIs provided by Google The whole project has been developed following the "open source" philosophy: the client-side code which handles the functioning of the web page is written in javascript; the server-side code which executes the searches for tracks or regions is written in PHP and the DB which undergoes the system is MySQL.

  18. OpenSearch (ECHO-ESIP) & REST API for Earth Science Data Access

    NASA Astrophysics Data System (ADS)

    Mitchell, A.; Cechini, M.; Pilone, D.

    2010-12-01

    This presentation will provide a brief technical overview of OpenSearch, the Earth Science Information Partners (ESIP) Federated Search framework, and the REST architecture; discuss NASA’s Earth Observing System (EOS) ClearingHOuse’s (ECHO) implementation lessons learned; and demonstrate the simplified usage of these technologies. SOAP, as a framework for web service communication has numerous advantages for Enterprise applications and Java/C# type programming languages. As a technical solution, SOAP has been a reliable framework on top of which many applications have been successfully developed and deployed. However, as interest grows for quick development cycles and more intriguing “mashups,” the SOAP API loses its appeal. Lightweight and simple are the vogue characteristics that are sought after. Enter the REST API architecture and OpenSearch format. Both of these items provide a new path for application development addressing some of the issues unresolved by SOAP. ECHO has made available all of its discovery, order submission, and data management services through a publicly accessible SOAP API. This interface is utilized by a variety of ECHO client and data partners to provide valuable capabilities to end users. As ECHO interacted with current and potential partners looking to develop Earth Science tools utilizing ECHO, it became apparent that the development overhead required to interact with the SOAP API was a growing barrier to entry. ECHO acknowledged the technical issues that were being uncovered by its partner community and chose to provide two new interfaces for interacting with the ECHO metadata catalog. The first interface is built upon the OpenSearch format and ESIP Federated Search framework. Leveraging these two items, a client (ECHO-ESIP) was developed with a focus on simplified searching and results presentation. The second interface is built upon the Representational State Transfer (REST) architecture. Leveraging the REST architecture, a new API has been made available that will provide access to the entire SOAP API suite of services. The results of these development activities has not only positioned to engage in the thriving world of mashup applications, but also provided an excellent real-world case study of how to successfully leverage these emerging technologies.

  19. Creating a Geo-Referenced Bibliography with Google Earth and Geocommons: The Coos Bay Bibliography

    ERIC Educational Resources Information Center

    Schmitt, Jenni; Butler, Barb

    2012-01-01

    We compiled a geo-referenced bibliography of research including theses, peer-reviewed articles, agency literature, and books having sample collection sites in and around Coos Bay, Oregon. Using Google Earth and GeoCommons we created a map that allows users such as visiting researchers, faculty, students, and local agencies to identify previous…

  20. Visualizing Mars data and imagery with Google Earth

    NASA Astrophysics Data System (ADS)

    Beyer, R. A.; Broxton, M.; Gorelick, N.; Hancher, M.; Lundy, M.; Kolb, E.; Moratto, Z.; Nefian, A.; Scharff, T.; Weiss-Malik, M.

    2009-12-01

    There is a vast store of planetary geospatial data that has been collected by NASA but is difficult to access and visualize. Virtual globes have revolutionized the way we visualize and understand the Earth, but other planetary bodies including Mars and the Moon can be visualized in similar ways. Extraterrestrial virtual globes are poised to revolutionize planetary science, bring an exciting new dimension to science education, and allow ordinary users to explore imagery being sent back to Earth by planetary science satellites. The original Google Mars Web site allowed users to view base maps of Mars via the Web, but it did not have the full features of the 3D Google Earth client. We have previously demonstrated the use of Google Earth to display Mars imagery, but now with the launch of Mars in Google Earth, there is a base set of Mars data available for anyone to work from and add to. There are a variety of global maps to choose from and display. The Terrain layer has the MOLA gridded data topography, and where available, HRSC terrain models are mosaicked into the topography. In some locations there is also meter-scale terrain derived from HiRISE stereo imagery. There is rich information in the form of the IAU nomenclature database, data for the rovers and landers on the surface, and a Spacecraft Imagery layer which contains the image outlines for all HiRISE, CTX, CRISM, HRSC, and MOC image data released to the PDS and links back to their science data. There are also features like the Traveler's Guide to Mars, Historic Maps, Guided Tours, as well as the 'Live from Mars' feature, which shows the orbital tracks of both the Mars Odyssey and Mars Reconnaissance Orbiter for a few days in the recent past. It shows where they have acquired imagery, and also some preview image data. These capabilities have obvious public outreach and education benefits, but the potential benefits of allowing planetary scientists to rapidly explore these large and varied data collections—in geological context and within a single user interface—are also becoming evident. Because anyone can produce additional KML content for use in Google Earth, scientists can customize the environment to their needs as well as publish their own processed data and results for others to use. Many scientists and organizations have begun to do this already, resulting in a useful and growing collection of planetary-science-oriented Google Earth layers.

  1. Global positioning system & Google Earth in the investigation of an outbreak of cholera in a village of Bengaluru Urban district, Karnataka.

    PubMed

    Masthi, N R Ramesh; Madhusudan, M; Puthussery, Yannick P

    2015-11-01

    The global positioning system (GPS) technology along with Google Earth is used to measure (spatial map) the accurate distribution of morbidity, mortality and planning of interventions in the community. We used this technology to find out its role in the investigation of a cholera outbreak, and also to identify the cause of the outbreak. This study was conducted in a village near Bengaluru, Karnataka in June 2013 during a cholera outbreak. House-to-house survey was done to identify acute watery diarrhoea cases. A hand held GPS receiver was used to record north and east coordinates of the households of cases and these values were subsequently plotted on Google Earth map. Water samples were collected from suspected sources for microbiological analysis. A total of 27 cases of acute watery diarrhoea were reported. Fifty per cent of cases were in the age group of 14-44 yr and one death was reported. GPS technology and Google Earth described the accurate location of household of cases and spot map generated showed clustering of cases around the suspected water sources. The attack rate was 6.92 per cent and case fatality rate was 3.7 per cent. Water samples collected from suspected sources showed the presence of Vibrio cholera O1 Ogawa. GPS technology and Google Earth were easy to use, helpful to accurately pinpoint the location of household of cases, construction of spot map and follow up of cases. Outbreak was found to be due to contamination of drinking water sources.

  2. Utilization of Google Earth for Distribution Mapping of Cholangiocarcinoma: a Case Study in Satuek District, Buriram, Thailand.

    PubMed

    Rattanasing, Wannaporn; Kaewpitoon, Soraya J; Loyd, Ryan A; Rujirakul, Ratana; Yodkaw, Eakachai; Kaewpitoon, Natthawut

    2015-01-01

    Cholangiocarcinoma (CCA) is a serious public health problem in the Northeast of Thailand. CCA is considered to be an incurable and rapidly lethal disease. Knowledge of the distribution of CCA patients is necessary for management strategies. This study aimed to utilize the Geographic Information System and Google EarthTM for distribution mapping of cholangiocarcinoma in Satuek District, Buriram, Thailand, during a 5-year period (2008-2012). In this retrospective study data were collected and reviewed from the OPD cards, definitive cases of CCA were patients who were treated in Satuek hospital and were diagnosed with CCA or ICD-10 code C22.1. CCA cases were used to analyze and calculate with ArcGIS 9.2, all of data were imported into Google Earth using the online web page www.earthpoint.us. Data were displayed at village points. A total of 53 cases were diagnosed and identified as CCA. The incidence was 53.57 per 100,000 population (65.5 for males and 30.8 for females) and the majority of CCA cases were in stages IV and IIA. The average age was 67 years old. The highest attack rate was observed in Thung Wang sub-district (161.4 per 100,000 population). The map display at village points for CCA patients based on Google Earth gave a clear visual deistribution. CCA is still a major problem in Satuek district, Buriram province of Thailand. The Google Earth production process is very simple and easy to learn. It is suitable for the use in further development of CCA management strategies.

  3. Accuracy comparison in mapping water bodies using Landsat images and Google Earth Images

    NASA Astrophysics Data System (ADS)

    Zhou, Z.; Zhou, X.

    2016-12-01

    A lot of research has been done for the extraction of water bodies with multiple satellite images. The Water Indexes with the use of multi-spectral images are the mostly used methods for the water bodies' extraction. In order to extract area of water bodies from satellite images, accuracy may depend on the spatial resolution of images and relative size of the water bodies. To quantify the impact of spatial resolution and size (major and minor lengths) of the water bodies on the accuracy of water area extraction, we use Georgetown Lake, Montana and coalbed methane (CBM) water retention ponds in the Montana Powder River Basin as test sites to evaluate the impact of spatial resolution and the size of water bodies on water area extraction. Data sources used include Landsat images and Google Earth images covering both large water bodies and small ponds. Firstly we used water indices to extract water coverage from Landsat images for both large lake and small ponds. Secondly we used a newly developed visible-index method to extract water coverage from Google Earth images covering both large lake and small ponds. Thirdly, we used the image fusion method in which the Google Earth Images are fused with multi-spectral Landsat images to obtain multi-spectral images of the same high spatial resolution as the Google earth images. The actual area of the lake and ponds are measured using GPS surveys. Results will be compared and the optimal method will be selected for water body extraction.

  4. The Effect of Google Earth and Wiki Models on Oral Presentation Skills of University EFL Learners

    ERIC Educational Resources Information Center

    Awada, Ghada; Diab, Hassan B.

    2018-01-01

    This article reports the results of an experimental study that investigated the effectiveness of Google Earth and Wiki tools in improving the oral presentation skills of English as a Foreign Language (EFL) learners and boosting their motivation for learning. The participants (n =81) are enrolled in writing classes at two English-medium…

  5. Utilizing Google Earth to Teach Students about Global Oil Spill Disasters

    ERIC Educational Resources Information Center

    Guertin, Laura; Neville, Sara

    2011-01-01

    The United States is currently experiencing its worst man-made environmental disaster, the BP Deepwater Horizon oil leak. The Gulf of Mexico oil spill is severe in its impact, but it is only one of several global oil spill disasters in history. Students can utilize the technology of Google Earth to explore the spatial and temporal distribution of…

  6. KML-based teaching lessons developed by Google in partnership with the University of Alaska.

    NASA Astrophysics Data System (ADS)

    Kolb, E. J.; Bailey, J.; Bishop, A.; Cain, J.; Goddard, M.; Hurowitz, K.; Kennedy, K.; Ornduff, T.; Sfraga, M.; Wernecke, J.

    2008-12-01

    The focus of Google's Geo Education outreach efforts (http://www.google.com/educators/geo.html) is on helping primary, secondary, and post-secondary educators incorporate Google Earth and Sky, Google Maps, and SketchUp into their classroom lessons. In this poster and demonstration, we will show our KML-based science lessons that were developed in partnership with the University of Alaska and used in classroom teachings by our team to Alaskan high-school students.

  7. Extensible Probabilistic Repository Technology (XPRT)

    DTIC Science & Technology

    2004-10-01

    projects, such as, Centaurus , Evidence Data Base (EDB), etc., others were fabricated, such as INS and FED, while others contain data from the open...Google Web Report Unlimited SOAP API News BBC News Unlimited WEB RSS 1.0 Centaurus Person Demographics 204,402 people from 240 countries...objects of the domain ontology map to the various simulated data-sources. For example, the PersonDemographics are stored in the Centaurus database, while

  8. 3D Orbit Visualization for Earth-Observing Missions

    NASA Technical Reports Server (NTRS)

    Jacob, Joseph C.; Plesea, Lucian; Chafin, Brian G.; Weiss, Barry H.

    2011-01-01

    This software visualizes orbit paths for the Orbiting Carbon Observatory (OCO), but was designed to be general and applicable to any Earth-observing mission. The software uses the Google Earth user interface to provide a visual mechanism to explore spacecraft orbit paths, ground footprint locations, and local cloud cover conditions. In addition, a drill-down capability allows for users to point and click on a particular observation frame to pop up ancillary information such as data product filenames and directory paths, latitude, longitude, time stamp, column-average dry air mole fraction of carbon dioxide, and solar zenith angle. This software can be integrated with the ground data system for any Earth-observing mission to automatically generate daily orbit path data products in Google Earth KML format. These KML data products can be directly loaded into the Google Earth application for interactive 3D visualization of the orbit paths for each mission day. Each time the application runs, the daily orbit paths are encapsulated in a KML file for each mission day since the last time the application ran. Alternatively, the daily KML for a specified mission day may be generated. The application automatically extracts the spacecraft position and ground footprint geometry as a function of time from a daily Level 1B data product created and archived by the mission s ground data system software. In addition, ancillary data, such as the column-averaged dry air mole fraction of carbon dioxide and solar zenith angle, are automatically extracted from a Level 2 mission data product. Zoom, pan, and rotate capability are provided through the standard Google Earth interface. Cloud cover is indicated with an image layer from the MODIS (Moderate Resolution Imaging Spectroradiometer) aboard the Aqua satellite, which is automatically retrieved from JPL s OnEarth Web service.

  9. GEOG 342: Exploring the Virtual Earth

    NASA Astrophysics Data System (ADS)

    Bailey, J. E.; Sfraga, M.

    2007-12-01

    First attributed to Eratosthenes around 200 BC, the word "geography" is derived from Greek words meaning "Earth" and "to describe". It describes the study of our planets, its features, inhabitants, and phenomena. The term "neogeography" put simply is new geography; where new refers to more than just practices that are new in usage. Methodologies of neogeography tend toward the intuitive, personal, artistic or even absurd, and general don't confirm to traditional protocols and boundaries. Mapping and spatial technologies such as Geobrowsers are typical of the tools used by neogeographers. Much of the success of Geobrowsers can be attributed to the fact that they use the methods and technologies of neogeography to provide a better understanding of traditional topics of Geography. The Geography program at the University of Alaska Fairbanks is embracing these new methodologies by offering a new class that explores the world around us through the use of Geobrowsers and other Web 2.0 technologies. Students will learn to use Keyhole Markup Language (KML), Google Maps API, SketchUp and a range of Virtual Globes programs, primarily through geospatial datasets from the Earth Sciences. A special focus will be given to datasets that look at the environments and natural hazards that make Alaska such a unique landscape. The role of forums, wikis and blogs in the expansion of the Geoweb will be explored, and students will be encouraged to be active on these websites. Students will also explore Second Life, the concept of which will be introduced through the class text, Neal Stephenson's "Snow Crash". The primary goal of the class is to encourage students to undertake their own explorations of virtual Earths, in order to better understand the physical and social structure of the real world.

  10. Mean composite fire severity metrics computed with Google Earth engine offer improved accuracy and expanded mapping potential

    Treesearch

    Sean A. Parks; Lisa M. Holsinger; Morgan A. Voss; Rachel A. Loehman; Nathaniel P. Robinson

    2018-01-01

    Landsat-based fire severity datasets are an invaluable resource for monitoring and research purposes. These gridded fire severity datasets are generally produced with pre- and post-fire imagery to estimate the degree of fire-induced ecological change. Here, we introduce methods to produce three Landsat-based fire severity metrics using the Google Earth Engine (GEE)...

  11. Google Sky: A Digital View of the Night Sky

    NASA Astrophysics Data System (ADS)

    Connolly, A. Scranton, R.; Ornduff, T.

    2008-11-01

    From its inception Astronomy has been a visual science, from careful observations of the sky using the naked eye, to the use of telescopes and photographs to map the distribution of stars and galaxies, to the current era of digital cameras that can image the sky over many decades of the electromagnetic spectrum. Sky in Google Earth (http://earth.google.com) and Google Sky (http://www.google.com/sky) continue this tradition, providing an intuitive visual interface to some of the largest astronomical imaging surveys of the sky. Streaming multi-color imagery, catalogs, time domain data, as well as annotating interesting astronomical sources and events with placemarks, podcasts and videos, Sky provides a panchromatic view of the universe accessible to anyone with a computer. Beyond a simple exploration of the sky Google Sky enables users to create and share content with others around the world. With an open interface available on Linux, Mac OS X and Windows, and translations of the content into over 20 different languages we present Sky as the embodiment of a virtual telescope for discovery and sharing the excitement of astronomy and science as a whole.

  12. Elevation data fitting and precision analysis of Google Earth in road survey

    NASA Astrophysics Data System (ADS)

    Wei, Haibin; Luan, Xiaohan; Li, Hanchao; Jia, Jiangkun; Chen, Zhao; Han, Leilei

    2018-05-01

    Objective: In order to improve efficiency of road survey and save manpower and material resources, this paper intends to apply Google Earth to the feasibility study stage of road survey and design. Limited by the problem that Google Earth elevation data lacks precision, this paper is focused on finding several different fitting or difference methods to improve the data precision, in order to make every effort to meet the accuracy requirements of road survey and design specifications. Method: On the basis of elevation difference of limited public points, any elevation difference of the other points can be fitted or interpolated. Thus, the precise elevation can be obtained by subtracting elevation difference from the Google Earth data. Quadratic polynomial surface fitting method, cubic polynomial surface fitting method, V4 interpolation method in MATLAB and neural network method are used in this paper to process elevation data of Google Earth. And internal conformity, external conformity and cross correlation coefficient are used as evaluation indexes to evaluate the data processing effect. Results: There is no fitting difference at the fitting point while using V4 interpolation method. Its external conformity is the largest and the effect of accuracy improvement is the worst, so V4 interpolation method is ruled out. The internal and external conformity of the cubic polynomial surface fitting method both are better than those of the quadratic polynomial surface fitting method. The neural network method has a similar fitting effect with the cubic polynomial surface fitting method, but its fitting effect is better in the case of a higher elevation difference. Because the neural network method is an unmanageable fitting model, the cubic polynomial surface fitting method should be mainly used and the neural network method can be used as the auxiliary method in the case of higher elevation difference. Conclusions: Cubic polynomial surface fitting method can obviously improve data precision of Google Earth. The error of data in hilly terrain areas meets the requirement of specifications after precision improvement and it can be used in feasibility study stage of road survey and design.

  13. How Would You Move Mount Fuji - And Why Would You Want To?

    NASA Astrophysics Data System (ADS)

    de Paor, D. G.

    2008-12-01

    According to author William Poundstone, "How Would You Move Mt Fuji?" typifies the kind of question that corporations such as Microsoft are wont to ask job applicants in order to test their lateral thinking skills. One answer (albeit not one that would necessarily secure a job at Microsoft) is: "With Google Earth and a Macintosh or PC." The answer to the more profound follow-up question "Why Would You Want To?" is hinted at by one of the great quotations of earth science, namely Charles Lyell's proposition that "The Present Is Key to the Past." Google Earth is a phenomenally powerful tool for visualizing today's earth, ocean, and atmosphere. With the aid of Google SketchUp, that visualization can be extended to reconstruct the past using relocated samples of present-day landscapes and environments as models of paleo-DEM and paleogeography. Volcanoes are particularly useful models because their self similar growth can be simulated by changing KML altitude tags within a timespan, but numerous other landforms and geologic structures serve as useful keys to the past. Examples range in scale from glaciers and fault scarps to island arcs and mountain ranges. The ability to generate a paleo-terrain model in Google Earth brings us one step closer to a truly four- dimensional, interactive geological map of the world throughout time.

  14. Everglades Ecological Forecasting II: Utilizing NASA Earth Observations to Enhance the Capabilities of Everglades National Park to Monitor & Predict Mangrove Extent to Aid Current Restoration Efforts

    NASA Technical Reports Server (NTRS)

    Kirk, Donnie; Wolfe, Amy; Ba, Adama; Nyquist, Mckenzie; Rhodes, Tyler; Toner, Caitlin; Cabosky, Rachel; Gotschalk, Emily; Gregory, Brad; Kendall, Candace

    2016-01-01

    Mangroves act as a transition zone between fresh and salt water habitats by filtering and indicating salinity levels along the coast of the Florida Everglades. However, dredging and canals built in the early 1900s depleted the Everglades of much of its freshwater resources. In an attempt to assist in maintaining the health of threatened habitats, efforts have been made within Everglades National Park to rebalance the ecosystem and adhere to sustainably managing mangrove forests. The Everglades Ecological Forecasting II team utilized Google Earth Engine API and satellite imagery from Landsat 5, 7, and 8 to continuously create land-change maps over a 25 year period, and to allow park officials to continue producing maps in the future. In order to make the process replicable for project partners at Everglades National Park, the team was able to conduct a supervised classification approach to display mangrove regions in 1995, 2000, 2005, 2010 and 2015. As freshwater was depleted, mangroves encroached further inland and freshwater marshes declined. The current extent map, along with transition maps helped create forecasting models that show mangrove encroachment further inland in the year 2030 as well. This project highlights the changes to the Everglade habitats in relation to a changing climate and hydrological changes throughout the park.

  15. Towards a geospatial wikipedia

    NASA Astrophysics Data System (ADS)

    Fritz, S.; McCallum, I.; Schill, C.; Perger, C.; Kraxner, F.; Obersteiner, M.

    2009-04-01

    Based on the Google Earth (http://earth.google.com) platform we have developed a geospatial Wikipedia (geo-wiki.org). The tool allows everybody in the world to contribute to spatial validation and is made available to the internet community interested in that task. We illustrate how this tool can be used for different applications. In our first application we combine uncertainty hotspot information from three global land cover datasets (GLC, MODIS, GlobCover). With an ever increasing amount of high resolution images available on Google Earth, it is becoming increasingly possible to distinguish land cover features with a high degree of accuracy. We first direct the land cover validation community to certain hotspots of land cover uncertainty and then ask them to fill in a small popup menu on type of land cover, possibly a picture at that location with the different cardinal points as well as date and what type of validation was chosen (google earth imagery/panoramio or if the person has ground truth data). We have implemented the tool via a land cover validation community at FACEBOOK which is based on a snowball system which allows the tracking of individuals and the possibility to ignore users which misuse the system. In a second application we illustrate how the tool could possibly be used for mapping malaria occurrence and small water bodies as well as overall malaria risk. For this application we have implemented a polygon as well as attribute function using Google maps as along with virtual earth using openlayers. The third application deals with illegal logging and how an alert system for illegal logging detection within a certain land tenure system could be implemented. Here we show how the tool can be used to document illegal logging via a YouTube video.

  16. Predicting the performance of local seismic networks using Matlab and Google Earth.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chael, Eric Paul

    2009-11-01

    We have used Matlab and Google Earth to construct a prototype application for modeling the performance of local seismic networks for monitoring small, contained explosions. Published equations based on refraction experiments provide estimates of peak ground velocities as a function of event distance and charge weight. Matlab routines implement these relations to calculate the amplitudes across a network of stations from sources distributed over a geographic grid. The amplitudes are then compared to ambient noise levels at the stations, and scaled to determine the smallest yield that could be detected at each source location by a specified minimum number ofmore » stations. We use Google Earth as the primary user interface, both for positioning the stations of a hypothetical local network, and for displaying the resulting detection threshold contours.« less

  17. BioMon: A Google Earth Based Continuous Biomass Monitoring System (Demo Paper)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vatsavai, Raju

    2009-01-01

    We demonstrate a Google Earth based novel visualization system for continuous monitoring of biomass at regional and global scales. This system is integrated with a back-end spatiotemporal data mining system that continuously detects changes using high temporal resolution MODIS images. In addition to the visualization, we demonstrate novel query features of the system that provides insights into the current conditions of the landscape.

  18. Finding the Shape of Space

    DTIC Science & Technology

    2011-07-01

    currently valid OMB control number. 1. REPORT DATE JUL 2011 2. REPORT TYPE 3. DATES COVERED 00-00-2011 to 00-00-2011 4. TITLE AND SUBTITLE...1 Lt Col Christopher C. Shannon Maj Tosha N. Meredith 2 GOOGLE EARTH TUBE: PROSPECTS FOR FULL MOTION VIDEO FROM SPACE . . . . . . . 5...Google Earth Tube,” a virtual environment that provides an extraordinary amount of information to whoever accesses it, sets the stage for improved

  19. Optimum Antenna Configuration for Maximizing Access Point Range of an IEEE 802.11 Wireless Mesh Network in Support of Multi-Mission Operations Relative to Hastily Formed Scalable Deployments

    DTIC Science & Technology

    2007-09-01

    Configuration Consideration ...........................54 C. MAE NGAT DAM, CHIANG MAI , THAILAND, FIELD EXPERIMENT...2006 802.11 Network Topology Mae Ngat Dam, Chiang Mai , Thailand.......................39 Figure 31. View of COASTS 2006 802.11 Topology...Requirements (Background From Google Earth).....62 Figure 44. Mae Ngat Dam, Chiang Mai , Thailand (From Google Earth

  20. Naive (commonsense) geography and geobrowser usability after ten years of Google Earth

    NASA Astrophysics Data System (ADS)

    Hamerlinck, J. D.

    2016-04-01

    In 1995, the concept of ‘naive geography’ was formally introduced as an area of cognitive geographic information science representing ‘the body of knowledge that people have about the surrounding geographic world’ and reflecting ‘the way people think and reason about geographic space and time, both consciously and subconsciously’. The need to incorporate such commonsense knowledge and reasoning into design of geospatial technologies was identified but faced challenges in formalizing these relationships and processes in software implementation. Ten years later, the Google Earth geobrowser was released, marking the beginning of a new era of open access to, and application of, geographic data and information in society. Fast-forward to today, and the opportunity presents itself to take stock of twenty years of naive geography and a decade of the ubiquitous virtual globe. This paper introduces an ongoing research effort to explore the integration of naive (or commonsense) geography concepts in the Google Earth geobrowser virtual globe and their possible impact on Google Earth's usability, utility, and usefulness. A multi-phase methodology is described, combining usability reviews and usability testing with use-case scenarios involving the U.S.-Canadian Yellowstone to Yukon Initiative. Initial progress on a usability review combining cognitive walkthroughs and heuristics evaluation is presented.

  1. Predicting plant attractiveness to pollinators with passive crowdsourcing.

    PubMed

    Bahlai, Christie A; Landis, Douglas A

    2016-06-01

    Global concern regarding pollinator decline has intensified interest in enhancing pollinator resources in managed landscapes. These efforts frequently emphasize restoration or planting of flowering plants to provide pollen and nectar resources that are highly attractive to the desired pollinators. However, determining exactly which plant species should be used to enhance a landscape is difficult. Empirical screening of plants for such purposes is logistically daunting, but could be streamlined by crowdsourcing data to create lists of plants most probable to attract the desired pollinator taxa. People frequently photograph plants in bloom and the Internet has become a vast repository of such images. A proportion of these images also capture floral visitation by arthropods. Here, we test the hypothesis that the abundance of floral images containing identifiable pollinator and other beneficial insects is positively associated with the observed attractiveness of the same species in controlled field trials from previously published studies. We used Google Image searches to determine the correlation of pollinator visitation captured by photographs on the Internet relative to the attractiveness of the same species in common-garden field trials for 43 plant species. From the first 30 photographs, which successfully identified the plant, we recorded the number of Apis (managed honeybees), non-Apis (exclusively wild bees) and the number of bee-mimicking syrphid flies. We used these observations from search hits as well as bloom period (BP) as predictor variables in Generalized Linear Models (GLMs) for field-observed abundances of each of these groups. We found that non-Apis bees observed in controlled field trials were positively associated with observations of these taxa in Google Image searches (pseudo-R (2) of 0.668). Syrphid fly observations in the field were also associated with the frequency they were observed in images, but this relationship was weak. Apis bee observations were not associated with Internet images, but were slightly associated with BP. Our results suggest that passively crowdsourced image data can potentially be a useful screening tool to identify candidate plants for pollinator habitat restoration efforts directed at wild bee conservation. Increasing our understanding of the attractiveness of a greater diversity of plants increases the potential for more rapid and efficient research in creating pollinator-supportive landscapes.

  2. jmzIdentML API: A Java interface to the mzIdentML standard for peptide and protein identification data.

    PubMed

    Reisinger, Florian; Krishna, Ritesh; Ghali, Fawaz; Ríos, Daniel; Hermjakob, Henning; Vizcaíno, Juan Antonio; Jones, Andrew R

    2012-03-01

    We present a Java application programming interface (API), jmzIdentML, for the Human Proteome Organisation (HUPO) Proteomics Standards Initiative (PSI) mzIdentML standard for peptide and protein identification data. The API combines the power of Java Architecture of XML Binding (JAXB) and an XPath-based random-access indexer to allow a fast and efficient mapping of extensible markup language (XML) elements to Java objects. The internal references in the mzIdentML files are resolved in an on-demand manner, where the whole file is accessed as a random-access swap file, and only the relevant piece of XMLis selected for mapping to its corresponding Java object. The APIis highly efficient in its memory usage and can handle files of arbitrary sizes. The APIfollows the official release of the mzIdentML (version 1.1) specifications and is available in the public domain under a permissive licence at http://www.code.google.com/p/jmzidentml/. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Results of new petrologic and remote sensing studies in the Big Bend region

    NASA Astrophysics Data System (ADS)

    Benker, Stevan Christian

    The initial section of this manuscript involves the South Rim Formation, a series of 32.2-32 Ma comenditic quartz trachytic-rhyolitic volcanics and associated intrusives, erupted and was emplaced in Big Bend National Park, Texas. Magmatic parameters have only been interpreted for one of the two diverse petrogenetic suites comprising this formation. Here, new mineralogic data for the South Rim Formation rocks are presented. Magmatic parameters interpreted from these data assist in deciphering lithospheric characteristics during the mid-Tertiary. Results indicate low temperatures (< 750 °C), reduced conditions (generally below the FMQ buffer), and low pressures (≤ 100 MPa) associated with South Rim Formation magmatism with slight conditional differences between the two suites. Newly discovered fayalite microphenocrysts allowed determination of oxygen fugacity values (between -0.14 and -0.25 DeltaFMQ over temperature ranges of 680-700 °C), via mineral equilibria based QUILF95 calculations, for Emory Peak Suite. Petrologic information is correlated with structural evidence from Trans-Pecos Texas and adjacent regions to evaluate debated timing of tectonic transition (Laramide compression to Basin and Range extension) and onset of the southern Rio Grande Rift during the mid-Tertiary. The A-type and peralkaline characteristics of the South Rim Formation and other pre-31 Ma magmatism in Trans-Pecos Texas, in addition to evidence implying earlier Rio Grande Rift onset in Colorado and New Mexico, promotes a near-neutral to transtensional setting in Trans-Pecos Texas by 32 Ma. This idea sharply contrasts with interpretations of tectonic compression and arc-related magmatism until 31 Ma as suggested by some authors. However, evidence discussed cannot preclude a pre-36 Ma proposed by other authors. The later section of this manuscript involves research in the Big Bend area using Google Earth. At present there is high interest in using Google Earth in a variety of scientific investigations. However, program developers have disclosed limited information concerning the program and its accuracy. While some authors have attempted to independently constrain the accuracy of Google Earth, their results have potentially lost validity through time due to technological advances and updates to imagery archives. For this reason we attempt to constrain more current horizontal and vertical position accuracies for the Big Bend region of West Texas. In Google Earth a series of 268 data points were virtually traced along various early Tertiary unconformities in Big Bend National Park and Big Bend Ranch State Park. These data points were compared with high precision GPS measurements collected in field and yielded a horizontal position accuracy of 2.64 meters RMSE. Complications arose in determining vertical position accuracy for Google Earth because default keyhole markup language (.kml) files currently do not export elevation data. This drawback forces users to hand record and manually input elevation values listed on screen. This is a significant handicap rendering Google Earth data useless with larger datasets. However, in a workaround solution exempted elevation values can be replaced from other data sources based on Google Earth horizontal positioning. We used Fledermaus 3D three-dimensional visualization software to drape Google Earth horizontal positions over a National Elevation Dataset (NED) digital elevation map (DEM) in order to adopt a large set of elevation data. A vertical position accuracy of 1.63 meters RMSE was determined between 268 Google Earth data points and the NED. Since determined accuracies were considerably lower than those reported in previous investigations, we devoted a later portion of this investigation to testing Google Earth-NED data in paleo-surface modeling of the Big Bend region. An 18 x 30 kilometer area in easternmost Big Ranch State Park was selected to create a post-Laramide paleo-surface model via interpolation of approximately 2900 Google Earth-NED data points representing sections of an early Tertiary unconformity. The area proved difficult to model as unconformity tracing and interpolation were often hindered by surface inflation due to regional magmatism, burial of Laramide topography by subsequent volcanism and sedimentation, and overprinting of Basin & Range extensional features masking Laramide compressional features. Despite these difficulties, a model was created illustrating paleo-topographic highs in the southeastern Bofecillos Mountains and at Lajitas Mesa. Based on the amount of surface relief depicted, inconsistency with subsequent normal faulting, and distance from magmatic features capable of surface doming or inflation, we believe the paleo-topographic highs modeled legitimately reflect the post-Laramide surface. We interpret the paleo-surface in this area as reflecting a post-Laramide surface that has experienced significant erosion. We attribute the paleo-topographic highs as Laramide topography that was more resistant. The model also implies a southern paleo-drainage direction for the area and suggests the present day topographic low through which the Rio Grande flows may have formed very soon after the Laramide Orogeny. Based on the newly calculated horizontal and vertical position accuracies for the Big Bend region and results of modeled Google Earth-NED data in easternmost Big Bend Ranch State Park, it seems Google Earth can be effectively utilized for remote sensing and geologic studies, however we urge caution as developers remain reluctant to disclose detailed program information to the public.

  4. What does it take to build a medium scale scientific cloud to process significant amounts of Earth observation data?

    NASA Astrophysics Data System (ADS)

    Hollstein, André; Diedrich, Hannes; Spengler, Daniel

    2017-04-01

    The installment of the operational fleet of Sentinels by Copernicus offers an unprecedented influx of freely available Earth Observation data with Sentinel-2 being a great example. It offers a broad range of land applications due to its high spatial sampling from 10 m to 20 m and its multi-spectral imaging capabilities with 13 spectral bands. The open access policy allows unrestricted use by everybody and provides data downloads for on the respective sites. For a small area of interest and shorter time series, data processing, and exploitation can easily be done manually. However, for multi-temporal analysis of larger areas, the data size can quickly increase such that it is not manageable in practice on a personal computer which leads to an increasing interest in central data exploitation platforms. Prominent examples are GoogleEarth Engine, NASA Earth Exchange (NEX) or current developments such as CODE-DE in Germany. Open standards are still evolving, and the choice of a platform may create lock-in scenarios and a situation where scientists are not anymore in full control of all aspects of their analysis. Securing intellectual properties of researchers can become a major issue in the future. Partnering with a startup company that is dedicated to providing tools for farm management and precision farming, GFZ builds a small-scale science cloud named GTS2 for processing and distribution of Sentinel-2 data. The service includes a sophisticated atmospheric correction algorithm, spatial co-registration of time series data, as well as a web API for data distribution. This approach is different from the drag to centralized research using infrastructures controlled by others. By keeping the full licensing rights, it allows developing new business models independent from the initially chosen processing provider. Currently, data is held for the greater German area but is extendable to larger areas on short notice due to a scalable distributed network file system. For a given area of interest, band and time range selection, the API returns only the data that was requested in a fast manner and thereby saves storage space on the user's machine. A jupyterhub instance is a main tool for data exploitation by our users. Nearly all used software is open source, is based on open standards, and allows to transfer software to other infrastructures. In the talk, we give an overview of the current status of the project and the service, but also want to share our experience with its development.

  5. Visualization of High-Resolution LiDAR Topography in Google Earth

    NASA Astrophysics Data System (ADS)

    Crosby, C. J.; Nandigam, V.; Arrowsmith, R.; Blair, J. L.

    2009-12-01

    The growing availability of high-resolution LiDAR (Light Detection And Ranging) topographic data has proven to be revolutionary for Earth science research. These data allow scientists to study the processes acting on the Earth’s surfaces at resolutions not previously possible yet essential for their appropriate representation. In addition to their utility for research, the data have also been recognized as powerful tools for communicating earth science concepts for education and outreach purposes. Unfortunately, the massive volume of data produced by LiDAR mapping technology can be a barrier to their use. To facilitate access to these powerful data for research and educational purposes, we have been exploring the use of Keyhole Markup Language (KML) and Google Earth to deliver LiDAR-derived visualizations. The OpenTopography Portal (http://www.opentopography.org/) is a National Science Foundation-funded facility designed to provide access to Earth science-oriented LiDAR data. OpenTopography hosts a growing collection of LiDAR data for a variety of geologic domains, including many of the active faults in the western United States. We have found that the wide spectrum of LiDAR users have variable scientific applications, computing resources, and technical experience and thus require a data distribution system that provides various levels of access to the data. For users seeking a synoptic view of the data, and for education and outreach purposes, delivering full-resolution images derived from LiDAR topography into the Google Earth virtual globe is powerful. The virtual globe environment provides a freely available and easily navigated viewer and enables quick integration of the LiDAR visualizations with imagery, geographic layers, and other relevant data available in KML format. Through region-dependant network linked KML, OpenTopography currently delivers over 20 GB of LiDAR-derived imagery to users via simple, easily downloaded KMZ files hosted at the Portal. This method provides seamlessly access to hillshaded imagery for both bare earth and first return terrain models with various angles of illumination. Seamless access to LiDAR-derived imagery in Google Earth has proven to be the most popular product available in the OpenTopography Portal. The hillshade KMZ files have been downloaded over 3000 times by users ranging from earthquake scientists to K-12 educators who wish to introduce cutting edge real world data into their earth science lessons. OpenTopography also provides dynamically generated KMZ visualizations of LiDAR data products produced when users choose to use the OpenTopography point cloud access and processing system. These Google Earth compatible products allow users to quickly visualize the custom terrain products they have generated without the burden of loading the data into a GIS environment. For users who have installed the Google Earth browser plug-in, these visualizations can be launched directly from the OpenTopography results page and viewed directly in the browser.

  6. 3D Online Visualization and Synergy of NASA A-Train Data Using Google Earth

    NASA Technical Reports Server (NTRS)

    Chen, Aijun; Kempler, Steven; Leptoukh, Gregory; Smith, Peter

    2010-01-01

    This poster presentation reviews the use of Google Earth to assist in three dimensional online visualization of NASA Earth science and geospatial data. The NASA A-Train satellite constellation is a succession of seven sun-synchronous orbit satellites: (1) OCO-2 (Orbiting Carbon Observatory) (will launch in Feb. 2013), (2) GCOM-W1 (Global Change Observation Mission), (3) Aqua, (4) CloudSat, (5) CALIPSO (Cloud-Aerosol Lidar & Infrared Pathfinder Satellite Observations), (6) Glory, (7) Aura. The A-Train makes possible synergy of information from multiple resources, so more information about earth condition is obtained from the combined observations than would be possible from the sum of the observations taken independently

  7. GIS tool to locate major Sikh temples in USA

    NASA Astrophysics Data System (ADS)

    Sharma, Saumya

    This tool is a GIS based interactive and graphical user interface tool, which locates the major Sikh temples of USA on a map. This tool is using Java programming language along with MOJO (Map Object Java Object) provided by ESRI that is the organization that provides the GIS software. It also includes some of the integration with Google's API's like Google Translator API. This application will tell users about the origin of Sikhism in India and USA, the major Sikh temples in each state of USA, location, name and detail information through their website. The primary purpose of this application is to make people aware about this religion and culture. This tool will also measure the distance between two temple points in a map and display the result in miles and kilometers. Also, there is an added support to convert each temple's website language from English to Punjabi or any other language using a language convertor tool so that people from different nationalities can understand their culture. By clicking on each point on a map, a new window will pop up showing the picture of the temple and a hyperlink that will redirect to the website of that particular temple .It will also contain links to their dance, music, history, and also a help menu to guide the users to use the software efficiently.

  8. Web GIS in practice V: 3-D interactive and real-time mapping in Second Life

    PubMed Central

    Boulos, Maged N Kamel; Burden, David

    2007-01-01

    This paper describes technologies from Daden Limited for geographically mapping and accessing live news stories/feeds, as well as other real-time, real-world data feeds (e.g., Google Earth KML feeds and GeoRSS feeds) in the 3-D virtual world of Second Life, by plotting and updating the corresponding Earth location points on a globe or some other suitable form (in-world), and further linking those points to relevant information and resources. This approach enables users to visualise, interact with, and even walk or fly through, the plotted data in 3-D. Users can also do the reverse: put pins on a map in the virtual world, and then view the data points on the Web in Google Maps or Google Earth. The technologies presented thus serve as a bridge between mirror worlds like Google Earth and virtual worlds like Second Life. We explore the geo-data display potential of virtual worlds and their likely convergence with mirror worlds in the context of the future 3-D Internet or Metaverse, and reflect on the potential of such technologies and their future possibilities, e.g. their use to develop emergency/public health virtual situation rooms to effectively manage emergencies and disasters in real time. The paper also covers some of the issues associated with these technologies, namely user interface accessibility and individual privacy. PMID:18042275

  9. Conceptual Learning Outcomes of Virtual Experiential Learning: Results of Google Earth Exploration in Introductory Geoscience Courses

    NASA Astrophysics Data System (ADS)

    Bitting, Kelsey S.; McCartney, Marsha J.; Denning, Kathy R.; Roberts, Jennifer A.

    2018-06-01

    Virtual globe programs such as Google Earth replicate real-world experiential learning of spatial and geographic concepts by allowing students to navigate across our planet without ever leaving campus. However, empirical evidence for the learning value of these technological tools and the experience students gain by exploration assignments framed within them remains to be quantified and compared by student demographics. This study examines the impact of a Google Earth-based exploration assignment on conceptual understanding in introductory geoscience courses at a research university in the US Midwest using predominantly traditional college-age students from a range of majors. Using repeated-measures ANOVA and paired-samples t tests, we test the significance of the activity using pretest and posttest scores on a subset of items from the Geoscience Concept Inventory, and the interactive effects of student gender and ethnicity on student score improvement. Analyses show that learning from the Google Earth exploration activity is highly significant overall and for all but one of the concept inventory items. Furthermore, we find no significant interactive effects of class format, student gender, or student ethnicity on the magnitude of the score increases. These results provide strong support for the use of experiential learning in virtual globe environments for students in introductory geoscience and perhaps other disciplines for which direct observation of our planet's surface is conceptually relevant.

  10. Real-time bus location monitoring using Arduino

    NASA Astrophysics Data System (ADS)

    Ibrahim, Mohammad Y. M.; Audah, Lukman

    2017-09-01

    The Internet of Things (IoT) is the network of objects, such as a vehicles, mobile devices, and buildings that have electronic components, software, and network connectivity that enable them to collect data, run commands, and be controlled through the Internet. Controlling physical items from the Internet will increase efficiency and save time. The growing number of devices used by people increases the practicality of having IoT devices on the market. The IoT is also an opportunity to develop products that can save money and time and increase work efficiency. Initially, they need more efficiency for real-time bus location systems, especially in university campuses. This system can easily find the accurate locations of and distances between each bus stop and the estimated time to reach a new location. This system has been separated into two parts, which are the hardware and the software. The hardware parts are the Arduino Uno and the Global Positioning System (GPS), while Google Earth and GpsGate are the software parts. The GPS continuously takes input data from the satellite and stores the latitude and longitude values in the Arduino Uno. If we want to track the vehicle, we need to send the longitude and latitude as a message to the Google Earth software to convert these into maps for navigation. Once the Arduino Uno is activated, it takes the last received latitude and longitude positions' values from GpsGate and sends a message to Google Earth. Once the message has been sent to Google Earth, the current location will be shown, and navigation will be activated automatically. Then it will be broadcast using ManyCam, Google+ Hangouts, and YouTube, as well as Facebook, and appear to users. The additional features use Google Forms for determining problems faced by students, who can also take immediate action against the responsible department. Then after several successful simulations, the results will be shown in real time on a map.

  11. Multi-source Geospatial Data Analysis with Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Erickson, T.

    2014-12-01

    The Google Earth Engine platform is a cloud computing environment for data analysis that combines a public data catalog with a large-scale computational facility optimized for parallel processing of geospatial data. The data catalog is a multi-petabyte archive of georeferenced datasets that include images from Earth observing satellite and airborne sensors (examples: USGS Landsat, NASA MODIS, USDA NAIP), weather and climate datasets, and digital elevation models. Earth Engine supports both a just-in-time computation model that enables real-time preview and debugging during algorithm development for open-ended data exploration, and a batch computation mode for applying algorithms over large spatial and temporal extents. The platform automatically handles many traditionally-onerous data management tasks, such as data format conversion, reprojection, and resampling, which facilitates writing algorithms that combine data from multiple sensors and/or models. Although the primary use of Earth Engine, to date, has been the analysis of large Earth observing satellite datasets, the computational platform is generally applicable to a wide variety of use cases that require large-scale geospatial data analyses. This presentation will focus on how Earth Engine facilitates the analysis of geospatial data streams that originate from multiple separate sources (and often communities) and how it enables collaboration during algorithm development and data exploration. The talk will highlight current projects/analyses that are enabled by this functionality.https://earthengine.google.org

  12. Moving beyond a Google Search: Google Earth, SketchUp, Spreadsheet, and More

    ERIC Educational Resources Information Center

    Siegle, Del

    2007-01-01

    Google has been the search engine of choice for most Web surfers for the past half decade. More recently, the creative founders of the popular search engine have been busily creating and testing a variety of useful products that will appeal to gifted learners of varying ages. The purpose of this paper is to share information about three of these…

  13. Sedimentation and erosion in Lake Diefenbaker, Canada: solutions for shoreline retreat monitoring.

    PubMed

    Sadeghian, Amir; de Boer, Dirk; Lindenschmidt, Karl-Erich

    2017-09-15

    This study looks into sedimentation and erosion rates in Lake Diefenbaker, a prairie reservoir, in Saskatchewan, Canada, which has been in operation since 1968. First, we looked at the historical data in all different formats over the last 70 years, which includes data from more than 20 years before the formation of the lake. The field observations indicate high rates of shoreline erosion, especially in the upstream portion as a potential region for shoreline retreat. Because of the great importance of this waterbody to the province, monitoring sedimentation and erosion rates is necessary for maintaining the quality of water especially after severe floods which are more common due to climate change effects. Second, we used Google Maps Elevation API, a new tool from Google that provides elevation data for cross sections drawn between two points, by drawing 24 cross sections in the upstream area extending 250 m from each bank. This feature from Google can be used as an easy and fast monitoring tool, is free of charge, and provides excellent control capabilities for monitoring changes in cross-sectional profiles.

  14. Using Google Earth in Marine Research and Operational Decision Support

    NASA Astrophysics Data System (ADS)

    Blower, J. D.; Bretherton, D.; Haines, K.; Liu, C.; Rawlings, C.; Santokhee, A.; Smith, I.

    2006-12-01

    A key advantage of Virtual Globes ("geobrowsers") such as Google Earth is that they can display many different geospatial data types at a huge range of spatial scales. In this demonstration and poster display we shall show how marine data from disparate sources can be brought together in a geobrowser in order to support both scientific research and operational search and rescue activities. We have developed the Godiva2 interactive website for browsing and exploring marine data, mainly output from supercomputer analyses and predictions of ocean circulation. The user chooses a number of parameters (e.g. sea temperature at 100m depth on 1st July 2006) and can load an image of the resulting data in Google Earth. Through the use of an automatically-refreshing NetworkLink the user can explore the whole globe at a very large range of spatial scales: the displayed data will automatically be refreshed to show data at increasingly fine resolution as the user zooms in. This is a valuable research tool for exploring these terabyte- scale datasets. Many coastguard organizations around the world use SARIS, a software application produced by BMT Cordah Ltd., to predict the drift pattern of objects in the sea in order to support search and rescue operations. Different drifting objects have different trajectories depending on factors such as their buoyancy and windage and so a computer model, supported by meteorological and oceanographic data, is needed to help rescuers locate their targets. We shall demonstrate how Google Earth is used to display output from the SARIS model (including the search target location and associated error polygon) alongside meteorological data (wind vectors) and oceanographic data (sea temperature, surface currents) from Godiva2 in order to support decision-making. We shall also discuss the limitations of using Google Earth in this context: these include the difficulties of working with time- dependent data and the need to access data securely. essc.ac.uk:8080/Godiva2

  15. Navigation API Route Fuel Saving Opportunity Assessment on Large-Scale Real-World Travel Data for Conventional Vehicles and Hybrid Electric Vehicles: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Lei; Holden, Jacob; Gonder, Jeffrey D

    The green routing strategy instructing a vehicle to select a fuel-efficient route benefits the current transportation system with fuel-saving opportunities. This paper introduces a navigation API route fuel-saving evaluation framework for estimating fuel advantages of alternative API routes based on large-scale, real-world travel data for conventional vehicles (CVs) and hybrid electric vehicles (HEVs). The navigation APIs, such Google Directions API, integrate traffic conditions and provide feasible alternative routes for origin-destination pairs. This paper develops two link-based fuel-consumption models stratified by link-level speed, road grade, and functional class (local/non-local), one for CVs and the other for HEVs. The link-based fuel-consumption modelsmore » are built by assigning travel from a large number of GPS driving traces to the links in TomTom MultiNet as the underlying road network layer and road grade data from a U.S. Geological Survey elevation data set. Fuel consumption on a link is calculated by the proposed fuel consumption model. This paper envisions two kinds of applications: 1) identifying alternate routes that save fuel, and 2) quantifying the potential fuel savings for large amounts of travel. An experiment based on a large-scale California Household Travel Survey GPS trajectory data set is conducted. The fuel consumption and savings of CVs and HEVs are investigated. At the same time, the trade-off between fuel saving and time saving for choosing different routes is also examined for both powertrains.« less

  16. Landsat Imagery Enables Global Studies of Surface Trends

    NASA Technical Reports Server (NTRS)

    2015-01-01

    Landsat 8 is the latest in the NASA-developed series of satellites that have provided a continuous picture of Earth for more than 40 years. Mountain View, California-based Google has incorporated Landsat data into several products, most recently generating a cloud-free view of Earth. Google has also teamed up with researchers at the University of Maryland and Goddard Space Flight Center to create a global survey showing changes in forest cover over many years-the first of its kind.

  17. Vocabulary services to support scientific data interoperability

    NASA Astrophysics Data System (ADS)

    Cox, Simon; Mills, Katie; Tan, Florence

    2013-04-01

    Shared vocabularies are a core element in interoperable systems. Vocabularies need to be available at run-time, and where the vocabularies are shared by a distributed community this implies the use of web technology to provide vocabulary services. Given the ubiquity of vocabularies or classifiers in systems, vocabulary services are effectively the base of the interoperability stack. In contemporary knowledge organization systems, a vocabulary item is considered a concept, with the "terms" denoting it appearing as labels. The Simple Knowledge Organization System (SKOS) formalizes this as an RDF Schema (RDFS) application, with a bridge to formal logic in Web Ontology Language (OWL). For maximum utility, a vocabulary should be made available through the following interfaces: * the vocabulary as a whole - at an ontology URI corresponding to a vocabulary document * each item in the vocabulary - at the item URI * summaries, subsets, and resources derived by transformation * through the standard RDF web API - i.e. a SPARQL endpoint * through a query form for human users. However, the vocabulary data model may be leveraged directly in a standard vocabulary API that uses the semantics provided by SKOS. SISSvoc3 [1] accomplishes this as a standard set of URI templates for a vocabulary. Any URI comforming to the template selects a vocabulary subset based on the SKOS properties, including labels (skos:prefLabel, skos:altLabel, rdfs:label) and a subset of the semantic relations (skos:broader, skos:narrower, etc). SISSvoc3 thus provides a RESTFul SKOS API to query a vocabulary, but hiding the complexity of SPARQL. It has been implemented using the Linked Data API (LDA) [2], which connects to a SPARQL endpoint. By using LDA, we also get content-negotiation, alternative views, paging, metadata and other functionality provided in a standard way. A number of vocabularies have been formalized in SKOS and deployed by CSIRO, the Australian Bureau of Meteorology (BOM) and their collaborators using SISSvoc3, including: * geologic timescale (multiple versions) * soils classification * definitions from OGC standards * geosciml vocabularies * mining commodities * hyperspectral scalars Several other agencies in Australia have adopted SISSvoc3 for their vocabularies. SISSvoc3 differs from other SKOS-based vocabulary-access APIs such as GEMET [3] and NVS [4] in that (a) the service is decoupled from the content store, (b) the service URI is independent of the content URIs This means that a SISSvoc3 interface can be deployed over any SKOS vocabulary which is available at a SPARQL endpoint. As an example, a SISSvoc3 query and presentation interface has been deployed over the NERC vocabulary service hosted by the BODC, providing a search interface which is not available natively. We use vocabulary services to populate menus in user interfaces, to support data validation, and to configure data conversion routines. Related services built on LDA have also been used as a generic registry interface, and extended for serving gazetteer information. ACKNOWLEDGEMENTS The CSIRO SISSvoc3 implementation is built using the Epimorphics ELDA platform http://code.google.com/p/elda/. We thank Jacqui Githaiga and Terry Rankine for their contributions to SISSvoc design and implementation. REFERENCES 1. SISSvoc3 Specification https://www.seegrid.csiro.au/wiki/Siss/SISSvoc30Specification 2. Linked Data API http://code.google.com/p/linked-data-api/wiki/Specification 3. GEMET https://svn.eionet.europa.eu/projects/Zope/wiki/GEMETWebServiceAPI 4. NVS 2.0 http://vocab.nerc.ac.uk/

  18. Software Applications to Access Earth Science Data: Building an ECHO Client

    NASA Astrophysics Data System (ADS)

    Cohen, A.; Cechini, M.; Pilone, D.

    2010-12-01

    Historically, developing an ECHO (NASA’s Earth Observing System (EOS) ClearingHOuse) client required interaction with its SOAP API. SOAP, as a framework for web service communication has numerous advantages for Enterprise applications and Java/C# type programming languages. However, as interest has grown for quick development cycles and more intriguing “mashups,” ECHO has seen the SOAP API lose its appeal. In order to address these changing needs, ECHO has introduced two new interfaces facilitating simple access to its metadata holdings. The first interface is built upon the OpenSearch format and ESIP Federated Search framework. The second interface is built upon the Representational State Transfer (REST) architecture. Using the REST and OpenSearch APIs to access ECHO makes development with modern languages much more feasible and simpler. Client developers can leverage the simple interaction with ECHO to focus more of their time on the advanced functionality they are presenting to users. To demonstrate the simplicity of developing with the REST API, participants will be led through a hands-on experience where they will develop an ECHO client that performs the following actions: + Login + Provider discovery + Provider based dataset discovery + Dataset, Temporal, and Spatial constraint based Granule discovery + Online Data Access

  19. Making a report of a short trip in an ophiolitic complex with Google Earth

    NASA Astrophysics Data System (ADS)

    Aubret, Marianne

    2017-04-01

    Plate tectonics is taught in French secondary school (lower and upper-sixth). According to the curriculum, the comprehension of plate-tectonic processes and concepts should be based on field data. For example, the Alpine's ocean history is studied to understand how mountain ranges are formed. In this context, Corsica is a great open-air laboratory, but unfortunately, the traffic conditions are very difficult in the island and despite the short distances, it's almost impossible for teachers to take their students to the remarkable geologic spots. The «défilé de l'Inzecca» is one of them: there you can see a part of the alpine's ophiolitic complex. The aim of this activity is to elaborate a « KMZ folder » in Google Earth as a report of a short trip thanks to the students' data field; it is also the occasion to enrich the Google Earth KMZ folder already available for our teaching.

  20. Earth Adventure: Virtual Globe-based Suborbital Atmospheric Greenhouse Gases Exploration

    NASA Astrophysics Data System (ADS)

    Wei, Y.; Landolt, K.; Boyer, A.; Santhana Vannan, S. K.; Wei, Z.; Wang, E.

    2016-12-01

    The Earth Venture Suborbital (EVS) mission is an important component of NASA's Earth System Science Pathfinder program that aims at making substantial advances in Earth system science through measurements from suborbital platforms and modeling researches. For example, the Carbon in Arctic Reservoirs Vulnerability Experiment (CARVE) project of EVS-1 collected measurements of greenhouse gases (GHG) on local to regional scales in the Alaskan Arctic. The Atmospheric Carbon and Transport - America (ACT-America) project of EVS-2 will provide advanced, high-resolution measurements of atmospheric profiles and horizontal gradients of CO2 and CH4.As the long-term archival center for CARVE and the future ACT-America data, the Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC) has been developing a versatile data management system for CARVE data to maximize their usability. One of these efforts is the virtual globe-based Suborbital Atmospheric GHG Exploration application. It leverages Google Earth to simulate the 185 flights flew by the C-23 Sherpa aircraft in 2012-2015 for the CARVE project. Based on Google Earth's 3D modeling capability and the precise coordinates, altitude, pitch, roll, and heading info of the aircraft recorded in every second during each flight, the application provides users accurate and vivid simulation of flight experiences, with an active 3D visualization of a C-23 Sherpa aircraft in view. This application provides dynamic visualization of GHG, including CO2, CO, H2O, and CH4 captured during the flights, at the same pace of the flight simulation in Google Earth. Photos taken during those flights are also properly displayed along the flight paths. In the future, this application will be extended to incorporate more complicated GHG measurements (e.g. vertical profiles) from the ACT-America project. This application leverages virtual globe technology to provide users an integrated framework to interactively explore information about GHG measurements and to link scientific measurements to the rich virtual planet environment provided by Google Earth. Positive feedbacks have been received from users. It provides a good example of extending basic data visualization into a knowledge discovery experience and maximizing the usability of Earth science observations.

  1. An overview of the web-based Google Earth coincident imaging tool

    USGS Publications Warehouse

    Chander, Gyanesh; Kilough, B.; Gowda, S.

    2010-01-01

    The Committee on Earth Observing Satellites (CEOS) Visualization Environment (COVE) tool is a browser-based application that leverages Google Earth web to display satellite sensor coverage areas. The analysis tool can also be used to identify near simultaneous surface observation locations for two or more satellites. The National Aeronautics and Space Administration (NASA) CEOS System Engineering Office (SEO) worked with the CEOS Working Group on Calibration and Validation (WGCV) to develop the COVE tool. The CEOS member organizations are currently operating and planning hundreds of Earth Observation (EO) satellites. Standard cross-comparison exercises between multiple sensors to compare near-simultaneous surface observations and to identify corresponding image pairs are time-consuming and labor-intensive. COVE is a suite of tools that have been developed to make such tasks easier.

  2. Measuring the Carolina Bays Using Archetype Template Overlays on the Google Earth Virtual Globe; Planform Metrics for 25,000 Bays Extracted from LiDAR and Satellite Imagery

    NASA Astrophysics Data System (ADS)

    Davias, M. E.; Gilbride, J. L.

    2011-12-01

    Aerial photographs of Carolina bays taken in the 1930's sparked the initial research into their geomorphology. Satellite Imagery available today through the Google Earth Virtual Globe facility expands the regions available for interrogation, but reveal only part of their unique planforms. Digital Elevation Maps (DEMs), using Light Detection And Ranging (LiDAR) remote sensing data, accentuate the visual presentation of these aligned ovoid shallow basins by emphasizing their robust circumpheral rims. To support a geospatial survey of Carolina bay landforms in the continental USA, 400,000 km2 of hsv-shaded DEMs were created as KML-JPEG tile sets. A majority of these DEMs were generated with LiDAR-derived data. We demonstrate the tile generation process and their integration into Google Earth, where the DEMs augment available photographic imagery for the visualization of bay planforms. While the generic Carolina bay planform is considered oval, we document subtle regional variations. Using a small set of empirically derived planform shapes, we created corresponding Google Earth overlay templates. We demonstrate the analysis of an individual Carolina bay by placing an appropriate overlay onto the virtually globe, then orientating, sizing and rotating it by edit handles such that it satisfactorily represents the bay's rim. The resulting overlay data element is extracted from Google Earth's object directory and programmatically processed to generate metrics such as geographic location, elevation, major and minor axis and inferred orientation. Utilizing a virtual globe facility for data capture may result in higher quality data compared to methods that reference flat maps, where geospatial shape and orientation of the bays could be skewed and distorted in the orthographic projection process. Using the methodology described, we have measured over 25k distinct Carolina bays. We discuss the Google Fusion geospatial data repository facility, through which these data have been assembled and made web-accessible to other researchers. Preliminary findings from the survey are discussed, such as how bay surface area, eccentricity and orientation vary across ~800 1/4° × 1/4° grid elements. Future work includes measuring 25k additional bays, as well as interrogation of the orientation data to identify any possible systematic geospatial relationships.

  3. Predicting plant attractiveness to pollinators with passive crowdsourcing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bahlai, Christie A.; Landis, Douglas A.

    Global concern regarding pollinator decline has intensified interest in enhancing pollinator resources in managed landscapes. These efforts frequently emphasize restoration or planting of flowering plants to provide pollen and nectar resources that are highly attractive to the desired pollinators. However, determining exactly which plant species should be used to enhance a landscape is difficult. Empirical screening of plants for such purposes is logistically daunting, but could be streamlined by crowdsourcing data to create lists of plants most probable to attract the desired pollinator taxa. People frequently photograph plants in bloom and the Internet has become a vast repository of suchmore » images. A proportion of these images also capture floral visitation by arthropods. Here, we test the hypothesis that the abundance of floral images containing identifiable pollinator and other beneficial insects is positively associated with the observed attractiveness of the same species in controlled field trials from previously published studies. We used Google Image searches to determine the correlation of pollinator visitation captured by photographs on the Internet relative to the attractiveness of the same species in common-garden field trials for 43 plant species. From the first 30 photographs, which successfully identified the plant, we recorded the number of Apis (managed honeybees), non-Apis (exclusively wild bees) and the number of bee-mimicking syrphid flies. We used these observations from search hits as well as bloom period (BP) as predictor variables in Generalized Linear Models (GLMs) for field-observed abundances of each of these groups. We found that non-Apis bees observed in controlled field trials were positively associated with observations of these taxa in Google Image searches (pseudo-R 2 of 0.668). Syrphid fly observations in the field were also associated with the frequency they were observed in images, but this relationship was weak. Apis bee observations were not associated with Internet images, but were slightly associated with BP. Our results suggest that passively crowdsourced image data can potentially be a useful screening tool to identify candidate plants for pollinator habitat restoration efforts directed at wild bee conservation. Increasing our understanding of the attractiveness of a greater diversity of plants increases the potential for more rapid and efficient research in creating pollinator-supportive landscapes.« less

  4. Predicting plant attractiveness to pollinators with passive crowdsourcing

    DOE PAGES

    Bahlai, Christie A.; Landis, Douglas A.

    2016-06-01

    Global concern regarding pollinator decline has intensified interest in enhancing pollinator resources in managed landscapes. These efforts frequently emphasize restoration or planting of flowering plants to provide pollen and nectar resources that are highly attractive to the desired pollinators. However, determining exactly which plant species should be used to enhance a landscape is difficult. Empirical screening of plants for such purposes is logistically daunting, but could be streamlined by crowdsourcing data to create lists of plants most probable to attract the desired pollinator taxa. People frequently photograph plants in bloom and the Internet has become a vast repository of suchmore » images. A proportion of these images also capture floral visitation by arthropods. Here, we test the hypothesis that the abundance of floral images containing identifiable pollinator and other beneficial insects is positively associated with the observed attractiveness of the same species in controlled field trials from previously published studies. We used Google Image searches to determine the correlation of pollinator visitation captured by photographs on the Internet relative to the attractiveness of the same species in common-garden field trials for 43 plant species. From the first 30 photographs, which successfully identified the plant, we recorded the number of Apis (managed honeybees), non-Apis (exclusively wild bees) and the number of bee-mimicking syrphid flies. We used these observations from search hits as well as bloom period (BP) as predictor variables in Generalized Linear Models (GLMs) for field-observed abundances of each of these groups. We found that non-Apis bees observed in controlled field trials were positively associated with observations of these taxa in Google Image searches (pseudo-R 2 of 0.668). Syrphid fly observations in the field were also associated with the frequency they were observed in images, but this relationship was weak. Apis bee observations were not associated with Internet images, but were slightly associated with BP. Our results suggest that passively crowdsourced image data can potentially be a useful screening tool to identify candidate plants for pollinator habitat restoration efforts directed at wild bee conservation. Increasing our understanding of the attractiveness of a greater diversity of plants increases the potential for more rapid and efficient research in creating pollinator-supportive landscapes.« less

  5. Predicting plant attractiveness to pollinators with passive crowdsourcing

    PubMed Central

    Bahlai, Christie A.; Landis, Douglas A.

    2016-01-01

    Global concern regarding pollinator decline has intensified interest in enhancing pollinator resources in managed landscapes. These efforts frequently emphasize restoration or planting of flowering plants to provide pollen and nectar resources that are highly attractive to the desired pollinators. However, determining exactly which plant species should be used to enhance a landscape is difficult. Empirical screening of plants for such purposes is logistically daunting, but could be streamlined by crowdsourcing data to create lists of plants most probable to attract the desired pollinator taxa. People frequently photograph plants in bloom and the Internet has become a vast repository of such images. A proportion of these images also capture floral visitation by arthropods. Here, we test the hypothesis that the abundance of floral images containing identifiable pollinator and other beneficial insects is positively associated with the observed attractiveness of the same species in controlled field trials from previously published studies. We used Google Image searches to determine the correlation of pollinator visitation captured by photographs on the Internet relative to the attractiveness of the same species in common-garden field trials for 43 plant species. From the first 30 photographs, which successfully identified the plant, we recorded the number of Apis (managed honeybees), non-Apis (exclusively wild bees) and the number of bee-mimicking syrphid flies. We used these observations from search hits as well as bloom period (BP) as predictor variables in Generalized Linear Models (GLMs) for field-observed abundances of each of these groups. We found that non-Apis bees observed in controlled field trials were positively associated with observations of these taxa in Google Image searches (pseudo-R2 of 0.668). Syrphid fly observations in the field were also associated with the frequency they were observed in images, but this relationship was weak. Apis bee observations were not associated with Internet images, but were slightly associated with BP. Our results suggest that passively crowdsourced image data can potentially be a useful screening tool to identify candidate plants for pollinator habitat restoration efforts directed at wild bee conservation. Increasing our understanding of the attractiveness of a greater diversity of plants increases the potential for more rapid and efficient research in creating pollinator-supportive landscapes. PMID:27429762

  6. A Web-Based Earth-Systems Knowledge Portal and Collaboration Platform

    NASA Astrophysics Data System (ADS)

    D'Agnese, F. A.; Turner, A. K.

    2010-12-01

    In support of complex water-resource sustainability projects in the Great Basin region of the United States, Earth Knowledge, Inc. has developed several web-based data management and analysis platforms that have been used by its scientists, clients, and public to facilitate information exchanges, collaborations, and decision making. These platforms support accurate water-resource decision-making by combining second-generation internet (Web 2.0) technologies with traditional 2D GIS and web-based 2D and 3D mapping systems such as Google Maps, and Google Earth. Most data management and analysis systems use traditional software systems to address the data needs and usage behavior of the scientific community. In contrast, these platforms employ more accessible open-source and “off-the-shelf” consumer-oriented, hosted web-services. They exploit familiar software tools using industry standard protocols, formats, and APIs to discover, process, fuse, and visualize earth, engineering, and social science datasets. Thus, they respond to the information needs and web-interface expectations of both subject-matter experts and the public. Because the platforms continue to gather and store all the contributions of their broad-spectrum of users, each new assessment leverages the data, information, and expertise derived from previous investigations. In the last year, Earth Knowledge completed a conceptual system design and feasibility study for a platform, which has a Knowledge Portal providing access to users wishing to retrieve information or knowledge developed by the science enterprise and a Collaboration Environment Module, a framework that links the user-access functions to a Technical Core supporting technical and scientific analyses including Data Management, Analysis and Modeling, and Decision Management, and to essential system administrative functions within an Administrative Module. The over-riding technical challenge is the design and development of a single technical platform that is accessed through a flexible series of knowledge portal and collaboration environment styles reflecting the information needs and user expectations of a diverse community of users. Recent investigations have defined the information needs and expectations of the major end-users and also have reviewed and assessed a wide variety of modern web-based technologies. Combining these efforts produced design specifications and recommendations for the selection and integration of web- and client-based tools. When fully developed, the resulting platform will: -Support new, advanced information systems and decision environments that take full advantage of multiple data sources and platforms; -Provide a distribution network tailored to the timely delivery of products to a broad range of users that are needed to support applications in disaster management, resource management, energy, and urban sustainability; -Establish new integrated multiple-user requirements and knowledge databases that support researchers and promote infusion of successful technologies into existing processes; and -Develop new decision support strategies and presentation methodologies for applied earth science applications to reduce risk, cost, and time.

  7. Google's Geo Education Outreach: Results and Discussion of Outreach Trip to Alaskan High Schools.

    NASA Astrophysics Data System (ADS)

    Kolb, E. J.; Bailey, J.; Bishop, A.; Cain, J.; Goddard, M.; Hurowitz, K.; Kennedy, K.; Ornduff, T.; Sfraga, M.; Wernecke, J.

    2008-12-01

    The focus of Google's Geo Education outreach efforts (http://www.google.com/educators/geo.html) is on helping primary, secondary, and post-secondary educators incorporate Google Earth and Sky, Google Maps, and SketchUp into their classroom lessons. In partnership with the University of Alaska, our Geo Education team members visited several remote Alaskan high schools during a one-week period in September. At each school, we led several 40-minute hands-on learning sessions in which Google products were used by the students to investigate local geologic and environmental processes. For the teachers, we provided several resources including follow-on lesson plans, example KML-based lessons, useful URL's, and website resources that multiple users can contribute to. This talk will highlight results of the trip and discuss how educators can access and use Google's Geo Education resources.

  8. Known unknowns, Google Earth, plate tectonics and Mt Bellenden Ker: some thoughts on locality data.

    PubMed

    Mesibov, Robert

    2012-01-01

    Latitude/longitude data in locality records should be published with spatial uncertainties, datum(s) used and indications of how the data were obtained. Google Earth can be used to locate sampling sites, but the underlying georegistration of the satellite image should be checked. The little-known relabelling of a set of landmarks on Mt Bellenden Ker, a scientifically important collecting locality in tropical north Queensland, Australia, is documented as an example of the importance of checking records not accompanied by appropriately accurate latitude/longitude data.

  9. HTTP-based Search and Ordering Using ECHO's REST-based and OpenSearch APIs

    NASA Astrophysics Data System (ADS)

    Baynes, K.; Newman, D. J.; Pilone, D.

    2012-12-01

    Metadata is an important entity in the process of cataloging, discovering, and describing Earth science data. NASA's Earth Observing System (EOS) ClearingHOuse (ECHO) acts as the core metadata repository for EOSDIS data centers, providing a centralized mechanism for metadata and data discovery and retrieval. By supporting both the ESIP's Federated Search API and its own search and ordering interfaces, ECHO provides multiple capabilities that facilitate ease of discovery and access to its ever-increasing holdings. Users are able to search and export metadata in a variety of formats including ISO 19115, json, and ECHO10. This presentation aims to inform technically savvy clients interested in automating search and ordering of ECHO's metadata catalog. The audience will be introduced to practical and applicable examples of end-to-end workflows that demonstrate finding, sub-setting and ordering data that is bound by keyword, temporal and spatial constraints. Interaction with the ESIP OpenSearch Interface will be highlighted, as will ECHO's own REST-based API.

  10. A JavaScript API for the Ice Sheet System Model (ISSM) 4.11: towards an online interactive model for the cryosphere community

    NASA Astrophysics Data System (ADS)

    Larour, Eric; Cheng, Daniel; Perez, Gilberto; Quinn, Justin; Morlighem, Mathieu; Duong, Bao; Nguyen, Lan; Petrie, Kit; Harounian, Silva; Halkides, Daria; Hayes, Wayne

    2017-12-01

    Earth system models (ESMs) are becoming increasingly complex, requiring extensive knowledge and experience to deploy and use in an efficient manner. They run on high-performance architectures that are significantly different from the everyday environments that scientists use to pre- and post-process results (i.e., MATLAB, Python). This results in models that are hard to use for non-specialists and are increasingly specific in their application. It also makes them relatively inaccessible to the wider science community, not to mention to the general public. Here, we present a new software/model paradigm that attempts to bridge the gap between the science community and the complexity of ESMs by developing a new JavaScript application program interface (API) for the Ice Sheet System Model (ISSM). The aforementioned API allows cryosphere scientists to run ISSM on the client side of a web page within the JavaScript environment. When combined with a web server running ISSM (using a Python API), it enables the serving of ISSM computations in an easy and straightforward way. The deep integration and similarities between all the APIs in ISSM (MATLAB, Python, and now JavaScript) significantly shortens and simplifies the turnaround of state-of-the-art science runs and their use by the larger community. We demonstrate our approach via a new Virtual Earth System Laboratory (VESL) website (http://vesl.jpl.nasa.gov, VESL(2017)).

  11. Evaluation of physical and chemical changes in pharmaceuticals flown on space missions.

    PubMed

    Du, Brian; Daniels, Vernie R; Vaksman, Zalman; Boyd, Jason L; Crady, Camille; Putcha, Lakshmi

    2011-06-01

    Efficacy and safety of medications used for the treatment of astronauts in space may be compromised by altered stability in space. We compared physical and chemical changes with time in 35 formulations contained in identical pharmaceutical kits stowed on the International Space Station (ISS) and on Earth. Active pharmaceutical content (API) was determined by ultra- and high-performance liquid chromatography after returning to Earth. After stowage for 28 months in space, six medications aboard the ISS and two of matching ground controls exhibited changes in physical variables; nine medications from the ISS and 17 from the ground met the United States Pharmacopeia (USP) acceptance criteria for API content after 28 months of storage. A higher percentage of medications from each flight kit had lower API content than the respective ground controls. The number of medications failing API requirement increased as a function of time in space, independent of expiration date. The rate of degradation was faster in space than on the ground for many of the medications, and most solid dosage forms met USP standard for dissolution after storage in space. Cumulative radiation dose was higher and increased with time in space, whereas temperature and humidity remained similar to those on the ground. Exposure to the chronic low dose of ionizing radiation aboard the spacecraft as well as repackaging of solid dosage forms in flight-specific dispensers may adversely affect stability of pharmaceuticals. Characterization of degradation profiles of unstable formulations and identification of chemical attributes of stability in space analog environments on Earth will facilitate development of space-hardy medications.

  12. Leveraging Earth and Planetary Datasets to Support Student Investigations in an Introductory Geoscience Course

    NASA Astrophysics Data System (ADS)

    Ryan, Jeffrey; De Paor, Declan

    2016-04-01

    Engaging undergraduates in discovery-based research during their first two years of college was a listed priority in the 2012 Report of the USA President's Council of Advisors on Science and Technology (PCAST), and has been the focus of events and publications sponsored by the National Academies (NAS, 2015). Challenges faced in moving undergraduate courses and curricula in this direction are the paired questions of how to effectively provide such experiences to large numbers of students, and how to do so in ways that are cost- and time-effiicient for institutions and instructional faculty. In the geosciences, free access to of a growing number of global earth and planetary data resources and associated visualization tools permits one to build into introductory-level courses straightforward data interrogation and analysis activities that provide students with valuable experiences with the compilation and critical investigation of earth and planetary data. Google Earth provides global Earth and planetary imagery databases that span large ranges in resolution and in time, permitting easy examination of earth surface features and surface features on Mars or the Moon. As well, "community" data sources (i.e., Gigapan photographic collections and 3D visualizations of geologic features, as are supported by the NSF GEODE project) allow for intensive interrogation of specific geologic phenomena. Google Earth Engine provides access to rich satellite-based earth observation data, supporting studies of weather and related student efforts. GeoMapApp, the freely available visualization tool of the Interdisciplinary Earth Data Alliance (IEDA), permits examination of the seafloor and the integration of a range of third-party data. The "Earth" meteorological website (earth.nullschool.net) provides near real-time visualization of global weather and oceanic conditions, which in combination with weather option data from Google Earth permits a deeper interrogation of atmospheric conditions. In combination, these freely accessible data resources permit one to transform general- audience geoscience courses into extended investigations, in which students discover key information about the workings of our planet.

  13. Google Earth in the middle school geography classroom: Its impact on spatial literacy and place geography understanding of students

    NASA Astrophysics Data System (ADS)

    Westgard, Kerri S. W.

    Success in today's globalized, multi-dimensional, and connected world requires individuals to have a variety of skill sets -- i.e. oracy, numeracy, literacy, as well as the ability to think spatially. Student's spatial literacy, based on various national and international assessment results, indicates that even though there have been gains in U.S. scores over the past decade, overall performance, including those specific to spatial skills, are still below proficiency. Existing studies focused on the potential of virtual learning environment technology to reach students in a variety of academic areas, but a need still exists to study specifically the phenomenon of using Google Earth as a potentially more useful pedagogical tool to develop spatial literacy than the currently employed methods. The purpose of this study was to determine the extent to which graphicacy achievement scores of students who were immersed in a Google Earth environment were different from students who were provided with only two-dimensional instruction for developing spatial skills. Situated learning theory and the work of Piaget and Inhelder's Child's Conception of Space provided the theoretical grounding from which this study evolved. The National Research Council's call to develop spatial literacy, as seen in Learning to Think Spatially , provided the impetus to begin research. The target population (N = 84) for this study consisted of eighth grade geography students at an upper Midwest Jr. High School during the 2009-2010 academic year. Students were assigned to the control or experimental group based on when they had geography class. Control group students ( n = 44) used two-dimensional PowerPoint images to complete activities, while experimental group students (n = 40) were immersed in the three-dimensional Google Earth world for activity completion. Research data was then compiled and statistically analyzed to answer five research questions developed for this study. One-way ANOVAs were run on data collected and no statistically significant difference was found between the control and experimental group. However, two of the five research questions yielded practically significant data that indicates students who used Google Earth outperformed their counterparts who used PowerPoint on pattern prediction and spatial relationship understanding.

  14. Public Use of Online Hydrology Information for Harris County and Houston, Texas, during Hurricane Harvey and Suggested Improvement for Future Flood Events

    NASA Astrophysics Data System (ADS)

    Lilly, M. R.; Feditova, A.; Levine, K.; Giardino, J. R.

    2017-12-01

    The Harris County Flood Control District has an impressive amount of information available for the public related to flood management and response. During Hurricane Harvey, this information was used by the authors to help address daily questions from family and friends living in the Houston area. Common near-real-time reporting data included precipitation and water levels. Maps included locations of data stations, stream or bayou conditions (in bank, out of bank) and watershed or drainage boundaries. In general, the data station reporting and online information was updating well throughout the hurricane and post-flooding period. Only a few of the data reporting stations had problems with water level sensor measurements. The overall information was helpful to hydrologists and floodplain managers. The online information could not easily answer all common questions residents may have during a flood event. Some of the more common questions were how to use the water-level information to know the potential extent of flooding and relative location of flooding to the location of residents. To help address the questions raised during the flooding on how to use the available water level data, we used Google Earth to get lot and intersection locations to help show the relative differences between nearby water-level stations and residences of interest. The reported resolution of the Google Earth elevation data is 1-foot. To help confirm the use of this data, we compared Google Earth approximate elevations with reported Harris County Floodplain Reference Mark individual reports. This method helped verify we could use the Google Earth information for approximate comparisons. We also faced questions on what routes to take if evacuation was needed, and where to go to get to higher ground elevations. Google Earth again provided a helpful and easy to use interface to look at road and intersection elevations and develop suggested routes for family and friends to take to avoid low areas that may be subject to flooding. These and other recommendations that helped answer common questions by residents reacting to the hurricane and subsequent flooding conditions are summarized with examples.

  15. Using Google Earth to Assess Shade for Sun Protection in Urban Recreation Spaces: Methods and Results.

    PubMed

    Gage, R; Wilson, N; Signal, L; Barr, M; Mackay, C; Reeder, A; Thomson, G

    2018-05-16

    Shade in public spaces can lower the risk of and sun burning and skin cancer. However, existing methods of auditing shade require travel between sites, and sunny weather conditions. This study aimed to evaluate the feasibility of free computer software-Google Earth-for assessing shade in urban open spaces. A shade projection method was developed that uses Google Earth street view and aerial images to estimate shade at solar noon on the summer solstice, irrespective of the date of image capture. Three researchers used the method to separately estimate shade cover over pre-defined activity areas in a sample of 45 New Zealand urban open spaces, including 24 playgrounds, 12 beaches and 9 outdoor pools. Outcome measures included method accuracy (assessed by comparison with a subsample of field observations of 10 of the settings) and inter-rater reliability. Of the 164 activity areas identified in the 45 settings, most (83%) had no shade cover. The method identified most activity areas in playgrounds (85%) and beaches (93%) and was accurate for assessing shade over these areas (predictive values of 100%). Only 8% of activity areas at outdoor pools were identified, due to a lack of street view images. Reliability for shade cover estimates was excellent (intraclass correlation coefficient of 0.97, 95% CI 0.97-0.98). Google Earth appears to be a reasonably accurate and reliable and shade audit tool for playgrounds and beaches. The findings are relevant for programmes focused on supporting the development of healthy urban open spaces.

  16. New Techniques for Deep Learning with Geospatial Data using TensorFlow, Earth Engine, and Google Cloud Platform

    NASA Astrophysics Data System (ADS)

    Hancher, M.

    2017-12-01

    Recent years have seen promising results from many research teams applying deep learning techniques to geospatial data processing. In that same timeframe, TensorFlow has emerged as the most popular framework for deep learning in general, and Google has assembled petabytes of Earth observation data from a wide variety of sources and made them available in analysis-ready form in the cloud through Google Earth Engine. Nevertheless, developing and applying deep learning to geospatial data at scale has been somewhat cumbersome to date. We present a new set of tools and techniques that simplify this process. Our approach combines the strengths of several underlying tools: TensorFlow for its expressive deep learning framework; Earth Engine for data management, preprocessing, postprocessing, and visualization; and other tools in Google Cloud Platform to train TensorFlow models at scale, perform additional custom parallel data processing, and drive the entire process from a single familiar Python development environment. These tools can be used to easily apply standard deep neural networks, convolutional neural networks, and other custom model architectures to a variety of geospatial data structures. We discuss our experiences applying these and related tools to a range of machine learning problems, including classic problems like cloud detection, building detection, land cover classification, as well as more novel problems like illegal fishing detection. Our improved tools will make it easier for geospatial data scientists to apply modern deep learning techniques to their own problems, and will also make it easier for machine learning researchers to advance the state of the art of those techniques.

  17. GeolOkit 1.0: a new Open Source, Cross-Platform software for geological data visualization in Google Earth environment

    NASA Astrophysics Data System (ADS)

    Triantafyllou, Antoine; Bastin, Christophe; Watlet, Arnaud

    2016-04-01

    GIS software suites are today's essential tools to gather and visualise geological data, to apply spatial and temporal analysis and in fine, to create and share interactive maps for further geosciences' investigations. For these purposes, we developed GeolOkit: an open-source, freeware and lightweight software, written in Python, a high-level, cross-platform programming language. GeolOkit software is accessible through a graphical user interface, designed to run in parallel with Google Earth. It is a super user-friendly toolbox that allows 'geo-users' to import their raw data (e.g. GPS, sample locations, structural data, field pictures, maps), to use fast data analysis tools and to plot these one into Google Earth environment using KML code. This workflow requires no need of any third party software, except Google Earth itself. GeolOkit comes with large number of geosciences' labels, symbols, colours and placemarks and may process : (i) multi-points data, (ii) contours via several interpolations methods, (iii) discrete planar and linear structural data in 2D or 3D supporting large range of structures input format, (iv) clustered stereonets and rose diagram, (v) drawn cross-sections as vertical sections, (vi) georeferenced maps and vectors, (vii) field pictures using either geo-tracking metadata from a camera built-in GPS module, or the same-day track of an external GPS. We are looking for you to discover all the functionalities of GeolOkit software. As this project is under development, we are definitely looking to discussions regarding your proper needs, your ideas and contributions to GeolOkit project.

  18. Development of a Carbon Sequestration Visualization Tool using Google Earth Pro

    NASA Astrophysics Data System (ADS)

    Keating, G. N.; Greene, M. K.

    2008-12-01

    The Big Sky Carbon Sequestration Partnership seeks to prepare organizations throughout the western United States for a possible carbon-constrained economy. Through the development of CO2 capture and subsurface sequestration technology, the Partnership is working to enable the region to cleanly utilize its abundant fossil energy resources. The intent of the Los Alamos National Laboratory Big Sky Visualization tool is to allow geochemists, geologists, geophysicists, project managers, and other project members to view, identify, and query the data collected from CO2 injection tests using a single data source platform, a mission to which Google Earth Pro is uniquely and ideally suited . The visualization framework enables fusion of data from disparate sources and allows investigators to fully explore spatial and temporal trends in CO2 fate and transport within a reservoir. 3-D subsurface wells are projected above ground in Google Earth as the KML anchor points for the presentation of various surface subsurface data. This solution is the most integrative and cost-effective possible for the variety of users in the Big Sky community.

  19. International Ocean Discovery Program U.S. Implementing Organization

    Science.gov Websites

    coordinates seagoing expeditions to study the history of the Earth recorded in sediments and rocks beneath the Internship :: Minorities in Scientific Ocean Drilling Fellowship Education Deep Earth Academy logo :: joidesresolution.org :: For students :: For teachers :: For scientists :: View drill sites in Google Earth Export

  20. State of the Oceans: A Satellite Data Processing System for Visualizing Near Real-Time Imagery on Google Earth

    NASA Astrophysics Data System (ADS)

    Thompson, C. K.; Bingham, A. W.; Hall, J. R.; Alarcon, C.; Plesea, L.; Henderson, M. L.; Levoe, S.

    2011-12-01

    The State of the Oceans (SOTO) web tool was developed at NASA's Physical Oceanography Distributed Active Archive Center (PO.DAAC) at the Jet Propulsion Laboratory (JPL) as an interactive means for users to visually explore and assess ocean-based geophysical parameters extracted from the latest archived data products. The SOTO system consists of four extensible modules, a data polling tool, a preparation and imaging package, image server software, and the graphical user interface. Together, these components support multi-resolution visualization of swath (Level 2) and gridded Level 3/4) data products as either raster- or vector- based KML layers on Google Earth. These layers are automatically updated periodically throughout the day. Current parameters available include sea surface temperature, chlorophyll concentration, ocean winds, sea surface height anomaly, and sea surface temperature anomaly. SOTO also supports mash-ups, allowing KML feeds from other sources to be overlaid directly onto Google Earth such as hurricane tracks and buoy data. A version of the SOTO software has also been installed at Goddard Space Flight Center (GSFC) to support the Land Atmosphere Near real-time Capability for EOS (LANCE). The State of the Earth (SOTE) has similar functionality to SOTO but supports different data sets, among them the MODIS 250m data product.

  1. Do Interactive Globes and Games Help Students Learn Planetary Science?

    NASA Astrophysics Data System (ADS)

    Coba, Filis; Burgin, Stephen; De Paor, Declan; Georgen, Jennifer

    2016-01-01

    The popularity of animations and interactive visualizations in undergraduate science education might lead one to assume that these teaching aids enhance student learning. We tested this assumption for the case of the Google Earth virtual globe with a comparison of control and treatment student groups in a general education class of over 370 students at a large public university. Earth and Planetary Science course content was developed in two formats: using Keyhole Markup Language (KML) to create interactive tours in Google Earth (the treatment group) and Portable Document Format (PDF) for on-screen reading (the control group). The PDF documents contained identical text and images to the placemark balloons or "tour stops" in the Google Earth version. Some significant differences were noted between the two groups based on the immediate post-questionnaire with the KML students out-performing the PDF students, but not on the delayed measure. In a separate but related project, we undertake preliminary investigations into methods of teaching basic concepts in planetary mantle convection using numerical simulations. The goal of this project is to develop an interface with a two-dimensional finite element model that will allow students to vary parameters such as the temperatures assigned to the boundaries of the model domain, to help them actively explore important variables that control convection.

  2. A Java API for working with PubChem datasets.

    PubMed

    Southern, Mark R; Griffin, Patrick R

    2011-03-01

    PubChem is a public repository of chemical structures and associated biological activities. The PubChem BioAssay database contains assay descriptions, conditions and readouts and biological screening results that have been submitted by the biomedical research community. The PubChem web site and Power User Gateway (PUG) web service allow users to interact with the data and raw files are available via FTP. These resources are helpful to many but there can also be great benefit by using a software API to manipulate the data. Here, we describe a Java API with entity objects mapped to the PubChem Schema and with wrapper functions for calling the NCBI eUtilities and PubChem PUG web services. PubChem BioAssays and associated chemical compounds can then be queried and manipulated in a local relational database. Features include chemical structure searching and generation and display of curve fits from stored dose-response experiments, something that is not yet available within PubChem itself. The aim is to provide researchers with a fast, consistent, queryable local resource from which to manipulate PubChem BioAssays in a database agnostic manner. It is not intended as an end user tool but to provide a platform for further automation and tools development. http://code.google.com/p/pubchemdb.

  3. CMR Catalog Service for the Web

    NASA Technical Reports Server (NTRS)

    Newman, Doug; Mitchell, Andrew

    2016-01-01

    With the impending retirement of Global Change Master Directory (GCMD) Application Programming Interfaces (APIs) the Common Metadata Repository (CMR) was charged with providing a collection-level Catalog Service for the Web (CSW) that provided the same level of functionality as GCMD. This talk describes the capabilities of the CMR CSW API with particular reference to the support of the Committee on Earth Observation Satellites (CEOS) Working Group on Information Systems and Services (WGISS) Integrated Catalog (CWIC).

  4. Google Haul Out: Earth Observation Imagery and Digital Aerial Surveys in Coastal Wildlife Management and Abundance Estimation

    PubMed Central

    Moxley, Jerry H.; Bogomolni, Andrea; Hammill, Mike O.; Moore, Kathleen M. T.; Polito, Michael J.; Sette, Lisa; Sharp, W. Brian; Waring, Gordon T.; Gilbert, James R.; Halpin, Patrick N.; Johnston, David W.

    2017-01-01

    Abstract As the sampling frequency and resolution of Earth observation imagery increase, there are growing opportunities for novel applications in population monitoring. New methods are required to apply established analytical approaches to data collected from new observation platforms (e.g., satellites and unmanned aerial vehicles). Here, we present a method that estimates regional seasonal abundances for an understudied and growing population of gray seals (Halichoerus grypus) in southeastern Massachusetts, using opportunistic observations in Google Earth imagery. Abundance estimates are derived from digital aerial survey counts by adapting established correction-based analyses with telemetry behavioral observation to quantify survey biases. The result is a first regional understanding of gray seal abundance in the northeast US through opportunistic Earth observation imagery and repurposed animal telemetry data. As species observation data from Earth observation imagery become more ubiquitous, such methods provide a robust, adaptable, and cost-effective solution to monitoring animal colonies and understanding species abundances. PMID:29599542

  5. Google Haul Out: Earth Observation Imagery and Digital Aerial Surveys in Coastal Wildlife Management and Abundance Estimation.

    PubMed

    Moxley, Jerry H; Bogomolni, Andrea; Hammill, Mike O; Moore, Kathleen M T; Polito, Michael J; Sette, Lisa; Sharp, W Brian; Waring, Gordon T; Gilbert, James R; Halpin, Patrick N; Johnston, David W

    2017-08-01

    As the sampling frequency and resolution of Earth observation imagery increase, there are growing opportunities for novel applications in population monitoring. New methods are required to apply established analytical approaches to data collected from new observation platforms (e.g., satellites and unmanned aerial vehicles). Here, we present a method that estimates regional seasonal abundances for an understudied and growing population of gray seals (Halichoerus grypus) in southeastern Massachusetts, using opportunistic observations in Google Earth imagery. Abundance estimates are derived from digital aerial survey counts by adapting established correction-based analyses with telemetry behavioral observation to quantify survey biases. The result is a first regional understanding of gray seal abundance in the northeast US through opportunistic Earth observation imagery and repurposed animal telemetry data. As species observation data from Earth observation imagery become more ubiquitous, such methods provide a robust, adaptable, and cost-effective solution to monitoring animal colonies and understanding species abundances.

  6. Integration and Exposure of Large Scale Computational Resources Across the Earth System Grid Federation (ESGF)

    NASA Astrophysics Data System (ADS)

    Duffy, D.; Maxwell, T. P.; Doutriaux, C.; Williams, D. N.; Chaudhary, A.; Ames, S.

    2015-12-01

    As the size of remote sensing observations and model output data grows, the volume of the data has become overwhelming, even to many scientific experts. As societies are forced to better understand, mitigate, and adapt to climate changes, the combination of Earth observation data and global climate model projects is crucial to not only scientists but to policy makers, downstream applications, and even the public. Scientific progress on understanding climate is critically dependent on the availability of a reliable infrastructure that promotes data access, management, and provenance. The Earth System Grid Federation (ESGF) has created such an environment for the Intergovernmental Panel on Climate Change (IPCC). ESGF provides a federated global cyber infrastructure for data access and management of model outputs generated for the IPCC Assessment Reports (AR). The current generation of the ESGF federated grid allows consumers of the data to find and download data with limited capabilities for server-side processing. Since the amount of data for future AR is expected to grow dramatically, ESGF is working on integrating server-side analytics throughout the federation. The ESGF Compute Working Team (CWT) has created a Web Processing Service (WPS) Application Programming Interface (API) to enable access scalable computational resources. The API is the exposure point to high performance computing resources across the federation. Specifically, the API allows users to execute simple operations, such as maximum, minimum, average, and anomalies, on ESGF data without having to download the data. These operations are executed at the ESGF data node site with access to large amounts of parallel computing capabilities. This presentation will highlight the WPS API, its capabilities, provide implementation details, and discuss future developments.

  7. Geolokit: An interactive tool for visualising and exploring geoscientific data in Google Earth

    NASA Astrophysics Data System (ADS)

    Triantafyllou, Antoine; Watlet, Arnaud; Bastin, Christophe

    2017-10-01

    Virtual globes have been developed to showcase different types of data combining a digital elevation model and basemaps of high resolution satellite imagery. Hence, they became a standard to share spatial data and information, although they suffer from a lack of toolboxes dedicated to the formatting of large geoscientific dataset. From this perspective, we developed Geolokit: a free and lightweight software that allows geoscientists - and every scientist working with spatial data - to import their data (e.g., sample collections, structural geology, cross-sections, field pictures, georeferenced maps), to handle and to transcribe them to Keyhole Markup Language (KML) files. KML files are then automatically opened in the Google Earth virtual globe and the spatial data accessed and shared. Geolokit comes with a large number of dedicated tools that can process and display: (i) multi-points data, (ii) scattered data interpolations, (iii) structural geology features in 2D and 3D, (iv) rose diagrams, stereonets and dip-plunge polar histograms, (v) cross-sections and oriented rasters, (vi) georeferenced field pictures, (vii) georeferenced maps and projected gridding. Therefore, together with Geolokit, Google Earth becomes not only a powerful georeferenced data viewer but also a stand-alone work platform. The toolbox (available online at http://www.geolokit.org) is written in Python, a high-level, cross-platform programming language and is accessible through a graphical user interface, designed to run in parallel with Google Earth, through a workflow that requires no additional third party software. Geolokit features are demonstrated in this paper using typical datasets gathered from two case studies illustrating its applicability at multiple scales of investigation: a petro-structural investigation of the Ile d'Yeu orthogneissic unit (Western France) and data collection of the Mariana oceanic subduction zone (Western Pacific).

  8. Multi-temporal Land Use Mapping of Coastal Wetlands Area using Machine Learning in Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Farda, N. M.

    2017-12-01

    Coastal wetlands provide ecosystem services essential to people and the environment. Changes in coastal wetlands, especially on land use, are important to monitor by utilizing multi-temporal imagery. The Google Earth Engine (GEE) provides many machine learning algorithms (10 algorithms) that are very useful for extracting land use from imagery. The research objective is to explore machine learning in Google Earth Engine and its accuracy for multi-temporal land use mapping of coastal wetland area. Landsat 3 MSS (1978), Landsat 5 TM (1991), Landsat 7 ETM+ (2001), and Landsat 8 OLI (2014) images located in Segara Anakan lagoon are selected to represent multi temporal images. The input for machine learning are visible and near infrared bands, PCA band, invers PCA bands, bare soil index, vegetation index, wetness index, elevation from ASTER GDEM, and GLCM (Harralick) texture, and also polygon samples in 140 locations. There are 10 machine learning algorithms applied to extract coastal wetlands land use from Landsat imagery. The algorithms are Fast Naive Bayes, CART (Classification and Regression Tree), Random Forests, GMO Max Entropy, Perceptron (Multi Class Perceptron), Winnow, Voting SVM, Margin SVM, Pegasos (Primal Estimated sub-GrAdient SOlver for Svm), IKPamir (Intersection Kernel Passive Aggressive Method for Information Retrieval, SVM). Machine learning in Google Earth Engine are very helpful in multi-temporal land use mapping, the highest accuracy for land use mapping of coastal wetland is CART with 96.98 % Overall Accuracy using K-Fold Cross Validation (K = 10). GEE is particularly useful for multi-temporal land use mapping with ready used image and classification algorithms, and also very challenging for other applications.

  9. Optimizing Travel Time to Outpatient Interventional Radiology Procedures in a Multi-Site Hospital System Using a Google Maps Application.

    PubMed

    Mandel, Jacob E; Morel-Ovalle, Louis; Boas, Franz E; Ziv, Etay; Yarmohammadi, Hooman; Deipolyi, Amy; Mohabir, Heeralall R; Erinjeri, Joseph P

    2018-02-20

    The purpose of this study is to determine whether a custom Google Maps application can optimize site selection when scheduling outpatient interventional radiology (IR) procedures within a multi-site hospital system. The Google Maps for Business Application Programming Interface (API) was used to develop an internal web application that uses real-time traffic data to determine estimated travel time (ETT; minutes) and estimated travel distance (ETD; miles) from a patient's home to each a nearby IR facility in our hospital system. Hypothetical patient home addresses based on the 33 cities comprising our institution's catchment area were used to determine the optimal IR site for hypothetical patients traveling from each city based on real-time traffic conditions. For 10/33 (30%) cities, there was discordance between the optimal IR site based on ETT and the optimal IR site based on ETD at non-rush hour time or rush hour time. By choosing to travel to an IR site based on ETT rather than ETD, patients from discordant cities were predicted to save an average of 7.29 min during non-rush hour (p = 0.03), and 28.80 min during rush hour (p < 0.001). Using a custom Google Maps application to schedule outpatients for IR procedures can effectively reduce patient travel time when more than one location providing IR procedures is available within the same hospital system.

  10. Visualize Your Data with Google Fusion Tables

    NASA Astrophysics Data System (ADS)

    Brisbin, K. E.

    2011-12-01

    Google Fusion Tables is a modern data management platform that makes it easy to host, manage, collaborate on, visualize, and publish tabular data online. Fusion Tables allows users to upload their own data to the Google cloud, which they can then use to create compelling and interactive visualizations with the data. Users can view data on a Google Map, plot data in a line chart, or display data along a timeline. Users can share these visualizations with others to explore and discover interesting trends about various types of data, including scientific data such as invasive species or global trends in disease. Fusion Tables has been used by many organizations to visualize a variety of scientific data. One example is the California Redistricting Map created by the LA Times: http://goo.gl/gwZt5 The Pacific Institute and Circle of Blue have used Fusion Tables to map the quality of water around the world: http://goo.gl/T4SX8 The World Resources Institute mapped the threat level of coral reefs using Fusion Tables: http://goo.gl/cdqe8 What attendees will learn in this session: This session will cover all the steps necessary to use Fusion Tables to create a variety of interactive visualizations. Attendees will begin by learning about the various options for uploading data into Fusion Tables, including Shapefile, KML file, and CSV file import. Attendees will then learn how to use Fusion Tables to manage their data by merging it with other data and controlling the permissions of the data. Finally, the session will cover how to create a customized visualization from the data, and share that visualization with others using both Fusion Tables and the Google Maps API.

  11. Planetary-Scale Geospatial Data Analysis Techniques in Google's Earth Engine Platform (Invited)

    NASA Astrophysics Data System (ADS)

    Hancher, M.

    2013-12-01

    Geoscientists have more and more access to new tools for large-scale computing. With any tool, some tasks are easy and other tasks hard. It is natural to look to new computing platforms to increase the scale and efficiency of existing techniques, but there is a more exiting opportunity to discover and develop a new vocabulary of fundamental analysis idioms that are made easy and effective by these new tools. Google's Earth Engine platform is a cloud computing environment for earth data analysis that combines a public data catalog with a large-scale computational facility optimized for parallel processing of geospatial data. The data catalog includes a nearly complete archive of scenes from Landsat 4, 5, 7, and 8 that have been processed by the USGS, as well as a wide variety of other remotely-sensed and ancillary data products. Earth Engine supports a just-in-time computation model that enables real-time preview during algorithm development and debugging as well as during experimental data analysis and open-ended data exploration. Data processing operations are performed in parallel across many computers in Google's datacenters. The platform automatically handles many traditionally-onerous data management tasks, such as data format conversion, reprojection, resampling, and associating image metadata with pixel data. Early applications of Earth Engine have included the development of Google's global cloud-free fifteen-meter base map and global multi-decadal time-lapse animations, as well as numerous large and small experimental analyses by scientists from a range of academic, government, and non-governmental institutions, working in a wide variety of application areas including forestry, agriculture, urban mapping, and species habitat modeling. Patterns in the successes and failures of these early efforts have begun to emerge, sketching the outlines of a new set of simple and effective approaches to geospatial data analysis.

  12. NASA Radar Images Show Continued Deformation from Mexico Quake

    NASA Image and Video Library

    2010-08-04

    This image shows a UAVSAR interferogram swath overlaid atop a Google Earth image. New NASA airborne radar images show the continuing deformation in Earth surface resulting from the magnitude 7.2 temblor in Baja California on April 4, 2010.

  13. Monitoring Urban Heat Island Through Google Earth Engine: Potentialities and Difficulties in Different Cities of the United States

    NASA Astrophysics Data System (ADS)

    Ravanelli, R.; Nascetti, A.; Cirigliano, R. V.; Di Rico, C.; Monti, P.; Crespi, M.

    2018-04-01

    The aim of this work is to exploit the large-scale analysis capabilities of the innovative Google Earth Engine platform in order to investigate the temporal variations of the Urban Heat Island phenomenon as a whole. A intuitive methodology implementing a largescale correlation analysis between the Land Surface Temperature and Land Cover alterations was thus developed.The results obtained for the Phoenix MA are promising and show how the urbanization heavily affects the magnitude of the UHI effects with significant increases in LST. The proposed methodology is therefore able to efficiently monitor the UHI phenomenon.

  14. Changes of Earthquake Vulnerability of Marunouchi and Ginza Area in Tokyo and Urban Recovery Digital Archives on Google Earth

    NASA Astrophysics Data System (ADS)

    Igarashi, Masayasu; Murao, Osamu

    In this paper, the authors develop a multiple regression model which estimates urban earthquake vulnerability (building collapse risk and conflagration risk) for different eras, and clarify the historical changes of urban risk in Marunouchi and Ginza Districts in Tokyo, Japan using old maps and contemporary geographic information data. Also, we compare the change of urban vulnerability of the districts with the significant historical events in Tokyo. Finally, the results are loaded onto Google Earth with timescale extension to consider the possibility of urban recovery digital archives in the era of the recent geoinformatic technologies.

  15. Google-Earth Based Visualizations for Environmental Flows and Pollutant Dispersion in Urban Areas

    PubMed Central

    Liu, Daoming; Kenjeres, Sasa

    2017-01-01

    In the present study, we address the development and application of an efficient tool for conversion of results obtained by an integrated computational fluid dynamics (CFD) and computational reaction dynamics (CRD) approach and their visualization in the Google Earth. We focus on results typical for environmental fluid mechanics studies at a city scale that include characteristic wind flow patterns and dispersion of reactive scalars. This is achieved by developing a code based on the Java language, which converts the typical four-dimensional structure (spatial and temporal dependency) of data results in the Keyhole Markup Language (KML) format. The visualization techniques most often used are revisited and implemented into the conversion tool. The potential of the tool is demonstrated in a case study of smog formation due to an intense traffic emission in Rotterdam (The Netherlands). It is shown that the Google Earth can provide a computationally efficient and user-friendly means of data representation. This feature can be very useful for visualization of pollution at street levels, which is of great importance for the city residents. Various meteorological and traffic emissions can be easily visualized and analyzed, providing a powerful, user-friendly tool for traffic regulations and urban climate adaptations. PMID:28257078

  16. Using ArcMap, Google Earth, and Global Positioning Systems to select and locate random households in rural Haiti.

    PubMed

    Wampler, Peter J; Rediske, Richard R; Molla, Azizur R

    2013-01-18

    A remote sensing technique was developed which combines a Geographic Information System (GIS); Google Earth, and Microsoft Excel to identify home locations for a random sample of households in rural Haiti. The method was used to select homes for ethnographic and water quality research in a region of rural Haiti located within 9 km of a local hospital and source of health education in Deschapelles, Haiti. The technique does not require access to governmental records or ground based surveys to collect household location data and can be performed in a rapid, cost-effective manner. The random selection of households and the location of these households during field surveys were accomplished using GIS, Google Earth, Microsoft Excel, and handheld Garmin GPSmap 76CSx GPS units. Homes were identified and mapped in Google Earth, exported to ArcMap 10.0, and a random list of homes was generated using Microsoft Excel which was then loaded onto handheld GPS units for field location. The development and use of a remote sensing method was essential to the selection and location of random households. A total of 537 homes initially were mapped and a randomized subset of 96 was identified as potential survey locations. Over 96% of the homes mapped using Google Earth imagery were correctly identified as occupied dwellings. Only 3.6% of the occupants of mapped homes visited declined to be interviewed. 16.4% of the homes visited were not occupied at the time of the visit due to work away from the home or market days. A total of 55 households were located using this method during the 10 days of fieldwork in May and June of 2012. The method used to generate and field locate random homes for surveys and water sampling was an effective means of selecting random households in a rural environment lacking geolocation infrastructure. The success rate for locating households using a handheld GPS was excellent and only rarely was local knowledge required to identify and locate households. This method provides an important technique that can be applied to other developing countries where a randomized study design is needed but infrastructure is lacking to implement more traditional participant selection methods.

  17. An integrated WebGIS framework for volunteered geographic information and social media in soil and water conservation.

    PubMed

    Werts, Joshua D; Mikhailova, Elena A; Post, Christopher J; Sharp, Julia L

    2012-04-01

    Volunteered geographic information and social networking in a WebGIS has the potential to increase public participation in soil and water conservation, promote environmental awareness and change, and provide timely data that may be otherwise unavailable to policymakers in soil and water conservation management. The objectives of this study were: (1) to develop a framework for combining current technologies, computing advances, data sources, and social media; and (2) develop and test an online web mapping interface. The mapping interface integrates Microsoft Silverlight, Bing Maps, ArcGIS Server, Google Picasa Web Albums Data API, RSS, Google Analytics, and Facebook to create a rich user experience. The website allows the public to upload photos and attributes of their own subdivisions or sites they have identified and explore other submissions. The website was made available to the public in early February 2011 at http://www.AbandonedDevelopments.com and evaluated for its potential long-term success in a pilot study.

  18. An Integrated WebGIS Framework for Volunteered Geographic Information and Social Media in Soil and Water Conservation

    NASA Astrophysics Data System (ADS)

    Werts, Joshua D.; Mikhailova, Elena A.; Post, Christopher J.; Sharp, Julia L.

    2012-04-01

    Volunteered geographic information and social networking in a WebGIS has the potential to increase public participation in soil and water conservation, promote environmental awareness and change, and provide timely data that may be otherwise unavailable to policymakers in soil and water conservation management. The objectives of this study were: (1) to develop a framework for combining current technologies, computing advances, data sources, and social media; and (2) develop and test an online web mapping interface. The mapping interface integrates Microsoft Silverlight, Bing Maps, ArcGIS Server, Google Picasa Web Albums Data API, RSS, Google Analytics, and Facebook to create a rich user experience. The website allows the public to upload photos and attributes of their own subdivisions or sites they have identified and explore other submissions. The website was made available to the public in early February 2011 at http://www.AbandonedDevelopments.com and evaluated for its potential long-term success in a pilot study.

  19. compuGUT: An in silico platform for simulating intestinal fermentation

    NASA Astrophysics Data System (ADS)

    Moorthy, Arun S.; Eberl, Hermann J.

    The microbiota inhabiting the colon and its effect on health is a topic of significant interest. In this paper, we describe the compuGUT - a simulation tool developed to assist in exploring interactions between intestinal microbiota and their environment. The primary numerical machinery is implemented in C, and the accessory scripts for loading and visualization are prepared in bash (LINUX) and R. SUNDIALS libraries are employed for numerical integration, and googleVis API for interactive visualization. Supplementary material includes a concise description of the underlying mathematical model, and detailed characterization of numerical errors and computing times associated with implementation parameters.

  20. Cartographic analyses of geographic information available on Google Earth Images

    NASA Astrophysics Data System (ADS)

    Oliveira, J. C.; Ramos, J. R.; Epiphanio, J. C.

    2011-12-01

    The propose was to evaluate planimetric accuracy of satellite images available on database of Google Earth. These images are referents to the vicinities of the Federal Univertisity of Viçosa, Minas Gerais - Brazil. The methodology developed evaluated the geographical information of three groups of images which were in accordance to the level of detail presented in the screen images (zoom). These groups of images were labeled to Zoom 1000 (a single image for the entire study area), Zoom 100 (formed by a mosaic of 73 images) and Zoom 100 with geometric correction (this mosaic is like before, however, it was applied a geometric correction through control points). In each group of image was measured the Cartographic Accuracy based on statistical analyses and brazilian's law parameters about planimetric mapping. For this evaluation were identified 22 points in each group of image, where the coordinates of each point were compared to the coordinates of the field obtained by GPS (Global Positioning System). The Table 1 show results related to accuracy (based on a threshold equal to 0.5 mm * mapping scale) and tendency (abscissa and ordinate) between the coordinates of the image and the coordinates of field. Table 1 The geometric correction applied to the Group Zoom 100 reduced the trends identified earlier, and the statistical tests pointed a usefulness of the data for a mapping at a scale of 1/5000 with error minor than 0.5 mm * scale. The analyses proved the quality of cartographic data provided by Google, as well as the possibility of reduce the divergences of positioning present on the data. It can be concluded that it is possible to obtain geographic information database available on Google Earth, however, the level of detail (zoom) used at the time of viewing and capturing information on the screen influences the quality cartographic of the mapping. Although cartographic and thematic potential present in the database, it is important to note that both the software as data distributed by Google Earth has policies for use and distribution.
    Table 1 - PLANIMETRIC ANALYSIS

  1. Multi-Instrument Tools and Services to Access NASA Earth Science Data from the GSFC Earth Sciences Data and Information Services Center

    NASA Technical Reports Server (NTRS)

    Kempler, Steve; Leptoukh, Greg; Lynnes, Chris

    2010-01-01

    The presentation purpose is to describe multi-instrument tools and services that facilitate access and usability of NASA Earth science data at Goddard Space Flight Center (GSFC). NASA's Earth observing system includes 14 satellites. Topics include EOSDIS facilities and system architecture, and overview of GSFC Earth Science Data and Information Services Center (GES DISC) mission, Mirador data search, Giovanni, multi-instrument data exploration, Google Earth[TM], data merging, and applications.

  2. A Web-based Google-Earth Coincident Imaging Tool for Satellite Calibration and Validation

    NASA Astrophysics Data System (ADS)

    Killough, B. D.; Chander, G.; Gowda, S.

    2009-12-01

    The Group on Earth Observations (GEO) is coordinating international efforts to build a Global Earth Observation System of Systems (GEOSS) to meet the needs of its nine “Societal Benefit Areas”, of which the most demanding, in terms of accuracy, is climate. To accomplish this vision, satellite on-orbit and ground-based data calibration and validation (Cal/Val) of Earth observation measurements are critical to our scientific understanding of the Earth system. Existing tools supporting space mission Cal/Val are often developed for specific campaigns or events with little desire for broad application. This paper describes a web-based Google-Earth based tool for the calculation of coincident satellite observations with the intention to support a diverse international group of satellite missions to improve data continuity, interoperability and data fusion. The Committee on Earth Observing Satellites (CEOS), which includes 28 space agencies and 20 other national and international organizations, are currently operating and planning over 240 Earth observation satellites in the next 15 years. The technology described here will better enable the use of multiple sensors to promote increased coordination toward a GEOSS. The CEOS Systems Engineering Office (SEO) and the Working Group on Calibration and Validation (WGCV) support the development of the CEOS Visualization Environment (COVE) tool to enhance international coordination of data exchange, mission planning and Cal/Val events. The objective is to develop a simple and intuitive application tool that leverages the capabilities of Google-Earth web to display satellite sensor coverage areas and for the identification of coincident scene locations along with dynamic menus for flexibility and content display. Key features and capabilities include user-defined evaluation periods (start and end dates) and regions of interest (rectangular areas) and multi-user collaboration. Users can select two or more CEOS missions from a database including Satellite Tool Kit (STK) generated orbit information and perform rapid calculations to identify coincident scenes where the groundtracks of the CEOS mission instrument fields-of-view intersect. Calculated results are displayed on a customized Google-Earth web interface to view location and time information along with optional output to EXCEL table format. In addition, multiple viewports can be used for comparisons. COVE was first introduced to the CEOS WGCV community in May 2009. Since that time, the development of a prototype version has progressed. It is anticipated that the capabilities and applications of COVE can support a variety of international Cal/Val activities as well as provide general information on Earth observation coverage for education and societal benefit. This project demonstrates the utility of a systems engineering tool with broad international appeal for enhanced communication and data evaluation opportunities among international CEOS agencies. The COVE tool is publicly accessible via NASA servers.

  3. Using Google Streetview Panoramic Imagery for Geoscience Education

    NASA Astrophysics Data System (ADS)

    De Paor, D. G.; Dordevic, M. M.

    2014-12-01

    Google Streetview is a feature of Google Maps and Google Earth that allows viewers to switch from map or satellite view to 360° panoramic imagery recorded close to the ground. Most panoramas are recorded by Google engineers using special cameras mounted on the roofs of cars. Bicycles, snowmobiles, and boats have also been used and sometimes the camera has been mounted on a backpack for off-road use by hikers and skiers or attached to scuba-diving gear for "Underwater Streetview (sic)." Streetview panoramas are linked together so that the viewer can change viewpoint by clicking forward and reverse buttons. They therefore create a 4-D touring effect. As part of the GEODE project ("Google Earth for Onsite and Distance Education"), we are experimenting with the use of Streetview imagery for geoscience education. Our web-based test application allows instructors to select locations for students to study. Students are presented with a set of questions or tasks that they must address by studying the panoramic imagery. Questions include identification of rock types, structures such as faults, and general geological setting. The student view is locked into Streetview mode until they submit their answers, whereupon the map and satellite views become available, allowing students to zoom out and verify their location on Earth. Student learning is scaffolded by automatic computerized feedback. There are lots of existing Streetview panoramas with rich geological content. Additionally, instructors and members of the general public can create panoramas, including 360° Photo Spheres, by stitching images taken with their mobiles devices and submitting them to Google for evaluation and hosting. A multi-thousand-dollar, multi-directional camera and mount can be purchased from DIY-streetview.com. This allows power users to generate their own high-resolution panoramas. A cheaper, 360° video camera is soon to be released according to geonaute.com. Thus there are opportunities for geoscience educators both to use existing Streetview imagery and to generate new imagery for specific locations of geological interest. The GEODE team includes the authors and: H. Almquist, C. Bentley, S. Burgin, C. Cervato, G. Cooper, P. Karabinos, T. Pavlis, J. Piatek, B. Richards, J. Ryan, R. Schott, K. St. John, B. Tewksbury, and S. Whitmeyer.

  4. Exploring Research Contributions of the North American Carbon Program using Google Earth and Google Map

    NASA Astrophysics Data System (ADS)

    Griffith, P. C.; Wilcox, L. E.; Morrell, A.

    2009-12-01

    The central objective of the North American Carbon Program (NACP), a core element of the US Global Change Research Program, is to quantify the sources and sinks of carbon dioxide, carbon monoxide, and methane in North America and adjacent ocean regions. The NACP consists of a wide range of investigators at universities and federal research centers. Although many of these investigators have worked together in the past, many have had few prior interactions and may not know of similar work within knowledge domains, much less across the diversity of environments and scientific approaches in the Program. Coordinating interactions and sharing data are major challenges in conducting NACP. The Google Earth and Google Map Collections on the NACP website (www.nacarbon.org) provide a geographical view of the research products contributed by each core and affiliated NACP project. Other relevant data sources (e.g. AERONET, LVIS) can also be browsed in spatial context with NACP contributions. Each contribution links to project-oriented metadata, or “project profiles”, that provide a greater understanding of the scientific and social context of each dataset and are an important means of communicating within the NACP and to the larger carbon cycle science community. Project profiles store information such as a project's title, leaders, participants, an abstract, keywords, funding agencies, associated intensive campaigns, expected data products, data needs, publications, and URLs to associated data centers, datasets, and metadata. Data products are research contributions that include biometric inventories, flux tower estimates, remote sensing land cover products, tools, services, and model inputs / outputs. Project leaders have been asked to identify these contributions to the site level whenever possible, either through simple latitude/longitude pair, or by uploading a KML, KMZ, or shape file. Project leaders may select custom icons to graphically categorize their contributions; for example, a ship for oceanographic samples, a tower for tower measurements. After post-processing, research contributions are added to the NACP Google Earth and Google Map Collection to facilitate discovery and use in synthesis activities of the Program.

  5. A Java API for working with PubChem datasets

    PubMed Central

    Southern, Mark R.; Griffin, Patrick R.

    2011-01-01

    Summary: PubChem is a public repository of chemical structures and associated biological activities. The PubChem BioAssay database contains assay descriptions, conditions and readouts and biological screening results that have been submitted by the biomedical research community. The PubChem web site and Power User Gateway (PUG) web service allow users to interact with the data and raw files are available via FTP. These resources are helpful to many but there can also be great benefit by using a software API to manipulate the data. Here, we describe a Java API with entity objects mapped to the PubChem Schema and with wrapper functions for calling the NCBI eUtilities and PubChem PUG web services. PubChem BioAssays and associated chemical compounds can then be queried and manipulated in a local relational database. Features include chemical structure searching and generation and display of curve fits from stored dose–response experiments, something that is not yet available within PubChem itself. The aim is to provide researchers with a fast, consistent, queryable local resource from which to manipulate PubChem BioAssays in a database agnostic manner. It is not intended as an end user tool but to provide a platform for further automation and tools development. Availability: http://code.google.com/p/pubchemdb Contact: southern@scripps.edu PMID:21216779

  6. The global blue-sky albedo change between 2000 - 2015 seen from MODIS

    NASA Astrophysics Data System (ADS)

    Chrysoulakis, N.; Mitraka, Z.; Gorelick, N.

    2016-12-01

    The land surface albedo is a critical physical variable, which influences the Earth's climate by affecting the energy budget and distribution in the Earth-atmosphere system. Blue-sky albedo estimates provide a quantitative means for better constraining global and regional scale climate models. The Moderate Resolution Imaging Spectroradiometer (MODIS) albedo product includes parameters for the estimation of both the directional-hemispherical surface reflectance (black-sky albedo) and the bi-hemispherical surface reflectance (white-sky albedo). This dataset was used here for the blue-sky albedo estimation over the globe on an 8-day basis at 0.5 km spatial resolution for the whole time period covered by MODIS acquisitions (i.e. 2000 until today). To estimate the blue-sky albedo, the fraction of the diffused radiation is needed, a function of the Aerosol Optical Thickness (AOT). Required AOT information was acquired from the MODIS AOT product at 1̊ × 1̊ spatial resolution. Since the blue-sky albedo depends on the solar zenith angle (SZA), the 8-day mean blue-sky albedo values were computed as averages of the corresponding values for the representative SZAs covering the 24-hour day. The estimated blue-sky albedo time series was analyzed to capture changes during the 15 period. All computation were performed using the Google Earth Engine (GEE). The GEE provided access to all the MODIS products needed for the analysis without the need of searching or downloading. Moreover, the combination of MODIS products in both temporal and spatial terms was fast and effecting using the GEE API (Application Program Interface). All the products covering the globe and for the time period of 15 years were processed via a single collection. Most importantly, GEE allowed for including the calculation of SZAs covering the 24-hour day which improves the quality of the overall product. The 8-day global products of land surface albedo are available through http://www.rslab.gr/downloads.html

  7. There's An App For That: Planning Ahead for the Solar Eclipse in August 2017

    NASA Astrophysics Data System (ADS)

    Chizek Frouard, Malynda R.; Lesniak, Michael V.; Bell, Steve

    2017-01-01

    With the total solar eclipse of 2017 August 21 over the continental United States approaching, the U.S. Naval Observatory (USNO) on-line Solar Eclipse Computer can now be accessed via an Android application, available on Google Play.Over the course of the eclipse, as viewed from a specific site, several events may be visible: the beginning and ending of the eclipse (first and fourth contacts), the beginning and ending of totality (second and third contacts), the moment of maximum eclipse, sunrise, or sunset. For each of these events, the USNO Solar Eclipse 2017 Android application reports the time, Sun's altitude and azimuth, and the event's position and vertex angles. The app also lists the duration of the total phase, the duration of the eclipse, the magnitude of the eclipse, and the percent of the Sun obscured for a particular eclipse site.All of the data available in the app comes from the flexible USNO Solar Eclipse Computer Application Programming Interface (API), which produces JavaScript Object Notation (JSON) that can be incorporated into third-party Web sites or custom applications. Additional information is available in the on-line documentation (http://aa.usno.navy.mil/data/docs/api.php).For those who prefer using a traditional data input form, the local circumstances can still be requested at http://aa.usno.navy.mil/data/docs/SolarEclipses.php.In addition the 2017 August 21 Solar Eclipse Resource page (http://aa.usno.navy.mil/data/docs/Eclipse2017.php) consolidates all of the USNO resources for this event, including a Google Map view of the eclipse track designed by Her Majesty's Nautical Almanac Office (HMNAO).Looking further ahead, a 2024 April 8 Solar Eclipse Resource page (http://aa.usno.navy.mil/data/docs/Eclipse2024.php) is also available.

  8. Playing with Satellite Data

    NASA Astrophysics Data System (ADS)

    Beitler, J.; Truex, S.

    2008-12-01

    Would you like to see your science on the evening news? On everyone's mobile device? How hard is it to make one of those cool Google Earth files so people can explore your world? Do you need to be a programmer, or could most any person with a little motivation and a few inexpensive tools do it? Find out what it takes to get started with these technologies--it may be easier than you think--and how they can give your data more legs. I will demonstrate some of the ways that the National Snow and Ice Data Center has been successful in reaching the public and educators with visualized and animated data about the Earth's frozen regions, and talk about some of the how-to. In particular, see what we have done with QuickTime, Google Earth, YouTube, and the iPhone. I'll also talk about how we've assessed the reach and success of these efforts.

  9. Spatiotemporal Visualization of Tsunami Waves Using Kml on Google Earth

    NASA Astrophysics Data System (ADS)

    Mohammadi, H.; Delavar, M. R.; Sharifi, M. A.; Pirooz, M. D.

    2017-09-01

    Disaster risk is a function of hazard and vulnerability. Risk is defined as the expected losses, including lives, personal injuries, property damages, and economic disruptions, due to a particular hazard for a given area and time period. Risk assessment is one of the key elements of a natural disaster management strategy as it allows for better disaster mitigation and preparation. It provides input for informed decision making, and increases risk awareness among decision makers and other stakeholders. Virtual globes such as Google Earth can be used as a visualization tool. Proper spatiotemporal graphical representations of the concerned risk significantly reduces the amount of effort to visualize the impact of the risk and improves the efficiency of the decision-making process to mitigate the impact of the risk. The spatiotemporal visualization of tsunami waves for disaster management process is an attractive topic in geosciences to assist investigation of areas at tsunami risk. In this paper, a method for coupling virtual globes with tsunami wave arrival time models is presented. In this process we have shown 2D+Time of tsunami waves for propagation and inundation of tsunami waves, both coastal line deformation, and the flooded areas. In addition, the worst case scenario of tsunami on Chabahar port derived from tsunami modelling is also presented using KML on google earth.

  10. Virtual Field Trips: Using Google Maps to Support Online Learning and Teaching of the History of Astronomy

    ERIC Educational Resources Information Center

    Fluke, Christopher J.

    2009-01-01

    I report on a pilot study on the use of Google Maps to provide virtual field trips as a component of a wholly online graduate course on the history of astronomy. The Astronomical Tourist Web site (http://astronomy.swin.edu.au/sao/tourist), themed around the role that specific locations on Earth have contributed to the development of astronomical…

  11. Using Google Earth to Explore Strain Rate Models of Southern California

    NASA Astrophysics Data System (ADS)

    Richard, G. A.; Bell, E. A.; Holt, W. E.

    2007-12-01

    A series of strain rate models for the Transverse Ranges of southern California were developed based on Quaternary fault slip data and geodetic data from high precision GPS stations in southern California. Pacific-North America velocity boundary conditions are applied for all models. Topography changes are calculated using the model dilatation rates, which predict crustal thickness changes under the assumption of Airy isostasy and a specified rate of crustal volume loss through erosion. The models were designed to produce graphical and numerical output representing the configuration of the region from 3 million years ago to 3 million years into the future at intervals of 50 thousand years. Using a North American reference frame, graphical output for the topography and faults and numerical output for locations of faults and points on the crust marked by the locations on cities were used to create data in KML format that can be used in Google Earth to represent time intervals of 50 thousand years. As markers familiar to students, the cities provide a geographic context that can be used to quantify crustal movement, using the Google Earth ruler tool. By comparing distances that markers for selected cities have moved in various parts of the region, students discover that the greatest amount of crustal deformation has occurred in the vicinity of the boundary between the North American and Pacific plates. Students can also identify areas of compression or extension by finding pairs of city markers that have converged or diverged, respectively, over time. The Google Earth layers also reveal that faults that are not parallel to the plate boundary have tended to rotate clockwise due to the right lateral motion along the plate boundary zone. KML TimeSpan markup was added to two versions of the model, enabling the layers to be displayed in an automatic sequenced loop for a movie effect. The data is also available as QuickTime (.mov) and Graphics Interchange Format (.gif) animations and in ESRI Shapefile format.

  12. Near real-time qualitative monitoring of lake water chlorophyll globally using GoogleEarth Engine

    NASA Astrophysics Data System (ADS)

    Zlinszky, András; Supan, Peter; Koma, Zsófia

    2017-04-01

    Monitoring ocean chlorophyll and suspended sediment has been made possible using optical satellite imaging, and has contributed immensely to our understanding of the Earth and its climate. However, lake water quality monitoring has limitations due to the optical complexity of shallow, sediment- and organic matter-laden waters. Meanwhile, timely and detailed information on basic lake water quality parameters would be essential for sustainable management of inland waters. Satellite-based remote sensing can deliver area-covering, high resolution maps of basic lake water quality parameters, but scientific application of these datasets for lake monitoring has been hindered by limitations to calibration and accuracy evaluation, and therefore access to such data has been the privilege of scientific users. Nevertheless, since for many inland waters satellite imaging is the only source of monitoring data, we believe it is urgent to make map products of chlorophyll and suspended sediment concentrations available to a wide range of users. Even if absolute accuracy can not be validated, patterns, processes and qualitative information delivered by such datasets in near-real time can act as an early warning system, raise awareness to water quality processes and serve education, in addition to complementing local monitoring activities. By making these datasets openly available on the internet through an easy to use framework, dialogue between stakeholders, management and governance authorities can be facilitated. We use GoogleEarthEngine to access and process archive and current satellite data. GoogleEarth Engine is a development and visualization framework that provides access to satellite datasets and processing capacity for analysis at the Petabyte scale. Based on earlier investigations, we chose the fluorescence line height index to represent water chlorophyll concentration. This index relies on the chlorophyll fluorescence peak at 680 nm, and has been tested for open ocean but also inland lake situations for MODIS and MERIS satellite sensor data. In addition to being relatively robust and less sensitive to atmospheric influence, this algorithm is also very simple, being based on the height of the 680 nm peak above the linear interpolation of the two neighbouring bands. However, not all satellite datasets suitable for FLH are catalogued for GoogleEarth Engine. In the current testing phase, Landsat 7, Landsat 8 (30 m resolution), and Sentinel 2 (20 m) are being tested. Landsat 7 has suitable band configuration, but has a strip error due to a sensor problem. Landsat 8 and Sentinel 2 lack a single spectral optimal for FLH. Sentinel 3 would be an optimal data source and has shown good performace during small-scale initial tests, but is not distributed globally for GoogleEarth Engine. In addition to FLH data from these satellites, our system delivers cloud and ice masking, qualitative suspended sediment data (based on the band closest to 600 nm) and true colour images, all within an easy-to-use Google Maps background. This allows on-demand understanding and interpretation of water quality patterns and processes in near real time. While the system is still under development, we believe it could significantly contribute to lake water quality management and monitoring worldwide.

  13. NARSTO EPA SS PITTSBURGH GAS PM PROPERTY DATA

    Atmospheric Science Data Center

    2018-04-09

    ... Sizer Nephelometer Aerosol Collector SMPS - Scanning Mobility Particle Sizer Fluorescence Spectroscopy ... Get Google Earth Related Data:  Environmental Protection Agency Supersites Pittsburgh, Pennsylvania ...

  14. Participating in the Geospatial Web: Collaborative Mapping, Social Networks and Participatory GIS

    NASA Astrophysics Data System (ADS)

    Rouse, L. Jesse; Bergeron, Susan J.; Harris, Trevor M.

    In 2005, Google, Microsoft and Yahoo! released free Web mapping applications that opened up digital mapping to mainstream Internet users. Importantly, these companies also released free APIs for their platforms, allowing users to geo-locate and map their own data. These initiatives have spurred the growth of the Geospatial Web and represent spatially aware online communities and new ways of enabling communities to share information from the bottom up. This chapter explores how the emerging Geospatial Web can meet some of the fundamental needs of Participatory GIS projects to incorporate local knowledge into GIS, as well as promote public access and collaborative mapping.

  15. [Establishment of malaria early warning system in Jiangsu Province II application of digital earth system in malaria epidemic management and surveillance].

    PubMed

    Wang, Wei-Ming; Zhou, Hua-Yun; Liu, Yao-Bao; Li, Ju-Lin; Cao, Yuan-Yuan; Cao, Jun

    2013-04-01

    To explore a new mode of malaria elimination through the application of digital earth system in malaria epidemic management and surveillance. While we investigated the malaria cases and deal with the epidemic areas in Jiangsu Province in 2011, we used JISIBAO UniStrong G330 GIS data acquisition unit (GPS) to collect the latitude and longitude of the cases located, and then established a landmark library about early-warning areas and an image management system by using Google Earth Free 6.2 and its image processing software. A total of 374 malaria cases were reported in Jiangsu Province in 2011. Among them, there were 13 local vivax malaria cases, 11 imported vivax malaria cases from other provinces, 20 abroad imported vivax malaria cases, 309 abroad imported falciparum malaria cases, 7 abroad imported quartan malaria cases (Plasmodium malaria infection), and 14 abroad imported ovale malaria cases (P. ovale infection). Through the analysis of Google Earth Mapping system, these malaria cases showed a certain degree of aggregation except the abroad imported quartan malaria cases which were highly sporadic. The local vivax malaria cases mainly concentrated in Sihong County, the imported vivax malaria cases from other provinces mainly concentrated in Suzhou City and Wuxi City, the abroad imported vivax malaria cases concentrated in Nanjing City, the abroad imported falciparum malaria cases clustered in the middle parts of Jiangsu Province, and the abroad imported ovale malaria cases clustered in Liyang City. The operation of Google Earth Free 6.2 is simple, convenient and quick, which could help the public health authority to make the decision of malaria prevention and control, including the use of funds and other health resources.

  16. NASA GSFC Space Weather Center - Innovative Space Weather Dissemination: Web-Interfaces, Mobile Applications, and More

    NASA Technical Reports Server (NTRS)

    Maddox, Marlo; Zheng, Yihua; Rastaetter, Lutz; Taktakishvili, A.; Mays, M. L.; Kuznetsova, M.; Lee, Hyesook; Chulaki, Anna; Hesse, Michael; Mullinix, Richard; hide

    2012-01-01

    The NASA GSFC Space Weather Center (http://swc.gsfc.nasa.gov) is committed to providing forecasts, alerts, research, and educational support to address NASA's space weather needs - in addition to the needs of the general space weather community. We provide a host of services including spacecraft anomaly resolution, historical impact analysis, real-time monitoring and forecasting, custom space weather alerts and products, weekly summaries and reports, and most recently - video casts. There are many challenges in providing accurate descriptions of past, present, and expected space weather events - and the Space Weather Center at NASA GSFC employs several innovative solutions to provide access to a comprehensive collection of both observational data, as well as space weather model/simulation data. We'll describe the challenges we've faced with managing hundreds of data streams, running models in real-time, data storage, and data dissemination. We'll also highlight several systems and tools that are utilized by the Space Weather Center in our daily operations, all of which are available to the general community as well. These systems and services include a web-based application called the Integrated Space Weather Analysis System (iSWA http://iswa.gsfc.nasa.gov), two mobile space weather applications for both IOS and Android devices, an external API for web-service style access to data, google earth compatible data products, and a downloadable client-based visualization tool.

  17. Wind Wake Watcher v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martin, Shawn

    This software enables the user to produce Google Earth visualizations of turbine wake effects for wind farms. The visualizations are based on computations of statistical quantities that vary with wind direction and help quantify the effects on power production of upwind turbines on turbines in their wakes. The results of the software are plot images and kml files that can be loaded into Google Earth. The statistics computed are described in greater detail in the paper: S. Martin, C. H. Westergaard, and J. White (2016), Visualizing Wind Farm Wakes Using SCADA Data, in Wither Turbulence and Big Data in themore » 21st Century? Eds. A. Pollard, L. Castillo, L. Danaila, and M. Glauser. Springer, pgs. 231-254.« less

  18. The Earth System Grid Federation: An Open Infrastructure for Access to Distributed Geospatial Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ananthakrishnan, Rachana; Bell, Gavin; Cinquini, Luca

    2013-01-01

    The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF s architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL,more » GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).« less

  19. The Earth System Grid Federation: An Open Infrastructure for Access to Distributed Geo-Spatial Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cinquini, Luca; Crichton, Daniel; Miller, Neill

    2012-01-01

    The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF s architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL,more » GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).« less

  20. The Earth System Grid Federation : an Open Infrastructure for Access to Distributed Geospatial Data

    NASA Technical Reports Server (NTRS)

    Cinquini, Luca; Crichton, Daniel; Mattmann, Chris; Harney, John; Shipman, Galen; Wang, Feiyi; Ananthakrishnan, Rachana; Miller, Neill; Denvil, Sebastian; Morgan, Mark; hide

    2012-01-01

    The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF's architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL, GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).

  1. Generating Southern Africa Precipitation Forecast Using the FEWS Engine, a New Application for the Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Landsfeld, M. F.; Hegewisch, K.; Daudert, B.; Morton, C.; Husak, G. J.; Friedrichs, M.; Funk, C. C.; Huntington, J. L.; Abatzoglou, J. T.; Verdin, J. P.

    2016-12-01

    The Famine Early Warning Systems Network (FEWS NET) focuses on food insecurity in developing nations and provides objective, evidence-based analysis to help government decision-makers and relief agencies plan for and respond to humanitarian emergencies. The network of FEWS NET analysts and scientists require flexible, interactive tools to aid in their monitoring and research efforts. Because they often work in bandwidth-limited regions, lightweight Internet tools and services that bypass the need for downloading massive datasets are preferred for their work. To support food security analysis FEWS NET developed a custom interface for the Google Earth Engine (GEE). GEE is a platform developed by Google to support scientific analysis of environmental data in their cloud computing environment. This platform allows scientists and independent researchers to mine massive collections of environmental data, leveraging Google's vast computational resources for purposes of detecting changes and monitoring the Earth's surface and climate. GEE hosts an enormous amount of satellite imagery and climate archives, one of which is the Climate Hazards Group Infrared Precipitation with Stations dataset (CHIRPS). CHIRPS precipitation dataset is a key input for FEWS NET monitoring and forecasting efforts. In this talk we introduce the FEWS Engine interface. We present an application that highlights the utility of FEWS Engine for forecasting the upcoming seasonal precipitation of southern Africa. Specifically, the current state of ENSO is assessed and used to identify similar historical seasons. The FEWS Engine compositing tool is used to examine rainfall and other environmental data for these analog seasons. The application illustrates the unique benefits of using FEWS Engine for on-the-fly food security scenario development.

  2. How to Display Hazards and other Scientific Data Using Google Maps

    NASA Astrophysics Data System (ADS)

    Venezky, D. Y.; Fee, J. M.

    2007-12-01

    The U.S. Geological Survey's (USGS) Volcano Hazard Program (VHP) is launching a map-based interface to display hazards information using the Google® Map API (Application Program Interface). Map-based interfaces provide a synoptic view of data, making patterns easier to detect and allowing users to quickly ascertain where hazards are in relation to major population and infrastructure centers. Several map-based interfaces are now simple to run on a web server, providing ideal platforms for sharing information with colleagues, emergency managers, and the public. There are three main steps to making data accessible on a map-based interface; formatting the input data, plotting the data on the map, and customizing the user interface. The presentation, "Creating Geospatial RSS and ATOM feeds for Map-based Interfaces" (Fee and Venezky, this session), reviews key features for map input data. Join us for this presentation on how to plot data in a geographic context and then format the display with images, custom markers, and links to external data. Examples will show how the VHP Volcano Status Map was created and how to plot a field trip with driving directions.

  3. JBioWH: an open-source Java framework for bioinformatics data integration

    PubMed Central

    Vera, Roberto; Perez-Riverol, Yasset; Perez, Sonia; Ligeti, Balázs; Kertész-Farkas, Attila; Pongor, Sándor

    2013-01-01

    The Java BioWareHouse (JBioWH) project is an open-source platform-independent programming framework that allows a user to build his/her own integrated database from the most popular data sources. JBioWH can be used for intensive querying of multiple data sources and the creation of streamlined task-specific data sets on local PCs. JBioWH is based on a MySQL relational database scheme and includes JAVA API parser functions for retrieving data from 20 public databases (e.g. NCBI, KEGG, etc.). It also includes a client desktop application for (non-programmer) users to query data. In addition, JBioWH can be tailored for use in specific circumstances, including the handling of massive queries for high-throughput analyses or CPU intensive calculations. The framework is provided with complete documentation and application examples and it can be downloaded from the Project Web site at http://code.google.com/p/jbiowh. A MySQL server is available for demonstration purposes at hydrax.icgeb.trieste.it:3307. Database URL: http://code.google.com/p/jbiowh PMID:23846595

  4. JBioWH: an open-source Java framework for bioinformatics data integration.

    PubMed

    Vera, Roberto; Perez-Riverol, Yasset; Perez, Sonia; Ligeti, Balázs; Kertész-Farkas, Attila; Pongor, Sándor

    2013-01-01

    The Java BioWareHouse (JBioWH) project is an open-source platform-independent programming framework that allows a user to build his/her own integrated database from the most popular data sources. JBioWH can be used for intensive querying of multiple data sources and the creation of streamlined task-specific data sets on local PCs. JBioWH is based on a MySQL relational database scheme and includes JAVA API parser functions for retrieving data from 20 public databases (e.g. NCBI, KEGG, etc.). It also includes a client desktop application for (non-programmer) users to query data. In addition, JBioWH can be tailored for use in specific circumstances, including the handling of massive queries for high-throughput analyses or CPU intensive calculations. The framework is provided with complete documentation and application examples and it can be downloaded from the Project Web site at http://code.google.com/p/jbiowh. A MySQL server is available for demonstration purposes at hydrax.icgeb.trieste.it:3307. Database URL: http://code.google.com/p/jbiowh.

  5. GeneOnEarth: fitting genetic PC plots on the globe.

    PubMed

    Torres-Sánchez, Sergio; Medina-Medina, Nuria; Gignoux, Chris; Abad-Grau, María M; González-Burchard, Esteban

    2013-01-01

    Principal component (PC) plots have become widely used to summarize genetic variation of individuals in a sample. The similarity between genetic distance in PC plots and geographical distance has shown to be quite impressive. However, in most situations, individual ancestral origins are not precisely known or they are heterogeneously distributed; hence, they are hardly linked to a geographical area. We have developed GeneOnEarth, a user-friendly web-based tool to help geneticists to understand whether a linear isolation-by-distance model may apply to a genetic data set; thus, genetic distances among a set of individuals resemble geographical distances among their origins. Its main goal is to allow users to first apply a by-view Procrustes method to visually learn whether this model holds. To do that, the user can choose the exact geographical area from an on line 2D or 3D world map by using, respectively, Google Maps or Google Earth, and rotate, flip, and resize the images. GeneOnEarth can also compute the optimal rotation angle using Procrustes analysis and assess statistical evidence of similarity when a different rotation angle has been chosen by the user. An online version of GeneOnEarth is available for testing and using purposes at http://bios.ugr.es/GeneOnEarth.

  6. Using Google Earth to Study the Basic Characteristics of Volcanoes

    ERIC Educational Resources Information Center

    Schipper, Stacia; Mattox, Stephen

    2010-01-01

    Landforms, natural hazards, and the change in the Earth over time are common material in state and national standards. Volcanoes exemplify these standards and readily capture the interest and imagination of students. With a minimum of training, students can recognize erupted materials and types of volcanoes; in turn, students can relate these…

  7. Active Fire Mapping Program

    MedlinePlus

    Active Fire Mapping Program Current Large Incidents (Home) New Large Incidents Fire Detection Maps MODIS Satellite Imagery VIIRS Satellite Imagery Fire Detection GIS Data Fire Data in Google Earth ...

  8. NARSTO EPA SS LOS ANGELES SMPS DATA

    Atmospheric Science Data Center

    2018-04-09

    ... Ground Station Instrument:  SMPS - Scanning Mobility Particle Sizer Location:  Los Angeles, ... Get Google Earth Related Data:  Environmental Protection Agency Supersites Los Angeles, California ...

  9. Mapping land cover change over continental Africa using Landsat and Google Earth Engine cloud computing.

    PubMed

    Midekisa, Alemayehu; Holl, Felix; Savory, David J; Andrade-Pacheco, Ricardo; Gething, Peter W; Bennett, Adam; Sturrock, Hugh J W

    2017-01-01

    Quantifying and monitoring the spatial and temporal dynamics of the global land cover is critical for better understanding many of the Earth's land surface processes. However, the lack of regularly updated, continental-scale, and high spatial resolution (30 m) land cover data limit our ability to better understand the spatial extent and the temporal dynamics of land surface changes. Despite the free availability of high spatial resolution Landsat satellite data, continental-scale land cover mapping using high resolution Landsat satellite data was not feasible until now due to the need for high-performance computing to store, process, and analyze this large volume of high resolution satellite data. In this study, we present an approach to quantify continental land cover and impervious surface changes over a long period of time (15 years) using high resolution Landsat satellite observations and Google Earth Engine cloud computing platform. The approach applied here to overcome the computational challenges of handling big earth observation data by using cloud computing can help scientists and practitioners who lack high-performance computational resources.

  10. Relevant Links

    Atmospheric Science Data Center

    2018-06-15

    ... Theoretical Basis Document (ATBD) ADAM-M ADAM-M Information AirMISR AirMISR Home Page MISR Home Page Feature Article: Fiery Temperament KONVEX Information SAFARI Home Page AirMSPI Get Google Earth ...

  11. Google Earth Engine derived areal extents to infer elevation variation of lakes and reservoirs

    NASA Astrophysics Data System (ADS)

    Nguy-Robertson, Anthony; May, Jack; Dartevelle, Sebastien; Griffin, Sean; Miller, Justin; Tetrault, Robert; Birkett, Charon; Lucero, Eileen; Russo, Tess; Zentner, Matthew

    2017-04-01

    Monitoring water supplies is important for identifying potential national security issues before they begin. As a means to estimate lake and reservoir storage for sites without reliable water stage data, this study defines correlations between water body levels from hypsometry curves based on in situ gauge station and altimeter data (i.e. TOPEX/Poseidon, Jason series) and sensor areal extents observed in historic multispectral (i.e. MODIS and Landsat TM/ETM+/OLI) imagery. Water levels measured using in situ observations and altimeters, when in situ data were unavailable, were used to estimate the relationship between water elevation and surface area for 18 sites globally. Altimeters were generally more accurate (RMSE: 0.40 - 0.49 m) for estimating in situ lake elevations from Iraq and Afghanistan than the modeled elevation data using multispectral sensor areal extents: Landsat (RMSE: 0.25 - 1.5 m) and MODIS (RMSE 0.53 - 3.0 m). Correlations between altimeter data and Landsat imagery processed with Google Earth Engine confirmed similar relationships exists for a broader range of lakes without reported in situ data across the globe (RMSE: 0.24 - 1.6 m). Thus, while altimetry is still preferred to an areal extent model, lake surface area derived with Google Earth Engine can be used as a reasonable proxy for lake storage, expanding the number of observable lakes beyond the current constellation of altimeters and in situ gauges.

  12. Visualizing spatio-temporal war casualty data in Google Earth - A case study of Map the Fallen (Invited)

    NASA Astrophysics Data System (ADS)

    Askay, S.

    2009-12-01

    Published on Memorial Day 2009, Map the Fallen is a Google Earth visualization of the 5500+ US and international soldiers that have died in Iraq and Afghanistan since 2001. In addition to providing photos, stories and links for each solider, the time-animated map visually connects hometowns to places of death. This novel way of representing casualty data brings the geographic reach and magnitude of the issue into focus together with the very personal nature of individual stories. Innovative visualizations techniques were used that illustrate the spatio-temporal nature of this information and to show the global reach and interconnectivity of this issue. Several of advanced KML techniques employed to create this engaging and performance-conscious map will be discussed during this session. These include: 1) the use of HTML iframes and javascript to minimize the KML size, and extensive cross-linking throughout content; 2) the creation of a time-animated, on-screen casualty counter; 3) the use of parabolic arcs to connect each hometown to place of death; 4) the use of concentric spirals to represent chronological data; and 5) numerous performance optimizations to ensure the 23K placemarks, 2500 screen overlays and nearly 250k line vertices performed well in Google Earth. This session will include a demonstration of the map, conceptual discussions of the techniques used, and some in-depth technical explanation of the KML code.

  13. Improved discrimination among similar agricultural plots using red-and-green-based pseudo-colour imaging

    NASA Astrophysics Data System (ADS)

    Doi, Ryoichi

    2016-04-01

    The effects of a pseudo-colour imaging method were investigated by discriminating among similar agricultural plots in remote sensing images acquired using the Airborne Visible/Infrared Imaging Spectrometer (Indiana, USA) and the Landsat 7 satellite (Fergana, Uzbekistan), and that provided by GoogleEarth (Toyama, Japan). From each dataset, red (R)-green (G)-R-G-blue yellow (RGrgbyB), and RGrgby-1B pseudo-colour images were prepared. From each, cyan, magenta, yellow, key black, L*, a*, and b* derivative grayscale images were generated. In the Airborne Visible/Infrared Imaging Spectrometer image, pixels were selected for corn no tillage (29 pixels), corn minimum tillage (27), and soybean (34) plots. Likewise, in the Landsat 7 image, pixels representing corn (73 pixels), cotton (110), and wheat (112) plots were selected, and in the GoogleEarth image, those representing soybean (118 pixels) and rice (151) were selected. When the 14 derivative grayscale images were used together with an RGB yellow grayscale image, the overall classification accuracy improved from 74 to 94% (Airborne Visible/Infrared Imaging Spectrometer), 64 to 83% (Landsat), or 77 to 90% (GoogleEarth). As an indicator of discriminatory power, the kappa significance improved 1018-fold (Airborne Visible/Infrared Imaging Spectrometer) or greater. The derivative grayscale images were found to increase the dimensionality and quantity of data. Herein, the details of the increases in dimensionality and quantity are further analysed and discussed.

  14. Streets? Where We're Going, We Don't Need Streets

    NASA Astrophysics Data System (ADS)

    Bailey, J.

    2017-12-01

    In 2007 Google Street View started as a project to provide 360-degree imagery along streets, but in the decade since has evolved into a platform through which to explore everywhere from the slope of everest, to the middle of the Amazon rainforest to under the ocean. As camera technology has evolved it has also become a tool for ground truthing maps, and provided scientific observations, storytelling and education. The Google Street View "special collects" team has undertaken increasingly more challenging projects across 80+ countries and every continent. All of which culminated in possibly the most ambitious collection yet, the capture of Street View on board the International Space Station. Learn about the preparation and obstacles behind this and other special collects. Explore these datasets through both Google Earth and Google Expeditions VR, an educational tool to take students on virtual field trips using 360 degree imagery.

  15. Accelerating North American rangeland conservation with earth observation data and user driven web applications.

    NASA Astrophysics Data System (ADS)

    Allred, B. W.; Naugle, D.; Donnelly, P.; Tack, J.; Jones, M. O.

    2016-12-01

    In 2010, the USDA Natural Resources Conservation Service (NRCS) launched the Sage Grouse Initiative (SGI) to voluntarily reduce threats facing sage-grouse and rangelands on private lands. Over the past five years, SGI has matured into a primary catalyst for rangeland and wildlife conservation across the North American west, focusing on the shared vision of wildlife conservation through sustainable working landscapes and providing win-win solutions for producers, sage grouse, and 350 other sagebrush obligate species. SGI and its partners have invested a total of $750 million into rangeland and wildlife conservation. Moving forward, SGI continues to focus on rangeland conservation. Partnering with Google Earth Engine, SGI has developed outcome monitoring and conservation planning tools at continental scales. The SGI science team is currently developing assessment and monitoring algorithms of key conservation indicators. The SGI web application utilizes Google Earth Engine for user defined analysis and planning, putting the appropriate information directly into the hands of managers and conservationists.

  16. Vizualization of Arctic Landscapes in the Geoinformation System

    NASA Astrophysics Data System (ADS)

    Panidi, E. A.; Tsepelev, V. Yu.; Bobkov, A. A.

    2010-12-01

    In order to investigate the long-scale dynamics of an ice cover, authors suggest to use the geoinformation system (GIS) which allows to conduct the operative and historical analysis of the Polar Region water-ice landscapes variability. Such GIS should include longterm monthly average fields of sea ice, hydrological and atmospheric characters. All collected data and results of their processing have been structured in ArcGISTM . For presentation in the INTERNET resources all datasets were transformed to the open format KML for using in the virtual reality of Google EarthTM . The double component system elaborating on the base of ArcGIS and Google Earth allows to make accumulation, processing and joint synchronous and asynchronous analysis of data and provide wide circle of remote users with accessibility of visual datasets analysis.

  17. Oceanographic data at your fingertips: the SOCIB App for smartphones

    NASA Astrophysics Data System (ADS)

    Lora, Sebastian; Sebastian, Kristian; Troupin, Charles; Pau Beltran, Joan; Frontera, Biel; Gómara, Sonia; Tintoré, Joaquín

    2015-04-01

    The Balearic Islands Coastal Ocean Observing and Forecasting System (SOCIB, http://www.socib.es), is a multi-platform Marine Research Infrastructure that generates data from nearshore to the open sea in the Western Mediterranean Sea. In line with SOCIB principles of discoverable, freely available and standardized data, an application (App) for smartphones has been designed, with the objective of providing an easy access to all the data managed by SOCIB in real-time: underwater gliders, drifters, profiling buoys, research vessel, HF Radar and numerical model outputs (hydrodynamics and waves). The Data Centre, responsible for the aquisition, processing and visualisation of all SOCIB data, developed a REpresentational State Transfer (REST) application programming interface (API) called "DataDiscovery" (http://apps.socib.es/DataDiscovery/). This API is made up of RESTful web services that provide information on : platforms, instruments, deployments of instruments. It also provides the data themselves. In this way, it is possible to integrate SOCIB data in third-party applications, developed either by the Data Center or externally. The existence of a single point for the data distribution not only allows for an efficient management but also makes easier the concepts and data access for external developers, who are not necessarily familiar with the concepts and tools related to oceanographic or atmospheric data. The SOCIB App for Android (https://play.google.com/store/apps/details?id=com.socib) uses that API as a "data backend", in such a way that it is straightforward to manage which information is shown by the application, without having to modify and upload it again. The only pieces of information that do not depend on the services are the App "Sections" and "Screens", but the content displayed in each of them is obtained through requests to the web services. The API is not used only for the smartphone app: presently, most of SOCIB applications for data visualisation and access rely on the API, for instance: corporative web, deployment Application (Dapp, http://apps.socib.es/dapp/), Sea Boards (http://seaboard.socib.es/).

  18. Visualization of seismic tomography on Google Earth: Improvement of KML generator and its web application to accept the data file in European standard format

    NASA Astrophysics Data System (ADS)

    Yamagishi, Y.; Yanaka, H.; Tsuboi, S.

    2009-12-01

    We have developed a conversion tool for the data of seismic tomography into KML, called KML generator, and made it available on the web site (http://www.jamstec.go.jp/pacific21/google_earth). The KML generator enables us to display vertical and horizontal cross sections of the model on Google Earth in three-dimensional manner, which would be useful to understand the Earth's interior. The previous generator accepts text files of grid-point data having longitude, latitude, and seismic velocity anomaly. Each data file contains the data for each depth. Metadata, such as bibliographic reference, grid-point interval, depth, are described in other information file. We did not allow users to upload their own tomographic model to the web application, because there is not standard format to represent tomographic model. Recently European seismology research project, NEIRES (Network of Research Infrastructures for European Seismology), advocates that the data of seismic tomography should be standardized. They propose a new format based on JSON (JavaScript Object Notation), which is one of the data-interchange formats, as a standard one for the tomography. This format consists of two parts, which are metadata and grid-point data values. The JSON format seems to be powerful to handle and to analyze the tomographic model, because the structure of the format is fully defined by JavaScript objects, thus the elements are directly accessible by a script. In addition, there exist JSON libraries for several programming languages. The International Federation of Digital Seismograph Network (FDSN) adapted this format as a FDSN standard format for seismic tomographic model. There might be a possibility that this format would not only be accepted by European seismologists but also be accepted as the world standard. Therefore we improve our KML generator for seismic tomography to accept the data file having also JSON format. We also improve the web application of the generator so that the JSON formatted data file can be uploaded. Users can convert any tomographic model data to KML. The KML obtained through the new generator should provide an arena to compare various tomographic models and other geophysical observations on Google Earth, which may act as a common platform for geoscience browser.

  19. The ethics of Google Earth: crossing thresholds from spatial data to landscape visualisation.

    PubMed

    Sheppard, Stephen R J; Cizek, Petr

    2009-05-01

    'Virtual globe' software systems such as Google Earth are growing rapidly in popularity as a way to visualise and share 3D environmental data. Scientists and environmental professionals, many of whom are new to 3D modeling and visual communications, are beginning routinely to use such techniques in their work. While the appeal of these techniques is evident, with unprecedented opportunities for public access to data and collaborative engagement over the web, are there nonetheless risks in their widespread usage when applied in areas of the public interest such as planning and policy-making? This paper argues that the Google Earth phenomenon, which features realistic imagery of places, cannot be dealt with only as a question of spatial data and geographic information science. The virtual globe type of visualisation crosses several key thresholds in communicating scientific and environmental information, taking it well beyond the realm of conventional spatial data and geographic information science, and engaging more complex dimensions of human perception and aesthetic preference. The realism, perspective views, and social meanings of the landscape visualisations embedded in virtual globes invoke not only cognition but also emotional and intuitive responses, with associated issues of uncertainty, credibility, and bias in interpreting the imagery. This paper considers the types of risks as well as benefits that may exist with participatory uses of virtual globes by experts and lay-people. It is illustrated with early examples from practice and relevant themes from the literature in landscape visualisation and related disciplines such as environmental psychology and landscape planning. Existing frameworks and principles for the appropriate use of environmental visualisation methods are applied to the special case of widely accessible, realistic 3D and 4D visualisation systems such as Google Earth, in the context of public awareness-building and agency decision-making on environmental issues. Relevant principles are suggested which lend themselves to much-needed evaluation of risks and benefits of virtual globe systems. Possible approaches for balancing these benefits and risks include codes of ethics, software design, and metadata templates.

  20. Learning GIS and exploring geolocated data with the all-in-one Geolokit toolbox for Google Earth

    NASA Astrophysics Data System (ADS)

    Watlet, A.; Triantafyllou, A.; Bastin, C.

    2016-12-01

    GIS software are today's essential tools to gather and visualize geological data, to apply spatial and temporal analysis and finally, to create and share interactive maps for further investigations in geosciences. Such skills are especially essential to learn for students who go through fieldtrips, samples collections or field experiments. However, time is generally missing to teach in detail all the aspects of visualizing geolocated geoscientific data. For these purposes, we developed Geolokit: a lightweight freeware dedicated to geodata visualization and written in Python, a high-level, cross-platform programming language. Geolokit software is accessible through a graphical user interface, designed to run in parallel with Google Earth, benefitting from the numerous interactive capabilities. It is designed as a very user-friendly toolbox that allows `geo-users' to import their raw data (e.g. GPS, sample locations, structural data, field pictures, maps), to use fast data analysis tools and to visualize these into the Google Earth environment using KML code; with no require of third party software, except Google Earth itself. Geolokit comes with a large number of geosciences labels, symbols, colours and placemarks and is applicable to display several types of geolocated data, including: Multi-points datasets Automatically computed contours of multi-points datasets via several interpolation methods Discrete planar and linear structural geology data in 2D or 3D supporting large range of structures input format Clustered stereonets and rose diagrams 2D cross-sections as vertical sections Georeferenced maps and grids with user defined coordinates Field pictures using either geo-tracking metadata from a camera built-in GPS module, or the same-day track of an external GPS In the end, Geolokit is helpful for quickly visualizing and exploring data without losing too much time in the numerous capabilities of GIS software suites. We are looking for students and teachers to discover all the functionalities of Geolokit. As this project is under development and planned to be open source, we are definitely looking to discussions regarding particular needs or ideas, and to contributions in the Geolokit project.

  1. A New Perspective on Increasing Activity of Extratropical Disturbances: Spatial and Temporal Trends of Wave Activity

    NASA Astrophysics Data System (ADS)

    Hsu, P. C.; Hsu, H. H.

    2016-12-01

    Changes in extratropical disturbance behavior could play an important role in climate dynamics and be responsible for a part of climate-related damage. However, robust observational evidence for long-term trends in the activity is still lacking, and understanding of how it is linked with climate phenomena is limited. In this study, we define an accumulated perturbation index (API) to quantify the variation in some scalar quantities of atmospheric disturbances. API measures the areas (e.g., % of total surface area of Earth) where a certain perturbation quantity exceeds the long-term mean value plus 0.5 standard deviations. This index reflects more realistically the ensemble impacts of a climate perturbation and/or trend (such as global warming and ENSO) on the extratropical disturbances, even though its impact on different regions might vary from year to year due to stochastic processes. API represents an integrated activity of extratropical disturbances at a given time relative to a long time span. API is calculated for the 5-day running mean and 10-30-day stream function fluctuations during DJF and JJA. The analysis reveals an increasing trend in API and variance of stream function, especially in the Southern Hemisphere. The findings suggest that atmospheric extratropical disturbances have strengthened in widening areas during the past six decades, even though there might not be robust trends in wave activity at regional scales. Whether the observed trends in API are associated with certain climate patterns is under investigation. Impact of global warming is likely one of the major sources for the increasing activity. The future change in API under global warming scenarios will be further studied by analyzing the projection of the CMIP5 models.

  2. Components for Maintaining and Publishing Earth Science Vocabularies

    NASA Astrophysics Data System (ADS)

    Cox, S. J. D.; Yu, J.

    2014-12-01

    Shared vocabularies are an important aid to geoscience data interoperability. Many organizations maintain useful vocabularies, with Geologic Surveys having a particularly long history of vocabulary and lexicon development. However, the mode of publication is heterogeneous, ranging from PDFs and HTML web pages, spreadsheets and CSV, through various user-interfaces and APIs. Update and maintenance ranges from tightly-governed and externally opaque, through various community processes, all the way to crowd-sourcing ('folksonomies'). A general expectation, however, is for greater harmonization and vocabulary re-use. In order to be successful this requires (a) standardized content formalization and APIs (b) transparent content maintenance and versioning. We have been trialling a combination of software dealing with registration, search and linking. SKOS is designed for formalizing multi-lingual, hierarchical vocabularies, and has been widely adopted in earth and environmental sciences. SKOS is an RDF vocabulary, for which SPARQL is the standard low-level API. However, for interoperability between SKOS vocabulary sources, a SKOS-based API (i.e. based on the SKOS predicates prefLabel, broader, narrower, etc) is required. We have developed SISSvoc for this purpose, and used it to deploy a number of vocabularies on behalf of the IUGS, ICS, NERC, OGC, the Australian Government, and CSIRO projects. SISSvoc Search provides simple search UI on top of one or more SISSvoc sources. Content maintenance is composed of many elements, including content-formalization, definition-update, and mappings to related vocabularies. Typically there is a degree of expert judgement required. In order to provide confidence in users, two requirements are paramount: (i) once published, a URI that denotes a vocabulary item must remain dereferenceable; (ii) the history and status of the content denoted by a URI must be available. These requirements match the standard 'registration' paradigm which is implemented in the Linked Data Registry, which is currently used by WMO and the UK Environment Agency for publication of vocabularies. Together, these components provide a powerful and flexible system for providing earth science vocabularies for the community, consistent with semantic web and linked-data principles.

  3. jmzReader: A Java parser library to process and visualize multiple text and XML-based mass spectrometry data formats.

    PubMed

    Griss, Johannes; Reisinger, Florian; Hermjakob, Henning; Vizcaíno, Juan Antonio

    2012-03-01

    We here present the jmzReader library: a collection of Java application programming interfaces (APIs) to parse the most commonly used peak list and XML-based mass spectrometry (MS) data formats: DTA, MS2, MGF, PKL, mzXML, mzData, and mzML (based on the already existing API jmzML). The library is optimized to be used in conjunction with mzIdentML, the recently released standard data format for reporting protein and peptide identifications, developed by the HUPO proteomics standards initiative (PSI). mzIdentML files do not contain spectra data but contain references to different kinds of external MS data files. As a key functionality, all parsers implement a common interface that supports the various methods used by mzIdentML to reference external spectra. Thus, when developing software for mzIdentML, programmers no longer have to support multiple MS data file formats but only this one interface. The library (which includes a viewer) is open source and, together with detailed documentation, can be downloaded from http://code.google.com/p/jmzreader/. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. The Use of Virtual Globes as a Spatial Teaching Tool with Suggestions for Metadata Standards

    ERIC Educational Resources Information Center

    Schultz, Richard B.; Kerski, Joseph J.; Patterson, Todd C.

    2008-01-01

    Virtual Globe software has become extremely popular both inside and outside of educational settings. This software allows users to explore the Earth in three dimensions while streaming satellite imagery, elevation, and other data from the Internet. Virtual Globes, such as Google Earth, NASA World Wind, and ESRI's ArcGIS Explorer can be effectively…

  5. Interfaces Visualize Data for Airline Safety, Efficiency

    NASA Technical Reports Server (NTRS)

    2014-01-01

    As the A-Train Constellation orbits Earth to gather data, NASA scientists and partners visualize, analyze, and communicate the information. To this end, Langley Research Center awarded SBIR funding to Fairfax, Virginia-based WxAnalyst Ltd. to refine the company's existing user interface for Google Earth to visualize data. Hawaiian Airlines is now using the technology to help manage its flights.

  6. Interdisciplinary Collaboration amongst Colleagues and between Initiatives with the Magnetics Information Consortium (MagIC) Database

    NASA Astrophysics Data System (ADS)

    Minnett, R.; Koppers, A. A. P.; Jarboe, N.; Tauxe, L.; Constable, C.; Jonestrask, L.; Shaar, R.

    2014-12-01

    Earth science grand challenges often require interdisciplinary and geographically distributed scientific collaboration to make significant progress. However, this organic collaboration between researchers, educators, and students only flourishes with the reduction or elimination of technological barriers. The Magnetics Information Consortium (http://earthref.org/MagIC/) is a grass-roots cyberinfrastructure effort envisioned by the geo-, paleo-, and rock magnetic scientific community to archive their wealth of peer-reviewed raw data and interpretations from studies on natural and synthetic samples. MagIC is dedicated to facilitating scientific progress towards several highly multidisciplinary grand challenges and the MagIC Database team is currently beta testing a new MagIC Search Interface and API designed to be flexible enough for the incorporation of large heterogeneous datasets and for horizontal scalability to tens of millions of records and hundreds of requests per second. In an effort to reduce the barriers to effective collaboration, the search interface includes a simplified data model and upload procedure, support for online editing of datasets amongst team members, commenting by reviewers and colleagues, and automated contribution workflows and data retrieval through the API. This web application has been designed to generalize to other databases in MagIC's umbrella website (EarthRef.org) so the Geochemical Earth Reference Model (http://earthref.org/GERM/) portal, Seamount Biogeosciences Network (http://earthref.org/SBN/), EarthRef Digital Archive (http://earthref.org/ERDA/) and EarthRef Reference Database (http://earthref.org/ERR/) will benefit from its development.

  7. Pict'Earth: A new Method of Virtual Globe Data Acquisition

    NASA Astrophysics Data System (ADS)

    Johnson, J.; Long, S.; Riallant, D.; Hronusov, V.

    2007-12-01

    Georeferenced aerial imagery facilitates and enhances Earth science investigations. The realized value of imagery as a tool is measured from the spatial, temporal and radiometric resolution of the imagery. Currently, there is an need for a system which facilitates the rapid acquisition and distribution of high-resolution aerial earth images of localized areas. The Pict'Earth group has developed an apparatus and software algorithms which facilitate such tasks. Hardware includes a small radio-controlled model airplane (RC UAV); Light smartphones with high resolution cameras (Nokia NSeries Devices); and a GPS connected to the smartphone via the bluetooth protocol, or GPS-equipped phone. Software includes python code which controls the functions of the smartphone and GPS to acquire data in-flight; Online Virtual Globe applications including Google Earth, AJAX/Web2.0 technologies and services; APIs and libraries for developers, all of which are based on open XML-based GIS data standards. This new process for acquisition and distribution of high-resolution aerial earth images includes the following stages: Perform Survey over area of interest (AOI) with the RC UAV (Mobile Liveprocessing). In real-time our software collects images from the smartphone camera and positional data (latitude, longitude, altitude and heading) from the GPS. The software then calculates the earth footprint (geoprint) of each image and creates KML files which incorporate the georeferenced images and tracks of UAV. Optionally, it is possible to send the data in- flight via SMS/MMS (text and multimedia messages), or cellular internet networks via FTP. In Post processing the images are filtered, transformed, and assembled into a orthorectified image mosaic. The final mosaic is then cut into tiles and uploaded as a user ready product to web servers in kml format for use in Virtual Globes and other GIS applications. The obtained images and resultant data have high spatial resolution, can be updated in near-real time (high temporal resolution), and provide current radiance values (which is important for seasonal work). The final mosaics can also be assembled into time-lapse sequences and presented temporally. The suggested solution is cost effective when compared to the alternative methods of acquiring similar imagery. The systems are compact, mobile, and do not require a substantial amount of auxiliary equipment. Ongoing development of the software makes it possible to adapt the technology to different platforms, smartphones, sensors, and types of data. The range of application of this technology potentially covers a large part of the spectrum of Earth sciences including the calibration and validation of high-resolution satellite-derived products. These systems are currently being used for monitoring of dynamic land and water surface processes, and can be used for reconnaissance when locating and establishing field measurement sites.

  8. Judicious use of custom development in an open source component architecture

    NASA Astrophysics Data System (ADS)

    Bristol, S.; Latysh, N.; Long, D.; Tekell, S.; Allen, J.

    2014-12-01

    Modern software engineering is not as much programming from scratch as innovative assembly of existing components. Seamlessly integrating disparate components into scalable, performant architecture requires sound engineering craftsmanship and can often result in increased cost efficiency and accelerated capabilities if software teams focus their creativity on the edges of the problem space. ScienceBase is part of the U.S. Geological Survey scientific cyberinfrastructure, providing data and information management, distribution services, and analysis capabilities in a way that strives to follow this pattern. ScienceBase leverages open source NoSQL and relational databases, search indexing technology, spatial service engines, numerous libraries, and one proprietary but necessary software component in its architecture. The primary engineering focus is cohesive component interaction, including construction of a seamless Application Programming Interface (API) across all elements. The API allows researchers and software developers alike to leverage the infrastructure in unique, creative ways. Scaling the ScienceBase architecture and core API with increasing data volume (more databases) and complexity (integrated science problems) is a primary challenge addressed by judicious use of custom development in the component architecture. Other data management and informatics activities in the earth sciences have independently resolved to a similar design of reusing and building upon established technology and are working through similar issues for managing and developing information (e.g., U.S. Geoscience Information Network; NASA's Earth Observing System Clearing House; GSToRE at the University of New Mexico). Recent discussions facilitated through the Earth Science Information Partners are exploring potential avenues to exploit the implicit relationships between similar projects for explicit gains in our ability to more rapidly advance global scientific cyberinfrastructure.

  9. Spatio-temporal Change Patterns of Tropical Forests from 2000 to 2014 Using MOD09A1 Dataset

    NASA Astrophysics Data System (ADS)

    Qin, Y.; Xiao, X.; Dong, J.

    2016-12-01

    Large-scale deforestation and forest degradation in the tropical region have resulted in extensive carbon emissions and biodiversity loss. However, restricted by the availability of good-quality observations, large uncertainty exists in mapping the spatial distribution of forests and their spatio-temporal changes. In this study, we proposed a pixel- and phenology-based algorithm to identify and map annual tropical forests from 2000 to 2014, using the 8-day, 500-m MOD09A1 (v005) product, under the support of Google cloud computing (Google Earth Engine). A temporal filter was applied to reduce the random noises and to identify the spatio-temporal changes of forests. We then built up a confusion matrix and assessed the accuracy of the annual forest maps based on the ground reference interpreted from high spatial resolution images in Google Earth. The resultant forest maps showed the consistent forest/non-forest, forest loss, and forest gain in the pan-tropical zone during 2000 - 2014. The proposed algorithm showed the potential for tropical forest mapping and the resultant forest maps are important for the estimation of carbon emission and biodiversity loss.

  10. Local Air Quality Conditions and Forecasts

    MedlinePlus

    ... Monitor Location Archived Maps by Region Canada Air Quality Air Quality on Google Earth Links A-Z About AirNow AirNow International Air Quality Action Days / Alerts AirCompare Air Quality Index (AQI) ...

  11. Web GIS in practice X: a Microsoft Kinect natural user interface for Google Earth navigation

    PubMed Central

    2011-01-01

    This paper covers the use of depth sensors such as Microsoft Kinect and ASUS Xtion to provide a natural user interface (NUI) for controlling 3-D (three-dimensional) virtual globes such as Google Earth (including its Street View mode), Bing Maps 3D, and NASA World Wind. The paper introduces the Microsoft Kinect device, briefly describing how it works (the underlying technology by PrimeSense), as well as its market uptake and application potential beyond its original intended purpose as a home entertainment and video game controller. The different software drivers available for connecting the Kinect device to a PC (Personal Computer) are also covered, and their comparative pros and cons briefly discussed. We survey a number of approaches and application examples for controlling 3-D virtual globes using the Kinect sensor, then describe Kinoogle, a Kinect interface for natural interaction with Google Earth, developed by students at Texas A&M University. Readers interested in trying out the application on their own hardware can download a Zip archive (included with the manuscript as additional files 1, 2, &3) that contains a 'Kinnogle installation package for Windows PCs'. Finally, we discuss some usability aspects of Kinoogle and similar NUIs for controlling 3-D virtual globes (including possible future improvements), and propose a number of unique, practical 'use scenarios' where such NUIs could prove useful in navigating a 3-D virtual globe, compared to conventional mouse/3-D mouse and keyboard-based interfaces. PMID:21791054

  12. What's New in the Ocean in Google Earth and Maps

    NASA Astrophysics Data System (ADS)

    Austin, J.; Sandwell, D. T.

    2014-12-01

    Jenifer Austin, Jamie Adams, Kurt Schwehr, Brian Sullivan, David Sandwell2, Walter Smith3, Vicki Ferrini4, and Barry Eakins5, 1 Google Inc., 1600 Amphitheatre Parkway, Mountain View, California, USA 2 University of California-San Diego, Scripps Institute of Oceanography, La Jolla, California ,USA3 NOAA Laboratory for Satellite Altimetry, College Park, Maryland, USA4 Lamont Doherty, Columbia University5 NOAAMore than two-thirds of Earth is covered by oceans. On the almost 6 year anniversary of launching an explorable ocean seafloor in Google Earth and Maps, we updated our global underwater terrain dataset in partnership with Lamont-Doherty at Columbia, the Scripps Institution of Oceanography, and NOAA. With this update to our ocean map, we'll reveal an additional 2% of the ocean in high resolution representing 2 years of work by Columbia, pulling in data from numerous institutions including the Campeche Escarpment in the Gulf of Mexico in partnership with Charlie Paul at MBARI and the Schmidt Ocean Institute. The Scripps Institution of Oceanography at UCSD has curated 30 years of data from more than 8,000 ship cruises and 135 different institutions to reveal 15 percent of the seafloor at 1 km resolution. In addition, explore new data from an automated pipeline built to make updates to our Ocean Map more scalable in partnership with NOAA's National Geophysical Data Center (link to http://www.ngdc.noaa.gov/mgg/bathymetry/) and the University of Colorado CIRES program (link to http://cires.colorado.edu/index.html).

  13. Web GIS in practice X: a Microsoft Kinect natural user interface for Google Earth navigation.

    PubMed

    Boulos, Maged N Kamel; Blanchard, Bryan J; Walker, Cory; Montero, Julio; Tripathy, Aalap; Gutierrez-Osuna, Ricardo

    2011-07-26

    This paper covers the use of depth sensors such as Microsoft Kinect and ASUS Xtion to provide a natural user interface (NUI) for controlling 3-D (three-dimensional) virtual globes such as Google Earth (including its Street View mode), Bing Maps 3D, and NASA World Wind. The paper introduces the Microsoft Kinect device, briefly describing how it works (the underlying technology by PrimeSense), as well as its market uptake and application potential beyond its original intended purpose as a home entertainment and video game controller. The different software drivers available for connecting the Kinect device to a PC (Personal Computer) are also covered, and their comparative pros and cons briefly discussed. We survey a number of approaches and application examples for controlling 3-D virtual globes using the Kinect sensor, then describe Kinoogle, a Kinect interface for natural interaction with Google Earth, developed by students at Texas A&M University. Readers interested in trying out the application on their own hardware can download a Zip archive (included with the manuscript as additional files 1, 2, &3) that contains a 'Kinnogle installation package for Windows PCs'. Finally, we discuss some usability aspects of Kinoogle and similar NUIs for controlling 3-D virtual globes (including possible future improvements), and propose a number of unique, practical 'use scenarios' where such NUIs could prove useful in navigating a 3-D virtual globe, compared to conventional mouse/3-D mouse and keyboard-based interfaces.

  14. Vulnerability of the Nigerian coast: An insight into sea level rise owing to climate change and anthropogenic activities

    NASA Astrophysics Data System (ADS)

    Danladi, Iliya Bauchi; Kore, Basiru Mohammed; Gül, Murat

    2017-10-01

    Coastal areas are important regions in the world as they host huge population, diverse ecosystems and natural resources. However, owing to their settings, elevations and proximities to the sea, climate change (global warming) and human activities are threatening issues. Herein, we report the coastline changes and possible future threats related to sea level rise owing to global warming and human activities in the coastal region of Nigeria. Google earth images, Digital Elevation Model (DEM) and geological maps were used. Using google earth images, coastal changes for the past 43 years, 3 years prior to and after the construction of breakwaters along Goshen Beach Estate (Lekki) were examined. Additionally, coastline changes along Lekki Phase I from 2013 to 2016 were evaluated. The DEM map was used to delineate 0-2 m, 2-5 m and 5-10 m asl which correspond to undifferentiated sands and gravels to clays on the geological map. The results of the google earth images revealed remarkable erosion along both Lekki and Lekki Phase I, with the destruction of a lagoon in Lekki Phase I. Based on the result of the DEM map and geology, elevations of 0-2 m, 2-5 m and 5-10 m asl were interpreted as highly risky, moderately risky and risky respectively. Considering factors threatening coastal regions, the erosion and destruction of the lagoon along the Nigerian coast may be ascribed to sea level rise as a result of global warming and intense human activities respectively.

  15. An Android based location service using GSMCellID and GPS to obtain a graphical guide to the nearest cash machine

    NASA Astrophysics Data System (ADS)

    Jacobsen, Jurma; Edlich, Stefan

    2009-02-01

    There is a broad range of potential useful mobile location-based applications. One crucial point seems to be to make them available to the public at large. This case illuminates the abilities of Android - the operating system for mobile devices - to fulfill this demand in the mashup way by use of some special geocoding web services and one integrated web service for getting the nearest cash machines data. It shows an exemplary approach for building mobile location-based mashups for everyone: 1. As a basis for reaching as many people as possible the open source Android OS is assumed to spread widely. 2. Everyone also means that the handset has not to be an expensive GPS device. This is realized by re-utilization of the existing GSM infrastructure with the Cell of Origin (COO) method which makes a lookup of the CellID in one of the growing web available CellID databases. Some of these databases are still undocumented and not yet published. Furthermore the Google Maps API for Mobile (GMM) and the open source counterpart OpenCellID are used. The user's current position localization via lookup of the closest cell to which the handset is currently connected to (COO) is not as precise as GPS, but appears to be sufficient for lots of applications. For this reason the GPS user is the most pleased one - for this user the system is fully automated. In contrary there could be some users who doesn't own a GPS cellular. This user should refine his/her location by one click on the map inside of the determined circular region. The users are then shown and guided by a path to the nearest cash machine by integrating Google Maps API with an overlay. Additionally, the GPS user can keep track of him- or herself by getting a frequently updated view via constantly requested precise GPS data for his or her position.

  16. An Agro-Climatological Early Warning Tool Based on the Google Earth Engine to Support Regional Food Security Analysis

    NASA Astrophysics Data System (ADS)

    Landsfeld, M. F.; Daudert, B.; Friedrichs, M.; Morton, C.; Hegewisch, K.; Husak, G. J.; Funk, C. C.; Peterson, P.; Huntington, J. L.; Abatzoglou, J. T.; Verdin, J. P.; Williams, E. L.

    2015-12-01

    The Famine Early Warning Systems Network (FEWS NET) focuses on food insecurity in developing nations and provides objective, evidence based analysis to help government decision-makers and relief agencies plan for and respond to humanitarian emergencies. The Google Earth Engine (GEE) is a platform provided by Google Inc. to support scientific research and analysis of environmental data in their cloud environment. The intent is to allow scientists and independent researchers to mine massive collections of environmental data and leverage Google's vast computational resources to detect changes and monitor the Earth's surface and climate. GEE hosts an enormous amount of satellite imagery and climate archives, one of which is the Climate Hazards Group Infrared Precipitation with Stations dataset (CHIRPS). The CHIRPS dataset is land based, quasi-global (latitude 50N-50S), 0.05 degree resolution, and has a relatively long term period of record (1981-present). CHIRPS is on a continuous monthly feed into the GEE as new data fields are generated each month. This precipitation dataset is a key input for FEWS NET monitoring and forecasting efforts. FEWS NET intends to leverage the GEE in order to provide analysts and scientists with flexible, interactive tools to aid in their monitoring and research efforts. These scientists often work in bandwidth limited regions, so lightweight Internet tools and services that bypass the need for downloading massive datasets to analyze them, are preferred for their work. The GEE provides just this type of service. We present a tool designed specifically for FEWS NET scientists to be utilized interactively for investigating and monitoring for agro-climatological issues. We are able to utilize the enormous GEE computing power to generate on-the-fly statistics to calculate precipitation anomalies, z-scores, percentiles and band ratios, and allow the user to interactively select custom areas for statistical time series comparisons and predictions.

  17. Using GeoRSS feeds to distribute house renting and selling information based on Google map

    NASA Astrophysics Data System (ADS)

    Nong, Yu; Wang, Kun; Miao, Lei; Chen, Fei

    2007-06-01

    Geographically Encoded Objects RSS (GeoRSS) is a way to encode location in RSS feeds. RSS is a widely supported format for syndication of news and weblogs, and is extendable to publish any sort of itemized data. When Weblogs explode since RSS became new portals, Geo-tagged feed is necessary to show the location that story tells. Geographically Encoded Objects adopts the core of RSS framework, making itself the map annotations specified in the RSS XML format. The case studied illuminates that GeoRSS could be maximally concise in representation and conception, so it's simple to manipulate generation and then mashup GeoRSS feeds with Google Map through API to show the real estate information with other attribute in the information window. After subscribe to feeds of concerned subjects, users could easily check for new bulletin showing on map through syndication. The primary design goal of GeoRSS is to make spatial data creation as easy as regular Web content development. However, it does more for successfully bridging the gap between traditional GIS professionals and amateurs, Web map hackers, and numerous services that enable location-based content for its simplicity and effectiveness.

  18. Tracing Forest Change through 40 Years on Two Continents with the BULC Algorithm and Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Cardille, J. A.; Crowley, M.; Fortin, J. A.; Lee, J.; Perez, E.; Sleeter, B. M.; Thau, D.

    2016-12-01

    With the opening of the Landsat archive, researchers have a vast new data source teeming with imagery and potential. Beyond Landsat, data from other sensors is newly available as well: these include ALOS/PALSAR, Sentinel-1 and -2, MERIS, and many more. Google Earth Engine, developed to organize and provide analysis tools for these immense data sets, is an ideal platform for researchers trying to sift through huge image stacks. It offers nearly unlimited processing power and storage with a straightforward programming interface. Yet labeling land-cover change through time remains challenging given the current state of the art for interpreting remote sensing image sequences. Moreover, combining data from very different image platforms remains quite difficult. To address these challenges, we developed the BULC algorithm (Bayesian Updating of Land Cover), designed for the continuous updating of land-cover classifications through time in large data sets. The algorithm ingests data from any of the wide variety of earth-resources sensors; it maintains a running estimate of land-cover probabilities and the most probable class at all time points along a sequence of events. Here we compare BULC results from two study sites that witnessed considerable forest change in the last 40 years: the Pacific Northwest of the United States and the Mato Grosso region of Brazil. In Brazil, we incorporated rough classifications from more than 100 images of varying quality, mixing imagery from more than 10 different sensors. In the Pacific Northwest, we used BULC to identify forest changes due to logging and urbanization from 1973 to the present. Both regions had classification sequences that were better than many of the component days, effectively ignoring clouds and other unwanted noise while fusing the information contained on several platforms. As we leave remote sensing's data-poor era and enter a period with multiple looks at Earth's surface from multiple sensors over a short period of time, the BULC algorithm can help to sift through images of varying quality in Google Earth Engine to extract the most useful information for mapping the state and history of Earth's land cover.

  19. Garver Google+ Hangout

    NASA Image and Video Library

    2013-05-31

    NASA Deputy Administrator Lori Garver participates in a live "We The Geeks" Google+ Hangout hosted by the White House to talk about asteroids, Friday, May 31, 2013 at NASA Headquarters in Washington. An asteroid nearly three kilometers wide will pass by the Earth today at 3.6 million miles away. Garver is joined in the conversation by Bill Nye, Executive Director, Planetary Society; Ed Lu, former astronaut and CEO, B612 Foundation; Peter Diamandis, Co-Founder and Co-Chairman, Planetary Resources and Jose Luis Galache, Astronomer at the International Astronomical Unions's Minor Planet Center. Photo Credit: (NASA/Carla Cioffi)

  20. 3D Viewer Platform of Cloud Clustering Management System: Google Map 3D

    NASA Astrophysics Data System (ADS)

    Choi, Sung-Ja; Lee, Gang-Soo

    The new management system of framework for cloud envrionemnt is needed by the platfrom of convergence according to computing environments of changes. A ISV and small business model is hard to adapt management system of platform which is offered from super business. This article suggest the clustering management system of cloud computing envirionments for ISV and a man of enterprise in small business model. It applies the 3D viewer adapt from map3D & earth of google. It is called 3DV_CCMS as expand the CCMS[1].

  1. When Will It Be... USNO Seasons and Apsides Calculator

    NASA Astrophysics Data System (ADS)

    Chizek Frouard, Malynda; Bartlett, Jennifer Lynn

    2018-01-01

    The turning of the Earth’s seasons (solstices and equinoxes) and apsides (perihelions and aphelions) are times often used in observational astronomy and also of interest to the public. To avoid tedious calculations, the U.S. Naval Observatory (USNO) has developed an on-line interactive calculator, Earth’s Seasons and Apsides to provide information about events between 1600 and 2200. The new data service uses an Application Programming Interface (API), which returns values in JavaScript Object Notation (JSON) that can be incorporated into third-party websites or applications. For a requested year, the Earth’s Seasons and Apsides API provides the Gregorian calendar date and time of the Vernal Equinox, Summer Solstice, Autumnal Equinox, Winter Solstice, Aphelion, and Perihelion. The user may specify the time zone for their results, including the optional addition of U.S. daylight saving time for years after 1966.On-line documentation for using the API-enabled Earth’s Seasons and Apsides is available, including sample calls (http://aa.usno.navy.mil/data/docs/api.php). A traditional forms-based interface is available as well (http://aa.usno.navy.mil/data/docs/EarthSeasons.php). This data service replaces the popular Earth's Seasons: Equinoxes, Solstices, Perihelion, and Aphelion page that provided a static list of events for 2000–2025. The USNO also provides API-enabled data services for Complete Sun and Moon Data for One Day (http://aa.usno.navy.mil/data/docs/RS_OneDay.php), Dates of the Primary Phases of the Moon (http://aa.usno.navy.mil/data/docs/MoonPhase.php), Selected Christian Observances (http://aa.usno.navy.mil/data/docs/easter.php), Selected Islamic Observances (http://aa.usno.navy.mil/data/docs/islamic.php), Selected Jewish Observances (http://aa.usno.navy.mil/data/docs/passover.php), Julian Date Conversion (http://aa.usno.navy.mil/data/docs/JulianDate.php), and Sidereal Time (http://aa.usno.navy.mil/data/docs/siderealtime.php) as well as its Solar Eclipse Computer (http://aa.usno.navy.mil/data/docs/SolarEclipses.php).

  2. Climate Engine - Monitoring Drought with Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Hegewisch, K.; Daudert, B.; Morton, C.; McEvoy, D.; Huntington, J. L.; Abatzoglou, J. T.

    2016-12-01

    Drought has adverse effects on society through reduced water availability and agricultural production and increased wildfire risk. An abundance of remotely sensed imagery and climate data are being collected in near-real time that can provide place-based monitoring and early warning of drought and related hazards. However, in an era of increasing wealth of earth observations, tools that quickly access, compute, and visualize archives, and provide answers at relevant scales to better inform decision-making are lacking. We have developed ClimateEngine.org, a web application that uses Google's Earth Engine platform to enable users to quickly compute and visualize real-time observations. A suite of drought indices allow us to monitor and track drought from local (30-meters) to regional scales and contextualize current droughts within the historical record. Climate Engine is currently being used by U.S. federal agencies and researchers to develop baseline conditions and impact assessments related to agricultural, ecological, and hydrological drought. Climate Engine is also working with the Famine Early Warning Systems Network (FEWS NET) to expedite monitoring agricultural drought over broad areas at risk of food insecurity globally.

  3. Mapping land cover change over continental Africa using Landsat and Google Earth Engine cloud computing

    PubMed Central

    Holl, Felix; Savory, David J.; Andrade-Pacheco, Ricardo; Gething, Peter W.; Bennett, Adam; Sturrock, Hugh J. W.

    2017-01-01

    Quantifying and monitoring the spatial and temporal dynamics of the global land cover is critical for better understanding many of the Earth’s land surface processes. However, the lack of regularly updated, continental-scale, and high spatial resolution (30 m) land cover data limit our ability to better understand the spatial extent and the temporal dynamics of land surface changes. Despite the free availability of high spatial resolution Landsat satellite data, continental-scale land cover mapping using high resolution Landsat satellite data was not feasible until now due to the need for high-performance computing to store, process, and analyze this large volume of high resolution satellite data. In this study, we present an approach to quantify continental land cover and impervious surface changes over a long period of time (15 years) using high resolution Landsat satellite observations and Google Earth Engine cloud computing platform. The approach applied here to overcome the computational challenges of handling big earth observation data by using cloud computing can help scientists and practitioners who lack high-performance computational resources. PMID:28953943

  4. Graphical overview and navigation of electronic health records in a prototyping environment using Google Earth and openEHR archetypes.

    PubMed

    Sundvall, Erik; Nyström, Mikael; Forss, Mattias; Chen, Rong; Petersson, Håkan; Ahlfeldt, Hans

    2007-01-01

    This paper describes selected earlier approaches to graphically relating events to each other and to time; some new combinations are also suggested. These are then combined into a unified prototyping environment for visualization and navigation of electronic health records. Google Earth (GE) is used for handling display and interaction of clinical information stored using openEHR data structures and 'archetypes'. The strength of the approach comes from GE's sophisticated handling of detail levels, from coarse overviews to fine-grained details that has been combined with linear, polar and region-based views of clinical events related to time. The system should be easy to learn since all the visualization styles can use the same navigation. The structured and multifaceted approach to handling time that is possible with archetyped openEHR data lends itself well to visualizing and integration with openEHR components is provided in the environment.

  5. Wallace Creek Virtual Field Trip: Teaching Geoscience Concepts with LiDAR

    NASA Astrophysics Data System (ADS)

    Robinson, S. E.; Arrowsmith, R.; Crosby, C. J.

    2009-12-01

    Recently available data such as LiDAR (Light Detection and Ranging) high-resolution topography can assist students to better visualize and understand geosciences concepts. It is important to bring these data into geosciences curricula as teaching aids while ensuring that the visualization tools, virtual environments, etc. do not serve as barriers to student learning. As a Southern California Earthquake Center ACCESS-G intern, I am creating a “virtual field trip” to Wallace Creek along the San Andreas Fault (SAF) using Google Earth as a platform and the B4 project LiDAR data. Wallace Creek is an excellent site for understanding the centennial-to-millennial record of SAF slip because of its dramatic stream offsets. Using the LiDAR data instead of, or alongside, traditional visualizations and teaching methods enhances a student’s ability to understand plate tectonics, the earthquake cycle, strike-slip faults, and geomorphology. Viewing a high-resolution representation of the topography in Google Earth allows students to analyze the landscape and answer questions about the behavior of the San Andreas Fault. The activity guides students along the fault allowing them to measure channel offsets using the Google Earth measuring tool. Knowing the ages of channels, they calculate slip rate. They look for the smallest channel offsets around Wallace Creek in order to determine the slip per event. At both a “LiDAR and Education” workshop and the Cyberinfrastructure Summer Institute for Geoscientists (CSIG), I presented the Wallace Creek activity to high school and college earth science teachers. The teachers were positive in their responses and had numerous important suggestions including the need for a teacher’s manual for instruction and scientific background, and that the student goals and science topics should be specific and well-articulated for the sake of both the teacher and the student. The teachers also noted that the technology in classrooms varies significantly. Some do not have computers available for students or do not have access to the internet or certain software licenses. For this reason, I am also creating a paper-based version of the same exercise. After a usable activity is developed I plan to make it available online through the OpenTopography portal (www.opentopography.com) using a format similar to the online teaching boxes seen at DLESE (www.dlese.org). The final version will facilitate visual student learning through the popular Google Earth platform along with student guides and a teacher’s manual.

  6. Interactive Visualization of Near Real-Time and Production Global Precipitation Mission Data Online Using CesiumJS

    NASA Astrophysics Data System (ADS)

    Lammers, M.

    2016-12-01

    Advancements in the capabilities of JavaScript frameworks and web browsing technology make online visualization of large geospatial datasets viable. Commonly this is done using static image overlays, pre-rendered animations, or cumbersome geoservers. These methods can limit interactivity and/or place a large burden on server-side post-processing and storage of data. Geospatial data, and satellite data specifically, benefit from being visualized both on and above a three-dimensional surface. The open-source JavaScript framework CesiumJS, developed by Analytical Graphics, Inc., leverages the WebGL protocol to do just that. It has entered the void left by the abandonment of the Google Earth Web API, and it serves as a capable and well-maintained platform upon which data can be displayed. This paper will describe the technology behind the two primary products developed as part of the NASA Precipitation Processing System STORM website: GPM Near Real Time Viewer (GPMNRTView) and STORM Virtual Globe (STORM VG). GPMNRTView reads small post-processed CZML files derived from various Level 1 through 3 near real-time products. For swath-based products, several brightness temperature channels or precipitation-related variables are available for animating in virtual real-time as the satellite observed them on and above the Earth's surface. With grid-based products, only precipitation rates are available, but the grid points are visualized in such a way that they can be interactively examined to explore raw values. STORM VG reads values directly off the HDF5 files, converting the information into JSON on the fly. All data points both on and above the surface can be examined here as well. Both the raw values and, if relevant, elevations are displayed. Surface and above-ground precipitation rates from select Level 2 and 3 products are shown. Examples from both products will be shown, including visuals from high impact events observed by GPM constellation satellites.

  7. Interactive Visualization of Near Real Time and Production Global Precipitation Measurement (GPM) Mission Data Online Using CesiumJS

    NASA Technical Reports Server (NTRS)

    Lammers, Matthew

    2016-01-01

    Advancements in the capabilities of JavaScript frameworks and web browsing technology make online visualization of large geospatial datasets viable. Commonly this is done using static image overlays, prerendered animations, or cumbersome geoservers. These methods can limit interactivity andor place a large burden on server-side post-processing and storage of data. Geospatial data, and satellite data specifically, benefit from being visualized both on and above a three-dimensional surface. The open-source JavaScript framework CesiumJS, developed by Analytical Graphics, Inc., leverages the WebGL protocol to do just that. It has entered the void left by the abandonment of the Google Earth Web API, and it serves as a capable and well-maintained platform upon which data can be displayed. This paper will describe the technology behind the two primary products developed as part of the NASA Precipitation Processing System STORM website: GPM Near Real Time Viewer (GPMNRTView) and STORM Virtual Globe (STORM VG). GPMNRTView reads small post-processed CZML files derived from various Level 1 through 3 near real-time products. For swath-based products, several brightness temperature channels or precipitation-related variables are available for animating in virtual real-time as the satellite-observed them on and above the Earths surface. With grid-based products, only precipitation rates are available, but the grid points are visualized in such a way that they can be interactively examined to explore raw values. STORM VG reads values directly off the HDF5 files, converting the information into JSON on the fly. All data points both on and above the surface can be examined here as well. Both the raw values and, if relevant, elevations are displayed. Surface and above-ground precipitation rates from select Level 2 and 3 products are shown. Examples from both products will be shown, including visuals from high impact events observed by GPM constellation satellites.

  8. VIP Data Explorer: A Tool for Exploring 30 years of Vegetation Index and Phenology Observations

    NASA Astrophysics Data System (ADS)

    Barreto-munoz, A.; Didan, K.; Rivera-Camacho, J.; Yitayew, M.; Miura, T.; Tsend-Ayush, J.

    2011-12-01

    Continuous acquisition of global satellite imagery over the years has contributed to the creation of long term data records from AVHRR, MODIS, TM, SPOT-VGT and other sensors. These records account for 30+ years, as these archives grow, they become invaluable tools for environmental, resources management, and climate studies dealing with trends and changes from local, regional to global scale. In this project, the Vegetation Index and Phenology Lab (VIPLab) is processing 30 years of daily global surface reflectance data into an Earth Science Data Record of Vegetation Index and Phenology metrics. Data from AVHRR (N07,N09,N11 and N14) and MODIS (AQUA and TERRA collection 5) for the periods 1981-1999 and 2000-2010, at CMG resolution were processed into one seamless and sensor independent data record using various filtering, continuity and gap filling techniques (Tsend-Ayush et al., AGU 2011, Rivera-Camacho et al, AGU 2011). An interactive online tool (VIP Data Explorer) was developed to support the visualization, qualitative and quantitative exploration, distribution, and documentation of these records using a simple web 2.0 interface. The VIP Data explorer (http://vip.arizona.edu/viplab_data_explorer) can display any combination of multi temporal and multi source data, enable the quickly exploration and cross comparison of the various levels of processing of this data. It uses the Google Earth (GE) model and was developed using the GE API for images rendering, manipulation and geolocation. These ESDRs records can be quickly animated in this environment and explored for visual trends and anomalies detection. Additionally the tool enables extracting and visualizing any land pixel time series while showing the different levels of processing it went through. User can explore this ESDR database within this data explorer GUI environment, and any desired data can be placed into a dynamic "cart" to be ordered and downloaded later. More functionalities are planned and will be added to this data explorer tool as the project progresses.

  9. Virtual Globes, where we were, are and will be

    NASA Astrophysics Data System (ADS)

    Dehn, J.; Webley, P. W.; Worden, A. K.

    2016-12-01

    Ten years ago, Google Earth was new, and the first "Virtual Globes" session was held at AGU. Only a few of us realized the potential of this technology at the time, but the idea quickly caught on. At that time a virtual globe came in two flavors, first a complex GIS system that was utterly impenetrable for the public, or a more accessible version with limited functionality and layers that was available on a desktop computer with a good internet connection. Google Earth's use of the Keyhole Markup Language opened the door for scientists and the public to share data and visualizations across disciplines and revolutionized how everyone uses geographic data. In the following 10 years, KML became more advanced, virtual globes moved to mobile and handheld platforms, and the Google Earth engine allowed for more complex data sharing among scientists. Virtual globe images went from a rare commodity to being everywhere in our lives, from weather forecasts, in our cars, on our smart-phones and shape how we receive and process data. This is a fantastic tool for education and with newer technologies can reach the the remote corners of the world and developing countries. New and emerging technologies allow for augmented reality to be merged with the globes, and for real-time data integration with sensors built into mobile devices or add-ons. This presentation will follow the history of virtual globes in the geosciences, show how robust technologies can be used in the field and classroom today, and make some suggestions for the future.

  10. Use of Google Earth to strengthen public health capacity and facilitate management of vector-borne diseases in resource-poor environments.

    PubMed

    Lozano-Fuentes, Saul; Elizondo-Quiroga, Darwin; Farfan-Ale, Jose Arturo; Loroño-Pino, Maria Alba; Garcia-Rejon, Julian; Gomez-Carro, Salvador; Lira-Zumbardo, Victor; Najera-Vazquez, Rosario; Fernandez-Salas, Ildefonso; Calderon-Martinez, Joaquin; Dominguez-Galera, Marco; Mis-Avila, Pedro; Morris, Natashia; Coleman, Michael; Moore, Chester G; Beaty, Barry J; Eisen, Lars

    2008-09-01

    Novel, inexpensive solutions are needed for improved management of vector-borne and other diseases in resource-poor environments. Emerging free software providing access to satellite imagery and simple editing tools (e.g. Google Earth) complement existing geographic information system (GIS) software and provide new opportunities for: (i) strengthening overall public health capacity through development of information for city infrastructures; and (ii) display of public health data directly on an image of the physical environment. We used freely accessible satellite imagery and a set of feature-making tools included in the software (allowing for production of polygons, lines and points) to generate information for city infrastructure and to display disease data in a dengue decision support system (DDSS) framework. Two cities in Mexico (Chetumal and Merida) were used to demonstrate that a basic representation of city infrastructure useful as a spatial backbone in a DDSS can be rapidly developed at minimal cost. Data layers generated included labelled polygons representing city blocks, lines representing streets, and points showing the locations of schools and health clinics. City blocks were colour-coded to show presence of dengue cases. The data layers were successfully imported in a format known as shapefile into a GIS software. The combination of Google Earth and free GIS software (e.g. HealthMapper, developed by WHO, and SIGEpi, developed by PAHO) has tremendous potential to strengthen overall public health capacity and facilitate decision support system approaches to prevention and control of vector-borne diseases in resource-poor environments.

  11. Combining Google Earth and GIS mapping technologies in a dengue surveillance system for developing countries

    PubMed Central

    Chang, Aileen Y; Parrales, Maria E; Jimenez, Javier; Sobieszczyk, Magdalena E; Hammer, Scott M; Copenhaver, David J; Kulkarni, Rajan P

    2009-01-01

    Background Dengue fever is a mosquito-borne illness that places significant burden on tropical developing countries with unplanned urbanization. A surveillance system using Google Earth and GIS mapping technologies was developed in Nicaragua as a management tool. Methods and Results Satellite imagery of the town of Bluefields, Nicaragua captured from Google Earth was used to create a base-map in ArcGIS 9. Indices of larval infestation, locations of tire dumps, cemeteries, large areas of standing water, etc. that may act as larval development sites, and locations of the homes of dengue cases collected during routine epidemiologic surveying were overlaid onto this map. Visual imagery of the location of dengue cases, larval infestation, and locations of potential larval development sites were used by dengue control specialists to prioritize specific neighborhoods for targeted control interventions. Conclusion This dengue surveillance program allows public health workers in resource-limited settings to accurately identify areas with high indices of mosquito infestation and interpret the spatial relationship of these areas with potential larval development sites such as garbage piles and large pools of standing water. As a result, it is possible to prioritize control strategies and to target interventions to highest risk areas in order to eliminate the likely origin of the mosquito vector. This program is well-suited for resource-limited settings since it utilizes readily available technologies that do not rely on Internet access for daily use and can easily be implemented in many developing countries for very little cost. PMID:19627614

  12. Crop classification and mapping based on Sentinel missions data in cloud environment

    NASA Astrophysics Data System (ADS)

    Lavreniuk, M. S.; Kussul, N.; Shelestov, A.; Vasiliev, V.

    2017-12-01

    Availability of high resolution satellite imagery (Sentinel-1/2/3, Landsat) over large territories opens new opportunities in agricultural monitoring. In particular, it becomes feasible to solve crop classification and crop mapping task at country and regional scale using time series of heterogenous satellite imagery. But in this case, we face with the problem of Big Data. Dealing with time series of high resolution (10 m) multispectral imagery we need to download huge volumes of data and then process them. The solution is to move "processing chain" closer to data itself to drastically shorten time for data transfer. One more advantage of such approach is the possibility to parallelize data processing workflow and efficiently implement machine learning algorithms. This could be done with cloud platform where Sentinel imagery are stored. In this study, we investigate usability and efficiency of two different cloud platforms Amazon and Google for crop classification and crop mapping problems. Two pilot areas were investigated - Ukraine and England. Google provides user friendly environment Google Earth Engine for Earth observation applications with a lot of data processing and machine learning tools already deployed. At the same time with Amazon one gets much more flexibility in implementation of his own workflow. Detailed analysis of pros and cons will be done in the presentation.

  13. Assessment of the detectability of geo-hazards using Google Earth applied to the Three Parallel Rivers Area, Yunnan province of China

    NASA Astrophysics Data System (ADS)

    Voermans, Michiel; Mao, Zhun; Baartman, Jantiene EM; Stokes, Alexia

    2017-04-01

    Anthropogenic activities such as hydropower, mining and road construction in mountainous areas can induce and intensify mass wasting geo-hazards (e.g. landslides, gullies, rockslides). This represses local safety and socio-economic development, and endangers biodiversity at larger scale. Until today, data and knowledge to construct geo-hazard databases for further assessments are lacking. This applies in particular to countries with a recently emerged rapid economic growth, where there are no previous hazard documentations and where means to gain data from e.g. intensive fieldwork or VHR satellite imagery and DEM processing are lacking. Google Earth (GE, https://www.google.com/earth/) is a freely available and relatively simple virtual globe, map and geographical information program, which is potentially useful in detecting geo-hazards. This research aimed at (i) testing the capability of Google Earth to detect locations of geo-hazards and (ii) identifying factors affecting the diagnosing quality of the detection, including effects of geo-hazard dimensions, environs setting and professional background and effort of GE users. This was tested on nine geo-hazard sites following road segments in the Three Parallel Rivers Area in the Yunnan province of China, where geo-hazards are frequently occurring. Along each road site, the position and size of each geo-hazard was measured in situ. Next, independent diagnosers with varying professional experience (students, researchers, engineers etc.) were invited to detect geo-hazard occurrence along each of the eight sites via GE. Finally, the inventory and diagnostic data were compared to validate the objectives. Rates of detected geo-hazards from 30 diagnosers ranged from 10% to 48%. No strong correlations were found between the type and size of the geo-hazards and their detection rates. Also the years of expertise of the diagnosers proved not to make a difference, opposite to what may be expected. Meanwhile the amount of time spent by the diagnoser proved to be positively influencing the detectability. GE showed to be a useful tool in detecting mainly larger geo-hazards if diligently applied, and is therefore applicable to identify geo-hazard hotspots. The usability for further assessments such as sediment delivery estimations is questionable and further research should be carried out to give insight to its full potential.

  14. When Will It Be ...?: U.S. Naval Observatory Sidereal Time and Julian Date Calculators

    NASA Astrophysics Data System (ADS)

    Chizek Frouard, Malynda R.; Lesniak, Michael V.; Bartlett, Jennifer L.

    2017-01-01

    Sidereal time and Julian date are two values often used in observational astronomy that can be tedious to calculate. Fortunately, the U.S. Naval Observatory (USNO) has redesigned its on-line Sidereal Time and Julian Date (JD) calculators to provide data through an Application Programming Interface (API). This flexible interface returns dates and times in JavaScript Object Notation (JSON) that can be incorporated into third-party websites or applications.Via the API, Sidereal Time can be obtained for any location on Earth for any date occurring in the current, previous, or subsequent year. Up to 9999 iterations of sidereal time data with intervals from 1 second to 1095 days can be generated, as long as the data doesn’t extend past the date limits. The API provides the Gregorian calendar date and time (in UT1), Greenwich Mean Sidereal Time, Greenwich Apparent Sidereal Time, Local Mean Sidereal Time, Local Apparent Sidereal Time, and the Equation of the Equinoxes.Julian Date can be converted to calendar date, either Julian or Gregorian as appropriate, for any date between JD 0 (January 1, 4713 BCE proleptic Julian) and JD 5373484 (December 31, 9999 CE Gregorian); the reverse calendar date to Julian Date conversion is also available. The calendar date and Julian Date are returned for all API requests; the day of the week is also returned for Julian Date to calendar date conversions.On-line documentation for using all USNO API-enabled calculators, including sample calls, is available (http://aa.usno.navy.mil/data/docs/api.php).For those who prefer using traditional data input forms, Sidereal Time can still be accessed at http://aa.usno.navy.mil/data/docs/siderealtime.php, and the Julian Date Converter at http://aa.usno.navy.mil/data/docs/JulianDate.php.

  15. Using Social Media and Mobile Devices to Discover and Share Disaster Data Products Derived From Satellites

    NASA Technical Reports Server (NTRS)

    Mandl, Daniel; Cappelaere, Patrice; Frye, Stuart; Evans, John; Moe, Karen

    2014-01-01

    Data products derived from Earth observing satellites are difficult to find and share without specialized software and often times a highly paid and specialized staff. For our research effort, we endeavored to prototype a distributed architecture that depends on a standardized communication protocol and applications program interface (API) that makes it easy for anyone to discover and access disaster related data. Providers can easily supply the public with their disaster related products by building an adapter for our API. Users can use the API to browse and find products that relate to the disaster at hand, without a centralized catalogue, for example floods, and then are able to share that data via social media. Furthermore, a longerterm goal for this architecture is to enable other users who see the shared disaster product to be able to generate the same product for other areas of interest via simple point and click actions on the API on their mobile device. Furthermore, the user will be able to edit the data with on the ground local observations and return the updated information to the original repository of this information if configured for this function. This architecture leverages SensorWeb functionality [1] presented at previous IGARSS conferences. The architecture is divided into two pieces, the frontend, which is the GeoSocial API, and the backend, which is a standardized disaster node that knows how to talk to other disaster nodes, and also can communicate with the GeoSocial API. The GeoSocial API, along with the disaster node basic functionality enables crowdsourcing and thus can leverage insitu observations by people external to a group to perform tasks such as improving water reference maps, which are maps of existing water before floods. This can lower the cost of generating precision water maps. Keywords-Data Discovery, Disaster Decision Support, Disaster Management, Interoperability, CEOS WGISS Disaster Architecture

  16. A method for vreating a three dimensional model from published geologic maps and cross sections

    USGS Publications Warehouse

    Walsh, Gregory J.

    2009-01-01

    This brief report presents a relatively inexpensive and rapid method for creating a 3D model of geology from published quadrangle-scale maps and cross sections using Google Earth and Google SketchUp software. An example from the Green Mountains of Vermont, USA, is used to illustrate the step by step methods used to create such a model. A second example is provided from the Jebel Saghro region of the Anti-Atlas Mountains of Morocco. The report was published to help enhance the public?s ability to use and visualize geologic map data.

  17. The impact of geo-tagging on the photo industry and creating revenue streams

    NASA Astrophysics Data System (ADS)

    Richter, Rolf; Böge, Henning; Weckmann, Christoph; Schloen, Malte

    2010-02-01

    Internet geo and mapping services like Google Maps, Google Earth and Microsoft Bing Maps have reinvented the use of geographical information and have reached an enormous popularity. Besides that, location technologies like GPS have become affordable and are now being integrated in many camera phones. GPS is also available for standalone cameras as add on products or integrated in cameras. These developments are the enabler for new products for the photo industry or they enhance existing products. New commercial opportunities have been identified in the areas of photo hardware, internet/software and photo finishing.

  18. Relevancy 101

    NASA Technical Reports Server (NTRS)

    Lynnes, Chris; Newman, Doug

    2016-01-01

    Where we present an overview on why relevancy is a problem, how important it is and how we can improve it. The topic of relevancy is becoming increasingly important in earth data discovery as our audience is tuned to the accuracy of standard search engines like Google.

  19. Analyst Performance Measures. Volume 1: Persistent Surveillance Data Processing, Storage and Retrieval

    DTIC Science & Technology

    2011-09-01

    solutions to address these important challenges . The Air Force is seeking innovative architectures to process and store massive data sets in a flexible...Google Earth, the Video LAN Client ( VLC ) media player, and the Environmental Systems Research Institute corporation‘s (ESRI) ArcGIS product — to...Earth, Quantum GIS, VLC Media Player, NASA WorldWind, ESRI ArcGIS and many others. Open source GIS and media visualization software can also be

  20. Found in Translation: Weather, your bees live or die

    USDA-ARS?s Scientific Manuscript database

    Honey bee colonies, along with humans and the rest of life on Earth, are strongly impacted by the weather. As a species, Apis mellifera has succeeded incredibly well from the tropics to the colder regions of Europe and Asia. With help from their human keepers, honey bees now live across most of the ...

  1. The 2nd Generation Real Time Mission Monitor (RTMM) Development

    NASA Technical Reports Server (NTRS)

    Blakeslee, Richard; Goodman, Michael; Meyer, Paul; Hardin, Danny; Hall, John; He, Yubin; Regner, Kathryn; Conover, Helen; Smith, Tammy; Lu, Jessica; hide

    2009-01-01

    The NASA Real Time Mission Monitor (RTMM) is a visualization and information system that fuses multiple Earth science data sources, to enable real time decisionmaking for airborne and ground validation experiments. Developed at the National Aeronautics and Space Administration (NASA) Marshall Space Flight Center, RTMM is a situational awareness, decision-support system that integrates satellite imagery and orbit data, radar and other surface observations (e.g., lightning location network data), airborne navigation and instrument data sets, model output parameters, and other applicable Earth science data sets. The integration and delivery of this information is made possible using data acquisition systems, network communication links, network server resources, and visualizations through the Google Earth virtual globe application. In order to improve the usefulness and efficiency of the RTMM system, capabilities are being developed to allow the end-user to easily configure RTMM applications based on their mission-specific requirements and objectives. This second generation RTMM is being redesigned to take advantage of the Google plug-in capabilities to run multiple applications in a web browser rather than the original single application Google Earth approach. Currently RTMM employs a limited Service Oriented Architecture approach to enable discovery of mission specific resources. We are expanding the RTMM architecture such that it will more effectively utilize the Open Geospatial Consortium Sensor Web Enablement services and other new technology software tools and components. These modifications and extensions will result in a robust, versatile RTMM system that will greatly increase flexibility of the user to choose which science data sets and support applications to view and/or use. The improvements brought about by RTMM 2nd generation system will provide mission planners and airborne scientists with enhanced decision-making tools and capabilities to more efficiently plan, prepare and execute missions, as well as to playback and review past mission data. To paraphrase the old television commercial RTMM doesn t make the airborne science, it makes the airborne science easier.

  2. A New More Accurate Calibration for TIMED/GUVI

    NASA Astrophysics Data System (ADS)

    Schaefer, R. K.; Aiello, J.; Wolven, B. C.; Paxton, L. J.; Romeo, G.; Zhang, Y.

    2017-12-01

    The Global UltraViolet Imager (GUVI - http://guvi.jhuapl.edu) on NASA's TIMED spacecraft has the longest continuous set of observations of the Earth's ionosphere and thermosphere, spanning more than one solar cycle (2001-2017). As such, it represents an important dataset for understanding the dynamics of the Ionosphere-Thermosphere system. The entire dataset has been reprocessed and released as a new version (13) of GUVI data products. This is a complete re-examination of the calibration elements, including better calibrated radiances, better geolocation, and better background subtraction. Details can be found on the GUVI website: http://guvitimed.jhuapl.edu/guvi-Calib_Prod The radiances (except for the LBH long band) in version 13 are within 10% of the original archival radiances and so most of the derived products are little changed from their original versions. The LBH long band was redefined in on-board instrument color tables on Nov., 2, 2004 to better limit contamination from Nitric Oxide emission but this was not updated in ground processing until now. Version 13 LBH Long has 19% smaller radiances than the old calibrated products for post 11/2/2004 data. GUVI auroral products are the only ones that use LBHL - (LBH long)/(LBH short) is used to gauge the amount of intervening oxygen absorption. We will show several examples of the difference between new and old auroral products. Overall version 13 represents a big improvement in the calibration, geolocation, and background of the GUVI UV data products, allowing for the cleanest UV data for analysis of the ionosphere-thermosphere-aurora. An updated "Using GUVI Data Tutorial" will be available from the GUVI webpage to help you navigate to the data you need. Data products are displayed as daily summary and Google Earth files that can be browsed through the Cesium tool on the GUVI website or the image files can be downloaded and viewed through the Google Earth app. The image below shows gridded 135.6 nm radiances from March 27, 2003 displayed in Google Earth.

  3. Tracing Forest Change through 40 Years on Two Continents with the BULC Algorithm and Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Cardille, J. A.

    2015-12-01

    With the opening of the Landsat archive, researchers have a vast new data source teeming with imagery and potential. Beyond Landsat, data from other sensors is newly available as well: these include ALOS/PALSAR, Sentinel-1 and -2, MERIS, and many more. Google Earth Engine, developed to organize and provide analysis tools for these immense data sets, is an ideal platform for researchers trying to sift through huge image stacks. It offers nearly unlimited processing power and storage with a straightforward programming interface. Yet labeling forest change through time remains challenging given the current state of the art for interpreting remote sensing image sequences. Moreover, combining data from very different image platforms remains quite difficult. To address these challenges, we developed the BULC algorithm (Bayesian Updating of Land Cover), designed for the continuous updating of land-cover classifications through time in large data sets. The algorithm ingests data from any of the wide variety of earth-resources sensors; it maintains a running estimate of land-cover probabilities and the most probable class at all time points along a sequence of events. Here we compare BULC results from two study sites that witnessed considerable forest change in the last 40 years: the Pacific Northwest of the United States and the Mato Grosso region of Brazil. In Brazil, we incorporated rough classifications from more than 100 images of varying quality, mixing imagery from more than 10 different sensors. In the Pacific Northwest, we used BULC to identify forest changes due to logging and urbanization from 1973 to the present. Both regions had classification sequences that were better than many of the component days, effectively ignoring clouds and other unwanted signal while fusing the information contained on several platforms. As we leave remote sensing's data-poor era and enter a period with multiple looks at Earth's surface from multiple sensors over a short period of time, this algorithm may help to sift through images of varying quality in Google Earth Engine to extract the most useful information for mapping.

  4. Large Scale Crop Mapping in Ukraine Using Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Shelestov, A.; Lavreniuk, M. S.; Kussul, N.

    2016-12-01

    There are no globally available high resolution satellite-derived crop specific maps at present. Only coarse-resolution imagery (> 250 m spatial resolution) has been utilized to derive global cropland extent. In 2016 we are going to carry out a country level demonstration of Sentinel-2 use for crop classification in Ukraine within the ESA Sen2-Agri project. But optical imagery can be contaminated by cloud cover that makes it difficult to acquire imagery in an optimal time range to discriminate certain crops. Due to the Copernicus program since 2015, a lot of Sentinel-1 SAR data at high spatial resolution is available for free for Ukraine. It allows us to use the time series of SAR data for crop classification. Our experiment for one administrative region in 2015 showed much higher crop classification accuracy with SAR data than with optical only time series [1, 2]. Therefore, in 2016 within the Google Earth Engine Research Award we use SAR data together with optical ones for large area crop mapping (entire territory of Ukraine) using cloud computing capabilities available at Google Earth Engine (GEE). This study compares different classification methods for crop mapping for the whole territory of Ukraine using data and algorithms from GEE. Classification performance assessed using overall classification accuracy, Kappa coefficients, and user's and producer's accuracies. Also, crop areas from derived classification maps compared to the official statistics [3]. S. Skakun et al., "Efficiency assessment of multitemporal C-band Radarsat-2 intensity and Landsat-8 surface reflectance satellite imagery for crop classification in Ukraine," IEEE Journal of Selected Topics in Applied Earth Observ. and Rem. Sens., 2015, DOI: 10.1109/JSTARS.2015.2454297. N. Kussul, S. Skakun, A. Shelestov, O. Kussul, "The use of satellite SAR imagery to crop classification in Ukraine within JECAM project," IEEE International Geoscience and Remote Sensing Symposium (IGARSS), pp.1497-1500, 13-18 July 2014, Quebec City, Canada. F.J. Gallego, N. Kussul, S. Skakun, O. Kravchenko, A. Shelestov, O. Kussul, "Efficiency assessment of using satellite data for crop area estimation in Ukraine," International Journal of Applied Earth Observation and Geoinformation vol. 29, pp. 22-30, 2014.

  5. GeoDash: Assisting Visual Image Interpretation in Collect Earth Online by Leveraging Big Data on Google Earth Engine

    NASA Technical Reports Server (NTRS)

    Markert, Kel; Ashmall, William; Johnson, Gary; Saah, David; Mollicone, Danilo; Diaz, Alfonso Sanchez-Paus; Anderson, Eric; Flores, Africa; Griffin, Robert

    2017-01-01

    Collect Earth Online (CEO) is a free and open online implementation of the FAO Collect Earth system for collaboratively collecting environmental data through the visual interpretation of Earth observation imagery. The primary collection mechanism in CEO is human interpretation of land surface characteristics in imagery served via Web Map Services (WMS). However, interpreters may not have enough contextual information to classify samples by only viewing the imagery served via WMS, be they high resolution or otherwise. To assist in the interpretation and collection processes in CEO, SERVIR, a joint NASA-USAID initiative that brings Earth observations to improve environmental decision making in developing countries, developed the GeoDash system, an embedded and critical component of CEO. GeoDash leverages Google Earth Engine (GEE) by allowing users to set up custom browser-based widgets that pull from GEE's massive public data catalog. These widgets can be quick looks of other satellite imagery, time series graphs of environmental variables, and statistics panels of the same. Users can customize widgets with any of GEE's image collections, such as the historical Landsat collection with data available since the 1970s, select date ranges, image stretch parameters, graph characteristics, and create custom layouts, all on-the-fly to support plot interpretation in CEO. This presentation focuses on the implementation and potential applications, including the back-end links to GEE and the user interface with custom widget building. GeoDash takes large data volumes and condenses them into meaningful, relevant information for interpreters. While designed initially with national and global forest resource assessments in mind, the system will complement disaster assessments, agriculture management, project monitoring and evaluation, and more.

  6. GeoDash: Assisting Visual Image Interpretation in Collect Earth Online by Leveraging Big Data on Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Markert, K. N.; Ashmall, W.; Johnson, G.; Saah, D. S.; Anderson, E.; Flores Cordova, A. I.; Díaz, A. S. P.; Mollicone, D.; Griffin, R.

    2017-12-01

    Collect Earth Online (CEO) is a free and open online implementation of the FAO Collect Earth system for collaboratively collecting environmental data through the visual interpretation of Earth observation imagery. The primary collection mechanism in CEO is human interpretation of land surface characteristics in imagery served via Web Map Services (WMS). However, interpreters may not have enough contextual information to classify samples by only viewing the imagery served via WMS, be they high resolution or otherwise. To assist in the interpretation and collection processes in CEO, SERVIR, a joint NASA-USAID initiative that brings Earth observations to improve environmental decision making in developing countries, developed the GeoDash system, an embedded and critical component of CEO. GeoDash leverages Google Earth Engine (GEE) by allowing users to set up custom browser-based widgets that pull from GEE's massive public data catalog. These widgets can be quick looks of other satellite imagery, time series graphs of environmental variables, and statistics panels of the same. Users can customize widgets with any of GEE's image collections, such as the historical Landsat collection with data available since the 1970s, select date ranges, image stretch parameters, graph characteristics, and create custom layouts, all on-the-fly to support plot interpretation in CEO. This presentation focuses on the implementation and potential applications, including the back-end links to GEE and the user interface with custom widget building. GeoDash takes large data volumes and condenses them into meaningful, relevant information for interpreters. While designed initially with national and global forest resource assessments in mind, the system will complement disaster assessments, agriculture management, project monitoring and evaluation, and more.

  7. Results of Prospecting of Impact Craters in Morocco

    NASA Astrophysics Data System (ADS)

    Chaabout, S.; Chennaoui Aoudjehane, H.; Reimold, W. U.; Baratoux, D.

    2014-09-01

    This work is based to use satellite images of Google Earth and Yahoo-Maps scenes; we examined the surface of our country to be able to locate the structures that have a circular morphology such as impact craters, which potentially could be.

  8. Cloud-based calculators for fast and reliable access to NOAA's geomagnetic field models

    NASA Astrophysics Data System (ADS)

    Woods, A.; Nair, M. C.; Boneh, N.; Chulliat, A.

    2017-12-01

    While the Global Positioning System (GPS) provides accurate point locations, it does not provide pointing directions. Therefore, the absolute directional information provided by the Earth's magnetic field is of primary importance for navigation and for the pointing of technical devices such as aircrafts, satellites and lately, mobile phones. The major magnetic sources that affect compass-based navigation are the Earth's core, its magnetized crust and the electric currents in the ionosphere and magnetosphere. NOAA/CIRES Geomagnetism (ngdc.noaa.gov/geomag/) group develops and distributes models that describe all these important sources to aid navigation. Our geomagnetic models are used in variety of platforms including airplanes, ships, submarines and smartphones. While the magnetic field from Earth's core can be described in relatively fewer parameters and is suitable for offline computation, the magnetic sources from Earth's crust, ionosphere and magnetosphere require either significant computational resources or real-time capabilities and are not suitable for offline calculation. This is especially important for small navigational devices or embedded systems, where computational resources are limited. Recognizing the need for a fast and reliable access to our geomagnetic field models, we developed cloud-based application program interfaces (APIs) for NOAA's ionospheric and magnetospheric magnetic field models. In this paper we will describe the need for reliable magnetic calculators, the challenges faced in running geomagnetic field models in the cloud in real-time and the feedback from our user community. We discuss lessons learned harvesting and validating the data which powers our cloud services, as well as our strategies for maintaining near real-time service, including load-balancing, real-time monitoring, and instance cloning. We will also briefly talk about the progress we achieved on NOAA's Big Earth Data Initiative (BEDI) funded project to develop API interface to our Enhanced Magnetic Model (EMM).

  9. The Earth Data Analytic Services (EDAS) Framework

    NASA Astrophysics Data System (ADS)

    Maxwell, T. P.; Duffy, D.

    2017-12-01

    Faced with unprecedented growth in earth data volume and demand, NASA has developed the Earth Data Analytic Services (EDAS) framework, a high performance big data analytics framework built on Apache Spark. This framework enables scientists to execute data processing workflows combining common analysis operations close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using vetted earth data analysis tools (ESMF, CDAT, NCO, etc.). EDAS utilizes a dynamic caching architecture, a custom distributed array framework, and a streaming parallel in-memory workflow for efficiently processing huge datasets within limited memory spaces with interactive response times. EDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be accessed using direct web service calls, a Python script, a Unix-like shell client, or a JavaScript-based web application. New analytic operations can be developed in Python, Java, or Scala (with support for other languages planned). Client packages in Python, Java/Scala, or JavaScript contain everything needed to build and submit EDAS requests. The EDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service enables decision makers to compare multiple reanalysis datasets and investigate trends, variability, and anomalies in earth system dynamics around the globe.

  10. Enabling Tools and Methods for International, Inter-disciplinary and Educational Collaboration

    NASA Astrophysics Data System (ADS)

    Robinson, E. M.; Hoijarvi, K.; Falke, S.; Fialkowski, E.; Kieffer, M.; Husar, R. B.

    2008-05-01

    In the past, collaboration has taken place in tightly-knit workgroups where the members had direct connections to each other. Such collaboration was confined to small workgroups and person-to-person communication. Recent developments through the Internet foster virtual workgroups and organizations where dynamic, 'just-in-time' collaboration can take place over a much larger scale. The emergence of virtual workgroups has strongly influenced the interaction of inter-national, inter-disciplinary, as well as educational activities. In this paper we present an array of enabling tools and methods that incorporate the new technologies including web services, software mashups, tag-based structuring and searching, and wikis for collaborative writing and content organization. Large monolithic, 'do-it-all' software tools are giving way to web service modules, combined through service chaining. Application software can now be created using Service Oriented Architecture (SOA). In the air quality community, data providers and users are distributed in space and time creating barriers for data access. By exposing the data on the internet the space, time barriers are lessened. The federated data system, DataFed, developed at Washington University, accesses data from autonomous, distributed providers. Through data "wrappers", DataFed provides uniform and standards-based access services to heterogeneous, distributed data. Service orientation not only lowers the entry resistance for service providers, but it also allows the creation of user-defined applications and/or mashups. For example, Google Earth's open API allowed many groups to mash their content with Google Earth. Ad hoc tagging gives a rich description of the internet resources, but it has the disadvantage of providing a fuzzy schema. The semantic uniformity of the internet resources can be improved by controlled tagging which apply a consistent namespace and tag combinations to diverse objects. One example of this is the photo-sharing web application Flickr. Just like data, by exposing photos through the internet those can be reused in ways unknown and unanticipated by the provider. For air quality application, Flickr allowed a rich collection of images of forest fire smoke, wind blown dust and haze events to be tagged with controlled tags and used in for evaluating subtle features of the events. Wikis, originally used just for collaboratively writing and discuss documents, are now also a social software workflow managers. In air quality data, wikis provides the means to collaboratively create rich metadata. Wikis become a virtual meeting place to discuss ideas before a workshop of conference, display tagged internet resources, and collaboratively work on documents. Wikis are also useful in the classroom. For instance in class projects, the wiki displays harvested resources, maintains collaborative documents and discussions and is the organizational memory for the project.

  11. Be-safe travel, a web-based geographic application to explore safe-route in an area

    NASA Astrophysics Data System (ADS)

    Utamima, Amalia; Djunaidy, Arif

    2017-08-01

    In large cities in developing countries, the various forms of criminality are often found. For instance, the most prominent crimes in Surabaya, Indonesia is 3C, that is theft with violence (curas), theft by weighting (curat), and motor vehicle theft (curanmor). 3C case most often occurs on the highway and residential areas. Therefore, new entrants in an area should be aware of these kind of crimes. Route Planners System or route planning system such as Google Maps only consider the shortest distance in the calculation of the optimal route. The selection of the optimal path in this study not only consider the shortest distance, but also involves other factors, namely the security level. This research considers at the need for an application to recommend the safest road to be passed by the vehicle passengers while drive an area. This research propose Be-Safe Travel, a web-based application using Google API that can be accessed by people who like to drive in an area, but still lack of knowledge of the pathways which are safe from crime. Be-Safe Travel is not only useful for the new entrants, but also useful for delivery courier of valuables goods to go through the safest streets.

  12. Northeast India Helminth Parasite Information Database (NEIHPID): Knowledge Base for Helminth Parasites

    PubMed Central

    Debnath, Manish; Kharumnuid, Graciously; Thongnibah, Welfrank; Tandon, Veena

    2016-01-01

    Most metazoan parasites that invade vertebrate hosts belong to three phyla: Platyhelminthes, Nematoda and Acanthocephala. Many of the parasitic members of these phyla are collectively known as helminths and are causative agents of many debilitating, deforming and lethal diseases of humans and animals. The North-East India Helminth Parasite Information Database (NEIHPID) project aimed to document and characterise the spectrum of helminth parasites in the north-eastern region of India, providing host, geographical distribution, diagnostic characters and image data. The morphology-based taxonomic data are supplemented with information on DNA sequences of nuclear, ribosomal and mitochondrial gene marker regions that aid in parasite identification. In addition, the database contains raw next generation sequencing (NGS) data for 3 foodborne trematode parasites, with more to follow. The database will also provide study material for students interested in parasite biology. Users can search the database at various taxonomic levels (phylum, class, order, superfamily, family, genus, and species), or by host, habitat and geographical location. Specimen collection locations are noted as co-ordinates in a MySQL database and can be viewed on Google maps, using Google Maps JavaScript API v3. The NEIHPID database has been made freely available at http://nepiac.nehu.ac.in/index.php PMID:27285615

  13. Northeast India Helminth Parasite Information Database (NEIHPID): Knowledge Base for Helminth Parasites.

    PubMed

    Biswal, Devendra Kumar; Debnath, Manish; Kharumnuid, Graciously; Thongnibah, Welfrank; Tandon, Veena

    2016-01-01

    Most metazoan parasites that invade vertebrate hosts belong to three phyla: Platyhelminthes, Nematoda and Acanthocephala. Many of the parasitic members of these phyla are collectively known as helminths and are causative agents of many debilitating, deforming and lethal diseases of humans and animals. The North-East India Helminth Parasite Information Database (NEIHPID) project aimed to document and characterise the spectrum of helminth parasites in the north-eastern region of India, providing host, geographical distribution, diagnostic characters and image data. The morphology-based taxonomic data are supplemented with information on DNA sequences of nuclear, ribosomal and mitochondrial gene marker regions that aid in parasite identification. In addition, the database contains raw next generation sequencing (NGS) data for 3 foodborne trematode parasites, with more to follow. The database will also provide study material for students interested in parasite biology. Users can search the database at various taxonomic levels (phylum, class, order, superfamily, family, genus, and species), or by host, habitat and geographical location. Specimen collection locations are noted as co-ordinates in a MySQL database and can be viewed on Google maps, using Google Maps JavaScript API v3. The NEIHPID database has been made freely available at http://nepiac.nehu.ac.in/index.php.

  14. An application of traveling salesman problem using the improved genetic algorithm on android google maps

    NASA Astrophysics Data System (ADS)

    Narwadi, Teguh; Subiyanto

    2017-03-01

    The Travelling Salesman Problem (TSP) is one of the best known NP-hard problems, which means that no exact algorithm to solve it in polynomial time. This paper present a new variant application genetic algorithm approach with a local search technique has been developed to solve the TSP. For the local search technique, an iterative hill climbing method has been used. The system is implemented on the Android OS because android is now widely used around the world and it is mobile system. It is also integrated with Google API that can to get the geographical location and the distance of the cities, and displays the route. Therefore, we do some experimentation to test the behavior of the application. To test the effectiveness of the application of hybrid genetic algorithm (HGA) is compare with the application of simple GA in 5 sample from the cities in Central Java, Indonesia with different numbers of cities. According to the experiment results obtained that in the average solution HGA shows in 5 tests out of 5 (100%) is better than simple GA. The results have shown that the hybrid genetic algorithm outperforms the genetic algorithm especially in the case with the problem higher complexity.

  15. A Virtual Tour of the 1868 Hayward Earthquake in Google EarthTM

    NASA Astrophysics Data System (ADS)

    Lackey, H. G.; Blair, J. L.; Boatwright, J.; Brocher, T.

    2007-12-01

    The 1868 Hayward earthquake has been overshadowed by the subsequent 1906 San Francisco earthquake that destroyed much of San Francisco. Nonetheless, a modern recurrence of the 1868 earthquake would cause widespread damage to the densely populated Bay Area, particularly in the east Bay communities that have grown up virtually on top of the Hayward fault. Our concern is heightened by paleoseismic studies suggesting that the recurrence interval for the past five earthquakes on the southern Hayward fault is 140 to 170 years. Our objective is to build an educational web site that illustrates the cause and effect of the 1868 earthquake drawing on scientific and historic information. We will use Google EarthTM software to visually illustrate complex scientific concepts in a way that is understandable to a non-scientific audience. This web site will lead the viewer from a regional summary of the plate tectonics and faulting system of western North America, to more specific information about the 1868 Hayward earthquake itself. Text and Google EarthTM layers will include modeled shaking of the earthquake, relocations of historic photographs, reconstruction of damaged buildings as 3-D models, and additional scientific data that may come from the many scientific studies conducted for the 140th anniversary of the event. Earthquake engineering concerns will be stressed, including population density, vulnerable infrastructure, and lifelines. We will also present detailed maps of the Hayward fault, measurements of fault creep, and geologic evidence of its recurrence. Understanding the science behind earthquake hazards is an important step in preparing for the next significant earthquake. We hope to communicate to the public and students of all ages, through visualizations, not only the cause and effect of the 1868 earthquake, but also modern seismic hazards of the San Francisco Bay region.

  16. Exploring Google Earth Engine platform for big data processing: classification of multi-temporal satellite imagery for crop mapping

    NASA Astrophysics Data System (ADS)

    Shelestov, Andrii; Lavreniuk, Mykola; Kussul, Nataliia; Novikov, Alexei; Skakun, Sergii

    2017-02-01

    Many applied problems arising in agricultural monitoring and food security require reliable crop maps at national or global scale. Large scale crop mapping requires processing and management of large amount of heterogeneous satellite imagery acquired by various sensors that consequently leads to a “Big Data” problem. The main objective of this study is to explore efficiency of using the Google Earth Engine (GEE) platform when classifying multi-temporal satellite imagery with potential to apply the platform for a larger scale (e.g. country level) and multiple sensors (e.g. Landsat-8 and Sentinel-2). In particular, multiple state-of-the-art classifiers available in the GEE platform are compared to produce a high resolution (30 m) crop classification map for a large territory ( 28,100 km2 and 1.0 M ha of cropland). Though this study does not involve large volumes of data, it does address efficiency of the GEE platform to effectively execute complex workflows of satellite data processing required with large scale applications such as crop mapping. The study discusses strengths and weaknesses of classifiers, assesses accuracies that can be achieved with different classifiers for the Ukrainian landscape, and compares them to the benchmark classifier using a neural network approach that was developed in our previous studies. The study is carried out for the Joint Experiment of Crop Assessment and Monitoring (JECAM) test site in Ukraine covering the Kyiv region (North of Ukraine) in 2013. We found that Google Earth Engine (GEE) provides very good performance in terms of enabling access to the remote sensing products through the cloud platform and providing pre-processing; however, in terms of classification accuracy, the neural network based approach outperformed support vector machine (SVM), decision tree and random forest classifiers available in GEE.

  17. Assessing and Minimizing Adversarial Risk in a Nuclear Material Transportation Network

    DTIC Science & Technology

    2013-09-01

    0704-0188) Washington DC 20503. 1. AGENCY USE ONLY (Leave Blank) 2. REPORT DATE 09-27-2013 3. REPORT TYPE AND DATES COVERED Master’s Thesis 4. TITLE AND...U.S. as of July 2013. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 Figure A.1 Google Earth routing from Areva to Arkansas Nuclear...Uranium ore is mined or removed from the earth in a leaching process. 2. Conversion (1). Triuranium octoxide (U3O8, “yellowcake”) is converted into ura

  18. Using Google Earth for Submarine Operations at Pavilion Lake

    NASA Astrophysics Data System (ADS)

    Deans, M. C.; Lees, D. S.; Fong, T.; Lim, D. S.

    2009-12-01

    During the July 2009 Pavilion Lake field test, we supported submarine "flight" operations using Google Earth. The Intelligent Robotics Group at NASA Ames has experience with ground data systems for NASA missions, earth analog field tests, disaster response, and the Gigapan camera system. Leveraging this expertise and existing software, we put together a set of tools to support sub tracking and mapping, called the "Surface Data System." This system supports flight planning, real time flight operations, and post-flight analysis. For planning, we make overlays of the regional bedrock geology, sonar bathymetry, and sonar backscatter maps that show geology, depth, and structure of the bottom. Placemarks show the mooring locations for start and end points. Flight plans are shown as polylines with icons for waypoints. Flight tracks and imagery from previous field seasons are embedded in the map for planning follow-on activities. These data provide context for flight planning. During flights, sub position is updated every 5 seconds from the nav computer on the chase boat. We periodically update tracking KML files and refresh them with network links. A sub icon shows current location of the sub. A compass rose shows bearings to indicate heading to the next waypoint. A "Science Stenographer" listens on the voice loop and transcribes significant observations in real time. Observations called up to the surface immediately appear on the map as icons with date, time, position, and what was said. After each flight, the science back room immediately has the flight track and georeferenced notes from the pilots. We add additional information in post-processing. The submarines record video continuously, with "event" timestamps marked by the pilot. We cross-correlate the event timestamps with position logs to geolocate events and put a preview image and compressed video clip into the map. Animated flight tracks are also generated, showing timestamped position and providing timelapse playback of the flight. Neogeography tools are increasing in popularity and offer an excellent platform for geoinformatics. The scientists on the team are already familiar with Google Earth, eliminating up-front training on new tools. The flight maps and archived data are available immediately and in a usable format. Google Earth provides lots of measurement tools, annotation tools, and other built-in functions that we can use to create and analyze the map. All of this information is saved to a shared filesystem so that everyone on the team has access to all of the same map data. After the field season, the map data will be used by the team to analyse and correlate information from across the lake and across different flights to support their research, and to plan next year's activities.

  19. Power Plants Likely Covered by the Mercury and Air Toxics Standards (MATS)

    EPA Pesticide Factsheets

    The U.S. Environmental Protection Agency (EPA) has proposed Mercury and Air Toxics Standards (MATS) for power plants to limit mercury, acid gases and other toxic pollution from power plants. Using Google Earth, this page locates power plants in your state.

  20. A Geospatial Scavenger Hunt

    ERIC Educational Resources Information Center

    Martinez, Adriana E.; Williams, Nikki A.; Metoyer, Sandra K.; Morris, Jennifer N.; Berhane, Stephen A.

    2009-01-01

    With the use of technology such as Global Positioning System (GPS) units and Google Earth for a simple-machine scavenger hunt, you will transform a standard identification activity into an exciting learning experience that motivates students, incorporates practical skills in technology, and enhances students' spatial-thinking skills. In the…

  1. Kinematics with the assistance of smartphones: Measuring data via GPS - Visualizing data with Google Earth

    NASA Astrophysics Data System (ADS)

    Gabriel, Patrik; Backhaus, Udo

    2013-04-01

    Nearly every smartphone is now GPS capable. The widespread use of GPS navigation has developed alongside less expensive hardware and user-friendly software interfaces, which may help to bring scientific research and teaching closer to real life.

  2. Teaching Genocide through GIS: A Transformative Approach

    ERIC Educational Resources Information Center

    Fitchett, Paul G.; Good, Amy J.

    2012-01-01

    The utilization of Geographical Information Systems (GIS) and geobrowsers (Google Earth) have become increasingly prevalent in the study of genocide. These applications offer teachers and students the opportunity to analyze historical and contemporary genocidal acts from a critical geographic perspective in which the confluence of historical…

  3. BErkeley Atmospheric CO2 Network (BEACON) - Bringing Measurements of CO2 Emissions to a School Near You

    NASA Astrophysics Data System (ADS)

    Teige, V. E.; Havel, E.; Patt, C.; Heber, E.; Cohen, R. C.

    2011-12-01

    The University of California at Berkeley in collaboration with the Chabot Space and Science Center describe a set of educational programs, workshops, and exhibits based on a multi-node greenhouse gas and air quality monitoring network being deployed over Oakland, California. Examining raw numerical data using highly engaging and effective geo-data visualization tools like Google Earth can make the science come alive for students, and provide a hook for drawing them into deeper investigations. The Climate Science Investigations teacher workshop at the Chabot Space and Science Center will make use of Google Earth, Excel, and other geo-data visualization tools to step students through the process from data acquisition to discovery. Using multiple data sources, including output from the BErkeley Atmospheric CO2 Network (BEACON) project, participants will be encouraged to explore a variety of different modes of data display toward producing a unique, and ideally insightful, illumination of the data.

  4. Fuzzy B-spline optimization for urban slum three-dimensional reconstruction using ENVISAT satellite data

    NASA Astrophysics Data System (ADS)

    Marghany, Maged

    2014-06-01

    A critical challenges in urban aeras is slums. In fact, they are considered a source of crime and disease due to poor-quality housing, unsanitary conditions, poor infrastructures and occupancy security. The poor in the dense urban slums are the most vulnerable to infection due to (i) inadequate and restricted access to safety, drinking water and sufficient quantities of water for personal hygiene; (ii) the lack of removal and treatment of excreta; and (iii) the lack of removal of solid waste. This study aims to investigate the capability of ENVISAT ASAR satellite and Google Earth data for three-dimensional (3-D) slum urban reconstruction in developed countries such as Egypt. The main objective of this work is to utilize some 3-D automatic detection algorithm for urban slum in ENVISAT ASAR and Google Erath images were acquired in Cairo, Egypt using Fuzzy B-spline algorithm. The results show that the fuzzy algorithm is the best indicator for chaotic urban slum as it can discriminate between them from its surrounding environment. The combination of Fuzzy and B-spline then used to reconstruct 3-D of urban slum. The results show that urban slums, road network, and infrastructures are perfectly discriminated. It can therefore be concluded that the fuzzy algorithm is an appropriate algorithm for chaotic urban slum automatic detection in ENVSIAT ASAR and Google Earth data.

  5. Cloud-Based Computational Tools for Earth Science Applications

    NASA Astrophysics Data System (ADS)

    Arendt, A. A.; Fatland, R.; Howe, B.

    2015-12-01

    Earth scientists are increasingly required to think across disciplines and utilize a wide range of datasets in order to solve complex environmental challenges. Although significant progress has been made in distributing data, researchers must still invest heavily in developing computational tools to accommodate their specific domain. Here we document our development of lightweight computational data systems aimed at enabling rapid data distribution, analytics and problem solving tools for Earth science applications. Our goal is for these systems to be easily deployable, scalable and flexible to accommodate new research directions. As an example we describe "Ice2Ocean", a software system aimed at predicting runoff from snow and ice in the Gulf of Alaska region. Our backend components include relational database software to handle tabular and vector datasets, Python tools (NumPy, pandas and xray) for rapid querying of gridded climate data, and an energy and mass balance hydrological simulation model (SnowModel). These components are hosted in a cloud environment for direct access across research teams, and can also be accessed via API web services using a REST interface. This API is a vital component of our system architecture, as it enables quick integration of our analytical tools across disciplines, and can be accessed by any existing data distribution centers. We will showcase several data integration and visualization examples to illustrate how our system has expanded our ability to conduct cross-disciplinary research.

  6. A detailed view of Earth across space and time: our changing planet through a 32-year global Landsat and Sentinel-2 timelapse video

    NASA Astrophysics Data System (ADS)

    Herwig, C.

    2017-12-01

    The Landsat program offers an unparalleled record of our changing planet, with satellites that have been observing the Earth since 1972 to the present day. However, clouds, seasonal variation, and technical challenges around access to large volumes of data make it difficult for researchers and the public to understand global and regional scale changes across time through the planetary dataset. Earth Timelapse is a global, zoomable video that has helped revolutionize how users - millions of which have never been capable of utilizing Landsat data before - monitor and understand a changing planet. It is made from 33 cloud-free annual mosaics, one for each year from 1984 to 2016, which are made interactively explorable by Carnegie Mellon University CREATE Lab's Time Machine library, a technology for creating and viewing zoomable and pannable timelapses over space and time. Using Earth Engine, we combined over 5 million satellite images acquired over the past three decades by 5 different satellites. The majority of the images come from Landsat, a joint USGS/NASA Earth observation program that has observed the Earth since the 1970s. For 2015 and 2016, we combined Landsat 8 imagery with imagery from Sentinel-2A, part of the European Commission and European Space Agency's Copernicus Earth observation program. Along with the interactive desktop Timelapse application, we created a 200-video YouTube playlist highlighting areas across the world exhibiting change in the dataset.Earth Timelapse is an example that illustrates the power of Google Earth Engine's cloud-computing platform, which enables users such as scientists, researchers, and journalists to detect changes, map trends, and quantify differences on the Earth's surface using Google's computational infrastructure and the multi-petabyte Earth Engine data catalog. Earth Timelapse also highlights the value of data visualization to communicate with non-scientific audiences with varied technical and internet connectivity. Timelapse videos - as a global, zoomable and explorable web map across time as well as curated locations hosted on YouTube - can be effective at conveying large and medium scale land surface changes over time to diverse audiences.

  7. Get Connected

    ERIC Educational Resources Information Center

    Horton, Jessica; Hagevik, Rita; Adkinson, Bennett; Parmly, Jilynn

    2013-01-01

    Technology can be both a blessing and a curse in the classroom. Although technology can provide greater access to information and increase student engagement, if screen time replaces time spent outside, then students stand to lose awareness and connectivity to the surrounding natural environment. This article describes how Google Earth can foster…

  8. Cornerstone: Foundational Models and Services for Integrated Battle Planning

    DTIC Science & Technology

    2012-06-01

    We close with a summary of future planned research. 3 Cross-Domain Knowledge Representation One of the primary reasons behind the...mission data using Google Earth to display the results of a Keyhole Markup Language (KML) mission data translator. Finally, we successfully ran Thread 1

  9. Using USNO's API to Obtain Data

    NASA Astrophysics Data System (ADS)

    Lesniak, Michael V.; Pozniak, Daniel; Punnoose, Tarun

    2015-01-01

    The U.S. Naval Observatory (USNO) is in the process of modernizing its publicly available web services into APIs (Application Programming Interfaces). Services configured as APIs offer greater flexibility to the user and allow greater usage. Depending on the particular service, users who implement our APIs will receive either a PNG (Portable Network Graphics) image or data in JSON (JavaScript Object Notation) format. This raw data can then be embedded in third-party web sites or in apps.Part of the USNO's mission is to provide astronomical and timing data to government agencies and the general public. To this end, the USNO provides accurate computations of astronomical phenomena such as dates of lunar phases, rise and set times of the Moon and Sun, and lunar and solar eclipse times. Users who navigate to our web site and select one of our 18 services are prompted to complete a web form, specifying parameters such as date, time, location, and object. Many of our services work for years between 1700 and 2100, meaning that past, present, and future events can be computed. Upon form submission, our web server processes the request, computes the data, and outputs it to the user.Over recent years, the use of the web by the general public has vastly changed. In response to this, the USNO is modernizing its web-based data services. This includes making our computed data easier to embed within third-party web sites as well as more easily querying from apps running on tablets and smart phones. To facilitate this, the USNO has begun converting its services into APIs. In addition to the existing web forms for the various services, users are able to make direct URL requests that return either an image or numerical data.To date, four of our web services have been configured to run with APIs. Two are image-producing services: "Apparent Disk of a Solar System Object" and "Day and Night Across the Earth." Two API data services are "Complete Sun and Moon Data for One Day" and "Dates of Primary Phases of the Moon." Instructions for how to use our API services as well as examples of their use can be found on one of our explanatory web pages and will be discussed here.

  10. The community-driven BiG CZ software system for integration and analysis of bio- and geoscience data in the critical zone

    NASA Astrophysics Data System (ADS)

    Aufdenkampe, A. K.; Mayorga, E.; Horsburgh, J. S.; Lehnert, K. A.; Zaslavsky, I.; Valentine, D. W., Jr.; Richard, S. M.; Cheetham, R.; Meyer, F.; Henry, C.; Berg-Cross, G.; Packman, A. I.; Aronson, E. L.

    2014-12-01

    Here we present the prototypes of a new scientific software system designed around the new Observations Data Model version 2.0 (ODM2, https://github.com/UCHIC/ODM2) to substantially enhance integration of biological and Geological (BiG) data for Critical Zone (CZ) science. The CZ science community takes as its charge the effort to integrate theory, models and data from the multitude of disciplines collectively studying processes on the Earth's surface. The central scientific challenge of the CZ science community is to develop a "grand unifying theory" of the critical zone through a theory-model-data fusion approach, for which the key missing need is a cyberinfrastructure for seamless 4D visual exploration of the integrated knowledge (data, model outputs and interpolations) from all the bio and geoscience disciplines relevant to critical zone structure and function, similar to today's ability to easily explore historical satellite imagery and photographs of the earth's surface using Google Earth. This project takes the first "BiG" steps toward answering that need. The overall goal of this project is to co-develop with the CZ science and broader community, including natural resource managers and stakeholders, a web-based integration and visualization environment for joint analysis of cross-scale bio and geoscience processes in the critical zone (BiG CZ), spanning experimental and observational designs. We will: (1) Engage the CZ and broader community to co-develop and deploy the BiG CZ software stack; (2) Develop the BiG CZ Portal web application for intuitive, high-performance map-based discovery, visualization, access and publication of data by scientists, resource managers, educators and the general public; (3) Develop the BiG CZ Toolbox to enable cyber-savvy CZ scientists to access BiG CZ Application Programming Interfaces (APIs); and (4) Develop the BiG CZ Central software stack to bridge data systems developed for multiple critical zone domains into a single metadata catalog. The entire BiG CZ Software system is being developed on public repositories as a modular suite of open source software projects. It will be built around a new Observations Data Model Version 2.0 (ODM2) that has been developed by members of the BiG CZ project team, with community input, under separate funding.

  11. Integrating Geospatial Technologies to Examine Urban Land Use Change: A Design Partnership

    ERIC Educational Resources Information Center

    Bodzin, Alec M.; Cirucci, Lori

    2009-01-01

    This article describes a design partnership that investigated how to integrate Google Earth, remotely sensed satellite and aerial imagery, with other instructional resources to investigate ground cover and land use in diverse middle school classrooms. Data analysis from the implementation study revealed that students acquired skills for…

  12. Learning Geomorphology Using Aerial Photography in a Web-Facilitated Class

    ERIC Educational Resources Information Center

    Palmer, R. Evan

    2013-01-01

    General education students taking freshman-level physical geography and geomorphology classes at Arizona State University completed an online laboratory whose main tool was Google Earth. Early in the semester, oblique and planimetric views introduced students to a few volcanic, tectonic, glacial, karst, and coastal landforms. Semi-quantitative…

  13. Understanding "Change" through Spatial Thinking Using Google Earth in Secondary Geography

    ERIC Educational Resources Information Center

    Xiang, X.; Liu, Y.

    2017-01-01

    Understanding geographic changes has become an indispensable element in geography education. Describing and analyzing changes in space require spatial thinking skills emphasized in geography curriculum but often pose challenges for secondary school students. This school-based research targets a specific strand of spatial thinking skills and…

  14. UNAVCO Software and Services for Visualization and Exploration of Geoscience Data

    NASA Astrophysics Data System (ADS)

    Meertens, C.; Wier, S.

    2007-12-01

    UNAVCO has been involved in visualization of geoscience data to support education and research for several years. An early and ongoing service is the Jules Verne Voyager, a web browser applet built on the GMT that displays any area on Earth, with many data set choices, including maps, satellite images, topography, geoid heights, sea-floor ages, strain rates, political boundaries, rivers and lakes, earthquake and volcano locations, focal mechanisms, stress axes, and observed and modeled plate motion and deformation velocity vectors from geodetic measurements around the world. As part of the GEON project, UNAVCO has developed the GEON IDV, a research-level, 4D (earth location, depth and/or altitude, and time), Java application for interactive display and analysis of geoscience data. The GEON IDV is designed to meet the challenge of investigating complex, multi-variate, time-varying, three-dimensional geoscience data anywhere on earth. The GEON IDV supports simultaneous displays of data sets from differing sources, with complete control over colors, time animation, map projection, map area, point of view, and vertical scale. The GEON IDV displays gridded and point data, images, GIS shape files, and several other types of data. The GEON IDV has symbols and displays for GPS velocity vectors, seismic tomography, earthquake focal mechanisms, earthquake locations with magnitude or depth, seismic ray paths in 3D, seismic anisotropy, convection model visualization, earth strain axes and strain field imagery, and high-resolution 3D topographic relief maps. Multiple data sources and display types may appear in one view. As an example of GEON IDV utility, it can display hypocenters under a volcano, a surface geology map of the volcano draped over 3D topographic relief, town locations and political boundaries, and real-time 3D weather radar clouds of volcanic ash in the atmosphere, with time animation. The GEON IDV can drive a GeoWall or other 3D stereo system. IDV output includes imagery, movies, and KML files for Google Earth use of IDV static images, where Google Earth can handle the display. The IDV can be scripted to create display images on user request or automatically on data arrival, offering the use of the IDV as a back end to support a data web site. We plan to extend the power of the IDV by accepting new data types and data services, such as GeoSciML. An active program of online and video training in GEON IDV use is planned. UNAVCO will support users who need assistance converting their data to the standard formats used by the GEON IDV. The UNAVCO Facility provides web-accessible support for Google Earth and Google Maps display of any of more than 9500 GPS stations and survey points, including metadata for each installation. UNAVCO provides corresponding Open Geospatial Consortium (OGC) web services with the same data. UNAVCO's goal is to facilitate data access, interoperability, and efficient searches, exploration, and use of data by promoting web services, standards for GEON IDV data formats and metadata, and software able to simultaneously read and display multiple data sources, formats, and map locations or projections. Retention and propagation of semantics and metadata with observational and experimental values is essential for interoperability and understanding diverse data sources.

  15. Semantic Web Data Discovery of Earth Science Data at NASA Goddard Earth Sciences Data and Information Services Center (GES DISC)

    NASA Technical Reports Server (NTRS)

    Hegde, Mahabaleshwara; Strub, Richard F.; Lynnes, Christopher S.; Fang, Hongliang; Teng, William

    2008-01-01

    Mirador is a web interface for searching Earth Science data archived at the NASA Goddard Earth Sciences Data and Information Services Center (GES DISC). Mirador provides keyword-based search and guided navigation for providing efficient search and access to Earth Science data. Mirador employs the power of Google's universal search technology for fast metadata keyword searches, augmented by additional capabilities such as event searches (e.g., hurricanes), searches based on location gazetteer, and data services like format converters and data sub-setters. The objective of guided data navigation is to present users with multiple guided navigation in Mirador is an ontology based on the Global Change Master directory (GCMD) Directory Interchange Format (DIF). Current implementation includes the project ontology covering various instruments and model data. Additional capabilities in the pipeline include Earth Science parameter and applications ontologies.

  16. Geospatial Visualization of Scientific Data Through Keyhole Markup Language

    NASA Astrophysics Data System (ADS)

    Wernecke, J.; Bailey, J. E.

    2008-12-01

    The development of virtual globes has provided a fun and innovative tool for exploring the surface of the Earth. However, it has been the paralleling maturation of Keyhole Markup Language (KML) that has created a new medium and perspective through which to visualize scientific datasets. Originally created by Keyhole Inc., and then acquired by Google in 2004, in 2007 KML was given over to the Open Geospatial Consortium (OGC). It became an OGC international standard on 14 April 2008, and has subsequently been adopted by all major geobrowser developers (e.g., Google, Microsoft, ESRI, NASA) and many smaller ones (e.g., Earthbrowser). By making KML a standard at a relatively young stage in its evolution, developers of the language are seeking to avoid the issues that plagued the early World Wide Web and development of Hypertext Markup Language (HTML). The popularity and utility of Google Earth, in particular, has been enhanced by KML features such as the Smithsonian volcano layer and the dynamic weather layers. Through KML, users can view real-time earthquake locations (USGS), view animations of polar sea-ice coverage (NSIDC), or read about the daily activities of chimpanzees (Jane Goodall Institute). Perhaps even more powerful is the fact that any users can create, edit, and share their own KML, with no or relatively little knowledge of manipulating computer code. We present an overview of the best current scientific uses of KML and a guide to how scientists can learn to use KML themselves.

  17. Application of Deep Learning in GLOBELAND30-2010 Product Refinement

    NASA Astrophysics Data System (ADS)

    Liu, T.; Chen, X.

    2018-04-01

    GlobeLand30, as one of the best Global Land Cover (GLC) product at 30-m resolution, has been widely used in many research fields. Due to the significant spectral confusion among different land cover types and limited textual information of Landsat data, the overall accuracy of GlobeLand30 is about 80 %. Although such accuracy is much higher than most other global land cover products, it cannot satisfy various applications. There is still a great need of an effective method to improve the quality of GlobeLand30. The explosive high-resolution satellite images and remarkable performance of Deep Learning on image classification provide a new opportunity to refine GlobeLand30. However, the performance of deep leaning depends on quality and quantity of training samples as well as model training strategy. Therefore, this paper 1) proposed an automatic training sample generation method via Google earth to build a large training sample set; and 2) explore the best training strategy for land cover classification using GoogleNet (Inception V3), one of the most widely used deep learning network. The result shows that the fine-tuning from first layer of Inception V3 using rough large sample set is the best strategy. The retrained network was then applied in one selected area from Xi'an city as a case study of GlobeLand30 refinement. The experiment results indicate that the proposed approach with Deep Learning and google earth imagery is a promising solution for further improving accuracy of GlobeLand30.

  18. The Dimensions of the Solar System

    ERIC Educational Resources Information Center

    Schneider, Stephen E.; Davis, Kathleen S.

    2007-01-01

    A few new wrinkles have been added to the popular activity of building a scale model of the solar system. Students can learn about maps and scaling using easily accessible online resources that include satellite images. This is accomplished by taking advantage of some of the special features of Google Earth. This activity gives students a much…

  19. Crowdfunding Astronomy Research with Google Sky

    ERIC Educational Resources Information Center

    Metcalfe, Travis S.

    2015-01-01

    For nearly four years, NASA's Kepler space telescope searched for planets like Earth around more than 150,000 stars similar to the Sun. In 2008 with in-kind support from several technology companies, our non-profit organization established the Pale Blue Dot Project, an adopt-a-star program that supports scientific research on the stars observed by…

  20. Using Google Earth to Teach Plate Tectonics and Science Explanations

    ERIC Educational Resources Information Center

    Blank, Lisa M.; Plautz, Mike; Almquist, Heather; Crews, Jeff; Estrada, Jen

    2012-01-01

    "A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas" emphasizes that the practice of science is inherently a model-building activity focused on constructing explanations using evidence and reasoning (NRC 2012). Because building and refining is an iterative process, middle school students may view this practice…

  1. Drawing the Line with Google Earth: The Place of Digital Mapping outside of Geography

    ERIC Educational Resources Information Center

    Mercier, O. Ripeka; Rata, Arama

    2017-01-01

    The "Te Kawa a Maui Atlas" project explores how mapping activities support undergraduate student engagement and learning in Maori studies. This article describes two specific assignments, which used online mapping allowing students to engage with the work of their peers. By analysing student evaluations of these activities, we identify…

  2. Map Scale, Proportion, and Google[TM] Earth

    ERIC Educational Resources Information Center

    Roberge, Martin C.; Cooper, Linda L.

    2010-01-01

    Aerial imagery has a great capacity to engage and maintain student interest while providing a contextual setting to strengthen their ability to reason proportionally. Free, on-demand, high-resolution, large-scale aerial photography provides both a bird's eye view of the world and a new perspective on one's own community. This article presents an…

  3. Detecting Potential Water Quality Issues by Mapping Trophic Status Using Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Nguy-Robertson, A. L.; Harvey, K.; Huening, V.; Robinson, H.

    2017-12-01

    The identification, timing, and spatial distribution of recurrent algal blooms and aquatic vegetation can help water managers and policy makers make better water resource decisions. In many parts of the world there is little monitoring or reporting of water quality due to the required costs and effort to collect and process water samples. We propose to use Google Earth Engine to quickly identify the recurrence of trophic states in global inland water systems. Utilizing Landsat and Sentinel multispectral imagery, inland water quality parameters (i.e. chlorophyll a concentration) can be estimated and waters can be classified by trophic state; oligotrophic, mesotrophic, eutrophic, and hypereutrophic. The recurrence of eutrophic and hypereutrophic observations can highlight potentially problematic locations where algal blooms or aquatic vegetation occur routinely. Eutrophic and hypereutrophic waters commonly include many harmful algal blooms and waters prone to fish die-offs from hypoxia. While these maps may be limited by the accuracy of the algorithms utilized to estimate chlorophyll a; relative comparisons at a local scale can help water managers to focus limited resources.

  4. Undergraduate Course on Global Concerns

    NASA Astrophysics Data System (ADS)

    Richard, G. A.; Weidner, D. J.

    2008-12-01

    GEO 311: Geoscience and Global Concerns is an undergraduate course taught at Stony Brook University during each fall semester. The class meets twice per week, with one session consisting of a lecture and the other, an interactive activity in a computer laboratory that engages the students in exploring real world problems. A specific concern or issue serves as a focus during each session. The students are asked to develop answers to a series of questions that engage them in identifying causes of the problem, connections with the Earth system, relationships to other problems, and possible solutions on both a global and local scale. The questions are designed to facilitate an integrated view of the Earth system. Examples of topics that the students explore during the laboratory sessions are: 1) fossil fuel reserves and consumption rates and the effect of their use on climate, 2) alternative sources of energy and associated technologies, such as solar photovoltaics, nuclear energy, tidal power, geothermal energy, and wind power, 3) effects of tsunamis and earthquakes on human populations and infrastructure, 4) climate change, and 5) hurricanes and storms. The selection and scheduling of topics often takes advantage of the occurrence of media attention or events that can serve as case studies. Tools used during the computer sessions include Google Earth, ArcGIS, spreadsheets, and web sites that offer data and maps. The students use Google Earth or ArcGIS to map events such as earthquakes, storms, tsunamis, and changes in the extent of polar ice. Spreadsheets are employed to discern trends in fossil fuel supply and consumption, and to experiment with models that make predictions for the future. We present examples of several of these activities and discuss how they facilitate an understanding of interrelationships within the Earth system.

  5. Improving satellite-based post-fire evapotranspiration estimates in semi-arid regions

    NASA Astrophysics Data System (ADS)

    Poon, P.; Kinoshita, A. M.

    2017-12-01

    Climate change and anthropogenic factors contribute to the increased frequency, duration, and size of wildfires, which can alter ecosystem and hydrological processes. The loss of vegetation canopy and ground cover reduces interception and alters evapotranspiration (ET) dynamics in riparian areas, which can impact rainfall-runoff partitioning. Previous research evaluated the spatial and temporal trends of ET based on burn severity and observed an annual decrease of 120 mm on average for three years after fire. Building upon these results, this research focuses on the Coyote Fire in San Diego, California (USA), which burned a total of 76 km2 in 2003 to calibrate and improve satellite-based ET estimates in semi-arid regions affected by wildfire. The current work utilizes satellite-based products and techniques such as the Google Earth Engine Application programming interface (API). Various ET models (ie. Operational Simplified Surface Energy Balance Model (SSEBop)) are compared to the latent heat flux from two AmeriFlux eddy covariance towers, Sky Oaks Young (US-SO3), and Old Stand (US-SO2), from 2000 - 2015. The Old Stand tower has a low burn severity and the Young Stand tower has a moderate to high burn severity. Both towers are used to validate spatial ET estimates. Furthermore, variables and indices, such as Enhanced Vegetation Index (EVI), Normalized Difference Moisture Index (NDMI), and the Normalized Burn Ratio (NBR) are utilized to evaluate satellite-based ET through a multivariate statistical analysis at both sites. This point-scale study will able to improve ET estimates in spatially diverse regions. Results from this research will contribute to the development of a post-wildfire ET model for semi-arid regions. Accurate estimates of post-fire ET will provide a better representation of vegetation and hydrologic recovery, which can be used to improve hydrologic models and predictions.

  6. GIS Technologies For The New Planetary Science Archive (PSA)

    NASA Astrophysics Data System (ADS)

    Docasal, R.; Barbarisi, I.; Rios, C.; Macfarlane, A. J.; Gonzalez, J.; Arviset, C.; De Marchi, G.; Martinez, S.; Grotheer, E.; Lim, T.; Besse, S.; Heather, D.; Fraga, D.; Barthelemy, M.

    2015-12-01

    Geographical information system (GIS) is becoming increasingly used for planetary science. GIS are computerised systems for the storage, retrieval, manipulation, analysis, and display of geographically referenced data. Some data stored in the Planetary Science Archive (PSA), for instance, a set of Mars Express/Venus Express data, have spatial metadata associated to them. To facilitate users in handling and visualising spatial data in GIS applications, the new PSA should support interoperability with interfaces implementing the standards approved by the Open Geospatial Consortium (OGC). These standards are followed in order to develop open interfaces and encodings that allow data to be exchanged with GIS Client Applications, well-known examples of which are Google Earth and NASA World Wind as well as open source tools such as Openlayers. The technology already exists within PostgreSQL databases to store searchable geometrical data in the form of the PostGIS extension. An existing open source maps server is GeoServer, an instance of which has been deployed for the new PSA, uses the OGC standards to allow, among others, the sharing, processing and editing of data and spatial data through the Web Feature Service (WFS) standard as well as serving georeferenced map images through the Web Map Service (WMS). The final goal of the new PSA, being developed by the European Space Astronomy Centre (ESAC) Science Data Centre (ESDC), is to create an archive which enables science exploitation of ESA's planetary missions datasets. This can be facilitated through the GIS framework, offering interfaces (both web GUI and scriptable APIs) that can be used more easily and scientifically by the community, and that will also enable the community to build added value services on top of the PSA.

  7. An Observation Knowledgebase for Hinode Data

    NASA Astrophysics Data System (ADS)

    Hurlburt, Neal E.; Freeland, S.; Green, S.; Schiff, D.; Seguin, R.; Slater, G.; Cirtain, J.

    2007-05-01

    We have developed a standards-based system for the Solar Optical and X Ray Telescopes on the Hinode orbiting solar observatory which can serve as part of a developing Heliophysics informatics system. Our goal is to make the scientific data acquired by Hinode more accessible and useful to scientists by allowing them to do reasoning and flexible searches on observation metadata and to ask higher-level questions of the system than previously allowed. The Hinode Observation Knowledgebase relates the intentions and goals of the observation planners (as-planned metadata) with actual observational data (as-run metadata), along with connections to related models, data products and identified features (follow-up metadata) through a citation system. Summaries of the data (both as image thumbnails and short "film strips") serve to guide researchers to the observations appropriate for their research, and these are linked directly to the data catalog for easy extraction and delivery. The semantic information of the observation (Field of view, wavelength, type of observable, average cadence etc.) is captured through simple user interfaces and encoded using the VOEvent XML standard (with the addition of some solar-related extensions). These interfaces merge metadata acquired automatically during both mission planning and an data analysis (see Seguin et. al. 2007 at this meeting) phases with that obtained directly from the planner/analyst and send them to be incorporated into the knowledgebase. The resulting information is automatically rendered into standard categories based on planned and recent observations, as well as by popularity and recommendations by the science team. They are also directly searchable through both and web-based searches and direct calls to the API. Observations details can also be rendered as RSS, iTunes and Google Earth interfaces. The resulting system provides a useful tool to researchers and can act as a demonstration for larger, more complex systems.

  8. On-line Geoscience Data Resources for Today's Undergraduates

    NASA Astrophysics Data System (ADS)

    Goodwillie, A. M.; Ryan, W.; Carbotte, S.; Melkonian, A.; Coplan, J.; Arko, R.; O'Hara, S.; Ferrini, V.; Leung, A.; Bonckzowski, J.

    2008-12-01

    Broadening the experience of undergraduates can be achieved by enabling free, unrestricted and convenient access to real scientific data. With funding from the U.S. National Science Foundation, the Marine Geoscience Data System (MGDS) (http://www.marine-geo.org/) serves as the integrated data portal for various NSF-funded projects and provides free public access and preservation to a wide variety of marine and terrestrial data including rock, fluid, biology and sediment samples information, underway geophysical data and multibeam bathymetry, water column and multi-channel seismics data. Users can easily view the locations of cruise tracks, sample and station locations against a backdrop of a multi-resolution global digital elevation model. A Search For Data web page rapidly extracts data holdings from the database and can be filtered on data and device type, field program ID, investigator name, geographical and date bounds. The data access experience is boosted by the MGDS use of standardised OGC-compliant Web Services to support uniform programmatic interfaces. GeoMapApp (http://www.geomapapp.org/), a free MGDS data visualization tool, supports map-based dynamic exploration of a broad suite of geosciences data. Built-in land and marine data sets include tectonic plate boundary compilations, DSDP/ODP core logs, earthquake events, seafloor photos, and submersible dive tracks. Seamless links take users to data held by external partner repositories including PetDB, UNAVCO, IRIS and NGDC. Users can generate custom maps and grids and import their own data sets and grids. A set of short, video-style on-line tutorials familiarises users step- by-step with GeoMapApp functionality (http://www.geomapapp.org/tutorials/). Virtual Ocean (http://www.virtualocean.org/) combines the functionality of GeoMapApp with a 3-D earth browser built using the NASA WorldWind API for a powerful new data resource. MGDS education involvement (http://www.marine-geo.org/, go to Education tab) includes the searchable Media Bank of images and video; KML files for viewing several MGDS data sets in Google Earth (tm); support in developing undergraduate- level teaching modules using NSF-MARGINS data. Examples of many of these data sets will be shown.

  9. Using Enabling Technologies to Facilitate the Comparison of Satellite Observations with the Model Forecasts for Hurricane Study

    NASA Astrophysics Data System (ADS)

    Li, P.; Knosp, B.; Hristova-Veleva, S. M.; Niamsuwan, N.; Johnson, M. P.; Shen, T. P. J.; Tanelli, S.; Turk, J.; Vu, Q. A.

    2014-12-01

    Due to their complexity and volume, the satellite data are underutilized in today's hurricane research and operations. To better utilize these data, we developed the JPL Tropical Cyclone Information System (TCIS) - an Interactive Data Portal providing fusion between Near-Real-Time satellite observations and model forecasts to facilitate model evaluation and improvement. We have collected satellite observations and model forecasts in the Atlantic Basin and the East Pacific for the hurricane seasons since 2010 and supported the NASA Airborne Campaigns for Hurricane Study such as the Genesis and Rapid Intensification Processes (GRIP) in 2010 and the Hurricane and Severe Storm Sentinel (HS3) from 2012 to 2014. To enable the direct inter-comparisons of the satellite observations and the model forecasts, the TCIS was integrated with the NASA Earth Observing System Simulator Suite (NEOS3) to produce synthetic observations (e.g. simulated passive microwave brightness temperatures) from a number of operational hurricane forecast models (HWRF and GFS). An automated process was developed to trigger NEOS3 simulations via web services given the location and time of satellite observations, monitor the progress of the NEOS3 simulations, display the synthetic observation and ingest them into the TCIS database when they are done. In addition, three analysis tools, the joint PDF analysis of the brightness temperatures, ARCHER for finding the storm-center and the storm organization and the Wave Number Analysis tool for storm asymmetry and morphology analysis were integrated into TCIS to provide statistical and structural analysis on both observed and synthetic data. Interactive tools were built in the TCIS visualization system to allow the spatial and temporal selections of the datasets, the invocation of the tools with user specified parameters, and the display and the delivery of the results. In this presentation, we will describe the key enabling technologies behind the design of the TCIS interactive data portal and analysis tools, including the spatial database technology for the representation and query of the level 2 satellite data, the automatic process flow using web services, the interactive user interface using the Google Earth API, and a common and expandable Python wrapper to invoke the analysis tools.

  10. Authoring Tours of Geospatial Data With KML and Google Earth

    NASA Astrophysics Data System (ADS)

    Barcay, D. P.; Weiss-Malik, M.

    2008-12-01

    As virtual globes become widely adopted by the general public, the use of geospatial data has expanded greatly. With the popularization of Google Earth and other platforms, GIS systems have become virtual reality platforms. Using these platforms, a casual user can easily explore the world, browse massive data-sets, create powerful 3D visualizations, and share those visualizations with millions of people using the KML language. This technology has raised the bar for professionals and academics alike. It is now expected that studies and projects will be accompanied by compelling, high-quality visualizations. In this new landscape, a presentation of geospatial data can be the most effective form of advertisement for a project: engaging both the general public and the scientific community in a unified interactive experience. On the other hand, merely dumping a dataset into a virtual globe can be a disorienting, alienating experience for many users. To create an effective, far-reaching presentation, an author must take care to make their data approachable to a wide variety of users with varying knowledge of the subject matter, expertise in virtual globes, and attention spans. To that end, we present techniques for creating self-guided interactive tours of data represented in KML and visualized in Google Earth. Using these methods, we provide the ability to move the camera through the world while dynamically varying the content, style, and visibility of the displayed data. Such tours can automatically guide users through massive, complex datasets: engaging a broad user-base, and conveying subtle concepts that aren't immediately apparent when viewing the raw data. To the casual user these techniques result in an extremely compelling experience similar to watching video. Unlike video though, these techniques maintain the rich interactive environment provided by the virtual globe, allowing users to explore the data in detail and to add other data sources to the presentation.

  11. A virtual tour of geological heritage: Valourising geodiversity using Google Earth and QR code

    NASA Astrophysics Data System (ADS)

    Martínez-Graña, A. M.; Goy, J. L.; Cimarra, C. A.

    2013-12-01

    When making land-use plans, it is necessary to inventory and catalogue the geological heritage and geodiversity of a site to establish an apolitical conservation protection plan to meet the educational and social needs of society. New technologies make it possible to create virtual databases using virtual globes - e.g., Google Earth - and other personal-use geomatics applications (smartphones, tablets, PDAs) for accessing geological heritage information in “real time” for scientific, educational, and cultural purposes via a virtual geological itinerary. Seventeen mapped and georeferenced geosites have been created in Keyhole Markup Language for use in map layers used in geological itinerary stops for different applications. A virtual tour has been developed for Las Quilamas Natural Park, which is located in the Spanish Central System, using geological layers and topographic and digital terrain models that can be overlaid in a 3D model. The Google Earth application was used to import the geosite placemarks. For each geosite, a tab has been developed that shows a description of the geology with photographs and diagrams and that evaluates the scientific, educational, and tourism quality. Augmented reality allows the user to access these georeferenced thematic layers and overlay data, images, and graphics in real time on their mobile devices. These virtual tours can be incorporated into subject guides designed by public. Seven educational and interpretive panels describing some of the geosites were designed and tagged with a QR code that could be printed at each stop or in the printed itinerary. These QR codes can be scanned with the camera found on most mobile devices, and video virtual tours can be viewed on these devices. The virtual tour of the geological heritage can be used to show tourists the geological history of the Las Quilamas Natural Park using new geomatics technologies (virtual globes, augmented reality, and QR codes).

  12. Google Earth Grand Tour Themes

    NASA Astrophysics Data System (ADS)

    De Paor, D. G.; Whitmeyer, S. J.; Bentley, C.; Dordevic, M. M.

    2014-12-01

    As part of an NSF TUES Type 3 project entitled "Google Earth for Onsite and Distance Education (GEODE)," we are assembling a "Grand Tour" of locations on Earth and other terrestrial bodies that every geoscience student should know about and visit at least in virtual reality. Based on feedback from colleagues at previous meetings, we have identified nine Grand Tour themes: "Plates and Plumes," "Rocks and Regions," "Geology Through Time," "The Mapping Challenge*," "U.S. National Parks*," "The Magical Mystery Tour*," "Resources and Hazards," "Planets and Moons," and "Top of the Pops." Themes marked with an asterisk are most developed at this stage and will be demonstrated in real time. The Mapping Challenge invites students to trace geological contacts, measure bedding strike and dip and the plunge, trend, and facing of a fold. There is an advanced tool for modeling periclinal folds. The challenge is presented in a game-like format with an emphasis on puzzle-solving that will appeal to students regardless of gender. For the tour of U.S. national parks, we divided the most geologically important parks into four groups—Western Pacific, West Coast, Rockies, and East Coast. We are combining our own team's GigaPan imagery with imagery already available on the Internet. There is a great deal of imagery just waiting to be annotated for geological education purposes. The Magical Mystery Tour takes students to Google Streetview locations selected by instructors. Students are presented with questions or tasks and are given automatic feedback. Other themes are under development. Within each theme, we are crowd-sourcing contributions from colleagues and inviting colleagues to vote for or against proposed locations and student interactions. The GEODE team includes the authors and: Heather Almquist, Stephen Burgin, Cinzia Cervato, Gene Cooper, Paul Karabinos, Terry Pavlis, Jen Piatek, Bill Richards, Jeff Ryan, Ron Schott, Kristen St. John, and Barb Tewksbury.

  13. Dagik: A Quick Look System of the Geospace Data in KML format

    NASA Astrophysics Data System (ADS)

    Yoshida, D.; Saito, A.

    2007-12-01

    Dagik (Daily Geospace data in KML) is a quick look plot sharing system using Google Earth as a data browser. It provides daily data lists that contain network links to the KML/KMZ files of various geospace data. KML is a markup language to display data on Google Earth, and KMZ is a compressed file of KML. Users can browse the KML/KMZ files with the following procedures: 1) download "dagik.kml" from Dagik homepage (http://www- step.kugi.kyoto-u.ac.jp/dagik/), and open it with Google Earth, 2) select date, 3) select data type to browse. Dagik is a collection of network links to KML/KMZ files. The daily Dagik files are available since 1957, though they contain only the geomagnetic index data in the early periods. There are three activities of Dagik. The first one is the generation of the daily data lists, the second is to provide several useful tools, such as observatory lists, and the third is to assist researchers to make KML/KMZ data plots. To make the plot browsing easy, there are three rules for Dagik plot format: 1) one file contains one UT day data, 2) use common plot panel size, 3) share the data list. There are three steps to join Dagik as a plot provider: 1) make KML/KMZ files of the data, 2) put the KML/KMZ files on Web, 3) notice Dagik group the URL address and description of the files. The KML/KMZ files will be included in Dagik data list. As of September 2007, quick looks of several geosphace data, such as GPS total electron content data, ionosonde data, magnetometer data, FUV imaging data by a satellite, ground-based airglow data, and satellite footprint data, are available. The system of Dagik is introduced in the presentation. u.ac.jp/dagik/

  14. Next Generation Landsat Products Delivered Using Virtual Globes and OGC Standard Services

    NASA Astrophysics Data System (ADS)

    Neiers, M.; Dwyer, J.; Neiers, S.

    2008-12-01

    The Landsat Data Continuity Mission (LDCM) is the next in the series of Landsat satellite missions and is tasked with the objective of delivering data acquired by the Operational Land Imager (OLI). The OLI instrument will provide data continuity to over 30 years of global multispectral data collected by the Landsat series of satellites. The U.S. Geological Survey Earth Resources Observation and Science (USGS EROS) Center has responsibility for the development and operation of the LDCM ground system. One of the mission objectives of the LDCM is to distribute OLI data products electronically over the Internet to the general public on a nondiscriminatory basis and at no cost. To ensure the user community and general public can easily access LDCM data from multiple clients, the User Portal Element (UPE) of the LDCM ground system will use OGC standards and services such as Keyhole Markup Language (KML), Web Map Service (WMS), Web Coverage Service (WCS), and Geographic encoding of Really Simple Syndication (GeoRSS) feeds for both access to and delivery of LDCM products. The USGS has developed and tested the capabilities of several successful UPE prototypes for delivery of Landsat metadata, full resolution browse, and orthorectified (L1T) products from clients such as Google Earth, Google Maps, ESRI ArcGIS Explorer, and Microsoft's Virtual Earth. Prototyping efforts included the following services: using virtual globes to search the historical Landsat archive by dynamic generation of KML; notification of and access to new Landsat acquisitions and L1T downloads from GeoRSS feeds; Google indexing of KML files containing links to full resolution browse and data downloads; WMS delivery of reduced resolution browse, full resolution browse, and cloud mask overlays; and custom data downloads using WCS clients. These various prototypes will be demonstrated and LDCM service implementation plans will be discussed during this session.

  15. Quantifying Urban Texture in Nairobi, Kenya and its Implications for Understanding Natural Hazard Impact

    NASA Astrophysics Data System (ADS)

    Taylor, Faith E.; Malamud, Bruce D.; Millington, James D. A.

    2016-04-01

    The configuration of infrastructure networks such as roads, drainage and power lines can both affect and be affected by natural hazards such as earthquakes, intense rain, wildfires and extreme temperatures. In this paper, we present and compare two methods to quantify urban topology on approximate scales of 0.0005 km2 to 10 km2 and create classifications of different 'urban textures' that relate to risk of natural hazard impact in an area. The methods we use focus on applicability in urban developing country settings, where access to high resolution and high quality data may be difficult. We use the city of Nairobi, Kenya to trial these methods. Nairobi has a population >3 million, and is a mix of informal settlements, residential and commercial development. The city and its immediate surroundings are subject to a variety of natural hazards such as floods, landslides, fires, drought, hail, heavy wind and extreme temperatures; all of these hazards can occur singly, but also have the potential for one to trigger another, thus providing a 'cascade' of hazards, or for two of the hazards to occur spatially and temporally near each other and interact. We use two measures of urban texture: (i) Street block textures, (ii) Google Earth land cover textures. Street block textures builds on the methodology of Louf and Barthelemy (2014) and uses Open Street Map data to analyse the shape, size, complexity and pattern of individual blocks of land created by fully enclosed loops of the major and minor road network of Nairobi. We find >4000 of these blocks ranging in size from approximately 0.0005 km2 to 10 km2, with approximately 5 classifications of urban texture. Google Earth land cover texture is a visual classification of homogeneous parcels of land performed in Google Earth using high-resolution airborne imagery and a qualitative criteria for each land cover type. Using the Google Earth land cover texture method, we identify >40 'urban textures' based on visual characteristics such as colour, texture, shadow and setting and have created a clear criteria for classifying an area based on its visual characteristics. These two methods for classifying urban texture in Nairobi are compared in a GIS and in the field to investigate whether there is a link between the visual appearance of an area and its network topology. From these urban textures, we may start to identify areas where (a) urban texture types may indicate a relative propensity to certain hazards and their interactions and (b) urban texture types that may increase or decrease the impact of a hazard that occurs in that area.

  16. Galaxy Portal: interacting with the galaxy platform through mobile devices.

    PubMed

    Børnich, Claus; Grytten, Ivar; Hovig, Eivind; Paulsen, Jonas; Čech, Martin; Sandve, Geir Kjetil

    2016-06-01

    : We present Galaxy Portal app, an open source interface to the Galaxy system through smart phones and tablets. The Galaxy Portal provides convenient and efficient monitoring of job completion, as well as opportunities for inspection of results and execution history. In addition to being useful to the Galaxy community, we believe that the app also exemplifies a useful way of exploiting mobile interfaces for research/high-performance computing resources in general. The source is freely available under a GPL license on GitHub, along with user documentation and pre-compiled binaries and instructions for several platforms: https://github.com/Tarostar/QMLGalaxyPortal It is available for iOS version 7 (and newer) through the Apple App Store, and for Android through Google Play for version 4.1 (API 16) or newer. geirksa@ifi.uio.no. © The Author 2016. Published by Oxford University Press.

  17. Evaluation of an electrocardiogram on QR code.

    PubMed

    Nakayama, Masaharu; Shimokawa, Hiroaki

    2013-01-01

    An electrocardiogram (ECG) is an indispensable tool to diagnose cardiac diseases, such as ischemic heart disease, myocarditis, arrhythmia, and cardiomyopathy. Since ECG patterns vary depend on patient status, it is also used to monitor patients during treatment and comparison with ECGs with previous results is important for accurate diagnosis. However, the comparison requires connection to ECG data server in a hospital and the availability of data connection among hospitals is limited. To improve the portability and availability of ECG data regardless of server connection, we here introduce conversion of ECG data into 2D barcodes as text data and decode of the QR code for drawing ECG with Google Chart API. Fourteen cardiologists and six general physicians evaluated the system using iPhone and iPad. Overall, they were satisfied with the system in usability and accuracy of decoded ECG compared to the original ECG. This new coding system may be useful in utilizing ECG data irrespective of server connections.

  18. Scientific Data Storage for Cloud Computing

    NASA Astrophysics Data System (ADS)

    Readey, J.

    2014-12-01

    Traditionally data storage used for geophysical software systems has centered on file-based systems and libraries such as NetCDF and HDF5. In contrast cloud based infrastructure providers such as Amazon AWS, Microsoft Azure, and the Google Cloud Platform generally provide storage technologies based on an object based storage service (for large binary objects) complemented by a database service (for small objects that can be represented as key-value pairs). These systems have been shown to be highly scalable, reliable, and cost effective. We will discuss a proposed system that leverages these cloud-based storage technologies to provide an API-compatible library for traditional NetCDF and HDF5 applications. This system will enable cloud storage suitable for geophysical applications that can scale up to petabytes of data and thousands of users. We'll also cover other advantages of this system such as enhanced metadata search.

  19. Free Global Dsm Assessment on Large Scale Areas Exploiting the Potentialities of the Innovative Google Earth Engine Platform

    NASA Astrophysics Data System (ADS)

    Nascetti, A.; Di Rita, M.; Ravanelli, R.; Amicuzi, M.; Esposito, S.; Crespi, M.

    2017-05-01

    The high-performance cloud-computing platform Google Earth Engine has been developed for global-scale analysis based on the Earth observation data. In particular, in this work, the geometric accuracy of the two most used nearly-global free DSMs (SRTM and ASTER) has been evaluated on the territories of four American States (Colorado, Michigan, Nevada, Utah) and one Italian Region (Trentino Alto- Adige, Northern Italy) exploiting the potentiality of this platform. These are large areas characterized by different terrain morphology, land covers and slopes. The assessment has been performed using two different reference DSMs: the USGS National Elevation Dataset (NED) and a LiDAR acquisition. The DSMs accuracy has been evaluated through computation of standard statistic parameters, both at global scale (considering the whole State/Region) and in function of the terrain morphology using several slope classes. The geometric accuracy in terms of Standard deviation and NMAD, for SRTM range from 2-3 meters in the first slope class to about 45 meters in the last one, whereas for ASTER, the values range from 5-6 to 30 meters. In general, the performed analysis shows a better accuracy for the SRTM in the flat areas whereas the ASTER GDEM is more reliable in the steep areas, where the slopes increase. These preliminary results highlight the GEE potentialities to perform DSM assessment on a global scale.

  20. Case study of visualizing global user download patterns using Google Earth and NASA World Wind

    NASA Astrophysics Data System (ADS)

    Zong, Ziliang; Job, Joshua; Zhang, Xuesong; Nijim, Mais; Qin, Xiao

    2012-01-01

    Geo-visualization is significantly changing the way we view spatial data and discover information. On the one hand, a large number of spatial data are generated every day. On the other hand, these data are not well utilized due to the lack of free and easily used data-visualization tools. This becomes even worse when most of the spatial data remains in the form of plain text such as log files. This paper describes a way of visualizing massive plain-text spatial data at no cost by utilizing Google Earth and NASA World Wind. We illustrate our methods by visualizing over 170,000 global download requests for satellite images maintained by the Earth Resources Observation and Science (EROS) Center of U.S. Geological Survey (USGS). Our visualization results identify the most popular satellite images around the world and discover the global user download patterns. The benefits of this research are: 1. assisting in improving the satellite image downloading services provided by USGS, and 2. providing a proxy for analyzing the "hot spot" areas of research. Most importantly, our methods demonstrate an easy way to geo-visualize massive textual spatial data, which is highly applicable to mining spatially referenced data and information on a wide variety of research domains (e.g., hydrology, agriculture, atmospheric science, natural hazard, and global climate change).

  1. Developing a Global Database of Historic Flood Events to Support Machine Learning Flood Prediction in Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Tellman, B.; Sullivan, J.; Kettner, A.; Brakenridge, G. R.; Slayback, D. A.; Kuhn, C.; Doyle, C.

    2016-12-01

    There is an increasing need to understand flood vulnerability as the societal and economic effects of flooding increases. Risk models from insurance companies and flood models from hydrologists must be calibrated based on flood observations in order to make future predictions that can improve planning and help societies reduce future disasters. Specifically, to improve these models both traditional methods of flood prediction from physically based models as well as data-driven techniques, such as machine learning, require spatial flood observation to validate model outputs and quantify uncertainty. A key dataset that is missing for flood model validation is a global historical geo-database of flood event extents. Currently, the most advanced database of historical flood extent is hosted and maintained at the Dartmouth Flood Observatory (DFO) that has catalogued 4320 floods (1985-2015) but has only mapped 5% of these floods. We are addressing this data gap by mapping the inventory of floods in the DFO database to create a first-of- its-kind, comprehensive, global and historical geospatial database of flood events. To do so, we combine water detection algorithms on MODIS and Landsat 5,7 and 8 imagery in Google Earth Engine to map discrete flood events. The created database will be available in the Earth Engine Catalogue for download by country, region, or time period. This dataset can be leveraged for new data-driven hydrologic modeling using machine learning algorithms in Earth Engine's highly parallelized computing environment, and we will show examples for New York and Senegal.

  2. AFRC2016-0054-528

    NASA Image and Video Library

    2016-02-27

    Sam Choi and Naiara Pinto observe Google Earth overlaid with in almost real time what the synthetic aperture radar is mapping from the C-20A aircraft. Researchers were in the sky and on the ground to take measurements of plant mass, distribution of trees, shrubs and ground cover and the diversity of plants and how much carbon is absorbed by them.

  3. Smartphones and Time Zones

    ERIC Educational Resources Information Center

    Baird, William; Secrest, Jeffery; Padgett, Clifford; Johnson, Wayne; Hagrelius, Claire

    2016-01-01

    Using the Sun to tell time is an ancient idea, but we can take advantage of modern technology to bring it into the 21st century for students in astronomy, physics, or physical science classes. We have employed smartphones, Google Earth, and 3D printing to find the moment of local noon at two widely separated locations. By reviewing GPS…

  4. Using Google Earth to Teach the Magnitude of Deep Time

    ERIC Educational Resources Information Center

    Parker, Joel D.

    2011-01-01

    Most timeline analogies of geologic and evolutionary time are fundamentally flawed. They trade off the problem of grasping very long times for the problem of grasping very short distances. The result is an understanding of relative time with little comprehension of absolute time. Earlier work has shown that the distances most easily understood by…

  5. Content and Language Integrated Learning through an Online Game in Primary School: A Case Study

    ERIC Educational Resources Information Center

    Dourda, Kyriaki; Bratitsis, Tharrenos; Griva, Eleni; Papadopoulou, Penelope

    2014-01-01

    In this paper an educational design proposal is presented which combines two well established teaching approaches, that of Game-based Learning (GBL) and Content and Language Integrated Learning (CLIL). The context of the proposal was the design of an educational geography computer game, utilizing QR Codes and Google Earth for teaching English…

  6. Using Google Earth and Satellite Imagery to Foster Place-Based Teaching in an Introductory Physical Geology Course

    ERIC Educational Resources Information Center

    Monet, Julie; Greene, Todd

    2012-01-01

    Students in an introductory physical geology course often have difficulty making connections between basic course topics and assembling key concepts (beyond textbook examples) to interpret how geologic processes shape the characteristics of the local and regional natural environment. As an approach to address these issues, we designed and…

  7. Applying Modern Stage Theory to Mauritania: A Prescription to Encourage Entrepreneurship

    DTIC Science & Technology

    2014-12-01

    entrepreneurship, stage theory, development, Africa , factor-driven, trade freedom, business freedom 15. NUMBER OF PAGES 77 16. PRICE CODE 17...SOUTH ASIA, SUB-SAHARAN AFRICA ) from the NAVAL POSTGRADUATE SCHOOL December 2014 Author: Jennifer M. Warren Approved by: Robert E...Notes, Coins) .......................................................................... 4  Figure 2.  Satellite map of West Africa (from Google Earth

  8. The World in Spatial Terms: Mapmaking and Map Reading

    ERIC Educational Resources Information Center

    Ekiss, Gale Olp; Trapido-Lurie, Barbara; Phillips, Judy; Hinde, Elizabeth

    2007-01-01

    Maps and mapping activities are essential in the primary grades. Maps are truly ubiquitous today, as evidenced by the popularity of websites such as Google Earth and Mapquest, and by devices such as Global Positioning System (GPS) units in cars, planes, and boats. Maps can give visual settings to travel stories and historical narratives and can…

  9. Creating global comparative analyses of tectonic rifts, monogenetic volcanism and inverted relief

    NASA Astrophysics Data System (ADS)

    van Wyk de Vries, Benjamin

    2016-04-01

    I have been all around the world, and to other planets and have travelled from the present to the Archaean and back to seek out the most significant tectonic rifts, monogenetic volcanoes and examples of inverted relief. I have done this to provide a broad foundation of the comparative analysis for the Chaîne des Puys - Limagne fault nomination to UNESCO world Heritage. This would have been an impossible task, if not for the cooperation of the scientific community and for Google Earth, Google Maps and academic search engines. In preparing global comparisons of geological features, these quite recently developed tools provide a powerful way to find and describe geological features. The ability to do scientific crowd sourcing, rapidly discussing with colleagues about features, allows large numbers of areas to be checked and the open GIS tools (such as Google Earth) allow a standardised description. Search engines also allow the literature on areas to be checked and compared. I will present a comparative study of rifts of the world, monogenetic volcanic field and inverted relief, integrated to analyse the full geological system represented by the Chaîne des Puys - Limagne fault. The analysis confirms that the site is an exceptional example of the first steps of continental drift in a mountain rift setting, and that this is necessarily seen through the combined landscape of tectonic, volcanic and geomorphic features. The analysis goes further to deepen the understanding of geological systems and stresses the need for more study on geological heritage using such a global and broad systems approach.

  10. A browser-based event display for the CMS Experiment at the LHC using WebGL

    NASA Astrophysics Data System (ADS)

    McCauley, T.

    2017-10-01

    Modern web browsers are powerful and sophisticated applications that support an ever-wider range of uses. One such use is rendering high-quality, GPU-accelerated, interactive 2D and 3D graphics in an HTML canvas. This can be done via WebGL, a JavaScript API based on OpenGL ES. Applications delivered via the browser have several distinct benefits for the developer and user. For example, they can be implemented using well-known and well-developed technologies, while distribution and use via a browser allows for rapid prototyping and deployment and ease of installation. In addition, delivery of applications via the browser allows for easy use on mobile, touch-enabled devices such as phones and tablets. iSpy WebGL is an application for visualization of events detected and reconstructed by the CMS Experiment at the Large Hadron Collider at CERN. The first event display developed for an LHC experiment to use WebGL, iSpy WebGL is a client-side application written in JavaScript, HTML, and CSS and uses the WebGL API three.js. iSpy WebGL is used for monitoring of CMS detector performance, for production of images and animations of CMS collisions events for the public, as a virtual reality application using Google Cardboard, and asa tool available for public education and outreach such as in the CERN Open Data Portal and the CMS masterclasses. We describe here its design, development, and usage as well as future plans.

  11. Towards better digital pathology workflows: programming libraries for high-speed sharpness assessment of Whole Slide Images.

    PubMed

    Ameisen, David; Deroulers, Christophe; Perrier, Valérie; Bouhidel, Fatiha; Battistella, Maxime; Legrès, Luc; Janin, Anne; Bertheau, Philippe; Yunès, Jean-Baptiste

    2014-01-01

    Since microscopic slides can now be automatically digitized and integrated in the clinical workflow, quality assessment of Whole Slide Images (WSI) has become a crucial issue. We present a no-reference quality assessment method that has been thoroughly tested since 2010 and is under implementation in multiple sites, both public university-hospitals and private entities. It is part of the FlexMIm R&D project which aims to improve the global workflow of digital pathology. For these uses, we have developed two programming libraries, in Java and Python, which can be integrated in various types of WSI acquisition systems, viewers and image analysis tools. Development and testing have been carried out on a MacBook Pro i7 and on a bi-Xeon 2.7GHz server. Libraries implementing the blur assessment method have been developed in Java, Python, PHP5 and MySQL5. For web applications, JavaScript, Ajax, JSON and Sockets were also used, as well as the Google Maps API. Aperio SVS files were converted into the Google Maps format using VIPS and Openslide libraries. We designed the Java library as a Service Provider Interface (SPI), extendable by third parties. Analysis is computed in real-time (3 billion pixels per minute). Tests were made on 5000 single images, 200 NDPI WSI, 100 Aperio SVS WSI converted to the Google Maps format. Applications based on our method and libraries can be used upstream, as calibration and quality control tool for the WSI acquisition systems, or as tools to reacquire tiles while the WSI is being scanned. They can also be used downstream to reacquire the complete slides that are below the quality threshold for surgical pathology analysis. WSI may also be displayed in a smarter way by sending and displaying the regions of highest quality before other regions. Such quality assessment scores could be integrated as WSI's metadata shared in clinical, research or teaching contexts, for a more efficient medical informatics workflow.

  12. Argon Intercalibration Pipette System (APIS): Smoking from the Same Pipe

    NASA Astrophysics Data System (ADS)

    Turrin, B. D.; Swisher, C. C., III; Hemming, S. R.; Renne, P. R.; Deino, A. L.; Hodges, K. V.; Van Soest, M. C.; Heizler, M. T.

    2014-12-01

    40Ar/39Ar age inter-calibration experiments, conducted as part of the US NSF sponsored EARTHTIME initiative, (http://www.earth-time.org), using two of the most commonly used 40Ar/39Ar mineral standards, Fish Canyon (FC, ~28.2 Ma) and Alder Creek (AC, ~1.2 Ma) sanidines, have revealed significant inter-laboratory inconsistencies. The reported ages for the AC sanidines range from 1.173 to 1.200 Ma (FC 28.02) (±~2%), ~4 times greater than the reported precision. These experiments have caused the 40Ar/39Ar community to scrutinize procedures and several informal lab intercalibrations have been conducted among different labs. This exercise is leading to better agreement, but discrepancies remain that need to be addressed. In an effort to isolate the cause(s) of these inconsistencies, two Argon Inter-calibration Pipette System (APIS) were designed and constructed. Each consists of three gas canisters; one contains atmospheric Ar, while the other two contain artificial gas mixtures with 40Ar/39Ar ratios similar to those of FC and AC. Each canister has 4x10-10 moles of 40Ar, is equipped with 0.1, 0.2 and 0.4 cc pipettes, and can deliver gas volumes from 0.1-0.7 cc. All volumes were determined manometrically to 0.4% or better and then filled to uniform pressure with Ar standard gases. This experimental design eliminates sample heterogeneity, leaving only interlaboratory variations in gas purification, data reduction, and isotopic measurement as potential sources of interlaboratory calibration discrepancies. APIS-1 was designated as a traveling unit that is brought to participating labs. APIS-2 is the reserve/master standard. Currently, APIS-1 is in its early stages in the voyage and has been to three labs (Rutgers, LDEO, and New Mexico Tech) as of this writing. The interlaboratory comparisons are ongoing, and will include ASU, BGC, Univ. of Wisconsin, and Oregon State University, plus additional laboratories of opportunity. A progress report will be presented at AGU.

  13. Subsetting and Formatting Landsat-7 LOR ETM+ and Data Products

    NASA Technical Reports Server (NTRS)

    Reid, Michael R.

    2000-01-01

    The Landsat-7 Processing System (LPS) processes Landsat-7 Enhanced Thematic Mapper (ETM+) instrument data into large, contiguous segments called "subintervals" and stores them in Level OR (LOR) data files. The LPS processed subinterval products must be subsetted and reformatted before the Level I processing systems can ingest them. The initial full subintervals produced by the LPS are stored mainly in HDF Earth Observing System (HDF-EOS) format which is an extension to the Hierarchical Data Format (HDF). The final LOR products are stored in native HDF format. Primarily the EOS Core System (ECS) and alternately the DAAC Emergency System (DES) subset the subinterval data for the operational Landsat-7 data processing systems. The HDF and HDF-EOS application programming interfaces (APIs) can be used for extensive data subsetting and data reorganization. A stand-alone subsetter tool has been developed which is based on some of the DES code. This tool makes use of the HDF and HDFEOS APIs to perform Landsat-7 LOR product subsetting and demonstrates how HDF and HDFEOS can be used for creating various configurations of full LOR products. How these APIs can be used to efficiently subset, format, and organize Landsat-7 LOR data as demonstrated by the subsetter tool and the DES is discussed.

  14. Tracking changes of river morphology in Ayeyarwady River in Myanmar using earth observations and surface water mapping tool

    NASA Astrophysics Data System (ADS)

    Piman, T.; Schellekens, J.; Haag, A.; Donchyts, G.; Apirumanekul, C.; Hlaing, K. T.

    2017-12-01

    River morphology changes is one of the key issues in Ayeyarwady River in Myanmar which cause impacts on navigation, riverine habitats, agriculture lands, communities and livelihoods near the bank of the river. This study is aimed to track the changes in river morphology in the middle reach of Ayeyarwady River over last 30 years from 1984-2014 to improve understanding of riverbank dynamic, erosion and deposition procress. Earth observations including LandSat-7, LandSat-8, Digital Elevation Model from SRTM Plus and, ASTER-2 GoogleMap and Open Street Map were obtained for the study. GIS and remote sensing tools were used to analyze changes in river morphology while surface water mapping tool was applied to determine how the dynamic behaviour of the surface river and effect of river morphology changes. The tool consists of two components: (1) a Google Earth Engine (GEE) javascript or python application that performs image analysis and (2) a user-friendly site/app using Google's appspot.com that exposes the application to the users. The results of this study shown that the fluvial morphology in the middle reach of Ayeyarwady River is continuously changing under the influence of high water flows in particularly from extreme flood events and land use change from mining and deforestation. It was observed that some meandering sections of the riverbank were straightened, which results in the movement of sediment downstream and created new sections of meandering riverbank. Several large islands have formed due to the stabilization by vegetation and is enforced by sedimentation while many small bars were formed and migrated dynamically due to changes in water levels and flow velocity in the wet and dry seasons. The main channel was changed to secondary channel in some sections of the river. This results a constant shift of the navigation route. We also found that some villages were facing riverbank erosion which can force villagers to relocate. The study results demonstrated that the products from earth observations and the surface water mapping tool could detect dynamic changes of river morphology in the Ayeyarwady River. This information is useful to support navigation and riverbank protection planning and formulating mitigation measures for local communities that are affecting by riverbank erosion.

  15. Addressing key concepts in physical geography through interactive learning activities in an online geo-ICT environment

    NASA Astrophysics Data System (ADS)

    Verstraeten, Gert; Steegen, An; Martens, Lotte

    2016-04-01

    The increasing number of geospatial datasets and free online geo-ICT tools offers new opportunities for education in Earth Sciences. Geospatial technology indeed provides an environment through which interactive learning can be introduced in Earth Sciences curricula. However, the effectiveness of such e-learning approaches in terms of learning outcomes has rarely been addressed. Here, we present our experience with the implementation of digital interactive learning activities within an introductory Physical Geography course attended by 90 undergraduate students in Geography, Geology, Biology and Archaeology. Two traditional lectures were replaced by interactive sessions (each 2 h) in a flexible classroom where students had to work both in team and individually in order to explore some key concepts through the integrated use of geospatial data within Google EarthTM. A first interactive lesson dealt with the classification of river systems and aimed to examine the conditions under which rivers tend to meander or to develop a braided pattern. Students were required to collect properties of rivers (river channel pattern, channel slope, climate, discharge, lithology, vegetation, etc). All these data are available on a global scale and have been added as separate map layers in Google EarthTM. Each student collected data for at least two rivers and added this information to a Google Drive Spreadsheet accessible to the entire group. This resulted in a database of more than one hundred rivers spread over various environments worldwide. In a second phase small groups of students discussed the potential relationships between river channel pattern and its controlling factors. Afterwards, the findings of each discussion group were presented to the entire audience. The same set-up was followed in a second interactive session to explore spatial variations in ecosystem properties such as net primary production and soil carbon content. The qualitative evaluation of both interactive sessions showed that the majority of students perceive these as very useful and inspiring. Students were more capable in exploring the spatial linkages between various environmental variables and processes compared to traditional lectures. Furthermore, the format of the sessions offered a forum in which undergraduate students from a variety of disciplines discussed the learning content in mixed groups. The success of interactive learning activities, however, strongly depends on the quality of the educational infrastructure (flexible spaces, wireless connections with sufficient broadband capacity).

  16. A Scalable Infrastructure for Lidar Topography Data Distribution, Processing, and Discovery

    NASA Astrophysics Data System (ADS)

    Crosby, C. J.; Nandigam, V.; Krishnan, S.; Phan, M.; Cowart, C. A.; Arrowsmith, R.; Baru, C.

    2010-12-01

    High-resolution topography data acquired with lidar (light detection and ranging) technology have emerged as a fundamental tool in the Earth sciences, and are also being widely utilized for ecological, planning, engineering, and environmental applications. Collected from airborne, terrestrial, and space-based platforms, these data are revolutionary because they permit analysis of geologic and biologic processes at resolutions essential for their appropriate representation. Public domain lidar data collection by federal, state, and local agencies are a valuable resource to the scientific community, however the data pose significant distribution challenges because of the volume and complexity of data that must be stored, managed, and processed. Lidar data acquisition may generate terabytes of data in the form of point clouds, digital elevation models (DEMs), and derivative products. This massive volume of data is often challenging to host for resource-limited agencies. Furthermore, these data can be technically challenging for users who lack appropriate software, computing resources, and expertise. The National Science Foundation-funded OpenTopography Facility (www.opentopography.org) has developed a cyberinfrastructure-based solution to enable online access to Earth science-oriented high-resolution lidar topography data, online processing tools, and derivative products. OpenTopography provides access to terabytes of point cloud data, standard DEMs, and Google Earth image data, all co-located with computational resources for on-demand data processing. The OpenTopography portal is built upon a cyberinfrastructure platform that utilizes a Services Oriented Architecture (SOA) to provide a modular system that is highly scalable and flexible enough to support the growing needs of the Earth science lidar community. OpenTopography strives to host and provide access to datasets as soon as they become available, and also to expose greater application level functionalities to our end-users (such as generation of custom DEMs via various gridding algorithms, and hydrological modeling algorithms). In the future, the SOA will enable direct authenticated access to back-end functionality through simple Web service Application Programming Interfaces (APIs), so that users may access our data and compute resources via clients other than Web browsers. In addition to an overview of the OpenTopography SOA, this presentation will discuss our recently developed lidar data ingestion and management system for point cloud data delivered in the binary LAS standard. This system compliments our existing partitioned database approach for data delivered in ASCII format, and permits rapid ingestion of data. The system has significantly reduced data ingestion times and has implications for data distribution in emergency response situations. We will also address on ongoing work to develop a community lidar metadata catalog based on the OGC Catalogue Service for Web (CSW) standard, which will help to centralize discovery of public domain lidar data.

  17. Post-Nor'Ida coastal oblique aerial photographs collected from Ocean City, Maryland, to Hatteras, North Carolina, December 4, 2009

    USGS Publications Warehouse

    Morgan, Karen L. M.; Krohn, M. Dennis; Guy, Kristy K.

    2015-01-01

    In addition to the photographs, a Google Earth Keyhole Markup Language (KML) file is provided and can be used to view the images by clicking on the marker and then clicking on either the thumbnail or the link above the thumbnail. The KML files were created using the photographic navigation files.

  18. Baseline Coastal Oblique Aerial Photographs Collected from Navarre Beach, Florida, to Breton Island, Louisiana, September 1, 2014

    USGS Publications Warehouse

    Morgan, Karen L. M.

    2015-08-31

    In addition to the photographs, a Google Earth Keyhole Markup Language (KML) file is provided and can be used to view the images by clicking on the marker and then clicking on either the thumbnail or the link above the thumbnail. The KML files were created using the photographic navigation files.

  19. Pinpointing Watershed Pollution on a Virtual Globe

    ERIC Educational Resources Information Center

    Saunders, Cheston; Taylor, Amy

    2014-01-01

    Pollution is not a problem we just read about anymore. It affects the air we breathe, the land we live on, and the water we consume. After noticing a lack of awareness in students, a lesson was developed that used Google Earth to pinpoint sources of pollution in the local area and in others across the country, and their effects on the surrounding…

  20. Automatic building detection based on Purposive FastICA (PFICA) algorithm using monocular high resolution Google Earth images

    NASA Astrophysics Data System (ADS)

    Ghaffarian, Saman; Ghaffarian, Salar

    2014-11-01

    This paper proposes an improved FastICA model named as Purposive FastICA (PFICA) with initializing by a simple color space transformation and a novel masking approach to automatically detect buildings from high resolution Google Earth imagery. ICA and FastICA algorithms are defined as Blind Source Separation (BSS) techniques for unmixing source signals using the reference data sets. In order to overcome the limitations of the ICA and FastICA algorithms and make them purposeful, we developed a novel method involving three main steps: 1-Improving the FastICA algorithm using Moore-Penrose pseudo inverse matrix model, 2-Automated seeding of the PFICA algorithm based on LUV color space and proposed simple rules to split image into three regions; shadow + vegetation, baresoil + roads and buildings, respectively, 3-Masking out the final building detection results from PFICA outputs utilizing the K-means clustering algorithm with two number of clusters and conducting simple morphological operations to remove noises. Evaluation of the results illustrates that buildings detected from dense and suburban districts with divers characteristics and color combinations using our proposed method have 88.6% and 85.5% overall pixel-based and object-based precision performances, respectively.

  1. Assessment of rainwater harvesting potential using GIS

    NASA Astrophysics Data System (ADS)

    Hari, Durgasrilakshmi; Ramamohan Reddy, K.; Vikas, Kola; Srinivas, N.; Vikas, G.

    2018-03-01

    Rainwater harvesting (RWH) is one of the best practices to overcome the scarcity of water. Rainwater harvesting involves collection and storage of rainwater locally through different technologies, for future use. It is also useful for livestock, groundwater recharge and for irrigation practices. Potential of rainwater harvesting refers to the capacity of an individual catchment that harnesses the water falling on the catchment during a particular year considering all rainy days. The present study deals with the identification of the study area boundary and marking it as a Polygon in Google Earth Pro Later, Rooftops of various house entities and roads were digitized using the Polygon command in Google Earth Pro. GIS technique is employed for locating boundaries of the study area and for calculating the areas of various types of rooftops and roads. With the application of GIS, it is possible to assess the total potential of water that can be harvested. The present study will enable us to identify the suitable type of water harvesting structure along with the number of structures required. It is extremely an ideal and effective solution to overcome the water crisis through water conservation in the study area.

  2. Mapping of Sample Collection Data: GIS Tools for the Natural Product Researcher

    PubMed Central

    Oberlies, Nicholas H.; Rineer, James I.; Alali, Feras Q.; Tawaha, Khaled; Falkinham, Joseph O.; Wheaton, William D.

    2009-01-01

    Scientists engaged in the research of natural products often either conduct field collections themselves or collaborate with partners who do, such as botanists, mycologists, or SCUBA divers. The information gleaned from such collecting trips (e.g. longitude/latitude coordinates, geography, elevation, and a multitude of other field observations) have provided valuable data to the scientific community (e.g., biodiversity), even if it is tangential to the direct aims of the natural products research, which are often focused on drug discovery and/or chemical ecology. Geographic Information Systems (GIS) have been used to display, manage, and analyze geographic data, including collection sites for natural products. However, to the uninitiated, these tools are often beyond the financial and/or computational means of the natural product scientist. With new, free, and easy-to-use geospatial visualization tools, such as Google Earth, mapping and geographic imaging of sampling data are now within the reach of natural products scientists. The goals of the present study were to develop simple tools that are tailored for the natural products setting, thereby presenting a means to map such information, particularly via open source software like Google Earth. PMID:20161345

  3. The CRUTEM4 land-surface air temperature data set: construction, previous versions and dissemination via Google Earth

    NASA Astrophysics Data System (ADS)

    Osborn, T. J.; Jones, P. D.

    2014-02-01

    The CRUTEM4 (Climatic Research Unit Temperature, version 4) land-surface air temperature data set is one of the most widely used records of the climate system. Here we provide an important additional dissemination route for this data set: online access to monthly, seasonal and annual data values and time series graphs via Google Earth. This is achieved via an interface written in Keyhole Markup Language (KML) and also provides access to the underlying weather station data used to construct the CRUTEM4 data set. A mathematical description of the construction of the CRUTEM4 data set (and its predecessor versions) is also provided, together with an archive of some previous versions and a recommendation for identifying the precise version of the data set used in a particular study. The CRUTEM4 data set used here is available from doi:10.5285/EECBA94F-62F9-4B7C-88D3-482F2C93C468.

  4. A virtual, interactive and dynamic excursion in Google Earth on soil management and conservation (AgroGeovid)

    NASA Astrophysics Data System (ADS)

    Vanwalleghem, Tom; Giráldez, Juan Vicente

    2013-04-01

    Many courses on natural resources require hands-on practical knowledge and experience that students traditionally could only acquire by expensive and time-consuming field excursions. New technologies and social media however provide an interesting alternative to train students and help them improve their practical knowledge. AgroGeovid is a virtual excursion, based on Google Earth, Youtube, Facebook and Twitter that is aimed at agricultural engineering students, but equally useful for any student interested in soil management and conservation, e.g. geography, geology and environmental resources. Agrogeovid provides the framework for teachers and students to upload geotagged photos, comments and discussions. After the initial startup phase, where the teacher uploaded material on e.g. soil erosion phenomena, soil conservation structures and different soil management strategies under different agronomic systems, students contributed with their own material gathered throughout the academic year. All students decided to contribute via Facebook, in stead of Twitter, which was not known to most of them. The final result was a visual and dynamic tool which students could use to train and perfect skills adopted in the classroom using case-studies and examples from their immediate environment.

  5. Reusable Client-Side JavaScript Modules for Immersive Web-Based Real-Time Collaborative Neuroimage Visualization.

    PubMed

    Bernal-Rusiel, Jorge L; Rannou, Nicolas; Gollub, Randy L; Pieper, Steve; Murphy, Shawn; Robertson, Richard; Grant, Patricia E; Pienaar, Rudolph

    2017-01-01

    In this paper we present a web-based software solution to the problem of implementing real-time collaborative neuroimage visualization. In both clinical and research settings, simple and powerful access to imaging technologies across multiple devices is becoming increasingly useful. Prior technical solutions have used a server-side rendering and push-to-client model wherein only the server has the full image dataset. We propose a rich client solution in which each client has all the data and uses the Google Drive Realtime API for state synchronization. We have developed a small set of reusable client-side object-oriented JavaScript modules that make use of the XTK toolkit, a popular open-source JavaScript library also developed by our team, for the in-browser rendering and visualization of brain image volumes. Efficient realtime communication among the remote instances is achieved by using just a small JSON object, comprising a representation of the XTK image renderers' state, as the Google Drive Realtime collaborative data model. The developed open-source JavaScript modules have already been instantiated in a web-app called MedView , a distributed collaborative neuroimage visualization application that is delivered to the users over the web without requiring the installation of any extra software or browser plugin. This responsive application allows multiple physically distant physicians or researchers to cooperate in real time to reach a diagnosis or scientific conclusion. It also serves as a proof of concept for the capabilities of the presented technological solution.

  6. Global Coastal and Marine Spatial Planning (CMSP) from Space Based AIS Ship Tracking

    NASA Astrophysics Data System (ADS)

    Schwehr, K. D.; Foulkes, J. A.; Lorenzini, D.; Kanawati, M.

    2011-12-01

    All nations need to be developing long term integrated strategies for how to use and preserve our natural resources. As a part of these strategies, we must evalutate how communities of users react to changes in rules and regulations of ocean use. Global characterization of the vessel traffic on our Earth's oceans is essential to understanding the existing uses to develop international Coast and Marine Spatial Planning (CMSP). Ship traffic within 100-200km is beginning to be effectively covered in low latitudes by ground based receivers collecting position reports from the maritime Automatic Identification System (AIS). Unfortunately, remote islands, high latitudes, and open ocean Marine Protected Areas (MPA) are not covered by these ground systems. Deploying enough autonomous airborne (UAV) and surface (USV) vessels and buoys to provide adequate coverage is a difficult task. While the individual device costs are plummeting, a large fleet of AIS receivers is expensive to maintain. The global AIS coverage from SpaceQuest's low Earth orbit satellite receivers combined with the visualization and data storage infrastructure of Google (e.g. Maps, Earth, and Fusion Tables) provide a platform that enables researchers and resource managers to begin answer the question of how ocean resources are being utilized. Near real-time vessel traffic data will allow managers of marine resources to understand how changes to education, enforcement, rules, and regulations alter usage and compliance patterns. We will demonstrate the potential for this system using a sample SpaceQuest data set processed with libais which stores the results in a Fusion Table. From there, the data is imported to PyKML and visualized in Google Earth with a custom gx:Track visualization utilizing KML's extended data functionality to facilitate ship track interrogation. Analysts can then annotate and discuss vessel tracks in Fusion Tables.

  7. Evolution of errors in the altimetric bathymetry model used by Google Earth and GEBCO

    NASA Astrophysics Data System (ADS)

    Marks, K. M.; Smith, W. H. F.; Sandwell, D. T.

    2010-09-01

    We analyze errors in the global bathymetry models of Smith and Sandwell that combine satellite altimetry with acoustic soundings and shorelines to estimate depths. Versions of these models have been incorporated into Google Earth and the General Bathymetric Chart of the Oceans (GEBCO). We use Japan Agency for Marine-Earth Science and Technology (JAMSTEC) multibeam surveys not previously incorporated into the models as "ground truth" to compare against model versions 7.2 through 12.1, defining vertical differences as "errors." Overall error statistics improve over time: 50th percentile errors declined from 57 to 55 to 49 m, and 90th percentile errors declined from 257 to 235 to 219 m, in versions 8.2, 11.1 and 12.1. This improvement is partly due to an increasing number of soundings incorporated into successive models, and partly to improvements in the satellite gravity model. Inspection of specific sites reveals that changes in the algorithms used to interpolate across survey gaps with altimetry have affected some errors. Versions 9.1 through 11.1 show a bias in the scaling from gravity in milliGals to topography in meters that affected the 15-160 km wavelength band. Regionally averaged (>160 km wavelength) depths have accumulated error over successive versions 9 through 11. These problems have been mitigated in version 12.1, which shows no systematic variation of errors with depth. Even so, version 12.1 is in some respects not as good as version 8.2, which employed a different algorithm.

  8. Dynamic Server-Based KML Code Generator Method for Level-of-Detail Traversal of Geospatial Data

    NASA Technical Reports Server (NTRS)

    Baxes, Gregory; Mixon, Brian; Linger, TIm

    2013-01-01

    Web-based geospatial client applications such as Google Earth and NASA World Wind must listen to data requests, access appropriate stored data, and compile a data response to the requesting client application. This process occurs repeatedly to support multiple client requests and application instances. Newer Web-based geospatial clients also provide user-interactive functionality that is dependent on fast and efficient server responses. With massively large datasets, server-client interaction can become severely impeded because the server must determine the best way to assemble data to meet the client applications request. In client applications such as Google Earth, the user interactively wanders through the data using visually guided panning and zooming actions. With these actions, the client application is continually issuing data requests to the server without knowledge of the server s data structure or extraction/assembly paradigm. A method for efficiently controlling the networked access of a Web-based geospatial browser to server-based datasets in particular, massively sized datasets has been developed. The method specifically uses the Keyhole Markup Language (KML), an Open Geospatial Consortium (OGS) standard used by Google Earth and other KML-compliant geospatial client applications. The innovation is based on establishing a dynamic cascading KML strategy that is initiated by a KML launch file provided by a data server host to a Google Earth or similar KMLcompliant geospatial client application user. Upon execution, the launch KML code issues a request for image data covering an initial geographic region. The server responds with the requested data along with subsequent dynamically generated KML code that directs the client application to make follow-on requests for higher level of detail (LOD) imagery to replace the initial imagery as the user navigates into the dataset. The approach provides an efficient data traversal path and mechanism that can be flexibly established for any dataset regardless of size or other characteristics. The method yields significant improvements in userinteractive geospatial client and data server interaction and associated network bandwidth requirements. The innovation uses a C- or PHP-code-like grammar that provides a high degree of processing flexibility. A set of language lexer and parser elements is provided that offers a complete language grammar for writing and executing language directives. A script is wrapped and passed to the geospatial data server by a client application as a component of a standard KML-compliant statement. The approach provides an efficient means for a geospatial client application to request server preprocessing of data prior to client delivery. Data is structured in a quadtree format. As the user zooms into the dataset, geographic regions are subdivided into four child regions. Conversely, as the user zooms out, four child regions collapse into a single, lower-LOD region. The approach provides an efficient data traversal path and mechanism that can be flexibly established for any dataset regardless of size or other characteristics.

  9. GIS Database and Google Map of the Population at Risk of Cholangiocarcinoma in Mueang Yang District, Nakhon Ratchasima Province of Thailand.

    PubMed

    Kaewpitoon, Soraya J; Rujirakul, Ratana; Joosiri, Apinya; Jantakate, Sirinun; Sangkudloa, Amnat; Kaewthani, Sarochinee; Chimplee, Kanokporn; Khemplila, Kritsakorn; Kaewpitoon, Natthawut

    2016-01-01

    Cholangiocarcinoma (CCA) is a serious problem in Thailand, particularly in the northeastern and northern regions. Database of population at risk are need required for monitoring, surveillance, home health care, and home visit. Therefore, this study aimed to develop a geographic information system (GIS) database and Google map of the population at risk of CCA in Mueang Yang district, Nakhon Ratchasima province, northeastern Thailand during June to October 2015. Populations at risk were screened using the Korat CCA verbal screening test (KCVST). Software included Microsoft Excel, ArcGIS, and Google Maps. The secondary data included the point of villages, sub-district boundaries, district boundaries, point of hospital in Mueang Yang district, used for created the spatial databese. The populations at risk for CCA and opisthorchiasis were used to create an arttribute database. Data were tranfered to WGS84 UTM ZONE 48. After the conversion, all of the data were imported into Google Earth using online web pages www.earthpoint.us. Some 222 from a 4,800 population at risk for CCA constituted a high risk group. Geo-visual display available at following www.google.com/maps/d/u/0/ edit?mid=zPxtcHv_iDLo.kvPpxl5mAs90 and hl=th. Geo-visual display 5 layers including: layer 1, village location and number of the population at risk for CCA; layer 2, sub-district health promotion hospital in Mueang Yang district and number of opisthorchiasis; layer 3, sub-district district and the number of population at risk for CCA; layer 4, district hospital and the number of population at risk for CCA and number of opisthorchiasis; and layer 5, district and the number of population at risk for CCA and number of opisthorchiasis. This GIS database and Google map production process is suitable for further monitoring, surveillance, and home health care for CCA sufferers.

  10. Copernicus Big Data and Google Earth Engine for Glacier Surface Velocity Field Monitoring: Feasibility Demonstration on San Rafael and San Quintin Glaciers

    NASA Astrophysics Data System (ADS)

    Di Tullio, M.; Nocchi, F.; Camplani, A.; Emanuelli, N.; Nascetti, A.; Crespi, M.

    2018-04-01

    The glaciers are a natural global resource and one of the principal climate change indicator at global and local scale, being influenced by temperature and snow precipitation changes. Among the parameters used for glacier monitoring, the surface velocity is a key element, since it is connected to glaciers changes (mass balance, hydro balance, glaciers stability, landscape erosion). The leading idea of this work is to continuously retrieve glaciers surface velocity using free ESA Sentinel-1 SAR imagery and exploiting the potentialities of the Google Earth Engine (GEE) platform. GEE has been recently released by Google as a platform for petabyte-scale scientific analysis and visualization of geospatial datasets. The algorithm of SAR off-set tracking developed at the Geodesy and Geomatics Division of the University of Rome La Sapienza has been integrated in a cloud based platform that automatically processes large stacks of Sentinel-1 data to retrieve glacier surface velocity field time series. We processed about 600 Sentinel-1 image pairs to obtain a continuous time series of velocity field measurements over 3 years from January 2015 to January 2018 for two wide glaciers located in the Northern Patagonian Ice Field (NPIF), the San Rafael and the San Quintin glaciers. Several results related to these relevant glaciers also validated with respect already available and renown software (i.e. ESA SNAP, CIAS) and with respect optical sensor measurements (i.e. LANDSAT8), highlight the potential of the Big Data analysis to automatically monitor glacier surface velocity fields at global scale, exploiting the synergy between GEE and Sentinel-1 imagery.

  11. Integrating and Visualizing Tropical Cyclone Data Using the Real Time Mission Monitor

    NASA Technical Reports Server (NTRS)

    Goodman, H. Michael; Blakeslee, Richard; Conover, Helen; Hall, John; He, Yubin; Regner, Kathryn

    2009-01-01

    The Real Time Mission Monitor (RTMM) is a visualization and information system that fuses multiple Earth science data sources, to enable real time decision-making for airborne and ground validation experiments. Developed at the NASA Marshall Space Flight Center, RTMM is a situational awareness, decision-support system that integrates satellite imagery, radar, surface and airborne instrument data sets, model output parameters, lightning location observations, aircraft navigation data, soundings, and other applicable Earth science data sets. The integration and delivery of this information is made possible using data acquisition systems, network communication links, network server resources, and visualizations through the Google Earth virtual globe application. RTMM is extremely valuable for optimizing individual Earth science airborne field experiments. Flight planners, scientists, and managers appreciate the contributions that RTMM makes to their flight projects. A broad spectrum of interdisciplinary scientists used RTMM during field campaigns including the hurricane-focused 2006 NASA African Monsoon Multidisciplinary Analyses (NAMMA), 2007 NOAA-NASA Aerosonde Hurricane Noel flight, 2007 Tropical Composition, Cloud, and Climate Coupling (TC4), plus a soil moisture (SMAP-VEX) and two arctic research experiments (ARCTAS) in 2008. Improving and evolving RTMM is a continuous process. RTMM recently integrated the Waypoint Planning Tool, a Java-based application that enables aircraft mission scientists to easily develop a pre-mission flight plan through an interactive point-and-click interface. Individual flight legs are automatically calculated "on the fly". The resultant flight plan is then immediately posted to the Google Earth-based RTMM for interested scientists to view the planned flight track and subsequently compare it to the actual real time flight progress. We are planning additional capabilities to RTMM including collaborations with the Jet Propulsion Laboratory in the joint development of a Tropical Cyclone Integrated Data Exchange and Analysis System (TC IDEAS) which will serve as a web portal for access to tropical cyclone data, visualizations and model output.

  12. KAGLVis - On-line 3D Visualisation of Earth-observing-satellite Data

    NASA Astrophysics Data System (ADS)

    Szuba, Marek; Ameri, Parinaz; Grabowski, Udo; Maatouki, Ahmad; Meyer, Jörg

    2015-04-01

    One of the goals of the Large-Scale Data Management and Analysis project is to provide a high-performance framework facilitating management of data acquired by Earth-observing satellites such as Envisat. On the client-facing facet of this framework, we strive to provide visualisation and basic analysis tool which could be used by scientists with minimal to no knowledge of the underlying infrastructure. Our tool, KAGLVis, is a JavaScript client-server Web application which leverages modern Web technologies to provide three-dimensional visualisation of satellite observables on a wide range of client systems. It takes advantage of the WebGL API to employ locally available GPU power for 3D rendering; this approach has been demonstrated to perform well even on relatively weak hardware such as integrated graphics chipsets found in modern laptop computers and with some user-interface tuning could even be usable on embedded devices such as smartphones or tablets. Data is fetched from the database back-end using a ReST API and cached locally, both in memory and using HTML5 Web Storage, to minimise network use. Computations, calculation of cloud altitude from cloud-index measurements for instance, can depending on configuration be performed on either the client or the server side. Keywords: satellite data, Envisat, visualisation, 3D graphics, Web application, WebGL, MEAN stack.

  13. CEO Sites Mission Management System (SMMS)

    NASA Technical Reports Server (NTRS)

    Trenchard, Mike

    2014-01-01

    Late in fiscal year 2011, the Crew Earth Observations (CEO) team was tasked to upgrade its science site database management tool, which at the time was integrated with the Automated Mission Planning System (AMPS) originally developed for Earth Observations mission planning in the 1980s. Although AMPS had been adapted and was reliably used by CEO for International Space Station (ISS) payload operations support, the database structure was dated, and the compiler required for modifications would not be supported in the Windows 7 64-bit operating system scheduled for implementation the following year. The Sites Mission Management System (SMMS) is now the tool used by CEO to manage a heritage Structured Query Language (SQL) database of more than 2,000 records for Earth science sites. SMMS is a carefully designed and crafted in-house software package with complete and detailed help files available for the user and meticulous internal documentation for future modifications. It was delivered in February 2012 for test and evaluation. Following acceptance, it was implemented for CEO mission operations support in April 2012. The database spans the period from the earliest systematic requests for astronaut photography during the shuttle era to current ISS mission support of the CEO science payload. Besides logging basic image information (site names, locations, broad application categories, and mission requests), the upgraded database management tool now tracks dates of creation, modification, and activation; imagery acquired in response to requests; the status and location of ancillary site information; and affiliations with studies, their sponsors, and collaborators. SMMS was designed to facilitate overall mission planning in terms of site selection and activation and provide the necessary site parameters for the Satellite Tool Kit (STK) Integrated Message Production List Editor (SIMPLE), which is used by CEO operations to perform daily ISS mission planning. The CEO team uses the SMMS for three general functions - database queries of content and status, individual site creation and updates, and mission planning. The CEO administrator of the science site database is able to create or modify the content of sites and activate or deactivate them based on the requirements of the sponsors. The administrator supports and implements ISS mission planning by assembling, reporting, and activating mission-specific site selections for management; deactivating sites as requirements are met; and creating new sites, such as International Charter sites for disasters, as circumstances warrant. In addition to the above CEO internal uses, when site planning for a specific ISS mission is complete and approved, the SMMS can produce and export those essential site database elements for the mission into XML format for use by onboard Earth-location systems, such as Worldmap. The design, development, and implementation of the SMMS resulted in a superior database management system for CEO science sites by focusing on the functions and applications of the database alone instead of integrating the database with the multipurpose configuration of the AMPS. Unlike the AMPS, it can function and be modified within the existing Windows 7 environment. The functions and applications of the SMMS were expanded to accommodate more database elements, report products, and a streamlined interface for data entry and review. A particularly elegant enhancement in data entry was the integration of the Google Earth application for the visual display and definition of site coordinates for site areas defined by multiple coordinates. Transfer between the SMMS and Google Earth is accomplished with a Keyhole Markup Language (KML) expression of geographic data (see figures 3 and 4). Site coordinates may be entered into the SMMS panel directly for display in Google Earth, or the coordinates may be defined on the Google Earth display as a mouse-controlled polygonal definition and transferred back into the SMMS as KML input. This significantly reduces the possibility of errors in coordinate entries and provides visualization of the scale of the site being defined. CEO now has a powerful tool for managing and defining sites on the Earth's surface for both targets of astronaut photography or other onboard remote sensing systems. It can also record and track results by sponsor, collaborator, or type of study.

  14. Baseline coastal oblique aerial photographs collected from the Virginia/North Carolina border to Montauk Point, New York, October 5-6, 2014

    USGS Publications Warehouse

    Morgan, Karen L. M.

    2015-10-02

    In addition to the photographs, a Google Earth Keyhole Markup Language (KML) file is provided and can be used to view the images by clicking on the marker and then clicking on either the thumbnail or the link above the thumbnail. The KML files were created using the photographic navigation files.

  15. Visualization of Wind Data on Google Earth for the Three-dimensional Wind Field (3DWF) Model

    DTIC Science & Technology

    2012-09-01

    ActiveX components or XPCOM extensions can be used by JavaScript to write data to the local file system. Since there is an inherent risk, it is very...important to only use these types of objects ( ActiveX or XPCOM) from a trusted source in order to minimize the exposure of a computer system to malware

  16. Tactical Level Commander and Staff Toolkit

    DTIC Science & Technology

    2010-01-01

    Sites Geodata.gov (for maps) http://gos2.geodata.gov Google Earth for .mil (United States Army Corps of Engineers (USACE) site) https...the eyes, ears, head, hands, back, and feet. When appropriate, personnel should wear protective lenses, goggles, or face shields . Leaders should...Typical hurricanes are about 300 miles wide, although they can vary considerably. Size is not necessarily an indication of hurricane intensity. The

  17. ODM2 Admin Pilot Project- a Data Management Application for Observations of the Critical Zone.

    NASA Astrophysics Data System (ADS)

    Leon, M.; McDowell, W. H.; Mayorga, E.; Setiawan, L.; Hooper, R. P.

    2017-12-01

    ODM2 Admin is a tool to manage data stored in a relational database using the Observation Data Model 2 (ODM2) information model. Originally developed by the Luquillo Critical Zone Observatory (CZO) to manage a wide range of Earth observations, it has now been deployed at 6 projects: the Catalina Jemez CZO, the Dry Creek Experimental Forest, Au Sable and Manistee River sites managed by Michigan State, Tropical Response to Altered Climate Experiment (TRACE) and the Critical Zone Integrative Microbial Ecology Activity (CZIMEA) EarthCube project; most of these deployments are hosted on a Microsoft Azure cloud server managed by CUAHSI. ODM2 Admin is a web application built on the Python open-source Django framework and available for download from GitHub and DockerHub. It provides tools for data ingestion, editing, QA/QC, data visualization, browsing, mapping and documentation of equipment deployment, methods, and citations. Additional features include the ability to generate derived data values, automatically or manually create data annotations and create datasets from arbitrary groupings of results. Over 22 million time series values for more than 600 time series are being managed with ODM2 Admin across the 6 projects as well as more than 12,000 soil profiles and other measurements. ODM2 Admin links with external identifier systems through DOIs, ORCiDs and IGSNs, so cited works, details about researchers and earth sample meta-data can be accessed directly from ODM2 Admin. This application is part of a growing open source ODM2 application ecosystem under active development. ODM2 Admin can be deployed alongside other tools from the ODM2 ecosystem, including ODM2API and WOFpy, which provide access to the underlying ODM2 data through a Python API and Water One Flow web services.

  18. From Geocaching to Virtual Reality: Technology tools that can transform courses into interactive learning expeditions

    NASA Astrophysics Data System (ADS)

    Moysey, S. M.; Lazar, K.; Boyer, D. M.; Mobley, C.; Sellers, V.

    2016-12-01

    Transforming classrooms into active learning environments is a key challenge in introductory-level courses. The technology explosion over the last decade, from the advent of mobile devices to virtual reality, is creating innumerable opportunities to engage students within and outside of traditional classroom settings. In particular, technology can be an effective tool for providing students with field experiences that would otherwise be logistically difficult in large, introductory earth science courses. For example, we have created an integrated platform for mobile devices using readily accessible "off the shelf" components (e.g., Google Apps, Geocaching.com, and Facebook) that allow individual students to navigate to geologically relevant sites, perform and report on activities at these locations, and share their findings through social media by posting "geoselfies". Students compete with their friends on a leaderboard, while earning incentives for completing extracurricular activities in courses. Thus in addition to exposing students to a wider range of meaningful and accessible geologic field experiences, they also build a greater sense of community and identity within the context of earth science classrooms. Rather than sending students to the field, we can also increasingly bring the field to students in classrooms using virtual reality. Ample mobile platforms are emerging that easily allow for the creation, curation, and viewing of photospheres (i.e., 360o images) with mobile phones and low-cost headsets; Google Street View, Earth, and Expeditions are leading the way in terms of ease of content creation and implementation in the classroom. While these tools are an excellent entry point to show students real-world sites, they currently lack the capacity for students to interact with the environment. We have therefore also developed an immersive virtual reality game that allows students to study the geology of the Grand Canyon using their smartphone and Google Cardboard viewer. Students navigate the terrain, collect rock samples, and investigate outcrops using a variety of tests and comparative analyses built into the game narrative. To enhance the realism of the game, real-world samples and outcrops from the Grand Canyon were scanned and embedded within the VR environment.

  19. Mapping for the masses: using free remote sensing data for disaster management

    NASA Astrophysics Data System (ADS)

    Teeuw, R.; McWilliam, N.; Morris, N.; Saunders, C.

    2009-04-01

    We examine the uses of free satellite imagery and Digital Elevation Models (DEMs) for disaster management, targeting three data sources: the United Nations Charter on Space and Disasters, Google Earth and internet-based satellite data archives, such as the Global Land Cover Facility (GLCF). The research has assessed SRTM and ASTER DEM data, Landsat TM/ETM+ and ASTER imagery, as well as utilising datasets and basic GIS operations available via Google Earth. As an aid to Disaster Risk Reduction, four sets of maps can be produced from satellite data: (i) Multiple Geohazards: areas prone to slope instability, coastal inundation and fluvial flooding; (ii) Vulnerability: population density, habitation types, land cover types and infrastructure; (iii) Disaster Risk: produced by combining severity scores from (i) and (ii); (iv) Reconstruction: zones of rock/sediment with construction uses; areas of woodland (for fuel/construction) water sources; transport routes; zones suitable for re-settlement. This set of Disaster Risk Reduction maps are ideal for regional (1:50,000 to 1:250,000 scale) planning for in low-income countries: more detailed assessments require relatively expensive high resolution satellite imagery or aerial photography, although Google Earth has a good track record for posting high-res imagery of disaster zones (e.g. the 2008 Burma storm surge). The Disaster Risk maps highlight areas of maximum risk to a region's emergency planners and decision makers, enabling various types of public education and other disaster mitigation measures. The Reconstruction map also helps to save lives, by facilitating disaster recovery. Many problems have been identified. Access to the UN Charter imagery is fine after a disaster, but very difficult if assessing pre-disaster indicators: the data supplied also tends to be pre-processed, when some relief agencies would prefer to have raw data. The limited and expensive internet access in many developing countries limits access to archives of free satellite data, such as the GLCF. Finally, data integration, spatial/temporal analysis and map production are all hindered by the high price of most GIS software, making the development of suitable open-source software a priority.

  20. Landsat Based Woody Vegetation Loss Detection in Queensland, Australia Using the Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Johansen, K.; Phinn, S. R.; Taylor, M.

    2014-12-01

    Land clearing detection and woody Foliage Projective Cover (FPC) monitoring at the state and national level in Australia has mainly been undertaken by state governments and the Terrestrial Ecosystem Research Network (TERN) because of the considerable expense, expertise, sustained duration of activities and staffing levels needed. Only recently have services become available, providing low budget, generalized access to change detection tools suited to this task. The objective of this research was to examine if a globally available service, Google Earth Engine Beta, could be used to predict woody vegetation loss with accuracies approaching the methods used by TERN and the government of the state of Queensland, Australia. Two change detection approaches were investigated using Landsat Thematic Mapper time series and the Google Earth Engine Application Programming Interface: (1) CART and Random Forest classifiers; and (2) a normalized time series of Foliage Projective Cover (FPC) and NDVI combined with a spectral index. The CART and Random Forest classifiers produced high user's and producer's mapping accuracies of clearing (77-92% and 54-77%, respectively) when detecting change within epochs for which training data were available, but extrapolation to epochs without training data reduced the mapping accuracies. The use of FPC and NDVI time series provided a more robust approach for calculation of a clearing probability, as it did not rely on training data but instead on the difference of the normalized FPC / NDVI mean and standard deviation of a single year at the change point in relation to the remaining time series. However, the FPC and NDVI time series approach represented a trade-off between user's and producer's accuracies. Both change detection approaches explored in this research were sensitive to ephemeral greening and drying of the landscape. However, the developed normalized FPC and NDVI time series approach can be tuned to provide automated alerts for large woody vegetation clearing events by selecting suitable thresholds to identify very likely clearing. This research provides a comprehensive foundation to build further capacity to use globally accessible, free, online image datasets and processing tools to accurately detect woody vegetation clearing in an automated and rapid manner.

  1. Development of Visualizations and Loggable Activities for the Geosciences. Results from Recent TUES Sponsored Projects

    NASA Astrophysics Data System (ADS)

    De Paor, D. G.; Bailey, J. E.; Whitmeyer, S. J.

    2012-12-01

    Our TUES research centers on the role of digital data, visualizations, animations, and simulations in undergraduate geoscience education. Digital hardware (smartphones, tablets, GPSs, GigaPan robotic camera mounts, etc.) are revolutionizing field data collection. Software products (GIS, 3-D scanning and modeling programs, virtual globes, etc.) have truly transformed the way geoscientists teach, learn, and do research. Whilst Google-Earth-style visualizations are famously user-friend for the person browsing, they can be notoriously unfriendly for the content creator. Therefore, we developed tools to help educators create and share visualizations as easily as if posting on Facebook. Anyone whoIf you wish to display geological cross sections on Google Earth, go to digitalplanet.org, upload image files, position them on a line of section, and share with the world through our KMZ hosting service. Other tools facilitate screen overlay and 3-D map symbol generation. We advocate use of such technology to enable undergraduate students to 'publish' their first mapping efforts even while they are working in the field. A second outcome of our TUES projects merges Second-Life-style interaction with Google Earth. We created games in which students act as first responders for natural hazard mitigation, prospectors for natural resource explorations, and structural geologist for map-making. Students are represented by avatars and collaborate by exchange of text messages - the natural mode of communication for the current generation. Teachers view logs showing student movements as well as transcripts of text messages and can scaffold student learning and geofence students to prevent wandering. Early results of in-class testing show positive learning outcomes. The third aspect of our program emphasizes dissemination. Experience shows that great effort is required to overcome activation energy and ensure adoption of new technology into the curriculum. We organized a GSA Penrose Conference, a GSA Pardee Keynote Symposium, and AGU Townhall Meeting, and numerous workshops at annual and regional meetings, and set up a web site dedicated to dissemination of program products. Future plans include development of augmented reality teaching resources, hosting of community mapping services, and creation of a truly 4-D virtual globe.;

  2. Google Earth Visualizations of the Marine Automatic Identification System (AIS): Monitoring Ship Traffic in National Marine Sanctuaries

    NASA Astrophysics Data System (ADS)

    Schwehr, K.; Hatch, L.; Thompson, M.; Wiley, D.

    2007-12-01

    The Automatic Identification System (AIS) is a new technology that provides ship position reports with location, time, and identity information without human intervention from ships carrying the transponders to any receiver listening to the broadcasts. In collaboration with the USCG's Research and Development Center, NOAA's Stellwagen Bank National Marine Sanctuary (SBNMS) has installed 3 AIS receivers around Massachusetts Bay to monitor ship traffic transiting the sanctuary and surrounding waters. The SBNMS and the USCG also worked together propose the shifting the shipping lanes (termed the traffic separation scheme; TSS) that transit the sanctuary slightly to the north to reduce the probability of ship strikes of whales that frequent the sanctuary. Following approval by the United Nation's International Maritime Organization, AIS provided a means for NOAA to assess changes in the distribution of shipping traffic caused by formal change in the TSS effective July 1, 2007. However, there was no easy way to visualize this type of time series data. We have created a software package called noaadata-py to process the AIS ship reports and produce KML files for viewing in Google Earth. Ship tracks can be shown changing over time to allow the viewer to feel the motion of traffic through the sanctuary. The ship tracks can also be gridded to create ship traffic density reports for specified periods of time. The density is displayed as map draped on the sea surface or as vertical histogram columns. Additional visualizations such as bathymetry images, S57 nautical charts, and USCG Marine Information for Safety and Law Enforcement (MISLE) can be combined with the ship traffic visualizations to give a more complete picture of the maritime environment. AIS traffic analyses have the potential to give managers throughout NOAA's National Marine Sanctuaries an improved ability to assess the impacts of ship traffic on the marine resources they seek to protect. Viewing ship traffic data through Google Earth provides ease and efficiency for people not trained in GIS data processing.

  3. The Real Time Mission Monitor: A Platform for Real Time Environmental Data Integration and Display during NASA Field Campaigns

    NASA Astrophysics Data System (ADS)

    He, M.; Hardin, D. M.; Goodman, M.; Blakeslee, R.

    2008-05-01

    The Real Time Mission Monitor (RTMM) is an interactive visualization application based on Google Earth, that provides situational awareness and field asset management during NASA field campaigns. The RTMM can integrate data and imagery from numerous sources including GOES-12, GOES-10, and TRMM satellites. Simultaneously, it can display data and imagery from surface observations including Nexrad, NPOL and SMART- R radars. In addition to all these it can display output from models and real-time flight tracks of all aircraft involved in the experiment. In some instances the RTMM can also display measurements from scientific instruments as they are being flown. All data are recorded and archived in an on-line system enabling playback and review of all sorties. This is invaluable in preparing for future deployments and in exercising case studies. The RTMM facilitates pre-flight planning, in-flight monitoring, development of adaptive flight strategies and post- flight data analyses and assessments. Since the RTMM is available via the internet - during the actual experiment - project managers, scientists and mission planners can collaborate no matter where they are located as long as they have a viable internet connection. In addition, the system is open so that the general public can also view the experiment, in-progress, with Google Earth. Predecessors of RTMM were originally deployed in 2002 as part of the Altus Cumulus Electrification Study (ACES) to monitor uninhabited aerial vehicles near thunderstorms. In 2005 an interactive Java-based web prototype supported the airborne Lightning Instrument Package (LIP) during the Tropical Cloud Systems and Processes (TCSP) experiment. In 2006 the technology was adapted to the 3D Google Earth virtual globe and in 2007 its capabilities were extended to support multiple NASA aircraft (ER-2, WB-57, DC-8) during Tropical Composition, Clouds and Climate Coupling (TC4) experiment and 2007 Summer Aerosonde field study. In April 2008 the RTMM will be flown in the Arctic Research of the Composition of the Troposphere from Aircraft and Satellites (ARCTAS) experiment to study the atmospheric composition in the Arctic.

  4. Cross-disciplinary Undergraduate Research: A Case Study in Digital Mapping, western Ireland

    NASA Astrophysics Data System (ADS)

    Whitmeyer, S. J.; de Paor, D. G.; Nicoletti, J.; Rivera, M.; Santangelo, B.; Daniels, J.

    2008-12-01

    As digital mapping technology becomes ever more advanced, field geologists spend a greater proportion of time learning digital methods relative to analyzing rocks and structures. To explore potential solutions to the time commitment implicit in learning digital field methods, we paired James Madison University (JMU) geology majors (experienced in traditional field techniques) with Worcester Polytechnic Institute (WPI) engineering students (experienced in computer applications) during a four week summer mapping project in Connemara, western Ireland. The project consisted of approximately equal parts digital field mapping (directed by the geology students), and lab-based map assembly, evaluation and formatting for virtual 3D terrains (directed by the engineering students). Students collected geologic data in the field using ruggedized handheld computers (Trimble GeoExplorer® series) with ArcPAD® software. Lab work initially focused on building geologic maps in ArcGIS® from the digital field data and then progressed to developing Google Earth-based visualizations of field data and maps. Challenges included exporting GIS data, such as locations and attributes, to KML tags for viewing in Google Earth, which we accomplished using a Linux bash script written by one of our engineers - a task outside the comfort zone of the average geology major. We also attempted to expand the scope of Google Earth by using DEMs of present-day geologically-induced landforms as representative models for paleo-geographic reconstructions of the western Ireland field area. As our integrated approach to digital field work progressed, we found that our digital field mapping produced data at a faster rate than could be effectively managed during our allotted time for lab work. This likely reflected the more developed methodology for digital field data collection, as compared with our lab-based attempts to develop new methods for 3D visualization of geologic maps. However, this experiment in cross-disciplinary undergraduate research was a big success, with an enthusiastic interchange of expertise between undergraduate geology and engineering students that produced new, cutting-edge methods for visualizing geologic data and maps.

  5. IFEQ, app developed based on the J-SHIS system

    NASA Astrophysics Data System (ADS)

    Azuma, H.; Hao, K. X.; Fujiwara, H.

    2015-12-01

    Raise an awareness of earthquake disaster prevention is an important issue in Japan. To do that, we have developed an app IFEQ on smartphone based on the APIs provided by the J-SHIS that is an integrated system of Seismic Hazard Assessment. IFEQ can simulate a real situation of earthquake disaster you might experience at current spot. The hint of the IFEQ came from a question "What should I do IF a big EarthQuake hit now?" An earthquake risk situation is estimated by a location information that is acquired from GPS, the detail comprehensive information of a 250m mesh obtained through J-SHIS APIs by the geomorphological classification and a probability of intensity 6 lower (JMA) within 30 years. A user's photo was displayed with a risk situation surrounded by simply one touch. IFEQ helps oneself to overcome a gap between exist scenes and a horrible disaster by enhancing imagination. The results show people have more ideas to handle with the risk situation after using the App IFEQ. IFEQ App's features are summarized as below: 1. Visualizing an image photo with possible risks from a coming earthquake at present spot. 2. Displaying an exceedance probability within 30-years and a maximum seismic intensity within 10,000-years on present location. 3. User can obtain some advices of how to prepare for possible risks. 4. A risk with five ranks is given, especially for items of Building Collapse, Liquefaction and Landslide. IFEQ can be downloaded freely from http://www.j-shis.bosai.go.jp/app-ifearthquake J-SHIS APIs can be obtained from http://www.j-shis.bosai.go.jp/en/category/opencat/api

  6. Open Core Data: Connecting scientific drilling data to scientists and community data resources

    NASA Astrophysics Data System (ADS)

    Fils, D.; Noren, A. J.; Lehnert, K.; Diver, P.

    2016-12-01

    Open Core Data (OCD) is an innovative, efficient, and scalable infrastructure for data generated by scientific drilling and coring to improve discoverability, accessibility, citability, and preservation of data from the oceans and continents. OCD is building on existing community data resources that manage, store, publish, and preserve scientific drilling data, filling a critical void that currently prevents linkages between these and other data systems and tools to realize the full potential of data generated through drilling and coring. We are developing this functionality through Linked Open Data (LOD) and semantic patterns that enable data access through the use of community ontologies such as GeoLink (geolink.org, an EarthCube Building Block), a collection of protocols, formats and vocabularies from a set of participating geoscience repositories. Common shared concepts of classes such as cruise, dataset, person and others allow easier resolution of common references through shared resource IDs. These graphs are then made available via SPARQL as well as incorporated into web pages following schema.org approaches. Additionally the W3C PROV vocabulary is under evaluation for use for documentation of provenance. Further, the application of persistent identifiers for samples (IGSNs); datasets, expeditions, and projects (DOIs); and people (ORCIDs), combined with LOD approaches, provides methods to resolve and incorporate metadata and datasets. Application Program Interfaces (APIs) complement these semantic approaches to the OCD data holdings. APIs are exposed following the Swagger guidelines (swagger.io) and will be evolved into the OpenAPI (openapis.org) approach. Currently APIs are in development for the NSF funded Flyover Country mobile geoscience app (fc.umn.edu), the Neotoma Paleoecology Database (neotomadb.org), Magnetics Information Consortium (MagIC; earthref.org/MagIC), and other community tools and data systems, as well as for internal OCD use.

  7. Integrating Authentic Earth Science Data in Online Visualization Tools and Social Media Networking to Promote Earth Science Education

    NASA Astrophysics Data System (ADS)

    Carter, B. L.; Campbell, B.; Chambers, L.; Davis, A.; Riebeek, H.; Ward, K.

    2008-12-01

    The Goddard Space Flight Center (GSFC) is one of the largest Earth Science research-based institutions in the nation. Along with the research comes a dedicated group of people who are tasked with developing Earth science research-based education and public outreach materials to reach the broadest possible range of audiences. The GSFC Earth science education community makes use of a wide variety of platforms in order to reach their goals of communicating science. These platforms include using social media networking such as Twitter and Facebook, as well as geo-spatial tools such as MY NASA DATA, NASA World Wind, NEO, and Google Earth. Using a wide variety of platforms serves the dual purposes of promoting NASA Earth Science research and making authentic data available to educational communities that otherwise might not otherwise be granted access. Making data available to education communities promotes scientific literacy through the investigation of scientific phenomena using the same data that is used by the scientific community. Data from several NASA missions will be used to demonstrate the ways in which Earth science data are made available for the education community.

  8. Complaint go: an online complaint registration system using web services and android

    NASA Astrophysics Data System (ADS)

    Mareeswari, V.; Gopalakrishnan, V.

    2017-11-01

    In numerous nations, there are city bodies that are the nearby representing bodies that help keep up and run urban communities. These administering bodies are for the most part called MC (Municipal Cooperation). The MC may need to introduce edit cameras and other observation gadgets to guarantee the city is running easily and productively. It is imperative for an MC to know the deficiencies occurring inside the city. As of now, this must be for all intents and purposes conceivable by introducing sensors/cameras and so forth or enabling nationals to straightforwardly address them. The everyday operations and working of the city are taken care by administering bodies which are known as Government Authorities. Presently keeping in mind the end goal to keep up the huge city requires that the Government Authority should know about any issue or deficiency either through (sensors/CCTV cameras) or by enabling the nationals to grumble about these issues. The second choice is generally granted on the grounds that it gives the best possible substantial data. The GA by and large enables its residents to enlist their grievance through a few mediums. In this application, the citizens are facilitated to send the complaints directly from their smartphone to the higher officials. Many APIs are functioning as the web services which are really essential to make it easier to register a complaint such as Google Places API to detect your current location and show that in Map. The Web portal is used to process various complaints well supported with different web services.

  9. Artificial Intelligence and NASA Data Used to Discover Eighth Planet Circling Distant Star

    NASA Image and Video Library

    2017-12-12

    Our solar system now is tied for most number of planets around a single star, with the recent discovery of an eighth planet circling Kepler-90, a Sun-like star 2,545 light years from Earth. The planet was discovered in data from NASA’s Kepler space telescope. The newly-discovered Kepler-90i -- a sizzling hot, rocky planet that orbits its star once every 14.4 days -- was found by researchers from Google and The University of Texas at Austin using machine learning. Machine learning is an approach to artificial intelligence in which computers “learn.” In this case, computers learned to identify planets by finding in Kepler data instances where the telescope recorded signals from planets beyond our solar system, known as exoplanets. Video Credit: NASA Ames Research Center / Google

  10. Building Knowledge Graphs for NASA's Earth Science Enterprise

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Lee, T. J.; Ramachandran, R.; Shi, R.; Bao, Q.; Gatlin, P. N.; Weigel, A. M.; Maskey, M.; Miller, J. J.

    2016-12-01

    Inspired by Google Knowledge Graph, we have been building a prototype Knowledge Graph for Earth scientists, connecting information and data in NASA's Earth science enterprise. Our primary goal is to advance the state-of-the-art NASA knowledge extraction capability by going beyond traditional catalog search and linking different distributed information (such as data, publications, services, tools and people). This will enable a more efficient pathway to knowledge discovery. While Google Knowledge Graph provides impressive semantic-search and aggregation capabilities, it is limited to search topics for general public. We use the similar knowledge graph approach to semantically link information gathered from a wide variety of sources within the NASA Earth Science enterprise. Our prototype serves as a proof of concept on the viability of building an operational "knowledge base" system for NASA Earth science. Information is pulled from structured sources (such as NASA CMR catalog, GCMD, and Climate and Forecast Conventions) and unstructured sources (such as research papers). Leveraging modern techniques of machine learning, information retrieval, and deep learning, we provide an integrated data mining and information discovery environment to help Earth scientists to use the best data, tools, methodologies, and models available to answer a hypothesis. Our knowledge graph would be able to answer questions like: Which articles discuss topics investigating similar hypotheses? How have these methods been tested for accuracy? Which approaches have been highly cited within the scientific community? What variables were used for this method and what datasets were used to represent them? What processing was necessary to use this data? These questions then lead researchers and citizen scientists to investigate the sources where data can be found, available user guides, information on how the data was acquired, and available tools and models to use with this data. As a proof of concept, we focus on a well-defined domain - Hurricane Science linking research articles and their findings, data, people and tools/services. Modern information retrieval, natural language processing machine learning and deep learning techniques are applied to build the knowledge network.

  11. EarthScope Plate Boundary Observatory Data in the College Classroom (Invited)

    NASA Astrophysics Data System (ADS)

    Eriksson, S. C.; Olds, S. E.

    2009-12-01

    The Plate Boundary Observatory (PBO) is the geodetic component of the EarthScope project, designed to study the 3-D strain field across the active boundary zone between the Pacific and North American tectonics plates in the western United States. All PBO data are freely available to scientific and educational communities and have been incorporated into a variety of activities for college and university classrooms. UNAVCO Education and Outreach program staff have worked closely with faculty users, scientific researchers, and facility staff to create materials that are scientifically and technically accurate as well as useful to the classroom user. Availability of processed GPS data is not new to the geoscience community. However, PBO data staff have worked with education staff to deliver data that are readily accessible to educators. The UNAVCO Data for Educators webpage, incorporating an embedded Google Map with PBO GPS locations and providing current GPS time series plots and downloadable data, extends and updates the datasets available to our community. Google Earth allows the visualization GPS data with other types of datasets, e.g. LiDAR, while maintaining the self-contained and easy-to-use interface of UNAVCO’s Jules Verne Voyager map tools, which have multiple sets of geological and geophysical data. Curricular materials provide scaffolds for using EarthScope data in a variety of forms for different learning goals. Simple visualization of earthquake epicenters and locations of volcanoes can be used with velocity vectors to make simple deductions of plate boundary behaviors. Readily available time series plots provide opportunities for additional science skills, and there are web and paper-based support materials for downloading data, manipulating tables, and using plotting programs for processed GPS data. Scientists have provided contextual materials to explore the importance of these data in interpreting the structure and dynamics of the Earth. These data and their scientific context are now incorporated into the Active Earth Display developed by IRIS. Formal and informal evaluations during the past five years have provided useful data for revision and on-line implementation.

  12. Enhancements and Evolution of the Real Time Mission Monitor

    NASA Astrophysics Data System (ADS)

    Goodman, M.; Blakeslee, R.; Hardin, D.; Hall, J.; He, Y.; Regner, K.

    2008-12-01

    The Real Time Mission Monitor (RTMM) is a visualization and information system that fuses multiple Earth science data sources, to enable real time decision-making for airborne and ground validation experiments. Developed at the National Aeronautics and Space Administration (NASA) Marshall Space Flight Center, RTMM is a situational awareness, decision-support system that integrates satellite imagery, radar, surface and airborne instrument data sets, model output parameters, lightning location observations, aircraft navigation data, soundings, and other applicable Earth science data sets. The integration and delivery of this information is made possible using data acquisition systems, network communication links, network server resources, and visualizations through the Google Earth virtual earth application. RTMM has proven extremely valuable for optimizing individual Earth science airborne field experiments. Flight planners, mission scientists, instrument scientists and program managers alike appreciate the contributions that RTMM makes to their flight projects. RTMM has received numerous plaudits from a wide variety of scientists who used RTMM during recent field campaigns including the 2006 NASA African Monsoon Multidisciplinary Analyses (NAMMA), 2007 Tropical Composition, Cloud, and Climate Coupling (TC4), 2008 Arctic Research of the Composition of the Troposphere from Aircraft and Satellites (ARCTAS) missions, the 2007-2008 NOAA-NASA Aerosonde Hurricane flights and the 2008 Soil Moisture Active-Passive Validation Experiment (SMAP-VEX). Improving and evolving RTMM is a continuous process. RTMM recently integrated the Waypoint Planning Tool, a Java-based application that enables aircraft mission scientists to easily develop a pre-mission flight plan through an interactive point-and-click interface. Individual flight legs are automatically calculated for altitude, latitude, longitude, flight leg distance, cumulative distance, flight leg time, cumulative time, and satellite overpass intersections. The resultant flight plan is then generated in KML and quickly posted to the Google Earth-based RTMM for planning discussions, as well as comparisons to real time flight tracks in progress. A description of the system architecture, components, and applications along with reviews and animations of RTMM during the field campaigns, plus planned enhancements and future opportunities will be presented.

  13. Global OpenSearch

    NASA Astrophysics Data System (ADS)

    Newman, D. J.; Mitchell, A. E.

    2015-12-01

    At AGU 2014, NASA EOSDIS demonstrated a case-study of an OpenSearch framework for Earth science data discovery. That framework leverages the IDN and CWIC OpenSearch API implementations to provide seamless discovery of data through the 'two-step' discovery process as outlined by the Federation for Earth Sciences (ESIP) OpenSearch Best Practices. But how would an Earth Scientist leverage this framework and what are the benefits? Using a client that understands the OpenSearch specification and, for further clarity, the various best practices and extensions, a scientist can discovery a plethora of data not normally accessible either by traditional methods (NASA Earth Data Search, Reverb, etc) or direct methods (going to the source of the data) We will demonstrate, via the CWICSmart web client, how an earth scientist can access regional data on a regional phenomena in a uniform and aggregated manner. We will demonstrate how an earth scientist can 'globalize' their discovery. You want to find local data on 'sea surface temperature of the Indian Ocean'? We can help you with that. 'European meteorological data'? Yes. 'Brazilian rainforest satellite imagery'? That too. CWIC allows you to get earth science data in a uniform fashion from a large number of disparate, world-wide agencies. This is what we mean by Global OpenSearch.

  14. A Data Services Upgrade for Advanced Composition Explorer (ACE) Data

    NASA Astrophysics Data System (ADS)

    Davis, A. J.; Hamell, G.

    2008-12-01

    Since early in 1998, NASA's Advanced Composition Explorer (ACE) spacecraft has provided continuous measurements of solar wind, interplanetary magnetic field, and energetic particle activity from L1, located approximately 0.01 AU sunward of Earth. The spacecraft has enough fuel to stay in orbit about L1 until ~2024. The ACE Science Center (ASC) provides access to ACE data, and performs level 1 and browse data processing for the science instruments. Thanks to a NASA Data Services Upgrade grant, we have recently retooled our legacy web interface to ACE data, enhancing data subsetting capabilities and improving online plotting options. We have also integrated a new application programming interface (API) and we are working to ensure that it will be compatible with emerging Virtual Observatory (VO) data services standards. The new API makes extensive use of metadata created using the Space Physics Archive Search and Extract (SPASE) data model. We describe these recent improvements to the ACE Science Center data services, and our plans for integrating these services into the VO system.

  15. Signalling maps in cancer research: construction and data analysis

    PubMed Central

    Kondratova, Maria; Sompairac, Nicolas; Barillot, Emmanuel; Zinovyev, Andrei

    2018-01-01

    Abstract Generation and usage of high-quality molecular signalling network maps can be augmented by standardizing notations, establishing curation workflows and application of computational biology methods to exploit the knowledge contained in the maps. In this manuscript, we summarize the major aims and challenges of assembling information in the form of comprehensive maps of molecular interactions. Mainly, we share our experience gained while creating the Atlas of Cancer Signalling Network. In the step-by-step procedure, we describe the map construction process and suggest solutions for map complexity management by introducing a hierarchical modular map structure. In addition, we describe the NaviCell platform, a computational technology using Google Maps API to explore comprehensive molecular maps similar to geographical maps and explain the advantages of semantic zooming principles for map navigation. We also provide the outline to prepare signalling network maps for navigation using the NaviCell platform. Finally, several examples of cancer high-throughput data analysis and visualization in the context of comprehensive signalling maps are presented. PMID:29688383

  16. Map based multimedia tool on Pacific theatre in World War II

    NASA Astrophysics Data System (ADS)

    Pakala Venkata, Devi Prasada Reddy

    Maps have been used for depicting data of all kinds in the educational community for many years. A standout amongst the rapidly changing methods of teaching is through the development of interactive and dynamic maps. The emphasis of the thesis is to develop an intuitive map based multimedia tool, which provides a timeline of battles and events in the Pacific theatre of World War II. The tool contains summaries of major battles and commanders and has multimedia content embedded in it. The primary advantage of this Map tool is that one can quickly know about all the battles and campaigns of the Pacific Theatre by accessing Timeline of Battles in each region or Individual Battles in each region or Summary of each Battle in an interactive way. This tool can be accessed via any standard web browser and motivate the user to know more about the battles involved in the Pacific Theatre. It was made responsive using Google maps API, JavaScript, HTML5 and CSS.

  17. Destination Information System for Bandung City Using Location-Based Services (LBS) on Android

    NASA Astrophysics Data System (ADS)

    Kurniawan, B.; Pranoto, H.

    2018-02-01

    Bandung is a city in West Java, Indonesia with many interesting locations to visit. For most favourite destinations, we can easily look for it on Google and we will find some blogs there discussing about related content. The problem is we can not guarantee that the destination is frequented by visitor. In this research, we utilizes an application to help everyone choosing destination frequented by visitor. The use of information technology in the form of picture, maps, and textual on Android application makes it possible for user to have information about destination with its visitor in a period of time. If destination has visit history, selection of proper destination will be given with fresh informations. This application can run well on Android Lollipop (API Level 21) or above with a minimum RAM of 2 GB since it will compare two coordinates for every data. The use of this app make it possible to access information about location with its visitor history and could help choosing proper destinations for the users.

  18. Towards the Ubiquitous Deployment of DNSSEC

    DTIC Science & Technology

    2016-01-01

    with other deployment partners around the world, there is now a significant and growing number of TLDs that have been signed, and a number of...as Google Earth, the Blackberry 10 operating system, and the entire set of K Desktop Environment (KDE) windowing system applications are based on...differentiate between transient errors and legitimate DNS spoofing attacks is likely going to be very important as deployment grows . The importance of

  19. Reaching Forward in the War against the Islamic State

    DTIC Science & Technology

    2016-12-07

    every week with U.S.- and Coalition-advised ISOF troops taking the lead in combat operations using cellular communications systems that link them...tions—Offline Maps, Google Earth , and Viber, to name a few—which allowed them to bring tablets and phones on their operations to help communicate ...provided an initial Remote Advise and Assist capability that enabled the special forces advisors to track, communicate , and share limited data with

  20. Baseline coastal oblique aerial photographs collected from Navarre Beach, Florida, to Breton Island, Louisiana, September 18–19, 2015

    USGS Publications Warehouse

    Morgan, Karen L. M.

    2016-08-01

    In addition to the photographs, a Google Earth Keyhole Markup Language (KML) file is provided and can be used to view the images by clicking on the marker and then the thumbnail or the link below the thumbnail. The KML file was created using the photographic navigation files. This KML file can be found in the kml folder.

  1. Interpretation of earthquake-induced landslides triggered by the 12 May 2008, M7.9 Wenchuan earthquake in the Beichuan area, Sichuan Province, China using satellite imagery and Google Earth

    USGS Publications Warehouse

    Sato, H.P.; Harp, E.L.

    2009-01-01

    The 12 May 2008 M7.9 Wenchuan earthquake in the People's Republic of China represented a unique opportunity for the international community to use commonly available GIS (Geographic Information System) tools, like Google Earth (GE), to rapidly evaluate and assess landslide hazards triggered by the destructive earthquake and its aftershocks. In order to map earthquake-triggered landslides, we provide details on the applicability and limitations of publicly available 3-day-post- and pre-earthquake imagery provided by GE from the FORMOSAT-2 (formerly ROCSAT-2; Republic of China Satellite 2). We interpreted landslides on the 8-m-resolution FORMOSAT-2 image by GE; as a result, 257 large landslides were mapped with the highest concentration along the Beichuan fault. An estimated density of 0.3 landslides/km2 represents a minimum bound on density given the resolution of available imagery; higher resolution data would have identified more landslides. This is a preliminary study, and further study is needed to understand the landslide characteristics in detail. Although it is best to obtain landslide locations and measurements from satellite imagery having high resolution, it was found that GE is an effective and rapid reconnaissance tool. ?? 2009 Springer-Verlag.

  2. An Interactive Visual Analytics Framework for Multi-Field Data in a Geo-Spatial Context

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Zhiyuan; Tong, Xiaonan; McDonnell, Kevin T.

    2013-04-01

    Climate research produces a wealth of multivariate data. These data often have a geospatial reference and so it is of interest to show them within their geospatial context. One can consider this configuration as a multi field visualization problem, where the geospace provides the expanse of the field. However, there is a limit on the amount of multivariate information that can be fit within a certain spatial location, and the use of linked multivari ate information displays has previously been devised to bridge this gap. In this paper we focus on the interactions in the geographical display, present an implementationmore » that uses Google Earth, and demonstrate it within a tightly linked parallel coordinates display. Several other visual representations, such as pie and bar charts are integrated into the Google Earth display and can be interactively manipulated. Further, we also demonstrate new brushing and visualization techniques for parallel coordinates, such as fixedwindow brushing and correlationenhanced display. We conceived our system with a team of climate researchers, who already made a few important discov eries using it. This demonstrates our system’s great potential to enable scientific discoveries, possibly also in oth er domains where data have a geospatial reference.« less

  3. Development of a Web-Based Visualization Platform for Climate Research Using Google Earth

    NASA Technical Reports Server (NTRS)

    Sun, Xiaojuan; Shen, Suhung; Leptoukh, Gregory G.; Wang, Panxing; Di, Liping; Lu, Mingyue

    2011-01-01

    Recently, it has become easier to access climate data from satellites, ground measurements, and models from various data centers, However, searching. accessing, and prc(essing heterogeneous data from different sources are very tim -consuming tasks. There is lack of a comprehensive visual platform to acquire distributed and heterogeneous scientific data and to render processed images from a single accessing point for climate studies. This paper. documents the design and implementation of a Web-based visual, interoperable, and scalable platform that is able to access climatological fields from models, satellites, and ground stations from a number of data sources using Google Earth (GE) as a common graphical interface. The development is based on the TCP/IP protocol and various data sharing open sources, such as OPeNDAP, GDS, Web Processing Service (WPS), and Web Mapping Service (WMS). The visualization capability of integrating various measurements into cE extends dramatically the awareness and visibility of scientific results. Using embedded geographic information in the GE, the designed system improves our understanding of the relationships of different elements in a four dimensional domain. The system enables easy and convenient synergistic research on a virtual platform for professionals and the general public, gr$tly advancing global data sharing and scientific research collaboration.

  4. Epidemiologic study of residential proximity to transmission lines and childhood cancer in California: description of design, epidemiologic methods and study population

    PubMed Central

    Kheifets, Leeka; Crespi, Catherine M; Hooper, Chris; Oksuzyan, Sona; Cockburn, Myles; Ly, Thomas; Mezei, Gabor

    2015-01-01

    We conducted a large epidemiologic case-control study in California to examine the association between childhood cancer risk and distance from the home address at birth to the nearest high-voltage overhead transmission line as a replication of the study of Draper et al. in the United Kingdom. We present a detailed description of the study design, methods of case ascertainment, control selection, exposure assessment and data analysis plan. A total of 5788 childhood leukemia cases and 3308 childhood central nervous system cancer cases (included for comparison) and matched controls were available for analysis. Birth and diagnosis addresses of cases and birth addresses of controls were geocoded. Distance from the home to nearby overhead transmission lines was ascertained on the basis of the electric power companies’ geographic information system (GIS) databases, additional Google Earth aerial evaluation and site visits to selected residences. We evaluated distances to power lines up to 2000 m and included consideration of lower voltages (60–69 kV). Distance measures based on GIS and Google Earth evaluation showed close agreement (Pearson correlation >0.99). Our three-tiered approach to exposure assessment allowed us to achieve high specificity, which is crucial for studies of rare diseases with low exposure prevalence. PMID:24045429

  5. A two-stage cluster sampling method using gridded population data, a GIS, and Google Earth(TM) imagery in a population-based mortality survey in Iraq.

    PubMed

    Galway, Lp; Bell, Nathaniel; Sae, Al Shatari; Hagopian, Amy; Burnham, Gilbert; Flaxman, Abraham; Weiss, Wiliam M; Rajaratnam, Julie; Takaro, Tim K

    2012-04-27

    Mortality estimates can measure and monitor the impacts of conflict on a population, guide humanitarian efforts, and help to better understand the public health impacts of conflict. Vital statistics registration and surveillance systems are rarely functional in conflict settings, posing a challenge of estimating mortality using retrospective population-based surveys. We present a two-stage cluster sampling method for application in population-based mortality surveys. The sampling method utilizes gridded population data and a geographic information system (GIS) to select clusters in the first sampling stage and Google Earth TM imagery and sampling grids to select households in the second sampling stage. The sampling method is implemented in a household mortality study in Iraq in 2011. Factors affecting feasibility and methodological quality are described. Sampling is a challenge in retrospective population-based mortality studies and alternatives that improve on the conventional approaches are needed. The sampling strategy presented here was designed to generate a representative sample of the Iraqi population while reducing the potential for bias and considering the context specific challenges of the study setting. This sampling strategy, or variations on it, are adaptable and should be considered and tested in other conflict settings.

  6. A two-stage cluster sampling method using gridded population data, a GIS, and Google EarthTM imagery in a population-based mortality survey in Iraq

    PubMed Central

    2012-01-01

    Background Mortality estimates can measure and monitor the impacts of conflict on a population, guide humanitarian efforts, and help to better understand the public health impacts of conflict. Vital statistics registration and surveillance systems are rarely functional in conflict settings, posing a challenge of estimating mortality using retrospective population-based surveys. Results We present a two-stage cluster sampling method for application in population-based mortality surveys. The sampling method utilizes gridded population data and a geographic information system (GIS) to select clusters in the first sampling stage and Google Earth TM imagery and sampling grids to select households in the second sampling stage. The sampling method is implemented in a household mortality study in Iraq in 2011. Factors affecting feasibility and methodological quality are described. Conclusion Sampling is a challenge in retrospective population-based mortality studies and alternatives that improve on the conventional approaches are needed. The sampling strategy presented here was designed to generate a representative sample of the Iraqi population while reducing the potential for bias and considering the context specific challenges of the study setting. This sampling strategy, or variations on it, are adaptable and should be considered and tested in other conflict settings. PMID:22540266

  7. Reusable Client-Side JavaScript Modules for Immersive Web-Based Real-Time Collaborative Neuroimage Visualization

    PubMed Central

    Bernal-Rusiel, Jorge L.; Rannou, Nicolas; Gollub, Randy L.; Pieper, Steve; Murphy, Shawn; Robertson, Richard; Grant, Patricia E.; Pienaar, Rudolph

    2017-01-01

    In this paper we present a web-based software solution to the problem of implementing real-time collaborative neuroimage visualization. In both clinical and research settings, simple and powerful access to imaging technologies across multiple devices is becoming increasingly useful. Prior technical solutions have used a server-side rendering and push-to-client model wherein only the server has the full image dataset. We propose a rich client solution in which each client has all the data and uses the Google Drive Realtime API for state synchronization. We have developed a small set of reusable client-side object-oriented JavaScript modules that make use of the XTK toolkit, a popular open-source JavaScript library also developed by our team, for the in-browser rendering and visualization of brain image volumes. Efficient realtime communication among the remote instances is achieved by using just a small JSON object, comprising a representation of the XTK image renderers' state, as the Google Drive Realtime collaborative data model. The developed open-source JavaScript modules have already been instantiated in a web-app called MedView, a distributed collaborative neuroimage visualization application that is delivered to the users over the web without requiring the installation of any extra software or browser plugin. This responsive application allows multiple physically distant physicians or researchers to cooperate in real time to reach a diagnosis or scientific conclusion. It also serves as a proof of concept for the capabilities of the presented technological solution. PMID:28507515

  8. Profile-IQ: Web-based data query system for local health department infrastructure and activities.

    PubMed

    Shah, Gulzar H; Leep, Carolyn J; Alexander, Dayna

    2014-01-01

    To demonstrate the use of National Association of County & City Health Officials' Profile-IQ, a Web-based data query system, and how policy makers, researchers, the general public, and public health professionals can use the system to generate descriptive statistics on local health departments. This article is a descriptive account of an important health informatics tool based on information from the project charter for Profile-IQ and the authors' experience and knowledge in design and use of this query system. Profile-IQ is a Web-based data query system that is based on open-source software: MySQL 5.5, Google Web Toolkit 2.2.0, Apache Commons Math library, Google Chart API, and Tomcat 6.0 Web server deployed on an Amazon EC2 server. It supports dynamic queries of National Profile of Local Health Departments data on local health department finances, workforce, and activities. Profile-IQ's customizable queries provide a variety of statistics not available in published reports and support the growing information needs of users who do not wish to work directly with data files for lack of staff skills or time, or to avoid a data use agreement. Profile-IQ also meets the growing demand of public health practitioners and policy makers for data to support quality improvement, community health assessment, and other processes associated with voluntary public health accreditation. It represents a step forward in the recent health informatics movement of data liberation and use of open source information technology solutions to promote public health.

  9. Interfaces to PeptideAtlas: a case study of standard data access systems

    PubMed Central

    Handcock, Jeremy; Robinson, Thomas; Deutsch, Eric W.; Boyle, John

    2012-01-01

    Access to public data sets is important to the scientific community as a resource to develop new experiments or validate new data. Projects such as the PeptideAtlas, Ensembl and The Cancer Genome Atlas (TCGA) offer both access to public data and a repository to share their own data. Access to these data sets is often provided through a web page form and a web service API. Access technologies based on web protocols (e.g. http) have been in use for over a decade and are widely adopted across the industry for a variety of functions (e.g. search, commercial transactions, and social media). Each architecture adapts these technologies to provide users with tools to access and share data. Both commonly used web service technologies (e.g. REST and SOAP), and custom-built solutions over HTTP are utilized in providing access to research data. Providing multiple access points ensures that the community can access the data in the simplest and most effective manner for their particular needs. This article examines three common access mechanisms for web accessible data: BioMart, caBIG, and Google Data Sources. These are illustrated by implementing each over the PeptideAtlas repository and reviewed for their suitability based on specific usages common to research. BioMart, Google Data Sources, and caBIG are each suitable for certain uses. The tradeoffs made in the development of the technology are dependent on the uses each was designed for (e.g. security versus speed). This means that an understanding of specific requirements and tradeoffs is necessary before selecting the access technology. PMID:22941959

  10. Creating User-Friendly Tools for Data Analysis and Visualization in K-12 Classrooms: A Fortran Dinosaur Meets Generation Y

    NASA Technical Reports Server (NTRS)

    Chambers, L. H.; Chaudhury, S.; Page, M. T.; Lankey, A. J.; Doughty, J.; Kern, Steven; Rogerson, Tina M.

    2008-01-01

    During the summer of 2007, as part of the second year of a NASA-funded project in partnership with Christopher Newport University called SPHERE (Students as Professionals Helping Educators Research the Earth), a group of undergraduate students spent 8 weeks in a research internship at or near NASA Langley Research Center. Three students from this group formed the Clouds group along with a NASA mentor (Chambers), and the brief addition of a local high school student fulfilling a mentorship requirement. The Clouds group was given the task of exploring and analyzing ground-based cloud observations obtained by K-12 students as part of the Students' Cloud Observations On-Line (S'COOL) Project, and the corresponding satellite data. This project began in 1997. The primary analysis tools developed for it were in FORTRAN, a computer language none of the students were familiar with. While they persevered through computer challenges and picky syntax, it eventually became obvious that this was not the most fruitful approach for a project aimed at motivating K-12 students to do their own data analysis. Thus, about halfway through the summer the group shifted its focus to more modern data analysis and visualization tools, namely spreadsheets and Google(tm) Earth. The result of their efforts, so far, is two different Excel spreadsheets and a Google(tm) Earth file. The spreadsheets are set up to allow participating classrooms to paste in a particular dataset of interest, using the standard S'COOL format, and easily perform a variety of analyses and comparisons of the ground cloud observation reports and their correspondence with the satellite data. This includes summarizing cloud occurrence and cloud cover statistics, and comparing cloud cover measurements from the two points of view. A visual classification tool is also provided to compare the cloud levels reported from the two viewpoints. This provides a statistical counterpart to the existing S'COOL data visualization tool, which is used for individual ground-to-satellite correspondences. The Google(tm) Earth file contains a set of placemarks and ground overlays to show participating students the area around their school that the satellite is measuring. This approach will be automated and made interactive by the S'COOL database expert and will also be used to help refine the latitude/longitude location of the participating schools. Once complete, these new data analysis tools will be posted on the S'COOL website for use by the project participants in schools around the US and the world.

  11. Use of Openly Available Satellite Images for Remote Sensing Education

    NASA Astrophysics Data System (ADS)

    Wang, C.-K.

    2011-09-01

    With the advent of Google Earth, Google Maps, and Microsoft Bing Maps, high resolution satellite imagery are becoming more easily accessible than ever. It have been the case that the college students may already have wealth experiences with the high resolution satellite imagery by using these software and web services prior to any formal remote sensing education. It is obvious that the remote sensing education should be adjusted to the fact that the audience are already the customers of remote sensing products (through the use of the above mentioned services). This paper reports the use of openly available satellite imagery in an introductory-level remote sensing course in the Department of Geomatics of National Cheng Kung University as a term project. From the experience learned from the fall of 2009 and 2010, it shows that this term project has effectively aroused the students' enthusiastic toward Remote Sensing.

  12. Doing One Thing Well: Leveraging Microservices for NASA Earth Science Discovery and Access Across Heterogenous Data Sources

    NASA Astrophysics Data System (ADS)

    Baynes, K.; Gilman, J.; Pilone, D.; Mitchell, A. E.

    2015-12-01

    The NASA EOSDIS (Earth Observing System Data and Information System) Common Metadata Repository (CMR) is a continuously evolving metadata system that merges all existing capabilities and metadata from EOS ClearingHOuse (ECHO) and the Global Change Master Directory (GCMD) systems. This flagship catalog has been developed with several key requirements: fast search and ingest performance ability to integrate heterogenous external inputs and outputs high availability and resiliency scalability evolvability and expandability This talk will focus on the advantages and potential challenges of tackling these requirements using a microservices architecture, which decomposes system functionality into smaller, loosely-coupled, individually-scalable elements that communicate via well-defined APIs. In addition, time will be spent examining specific elements of the CMR architecture and identifying opportunities for future integrations.

  13. Extending Climate Analytics-As to the Earth System Grid Federation

    NASA Astrophysics Data System (ADS)

    Tamkin, G.; Schnase, J. L.; Duffy, D.; McInerney, M.; Nadeau, D.; Li, J.; Strong, S.; Thompson, J. H.

    2015-12-01

    We are building three extensions to prior-funded work on climate analytics-as-a-service that will benefit the Earth System Grid Federation (ESGF) as it addresses the Big Data challenges of future climate research: (1) We are creating a cloud-based, high-performance Virtual Real-Time Analytics Testbed supporting a select set of climate variables from six major reanalysis data sets. This near real-time capability will enable advanced technologies like the Cloudera Impala-based Structured Query Language (SQL) query capabilities and Hadoop-based MapReduce analytics over native NetCDF files while providing a platform for community experimentation with emerging analytic technologies. (2) We are building a full-featured Reanalysis Ensemble Service comprising monthly means data from six reanalysis data sets. The service will provide a basic set of commonly used operations over the reanalysis collections. The operations will be made accessible through NASA's climate data analytics Web services and our client-side Climate Data Services (CDS) API. (3) We are establishing an Open Geospatial Consortium (OGC) WPS-compliant Web service interface to our climate data analytics service that will enable greater interoperability with next-generation ESGF capabilities. The CDS API will be extended to accommodate the new WPS Web service endpoints as well as ESGF's Web service endpoints. These activities address some of the most important technical challenges for server-side analytics and support the research community's requirements for improved interoperability and improved access to reanalysis data.

  14. Integrating Socioeconomic and Earth Science Data Using Geobrowsers and Web Services: A Demonstration

    NASA Astrophysics Data System (ADS)

    Schumacher, J. A.; Yetman, G. G.

    2007-12-01

    The societal benefit areas identified as the focus for the Global Earth Observing System of Systems (GEOSS) 10- year implementation plan are an indicator of the importance of integrating socioeconomic data with earth science data to support decision makers. To aid this integration, CIESIN is delivering its global and U.S. demographic data to commercial and open source Geobrowsers and providing open standards based services for data access. Currently, data on population distribution, poverty, and detailed census data for the U.S. are available for visualization and access in Google Earth, NASA World Wind, and a browser-based 2-dimensional mapping client. The mapping client allows for the creation of web map documents that pull together layers from distributed servers and can be saved and shared. Visualization tools with Geobrowsers, user-driven map creation and sharing via browser-based clients, and a prototype for characterizing populations at risk to predicted precipitation deficits will be demonstrated.

  15. Digital Earth reloaded - Beyond the next generation

    NASA Astrophysics Data System (ADS)

    Ehlers, M.; Woodgate, P.; Annoni, A.; Schade, S.

    2014-02-01

    Digital replicas (or 'mirror worlds') of complex entities and systems are now routine in many fields such as aerospace engineering; archaeology; medicine; or even fashion design. The Digital Earth (DE) concept as a digital replica of the entire planet occurs in Al Gore's 1992 book Earth in the Balance and was popularized in his speech at the California Science Center in January 1998. It played a pivotal role in stimulating the development of a first generation of virtual globes, typified by Google Earth that achieved many elements of this vision. Almost 15 years after Al Gore's speech, the concept of DE needs to be re-evaluated in the light of the many scientific and technical developments in the fields of information technology, data infrastructures, citizen?s participation, and earth observation that have taken place since. This paper intends to look beyond the next generation predominantly based on the developments of fields outside the spatial sciences, where concepts, software, and hardware with strong relationships to DE are being developed without referring to this term. It also presents a number of guiding criteria for future DE developments.

  16. Towards better digital pathology workflows: programming libraries for high-speed sharpness assessment of Whole Slide Images

    PubMed Central

    2014-01-01

    Background Since microscopic slides can now be automatically digitized and integrated in the clinical workflow, quality assessment of Whole Slide Images (WSI) has become a crucial issue. We present a no-reference quality assessment method that has been thoroughly tested since 2010 and is under implementation in multiple sites, both public university-hospitals and private entities. It is part of the FlexMIm R&D project which aims to improve the global workflow of digital pathology. For these uses, we have developed two programming libraries, in Java and Python, which can be integrated in various types of WSI acquisition systems, viewers and image analysis tools. Methods Development and testing have been carried out on a MacBook Pro i7 and on a bi-Xeon 2.7GHz server. Libraries implementing the blur assessment method have been developed in Java, Python, PHP5 and MySQL5. For web applications, JavaScript, Ajax, JSON and Sockets were also used, as well as the Google Maps API. Aperio SVS files were converted into the Google Maps format using VIPS and Openslide libraries. Results We designed the Java library as a Service Provider Interface (SPI), extendable by third parties. Analysis is computed in real-time (3 billion pixels per minute). Tests were made on 5000 single images, 200 NDPI WSI, 100 Aperio SVS WSI converted to the Google Maps format. Conclusions Applications based on our method and libraries can be used upstream, as calibration and quality control tool for the WSI acquisition systems, or as tools to reacquire tiles while the WSI is being scanned. They can also be used downstream to reacquire the complete slides that are below the quality threshold for surgical pathology analysis. WSI may also be displayed in a smarter way by sending and displaying the regions of highest quality before other regions. Such quality assessment scores could be integrated as WSI's metadata shared in clinical, research or teaching contexts, for a more efficient medical informatics workflow. PMID:25565494

  17. Baseline coastal oblique aerial photographs collected from Key Largo, Florida, to the Florida/Georgia border, September 5-6, 2014

    USGS Publications Warehouse

    Morgan, Karen L. M.

    2015-09-14

    In addition to the photographs, a Google Earth Keyhole Markup Language (KML) file is provided and can be used to view the images by clicking on the marker and then clicking on either the thumbnail or the link above the thumbnail. The KML files were created using the photographic navigation files. These KML files can be found in the kml folder.

  18. Isosurface Display of 3-D Scalar Fields from a Meteorological Model on Google Earth

    DTIC Science & Technology

    2013-07-01

    facets to four, we have chosen to adopt and implement a revised method discussed and made available by Bourke (1994), which can accommodate up to...five facets for a given grid cube. While the published code from Bourke (1994) is in the public domain, it was originally implemented in the C...and atmospheric temperatures. 17 4. References Bourke , P. Polygonising a Scalar Field. http://paulbourke.net/geometry/polygonise

  19. Effects of Spatial Ability, Gender Differences, and Pictorial Training on Children Using 2-D and 3-D Environments to Recall Landmark Locations from Memory

    ERIC Educational Resources Information Center

    Kopcha, Theodore J.; Otumfuor, Beryl A.; Wang, Lu

    2015-01-01

    This study examines the effects of spatial ability, gender differences, and pictorial training on fourth grade students' ability to recall landmark locations from memory. Ninety-six students used Google Earth over a 3-week period to locate landmarks (3-D) and mark their location on a 2-D topographical map. Analysis of covariance on posttest scores…

  20. Too Cool for School? No Way! Using the TPACK Framework: You Can Have Your Hot Tools and Teach with Them, Too

    ERIC Educational Resources Information Center

    Mishra, Punya; Koehler, Matthew

    2009-01-01

    This is the age of cool tools. Facebook, iPhone, Flickr, blogs, cloud computing, Smart Boards, YouTube, Google Earth, and GPS are just a few examples of new technologies that bombard people from all directions. As individuals people see a new technology and can appreciate its coolness, but as educators they wonder how these tools can be used for…

  1. Automated Detection of Thermo-Erosion in High Latitude Ecosystems

    NASA Astrophysics Data System (ADS)

    Lara, M. J.; Chipman, M. L.; Hu, F.

    2017-12-01

    Detecting permafrost disturbance is of critical importance as the severity of climate change and associated increase in wildfire frequency and magnitude impacts regional to global carbon dynamics. However, it has not been possible to evaluate spatiotemporal patterns of permafrost degradation over large regions of the Arctic, due to limited spatial and temporal coverage of high resolution optical, radar, lidar, or hyperspectral remote sensing products. Here we present the first automated multi-temporal analysis for detecting disturbance in response to permafrost thaw, using meso-scale high-frequency remote sensing products (i.e. entire Landsat image archive). This approach was developed, tested, and applied in the Noatak National Preserve (26,500km2) in northwestern Alaska. We identified thermo-erosion (TE), by capturing the indirect spectral signal associated with episodic sediment plumes in adjacent waterbodies following TE disturbance. We isolated this turbidity signal within lakes during summer (mid-summer & late-summer) and annual time-period image composites (1986-2016), using the cloud-based geospatial parallel processing platform, Google Earth Engine™API. We validated the TE detection algorithm using seven consecutive years of sub-meter high resolution imagery (2009-2015) covering 798 ( 33%) of the 2456 total lakes in the Noatak lowlands. Our approach had "good agreement" with sediment pulses and landscape deformation in response to permafrost thaw (overall accuracy and kappa coefficient of 85% and 0.61). We identify active TE to impact 10.4% of all lakes, but was inter-annually variable, with the highest and lowest TE years represented by 1986 ( 41.1%) and 2002 ( 0.7%), respectively. We estimate thaw slumps, lake erosion, lake drainage, and gully formation to account for 23.3, 61.8, 12.5, and 1.3%, of all active TE across the Noatak National Preserve. Preliminary analysis, suggests TE may be subject to a hysteresis effect following extreme climatic conditions or wildfire. This work demonstrates the utility of meso-scale high frequency remote sensing products for advancing high latitude permafrost research.

  2. Development of RESTful services and map-based user interface tools for access and delivery of data and metadata from the Marine-Geo Digital Library

    NASA Astrophysics Data System (ADS)

    Morton, J. J.; Ferrini, V. L.

    2015-12-01

    The Marine Geoscience Data System (MGDS, www.marine-geo.org) operates an interactive digital data repository and metadata catalog that provides access to a variety of marine geology and geophysical data from throughout the global oceans. Its Marine-Geo Digital Library includes common marine geophysical data types and supporting data and metadata, as well as complementary long-tail data. The Digital Library also includes community data collections and custom data portals for the GeoPRISMS, MARGINS and Ridge2000 programs, for active source reflection data (Academic Seismic Portal), and for marine data acquired by the US Antarctic Program (Antarctic and Southern Ocean Data Portal). Ensuring that these data are discoverable not only through our own interfaces but also through standards-compliant web services is critical for enabling investigators to find data of interest.Over the past two years, MGDS has developed several new RESTful web services that enable programmatic access to metadata and data holdings. These web services are compliant with the EarthCube GeoWS Building Blocks specifications and are currently used to drive our own user interfaces. New web applications have also been deployed to provide a more intuitive user experience for searching, accessing and browsing metadata and data. Our new map-based search interface combines components of the Google Maps API with our web services for dynamic searching and exploration of geospatially constrained data sets. Direct introspection of nearly all data formats for hundreds of thousands of data files curated in the Marine-Geo Digital Library has allowed for precise geographic bounds, which allow geographic searches to an extent not previously possible. All MGDS map interfaces utilize the web services of the Global Multi-Resolution Topography (GMRT) synthesis for displaying global basemap imagery and for dynamically provide depth values at the cursor location.

  3. The National 3-D Geospatial Information Web-Based Service of Korea

    NASA Astrophysics Data System (ADS)

    Lee, D. T.; Kim, C. W.; Kang, I. G.

    2013-09-01

    3D geospatial information systems should provide efficient spatial analysis tools and able to use all capabilities of the third dimension, and a visualization. Currently, many human activities make steps toward the third dimension like land use, urban and landscape planning, cadastre, environmental monitoring, transportation monitoring, real estate market, military applications, etc. To reflect this trend, the Korean government has been started to construct the 3D geospatial data and service platform. Since the geospatial information was introduced in Korea, the construction of geospatial information (3D geospatial information, digital maps, aerial photographs, ortho photographs, etc.) has been led by the central government. The purpose of this study is to introduce the Korean government-lead 3D geospatial information web-based service for the people who interested in this industry and we would like to introduce not only the present conditions of constructed 3D geospatial data but methodologies and applications of 3D geospatial information. About 15% (about 3,278.74 km2) of the total urban area's 3D geospatial data have been constructed by the national geographic information institute (NGII) of Korea from 2005 to 2012. Especially in six metropolitan cities and Dokdo (island belongs to Korea) on level of detail (LOD) 4 which is photo-realistic textured 3D models including corresponding ortho photographs were constructed in 2012. In this paper, we represented web-based 3D map service system composition and infrastructure and comparison of V-world with Google Earth service will be presented. We also represented Open API based service cases and discussed about the protection of location privacy when we construct 3D indoor building models. In order to prevent an invasion of privacy, we processed image blurring, elimination and camouflage. The importance of public-private cooperation and advanced geospatial information policy is emphasized in Korea. Thus, the progress of spatial information industry of Korea is expected in the near future.

  4. Simulating a 40-year flood event climatology of Australia with a view to ocean-land teleconnections

    NASA Astrophysics Data System (ADS)

    Schumann, Guy J.-P.; Andreadis, Konstantinos; Stampoulis, Dimitrios; Bates, Paul

    2015-04-01

    We develop, for the first time, a proof-of-concept version for a high-resolution global flood inundation model to generate a flood inundation climatology of the past 40 years (1973-2012) for the entire Australian continent at a native 1 km resolution. The objectives of our study includes (1) deriving an inundation climatology for a continent (Australia) as a demonstrator case to understand the requirements for expanding globally; (2) developing a test bed to assess the potential and value of current and future satellite missions (GRACE, SMAP, ICESat-2, AMSR-2, Sentinels and SWOT) in flood monitoring; and (3) answering science questions such as the linking of inundation to ocean circulation teleconnections. We employ the LISFLOOD-FP hydrodynamic model to generate a flood inundation climatology. The model will be built from freely available SRTM-derived data (channel widths, bank heights and floodplain topography corrected for vegetation canopy using ICESat canopy heights). Lakes and reservoirs are represented and channel hydraulics are resolved using actual channel data with bathymetry inferred from hydraulic geometry. Simulations are run with gauged flows and floodplain inundation climatology are compared to observations from GRACE, flood maps from Landsat, SAR, and MODIS. Simulations have been completed for the entire Australian continent. Additionally, changes in flood inundation have been correlated with indices related to global ocean circulation, such as the El Niño Southern Oscillation index. We will produce data layers on flood event climatology and other derived (default) products from the proposed model including channel and floodplain depths, flow direction, velocity vectors, floodplain water volume, shoreline extent and flooded area. These data layers will be in the form of simple vector and raster formats. Since outputs will be large in size we propose to upload them onto Google Earth under the GEE API license.

  5. Global Precipitation Measurement (GPM) Mission Products and Services at the NASA Goddard Earth Sciences Data and Information Services Center (GES DISC)

    NASA Technical Reports Server (NTRS)

    Liu, Z.; Ostrenga, D.; Vollmer, B.; Kempler, S.; Deshong, B.; Greene, M.

    2015-01-01

    The NASA Goddard Earth Sciences (GES) Data and Information Services Center (DISC) hosts and distributes GPM data within the NASA Earth Observation System Data Information System (EOSDIS). The GES DISC is also home to the data archive for the GPM predecessor, the Tropical Rainfall Measuring Mission (TRMM). Over the past 17 years, the GES DISC has served the scientific as well as other communities with TRMM data and user-friendly services. During the GPM era, the GES DISC will continue to provide user-friendly data services and customer support to users around the world. GPM products currently and to-be available: -Level-1 GPM Microwave Imager (GMI) and partner radiometer products, DPR products -Level-2 Goddard Profiling Algorithm (GPROF) GMI and partner products, DPR products -Level-3 daily and monthly products, DPR products -Integrated Multi-satellitE Retrievals for GPM (IMERG) products (early, late, and final) A dedicated Web portal (including user guides, etc.) has been developed for GPM data (http://disc.sci.gsfc.nasa.gov/gpm). Data services that are currently and to-be available include Google-like Mirador (http://mirador.gsfc.nasa.gov/) for data search and access; data access through various Web services (e.g., OPeNDAP, GDS, WMS, WCS); conversion into various formats (e.g., netCDF, HDF, KML (for Google Earth), ASCII); exploration, visualization, and statistical online analysis through Giovanni (http://giovanni.gsfc.nasa.gov); generation of value-added products; parameter and spatial subsetting; time aggregation; regridding; data version control and provenance; documentation; science support for proper data usage, FAQ, help desk; monitoring services (e.g. Current Conditions) for applications. The United User Interface (UUI) is the next step in the evolution of the GES DISC web site. It attempts to provide seamless access to data, information and services through a single interface without sending the user to different applications or URLs (e.g., search, access, subset, Giovanni, documents).

  6. Leveraging Global Geo-Data and Information Technologies to Bring Authentic Research Experiences to Students in Introductory Geosciences Courses

    NASA Astrophysics Data System (ADS)

    Ryan, J. G.

    2014-12-01

    The 2012 PCAST report identified the improvement of "gateway" science courses as critical to increasing the number of STEM graduates to levels commensurate with national needs. The urgent need to recruit/ retain more STEM graduates is particularly acute in the geosciences, where growth in employment opportunities, an aging workforce and flat graduation rates are leading to substantial unmet demand for geoscience-trained STEM graduates. The need to increase the number of Bachelors-level geoscience graduates was an identified priority at the Summit on the Future of Undergraduate Geoscience Education (http://www.jsg.utexas.edu/events/future-of-geoscience-undergraduateeducation/), as was the necessity of focusing on 2-year colleges, where a growing number of students are being introduced to geosciences. Undergraduate research as an instructional tool can help engage and retain students, but has largely not been part of introductory geoscience courses because of the challenge of scaling such activities for large student numbers. However, burgeoning information technology resources, including publicly available earth and planetary data repositories and freely available, intuitive data visualization platforms makes structured, in-classroom investigations of geoscience questions tractable, and open-ended student inquiry possible. Examples include "MARGINS Mini-Lessons", instructional resources developed with the support of two NSF-DUE grant awards that involve investigations of marine geosciences data resources (overseen by the Integrated Earth Data Applications (IEDA) portal: www.iedadata.org) and data visualization using GeoMapApp (www.geomapapp.org); and the growing suite of Google-Earth based data visualization and exploration activities overseen by the Google Earth in Onsite and Distance Education project (geode.net). Sample-based investigations are also viable in introductory courses, thanks to remote instrument operations technologies that allow real student participation in instrument-based data collection and interpretation. It is thus possible to model for students nearly the entire scientific process in introductory geoscience courses, allowing them to experience the excitement of "doing" science and thereby enticing more of them into the field.

  7. Landforms in Lidar: Building a Catalog of Digital Landforms for Education and Outreach

    NASA Astrophysics Data System (ADS)

    Kleber, E.; Crosby, C.; Olds, S. E.; Arrowsmith, R.

    2012-12-01

    Lidar (Light Detection and Ranging) has emerged as a fundamental tool in the earth sciences. The collection of high-resolution lidar topography from an airborne or terrestrial platform allows landscapes and landforms to be spatially represented in at sub-meter resolution and in three dimensions. While the growing availability of lidar has led to numerous new scientific findings, these data also have tremendous value for earth science education. The study of landforms is an essential and basic element of earth science education that helps students to grasp fundamental earth system processes and how they manifest themselves in the world around us. Historically students are introduced to landforms and related processes through diagrams and images seen in earth science textbooks. Lidar data, coupled with free tools such as Google Earth, provide a means to allow students and the interested public to visualize, explore, and interrogate these same landforms in an interactive manner not possible in two-dimensional remotely sensed imagery. The NSF-funded OpenTopography facility hosts data collected for geologic, hydrologic, and biological research, covering a diverse range of landscapes, and thus provides a wealth of data that could be incorporated into educational materials. OpenTopography, in collaboration with UNAVCO, are developing a catalog of classic geologic landforms depicted in lidar. Beginning with textbook-examples of features such as faults and tectonic landforms, dunes, fluvial and glacial geomorphology, and natural hazards such as landslides and volcanoes, the catalog will be an online resource for educators and the interested public. Initially, the landforms will be sourced from pre-existing datasets hosted by OpenTopography. Users will see an image representative of the landform then have the option to download the data in Google Earth KMZ format, as a digital elevation model, or the original lidar point cloud file. By providing the landform in a range of data types, educators can choose to load the image into a presentation, work with the data in a GIS, or do more advanced data analysis on the original point cloud data. In addition, for each landform, links to additional online resources and a bibliography of select publications will be provided. OpenTopography will initially seed the lidar landform catalog, but ultimately the goal is to solicit community contributions as well. We envision the catalog development as the first phase of this activity, and hope that later activities will focus on building curriculum that leverages the catalog and lidar data to teach earth system processes.

  8. Using NASA's Giovanni Web Portal to Access and Visualize Satellite-based Earth Science Data in the Classroom

    NASA Technical Reports Server (NTRS)

    Lloyd, Steven; Acker, James G.; Prados, Ana I.; Leptoukh, Gregory G.

    2008-01-01

    One of the biggest obstacles for the average Earth science student today is locating and obtaining satellite-based remote sensing data sets in a format that is accessible and optimal for their data analysis needs. At the Goddard Earth Sciences Data and Information Services Center (GES-DISC) alone, on the order of hundreds of Terabytes of data are available for distribution to scientists, students and the general public. The single biggest and time-consuming hurdle for most students when they begin their study of the various datasets is how to slog through this mountain of data to arrive at a properly sub-setted and manageable data set to answer their science question(s). The GES DISC provides a number of tools for data access and visualization, including the Google-like Mirador search engine and the powerful GES-DISC Interactive Online Visualization ANd aNalysis Infrastructure (Giovanni) web interface.

  9. Illuminating Northern California’s Active Faults

    USGS Publications Warehouse

    Prentice, Carol S.; Crosby, Christopher J.; Whitehill, Caroline S.; Arrowsmith, J. Ramon; Furlong, Kevin P.; Philips, David A.

    2009-01-01

    Newly acquired light detection and ranging (lidar) topographic data provide a powerful community resource for the study of landforms associated with the plate boundary faults of northern California (Figure 1). In the spring of 2007, GeoEarthScope, a component of the EarthScope Facility construction project funded by the U.S. National Science Foundation, acquired approximately 2000 square kilometers of airborne lidar topographic data along major active fault zones of northern California. These data are now freely available in point cloud (x, y, z coordinate data for every laser return), digital elevation model (DEM), and KMZ (zipped Keyhole Markup Language, for use in Google EarthTM and other similar software) formats through the GEON OpenTopography Portal (http://www.OpenTopography.org/data). Importantly, vegetation can be digitally removed from lidar data, producing high-resolution images (0.5- or 1.0-meter DEMs) of the ground surface beneath forested regions that reveal landforms typically obscured by vegetation canopy (Figure 2)

  10. A proposed-standard format to represent and distribute tomographic models and other earth spatial data

    NASA Astrophysics Data System (ADS)

    Postpischl, L.; Morelli, A.; Danecek, P.

    2009-04-01

    Formats used to represent (and distribute) tomographic earth models differ considerably and are rarely self-consistent. In fact, each earth scientist, or research group, uses specific conventions to encode the various parameterizations used to describe, e.g., seismic wave speed or density in three dimensions, and complete information is often found in related documents or publications (if available at all) only. As a consequence, use of various tomographic models from different authors requires considerable effort, is more cumbersome than it should be and prevents widespread exchange and circulation within the community. We propose a format, based on modern web standards, able to represent different (grid-based) model parameterizations within the same simple text-based environment, easy to write, to parse, and to visualise. The aim is the creation of self-describing data-structures, both human and machine readable, that are automatically recognised by general-purpose software agents, and easily imported in the scientific programming environment. We think that the adoption of such a representation as a standard for the exchange and distribution of earth models can greatly ease their usage and enhance their circulation, both among fellow seismologists and among a broader non-specialist community. The proposed solution uses semantic web technologies, fully fitting the current trends in data accessibility. It is based on Json (JavaScript Object Notation), a plain-text, human-readable lightweight computer data interchange format, which adopts a hierarchical name-value model for representing simple data structures and associative arrays (called objects). Our implementation allows integration of large datasets with metadata (authors, affiliations, bibliographic references, units of measure etc.) into a single resource. It is equally suited to represent other geo-referenced volumetric quantities — beyond tomographic models — as well as (structured and unstructured) computational meshes. This approach can exploit the capabilities of the web browser as a computing platform: a series of in-page quick tools for comparative analysis between models will be presented, as well as visualisation techniques for tomographic layers in Google Maps and Google Earth. We are working on tools for conversion into common scientific format like netCDF, to allow easy visualisation in GEON-IDV or gmt.

  11. Enhancements and Evolution of the Real Time Mission Monitor

    NASA Technical Reports Server (NTRS)

    Goodman, Michael; Blakeslee, Richard; Hardin, Danny; Hall, John; He, Yubin; Regner, Kathryn

    2008-01-01

    The Real Time Mission Monitor (RTMM) is a visualization and information system that fuses multiple Earth science data sources, to enable real time decision-making for airborne and ground validation experiments. Developed at the National Aeronautics and Space Administration (NASA) Marshall Space Flight Center, RTMM is a situational awareness, decision-support system that integrates satellite imagery, radar, surface and airborne instrument data sets, model output parameters, lightning location observations, aircraft navigation data, soundings, and other applicable Earth science data sets. The integration and delivery of this information is made possible using data acquisition systems, network communication links, network server resources, and visualizations through the Google Earth virtual globe application. RTMM has proven extremely valuable for optimizing individual Earth science airborne field experiments. Flight planners, mission scientists, instrument scientists and program managers alike appreciate the contributions that RTMM makes to their flight projects. We have received numerous plaudits from a wide variety of scientists who used RTMM during recent field campaigns including the 2006 NASA African Monsoon Multidisciplinary Analyses (NAMMA), 2007 Tropical Composition, Cloud, and Climate Coupling (TC4), 2008 Arctic Research of the Composition of the Troposphere from Aircraft and Satellites (ARCTAS) missions, the 2007-2008 NOAA-NASA Aerosonde Hurricane flights and the 2008 Soil Moisture Active-Passive Validation Experiment (SMAP-VEX). Improving and evolving RTMM is a continuous process. RTMM recently integrated the Waypoint Planning Tool, a Java-based application that enables aircraft mission scientists to easily develop a pre-mission flight plan through an interactive point-and-click interface. Individual flight legs are automatically calculated for altitude, latitude, longitude, flight leg distance, cumulative distance, flight leg time, cumulative time, and satellite overpass intersections. The resultant flight plan is then generated in KML and quickly posted to the Google Earth-based RTMM for interested scientists to view the planned flight track and then compare it to the actual real time flight progress. A description of the system architecture, components, and applications along with reviews and animations of RTMM during the field campaigns, plus planned enhancements and future opportunities will be presented.

  12. Spatial Distribution of the Population at Risk of Cholangiocarcinoma in Chum Phaung District, Nakhon Ratchasima Province of Thailand.

    PubMed

    Kaewpitoon, Soraya J; Rujirakul, Ratana; Loyd, Ryan A; Matrakool, Likit; Sangkudloa, Amnat; Kaewthani, Sarochinee; Khemplila, Kritsakorn; Eaksanti, Thawatchai; Phatisena, Tanida; Kujapun, Jirawoot; Norkaew, Jun; Joosiri, Apinya; Kaewpitoon, Natthawut

    2016-01-01

    Cholangiocarcinoma (CCA) is a serious health problem in Thailand, particularly in northeastern and northern regions, but epidemiological studies are scarce and the spatial distribution of CCA remains to be determined. A database for the population at risk is required for monitoring, surveillance and organization of home health care. This study aim was to geo-visually display the distribution of CCA in northeast Thailand, using a geographic information system and Google Earth. A cross-sectional survey was carried out in 9 sub-districts and 133 villages in Chum Phuang district, Nakhon Ratchasima province during June and October 2015. Data on demography, and the population at risk for CCA were combined with the points of villages, sub-district boundaries, district boundaries, and points of hospitals in districts, then fed into a geographical information system. After the conversion, all of the data were imported into Google Earth for geo-visualization. A total of 11,960 from 83,096 population were included in this study. Females and male were 52.5%, and 47.8%, the age group 41-50 years old 33.3%. Individual risk for CCA was identifed and classified by using the Korat CCA verbal screening test as low (92.8%), followed by high risk (6.74%), and no (0.49%), respectively. Gender (X2-test=1143.63, p-value= 0.001), age group (X2-test==211.36, p-value=0.0001), and sub-district (X2-test=1471.858, p-value=0.0001) were significantly associated with CCA risk. Spatial distribution of the population at risk for CCA in Chum Phuang district was viewed with Google Earth. Geo-visual display followed Layer 1: District, Layer 2: Sub-district, Layer 3: Number of low risk in village, Layer 4: Number of high risk in village, and Layer 5: Hospital in Chum Phuang District and their related catchment areas. We present the first risk geo-visual display of CCA in this rural community, which is important for spatial targeting of control efforts. Risk appears to be strongly associated with gender, age group, and sub-district. Therefor, spatial distribution is suitable for the use in the further monitoring, surveillance, and home health care for CCA.

  13. Mapping rice extent map with crop intensity in south China through integration of optical and microwave images based on google earth engine

    NASA Astrophysics Data System (ADS)

    Zhang, X.; Wu, B.; Zhang, M.; Zeng, H.

    2017-12-01

    Rice is one of the main staple foods in East Asia and Southeast Asia, which has occupied more than half of the world's population with 11% of cultivated land. Study on rice can provide direct or indirect information on food security and water source management. Remote sensing has proven to be the most effective method to monitoring the cropland in large scale by using temporary and spectral information. There are two main kinds of satellite have been used to mapping rice including microwave and optical. Rice, as the main crop of paddy fields, the main feature different from other crops is flooding phenomenon at planning stage (Figure 1). Microwave satellites can penetrate through clouds and efficiency on monitoring flooding phenomenon. Meanwhile, the vegetation index based on optical satellite can well distinguish rice from other vegetation. Google Earth Engine is a cloud-based platform that makes it easy to access high-performance computing resources for processing very large geospatial datasets. Google has collected large number of remote sensing satellite data around the world, which providing researchers with the possibility of doing application by using multi-source remote sensing data in a large area. In this work, we map rice planting area in south China through integration of Landsat-8 OLI, Sentienl-2, and Sentinel-1 Synthetic Aperture Radar (SAR) images. The flowchart is shown in figure 2. First, a threshold method the VH polarized backscatter from SAR sensor and vegetation index including normalized difference vegetation index (NDVI) and enhanced vegetation index (EVI) from optical sensor were used the classify the rice extent map. The forest and water surface extent map provided by earth engine were used to mask forest and water. To overcome the problem of the "salt and pepper effect" by Pixel-based classification when the spatial resolution increased, we segment the optical image and use the pixel- based classification results to merge the object-oriented segmentation data, and finally get the rice extent map. At last, by using the time series analysis, the peak count was obtained for each rice area to ensure the crop intensity. In this work, the rice ground point from a GVG crowdsourcing smartphone and rice area statistical results from National Bureau of Statistics were used to validate and evaluate our result.

  14. Development of Waypoint Planning Tool in Response to NASA Field Campaign Challenges

    NASA Technical Reports Server (NTRS)

    He, Matt; Hardin, Danny; Mayer, Paul; Blakeslee, Richard; Goodman, Michael

    2012-01-01

    Airborne real time observations are a major component of NASA 's Earth Science research and satellite ground validation studies. Multiple aircraft are involved in most NASA field campaigns. The coordination of the aircraft with satellite overpasses, other airplanes and the constantly evolving, dynamic weather conditions often determines the success of the campaign. Planning a research aircraft mission within the context of meeting the science objectives is a complex task because it requires real time situational awareness of the weather conditions that affect the aircraft track. A flight planning tools is needed to provide situational awareness information to the mission scientists, and help them plan and modify the flight tracks. Scientists at the University of Alabama ]Huntsville and the NASA Marshall Space Flight Center developed the Waypoint Planning Tool, an interactive software tool that enables scientists to develop their own flight plans (also known as waypoints) with point -and-click mouse capabilities on a digital map filled with real time raster and vector data. The development of this Waypoint Planning Tool demonstrates the significance of mission support in responding to the challenges presented during NASA field campaigns. Analysis during and after each campaign helped identify both issues and new requirements, and initiated the next wave of development. Currently the Waypoint Planning Tool has gone through three rounds of development and analysis processes. The development of this waypoint tool is directly affected by the technology advances on GIS/Mapping technologies. From the standalone Google Earth application and simple KML functionalities, to Google Earth Plugin on web platform, and to the rising open source GIS tools with New Java Script frameworks, the Waypoint Planning Tool has entered its third phase of technology advancement. Adapting new technologies for the Waypoint Planning Tool ensures its success in helping scientist reach their mission objectives.

  15. Using Google Earth to Explore Multiple Data Sets and Plate Tectonic Concepts

    NASA Astrophysics Data System (ADS)

    Goodell, L. P.

    2015-12-01

    Google Earth (GE) offers an engaging and dynamic environment for exploration of earth science data. While GIS software offers higher-level analytical capability, it comes with a steep learning curve and complex interface that is not easy for the novice, and in many cases the instructor, to negotiate. In contrast, the intuitive interface of GE makes it easy for students to quickly become proficient in manipulating the globe and independently exploring relationships between multiple data sets at a wide range of scales. Inquiry-based, data-rich exercises have been developed for both introductory and upper-level activities including: exploration of plate boundary characteristics and relative motion across plate boundaries; determination and comparison of short-term and long-term average plate velocities; crustal strain analysis (modeled after the UNAVCO activity); and determining earthquake epicenters, body-wave magnitudes, and focal plane solutions. Used successfully in undergraduate course settings, for TA training and for professional development programs for middle and high school teachers, the exercises use the following GE data sets (with sources) that have been collected/compiled by the author and are freely available for non-commercial use: 1) tectonic plate boundaries and plate names (Bird, 2003 model); 2) real-time earthquakes (USGS); 3) 30 years of M>=5.0 earthquakes, plotted by depth (USGS); 4) seafloor age (Mueller et al., 1997, 2008); 5) location and age data for hot spot tracks (published literature); 6) Holocene volcanoes (Smithsonian Global Volcanism Program); 7) GPS station locations with links to times series (JPL, NASA, UNAVCO); 8) short-term motion vectors derived from GPS times series; 9) long-term average motion vectors derived from plate motion models (UNAVCO plate motion calculator); 10) earthquake data sets consisting of seismic station locations and links to relevant seismograms (Rapid Earthquake Viewer, USC/IRIS/DELESE).

  16. Measuring river from the cloud - River width algorithm development on Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Yang, X.; Pavelsky, T.; Allen, G. H.; Donchyts, G.

    2017-12-01

    Rivers are some of the most dynamic features of the terrestrial land surface. They help distribute freshwater, nutrients, sediment, and they are also responsible for some of the greatest natural hazards. Despite their importance, our understanding of river behavior is limited at the global scale, in part because we do not have a river observational dataset that spans both time and space. Remote sensing data represent a rich, largely untapped resource for observing river dynamics. In particular, publicly accessible archives of satellite optical imagery, which date back to the 1970s, can be used to study the planview morphodynamics of rivers at the global scale. Here we present an image processing algorithm developed using the Google Earth Engine cloud-based platform, that can automatically extracts river centerlines and widths from Landsat 5, 7, and 8 scenes at 30 m resolution. Our algorithm makes use of the latest monthly global surface water history dataset and an existing Global River Width from Landsat (GRWL) dataset to efficiently extract river masks from each Landsat scene. Then a combination of distance transform and skeletonization techniques are used to extract river centerlines. Finally, our algorithm calculates wetted river width at each centerline pixel perpendicular to its local centerline direction. We validated this algorithm using in situ data estimated from 16 USGS gauge stations (N=1781). We find that 92% of the width differences are within 60 m (i.e. the minimum length of 2 Landsat pixels). Leveraging Earth Engine's infrastructure of collocated data and processing power, our goal is to use this algorithm to reconstruct the morphodynamic history of rivers globally by processing over 100,000 Landsat 5 scenes, covering from 1984 to 2013.

  17. The E3 ubiquitin ligases β-TrCP and FBXW7 cooperatively mediates GSK3-dependent Mcl-1 degradation induced by the Akt inhibitor API-1, resulting in apoptosis.

    PubMed

    Ren, Hui; Koo, Junghui; Guan, Baoxiang; Yue, Ping; Deng, Xingming; Chen, Mingwei; Khuri, Fadlo R; Sun, Shi-Yong

    2013-11-22

    The novel Akt inhibitor, API-1, induces apoptosis through undefined mechanisms. The current study focuses on revealing the mechanisms by which API-1 induces apoptosis. API-1 rapidly and potently reduced the levels of Mcl-1 primarily in API-1-senstive lung cancer cell lines. Ectopic expression of Mcl-1 protected cells from induction of apoptosis by API-1. API-1 treatment decreased the half-life of Mcl-1, whereas inhibition of the proteasome with MG132 rescued Mcl-1 reduction induced by API-1. API-1 decreased Mcl-1 levels accompanied with a rapid increase in Mcl-1 phosphorylation (S159/T163). Moreover, inhibition of GSK3 inhibited Mcl-1 phosphorylation and reduction induced by API-1 and antagonized the effect of API-1 on induction of apoptosis. Knockdown of either FBXW7 or β-TrCP alone, both of which are E3 ubiquitin ligases involved in Mcl-1 degradation, only partially rescued Mcl-1 reduction induced by API-1. However, double knockdown of both E3 ubiquitin ligases enhanced the rescue of API-1-induced Mcl-1 reduction. API-1 induces GSK3-dependent, β-TrCP- and FBXW7-mediated Mcl-1 degradation, resulting in induction of apoptosis.

  18. The E3 ubiquitin ligases β-TrCP and FBXW7 cooperatively mediates GSK3-dependent Mcl-1 degradation induced by the Akt inhibitor API-1, resulting in apoptosis

    PubMed Central

    2013-01-01

    Background The novel Akt inhibitor, API-1, induces apoptosis through undefined mechanisms. The current study focuses on revealing the mechanisms by which API-1 induces apoptosis. Results API-1 rapidly and potently reduced the levels of Mcl-1 primarily in API-1-senstive lung cancer cell lines. Ectopic expression of Mcl-1 protected cells from induction of apoptosis by API-1. API-1 treatment decreased the half-life of Mcl-1, whereas inhibition of the proteasome with MG132 rescued Mcl-1 reduction induced by API-1. API-1 decreased Mcl-1 levels accompanied with a rapid increase in Mcl-1 phosphorylation (S159/T163). Moreover, inhibition of GSK3 inhibited Mcl-1 phosphorylation and reduction induced by API-1 and antagonized the effect of API-1 on induction of apoptosis. Knockdown of either FBXW7 or β-TrCP alone, both of which are E3 ubiquitin ligases involved in Mcl-1 degradation, only partially rescued Mcl-1 reduction induced by API-1. However, double knockdown of both E3 ubiquitin ligases enhanced the rescue of API-1-induced Mcl-1 reduction. Conclusions API-1 induces GSK3-dependent, β-TrCP- and FBXW7-mediated Mcl-1 degradation, resulting in induction of apoptosis. PMID:24261825

  19. Advances in Integrating Autonomy with Acoustic Communications for Intelligent Networks of Marine Robots

    DTIC Science & Technology

    2013-02-01

    Sonar AUV #Environmental Sampling Environmental AUV +name : string = OEX Ocean Explorer +name : string = Hammerhead Iver2 +name : string = Unicorn ...executable» Google Earth Bluefin 21 AUV ( Unicorn ) MOOS Computer GPS «serial» Bluefin 21 AUV (Macrura) MOOS Computer «acoustic» Micro-Modem «wired...Computer Bluefin 21 AUV ( Unicorn ) MOOS Computer NURC AUV (OEX) MOOS Computer Topside MOOS Computer «wifi» 5.0GHz WiLan «acoustic» Edgetech GPS

  20. Analysis of the Global Maritime Transportation System and Its Resilience

    DTIC Science & Technology

    2017-06-01

    shortest/cheapest available route. • We establish re-routing strategies that apply, if a part of a route becomes impassable for container ships. We...The currently available throughput data is mostly from 2011, with few exceptions of 2010 and 2012. In total, there are 94 container seaports from 58...port or to the next available container port for further transportation by sea. Figure 3.5. The road layer visualized in Google Earth. Because our

  1. Naval Fuel Management System (NFMS): A Decision Support System for a Limited Resource

    DTIC Science & Technology

    2010-09-01

    Figure 14. Proof of concept displayed on Google Earth. ..................................................50 Figure 15. GTG fuel burn rate. From [10...the gas turbine generators ( GTG ) to provide electricity. An average of 280 gallons per hour (GPH) was used for the GTG to take into account changing... GTGs or operating on more than one GTG for short periods of time. The amount of fuel burned by the GTGs was found in the Shipboard Energy

  2. The Chandra Source Catalog : Google Earth Interface

    NASA Astrophysics Data System (ADS)

    Glotfelty, Kenny; McLaughlin, W.; Evans, I.; Evans, J.; Anderson, C. S.; Bonaventura, N. R.; Davis, J. E.; Doe, S. M.; Fabbiano, G.; Galle, E. C.; Gibbs, D. G., II; Grier, J. D.; Hain, R.; Hall, D. M.; Harbo, P. N.; He, H.; Houck, J. C.; Karovska, M.; Kashyap, V. L.; Lauer, J.; McCollough, M. L.; McDowell, J. C.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Mossman, A. E.; Nichols, J. S.; Nowak, M. A.; Plummer, D. A.; Primini, F. A.; Refsdal, B. L.; Rots, A. R.; Siemiginowska, A. L.; Sundheim, B. A.; Tibbetts, M. S.; van Stone, D. W.; Winkelman, S. L.; Zografou, P.

    2009-09-01

    The Chandra Source Catalog (CSC) contains multi-resolution, exposure corrected, background subtracted, full-field images that are stored as individual FITS files and as three-color JPEG files. In this poster we discuss how we took these data and were able to, with relatively minimal effort, convert them for use with the Google Earth application in its ``Sky'' mode. We will highlight some of the challenges which include converting the data to the required Mercator projection, reworking the 3-color algorithm for pipeline processing, and ways to reduce the data volume through re-binning, using color-maps, and special Keyhole Markup Language (kml) tags to only load images on-demand. The result is a collection of some 11,000 3-color images that are available for all the individual observation in the CSC Release 1. We also have made available all ˜4000 Field-of-View outlines (with per-chip regions), which turns out are trivial to produce starting with a simple dmlist command. In the first week of release, approximately 40% of the images have been accessed at least once through some 50,000 individual web hits which have served over 4Gb of data to roughly 750 users in 60+ countries. We will also highlight some future directions we are exploring, including real-time catalog access to individual source properties and eventual access to file based products such as FITS images, spectra, and light-curves.

  3. Harnessing Satellite Imageries in Feature Extraction Using Google Earth Pro

    NASA Astrophysics Data System (ADS)

    Fernandez, Sim Joseph; Milano, Alan

    2016-07-01

    Climate change has been a long-time concern worldwide. Impending flooding, for one, is among its unwanted consequences. The Phil-LiDAR 1 project of the Department of Science and Technology (DOST), Republic of the Philippines, has developed an early warning system in regards to flood hazards. The project utilizes the use of remote sensing technologies in determining the lives in probable dire danger by mapping and attributing building features using LiDAR dataset and satellite imageries. A free mapping software named Google Earth Pro (GEP) is used to load these satellite imageries as base maps. Geotagging of building features has been done so far with the use of handheld Global Positioning System (GPS). Alternatively, mapping and attribution of building features using GEP saves a substantial amount of resources such as manpower, time and budget. Accuracy-wise, geotagging by GEP is dependent on either the satellite imageries or orthophotograph images of half-meter resolution obtained during LiDAR acquisition and not on the GPS of three-meter accuracy. The attributed building features are overlain to the flood hazard map of Phil-LiDAR 1 in order to determine the exposed population. The building features as obtained from satellite imageries may not only be used in flood exposure assessment but may also be used in assessing other hazards and a number of other uses. Several other features may also be extracted from the satellite imageries.

  4. The impact of land use changes in the Banjarsari village, Cerme district of Gresik Regency, East Java Province

    NASA Astrophysics Data System (ADS)

    Ayu Larasati, Dian; Hariyanto, Bambang

    2018-01-01

    High population growth, and development activities in various fields will lead to join the growing demand for land. Cerme is a district close to the city of Surabaya, therefore a lot of agricultural land in Cerme used as housing and industry in order to support the growth of the population whose land in Surabaya city could not accommodate more. Base on this fact the research be did. The aim of this research is: determine the pattern of land use changes in the last year and to analyze the socioeconomic changes in the Banjarsari village, Gresik Regency. To determine the socioeconomic changes in the area of research is required: a). population change data from 2010 to 2015, b). Google Earth Imagery 2010 to 2015. The population data and the type of work changes are described by the time series and land cover change analysis. To analysis the land use conversion we also use Google Earth imagery with ArcGIS applications. For astronomical layout correction based on GPS field checks and RBI Map. The goal of this study is 1). Farmland change into residential/settlements in 2004-2014 is 12%; 2). Peoples who changing their livelihood is 39%. In occupational changes affect the population income ranges from 500,000 IDR -. 1,000,000 IDR per month/percapita.

  5. A campus-based course in field geology

    NASA Astrophysics Data System (ADS)

    Richard, G. A.; Hanson, G. N.

    2009-12-01

    GEO 305: Field Geology offers students practical experience in the field and in the computer laboratory conducting geological field studies on the Stony Brook University campus. Computer laboratory exercises feature mapping techniques and field studies of glacial and environmental geology, and include geophysical and hydrological analysis, interpretation, and mapping. Participants learn to use direct measurement and mathematical techniques to compute the location and geometry of features and gain practical experience in representing raster imagery and vector geographic data as features on maps. Data collecting techniques in the field include the use of hand-held GPS devices, compasses, ground-penetrating radar, tape measures, pacing, and leveling devices. Assignments that utilize these skills and techniques include mapping campus geology with GPS, using Google Earth to explore our geologic context, data file management and ArcGIS, tape and compass mapping of woodland trails, pace and compass mapping of woodland trails, measuring elevation differences on a hillside, measuring geologic sections and cores, drilling through glacial deposits, using ground penetrating radar on glaciotectonic topography, mapping the local water table, and the identification and mapping of boulders. Two three-hour sessions are offered per week, apportioned as needed between lecture; discussion; guided hands-on instruction in geospatial and other software such as ArcGIS, Google Earth, spreadsheets, and custom modules such as an arc intersection calculator; outdoor data collection and mapping; and writing of illustrated reports.

  6. DyninstAPI Patches

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    LeGendre, M.

    2012-04-01

    We are seeking a code review of patches against DyninstAPI 8.0. DyninstAPI is an open source binary instrumentation library from the University of Wisconsin and University of Maryland. Our patches port DyninstAPI to the BlueGene/P and BlueGene/Q systems, as well as fix DyninstAPI bugs and implement minor new features in DyninstAPI.

  7. Presidential Citation for Science and Society

    NASA Astrophysics Data System (ADS)

    2012-07-01

    AGU presented its Presidential Citation for Science and Society to three recipients at a reception on 1 May 2012 in the Rayburn House Office Building as part of the inaugural AGU Science Policy Conference. Google Earth, Jane Lubchenco, who is the under secretary of Commerce for oceans and atmosphere and administrator of the National Oceanic and Atmospheric Administration, and Sen. Olympia Snowe (R-Maine) were recognized for their leadership and vision in shaping policy and heightening public awareness of the value of Earth and space science. “This is an important award because with it AGU brings to light the importance of cutting-edge use-inspired science that helps people, communities, and businesses adapt to climate change and sustainably manage our oceans and coasts,” Lubchenco said.

  8. TerraceM: A Matlab® tool to analyze marine terraces from high-resolution topography

    NASA Astrophysics Data System (ADS)

    Jara-Muñoz, Julius; Melnick, Daniel; Strecker, Manfred

    2015-04-01

    To date, Light detection and ranging (LiDAR), high- resolution topographic data sets enable remote identification of submeter-scale geomorphic features bringing valuable information of the landscape and geomorphic markers of tectonic deformation such as fault-scarp offsets, fluvial and marine terraces. Recent studies of marine terraces using LiDAR data have demonstrated that these landforms can be readily isolated from other landforms in the landscape, using slope and roughness parameters that allow for unambiguously mapping regional extents of terrace sequences. Marine terrace elevation has been used since decades as geodetic benchmarks of Quaternary deformation. Uplift rates may be estimated by locating the shoreline angle, a geomorphic feature correlated with the high-stand position of past sea levels. Indeed, precise identification of the shoreline-angle position is an important requirement to obtain reliable tectonic rates and coherent spatial correlation. To improve our ability to rapidly assess and map different shoreline angles at a regional scale we have developed the TerraceM application. TerraceM is a Matlab® tool that allows estimating the shoreline angle and its associated error using high-resolution topography. For convenience, TerraceM includes a graphical user interface (GUI) linked with Google Maps® API. The analysis starts by defining swath profiles from a shapefile created on a GIS platform orientated orthogonally to the terrace riser. TerraceM functions are included to extract and analyze the swath profiles. Two types of coastal landscapes may be analyzed using different methodologies: staircase sequences of multiple terraces and rough, rocky coasts. The former are measured by outlining the paleo-cliffs and paleo-platforms, whereas the latter are assessed by picking the elevation of sea-stack tops. By calculating the intersection between first-order interpolations of the maximum topography of swath profiles we define the shoreline angle in staircase terraces. For rocky coasts, the maximum stack peaks for a defined search ratio as well as a defined inflection point on the adjacent main cliff are interpolated to calculate the shoreline angle at the intersection with the cliff. Error estimates are based on the standard deviation of the linear regressions. The geomorphic age of terraces (Kt) can be also calculated by the linear diffusion equation (Hanks et al., 1989), with a best-fitting model found by minimizing the RMS. TerraceM has the ability to efficiently process several profiles in batch-mode run. Results may be exported in various formats, including Google Earth and ArcGis, basic statistics are automatically computed. Test runs have been made at Santa Cruz, California, using various topographic data sets and comparing results with published field measurements (Anderson and Menking, 1994). Repeatability was evaluated using multiple test runs made by students in a classroom setting.

  9. Fusion of the C-terminal triskaidecapeptide of hirudin variant 3 to alpha1-proteinase inhibitor M358R increases the serpin-mediated rate of thrombin inhibition

    PubMed Central

    2013-01-01

    Background Alpha-1 proteinase inhibitor (API) is a plasma serpin superfamily member that inhibits neutrophil elastase; variant API M358R inhibits thrombin and activated protein C (APC). Fusing residues 1-75 of another serpin, heparin cofactor II (HCII), to API M358R (in HAPI M358R) was previously shown to accelerate thrombin inhibition over API M358R by conferring thrombin exosite 1 binding properties. We hypothesized that replacing HCII 1-75 region with the 13 C-terminal residues (triskaidecapeptide) of hirudin variant 3 (HV354-66) would further enhance the inhibitory potency of API M358R fusion proteins. We therefore expressed HV3API M358R (HV354-66 fused to API M358R) and HV3API RCL5 (HV354-66 fused to API F352A/L353V/E354V/A355I/I356A/I460L/M358R) API M358R) as N-terminally hexahistidine-tagged polypeptides in E. coli. Results HV3API M358R inhibited thrombin 3.3-fold more rapidly than API M358R; for HV3API RCL5 the rate enhancement was 1.9-fold versus API RCL5; neither protein inhibited thrombin as rapidly as HAPI M358R. While the thrombin/Activated Protein C rate constant ratio was 77-fold higher for HV3API RCL5 than for HV3API M358R, most of the increased specificity derived from the API F352A/L353V/E354V/A355I/I356A/I460L API RCL 5 mutations, since API RCL5 remained 3-fold more specific than HV3API RCL5. An HV3 54-66 peptide doubled the Thrombin Clotting Time (TCT) and halved the binding of thrombin to immobilized HCII 1-75 at lower concentrations than free HCII 1-75. HV3API RCL5 bound active site-inhibited FPR-chloromethyl ketone-thrombin more effectively than HAPI RCL5. Transferring the position of the fused HV3 triskaidecapeptide to the C-terminus of API M358R decreased the rate of thrombin inhibition relative to that mediated by HV3API M358R by 11-to 14-fold. Conclusions Fusing the C-terminal triskaidecapeptide of HV3 to API M358R-containing serpins significantly increased their effectiveness as thrombin inhibitors, but the enhancement was less than that seen in HCII 1-75–API M358R fusion proteins. HCII 1-75 was a superior fusion partner, in spite of the greater affinity of the HV3 triskaidecapeptide, manifested both in isolated and API-fused form, for thrombin exosite 1. Our results suggest that HCII 1-75 binds thrombin exosite 1 and orients the attached serpin scaffold for more efficient interaction with the active site of thrombin than the HV3 triskaidecapeptide. PMID:24215622

  10. Extending the Common Framework for Earth Observation Data to other Disciplinary Data and Programmatic Access

    NASA Astrophysics Data System (ADS)

    Evans, B. J. K.; Wyborn, L. A.; Druken, K. A.; Richards, C. J.; Trenham, C. E.; Wang, J.

    2016-12-01

    The Australian National Computational Infrastructure (NCI) manages a large geospatial repository (10+ PBytes) of Earth systems, environmental, water management and geophysics research data, co-located with a petascale supercomputer and an integrated research cloud. NCI has applied the principles of the "Common Framework for Earth-Observation Data" (the Framework) to the organisation of these collections enabling a diverse range of researchers to explore different aspects of the data and, in particular, for seamless programmatic data analysis, both in-situ access and via data services. NCI provides access to the collections through the National Environmental Research Data Interoperability Platform (NERDIP) - a comprehensive and integrated data platform with both common and emerging services designed to enable data accessibility and citability. Applying the Framework across the range of datasets ensures that programmatic access, both in-situ and network methods, work as uniformly as possible for any dataset, using both APIs and data services. NCI has also created a comprehensive quality assurance framework to regularise compliance checks across the data, library APIs and data services, and to establish a comprehensive set of benchmarks to quantify both functionality and performance perspectives for the Framework. The quality assurance includes organisation of datasets through a data management plan, which anchors the data directory structure, version controls and data information services so that they are kept aligned with operational changes over time. Specific attention has been placed on the way data are packed inside the files. Our experience has shown that complying with standards such as CF and ACDD is still not enough to ensure that all data services or software packages correctly read the data. Further, data may not be optimally organised for the different access patterns, which causes poor performance of the CPUs and bandwidth utilisation. We will also discuss some gaps in the Framework that have emerged and our approach to resolving these.

  11. Pulling on the Long Tail with Flyover Country, a Mobile App to Expose, Visualize, Discover, and Explore Open Geoscience Data

    NASA Astrophysics Data System (ADS)

    Myrbo, A.; Loeffler, S.; Ai, S.; McEwan, R.

    2015-12-01

    The ultimate EarthCube product has been described as a mobile app that provides all of the known geoscience data for a geographic point or polygon, from the top of the atmosphere to the core of the Earth, throughout geologic time. The database queries are hidden from the user, and the data are visually rendered for easy recognition of patterns and associations. This fanciful vision is not so remote: NSF EarthCube and Geoinformatics support has already fostered major advances in database interoperability and harmonization of APIs; numerous "domain repositories," databases curated by subject matter experts, now provide a vast wealth of open, easily-accessible georeferenced data on rock and sediment chemistry and mineralogy, paleobiology, stratigraphy, rock magnetics, and more. New datasets accrue daily, including many harvested from the literature by automated means. None of these constitute big data - all are part of the long tail of geoscience, heterogeneous data consisting of relatively small numbers of measurements made by a large number of people, typically on physical samples. This vision of mobile data discovery requires a software package to cleverly expose these domain repositories' holdings; currently, queries mainly come from single investigators to single databases. The NSF-funded mobile app Flyover Country (FC; fc.umn.edu), developed for geoscience outreach and education, has been welcomed by data curators and cyberinfrastructure developers as a testing ground for their API services, data provision, and scalability. FC pulls maps and data within a bounding envelope and caches them for offline use; location-based services alert users to nearby points of interest (POI). The incorporation of data from multiple databases across domains requires parsimonious data requests and novel visualization techniques, especially for mapping of data with a time or stratigraphic depth component. The preservation of data provenance and authority is critical for researcher buy-in to all community databases, and further allows exploration and suggestions of collaborators, based upon geography and topical relevance.

  12. Neighbourhood looking glass: 360º automated characterisation of the built environment for neighbourhood effects research.

    PubMed

    Nguyen, Quynh C; Sajjadi, Mehdi; McCullough, Matt; Pham, Minh; Nguyen, Thu T; Yu, Weijun; Meng, Hsien-Wen; Wen, Ming; Li, Feifei; Smith, Ken R; Brunisholz, Kim; Tasdizen, Tolga

    2018-03-01

    Neighbourhood quality has been connected with an array of health issues, but neighbourhood research has been limited by the lack of methods to characterise large geographical areas. This study uses innovative computer vision methods and a new big data source of street view images to automatically characterise neighbourhood built environments. A total of 430 000 images were obtained using Google's Street View Image API for Salt Lake City, Chicago and Charleston. Convolutional neural networks were used to create indicators of street greenness, crosswalks and building type. We implemented log Poisson regression models to estimate associations between built environment features and individual prevalence of obesity and diabetes in Salt Lake City, controlling for individual-level and zip code-level predisposing characteristics. Computer vision models had an accuracy of 86%-93% compared with manual annotations. Charleston had the highest percentage of green streets (79%), while Chicago had the highest percentage of crosswalks (23%) and commercial buildings/apartments (59%). Built environment characteristics were categorised into tertiles, with the highest tertile serving as the referent group. Individuals living in zip codes with the most green streets, crosswalks and commercial buildings/apartments had relative obesity prevalences that were 25%-28% lower and relative diabetes prevalences that were 12%-18% lower than individuals living in zip codes with the least abundance of these neighbourhood features. Neighbourhood conditions may influence chronic disease outcomes. Google Street View images represent an underused data resource for the construction of built environment features. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  13. N-Terminal Ile-Orn- and Trp-Orn-Motif Repeats Enhance Membrane Interaction and Increase the Antimicrobial Activity of Apidaecins against Pseudomonas aeruginosa

    PubMed Central

    Bluhm, Martina E. C.; Schneider, Viktoria A. F.; Schäfer, Ingo; Piantavigna, Stefania; Goldbach, Tina; Knappe, Daniel; Seibel, Peter; Martin, Lisandra L.; Veldhuizen, Edwin J. A.; Hoffmann, Ralf

    2016-01-01

    The Gram-negative bacterium Pseudomonas aeruginosa is a life-threatening nosocomial pathogen due to its generally low susceptibility toward antibiotics. Furthermore, many strains have acquired resistance mechanisms requiring new antimicrobials with novel mechanisms to enhance treatment options. Proline-rich antimicrobial peptides, such as the apidaecin analog Api137, are highly efficient against various Enterobacteriaceae infections in mice, but less active against P. aeruginosa in vitro. Here, we extended our recent work by optimizing lead peptides Api755 (gu-OIORPVYOPRPRPPHPRL-OH; gu = N,N,N′,N′-tetramethylguanidino, O = L-ornithine) and Api760 (gu-OWORPVYOPRPRPPHPRL-OH) by incorporation of Ile-Orn- and Trp-Orn-motifs, respectively. Api795 (gu-O(IO)2RPVYOPRPRPPHPRL-OH) and Api794 (gu-O(WO)3RPVYOPRPRPPHPRL-OH) were highly active against P. aeruginosa with minimal inhibitory concentrations of 8–16 and 8–32 μg/mL against Escherichia coli and Klebsiella pneumoniae. Assessed using a quartz crystal microbalance, these peptides inserted into a membrane layer and the surface activity increased gradually from Api137, over Api795, to Api794. This mode of action was confirmed by transmission electron microscopy indicating some membrane damage only at the high peptide concentrations. Api794 and Api795 were highly stable against serum proteases (half-life times >5 h) and non-hemolytic to human erythrocytes at peptide concentrations of 0.6 g/L. At this concentration, Api795 reduced the cell viability of HeLa cells only slightly, whereas the IC50 of Api794 was 0.23 ± 0.09 g/L. Confocal fluorescence microscopy revealed no colocalization of 5(6)-carboxyfluorescein-labeled Api794 or Api795 with the mitochondria, excluding interactions with the mitochondrial membrane. Interestingly, Api795 was localized in endosomes, whereas Api794 was present in endosomes and the cytosol. This was verified using flow cytometry showing a 50% higher uptake of Api794 in HeLa cells compared with Api795. The uptake was reduced for both peptides by 50 and 80%, respectively, after inhibiting endocytotic uptake with dynasore. In summary, Api794 and Api795 were highly active against P. aeruginosa in vitro. Both peptides passed across the bacterial membrane efficiently, most likely then disturbing the ribosome assembly, and resulting in further intracellular damage. Api795 with its IOIO-motif, which was particularly active and only slightly toxic in vitro, appears to represent a promising third generation lead compound for the development of novel antibiotics against P. aeruginosa. PMID:27243004

  14. The Earth System Documentation (ES-DOC) Software Process

    NASA Astrophysics Data System (ADS)

    Greenslade, M. A.; Murphy, S.; Treshansky, A.; DeLuca, C.; Guilyardi, E.; Denvil, S.

    2013-12-01

    Earth System Documentation (ES-DOC) is an international project supplying high-quality tools & services in support of earth system documentation creation, analysis and dissemination. It is nurturing a sustainable standards based documentation eco-system that aims to become an integral part of the next generation of exa-scale dataset archives. ES-DOC leverages open source software, and applies a software development methodology that places end-user narratives at the heart of all it does. ES-DOC has initially focused upon nurturing the Earth System Model (ESM) documentation eco-system and currently supporting the following projects: * Coupled Model Inter-comparison Project Phase 5 (CMIP5); * Dynamical Core Model Inter-comparison Project (DCMIP); * National Climate Predictions and Projections Platforms Quantitative Evaluation of Downscaling Workshop. This talk will demonstrate that ES-DOC implements a relatively mature software development process. Taking a pragmatic Agile process as inspiration, ES-DOC: * Iteratively develops and releases working software; * Captures user requirements via a narrative based approach; * Uses online collaboration tools (e.g. Earth System CoG) to manage progress; * Prototypes applications to validate their feasibility; * Leverages meta-programming techniques where appropriate; * Automates testing whenever sensibly feasible; * Streamlines complex deployments to a single command; * Extensively leverages GitHub and Pivotal Tracker; * Enforces strict separation of the UI from underlying API's; * Conducts code reviews.

  15. CALIBRATION, OPTIMIZATION, AND SENSITIVITY AND UNCERTAINTY ALGORITHMS APPLICATION PROGRAMMING INTERFACE (COSU-API)

    EPA Science Inventory

    The Application Programming Interface (API) for Uncertainty Analysis, Sensitivity Analysis, and Parameter Estimation (UA/SA/PE API) tool development, here fore referred to as the Calibration, Optimization, and Sensitivity and Uncertainty Algorithms API (COSU-API), was initially d...

  16. Impact of Nosema ceranae and Nosema apis on individual worker bees of the two host species (Apis cerana and Apis mellifera) and regulation of host immune response.

    PubMed

    Sinpoo, Chainarong; Paxton, Robert J; Disayathanoowat, Terd; Krongdang, Sasiprapa; Chantawannakul, Panuwan

    Nosema apis and Nosema ceranae are obligate intracellular microsporidian parasites infecting midgut epithelial cells of host adult honey bees, originally Apis mellifera and Apis cerana respectively. Each microsporidia cross-infects the other host and both microsporidia nowadays have a worldwide distribution. In this study, cross-infection experiments using both N. apis and N. ceranae in both A. mellifera and A. cerana were carried out to compare pathogen proliferation and impact on hosts, including host immune response. Infection by N. ceranae led to higher spore loads than by N. apis in both host species, and there was greater proliferation of microsporidia in A. mellifera compared to A. cerana. Both N. apis and N. ceranae were pathogenic in both host Apis species. N. ceranae induced subtly, though not significantly, higher mortality than N. apis in both host species, yet survival of A. cerana was no different to that of A. mellifera in response to N. apis or N. ceranae. Infections of both host species with N. apis and N. ceranae caused significant up-regulation of AMP genes and cellular mediated immune genes but did not greatly alter apoptosis-related gene expression. In this study, A. cerana enlisted a higher immune response and displayed lower loads of N. apis and N. ceranae spores than A. mellifera, suggesting it may be better able to defend itself against microsporidia infection. We caution against over-interpretation of our results, though, because differences between host and parasite species in survival were insignificant and because size differences between microsporidia species and between host Apis species may alternatively explain the differential proliferation of N. ceranae in A. mellifera. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Comparison of plasma amino acid profile-based index and CA125 in the diagnosis of epithelial ovarian cancers and borderline malignant tumors.

    PubMed

    Miyagi, Etsuko; Maruyama, Yasuyo; Mogami, Tae; Numazaki, Reiko; Ikeda, Atsuko; Yamamoto, Hiroshi; Hirahara, Fumiki

    2017-02-01

    We previously developed a new plasma amino acid profile-based index (API) to detect ovarian, cervical, and endometrial cancers. Here, we compared API to serum cancer antigen 125 (CA125) for distinguishing epithelial ovarian malignant tumors from benign growths. API and CA125 were measured preoperatively in patients with ovarian tumors, which were later classified into 59 epithelial ovarian cancers, 21 epithelial borderline malignant tumors, and 97 benign tumors including 40 endometriotic cysts. The diagnostic accuracy and cutoff points of API were evaluated using receiver operating characteristic (ROC) curves. The area under the ROC curves showed the equivalent performance of API and CA125 to discriminate between malignant/borderline malignant and benign tumors (both 0.77), and API was superior to CA125 for discrimination between malignant/borderline malignant lesions and endometriotic cysts (API, 0.75 vs. CA125, 0.59; p < 0.05). At the API cutoff level of 6.0, API and CA125 had equal positive rates of detecting cancers and borderline malignancies (API, 0.71 vs. CA125, 0.74; p = 0.84) or cancers alone (API, 0.73 vs. CA125, 0.85; p = 0.12). However, API had a significantly lower detection rate of benign endometriotic cysts (0.35; 95 % CI, 0.21-0.52) compared with that of CA125 (0.65; 95 % CI, 0.48-0.79) (p < 0.05). API is an effective new tumor marker to detect ovarian cancers and borderline malignancies with a low false-positive rate for endometriosis. A large-scale prospective clinical study using the cutoff value of API determined in this study is warranted to validate API for practical clinical use.

  18. LANCE in ECHO - Merging Science and Near Real-Time Data Search and Order

    NASA Astrophysics Data System (ADS)

    Kreisler, S.; Murphy, K. J.; Vollmer, B.; Lighty, L.; Mitchell, A. E.; Devine, N.

    2012-12-01

    NASA's Earth Observing System (EOS) Data and Information System (EOSDIS) Land Atmosphere Near real-time Capability for EOS (LANCE) project provides expedited data products from the Terra, Aqua, and Aura satellites within three hours of observation. In order to satisfy latency requirements, LANCE data are produced with relaxed ancillary data resulting in a product that may have minor differences from its science quality counterpart. LANCE products are used by a number of different groups to support research and applications that require near real-time earth observations, such as disaster relief, hazard and air quality monitoring, and weather forecasting. LANCE elements process raw rate-buffered and/or session-based production datasets into higher-level products, which are freely available to registered users via LANCE FTP sites. The LANCE project also generates near real-time full resolution browse imagery from these products, which can be accessed through the Global Imagery Browse Services (GIBS). In an effort to support applications and services that require timely access to these near real-time products, the project is currently implementing the publication of LANCE product metadata to the EOS ClearingHouse (ECHO), a centralized EOSDIS registry of EOS data. Metadata within ECHO is made available through an Application Program Interface (API), and applications can utilize the API to allow users to efficiently search and order LANCE data. Publishing near real-time data to ECHO will permit applications to access near real-time product metadata prior to the release of its science quality counterpart and to associate imagery from GIBS with its underlying data product.

  19. EpiCollect: linking smartphones to web applications for epidemiology, ecology and community data collection.

    PubMed

    Aanensen, David M; Huntley, Derek M; Feil, Edward J; al-Own, Fada'a; Spratt, Brian G

    2009-09-16

    Epidemiologists and ecologists often collect data in the field and, on returning to their laboratory, enter their data into a database for further analysis. The recent introduction of mobile phones that utilise the open source Android operating system, and which include (among other features) both GPS and Google Maps, provide new opportunities for developing mobile phone applications, which in conjunction with web applications, allow two-way communication between field workers and their project databases. Here we describe a generic framework, consisting of mobile phone software, EpiCollect, and a web application located within www.spatialepidemiology.net. Data collected by multiple field workers can be submitted by phone, together with GPS data, to a common web database and can be displayed and analysed, along with previously collected data, using Google Maps (or Google Earth). Similarly, data from the web database can be requested and displayed on the mobile phone, again using Google Maps. Data filtering options allow the display of data submitted by the individual field workers or, for example, those data within certain values of a measured variable or a time period. Data collection frameworks utilising mobile phones with data submission to and from central databases are widely applicable and can give a field worker similar display and analysis tools on their mobile phone that they would have if viewing the data in their laboratory via the web. We demonstrate their utility for epidemiological data collection and display, and briefly discuss their application in ecological and community data collection. Furthermore, such frameworks offer great potential for recruiting 'citizen scientists' to contribute data easily to central databases through their mobile phone.

  20. TCL2 Ocean Scenario Replay

    NASA Technical Reports Server (NTRS)

    Mohlenbrink, Christoph P.; Omar, Faisal Gamal; Homola, Jeffrey R.

    2017-01-01

    This is a video replay of system data that was generated from the UAS Traffic Management (UTM) Technical Capability Level (TCL) 2 flight demonstration in Nevada and rendered in Google Earth. What is depicted in the replay is a particular set of flights conducted as part of what was referred to as the Ocean scenario. The test range and surrounding area are presented followed by an overview of operational volumes. System messaging is also displayed as well as a replay of all of the five test flights as they occurred.

  1. Smartphones and Time Zones

    NASA Astrophysics Data System (ADS)

    Baird, William; Secrest, Jeffery; Padgett, Clifford; Johnson, Wayne; Hagrelius, Claire

    2016-09-01

    Using the Sun to tell time is an ancient idea, but we can take advantage of modern technology to bring it into the 21st century for students in astronomy, physics, or physical science classes. We have employed smartphones, Google Earth, and 3D printing to find the moment of local noon at two widely separated locations. By reviewing GPS time-stamped photos from each place, we are able to illustrate that local noon is longitude-dependent and therefore explain the need for time zones.

  2. Infusion of a Gaming Paradigm into Computer-Aided Engineering Design Tools

    DTIC Science & Technology

    2012-05-03

    Virtual Test Bed (VTB), and the gaming tool, Unity3D . This hybrid gaming environment coupled a three-dimensional (3D) multibody vehicle system model...from Google Earth to the 3D visual front-end fabricated around Unity3D . The hybrid environment was sufficiently developed to support analyses of the...ndFr Cti3r4 G’OjrdFr ctior-2 The VTB simulation of the vehicle dynamics ran concurrently with and interacted with the gaming engine, Unity3D which

  3. A tutorial for software development in quantitative proteomics using PSI standard formats☆

    PubMed Central

    Gonzalez-Galarza, Faviel F.; Qi, Da; Fan, Jun; Bessant, Conrad; Jones, Andrew R.

    2014-01-01

    The Human Proteome Organisation — Proteomics Standards Initiative (HUPO-PSI) has been working for ten years on the development of standardised formats that facilitate data sharing and public database deposition. In this article, we review three HUPO-PSI data standards — mzML, mzIdentML and mzQuantML, which can be used to design a complete quantitative analysis pipeline in mass spectrometry (MS)-based proteomics. In this tutorial, we briefly describe the content of each data model, sufficient for bioinformaticians to devise proteomics software. We also provide guidance on the use of recently released application programming interfaces (APIs) developed in Java for each of these standards, which makes it straightforward to read and write files of any size. We have produced a set of example Java classes and a basic graphical user interface to demonstrate how to use the most important parts of the PSI standards, available from http://code.google.com/p/psi-standard-formats-tutorial. This article is part of a Special Issue entitled: Computational Proteomics in the Post-Identification Era. Guest Editors: Martin Eisenacher and Christian Stephan. PMID:23584085

  4. Travel behavior of low income older adults and implementation of an accessibility calculator

    PubMed Central

    Moniruzzaman, Md; Chudyk, Anna; Páez, Antonio; Winters, Meghan; Sims-Gould, Joanie; McKay, Heather

    2016-01-01

    Given the aging demographic landscape, the concept of walkable neighborhoods has emerged as a topic of interest, especially during the last decade. However, we know very little about whether walkable neighborhoods promote walking among older adults, particularly those with lower incomes. Therefore in this paper we: (i) examine the relation between trip distance and sociodemographic attributes and accessibility features of lower income older adults in Metro Vancouver; and, (ii) implement a web-based application to calculate the accessibility of lower income older adults in Metro Vancouver based on their travel behavior. We use multilevel linear regression to estimate the determinants of trip length. We find that in this population distance traveled is associated with gender, living arrangements, and dog ownership. Furthermore, significant geographical variations (measured using a trend surface) were also found. To better visualize the impact of travel behavior on accessibility by personal profile and location, we also implemented a web-based calculator that generates an Accessibility (A)-score using Google Maps API v3 that can be used to evaluate the accessibility of neighborhoods from the perspective of older adults. PMID:27104148

  5. Active pharmaceutical ingredients for antiretroviral treatment in low- and middle-income countries: a survey.

    PubMed

    Fortunak, Joseph M; de Souza, Rodrigo O M A; Kulkarni, Amol A; King, Christopher L; Ellison, Tiffany; Miranda, Leandro S M

    2014-01-01

    Active pharmaceutical ingredients (APIs) are the molecular entities that exert the therapeutic effects of medicines. This article provides an overview of the major APIs that are entered into antiretroviral therapy (ART), outlines how APIs are manufactured, and examines the regulatory and cost frameworks of manufacturing ART APIs used in low- and middle-income countries (LMICs). Almost all APIs for ART are prepared by chemical synthesis. Roughly 15 APIs account for essentially all of the ARTs used in LMICs. Nearly all of the ART APIs purchased through the Global Fund for AIDS, TB and Malaria (GFATM) or the United States President's Emergency Plan for AIDS Relief (PEPFAR) are produced by generic companies. API costs are very important because they are the largest contribution to the overall cost of ART. Efficient API production requires substantial investment in chemical manufacturing technologies and the ready availability of raw materials and energy at competitive prices. Generic API production is practiced in only a limited number of countries; the API market for ART is dominated by Indian companies. The quality of these APIs is ensured by manufacturing under good manufacturing practice (GMP), including process validation, testing against previously established specifications and the demonstration of clinical bioequivalence. The investment and personnel costs of a quality management system for GMP contribute significantly to the cost of API production. Chinese companies are the major suppliers for many advanced intermediates in API production. Improved chemistry of manufacturing, economies of scale and optimization of procurement have enabled drastic cost reductions for many ART APIs. The available capacity for global production of quality-assured APIs is likely adequate to meet forecasted demand for 2015. The increased use of ART for paediatric treatment, for second-line and salvage therapy, and the introduction of new APIs and combinations are important factors for the future of treatment in LMICs. The introduction of new fixed-dose combinations for ART and use of new drug delivery technologies could plausibly provide robust, durable ART for all patients in need, at an overall cost that is only moderately higher than what is presently being spent.

  6. Active pharmaceutical ingredients for antiretroviral treatment in low- and middle-income countries: a survey

    PubMed Central

    Fortunak, Joseph M; de Souza, Rodrigo OMA; Kulkarni, Amol A; King, Christopher L; Ellison, Tiffany; Miranda, Leandro SM

    2015-01-01

    Active pharmaceutical ingredients (APIs) are the molecular entities that exert the therapeutic effects of medicines. This article provides an overview of the major APIs that are entered into antiretroviral therapy (ART), outlines how APIs are manufactured, and examines the regulatory and cost frameworks of manufacturing ART APIs used in low- and middle-income countries (LMICs). Almost all APIs for ART are prepared by chemical synthesis. Roughly 15 APIs account for essentially all of the ARTs used in LMICs. Nearly all of the ART APIs purchased through the Global Fund for AIDS, TB and Malaria (GFATM) or the United States President’s Emergency Plan for AIDS Relief (PEPFAR) are produced by generic companies. API costs are very important because they are the largest contribution to the overall cost of ART. Efficient API production requires substantial investment in chemical manufacturing technologies and the ready availability of raw materials and energy at competitive prices. Generic API production is practiced in only a limited number of countries; the API market for ART is dominated by Indian companies. The quality of these APIs is ensured by manufacturing under good manufacturing practice (GMP), including process validation, testing against previously established specifications and the demonstration of clinical bioequivalence. The investment and personnel costs of a quality management system for GMP contribute significantly to the cost of API production. Chinese companies are the major suppliers for many advanced intermediates in API production. Improved chemistry of manufacturing, economies of scale and optimization of procurement have enabled drastic cost reductions for many ART APIs. The available capacity for global production of quality-assured APIs is likely adequate to meet forecasted demand for 2015. The increased use of ART for paediatric treatment, for second-line and salvage therapy, and the introduction of new APIs and combinations are important factors for the future of treatment in LMICs. The introduction of new fixed-dose combinations for ART and use of new drug delivery technologies could plausibly provide robust, durable ART for all patients in need, at an overall cost that is only moderately higher than what is presently being spent. PMID:25310430

  7. Smarter Earth Science Data System

    NASA Technical Reports Server (NTRS)

    Huang, Thomas

    2013-01-01

    The explosive growth in Earth observational data in the recent decade demands a better method of interoperability across heterogeneous systems. The Earth science data system community has mastered the art in storing large volume of observational data, but it is still unclear how this traditional method scale over time as we are entering the age of Big Data. Indexed search solutions such as Apache Solr (Smiley and Pugh, 2011) provides fast, scalable search via keyword or phases without any reasoning or inference. The modern search solutions such as Googles Knowledge Graph (Singhal, 2012) and Microsoft Bing, all utilize semantic reasoning to improve its accuracy in searches. The Earth science user community is demanding for an intelligent solution to help them finding the right data for their researches. The Ontological System for Context Artifacts and Resources (OSCAR) (Huang et al., 2012), was created in response to the DARPA Adaptive Vehicle Make (AVM) programs need for an intelligent context models management system to empower its terrain simulation subsystem. The core component of OSCAR is the Environmental Context Ontology (ECO) is built using the Semantic Web for Earth and Environmental Terminology (SWEET) (Raskin and Pan, 2005). This paper presents the current data archival methodology within a NASA Earth science data centers and discuss using semantic web to improve the way we capture and serve data to our users.

  8. Mechanical analysis of the dry stone walls built by the Incas

    NASA Astrophysics Data System (ADS)

    Castro, Jaime; Vallejo, Luis E.; Estrada, Nicolas

    2017-06-01

    In this paper, the retaining walls in the agricultural terraces built by the Incas are analyzed from a mechanical point of view. In order to do so, ten different walls from the Lower Agricultural Sector of Machu Picchu, Perú, were selected using images from Google Street View and Google Earth Pro. Then, these walls were digitalized and their mechanical stability was evaluated. Firstly, it was found that these retaining walls are characterized by two distinctive features: disorder and a block size distribution with a large size span, i.e., the particle size varies from blocks that can be carried by one person to large blocks weighing several tons. Secondly, it was found that, thanks to the large span of the block size distribution, the factor of safety of the Inca retaining walls is remarkably close to those that are recommended in modern geotechnical design standards. This suggests that these structures were not only functional but also highly optimized, probably as a result of a careful trial and error procedure.

  9. The Use of Remote Sensing Data for Modeling Air Quality in the Cities

    NASA Astrophysics Data System (ADS)

    Putrenko, V. V.; Pashynska, N. M.

    2017-12-01

    Monitoring of environmental pollution in the cities by the methods of remote sensing of the Earth is actual area of research for sustainable development. Ukraine has a poorly developed network of monitoring stations for air quality, the technical condition of which is deteriorating in recent years. Therefore, the possibility of obtaining data about the condition of air by remote sensing methods is of great importance. The paper considers the possibility of using the data about condition of atmosphere of the project AERONET to assess the air quality in Ukraine. The main pollution indicators were used data on fine particulate matter (PM2.5) and nitrogen dioxide (NO2) content in the atmosphere. The main indicator of air quality in Ukraine is the air pollution index (API). We have built regression models the relationship between indicators of NO2, which are measured by remote sensing methods and ground-based measurements of indicators. There have also been built regression models, the relationship between the data given to the land of NO2 and API. To simulate the relationship between the API and PM2.5 were used geographically weighted regression model, which allows to take into account the territorial differentiation between these indicators. As a result, the maps that show the distribution of the main types of pollution in the territory of Ukraine, were constructed. PM2.5 data modeling is complicated with using existing indicators, which requires a separate organization observation network for PM2.5 content in the atmosphere for sustainable development in cities of Ukraine.

  10. IgE-Api m 4 Is Useful for Identifying a Particular Phenotype of Bee Venom Allergy.

    PubMed

    Ruiz, B; Serrano, P; Moreno, C

    Different clinical behaviors have been identified in patients allergic to bee venom. Compound-resolved diagnosis could be an appropriate tool for investigating these differences. The aims of this study were to analyze whether specific IgE to Api m 4 (sIgE-Api m 4) can identify a particular kind of bee venom allergy and to describe response to bee venom immunotherapy (bVIT). Prospective study of 31 patients allergic to bee venom who were assigned to phenotype group A (sIgE-Api m 4 <0.98 kU/L), treated with native aqueous (NA) extract, or phenotype group B (sIgE-Api m 4 ≥0.98 kU/L), treated with purified aqueous (PA) extract. Sex, age, cardiovascular risk, severity of preceding sting reaction, exposure to beekeeping, and immunological data (intradermal test, sIgE/sIgG4-Apis-nApi m 1, and sIgE-rApi m 2-Api m 4 were analyzed. Systemic reactions (SRs) during bVIT build-up were analyzed. Immunological and sting challenge outcomes were evaluated in each group after 1 and 2 years of bVIT. Phenotype B patients had more severe reactions (P=.049) and higher skin sensitivity (P=.011), baseline sIgE-Apis (P=.0004), sIgE-nApi m 1 (P=.0004), and sIgG4-Apis (P=.027) than phenotype A patients. Furthermore, 41% of patients in group B experienced SRs during the build-up phase with NA; the sting challenge success rate in this group was 82%. There were no significant reductions in serial intradermal test results, but an intense reduction in sIgE-nApi m 1 (P=.013) and sIgE-Api m 4 (P=.004) was observed after the first year of bVIT. Use of IgE-Api m 4 as the only discrimination criterion demonstrated differences in bee venom allergy. Further investigation with larger populations is necessary.

  11. Chimeras of Bet v 1 and Api g 1 reveal heterogeneous IgE responses in patients with birch pollen allergy

    PubMed Central

    Gepp, Barbara; Lengger, Nina; Bublin, Merima; Hemmer, Wolfgang; Breiteneder, Heimo; Radauer, Christian

    2014-01-01

    Background Characterization of IgE-binding epitopes of allergens and determination of their patient-specific relevance is crucial for the diagnosis and treatment of allergy. Objective We sought to assess the contribution of specific surface areas of the major birch pollen allergen Bet v 1.0101 to binding IgE of individual patients. Methods Four distinct areas of Bet v 1 representing in total 81% of its surface were grafted onto the scaffold of its homolog, Api g 1.0101, to yield the chimeras Api-Bet-1 to Api-Bet-4. The chimeras were expressed in Escherichia coli and purified. IgE binding of 64 sera from Bet v 1–sensitized subjects with birch pollen allergy was determined by using direct ELISA. Specificity was assessed by means of inhibition ELISA. Results rApi g 1.0101, Api-Bet-1, Api-Bet-2, Api-Bet-3, and Api-Bet-4 bound IgE from 44%, 89%, 80%, 78%, and 48% of the patients, respectively. By comparing the amount of IgE binding to the chimeras and to rApi g 1.0101, 81%, 70%, 75%, and 45% of the patients showed significantly enhanced IgE binding to Api-Bet-1, Api-Bet-2, Api-Bet-3, and Api-Bet-4, respectively. The minority (8%) of the sera revealed enhanced IgE binding exclusively to a single chimera, whereas 31% showed increased IgE binding to all 4 chimeras compared with rApi g 1.0101. The chimeras inhibited up to 70% of IgE binding to rBet v 1.0101, confirming the specific IgE recognition of the grafted regions. Conclusion The Bet v 1–specific IgE response is polyclonal, and epitopes are spread across the entire Bet v 1 surface. Furthermore, the IgE recognition profile of Bet v 1 is highly patient specific. PMID:24529686

  12. Predominant Api m 10 sensitization as risk factor for treatment failure in honey bee venom immunotherapy.

    PubMed

    Frick, Marcel; Fischer, Jörg; Helbling, Arthur; Ruëff, Franziska; Wieczorek, Dorothea; Ollert, Markus; Pfützner, Wolfgang; Müller, Sabine; Huss-Marp, Johannes; Dorn, Britta; Biedermann, Tilo; Lidholm, Jonas; Ruecker, Gerta; Bantleon, Frank; Miehe, Michaela; Spillner, Edzard; Jakob, Thilo

    2016-12-01

    Component resolution recently identified distinct sensitization profiles in honey bee venom (HBV) allergy, some of which were dominated by specific IgE to Api m 3 and/or Api m 10, which have been reported to be underrepresented in therapeutic HBV preparations. We performed a retrospective analysis of component-resolved sensitization profiles in HBV-allergic patients and association with treatment outcome. HBV-allergic patients who had undergone controlled honey bee sting challenge after at least 6 months of HBV immunotherapy (n = 115) were included and classified as responder (n = 79) or treatment failure (n = 36) on the basis of absence or presence of systemic allergic reactions upon sting challenge. IgE reactivity to a panel of HBV allergens was analyzed in sera obtained before immunotherapy and before sting challenge. No differences were observed between responders and nonresponders regarding levels of IgE sensitization to Api m 1, Api m 2, Api m 3, and Api m 5. In contrast, Api m 10 specific IgE was moderately but significantly increased in nonresponders. Predominant Api m 10 sensitization (>50% of specific IgE to HBV) was the best discriminator (specificity, 95%; sensitivity, 25%) with an odds ratio of 8.444 (2.127-33.53; P = .0013) for treatment failure. Some but not all therapeutic HBV preparations displayed a lack of Api m 10, whereas Api m 1 and Api m 3 immunoreactivity was comparable to that of crude HBV. In line with this, significant Api m 10 sIgG 4 induction was observed only in those patients who were treated with HBV in which Api m 10 was detectable. Component-resolved sensitization profiles in HBV allergy suggest predominant IgE sensitization to Api m 10 as a risk factor for treatment failure in HBV immunotherapy. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  13. Chimeras of Bet v 1 and Api g 1 reveal heterogeneous IgE responses in patients with birch pollen allergy.

    PubMed

    Gepp, Barbara; Lengger, Nina; Bublin, Merima; Hemmer, Wolfgang; Breiteneder, Heimo; Radauer, Christian

    2014-07-01

    Characterization of IgE-binding epitopes of allergens and determination of their patient-specific relevance is crucial for the diagnosis and treatment of allergy. We sought to assess the contribution of specific surface areas of the major birch pollen allergen Bet v 1.0101 to binding IgE of individual patients. Four distinct areas of Bet v 1 representing in total 81% of its surface were grafted onto the scaffold of its homolog, Api g 1.0101, to yield the chimeras Api-Bet-1 to Api-Bet-4. The chimeras were expressed in Escherichia coli and purified. IgE binding of 64 sera from Bet v 1-sensitized subjects with birch pollen allergy was determined by using direct ELISA. Specificity was assessed by means of inhibition ELISA. rApi g 1.0101, Api-Bet-1, Api-Bet-2, Api-Bet-3, and Api-Bet-4 bound IgE from 44%, 89%, 80%, 78%, and 48% of the patients, respectively. By comparing the amount of IgE binding to the chimeras and to rApi g 1.0101, 81%, 70%, 75%, and 45% of the patients showed significantly enhanced IgE binding to Api-Bet-1, Api-Bet-2, Api-Bet-3, and Api-Bet-4, respectively. The minority (8%) of the sera revealed enhanced IgE binding exclusively to a single chimera, whereas 31% showed increased IgE binding to all 4 chimeras compared with rApi g 1.0101. The chimeras inhibited up to 70% of IgE binding to rBet v 1.0101, confirming the specific IgE recognition of the grafted regions. The Bet v 1-specific IgE response is polyclonal, and epitopes are spread across the entire Bet v 1 surface. Furthermore, the IgE recognition profile of Bet v 1 is highly patient specific. Copyright © 2014 The Authors. Published by Mosby, Inc. All rights reserved.

  14. Temporal and spatial behavior of pharmaceuticals in ...

    EPA Pesticide Factsheets

    The behavior of active pharmaceutical ingredients (APIs) in urban estuaries is not well understood. In this study, 15 high volume usage APIs were measured over a one year period throughout Narragansett Bay, RI, USA to determine factors controlling their concentration and distribution. Dissolved APIs ranged in concentration from not detected to 310 ng/L, with numerous APIs present at all sites and sampling periods. Eight APIs were present in suspended particulate material, ranging in concentration from <1 ng/g to 44 ng/g. Partitioning coefficients (Kds) were determined for APIs present in both the dissolved and particulate phases, with their range and variability remaining relatively constant during the study. Organic carbon normalization reduced the observed variability of several APIs to a small extent; however, other factors appear to play a role in controlling partitioning behavior. The continuous discharge of wastewater treatment plant effluents into upper Narragansett Bay resulted in sustained levels of APIs, resulting in a zone of “pseudo-persistence.” For most of the APIs, there was a strong relationship with salinity, indicating conservative behavior within the estuary. Short flushing times in Narragansett Bay coupled with APIs present primarily in the dissolved phase suggests that most APIs will be diluted and transported out of the estuary, with only small amounts of several compounds removed to and sequestered in sediments. This study ide

  15. 49 CFR 195.565 - How do I install cathodic protection on breakout tanks?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...) capacity built to API Specification 12F, API Standard 620, or API Standard 650 (or its predecessor Standard 12C), you must install the system in accordance with API Recommended Practice 651. However, installation of the system need not comply with API Recommended Practice 651 on any tank for which you note in...

  16. 49 CFR 195.579 - What must I do to mitigate internal corrosion?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... API Specification 12F, API Standard 620, or API Standard 650 (or its predecessor Standard 12C), you must install the lining in accordance with API Recommended Practice 652. However, installation of the lining need not comply with API Recommended Practice 652 on any tank for which you note in the corrosion...

  17. 49 CFR 195.565 - How do I install cathodic protection on breakout tanks?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...) capacity built to API Specification 12F, API Standard 620, or API Standard 650 (or its predecessor Standard 12C), you must install the system in accordance with API Recommended Practice 651. However, installation of the system need not comply with API Recommended Practice 651 on any tank for which you note in...

  18. 49 CFR 195.565 - How do I install cathodic protection on breakout tanks?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...) capacity built to API Specification 12F, API Standard 620, or API Standard 650 (or its predecessor Standard 12C), you must install the system in accordance with API Recommended Practice 651. However, installation of the system need not comply with API Recommended Practice 651 on any tank for which you note in...

  19. 49 CFR 195.579 - What must I do to mitigate internal corrosion?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... API Specification 12F, API Standard 620, or API Standard 650 (or its predecessor Standard 12C), you must install the lining in accordance with API Recommended Practice 652. However, installation of the lining need not comply with API Recommended Practice 652 on any tank for which you note in the corrosion...

  20. 49 CFR 195.579 - What must I do to mitigate internal corrosion?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... API Specification 12F, API Standard 620, or API Standard 650 (or its predecessor Standard 12C), you must install the lining in accordance with API Recommended Practice 652. However, installation of the lining need not comply with API Recommended Practice 652 on any tank for which you note in the corrosion...

  1. 49 CFR 195.579 - What must I do to mitigate internal corrosion?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... API Specification 12F, API Standard 620, or API Standard 650 (or its predecessor Standard 12C), you must install the lining in accordance with API Recommended Practice 652. However, installation of the lining need not comply with API Recommended Practice 652 on any tank for which you note in the corrosion...

  2. 78 FR 48738 - Self-Regulatory Organizations; C2 Options Exchange, Incorporated; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-09

    ... depend upon the Application Programming Interface (``API'') a Permit Holder is using.\\4\\ Currently, the Exchange offers two APIs: CBOE Market Interface (``CMi'') API and Financial Information eXchange (``FIX... available APIs, and if applicable, which version, it would like to use. \\4\\ An API is a computer interface...

  3. 49 CFR 195.565 - How do I install cathodic protection on breakout tanks?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...) capacity built to API Specification 12F, API Standard 620, or API Standard 650 (or its predecessor Standard 12C), you must install the system in accordance with API Recommended Practice 651. However, installation of the system need not comply with API Recommended Practice 651 on any tank for which you note in...

  4. 49 CFR 195.579 - What must I do to mitigate internal corrosion?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... API Specification 12F, API Standard 620, or API Standard 650 (or its predecessor Standard 12C), you must install the lining in accordance with API Recommended Practice 652. However, installation of the lining need not comply with API Recommended Practice 652 on any tank for which you note in the corrosion...

  5. 49 CFR 195.565 - How do I install cathodic protection on breakout tanks?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...) capacity built to API Specification 12F, API Standard 620, or API Standard 650 (or its predecessor Standard 12C), you must install the system in accordance with API Recommended Practice 651. However, installation of the system need not comply with API Recommended Practice 651 on any tank for which you note in...

  6. Earth Observation oriented teaching materials development based on OGC Web services and Bashyt generated reports

    NASA Astrophysics Data System (ADS)

    Stefanut, T.; Gorgan, D.; Giuliani, G.; Cau, P.

    2012-04-01

    Creating e-Learning materials in the Earth Observation domain is a difficult task especially for non-technical specialists who have to deal with distributed repositories, large amounts of information and intensive processing requirements. Furthermore, due to the lack of specialized applications for developing teaching resources, technical knowledge is required also for defining data presentation structures or in the development and customization of user interaction techniques for better teaching results. As a response to these issues during the GiSHEO FP7 project [1] and later in the EnviroGRIDS FP7 [2] project, we have developed the eGLE e-Learning Platform [3], a tool based application that provides dedicated functionalities to the Earth Observation specialists for developing teaching materials. The proposed architecture is built around a client-server design that provides the core functionalities (e.g. user management, tools integration, teaching materials settings, etc.) and has been extended with a distributed component implemented through the tools that are integrated into the platform, as described further. Our approach in dealing with multiple transfer protocol types, heterogeneous data formats or various user interaction techniques involve the development and integration of very specialized elements (tools) that can be customized by the trainers in a visual manner through simple user interfaces. In our concept each tool is dedicated to a specific data type, implementing optimized mechanisms for searching, retrieving, visualizing and interacting with it. At the same time, in each learning resource can be integrated any number of tools, through drag-and-drop interaction, allowing the teacher to retrieve pieces of data of various types (e.g. images, charts, tables, text, videos etc.) from different sources (e.g. OGC web services, charts created through Bashyt application, etc.) through different protocols (ex. WMS, BASHYT API, FTP, HTTP etc.) and to display them all together in a unitary manner using the same visual structure [4]. Addressing the High Power Computation requirements that are met while processing environmental data, our platform can be easily extended through tools that connect to GRID infrastructures, WCS web services, Bashyt API (for creating specialized hydrological reports) or any other specialized services (ex. graphics cluster visualization) that can be reached over the Internet. At run time, on the trainee's computer each tool is launched in an asynchronous running mode and connects to the data source that has been established by the teacher, retrieving and displaying the information to the user. The data transfer is accomplished directly between the trainee's computer and the corresponding services (e.g. OGC, Bashyt API, etc.) without passing through the core server platform. In this manner, the eGLE application can provide better and more responsive connections to a large number of users.

  7. Foraging behavior of honey bees (hymenoptera: Apidae) on Brassica nigra and B. rapa grown under simulated ambient and enhanced UV-B radiation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collins, S.A.; Robinson, G.E.; Conner, J.K.

    Two species of mustard, Brassica nigra and B. rapa, were grown under simulated ambient and enhanced ultraviolet-B (UV-B) radiation and exposed to pollinators, Apis mellifera L. Observations were made to determine whether UV-B-induced changes in these plants affected pollinator behavior. Total duration of the foraging trip, number of flowers visited, foraging time per flower, search time per flower, total amount of pollen collected, and pollen collected per flower were measured. There were no significant differences between UV-B treatments in any of the behaviors measured or in any of the pollen measurements. These results suggest that increases in the amount ofmore » solar UV-B reaching the earth`s surface may not have a negative effect on the relationship between these members of the genus Brassica and their honey bee pollinators. 28 refs., 2 figs., 1 tab.« less

  8. A Global Repository for Planet-Sized Experiments and Observations

    NASA Technical Reports Server (NTRS)

    Williams, Dean; Balaji, V.; Cinquini, Luca; Denvil, Sebastien; Duffy, Daniel; Evans, Ben; Ferraro, Robert D.; Hansen, Rose; Lautenschlager, Michael; Trenham, Claire

    2016-01-01

    Working across U.S. federal agencies, international agencies, and multiple worldwide data centers, and spanning seven international network organizations, the Earth System Grid Federation (ESGF) allows users to access, analyze, and visualize data using a globally federated collection of networks, computers, and software. Its architecture employs a system of geographically distributed peer nodes that are independently administered yet united by common federation protocols and application programming interfaces (APIs). The full ESGF infrastructure has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the Coupled Model Intercomparison Project (CMIP) output used by the Intergovernmental Panel on Climate Change assessment reports. Data served by ESGF not only include model output (i.e., CMIP simulation runs) but also include observational data from satellites and instruments, reanalyses, and generated images. Metadata summarize basic information about the data for fast and easy data discovery.

  9. Re-Organizing Earth Observation Data Storage to Support Temporal Analysis of Big Data

    NASA Technical Reports Server (NTRS)

    Lynnes, Christopher

    2017-01-01

    The Earth Observing System Data and Information System archives many datasets that are critical to understanding long-term variations in Earth science properties. Thus, some of these are large, multi-decadal datasets. Yet the challenge in long time series analysis comes less from the sheer volume than the data organization, which is typically one (or a small number of) time steps per file. The overhead of opening and inventorying complex, API-driven data formats such as Hierarchical Data Format introduces a small latency at each time step, which nonetheless adds up for datasets with O(10^6) single-timestep files. Several approaches to reorganizing the data can mitigate this overhead by an order of magnitude: pre-aggregating data along the time axis (time-chunking); storing the data in a highly distributed file system; or storing data in distributed columnar databases. Storing a second copy of the data incurs extra costs, so some selection criteria must be employed, which would be driven by expected or actual usage by the end user community, balanced against the extra cost.

  10. Re-organizing Earth Observation Data Storage to Support Temporal Analysis of Big Data

    NASA Astrophysics Data System (ADS)

    Lynnes, C.

    2017-12-01

    The Earth Observing System Data and Information System archives many datasets that are critical to understanding long-term variations in Earth science properties. Thus, some of these are large, multi-decadal datasets. Yet the challenge in long time series analysis comes less from the sheer volume than the data organization, which is typically one (or a small number of) time steps per file. The overhead of opening and inventorying complex, API-driven data formats such as Hierarchical Data Format introduces a small latency at each time step, which nonetheless adds up for datasets with O(10^6) single-timestep files. Several approaches to reorganizing the data can mitigate this overhead by an order of magnitude: pre-aggregating data along the time axis (time-chunking); storing the data in a highly distributed file system; or storing data in distributed columnar databases. Storing a second copy of the data incurs extra costs, so some selection criteria must be employed, which would be driven by expected or actual usage by the end user community, balanced against the extra cost.

  11. 49 CFR 194.105 - Worst case discharge.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...: Prevention measure Standard Credit(percent) Secondary containment >100% NFPA 30 50 Built/repaired to API standards API STD 620/650/653 10 Overfill protection standards API RP 2350 5 Testing/cathodic protection API...

  12. 49 CFR 194.105 - Worst case discharge.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...: Prevention measure Standard Credit(percent) Secondary containment > 100% NFPA 30 50 Built/repaired to API standards API STD 620/650/653 10 Overfill protection standards API RP 2350 5 Testing/cathodic protection API...

  13. 49 CFR 194.105 - Worst case discharge.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...: Prevention measure Standard Credit(percent) Secondary containment >100% NFPA 30 50 Built/repaired to API standards API STD 620/650/653 10 Overfill protection standards API RP 2350 5 Testing/cathodic protection API...

  14. 49 CFR 194.105 - Worst case discharge.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...: Prevention measure Standard Credit(percent) Secondary containment > 100% NFPA 30 50 Built/repaired to API standards API STD 620/650/653 10 Overfill protection standards API RP 2350 5 Testing/cathodic protection API...

  15. 49 CFR 194.105 - Worst case discharge.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...: Prevention measure Standard Credit(percent) Secondary containment > 100% NFPA 30 50 Built/repaired to API standards API STD 620/650/653 10 Overfill protection standards API RP 2350 5 Testing/cathodic protection API...

  16. The novel Akt inhibitor API-1 induces c-FLIP degradation and synergizes with TRAIL to augment apoptosis independent of Akt inhibition

    PubMed Central

    Li, Bo; Ren, Hui; Yue, Ping; Chen, Mingwei; Khuri, Fadlo R.; Sun, Shi-Yong

    2012-01-01

    API-1 is a novel small molecule inhibitor of Akt, which acts by binding to Akt and preventing its membrane translocation, and has promising preclinical antitumor activity. In this study, we reveal a novel function of API-1 in regulation of c-FLIP levels and tumor necrosis factor-related apoptosis-inducing ligand (TRAIL)-induced apoptosis, independent of Akt inhibition. API-1 effectively induced apoptosis in tested cancer cell lines including activation of caspase-8 and caspase-9. It reduced the levels of c-FLIP without increasing the expression of DR4 or DR5. Accordingly, it synergized with TRAIL to induce apoptosis. Enforced expression of ectopic c-FLIP did not attenuate API-1-induced apoptosis, but inhibited its ability to enhance TRAIL-induced apoptosis. These data indicate that downregulation of c-FLIP mediates enhancement of TRAIL-induced apoptosis by API-1, but is not sufficient for API-1-induced apoptosis. API-1-induced reduction of c-FLIP could be blocked by the proteasome inhibitor MG132. Moreover, API-1 increased c-FLIP ubiquitination and decreased c-FLIP stability. These data together suggest that API-1 downregulates c-FLIP by facilitating its ubiquitination and proteasome-mediated degradation. Since other Akt inhibitors including API-2 and MK2206 had minimal effects on reducing c-FLIP and enhancement of TRAIL-induced apoptosis, it is likely that API-1 reduces c-FLIP and enhances TRAIL-induced apoptosis independent of its Akt-inhibitory activity. PMID:22345097

  17. Examples to Keep the Passion for the Geosciences

    NASA Astrophysics Data System (ADS)

    Fernández Raga, María; Palencia Coto, Covadonga; Cerdà, Artemi

    2014-05-01

    It is said that the beasts can smell fear. The translation to education is that students know when our vocation is really teaching or when you are teaching as a result of the sequence of events like a side effect of an investigating vocation path. But to become a good teacher, you need to love teaching!!! Education work requires a dynamic appeal by the students. It should be entertaining, motivating, interactive and dynamic. In this session I will present several tips and examples to get attention on your geology sessions: 1. The teacher should maintain a high interest in the subject of your work. Motivation is contagious!!!!If you show passion the other will feel it. 2. Change the attitude of students. Some activities can help you to do that like asking for the preparation of an experiment, and analyzing the results. Some examples will be shown. 3. Arouse the curiosity of the students. Some strategies could be asking questions in novel, controversial or inconsistent ways, asking conceptual conflicts and paradox that looks not expected from what is studied or 4. Use some tools to get the attention of the students. Examples of these tools can be Google Maps and Google Earth (teaching them to design routes and marking studies), Google drive (to create documents online in a team and file sharing), Google plus (to hang interesting news). 5. Examine students each week. Although it will be laborious, their work and learning will be more gradual. 6. Increase levels of competition among peers. 7. Relate what you know with what you learn. It is very important to be aware of the basis on which pupils, through prior knowledge test match. 8. Feel competent. Teacher's confidence is vital when teaching a class. You must be aware of our weaknesses and humble, but our nerves should help us to improve the quality of our classes. 9. Individualized teaching and learning. Numerous psychological and sociological studies suggest that the existence of social networks contribute to the welfare and health of the person. Applying this idea to the field of training, promote development within the classroom social networking encourages participation and aid in student learning.The criterion to consider dissolving or enhance these natural groups is given by the adequacy or not the educational proposed approaches (objectives, content, interests , etc.). And last but not least… 10. Never stop learning!!!!!!!!! Teaching geosciences needs passion for the Earth, the processes, the forms…And to show this in the field to the students.

  18. A Ruby API to query the Ensembl database for genomic features.

    PubMed

    Strozzi, Francesco; Aerts, Jan

    2011-04-01

    The Ensembl database makes genomic features available via its Genome Browser. It is also possible to access the underlying data through a Perl API for advanced querying. We have developed a full-featured Ruby API to the Ensembl databases, providing the same functionality as the Perl interface with additional features. A single Ruby API is used to access different releases of the Ensembl databases and is also able to query multi-species databases. Most functionality of the API is provided using the ActiveRecord pattern. The library depends on introspection to make it release independent. The API is available through the Rubygem system and can be installed with the command gem install ruby-ensembl-api.

  19. Digital Earth - Young generation's comprehension and ideas

    NASA Astrophysics Data System (ADS)

    Bandrova, T.; Konecny, M.

    2014-02-01

    The authors are experienced in working with children and students in the field of early warning and crises management and cartography. All these topics are closely connected to Digital Earth (DE) ideas. On the basis of a questionnaire, the young generation's comprehension of DE concept is clarified. Students from different age groups (from 19 to 36) from different countries and with different social, cultural, economical and political backgrounds are asked to provide definition of DE and describe their basic ideas about meaning, methodology and applications of the concept. The questions aim to discover the young generation's comprehension of DE ideas. They partially cover the newest trends of DE development like social, cultural and environmental issues as well as the styles of new communications (Google Earth, Facebook, LinkedIn, etc.). In order to assure the future development of the DE science, it is important to take into account the young generation's expectations. Some aspects of DE development are considered in the Conclusions.

  20. 49 CFR 195.205 - Repair, alteration and reconstruction of aboveground breakout tanks that have been in service.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...-refrigerated and tanks built to API Standard 650 or its predecessor Standard 12C, repair, alteration, and reconstruction must be in accordance with API Standard 653. (2) For tanks built to API Specification 12F or API..., examination, and material requirements of those respective standards. (3) For high pressure tanks built to API...

Top